Apple’s latest light field patent describes the use of a camera array for immersive augmented reality (AR), live display walls, head mounted displays, video conferencing, and similar applications based on a user’s point of view. The patent application, simply titled “Light field capture”, talks about AR video conferencing where the user’s background can be replaced with other information (e.g. their own view of a scene, or live sports).
The invention also includes concepts including pixel culling (i.e. following the user’s movements and cropping to the interesting parts of the entire camera view), conversion of 3D data to 2D views for the left and right eyes of the second party,
Interestingly, the authors also mention the possibility for a hybrid display/camera-array that would integrate both devices into a single, light-field sensing screen.
For more information, check out Patently Apple and Patent US9681096 – Light field capture on Google Patents.
Lytro recently upped their Immerge VR Camera to the next generation, with a larger and planar camera array for easier VR video production. Their most highly promoted feature is recording content at 6 degrees of freedom, meaning that you can’t just rotate your view around, but actually move your head around in space (within limits).
At the recent Tribeca Film Festival, the company presented a first VR video experience titled “Hallelujah”, featuring a performance of Leonard Cohen’s popular song, and recorded with the Lytro second-gen Immerge. Lytro’s “Making Of” video not only hints at what VR viewers will see in the video, but also gives some insight into the Immerge production controls and interfaces: Continue reading
After releasing two light field cameras for end users, Lytro seems to try branching out into other fields to enable broader application of their plenoptic technology: Back in November, Lytro announced the Lytro Development Kit, basically a way for interested companies to license the technology and explore light field applications on their own.
Now the company reportedly raised 50 million $ to shift toward Virtual Reality and video. Lytro’s “refocus” to these new areas entails a lay-off of 25 to 50 current positions – a sizeable chunk of their workforce of just 130 – so that new specialists from the fields of video and VR can be hired. Continue reading
At first glance, the music video below consists only of slow panning and focus-shifting across otherwise static scenes, where the camera movement matches the calm soundscape of Big Noble‘s new song “Ocean Picture”.
However, there’s something special about this video: It was recorded solely with a Lytro Illum light field camera, and thus consists of many individual images brought to life by two of the most popular light field features: post-capture refocus and single-exposure 3D.
Light field technology produces a completely new, and fundamentally richer, set of data. With all of the advantages of this technology, the obvious downside is that there isn’t yet much software out there to process this data.
Now, the Fraunhofer Digital Cinema Alliance has presented a light-field plug-in for the popular video post processing application Avid Media Composer.
According to their press release, the software plugin is able to create high-resolution depth maps, change focus, move perspective or add 3D effects and camera movement. The output can then be used to generate footage for 2D, 3D or multi-view displays — and all of that by using existing video software. A licensing model is available for professional users interested in working with light-field video.
Full press release after the break: