Earlier this year, Facebook and OTOY revealed a new recording technique to combine 360 degree video with depth information. The football-sized sphere contains 24 cameras (there’s also a smaller version with just 6 cameras), allows the recording of 360 degree virtual reality video with 6 degrees of freedom, making it possible for viewers to not only turn their head and look around themselves while locked into the fixed position of the camera, but move around slightly in the scene.
Now, at the Adobe MAX creator’s conference in Las Vegas, Adobe has unveiled project Sidewinder, an experimental software tool which creates a very similar effect based on just two camera views. Continue reading
Apple’s latest light field patent describes the use of a camera array for immersive augmented reality (AR), live display walls, head mounted displays, video conferencing, and similar applications based on a user’s point of view. The patent application, simply titled “Light field capture”, talks about AR video conferencing where the user’s background can be replaced with other information (e.g. their own view of a scene, or live sports).
The invention also includes concepts including pixel culling (i.e. following the user’s movements and cropping to the interesting parts of the entire camera view), conversion of 3D data to 2D views for the left and right eyes of the second party,
Interestingly, the authors also mention the possibility for a hybrid display/camera-array that would integrate both devices into a single, light-field sensing screen.
For more information, check out Patently Apple and Patent US9681096 – Light field capture on Google Patents.
Lytro recently upped their Immerge VR Camera to the next generation, with a larger and planar camera array for easier VR video production. Their most highly promoted feature is recording content at 6 degrees of freedom, meaning that you can’t just rotate your view around, but actually move your head around in space (within limits).
At the recent Tribeca Film Festival, the company presented a first VR video experience titled “Hallelujah”, featuring a performance of Leonard Cohen’s popular song, and recorded with the Lytro second-gen Immerge. Lytro’s “Making Of” video not only hints at what VR viewers will see in the video, but also gives some insight into the Immerge production controls and interfaces: Continue reading
After releasing two light field cameras for end users, Lytro seems to try branching out into other fields to enable broader application of their plenoptic technology: Back in November, Lytro announced the Lytro Development Kit, basically a way for interested companies to license the technology and explore light field applications on their own.
Now the company reportedly raised 50 million $ to shift toward Virtual Reality and video. Lytro’s “refocus” to these new areas entails a lay-off of 25 to 50 current positions – a sizeable chunk of their workforce of just 130 – so that new specialists from the fields of video and VR can be hired. Continue reading
At first glance, the music video below consists only of slow panning and focus-shifting across otherwise static scenes, where the camera movement matches the calm soundscape of Big Noble‘s new song “Ocean Picture”.
However, there’s something special about this video: It was recorded solely with a Lytro Illum light field camera, and thus consists of many individual images brought to life by two of the most popular light field features: post-capture refocus and single-exposure 3D.