Oct 13

Researchers Develop Geometric Calibration Method for MLA-based Light Field Cameras using Line Features in RAW Images

^Researchers Develop Geometric Calibration Method for MLA-based Light Field Cameras using Line Features in RAW Images (picture: Bok et al. 2014) Calibration is an important part of light field photography: Image processing and image quality can be significantly improved when the physical properties of the camera are known. More specifically, geometric information about the microlenses in a microlens-array-based light field camera can help create more precise depth maps with fewer errors.

Yunsu Bok and colleagues from the Korean Advanced Institute of Science and Technology (KAIST) have devised a new method for geometric calibration which – in contrast to conventional methods – does not rely on processing sub-aperture images. Instead, they extract line features and compute a light field camera’s geometric parameters directly from RAW images. Continue reading

Aug 09

Lytro Meltdown: Updates to Lytro Compatible Viewer, Communicator, and Library

Lytro Meltdown: Updates to Lytro Compatible Viewer, Communicator, and Library Jan Kučera has recently released a suite of software updates for his Lytro Meltdown tools, the Lytro Compatible Viewer (updated to version 3.0.0.0), the Lytro Compatible Communicator (new version: 1.0.1.2), and the Lytro Compatible Library (new version: 2.1.0.0).

Updates include a 3D mesh view from depth maps for the Viewer, improved demosaicing, and user manuals. The library has received accessors for well-known components in light field packages, dedicated classes and methods for easier access to sub-aperture and individual microlens images.

Continue reading

Jul 06

Pinlight Display: Light Field Glasses for Augmented-Reality Applications

Nvidia Pinlight Display: Light Field Glasses for Augmented-Reality Applications (picture: Siggraph 2014 website) Earlier this year at Augmented World Expo, Nvidia researcher Douglas Lanman gave a talk about Near-Eye Light Field displays, i.e. electronic glasses which allow users to experience both 3D and depth. When asked about Augmented Reality (AR) applications during the discussion, Lanman noted that creating a set of transparent glasses that would also include microlenses (or something equivalent) but still allow “normal” see-through vision, was a real challenge. He very briefly teased “pinlight displays”, which were to be presented at the same conference, but no further information could be found online.

In the Emerging Technologies section of the Siggraph 2014 conference (10-14 August 2014), Adam Maimone and colleagues from the University of North Carolina at Chapel Hill and Nvidia will be presenting their new invention in a talk entitled “Pinlight Displays: Wide-Field-of-View Augmented-Reality Eyeglasses Using Defocused Point-Light Sources”. Continue reading

Jul 01

Sony Patents Light Field Sensor with Full-Resolution 3D Stereo Output

With today’s light field sensors, extracting 3D stereo images from light field recordings typically results in a lowered effective image resolution – but that limitation may soon be history: Sony has developed a novel sensor design with overlapping pixels in two layers, that will allow 3D output without the typical decrease in image resolution. In Sony’s recently granted US Patent, Nr. US20140071244, author Isao Hirota introduces a dual level microlens array setup in combination with a sensor that consists of two layers of light sensitive pixel grids – front-facing and back-facing grids that are rotated at, for example, 45 degrees.
The described configuration allows different neighbouring pixels to share the same information from a single microlens while being allocated to either the left or right stereo views, resulting in higher-resolution 3D stereo output from a single-lens, single-sensor device (i.e. a “monocular 3D stereo camera”).

Fig. 8 from the patent application shows an example setup which uses both a multilens array (34) as well as an additional on-chip lens array (OCL, 35) and color filter array (33) to create a stereo light field image. Fig. 18 from the patent application is a diagram illustrating a CMOS image sensor in which pixels are arranged in a matrix of 2x2 and, for the second layer of pixels, rotated by 45 degrees for multiple perspectives output in the Bayer arrangement.
Fig. 19 from the patent application illustrates how the setup in Fig. 18 allows for nine perspective images (incl. 6 actual parallax images and 3 interpolated ones), making the setup suitable for a monocular 3D stereo camera. Fig. 27 provides examples of electric potential distribution of the light receiving face (back face) in a typical square (A) and triangular (B) pixel, showing that the microlenses can be circular, ellipsoidal or polygonal shapes to improve the lenses' extinction ratios.

Patent abstract: Continue reading

Jun 06

Nvidia Near-Eye Light Field Display: Background, Design and History [Video]

Nvidia Near-Eye Light Field Display: Binocular OLED-based prototype (Youtube Screenshot) About a year ago, Nvidia presented a novel head-mounted display that is based on light field technology and offers both depth and refocus capability to the human eye. Their so-called Near-Eye Light Field Display was more a proof of concept, but it’s exciting new technology that solves a number of existing problems with stereoscopic virtual reality glasses.

Nvidia researcher Douglas Lanman recently gave a talk at Augmented World Expo (AWE2014), in which he explained the background and evolution of head-mounted displays and the history and design of Nvidia’s near-eye light field display prototypes: Continue reading