Pelican Imaging is working hard on releasing “a smart camera for your Smartphone”. Their first generation 4×4 camera array is reportedly 50 % thinner than current smartphone camera modules. It captures the Light Field not by use of a microlens array, but using – in this case – 16 individual fixed focus cameras.
But hardware is only one part of the technology. It’s strengh lies in the power of post-processing and sophisticated computation.
In a new demo released today, Pelican Imaging demonstrates 3Dvideo recording at 1080p and 30 fps, as well as two application examples: distance measurement within the picture, and 3D printing of recorded scenes. Continue reading →
With their newest Youtube-Upload, German LightField specialist Raytrix demos the LightField features of their R29 camera.
The HD video includes original footage recorded with the high-end camera (resized from 3288 x 2192 = 7.2 megapixels; and 100 % crops), and demonstrates the reconstruction of 3D information – even at a subject-to-lens distance of 400 meters – as well as software refocus and extended depth of field.
Video frame rate for full-resolution imaging is 5 frames per second.
LightField technology has a wide potential field of applications, which we’ve only recently begun to explore. One of these applications, which may soon be realized, is the 3D imaging of fish for scientific use:
The US National Oceanic and Atmospheric Administration (NOAA) has recently filed a statement of need notice for the development of an “underwater LightField camera system for single camera 3D imaging of fish” (WRAD-13-02577).
Looking at the specific camera requirements, it’s clear that the NOAA have a clear idea of what they want. At this stage, however, there’s no information about the precise application of the LightField pictures and video that the agency is hoping to record.
3D displays are slowly moving into mainstream, but most of the technologies used today require the viewers to wear special 3D glasses, or watch from a very defined, small optimum viewpoint. More advanced 3D displays use eye tracking, and create a stereoscopic effect by specifically sending different images to either eye.
David Fattal and colleagues from HP Laboratories in Palo Alto, California developed a new approach to glasses-free 3D displays, which comes with a number of improvements: Their prototype displays use multi-directional diffractive backlight technology, which makes them particularly well-suited for mobile devices (e.g. smartphones, tablets, or watches). They’re high-resolution, very thin (<1 mm), don’t require eye tracking, and feature a very wide view zone (up to 180 degrees) at an observation distance of up to a metre. Their work was recently published in Nature.
So you’ve built your own LightField Camera? Taken your first LightField pictures? What’s next?
The next step is finding software that will allow you to process the captured LightField information. There are countless factors in which LightField setups can differ, so unfortunately processing your pictures is not just a matter of click and refocus. There is some software available, though, that will help you work with your very own LightField photographs.
Originally developed for , LFDisplay will also work with LightField pictures taken with other setups (including a DIY LightField camera). The Open-Source tool for Mac and Windows provides the following LightField features…
software refocus: two refocus sliders (coarse and fine) for adjustment along the virtual z-axis
synthetic aperture controls: pinhole, full and custom aperture