Depth perception through a single lens

The key, they found, is to infer the angle of the light at each pixel, rather than directly measuring it (which standard image sensors and film would not be able to do).
 
The team’s solution is to take two images from the same camera position but focused at different depths. The slight differences between these two images provide enough information for a computer to mathematically create a brand-new image, as if the camera had been moved to one side.
 
By stitching these two images together into an animation, Crozier and Orth provide a way for amateur photographers and microscopists to create the impression of a stereo image without the need for expensive hardware.
 
They are calling their computational method “light-field moment imaging” — not to be confused with “light field cameras” (like the Lytro), which achieve similar effects using high-end hardware rather than computational processing.
 
Importantly, the technique offers a new and very accessible way to create 3D images of translucent materials, such as biological tissues.
 
Biologists can use a variety of tools to create 3D optical images, including light-field microscopes, which are limited in terms of spatial resolution and are not yet commercially available; confocal microscopes, which are expensive; and a computational method called “shape from focus,” which uses a stack of images focused at different depths to identify at which layer each object is most in focus.
 
That’s less sophisticated than Crozier and Orth’s new technique because it makes no allowance for overlapping materials, such as a nucleus that might be visible through a cell membrane, or a sheet of tissue that’s folded over on itself. Stereo microscopes may be the most flexible and affordable option right now, but they are still not as common in laboratories as traditional, monocular microscopes.