So I am browsing /. and I come across a post linking to a Wired article about a Stanford research project – and I check out the research paper website and download some of the example movies and think to myself holy shit, that’s really fucking cool! Honestly, the exclamation mark was there and all.
Here is what is happening:
Traditionally, light rays filter through a camera’s lens and converge at one point on film or a digital sensor, then the camera summarizes incoming light without capturing much information about where it came from. Ng’s camera pits about 90,000 micro lenses between the main lens and sensor. The mini lenses measure all the rays of incoming light and their directions of origin. The software later adds up the rays, according to how the picture is being refocused.
The result is that you can, after the fact, alter the plane of focus throughout the entire possible range using software.
For any Shadowrun GMs who may happen to be reading this – it was this kind of thing that I had in mind with my surveillance field in the forest outside the Proteus AG base – many small lenses > one big one.