We usually think of photos as a two-dimensional representation of a scene. However, Stanford University researchers have developed an image sensor that adds a third dimension by judging the distance of subjects within a snapshot. The new technology is called 'multi-aperture image sensor,' and sees things differently than the light detectors used in ordinary digital cameras.
Unlike a normal camera sensor, the three megapixel chip uses overlapping 16x16 arrays of pixels. Each "sub-array" has its own lens. Since these arrays see from slightly different angles, 3D information can be obtained by examining the differences between the images, much as the human brain gleans depth information from comparing images from our two eyes.
In addition to extracting depth information, the design may reduce the color-crosstalk problems current sensors suffer from. It can also take macro close-ups in restricted spaces, making it potentially useful in medical situations.
More Stats +/-
Location-Based Messaging Apps
Upcycled Vintage Chair Covers
Iconic Movie Car Commercials
Discount-Notifying Shopping Apps
Palm-Sized Vacuum Cleaners
Free 2019 Report & eBook
Get the top 100 trends happening right NOW -- plus a FREE copy of our award-winning book.
Our Research Methodology
This article is one of 350,000 experiments. We use crowd filtering, big data and AI to identify insights.
New Camera Chip Takes 3D Photos
- By: AymanFeb 21, 2008