Lidar used to cost $75,000—here’s how Apple brought it to the iPhone
How Apple made affordable lidar with no moving parts for the iPhone. …
reader comments
164 with 84 posters participating, including story author
At Tuesday’s unveiling of the iPhone 12, Apple touted the capabilities of its new lidar sensor. Apple says lidar will enhance the iPhone’s camera by allowing more rapid focus, especially in low-light situations. And it may enable the creation of a new generation of sophisticated augmented reality apps.
Tuesday’s presentation offered little detail about how the iPhone’s lidar actually works, but this isn’t Apple’s first device with lidar. Apple first introduced the technology with the refreshed iPad in March. And while no one has done a teardown of the iPhone 12 yet, we can learn a lot from recent iPad teardowns.
Lidar works by sending out laser light and measuring how long it takes to bounce back. Because light travels at a constant speed, the round-trip time can be translated into a precise distance estimate. Repeat this process across a two-dimensional grid and the result is a three-dimensional “point cloud” showing the location of objects around a room, street, or other location.
A June analysis by System Plus Consulting found that the iPad’s lidar sends out light using an array of vertical cavity surface-emitting lasers (VCSELs) made by Lumentum. It then detects the return flash using an array of sensors called single-photon avalanche diodes (SPADs) supplied by Sony. I’ll explain what these are in the next section.
I found Apple’s announcement particularly interesting because I’ve been working on a story about companies that are using the same combination of technologies—VCSEL lasers and SPAD detectors—to build much
Continue reading – Article source