Apple is evaluating a different technology from the one it currently uses in the TrueDepth sensor system on the front of the iPhone X, the people said. The existing system relies on a structured-light technique that projects a pattern of 30,000 laser dots onto a user’s face and measures the distortion to generate an accurate 3D image for authentication. The planned rear-facing sensor would instead use a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a three-dimensional picture of the environment.
The existing TrueDepth camera would continue to be used in the front-facing camera of future iPhones in order to power Face ID, while the new system would bring the more advanced “time-of-flight” 3D sensing capability to the rear camera, according to sources. Discussions with manufacturers are reportedly already underway, and include Infineon, Sony, STMicroelectronics, and Panasonic. Testing is said to be still in the early stages, and could end up not being used in the phones at all.
With the release of iOS 11, Apple introduced the ARKit software framework that allows iPhone developers to build augmented reality experiences into their apps. The addition of a rear-facing 3D sensor could theoretically increase the ability for virtual objects to interact with environments and enhance the illusion of solidity.
Apple was reportedly beset with production problems when making the sensor in the iPhone X’s front-facing camera, because the components used in the sensor array have to be assembled with a very high degree of accuracy. According to Bloomberg, while the time-of-flight technology uses a more advanced image sensor than the existing one in the iPhone X, it does not require the same level of precision during assembly. That fact alone could make a rear-facing 3D sensor easier to produce at high volume.