The existing system relies on a structured-light technique that projects a pattern of 30,000 laser dots onto a user’s face and measures the distortion to generate an accurate 3-D image for authentication. The planned rear-facing sensor would instead use a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a three-dimensional picture of the environment.
It would be surprising to see Apple only use time-of-flight sensors on the back of the next iPhone, but not entirely unexpected. Lots of augmented reality hardware already use these sensors, including Microsoft’s Hololens and Google’s Tango platforms.
This ultimately means augmented reality apps could better integrate with the environment.
Where the current True Depth sensor does a great job gathering data up close, the smaller time-of-flight sensors necessary for the back of a phone would be better suited for gathering data at room-scale. Being able to see how far the wall or the couch is from the phone means augmented reality apps could better integrate with the environment, instead of asking the user to find a suitable space to play in like many ARKit apps currently do.
Like other AR platforms, it’s likely what we will actually see from this supposed research is a new True Depth sensor which combines time-of-flight and the existing structured-light techniques for a more complete picture of the world around the iPhone. Either way, an iPhone with better depth sensing on the back of the phone is great news for the future of ARKit and a clear indicator of how important Apple thinks this tech is going to be moving forward.