Apple could reportedly add TrueDepth features to the rear iPhone camera

Apple has packed a ton of sensors into the notch of the iPhone X. The TrueDepth camera system powers many features, from Face ID to face-tracking technology using ARKit. It also lets you do selfies in Portrait mode. According to a new Bloomberg report, Apple now wants to improve the rear camera with 3D-sensing capabilities.

Bloomberg says that Apple won’t use the exact same technology in the rear sensor. The iPhone X currently projects a grid of thousands of laser dots and look at the distorsion of those dots on your face.

This rumored 3D sensor would project laser dots and calculate how long it takes to bounce back and come back to the phone. Apple currently uses two cameras to understand what’s closer to you.

But this new system would be much more accurate. Your phone could understand your surrounding and create a rough 3D map of your environment. It would be quite useful for ARKit and other augmented reality features.

Apple is getting serious about augmented reality — Bloomberg also reported that the company has been working on an AR headset. Adding new AR capabilities to the iPhone could be a great way to convince people to buy an AR headset in a few years.

It reminds me of Google’s Project Tango. It never really took off and Google pivoted to ARCore. But the idea behind Project Tango might live on with Apple’s upcoming phones. Sensors and chips could be cheap and small enough for a thin flagship smartphone.

Bloomberg says this new 3D sensor won’t be ready for next year’s iPhone. It could ship in 2019.