According to Bloomberg reports has it that Apple is hard at work developing a new rear-facing 3D sensor for the iPhone that gets launched in 2019. Unlike the TrueDepth sensor located on the front of the iPhone X, which uses an array of 30,000 laser dots, this new sensor would rely on time-of-flight to detect objects. Using this method, the sensor bounces a laser off objects to create a 3D image of the environment directly behind the phone.
The main reason for adding this rear-facing sensor is more believable augmented reality experiences. If the iPhone is able to detect real-world objects then the AR experience being viewed can take them into account in the scene. For example, parts of the AR scene could be hidden if they appear behind a real object, therefore maintaining the illusion. Currently, real objects are ignored in AR.
Time-of-flight laser sensors are nothing new. Infineon, Sony, STMicroelectronics, and Panasonic already offer them. However, we all know how demanding Apple is when it comes to new tech for its smartphones. They will want this sensor to be very small, very thin, and requiring minimal power to run. If those requirements can’t be met, then the sensor won’t be added in 2019.
If the sensor does make the grade, it will mean future iPhones carry two 3D sensors. The TrueDepth system will continue to feature of the front of the handset, while this new sensor is positioned on the back most likely next to the cameras. It also means developers will gain access to an upgraded ARKit framework from Apple allowing them to take full advantage of the new sensor data for their AR apps.