Apple’s big improvements to AR might be pushed back to 2020
The announcement of the 2018 iPhones is just hours away, but some experts are already looking towards the future, wondering what Apple might have in store for its customers in 2019, the year that's expected to bring bigger changes to the iPhone lineup.
One thing that was rumored to be introduced in 2019 iPhones was a 3D sensor on the back of the phone. According to Apple analyst Ming-Chi Kuo, the company won’t be implementing the technology in its next year models.
Unlike the front facing system that Apple uses today for Face ID and the infamous animojis, the one on the back would require a different technology for its purposes. So, what exactly are those purposes? Well, as we already know, Apple is working hard on Augmented Reality and a big part of it working properly is having a good idea of the 3D environment your device is pointed at. While today’s solutions are doing a decent job at that, adding a dedicated 3D sensor on the back will significantly increase the accuracy and therefore the functionality of AR.
Yes, streets, according to specialists, Apple is planning to make AR part of its Apple Maps and a core functionality of its rumored smart glasses. For that they’ll need to employ a better 3D mapping technology, called Time of Flight (ToF). ToF sensors precisely measure the time it takes for a signal to bounce from an object and get back to the sensor, thus giving it a very accurate position in space. There are standalone devices that use this method and are applied in various industries but fitting such system in a smartphone (or glasses) is obviously a technological challenge.
We’re eager to see what Apple is working on, and while it keeps future projects under cover, earlier this month we saw the experts Apple is looking to hire in that field, so sooner or later we will get something new on that front.
One thing that was rumored to be introduced in 2019 iPhones was a 3D sensor on the back of the phone. According to Apple analyst Ming-Chi Kuo, the company won’t be implementing the technology in its next year models.
The difference between the sensors comes in the way they’ll work. Apple’s current TrueDepth camera setup is based on the structured light technology. The way it works is by projecting a known pattern on the object (the person’s face) and based on the distortion in the pattern caused by the shape of the object, creating a 3D model. While this works well for small objects close to your device, it’s not so useful for modeling rooms and even streets.
We’re eager to see what Apple is working on, and while it keeps future projects under cover, earlier this month we saw the experts Apple is looking to hire in that field, so sooner or later we will get something new on that front.
via: AppleInsider
Things that are NOT allowed: