You need to have at least two cameras looking at your face at all times - the Amazon Fire Phone has four, so two of them are never covered by your fingers.
The most notable feature of the new
Amazon Fire Phone is something called
Dynamic Perspective, a no-glasses 3D-like functionality embedded throughout the phone - from the wallpapers, to the refashioned Carousel interface, to apps and games. But how was Amazon able to pull such a trick?
Jeff Bezos took to the stage yesterday
to unveil the new phone, and he also spoke about what challenges the company faced in bringing this to market. In fact, Bezos mentioned that the Fire Phone project started nearly 4 years ago, and the first prototypes were using glasses to recognize someone’s face. Not the best way to do it, is it?
In order to get rid of such unwanted head gear, Amazon had to double the cameras on initial prototypes - two cameras are the minimum for stereo vision with depth, but if you hold the phone in different orientation you can easily cover one of those two cameras with your hand. That’s why Amazon uses four cameras, so even when you hold this phone in landscape orientation, you still have at least two cameras looking at you.
The next challenge was the field of view of traditional cameras - at 72-degree field of view that most sport, a person’s head would go out of the frame way too often, causing the whole planned 3D effect to fail. That’s why Amazon had to equip the Fire Phone with custom-built, 120-degree, wider field of view cameras that would allow to have the user’s face in the view for the majority of use cases.
It's a technology 4 years in the making
Going even deeper in the technological details, Amazon also unveiled that it uses global shutter cameras on the front rather than the more traditional rolling shutter ones. Global shutter cameras are faster, much faster, something that allows them to also use less power, as they are fired dozens of times every second. There is, in fact, a 10x difference in efficiency between a rolling shutter and a global shutter camera.
The next step towards perfecting this technology was using the phone in low light. After all, we often use our phones at night, or in a car, or generally, in poorly-lit conditions. To overcome this difficulty, Amazon uses infra-red light - a kind of light that we don’t see, but that the cameras can use at night to see where your face is in relation to them, and that’s exactly what makes the 3D-like effect possible.
Finally, Amazon has opened this new Dynamic Perspective SDK to developers on the day of the event, so if you’re a coder, you can start supporting the new 3D-like functionality in your apps right away - Amazon has made it all really simple.
Things that are NOT allowed: