During the introduction of the new iPhone 6 and 6 Plus, Apple spend some time talking about the new 'Focus Pixels' in the latest iSight camera. Behind the fancy name, though, lies a well-known technology that is typically used in SLR cameras. It's a different focusing technique from what's traditionally used in smartphones, and it's called phase detection autofocus. Hearing the name of the technology, some of our readers probably remember that the Samsung Galaxy S5, launched earlier this year, became the first smartphone to utilize phase detection autofocus. Now, the iPhone 6 (and its bigger version, the Plus) becomes the second smartphone to join this club.
The vast majority of smartphones today rely on 'contrast detection' autofocus, which analyzes the amount of contrast between nearby pixels to determine which area should be in focus. The downsides of this technique are that it tends to be less reliable in low-light environments, and it can be a bit slower, compared to phase detection. On the other hand, phase detection autofocus works in a totally different way - it analyzes the light rays coming through the lens in order to determine which object should be focused on. It's a fairly complicated process, but the bottom line is that phase detection, and Apple's 'Focus Pixels', allow the camera to focus in on an object in a quicker and more reliable manner. The Galaxy S5 proved that the technology works quite well in the context of smartphones, as its benefits could be easily felt by users. Now, we can expect the iPhone 6 to push things even further with its own implementation of phase detection.
Things that are NOT allowed: