Developer tests 2020 iPhone SE camera, results show impressive photography for a single-camera setup
Ben Sandofsky, an iOS developer behind popular camera app Halide, has taken a deep look at the newly-released 2020 iPhone SE's single-camera setup.
As we know, the new budget iPhone is priced so low in part due to its reuse of an old iPhone shell, the one from the 2017 iPhone 8, while its internals and camera have been significantly upgraded.
The first notable feature of the lone back camera on the new iPhone SE is what is called "Single Image Monocular Depth Estimation," it's the technology that makes Portrait Mode photography possible via only a single camera. Normally, at least two cameras are used for a smartphone to get a good enough idea of which objects are in the background, and therefore should be blurred out, and who, or what, is the main subject.
It's notable that those impressive results despite a single camera are achieved entirely through machine learning. And due to the lack of more advanced sensors on the SE, its camera can even be tricked into registering a flat photo as a scene with depth, as shown on the photo below.
Ben notes that Apple likely disabled the option to take Portrait Mode photos on non-human subjects due to the results likely to turn out badly, with wrong parts of the image ending up blurry. This is seen in the next image, where the SE registered part of the background, a tree, as part of the subject, so it wrongly didn't blur it.
And while the flagship, cutting-edge iPhone 11 Pro is capable of distinctly detecting the depth of various objects in a scene, the SE 2 simply "gets the general gist of things." As seen below, the latter is notably less capable of precise object edge detection.
Ben notes that the machine-learned depth data the SE 2 camera collects is available to developers, so even though Apple itself doesn't want to make bold claims that it works well on non-humans, third-party developers can still use that data on their own apps.
Earlier in April we also covered the testing of the iPad Pro camera set up and its LiDAR sensor done by another Halide developer, which too gave us a fascinating look into the technology behind it, and the future uses of LiDAR.
The 2020 iPhone SE 2 itself was released on April 24, featuring a small 4.7-inch screen, a fairly outdated look, but monster specs on the inside, especially for a budget $400 device, and is likely to be a huge success for Apple.
As we know, the new budget iPhone is priced so low in part due to its reuse of an old iPhone shell, the one from the 2017 iPhone 8, while its internals and camera have been significantly upgraded.
- iPhone 13: price, release date, features, and specs
A photo taken with the Halide app, showing that Portrait Mode is possible on the SE even on non-human subjects, although Apple's own camera app only supports it on human subjects.
It's notable that those impressive results despite a single camera are achieved entirely through machine learning. And due to the lack of more advanced sensors on the SE, its camera can even be tricked into registering a flat photo as a scene with depth, as shown on the photo below.
The flat 2D photo on the left was photographed with the SE, with the result shown on the right - a focused subject in front of a blurred background! The image in the middle shows what the SE thought is the main subject, correctly, and what it registered as the background, despite this being a flat 2D photo with no 3D depth.
Ben notes that Apple likely disabled the option to take Portrait Mode photos on non-human subjects due to the results likely to turn out badly, with wrong parts of the image ending up blurry. This is seen in the next image, where the SE registered part of the background, a tree, as part of the subject, so it wrongly didn't blur it.
The SE uses machine learning to detect human subjects with great results, but the same can't be said for non-human subjects, so Apple disabled that option in its camera app.
And while the flagship, cutting-edge iPhone 11 Pro is capable of distinctly detecting the depth of various objects in a scene, the SE 2 simply "gets the general gist of things." As seen below, the latter is notably less capable of precise object edge detection.
The left photo is taken with the iPhone 11 Pro, the right one is from the 2020 iPhone SE. The portrait effect has been significantly increased for the purpose of this test.
Ben notes that the machine-learned depth data the SE 2 camera collects is available to developers, so even though Apple itself doesn't want to make bold claims that it works well on non-humans, third-party developers can still use that data on their own apps.
Earlier in April we also covered the testing of the iPad Pro camera set up and its LiDAR sensor done by another Halide developer, which too gave us a fascinating look into the technology behind it, and the future uses of LiDAR.
The 2020 iPhone SE 2 itself was released on April 24, featuring a small 4.7-inch screen, a fairly outdated look, but monster specs on the inside, especially for a budget $400 device, and is likely to be a huge success for Apple.
Things that are NOT allowed: