App developer analyzes 2020 iPad Pro LiDAR scanner and cameras
Sebastian de With, developer of iPhone camera app Halide, has taken a closer look at the 2020 iPad Pro camera array and LiDAR sensor.
What becomes clear after inspecting the iPad Pro's cameras is that they are once again inferior against the current iPhone 11 model. In addition, comparing photos taken with the 2020 iPad Pro and the 2018 iPad pro shows little difference or improvement between the two. Aside from minor changes in photo processing via software, not much has improved in the iPad camera department since 2018. The camera sensors and lens appear to be either identical or almost identical between the two iPad Pro models.
De With notes that the iPad Pro's camera array is most closely comparable to the one found on the iPhone 8, and is still a great set of cameras.
The ultra-wide lens on the iPad Pro appears to be slightly less wide than the one found on the iPhone 11 series, at a difference of 1mm. It's also at a mere 10 megapixels, lowest on any Apple rear camera since the iPhone 6, though it's important to remember that megapixel numbers are only one factor in an image, and don't necessarily mean worse or better image quality.
De With concludes that the iPhone 11 and 11 Pro pack a significantly larger and better sensor, as compared to the iPad.
Next is the LiDAR depth sensor, noted for being the first new imaging capture technology to appear on an iPad before an iPhone. And with mostly just the AR Measure app making use of it at the moment, it still remains relatively unexplored. Thanks to it, augmented reality (AR) apps would no longer need any initial calibration, in order for the iPad to map a certain area, and the sensor can be used for more seamless and polished AR experiences. From detecting features in a room, to measuring distances and even how tall people are, it has a lot of potential.
The LiDAR sensor makes Apple's iPad great for sensing three-dimensional space, though the output by it right now isn't accurate enough to, for example, scan a 3D object and send it to a 3D printer. For now, the Halide app lets users capture the LiDAR's raw depth data, which is a cool concept.
What becomes clear after inspecting the iPad Pro's cameras is that they are once again inferior against the current iPhone 11 model. In addition, comparing photos taken with the 2020 iPad Pro and the 2018 iPad pro shows little difference or improvement between the two. Aside from minor changes in photo processing via software, not much has improved in the iPad camera department since 2018. The camera sensors and lens appear to be either identical or almost identical between the two iPad Pro models.
The ultra-wide lens on the iPad Pro appears to be slightly less wide than the one found on the iPhone 11 series, at a difference of 1mm. It's also at a mere 10 megapixels, lowest on any Apple rear camera since the iPhone 6, though it's important to remember that megapixel numbers are only one factor in an image, and don't necessarily mean worse or better image quality.
iPhone 11 Pro left, 2020 iPad Pro right. Source - Sebasitan de With
De With concludes that the iPhone 11 and 11 Pro pack a significantly larger and better sensor, as compared to the iPad.
Next is the LiDAR depth sensor, noted for being the first new imaging capture technology to appear on an iPad before an iPhone. And with mostly just the AR Measure app making use of it at the moment, it still remains relatively unexplored. Thanks to it, augmented reality (AR) apps would no longer need any initial calibration, in order for the iPad to map a certain area, and the sensor can be used for more seamless and polished AR experiences. From detecting features in a room, to measuring distances and even how tall people are, it has a lot of potential.
It doesn't appear to be an adequate sensor to use for photo enhancements, such as portrait mode improvements, however. De With notes that it could potentially become useful in that area in the future, thanks to machine learning, though portrait mode is clearly not what the LiDAR was meant for, thus this is unlikely.
The LiDAR sensor makes Apple's iPad great for sensing three-dimensional space, though the output by it right now isn't accurate enough to, for example, scan a 3D object and send it to a 3D printer. For now, the Halide app lets users capture the LiDAR's raw depth data, which is a cool concept.
We're yet to see how Apple and app developers will take full advantage of the sensor, likely to be used towards the scanning and creation of augmented reality environments, which is what it's meant for.
Things that are NOT allowed: