Here's how Google fixes your discombobulated face in wide-angle Pixel shots
These lenses are great for group or nature photos but the problem is that the wide-angle shots also catch the curves lens areas as well, and return very visible distortions, especially towards the edges.
Thus, if your face happens to be at the periphery of a group shot, it could very well look like it's painted by Picasso. Google has been addressing that in the Pixel phones by employing computer learning algorithms to correct your discombobulated face. From the Pixel 3 camera writeup:
If you’re having trouble fitting everyone in shot, or you want the beautiful scenery as well as your beautiful face, try our new wide angle lens that lets you get much more in your selfie. You can get up to 184% more in the shot, or 11 people is my own personal record. Wide angle lenses fit more people in the shot, but they also stretch and distort faces that are on the edge. The Pixel camera uses AI to correct this, so every face looks natural and you can use the full field of view of the selfie cam.
How did Google achieve that? Well, they've been researching on the subject for a while, it seems, and now we have the first paper - "Distortion-Free Wide-Angle Portraits on Camera Phones" - co-authored by Google and MIT researchers that explains how they did it.
Google's new wide-angle distortion correction method compared to others
The quote you see above is just a tiny fraction of Google's Pixel camera features explanations, mentioned passingly in the selfie section, but there's quite a lot of painstaking research behind its algorithms.
This level of attention to detail may explain why Google does with one paltry 12MP camera on the back of the Pixels what other need multilens kits to do, and are still catching up to Google's results. In that line of thought, we can't wait to see what the Pixel 4 is capable of.
Things that are NOT allowed: