iOS 13 introduces eye contact simulation to FaceTime

12comments
iOS 13 introduces eye contact simulation to FaceTime
Video calls have made long-distance communication a much more intimate experience, but there has been an issue from the get-go when it came to realistic eye contact between participants. Since you're looking at the other person's face on the screen, and not the camera, and they are doing the same, to both of you it appears as though the other is looking somewhere to the side. This "issue" will be fixed in iOS 13, with the introduction of FaceTime Attention Correction.

What FaceTime Attention Correction does, is it basically fakes eye contact, even though you're looking at the screen. Apple is able to do this by leveraging the power of ARKit and computational image manipulation.

The feature was discovered by Mike Rundle who tested it with Will Sigmon on Twitter:


According to Rundle and Sigmon, FaceTime Attention Correction is currently working on the iPhone XS and XS Max (and possibly the iPhone XR) running iOS 13 beta 3, while the iPhone X doesn't seem to support it. We can't say for sure how much processing power the effect requires, but considering that the iPhone X also doesn't have real-time previews for post-processing effects like HDR in the Camera app, FaceTime Attention Correction may be reserved for the newer models only.

Dave Schukin explains that eye contact correction uses ARKit to "grab a depth map of your face" and then employs computational imaging to readjust your eyes in real time. It sounds complex, but you can look at it as just a Snapchat filter (only a much more meaningful implementation of the technology).

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless