Apple released iOS 13 this week and that new update added a stealthy new feature that will make it appear like FaceTime users are looking at the caller’s eyes rather than their screen.

Normally when you place a FaceTime call you look at the other person on your phone’s display. That byproduct of that is you aren’t able to make eye contact with the other person.

The new iOS 13 FaceTime Attention Correction feature discovered by Mike Rundell, fixes that and it’s all done in software.

The new feature only currently works on the iPhone XS and iPhone XS Max although it’s possible that will change in future updates.

However it appears to work well in tests we’ve seen and while it isn’t a feature many will have been crying out for, it’s one that you’ll immediately appreciate when you start using it.

As has been noted by Dave Schukin via Twitter, it appears that Apple is using an ARKit depth map to adjust the person’s eye position on-screen. He demonstrated that by showing a horizontal line being warped when moved up and down his face during a call. It’s a pretty dramatic test and while it’s interesting to see how it works, it’s impossible to see during a normal call. It even appears to work when wearing sunglasses.

You can take this for a spin if you’re using the current iOS 13 beta 3 release. The final public-facing update is expected to arrive later this year, likely in September. It is of course possible the feature will change or be removed entirely by then, but hopefully that isn’t the case.

You may also like to check out:

You can follow us on Twitter, or Instagram, and even like our Facebook page to keep yourself updated on all the latest from Microsoft, Google, Apple, and the Web.

Related Stories