Each new round of Apple’s operating system betas brings small new features that will most likely appear in the final release, and the third developer betas of iOS 13 and iPadOS 13 include something highly unusual for Apple — a subtle face-changing feature called FaceTime Attention Correction. Flipping the switch on makes you appear to look directly at the front-facing camera during FaceTime Video calls, even though your actual gaze is on the adjacent screen.

The feature is as surprising for the way it’s described in the betas as for how it works. Apple’s explanation says “[y]our eye contact with the camera will be more accurate during FaceTime Video calls,” though in truth, the feature adjusts your pupils to appear to be looking straight at the camera when they’re not. Consequently, users of the feature might seem to be paying undivided attention during a video call.

Developers Mike Rundle and Dave Schukin have been tweeting their findings on the feature, starting with Rundle’s discovery of the setting and Schukin’s demonstration of how it uses ARKit 3 to determine the position of your face with a depth map and then make appropriate eye adjustments using unspecified AI and machine learning tricks. The feature, Schukin notes, can be fooled if an object such as an eyeglass frame stem is moved in front of your eyes, leading to obvious distortion that’s visible as small curves in an otherwise straight line.

For the time being, FaceTime Attention Correction appears as an option — turned off by default — under the FaceTime settings of A12 Bionic and A12X Bionic devices running iOS 13 and iPadOS 13 developer beta 3, including the iPhone XR, iPhone XS, iPhone XS Max, and 2018 iPad Pros. It doesn’t seem to be working properly on iPad Pros, however, and that list of devices notably excludes the iPhone X for reasons unknown. The feature will likely be included in the second public betas of iOS 13 and iPadOS 13 for users interested in seeing how it works.