During the Surface Pro X unveiling yesterday, Microsoft showed off new eye gaze technology. The feature taps AI to adjust the appearance of your eyes so you appear to be looking straight at the camera when you’re on a conference call. We talked to Microsoft to get a little more detail on the AI-powered functionality, which is similar to “eye correction” in iOS 13.
“AI is understanding where the camera placement is compared to your where you are looking in the video, and it’s shifting your eyes downward [or upward],” Microsoft Devices director Megan Solar told VentureBeat. “It’s creating a sense that you’re looking at me eye-to-eye, even though you may be looking at your camera, you may be looking slightly off.”
Panos Panay, head of engineering for all of Microsoft’s devices, noted onstage that the feature uses the Surface Pro X’s ARM-based SQ1 co-engineered by Microsoft and Qualcomm. Normally such a feature on a regular PC would draw about 15 watts, Panay said, which is why it “doesn’t happen today.” But thanks to the SQ1, he claimed it uses 50 times less power “and doesn’t even touch the GPU.”
This eye gaze technology requires the SQ1, so it will only be available on the Surface Pro X, we confirmed. The Surface team worked on a bunch of features and improvements optimized using the SQ1, and the eye gaze technology falls into this category. Assuming Microsoft ends up using the SQ1 in future devices, the feature could presumably make its way there too.
Furthermore, the eye gaze technology will be only available for Microsoft Teams on the Surface Pro X. Microsoft wouldn’t say whether it would open up the functionality to other messaging and video conferencing apps or keep it as a Teams exclusive.
For reference, here’s Panay explaining why Microsoft bothered to build the feature at all:
There’s a lot of information you can get when you’re making eye contact. And there’s a lot of information you can get when you’re not. So when somebody gives you their eyes, you can really feel that connection. But when they don’t, you can feel that too. Now today, when you look at your screen, and you make a video call or use a webcam, that’s what happens. You’re looking at your screen.
And so when you’re looking across the ocean at each other or across state[s] or in different conference rooms, or in different parts of the house — that happens to me, believe it or not — you’re not looking each other in the eyes.
So how do we get to that point where we can connect even deeper, where we can have that conversation, where we can co-engineer products deeper, where we can work together as humans and continue to push things forward?
He concluded the demo by saying “this is the example of power on the intelligent edge.” Expect many more such features coming to the Surface line in the next few years.