Qualcomm has a lot of traction in XR devices (virtual reality, augmented reality or mixed reality), and its chips have been used in more than 60 XR products to date. That’s up from 30 devices back in CES 2020.

The man in charge of that effort is Hugo Swart, vice president and general manager of Qualcomm’s XR efforts. I caught up with Swart at CES 2023 to talk about the future of XR and the devices that we’ll see coming out in the new generation of products.

While Qualcomm isn’t making the headsets, the Snapdragon family of processors that it designs are key to the power and processing efficiency of the devices. When the company makes progress with its technology, we get lighter, more capable XR headsets with better battery life. And one of these days, we’ll gets some AR glasses that look just like a pair of ordinary sunglasses or prescription lenses.

>>Don’t miss our special issue: The CIO agenda: The 2023 roadmap for IT leaders.<<


GamesBeat at the Game Awards

We invite you to join us in LA for GamesBeat at the Game Awards event this December 7. Reserve your spot now as space is limited!

Learn More

What’s coming down the road? Snapdragon Spaces devices, better connectivity with Wi-Fi 7, better video passthrough, and hopefully a set of standards and devices that will enable an open metaverse. I talked with Swart about these topics at the recent CES 2023 tech trade show in Las Vegas.

Here’s an edited transcript of our interview.

Hugo Swart is vice president and general manager of XR (VR and AR) at Qualcomm.

VentureBeat: I was just around the corner looking at the new HTC headset. It looks pretty good.

Hugo Swart: The hardware is pretty nice. We’re still supporting a lot of the software things, but I really like it too. We’re starting to see more and more devices. The Lenovo VRX, I’m not sure if you’ve tried that. They also use pancake lenses. It’s targeted at enterprises.

It’s what we’ve been seeing for the longest time. If you’re doing a VR headset, there’s nothing better than the XR2. It’s still a very popular platform. Overall, believe it or not, we’ve already launched more than 60 devices. Of course, there are devices from the tier-one companies, but also many startups. We have a good way to scale our support for all these types of companies.

VentureBeat: Do you see some clear periods of time when we had a 1.0 generation, and then a 2.0 generation?

Swart: We did the XR2+. Essentially the main difference was the packaging. It was just a different package. The XR2 Gen 1, the original one, has a package which helps with the form factor, but on heat dissipation, it actually concentrates heat in one area. With XR2+ we put the memory on the side. It helps with the heat dissipation. That’s the kind of change we call a Plus or a Pro. When it comes to the generations, it’s really a new processor as a whole.

One example of that, we announced the AR2 last November. I don’t think we’ve talked about that. The XR product family is for, of course, products that can be VR, MR, AR. But then as we focused on sleeker, smaller form factor glasses, we felt we needed to do something different. That’s why we created a new platform that is only for AR. That’s why we call it the AR platform. The AR2 is a completely new architecture for AR so that we can get under one watt of power in the glasses. How do you do that? You actually need more power if you want to create rich multimedia and graphics and all that. But you can distribute the processing between the glasses and something that you carry at the same time, or something like a PC.

We were talking about split rendering way back when. I remember it was a GDC. It’s kind of the same thing, the same concept, where you do perception – hand tracking, head tracking, and some of those perception features – in the headset, and then send those poses to the host compute unit. The host compute unit is actually running the applications. Then it encodes and sends it back and you do projection and all those display processing features in the headset.

That’s why it’s completely different. It’s ground-breaking. It’s going to enable the small form factor VR glasses because of this architecture that eliminates, or at least decreases a lot, the power consumption. Power is what drives size. It’s heat dissipation. You can’t put a fan in there. Any kind of thermal solution leads to size. You need to reduce size, and so you need to reduce power. You need to reduce the physical size, which means you need less. That’s the compromise we found to enable these small form factor glasses.

The other thing is that we enable wireless. It’s a wireless distributed process. We’re using Wi-Fi 7. It doesn’t necessarily need Wi-Fi 7, but of course with Wi-Fi 7 you have the best latency and techniques to work in more challenging scenarios. You connect Wi-Fi 7 from your phone or your PC to the glasses.

VentureBeat: What perspective do you have when people look at something like the Meta Quest Pro? Outstanding mixed reality, great wireless, all the other advantages. You can read text now. But they want it for $400.

Swart: There is certainly evolution toward lower price points. I can’t say the exact price point because that involves a lot of things around the business model and so forth. You saw that HTC has a lower price point than the Quest Pro. The VRX, I’m not sure if they’ve talked about pricing or not. The Pico 4 is significantly lower in price. I don’t recall the number off the top of my head. But it doesn’t have all the features. That’s why I think it’s healthy to see all these devices coming. People can create products at different levels of features and price points and then target different use cases, different markets. That helps us move the industry forward.

You can draw or sculpt with the Meta Quest Pro.
You can draw or sculpt with the Meta Quest Pro.

VentureBeat: Keeping everyone’s expectations in line, that’s one thing that’s always hurt this industry in some ways.

Swart: I agree. If you look into video passthrough, that’s an amazing functionality. But this is the first version that folks are putting on the market. There’s still a lot of optimization that can be done, both on the display front and even on how you do processing, how we do the silicon. That can be optimized for video passthrough types of functionality. So I agree. “Expectation” is a keyword here. Even the hype and the buzz around the metaverse–it was great to have that, but then people have a certain expectation. “It’s here now. I can live in a virtual world!” Well, wait. It’s one step at a time.

That’s what I like about what we’re seeing in the XR industry or metaverse industry. From 2015, if you look back at 2015 and compare to where we are now, that’s quite a lot of progress. We have the stand-alone category and all the functionality that’s coming to stand-alones. Stand-alone is also able to pair with compute units like PCs and so forth. There’s a lot of progress we’ve made over the last few years.

VentureBeat: Do you see things yet that could be called the metaverse?

Qualcomm’s Snapdragon supports AR distributed processing with AR2.

Swart: I think so. I think of the metaverse not necessarily as a virtual world, but really as the combination of the physical and digital worlds. I like to equate the metaverse more to spatial computing. In that sense, we’re just early in the metaverse experiences we have. We’re early in spatial computing.

In early 2000, when we were talking about the mobile internet, it was very much the same. That’s when I started my career. I was asked, “What’s the killer use case for the mobile internet?” “Well, you’ll watch videos. You’ll have video calls. You’ll have navigation.” It’s not that we didn’t know the use case. It’s just that it wasn’t ready for everyone. It took displays that had the right form factor. It took connectivity.

That’s what we see now with the metaverse and spatial computing. You have all these vectors of improvement, whether it’s the processor or connectivity or displays or the ecosystem. We now have Snapdragon Spaces, which is a set of APIs following OpenXR. We want to make it easier for developers to create content and be available across devices. It’s not an app store. It’s really just perception features. How do we expose these perception features to developers? That becomes a horizontal solution. That’s what we’re aiming for with Snapdragon Spaces.

VentureBeat: I don’t know if the healthiest thing I see in the ecosystem–I see all the creators, all the people who are using Nvidia Omniverse to create enterprise applications. There are all these unexpected things that big developers aren’t necessarily all doing. It’s the same on the creator side. Things like Beat Saber coming out of nowhere. That seems like one of the most encouraging things.

Swart: Another thing we talked about in November when we announced AR2 is our collaboration with Adobe. I really like what these guys are doing. They’re creating tools for 3D content, 3D assets, for non-coders. That’s the kind of thing we need. Anyone can create content now. You need tools for regular people to start creating. It goes all the way from full-featured digital twins that need to work exactly like they do in real life, on back to me creating in whatever way I want to create.

We’re seeing a lot of these things come together. I’m not sure if you played with ShapesXR, but that’s a very nice application. That’s also creation-related, 3D creation. There’s good progress on that front.

VentureBeat: What about things like USD and what that might lead to as far as metaverse standards?

Swart: We’re supporting and working toward USD, but also glTF. It’s not like we’re religious about one or the other. We see that these standards, these formats are gaining traction. On the client side we want to be optimized. Coincidentally, that’s also what we talked about at our Snapdragon Summit with Adobe. With Adobe we’re looking at USD and how to make it optimized for mobile. By which we mean smartphones, but also headsets.

Qualcomm is making a lot of Snapdragon chip variants for XR.

VentureBeat: I see very interesting activity at Nvidia with all of the Omniverse work and support for USD. But it feels like some of the other guys are maybe too quiet. Intel is kind of quiet. They have supported it, but then AMD–more companies could be as enthusiastic as some of the leaders are.

Swart: It’s not that we don’t support it. We’re just focusing more of our energy on the bigger challenges. I’m not saying that’s not a challenge. But there are challenges where we can make a bigger impact. Things like power and size and functionality, making all that available to developers. That’s where we’re concentrating. Also, it’s important to have these standards and things to help eliminate or diminish fragmentation.

VentureBeat: Do you feel like we’re on a yearly cadence now for new generations?

Swart: Not yet. Not yet. If you’re looking at the XR, our VR product line, with XR2 Gen 1 we had the Pro. I’m going to keep a little suspense as far as Gen 2. But there’s a lot that can be improved around a given platform, meaning the processor. Once you create a processor, the software, the displays, the things that go around it–creating a new chip every year, first, that doesn’t make economic sense yet. Second, it’s disruptive. Some of these things take time to optimize to a given platform. By the time you’ve optimized, if you suddenly need to move to the next one? You’re always trying to play catch-up.

The smartphone business, the PC business, maybe these are mature markets and that makes sense. But for XR it’s better to concentrate effort in the industry on a given platform. There’s a right time. I think the interval is going to decrease, but right now a yearly cadence is too fast.

Hugo Swart is an optimist about the future of XR.

VentureBeat: Do you think it’s worrisome that Moore’s Law is slowing down at a time when we’re all supposed to be heavily investing in the metaverse?

Swart: Yes and no. The nice thing about engineers is that if you give them a problem, they’ll find a way around it. AR2 is an example. I don’t have the same kind of gains I get on the process node. How do I overcome this problem? Oh, it’s by distributing the processing. Ideally, of course I’d like to have it all in one place. I’d like to have it with 5G. I’d like to have everything, and maybe I’d have it if Moore’s Law continued at the same pace. Without it we need to be more creative and find ways to work around.

Yes, it’s concerning. But we need to come together on ways to create the experiences that everyone wants without counting on process node benefits.

GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.