Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.


Engineers are an interesting cross section of practical thinking and creative vision. When you give them a problem like the metaverse to work on, they’re going ponder it in a different way.

Science fiction writers and Hollywood creatives have done a good job painting the vision of the metaverse. But the engineers are the ones who have to think about building it. To get a flavor for the practical side of engineering the metaverse, I talked to Thomas Coughlin, president of Coughlin Associates and president-elect of the IEEE engineering society.

Coughlin is an IEEE Life Fellow and he has been providing market and technology analysis services for more than 40 years. He has six patents and worked in the data storage industry for 40 years. He has been consulting for the last 20 years.

Before starting his own company, Coughlin held senior leadership positions at Ampex, Micropolis, and SyQuest. He is the author of Digital Storage in Consumer Electronics: The Essential Guide, which is in its second edition. He is a regular contributor on digital storage for the Forbes blog and other news outlets.

Event

GamesBeat Summit 2023

Join the GamesBeat community in Los Angeles this May 22-23. You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.


Register Here

I’ve gone to many of the same events with Coughlin for years, and we talked a bit about the upcoming CES, but most of our conversation focused on how to build the metaverse.

Here is an edited transcript of our interview.

Tom Coughlin is president-elect of IEEE and founder of Coughlin Associates.

GamesBeat: The metaverse would require a real-time internet. We’re not where we need to be for that. When I think of where the internet needs to be, the best thing that comes to mind is the Comcast announcement that they’re going to have two-way 10G, and latency should be better as well. That latency is really the thing that can kill online games. But on that front I’m curious whether you see that kind of infrastructure coming into place in time for what everybody wants so they can deploy a real-time metaverse.

Tom Coughlin: And at a price that people can afford. It’s going to take a while to get that kind of experience. There are networking constraints. A lot of metaverse stuff is based on wearable equipment, things of that sort. We have constraints on battery life. A lot of the headsets, one to three hours is what you get out of a charge. Unless you want to wear a backpack — people have offered those backpacks. But it’s inconvenient. You look weird. You could pretend it’s your ammo pack for a game, I guess.

A lot of the technology is coming together to make this possible. One of the IEEE things, if we get down to that lower stack, can we come to some common terminology? Can we develop some open standards for how you do this stuff? Make it easier to accelerate the process of building that infrastructure to make various types of extended reality experiences more real?

GamesBeat: There’s progress that needs to be made on so many fronts. But that basic internet infrastructure has to grow up as well. There’s this problem that people pointed out about — in some ways people suggest that a metaverse experience, getting lots of people together in the same space, like a concert — if you did an all-digital concert and people were individually interacting with each other and they could hear each other with 3D audio and see that there were 10,000 people in there with them in a stadium, then that is a metaverse experience.

I saw that demo by Improbable and the Bored Ape people, Yuga Labs. They have something called Otherside. They did an experience like that with 4,500 people in one space. It’s interesting that there’s some technology out there that could get us beyond just 100 people in a space, which is what Fortnite does.

Coughlin: If you’re going to get any kind of resolution, anything that acts like people would act, especially with a social element, your networking will be very important. You’re going to need awesome networking capability if you’re going to get hundreds or thousands of people together and have it act like real life.

Yuga Labs had a real-time demo of 4,500 players with 3D audio and full physics.
Yuga Labs had a real-time demo of 4,500 players with 3D audio and full physics.

GamesBeat: What is the barrier there? Is it somebody’s law?

Coughlin: First of all, you have speed of light issues. If you’re further away, you’re going to have built-in latency. If you’re on Earth, that’s generally not that bad. The biggest constraint often is your local connectivity. Getting on the big pipes from whatever little pipe you have. A lot of people still have crummy internet hookups. Even in Silicon Valley, sometimes it can be sketchy depending on who you work with.

GamesBeat: A lot of the people speaking about the metaverse are saying that you have to include those people. You have to include not only VR headsets but desktops and laptops and smartphones. People should be able to access the metaverse through any of those things.

Coughlin: Which means you’re going to have to do some kind of compression. If you don’t want to have horrible latency you’ll have to do an awful lot to make it easier for people with limited 3G versus 6G connections, you know? You have to do some compression. You have to make some compromises to get everyone on there. There are a lot of technology factors. We have the technology to do this to some extent. The ability to do that has a lot of growth ahead, especially — the metaverse is supposed to be a social thing. To get that social element, you’re going to have to handle a lot of different people coming in with a lot of different connectivity and make that somehow work.

A digital divide or a network divide or a reality divide, whatever you call it, between those with bad connectivity and good connectivity, it gives constraints. If a lot of things move to the metaverse, from entertainment to education even, then that’s another one of those; the people that have money can get the connectivity that they need. The people who don’t have that money probably won’t. They’re going to be dealing with whatever they can. What can you do about that to make it more equitable and get more people to be able to be part of whatever this modern economy is going to be?

GamesBeat: If you look at that one problem, right now a lot of games can get 100 or 150 people in an instance, in one world where they can interact with each other. If we take that leap to where we want 1,000 people, what’s involved in making that possible?

Coughlin: And all interacting. The more people with simultaneous exchanges possible. It almost goes up — complexity probably goes up to something like the square of the number of participants. It’s at least that. Maybe even a higher power. The more people you have, the more connections you have, the more communication that might be going on. That complexity then puts a lot of pressure on the infrastructure to support getting those people in there.

GamesBeat: There’s a reason we’ve been stuck at this limit for many years now.

Coughlin: I think that’s part of it.

GamesBeat: The other way the Epic Games people were good at expressing this was something they called “the sniper and the metaverse.” You put a sniper in a stadium filled with people, or just up on a mountain or something, and they could scope in on one individual and take a shot. But you don’t know who they’re going to target.

Coughlin: Only the sniper knows.

AleXa's virtual concert could be a forerunner of the metaverse.
AleXa’s virtual concert could be a forerunner of the metaverse.

GamesBeat: You might have 10,000 people visible to the sniper [like in a Hitman game], but only the sniper knows where they would go. You have to do that instantly. The movement has to be synchronized. What they’re saying is that crosses server lines. Usually you have a grid, a play space that’s handled by one server. But if you have this distance where you can see for a mile, then that’s probably going to cross multiple (servers).

Coughlin: The distance in the virtual space may not at all relate to distance in real space.

GamesBeat: It was explained to me as: You would cross server lines if you were having such a wide viewing distance. The servers usually were constrained in some way such that they could only show a certain geography.

Coughlin: It depends on how they break up the computation.

GamesBeat: Any time you cross the line you would lose the real-time nature of things.

Coughlin: If you’re going between one server and another, there’s going to be latency built into that. As soon as you get communication out of one box, there’s built-in latencies around that. Now, there’s a lot of technology coming into play within data centers that may help a lot in the future. For instance, one of the biggest consumers of energy and delays is moving data around between memory and compute. Including rendering and things like that.

There are new technologies that could allow you (to do that) at data centers — CXL is one of these. It’s an interface that allows you to switch network and memory. I can pool memory. I can share memory between devices, build up virtual machines, span multiple servers potentially, or create servers as I need them with various resources, in this case the memory. I can have some direct connect memory, and then I have a little bit of higher latency with shared memory.

One thing involved in that as well is the idea, can I compute closer to memory? Then you reduce that delay and energy consumption moving data around. In the equipment, at least in one data center, there are things going on that people are developing, especially what they call computational accelerators–these are located closer to the memory, where the data lives, and they can do certain functions and offload the CPUs. These things should help reduce certain latencies. I believe it would also impact things like games and metaverse performance as well.

GamesBeat: It was encouraging to hear Intel talk at IEDM about how they don’t think Moore’s Law is dead.

Coughlin: They’re doing all kinds of stuff. Chiplets. Getting finer lithographies is getting more and more expensive. The lithographic equipment, the extreme ultraviolet stuff, costs hundreds of millions of dollars. The next generations are going to cost even more. They don’t want to use that in everything. The idea of chiplets, for example, was I only use that where I need that, which gives an advantage. This is all part of this disaggregation of traditional server architectures, creating what they call composable infrastructure, where I can build stuff up as I need it.

In this case, they’re deconstructing the chip — this is not programmable, but they’re deconstructing the chip into little pieces. Then they put those on a connected substrate. I can have some memory separate from my computation, but close enough that I can get good performance. It gives me a lot more options. It allows me to do more scaling. Also it’s more cost-effective than trying to do everything with the high lithographic nodes. A lot of that stuff could impact embedded devices. That’s where we’re getting into things that could be on the network edge, or in the wearable devices.

Nvidia Research
Can we auto-generate art for the metaverse? Image source: Nvidia Research

GamesBeat: I was thinking that it would be a tragedy if Moore’s Law came to an end right when we got to the metaverse.

Coughlin: Yeah. All of a sudden we can’t do anything for years. But no, I think we’ll be able to address a lot of this stuff in a number of different ways. The other thing is metadata, which is information about the stuff you’re moving around, that you’re doing stuff with. That could be an important source for optimizing network performance.

If I know something about how far something is, could I cache stuff up and do things so that the latencies don’t appear to be as bad? If I’m playing a game, can I have the game seem realistic and not pause? Even if it doesn’t have the information yet, it does something that is in line with the nature of the game because it understands the game. Can you do things like that, so that even if you do have issues with your connectivity, you can build smarts into the system, your game system, so it deals with that effectively, so it has the least impact on the players? They still feel like they’re engaged. Things may not respond as fast if you’re in Antarctica playing somebody in Greenland, but you can still get a reasonable gameplay experience.

There are definitely physical constraints on things we can do. There are technological constraints in terms of what we know how to do yet. But there are also ways to mitigate that, things that could continue to let us have good experiences with whatever infrastructure you’ve got, while we’re improving and making the infrastructure better. We’re continuing with something like Moore’s Law, only it’s not Moore’s Law anymore. We need more performance over time. Faster responses and all this stuff. There are a lot of technologies going on right now, spanning everything from networking to compute to computer architectures to memory and storage.

Neal Stephenson and Dean Takahashi talk about turning science fiction into reality.
Neal Stephenson and Dean Takahashi talk about turning science fiction into reality.

GamesBeat: Are you optimistic about the metaverse, then?

Coughlin: Well, I’m optimistic about the concept of creating immersive realities that extend what human beings can do. The metaverse, Neal Stephenson coined that in Snow Crash, which is kind of a dystopian novel. Facebook grabbed on to Meta, metadata and stuff. I think we’ll be having these kinds of experiences, whatever you want to call it. New ways to extend reality. If it ends up being called the metaverse or something else, I think that’s something that will be very important.

That’s going to be involved in things like telepresence. I could remotely seem like I’m somewhere else. It could even be with little robot things that roll around. I see it at some of the convention centers. You have some space on a tablet or something like that. Maybe more sophisticated versions of that down the road. Could you make a human-like robot that could be somebody else for a while in some other location so you don’t have to travel? Give you sensory experiences that span the gamut of what we could do. Those are all possibilities. It just takes a while to build the capabilities to do that.

Also, I think it will be accelerated by having standards that are underlying it, especially if they can be open source. Allowing people to make things that will work better together. No one outfit will be able to make this work. We can only make it work as an industry. That’s where standards come in. There are activities going on in IEEE that are trying to address some of that.

GamesBeat: As far as CES goes, it looks like we’ll hear a lot about the metaverse there.

Coughlin: Oh, I think so. CES is always interesting. There are things that make sense and things that don’t make sense. One place I really like is Eureka Park. It’s sort of the cheap seats. You find startups and people like that. It goes all the way from the silly to the sublime in terms of the things you find there.

GamesBeat: I don’t know where we are on the semiconductor or electronics content in cars now.

Coughlin: It’s over 50% of the cost of the car, I think, at this point. Especially when you get into the electric vehicles.

GamesBeat: KPMG is now saying that they expect automotive to drive semiconductor revenues in the next year, in contrast to wireless communications. Wireless, with smartphones, just seems so gigantic. I didn’t realize that automotive had any chance of surpassing it.

Coughlin: Automotive was really hit by the chip shortages. When they started to order stuff again they found out that–the thing about automotive is they go through this really rigorous qualification. Once they qualify chips, and anything else, they want to keep getting them for decades. The problem is that technology like semiconductors doesn’t stand still. Ten years — it’s more like mayfly years, the life of the technology. You get these old nodes, and there are very few places they can make it. When they decommit it and they want to get it again, some of the sources aren’t available anymore.

Automotive is responding to that. I think they’re trying to get more modern technologies where they can. They still have all the safety stuff they have to do. That limits what you can do. And there are a lot of semiconductors being built. Some of that is going to be supporting automotive. Automotive is certainly not the biggest driver, but it’s going to be a significant driver for the next few years.

Jensen Huang is CEO of Nvidia. He gave a virtual keynote at the recent GTC event.
Jensen Huang, CEO of Nvidia.

GamesBeat: I don’t know if you’ve listened to a lot of Jensen Huang’s talks, but there’s an interesting bridge that I see between enterprise and games through something like the Omniverse. He was saying that they’re going to use the Omniverse to build the digital twin of the earth, the Earth 2 simulation, so that they can really predict climate change for decades to come. They would apply all the world’s supercomputers to this problem and try to simulate the earth with meter-level accuracy, so they could have the most accurate forecasting possible. And then I asked him, “Does that mean you get the metaverse for free?” And he says, “Yes, you get the metaverse for free.”

After designing this in the Omniverse, it theoretically then should be reusable. If some video game people out there want to create a planet-size world and auto-generate a lot of it, they couldn’t come up with enough artists in the world to create these things that they want to create. They would rely on generative AI for a fair amount of it. But if they’re getting this handed to them for free and it’s reusable, then that makes the metaverse so much easier to implement.

Coughlin: It’s really a commoditization of technology, which is a long-term trend. A lot of new tech is first implemented in data centers, places that can spend higher amounts of money that’s brand new. They can get economic value from it. As that technology matures and you do more of it, it gets cheaper. That’s the general trend. Most technology gets commoditized.

If you look back, some of the stuff we’re calling the metaverse, these technologies like heads-up displays and virtual reality, they’ve been around for decades, but they’ve been extremely expensive. The Air Force uses it for pilot training, for pilots to be able to know what’s going on. If it costs a few million dollars, it’s still a fraction of the cost of a fighter plane, and it makes it work better. If you do more of this stuff, you make more of this stuff, the cost goes down, and then that allows it to become commoditized. More people can access it.

It’s the other Moore, Geoff Moore, his statement. You get something that gets enough of these niche applications to where it can go into high volume. Then the costs go down and it becomes commoditized. There’s a very good chance that all the things that come together to make what we might call a metaverse or some kind of extended reality are going to go down. It’s going to become a part of everyday life in the not too distant future. Ten years?

GamesBeat: Some of that starts to feel like the space program. We got Tang. We got Velcro.

Coughlin: Freeze-dried food!

Geoffrey Moore onstage at Demo Fall 2011

GamesBeat: The metaverse could lead to these unexpected benefits.

Coughlin: That’s true. The other thing, and other people have said this, is that there will probably be new jobs and economic opportunities. We keep remaking what it is that people do. More of that is going to happen. Data is like the new oil, all these things that people glibly say, but it’s really true. All these are tools for us to interact with each other and the world around us. It’s going to be part of our economy. They’ll be important drivers.

GamesBeat: So you’re not in the camp of the curmudgeonly engineers who think this is never going to happen?

Coughlin: Oh, it’s going to happen. It will probably be able to do more than what we think it could, in ways which we can’t think of how it would do that. People will come up with things to do with something. You mentioned the Bored Ape Yacht Club. Who would have thought that would be a thing? That people would pay that kind of money for stupid avatars of apes with yachting hats on? There will be things we can’t anticipate. I guarantee that’s going to happen.

Meta Quest Pro on a charger.
Meta Quest Pro on a charger.

GamesBeat: How soon do you think we can get great AR (and even VR) headsets that are the size and shape of ordinary glasses?

Coughlin: To some extent there are AR/VR headsets that are close to the size and shape of ordinary glasses.

However, the real question in terms of practicality and usefulness is how soon can we get AR/VR headsets that look like — and hopefully weigh about — what modern glasses do, with 4K resolution or higher, at high enough frame rate and that can operate for several hours on a single charge while being affordable.

I estimate that based upon developments in battery technology and processing, display technology and memory and materials, that it will take between 5 to 7 years for there to be a viable product (in high volume production).

GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.