Two recent announcements signal games developers are realizing that, for cloud gaming, matching a player to their location and then placing game servers closer to them dramatically decreases lag and improves .experience. They are Google’s new cloud-based game service Stadia, and news that Pokemon Go developer Niantic has developed a mixed reality “multiplayer experience” that runs on an edge-cloud service.

But the two announcements work in very different ways. In particular the March announcement of Stadia, designed to run on just about any device (PCs, Android phones, Chromecast, and more) has left me wondering: is Google doing it all wrong?

The idea behind Stadia is to run games on centralized servers, taking in controller inputs and sending back video and audio using Google’s network of low-latency data centers. But do we need large inefficient streaming services that process games in the cloud and stream video back to users? When instead, leveraging decentralized networks run at the edge of the network (instead of data being sent to a few centralized locations for processing) and the cloud provides a much more affordable and scalable solution to online gameplay. Data needs to travel the minimum necessary distance only, reducing associated lag time and facilitating more interactive and immersive in-game experiences.

Edge computing provides the opportunity to better serve multiplayer gaming, which is both latency-sensitive and bandwidth-intensive. In a centralized model, processing gaming data occurs in the data center or in the cloud, yet in a decentralized, edge computing model, initial processing occurs close to the gamer. Combine that with using a decentralized network of globally dispersed nodes, and multiplayer latency can reach single-digit milliseconds, which dramatically decreases any lagginess.

You could consider edge computing a platform-as-a-service, but decentralized and distributed at strategic points along the network, ultimately improving performance and enhancing reliability. This offers a completely scalable, elastic solution, unlike the massively inefficient solution Stadia proposes.

It’s the cost of building enough infrastructure to overcome distance and therefore reduce latency to make streaming scalable and viable as a business that is the limiting factor.


Ensuring there are enough GPUs near gamers to make cloud gaming work is a huge cost per user per hour. There is no denying that streaming game distribution will revolutionize the business, especially as broadband infrastructure continues to improve worldwide and 5G technology rolls out. But even if you’re Google, this all needs to generate revenue at some point.

Though the delivery of content based on user location certainly captures the ethos of edge computing, companies like Google and Amazon are still relying on their existing data centers to achieve these goals. Whether they can accommodate large amounts of gamers (and whether they can do so profitably, given the sheer processing power required) remains to be seen. And that’s without even trying to overcome “last mile” hurdles. Assuming it is feasible, their appeal will still only be limited to those able to communicate with the data centers.

Latency — the killer issue

Latency can destroy user experience; a game needs to be able to respond to keystrokes or controller inputs. Any commands issued must travel over the network in each direction and be processed fast enough by the data center for the gamer to feel like the game is responding to each keyboard and mouse stroke in real time.

Google’s Stadia team have tried to reassure players that game streaming should provide a smooth, full-resolution experience on Internet connections above a threshold of 20 to 30 mbps, yet many see this as a suboptimal way to play some games — shooters, fighters, racers, and sports. The lag will be noticeable even on good connections.

By combining edge computing and distributed networks, gaming companies can now escape the situation where they either have to force users to download massive files — something that will only get worse as games like Fortnite are updated every few weeks – or where they have to build out massive infrastructures and pay the cost of GPUs.

Yet, leaving all the hard work to their data centers, like the Google and Apple model, where the physics, renderings and movements sends a video stream of gameplay to a gamer’s device, is hugely expensive. And Google has yet to explain what their solutions might be: Google’s Phil Harrison has been quoted saying: “I know [Stadia] won’t reach everybody [and] I respect that some people will be frustrated by that.” We are still waiting for details on Google’s latency mitigation efforts or the threshold for additional latency that the company would consider acceptable.

Fortnite’s record concurrent player count now stands at 10.8 million. Ten million simultaneous players means 10 million different loads of processing, 10 million GPUs. Plus, not everyone is going to be playing all the time, requiring huge redundancy to be built in.

The lower the latency between a game console or gaming PC and the backend server, the lower the lag. With game-streaming services like Sony PlayStation Now and Nvidia placing a lot of faith in edge computing to enable their success, coupled with the rise of competitive gaming suggests that the massive gaming community is willing to pay a premium for a better experience.

Neeraj Murarka is a scientist and technology entrepreneur, and is also co-founder and CTO of Bluzelle Networks.