Ever since releasing the original Grand Theft Auto in 1997, Dave Jones has been obsessed with making believable worlds. He worked on more recent online game worlds like the one in All Points Bulletin (APB), and at each point along the way, he ran into huge obstacles on a technological front.
Now, he serves as president of Cloudgine, which has created a cloud-based game engine to help alleviate the problems of running online titles. And the company released a demo game called They Came From Space to show off the quality of the cloud-gaming engine, which is designed to take advantage of cloud computing and produce massive physics-based simulations.
Jones started Cloudgine in Edinburgh, Scotland, in 2012 to develop cloud computing technology for making games. So far, studios have used the tech to create the upcoming Microsoft exclusive Crackdown 3 and Facebook’s virtual reality title, Oculus Toybox. Cloudgine can be used to make massive online games for virtual reality.
They Came From Space is a “proof of concept” game for the PC, Oculus Rift, and HTC Vive virtual reality headsets. We talked to Jones about the demo, where as many as 10 players can participate in the same game world at the same time.
They Came From Space uses a funny art style and tone that borrows heavily from classic 1950s B-movies about alien invasions. In the demo, you can annihilate entire cities while playing a role as a massive alien. As the destruction takes place, you can see tons of individual particles floating in the air, pushed along by a physically accurate wind. We talked with Jones about how tough it is to make these games.
Here’s an edited transcript of our interview.
GamesBeat: I wondered why you guys went ahead and did the new game as a proof of concept when you already had Crackdown 3 and the Oculus Toybox.
Dave Jones: It was mainly to show we could support additional players in VR, further to those that were in Toybox. One of the benefits of running a bunch of simulations in the cloud is we can connect a lot more players, pretty much for free. We wanted to demonstrate something with that.
GamesBeat: How many more players can get into the game?
Jones: On the server side, it scales really well. It’s pretty much a completely scalable architecture. You can just add players and add players. At some point, you’re going to hit a rendering limit on the client. But from a backend infrastructure perspective, that’s one of the big benefits of building the simulation in the cloud. It scales pretty much infinitely. We can just add more and more CPU threads to keep building up the simulation. Right now, the demonstration we’re doing live is just going to be 10 players. But that’s only because that particular demonstration needs to hit the right frames per second, 90 frames per second for VR.
The other thing, which we don’t have in either Crackdown or Toybox, we actually attach a cloud GPU to the game as well. We use that to give the VR player a selfie camera, basically. If you want to stream from your perspective, that’s just not very conducive to streaming. A 2D view from a VR perspective doesn’t look very good. But one thing we liked was, because the VR player is the main protagonist, we gave him another camera — effectively, a virtual camera that’s rendered in the cloud. It doesn’t really matter where it’s coming from … from a streaming perspective. It’s neat for streamers because obviously, that’s a new method of discovery for games. It gives them a controllable camera, so they can have it looking at their face or off to the side.
GamesBeat: Are they streaming that in VR, then, or in a 2D perspective?
Jones: To a 2D screen. It’s kind of like having a third-person camera. If you watch the video, you can see that everything’s taken from a third-person camera because it looks better that way. But it’s coming from a camera properly set up for the game. It’s not like a dev camera. It’s a camera we can give to VR players, like attaching a virtual selfie stick to themselves.
That’s a good example of — a lot of people have talked about doing actual rendered streaming for a full game. We think, from a streamer’s perspective, it makes sense because you’re sending from just one GPU to potentially hundreds of thousands of viewers. It could be applied to any game. If devs want to, they can start applying really good cinematic cameras to their games and give control of those to streamers or players. It’s a better way to stream a different perspective on the game.
GamesBeat: What are some good things to notice as far as what the game is capable of, as we look at the demo? I saw all the physics-based elements, the pieces of the buildings flying around.
Jones: Yeah, there’s lot of that. Nothing’s scripted. Everything is completely physical. It’s small things. If you wave your hands, as a VR player, or if you watch the downdraft from the drones. That’s all modeled, so the trees bend based on wind and things like that. It’s all the small neat touches that make the world much more physical, which I think players will come to appreciate.
From a VR perspective, a lot of VR worlds are a little bit sparse because you’re trying to put all your power into rendering. But this goes the other way because you get a very nice interaction with a VR world where everything is physical. That makes a big difference, but it is quite neat for a VR player to get that level of physicality and density.
GamesBeat: How are you able to do some of that? Are you offloading rendering tasks into the cloud?
Jones: That demonstration used Unreal and PhysX. Typically, in any game, when you want to look at your physics, the maximum budget for that sort of thing [is probably] going to be taking about 20 percent of your CPU, applying that to physics. Other things like AI, pathfinding, audio, they all have to fit in the budget. Normally, running something like PhysX, you can give maybe 20 percent of your CPU to it.
With this, we run PhysX completely in the cloud, and we run multiple instances of PhysX. We used something like six instances of PhysX for that game. Effectively, you can have six times, or even more so, because that can be on individual CPUs. You could have 200 percent of the CPU you’d normally see in a game just applied to physics.
We’re about to do something similar with AI as well. Typically, you have about 10 percent of the CPU budget in your game for AI. What if you could have a whole CPU, multiple CPUs, just dedicated to AI? It’s taking each of those incrementally, pushing more of them to the cloud and then scaling them up. But it’s all based on technologies developers know. They know PhysX. They know Unreal. They know how to do pathfinding and so on. We’ll start to push a lot more of those services to the cloud — but in a very usable way — using very normal game technologies.
GamesBeat: What are the different things that Crackdown 3 or Toybox used or showcased compared to what this demo does?
Jones: Crackdown 3 was using Havok in the cloud. It was more about doing complex structures, complex simulations of large buildings. It’s things like, whatever way you destroy a building, throughout the superstructure the stresses would be calculated properly, and the building would fall over in a completely non-scripted way. It was applying things like heavy compute to building superstructure.
In They Came From Space, we’re applying it more to forces from the VR players. The VR players get access to these huge arsenals of super weapons, these massive destruction beams that cut through the landscape and destroy all the buildings in their path. It’s just a different application of how game design can creatively use physics.
With Toybox, it was more about super-fast interaction in a virtual space, a social space, where you’re both playing with the same things. Any latency would have been immediately noticeable at 90 frames per second when you’re stacking blocks and passing objects to another VR player. So, that was about solving problems around very smart ownership, how we transfer ownership between players in the backend, in the cloud.
GamesBeat: Can you talk about the reason for Cloudgine and the need to bring the cloud into modern gaming?
Jones: It’s really to open up new creative opportunities. These days, we’re mostly limited by CPUs — on console and PC alike. There’s been a lot of hardware evolution in GPUs. For the last two generations of consoles, it’s mostly been about higher resolutions and HDR. This last generation is all about 4K. But rarely have we seen these kinds of huge pushes in compute power, which I believe offers the most interesting creative opportunities for new games. That controls things like AI and physics, all the things that open more doors from a design perspective.
Our goal was to start to offer developers an easy path to taking game systems that they know and saying, “If you want, you can run some of these systems as they are in the cloud. The benefit you’ll get is we’ll make them scale — from 1X to 2X to 10X. We’ll make it very easy to pick up any of these technologies and make them scale by running them in the cloud.” That way, we can start to build huge, complex simulations and dynamic worlds that just can’t be done today.
Even today, when you look at typical game worlds, they are very static. You can fire off rocket launchers and stomp around in huge mechs in a game like Titanfall, but ultimately, everything you come across in the world, every structure, is static. We like to think these worlds are very dynamic, but they’re still — the [polygons] are set. They’re indestructible. That’s one example of where we could start to make inroads on something a little bit more dynamic.
GamesBeat: If you look back on things like Crackdown and APB, were there things you wished for there that you have now?
Jones: We think of cloud as being new, but it’s not really. It’s just started having an effect on gaming infrastructure in the last five years. Going back to APB in 2010, when you made an online game back then, you had to invest in your own dedicated servers. That was really expensive and very hard to manage. You were never quite sure. You had this huge spike at launch, and then, things settled down. The cloud has come along in the last five years and taken a lot of those problems away. If cloud had been around then like it is now — that’s what brought gamers dedicated servers, and gamers understand the value of that. It’s hard to launch a game now that isn’t on dedicated servers. Even if they’re cloud dedicated, it’s a huge advantage.
Behind the scenes, then, this has been happening, anyway. Cloud has been affecting gaming in a positive way. We’re just seeing newer and more unique ways to do that. Our goal is to find more ways to leverage the awesome compute capability that’s out there and start to make it more accessible to developers. It is a different skill set, a skill set they don’t have. There’s a lot of hard work to do in making the cloud accessible for compute. We want to make it very easy and very cost effective to get developers using these kinds of services.
GamesBeat: Are there ways this is either similar to or different from things like Improbable — or the folks who started the MaxPlay game engine?
Jones: There’s a couple of key differences. Both of those are good examples of — if you’re going to jump in there, it’s a whole paradigm shift in how you develop. Improbable is built on more traditional web-based technologies. Game developers, we’re all about performance. When we looked at how you build a massive-scale simulation run in the cloud, we started with the hardest problem for us, which is physics. In the physics of games like Crackdown and They Came From Space, we’re dealing with upwards of 50,000 dynamic objects in every frame, and they have to move around in real time, from server to server. Players can’t notice any kind of latency in that experience.
We tackled it from … players have to be able to see the difference. When Crackdown showed up a few years ago, people said, “Wow.” With VR running at 90 frames per second, when we wanted to create a dynamic world for that, we said, “Consumers have to be able to see and feel the difference in the things we’re doing.” That required solving the hardest problems first. From there, going up the stack, things like a traditional MMO backend are very easy for us to do. We tackled it from the low level, and we’re moving further into the market.
GamesBeat: What are some of your plans for this game? Do you want to release it as a demo or actually get it out into the market commercially?
Jones: Certainly the demo. We plan to launch a variation of this, so people can play it and get a feel for what these kinds of worlds are like. We’re using this release to start to gauge reaction as well. We think there’s enough interest that we can find some partners out there and bring some other things to market. But these kinds of game jams or shorts, demonstrating new bits of tech, we’re going to keep on doing that. We’ll always be looking creatively at what can be some unique, inspirational ideas for ourselves, as well as developers and publishers, that we can put together to demonstrate the possibilities. If we get some interest in some of those — if people say, “Oh, I’d love to see this as a full game,” then we’re more than happy to talk to the people who can make that happen.
GamesBeat: Where do you want this to go as far as other kinds of things that could be accomplished with the engine?
Jones: That comes down to personal preference, I think. I’m really interested in how people can apply this as we go away from big physics demonstrations. There’s a lot of work on AI just now, and one of the things we allow access to is GPUs in the cloud from a compute perspective. A lot of companies are working with that because it gives you something like 3,000-plus cores to do tremendous AI. It’s about developers realizing that we need to start thinking a bit outside the box. If we have access to that level of AI for a game, how can we leverage that? AI has a huge opportunity as far as running in the cloud.
I’ve always been a big fan of open worlds, but like I said, we’re in the very early days still. I know people like to think open world games are quite advanced, but I don’t really believe they are. The worlds are still pretty static. It’s hard to find an evolving open world game. I’d like to see us taking a [crack at] some initial versions of technology that can show that everything, the whole simulation is in the cloud 24/7, and we can start to give the player a world that evolves in some ways. I think there are huge possibilities there.
GamesBeat: Are you seeing any success at getting third parties interested in the engine?
Jones: Definitely. We hope to be announcing some other users of the technology shortly. But we’re definitely getting a fair bit of interest now. We’ve mostly been doing this as bespoke services up until now — in the background. We were working on building a platform, so we can start to roll this out to many more developers without it having to be more like a bespoke project. Part of our big drive now is turning this into a platform and getting it out into developers’ hands everywhere, where they can try it with very little friction and start to use it as part of their pipeline. We can see what they come up with creatively.
GamesBeat: How many people are at your company now? Have you raised any money so far?
Jones: We’re about 22 or 23, mostly engineers, and no, we’re 100 percent independent.
GamesBeat: Crackdown 3 should be an interesting test.
Jones: Yeah, we’re very excited to see that launching. It’ll be quite a big stress test for the technology.
GamesBeat: Are you having as much fun now as you did in the early days? You’ve been at this a very long time.
Jones: I am. I’m really enjoying this, to be honest — this kind of game jam we’re doing with the technology. I always loved thinking about technology in gaming. Any game that brings a bit of unique tech is always something that helps. With this kind of tech, it opens up some new creative ideas, which I love as well. Some of the stuff we’re going to see in the coming months [or] years from the smaller, more creative projects is going to be a lot of fun to work on and, hopefully, spark a lot of interest.
GamesBeat: You could revisit some of the ambitions in games like Grand Theft Auto.
Jones: Absolutely. I still get frustrated by the technology barriers we face and how far we’d like to push these worlds we build. It’ll be exciting to potentially lift some of those barriers. But we have a long way to go.