Gaming execs: Join 180 select leaders
from King, Glu, Rovio, Unity, Facebook, and more to plan your path to global domination in 2015. GamesBeat Summit
is invite-only -- apply here
. Ticket prices increase
on April 3rd!
After cloud gaming leader OnLive ran out of money in August, the future of cloud gaming became, er, cloudy. Rival cloud gaming service Gaikai had sold itself to Sony for $380 million, but OnLive’s failure to gain enough consumers to offset the costs of its cloud infrastructure raised questions about whether cloud technology was economical for games. Cloud gaming let a user play a high-end game on a low-end PC simply by logging into OnLive, which executed games in web-connected data centers that computed the game and sent images to the user’s machine in real time. OnLive launched in 2010, but too few subscribers materialized. Surrounded by free-to-play games, OnLive tried to sell consumers on instant access to the cloud and the capability to log in from any machine.
But Phil Eisler, the general manager of GeForce Grid Cloud Gaming at Nvidia, told us in an interview that he believes the technology is “hugely disruptive.” And he thinks it’s ready and numerous cloud gaming projects will appear in 2013. Nvidia is adding cloud graphics technology to its graphics chips and making graphics cards available for cloud gaming data center servers. In doing so, Nvidia enables service providers to accommodate multiple gamers per server and thereby make more economical use of their server infrastructure. Here’s an edited transcipt of our interview with Eisler.
GamesBeat: Where did the interest in cloud gaming begin for Nvidia?
Phil Eisler: Cloud began with some software architects about four or five years ago. They were looking at encoding the output of the GPU and sending it over the internet, and experimenting with that. They build that into a series of capture routines and applications programming interfaces (APIs) that could capture the full frame buffer or part of the frame buffer.
GamesBeat: Why were they thinking about that? They had just watched things in the market, where the cloud was already starting to … ?
Eisler: I think it’s just a precursor to a remote desktop. People have always thought that they want to get rid of all the wires around a computer — the capability to remote in and use the IP protocol to send displays anywhere. The concept, I guess, has been around in different forms going back to X terminals and things like that, using different methods. It’s always been a useful feature when you encounter it, but it’s never gained great traction in the marketplace. X terminals kind of went away, their original incarnation. But I think now, with all the multiscreen proliferation, it’s generated a lot of incompatibility between applications. There’s even more call for it now. And then it kind of evolved into the hardware team actually adding hardware encoders to graphics chips to make it more functional, make it scale better, than passing all that data back to the CPU. It’s a lot more efficient in the computer system to do the encoding on the GPU.
GamesBeat: It seems like it would be a little less interesting to you guys if it was purely enterprise. But when gaming came into the picture, it maybe has spurred more interest at Nvidia.
Eisler (pictured right): Yeah. I think it also enables Nvidia to expand beyond PC gaming. Now we can reach the television. We’ve always participated in television gaming through Xbox and PlayStation, but we see the opportunity to support TV gaming directly. And so that big-screen gaming experience being delivered from the cloud is appealing to us. We think that’s a direction for the future.
GamesBeat: That’s when you have Nvidia hardware in data centers that are streaming things into TVs, right?
Eisler: We look at it both ways. A lot of our focus is on that, putting in the data centers and streaming to TVs or tablets or phones. Or PCs or Mac products. But also, the idea that if you have a high-end gaming PC in your home, you could actually stream it to screens around the home yourself. So it’s both data-center-based and local home-server-based.
GamesBeat: OnLive came out in 2009. Were people already thinking about cloud gaming and talking about things like this?
Eisler: Sure. One of the great things about a company like Nvidia is it does invest a lot in long-range research. We have research teams internally that work on a lot of long-range things. We had started doing early work on remote desktops prior to OnLive’s existence. Actually, we filed quite a few patents early on in that area. But then we worked and supported most of the early cloud gaming startups: OnLive, Gaikai, and such. We still work with the remaining ones, as the market evolves and grows.
GamesBeat: The idea that, say, your graphics chip hardware might change because of the cloud, was that also pretty early?
Eisler: The planning cycle for a new GPU is at least two years, so … Kepler came out last year. It would have had to begin in 2009 — and probably earlier than that. We decided to enable the remote desktop through GPU hardware, which definitely makes it more scalable and more hardware-friendly. And now that we have our first generation built, we also look forward to the ability to improve remote desktop or cloud gaming hardware with each generation. Next-generation chips are learning a lot, and will be much better at delivering more concurrent streams of better experiences to more people.
GamesBeat: I think the way your chief executive, Jen-Hsun Huang, described it was that the hardware now, rather than expecting every task coming into it from just one PC, can receive a task from any Internet protocol address.
Eisler: Yeah. The concept of a CPU being shared as a virtual machine has been around since the ’60s. With hypervisors that then enable that…. Part of our VGX effort is to develop the hardware and software for a GPU hypervisor to enable the virtual GPU. So we are going to apply that to gaming, which provides us even more scalability of sharing a GPU as well as the CPU in the system so that both are virtualized. Virtualizing a GPU is a little bit different and more challenging than virtualizing a CPU, just because of the nature of the tasks and the data sets that are associated with it. But we’re solving those problems. The early results are promising, and I think we’ll get better with each generation of hardware and software. That’s a major effort for us right now.
GamesBeat: So with Kepler, you could do something like four users on a single GPU?
Eisler: It has the ability to encode four simultaneous streams. The actual virtualization of the GPU can be more than that. We have some game partners that do more than for. The encoding, currently, depends on the resolution and framerate. But practically, it’s about four.
GamesBeat: I think Jen-Hsun mentioned that you could go to 16 users over time. What makes that happen?
Eisler: I think it will get on a Moore’s Law type of trajectory, where the number of streams per server will double every year. I think the original Gaikai and OnLive implementations, which are a couple of years old now, are one stream per server. We announced now that K520, which does two per board… Most of the systems have two in them, so that makes four. That’ll be this year. Next year we’ll see eight, and then 16 and 32. I think it’ll go up for a while.
GamesBeat: One thing surprised me about the early architecture of the cloud gaming infrastructure. It didn’t seem very efficient. They studied the issue for so long, for a decade or so. I would have thought they might have come up with something more economical. But in the end result, with OnLive, it looks like the infrastructure was pretty costly. Is there a conclusion you can draw from that? Have the infrastructure requirements evolved over time?
Eisler: Yeah, I think that was part of the OnLive story. I don’t think it was all of it. Maybe not even the most important part. They built something like an 8,000 concurrent stream capacity, which in a world of a billion gamers is not a lot. I think Gaikai built something like half that.
GamesBeat: And both of them were mostly able to lease this server capacity, right?
Eisler: Yeah. I think Gaikai used a company called TriplePoint Capital. So they were both able to lease it. It was a heavy burden on them as startups, from that standpoint. But you’re right. They only supported one game per machine. They were early pioneers and they were focused mostly on getting it working. When they started, nobody was doing it, so they had to invent the encoding technology, the virtual machine technology, the distributed game technology. Being small startups, they were mostly focused on the quickest path to make it work, which is what they did. They didn’t really get to the second phase, which was cost-reducing it and adding more concurrency. And they did it largely without a lot of support from even us. In the early days we gave them little bits of help, but not a lot. We didn’t design products for them. Grid was the first time we designed products for the market that were made for that purpose, so it makes it easier for them to scale it out.
GamesBeat: So custom servers are not as efficient as in the day and age when these capabilities are built into general-purpose servers.
Eisler: One thing is, we’re offloading their encoding parameters. They were either doing it through external hardware or through software on the CPU, both of which involve a lot of data transfer and power consumption. Because we’re doing it on the GPU, that offloads the CPU for more games running and reduces the power and enables more concurrency, so you can get more simultaneous streams per server and reduce the cost per stream.
GamesBeat: And then there’s more data centers with servers with GPUs in them these days.
Eisler: They’re coming, yeah. We had to work with the people who build servers, whether it’s Quanta or Dell or Super Micro, to build space for GPUs into their architecture. We’ve done that now, and we actually pioneered that, primarily through our Tesla group. We’re leveraging a lot of the infrastructure that was built for that to plug our GPUs in. That’s working out well and making it easy to assemble and put GPU servers into the data center.
GamesBeat: What’s the part you call the GeForce Grid again?
Eisler: It’s a server board product, so it’s passively cooled. It doesn’t have its own fans. It’s designed to use the fans from a server chassis. It has multiple GPUs on it. The one we have announced has two GPUs on it. And then there’s a software package that comes with it that enables the control of the capture and encoding routines to enable streaming of the output.
GamesBeat: So the more that the cloud infrastructure people deploy these, then the better able they could be to support the whole cloud gaming industry.
Eisler: Yeah. It’s possible that the industry could shake out into different levels. There are infrastructure-as-a-service providers that operate data centers and we’re talking to those folks. They’re looking at just putting capacity in place, and then anyone who’s in the game distribution business could layer their service on top of that. That would be one way of financing the build-out of it. And then there’s other people who want more of a wholly integrated solution, where they want to purchase their own servers and rent the games, or for that matter any application, directly themselves to consumers or business users.
GamesBeat: What sort of expectation should consumers have about how much progress they’re going to see? You guys have the figures, like the minimum requirements of bandwidth that you need for a good cloud gaming experience. 30 megabits a second on a desktop line and 10 megabits on a mobile device. …
Eisler: It’s variable bit rate, but a lot of it… A good tradeoff of experience versus bandwidth is 720p, 30 frames per second and about five megabits per second. That’s where we think it’s pretty good. We have the ability to go higher. We could 1080p, we could do 60 frames per second if you’ve got 15 or 20 megabits per second, but not many people do, at least today. And then we can also go lower, down to 480 and two megabits per second, but there’s kind of a range. The sweet spot, we think, is 720p, 30 frames per second, five megabits per second.
GamesBeat: I think that’s exactly what OnLive targeted.
Eisler: Yes. OnLive and Gaikai both were in that range, which has a lot to do with the current broadband infrastructure.
GamesBeat: I don’t know whether consumers found it to be good enough. They tried that for a while and then they didn’t wind up with too many subscribers, so… I don’t know if the consumers just weren’t aware of it, or if they didn’t actually like the experience.
Eisler: Bandwidth is one thing. The other thing is also latency. They had a lot of latency in their system associated with the encoding and the network infrastructure. One of the things we’ve done with Grid is do a much faster capture and encode. We’re able to save about 30 milliseconds of latency, which helps a lot. We’re also working on client optimizations, which can save maybe another 20 milliseconds of latency. The latency feel was certainly a big part of it. Also, with Moore’s Law, we will improve the bandwidth and resolution. Even in countries like Korea… I was at a product launch there about a month ago with the LG U+. They were announcing two choices for their TV service. One was 50 megabits per second and the other choice was 100 megabits per second. There, we can go 1080p, 60Hz, and it’ll be a beautiful picture. I think that is the future for this country as well. I think there was an experience problem. That might have been the larger problem, even over the density and cost side. We are working on that one hard too, in terms of improving the quality of the experience, reducing latency, and improving the visual quality of it. And then lastly, it’s access to first-run content. The content that OnLive had was a little bit older, and so we’re trying to work with developers to have fresher content available.
GamesBeat: The problem there with the experience is that it’s always improving… The rumored console resolutions are going to be like 4K or so and multiple Blu-ray discs worth of content for their games in the next generation. That’s the kind of thing people were talking about at the Cloud Gaming USA conference at the end there. That puts more of a burden on improving the whole experience for the cloud.
Eisler: Yeah. I think so. Although I’m not so sure that there’s going to be a lot of 4K display screens or TVs available any time soon.
GamesBeat: Sony’s got those TVs for $20,000 dollars …
Eisler: Yeah, exactly. There may be a few people who would want to buy a console for that. 1080p at 60Hz is a pretty awesome experience. The thing about the consoles … they say this is the last console, and I am certainly a believer in that. The last one is almost 10 years old now in terms of the technology.
The good thing about cloud gaming is it’s going to get better every year. One of the reasons we’re investing in it is we see that there are some issues today, but they’re all solvable, and they’re all moving in the right direction. Bandwidth is going up. The cost of server rooms is going down. We’re bringing latency down. The experience will just get better and better every year, to the point where I think it will become the predominant way that people play games.
You can put out multiple Blu-ray discs, but who wants to jockey discs anymore? People don’t want discs in their lives anymore. They want to download everything, and when you’re downloading that kind of stuff, it takes a long time. So we’re also pushing the ability, of course, to play instantly. You don’t have to download anything. You don’t have to update any patches. It’s all maintained for you. You just play.
GamesBeat: Rob Wyatt, chief scientist at Otoy, said at our 2011 GamesBeat conference that he sees a day when, say, a cloud gaming experience could be faster than a disc-based console system. [chuckles] That they can attack the lag so much and remove it that he thinks that day will come.
Eisler: We did some experiments in that area, around when we launched our product. One of the things we found was that… The consoles, when they launch, they’re rendering the games at 60Hz and the average game has got three frames queued up, so it takes about 50 milliseconds to render a game. As time goes on and more advanced games come, the hardware’s too slow, so they basically take the rendering down to 30Hz. Now you’re spending 100 milliseconds to get a frame out of the console. We also measured a lot of the HDMI inputs on TVs, and the best one that we could find is about 60 milliseconds. So it’s funny. The average gamer playing on an Xbox today with a standard television is probably experiencing 150 to 200 milliseconds of latency, and that’s what they’re used to playing with every day. Because we can always improve the hardware at the server end and we can improve the capture and encode… We can do that portion in about 60 milliseconds and effectively hide the network delay. I think Gaikai showed, when they worked with Limelight, that with a distributed network in the United States, you could get to most homes in about 30 milliseconds. When I was out talking to the Koreans about these things, they’re under 20 to get to houses. People worry about the network latency, but actually, in the whole pipeline, it’s the smallest piece. Our monitors that we work with today are under 10 milliseconds of latency. We think that, working with smart TV manufacturers, we’ll be able to cut that time down. It’s going to be possible very shortly to have a cloud-rendered experience that has lower latency than the current console plus standard television experience.
GamesBeat: Do you improve on the distance that is involved here, too? I think that 1,000 miles to the data center is the limit of what the current cloud guys could do.
Eisler: Yeah. There’ll be a tradeoff versus experience there. The closer you are, the lower latency you’re going to have.
GamesBeat: Just a couple of data centers to cover the whole U.S. would be ideal …
Eisler: That’s one way of doing it, yeah. The alternative is more distributed data centers and giving people an even better experience in terms of lower latency. It could be taken either way. If you say that 150 milliseconds is good enough, then you could probably get down to two data centers. If you want to push through100 milliseconds, then you may want more data centers. We’re working with different providers, going down either one of those strategies.
GamesBeat: Is there something evident about how you guys are doing things differently from AMD?
Eisler: We haven’t heard much from AMD. I think they made a small investment in one of the startups, who actually currently uses Nvidia products…
GamesBeat: They were active with Otoy. They recently invested in CiiNow.
Eisler: Yeah. The thing is, there’s a lot of software that needs to be developed and supported. Nvidia’s invested quite a bit in that. We haven’t seen the cloud gaming software come from AMD get. Really, at this time, the only practical choice that people have is Nvidia.
GamesBeat: So it’s not clear what, say, architectural difference there is right now.
Eisler: Well… On paper, there’s also encoders in the AMD architecture, but I don’t know that anyone’s seen any software for that either. Once the software shows up, then people might be able to get an idea of whether the hardware is good enough or not. Nvidia at that point might be on to our second generation of stuff. It’s just a question of Nvidia investing in stuff in a higher-level integrated way. For a GPU vendor, the revenue from cloud gaming right now is close to zero. It’s just getting off the ground. It’s definitely a bet on the future, to invest your engineers in it. Nvidia so far has been willing to do that. We haven’t seen much investment from AMD in it so far.
GamesBeat: Have you also figured out what you need to do for the ecosystem in order to make it happen? Theoretically you could be operating data centers. Building your own data centers, putting Nvidia-based hardware into them, and leasing them out to whoever wants to use them. Those are investments you could make.
Eisler: Yeah. There’s lots of possibilities, of course. So far, we’ve decided to focus on delivering great boards and software to enable others to be successful in running services and data centers.
GamesBeat: Who do you think’s going to come into this market? The CiiNow folks in particular were hoping to see carriers, cable companies, game companies, game publishers all in some way thinking that they could offer games on the cloud.
Eisler: We’ve seen it from all of those people. The cable telco companies are interested, both here and abroad. Game publishers are interested. Anybody in the game distribution business. If you’re offering rental services or selling games you’re interested in this. We’ve seen it from all types of people. Some people want to offer more than just gaming. Some people want to also offer virtual desktops or other applications as well. We’ve seen people in the software-as-service business, or anyone building a software app. Again, I think in large part the consumer client has now changed to so many different form factors that are incompatible with general software development on PCs. The ability to stream to any of those desktops is becoming greater. You can stream to any Apple device, whether it’s a Mac or an iOS device. You can stream to televisions. So the reach for a software-as-a-service company, whether it’s gaming or another app, is extended a lot by cloud gaming.
GamesBeat: That speaks to the whole disruptive potential… Everybody’s interested in cloud gaming because it can disrupt businesses.
Eisler: It’s potentially hugely disruptive. That’s why we’re investing in it, and we’re investing in it with our partners. We think it’s a paradigm shift in computing. It’s almost like cloud 2.0. Cloud 1.0 was… Run some sort of master servers and communicate data back and forth. Cloud 2.0, we think, is just render the screen at the server and stream it to everybody. Our early prototypes are showing that that’s very possible to do and has a lot of benefits.
GamesBeat: I wonder if the untimely end of OnLive and cloud 1.0 is scaring some people from cloud 2.0.
Eisler: The naysayers certainly had a field day with the demise of OnLive. But as we’ve discussed, I think a lot of their problems were of their own doing. We still see a lot of potential for the vision of cloud gaming. We wouldn’t do some things the way that OnLive did, and it’s unfortunate that things ended the way they did there, but we don’t see that as a negative for the long-term potential of cloud gaming. We’re still bullish on it.
GamesBeat: I suppose Gaikai’s exit could be viewed more positively, as a great outcome for them …
Eisler: Sure. Clearly Sony believes in it enough to put their $380 million dollars into it. That was equally supportive for those people that are pro-cloud gaming. Anybody who’s in the game console business is clearly awakened to the potential of streaming games to TVs.
GamesBeat: Is it logical to expect more startups in this area, because of different opportunities that arise?
Eisler: There’s already quite a lot of them. There was Gaikai, Onlive, CiiNow, Agawi at the conference in China. You had Cloud Union at the conference.
GamesBeat: G-Cluster …
Eisler: Yeah, G-Cluster in Europe. Playcast. Ubitus
GamesBeat: Otoy, in some ways …
Eisler: Yeah. So we’re approaching 10. I remember in the graphics business, there was originally 30, so we’ve got another 20 to go, I guess. [laughs] And then back down to two. There does seem to be a new cloud gaming middleware company coming up every week or month or so.
GamesBeat: It seems like the middleware folks are trying to provide this one slice of the whole picture. That could enable some better white-labeling or mix-and-matching of different technologies. I wonder if that will help other companies get services off the ground.
Eisler: They perform a system integration function, and they provide some software that enables user accounts and signed games to users and onboarding of games. So they have services that they provide to anyone trying to offer a service. They’re a necessary function in the ecosystem, to bring it together. They’re trying to work with different partners to reach as many potential game subscribers as possible.
GamesBeat: Given what you know, then, how soon do you think cloud gaming becomes, say, a much bigger part of the consumer market for games?
Eisler: I think we’ll see a lot happening next year. Our products are sampling to the partners now. They’re building their middleware services based on them, and they’re starting to interact with telcos and other gaming-as-a-service providers. A lot of it is actually outside this country so far. Asia is leading the way. But that will come back here next year. We’ll see it gain momentum throughout next year. But if I look out five years, I think it could be a significant portion of the way people play games.
GamesBeat: How much resources is Nvidia putting behind it?
Eisler: It’s one of the major strategic initiatives in the company. There’s well over 100 people working on it to some degree. Whether it’s on the hardware, the software, the firmware and driver level… There’s a lot of people here that are working on it in some fashion or another.
GamesBeat: Some people have pointed out that if you sell a lot of graphics cards into data centers, you may not sell so many to consumers anymore. That’s a tradeoff.
Eisler: Yeah. As we discussed before, as the screens move to higher resolutions like 1440 or 2K and people want that kind of… I mean, we talk about 100 to 150 milliseconds of latency in cloud gaming. GeForce gamers are playing with 50 milliseconds of latency. There’s still a gap to that GTX experience for the hardcore gamer. I think it’ll be 10 years before we have to worry about them switching over to the cloud. They’re pretty particular about their gaming experience. I don’t think we have to worry too much about that right now. That also improves dramatically from generation to generation. We see this more the opposite way, really, as expanding the market for GeForce. I think it will attract new users, because it’s easier to play and you can now play PC games on any device, not just on a PC that you have to set up yourself. For the most part, it expands the market for Nvidia more than it takes away.
CloudBeat 2012 is assembling the biggest names in the cloud’s evolving story to learn about real cases of revolutionary cloud adoption. Unlike other cloud events, customers — the users of cloud technologies — will be front and center. Their discussions with vendors and other experts will give you rare insights into what really works, who’s buying what, and where the industry is going. Register now and save 25 percent! The early-bird discount ends September 14.