GamesBeat: Do you think that covers enough different types of behavior from players?
Wirtz: The nice thing about some of those scenarios is that a well-developed one looks at multiple types of milestones. What’s your incentive? What’s your “achievement”? You might be playing Skyrim in a way such that it’s “How much of the map can I see?” So maybe it should give you a quest that shows you more of the map, because that’s what you’re in the process of doing anyway. That’ll pique your interest. You can balance between those things. Maybe it seems like you’re really into exploring the landscape, so it’ll send you to find this particular plant that’s really hard to find, a detail-oriented thing. The idea is to find what sort of excitement is good for that user, so you can keep them on that storyteller’s curve.
GamesBeat: When it comes to personalization on the ad side, Kabir, what do you think about how personal this can get and how personal it should be?
Mathur: I think from a content perspective, personalization can be thought of as a spectrum. On one side of the spectrum you have the EAs and Ubisofts and Kings of the world, which can push users from one property to another across a huge portfolio, and also have the resources to hire huge data science teams. They could actually deploy personalization models down to the user level if they wanted. And on the other side you have indie developers, who have neither the resources nor the data scientists to accomplish any of that.
Somewhere in the middle you have smaller game companies – Storm8 comes to mind – who have built up network effects really well. They have a portfolio of dozens of games with well-developed mechanics to push users from one game to another. They have a rich data set of user actions. They know when user A moves from this game to that game. They’ve learned a lot about the preferences of that user. They can deploy certain content personalization strategies that are going to appeal to that user.
What I think game developers on the smaller side of the spectrum need to think about—you have to learn something about your users. Build a connection with these people and use it. Figure out who your users are. Even if you’re only deploying three or four models based on basic things like age, gender, and education, that’s better than pushing everyone through the same content pipeline. There’s a lot of variables you can test. There are push notifications, in-game messaging, iOS versus Android. iOS users tend to be better-off than Android, so maybe your first IAP offer there should be of higher value. If Android users are less likely to monetize, maybe you can show them more ads. There are lots of models you can deploy.
Fletcher: It’s critical to understand your users, no matter what. That’s critical from the standpoint of narrative development all the way to understanding the LTV of your user base. Personalization, what A.I. brings, what I want it to bring at least—as someone who plays a lot of games, I want to have fun. I’m here to play games and enjoy myself. I recognize there’s a balance between creative needs and business needs, and I think A.I. can successfully balance those two things.
I started a game yesterday where I got through about 15 seconds of content before I ran into a minute and a half of ads. I never played it again. That game is dead to me. But there was an opportunity for that game to have a much better personalized experience by showing me the right amount of ads, or possibly sending the right level of IAP prompts, and yet still keeping the enjoyment level high.
The comment about being able to move gamers between your properties, if you’re a big enough publisher or studio, is valid. You’re continuing a sort of meta-cycle of fun there. The advertisements you see there are trying to draw you into the next game and so on. King and other companies are very good at that. But the critical issue is simply to keep people having fun. We’re making games. A.I. will help that happen. It’ll lead to a better experience for the user balanced with LTV for the developer.
Wirtz: There’s a big lesson between what the two of you pointed out. You said, “Here’s the very sparse data from which you can make decisions.” That’s a very important thing. Then Adam’s company says, “Here is a metric ton of data that may be overwhelming, so you have to know where you are on the spectrum to know how to use it.”
GamesBeat: Adam, can you tell us in some more detail about what your company is doing?
Fletcher: What Gyroscope is doing is we’re gathering a whole bunch of data points that are passively instrumented in the game. They don’t represent specific user attributes in the sense of what we’re talking about here – we don’t actually care about stuff like demographic data. We’re watching interaction data and we’re watching a lot of it. From that, we feed it into a machine that learns when to trigger actions in the app or in the game.
The key here is that with the unreasonable effectiveness of data, the machine doesn’t care. It doesn’t care about your gender or anything else like that. It just evaluates a stream of events, makes decisions based on that, and acts in the game. Our models let us do this in ways that are effective as far as IAP prompting, changing up the random number generator to create a better experience, really any action that can take place in the game.
You’re right that we are very much taking a data-first approach to this. We’re gathering tons of data. Our customers often ask, “Hey, can we see that data?” and they can, but it’s not really human-interpretable. It doesn’t tell you something like a business function. It doesn’t tell you how people are getting to where they’re going, and that’s okay. We’re not designing around that. We’re designing around retention and engagement, not necessarily acquisition.
Wallace: One area I’ve been working in is pretty interesting that uses A.I. and machine learning. I was working with the Scripps Research Institute to build the equivalent of a web portal, something like Kongregate, for citizen science games, called Science Game Lab. What we’re building toward is using human computation across a variety of citizen science games, games that drive research. They’re not just educational, but actually part of data-driven research, using human computation as a basis for building more sophisticated machine learning. A lot of what we’re doing is built on principles we’ve worked around in gaming for years, a lot of those stepping stones.
GamesBeat: I’m curious how much you all think the two different conversations here come together into one conversation. We have A.I. for marketing and A.I. for game design. It seems like in this area of why people drop out, why people stop playing a game, these two areas come together. You might apply science from one side to solve the problems of the other.
As far as dropping out goes, some people might think they’ve run out of content. In a situation where the content is procedural, where it’s constantly being generated, it might still eventually feel like the same type of thing every time. When a game turns into a grind, it feels like there’s no point in continuing.
Wirtz: The big guys are doing a lot of things where they ask, “When do I drop the DLC? When do I push somebody into the next purchase?” But you also see that in the mobile space. I play this game called Hill Climb Racing 2 from Fingersoft. Every time I lose badly enough, it hits me with, “You should buy the upgrades! Look, for a dollar you can buy some stuff to make your car go faster!” It has hit me more times than I’d care to admit for a game that I wouldn’t have paid four dollars for up front. I’m getting nickel-and-dimed by these pay-to-win purchases.
Fletcher: The content space is a really interesting issue. How many of us were excited for No Man’s Sky? It looked awesome, and now it’s the best way to take photos of a game you never play. It’s too bad, because it’s such a great basic idea. But there was too much content, no narrative, nothing compelling you to move on.