This post has not been edited by the GamesBeat staff. Opinions by GamesBeat community writers do not necessarily reflect those of the staff.
Game design isn’t something that many people understand, even within the industry, and that’s a real problem. Now we’re approaching a new console generation, this is a good time to start talking about what game design actually entails and how it informs both players and developers. This might get a little bit heavy, because in trying to understand games, you need to try to understand the economic and mathematical definitions of what a game “is.” OK, so I can hear you tabbing out from here, but please bear with me – I’ll do my best to keep things interesting.
In essence, a game is a situation in which multiple actors compete to achieve certain goals; actors are defined as individuals who are both rational and intelligent. An actor’s goal can be anything from the satisfaction that comes with winning or some sort of reward (virtual or otherwise.) Immediately, this should raise a couple of flags in your mind if you’ve given computer games much consideration, especially with regards to artificial intelligence. The reason playing games online is so satisfying is because human beings enjoy lording their victories over one another, but also because the behaviour of a human being is typically more interesting than that of an AI routine. Computers are certainly rational – often more rational than their human counterparts – but creating a digital intelligence is impossible for us right now.
An example analysis of a “game” as defined by game theory is the “prisoner’s dilemma,” which is as follows:
Two members of a criminal gang are arrested and imprisoned. Each prisoner is in solitary confinement with no means of speaking to or exchanging messages with the other. The police admit they don’t have enough evidence to convict the pair on the principal charge. They plan to sentence both to a year in prison on a lesser charge. Simultaneously, the police offer each prisoner a Faustian bargain. Here’s how it goes:
- If A and B both confess the crime, each of them serves 2 years in prison
- If A confesses but B denies the crime, A will be set free whereas B will serve 3 years in prison (and vice versa)
- If A and B both deny the crime, both of them will only serve 1 year in prison
Our old friend Wikipedia notes that in purely individualistic terms, the most rational thing to do in the game is to betray your partner; as the betraying actors have no chance of receiving the three year sentence, it typically yields better results, despite cooperation being ultimately more rewarding. The game breaks down in real life as people tend to be biased towards working together, but video games aren’t real life – and that’s where things get interesting.
Assuming that you are both rational and intelligent, when you begin playing a video game, you always have a number of choices available to you. These differ depending on the game, and encompass what level to select, what character to play as, and so on and so forth. More importantly, though, a meta-level choice available to the player is whether to continue playing or to switch the console off; unlike within the situation described by the prisoner’s dilemma, you have no obligation to keep playing. We’re assuming that you’re playing video games for the purposes of entertainment, not because you’re the victim of a particularly eccentric prison sentence, so when a video game stops being fun, the rational thing to do is to stop playing.
There are a number of reasons games stop being fun, but one of the biggest is when a player feels that have no chance of winning. If all of your actions lead to failure (even if you have the capacity to delay the inevitable), then why keep playing? No effort is rational, which means ragequitting in multiplayer games is actually somewhat understandable. Of course, unless the game in question suffers from a glitch in which victory is literally impossible, games always tend to offer some chance of victory – but if this chance is too low (and the amount of effort you need to expend to progress further outweighs the perceived enjoyment that might result from doing so), then the rational thing is to switch off. This is why people quit games when they “get stuck,” because trying harder isn’t rational.
So: good game design dictates that loss should not be a foregone conclusion, and that victory should never be clear until the game itself has ended. Civilization V had a problem prior to its most recent expansion where the gap between players would typically widen to the point where winners and losers would be decided too early in the game, thus the late game phase was criminally dull. MMORPGs don’t appeal to a large number of people because they’re effectively designed to be unwinnable; this doesn’t mean that MMORPGs are typically poorly designed, or that the genre is a wash, but they’re designed like treadmills to keep you playing because this is what the MMORPG business typically relies upon. Because it’s so blatant, the terms of the game are visible from the outset, meaning that many people aren’t willing to get involved.
The reason games like Pac-man and Street Fighter have endured for so long isn’t because of misplaced nostalgia. It’s because as far as games go, they are very well designed in terms of when loss and victory become clear. While in real life sometimes taking part is all that counts (and society would be a much worse place without such activities), video games are designed to be entertaining, so victory should always be attainable at all stages in the game and for all players. Of course, some games resort to bizarre methods of making sure fortunes can be reversed, which also isn’t the answer; Mario Kart games typically contain weapons designed to punish players in first place, which incentivises staying in second place for as long as possible and makes the game very frustrating for players of high skill.
Bearing what I’ve said in mind, then, the golden rule of game design should be to create games where victory is always attainable for all players, but skill is still adequately rewarded. If you want players to get better at the game, you have to incentivise doing so, either by offering them better in game items, with leaderboards, or by matching them with players of approximate skill online. As a developer, you need to lull players into a state of flow; while this is much more complex in a multiplayer environment, with good matchmaking protocols it’s totally possible. In offline games, AI can – and should – adapt to players who are having difficulty with the game or who are simply finding it too easy, although this is easier said than done because true artificial intelligence is a very long way off. Also, I think many people would be infuriated if they were under the impression a video game intelligence was letting them win out of pity (a 100% rational AI would want to remain operational for as long as possible, so letting somebody win just enough to keep them playing would effectively be the best way for it to achieve that end.)
This is obviously very challenging, but it has been done. Street Fighter (and fighting games in general) are designed so that the health bars are largely arbitrary, because it is almost always possible to avoid damage and secure a flawless victory at any point in a match. Pac-man, Tetris and so on are designed in such a way that losing never takes too long, and that the rewards for playing well were ample enough that people could have fun. Even Call of Duty is enjoyable because player death is short and quick, and shooting at opponents feels poppy and responsive.
The trouble is that developers often seem more caught up in making games incredibly complex to make up for their own lack of ideas about how their game should play out. “Triple-A” titles – the video game equivalent of a Hollywood blockbuster – focus less on the player’s potential to win and lose and more on the game’s visual and auditory nuances. While we’re building better consoles to improve on the latter, it seems somewhat absurd that video game developers tend to frequently put the “game” part second to the abstraction of the game itself. Complexity is not just graphical; complexity in terms of rules is equally as problematic, meaning that players sometimes get lost in a labyrinth of rules and can no longer properly discern whether or not they’re winning or losing or getting closer to either outcome. Again, there’s no point fighting through a bureaucracy if it isn’t enjoyable, and lord knows anybody who’s tried to deal with a government body will be able to tell you that bureaucracy is the furthest thing from fun.
There’s nothing wrong with Triple-A titles, of course, but releasing big budget games gets riskier and riskier with every console generation. We’re at the point now where it only takes one bad game to force a studio into shutting its doors, and I think marketing executives are confused why games with lavish budgets and long development cycles aren’t doing particularly well in today’s market. I’d hazard it was because large studios are no longer producing games, but packages of set pieces; effectively, many modern games are designed upside down, in that what makes them games is deemed less important than the tropes from other media that make them marketable in the first place. I mean, really, take a look at Medal of Honor: Warfighter – the game was a clear attempt at the Call of Duty money, leaning hard into the same tropes, but what made it a game was unremarkable at best. Do games companies genuinely believe that the public is so easily fooled? Sure, you need to market products to the public, but don’t kid yourselves – if playing a game isn’t enjoyable, there’s basically nothing stopping consumers from switching them off.