Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Register today.

Characters suddenly become unplayable as the player becomes aware of new pieces of information. Armed with that knowledge, she embarks on the next quest or level and gets right back into the action.

At first glance, this completely makes sense. After all, if games want to tell a story they need to have some type of narrative anchor, and cut-scenes are a great way to do that.

Since the mid 1990s, games have used pieces of cinematic animation to bridge levels and create motivation to drive the story forward, which demonstrates that the industry clearly wants to tell stories. Ever since the plot moved on from “the princess is in another castle,” games have wanted to create new forms of narrative that drive the player forward.


MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

Cut-scenes can provide a default way of showing the personality of a character, which can be a particularly fickle trait in something as subjective as gaming. They provide a canon, stability, and a solid story arc in a game filled with thousands of options for each player.

But is it time for the industry to outgrow cut-scenes?


They may very well provide a method for giving solid narrative, but the use of in-game narrative has grown over the past decade, a trend which primarily started with Valve’s Half-Life in 1998. This was one of the first titles to never break away from the viewpoint of the main character, which made all of the story elements filter through the player’s gaze.

The advantages of this are obvious with the main benefit being that you can direct the gamer down a certain path and make them take note of certain elements within the story, or simply tell them while having them retain control.

Because after all, the importance of the game is control. Developers want gamers to be in control at all times, and that is much, much superior to having them watch a cutscene.

But a number of different disadvantages reveal themselves when you simply relying on cutscenes, especially when new titles are becoming so good at using in-game and in-character narrative.

Firstly, cut-scenes are used far too often. As previously stated, the purpose of the game is to interact, play, and feel control. Cutscenes detract from that feeling, and they need to be used sparingly as a result.

Secondly, cut-scenes are often used for little or no purpose. The most recent Medal of Honor sported several cut-scenes that provided no other information than watching the character stab or shoot someone. What is the point of this? Why can’t the gamer have done it herself?

Finally, cut-scenes can distract the gamer from her actual goal. When she sits down to play a game, she doesn’t necessarily want watch a film. She has a DVD or Blu-ray player for that. Rather, she wants to actually engage and control a character.

She want to dictate exactly how she handles herself, uses weapons, talks, jumps, or runs. And she wants to feel that power at all times. Cut-scenes break that power and turn the gamer into a passive agent once again; in some ways, it directly contradicts the purpose of the medium.

Debate surrounds Activision Blizzard hinting that the publisher may at one point take all the cut-scenes from Starcraft 2, stitch them together, and sell them as a movie. They estimate they could make millions of dollars doing this. They are probably correct. Gamers would pay through the nose to see this type of “film" — if this product could even be called that at all.

But in theory, that plan shouldn’t even be allowed to work. Cut-scenes need to be a tool in the writer’s arsenal — not the means by which they tell an entire story.

Just as a screenwriter does not rely entirely on either dialogue or on-screen action to completely tell a story, games must rely on both in-game action and well-crafted cut-scenes to provide the full breadth of characterisation and action the narrative deserves.

Both tools cannot make sense without the other present — they must work in tandem. The perfect scenario would see Activision Blizzard’s film be nothing more than a group of shots that have no context when put together.

The best examples of games that properly use cut-scenes are Mass Effect and Knights of the Old Republic. For the most part, these games rely on cut-scenes when the user controls the action. She choose the dialogue the character speaks — and to some extent, the response — and then what happens next in the narrative.

In this situation, the gamer has at least some essence or perception of control about what is happening directly on the screen, even though she cannot control the actions of the character at that particular time.

And that is the purpose of the cut-scene — to provide a cinematic experience that gives the gamer some excitement without completely losing that element of playability.

So, what makes a good cut scene? I'll share a few points that gamers should keep in mind when determining whether a cut-scene has had some time and effort put into it.

The action within the cut-scene remains the same as the gameplay the user experiences or at least doesn’t exceed it. This is, in some ways, the most important point. Often in cut-scenes found in intense action games such as Resident Evil or Crysis, the character isn't actually able to perform the cut-scene's action during normal gameplay.

And this completely ruins any sort of immersion the gamer has. If you’ve been controlling a character for a solid 10 hours, you’re going to feel in sync with that character and know how to operate him or her efficiently.

But too often gamers see that character in cut-scene perform moves that aren’t in sync with the way they have been playing. The character might move too quickly, use her hands when she might have relied on weapons, and so on.

The cut-scene at some point provides some type of action for the user to perform. Some users might know of these as “quick-time events.” These operate just like a normal cut-scene, but within that scene the game gives users some type of action to perform: an attack, a handshake, or whatever it may be.

Sometimes there is a consequence for missing the event, and sometimes there isn’t. Many gamers hate quick-time events, but they are at least better than doing nothing at all and leaving the gamer to sit back and watch. Taking Mass Effect as a great example, games need to live up to their actual definition and provide some type of gameplay for users to interact with.

Cut-scenes are fine, but they need to be coupled with action to make them worthwhile.

The cutscene contributes something to the plot. There’s a saying in screenwriting that basically states: A writer needs to start a scene as late as possible, and then finish that scene as early as he can. The same applies to cut-scenes. Developers really need to be asking themselves: Is this cut-scene really necessary?

At any point, can the information in the cut-scene be delivered through gameplay, which should be given preference over the passive activity? If at any point there is doubt over that, then gameplay should take precedence.

The Call of Duty series is the absolute worst offender of this. The franchise has gamers interact with their characters, and all of a sudden they will enter a quick-time event that sees them slide down a hill, or grab on to a cliff. Why do something that completely ruins the gamer’s immersion instead of providing her with the tool to actually interact with the game on a deeper level? If given the choice, allow gamers hang on to the cliff themselves — they shouldn’t have it done for them.

Cutscenes need to actually integrate with the story. Every cutscene — every single one — must provide some type of information to the player. That information might be visual or verbal, but it must contain something they didn’t know before.

Simply showing something cool, like an explosion and the character dodging away, might be fun but it isn’t actually delivering the player any type of narrative prompt. For all of its benefits, Prince of Persia: Sands of Time was pretty bad at this.

After every fight, there would be a short cut-scene showing the Prince putting his sword back into place. Who cares? Do we need this? Not at all — it provides nothing.

Cut-scenes are a tool — they are not the finished project. They need to assist the player in coming to the conclusion of the story — not just give her something pretty to look at.

GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn more about membership.