In 2012 developer Telltale Games stunned the game industry with The Walking Dead. With its constant tense sequences and dark but endearing tone, the episodic series was one of the year’s biggest surprises, going on to win the lofty Game of the Year title from the VGA’s and many other publications. At the time, however, there was quite the divide between those who felt the saga was a gaming innovation and those who doubted whether it should be considered a game at all. Although the title involves player interaction, the heavily cinematic and on-rails feel of the game led many to argue that it wasn’t truly a video game.
Since 2012, this argument has continued to occupy the collective discourse of the gaming community. The contention has expanded to include all of Telltale’s episodic series, visual novels, and even games with full-motion video (FMVs). Some call them interactive narratives, some call them graphic adventures, but I think it’s high time we clear up some of the finer points of the debate.
Games, by definition, require rule sets and are determined by strength, skill, or luck. By that logic, video games are a natural outgrowth of the traditional systems. A game’s code is really just an internal rule set that places challenges in front of gamers who must use skill (and sometimes luck) to overcome them. When you ask a random person what they think of when they hear “video game”, you get Mario or Zelda or Call of Duty as answers. There’s a certain expectation for games to have interactive scenes determined by player skill; that’s called gameplay. There’s a chance you can win and a chance you can lose, but it nonetheless requires skill to complete.
Telltale’s games, however, restrict what one would call conventional gameplay. You can’t really move except when the game occasionally breaks to let you explore a small, controlled environment, there aren’t many puzzles to solve, and most of the time is spent conversing with other characters. To say you can “lose” at a Telltale game is to fail an easily repeatable quicktime event. There are few moments where the plot’s momentum stops to have players shoot zombies or play a minigame. Some members of the community state that these so-called “games” have minimal interactivity and are nothing more than glorified cutscenes broken up by QTEs. Without being able to overcome a difficult task and reap the sense of achievement that comes with it, by definition it cannot be a game. So goes the argument at least.
Video games are considered in a different light than the way they were twenty or thirty years ago. Gaming used to be about achieving high scores or beating the big bad guy on the final level, their stories relegated to rarely-read booklets included in the box. But as technology has improved and data storage increased, developers started to include narratives directly in the game. Mass Effect, The Last of Us, and The Witcher have each been lauded for their mature storytelling, as well as their gameplay. In fact, one could (and I indeed would) argue that stories are now an inseparable part of games. They provide context for our actions, giving us tasks to accomplish and rewarding our efforts.
While the main story in Tales from the Borderlands or The Wolf Among Us does not greatly diverge based on your choices, how you experience the story is still ultimately affected by the actions you take. The nature of this interactivity doesn’t remove the end goal from a player’s experience, but alters it, emphasizing progression and resolution of the story rather than “beating” a challenge. The agency may be different than in your common game, but it’s still there. Maybe you’re trying to survive the zombie apocalypse or maybe you’re just trying to date a certain girl. Phenomenologically, the way you interact with and receive the game is entirely unique to yourself. This methodology, known as reader-response criticism, argues that individual interaction with a game imparts a sort of “real existence” to the art; you create and interpret the meaning of a game on your own terms. The structure is there for you to interact with, and the blank spaces are there for you to fill in with your own thoughts and projections.
You can call them games or interactive experiences or garbage or whatever. The point is that they are fundamentally game-like. They have interactive elements, a goal or resolution to strive toward, and require player input to do so. Whether or not they are games in the strictest sense ultimately doesn’t matter. The notion of what a “game” can be has changed over the years. Can you really “beat” Journey? What’s the final boss in Until Dawn? Games have changed, and as our technological capacities increase, they will continue to change. At a certain point, arguing about whether or not the experiences they present fit into traditional molds feels constricting. The debate is only relevant in that it forces us to address the fundamental nature of what video games are. Suggesting that these experiences fit into a traditional hierarchy, however, might make us less open to new forms. I may not know in which direction video games are heading, but I’m going to play them regardless.