So, this topic was brought on by this article. In the article, the author is basically talking about gameplay/story segregation in video games, how often the story that video games intend to tell are sort of destroyed by their gameplay. In the article, the author uses Tomb Raider as an example, saying that while the story/cutscenes portray Lara as reluctant to kill and having a vulnerable side, the gameplay portrays her as a mass murderer with special operations commando skills. Another example comes up in Bioshock Infinite, at one point in the game, you come across a group of executed rebels. Elizabeth, your partner, gasps in horror at the sight. I laughed. I mean, does the game expect me to feel sympathy or regret? I've probably killed literally HUNDREDS of the same people just to get where I'm standing, how the hell am I supposed to believe that Elizabeth shows anything but utter apathy at this point?
This becomes the biggest problem, IMO, in open world games. The best open world games let you do literally whatever you want, but brace that with real world consequence (Fallout 1 and 2). On the other end of the spectrum, we have games that simply don't have good "safeguarding mechanisms". Safeguarding against what you might ask? Safeguarding against a players inevitable wish to break the game.
You see, when in a game, you have a companion, and you decide to shoot him or her, a couple things might happen. Blood might squirt out of your companion and nothing happens. Blood might squirt out and your companion might say "hey, what gives!" or something to that effect. You could actually kill your partner and have the game restart. Or, the game could just lower your gun when you pull the trigger. This is a safeguarding mechanism, it safeguards against players being @ssholes, basically. In LA Noire, the most recent game I've played, I immediately started a driving destruction spree. I ran over everything, probably killed like 20 people. Other than my partner making smart-alec comments and a little flash on the top right corner of the screen saying "don't kill people", nothing really happened.
The implications are hilarious. In the story, I'm playing a doo-gooder cop who wants to do the right thing. In the game, I'm a homicidal maniac who, despite killing like a hundred people, becomes the fastest promoted detective in LAPD history.
In a perfect world, interactive fiction will compel players to behave in ways roughly analogous to how the interactive fiction’s author intends them to behave. For example, In Fallout 1 and 2, I could literally kill anyone in the game world. I would have to deal with the consequences of it of course, but I could do it. There were no "quest essential characters". If I wanted to, I could literally stop in the middle of the desert and shoot my companion in the head. But Fallout 1 and 2 had such a strong fiction, such a strong story, that I didn't WANT to. In GTA, literally the first thing I try to do is figure out how to ramp my car to decapitate the maximum amount of hookers. In Fallout, I don't have a desire to randomly shoot every stranger I come up to, because the fiction makes me CARE about the people and the place. I don't want to shoot Ian (companion in FO1) because I really LIKE Ian. And Dogmeat. And Cassidy. And all my companions in New Vegas (they're really well written).
Anyway, my question to you all is this:
How big of a problem is it that players can effectively screw up video-game stories? Is it important to you? Why or why or why not? Do you think it is an inherent flaw in interactive storytelling in general?
Log in to comment