This article first appeared in the January 2010 issue of Game Developer magazine.  It has since been reprinted on Gamasutra at this link.


How Realistic is Too Realistic?

One can ask the makers of The Polar Express, the animated Christmas film released just in time for Thanksgiving in 2004. Despite huge investments, a big-named director, and Tom Hanks providing the voice and mocap animation for several roles, the film struggled at the box office, getting swamped by another animated film, The Incredibles, released five days prior. Around this time, the concept of the Uncanny Valley entered the public mind.

The Uncanny Valley is a theory that most game artists (especially modelers and animators) are well aware of now, but it wasn’t always the case. Around the same time The Incredibles was trouncing Polar Express in the box office, too many art directors believed real games made for gamers had to chase photorealism in order to be successful. World of Warcraft eschewed all of that for a cartoony appearance, and in doing so blew past all the competition and expectations. Three years later, Team Fortress 2 would do the same for a shooter market that had previously obsessed over realism to an insane degree.

Realism is a choice, both for artists and for designers — but it can also be also a trap, and one that is perilously easy to fall into. In art, chasing realism is expensive — technology can provide incredibly lifelike visuals now, but it’s also increasingly expensive and time-consuming to generate that content, and the end result is a screenshot that looks not all that different from competitors who are also chasing realism as an end goal. But realism isn’t just a pitfall for artists — game designers also flirt with realism as a source of inspiration for their game mechanics, often with staggering implications to their game designs.

The Realism Trap

The unwary designer can get into trouble by trying to follow realism too closely. Making a scene look realistic doesn’t necessarily make it look more beautiful, fantastic, or intriguing. Similarly for designers, a game mechanic that is realistic doesn’t necessarily make the game fun.

A common way that this makes itself evident inside our game designs is the rise of sandbox games. Once a mechanic largely limited to strict simulations, the success of Grand Theft Auto has resulted in game designers trying to shoehorn sandbox design principles on almost every genre of gaming. In sandboxes, players are free to go anywhere and tackle content in almost any order, rather than be drawn along a linear game path with unreachable areas blocked off by unrealistic obstacles or invisible walls. True, it’s more realistic, but it’s also more expensive to build and test that world.

And even if it weren’t, sandbox gameplay may fight with other tenets of the design. For example, most players get confused and overwhelmed when told to find their own fun, and systems need to be devised to lead them to interesting activities. Compelling narratives are harder to tell, because designers lose control of the order and flow. Sometimes issues are more insidious: Burnout: Paradise‘s open world structure made it difficult for players to attempt to do the same race or challenge twice in a row, as many racing game players want.

Are sandboxes inherently bad? No — some of the finest games in the world are sandboxes. But injecting this level of realism into a game has very direct repercussions on the cost and design of the game that the designers must be mindful of.

When Realism Creates Unrealistic Behavior

In the early days of Everquest, it was not uncommon to stumble upon another player in the wild who was throwing himself off a short cliff over and over again while spewing gibberish indecipherable to passers-by. Use-based advancement was to blame: While most of Everquest‘s advancement model was centered around a classic level-based advancement system, the non-combat skills like “Language” and “Safe Fall” advanced as the character performed in-game actions. Thus, our mysterious cliff diving tonguespeaker was someone whose character was, ostensibly, learning new trades.

The ironic thing, of course, is that these use-based systems are designed to be realistic — practice making perfect, and all. Some players and designers are bothered by the idea that you can learn how to speak Orcish by killing kobolds until you gain a level. A learning-by-doing system makes perfect sense to them.

But in practice, learning-by-doing falls into sort of an uncanny design of game mechanics. Efficient advancement in a use-based system often nudges people to perform odd gameplay that is frequently repetitive as well as not particularly fun. Rather than feeling natural and elegant, the game mechanic feels unnatural and contrived, and worse, draws attention to itself in the process. Learning Orcish by killing kobolds may not be terribly realistic, but at least at no point is the player being asked to do something he didn’t want to do anyway.

Realism vs. Consistency

There’s a lot to like about Gotham Central. The DC comic was a police procedural set in Gotham city, and tried to describe what it was like to be a detective and have to clean up after Batman and Joker slugging it out amongst the rooftops. The comic ran for 40 issues, earning meager sales but strong critical praise. Those who loved it often cited the series’ gritty realism. Which is interesting, given the series still hinges on a man who fights crime dressed as a bat.

A lot of times, people think they want realism when what they really crave is internal consistency within a given universe. Gotham Central feels a lot like what happens if you merge the classic Dark Knight with gritty TV cop fare like The Shield. The goal is to make the rest of Gotham as real as possible, and the end result is a world where Batman is still amazing and mysterious, without becoming silly or ludicrous. He feels possible — even though he’s not.

Immersion is the goal. The player should be drawn into your worlds and experiences. Realism is good when it supports immersion, and bad when it gets into the way. For example, most single roomed buildings in games are huge, often with 18 foot ceilings. It’s not realistic, but the player rarely notices. On the other hand, he always notices when, in a small room, the camera moves in too close to see or do anything.

Jumping is an interesting place where realism and gaming diverge. Most games that have jumping allow ludicrously high jumps — often a character can leap 6 feet high from a dead stop, because it feels right (see Inner Product November 2009). But recently, some action games — such as Gears of War — have been experimenting with not allowing jumping, since jumping around like a jackrabbit in heat isn’t particularly realistic. For the most part, these experiments have been successful — until the player finds an obstacle that he can’t jump but could in real life. Even worse, he could clear it by five feet in a game that allows jumping. The obstacle feels unrealistic, and worse, noticeably so. It’s a problem because it breaks immersion.

To some degree, the realism we are bound to is determined not by real life, but by our forerunners. Hit points linger as a concept because most games teach us that you usually hit what you swing at, but fights shouldn’t be over instantly. When an NPC tells you to “hurry,” he doesn’t mean it unless a timer appears on your screen. Rocket launchers aren’t just great weapons, they’re also solid ways to propel yourself up to a hard-to-reach ledge. But it’s not just games — most gun effects in shooters sound more like they do in the movies than they do in real life, because the theatre is where most players learn what automatic gunfire sounds like.

In all these cases, following unrealistic conventions can make the games feel better than taking a more realistic approach that breaks player expectations. Worse, breaking convention can make the game feel less realistic, even though it is more so.

The Place of Realism

Designers make concessions to realism all the time, of course. In the real world, it only takes one bullet from an assault rifle to kill a man. Building a breast plate from raw iron doesn’t happen in less than 10 seconds. If you get brought near death by the jet of a flamethrower, you aren’t likely to be hopping back into battle after a couple of first aid kits. This is before we get to the inherent fantasy of the worlds we build: worlds full of dragons, gangsters, or battle cruisers. And lest we cut out the mundane — short of The Sims, no games require your characters take bathroom breaks.

But realism can enrich a game as well. An MMO that has crafting can have a much more realistic economy than one that doesn’t, even if the mechanics of crafting aren’t realistic. An assassin that trades in poisons feels more real, even if game balance requires that poison be a minor damage over time effect instead of being immediately lethal. Bouts of Madden that end with scores like 30–27 feel more real, even if it takes five minute quarters to keep the scoring that low.

At the end of the day, players play games escape the real world, so designers shouldn’t be such a slave to it. Players are hoping to live a fantasy provided by the game designer. Good games make those fantasies as immersive as possible, but they don’t always do that by making them realistic. Sometimes, too much realism gets in the way.