[Editor’s note: We’re not just a (rad) news site — we also publish opinions/editorials from our community & employees like this one, though be aware it may not jive with the opinions of Destructoid as a whole, or how our moms raised us. Want to post your own article in response? Publish it now on our community blogs.]
Gaze upon the cold steel maw of the imposing yet awe-inspiring stronghold. Note the trails of rust snaking through every fissure, every iron grate, every shattered window, as though an otherworldly ivy has ensnared the structure in its myriad tendrils. The coppery stains that litter the floors and walls as if an unseen painter allowed his brush to splatter across the canvas indiscriminately indicate that the souls of this once-bustling compound met a gruesome end.
Everything is so rich, the world so alive in its destitution. The fog is so thick that it almost seeps through your television screen. You teeter on the edge of the coach, pricking your ears at every hollow reverberation. You are so happy you sprung for that 7.1 surround system. As you ready yourself before plunging onward, you remark how you’ve never known true immersion until this very moment.
As technology allows for greater visual and audio fidelity, as game developers further their pursuit of the perfect balance between narrative and gameplay, gamers grow excited at the prospect of greater levels of immersion. There is no reason to return to the old gaming haunts that can’t provide an experience of the same awesome magnitude as today’s hardware. The great holy grail of gaming is an immersive universe in which you cease to be a player and become a resident of the world. Immersion, immersion, immersion.
I keep hearing that word: Immersion. What does it mean? Why is it so damn important? Why does it appear, according to the rumblings on game sites, message boards, and blogs, to be a quality that is unattainable on anything other than the most high-end hardware available? Does that mean that games we consider immersive today will no longer be as immersive tomorrow? Why isn’t there a clear-cut definition?
At the very least, “immersion” describes the state in which a person is completely absorbed in an activity. In the case of videogames, to be immersed is to be so totally engaged that we discard all our cares and worries in order to devote all our attention to playing the game. Sometimes we lose track of the time or put off necessities like eating or using the toilet just so we can get to that next save marker, that next town, that finish line.
But really, to what extent does atmosphere alone keep a player engrossed? I’m not downplaying the importance of presentation as an effective means of drawing you into a game. Obviously, a game that is critically lauded for being having fantastic character models and a beautiful ambient soundtrack is going to be more immediately appealing than a game that is lambasted for looking like a graphic design major’s first semester homework assignment. We are reminded every day of the benefits of making a good first impression whether it is in preparation for a job interview or in meeting a significant other’s relatives for the first time. Outward presentation doesn’t always reflect internal value, but it does inspire confidence that quality and care were heavily invested.
Once you have been enticed, after you have been drawn in, what guarantees that you remain engaged? One of several things can happen. You can grow bored or frustrated by the nature of play and lose your confidence, you can feel inspired to play for an extended period because there is just enough content to tickle your curiosity or because you hope that at some point in the future the game will ensnare you completely, or you can click with the game immediately and play until you will yourself away. Yeah, this sounds like the old “graphics over gameplay” cliché, but I want to stress how a game doesn’t have to be fun or even all that good for it to be engaging, like watching a terrible public speaker lose his or her composure on stage and then anticipating what else could possibly go wrong. I think that’s an important distinction to make even though it isn’t as likely to happen.
You’ve heard of the Tetris effect, haven’t you? If you play Tetris frequently, you may find yourself staring at ceiling tiles and remarking how similar they appear to tetrominoes or gazing towards a cluttered shelf and imagining how you could rotate books and knick-knacks to form neat and orderly lines. It’s similar to when you see light spots superimposed on the backs of your eyelids after staring at a pattern of black dots on a page for a few minutes. While the same effect can occur with prolonged exposure to any game, it is most commonly associated with Tetris because of its ubiquitous and addictive nature.
Contrast that game with any high-end Hollywood-esque production from the past few years. Tetris is often hailed as one of the most engaging games ever made, yet it has no narrative and only enough graphical flourish for a player to distinguish among the different shapes. If a game about falling bricks can absorb players so deeply, wouldn’t that imply that immersion relies very little on how “natural” a game looks or feels?
I know, not a fair comparison. Opposing genres, apples and oranges, all that jazz. There are different expectations for a puzzle game compared to an open-world shooter or a 3D platformer. If you are racing through a misty jungle, you have to believe that it’s a misty jungle. If you are launching through the cosmos, you have to believe that you are hurtling through the cold vacuum of space. Surely, the less work required to suspend your disbelief then the more likely you will sync up with the heartbeat of the game’s fictional environment.
As beautiful as a game like God of War, Heavenly Sword, or Ninja Gaiden may be, what are you most focused on when you are beset on all sides by blood-thirsty hordes? Are you thinking of what variety of techniques you should string together in order to minimize health loss as efficiently as possible? Or are you admiring the stitching on your clothing, the individual beads of sweat running down your enemies’ cheeks, the hollow sound of footsteps on the floorboards? Get distracted and you’ll be dead within a minute. If you are serious about seeing a task to completion, you’ll want to devote the bulk of your attention to that task. Everything else is superfluous nonsense.
Don’t believe me? When you read, how much information do you retain? Can you recite an entire passage word for word on your first try? Unless you possess a true photographic memory, you’ll only be able to recall the most important details. It’s the key words in each sentence that are crucial. The more focused you are on the content, the less likely you will get hung up on spelling and grammar errors should you happen to notice them.
The same thing happens in videogames. Your brain can only process so much information at a time, so you remain on the lookout for visual and audio cues. That’s not a Mettaur, it’s a little yellow blob. Those aren’t Locust, they’re big gray guys that bleed. It doesn’t matter how atmospheric or true-to-life the game is initially presented. Eventually, you are going to filter out nonessential data, after which it will be up to the game to lay a trail of breadcrumbs to lead you to the next objective.
If that’s the case, why don’t games revert to the Atari age? Aside from an appealing presentation, atmosphere provides necessary context. The pieces have to come together to form a cohesive, believable whole. What I argue is that the importance of minute details is overblown. It doesn’t take much to fool the human mind. All it takes is a very convincing lie. Some game developers are able to pull that lie off with fewer resources than most while others can sink millions into a project yet make us feel so disconnected from the world that we shelve the game after half an hour.
Ultimately, what you and I believe encapsulates immersion varies wildly. That’s where my big problem is. The concept is intentionally vague, left up to the interpretation of the individual. Its use within the gamer community picked up tenfold over these past few years as the rift between Wii and HD console owners widened. Whenever a Wii port is considered or the rare major third-party Wii exclusive is announced, I hear talk about how much more immersive it would be on the PS3 or the Xbox 360. It’s as though gamers more or less understand that a lot of the output this generation is just spit-shined variants of earlier titles and thus they play the “immersion” card because no can argue against something so nebulous.
Why would games be immersive? No one can seem to answer that. People say that the world would be more vibrant and the game much more satisfying than a similar experience on lesser hardware. Is there some grading rubric you guys use to measure how much more involved you are with a game in which you can hear artillery fire a hundred feet out, 13 degrees north of east thanks to Dolby Digital magic? What about portable gaming? Is that a lost cause?
Claiming that higher-grade hardware will, all other elements being equal, yield greater immersion in gaming is as crazy as claiming that higher-grade paper will yield greater immersion in books. Just like “vision,” “immersion” is just another wall for console warriors to hide behind when what they really mean is that a game should come to their machine and not to yours. I wish they would just be honest and admit that they need to justify the hundreds of dollars they blew during a wild trip to Best Buy. No one will hold it against you.
Or maybe I’m a just bitter Wii fanboy with an agenda and thus my argument is invalid.