Or: They just don’t make’em like that any more
Videogames, as we all know, are quite the success story: In only about fifty short years, they developed from a fringe phenomenon to the multi-billion dollar industry and the world’s favorite pastime it is today. Videogames tell increasingly complex stories, present us with ever more astonishing graphics and soundscapes and create game-worlds so large that one could wander for hours upon hours without crossing the same virtual river twice. But for an increasingly large number of people, myself included, this rapid development appears to bring about something quite worrisome indeed: The death of creativity in the videogame industry.
You may be wondering what I’m talking about - after all, I just talked about how excellent some videogame stories are and how artfully creative the graphics and in some games are. Well, allow me to speak verbosely. For most scholars in videogame studies, things like stories graphics, the overall presentation of the videogame, is secondary to a much more essential aspect: gameplay. Gameplay is the single most important part of a videogame, it’s the thing that makes a game enjoyable. It’s the main differentiator from other audio-visual media like movies, television or the world wide web. Unfortunately, it’s also real fucking hard to define. I might return to this issue at later point, but for now, I’m sure most everybody has a certain thing in mind when thinking about gameplay; these very subjective definitions will do for now.
For many years, gameplay was quite obviously the most important part of a videogame; when everything is presented in very crude form, with just a few white line on a black screen and maybe canny audio from an early sound chip. In short, there was just not a whole lot there to distract you from the main activity. There was also next to no reason to play the videogame if it just wasn’t fun. With the advent of ever more elaborate presentational capabilities on the other hand, the distractions for the player and also the means for hiding weaknesses in the gameplay increased tremendously. Let’s take a popular (and kind of controversial) example: Ubisoft's Assassin's Creed.
Both the original and its sequels have amazing graphics and provide ample opportunities for "emergent" gameplay. The core gameplay mechanics work wonderfully well, be it climbing, fighting or just running for your life from an angry horde of templars: almost everything is fluid, feels intuitive and natural. As far as control interfaces go, you can’t do much better than Ubisoft Montreal did with these games. The game’s historic cities are presented in great detail and offer astonishing visuals. So, why would i still agree with most of Destructoid’s Jim Sterling’s scathing criticism in his review of Assassin’s Creed II? Well, because there are some serious issues with the gameplay. (Although, to be fair, I probably wouldn’t give the game a 4,5. The first game? Less than that, probably.)
And that’s really the problem with many, many newer games out there. Many games create awesome world’s but the things you can do in them? The missions and quests, the activities and the rest? Boring. Unimaginative. Stale. Just plain not fun. The creativity in gameplay that for so many years dominated the videogame industry and made this great pastime ever more popular seems all but dead in many development studios. There are probably many reasons for that, but the two most important ones, in point of view, are the following.
1. Graphics fetishism
2. Economic risk-managment
3. Drive to make games more "cinematic"
I already touched the first point above, so I will concentrate mainly on the other two; As the industry grew, games production became more and more expensive. This lead to the split in game developers and game publishers. This is comparable to how things go in the film and music business. The developer presents an early form of the game to the publisher, the publisher then decides if the game is worth financing, read: if there's money in the idea. In the last few years, publishers demand ever more elaborate and advanced version of the game from the developers before deciding if they get their money after all or not - this early version has to be payed by the developers themselves (although they usually get their investment payed back once a publishers picks their project up). This unfortunately leads to the developers being ill-advised to try something off the beaten path: They might have a great, revolutionary idea, but pursuing it is much more risky than just churning out one gritty military shooter after another. Everybody knows that there's a huge market for those games, but if nobody is ever going to take chances, this is likely all we'll get.
The last great problem the game industry has is its obsession with the term "cinematic". Just go and read a few reviews of modern action games. Or read the backsides of your newest (J)RPG. Most every game promises a "cinematic experience"; and it does sound great, doesn't it? It sounds like bullets whizzing past our heads and explosions so lifelike that their shockwaves will cause our intestines to tremor. But really, what cinematic means is: Scripted gameplay with minimal freedom to explore and a heavy reliance on (mostly in-engine) non-interactive cutscenes. And really, games that work like that can be very exciting, they can still be fun and a great experience. But "cinematic-ness" suddenly becomes the most important part of video games, more or less regardless of genre, the gameplay will suffer. Movies are movies. Games are games. They don't work the same way, and they shouldn't look or feel the same way, too.
I for one implore all of you: Support developers and publishers willing to take risks. Question your own opinions on the presentation of games. Don't pirate games by Indie devs, because they are one of the last innovative forces remaining in the business. If you fell like you have to pirate something, pirate bestsellers.