This is my second attempt uploading this, for some reason it didn't upload last time I tried this.
For my not-so-triumphant return to C-Blogging after a couple months' absence, I've decided to write about graphics.
The visual representation of thousands of lines of code, neatly displayed in rows of colours on our TVs or monitors. Many gamers' first impressions of games come from them. But have they become an obsession?
It's no surprise that video games this generation have excellent, near-realistic (emphasis on the "near") graphics. We have high polygon counts, advanced lighting, and incredible texture resolutions. But are these great graphics really necessary?
This generation of gaming has brought a huge improvement to graphics - the Xbox 360 and PS3 brought high-end HD visuals to the masses. Higher polygon counts and texture resolutions meant greater graphical quality. More explosions, animations and lighting effects. But, really, how does that help the game anyway? Sure, it improves the cinematic quality of the game. But, take away the graphical improvements: we're still jumping from platform (metaphorically) to platform, shooting things in the face, role-playing, and playing through action scenarios and/or adventuring. I honestly think that the last generation of graphics were just fine and needed no upgrade to provide a good gaming experience.
"Next generation" gaming you say?
That's not to say that the actual gameplay hasn't improved from last generation. Far from it. But my point is that most of it could be accomplished on and Xbox or console of similar power with enough effort.
I'm no computer genius, but if we took away some of those (in my opinion) unnecessary lighting effects and ridiculous polygon counts, could we have an improved gaming experience? Maybe we could have more units on screen. Maybe we could play through maps on a greater scale with more things actually going on in the game. PC gaming, (which is unfortunately a very expensive choice for the average consumer - impractical for most) at least usually has support to turn down the graphics settings to make the games run smoother. Imagine if we could do that with mainstream consoles like the 360 - giving the option for people who don't care about visual fidelity to sacrifice graphics for a better gameplay experience.
I'm getting off topic.
The main problem with our obsession with graphics is the consumer. If they see a game with terrible graphics, chances are that they'll sh*t all over it before they've even played it. It leaves a bad public impression if the game doesn't look up to standard with all of the other AAA titles out there. It's pressure like this that causes the graphics to become such a high priority, and that brings up another point of mine.
Would it really be so bad?
Imagine if no one upgraded their graphics from the move to this generation. Just look at Call of Duty: Black Ops on the Wii and tell me that next-gen games couldn't function with limited graphics. With all of those spare resources of the newer systems, what kind of games could be made? The actual quality of games would improve tremendously, and games that previously pushed systems like the PS2 to the limit could now function with lots of room for improvement.
Or maybe I'm just a crazy lunatic.