hot  /  reviews  /  videos  /  cblogs  /  qposts


PenguinFactory's blog

10:41 AM on 03.19.2013

More than a survivor: my take on Tomb Raider

"Why is this necessary?"

It's a question that often gets asked whenever an older property is rebooted, and in today's troubled financial landscape it's often a question well worth asking as older film and game franchises are shamelessly mined for brand-recognition. So, let's apply the question to Tomb Raider and see if we can come up with an answer- why is this necessary?

Rewind the clock back to the release of the original Tomb Raider. At the time Sony and many developers were aggressively pushing the Playstation as the "adult" alternative to the more kid-friendly image of Nintendo and video games in general. This led to a lot of questionable marketing decisions that would only start to alleviate with the PS3 era.

Then along came Tomb Raider, which offered not just innovative gameplay and amazing graphics but a potential sex symbol in the form of Lara Croft. At a time when consoles were still expected to have mascots she seemed like the perfect figurehead to emphasize Sony's focus on older gamers. Nintendo and Sega had a fat plumber and a cartoon hedgehog- the playstation had a sexy lady with guns! It was perfect.

I don't know whether Sony, Eidos or the people behind the PlayStation magazines at the time are more to blame for what happened next, but this led to perhaps the most embarrassingly juvenile period in gaming history. Primitive CG images of Lara Croft in "sexy" poses started to pop up everywhere, her plastic face frozen in awkward come-hither expressions. It was incredibly tacky and sleazy and spoke volumes about the boy's club attitude of the gaming world and how misguided the quest for "maturity" had become, both issues that continue to plague games right now to a slightly lesser extent.

As the PS2 era advanced and gaming established itself more as a mainstream hobby Lara's public image improved, perhaps as publishers realized that associating themselves with softcore pornography wasn't doing them any favours. The Tomb Raider franchise went through some rough patches at this time with a string of mediocre games that failed to update the now-ancient gameplay mechanics to the analogue stick age. After a long struggle Tomb Raider eventually re-established itself in terms of quality, but not cultural relevance. Elements of Lara's character and iconic visual design carried atavistic traces of her origin as a crude sex symbol that made her difficult to take seriously in a landscape where action heroes were becoming more grounded in reality.

Tomb Raider is a franchise worth saving. Its gameplay was revolutionary at the time and it's important to remember that Lara Croft's actual in-game persona was always much more grounded and down to Earth than the ludicrous virtual porn star image she was saddled with in public. I fully believe this helped to encourage the same trend of video game characters as realistic human beings instead of over the top cartoon characters that would eventually make her obsolete.

This is one situation where a clean slate was absolutely necessary. Lara Croft was like that epic seven-part Pokemon fanfiction you were so proud of when you were fifteen- a product of a bygone age that you look back on with equal parts nostalgia and shame. Gamers had to be reintroduced to the character without that baggage attached. Did Crystal Dynamics pull it off?

Just like a superhero reboot, the lynchpin of this game was always going to be the portrayal of Lara Croft. Early game footage and the developer's statements initially gave some people, including myself, cause to worry about the direction the game was taking. Some of those fears were justified, and I might write more about the game's more questionable aspects in future, but for the most part I was pleasantly surprised by how much I liked hiking, climbing and face-stomping in the boots of this new Lara Croft. The fact that the game takes so long to establish her at the beginning as an ordinary person rather than simply placing a gun in the player's hand and commanding them to kill gives everything in the game a greater weight and context. I'm climbing a wrecked plane over a cliff- pretty dangerous! Holy crap I just shot a guy! With bullets! Listen to those enemies panic as I tear through their ranks- I'm kicking ass! Nothing you do in the game is fundamentally different from what you do in any other game made in the last six or seven years, but seen through the eyes of a ship-wrecked archaeologist rather than a faceless action hero it all takes on a new light.

This allows for a unique synergy between story and gameplay, as the player gains skills and weapons in tandem with Lara's growing confidence. The acquisition of new gear or upgrades is accompanied by a series of escalating "fuck yeah" moments, each serving to propel Lara one step further on her journey from shipwrecked hostage to survivor, then from survivor to badass, and finally from badass to leader. By the time she runs out of a burning temple waving a grenade launcher and screaming about her intention to murder everyone in sight I was just about ready to get up and start cheering at both the triumphant character moment and the thought of reducing those machine-gun fuckers to pieces.

Of course all those skills would be useless without a good combat system. Herein lies Tomb Raider's greatest strength and it's biggest weakness. Early gameplay footage suggested a predictable retread of the Gears of War style whack-a-mole cover shooting that's infested so much game design lately; in reality this couldn't be further from the truth. Tomb Raider uses a revolutionary cover system that involves moving behind objects in order to escape from bullets. I know it might seem massively counter-intuitive to not have to press a button and have your character latch awkwardly onto the nearest wall as if attracted by a powerful magnet but trust me, you'll get used to it eventually. This fluidity along with enemies who will aggressively flank and throw grenades mean the player is encouraged to stay mobile and use improvisational strategies to out-think larger, better armed enemy forces. The addition of close combat attacks adds a whole new dimension to battles, as distracting and then charging enemies is often a much more efficient strategy than waiting for them to move out of cover. Plus murdering someone with a climbing axe just makes you feel like more of a badass.

It's unfortunate then that the combat is at it's best early on, when the player only has a bow, a handgun and the ability to shove enemies around a bit, or possibly off a cliff if the opportunity presents itself. As more weapons and skills are obtained the battles start to become more conventional, relying more on rapid fire weapons than cunning and experimentation. It never entirely loses that exciting win-by-the-skin-of-teeth flair and the new tools do offer some cool action-movie moments (my favorite was when I launched a grenade at someone at close range, flinging myself backwards down a flight of stairs and away from his heavily armed buddies) but too often I found myself crouching behind some boxes waiting for a dude to poke his head out so I could spray bullets at him.

The other component to Tomb Raider is exploration, as there are several large hub areas stuffed full of hidden collectibles and every combat area can be revisited to explore at the player's leisure for XP-granting achievements. It's here that Tomb Raider innovates by taking a step backward, shrugging off many of the limitations modern game design has foisted on the player. Objective markers and usable-item highlights are only visible by using an optional "survival vision", indicating a refreshing willingness to trust that the player can find their way around without having their hand held constantly. It's a sad irony that this version of Lara is significantly less acrobatic than her PS1 counterpart (no backflips here) but moves like a goddamn Cirque Du Soleil member compared to the majority of cement-footed game protagonists trundling around the environment these days. Even just being able to jump whenever you want feels incredibly refreshing. Unfortunately this effort to cast off the shackles of modern gaming conventions doesn't extend to Quick Time Events, which are used far too frequently.

I'd love to say the mostly-excellent gameplay is accompanied by an excellent story, but that's unfortunately not the case. Tomb Raider spins a predictable, cliched, extraordinarily silly yarn over the course of its playtime. For all it's been marketed as a gritty Batman Begins style reinvention Tomb Raider goes into territory just as hokey as the early games in the series. This doesn't mean its badly executed by any means- the dialogue is pretty well written and Lara's adventure crew reveal themselves to be a bit more than the pack of one-dimensional cliches they initially appear to be- but it's impossible to take certain plot revelations seriously. This isn't helped by the fact that the player will figure out exactly what's going on a good six hours before the characters do. The game also follows the recent trend of anti-climactic endings, going out with a whimper after 10-12 hours of increasingly big bangs.

These are flaws to be sure, but they were only minor blemishes on Tomb Raider's gloriously scratched, mud-splattered veneer for me. This is a quality game representing one major step forward for a venerable icon and a refreshing step back for gameplay design.   read

4:05 PM on 03.09.2013

Simcity and the review score problem

Review scores matter to gamers. Wars have been fought over them. I've been in the trenches with both hardcore Nintendo and Silent Hill fans through multiple sequels and a generation leap. I've seen things you people wouldn't believe.

Rewind the clock to the run up to The Legend of Zelda: Twilight Princess' launch. By this point in the game's lengthy development cycle anticipation was at a fever pitch and the regulars in the Nintendo forum I used to post in were anxiously awaiting the first review scores, specifically IGN's which tended to be seen at the time as the prime arbiter of quality. The absurd element to all this is that the focal point of their speculation wasn't what score the game would get but how many decimal points above 9.5 it would be awarded in comparison to GTA IV, which came out shortly before. To many armchair analysts anything less than a 9.8 would be a failure, with a below-9.5 review unthinkable. IGN's semi-mythical four point review scale had been reduced to a 0.5 scale.

So game scores matter. But the disastrous launch of Simcity gives us more reason than ever to reject them.

If you've had any contact with the gaming world at all this week you're probably aware of Simcity's predictable server meltdown. Comment threads are on aflame. Polemics have been penned. Pre-release screenshots of burning buildings have been gleefuly re-purposed by journalists for hilarity by way of visual metaphor. But there's a smaller, almost as interesting story unfolding in the shadow of the larger Simcity release disaster.

I discovered a few months ago and it's quickly become one of my favourite gaming sites due to its insightful journalistic approach and classy visual style, which I have shamelessly ripped off for my own humble blogging efforts. So I was interested to see that their Simcity review has undergone quite an odyssey over the last few days. Starting off with an excellent 9.5, the score was reduced to 8.0 after the game's server woes before being dramatically chopped down to a 4.0 yesterday.

Other review sites such as Gamespot and our own Destructoid have given the game mediocre to bad scores, based partially on intrinsic design flaws such as the small amount of land available to players and the lack of a true single player mode but also to an extent on the server issues, with Destructoid's Joshua Derecher unable to tell if some of the gameplay problems he experienced were inherent to the game's design or due to the broken connectivity.

These are situations that don't lend themselves to review scores. A 4/10 says "bad game" whereas the text of the review itself says "game with problems, some of which might not still be around this time next week". Strip away the rather startling drop in Polygon's score and the reviewer's change in opinion seems a lot more reasonable- from "this game is great" to "think twice before buying" to "I can't recommend this due to technical issues". And yet it's the review score that anyone who visits the site months and even years from now, assuming Polygon doesn't change their score again once the servers stabilize.

Obviously most game don't have this issue, but I think it serves to illustrate the fundamental problem with condensing a reviewer's often nuanced opinion into a simple number or even a five star scale. In even the best of circumstances it's a compromise, one that often only serves to distract attention away from the writer's opinion and facilitate toxic flame wars.

I've heard from many reviewers both inside and outside the gaming world that they don't like using review scores but feel obliged to do so due to audience expectations. In today's rapid fire internet communication age people seem to need some measure of summary to make quick judgements. Kotaku's current binary "should you buy this?" system may be a decent compromise, eliminating a lot of the potential for squabbling and readers demanding that a reviewer explain the difference in quality between a 9.5 and 9.8 score.   read

2:43 AM on 03.08.2013

Worth the wait? Tropes vs Women in Games, episode 1

I missed the boat on becoming a backer for Anita Sarkeesian's Tropes vs Women in Games project, but having been introduced to Feminist Frequency through her Lego videos just prior to the announcement I was more than excited at the prospect.

Nine months later, does the first video meet expectations?

The first thing I noticed is the length, a pleasingly chunky 23 minutes and change. Assuming all twelve planned videos in the series follow suit this would make for a combined running time of more than four and a half hours. Anita Sarkeesian: The Movie, anyone?

In the past the Feminist Frequency videos have adopted quite an informal tone, akin to a (obviously rehearsed and planned out) vlog. Tropes vs Women takes a somewhat more formal approach, with a structure inspired by documentary techniques. The video opens with an anecdote about the production of Starfox Adventures that demonstrates the trope better than a simple dry explanation. We get the historical and cultural roots of the Damsel in Distress encompassing movies and other pre-game media.

I have to confess the use of the word "trope" in the video series' title made me a little wary, as the usual approach of simply listing occurances of a particular story element can make for compelling reading but probably wouldn't be very interesting to watch in a video. Thankfully Sarkeesian didn't go down this route, instead focusing on a few popular franchises, even giving some historical background on their development I didn't know about, while demonstrating the ubiquitousness of the tropes with quick video montages. And let me tell you, when you see dozens of distressed damsels lined up ear to ear it really drives home just how prevalent the use of that character is.

One of the things that initially drew me to Sarkeesian's videos is the relatively high production values on display. After wading through so much jaggy edged green-screen and 72-point gothic fonts it was refreshing to see a video made by someone with an ounce of design sense, using a clean, visually simple style. With the extra time and Kickstarter money the Tropes vs Women project pushes the boat out even further on this, with tons of animated graphics that you would usually only see in a video put out by a professional online publication. That said, I'm not overly fond of the rather garish sea blue and pink colour choice on display.

Judging the actual content of the video in terms of discussing the trope is difficult at he moment as this video only covers older examples, with a second part to focus on more recent trends. However it does succinctly demonstrate how pernicious the trope has been in continuing for this long, with franchises like Mario and Zelda endlessly recycling the "kidnapped princess" plot device even when later games in the series at first appear to be subverting it.

Sarkeesian is in a sense hamstrung in this video as the damsel in distress trope is one of the most basic and surface level sexist elements in video games, and as such there just isn't much there to analyse. However, I came away from the video with a renewed understanding and awareness of the pervasiveness of the damsel in distress in games. The fact that this stock character has been so widespread in games speaks volumes about who developers assume their target audience to be,

Focusing as it does on older titles I found a sun-faded photo of a young Anita Sarkeesian sitting enraptured in front of a SNES near the end of the video quite a powerful moment. I think the intention was to head off the inevitable "fake gamer" claims at the pass but it demonstrates that many women and girls have grown up consuming media in which people of their gender are almost universally portrayed as weak and powerless, if they exist at all. You have to look at that young girl and wonder what must have been going through her head at the time, then realize that this is still a daily reality not just in video games but across all media. That's why we need this series, and a dozen more like it.

Welcome back, Anita Sarkeesian. It was worth the wait.   read

7:01 AM on 03.03.2013

The problems with video game writing and what we can do to help

Video games: the newest, hippest, fastest growing entertainment medium in the world. More profitable than movies, by some estimates. They're mainstream now, don't you know. Your grandparents are probably playing them. We've moved on from the semi-mythical days when playing video games on your Commodore 64 would get you beaten up at school.

And yet while games continue to change and evolve in the breadth of gameplay experiences they offer (not always in positive ways, it needs to be said) the quality of video game storytelling has stagnated or even receded in the past decade and a half. Counter-examples to this trend can be found on every platform for every year, but they continue to be bright, shining pinnacles rising above a sea of mediocrity.

As someone who loves both games and well crated, original stories this situation has always bothered me. Over the years I've discovered that the problems in video game writing can usually be broken down into a handful of distinct flaws, many of them stemming from one original sin. In this post I'll go through them and then offer some constructive advice on how we as consumers can help to rectify the situation.

Note: for the purposes of this post I am mostly talking about western made games published professionally for retail, either in stores or downloadable, ie what most would consider the current "mainstream" of gaming in the America and Europe. Japanese developed and indie games have their own writing flaws, but they tend to be quite different.

"There are no new stories" is a phrase that often gets trotted out when it comes to writing. The intent of the statement is that certain genres tend to follow the same basic template- for example, romance will always involve people falling in love - so the writer must try to handle it in an interesting way. It does not mean that every story that could ever be told already has been and so we might as well stop trying to be original.

Nobody appears to have informed video game writers of this, because by God are they taking the concept at face value.

Picture the following scenario: you play as a lone agent operating in a dangerous situation. You're acting not on your own initiative, but according to the instructions of someone else isolated from the action. Maybe you return to a central location between missions, or perhaps your contact is a distant voice over a radio. You fight through hordes of bad guys and finally take out the villain. What happens next?

If you're at all familiar with video game narratives- if you've played even a handful of prominent games over the last five years- you should have been able to instantly guess the plot twist: the entire mission was a ruse designed to play into the hands of your mission control, who betrays you in order to further their own sinister agenda. You must then hunt them down to get revenge/rescue your love interest/stop the world from exploding. This has gotten so bad that when I played Dishonored last year I correctly guessed that the Royalist conspiracy was going to double cross me before I even bought the game. I spent two-thirds of Mark of The Ninja waiting for my ninja master to play his hand and reveal himself to be an asshole. It's like game writers have gotten it into their heads that this is just how you write a story, that if they crack open the Bible there'll be a bit halfway through Revelations where God turns out to be Satan in disguise and Jesus has to go on a one-man crusade to kick him out of heaven.

I think Bioshock is to blame for all of this. It used this formula so well that everyone rushed to copy it without realizing how much effort Irrational put into pulling it off. Did you know they went through several different accents for Atlas before making him Irish? They got test audiences to play through the opening of the game specifically looking for a voice players would trust, just to make sure the twist would completely blindside them. There are hints scattered throughout the game that he isn't who he appears to be, but they're subtle and very easy to miss if you aren't paying attention.

Now Irrational are making Bioshock Infinit,e and according to interviews with Ken Levine they decided not to go with the "voice over the radio" format partially because they knew players would spend the entire game waiting to be double crossed or to have it revealed that their unseen ally is actually a three headed sentient venus fly-trap. The twist has become so ubiquitous that it's impossible to pull off successfully. Gamers are going to expect it even if it's not in the game.

This is far from the only way videogame story-telling endlessly recycles familiar tropes, of course. There's the old Damsel in Distress scenario that's been going on since the early days of the medium, widely recognized as outdated and sexist but still a staple of the industry. Amnesiac heroes have decreased in frequency but still linger on as a cheap source of mystery. The rise of the gun as gaming's sacred totem has brought with it hordes of faceless soldiers hell-bent on destroying everything in their path for no obvious reason (Killzone, Gears of War, Resistance, Call of Duty, Battlefield: Bad Company, Battlefield: Bad Company 2, Battlefield 3, Medal of Honour, Medal of Honour: Warfighter, every Mass Effect game, damn near any game made this generation that involves holding a firearm). "There are zombies" is of course a concept that became over-used almost as soon as it appeared on the scene, now only considered acceptable when it's paired with innovative gameplay concepts or a story that's widely championed as among the best in the industry.

Let's have some new stories, okay? Let's have some games that don't telegraph their entire plot on the front cover.

Have you heard of the Plinkett Star Wars test? It was coined in the first of Red Letter Media's famous Star Wars prequel takedowns and goes like this:

“Describe this character to me without mentioning their appearence, their job or what their role in the plot is.”

I want you to apply this to Marcus Fenix. Take as long as you need.

Now Soap McTavish. Adam Jensen. Kratos? Hell, even Solid Snake.

If you could come up with any adjectives other than "gruff" or "serious" than you must be seeing something I'm not.

I'm not going to claim that there aren't good video game characters, because of course that's not true, but there are also rafts of terrible ones; monosyllabic grunting meat-heads and smarmy assholes and cooing sexpots who might as well make Austin Powers-style vagina puns every other second. Gaming's cup runneth over with tortured dark heroes seeking redemption and wise-cracking Han Solo knock-offs.

All of these problems come about because video game characters are largely written according to a method favoured by hacks in every medium: taking one of a handful of stock character archetypes and dressing them up with superficially distinct characteristics. I'm going to be going back to this point repeatedly, but it needs to be driven home: we're not getting characters written for video games, we're getting knock offs of the sorts of characters that went out of style in movies decades ago. This needs to stop.

Sometimes developers decide that their characters shouldn't be empty cyphers, that they should have personalities. Horrible, horrible personalities.

I don't know why so many games force us to play as assholes. They snarl and scowl through all of their lines, they have no setting between "off" and "angry". They don't have friends, they have people they won't punch or murder on sight. Most of the time, anyway.

The violent sociopath that is Kratos in the second and third God of War games may be the worst example of this, but there are plenty of others. Sam Fisher, Jak in the post-Jak and Daxter games, Cole from InFamous, Nico Belic, what's his face from GTAIII, Wei Shen from Sleeping Dogs.....

Most of these characters are supposed to be conflicted or multi-faceted. I get that. You can write characters who are unlikable on many levels but still ultimately someone we can relate to and empathize with. Case in point- James Marston of Red Dead Redemption. We spend a lot of time at the start of the game not really sure what to think of the guy. He has a history of violence and a short temper, but he's also polite and corteous to ordinary people. We see him go out of his way to help people in danger but we also see him try to shoot someone for annoying him. He's got different sides to him and not all of them are good, but there's an underlying sense of something decent trying to come through. When he acts like an asshat it's tragic because we know he's trying to be someone better, and under the tough veneer he is better.

Most asshole characters don't have that extra layer. There's usually nothing underneath the tough guy facade but a writer's attempt to make them "badass".

Visual and non-verbal storytelling is something I have a great deal of respect for but many kinds of stories are dialogue driven by necessity. This is a problem.

The deck is in many ways stacked against game writers compared to their counterparts working in other fields of entertainment. Developers don't have the luxury of pacing in many genres (particularly in today's action heavy environment), forcing story advancement and character development to be crammed into short cut scenes. Attempting a more organic story-telling style can yield transcendent results, but is difficult to pull off while also serving the needs of the gameplay.

All of that might help to explain the generally shoddy state of video game dialogue, but it doesn't excuse it. And just so we're clear, I'm not talking about the infamous clangers from the Playstation era. I'll take a hundred Jill sandwiches any day over writing that's just boring. Far too often game dialogue feels like a pale imitation of whatever movie the writer happened to be watching at the time (probably Aliens), is far too obviously expository or attempts to be cool and edgy and comes across like something a 13 year old on Xbox Live would yell at his opponent.

Let's take a look at some examples. First we have a taunt delivered by the Succubus boss in DmC, a game I quite like in many respects:

“I'm going to rip off your head, piss down your neck and shit on your worthless corpse! “

Yes, Ninja Theory, we're very impressed at how many naughty words you know. Have a cookie and go watch cartoons.

Which isn't to say profanity can't be used well. Ellie from The Last of Us can't seem to go two sentences without dropping an F-bomb if the trailers and gameplay videos are any indication, but in this case it feels as though it arises naturally from the character. I should listen to someone's dialogue and feel as though I'm listening to them, not the writer's Id. Dialogue should feel natural, it should feel spontaneous. It should obscure the strings linking the characters to their human creators rather than calling attention to them.

Keep in mind that this has nothing to do with dialogue being realistic. Here's another example from one of my favourite games of all time, Bioshock 2. The villain, Sophia Lamb, spends a great deal of time trolling the main character, Delta, over his radio, leading to this particularly nice taunt:

"Rapture is a body, Delta. I am the voice. Big Sister is the hand. When Rapture speaks of you it says only this: sleep, now. Your time is done."

This is not "realistic" dialogue. It's not something a real person would ever say unless they had a speech writer feeding them lines every second of every day. But it is something Sophia Lamb would say, and it gives you a glimpse into how she views the world. This line wasn't just written to sound cool, even though I think it is. It gives you a window into the character and the world she inhabits, even though it's only a throw-away line with no real story consequence. Take any of Andrew Ryan's lines out of context and anyone who played the original Bioshock would be able to tell you where they came from instantly, because no one else could have said them.

This isn't a low bar to reach. It's not an easy thing to achieve. It takes skilled, seasoned writers working with strong creative direction. But it's the goal games must aspire to if they want to tell excellent stories. "Good enough" isn't good enough.

A subset of the previous problem. There should be a special place in hell for modern FPS games that constantly assault the player with reams of esoteric military slang. I do not need to be told every five seconds that Bravo Actual is Oscar Mike to the LZ, which is hot (the LZ is always hot). At best it's annoying and distracting, at worst I don't understand a word anyone is saying.

And no, "realism" isn't an excuse. These games are testosterone-fueled fever dreams of patriotic carnage, not sober combat simulators. Lose the soldier speak.

The mostly one-sided creative relationship between games and movies is a topic that's written about often, but I don't think enough is said about how damaging this is to many games’ writing. It leads to a creative stagnation where game stories feel like shallow pastiches of popular movies, but it also leads to a situation where developers try to tell stories in games as though they're movies even though the two mediums are not compatible on fundamental levels. You can make games "cinematic" and you can have excellent, well-directed cut scenes and even gameplay sequences that use the well-established language of cinema, but if you sit down to write a game's story and try to structure it like a film you're almost guaranteed to fail.

Look at the Call of Duty games, or any of their myriad clones. Their stories are in essence geo-political thrillers with some war movie elements thrown on top. It's a pretty simple narrative, but it's constantly sabotaged by the gameplay structure, which requires multiple characters operating in different locations featuring a wide variety of action set pieces. This plays havoc with the narrative- events occurring in different locations seem to have barely any connection to each other even though they're ostensibly interlinked, character motivations are vague or non-existant, things just seem to happen for no reason. The Eiffel Tower explodes because the developers wanted a scene where the Eiffel Tower explodes, not because it has any real place in the story. Often while playing these games I'll stop and realize I have no idea who I'm fighting or why I’m fighting them. The absolute worst offender in this regard is Battlefield 3, which actually has entire chunks of the narrative occur off-screen between missions. It's like DICE came up with each level first and then tried to make up some excuse to justify it after the fact.

Of course the other problem with military FPS stories is that most of them feel as if they were written by putting a Roomba on a keyboard and hitting the on button, but that's a topic for another day.

Open world games represent a particular problem. By their very nature these games are expected to present the player with much more to do than other genres and so you end up with tons of padding as the player is forced to wade through side-plots that don't go anywhere or advance the story in any way. Several different plot arcs often run simultaneously, randomly falling into the background and then re-appearing for seemingly no reason. The by-now standard approach of having the player take missions from designated quest-givers means that we're forced to cycle through one-note side characters who often vanish from the story as abruptly as they appeared.

I remember in Sleeping Dogs being forced to drive a rapper around Hong Kong for a shady record label owner; at this point my character was supposed to be a lieutenant holding the second highest rank possible in his triad organization. Why the fuck am I doing odd jobs for a sleazy entertainment tycoon when I'm supposed to be the commander of my own criminal army? It's because the story (you are a powerful crime boss) conflicted with the gameplay (drive around Hong Kong kicking people in the face). United Front wanted to have Wei Shen rise up the ranks of the Sun On Yee, but they couldn't think of any way to do that without radically altering the nature of the game. In this situation they should have realized the story they wanted to tell was fundamentally incompatible with the game they were trying to make. Contrast this with the GTA games, which have the player gain prestige and influence with various criminal organizations but generally stop short of having them become official members or attain high ranking positions. Thus in GTA 3 you gain the admiration of the Mafia by being an awesome mercenary enforcer but you don't become the Godfather, because at that point the game either ends or turns into SimCity with drive-by shootings.

I've mentioned several times before that elements of video game storytelling can often feel like shallow rip-offs of movies. We deride Hollywood for churning out inferior videogame adaptations, but the truth is that games are just as bad about taking the fruits of film-makers’ labor and turning them into lukewarm oatmeal. No, I'm not talking about licensed games. I mean the fact that vast swathes of video games feature stories, characters, visuals and even music that are simply reheated elements from a handful of influential movies, endlessly recycled over and over again. I once read a blog post that argued convincingly that James Cameron's Aliens has done more to shape the look and feel of video games than any other single cultural influence in the world. Really stop to think about it and you'll realize it's a fair point. The concept of the space marine, the way characters in shooters with any sort of sci-fi bent to them look and talk and behave, all comes more or less straight from Aliens. Black Hawk Down, meanwhile, did the same thing for the military shooter genre that's become so dominant recently.

Often games borrow so heavily from movies that they can start to resemble unofficial remakes. The Road comes out and reshapes how we think about the post-apocalypse, so we get I am Alive and The Last of Us. Event Horizon? Dead Space. After The Fast and The Furious came out there was a massive glut of "urban" racing games focusing on car modification and "underground" elements, until everyone got so sick of it that reviews started to note the style as a detriment. Don't even get me started on what the post-Matrix years were like. You were there. You saw bullet time creep into every single goddamn action game in existence, you saw the badly-rendered swirling black trenchcoats.

None of this is necessarily a bad thing. I'm not saying any of these games have bad stories per se or are completely lacking in originality. I have often dreamed of an FPS based heavily on No Country For Old Men (EA: call me). But it leads to massive amounts of repetition as games repeatedly draw from the same sources of inspiration and directly or indirectly leads to many of the problems I discussed above, as developers try to replicate something they're not skilled enough to handle, or try to force a plot lifted from a movie into a fundamentally incompatible medium. Games also keep borrowing from movies long after the rest of the cultural landscape has changed, so we continue to get gritty 90s anti-heroes even though that trope is now widely considered to be juvenile and shallow. Elements of 80s action heroes keep showing up even though that type of character looks silly and cheesy to most people today outside of a few beloved classics. Did the world need yet another Mad Max style post-apocalypse setting with biker gangs and scavenger chic clothes and mutants? No, but id still made Rage anyway.

The end result is that games often feel like they lack their own cultural identity. How many games really feel like unique entities and not just pale imitations of films? If we're talking about visuals there's quite a few, Bioshock being the most readily available example. But what about in terms of story? Hardly any.

America has largely unseated Japan as the dominant force in game development as far as western audiences are concerned and is also by far the country most fond of mythologizing its own history. This leads to a related problem.

I don't think it's unfair to say that America never really got over World War 2, still championing it as the country's Finest Hour more than half a century later. (Britain has a similar problem when it comes to the First World War, incidentally.) This may explain why every single military FPS- and I mean virtually all of them- will include at least some element of WWII-story heroism, even in games set in modern War On Terror conflicts where it's not remotely applicable or suitable. Developers just can't seem to get over the plucky band of misfits completing their virtuous freedom quest against all odds, the heroic sacrifice and last stand, the Evil Villain whose death will instantly end the conflict, the "he's just a boy!" lament as the young rookie is cut down in his prime. Thanks, Saving Private Ryan. Thanks for a million Normandy landing missions.

Of course, real life is also an important and annoying source of inspiration, which is why every single military FPS has bit where you shoot people using night vision (just like on the news!). If games fair poorly when trying to imitate movies, just think how badly they stack up against reality.

Killzone goes perhaps the furthest with this, blatantly turning its villains into Space Nazis with some Stalinist Russia elements thrown in (and a pinch of Blitz-era London, for some reason). Which is a shame because that series has a legitimately interesting setting and visual style of its own and doesn't need the history aping. Hopefully Shadow Fall, which riffs on Cold War themes in a slightly more subtle way, will abandon this trend.

Now before anyone gets the wrong impression from all of this, it's true that there are many games with good stories. But are they actually good, or just good compared to the low bar set by other games?

The truth is that a lot of games praised for their writing would be utterly forgettable if they showed up in any other medium. The most recent example is probably Heavy Rain, a completely bog-standard potboiler detective story that looked good only because there are very few competently made bog-standard potboiler detective games. See also: LA Noire. Even games justifiably held up as masterpieces like Bioshock and Half Life 2 have plots that are nothing particularly special, instead achieving memorability via their characters and settings. Most games that stand out from the pack barely approach the level of storytelling competence expected of an average made for TV movie.

We need to stop accepting stories that are good "for a video game" and start expecting stories that can stand up to comparison with any other medium.

Complaining about video game writing is nothing new. I've just spent more than 3000 words explaining grievances that have largely been voiced by others elsewhere. Everyone knows the writing in games is terrible, audiences know, critics know, developers certainly know. Or do they?

Without naming names, it's not uncommon to hear developers wax rhapsodic about the narrative in their game only for the finished product to fall far short of their praise. In some cases this may be due to development problems taking a toll on the storytelling, but this happens even with developers whose games rake in millions of dollars and who have access to massive teams of programmers, artists and writers. And if developers really are aware of all of these problems, why do they keep happening?

This situation isn't going to change until we as consumers start demanding it. The next time you pay good money for a game and see one of these problems crop up, contact the developer. Don't be belligerent. Just send them a short, polite, to the point E-mail with the following message:

You're not trying hard enough. I think you can do better.

The second and most important step is to support games with good writing. I realize that in the age of piracy as a righteous moral cause it's become heretical to suggest that anyone should ever pay money for anything, but developers need to know that people are willing to pay for strong narratives in games or they're not going to bother.

There are encouraging signs that the situation may be changing for the better. The Walking Dead dared to reach higher than perhaps any of its peers and was rewarded with commercial and critical success, embarking on the sort of underdog story you'd dismiss for being too unrealistic if you saw it in a movie. Journalists who have played the opening hours of Bioshock Infinite have said enough about it to indicate that it could go in some very interesting directions. And Naughty Dog's The Last of Us looks to be attempting to tell a genuinely mature story.

I think a lot of the hostility in the gaming community comes about when gamers see their hobby and its place in the world shifting and feel as though there's nothing they can do personally to change anything. But I firmly believe this is something we can change. If consumers want better writing, developers will have to step up to the plate and deliver it.   read

12:38 PM on 02.26.2013

Harvest Moon and the art of doing more with less

When I was a kid I had a fascination with games about unusual subjects. They were a lot more visible back then, before the mainstream started to converge on first person shooters and third person action games as the only viable genres. And those games are good too, but it was the game about surviving on a stranded island or driving a train that really got my attention. If you've ever looked at a game like Bus Driver and wondered who in their right mind would pay for something like that, well, look no further.

One fateful day I walked into my nearest game shop and saw a Gameboy Color game called Harvest Moon. It was about farming. Some small but highly endorphin-productive part of my brain immediately lit up like it never had before.

I could play every single Harvest Moon game back to back and not get bored, even the ones that are almost indistinguishable from each other (which is to say most of them). Part of it is that same childish "I'm pretending to be a FARMER, check me out" glee that made me impulse buy Harvest Moon GBC all those years ago, but after playing through one of the DS titles again recently I wanted to try and figure out just what it is that's so appealing about the games.

If you stop to think about it, Harvest Moon games really do offer an astonishingly wide variety of gameplay options. Even the earliest and most primitive entries gave you a blank plot of land and some tools and told you to fend for yourself. There are no minute-to-minute gameplay objectives. If you decide to just sit around all day and watch the seasons pass, the game will just shrug it's shoulders and say "fine, try explaining this when the Harvest Goddess comes knocking". You can upgrade your farm with entirely optional buildings that open up completely new gameplay mechanics, such that two people's gameplay experience might be very different. Later games have added relationship building systems, mining, fishing, min-games, town building elements and even, in A New Beginning for the 3DS, the ability to completely customize the layout of anything on your farm. And all of this is presented not with the esoteric interlocking systems of a strategy game or a Simcity, but with simple controls that even the most casual player could easily pick up.

Over the last year or so a lot of conversations have been occurring about the apparent regression of the industry in terms of complexity, with higher development costs leading to shorter, more linear and more heavily scripted experiences. Many developers in the mainstream game world are increasingly doing less with more, utilizing breath-taking graphical and computational wizardry to make games that feel smaller and less complex than their more primitive forebears. I feel like Harvest Moon turns that on its head in a way that developers could learn from. For a long time the handheld games (the only true way to play a Harvest Moon title, in my subjective but also correct opinion) avoided 3D graphics completely, most likely to keep the cost of development down and avoid this very issue.

Harvest Moon is also relevant to other discussions that have come up recently, such as emergent gameplay and storytelling, how to encourage player-driven experiences and potential ways of lessening gaming's reliance on violence.

Could this design philosophy be applied to other genres and types of games? Not universally. Design of a roller coaster thrill-ride like Modern Warfare probably isn't going to borrow much from Harvest Moon. Still, I like to think valuing more open ended gameplay and building complex systems with simple components might be able to breathe fresh life into other genres.

Let's see more games that try to do more with less.   read

6:15 PM on 02.22.2013

Gender, the PS4 conference and missing the point

A crisis is sweeping the game world! No, it's not booth babes or gross trailers or the looming spectre of fake geek girls this time, it's woman and Sony's PS4 conference, and the fact that there were none in it.

The reaction to this controversy was not surprising, but still deeply disappointing. As I made the mistake of stepping into the whirling Id vortex that is the comment section on any gaming site you care to name I encountered the same sentiment over and over again: what's your point? Why is this a problem? You're just making a mountain out of a molehill. Usually this was be followed by one of several rationalizations, all of which demonstrate a profound inability to understand the issue at hand. Let's take a look.

But all the important developers are men!

I'm going to describe a little scenario. Pay attention.

You're walking to work one day. For the purposes of this exercise we're just going to assume it's the 1950s and you're wearing a hat. You arrive at the office and notice that- oh no!- your fancy business hat is on fire. You can smell burning hair. It's getting awfully hot! You turn to your co-worker and politely indicate that you're having some trouble.

He looks at the blaze for a moment then shrugs his shoulders. "Afraid I can't do anything, buddy!" he says. "Your head's on fire. No way I can take the hat off when it's like that."

Did you notice how slightly rephrasing the problem doesn't actually do anything to solve it?

Yes, most influential and high ranking game developers and executives are men. That's not an excuse. That's the entire problem. Simply pointing that out does absolutely nothing to change anything.

Is Sony single-handedly responsible for the gender imbalance in the games industry? No. If an all-male conference part of some sort of misgynist conspiracy? No. Should they have roped in some token female presenters just to give the illusion of equality? The only people who seem to be suggesting that are those trying to derail the argument, but no.

Does any of that change the magnitude of the issue? No.

There are two related points here, one being that Sony couldn't or wouldn't put any women on stage and the second being the imbalance of men and women in important gaming positions. The latter is the underlying problem causing the former; it is not an excuse for the former. It is not a reason for us to stop talking about it.

Now the actual premise of this argument may not even be true, as Kellee Santiago of thatgamecompany claims in these twitter posts reported by Kotaku's Patricia Hernandez (who was of course immediately bombarded with irrelevant comments about how all of her articles suck and she should be fired). Santiago didn't present data and I'm not currently able to find reliable demographics for women in high positions in the gaming industry, but just keep in mind that this line of reasoning may be flawed from the outset.

Women probably just didn't want to present at the conference!

Noticing that the above strategy wasn't working, some comment crusaders then invented an elaborate fanfiction scenario in which Sony went to various qualified women and asked them to present at the conference...... and they all said no. Sure, they could stand in front of the entire world and help usher in the next phase of their company's evolution, but they all decided not to because.... well, just because.

Let's use Occam's Razor here. We have a woman-free conference. Which explanation for this requires fewer unnecessary assumptions- that Sony didn't ask any women to present, or that they did ask and all of them refused? I'm going to go with the first option as the more parsimonious explanation. Several people have reached out to Sony for comment, so if they did ask women then they'll have ample opportunity to say so.

It's just one conference- what's the big deal?

Ah the vacuum gambit, wherein any one event is stripped of all cultural and historical context and examined purely on its own. Yeah, sorry, that's not how these things work. The lack of women at Sony's conference is just one more expression of the near-invisibility of women in games and the gaming industry as anything other than objects of male attention. Look at the percentage of female game protagonists, look at how they're usually presented in games. Look at the cover of Bioshock Infinite relegating Elizabeth to the back despite the fact that the game's story revolves around her. Look at the ratio of female speakers at gaming events to booth babes.

It's not "just one conference". Ditch your tunnel vision and start looking at the bigger picture.

Why is this a problem?

In trying to address this I'm coming dangerously close to attempting to explain women's feelings on their behalf, something I'm not remotely entitled to do. All I'll say on that front is that women have explained, repeatedly, why the gender imbalance at the conference is a problem. Listen to them.

The bottom line is this: you don't have to be bothered by this. You don't have to think it's a big deal. It's okay. No, really. The feminist police are not going to rappel through your window and take your consoles away. Sony isn't going to cancel the PS4 because someone criticized their conference. You can just ignore the blog posts and the website articles and be on your merry way.

What you can't do is try to tell people that their feelings of outrage and disappointment are illegitimate without a damn good reason and as I've spent this blog post explaining, those are pretty thin on the ground. And for the love of God don't be a shithead and try to chase people off the internet just for expressing an opinion you don't agree with. I don't have to bring this up again, do I?

I don't know how to fix this problem. But I do know that it isn't going to fix itself if we all just shut our mouths and stop talking about it.   read

3:12 PM on 02.21.2013

Has Sony learned from its past? My take on the PS4

In the immediate aftermath of Sony's PS4 conference the internet has been awash in opinion on every tiny detail of the event. You can get expert commentary from gaming journalists and business analysts from every corner of the world. People have talked about the controller, the graphics, the social networking features and the hardware specs until the cows come home. That's all good. I'm going to do that too. But I want to address another topic as well: Sony's approach to the unveiling and how it compares to their Playstation 3 reveal in years gone by. You see the company has undergone something of a corporate attitude adjustment since then and I think it's for the best.

Anyone who was paying attention during the year between E3 2005 and 2006 will remember the insane hyperbole thrown around by the "big three" console makers. Microsoft and even Nintendo, who had by far the least to brag about in terms of graphics, got in on the act but it was Sony who really rode that train all the way into Crazytown Station. One memorable quote from Sony Entertainment president Ken Kutaragi claimed that the PS3 would run games at 120 frames per second, making 60 fps look like " a slideshow". This is especially hilarious since PS3 ports quickly became notorious for suffering from frame-rate issues. Meanwhile the console's GPU was officially dubbed the "reality synthesizer" in an act of corporate hubris all but inviting divine retribution. Eventually the console manufacturers snapped out of whatever drug-induced fever dream they were in at the time and took on an improved relationship with reality but for awhile they seemed to be setting themselves up for a very large fall.

I was very glad to see Sony take on an admirably more humble tone this time around, mostly letting the technology and games speak for themselves with a notable lack of the corporate preening and grandstanding that tends to clog up these kinds of events (I'm looking at you, Nintendo). The hardware presentation was a simple, straightforward explanation of the PS4's features that didn't resort to overblown rhetoric. We were not promised an entertainment supercomputer. No components were given names implying they have supernatural abilities. Mark Cerny even criticized some of the mistakes in hardware design that Sony made with the PS3. A large corporation admitted they made a mistake, without being prompted by lawsuits or protestors. In public. During a press conference.

I don't know what brought on this change of attitude, but I like it. This time around Sony has a lot more to be humble about, with a previous console that didn't demolish the competition like the last two did and much more competition for dominance. I'm glad to see they've recognized that.

For whatever reason Sony decided to go with an idiosyncratic design for the PS3's internal architecture focused around a propitiatory "Cell" processor designed with Toshiba and IBM. As a result the PS3 gained a reputation for being difficult to work with, resulting in most cross-platform games being developed for the Xbox 360 primarily and then ported over to the PS3, often badly. This resulted in something of a schism in the console's library, with visually impressive exclusives (look at God of War 3 and anything made by Naughty Dog) but cross-platform titles that were often considered sub-par. I don't have any direct evidence of this but I've always suspected that the difficult nature of the hardware may also explain the PS3's smaller catalogue of indie games compared to what you can find on the XBLA.

Thankfully Sony appears to have learned their lesson from all this, as the information given in the conference stressed how PC-like, and therefore probably easy to develop for, the PS4's architecture is. The 8GB of RAM was a welcome, if very unexpected, surprise. For those not in the loop, consoles use a lot less RAM than PCs since they're not running a complex OS simultaneously. 8GB of RAM in PC terms is more than is usually deemed necessary for gaming but not anything to get overly excited about, whereas for a console that's an insane amount (The PS3 has 512MB, for comparison). It's also DDR5 RAM which people who know what that means tell me is a big deal.

I remember when I got my first PS3 after having owned all of the other platforms for years that it was the little things that stood out to me. Features like rechargeable controller batteries included as standard, free online multiplayer, a neat user interface, a HDMI port, even the totally superfluous touch sensitive front buttons all combined to give a sense of quality that the other consoles lacked. It was kind of like driving an expensive sports car- fundamentally it's the same as any other vehicle, but it's got all these neat bells and whistles that sets it apart. The PS4 feature set is very similar. Nothing jumps out as being a revolutionary step up, but they've built in all these cool features like an instant start-up, a sleep mode, streaming downloads and video capture functions.

When I saw the leaked photos of controller prototypes ahead of the conference (God bless Sony and their charming inability to keep anything a secret) I worried that it looked far too chunky and bulky. Thankfully the real deal is far sleeker and more sculpted. I like the form factor a lot. The Dualshock has always been a bit cheap and plasticky compared to the Xbox 30 controller so I'm glad to see that Sony have taken some pages from their competitor's playbook by making the grips larger and more rounded and making the lower shoulder buttons into triggers. It also has a cool textured pattern on the bottom. Very nice.

The biggest new addition is obviously the touch pad and to be honest I'm having a bit of trouble imagining what it's going to be used for outside of some interface applications. I can see a handful of launch titles including some superficial support for it in gameplay before developers promptly stop trying. The light bar at the top has similar problems. Despite being introduced as a way to make recognizing controller easier the bar and camera are clearly intended to carry over the functionality of the Move. Integrating the technology right into the controller instead of making consumers buy a seperate peripheral is a smart move but it will only pay off if developers actually use it. Otherwise we're going to be looking at another Sixaxis situation (remember the Sixaxis? No? Neither do I). The middle space being taken up the touch pad along with the sleep mode making traditional pausing obsolete necessitates the merging of Start and Select into one single option button, which is fine by me since I can't remember the last time I used Select for anything. The touch pad is clickable anyway, meaning that the overall button count hasn't been reduced.

The headphone jack is a feature I personally wanted, although I never imagined it would be integrated into the controller. I initially thought the speaker was actually a microphone, alleviating the need to buy headsets to yell obscenities at strangers, but apparently not. One can dream.

The share button is an a cool idea, although I have to wonder how robust the functionality actually is. Sony is correct that watching people play games is becoming increasingly popular, but mostly in the form of commentaries and Let's Plays. Simply spamming ten minute clips of your gameplay isn't really going to attract much of an audience. Now if you can take the footage you record and upload it to a PC for further editing, bypassing the need for a video capture device.... I can see that taking off in a big way.

The D pad is pretty drastically different and looks quite similar to the one used by the Vita. I've seen Vita owners praise that console's D pad to the high heavens so this is apparently a good thing.

Overall I really like the look of the new controller and how I can imagine it feeling, but the features packed into it make it seem as though Sony is throwing ideas at the wall in the hope that one of them sticks. I get the distinct feeling that the touch pad and light bar are probably going to fall by the wayside soon after launch. Still, that leaves an exceptionally pretty and well designed controller that does everything the Dualshock 3 can do and more.

The big question when any new generation kicks off is always "what are the graphics like?". Personally I think judging a console's worth solely on its graphical capability is increasingly misguided, but that's a topic for another day.

The unfortunate elephant in the room here is that Sony has a history of "enhancing the truth" when it comes to console launches. The PS3 was unveiled in 2005 with a jaw-dropping Killzone 2 trailer that later turned out to be a "target render" of what the game was going to look like, something that Jack Tretton and Phil Harrison blatantly lied about in interviews. While still very impressive in its own right, the actual game looked nowhere near that good and in many ways the target render still has yet to be matched even in high-end PC games. The take-away lesson here is that you should be skeptical around console launches. For the purposes of this post I'll give Sony the benefit of the doubt and assume that the videos they didn't explicitly identify as tech demos were at least running in-engine.

Coming full circle, one of the first big game reveals was a next-gen Killzone game. Someone actually got on stage with a controller while this footage was playing so it is apparently legit. "Current gen turned to 11" is probably the best way to describe this, along with a lot of the other games. Nothing about it obviously leaps out at you as being next-gen but upon closer examination there are a whole raft of graphical features that you wouldn't find on the PS3- sophisticated cloth physics, crisp lighting, very nice particle and smoke effects, huge draw distances (this one in particular was a common feature among many of the games shown). I'm prepared to call all of that next-gen. Keeping in mind that the first wave of titles for any new console never show off the hardware at its best I think this is a perfectly decent starting point. The same could also be said of Bungie's Destiny: You could tell me some of this footage was a PS3 game and I'd believe you; on the other hand the high level of anti-aliasing and massive draw distances are clearly more than the current generation could handle.

One other thing that stood out to me is that many of the games shown had bright, vibrant, almost cartoonish colour schemes, even the traditionally dour and ashen Killzone. I'm not sure if this represents a stylistic shift in the industry or if it's something Sony asked for, but either way I'm very happy to see it. More bright colours, please.

Watch_Dogs looked to my eye more impressive than what we saw last year, particularly in the lighting department. Whether that means it's running on PS4 hardware and wasn't before or the game has just been improved since then, I don't know. It's certainly looks impressive though, particularly for an open-world game, and should serve as an interesting comparison when we see it running on current and next-gen hardware simultaneously.

The first trailer to sail right into "too good to be true" territory was Capcom's Deep Down tech demo. Some parts of this had the jerky camera movements I always look for in genuine gameplay footage, but other parts were clearly just cut-scenes with an interface pasted over it. The trailer also switches between first and third person at one point, which doesn't inspire confidence that what we're seeing is in any way indicative of what this or any other game running on the oddly named Panta Rhei engine, which I can't hear without imagining a manta ray in women's underwear.

I'd love to be proven wrong on this but for now I'm not buying it.

Square Enix also trotted out their Agni's Philosophy demo that they showed at last year's E3. I'll say the same thing now that I said then: I'll believe a game can look this good when I'm sitting in front of it with a controller in my hand and not a second before (I'm also mildly concerned that this video shows poverty stricken gangs of brown people besieging attractive white sorcerers from a technologically advanced wonder metropolis. Nice one, Square Enix.)

Then we had David Cage's giant CG head, which I can guarantee we'll all look back at and laugh five years from now.

As a special bonus to anyone unlucky enough to not be watching the conference, I want to talk about my two favourite moments. It's a sort of unwritten rule that no video game press conference can be complete without someone making an ass of themselves, and this one was no exception.

First up we had the guy from Evolution Studios who introduced Driveclub appearing to teeter on the very brink of orgasm when discussing the level of detail that's gone into the vehicles for his game. A lot of which, such as rendering the orientation of individual paint flecks, frankly sounds like complete bullshit. Then there was the InFamous: Second Son trailer that was pre-empted by a brief trip into the inner psyche of Sucker Punch's Nate Fox, who immediately launched into a frankly uncomfortable speech about police brutality and the erosion of personal privacy without mentioning the game at all, at one point appearing to be on the verge of tears when recounting the time he got gassed at a protest. I don't think I've ever seen so much sorrow in the eyes of a press conference speaker.

Two subjects remain: the console design and the price. The former of which I can't talk about because Sony didn't show it, much to everyone's surprise. Apparently Shuhei Yoshida told Kotaku that this was to stop people from being bored, whereas Jack Tretton recently gave the far more sensible reason that the design just isn't finalized yet. Maybe they're having trouble cramming all of the hardware into a console-sized case? It's a little surprising given that the PS3's design was unveiled a full year before we even saw any games for it. Nintendo's somewhat bungled Wii U unveiling, which focused on the controller to such a degree that many people came away with the impression it was just a peripheral for the existing console, shows that seeing the hardware is important for the average consumer.

As for the price, they didn't say anything about that either, as I pretty much expected. But we do have unsubstantiated rumours so let's not let that stop us!

At E3 2006 Kaz Hirai stepped on stage and into the annals of legend with the announcement that the premium version of the PS3 would retail in America for a wallet-shattering 599 US dollars. The memes started to flow before the conference was even over, forged in a crucible of image macros and techno remixes alongside ridge racers and giant enemy crabs. It's difficult to remember now, but at the time the price was genuinely shocking underneath all the hilarity. There was a widespread perception that Sony's hubris after the success of their last two consoles had uncoupled them from reality, not helped by arrogant statements made after the conference implying that Sony believed people would pay any price for a PS3 simply because it had the Playstation brand.

The Times reported earlier this month that at the time of writing Sony was considering a price of $449. Given the apparently powerful hardware involved I'd consider that a reasonable price, but I have to wonder if it will look as attractive to more casual gamers who already have an expensive smartphone. One advantage Sony has this time around is that the price difference between the Wii U and the PS4 isn't nearly as large as that of their predecessors, making the possibility of unexpected competition from a console Sony had largely dismissed less likely. Particularly if the Wii U's less than stellar sales continue. Whatever it ends up costing, price is going to be a problem for both the PS4 and Microsoft's console. People were willing to pay for these consoles before because a games console was still the default way to play games; that's rapidly becoming less true.

I like the Playstation 4. I like its controller, I like its features, I like its graphical capabilities. It seems to be building up a decent amount of publisher and developer support. I went into this press conference with no opinion on the console; I came out of it not ravenous for more, not desperate to get my hands on one, but feeling like I'll probably make it my next console purchase. I think that could be considered the best possible outcome for a show like this.

But more than that, I came away feeling more confident in Sony themselves. They've proven to me now that they have their head screwed on straight and aren't taking wild risks while still being willing to innovate. In many ways I (probably unfairly) still viewed Sony as the arrogant schoolyard bullies of the 2006 era. Today's Sony was the sixth year student quietly studying in the back of the class. He's probably a prefect and popular with all the teachers. Look at that kid! they all say. He's going places.

Well done, Sony. Good job.   read

Back to Top

We follow moms on   Facebook  and   Twitter
  Light Theme      Dark Theme
Pssst. Konami Code + Enter!
You may remix stuff our site under creative commons w/@
- Destructoid means family. Living the dream, since 2006 -