Quantcast
Community Discussion: Blog by Derek Pietras | Derek Pietras's ProfileDestructoid
Derek Pietras's Profile - Destructoid

DestructoidJapanatorTomopopFlixist





click to hide banner header
About
Derek spends his days trying to keep up with Sonic the Hedgehog, his evenings attempting to jump as high as Mario, and his nights by sneaking into the Ninja Turtles’ secret lair in the hopes of getting some special ninja training from Master Splinter.

Among other things.

Born and raised in boring ol’ Massachusetts, Derek has felt the call of fantasy from a young age. Proudly declaring that “Reality is boring!” he strives to find new and interesting fantastic worlds with an unmatched drive. He hopes that his works will one day inspire others to explore the fantastical. He welcomes anyone on board for the ride.
Player Profile
Xbox LIVE:Ryoma90
PSN ID:Ryoma90
Steam ID:Ryoma900
Wii U code:Don't know it.
Follow me:
Twitter:@DerekPietras
Facebook:Link
Derek Pietras's sites
URL:Blog!
Badges
Following (1)  


If you don't know, the frame rate is the term to determine how many images, or frames, display per second in rapid succession to create the illusion of movement. Movies, with rare exception, tend to clock in at 24 frames per second (FPS), while video games tend to average around 30 FPS, with some going to 60. Obviously, the more frames you can display in a single second, the smoother the game will play. It has become a point of contention among some gamers, especially since games on the PS4 can hit a solid 60 FPS while the same game on the Xbone stutters to a 30.

But are frame rates really important? We, the gaming community, love to find reasons to bash other consoles and other gamers. Is the frame rate debate (rhyme!) just another way for us to pick on each other, or is it important?

Depends on who you ask.


Seriously? That's your brilliant answer?

To be honest, I'm surprised at how few gamers actually know what a frame rate is. I've talked to gamer friends who have no idea what I mean when I say the frame rate sucks or this game chugs along. This always seems strange to me, personally, since I knew what a frame rate was ever since Sonic Adventure 2 came out on the Dreamcast (Long may it reign). Sonic Adventure 2 made a big deal about running at a silky smooth 60 FPS, and I remember being impressed with it right away. There was a noticeable difference in smoothness from Sonic Adventure 2 to Sonic Adventure. Since then, I've been able to look at a game and tell you its frame rate. And to me, it's very important.

Personally, I think video games should strive for a rock-solid 30 FPS at minimum. If your game struggles to reach that on a console (PCs are a bit different since every rig is unique), then I think the game developers should be embarrassed. Anything less than 30 FPS is really embarrassing, and is very noticeable in game play. I realize that there's a desire to have the best graphics and textures and everything, but I think it speaks of overly ambitious game design if the game you're making cannot run smoothly on the console you're putting it on. After all, what good are pretty graphics if you struggle to see them in motion?

Plus, games that don't run smoothly give me eyestrain and a headache. I have wimpy eyes, apparently.


My eyes will be starting a new workout regime soon. See, my blogs are educational too!

I am speaking as an ignorant gamer who keeps up with blogs but doesn't actually work in game design, so this next paragraph may be rubbish. But I do think that there is a desire among game companies to make these beautiful screenshots of games and post them online. Those screenshots are not formed from the console though, and then it becomes a process of forcing a pretty game to run on lesser hardware. I think that's where a lot of frame rate trouble comes from. Taking a game that wasn't originally created on certain hardware and trying to force it into that mold. It's very much like trying to take a square peg into a round hole. Something has to go before it'll fit. I think that something is frame rate, but I don't think that's the right direction.

But I am also the person that turns down his PC graphics to minimum to get his game running smoothly. I don't think I'm in the majority opinion with that.


Just about right.

Some people claim that they cannot see the frame rate, and that may be true, especially if you don't play many games that have a rock solid 60. But I do think that even if you can't see the difference, you can feel it. There's a reason Bayonetta on the PS3 was poorly received. It just isn't as fluid as the Xbox 360 version. To me, it's noticeable. And I think anyone that put any amount of time into Bayonetta would agree.
But that's the issue at hand. Some people don't play Bayonetta (actually, judging by sales, a lot of people don't and that's a shame), and frame rates are less important to turn-based RPGs or something less reflex heavy.

I personally don't think that that's an excuse for a game to stutter along.

I think at the end of the day, the frame rate thing is really up to personal preference. If it doesn't bother you, that's fine. Enjoy your game. After all, that's what we're all here for. But frame rate bothers me. I won't refuse to play games with bad frame rates (I did play GTAV), but I do think that more effort should be placed into making a smooth experience. Because people do care. And even if they they don't, they notice the difference.

What about you? Are frame rates important to you when you play games?
Photo Photo








Since it's summer and I was a former lifeguard, I don't like to keep myself cooped up in the house when the weather is decent. While this is great for my pasty white complexion, it's terrible for my video gaming. Thus, progress on gaming has been slow, but I've managed to play a few things over the past week or so.



Let's get the simpler one (not worse one) out of the way. Shovel Knight is awesome. If you don't know, Shovel Knight is a 2D platformer in the style of games released in the late 80s and early 90s in the NES/SNES era. You play as the titular Shovel Knight, on a quest to defeat the Enchantress and possibly reunite with his lost love, Shield Knight.

In a word, this game has charm. From the nice throwback visuals to the very strong sense of humor, this is a game that keeps calling me back. I've not finished it yet (see above comment on being less pasty and white), but I can't help but smile whenever I play. I almost don't want to see the end, because that means that the quest is complete.

Gameplay is probably closest to a non-Metroidvania Castlevania game. You run to the right, swinging your shovel as your primary attack, and use items that consume magic points. You keep going until you reach the boss, which you fight, defeat, and move on to the next level. There's even turkeys hidden in the walls before fights. And like Castlevania games, this game is hard. You have to navigate deviously designed platforms, often while attacking/dodging enemies, and one wrong step can spell death.

Fortunately, the game does not use the life system, and in a clever turn, you are punished by death by losing money. When you die, three bags of coins pop out of your corpse, and you can return to that area to claim them. Sometimes, that money gets stuck in bad situations, and you may find yourself dying trying to get the money back. It's a great system though, because it doesn't make death meaningless, but it also doesn't make death heart-wrenchingly punishing. I like it a lot.

And yes, you will die. I haven't had a game make me this angry about dying in a long time. I've slammed my WiiController-Tablet-thing a few too many times playing. This is because each death is my fault. There are no cheap deaths in this game. Only you failing to overcome a challenge. It's completely fair, and all the more anger-inducing because of it.

In a word, I love it.



Now, let's move on to Gone Home. I got this game during the Steam summer sale for five bucks. I knew very little about it, primarily by choice. I knew the game's central premise (you come back to your house after a summer abroad to find it empty and you need to figure out where everyone is), and I knew that it caused controversy from some not labeling it a “game.” Other than that, I avoided discussion of the game, because I knew that story was the game's selling point and that if I ever got to play it, I wouldn't want to have it spoiled.

I am glad I did.

I won't go into spoiler territory, as this is one of those games that is so much sweeter when you know next to nothing about what to expect. Instead, I want to talk about the game in a broader sense, and how its design works in its favor.

Gone Home is the epitome of game design where you get as much as you put in. You can burn through the game in probably an hour if you rush, but if you do, you won't get nearly as much out of it as someone who spent 5 or 6 hours in it. Granted, it's not a long game, but it's a game that seems to beg for slow methodical play. Every item you find almost demands to be interacted with directly and thoroughly examined. Every room calls your name, asking you to find its secrets. And the more secrets you find, the richer the story becomes.

There is a central story, one that is quite clearly played out when you play. But more stories are unearthed as you delve deeper. You learn about the family through the items you find, you come to feel that they are real people and that you are really exploring their house. You learn their stories, their interests, their lives. It's pretty amazing, actually.

But my favorite part? None of the game experience is orchestrated. Yes, the items are placed in such a way that you will likely find them in a certain order, but you are just as likely to pass over something. But the game doesn't care if you miss something. It will less you pass over much of the items in the game world.

This goes back to something I discussed in a past blog about cutscenes, but Gone Home seems to take that idea to an extreme. It doesn't care if you examine every item. You can complete the game without doing so. It doesn't force emotions on you. It doesn't say “Here's a scary scene” or “Here's an exciting scene.” It merely presents things to do, and lets you, the player, decide how to feel about them.

It would be very easy for the game to force you to examine every item, and then tell you when you've found everything. Or to not let you move on without finding everything. But it doesn't. It merely presents its story to you, and expects you to enjoy it as you see fit.

Compared to a game like Uncharted, which forces thrills down your throat, Gone Home is a refreshing change. It's unapologetic about what it is. It's a house to explore, and a story to uncover. And that's it.

And that's all it needs to be.








Like the title says. Let's do it.

I don't know if the term “Cutscene” has a formal dictionary definition, but I take it to mean those points in video games where the gameplay essentially stops, and a movie of sorts plays. This movie may be fully animated or essentially a set of talking heads discussing the plot. It generally means that gameplay has stopped, and the player can take a breather. 

Some games have hours and hours of cutscenes. Some have barely ten minutes, if any at all. They have become more prevalent in modern games though, especially since things like motion capture have become such a big part of the industry. But they are also a point of contention among gamers. Should games have cutscenes? Shouldn't that stuff be saved for the film industry where it belongs? Can cutscenes and gameplay be integrated? 


I don't fuckin' know. 

Well, I'm going to throw my hat into the ring here and talk about my thoughts on them. They may surprise you.

I like cutscenes, overall. I do. This may surprise you, considering a past post of mine, and I'll try and address that throughout this post. But I do like cutscenes. As an avid player of JRPGs, I damn well better like cutscenes. But I think, as with anything, there's a right way and a wrong way to do them. 

I think the success of cutscenes comes down to how they are integrated into the game. That does NOT mean that I think cutscenes should happen while I am still playing the game. With rare exception, (Portal being a prime example), I don't think that method works as well as the developers think it does. There's nothing worse than a game that has you in a firefight or a sword fight with a big boss, and then have character spouting plot off while you're fighting for your life. In this situation, you're focused on the fight. Your eyes are glued to the screen, trying to see what you need to keep yourself alive. Characters spouting exposition get ignored in this moment, because your focus is elsewhere. And usually, once the battle ends, the cutscene has also ended, and you missed a valuable plot point. 


You don't look busy. Let me explain why the zombies are here.

That's not to say that this method inherently does not work. Like I said, Portal 1 and 2 do it brilliantly, having important stuff happen during more calm points in the game. Chell would be walking through a room without a puzzle while GLaDOS or Wheatley spouts on and on. Or, in Grand Theft Auto, plot can happen while you're driving someplace. That works, and works well. But if you're in a firefight in Grand Theft Auto, or running from the cops, the last thing you want is exposition being shoved at you. It could be the best exposition in the world, but you won't be hearing it. 

This flows nicely to my next point. Cutscenes are meant to be something of a breather. Anyone who has seen Transformers or Sucker Punch knows that “non-stop action” is hardly a compliment. Games took a while, but I think they are starting to realize this too. You need to have moments of calm, where the player can realize a little and let their heartbeat slow down. I think that this is where cutscenes can serve their purpose perfectly. They can be those moments of calm between gameplay, which is where the thrills should be.
 
And this is where we get into my thoughts in regards to my past blog post. You see, too many modern games (like Uncharted) try to put those thrills into cutscenes as much as gameplay. I think this creates the opposite effect. When I'm playing Uncharted, the thrill should be in me getting Drake out of some sort of scrape, not the game getting him out of that scrape for me. That, to me, is not thrilling. At least, not in the way that video games can be thrilling. Movies are essentially one long cutscene, but in a game, the thrill comes in the way I am challenged. Not in the way the character escapes demise. That's why a game like Uncharted (and a game like The Order: 1886) doesn't work for me. The game is creating the thrills with no interaction from me. I'm not getting Drake out of a crashing airplane--The game is. 

That's not a game. That's a movie.


Good job saving yourself, Drake. I'll just, uh, watch.

But Derek, you ask, you said you like JRPGs. They have tons of cutscenes that do that kind of stuff!

I would argue that JRPG cutscenes do not that kind of stuff, actually. Let's take an example from a generic JRPG I'm going to call Generica. In Generica, let's say the gameplay flow is something like this:

1. Cutscene introducing dungeon.
2. Fairly linear dungeon crawl.
3. Cutscene indicating halfway point of dungeon.
4. More linear dungeon crawl.
5. Cutscene introducing boss
6. Boss fight.
7. Cutscene indicating boss was defeated, and introducing the next area to explore.

Sound like a fair assessment? Sure seems like cutscenes really interrupt the gameplay there, especially since there's one in the middle of the dungeon. And you're right, in a sense. They do interrupt gameplay. But they are also kept separate from the gameplay. That, I think, is the big point I'm trying to make.

So, the cutscene plays that introduces Generica's dungeon. It tells me it's dangerous, I should expect monsters, but boy oh boy is my party excited to venture forth. The cutscene ends with my characters entering the dungeon. Note that this cutscene presents the problem to me. That problem is surviving the dungeon. It's not solving the problem for me. I still have to navigate the dungeon. That falls on me, the player. The game is giving me the tools to solve the dungeon (the party members) but it's up to me to put them into a group that can defeat the monsters I face.

The second cutscene tells me I'm doing well. I've made it this far. I'm doing something right. But I still have a ways to go.


Damn right I am.

Then, the boss cutscene. In my opinion, a good JRPG pre-boss cutscene has to do only one thing. It has to show off the boss. Let's assume that the boss here is one of the party's rivals, one who I may have seen in other scenes. Here, I get to see the character in full. They show off a little, and the cutscene ends with them “attacking” me. Then, I'm taken to the boss battle.

This cutscene presents the challenge (the boss) and then challenges me to beat it. If the boss beats me, I have to try again. The game does not continue if I lose, nor does the game solve the problem for me. It presents the problem to me, and expects me to figure out how to solve it on my own.

I beat the boss, and the “Whoo-hoo look how awesome you are” scene plays. Then, a new area appears, furthering the story.

This is an example of good cutscene design. It presents me with the challenge, tasks me to overcome it, and then gets out of my way. Games such as Uncharted and The Order do not do that. They may present me with a challenge, but more often than not, another cutscene solves that problem. Drake is in a burning building. I have to climb the stairs, going through scripted “thrills” on the way. Oh no, that floor broke! How dangerous!

Except that the floor breaks every time. I, the player, am not in any danger because the game is pushing me through a sequence. It's not presenting me with a challenge and expecting me to solve it. It's walking me through a generated thrill. 

The boss battle in Generica may be incredibly challenging if I'm underleveled, or it may be incredibly simple if I'm overleveled. The game isn't watching to make sure I have a certain set of “Gasp” moments. It's just presenting me with a challenge to overcome.

My reward for completing that challenge? More story. More challenges. My reward in Uncharted? More vaguely interactive cutscenes.

In my opinion, this is how cutscenes should be used in video games. And this is something we don't see nearly enough these days.

Agree? Disagree? Let's talk.
Photo Photo Photo







Derek Pietras
10:05 AM on 06.21.2014

Let me walk you through my experience at E3. And, as you can see by the title, you probably have some idea where this is going. 

E3 this year really held little interest to me. Too often, I've been let down by game developers making promises that they don't keep, whether it be in the way the game looks, or whether the game is actually released at all (I still want The Last Guardian, Sony!). Too many disappointments have made me into a cynical bastard, to the point where I rarely look at game previews at all. Instead, I just wait for the review. Since E3 is essentially all previews, I wasn't looking forward to it and had all but forgotten it was happening.

That, and I work a full-time job. I couldn't exactly watch the press conferences and work at the same time.

But I could listen. I started by listening to Microsoft's press conference, and, credit where credit is due, it was all about video games. Game after game was shown, all in various stages of development. However, I cannot remember anything shown that wasn't Halo 5 or Sunset Overdrive. Neither game has much interest for me. I don't care for first-person shooters or online multiplayer, so while the Master Chief collection sounds cool, it's not for me. And Sunset Overdrive seemed to scream “Look how cool we are!” in that 90s-not-really-cool-at-all way. I was unimpressed.


Radical.

I missed EA but found out that I didn't miss much. I caught a bit of Ubisoft's where they showed games like Assassin's Creed Unity (I don't like AC on a good day—I much prefer Prince of Persia) and Far Cry 4 (see above for FPS comment). So, nothing too exciting there.

Then, I saw Sony's press conference. I was home from work then, and while I had to go to bed midway through, I managed to see a bit of it, and caught up on what I missed the following day. As I said in my previous blog, the trend of gameplay experiences really doesn't do much for it. Combine that with a number of trailers (Such as Uncharted 4: They Might Actually Die This Time for Realz) that had no gameplay, and I was left jaded.

I went to bed that night feeling that video gaming had left me behind. As if the kinds of games I liked no longer mattered. Everything felt somewhat gritty, with gunplay, and very little gameplay. A few things were interesting, like No Man's Sky, but as a whole, I just felt like I had no place in today's world of gaming. I talk to coworkers who loved Heavy Rain (HA!) and who thought Uncharted 3 had a good story (DOUBLE HAH) and just wondered where my place in this world was. Was it time for me to just go to a used games store, buy all the old consoles, and just keep playing old games? Was there anything to be excited about?


Was this as good as it got?

Then, Nintendo happened.

Once again, I was at work, so I could only listen. But holy crap. Smash Bros. Gameplay. Splatoon (a new IP) gameplay. New Zelda footage in an in-game engine. New Kirby! Bayonetta! Captain Toad! (!) Hyrule Warriors! Xenoblade! Gameplay, gameplay, gameplay!

These were games I could get behind. They weren't all gritty. They weren't glorified cutscenes. They had gameplay, honest-to-goodness gameplay. Where other developers tried to make their games into movies and hide the gameplay as much as possible, Nintendo showed games that weren't afraid to be games. Why is Kirby in Claymation style and into a round ball? Who cares! It's fun! Why are we turning into squids and shooting paint? Who cares! It's fun! Why is Bayonetta fighting a Lumen Sage and going to Hell? Who cares! It's fun!

And so on and so on.

Jim Sterling said it best.

Because of my fangirling all day at work, I was called by more than a few Nintendo's Bitch. I found that to be a funny comment, because it's just not true. I love Nintendo games, but it's not because they are Nintendo games that I love them. Nintendo makes game that love to play. Games that I can get behind and love. If Sony or Microsoft did that, then I'd be behind them as well. I don't have brand loyalty to Nintendo in the sense that I will support them no matter what. I support them because they make what I want. They gave Captain Toad his own game for crying out loud! You'd never see Microsoft do that.

If Sony or Microsoft started making games that appeal to me, I'd be far more supportive. As it stands, the games they produce just don't interest me. Does that make me Nintendo's Bitch? I don't think so. It just makes me a fan. A loyal fan, because I plan to throw money at Nintendo for these puppies.

I'm not going to say the WiiU is a great idea. Hell, it's a pretty crappy idea that seems to be biting them in the ass. But I own one, proudly. And even if it's library of games is smaller than a PS4, almost all of them appeal to me.


Nintendo, don't ever change.








E3, in addition to being the debut of many a new gameplay trailer, has also become a home for many a video game buzzword. Words like "innovation,” “gameplay experiences,” “social interaction,” and so on get tossed around to such an extent that they become a mockery of what they once meant. Suddenly, no one cant describe a video game without mentioning at least three of these buzzwords. It's gotten to the point that some people make a drinking game out of it.

And end up passed out under their computers.

Most of those buzzwords just become part of the E3 noise that I try to tune out while I wait for a trailer to begin. However, this year, one buzzword finally made sense to me. Gameplay experience. Because of one trailer played during Sony's E3 press conference, I finally understood not only what it meant, but how it described a certain type a game. It was no longer white noise. It meant something. 



Disclaimer: I'm going to talk about The Order: 1886. A game that I know very little about, outside of what was shown at Sony's E3 press conference.

Let's think back to The Order: 1886's gameplay demo. Let's also avoid criticizing its use of zombie-werewolf things. Instead, let's think about the demo as a whole. It opens with the player character walking slowly through a supposedly creepy house, shining a useless lantern through pervasive darkness. The character moves slowly, in what is likely meant to inspire suspense and fear in the player. Soon, the player character stumbles upon the previously mentioned zombie-werewolf things. Through a cutscene, we see the zombiwolf (which I sincerely hope is their real name) eating the corpse of something, then notices our hero. It stands up, and starts to change. Then, it attacks.

The player character, as is always the case, is prepared with a gun. He fires at the creature in a gameplay sequence, but it seems that the gunshots do nothing. The zombiwolf gets up close, and then another cutscene plays, where the hero is tossed aside through a window. The cutscene continues, and eventually, the player character gets back up. Once more, the zombiwolf sees him, and attacks. Once more, gameplay briefly interrupts, where the player fires at the zombwolf. And once more, his attacks do nothing. The trailer ends with the zombiwolf closing in on the hero.

This demo exemplifies what a gameplay experience is. It's a carefully crafted sequence of events punctuated by player interaction that drives the game forward. These moments are created by the developer, and are meant to elicit some kind of emotion. In this sequence, that emotion is meant to be fear. Look at this unstoppable monster! You should be afraid of it attacking you!

And yet, it inspires the exact opposite reaction from me.


Which had nothing to do with the lameness of the werewolves

According to the gameplay demo (of which very little was actual gameplay), the player's actions have very little effect on what's going on in the game. I doubt that no number of headshots would've actually stopped the zombiwolf, because the game developers didn't want the player to feel like they could kill it yet. It's a sequence then, a glorified cutscene, one that the developers have planned from start to finish. The player interaction is there simply as a means to call it a video game. No doubt, sooner or later, the hero will face off against the zombiwolf again but this time, the character will be able to kill the beast. Maybe it's a boss battle. Maybe the hero finds the magical whazzit that lets him finally hurt the beast. But not at the moment demoed, because that's not how the developers want the sequence to go.

They are essentially leading players down a series of events, a movie where player interaction is a means to an end. They want this sequence to create fear, and rather than let fear grow dynamically (and risk the player killing the beast and not getting any fear at all), they force the player into a sequence meant to create that emotion.

But to me, it just feels like a cheap thrill.


Like a certain director's movies.

When I'm playing a video game, and I get myself in a fight that the game wants me to lose for story purposes, it takes me right out of the game. Suddenly, I'm not playing the game, so much as being a spectator in what the developers want to have happen to me. I'm watching a movie and pressing buttons in a smoke and mirrors attempt to include gameplay. But my actions don't matter. My gameplay doesn't matter. The moment has been preplanned—my failures or successes have been preplanned.

This has become an increasingly popular trend in video games, made popular by games such as Uncharted with their action setpieces. I was never able to put my finger on a reason why Uncharted didn't excite me.

Now, I can.

A few years back, I was playing Uncharted 3 and Ninja Gaiden Sigma 2 at roughly the same time. I remember being completely enthralled by Ninja Gaiden, and bored with Uncharted 3. Ignoring the story problems inherent in Uncharted 3 (that's another blog subject entirely), I knew something about the game wasn't connecting with me the same way Ninja Gaiden was. I just couldn't figure out why.


Could have been the ninjas. I do have a raging hard-on for ninjas.

But it's actually pretty simple. In Ninja Gaiden, I am faced with challenges, not experiences. The game developers aren't trying to manipulate my emotions or create certain situations or setpieces. They are dropping me in a room with enemies and expecting me to emerge victorious. They are throwing boss battles at me and expecting me to prove myself superior. The thrills are not carefully crafted pre-designed sequences--they are created through the well-done battle system, crazy combat, and my own skill. A battle in Ninja Gaiden can be simple if I'm good at the game, or heart-wrenching if I'm bad. My health drops low if I mess up, not if the developers choose for it to happen. Every fight can feel different from the last, even if I'm fighting the same set of enemies. If I don't throw up Ryu's block fast enough, I could be faced in a tough situation with very little health. If I manage to chain together a series of devastating attacks, the same battle could be over in a matter of moments. My actions matter in this moment, because the gameplay can go either way. I can fail miserably, succeed splendidly, or fall somewhere in between. But that's for me to decide, not the developers.

In Uncharted 3, almost the exact opposite occurs. I'm running along rooftops because the game told me to, not because I chose it as the best course of action. The ship is sinking, but it only moves along when the developers determine that it should. My actions are not driving the battle so much as I'm doing what I'm told. Failure is barely a setback as well, and if I do fail, I will respawn in a location the developers have determined is the best place for the battle to unfold. They don't want my lack of skill to hold me back, they want me to keep moving forward, from one thrill to the next. Outside of shooting the guys, my actions matter very little. Cutscenes take over often and there is always a set path through the level and to deal with enemies. “See that cover over there? Use that!” the developers seem to say. “It will put you in a good spot to deal with the enemies and to see the explosion we have planned!”

It just feels like I'm being led along, doing what I'm told like a good gamer.


"Can I get a gold star?" "Does this look like fucking Mario to you?"

Ninja Gaiden doesn't care if I suck. I need to either learn to get better, or stop playing the game. They don't care if a certain battle makes my heart start pounding. If I know how to beat the boss, the boss is easy. If I don't, then the boss is hard, and gets my blood racing. The sequence is not trying to force its thrills on me, they happen naturally through the fight. In Uncharted, the thrills are forced on me. Outside of choosing which cover to take and which guy to shoot first, my actions matter very little. The fight will end the same way.

It becomes like a movie interrupted reluctantly by gameplay.

But I don't know if it works as well in video games as in movies. In a movie, we're wondering if the hero can survive a challenge, and how they are going to do it. In a video game, we're trying to see if we, the gamer, can survive that challenge. We don't really want to see how Drake does it—we want to do it ourselves. For me, seeing Drake save himself through cutscene just doesn't have the same thrill as surviving a challenging boss fight by my own merit.

As is probably obvious by now, I prefer video game challenges over video game experiences. I feel that video games are meant to challenge us, test our skill, and have us prove ourselves superior to anything the game throws at us. Gameplay experiences, on the other hand, are there to force their thrills on us, force their emotions on us. And to me, that seems like a cheap thrill, rather than a genuine experience.

What do you think? Am I on to something here?
Photo








I was playing Mario Kart 8 online the other day when I came to a startling discovery.

I don't like when other people play as Toad.



You see, Toad is my Mario Kart buddy. He and I, we tight. We've been through some crazy tracks together, taken our fair share of red, blue, and to a lesser extent, green shells. We've clawed our way to the front of the pack, we've faced certain loss only to speed ahead, we've cursed our low speed and sung the praises of our acceleration.

We've been through a lot together. 

From my first Mario Kart game (Super Circuit, if you must know, and you may judge me accordingly), Toad and I have stuck it out. Sure, I've dabbled with the Princesses, and even tried some other male characters that weren't fungi, but I always found myself going back to the little shroom that could. It's a kind of bizarre emotional connection, one that happens in games when you select your characters. I've grown attached to Toad, and don't like when other people use him. He's mine, after all.

I find that this happens in other games too. In fighting games, I gravitate to certain kinds of characters, and then become kind of defensive about them. My first fighting game, Power Stone, was played obsessively by me and my little brother. We ended up dividing the cast of 10 (at the time, quite large) in half, and my characters have stuck with me since. We still go back to that game now and then, and when we do, I pick the same characters.

Sometimes, we joke and he picks a character I generally play, and vice versa. While I don't get really angry about this, it does feel like he's stepping across a kind of boundary. It's not like I have any particular claim to these characters, any more than anyone else, that is, but I've spent so much time with them that they feel like my own. I beat Arcade Mode with them, we stuck together as we fought the CPU and earned our endings. We go through that together. They may be just code, polygons, and game data, but dammit, we went through a lot together.



Growing up playing split-screen with my little brother (and sometimes with friends (they weren't big gamers and I used to always win, resulting in little fun for anyone)), this kind of territorialness became an unspoken acceptance between the two of us. I'd stick with my characters, he'd stick with his. We tend to have different play styles, which works all the better. I like the slick and speedy, he likes the heavier and slow. We never really cross the boundaries.

But now, with online gaming, those boundaries no longer exist. Anyone can play as Toad, and there's nothing that can be done about it. Toad isn't solely the favorite character of me, and others should be able to enjoy the character as well. After all, it's not like I created him.

But damn, we've been through a lot together. Dashing around Rainbow Road, avoiding fireballs in Bowser's Castle, shooting out of a Cannon in DK Mountain, getting lost in Yoshi Valley, cursing the blue shell for the hundredth time. I've grown attached to the little mushroom guy. And a part of me hates the idea that someone else can play as him.

I'm not the only one who gets this way, am I?