Quantcast
Community Discussion: Blog by Enid Green | Enid Green's ProfileDestructoid
Enid Green's Profile - Destructoid

DestructoidJapanatorTomopopFlixist





click to hide banner header
About
Enid is a 25-year-old female gaming enthusiast who enjoys writing opinions on gaming, comics, movies, her cats, and people with awkward bathroom habits.
Badges
Following  


I am what is considered a "social" MMORPG player. When I'm hooked on an MMO, I spend hours upon hours every week (almost every day) leveling my character, working on my crafting professions, and maximizing my gear so that I can play cooperatively with other members of my guild to conquer whatever evil monster is currently threatening the continent. But without the social element of playing with other people, even befriending many of them, an MMORPG becomes mundane.

I think some game developers have forgotten how important community is for the continued success of an MMO. New content, new raids, new bosses, new gear, new mounts, new player-versus-player modes, new gizmos, toys, gadgets, should always improve a community. If you consider what an MMORPG is when you remove the other people you play with, it becomes a large-scale, watered down RPG with repetitive and inconsequential gameplay. You cast the same five spells, kill twenty of whatever monster is invading a nearby helpless village, turn in your quest for the reward of a fraction of a piece of gold, then repeat the whole process again, probably hundreds of times. Whatever decisions or actions you take during the game have little to no impact on the game's environment.

However, when you take this "boring" game and add thousands (millions?) of people to the game world, you create unlimited potential for conversations between players. Designing an MMO with amazing bosses, great looking gear, and innovative gameplay means nothing if you don't create a means of connecting players together.

So, because I'm a huge fan of writing lists, I created a new one: how to keep subscribers from getting bored of your MMO and quitting.


Looking like a badass in Tier 19 gear doesn't mean much without babes following you around, throwing e-panties at you.


Update old content instead of just adding new content on top of it.

This is less of an issue when a game is still relatively new and no expansions have been released. But when an MMORPG has been on the market for a few years, perhaps with a couple of expansions released, it's extremely alienating (especially if you're new to the game) to create a new character and log in to a completely barren game environment. Additionally, developers tend to "water down" difficult content as it becomes obsolete, so that players trying to reach the next tier of content aren't stuck waiting to find help from other players, the majority of which who have already passed them by.

It seems like an incredible waste of resources to ignore entire areas of a game because less players are required to visit there. Developers should find ways to motivate higher-level players to frequent older areas and interact with lower-level players instead of just creating new content for only higher-level players to explore. It's good to recycle old content while adding the new. Adding entire new continents and forgetting the old just causes a game to bloat.



A map of the WoW universe after the Wrath of the Lich King expansion. Three continents and hundreds of areas to explore.


Areas that were relevant by the end of the expansion are marked in red (most of which are capital cities).


Add content that improves community instead of derailing it.


"Why World of Warcraft is the best MMO ever/sucks ass" is an extremely tired conversation, and it's one that I prefer to avoid, but Blizzard has implemented a variety of things over time that I consider hugely dentrimental to their own community.

The introduction of flying mounts made world PvP almost non-existant (I'm not a fan of player-versus-player encounters, but I do think faction rivalries add to a community). And while I'm sure many players were thankful to be able to quest and travel in faster and easier ways, any time you make it easier for players to avoid interacting with each other, you are simultaneously weakening the community.

As the population of Warcraft started to decline and the bloating of content caused players of differing levels to spread away from each other, it became almost impossible for groups to organize for group-focused content, such as "elite" quests and lower-level dungeon-raids. Too few people were creating new characters to keep the lower-level group-content active, and the higher-level players had already surpassed the usefulness to do anything besides end-game raid content. As a result, many players were unable to explore huge amounts of content without spending hours trying to put a group together. Blizzard's "quick-fix" was to connect different servers together so that players from one server could automatically connect with players from other servers to complete dungeons isolated from the actual game world.

While it vastly reduced the amount of time required to find a group to complete a 30-minute dungeon, it also detracted from the community of individual servers. Instead of grouping with people on the same server (people they'd likely play with again in the future), players were grouped with complete strangers from other servers, players who had no reason to communicate with each other because they probably wouldn't meet each other again, not that they'd have a way to communicate after the dungeon anyway. Guilds no longer recruited from observing the ability of other group-members, players no longer "friended" each other to group together in the future, and dungeons themselves became chores that players only want to finish and leave, as opposed as a means of meeting and communicating with fellow players.



Playing a low-level character on an older MMORPG is about as fun as starting a go-cart race two laps late with a cart that smells like farts.


Force players into situations that require group-play.

A lot of players want to get to the end of a game as quickly as possible, as easily as possible. Many will attempt to skip content that requires them to group with other players, because it usually takes more time with little pay-off when compared to questing alone. This is probably the only time I'll ever say a developer should ignore the wishes of gamers, because it's good for the community to put players in situations that require them to interact. Allowing players to avoid each other turns an MMO into a single-player experience.

I played DC Universe Online earlier this year and managed to play a healer to the level cap. During that time (it took about two weeks of regular but not obsessive playing sessions), I was motivated to group with other players twice (in case you don't realize how terrible this is: in an MMORPG that I put about 40 hours of game time into, I spent a grand total of 60 minutes in a group as a healer). Nearly every quest encouraged solo-play, and even the group "dungeons" did little to promote cooperative team play as opposed to "five dudes wandering around a military base smashing things for 30 minutes." People rarely talked to each other and I never once saw a person questing with another player. The gameplay itself was interesting, the story, characters, and background were highly appealing, but DCUO did little to connect players, and because of this I have a hard time believing the game will last longer than a year.


Just like this child would inevitably benefit from being forced to attend fat camp by his parents, gamers would benefit from developers forcing them to socialize with other gamers.


With a plethora of MMORPG's being released this year (DC Universe Online, Rift, Guild Wars 2, Star Wars: The Old Republic, and a handful of others), it will be interesting which developers understand that community should always come first, and which games will make the mistake of believing content trumps community before disappearing from the MMORPG market.

______________________

If you're interested, my original blog is published here.








(Dtoid likes to skew my images, so if you feel inclined, my original post is here.)

I knew immediately I wouldn't like Gears of War just by reading the game case. It reeked of tired premises, outdated character designs, and stale gameplay. But I consider myself a somewhat open-minded person, so after listening to four years of constant and unwavering praise for the Gears series, I finally picked both games up from the bargain bin at a local game shop.

I really wanted to like the game. I really did. But instead of becoming engrossed in the game, I ended up making a mental list of things I hated about it while playing. I kept going until I finally (and painfully) reached the game's ending (so no one could read me that bullshit line: hang in there, it gets better). When the ending cinematic finally cued I could put down my controller, I breathed a sigh of relief, put the game back in its case, and vowed to encase the piece of shit in heavy concrete before dropping it into the Gulf of Mexico.

Gears left such a bitter taste in my mouth, it made me consider things I hate about all the games I've played. Which inevitably motivated me to write a list of poor game design choices that I really, really hate.


Saving

* Using beds or typewriters as designated "save points." If you're going to require a player go to a specific area to save their game, at least be creative enough so that it adds to the atmosphere of the game.


"Thank fucking god. I can finally take a break from killing all these monsters and get started on my memoir."

* Implementing "automatic check point saves" and not using enough checkpoints. A game is more annoying than challenging if you're asked to repeat a twenty-minute sequence because the last 5 seconds is a nearly unavoidable death-trap.

* Putting a checkpoint before an unskippable animated segment. Yes, your final boss's nine-minute entrance was incredibly impressive the first, even second and third time, I watched it. But when he's slaughtered me twenty times, and I've had to endure watching the same sequence an equal number of times, it gets progressively less "awesome".


Bugs

Creating a completely bug-free game is virtually impossible, and I recognize that. But it's equally important for a game creator to understand why a game should be released as bug-free as possible.

When someone is playing a game, until they come across a game-breaking bug, they'll assume any problem they encounter in the game is based on user-error. Which is a good thing, because when that player finally figures out the solution to the problem, they'll not only feel accomplished, they'll also be impressed with how creative and complex the puzzle's design was.

When that same player does come across a game-breaking bug, instead of devoting all their capacities into solving future problems, they will now doubt the integrity of the game instead of their own ability to solve the puzzle.


Puzzles

* It pisses me off to no end when a game requires the player to think like the developer, when it should always, always be the other way around. Puzzles should have multiple, logical solutions. If it's impossible or illogical to create multiple solutions to a certain problem, then the designer should make the solution obvious (highlighting the object used for the solution, etc.) or make damn sure that the player isn't going to waste thirty minutes trying to interact with random objects around a room until they finally realize that this piece of copper wiring combined with this sheet of toilet paper and this rubber ducky creates a timed explosive that will help you escape the Evil Overlord's death trap.

* Timed events that are poorly executed are also aggravating. If a game is going to require a player dash across an obstacle course, at least show the player where the endpoint actually is or provide a means of navigating to that area. This turns a timed event into a calculated and challenging race against time, as opposed to a panicky clusterfuck of wrong turns and dead ends.


Maps

* A map that is too small to read and/or doesn't have a marker to indicate the player's location is a map that's worthless. Why even bother including a map if it doesn't help the player navigate the game world?
* Not having a clear objective that can be easily read or heard at any time is another nuisance. Including a short blurb of easily forgotten dialogue explaining where to go next is not enough, especially when most gamers play games over the course of several days and weeks, not a single lengthy playing session.


The Limited Edition of Shitfest: Return of the Shit comes with a magnifying glass if you need help reading the in-game map.


Atmosphere

* Creating a game that requires you to work with a team of obnoxiously masculine and highly armed comrades is generally regarded as making a game's atmosphere less frightening. So when a designer decides to include "jump" scenes and creepy music to the aforementioned "team-based game", it makes the playing experience feel less like a horror game, and more like the designers had no idea what the hell atmosphere they were trying to create. Atmospheric elements should complement each other.

* Putting a muddy brown filter over everything makes a game feel more shitty than "gritty."


"Boooooring...Not nearly gritty enough."


"Now we're talkin', but needs more fuckin' brown."


"Hells fuckin' yeah! So fuckin' brown I can't even see! This is real gritty!"

AI

* Unless an AI's character specifically has X-ray vision and a super-gun, they should not be able to see you through concrete walls and shoot curving bullets.

* If a game gives you a team of AI's to help you in combat, it's not all that beneficial to have them do an automatic suicide run into the enemy's base.

* I really doubt even the most unintelligent of people would hold a standing position in clear view while being openly fired upon. So why would you program an AI that does?

* When a friendly AI is obstructing a doorway, I shouldn't have to shoot them or punch their face in to get them out of my way.

Pacing

* When you require a player to complete a task, make sure the importance of the task is on par with the amount of effort required to complete the task. A game shouldn't spend 4/5ths of it's story sending the playing doing mundane tasks, and only 1/5th actually doing something vital to the main story.

* Pace the difficulty appropriately. If your game is supposed to be as challenging as chewing bubble gum, keep it that way. If a game is supposed to be more difficult, it should have a steeper but consistent level of difficulty. Throwing in "Mega-Douche: The Ultimate Endgame Boss, complete with flamethrower eyes, heat-seeking missiles, and a billion health points" at the end of a nine-hour flower-picking session is probably going to make someone stop playing your game.

Now I already know that this list of "no-no's" is highly subjective. I will always condone developers that implement new and creative ideas in their games instead of adhering to strict rules and formulas. It just seems that so many common problems in games have been repeated for decades, and it makes me wonder how thoroughly some developers think about what they're creating before putting it on a disc.
Photo Photo Photo








I received my first gaming console in 1996. I had been begging for a system of my own for years, ever since I went across the street to a friend's house and demo'd out Super Mario World on his SNES. My parents finally buckled one Christmas and surprised me with a Nintendo 64.

My happiest gaming memories took place on that console. Long playing sessions of Mario Kart were usually the highlight of slumber parties with friends. I was introduced to Zelda, which I have long-revered as the greatest video game series I've ever played. I spent many recesses with other Nintendo-enthusiasts discussing what secret stars we'd found in Super Mario. I lost complete days to games like Tony Hawk, Goldeneye, Rocket: Robot on Wheels, and Mischief Makers
.




It was when I was looking these games up online that I realized how poorly reviewed so many of my fondest games were. Diddy Kong Racing was probably one of the most expansive and fun games I had played. Gamespot reviewed the game at a 6.6, a failing grade by most standards. (Interestingly enough, Mickey's Speedway USA, a completely forgotten bastardization of the kart-genre at the time, received a 7.5). Plenty of reviewers hated Super Smash Bros. (even Nintendo Power gave it a 7/10), while I (and many others) revere SSB as one of the greatest multiplayer experiences of the N64's lifetime.




The value and means by which we interpret reviewer's scores has been debatable for ages, it's nothing new. And more than ever, people are choosing to be more selective about the games they wish to invest money in. Many people just don't have the income to take a chance on a game purchase that they may not like. Which means taking the word of reviewers to decide on a purchase.

And I'm extremely guilty of that. Scanning over my "legacy console" games, many of them received aggregate reviewer scores between the 60% to 80% range, a "sub-average" grade by many's standards. Yet there's some real gems in my collection. When I look at my game collection for this current generation of consoles, I notice that almost every game I possess is universally praised (and hyped) by critics. Anything that didn't receive 90% positive reviews was a bargain-purchase. And it makes me wonder what unfairly reviewed games I'm missing out on.

I personally think there's something wrong with the way the majority of reviewers score games. I think it sucks that 80% is not considered a good game, and I wish there was a better way to translate the quality of a game than a number that's extremely subjective to bias.

I'm personally a fan of review systems that use as little number variables as possible and interestingly enough, provide as little depth as possible. I never understood reviewers that write individual reviews of a game's individual parts (graphics, gameplay, music) and then compiled those "micro-scores" into an average, because people don't play games in individual parts, and when you review a game based on it's individual parts, it's not equally representative of the game as a whole.

It also seems silly to adopt such a specific means of rating a game, when it is impossible for a review to be precise. Reviewers often skew their own reviews to affect how well the game is received, as opposed to how the game actually played. Scores generated by users are almost always completely worthless as well. Many submit scores of 0 or 10 instead of an honest opinion of the game in order to alter the overall average. Obviously, averages based on numbers just don't work.




What I don't understand about systems that use percentages or "out-of-ten" is why you would have such a broad spectrum to range a game, but you only ever really see a reviewer actually use 40% of those numbers to grade a game. The percentage system (A=90%, B=80%, etc.) makes sense in gradeschool, because really, learning less than 70% of your given material isn't really an accomplishment. But in games, why do you only need 30% of a grading system to tell if a game is good (80-100%) and 70% of a grading system to tell if it sucks (0-70%)? I'm pretty sure no one is going to care if a game universally receives scores of 40% or 60%...they're just going to know to not buy it, the degree at which it sucks isn't important.





I think systems that use stars or a thumbs-up, thumbs-down approach are much more fair. Instead of weeding out whether one game is an 8.5 or a 7.5, it should really just come down to whether the game is crap or not. Most film reviewers seem to use the same system, and I think it's a much more fair representation of whether a film is worth seeing or not. Most people decide whether they want to see a film based on two reasonings:

1. Does the content look appealing to me (your favorite actor is in it, you're a fan of the genre, etc)?
2. Did it at least get an average review?

The general population seems to look at film reviewers to see if the film was universally hated, and the faith in those reviewers' opinions seems to end there.




Removing the specifics of reviewer scores also makes it less likely for scores to be skewed. Averages don't become skewed by outliers (jerks that submit 0 or 10 scores for games that obviously don't deserve it) and it gives a much more accurate representation of how a game is received. Either 70% of the population hated the game, or they didn't. Less opportunity for outliers means a better representation of whole opinions.

Overall, I just really wish reviewers would stop assigning specific numbers and point values to titles when we only really need to know two things before playing a game: Is this game of a genre I'm interested in, and is the game worthy of the public's time?