For the record, I think review scores are the spawn of Satan. I hate them with a firey pasion because not only do they incite stupid arguements, but people's entire livelihoods can hinge on a positive review score. But what even is positive? That depends on the scale, and even then the meanings have changed over the years.
So, just how the hell do we review video games? What's the best way? Every method seems to suck, but there's got to be some kind of answer, right? Yes, I think there is. When I started blogging as a dumb teenager, I modeled my own reviews after the popular systems of the time without giving it a second thought. Over the years, though, I 've come to find that there's only one system that makes sense. No offense to any websites that hand out scores, though. I undersatnd why you feel the need to do it, though I wish you wouldn't.
So, let's turn the tables and assess the scales themselves!
This one is still more common than you think, even today. Outlets like GameInformer, IGN, and GamesBeat still use this one. It usually consists of a 10-point scale with decimal 10th points (though GI will also go to hundredths, like .75, for some asanine reason). In the case of GamesBeat, they literally use a 0-100 scale. Metacritic uses this scale when it magically averages the scores from all scoring systems together. Why is this system crap? The answer is very simple: It's just too many damn numbers. No one could possibly explain the difference between a 7.4, and a 7.5. Add in crap like 7.45, and you end up looking completely foolish. It's the most arbitrary of all the arbitrary numbering systems. There is no reason whatsoever reviewers can't just round up or down to, at least, the nearest half point. For some reason, this was the system back in the days of gaming magazines. Today, it's just ridiculous.
The most widely used scale today is the 10-point with halves, known here as the 20-point. Outlets like Polygon, EGM, and Destructoid itself use this system. It seems pretty good on paper, but suffers from a lot of problems in practice. Centrally, this one still uses too many points on the scale. Half points essentially mean "I can't decided between this score or this other score so I'm going to just go with something in the middle." It's the scale for indecisive folks that rarely works well. The real issue here is that things are still too arbitrary and there's a crapton of inflation from it. Ever wonder why so many reviewers seem to use 7-10 (essentially six points) only? It's because there's just too many damn numbers to possibly use them all! The industry is calling out for a simpler system while we all either complain or accept 7 as a low score.
Do not be fooled by stars! I can't think of a single major outlet that ever used a 10-point scale without stars. At first, you may think sites like GamesRadar and (formerly) Joystiq are using a 5-star scale, but they're really using a 10-point scale with stars because they use half points. The problem with this, of course, is that it's not easy to read. I always end up converting the stars into numbers in my head (one star equals two points, while a half equals one). And, of course, this system suffers from the same problem as the 20-point. It's got half as many numbers, but that's still too many. Scores will continue to average on the high end, likely 7-10 again. Except now you're really using a 3-point scale. If they chop off the other seven numbers, would that be a viable option?
A 3-point systems is pretty straightforward, but is probably too much so. It'd break it down something like: good, average, and bad. That doesn't seem like enough wiggle room. What if the game isn't one of those three things but falls somewhere in between? This system is simply too restrictive. Many smaller sites use a variation on this where the verdict is something like "buy", "rent", or "skip". Using something like this is too definitive, and might actually insult readers. Rather than placing a score in front of you that you would (theoretically) interpret, this simply tells you what you should and should not be playing. A score should ideally reflect the reviewer's opinion and be used as a guide, along with the text, to help you make a purchasing decision. Under this system, the reviewer has made that decision for you.
Although this system formerly used by EGM and 1up has gone the way of the dinosaur, it's still worthy of note because no one should ever use this again. Anyone who's ever gone to school knows that letter grading makes no sense anyway, and what's a good grade to one person may not be to someone else. Then there's the decision whether to use plusses and minuses or not. It can come out anywhere between 5 points (then why not just use numbers?) and 15. It's a mess.
Wise old Adam Sessler. If there's got to be a review score, this system might as well be the industry standard. Almost nobody actually uses a 5-point scale (Rev3 Games and G4 used it, but they're dead now), but it's absolutely the most balanced. Since five points is an odd number, there is a clear middle-ground. The 10-point also has this, but a 5 is never considered average. With fewer points, you can actually use them all because they can clearly mean something. There's two points in each direction so you can have one that's highly good, one that's highly bad, and something in the middle both ways. The only problem with this is that it's hard to read because it's so alien. Without clearly defining what each point on the scale represents, most people would probably try to convert stars into numbers or else be very confused with just the numbers 1-5. Is a score of 2 supposed to be bad? How about 3? With some brief explanation, though, this system is top. The problem with this scale, of course, is that it mindfucks Metacritic. What are they supposed to do with a score of 1-5? Nothing, that's what. And that's why nobody uses it.
A five... out of five.
The five star system was the last scale I used when I still scored my reviews. I interpreted each value and gave it a meaning with the score that summed up my experience with the game. It went like this:
Before the score I'd have a short summary of my important points so the reader could skim if they liked skipping to the bottom for the score.
I can settle for five stars, but I think us rational gamers can agree that scores need to go. Obviously, I hate the overreliance on scores and I hate the expectation that there should be a score, so anybody willing to give Metacritic the middle finger and do things the way they want without caving into pressure deserves my praise. I have to give props to Eurogamer for dropping review scores last year, and to Kotaku for realizing their system was stupid and going scoreless. But it was really Joystiq who paved the way for a scoreless future. Unfortunately, the change was made a bit too late and, if I remember correctly, they only went a few months with this system before they were closed. I just wish more websites would follow their example so we can focus on what really matters. Sure, there will still be reviews people won't agree with, but cutting the number and taking away its power will really go a long way towards inciting discussinon rahter than arguing. Without that numeric representation, that value that means different things to different people, maybe we can learn to better understand the reviewer's position instead of igniting flame wars. As I see it, you take away the score and it removes 50% of the butthurt. The other 50%? Well, that'll probably always be there because gamers are a passionate group, many of which apparently hate dissenting oppinions. You know what, though? That's okay. Because reviews aren't supposed to be the definitive word on whether you'll like a game. That's for you to find out. The reviews are more of a "day-one purchasing aide" anyway. So next time, just read the text. Pretend the score isn't there and see how that makes you feel.
That's one man's opinion, anyway.