So there have been some minor uproars over game review scores lately that have gotten enough attention to rise above the usual internet static. First was Hydrophobia's developer pitching a fit over it receiving some less than glowing evaluations. Then the developer of the new Castlevania: Lords of Shadow title took issue with IGN delivering it a score of 7.5, which equates to "good" on their rating scale.
Since I personally reviewed one of these games, and I'm tentatively slated to review the other, it got me thinking about review scores in general as a sliding scale. It seems as of late, things have gone a bit awry.
Remember when you were in elementary school and the class was divided up by groups according to their reading skill? Each group had a specific color to go along with their workbooks. Children in the "Advanced" group set the gold-standard, readers in the "Average" group were told that they were "good", but were secretly spurned and tracked for being sub-par readers which translated to "good enough". Kids in the "Remedial" group had parents that were related to one another.
The video game industry appears to have reached a point where it is so cut-throat and competitive, that less than stellar review scores for games now equate to a title being forgettable or outright bad. This is only reinforced by the above incidents where developers speak out against a score that is considered in the upper echelon of the sites review scale, but not the very top.
It would be easy to just blame developers and PR people for "not delivering on promises", but the gaming journalism industry is largely at fault too. We (yes I'm lumping myself in there) sit and hype games based on pure speculation, creating a buzz months before there is even a tangible product to evaluate. This in turn creates a readership with very high expectations. I'm not telling anyone they should have to settle. Don't ever settle. Stay hungry, like Twisted Sister told you to. However, I think this behavior from us actually creates an almost self-destructive environment.
In a world where only the top scores mean something is worthwhile, a few docked points garner a title being widely ignored, and a low score yielding a possible phone call or email from an angry company, a new precedent is set. Reviewers most likely give much higher scores simply because their frame of reference is skewed, or because they don't want to deal with the backlash from the occasional low score. So then what's even the point of having numbers below 4 or 8 on the scale at all?
I think as the holiday game season comes upon us we will only see more of this as the top titles compete among cash strapped consumers. There are always winners and losers in a competitive industry, but a combination of the journalist hype machine, and impertinent developers reacting to imperfect scores for their works are creating a culture where we can't separate "median" from "mediocre".
So what do you think? Are review scores a flawed system? Or is any game labeled simply "ok" not worth your time and hard earned money?
I think review scores from the popular gaming press are largely inflated. It's just like you've described it -- everything's a 9 (or perhaps a high 8), or it's not worth your time. How useful is that to us? We end up with overcrowding at the very top, and it becomes difficult to differentiate between all the 9.5s and 9.7s and 9.3s out there. It's difficult to come up with examples because everyone disagrees on how good specific games are (I think it's ludicrous that GTA IV received so many perfect scores -- it's CLEARLY not perfect, in my eyes, but many will disagree), but I think many of us can agree that there's a problem.
Honestly, for me, I tend to skip the scores and read the review for the overall "What did it do and how well did it do it" picture. I take note of any scores given, obviously, but I don't really pay any attention to them unless they're below 3/5 or 60%.
I pay VERY close attention to scores given out below the 30s because that's usually a tip-off that I'm about to read a very, very entertaining review.
It's the same thing that happens with any 0-10 review scale for any medium. Look at album reviews, or even scores for voting on user submissions to a site like youtube (which is why they use the like/dislike system). 9's and 10's abound, 7-8 becomes average, 6 is passably mediocre, and 5 or below is just horrible (Pitchfork's use of the 10-point system is the one thing I think they do right in reviews). The other problem is instead of reading about the game and figuring out if its right for them, most readers of reviews (me included, sometimes) will just straight up look at the score. These two attributes of a 10-point system feed off of each other and necessitate a continuation of inflated scores because the expected 10-point system of 7-8 average is different than what the 10-point system is supposed to mean, where 6 would be something with flaws, but still definitely able to be enjoyed.
I personally don't think hype is the problem as much as that; hype should be encouraged but also realized to be a natural reaction of excitement for something new and should factor in as little as possible into an actual score of the game. However, in a system where the score is the end-all-be-all result, actually giving qualitative praise- that the user will recognize without reading the review- to a game that fails certain objective criteria is difficult, if not nearly impossible.
I think that's the biggest problem with these scores, and the publishers honestly; it's completely and utterly self-evident that nobody is READING the reviews anymore. They're looking at the scores, ignoring what the reviewers wrote in the article (which are SUPPORTING ARGUMENTS for good/bad reviews), positive or negative, and throwing tantrums. Heaven forbid people actually sit through a review and compile the information received and make their OWN decisions.
And this "Do it, Give it an 8 or we'll fucking spank you" attitude publishers have is completely ridiculous. There should be punishments for running a business like that.
If I've read positive buzz about a game, I'll check out the trailers and demos myself, and if I'm still interested, review scores generally don't sway me one way or the other.
I also think scores (and reviews in general) often ignore the key facet of any game: if the review enjoyed playing it. I couldn't care less what it "scored" or its position relative to other games, as long as I had fun.
That's something I try to be highly aware of when writing reviews, Layton. I think if the reviewer doesn't broadcast a sense of whether they actually enjoyed the game they were playing it is a disservice to the readers. Although admittedly, it can be difficult to always do that when you're trying to create an objective critique of something. It's definitely a fine line that needs to be walked.















