But isn't the point of Metacritic to smooth out that margin of error?Baresark said:Actually, systems such as reviews on scale of 100 (they are all scales of 100 basically, even on a 1-10 scale because they use to the tenths decimal place) tend to be totally subjective. The metric is, in and of itself, flawed. Leonard Mlodinow explains it beautifully in the book, The Drunkard's Walk. Essentially, people have tried to create a system such as this for grading papers for school. This one fella' took months drilling strict criteria for his out of 100 scale paper grading system. He trained 25 people. After the training, he gave them all the same stack of 25 papers to grade. When he looked at the results of his work, peoples grades varied by as much as 12 points on the same paper. He also tells the story of how two of his son's friends turned in identical papers, accidentally. Instead of getting caught, one received a 90 while the other received a 79. For the number system to mean anything, there would have to be a palpable difference from one point to the next, but there isn't.
The margin for error is huge. I would invite you to check out metacritic (I know, I hate it too, but it shows my point beautifully). In the official reviews column on a game that has seen international release, there tends to be a huge difference in scores out of 100. One game (it's been a while so I don't remember the game) had a top score of 71, while the lowest score on the list for the game was a 10.
Also, I know no one is a fan of a by the numbers system.
Like how a junk rifle can be inaccurate, keep firing it at the target and the centre of the cluster of bullet holes will be actually your centre of aim.
You give the perfect example with your paper marking to the inaccuracy between two shots of a rifle, the identical paper marked by different people, to get the true score you have it mark it by many more teachers and the most accurate score is the average of all their scores.
Metascore. THAT is what we are talking about here. Frankly, critics might as well never reveal publicly their score to a game, they should send it straight to metacritic to find the aggregate. As a single score in itself is useless due to the inaccuracies in trying to quantify ones judgement. And it doesn't do any good as then the fanboys and haters say "GRRR, IGN skewed the result, if it wasn't for them this game would have had a different score! GRRR"
The worth of a single critic should be in their prose. What they actually write about a given work, that is the most important guide to the customer.
As to Modern Warfare 3, consider this: it may be hardly an improvement over COD4 but:
-COD4 is still a good game, 4 years later
-No other game has really surpassed it in what it does.
So, standards have NOT gone up significantly, MW3 is a bit better than COD4 in the most valued areas and enough things are changed around for it to get the same score as COD4. They may rate it a bit higher, but another critic will rate it a bit lower. THERE IS NO REASON FOR CONFORMITY! People can have varying opinions and judgements.