Review reviewers: Metacritic responds
Who watches the Watchmen?
Kinectimals developer David Braben ruffled a few feathers this week when he suggested that game reviewers should be subjected to a Metacritic-like system to judge which of them are the most reliable.
"The best reviewers give spot-on reviews pretty soon after a game is released," he said. "They do not wait to see what others say, but nevertheless consistently come very close to the final average score. There could be a prize for the best each year."
But what does review score aggregation website Metacritic think?
According to its co-creator Marc Doyle, Braben's Metacritic for reviewers idea isn't a good one.
"Braben posits that 'the best reviewers give spot-on reviews pretty soon after a game is released'," Doyle told Eurogamer.
"He's making some assumptions here that are not always the case. Reviewers are not always provided with the game or review code before a game's release date. So if he or she must purchase the game at retail, of course, it's going to take time to play the game, write a well-reasoned review, and publish it.
"But even if a critic were given the code early, there should not be an externally-imposed race to publish that review. Spending extra time playing the game, thinking about the review, and pondering the most accurate score to give it are things to be encouraged. Of course there are business reasons to get a review in front of a publication's readers as quickly as possible, but should we, the consumers of those reviews, be encouraging the needless acceleration of that process? I don't think so."
On Braben's suggestion that there could be a prize given to the reviewer who effectively wins the 'closest to the average score' battle, Doyle is equally dismissive.
"A critic's review and his or her score is an opinion - it's not right or wrong. We can judge the credibility of a critic based on the quality of his or her analysis, the depth of his or her experience in gaming, and a host of other criteria. We do exactly that when selecting critics at Metacritic.
"But once you've established that critics meet these basic threshold criteria, if they happen to deviate from the final Metascore (or average review score) on certain games, there's no reason to deem those critics or their reviews of lower quality than those whose review scores more frequently match up with the final aggregated score.
"Penalizing a brilliant critic who happens to utilize the lower end of its publication's scale more often than a middling critic who never gives lower than a 6/10 doesn't make sense to me."
Doyle pointed out that Metacritic currently offers a way of putting game review scores in context by comparing each publishing outlet's reviews in aggregate to those of others.
"I believe that is the best measure of just how 'tough' a grader the publication is in a field which is always going to be subjective," Doyle added. "But simply because a publication tends to grade higher or lower than another in the aggregate doesn't make it any 'better' or 'worse'."