Metacritic’s Secret Weighting System “Outed”, Company Denies Claims

Overnight trade site Gamasutra published a report claiming to have unlocked Metacritic’s secret weighting system, a series of numbers the score ranking site assigns to certain publications when averaging out a game (or, of course, a movie’s, a tv show’s or a piece of music) ‘meta’ score.

“Each critic/ publication is assigned one of six different weightings of ‘importance’,” claims Gamasutra, “with some publications exerting considerably more influence over a game’s final ‘metascore’ than others.”

The findings were revealed by Adams Greenwood-Ericksen of Full Sail University at a talk titled ‘A Scientific Assessment of the Validity and Value of Metacritic’ delivered at GDC.

The research took six months, and assigned the rankings thus:

Highest (1.5) — Dark Zero
Highest (1.5) — Digital Chumps
Highest (1.5) — Digital Entertainment News
Highest (1.5) — Extreme Gamer
Highest (1.5) — Firing Squad
Highest (1.5) — Game Almighty
Highest (1.5) — Game Informer
Highest (1.5) — GamePro
Highest (1.5) — Gamers Europe
Highest (1.5) — GameTrailers
Highest (1.5) — GotNext
Highest (1.5) — IGN
Highest (1.5) — IGN AU
Highest (1.5) — IGN UK
Highest (1.5) — Just Adventure
Highest (1.5) — Machinima
Highest (1.5) — Planet Xbox 360
Highest (1.5) — PlayStation Official Magazine UK
Highest (1.5) — PlayStation Official Magazine US
Highest (1.5) — Telegraph
Highest (1.5) — The New York Times
Highest (1.5) — TheSixthAxis
Highest (1.5) — TotalPlayStation
Highest (1.5) — VGPub
Highest (1.5) — Videogameszone.de
Highest (1.5) — Wired
Highest (1.5) — Xboxic
Highest (1.5) — Yahoo Games
Highest (1.5) — ZTGames Domain

High (1.25) — Absolute Games
High (1.25) — ActionTrip
High (1.25) — Adventure Gamers
High (1.25) — Computer & Video Games
High (1.25) — Console Gameworld
High (1.25) — Da GameBoyz
High (1.25) — Darkstation
High (1.25) — Edge Magazine
High (1.25) — EGM
High (1.25) — EuroGamer Italy
High (1.25) — EuroGamer Spain
High (1.25) — G4 TV
High (1.25) — Game Chronicles
High (1.25) — GameDaily
High (1.25) — Gameplayer
High (1.25) — Gamer 2.0
High (1.25) — Gamervision
High (1.25) — Games Master UK
High (1.25) — Gamespot
High (1.25) — GameSpy
High (1.25) — Gaming Age
High (1.25) — Gaming Nexus
High (1.25) — Maxi Consoles (Portugal)
High (1.25) — Pelit
High (1.25) — Play.tm
High (1.25) — PlayStation Universe
High (1.25) — PlayStation Official AU
High (1.25) — PSM3 Magazine UK
High (1.25) — PS Extreme
High (1.25) — RPG Fan
High (1.25) — Strategy Informer
High (1.25) — Team Xbox
High (1.25) — The Onion (AV Club)
High (1.25) — Totally 360
High (1.25) — WonderwallWeb
High (1.25) — XGN

Medium (1.0) — 1Up
Medium (1.0) — CPU Gamer
Medium (1.0) — Cubed3
Medium (1.0) — Cynamite
Medium (1.0) — D+Pad Magazine
Medium (1.0) — DailyGame
Medium (1.0) — Destructoid
Medium (1.0) — Eurogamer
Medium (1.0) — Everyeye.it
Medium (1.0) — Game Revolution
Medium (1.0) — Game Shark
Medium (1.0) — Gameblog.fr
Medium (1.0) — GameKult
Medium (1.0) — Gamereactor Denmark
Medium (1.0) — Gamers’ Temple
Medium (1.0) — GameShark
Medium (1.0) — Gameblog.fr
Medium (1.0) — GamesNation
Medium (1.0) — GameStar
Medium (1.0) — GameTap
Medium (1.0) — Gaming Target
Medium (1.0) — Gamereactor Sweden
Medium (1.0) — The Guardian
Medium (1.0) — Hardcore Gamer Magazine
Medium (1.0) — HellBored
Medium (1.0) — NiceGamers
Medium (1.0) — Joystiq
Medium (1.0) — Just RPG
Medium (1.0) — Level 7.nu
Medium (1.0) — Modojo
Medium (1.0) — MondoXbox
Medium (1.0) — Multiplayer.it
Medium (1.0) — Netjak
Medium (1.0) — NGamer Magazine
Medium (1.0) — Nintendo Life
Medium (1.0) — Nintendo Power
Medium (1.0) — Nintendojo
Medium (1.0) — Nintendo World Report
Medium (1.0) — NZGamer
Medium (1.0) — Official Nintendo Magazine UK
Medium (1.0) — Official Xbox 360 Magazine UK
Medium (1.0) — Official Xbox Magazine
Medium (1.0) — Official Xbox Magazine UK
Medium (1.0) — PALGN
Medium (1.0) — PC Format
Medium (1.0) — PC Gamer (Germany)
Medium (1.0) — PC Gamer UK
Medium (1.0) — PC Gamer
Medium (1.0) — PC Powerplay
Medium (1.0) — PGNx Media
Medium (1.0) — Play Magazine
Medium (1.0) — PlayStation LifeStyle
Medium (1.0) — Pocketgamer UK
Medium (1.0) — PT Games
Medium (1.0) — Real Gamer
Medium (1.0) — SpazioGames
Medium (1.0) — Talk Xbox
Medium (1.0) — The Escapist
Medium (1.0) — Thunderbolt
Medium (1.0) — Total VideoGames
Medium (1.0) — Worth Playing
Medium (1.0) — X360 Magazine UK
Medium (1.0) — Xbox World 360 Magazine UK
Medium (1.0) — Xbox World Australia
Medium (1.0) — Xbox360 Achievements
Medium (1.0) — Xbox Addict

Low (0.75) — 360 Gamer Magazine UK
Low (0.75) — 3DJuegos
Low (0.75) — Ace Gamez
Low (0.75) — Atomic Gamer
Low (0.75) — BigPond GameArena
Low (0.75) — Console Monster
Low (0.75) — Deeko
Low (0.75) — Eurogamer Portugal
Low (0.75) — Game Focus
Low (0.75) — Gameplanet
Low (0.75) — Gamer Limit
Low (0.75) — Gamer.nl
Low (0.75) — Games Radar (in-house)
Low (0.75) — Games TM
Low (0.75) — Gamestyle
Low (0.75) — GameZone
Low (0.75) — Gaming Excellence
Low (0.75) — Gaming Trend
Low (0.75) — Impulse gamer
Low (0.75) — MEGamers
Low (0.75) — Metro Game Central
Low (0.75) — MS Xbox World
Low (0.75) — NTSC-uk
Low (0.75) — PS Focus
Low (0.75) — PSW Magazine UK
Low (0.75) — Video Game Talk
Low (0.75) — VideoGamer

Lower (0.5) — Armchair Empire
Lower (0.5) — Cheat Code Central
Lower (0.5) — Game Over Online
Lower (0.5) — Game Positive
Lower (0.5) — Gamer’s Hell
Lower (0.5) — Gamereactor Sweden
Lower (0.5) — Gamers.at
Lower (0.5) — Giant Bomb
Lower (0.5) — PS3bloggen.se
Lower (0.5) — RPGamer
Lower (0.5) — Vandal Online

Lowest (0.25) — 9Lives
Lowest (0.25) — Boomtown
Lowest (0.25) — Computer Games Online RO
Lowest (0.25) — GamerNode
Lowest (0.25) — GamingXP
Lowest (0.25) — IC-Games
Lowest (0.25) — Insidegamer.nl
Lowest (0.25) — Jolt Online Gaming
Lowest (0.25) — Kikizo
Lowest (0.25) — LEVEL
Lowest (0.25) — Meritstation
Lowest (0.25) — My Gamer
Lowest (0.25) — Official PlayStation 2 Magazine UK
Lowest (0.25) — Play UK
Lowest (0.25) — WHAM! Gaming

You might see some familiar names in there. To all publications, these weightings are a surprise (they certainly are to us) and obviously aren’t indicative of anything other than Metacritic’s own calculations, based on a formula we’re not privy to.

That said, Metacritic were quick to respond.

“Today, the website Gamasutra ‘revealed’ the weights that we assign to each gaming publication (for the purpose of calculating our Metascores), based on a presentation given at the Game Developers Conference this morning,” said a statement.

“There’s just one major problem with that: neither that site, nor the person giving the presentation, got those weights from us; rather, they are simply their best guesses based on research (the Gamasutra headline is misleading in this respect).”

The response says that the guesses are “wildly, wholly inaccurate” and says that Metacritic use “far fewer tiers than listed in the article” and that “the disparity between tiers listed in the article is far more extreme than what we actually use on Metacritic.”

They also say that their “placement of publications in each tier differs from what is displayed in the article,” so it’s hard to put any confirmation behind any of the weights.

24 Comments

  1. Good to see you up tget sixthaxis, how they heck is IGN all the way up there!

    • If I’m not mistaken, IGN is read most of all gaming sites. They’re huge whether you like them or not.

      • But this is not based on whose site is read a lot

      • I don’t know what these things are based on, but IGN is huge, so I’m not surprised.

      • So is Eurogamer and others much lower down, so it’s not to do with a site’s size, or traffic.

        Don’t forget though, Metacritic claim this wholly inaccurate, but unless they explain further we don’t actually know that for sure.

  2. Interesting. And gratz to TSA for being in the ‘Highest’ catagory.

  3. Weighting things just makes the whole thing utterly pointless. One opinion is just as a valid as another.

    • Absolutely…. what it’s saying is that your entitle to your opinion but that guys opinion over there is more valid than yours so we’ll give him more cookies.

      Metacritic is a blight and this weighting system should be put out to pasture as soon as possible, but it never will and in a few months we’ll have another post about it.

      If a meta-review site is biased (which is what weighting essentially are) why do publishers and dev put so much faith in them, to the extent that bonuses are based on them – surely they can count for themselves how many 100%’s or 2/10’s they get for a game.

    • But I don’t think its about opinions rather I reckon its about credibility of the opinion amongst the public. I’m not saying that’s necessarily the right thing, but I think its a factor.

      Some of the smaller sites could be rather individualistic and personal, valuing or devaluing a game more on personal likes. Some smaller reviewers have been crap in saying Atelier is just repetitive, boring quest based Japanese stuff; hardly fair and objective for the developers considering the reviewer didn’t weigh it up against the pros.

    • Disagree. The weighting may not be use to determine a sites validity but instead their suitableness to the Metacritic way of scoring.

      For example: by using a five point scale, Giant Bombs scores can only be 0, 20, 40, 60, 80, 100, which when working to a 100 point scale, can really through things out of balance.

  4. TSA, TSA, TSA, TSA ,TSA! Ok, so this may not be accurate, but I find it interesting that “Official” sites/magazines are weighted less. Is this a theory of potential bias on their part?

  5. Presumably the scores have been crawled and correlations have been found between the final meta score and the contributions from individual sources. It could be skewed quite easily by a particular source consistently scoring higher or lower than the real average, thus appearing to have a high or low weighting.

    • You can see a publications average on their Metacritic ‘page’

      TheSixthAxis is pretty reflective of the whole industry, reviewing 0.5 points lower than other critics, remember it’s out of 100 so 0.5 is pretty negligible.

      I haven’t checked them all but a few in the highest tier review comfortably above average, whilst sites in the middle tier like Eurogamer 6.8pts below other critics’ average & Edge 9pts below average are nuetrally weighted meaning their scores carry less weight than those higher scoring sites in the top tier

      This could mean, TSA-aside that Metacritic is skewing scores upwards.

      although… I’ve only looked at a couple so it may not follow that pattern and of course Metacritic say it’s very far from the truth.

      It all highlights the pointlessness of their Metascore imo.

      There are many site’s reviews they exclude from the initial launch period Metascore, only adding them days & days later which wildly affects the Metascore… It just means it can’t really be trusted & then when you factor in the lack of transparency of their secret sauce – you don’t really know what you’re looking at.

      • All their included publications here here, where you can see them all & sort by score or name.

        IGNUK 9pts above average & in the top tier.

        Giant Bomb, GamesRadar and other reasonably high traffic sites along with EG, Edge etc which score on average lower are tiered lower.

        Although, if Metacritic are correct this is all wrong anyway, so I shouldn’t read this into it.

  6. “To all publications, these weightings are a surprise (they certainly are to us)”. Are you surprised by the ‘tier’ you’ve been placed in, or by the weightings themselves? Because Metacritic have always stated they weight publications differently, just not what criteria they use.

    • The tier. Makes little sense, to me at least.

      • Thanks for clarifying, it seems a lot of reaction around this story is people thinking the scores were a simple arithmetic mean in the first place.

        I do agree there seems to be little rhyme or reason to the groupings, but then I can imagine there are other permutations they missed in their study. If they’d hit the right formula, the scores would be exactly correct and predictable, when in fact they say they are ‘almost exact’. There is no almost in maths.

  7. I wonder how many maths ‘models’ you can run weightings through an end up with those Metascores.

    Just assuming the research is sound and there isn’t other weighting combinations that ‘fit’ the final outcome and that it is overall reasonably accurate:

    It does seem to be websites or publications which are new, don’t have much readership or a proven track record in reviewing that are in the bottom negatively weighted tiers.

    Above that it seems that there are plenty of sites whose review scores a sizeable amount below average that are in the middle tier, sites like Eurogamer & Edge… Whilst most of the sites in the highest tier actually review over average (with the exception of TSA… who are broadly in line with the industry reviewing just 0.5/100 below average.

    So (again assuming the research is sound and there isn’t other configurations which could end up with those Metascores) it could seem that TSA aside Metacritic is skewing the final Metascore upwards.

    Although as Metacritic themselves say it’s bollocks we’ll never know until they do reveal their secret sauce, which I’m sure they won’t.

    The sad thing is people’s jobs depend on this… With developers bonuses often depending on Metascore, which is awful when no one really knows how the outcome is derived. Remember Fallout NV Metascored 84, ensuring Obsidian missed out on a hefty bonus if it was to Metascore >85

    I think the publishing industry should move on from Metascores… and just work to a true average of 10/20/50 top sites or something, rather than this mysterious figure that can be manipulated by excluding high or low scoring reviews… & of course this secret sauce that no one is privy to.

    • To be fair, that’s not Metacritic’s fault. There’ll always be a very popular “mean” scoring site somewhere. Any software house that does this needs a bloody good slap to the face.

  8. You should have edited the list and put The Sixth Axis on the Lowest category, sat back & watched all the negative/abusive posts roll in about Metacritic.
    Probably not a good idea in hindsight but I’d find it funny! :)

    • That would’ve been some top antagonising. *approves* :-p

  9. in vaguely related news check the metacritic user reviews for walking dead survival instinct
    http://www.metacritic.com/game/xbox-360/the-walking-dead-survival-instinct/user-reviews
    there is a suspicious looking amount of positive reviews from people with only one review on their account. does this happen often?

  10. Yahoo games is in the highest rating catergory!? They rarely report games and i suspect it’s the same quailty as the rest of their site. Average at best. :O

    Metacritic should only serve to be used to compare scores of reviews by different outlets and not used by publishers to judge whether to keep the developers or not. I still remeber Bethesda refusing to give Obsidian their bonus because there were one point away from their target score. And in doing so, have most likely blown their chances of having the developers of the original fallout games develop the West Coast Fallout games due to that. Which is a massive disappointment as F:NV is far better then 3 imo.

    Getting back on topic,this means nothing to me as i tend to judge sites by the quailty of the articles, reviews and more importantly, the community. TSA is number 1 in my top 10 sites list. :)

    Hang on, Why are IGN higher then us? They were given an exclusive 4 day window for their Bioshock:Infinite review which looks very dodgy and they tend to have dodgy reviews. And they tend to do… no, i’m not going to go any further.

    Metacritic seems to serve as fanboy bait as i suspect many fanboy arguements have broken out and i remeber MW3 being given 1s and 0s when it was launched on there.

Comments are now closed for this post.