Reviewing Reviews (Or, The Long And Short Of It)

Another day rolls by, another review embargo. Yesterday was the turn of Evolution’s MotorStorm Apocalypse, the latest (and greatest) in the series of brutal, off road (not without a sense of irony) arcade racers, and reviews were set to go live at 10am GMT – that’s the embargo time and date set by SCEE, after which gaming sites like TSA could talk about the game openly.  Why was that particular embargo in place?  Well, presumably it was due to the demo announcement, which went live on the official PlayStation Blog at the same time: gamers could see that there was a demo, and then immediately follow up by reading a review or two – it’s a shame the same blog didn’t post links to reviews, but hopefully they weren’t too hard to find, VG247 rounded up the first batch anyway.

It’s a strange one, though – and the above link demonstrates this perfectly: a good proportion of people only really care about the score – that singular figure at the end of one and a half thousand words – carefully picked, deliberated over words that can easily take anywhere from a good few hours to a couple of days to nail and perfect, at least as much as we can.  Naturally, that score is hugely significant – it’s a ready reckoner for anyone doing the rounds to quickly see what that game is ‘worth’, but in relation to what?  What’s the frame of reference?  What’s the score actually being compared to?

Take, seeing as we’re already here, MotorStorm’s score.  8/10.  Our review policy states that such a score equates to two words – ‘very good’ – but what does that mean?  Can you relate it to our review of MotorStorm: Pacific Rift, which Michael rated a 6/10 way back in 2008?  Not really, because the two games were reviewed by different people, with different approaches to gaming and completely different outlooks on what a game should be.  I’m not saying that the Rift score is ‘wrong’ in any way – it’s not, but that’s the point: it’s just one man’s view of a game boiled down to a figure at the end, albeit in this case a stylised one that harks back to an old site design.  It’s the words that count and, re-reading the Rift review, they still ring true: the sequel to MotorStorm simply wasn’t as good as the first.

Is all this really a big deal?  Possibly.  When Test Drive Unlimited 2 came out I wrote 900 words on it but stopped short of awarding the game a score.  It wasn’t so much an experiment – more of a conscious decision to try something else and actually write about the game as a gamer rather that desperately trying to evaluate everything as I went along.  I made no notes, I just played the game, on and off, for a week, and then poured out a series of thoughts into that article.  It was held back until the embargo, and went live along with all the other reviews of the game at the time.  Some of the comments in the story were confused as to where the score was, but we were happy with the piece as it stood, and I wouldn’t have written any more for it if I had scored it, except perhaps to play the online portion a little more.

Things move quickly in the world of videogames, and when you’ve got limited time to get through a game, chances are after it’s reviewed and out there’s little motivation to go back to it unless it’s a really special title.  There are exceptions: those that actively push the community angle normally get some playtime after launch but, from a personal point of view, it’s normally the games I haven’t reviewed myself that I find myself enjoying the most down the line – there are occasions when reviewing a game spoils the experience somewhat, being forced to distill something down into things you liked and things you didn’t is hard enough but trying to remain objective whilst doing so is harder still.  TSA’s readers are a widely diverse bunch and you’ve got to try to be true to everyone.

But the hardest bit is scoring a game.  When you’re on an embargo like MotorStorm’s you’ve no idea what everyone else is going to score something and, whilst you’re confident in your own ability to effectively ‘judge’ something in text, it’s the figure that everyone’s going to see.  TSA: 8/10.  As it stands, we were pretty much in line with everyone else and after reading the text (yes) it seems most reviews commented on roughly the same sort of things: Apocalypse isn’t a terribly complicated game, and it’s unlikely to split opinion as much as something like Killzone 3, but there’s always that lingering notion that you might have missed something obvious.  At least we got a score out there, hey, we’ve still to finally award one to Gran Turismo 5.

Perhaps it’s time to do that.  How about [randomreviewscore]/10?



  1. 5/10 on Release, 7/10 now :D, to be honest i never worry about a score, if you patronise particular websites and magazines you start to learn something about the reviewers, thenits justa case of finding those reviewers that are most akin to you and making an informed decision based on that.
    never been led astray so far (with the glaring exception of Mercenaries 2…..just awful…)

    • i thought id qualify my dual scores with a story, one of the reasons id score the game twice is, for people that bought the game on day one like myself it was broken, just not a good experience and felt quite last gen with a myriad of other errors that made me as a long term GT fan crazy and they deserve some criticsm for it.
      the second score would be for those who buy it now, its just not the same game anymore, its alot faster, with more features (all free) than it had at launch, therefore its impossible to assign one score for it.
      i tell you what i do miss on game review sites, especially those with a list of games and their scores, scores that DEcrease over time, PCZone used to do this (dont know is they still do) for example they gave Quake 2 96%(totally justified, best online multiplayer ever….no it was…..stop laughing… WAS!) then as time went by it decreased as it became more and more irrelevant and servers closed etc, id love this to be adopted everywhere, sorry for the rant :)

      • Thats a really good idea. I must admit I’ve reviewed games in the past and gave it a top score, but when I come back to it a year or two later and consider how much gaming has moved forward I think “I should really knock 10% off this game now”

  2. Good question. I must admit in hindsight one of my regrets for our website was to score games with a percentage. Pinning down games to a score out of 10 and giving that figure a relational meaning is hard enough, let alone trying to work out the difference between an 84% game and an 85% one lol.
    Overall, I take actual scores with a pinch of salt these days. A recent example I can think of is Dragon Age 2 on IGN. By reading the review, the game sounded like a natural improvement on the original, yet it then scored 8.0, yet ‘Origins’ got a higher score… go figure.

    • They did the same with Killzone 3 which astounded me. The review really made me feel like I was reading a game worthy of a minimum 9 out of 10 score, as they only criticised the story that was told

      • Yeah that suprised me too, after all, narratives aren’t usually considered when reviewing an FPS (at least I hope not, given their high score of Unreal Tournament 3!)

  3. I don’t really pay too much attention to reviews, but I always like to see a score. Sometimes I can’t be bothered, or don’t have time to read a whole review, and I’m sure there are many with the same opinion.

    I’ve said it before and I still maintain (convenient or not) that the fairest way to review games is for multiple people to review it and have an average score. This is something that is given to us by metacritic, which is why I always look for a metacritic score before parting with my hard earned cash.

  4. Typical real-driving-sim GT haters….

  5. Haha, so it’s official then, on metacritic under gt5 it’ll read “TheSixthAxis” 40/100 ;)

    But yes, everything you say here hits the nail on the proverbial head!

  6. ‘the two games were reviewed by different people, with different approaches to gaming and completely different outlooks on what a game should be’

    It irritates me when people say this. Thats NOT what a review’s meant to be! A review is supposed to be an unbiast scoring of a game to advise whether OTHER PEOPLE should buy it or not. You don’t review a game for yourself, so the particular reviewers’ taste should be irrelevant!

    • This is what makes 99% of game reviews inaccurate. There’s too much bias. eg. Killzone 3 getting 1 or 2 points knocked-off simply ’cause it’s a PS3 exclusive from most websites; Even though everyone knows it’d get all 10’s across the board if it were the EXACT same game, but 360 exlusive. Jealousy and bitterness are all too common on internet reviews.

      • I disagree. I know a lot of people who review games and they go about it in a wholly professional manner. Obviously I can’t vouch for everyone, but to say 99% of reviews are inaccurate is a bit much.

        As for being completely unbiased, it’s a lot harder than it sounds because everyone will have some form of preconception about a game, or a particular genre.

      • i cant agree with 99%, most sites that attract regular hits for game reviews are generally quite professional across platforms, its incredibly difficult to remain unbiased about particular games and you always annoy people no matter what score you give, also its pretty difficult to stay in your job/ keep your website up and running if your reviewer/s are constantly bitter

    • I agree. Reviews are difficult to do (which is why not many people can do them properly). An opinion is easy and everyone can have one. It takes zero talent to shout your opinion or give a score. Reviewing is a difficult task that requires the ability to see past personal preferences.
      I do think that in order for a review to be entertaining, it needs to be coloured with the personality of the reviewer though. So it should be objective but carry the reviewer’s take on how those objective points are implemented.
      So, in summary, a review should be mostly objective but with a garnish of subjective flavouring. Otherwise it would just be a bullet pointed list of features.

    • That’s simply untrue. Critics are notoriously biased — even Siskel and Ebert could disagree and give different opinions… because ultimately, a review is two things:
      1) a statement of fact – this is what the game is, and how it works, and how it fits into the rest of it’s genre.
      2) a statement of opinion – this is how well it works.

      Someone that hates SRPGS will have a hard time doing either of those things for an SRPG, and therefore simply shouldn’t review them. But nobody can possibly do it “without bias”.

      The reviewer can not, and should not, remove their opinion — I mean, that’s why we are reading the review in the first place! If they think it sucks, how can they possibly know most people will find it fun (or vice versa?). It’s not like they have a magic ball.

  7. Worst site EVER is eurogamer. TSA is the best I have come across for this type of thing. I dont care about reviews, I enjoy reading them but they rarely sway me unless it a game I know little about. I mean Kz3 could have got zeros across the board and I would have still bought it, knowing I would enjoy as its my time of FPS.

    Reviews in many cases [not TSA] are used as a tool. I mean I have seen 5/6 for KZ3 and I was like ‘wuuuutttt???’

    Reviews should be enjoyed as someones perspective and thats it.

    • exactly, its like taking Yahtzee Croshaws critique on games seriously, youd never buy a game again :D

      • Rofl definately. Though whenever I have a huge list of games I want and I know I can’t afford them all, I watch his reviews and his negative comments help me keep my wallet in my pocket

      • That man is a cynical genius! I use his reviews so all the negative points can be blown way out of proportion, then put them in perspective with over-blown scores on sites like IGN. Then I get a (usually) realistic average and actually read about it here on TSA. Has helped me many a time.

  8. For me, the easiest way for me to tell if a game is worth a buy is by watching video reviews. I pretty much block out the dialogue and just focus on the game in action to see if its my cup o’ tea.

  9. I think the score is important to quickly categorize a game as great (10-8), good with problems (7-6) or bad (>5). If I am thinking of getting a game, I will jump on gamerankings and see what the average score is. Then I can read a review with a higher score and a review with a lower score and come to a reasonably well rounded opinion of the game and its potential flaws. Then I can decide if those flaws will ruin the game for me.

    Without scores, it would be very difficult to quickly come to a decision as to whether or not a game is worth buying. If it is an 8 or above, I usually don’t bother reading the review anyway unless I am after something to read. Anything less, it warns you there might be problems to consider.

  10. Getting every review bang-on isn’t an impossibility, but it requires a lot of manpower, a lot more than the TSA collective. In personal opinion, this is the best and most refined way of a reviewing a game:

    Instead of assigning a game to just one writer, assign it to two, three, or even four, who each have to play through the bulk of the game within a deadline. Instead of speeding straight into a review, there should be a meeting and a discussion before the review group come up with the most accurate score. One of the writers, the “leader” of the review group, is the one who actually writes the review, though considers points raised by his peers.

    This would hopefully prevent bias, and improve the quality of and entertainment value of a review. Hopefully, if TSA ever expands into an fully up-and-running office of establishment, the team will have the means to do something like this.

    • That. Is. Genius.
      that is how i always wanted to run a Review section in a magazine/website, get it done :D

      • Glad to see someone agrees, though with most sites having a fairly tight budget, getting a team to play through and talk about the same game is time/money consuming, even if only one of them will have to put pen to paper.

        This is just another stab in the dark, but maybe, if this were to be implemented, the discussion sessions could be video’d or recording in audio, proving the transparency of our reviews.

      • I’ll do it for free *cough *cough

        I agree as well having four different reviewers would be interesting but of course money would be a issue. :/

    • I agree. Pretty much what I said earlier but written in a more comprehensive manner.

      • i think you`ll find more than enough people willing to do it for free Jim lol, recording audio shouldnt be too much of a challenge in this day and age either, i think the main issue would be having reliable people willing to move heaven and earth to meet the deadline.

    • The main issue is when you have a 60 hour monster like Dragon Age. The amount of time needed would be horrendous unless all members got a review copy. I’ll let you ask BioWare for that one. ;-)

      • lol, that conversation would be quite something to sit in on, although i do think that if the site established itself as the only place on the web that does it and if its sufficiently(i hate that word) popular hitwise i could see it being a success, think about it, todays age is all about inter-connectivity via facespace etc, what better way to appeal to companies than to make the reviews a group of blokes/girls having fun, playing games together, at the very least it should help dispell the lonely acne ridden image gamers have

    • That would be a pretty cool experiment but the budget required would be astronomical. It would seem that many of the biggest hitters in online games media can’t even afford the man hours to have one guy play through the entirety of a game before reviewing, let alone a committee!
      I can say, with some confidence, that this is almost certainly never going to happen at TSA. It’s just not a sensible allocation of resources to put four guys on one title for the amount of time required. Imagine it was an RPG, that’s four guys out for a whole working week (plus meeting and writing time). That’s an entire monthly salary to cover one game in a month that might see fifteen other games that all need covered too. It’s a nice “ideal world” scenario but never going to be workable. Unless you’ve got a lottery win you’d like to sink into keeping a website running for a few years? ;)

      • So how about publishing reviews written by readers? Or let readers send in their scores and have an additional section to the main TSA review. This wouldn’t work for pre-release reviews, but there are plenty of games that are reviewed on here well after the release date.

        Thinking about it you could have a poll style section to a review where the readers could simply click a score out of ten and the average would then be displayed. Surely this would be easy enough to implement? It would also work for pre-release reviews, as the readers could come back to the review and score it after its release.

      • If i did, i would, thats a promise :)

      • @Kronik Reader reviews are a no go simply because the vast majority of our readers have opinions, not reviews. We can’t trust our hard-won reputation on a reader’s potentially misguided or biased opinion.
        We have published little round ups in the past but there wasn’t much interest from our readership so we ended up with the same four or five people submitting every week. The polling idea is also something we’ve looked at. Actually, just last week myself and Alex were looking at something similar to a round up thing, like the Verdicts section in our forums but published on the main site and kind of “curated” by us. It’s difficult to gauge the level of interest we’d get though. I’ll keep thinking about it.

    • That’s also what I thought but like you pointed out, not everyone has the manpower to let 4 people work on a review for an entire week.

    • Wait… isn’t this basically what Metacritic is doing? Take the average of every review without the team meetings and one combined review…

      • not exactly, it doesnt take into account the stupidity of certain reviews or the ones that are paid for or ones that are simply biased or incomplete.

Comments are now closed for this post.