Read-only archive of the All About Symbian forum (2001–2013) · About this archive

Review score

17 replies · 67 views · Started 16 February 2003

I've been thinking about this a while and it's still bothers me.
The current score system on the reviews is not objective and not even close to being fair ...

I think we need guidelines to score apps/games instead of voting just by feeling ... maybe a bit more categories for the application departement wouldn't be bad 2.

I realise that the score is the reviewers oppinion but I think using some guidelines for a score would be fair when scoring apps in the same division.

Things like performance, memory load, user friendlyness ...

I had to upgrade the appman score to 90% just to get it into Mega Award, because well it would belong there if those other apps do 😛

Your comments? 😊

I had some thorough objections against the current system but they just wouldn't come out now 😉

No score system is going to be fair to everyone, there will always be loosers and winners.

I'm of the completely opposite opinion, Dazler. I think it should be done on feel - trying to shoehorn in memory useage, speed, etc is just as subjective. You either add up the scores so it fits witht he fianl score you want (in whcih case it serves no purpose) or you have an unrealistic score that doesn;t match up with your expectations.

I also think having about 5 or 6 different scores in one review clouds the whole thing. One score, giving it Ace, not so ace, average, etc as we have now, is clear and simple.

We may write reviews for ourself, but they are read by everyone.

Of course, the idea of breaking down each "score band" is a good one, and could be expanded easily to give more guidance.

Of couse we could do what the majority of magazines do and score based on Advertising Income from the company.

Maybe just stop scoring 😃
I was thinking maybe something like the following:
The reviewer gives a base score let's say 75%, every reader can add a value to that from -10 to +10 these values will give the final score of (E(Si)/n)+Sr where Si are the reader values, n the number of reader values and Sr the reviewers value.

(1) The reviewer still needs to decide a base score.

(2) It means we're reviewing by committee, which defeats the purpose of stressing reviews are that of an individual and makes gettings scores a chore.

Mind you, problems do happen when opinions amoung us are wildly differet (Fairway review saw Switchblade and I at odds and over 25% out!)

Not 25% more like 15% wasn't it? It's still abloody good game when I'm bored at work. Although we need to finalise the review to get it up. As for scoring bands I score in bands for all apps and then develop a final score from it relating to major areas in the app/game. This is the way reviews were done when I were a lad reading ST Format, ST Action, Atari ST User, and even still now in PC Plus.

I agree that there should be some things we should consider by scoring:

Memory Used
Momory Required
Grafical User Interface
Usability _Adictability (Vexed is a etalon 😃 )
Pricing

If we will stick the verdict using Mega App, Average App, Below Expected kinda like categories, it will be easy to review it.

I am 2 days far from a surprise to you all, and I will let you decide. I agree with you both...But I haven`t said my last one 😉

I don't score by the Apps requirements as I feel that's unfair. For games I use: Graphics, Sound, Playability, Addictability, Useability (ease of use). For Apps (marking scheme recently changed) I rate on: Useability, Usefulness, Presentation.

That way people can see if a game got a good result by being pretty playing ok, but not too addictive. Although for me playbaility and addictablility are the highest scoring factors, one of my favourite games of all time was in 2 colours and lines.

So we'd have to agree on categories as well? Seesh!!

Okay, how about people can use sub categories, bands, types, ickle scores, whatever... as long as the final mark fits the scoring scheme and makes sense?

Howzat?

Or how about we give them a score depending on how good the authors christmas card was last year, or maybe how well he stands his round? 😉

I use the same system as Switchblade ... maybe to make it all simple, a reviewer gives it a score and 1 other person from admin verifies this? That is for very high and very low scores?

Like the filecommander thing that got 45% (I agreed with Ewan's comments) but if you read the statement in the review scoring section for scores between 40-49%, well I don't think it is really appropriate ...

But I'm thinking to drop this issue. I can still give MGS Karting a 85 or even 90 percent without feeling bad because it still falls in the category "games" and therefor won't compete with things like Mylist pro and Appman...

[quote="Dazler"]I use the same system as Switchblade ... maybe to make it all simple, a reviewer gives it a score and 1 other person from admin verifies this? That is for very high and very low scores?
[/quote]

That's what we kinda have now, not just firmed up. If memory serves the 'Board' is myself, Switchblade and MaleBuffy

[quote="Dazler"]
Like the filecommander thing that got 45% (I agreed with Ewan's comments) but if you read the statement in the review scoring section for scores between 40-49%, well I don't think it is really appropriate, although the score does need re-described methinks to be a little less "Argh!!!"
[/quote]

In which case the score descriptio needs reworking. If you can't recommend a product (a simple yes or no) then scores are below 50%. I think the 45% score is about right - the comment says it's like a Released Beta. Which I agree with.

[quote="Dazler"]
But I'm thinking to drop this issue. I can still give MGS Karting a 85 or even 90 percent without feeling bad because it still falls in the category "games" and therefor won't compete with things like Mylist pro and Appman...[/quote]

NO! Don't drop it! We need to discuss this stuff so we're all comfortable and to force people like me to re-assess the whole thing to see if we were doing it right!

Yes, I'd hope people could tell the difference between Game and App. And also note that the majority of awards would be "recommened" with the Mega award very very rare. And before anyone asks, Rafe and Langdona did the Vexed review and I have nothing to do with it.

I still think both awards should be over 90%, this is the first time I've seen a review system where apps should struggle to get 80%. The majority of averagely-good apps should get 70-80%, good apps up to 90%, thrilling with a gong 90-95%, and so bloody special I'm gonna write a song about it 95%+.

[quote="Ewan-FreEPOC"]And before anyone asks, Rafe and Langdona did the Vexed review and I have nothing to do with it.[/quote]

Ah, but did you send them a good quality Christmas card?

[quote="SwitchBlade"]I still think both awards should be over 90%, this is the first time I've seen a review system where apps should struggle to get 80%. The majority of averagely-good apps should get 70-80%, good apps up to 90%, thrilling with a gong 90-95%, and so bloody special I'm gonna write a song about it 95%+.[/quote]

Surely average should be 50%? 😊

After talking with Rafe and a few others (external to the site) we decided the average band was going to be 65%-75%, witha anarm (but unstated) 75%-80% band for the almost great apps. 80%90 would correlate to a magazine's 90%-95%, and our 90%+ is a mags 95%.

A lot of these magazine will not give scores under 80% because of Advertising pressure - which is not a consideration with AAS. I also personally don't see what's wrong with an app scoring 70% when it's run of the mill. Companies wouldn't worry about scoring 7/10, but they worry about 70% Why shouldn't apps struggle to get 80% where we recommend it? We don't like apps just because they run on our nice phones?

I think we have two main points raised here...

(1) What would we score the 'average' app is (personaly 65-70, but it seems we're arguing for 80-85 here).

(2) Should we follow the pack in terms of scores or use something that provides a more realistic assesment of an app or game. Average shuld really mean 50%, but then that would be too mean, so it's moved up slightly.

I'd give an average (neither good nor bad) app 50%, an averagely good app will get 75%, and averagely bad 25%. The thing here is if we move away from a scoring system similar to most mags as people will be used to, people won't view the apps in a way they would as reviewed elsewhere.

We're not sheep! I refuse to consider something just because 'other mags' do it that way! If we had the ad budget that 'mags' do then fine, but we get more readership per month that "PDA Essentials" and the puff pieces in that make me vomit, just so they won;t upset anyone. Apps there get...

7/10 : They've given us no money and its okay
8/10 : They've given us money and it's okay
9/10 : They've given us money and it's a bit better than okay

"Not on my watch," says Capt Mainwaring

<Ewan removes tounge from cheek>

I never said lie, I said use a marking scheme people are used to, mags I read happily give apps scores of 20% if they are shite, I couldn't give a toss what they get but if everyone fits in together it's easier for people to understand how good something is.

I think it is up to reviewer to give the score and make judgment.

We will play between the MegaApp and othe categories, and the final score will relflect into the category it`s in.

What do ya think?