A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » System Manufacturers & Vendors » UK Computer Vendors
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Romulus ratings



 
 
Thread Tools Display Modes
  #1  
Old October 22nd 03, 06:10 AM
BUFF
external usenet poster
 
Posts: n/a
Default Romulus ratings


"Lem" wrote in message
...
At http://www.romulus2.com/feedback/chart.php?1 there is a series of
ratings of good UK suppliers.

What I don't understand is how some suppliers have got such a high
ranking but they have a mediocre score.

For example, Dabs gets a score of 4.66 and a ranking of 19 but that
4.66 is worse than several other suppliers which are lower down in
the rankings.

Can anyone explain this as I like to use this page a lot.

5 stages

1.Suppliers with a minimum of 15 votes (reviews) in past year
2.Suppliers with a minimum of 15 votes (reviews)
3.Suppliers with less than 15 votes (reviews)
4.Suppliers which have been archived (denoted by )
5.Suppliers with no votes (reviews) at all

So a supplier with a low rating but over 15 votes would be further up than
one with a higher rating but under 15 votes.



  #2  
Old October 22nd 03, 09:32 PM
Paul Hopwood
external usenet poster
 
Posts: n/a
Default

"BUFF" wrote:

For example, Dabs gets a score of 4.66 and a ranking of 19 but that
4.66 is worse than several other suppliers which are lower down in
the rankings.


Can anyone explain this as I like to use this page a lot.


So a supplier with a low rating but over 15 votes would be further up than
one with a higher rating but under 15 votes.


Does make the rankings somewhat misleading imho - basically a company
with 100 people agreeing they're sh*te would get a better ranking than
a competitor with 10 customers saying they're the best company they
ever dealt with.

I can see some of the logic behind it but as more reviews have been
added to the site the flaws in the basic ethos behind the ranking
system have become more apparent. I still use it occasionally to read
some of the reviews but completely disregard the rankings.

--
iv Paul iv


[ Mail: ]
[ WWW:
http://www.hopwood.org.uk/ ]
  #3  
Old October 23rd 03, 10:24 AM
BUFF
external usenet poster
 
Posts: n/a
Default


"Lem" wrote in message
...
"BUFF" wrote:

"Lem" wrote in message
...
At http://www.romulus2.com/feedback/chart.php?1 there is a
series of ratings of good UK suppliers.

What I don't understand is how some suppliers have got such a
high ranking but they have a mediocre score.

For example, Dabs gets a score of 4.66 and a ranking of 19 but
that 4.66 is worse than several other suppliers which are lower
down in the rankings.

Can anyone explain this as I like to use this page a lot.

5 stages

1.Suppliers with a minimum of 15 votes (reviews) in past year
2.Suppliers with a minimum of 15 votes (reviews)
3.Suppliers with less than 15 votes (reviews)
4.Suppliers which have been archived (denoted by )
5.Suppliers with no votes (reviews) at all

So a supplier with a low rating but over 15 votes would be
further up than one with a higher rating but under 15 votes.



Strange thing is that the first 34 companies on

http://www.romulus2.com/feedback/chart.php?1

have over 15 votes. And yet low scoring comanies are still ranked
above high scoring ones.

Something doesn't seem right to me.


See what you mean now - it seems to work OK down to no. 24 (i.e. lower
score= lower place) but then goes wonky for the next 10 whwere some should
be higher up.
Why don't you drop him a line &
1) point it out - he may not have noticed
2) ask for an explanation


  #4  
Old October 23rd 03, 01:02 PM
RomQ
external usenet poster
 
Posts: n/a
Default

Strange thing is that the first 34 companies on

http://www.romulus2.com/feedback/chart.php?1

have over 15 votes. And yet low scoring comanies are still ranked
above high scoring ones.

Something doesn't seem right to me.


See what you mean now - it seems to work OK down to no. 24 (i.e. lower
score= lower place) but then goes wonky for the next 10 whwere some

should
be higher up.
Why don't you drop him a line &
1) point it out - he may not have noticed
2) ask for an explanation



I have noticed ;-)

It may *look* wonky but actually it's doing exactly what it's supposed to
do. It's following the 5-stage rule noted by you (Buff) in your previous
post:

1.Suppliers with a minimum of 15 reviews in past year
2.Suppliers with a minimum of 15 reviews
3.Suppliers with less than 15 reviews
4.Suppliers which have been archived
5.Suppliers with no reviews at all

The top 24 on the chart have had 15 or more reviews in THE PAST YEAR. The
next 10 (the ones that look wonky) have had 15 or more reviews BUT (and
here's the crunch!) not in the past year. The firms after that (from 35
down) have less than 15 (the minimum) reviews or no reviews at all.

I felt that for the chart to be meaningful and up-to-date each company must
have at least 15 reviews to be at the top of the chart (otherwise only two
or three reviews - perhaps even posted by friends of the company owner -
could put a two-bit firm in number one position! That actually happened on
the old Feedback Arena before I rewrote the script.).

I also felt that the rankings must reflect the CURRENT performance of a
company. For instance, a firm which may have performed very well over a year
ago may have had 15 or more positive reviews at that time. Their performance
may have dropped abysmally since then (I can think of a few companies whose
service has declined in the past year) but unless 15 or more people have
posted reviews more recently, they would still be ranked highly.

It also works in the opposite direction of course. Some companies may be
performing much better now than they were over a year ago. Thus I introduced
the rule where the 15 or more reviews must have been posted in the past
twelve months for the company to reach the highest rankings.

Also, the ranking achieved by reviews over the past year supersedes earlier
reviews - so the ranking of a company is constantly updated as time goes on.
See: http://www.romulus2.com/feedback/fan...57968547,61675,

I know it's complex and it ties my head up in knots too sometimes! But it
does produce the fairest and most meaningful results. The code for a simple
highest-rating-gets-highest rank chart would have been waaaayyy easier to
write but utterly meaningless as it would not reflect the NUMBER of reviews
posted or how RECENTLY they were posted.

I hope that helps ...


David Knell

Feedback Arena
http://www.feedbackarena.com







"BUFF" wrote in message
...

"Lem" wrote in message
...
"BUFF" wrote:

"Lem" wrote in message
...
At http://www.romulus2.com/feedback/chart.php?1 there is a
series of ratings of good UK suppliers.

What I don't understand is how some suppliers have got such a
high ranking but they have a mediocre score.

For example, Dabs gets a score of 4.66 and a ranking of 19 but
that 4.66 is worse than several other suppliers which are lower
down in the rankings.

Can anyone explain this as I like to use this page a lot.

5 stages

1.Suppliers with a minimum of 15 votes (reviews) in past year
2.Suppliers with a minimum of 15 votes (reviews)
3.Suppliers with less than 15 votes (reviews)
4.Suppliers which have been archived (denoted by )
5.Suppliers with no votes (reviews) at all

So a supplier with a low rating but over 15 votes would be
further up than one with a higher rating but under 15 votes.



Strange thing is that the first 34 companies on

http://www.romulus2.com/feedback/chart.php?1

have over 15 votes. And yet low scoring comanies are still ranked
above high scoring ones.

Something doesn't seem right to me.


See what you mean now - it seems to work OK down to no. 24 (i.e. lower
score= lower place) but then goes wonky for the next 10 whwere some

should
be higher up.
Why don't you drop him a line &
1) point it out - he may not have noticed
2) ask for an explanation




  #5  
Old October 23rd 03, 01:18 PM
RomQ
external usenet poster
 
Posts: n/a
Default

Does make the rankings somewhat misleading imho - basically a company
with 100 people agreeing they're sh*te would get a better ranking than
a competitor with 10 customers saying they're the best company they
ever dealt with.


Paul, I hope my reply to Buff answers your misgivings, particularly my
paragraph:

"I felt that for the chart to be meaningful and up-to-date each company must
have at least 15 reviews to be at the top of the chart (otherwise only two
or three reviews - perhaps even posted by friends of the company owner -
could put a two-bit firm in number one position! That actually happened on
the old Feedback Arena before I rewrote the script.)."

As I'm sure you'll agree, any survey can only be meaningful if a reasonable
number of people are polled. Think of the ranking of a company as a survey.
Would you place any trust in a survey based on less than 15 people?

David


"Paul Hopwood" wrote in message
...
"BUFF" wrote:

For example, Dabs gets a score of 4.66 and a ranking of 19 but that
4.66 is worse than several other suppliers which are lower down in
the rankings.


Can anyone explain this as I like to use this page a lot.


So a supplier with a low rating but over 15 votes would be further up

than
one with a higher rating but under 15 votes.


Does make the rankings somewhat misleading imho - basically a company
with 100 people agreeing they're sh*te would get a better ranking than
a competitor with 10 customers saying they're the best company they
ever dealt with.

I can see some of the logic behind it but as more reviews have been
added to the site the flaws in the basic ethos behind the ranking
system have become more apparent. I still use it occasionally to read
some of the reviews but completely disregard the rankings.

--
iv Paul iv


[ Mail: ]
[ WWW:
http://www.hopwood.org.uk/ ]



  #6  
Old October 23rd 03, 10:29 PM
Paul Hopwood
external usenet poster
 
Posts: n/a
Default

"RomQ" wrote:

Does make the rankings somewhat misleading imho - basically a company
with 100 people agreeing they're sh*te would get a better ranking than
a competitor with 10 customers saying they're the best company they
ever dealt with.


As I'm sure you'll agree, any survey can only be meaningful if a reasonable
number of people are polled. Think of the ranking of a company as a survey.
Would you place any trust in a survey based on less than 15 people?


My statement above still applies. It's true you need a decent number
of contributions for a ranking to be meaningful but it's nonsensical
imho that a company with a huge number of negative responses can be
ranked more highly than one who has predominantly good ratings from
fewer people.

For example you have Time, Watford and PC World with an abysmal
ratings contributed by a large number of people (thus they can
probably regarded as being crap companies to deal with) being better
rated (and climbing!) higher than for example, Evesham, Insight and
Maplin, each of which has an average score in excess of DOUBLE the
former companies. Then you have companies with a dozen or so
customers giving them glowing reports, resulting in average scores of
8 or higher, languishing outside the top 100.

A ranking system can only really be regarded effective if the end
result is a chart with the most favourable at one end of the scale and
least at the opposite end, and those in between being sorted in
(approximate) order. Ignoring the ethos behind the ranking system for
a moment, can you really say the Romulus really gives a true
representation of the suppliers listed, or even a reliable Top 100?
Companies with infamously terrible service like Time, Watford and PC
World are in your top 10%, with terrible reviews and an overall rating
which reflect the true nature of those companies, are ranked above
well-regarded companies with significantly better scores such as
Evesham, Overclock.co.uk, Extreme Cooling, Digi-UK, MicroWarehouse
etc.

As I said, I find the site useful for the overall rating and to read
the reviews people have posted, but the ranking system is so wide of
the true picture that I consider it to be of no practical use. When
it comes down to it it's your site and if the ranking system orders
them how *you* consider each of the suppliers then it's probably
successful, I just happen to disagree with your evaluation. ;-)

--
iv Paul iv


[ Mail: ]
[ WWW:
http://www.hopwood.org.uk/ ]
  #7  
Old October 24th 03, 12:33 AM
RomQ
external usenet poster
 
Posts: n/a
Default

When
it comes down to it it's your site and if the ranking system orders
them how *you* consider each of the suppliers then it's probably
successful, I just happen to disagree with your evaluation. ;-)


You're being a little unfair, Paul. You know very well that Feedback Arena
has absolutely nothing to do with *my* evaluation or how *I* consider the
suppliers. I have no control over the rankings whatsoever. I merely choose
the minimum number of reviews that I feel represents an adequate poll for a
rank to be reliable. The number could be 3 or 5 or 10 or whatever but I
think 15 is about right. I could be wrong and 10 has been suggested as
better - but I don't want some tiny operation to just get ten friends to
post great reviews for them and shoot them to the top. The larger the
number, the less that will happen.

Incidentally, the American ResellerRatings.com uses a minimum of 30 ("The
resellers must have at least five reviews for every month in the past six
months to qualify for the top seller list")
http://www.resellerratings.com/faq.pl

I would like to see companies like Evesham, etc. properly ranked in the
chart too but until enough people post reviews for those companies within
the past year for the results to be statistically reliable, there's nothing
I can do about it. As I said, *I* don't control the rankings, the site users
do. The more people who use the site and post reviews, the better it gets.

A ranking system can only really be regarded effective if the end
result is a chart with the most favourable at one end of the scale and
least at the opposite end, and those in between being sorted in
(approximate) order.


Regardless of how many reviews they get or how long ago they were posted? A
ranking system where a company with only two reviews posted by mates of the
owner two years ago could be ranked way higher than one with 50 reviews
posted by bona fide customers in the last month? That's not the kind of
ranking system I would regard as effective or want to place any confidence
in. ;-)

Regards,
David


"Paul Hopwood" wrote in message
...
"RomQ" wrote:

Does make the rankings somewhat misleading imho - basically a company
with 100 people agreeing they're sh*te would get a better ranking than
a competitor with 10 customers saying they're the best company they
ever dealt with.


As I'm sure you'll agree, any survey can only be meaningful if a

reasonable
number of people are polled. Think of the ranking of a company as a

survey.
Would you place any trust in a survey based on less than 15 people?


My statement above still applies. It's true you need a decent number
of contributions for a ranking to be meaningful but it's nonsensical
imho that a company with a huge number of negative responses can be
ranked more highly than one who has predominantly good ratings from
fewer people.

For example you have Time, Watford and PC World with an abysmal
ratings contributed by a large number of people (thus they can
probably regarded as being crap companies to deal with) being better
rated (and climbing!) higher than for example, Evesham, Insight and
Maplin, each of which has an average score in excess of DOUBLE the
former companies. Then you have companies with a dozen or so
customers giving them glowing reports, resulting in average scores of
8 or higher, languishing outside the top 100.

A ranking system can only really be regarded effective if the end
result is a chart with the most favourable at one end of the scale and
least at the opposite end, and those in between being sorted in
(approximate) order. Ignoring the ethos behind the ranking system for
a moment, can you really say the Romulus really gives a true
representation of the suppliers listed, or even a reliable Top 100?
Companies with infamously terrible service like Time, Watford and PC
World are in your top 10%, with terrible reviews and an overall rating
which reflect the true nature of those companies, are ranked above
well-regarded companies with significantly better scores such as
Evesham, Overclock.co.uk, Extreme Cooling, Digi-UK, MicroWarehouse
etc.

As I said, I find the site useful for the overall rating and to read
the reviews people have posted, but the ranking system is so wide of
the true picture that I consider it to be of no practical use. When
it comes down to it it's your site and if the ranking system orders
them how *you* consider each of the suppliers then it's probably
successful, I just happen to disagree with your evaluation. ;-)

--
iv Paul iv


[ Mail: ]
[ WWW:
http://www.hopwood.org.uk/ ]



  #8  
Old October 25th 03, 02:53 PM
Paul Hopwood
external usenet poster
 
Posts: n/a
Default

"RomQ" wrote:

When
it comes down to it it's your site and if the ranking system orders
them how *you* consider each of the suppliers then it's probably
successful, I just happen to disagree with your evaluation. ;-)


You're being a little unfair, Paul. You know very well that Feedback Arena
has absolutely nothing to do with *my* evaluation or how *I* consider the
suppliers. I have no control over the rankings whatsoever.


I wasn't implying you have any influence over the feedback itself, but
you *do* write the coding behind the ranking mechanism and based on a
set of rules you determine so the rankings are, in effect, of your
making.

I merely choose
the minimum number of reviews that I feel represents an adequate poll for a
rank to be reliable. The number could be 3 or 5 or 10 or whatever but I
think 15 is about right. I could be wrong and 10 has been suggested as
better - but I don't want some tiny operation to just get ten friends to
post great reviews for them and shoot them to the top. The larger the
number, the less that will happen.


The most effective might be some kind of weighted average, which used
a "balanced scorecard" approach to give each supplier a scoring, which
in turn was used to rank them. Each review would be a given a degree
of "trust", possibly based on the number of reviews posted by the
reviewer (avoiding a supplier getting 10 friends to post positive
remarks), age and quantity of reviews. I've also seen other sites
make very effective use review feedback, reviews of reviews as it
were, so that other visitors can state their level of agreement in a
review without necessarily placing a review themselves.

Another simpler alternative would be to simply leave suppliers you
consider not to have sufficient feedback as "unranked", rather than
try to rank suppliers using incomplete data.


A ranking system can only really be regarded effective if the end
result is a chart with the most favourable at one end of the scale and
least at the opposite end, and those in between being sorted in
(approximate) order.


Regardless of how many reviews they get or how long ago they were posted? A
ranking system where a company with only two reviews posted by mates of the
owner two years ago could be ranked way higher than one with 50 reviews
posted by bona fide customers in the last month? That's not the kind of
ranking system I would regard as effective or want to place any confidence
in. ;-)


As it stands the current ranking system isn't one I'd suggest you
could regard as effective or want to place any confidence in. Unless
you really think we should be discouraging people from doing business
with, for example, Evesham and suggesting everyone goes out and buy
from Tiny instead? ;-)

The site is very useful for gauging general service level of a
customer using the reviews and overall rating but, imho, the chart is
quite clearly NOT a true reflection of the comparative merits of the
suppliers listed.

--
iv Paul iv


[ Mail: ]
[ WWW:
http://www.hopwood.org.uk/ ]
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Video card power ratings Paul Asus Motherboards 1 August 27th 04 01:19 AM
Amp Ratings on my replacement battery is different! Chris Yewchuk Dell Computers 8 June 10th 04 05:16 PM
Maxtor ROMULUS Firmware Carl Farrington Storage (alternative) 13 January 21st 04 10:13 PM


All times are GMT +1. The time now is 08:05 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.