Ownership of the print top 20?

Conaill said:
See, this is a good example why more "professional" rating services like imdb.com use something called a "Bayesian estimate" to compensate for differences in the number of votes for each candidate. I won't go into details of why the formula has the shape it does, but the upshot is that it's equivalent to adding a small number of "pseudo"-votes with a score equal to the overall average score.

For example, the average score for print reviews is around 3.85 , so let's add 5 "3.85 star" votes to each product. Sum of scores for Slaine would be 6 x 5.00 + 5 x 3.85 = 49.23, for an adjusted average score of 4.48. Sum of scores for M&M would be 11 x 5.00 + 1 x 4.00 + 5 x 3.85 = 78.23, for an adjusted average score of 4.60.

So if you take into account the fact that M&M has twice as many votes as Slaine, and only one single 4-star vote, M&M should be ranked higher than M&M.

Here's what the top 20 would look like, using a Bayesian estimate a la imdb.com:

Code:
1	Mutants & Masterminds	Green Ronin Publishing	4.60
2	Tome of Horrors     	Necromancer Games	4.54
3	Manual of the Planes	Wizards of the Coast	4.54
4	2000AD: Sla'ine RPG	Mongoose Publishing	4.48
5	Midnight        	Fantasy Flight Games	4.48
6	Crooks!         	Green Ronin Publishing	4.42
7	Lords o/tNight:Vampires	Bottled Imp Games	4.42
8	Starfarers Handbook	Fantasy Flight Games	4.42
9	Book of the Righteous	Green Ronin Publishing	4.42
10	The Book of Taverns	Necromancer Games	4.40
11	Call of Cthulhu d20	Wizards of the Coast	4.40
12	Scarred Lands: Ghelspad	Sword & Sorcery Studios	4.37
13	Lords o/t Night: Liches	Bottled Imp Games	4.37
14	The Tomb of Abysthor	Necromancer Games	4.35
15	Monsternomicon Vol. I	Privateer Press   	4.35
16	Forgotten Realms CS	Wizards of the Coast	4.33
17	Legions of Hell    	Green Ronin Publishing	4.33
18	Hammer & Helm   	Green Ronin Publishing	4.32
19	Plot and Poison    	Green Ronin Publishing	4.32
20	Unearthed Arcana     	Wizards of the Coast	4.29
21	Judge Dredd RPG      	Mongoose Publishing	4.29

Note how all the 5- and 6-vote entries moved down, whereas entries with many more 5-star reviews (such as MotP, Starfarer's HB and BotR) moved up. FRCS, with a whopping 17 reviews of which 11 5-star, moved up 13 places, from #29 to #16, whereas Hammer & Helm and Plot & Poison dropped 11 places because they only barely squeaked by the 5-vote minimum.

You don't know SQL coding and want to volunteer to tweak the review database, would you? ;)

Yes, I brought up imdb's method ages ago, but we were never able to implement anything like it. Something like that would, indeed, make the scoring system more credible. As it is now, all it takes is for a publisher to convince 5 fans to give it a glowing review and they are at the top of the charts.
 
Last edited:

log in or register to remove this ad

Psion said:
You don't know SQL coding and want to volunteer to tweak the review database, would you? ;)
I don't know SQL, but it should be pretty trivial to code up. The only thing you need is the average of all votes (*not* the average score, mind you). If the votes aren't stored individually, you can get that by multiplying each product's score by it's number of votes, adding up across products, and dividing by the total number of votes.

Next, you add 5 "fake" votes with that average to each product, and recalculate the score based on that. Should be 5 minute job for someone who is familiar with SQL and the current review database. It took me about 15 minutes, but much of that was pasting the data from the website into Excel...
 
Last edited:

I have the following:

2. Midnight Fantasy Flight Games 5.00 6
3. Crooks! Green Ronin Publishing 5.00 5
5. Mutants & Masterminds Green Ronin Publishing 4.92 12
9. Manual of the Planes Wizards of the Coast 4.79 14
11. Call of Cthulhu d20 Wizards of the Coast 4.75 8
18. Unearthed Arcana Wizards of the Coast 4.67 6
22. Freedom City - Green Ronin Publishing

I will Use Crooks! and Freedom City. I have used Manual of the Planes many times. I have used CoC d20 once. I have use M&M once, and intend to use it a good deal more.
 

Conaill said:
So what do you all think... does this ranking make more sense to you? From a statistical point of view, it should be more reliable...
Yes, it certainly does make more sense to me.

And Conaill - I just want you to know that I absolutely love your statistical analyses on this board (in this thread and others). My statistics years are far behind me, but I still find your posts fascinating!
 

Interesting...

I did a little more number-crunching, and it seems like the current ranking, compared to my adjusted one, agrees as well or better with the products people actually say they *use* in this thread. Not necessarily a measure of quality, but this could be an indication that people are more likely to actually use a niche product once they've bought it.
 

arnwyn said:
And Conaill - I just want you to know that I absolutely love your statistical analyses on this board (in this thread and others). My statistics years are far behind me, but I still find your posts fascinating!
Thanks! Always nice to know they're not just dissapearing into a black hole. :D
 

Here's what I own. Each of these books I've used at least once.

6. Tome of Horrors Necromancer Games 4.83 12
9. Manual of the Planes Wizards of the Coast 4.79 14
11. Call of Cthulhu d20 Wizards of the Coast 4.75 8
18. Unearthed Arcana Wizards of the Coast 4.67 6

Here's a listing of what I get the most use out of:
1. Unearthed Arcana
2. Manual of the Planes
3. Tome of Horrors (I love, LOVE opening this book to look for wacky dungeon inhabitants, but I don't really use dungeons that much, alas alas...)
4. Call of Cthulhu (In reality this should be number 6 or 7 :p. I don't use it that much)
 


Conaill said:
See, this is a good example why more "professional" rating services like imdb.com use something called a "Bayesian estimate" to compensate for differences in the number of votes for each candidate. I won't go into details of why the formula has the shape it does, but the upshot is that it's equivalent to adding a small number of "pseudo"-votes with a score equal to the overall average score.

For example, the average score for print reviews is around 3.85 , so let's add 5 "3.85 star" votes to each product. Sum of scores for Slaine would be 6 x 5.00 + 5 x 3.85 = 49.23, for an adjusted average score of 4.48. Sum of scores for M&M would be 11 x 5.00 + 1 x 4.00 + 5 x 3.85 = 78.23, for an adjusted average score of 4.60.

Excellent, excellent work! This makes more sense.

Very interested in the formula -- any links to it?
I'm curious why 5 fake votes are added in to each score as opposed to
2, 10, 100, etc.

Thanks,
-D
 

devilish said:
Very interested in the formula -- any links to it?
I'm curious why 5 fake votes are added in to each score as opposed to
2, 10, 100, etc.
I'll try and dig up a reference that's understandable to the lay person. Meanwhile, you can find the formula at the bottom of imdb.com's top 250 page. They also have a discussion of it in their FAQ.

As for why adding exactly 5 votes... imdb recommends using the same value as the cutoff used to weed out the entries with too few votes. In their case, movies have to have gotten at least 1250 votes to appear in the top 250 list, and they add 1250 "pseudo-votes".
 

Remove ads

Top