Review inflation on ENworld

tleilaxu said:
So I went through the three official reviewers, Kushner, Simon and Psion. (heh, Simon and Psion, anyone remember Simon and Simon?)

Scrolling down quickly I found that out of Psions 100+ reviews not a -single- one warranted a "1 - atrocious" rating. Kushner as well has not nailed anyone with a 1 rating. In Simon's 100+ reviews there are only about 3 or 4 atrocious ratings.

(...)

Now, in my opinion, you should ideally be able to add up all the scores from all the products, divide by number of reviews and get 3.00 . So when I get access to the reviews page that is what I am going to do. My guess is that the average is much closer to 3.66 than 3.00
(...)
Here is a question for the rest of you: Why does this always seem to happen, in grades, movie reviews, etcetera. A "C" should be the average grade in classes, but often B is the average.
(...)
Opinions, thoughts, rants, flames?

Allow me to thoroughly disagree. There are a few important things you are missing.

1) First, nothing about the ENWorld rating system says that is is a ranking system that is strictly relative to one's peers. The only thing that says what the numbers mean is the short description on the review page:

1 - Appalling
2 - Poor
3 - Average
4 - Good
5 - Superb


Note that those ratings don't say

1 - Bottom 20 percentile
2 - 20th-40th percentile

etc.

Nothing I have reviewed strikes me as "appalling." Nothing. That is not to say that there is not anything appalling out there. This leads me to point number 2.
2) You assume that all of the staff reivewers have reviewed a fair cross section of the material available out there. This is not the case.

There are a few factors at work here.

First, while some publishers do contact me, for the most part I solicit the publishers that I would like to see review material from. I wont solicit companies whose material I consider poor to begin with. As a result, I typically get material that is above the d20 average to begin with. I am in large showing you reviews of the companies that I consider the best d20 publishers.

Second, when publishers are rated badly, they tend to not want to send you review material anymore. There is not a company I have given a "2" to that sends me material anymore. Many publishers take it pretty hard when I give them a "3".
3) Third is the moving average. Material has significantly improved in quality over the last 2 years. But I try not to shift my standards as time goes by. When a person comes to the board and looks at ratings, those ratings have to stand up irrespective of when the review was posted.

4) The primary content of the reviews page is the reviews, not the numbers. Anyone can put up a poll of what people thought of product X or Y, but in the end, those are just subjective measures. The numbers don't tell you how the reviewer came to that conclusion, and the reviewer's criteria may be irrelevant for your puproses. The best way to determine what the value of a product is is to you is to...

(drum roll...)

READ THE REVIEW!
 

log in or register to remove this ad

Reviewers don't review "everything" -- they get to pick and choose what they review. Maybe they stay away from the stinkers, whether consciously not.

That would be "consciously". I have seen a few stinkers, but I certainly didn't squander my money on it, and if the companies responsible solicited me to do reviews, I would politely decline. I make enough that I can buy most RPG products with an hour or two's wages. Writing, editing, and submitting review take AT LEAST two hours (for a large product, more like four or six). Reviewing poor products is annoying to me and only adds a product that I didn't want in the first place to my collection. The motivation is just not there for me to do lots of reviews of bad products.

I do plan on doing a review of a stinker I picked up durring the ENnies (though still a 2, not a 1) but gratis items submitted for reviews come first.
 

Tallarn said:
5*) A movie masterpiece, like LotR, Star Wars Ep4...films that are generally agreed to be incredible pieces of work.

I would hasten to point out that the review of some films changes over time. Star Wars (I refuse to call it episode 4, "A New Hope" or anything else) did not recieve universal praise when it first came out. That it rose to cultural touchstone has affected reviews of it since, but it received it's share of bad reviews when it was new.


That said, I'd prefer a rating system that incoporates several categories, with or without a 'master' score. An example is how some magazines review computer games: they rate them based on three to five categories, such as graphics, sound, replayability and so on.

I don't much care for going to 1-10, as I think it will just skew the scores to a different point, without much benefit. And technically, you have that now, if you include 1/2 stars. Ultimately, the score should only be an invitation to read the review, and find out what the reasons behind the score were. No simple number or collection of numbers is going to tell the story effectively enough.
 

Re: Re: Review inflation on ENworld

Psion said:

Second, when publishers are rated badly, they tend to not want to send you review material anymore. There is not a company I have given a "2" to that sends me material anymore. Many publishers take it pretty hard when I give them a "3".

Wow... harsh.

I know you have never given one of my products a "2" yet, but another staff reviewer has, but I certianly still have him in my distribution list (hint hint, Simon Collins, you know who you are... :D). The best thing a company that gets a bad review can do is work hard to get a better review. That's why I still send out promotional copies even to people who have panned my owrk in the past.

But there ARE exceptions. For example, we won't send out promotional copies of our material to several review sites after the Kid's Coloring Book fiasco. My two children wrote a book, paid to have a limited print run done out of their own pockets, and brought said book to GenCon. They gave away copies to their favourite game designers, and sold the rest. A lot of reviewers came up and asked for free copies to review, and they all got one. In fact, enough were given away that the girls lost money on the print run.

That was August 8th - 11th and there are still NO reviews of this book up anywhere. One review site "lost" the copy they were given, and none of the others have been bothered to post these reviews.

It's one thing when a complimetary review copy is volunteered, but when a reviewer SOLICITS a review copy, especially from a 7 year old child, they had damn well better actually review it.

Wow... did I mention I am bitter about this? If these same two children hadn't have been part of the award-winning Portable Hole team, they would have walked away from the whole GenCon thing with a significant dislike for the whole game publishing scene.
 

Dextra was saying something about this that had her up in arms on a thread on the d20 publishers forum. At the time, she didn't specify what the product or situation was, but now I understand. Yes, I would agree that sounds a bit aggravating.
 

I don't think we need to change the ratings system for reviews, but I do agree with tleilaxu that it would be useful to add a percentile to the score.

Likewise, I think we could add a percentile to indicate how this particular review compares to other reviews by the same reviewer. Yes, some reviewers give higher marks than others. One person's "2" may be another's "4".

No need to actually change any of the ratings or overhaul the whole system, mind you! Just a little more info on how a product compares to others and how a review compares to others by the same reviewer...
 

First, Jason, I am terribly sorry to hear about this chain of events. Your kids definitely deserve better than that. I don't know if the publishers in question didn't feel that they book was worth reviewing, because of the age of the authors, of if there were other circumstances, but I am very disappointed to hear of the shoddy treatment. Frankly, it sucks.

Second, about the reviews: I think Eric and Psion put it best: You already HAVE your pool of potential one's and two's out there - all those products that people have consciously or not avoided. Frankly, the purpose of a review is to tell another person to either seek out or avoid a book that in your personal experience affected you.

Most ENWorld Reviewers PAY for their books; only a select few house reviewers are on distribution lists of any appreciable size. The rest are amateur reviewers who have to go buy their books and then review them. all of the reviews I have posted to ENWorld (so far only 5 or 6, the products I have bought in the past year), I have given 3's or 4's to, because if they weren't good or interested me, I wouldn't have bought them. This shows up in my reviews, for people to see if the material will interest them.

Ironically, the only review copy I EVER received for free, I had to post its review on RPGnet, because the review logins here were terrible at the time, and I couldn't log in to post a review if my life depended on it! Fortunately, the problems have been resolved, but the review was posted, nonetheless. If you ever read this, Hyrum Savage, I'm sorry it took so long to post a review at all! :o
 

Here is a question for the rest of you: Why does this always seem to happen, in grades, movie reviews, etcetera. A "C" should be the average grade in classes, but often B is the average. If someone gets an A in a class they should be in at least the top 10 if not 5 percent.

I'll explain to you very simply why that doesn't work: It doesn't actualy measure the persons (Or, in this case, the products) ability in and of itself.

As an example. Say you had a class that gave math tests, each one with 100 problems, 10 times a semester. Assume you have 50 students. 10 always get all the questions right. 10 always miss 1. 10 always miss 2. 10 always miss 3. And ten always miss four. (We are assuming this is an advanced class, I guess).

Now, individualy, if someone said "I got 96% on all my tests this year!", you would think they did pretty well, right? I sure would.

But by your model, they would all fail and have to repeat the class. Despite having done good work. Simply because there was someone slightly better than them.

Now, granted, that's a fabricated example. But I do think the point it examines is valid. I mean, if you were to assume that only a "D" grade would let you pass a class, by nessesity, in your system, 40 percent of every class MUST fail, regardless of how well they did.

Or, to shift back to reviews, if you reviewed ten really really good products... lets say... Wheel of Time, FRCS, Rokugan, Manual of the Planes, Monsternomicon, Kalamar players guide, Oriental Adventures, Traps and Treachery, the EQ RPG, and Spycraft... Of those, 2 MUST, no matter how good they are (assuming you are still going on a hard 1-5 scale) earn a 1. Likewise, if you reviewed 10 horrible products, two MUST earn a 5. This system is deceptive and doesn't really tell you anything. For example, if I see that Psion gave the Manual of the Planes a 1, I might think he hated it, when in fact he simply loved it less than other stuff.

That's not really useful... it forces me to read the entire review to understand how he felt about it. And while I do make it a point of reading the reviews, sometimes I just want a quick baseline... which your system would not give me.
 

Review ratings

The primary content of the reviews page is the reviews, not the numbers. Anyone can put up a poll of what people thought of product X or Y, but in the end, those are just subjective measures. The numbers don't tell you how the reviewer came to that conclusion, and the reviewer's criteria may be irrelevant for your puproses. The best way to determine what the value of a product is is to you is to...

READ THE REVIEW!

Well said, Psion. The numeric rating system should never be considered the definitive grade of a review, but rather a subjective measure that both complements the text of the review and provides a means of comparing similar prodcuts. For example, if I was looking for a city book and was trying to decide between Freeport: CoA and Skraag, I could look at the review scores as a comparison. If Freeport was consistently rated high and Skraag wasn't rated as high, I would be inclined to lean towards Freeport especially after reading the reviews. That's not to say that Skraag was a bad book (it wasn't), but rather the scores can serve to guide me when looking at similar products. In the end it all boils down to one thing for every single consumer, "Can I use this in my campaign?" If the answer is yes, then reviews of a product can certainly sway a person to buy it. But if the answer is no, then it doesn't matter if the review got a 1 or a 5 because it probably will not be purchased. Period.
 

One nice thing that has happened recently is that we now have reviewers with track records and websites with editorial policies regarding their reviews. I'll pick up a movie in a video store if it got a thumbs-up from Ebert because I've done so in the past and seldom been disappointed. Likewise, if I notice a reviewer assigns similar grades as I would then I'll give a product I otherwise might have passed over a try if the reviewer scores it well.

The main way to build this faith is for a reviewer to have a large number of reviews, a consistent rating system, and good write informative reviews.

A reviewer that gets a stamp of approval from a respected entity, such as becoming a staff or affiliate reviewer here or becoming a print reviewer for Dragon magazine also makes me more confident in their opinion. This process of editorial review of internet reviewers is new; many websites still seem to let just anyone post reviews and that decreases the value of those reviews (at least to me). I still look at those reviews but I don't treat them any differently than what someone might post to a message board; unless they've got a strong track record *and* I know about that record.
 

Remove ads

Top