Archive for September 2009
For quite sometime the travel space has been debating the issue of hotel reviews and the possibility of fraud for either monetary gains or tarnishing a competitor reputation. It is hot debate with a lot at stake.
What do we know?
As I am following the debate closely, it got me thinking about the facts the this debate is based on. By that I mean how many reviews that are fraud? Well, everyone seems to very sure of the presence of fraud but cannot quantify it with such a high accuracy. That’s understandable, of course. So let’s try to get a less accurate measurement for the problem. How about a rough percentage? is it 2% fraud or 25% fraud? Hmm, let me tell you that nobody know the answer either. You see where I am going!? No accurate or even credible rough estimate to the extent of the issue. Basically, all we know is fraudulent review exists. That’s it. Ironically, that the same exact thing we know about Wikipedia articles, some entries are incorrect.
Debates feed on the unknown, the unproven, and the un-measureable. I think this debate has the perfect environment to keep going and going … until we can measure things accurately.
Reviews Credibility Test (RCT)
Admittedly, fraudulent reviews will slip through the cracks despite the best efforts for hotel review sites. That’s what makes it hard to measure. But the real question is what can we measure that would give us a real feel of the overall credibility of the system? something that we can measure accurately.
So let’s try this simple test on movie reviews. Let’s say that I don’t know if I should trust reviews written about ‘The Dark Night’ on this new site but then i found that the site rated\ranked given to the movie matched my expectations as I’ve seen the movie before. Let’s also say that this is consistently happening across all movies I’ve seen recently. Well, that sounds like a credible source that I could use for recommendations before watching my next movie. Now if I tell you that the rankings are based on user reviews only. Well, that is an indication that reviews are also credible. Notice that I did not say there are no fraud because that assertion is hard to make without reasonable doubt. This simple thought process applied to hotel reviews will yield the same result.
This simple test tells us at a glance whether a source is credible or not. If the results are way off then we should dig deeper. Put another way, before we debate fake reviews let’s first find those cases where a mediocre hotel made it to the top of the ranking list in a big city; or a bad movie became a blockbuster; or a terrible song at the top of billboards. This test is measurable and easy.
To me, this is really the catalyst of this debate. A question I always ask in forums and on blogs. To no one surprise; nobody was able to show a single example of bad hotel on top of the rankings. Yet, the debate continues…
Challenge UGC, if you can
In a sign of times, Microsoft discontinued its Encarta product citing the popularity of Wikipedia. That’s a wise decision. I don’t recall using Encarta in years. But what’s so interesting here is that Microsoft did not poke holes on Wikipedia and its open model. It is a losing battle.
The lessons from Encarta are not realized in the hotel reviews space yet. The marketing machines of companies with professionally written reviews are kicking on all the tires taking shots against user generated content review sites. That’s not a surprise. In fact, it is the only way to fight the growing popularity of UGC. The question though is: does it work? Well, ask Microsoft!
If you read this far, you probably have something to say about all this and I am interested to know what you think. Do you think you can rely on the review credibility test (RCT) to get a feel of how trustworthy a review site is?