Keeping crowdsourcing honest: can we trust the reviews?
- Published
The crowdsourced review - where the opinions of thousands are collected to help customers decide what to purchase or whom to hire - has proved to be one of the most disruptive business forces of the modern age.
TripAdvisor is perhaps the industry's best-known name.
The hotels and restaurant review site enjoyed a 55% rise in traffic last year, attracting 260 million unique monthly visitors. It is expanding its remit too, having recently acquired seatguru.com, where air passengers get help choosing the best seats on specific flights.
Local business review site Yelp has just formed a partnership with Yahoo! search, and the UK's assess-and-hire-a-builder service RatedPeople is preparing for a public floatation on the London Stock Exchange.
These companies can make or break the businesses listed on their pages.
But that can be a problem when some of the opinions they promote cannot be trusted.
Bad experiences
The big review sites say they operate robust editorial controls to weed out the fakes.
But some of the people who have suffered bad experiences have their doubts.
Feolla Chastanet is a sales manager at her family's hotel in St Lucia. At a recent staff meeting she showed her colleagues a TripAdvisor review of their business.
It was bad.
"What's the story with this guest?" she asked.
They told her that the man who had posted the review had been at the front desk barely an hour ago. He was upstairs, not far from where they sat. He had demanded a room upgrade and did not get it, she tells the BBC. He had said nothing, left the front desk, gone upstairs, gone online - and vented his anger.
Ms Chastanet says that much of the review was unfair and a direct reaction to his request being denied rather than a fair reflection of her establishment.
She also has doubts about some of the other comments posted on the site.
"How does a guest know how much it costs to renovate a hotel?" Ms Chastanet asks.
"It's clearly fake when they write about the kind of details they couldn't possibly know."
Checks and balances
"If I tell you, I'd have to kill you," jokes Julio Bruno, TripAdvisor's global vice-president of sales.
He has just been asked about the system put in place to catch bogus reviews.
The company says it uses a confidential algorithm - software that carries out step-by-step checks to root out odd posts from the mountain of user reviews.
He likens it to the secret formula of Coca-Cola and Colonel Sanders' KFC restaurants - but stresses that it is supported by hundreds of workers who sift through the flagged material.
"We call them the content integrity team," Mr Bruno says.
"But we always give the last word to the establishments."
Botched job
Unfair, bad reviews can depress sales. But unfair good reviews are also a problem and can lead to buyer's remorse.
Michael and Diana, a couple in their sixties living in south-west London, wanted a workman to fix their leaking roof.
They turned to RatedPeople, found a builder with good reviews and paid him £2,000 to do the job. The roof started leaking again a week later. He refused to return to do it properly. A subsequent survey found that he had actually made the problem worse.
The couple then discovered the person they had picked had subcontracted the work to someone else without their knowledge, meaning the reviews they had read were not for the person who had come to their home.
They questioned RatedPeople in a series of email exchanges. The company closed the builder's account, but the couple ended up paying £9,000 to rectify the damage caused.
While the damage was regrettable, the company says farming work out was not against its rules.
"Subcontracting is a big part of our business - many tradesmen subcontract work," RatedPeople chief executive Chris Havemann tells the BBC.
"It is in their interests to only subcontract to tradesmen that they know and trust, as the rating left by the homeowner will be attributed to the tradesman/business who employed the subcontractor."
But he acknowledges that tradesmen do sometimes try to game the system.
"One way is to pretend to be a homeowner, effectively buy access to that work, and try and rate yourself," he says.
But he insists his company has taken steps to stop such problems before they occur - for instance, a quick rating would raise alarms.
"If you joined our platform at 4.30pm, how can you get a rating on a bathroom project at 7pm? We're also checking IP [internet protocol] addresses and going down to the digital fingerprint of the device."
Yelp also says it uses automated software, and supplements that with sting operations.
"Once [someone is] caught red-handed, we clearly make this information available to consumers looking to patronise these businesses with a banner on their business page, outing their shady business practices," Yelp's senior PR manager Elliot Adams tells the BBC.
"[We recommend] that they may want to take their business elsewhere."
Growing challenge
But do these checks mean customers can be confident about making what might be very expensive purchases?
Giorgos Zervas, an assistant professor of marketing at Boston University, says its research indicates that the number of suspicious reviews has been rising.
"Reviews are becoming a more integral part of consumer decision-making," Mr Zervas said.
"I believe algorithms can only go so far in detecting fake reviews.
"In the end, crafting a legitimate-looking fake review is not that hard, and those who commit review fraud are becoming increasingly sophisticated."
It is not exactly what Ms Chastanet or Michael and Diana want to hear.
- Published6 February 2014
- Published13 January 2014