TripAdvisor
is pulling back the curtain on its process for catching fake reviews.
The site’s inaugural
“TripAdvisor Review Transparency Report” provides insight into the moderation
processes and key data. In 2018, TripAdvisor says users posted 66 million
reviews and its fraud-detection technology rejected 2.1% of those, or about 1.4
million reviews – with about three-quarters of those blocked before they posted.
The company
says fraudulent reviews are generally one of three types: biased positive
reviews, biased negative reviews or paid reviews.
TripAdvisor’s
content moderation team rejected or removed an additional 1.7 million reviews for
guideline violations such as an incorrect location.
Subscribe to our newsletter below
In total,
TripAdvisor’s combination of technology and human assessment rejected 4.7% of
reviews in 2018 (3.1 million) either before or after they were posted.
TripAdvisor’s
report comes out less than two weeks after the latest criticism of its reviews,
this time from Which? Travel in the United Kingdom. That company says it analyzed 250,000 reviews for the 10 highest-ranked hotels
in 10 popular tourist destinations and found that 15 of the 100 looked “blatantly
suspicious.”
In the
report’s foreword, TripAdvisor president and CEO Stephen Kaufer writes, “No one
has a greater incentive than TripAdvisor to ensure the reliability of the
content on our platform. It is the core of our business because if people don’t
find our content reliable and helpful, they won’t keep coming back to our site.”
Kaufer goes
on to say third-party criticism is often based on “inaccurate figures,” so the
report is intended “to provide definitive insights into the details and data
behind our extensive content moderation efforts. We do this in the spirit of
transparency. We know we’re not perfect. But we’re constantly working to stay
one step ahead of the people who try to abuse our platform — and we believe
that no other review platform does more to protect the integrity of their
content than TripAdvisor.”
TripAdvisor
says its users or businesses flagged less than 1% of reviews for possibly
violating the platform’s guidelines, and its content moderation team reviewed
most of those within six hours of receiving that alert.
The company
also enacted a “ranking penalty” on more than 34,000 businesses caught
attempting to post fake reviews and, since 2015, it has stopped the activity of
more than 75 websites it says were attempting to “sell” reviews.
“We’ve
continued to make advancements to our industry-leading fraud detection efforts
in recent years, but it’s a daily battle and we are far from complacent,” says
Becky Foley, senior director of trust and safety at TripAdvisor.
“While we
are winning the fight against fake reviews on TripAdvisor, we can only protect
our corner of the internet. As long as other review platforms aren’t taking
aggressive action, then fraudsters will continue to exploit and extort small
businesses for cash. It is time other platforms like Google and Facebook
stepped up to the plate to join us in tackling this problem head on.”
In the report, TripAdvisor also outlines steps it
is taking to protect the integrity of reviews, including investing in training
for human moderation teams, improving its fraud-detection technology,
partnering with law enforcement authorities to tackle fake reviews and enhancing
the transparency of its review moderation process.