When it comes to web bots and scrapers, travel sites are some of the most vulnerable websites on the net. How can you protect your travel site?
NB: This is an analysis by Courtney Cleaves, director of marketing at Distil Networks.
Though web bots and scraping agents can attack virtually any site on the Internet, there are a few types that are more vulnerable than others.
Travel sites are one of those more vulnerable species. In fact, according to a recent study we conducted, more than 30% of a travel website’s traffic comes directly from malicious bots and scrapers.
In that one-week study, we counted more than 149,000 unique bots and tens of millions of scraping attempts – and that was just on a handful of sites.
The truth of the matter is, if you own, operate or work for a travel site, bots and scrapers are probably cutting into your paycheck in a big way.
Fortunately, there are ways to help identify this problem and work to correct and prevent it from happening again- often resulting in turning around your company’s revenue.
The problem
When we talk about travel websites, we’re referring to websites for airlines, hotels, motels, online travel agencies and travel content sites, like ones that share reviews and ratings.
We are not talking about aggregating sites like Kayak, Skyscanner et al. Many companies have agreements in place to legally scrape sites and the "scrapee" knows that it is going on.
Unfortunately, due to the growing popularity of travel sites and aggregator sites, the problem of bots and scrapers only continues to expand.
More and more startups are launched every year built around simple scraping schemes that steal data, take away customers, and prevent sales on your site.
To put it into perspective, Priceline purchased Kayak for a whopping $1.8 billion. That’s how successful price scraping has become; a legitimate travel site – one with legitimate relationships with airlines and hotels – has found a price-scraping scheme to be worth billions of dollars.
Expedia took the same route, taking a massive stake the European version of Kayak – Trivago – in 2013. This just goes to show that scraping is only going to continue and grow – but it needs to be properly controlled.
What’s at stake?
Before we go into solutions for price scraping, it’s important to know what’s at stake first.
Though bots and scrapers do steal your proprietary data and use it for their own devices, it’s the big-picture, long-term effects that will really affect your company most significantly.
Here’s what you should take into account when considering the effect of bots and scrapers at your company:
1) Removes interaction with customers
When a customer purchases your travel deals elsewhere, you miss out on the entire buying experience.
You can’t create customized recommendations for them, and you can’t capitalize on the opportunity to up-sell and cross-sell.
In the end, that means lost revenue. You also can’t generate a customer profile so you can target that customer’s needs/demographic in the future.
2) Add-ons and up-sells looks like they come from you (but they don’t)
On that same note, the scraping site can sell add-ons to your packages, and make them look like they’re from you.
Depending on how these are priced, worded and presented, it could have an effect on how the customer views your brand.
3) Eats up resources
When you consider the possibility that 30% of your traffic is coming from bots, think about how much bandwidth and server space they are using up.
And what about the costs for that bandwidth and server space? Just think: without bots, your costs could be 30% lower.
4) Impacts customer loyalty
When customers buy your deals from another site, they don’t build a relationship with your brand.
That means they won’t think of you for future travel needs, and they won’t feel any loyalty or familiarity to buy from you again.
5) Increases your costs
The damage done by bots and scrapers has to be accounted for somehow. Unfortunately, that’s usually in marketing, advertising or customer acquisition costs.
Since you’re losing out on sales from the scrapers, you’ll need to pour more money into these other areas of your business in order to get the sales and revenues you need.
In the end, you spend more and make less.
A solution
There’s no way around it: bots and scrapers are a serious threat to a travel site’s revenues, sales and overall brand image.
It’s not an issue you can safely ignore. If you want to take charge of your company’s revenue streams and make your brand as successful and profitable as it can be, then finding a way to stop and prevent bots is crucial.
In the past, many travel sites have tried – and failed – to block bots and scrapers.
They’ve tried to protect their APIs or they’ve changed up the data structure in an effort to prevent malicious attacks.
While these efforts may work for a short time, neither is a long-term, permanent solution.
Ultimately, the only truly effective way to protect a travel site from bots is to take a proactive approach – to prevent them from accessing the site in the first place.
As a company you should think about, as part of an overall strategy, doing the following:
- Block any and all bots and scrapers entirely
- Block specific scrapers, while whitelisting others (if you have relationships with certain aggregation sites, like Expedia or Orbitz)
- Allow the entry of scrapers, but capture information on them so you can take legal action (timely and not cost efficient)
- Protect only specific data sets from scrapers and bots
- Protect your site from bots during sluggish and slow loading times
- Simply monitor scraping activity and decide what to do next
NB: This is an analysis by Courtney Cleaves, director of marketing at
Distil Networks.
NB2:Web bots image via Shutterstock.