In the years since everyone became their own travel agent, finding the best deal has become almost a Vegas-like "beat the system" game.
NB: This is an analysis by Rami Essaid, CEO and co-founder at Distil Networks.
Identify the precise time that an airline, hotel, or rental car company offers its lowest price, or best bundle of incentives, and you’re a winner!
But the problem for the site operators is that automated bots are also hitting your site and looking for that magic point-in-time.
In fact, your human prospects aren’t even your biggest source of traffic.
Welcome to the botworld of webscrapers and price-scrapers
Web scraping bots are automated agents that are deployed across the web to gather specific information and/or deposit malware to assist in the acquisition of that information.
Some bots, like Google search agents, are good bots. But the vast majority – over 60% in a recent Distil Networks study – are bad bots, looking to steal competitive information that could facilitate the theft of customers from legitimate businesses.
The travel industry is filled with bot-based business models
It started with metasearch engine bots like Kayak and Trivago that send out simultaneous queries to other travel industry sites for results.
The proliferation of these types of sites has increased market transparency and put downward pressure on margins.
You could posit that the next generation of bot-based business models are bent on destroying margins.
They aren't just metasearch engines scouring current offers, but rather companies like Skiplagged that look at multiple flight segments and help customers pay less money by flying only one segment of the trip and then abandoning the rest of the flight.
Known as "hidden city" ticketing, this type of activity is explicitly against the terms of service of the airlines and potentially against TSA rules as well. Orbitz and United Airlineshave filed a lawsuit against Skiplagged.
What’s next is anyone’s guess, but launching bots is incredibly cheap and easy. The widespread availability of cloud computing, virtualization, commodity hardware, and freeware tools means that almost anyone can get in on the game.
How bad bots can hurt travel sites
A good travel business is built around excellence in customer engagement; it’s a high-touch business.
When a bad bot hijacks your information, whether it be pricing, incentive packages, keyword placement, editorial content or any other unique advantage, they’re diminishing your ability to interact with your customer.
You’re not just losing a ticket or room sale. You’re losing the opportunity to engage with that customer throughout their travel experience, to upsell or cross-sell them higher-margin services, and ultimately, their loyalty.
Blocking bad bots – What’s the answer?
A number of technological approaches have been developed to do just that. Grouped under the umbrella label of “proactive bot detection and mitigation”, solutions available today offer some or all of the following protection options:
- Block all scraper bots completely
- Selectively block scraper bots based on business logic
- Monitor but don’t block scraper bots in order to build evidence for legal action
- Selectively protect specific data sets against scraping
- Allow bots free range but feed them with useless data
Selective approaches make a lot of sense, because they enable site operators to continue to feed legitimate search engines but block bad bots that seek to hijack customers or otherwise damage your business.
Legacy bot blocking solutions were never purpose built for the task
Technologies such as Firewalls and WAF’s that once were expected to offer protection are not well-positioned to defend against scraper bots.
Content Delivery Networks (CDNs) are focused on content delivery; they’re hardly incentivized to protect against bot activity, since bots drive up traffic, increasing the monthly bandwidth fees paid to the networks.
Detection and mitigation by IP address alone is no longer an option, since bots can hop from one IP address to another.
Also, IP-centric solutions present the major risk of blocking human customers who happen to be using the same stock of IP addresses as the bot.
For this reason, Web Application Firewalls cannot safely be used to block bots, as they will likely block legitimate customers as well.
How proactive bot detection and mitigation works
A far more effective approach is to develop a unique fingerprint for each bot and use that information to prevent illegitimate access.
It even works if the bot’s controller attempts to disguise its identity or attack via the use of proxies.
Alongside this fingerprint development, an effective solution will also "learn" what typical human behavior looks like on your site, so that atypical behavior which might indicate a bot attack can be flagged and mitigated.
Couple this with a cloud-based repository of fingerprints and behavior patterns accessible to all users of that technology, and you have the potential for a real-time, intelligent bot detection and mitigation solution.
Eight questions to help you determine if your site requires bot protection
If you’re in the online travel business today, the reality is that your business is under threat from scraper bots. If you need to convince management this is a real problem, you may find it helpful to build your request for a solution around the following questions:
- Can you measure your current good bot and bad bot traffic?
- Can you identify the specific content that’s being scraped?
- Have you seeded content to try and identify who is scraping your site?
- Specifically, can you identify who is scraping your highest-margin data?
- Do you know what that entity might do with that data?
- How is your SEO performance trending in that high-margin segment?
- Which competitors are gaining on you or overtaking you?
- Are your promotions – and the SEO results for those promotions - being undercut by specific competitors?
NB: This is an analysis by Rami Essaid, CEO and co-founder at
Distil Networks. It appears here as part of
Tnooz’s sponsored content initiative.
NB2: Download the full report here.
NB3: Web bot image via Shutterstock.
CLARIFICATION: This article originally suggested TripScanner uses bots to monitor rates after you book. This was a misunderstanding. TripScanner says that instead of price-scraping it uses APIs from established, reputable vendors to access legitimate search channels, such as a GDS.