Google has ramped up its war on web spam in recent months with some big brand web sites including travel companies reportedly being penalised because they are believed to be trying to manipulate their search engine rankings.
Top travel site Expedia saw its visibility in Google searches drop by around 25% when it was thought to have been punished by the search giant at the start of this year. Not only would such a penalty such have caused a drop in traffic and revenue, but it also affected Expedia’s stock price and value.
NB: This is a viewpoint by Tom Schuster, chief executive of Searchmetrics.
And every month brings fresh news of more brand sites being negatively impacted. Search is a key channel for the travel sector and senior business chiefs in the industry must be concerned about why this is happening and what can be done to protect their businesses.
Google has made no secret about its fight against sites that attempt to game their way to the top of search results using techniques that infringe its webmaster guidelines. These techniques are considered bad for search because they can mean relevant websites get buried in search results, and make sites from legitimate website owners harder to find.
The search engine’s algorithms detect many spam techniques and automatically demote the sites that use them. It also employs teams which manually review sites for spam activity.
One of the ways which Google decides rankings is the number and quality of links to a site coming in from other sites or blogs. These backlinks are seen as a ‘vote of confidence’ from other sites for the content on your own site. The more links you have from high quality sites such as media sites or prominent sites within your niche, the higher you are likely to rank.
But for years Search Engine Optimization (SEO) professionals have found ways to manipulate the number of backlinks in ways that are outlawed by Google.
Analysis of Expedia seems to indicate that it may have been penalised for a variety of alleged ‘unnatural’ links to its domain and other important URLs. And, many of the links in question were analysed by Searchmetrics as a probable legacy from an older link building strategy, although they have affected the business now.
But why does it seem as though many more sites are being affected at this time?
The simple answer is that Google is becoming smarter at identifying spam and widening the definition of what is now considered against the rules. Many SEO 'tricks' that helped sites achieve high search rankings in the past are now identifiable by Google due to continuous improvements in its algorithms’ sensitivity.
Walking a fine line
In the past it was very common in the SEO industry to incorporate many keyword links into the footer of company sites for example. But now such measures may well be counter-productive, since Google is getting better at recognizing this kind of ’link optimization’.
Even the excessive use of guest blogging in order to increase the number of in-bound links to your web pages and so improve your search rankings seems to be an issue that Google is clamping down on.
If you have been using these methods (even unknowingly) and have done nothing to remedy the situation, you could be walking a dangerous line.
Often, as appears to be the case with Expedia, the problems could relate to outdated SEO practices that were implemented years ago and forgotten about. Or they could be caused by an agency or department that is still using these practices, unbeknownst to senior management.
In rare cases it could even be that a competitor has targeted the company’s website by implementing the ‘illegal’ practices such as poor links in a deliberate ‘negative SEO’ campaign designed to alert Google.
Of course, a single stealth does not automatically lead to a punishment by Google. But the interplay of a variety of infringements increases the likelihood of a penalty which could hit overnight.
Senior chiefs in travel companies need to be asking their marketing department to explain their SEO strategy. They should also consider bringing in outside experts to independently test whether a site is at risk – or commission regular internal reviews using the necessary site auditing tools.
The next step would be to develop a plan to gradually replace any sub optimal or ‘illegal’ elements on the site. It is important that companies who think they may have old links that Google might view as infringements do not just remove all the links at once.
The best way to reduce risk is to remove the bad links slowly and substitute them with good new links in order to try and maintain, or even improve, the domain’s search rankings.
Algorithm updates
It is helpful to understand the series of algorithm updates that Google is continuously rolling out in order to help weed out spam. The Panda updates are all about reducing the unhelpful and irrelevant content that appears in searches, while Penguin scrutinizes suspicious link-building techniques.
More recently the Hummingbird algorithm which went live in mid 2013, was a dramatic step change which means Google is getting much better at understanding the search intent of the user and taking account of the all the words in a search query in order to present the most relevant results.
Increasing relevance is clearly the direction in which Google is moving and that is where companies need to pay attention in the long run. It is important to put resources behind delivering helpful and valuable online content and constantly keeping it updated and fresh. To do this you need to consider and then answer the questions that the searcher is likely to ask.
The content needs to address the requirements of site visitors at the different stages of the buying cycle. Are customers looking for general information about holidays or specific destinations (informational searches)? Are they looking for flight prices and availability (directed informational searches)? Or, are they making transactional searches which indicate an intention to buy?
As well as having the right information, a site needs to be structured technically so that search engine crawlers can find and recognize the content. And of course the structure and usability needs to be designed so that human visitors can immediately be taken to the relevant pages and content for their stage in the buying cycle.
Search strategies should also account for the fact that Google is increasingly presenting different results on computers, mobiles and tablets to match the differing intent of searchers on specific devices.
If you move in the direction of producing relevant, valuable and helpful content and making it easy to find on multiple devices then you are moving in tune with Google and building a successful long term search strategy.
NB: This is a viewpoint by Tom Schuster, chief executive of Searchmetrics.
NB2: Risk ahead image via Shutterstock.