While not as shocking as human–machine transcendence depicted in the sci-fi drama written by Jack Paglen, the transcendence of SEO to UX is happening now. In the following blog post I will show how UX becomes an increasingly large part of the ranking in Google, and thus a vital part of SEO. It also shows the direction where the world’s leading search engine goes, and that direction is the largely user experience based ranking. At the end I will write briefly about the current user experience signals in Google ranking. I can’t and don’t want to give a checklist, instead I want SEO experts to embrace user-centred thinking.
SEO before UX
Until 2005 SEO started with finding the keywords that will get you a lot of local visitors for as little effort as possible, then getting a lot of links with that keyword pointing to your site. This worked because ranking in Google was largely governed by PageRank, based on the research by Sergey Brin and Larry Page. You can imagine the PageRank algorithm as a cat in front of a touchscreen. Assume that the cat can identify links on a website (has a cute browser plugin that puts a dancing mouse over every link), and she will randomly click on a link. PageRank is basically the chance of this stochastically clicking cat of arriving to your website. To make sure the cat does not arrive into a dead end page, where there are no external link, each time she visits a site there is chance that she will get bored and jumps to a new, random page. This will lead to two things: the cat might arrive back to the original page within a few hours and you will get a totally scratched display.
What did the SEO guys do to exploit this system? They created link farms and other very low quality pages with the sole purpose of manipulating PageRank. Others simply brought links on pages with high(er) PageRank. Now the bright minds working at Google realized that user don’t want to find the sites with the most cash behind them (to buy links and pay for linkfarms) or with the smartest SEO guys. Users wanted the most relevant content for their search. Someone searching for “city council Southampton” might not want to arrive to a page selling cheap Viagra, even if the said city had no such money to compete with the “smart” ecommerce sites.
After Jagger update in 2005 things got better, most of the time you could not accidentally find those kind of ecommerce or adult entertainment sites, unless you wanted to. As the biggest pain point vanished, another one got the spotlight: if you are searching for “cheap car insurance” you are expecting to get the best car insurance website, the most user friendly one with very low prices. Back then you found the one with title element “cheap car insurance” and the URL “cheap-car-insurance.com”, which still had cartloads of links pointing at it, and still quite low quality, just not below a certain imaginary standard.
While Vince update in 2009 made sure that cheap-adidas-shoes.info will not outrank adidas.com for “Adidas” keyword it was still an uphill battle for sites focusing on the user, and not on pleasing Google.
In 2009 Google ran a user experience experiment on how users react when web search takes longer. As you might have guessed, people hate slow sites. So an undeniably UX ranking criteria was introduced in early 2010: site speed.
Real breakthroughs: Panda and Penguin updates
The real breakthrough came in early 2011, when the Panda update shook the search engine result page. Panda obliterated sites with thin content, especially content farms, that where the new enemy of Google after link farms fell. Sites with high ad-to-content ratios or serious quality issues also got penalized. Not incidentally those kind of sites where terrible from user experience perspective. A few months later, in November 2011 Freshness update put the focus on fresh content, also a good thing from the user’s perspective, and the bashing train did not slow down. In January 2012 Google started to penalize sites with too many ads above the fold. After all, users visit sites for the content not to get some advertising (unless they are internet marketing experts). A month later the search giant realized, that people searching for “Chinese restaurant” might be looking for that within a few miles, not on a different continent so Venice update rolled out.
Penguin appears in the nightmares of SEO experts. No, not the arch villain from Batman Returns portrayed by Danny DeVito, but the algorithm update that happened on 24th of April in 2012. The main target of the update was the keyword stuffing, a technique of writing “Google friendly texts”, as in adding your main keywords to the text as many times as possible. Some SEO experts claimed that 4-6% of the words in the text should be your main keyword. This often leads to texts not meant for human readers. Repeating the same word once in every 20 will make sure most readers will not go past the first paragraph. Needless to say this is extremely bad for the users, as they might want to read the text. This even lead to claims by SEO experts that users will not read the text anyways, so you can do whatever you wish with it. Keyword stuffing was only the top of the Penguin’s iceberg. Some called it “over optimization penalty”, because seemingly it penalized sites for having too much SEO. In reality it penalized sites for having poor user experience.
To make things even worse for SEO experts in September 2012 Google introduced Exact-Match Domain penalty. This means that if it is not a brand search and your domain matches the search query you will be penalized in you have a low quality site. It was a common practice to buy domains with the only intent of tricking the search engines. Researches shown that from the user’s perspective a domain name like elistaria.com is seems a lot more trustworthy (and better from marketing and PR perspective) than fasthtm5cms.com, even if she or he searches for “fast HTML5 CMS”.
2013 and 2014
Compared to 2012, 2013 has seen no SEO bombs, but a lot of smaller updates came, and Google was looking into search queries like porn and payday loans, that are usually full of web spam and black hat SEO technique (those techniques are trying to manipulate Google, usually at the user’s expanse).
The major hit of 2014 (as of 4th of July, 2014) was Panda 4.0 on May 20, 2014. Industry giants like eBay were also struck by this. According to a study my moz.com eBay lost quite a few top 10 positions in the search engine result page. For example, in case of “fiber optic christmas tree” search eBay held position 7 and 8 on the 19th, while no top10 position on the 20th. Most of the time the lost URLs were category or sub-category pages, not individual auction listings. (All of the URLs of the form ebay.com/bhp disappeared.) This is a clear warning, that even if you are an industry leading giant, you are not safe, and it will penalise you if you try to manipulate the search engine.
Do you see a pattern? Google tries to force good user experience onto the web, so the SEO expert of the future (or rather present) needs to be well versed into the art of UX. In the following I will try to list a few UX signals that the Google ranking currently uses. Please note, that this is an ever growing list, and if Google wants to keep its position as the leading search engine this trend has to continue.
Current user experience signals in Google ranking
One of the most important ranking factors is the page title, which is inside the
<head> of the HTML, wrapped in a
<title> element. This is used for ranking and for generating the snippet on the search engine result page… if it is relevant, sums up what the page is about, without being overly verbose or keyword heavy, or with other words, as long as it was written for the users. But relevance is the key in here, because the whole page should be relevant to the query visitors used to land on your page. Users should find a solutions to problems they want to solve, read interesting things and have fun. If they are bored or the whole page is uninteresting, badly designed, bad UX, they will leave, so Google will anticipate this and makes sure people don’t arrive there from Google.
It’s not only the home page or the landing page. Each and every page should be aimed at the users and should be easy to use, easy to navigate and generally great user experience. Nowadays cluttered pages with lots of visual noise are rare in the top spots of the SERP (search engine result page), so are overly complicated navigations. Information architecture starts to grow into a science of its own, and helps SEO tremendously.
One of the reasons full flash sites died out was that they were bad for SEO, and the same thing is true for every old and outdated technique that are generally not user friendly. While there are easy to measure factors surely amongst ranking criteria like site speed, button sizes or readability just to name a few, none can give you a full list of Google ranking criteria. And this is quite fortunate, because if there would be such a list, most SEO experts would just go through it, and check everything, without trying to understand user behaviours, and trying to provide great UX.
Google has over 200 ranking factors, most of them are a well-kept secret, but one thing is sure: user experience already plays a major role in ranking, and this trend will surely continue. My best advice is: keep your SEO practices user-centred to avoid future updates (most likely named after cute animals) that might severely hurt the ranking of your websites.