The job of an SEO Analyst has developed a lot over the years, and the role of keyword research has also changed. What was once simply keyword stuffing, is now much more complex. The Hummingbird and Rank Brain algorithms have taken Google’s understanding of text to a higher level. With this in mind, we ask the question “Is Keyword Research Dead?”
We receive enquiries about keyword research a lot, and it’s evolved past how most people think about it. There are many people looking to rank for as many terms as possible, and believe keyword stuffing is critical. However, this recent AHREF’s article shows that ranking 1st position will often include up to 1,000 extra terms indexed. Surely their text isn’t stuffed with 1,000 keywords.
If we look at keyword research the same way we used to, then I am confident in saying it is surely dead. However, if we evolve and redefine our understanding of keywords – it’s still very much alive and kicking.
Latent Semantic Indexing
We often see this term ‘LSI’ or Latent Semantic Indexing thrown around a lot, but not everybody grasps what it is. To most it’s simply using synonyms, but it’s much more than that. I would argue that it’s of vital importance that we understand LSI, because it’s one of the main reasons why keyword research might be considered dead. So what actually is it?
The online dictionary defines the terms as follows:
So what does this look like in practice? The example I often give to clients is to imagine ‘Apple’. I could be referring to an apple that you eat; an Apple Macintosh computer; the apple record label, or even an Adam’s apple. There aren’t many synonyms for the word ‘apple’ yet in everyday conversation we understand which apple somebody refer to.
This is because our understanding of meaning of words is not simply based on the word itself. It includes the context of the sentence and topic. If I was talking about music and then mentioned Justin, the expectation is to be talking about Justin Timberlake, not Justin Trudeau the Canadian Prime Minister.
A great tool for searching for Latent Semantic Indexing is LSI Graph, which will make keyword research a lot easier.
Hummingbird Algorithm Update
The hummingbird algorithm allows Google to read text as if you and I were reading it. This is a scary thought for people using Fiverr gigs for all of their written content and not ranking first page.
Whilst exact match and partial match keywords are a strong ranking factor, alone they don’t guarantee a spot in the top 10. You need to look at user intent more closely. This is where Google Trends is such a useful tool. You can narrow down your audience between countries, the past 12 years, related topics and related queries.
The related topics and related queries are actual terms that people are looking for, and help you to understand user intent much better. A fantastic example of this is the term ‘Guinness’. It’s a term that has dual meaning, and using Google Trends I could tell you what the SERPs would look like before searching it. Look at the images below, and have a guess what the SERPs will look like before reading further:
If we analyze the related topics and the related queries, there’s a clear divide. Some users are looking for the Guinness World Records and others for the alcoholic beverage. So we would expect Google to create a page that is split between the both. If we had to hazard a guess, we could say there would be images of the drink, and news about records broken. We might expect links to the official sites and social media pages too. Let’s take a look below.
So it’s clear that at some level, Google Trends data correlates with the SERPs. This is to be expected, as Google tirelessly discuss user experience. It also makes this one of the most powerful tools for content writing.
The last piece of the puzzle is Rank Brain.
Rank Brain Algorithm Update
The Rank Brain machine-learning algorithm is not an artificial intelligence. It does not analyse the SERPs and then make huge decisions on its own. The way machine learning typically works is by processing data based on analysis of a sample. Manual reviews are collected by the thousands and then processed for trends that are consistent amongst them. This returns a % Match and if there is a huge trend within the industry for bad websites, Google can push an update.
I was raised with audio signals before I became an SEO Analyst and I think the SEO industry’s understanding of signals is simplified. Terms such as noise, bandwidth, signal-to-noise ratios (SNR), headroom, dynamics and dithering are useful in understanding Google’s algorithm. It’s something we use at the agency, because it’s a more holistic representation of signals.
Signals are not simply ‘1’ and ‘0’, on or off. They have noise involved, and when Rank Brain processes data, it’s definitely looking for strong SNR. It’s why you should be looking to add noise to your link profile and money sites. In audio we use a term called dithering to make quantization errors sound smoother. We add noise to audio files, which reduces audible noise. It seems counter intuitive but it works.
Likewise, within your link profile you want to add noise. Stop making Google’s life easy by doing all your PBNs with two pages and 1,000 words of content with links on the same page, always doFollow and images labelled 1.jpg, 2.jpg and 3.jpg. Stop using Fiverr on every PBN, and use decent content on a number of them. Stop using expired domains for everything, and begin mixing it up month-on-month with as few predictable patterns as possible.
For as long as link building is a strong part of the algorithm, keep on doing this. Because making the link profile dynamic will reduce your chances of being hit by a penalty, and mitigate the damage if your network is found.
What Keyword Research Will Look Like
My hopes are that in the future keyword research will look very different. I want to see SEO become more competitive and difficult. This is what makes SEO fun, not the money but the challenge. As Google become more agile it forces us to become more cunning, and this game of cat and mouse continues.
I think that keyword research in the future is going to look much more like market research and topical research. I expect it will become more professional and somewhat reminiscent of university. Google already have a very complex understanding of words, but their weighting algorithm is quite easily gamed. As soon as Google create a more complex way of weighting words in text, there will be an overhaul in how content is written.
I predict that page titles will become less relevant, as will headings and use of the <strong> element. They are easy to abuse and not providing users with the best experience, text, images and media are. So quality written content with plenty of research will likely be a staple of tomorrow’s keyword research.
The important thing to remember is that Google uses TF-IDF systems. It is likely you will always be compared to other articles within your niche. So competitor research is extremely vital when writing text, because it suggests how much you can optimise a page without it being over-optimised.