In this guide I am going to cover 100’s of Google ranking factors and how they apply to your website. I will not only list out the signals, but I will include resources to help you diagnose your website. There is absolutely tons of juicy content below.
If you would like to skip ahead to the Google ranking factors that apply to your website, then please use the below contents to skip to the section that applies to you.
- Domain Factors
- Page Level Factors
- Site Level Factors
- Backlink Factors
- User Interaction
- Special Algorithm Factors
- Social Signals
- Brand Signals
- On-Site Webspam Factors
- Off-Site Webspam Factors
1. Domain Age & Registration – as of June 2017 domain age is still currently a ranking factor. Whilst Google typically do not reward older domains, they do purposefully harm new sites. This process of not being able to rank effectively is called the “Google Sandbox”. If you want to remove the sandbox you can use patience, high quality links and social signals. The work around to having a domain with a good quality name is to buy an aged domain instead. This is one of Google’s arbitrary filters to stop SEO’s from ranking but hurts small businesses in the process.
Another ranking factor that ties in with age is registration dates. If you buy the domain with 3 – 4 years in advance, this will indicate to Google that you are around for the long haul. You’re not looking to make a quick buck. This is more of a trust signal than anything else and could help you out of the sandbox.
If you want to check when your competition created their website, you can use WHOIS. If the website is a .com TLD then you can use https://whois.icann.org/en to find this information quickly. If you need it for other TLDs then simply search “WHOIS .co.uk” or some other variant of that until you find one that you like.
2. Domain Name – every URL can be broken down into components, many of these are ranking factors. For example, if we take the below URL, it has a protocol, subdomain, root domain, top level domain, subdirectory and filename. Each of these are a ranking factor that can be optimised. If your root domain includes your targeted keyword as an exact match, partial match or is topically relevant – then you can expect to receive a small ranking boost. However, URL’s that include your targeted keyword can be easily hit with penalties from anchor text optimisation, so you should consider whether you want this or not. When you are starting out, this can be a problem, but as your backlink profile becomes larger it will cease to be an issue.
If you want to really improve your rankings then you should consider including the targeted term first. For example, if you were creating a paleo restaurant in the United Kingdom, then you might create a name such as paleorestaurant.co.uk. Whilst .com is the most popular TLD on the internet, there is a slight boost for using the targeted country. However, if you have ambitions to scale internationally then you may wish to choose .com for a neutral domain.
3. Domain History – the history of a domain is what activity happened on the site prior to it being owned by you. Google has said in recent history that they will reset backlinks to expired domains. This would improve the amount of available domains that are not negatively affected, but stop people building Private Blog Networks. However, recent studies suggest that Google has not done this as of yet. I think it makes sense for Google to do this moving forwards, so it is hard to say this will still be a ranking factor in the future.
If you would like to view the history of your website then you can use the wayback machine. You can use this to see if the domain was used for SEO purposes and then cross reference with AHREFS, Majestic and SEMRush to see if the site received a penalty.
4. Public vs. Private WhoIs – there is a good case to argue that hiding your WHOIS information may reflect having something to hide. Whilst it is appreciable that a small business might hide their information, large companies should make it available here. This is likely a trust signal rather than authoritative or relevant. There should be bonus points for people who use the same contact details in the WHOIS as the contact details on their site. However, if your website is small or you do not wish to be contacted, then privacy is safe too. It’s not one of the Google ranking factors and can save you a lot of grief from spammers.
For those that do wish to reveal their WHOIS information, you should play by the rules. Once you have your name and contact details in Google’s database you may wish you never got penalised. There are several cases where Google have taken a grudge against an individual and banned all their sites. #FreeCharlesFloate
5. Page Title Tags – this is the most popular of the Google ranking factors, it’s the one everybody knows. It has become the staple for all sites that are not mammoth sized like Amazon or Nike. However, it’s not as powerful a ranking factor as it once was – and I say this carefully. It used to be the case that exact match page titles were the hot topic. You used to also stuff your keywords into the page title and rank. Now it is the opposite – you cannot stuff keywords into your page title and you need to be discreet. The best way to stay under the radar is to include Latent Semantic Index terms. These are phrases that are relevant to a topic but not the targeted keyword. For example, in the image below PaleoLeap use the Paleo Chicken and Poultry Recipes to include two terms that are closely related in lexicography.
One of the biggest tricks that people are suckered into with page titles is over-optimisation. Whilst you can and should include the keyword, if you over-optimise the page you start signalling that you are abusing Google ranking factors. If this is the case, Google drops your site very quickly. They do so because you have thought about search engines whilst writing content, not users. That said, many people still choose to use a single keyword in their page title with a compelling reason to click-through. Within some niches this can be all pages in the top, but on average around 50% of sites in the top 10 results include a targeted keyword.
6. H1, H2, H3 Tag Optimisation – the heading elements are 3 of the most popular ways that Google can understand relevance on a page. Conventional wisdom would be that if this helps Google gauge relevance, then more the merrier. However, if you change your whole page to H1 there is no structure to your page. Therefore, if you want Google to have the clearest understanding of relevance you should intentionally restrict yourself to certain practices.
Firstly, your core term should appear in the URL and Page Title, which means that your <h1> tag should be semantically related. This means that rather than using the exact same term in your URL, page title and <h1>, you choose to use something relevant instead. As an example, if you are targeting Home Recording then your heading might be Home Studio & Audio Equipment. There should only be one <h1> tag and preferably it’s above the fold in a large font to draw attention to it. By including multiple <h1> tags you dilute the relevance somewhat.
With your <h2> tags you can be more liberal. You should aim to include multiple <h2> tags, and they should be topically relevant and preferably include your keywords in some fashion. Using the above example, you might include <h2> tags such as Essential Recording Equipment, or Home Studio and other combinations of the keywords. This keeps it natural by using real terms from your industry that are relevant to your topic and page content. This will not only be useful for Google, but also for your users.
Finally, <h3> tags are the least important of the heading tags. My advice is typically for this to be used as subheadings within a <h2>. Using the examples above, in your section about Home Studio you may choose to include headings such as Audio Interface, Studio Monitors, and Microphones. These are naturally occurring headings that fall within the topic and provide the user relevance.
In the below example, even though Google haven’t optimised their headings to rank for specific keywords, you can see they’ve followed the same clear structure. Google uses a <h1> tag at the top in large letters, followed by a sub-topic called Things to do, and in that list of things to do you find specific items such as Give visitors the information they’re looking for.
7. Meta Description Tags – the meta description has for a long time not been a ranking factor. It is not something you ultimately need to worry about. However, many SEO’s would argue that they can still be optimised to improve rankings.
Since Click-through Rate, Bounce Rate, and Dwell Time are often considered ranking factors, it makes sense. It’s really as simple as these two steps:
If your meta description encourages users to click your link – great!
If your meta description is accurate to what your page will achieve – great!
Then just throw a keyword in that you’re targeting, and you will rank in no time at all. This is one of the most overthought Google ranking factors. It’s really that simple!
8. Relevant Content – when people talk about relevant content – they get confused. We think about these terms in human understanding, but it’s wrong.
Google is an algorithm, it’s based on mathematical equations. Whilst its understanding of word seems similar to humans, it is fundamentally mathematical. So when we talk about relevance, we need to think about the numbers. Let me show you why:
The most popular form of information retrieval in regards to relevance is TF-IDF. This means Term Frequency – Inverse Document Frequency. If you’ve not read about this before, you need to learn this:
(Term Frequency/Total Terms) * log(Total Documents/Documents Including Term)
For example, let’s say you have 10,000,000 documents, and 1,000 of them included the term “doctor”. The specific page you’re analysing has just 100 words, and includes doctor 3 times in the text. Your TF-IDF score would be:
(0.03) * log(10000000/1000) = 0.12
If you then compared this to a term such as “the” which will show up in almost every document on the internet, the mathematics is going to look a bit more like this:
(0.03) * log(10000000/9000000) = 0.001
So even though the amount of times the term is included in the document remains the same, the amount of times it shows up across the internet is not. This helps Google to weight terms and pick which are the most important to a document.
Latent Semantic Indexing is how Google understands words in relationship to one another, and this needs a whole page to itself.
9. Keyword is Most Frequently Used Phrase in Document:
10. On-Page Content – if you heard that content is king, you’ve undoubtedly been told to write good content. Yet, whilst there is some correlation between length and ranking – it’s not clear cut.
One reason that content length is important, is LSI. By writing more text you give yourself more opportunity to include your keyword and the semantically related keywords. However, not every page benefits from large volumes of content.
If you look at content within your niche, it may differ completely against other pages in that niche. However, beyond just being niche based, it’s also keyword based. Informative keywords such as ‘how to dye my hair’ may include long articles and step-by-step instructions. However, if you look for ‘hairdressers near me’ then you’ll find pages that seldom have more than 200 words of text. This is because Google handles relevance at site-wide levels, not just page levels.
That is to say, if you write a lot of articles such as ‘how to dye my hair’ and ‘top haircuts in 2017’, there’s going to be an association with hair. Then they find your pricing page and see you do male and female haircuts, and your Google maps places you five miles from Bobby who needs his hair cut. Google uses these various pages to get a holistic understanding of your site. So there’s not much need to write 2,000 words of content for a category page, when you could write 2,000 words in a buying guide.
11. Keyword Density & LSI –
you can find out more about this in my recent blog article about keyword research including LSI.
12. LSI Keywords in Title and Description Tags:
13. Page Loading Speed via HTML – the page load speed is a ranking factor, and it’s huge! But it’s fairly easy to achieve. Why?
I’ll show you:
A long page load speed does not indicate that a site is bad, it simply indicates that it takes a long time to load. In fact, sites with lots of dynamic content will load slower than others.
If you have been biting your nails, and ripping your hair trying to go from 2 seconds to 1.8 seconds and beat your competition – then I’m going to save you time.
Page Speed is a threshold ranking factor, and likely a negative one.
That means if your page loads slower than X seconds, it’s going to incur devaluation. This means that if your site isn’t slow, you’re not going to get a big boost from reducing page speed. You can see this with Pagespeed Insights.
Google marks everything with a score above 80 as being Good.
This doesn’t mean you shouldn’t try to get closer to 100, but once you’re marked as Good – you know that Google is currently happy with your site speed.
14. Duplicated & Syndicated Content – duplicate content can be a real issue for users. Google uses duplicate content as a signal of quality on a website. Whilst many companies won’t be writing duplicate content on purpose, they may be creating it accidentally. In my time doing SEO I have found three main reasons for duplicate content. The first is because you copy manufacturers descriptions, the second is because you use WordPress, and the third is because you use URL parameters. As you will see, all three situations are caused by the webmaster’s choices. It is both controllable and avoidable.
In the first situation, a webmaster has chosen to use manufacturer descriptions. This happens a lot in eCommerce and it’s really not great value. The manufacturer’s include lots of sales copy about why their product is the greatest alive, but it rarely includes a breakdown of features with some depth and complexity. If you are looking to write product descriptions, these should be 500 words of unique content that addresses the user’s needs. If you can’t write 500 words about your product, aim for 300 and try to improve this with time. However, if you can’t write 300 words of unique content about a product, do you really know what you’re selling? Is it really good value? Do people really want to buy this product? These are the types of questions you should start asking. Putting the page through Copyscape will quickly reveal that a page is syndicated across the web.
In the second situation, the site owner has chosen to use WordPress, but wasn’t aware of several features to the blog that caused duplicated content. In WordPress there are /category/, /tag/, /author/ and /archive/ pages that include links to your articles. These are great value for users reading your blog. However, these are not good for Google crawling your website. The pages all include the same snippets of text as your main blog pages and appears as duplicated content. To avoid this, you can quite easily noindex,follow on the pages. If your site is small, then this is the only step you need to take. However, if your site is large and you have used a lot of categories and tags, you may have an issue with crawl budget. Whilst Google will not index those pages, it will still crawl them. To solve this, write the following in your robots.txt file:
Disallow: /category/ Disallow: /tag/
This can normally be detected quite easily by using Siteliner to check for internally duplicated pages. For example, I chose to search for a random wordpress blog on the internet as an example.
In the final example the webmaster has chosen to use URL parameters that create duplicates of a product. This can often happen when dealing with eCommerce sites and leads to many duplicate pages. Whilst it’s understandable your eCommerce will have these issues – it is avoidable. You should look to disallow specific queries from the robots.txt file. For example, if you have an issue with lots of search pages indexed – you may wish to disallow the search query. Let’s take ?Search= as an example. You could disallow it in two ways:
Disallow: *?Search=* Disallow: *?*
In the first example, you are disallowing all queries that include ?Search, no matter what comes before or afterwards. In the second example, you are disallowing all URLs that include ?. This would by default remove ?Search= and all variants, as well as every other URL parameter in one go. If you go to your Webmaster Tools, you can find a full list of URL parameters being used. This tool also allows you to change how Googlebot behaves when it finds those pages.
23. Page Loading Speed via Chrome:
24. Image Optimization – one of the cornerstones of SEO, image optimisation is often misunderstood and left alone.
In an effort to optimise every page titles, heading, meta description, and backlink – people forget to ever do their images. This is partly because WordPress does not have great facilities for bulk updating your images.
The main things to remember include:
- Alt Text – this is what the user will see if the image is broken. Particularly useful for visually impaired people that use screen reading software. Over-optimising this is bad SEO, but it’s also bad user-engagement. If you value your users, be descriptive.
- File Name – this is something you really need to remember. File names matter. Period. It’s really as simple as naming your file what it is, instead of some random series of numbers and letters. There’s really no excuse, because it makes your life so much easier when browsing the media libraries of your own website on WordPress.
- Title Tags – An often overlook opportunity is to add a title tag for your images. These are little boxes that appear when people hover over your image. Since this is visible to people, not just robots, you will definitely need to optimise this for reading.
25. Recency of Content Updates – some people are skeptical about this one, but it’s one of the Google ranking factors that make sense. If your website is covering a topic from 2014, and a new site covers 2018 – yours will be more relevant for unique queries including 2014. However, the one that is more recent may be more relevant for broad terms.
This isn’t always the case, because great articles can last many years without needing to be refreshed. So I personally lean towards saying this is a small ranking factor. If you deliver good content, and market it properly, there’s no problem.
26. Magnitude of Content Updates:
27. Historical Updates Page Updates:
28. Keyword Prominence:
30. Keyword Word Order:
31. Outbound Link Quality – one of the core parts of the algorithm is links. This is their term in computer programming, but really they are citations.
Imagine if you were writing a dissertation at University and you only cited Wikipedia, YouTube, your friend’s website, and a couple blog posts you liked. It would not end well for your grade.
And so it stands with online marketing.
If you want to rank, then you need to be linking towards authoritative websites. Industry leaders, authoritative sources.
These are pages such as major news websites such as Financial Times, Huffington Post, The Guardian, BBC. Other sources might be research papers or interests topics, websites such as Wired, Forbes, and National Geographic.
Within the SEO industry, these are websites such as AHREFS, Majestic, Moz, Google – the type of sites that you will find in this post.
But sometimes you need to link to websites that are topically relevant, but not authoritative. This is where the rel=”nofollow” attribute can be added to your anchors. For example, earlier in this article I was talking about PaleoLeap. I included a link to them, but it’s noFollow.
For users, they may have wanted to check the PaleoLeap website out, and it was useful to include a link towards the site – but it’s not authoritative for the purpose of this post. So, they got a nofollow link instead. This is a totally natural way of using it. I’m not sculpting, and I’m not abusing.
32. Outbound Link Theme:
33. Grammar and Spelling:
35. Helpful Supplementary Content:
36. Number of Outbound Links:
38. Number of Internal Links Pointing to Page – internal linking is a great ranking factor. It’s one that is easy to control, can pass relevance and authority – and it’s low effort.
In simple terms, the more internal links pointed towards a page, the easier that page is to find. This means that Google will value that page as being important for your website. However, despite this, many people read that internal links are good and try to hyperlink every other word in a paragraph.
Think about it like this:
If you have 100% authority, and 10 links, that’s a 10% weighting for each of those links. If you have 100 links instead, there’s now a 1% weighting for each link.
You want to make sure that your posts are linking to as few topically relevant pages as necessary to deliver a great experience. Phew, what a mouthful. To summarise, use links sparingly and aim to help the reader.
39. Quality of Internal Links Pointing to Page – this is somewhat a continuation of the above point. However, there’s a twist.
Whilst including internal links on your website are great, the placement is important. This is because if you include contextual links – they mean more. Both to Google, and to the user. It’s really obvious, but if you’re talking about dog food and have a link to your dog treats page – it’s something your users might also be interested in reading.
Then after that I generally weigh links in this order, best to worst:
- Top Navigation
I can see arguments for swapping around he sidebar and top navigation. However, my reasoning is that sidebars often include topically relevant articles. This differs from the top navigation that is prominent, but includes links to every page.
If your side bar includes the exact same links across every page of your website, then I would say they’re equal in worth. However, specifically for people that take sidebar links seriously and make it useful for users – those people are helping people to navigate the website.
40. Broken Links – being hit with a 404 page is not useful, and kinda sucks. But it generally is not a direct Google ranking factor. This is because it’s hard to really punish a website for occasional errors, especially huge websites.
However, there are 3 indirect ways that 404’s can be hurting your website:
- Broken Backlinks – if your website has backlinks pointed towards the page that is returning a 404, you’re losing authority. If the page is permanently gone and has backlinks, you need to write a custom 301 redirect to your new page.
- Soft 404s – if Googlebot crawls your website multiple times, each with a server error, it will ignore your page. It may still exist, but Google will treat the page as though it has nofollow links and no longer give your website credit for it.
- User Experience – if your website has 1 or 2 errors, it’s unlikely to hurt your user experience. If you have 100’s of them, you may be getting unnecessarily high bounce rate. These user metrics are often considered strong ranking factors.
41. Reading Level:
42. Affiliate Links – there are good and bad affiliate sites, and it’s obvious which category a website falls in to – mostly by their business model.
A website such as SkyScanner, or CompareTheMarket – these are high quality affiliate sites. Whilst they make their money from comparing and selling services; both are so useful they are a service in their own right.
On the other hand, Amazon affiliate sites are often just glorified link building schemes. People using Fiverr content, and tons of shady backlinks. This is the type of site that has little inherent value, so it borrows it from backlinks. These websites tend to get penalised with every Google update that comes out, no matter what strategy they seem to use.
Whilst I have no moral objections to an affiliate site, I only work with an affiliate site if it has value added. This is something that users want, Google wants, and ultimately what we as business owners should want for our products too.
43. HTML errors/W3C validation:
44. Page Host’s Domain Authority:
45. Page’s PageRank:
46. URL Length:
47. URL Path:
48. Human Editors:
49. Page Category:
50. WordPress Tags:
51. Keyword in URL – it’s not likely to be a surprise for people, but including your keyword in the URL is a big ranking factor. It’s absolutely huge.
Recently a new colleague asked me a simple question.
“Is it really a big ranking factor? Really?”
So I said let’s check the SERPs and see what they have to say. I picked payday loans because I had been working on a campaign there, and the below screenshot proves the point.
If you look at the below screenshot and don’t think this is a big ranking factor – the rest of the article isn’t much use for you.
52. URL String:
53. References and Sources:
54. Bullets and Numbered Lists – this is an interesting one, because adding lots of bullets and numbered lists will not dramatically improve your rankings. However, they are part of a good user experience and can be really useful.
For example, if you are trying to highlight three small and simple points, then lists are for you. It’s such a great way to provide easy to digest content that is neatly formatted.
I’m not advocating that you start spanking it and placing lists everywhere. However, where a list is appropriate, then use either bullets or numbered lists.
If your question is when a list is appropriate, it’s when you’re listing multiple items. The clue is in the title on this one.
55. Priority of Page in Sitemap:
56. Too Many Outbound Links – when we use an outbound link to another website, it’s supposed to be useful for users. It’s something that passes our authority onto them, but should ultimately provide users with some added value.
It’s great practise to include lots of outbound links across your website, a couple per each article is fine. However, when you’re including too many outbound links, any ranking power you gain from other sites – you instantly lose here.
The same can be said for ‘high authority websites‘ that have lots of backlinks. The power is divided amongst each of those websites and eventually not left with a great deal to work with.
Include valuable assets and links to resources as you feel is appropriate for your users to learn and grow. If you’re seeing a decrease in rankings, reign it in and focus on the core authoritative sources only.
57. Quantity of Other Keywords Page Ranks For:
58. Page Age:
59. User Friendly Layout:
60. Parked Domains:
61. Useful Content:
62. Content Provides Value and Unique Insights:
63. Contact Us Page:
64. Domain Trust/TrustRank:
65. Site Architecture – this is one of those Google ranking factors that can be misunderstood. It seems obvious that site architecture should be a ranking factor – and it is – but it’s not simple.
66. Site Updates:
67. Number of Pages:
68. Presence of Sitemap – the presence of a sitemap is really easy to set up on your website, and a huge benefit for your site.
There are two types of sitemaps that you will need to know about. Both of these are recommended by Google:
- HTML Sitemap – a user friendly sitemap, normally placed in your footer. This should include links to all of your most important pages that users will need to find. This shouldn’t include every possible page, just the most important categories, informational pages, and posts.
- XML Sitemap – a robot friendly sitemap, normally placed in your root domain. This should include either a library of sitemaps or every single link on your website. If you use YoastSEO this will normally be sitemap_index.xml, page-sitemap.xml, post-sitemap.xml, and various others. Non-essential sitemaps such as post_tag-sitemap.xml can be removed.
The important thing is that a robot is only interested in the naked URL, priority, update rate, and last updated data. This is easily generated by Screaming Frog. The objective is to improve crawl rate for the website by providing every important URL.
With a HTML Sitemap, you’re trying to provide meaningful URLs for users, these should include human friendly anchor text. To prevent the list being too long, you should only include the core pages, and group them into categories for the best user experience.
69. Site Uptime – It’s not likely this a Google ranking factor by itself. If we pose the statement like this: ‘oh the site is down very often, so let’s just penalise it.‘
However, if every time Google visits your website the server is down – there’s a problem. After a while, their algorithm treats that page as a soft 404. This means that the page may still be there, but is not returning any content.
Since a soft 404 is not good for users, it will be devalued from the SERPs and no longer be able to rank. So whilst it’s not a negative or positive ranking factor, it matters. Having a great server up-time will reduce the chance of being marked as a soft 404.
70. Server Location – this is an important one that people often overlook.
71. SSL Certificate – there is really no need to string this one out with a long description. This is a 100% confirmed Google ranking factor. They themselves announced this in the HTTPS as a ranking signal post that they wrote. It would be silly to argue against this one.
However, more notably, how you set up your https:// protocol is going to be a contributing factor towards how well you rank. You will need to take a break and read my guide on setting up HTTPS for your website.
This guide will help you cover all the mistakes that I find from regularly working with websites.
72. Terms of Service and Privacy Pages:
73. Duplicate Meta Information On-Site:
74. Breadcrumb Navigation:
75. Mobile Optimized – this one is going to impact users with lots of mobile traffic more than desktop users. However, I rarely come across websites that are 100% organic traffic from desktop.
It’s quite typical for me to see websites that have large desktop audiences, and they still hold around 30 – 40% of their traffic from mobile. Other websites I have worked with include up to 75% traffic from mobile, which is a very different story.
There are lots of different things that you can do to optimise for mobile. Here’s a short list of my favourite ways to optimise:
- Accelerated Mobile Pages – this is a movement lead by Google and many other big businesses that wanted to deliver better mobile experience. In a nutshell, it processes your website in a way that renders above the fold faster than the rest of the page. This means that the website appears to load immediately for users. Cool technology endorsed by Google, it’s a winning formula.
- Buttons – one thing people forget is that people have fat fingers. This is a real issue that people have wth using your website for mobile. Spreading your buttons out and making them large will help users to click them. It’s also something that Google recommends – so worth doing.
- Colour Coordination – many websites think its okay in 2018 to get away without organising their colour palette. It’s absolutely not okay. If your fonts are a dark blue on a light blue background, I’m glad you like it – but change it to black on white. It’s easier to read.
- Horizontal Scrolling & Zooming – this fits into responsive designs below, but deserves to be stated alone. There’s no need for horizontal scrolling on mobile. Similarly, if you have to constantly zoom out and zoom in to read the content, something is horribly wrong with the user experience.
- HTML Phone Numbers – in early 2017 I was using a windows phone. I’m glad those days are over. However, one thing it taught me is that HTML tags for phone numbers are super useful. Instead of having to copy and paste the phone number and then open up my phonebook to dial that number, I want to just click it. This is what a simple <a href=”tel:”></a> tag can do for you.
- Large Fonts – up for debate on whether there’s a ranking factor for this one alone. If your website is hard to read on mobile, then it sucks. Simple. I find that size 16px is a great size for people to read, and strongly advocate this sized font for mobile.
- Responsive Designs – simply put, if your website on mobile is just a small version of your website on desktop – then you need to redesign. It’s 2018 and making mobile friendly websites is super easy with all the technology available. Make it fit to mobile devices, and use a different navigation style that users are familiar with.
76. YouTube – this one can be summed up in three carefully chosen words:
Google owns YouTube.
If you’re doing video, and you want it to rank well – then YouTube is the platform you need to be on. There is a clear reason for Google to prioritise YouTube over any other platform. It’s also a great opportunity for you to add a link back to your website.
This is something that can be abused. However, I’m confident that if you do it naturally, and link towards a page that is relevant – it’s fine. Relevant in this context means a page appropriate to the video. Likewise, that video is probably appropriate to that page, so place the video there.
This will not only help funnel referral traffic to your main page. It will improve the user experience on your target page. This is a double whammy.
With so many ways to create animated and short videos on a budget, it makes sense to get a few well done videos for your YouTube channel to help improve conversions.
77. Site Usability:
78. Use of Google Analytics and Google Webmaster Tools:
79. User reviews/Site reputation:
80. Linking Domain Age:
81. # of Linking Root Domains:
82. # of Links from Separate C-Class IPs:
83. # of Linking Pages:
84. Alt Tag (for Image Links):
85. Links from .edu or .gov Domains – this generally works really well. The whole point of a citation based algorithm is that authoritative government and education websites have lots of power.
In recent years people have used scholarship programs to connect with Universities for backlinks. There’s nothing inherently wrong with a scholarship, or getting backlinks towards your website. However, if you grab hundreds of education pages – this can flag you up.
The same isn’t necessarily true for government websites. These tend to be a lot harder to manipulate and acquire links from. So if the house of parliament is linking towards you – then it’s a great sign.
86. Authority of Linking Page:
87. Authority of Linking Domain:
88. Links From Competitors:
89. Social Shares of Referring Page:
90. Links from Bad Neighborhoods:
91. Guest Posts:
92. Links to Homepage Domain that Page Sits On:
93. Nofollow Links:
94. Diversity of Link Types:
95. “Sponsored Links” Or Other Words Around Link:
96. Contextual Links:
97. Excessive 301 Redirects to Page:
98. Backlink Anchor Text – if external websites pass backlinks towards you with a targeted anchor text – you’re going to rank really well. In fact, it’s so good – Google often police this.
Anchor text over-optimisation is quite easy to detect, and so people need to be really careful with how many targeted anchors they use. Branded anchor text is a safe bet, but doesn’t pass the same punch for your target keywords.
This is another one that Google have confirmed matters, and all my tests show that it works extremely well. I strongly recommend manipulating backlink anchor text if you want to rank effectively.
99. Internal Link Anchor Text:
100. Link Title Attribution:
101. Country TLD of Referring Domain:
102. Link Location In Content:
103. Link Location on Page:
104. Linking Domain Relevancy:
105. Page Level Relevancy:
106. Text Around Link Sentiment:
107. Keyword in Title:
108. Positive Link Velocity:
109. Negative Link Velocity:
110. Links from “Hub” Pages:
111. Link from Authority Sites:
112. Linked to as Wikipedia Source:
114. Backlink Age:
115. Links from Real Sites vs. Splogs:
116. Natural Link Profile:
117. Reciprocal Links:
118. User Generated Content Links:
119. Links from 301:
120. Schema.org Microformats:
121. DMOZ Listed:
122. TrustRank of Linking Site:
123. Number of Outbound Links on Page:
124. Forum Profile Links:
125. Word Count of Linking Content:
126. Quality of Linking Content:
127. Sitewide Links:
128. Organic Click Through Rate for a Keyword:
129. Organic CTR for All Keywords – whilst Google stopped passing through Analytics data for keywords that users click – it doesn’t mean they stopped tracking it.
If you regularly get clicks for your target keyword, it’s fairly safe to say that you are going to build a strong interaction with this site. Some people would suggest that Google can’t track this, but I think they can.
130. Bounce Rate:
131. Direct Traffic:
132. Repeat Traffic:
133. Blocked Sites:
134. Chrome Bookmarks:
135. Google Toolbar Data:
136. Number of Comments:
137. Dwell Time:
Special Algorithm Rules
138. Query Deserves Freshness:
139. Query Deserves Diversity:
140. User Browsing History – this is not a part of the core algorithm, but applies to users struggling to find what they are looking for. There’s really no way to manipulate this for your advantage – but you should be aware that browsing history is part of the algorithm.
141. User Search History – similar to above, the search history of a user can impact what they find. For example, if a user searches ‘cheap flights’, and then five minutes later searches ‘south of France’ – it’s fairly safe to assume they’re looking for a holiday.
This may influence the algorithm slightly to push results that are associated with both the phrase ‘cheap flights’ and ‘south of France’. This essentially improves the relevance of the search results for a single user based on what they have recently searched.
Whilst this is something that Google is definitely doing, the importance of this is very low. The level of processing that is required for this is not justified by the difference it brings in results. I would not expect this to be large scale for 1 – 2 years at least.
142. Geo Targeting:
143. Safe Search – this is an easy one that people often forget. If a user selects safe searching, then all the adult related content that might be relevant is left out. Again, there’s not really any way to manipulate this, except to not have your website classified as adult entertainment.
If you’re doing a website around pornography, or escort services – then safe search will be against you. If you’re selling girl scout cookies – this is probably a ranking factor to ignore.
144. Google+ Circles:
145. DMCA Complaints:
146. Domain Diversity:
147. Transactional Searches:
148. Local Searches:
149. Google News Box:
150. Big Brand Preference:
151. Shopping Results:
152. Image Results –
154. Single Site Results for Brands:
155. Number of Tweets:
156. Authority of Twitter Users Accounts:
157. Number of Facebook Likes:
158. Facebook Shares:
159. Authority of Facebook User Accounts:
160. Pinterest Pins:
161. Votes on Social Sharing Sites:
162. Number of Google+1’s:
163. Authority of Google+ User Accounts:
164. Known Authorship:
165. Social Signal Relevancy:
166. Site Level Social Signals:
167. Brand Name Anchor Text:
168. Branded Searches:
169. Site Has Facebook Page and Likes:
170. Site has Twitter Profile with Followers:
171. Official Linkedin Company Page:
172. Employees Listed at Linkedin:
173. Legitimacy of Social Media Accounts:
174. Brand Mentions on News Sites:
176. Number of RSS Subscribers:
177. Brick and Mortar Location With Google+ Local Listing:
178. Website is Tax Paying Business:
On-Site WebSpam Factors
179. Panda Penalty – the Panda algorithm looks at pages that are thin in content, duplicate, or low quality. It does this by analysing how similar multiple pages are.
A pretty common cause for Panda can be found on each of the major CMS platforms. WordPress generates lots of /tag/, /author/, and /category/ pages.
These are often handled through YoastSEO, which is the more popular SEO plugin for WordPress. By noindexing these pages, you can quickly reduce the amount of duplicate content on your site.
Shopify creates lots of tagged pages, and if you don’t change the liquid theme files, duplicate pages. One of the big issues with Shopify are /collection/collection-handle/products/product-handle URLs. These are often referenced throughout the website. Instead you can change this to always include /products/product-handle/.
This simple change can essentially halve the size of your website, and improve crawl efficiency; simultaneously reducing cannibalisation at the same time.
180. Links to Bad Neighborhoods:
181. Redirects – When dealing with external websites pointing towards you, redirects are quite normal. It’s not possible for you to keep every page you ever created live at the same URL forever.
Naturally, when a page is gone or moved, then you will add a 301 redirect from the old page, to the most relevant new page. This improves the internet, because people can find relevant content without hitting 404 pages. So this is positive.
However, when we’re dealing on-site, it is possible for you to handle your own redirects. For example, if you move your own article – then you can change your internal links to point towards this page.
Likewise, if you point externally to another website and they change their URL, you should check to make sure that this is a good redirect. You don’t want to be linking to a playground website, only to find out it’s been repurposed. I’ve seen church websites turned into sexual enhancement blog networks – you’ve got to be careful where your redirects take people.
182. Popups or Distracting Ads:
183. Site Over-Optimization:
184. Page Over-Optimization:
185. Ads Above the Fold:
186. Hiding Affiliate Links:
187. Affiliate Sites:
188. Autogenerated Content:
189. Excess PageRank Sculpting:
190. IP Address Flagged as Spam:
191. Meta Tag Spamming:
Off Page Webspam Factors
192. Unnatural Influx of Links:
193. Penguin Penalty – this is an algorithmic penalty that seriously harms your ability to rank. It’s one that updates regularly and is identified by over-optimised backlinks.
If you have been using press releases, guest post services, private blog networks, link wheels, link trading, or even just spammy outreach campaigns – you’re a potential target for Penguin.
It’s fairly easy to pickup by looking in AHREFS or Webmaster Tools. But here’s my tips on how to find Penguin Penalties:
- Anchor Text – if you check the top 100 anchors in Google Search Console and find lots of targeted anchor text. There’s a chance that you will receive Penguin if you haven’t already. This isn’t saying that targeted anchor text is bad, but when a single keyword shows up in 25% of all your anchors – then you may have overcooked it.
- Internal Linking – it goes without saying that because this is such a strong ranking factor, it’s heavily policed. Having internal links to your website is GREAT for rankings. But just like eating cake, too much is really bad for you. This can be viewed in Search Console by looking at your most linked pages. Naturally your homepage should be the most linked page.
- Low Quality Backlinks – this is one that is covered across the internet in lots of detail. Needless to say, if you have tons of porn, sex, drugs, and rock’n’roll sites pointing towards you – then these are not great citations. Unless of course, your website is about porn, sex, drugs, and rock’n’roll.
195. Linking Domain Relevancy – this is a phrase that will come up quite a lot, and it’s really hard to quantify. Majestic took a crack at classifying the internet with their manual review of the internet. However, it doesn’t sit well in the stomach, something seems wrong with this classification.
I think that link domain relevance is important, but how Google calculates this is not clear. Though there are a few things we can expect to be checked. The more of these six points the linking website has, the more relevant the domain is likely to be:
- Anchors – the anchor is relevant to your page or domain, and useful for users.
- Content – the content that links towards you is the same topic as the page it links towards.
- Directory – the directory includes your link with other industry related websites.
- Domain – the domains that links towards you covers the same topic that your website does.
- Language – the language of the website that links towards you is the same as yours.
- Resource – the website that links towards you is using your website as a citation.
Whilst it’s possible to have good domains that do not meet all of these requirements, if you look out for these six points – you’re safe.
196. Unnatural Links Warning:
197. Links from the Same Server – this is one that people are deeply confused about.
The internet does not work by class based systems anymore, and hasn’t for almost two decades. The idea that c-class or c-block ip addresses matter is easily debunked.
Across the internet there are five Regional Internet Registry organisations that are responsible for allocating ip addresses. As well as this, each RIR assigns an Autonomous Systems Number.
When an ISP or Web Host contacts their RIR asking for a block of ip addresses, they are given ranges in this format: 126.96.36.199/24.
This means that they would be assigned all the ip addresses from range 188.8.131.52 – 192.168.2.255. The first and the last are allocated for the host.
However, the myth behind c-block ip addresses is that a single server or data centre would only include one IP range. This isn’t true. It’s possible for a single server to hold many ip addresses, and it’s all located geographically in the same place.
Therefore, despite all these IP addresses having diversity, they all share the same ASN number, and the same IP address. They may even all share the same DNS records. It’s simply not sufficient to say that an IP address is the only thing that Google is looking at.
Every time they crawl your website, they physically have to connect to your server. This means they have to get the ASN number to connect, as well as the DNS records. Why not take these publicly available pieces of data and incorporate it?
198. “Poison” Anchor Text:
199. Manual Penalty – there are a lot of people that think Google do not use manual penalties any more. This is simply not true, and is misleading.
At the time of writing, there are 11 common ways to get a manual penalty, as outlined by Google’s Webmaster Manual Actions report.
These 11 common reasons for a penalty include, but are not necessarily limited to the following:
- Hacked Website
- User Generated Spam
- Spammy Freehosts
- Spammy Structured Markup
- Unnatural Links to Your Site
- Thin Content with Little or No Added Value
- Cloaking and/or Sneaky Redirects
- Unnatural Links from Your Site
- Pure Spam
- Cloaked Images
- Hidden Text and/or Keyword Stuffing
The two that stand out most interesting are unnatural links to your website, and from your website. It shows that they differentiate both. It’s possible for your website to be penalised from poor quality linking to other websites.
The most unlikely is the keyword stuffing review, which likely dates back to the days of meta keywords. For this reason, part of my on-site audits always remove meta keywords from client websites. These have no place in 2018, and are just pure spam.
200. Selling Links:
201. Google Sandbox:
202. Google Dance:
203. Disavow Tool:
204. Reconsideration Request:
205. Temporary Link Schemes:
Categorizing the Google Ranking Factors