Home / Blog / Screaming Frog Guide


15+ Tips On How To Use Screaming Frog (From Start to Finish)

This article is designed to help beginners quickly learn how to use Screaming Frog.

Screaming Frog is a powerful tool that is essential for any great eCommerce or blogger. When it is used properly, it can lead to significant ranking increases almost instantly.

Here’s an example of how Screaming Frog helped me to improve my client’s rankings immediately:

Improvements due to Screaming Frog

I’ll show you how to use Screaming Frog, so that you can also improve your rankings.

But first, we need to look at the basic layout…

User Interface:

The first step to using this tool is understanding where everything is broken down. I have created a numbered list to breakdown the different sections.

  1. Menu Bar
  2. Crawl Bar
  3. Tab Bar
  4. Main Window
  5. Detailed Window
  6. Sidebar

screaming frog on windows

Tab Selection:

I will not cover every tab in the menu for the sake of brevity, but I will include the most popular tabs within the software. Learning where these are and how to use them will help you grow your SEO Analysis and move towards the advanced tutorial further below.

Protocol – the protocol of a website can be filtered to include all pages, HTTP pages and HTTPS pages. This tab is great for quickly checking both internal and outbound links. If you have recently acquired an SSL or TLS security certificate, this tab is going to be useful for you.

Status Codes – these are the codes that your website returns when a page is requested. This will only include pages that can be found using your internal linking. You can exclude sections of your website to make crawling faster if you have large volumes of user profiles or user generated content that bloats your website’s crawling. Typically, the server will return a 20X, 30X, 40X or 50X status – which means OK, redirected, client error, and server error respectively.

Page Titles – this is where you will quickly be able to access the page titles across your site. If you have more than 1 page title it will say so. This tab also has filter options to find duplicate, short, long, and missing tags, plus an option to check if your title matches your H1 tag.

Meta Description – this is where you can find your meta description, and filter by duplicate, short, long, and missing descriptions.

H1 Tags – this is where you will find all of your H1 tags, it includes filters for duplicate, multiple and missing h1 tags. This is useful for analysing your whole site’s h1 tags.

H2 Tags – this is a repeat of the above section but only applies to H2 tags. It’s important to note that whilst there are options to find multiple h2 tags, this is no longer an important factor.

Images – this is where you will find a list of all the images used on the website that are found during the crawl. It will include their URL path, the number of times the image is used and can be filtered by size and alt attribute. This section is very basic, reflecting how difficult it is for crawlers to read images.

The tab selection in Screaming Frog

1. How To Install Screaming Frog?

Before we learn how to use screaming frog, we need to install it first.

The first step is to download your free copy here.

If you are downloading for Windows, this will be .exe file. Those of you that are using a Mac computer will need to download the .pkg file.

The website will automatically detect your system, so this should be straight forward. Install the software like you would any other on your computer.

Once you have done this, you’re ready to get started with the rest of the article. However, the basic version is limited by 500 URLs. You may wish to purchase a full copy.

Installed screaming frog on windows

2. Get A License.

This step is not essential if you are dealing with a small website. However, you should buy a copy if you’re looking to do freelance or agency work.

The process is simple, but first you will need to buy a licence.

Once you have purchased a licence, you need to load Screaming Frog. Then, in the top navigation you can select Licence > Enter Licence.

After you have entered the correct details and pressed OK, you will need to restart the program. This will activate your licence and enable you to crawl large sites.

Purchasing a Screaming Frog licence

3. How To Use Screaming Frog?

There are many different ways to use Screaming Frog, but the main objective is to crawl and fix your website. To get started, simply download your free copy online, and install.

Once you have completed this, you are ready to start crawling your website. This will normally take a few minutes for medium sized websites. Large websites may take significantly longer to crawl, and you could run out of memory.

Once you have completed a crawl, it is easy to review errors across your website. Simply navigate through the menus and tabs to explore the various tools available.

Screaming Frog Guide Banner

4. How to Crawl A Site?

With Screaming Frog, you can easily crawl a website. To get started, you simply need to enter the starting URL into the spider. This option can be found at the top of the screen.

From here, the website will be automatically crawled one URL at a time. However, it is important to note that Screaming Frog is based on internal linking. Therefore, any pages that are missing a link towards them will not appear in the crawl.

If you are having trouble crawling any URL on your website, this may suggest the bots are blocked. This can be normally checked in your robots.txt settings.

Begin a crawl on Screaming Frog

5. How To Crawl A List of URLs?

There are some instances when you will want to crawl a list of URLs. Perhaps you are cleaning the index and want to check for broken links there.

To do this, you will need to create a list of URLs that you want to check. I recommend you to use the Scraper Tool. It’s easy to use and discreet.

Once you have created a list, then you will need to copy to clipboard. The windows shortcut is Ctrl + C and it’s CMD + C for Mac.

Select ‘Mode’ from the Screaming Frog menu. Here you can change to list mode instead of spider mode. This will let you paste those URLs straight into Screaming Frog.

You can also crawl lists from a file, a sitemap, or by adding them manually:

Crawling a list with Screaming Frog

6. How To Change Crawl Settings?

If you are having troubles crawling a website, it can be useful to change the crawl settings. It is fast and easy to update, so just follow these instructions:

Make Screaming Frog Crawl Faster:

If you wish to make Screaming Frog crawl faster, this can be done in Configuration > Speed. Just increase the maximum number of threads, and remove any URI limits.

However, if you crawl your website too quickly then you may get blocked out. You can change this in your server settings by whitelisting your IP address.

Crawling faster with Screaming Frog

Make Screaming Frog Crawl Slower:

If your website is hosted on Shopify, or you can’t whitelist your IP address; then you will want to slowly crawl the website. This can be changed in Configuration > Speed. You can add a limit to the number of URI requests per second.

You can also reduce crawl speed by reducing the number of threads. By default there will be a thread count of 5. This can be reduced to 1 or 2, and that will significantly reduce the speed that you crawl.

Crawling slowly with Screaming Frog

Ignore Robots.txt:

There are a few websites on the internet that have blocked all robots. This is normally done by mistake in the robots.txt. If this is the case, you will want to fix that as quickly as possible.

However, if this is a development website and you do not want it indexed; you will need to ignore the robots.txt.

To do this, you can change your settings in Configurations > Robots.txt > Settings. There is an option to Ignore robots.txt. This will allow you to crawl the website even if the robots.txt forbids that you do so.

Ignoring the robots.txt with Screaming Frog

Change User-Agent:

Something I rarely need to do is change my user-agent for a crawl. The default is Screaming Frog, but you may wish to view the website the same way that Googlebot does. Therefore, changing your user-agent can sometimes be helpful.

To change this, you need to open the settings in Configuration > User-agent. Here you will find a drop down menu to select different presets.

However, you should be careful with using user-agents. If you regularly do server log analysis, the data will be affected.

Changing user agent in Screaming Frog

7. Creating Sitemaps.

Having an XML sitemap is strongly recommended by Google. It helps them to crawl your site and find all your new content.

The first step to creating a sitemap is to crawl your website. Then you will need to generate a new file. To do this, select ‘Create XML Sitemap’ from the menu.

Generate XML Sitemap with Screaming Frog

This should create a popup that asks for configuration. By default this is good to use, but you may wish to make changes.

For example, you may wish to change the frequency of the updates:

Changing Crawl Frequency in XML Sitemaps

Alternatively, you may wish to include canonicalised URLs:

Including canonicalised URLs in Screaming Frog

This is useful if you have added a canonical to every page of your website. Otherwise, you may get a notice that the sitemap is empty.

8. How To Find Broken Links?

The most popular question is how to find broken links. Finding errors is how most people hear about Screaming Frog, and it’s the main thing they want to fix.

To find the broken links, you first need to select the ‘response codes’ tab. Then select the filter that says Client Errors (4XX).

If you have done this correctly, then you will see the following:

When you click on the broken links, then the ‘inlinks’ tab will become populated. These are the pages with broken links.

Finding 404 Errors in Screaming Frog

9. How To Check Redirects?

Using the same approach as above, select the filter that says Redirection (3XX). This will then provide you with all the internal redirects.

Whilst there’s no harm in having redirects, you should try to reduce these internally. Here’s why this is important:

When an external website links towards a broken page, the redirect helps the user. This makes it a good thing to use. However, when you do it then it’s bad.

This is because a redirect requires the server to process the request. This adds time to your page load time. Therefore, since you can control your internal links you should replace it.

Redirects in Screaming Frog

10. Checking The Internal Links.

The fast and easy way to check internal links is by using Screaming Frog. This is because it crawls your entire website through internal links, and therefore collects tons of data about site navigation, anchor text, and alt attributes.

To check the internal links towards a page, simply search for the URL address that you wish to review. This can be found on the right-side of the main window.

Once you have found your URL, select the Inlinks tab at the bottom of the screen. This will then populate the detailed window with data, and you can begin to explore the links.

There are 6 columns, titled Type, From, To, Anchor Text, Alt Text, Follow. However, the most important columns will be From, To, and Anchor Text.

Reviewing internal links in Screaming Frog

11. Titles, Headings & Meta Tags.

The titles and headings are some of the biggest signals for revealing page relevance to Google. Therefore, you will want to use Screaming Frog to quickly analyse your content.

To get started, simply select the page titles tab. This will then reveal to you all the HTML pages, and the corresponding title. This data can be filtered to reveal pages with missing, duplicate, short, long and multiple titles. This will help you find problems to fix quickly.

Page Title Optimisation.

To get the most of our your page titles, you should aim to have a single unique title for each page. Be sure to include your target keyword for that page, and avoid using the same keyword in multiple page titles.

Filtering Page Titles in Screaming Frog

Heading Optimisation.

My main tips are to make sure that you have included a single H1 tag per page, and at least 1 H2 tag per page. The headings are a great way to let Google know what your page is about, and they are visible to your user. Try to include your core terms and topically relevant headings.

Filtering Headings in Screaming Frog

Meta Description Optimisation.

These are not a direct ranking factor, but they are used by Google in their Search Engine Result Pages (SERPs). Therefore, it’s worth making these short and catchy to the user. Include your core keyword in the description, as these are often made bold when the searcher finds your page.

Filtering Meta Descriptions in Screaming Frog

Meta Keywords Optimisation.

These used to offer Google important data on what your page is about. However, for almost 10 years this has not been a ranking factor. Therefore, I would suggest you avoid spending any time on filling these out. If your pages include meta keywords, then I would advise you remove them.

Filtering Keywords in Screaming Frog

12. Images.

These can often be overlooked by most SEO’s – especially when there are bigger problems with the website. However, they are still a ranking factor and should get some attention. It helps that Screaming Frog is great at identifying opportunities to improve images.

13. HREFLang.

There are many great ways to check for HREFLang tags, but Screaming Frog is another way to do it. This can be found under the Hreflang tab at the top of the screen.

When you are viewing this tab, there are many ways to filter the data. Using these filters will help you to find any problems.

Finding HREFLangs in Screaming Frog

14. Exclude Tool.

The exclude tool is used to exclude sections of your website from your crawl. Therefore, it functions a lot like a robot.txt file.

This can be useful when you are dealing with large websites. For example, if you want to exclude wordpress tag pages, you would exclude this:

https://rowanseo.com/tag/.*

The .* means to ignore all variations. This can be useful if you want to block URL parameters or whole sections of a website.

Excluding URLs in Screaming Frog

15. Other Settings.

Once you have taken the trouble to crawl a large site, you may not wish to do this again each time. Simply save the crawl using the menu option. This crawl can be opened again at a later date by double clicking it like a normal file, or by opening Screaming Frog. Once this is opened, you can select load crawl from the same drop down you saved it.

The bulk export tool is useful for creating a spreadsheet with a single type of information. My favourite bulk options include export all anchor textexport all 30x redirects, and export all 40x errors. There are also options for internal links, images and directives. However, when I audit a website I find the above to be particularly useful.

My name is Rowan Collins, and I am an SEO Specialist based in London. I started SEO back in 2016 after moving from an eCommerce company to an agency. Since then, I have enjoyed years of experience on websites from a plethora of niches. I pride myself in my Christian beliefs and focus on helping others to improve at digital marketing.

Leave a Reply

Your email address will not be published. Required fields are marked *