Want a free year on Tuts+ (worth $180)? Start an InMotion Hosting plan for $3.49/mo.
The right tools can make the SEO audit of a website so much easier. One such tool is the Screaming Frog SEO Spider. It can be used to review a website and identify flaws which can hurt a website’s performance in search results. In this tutorial we’ll walk you through the most important features.
What is Screaming Frog?
The Screaming Frog SEO Spider is a small desktop application you can install locally on your PC, Mac or Linux machine. It crawls a websites’ links, images, CSS, etc from an SEO perspective. It basically tells you what a search spider would see when it crawls a website.
This information allows you to quickly analyse, audit and review a site from an onsite SEO perspective. It can save you a ton of work, because manually analyzing each page of a big website can be very challenging.
1. Download and Install
I’m going to explain some of the most frequently used features of the Screaming Frog SEO spider via a small case study. It would be great if you could follow along, so start by downloading the tool here: http://www.screamingfrog.co.uk/seo-spider/.
It’s available for Windows, Mac and Ubuntu and it’s completely free, though limited to crawling 500 URI's under the free license. Purchasing a license will allow more URIs to be crawled (useful if you're dealing with a large site), as well as giving you a few more features.
2. Crawling a Website
Ready to crawl your first website? Let’s go! I’ll be using the website Gero Wonen Meubelen as an example, but you can follow along using your own domain.
First of all we need to enter a URL to spider. Paste the root domain in the box and hit the Start button.
If you want to crawl additional subdomains (for example a blog on the URL ‘blog.website.com’), you need to check the Crawl All Subdomains box under Configuration > Spider.
Depending on the size of your website, the crawl process can take a couple of minutes. Check the progress bar in the top right corner to see when it’s done. Thereafter we can start analysing the outcome.
3. Check the Response Codes
I usually start by taking a look at the HTTP status codes under the ‘Response Codes’ tab. The most important HTTP status codes are:
- 200: OK
- 301: Permanent redirect
- 302: Temporary redirect
- 404: Not found
- 500: Server error
- 503: Unavailable
Start by checking all the redirects (301 and 302) and verify that you've used the correct version. Remember that 301 are permanent redirects, so they pass authority to the new page. This isn’t the case with a 302 redirect.
Subsequently we can take a look at the 404 errors. This error appears when you try to visit a page that doesn’t exist - either because it has been deleted or renamed (without redirecting to a new URL). Solve these errors by implementing a 301 redirect to a relevant page.
Note: don’t forget to crosscheck the crawl errors report in Google Webmaster Tools.
On this page you can also search for broken internal links. Click on a URL with a 4xx or 5xx status code and take a look at the In Links tab in the bottom window. Here you’ll see a list of all the pages that link to the broken URL.
Start by checking the length of each URL by sorting the column. See if you can spot any unnaturally long ones. Remember that a good URL is short (preferably only four to five words) and descriptive (e.g. website.com/seo-tutorial instead of website.com/p5145).
Don’t forget to take a look at the filters for non ASCII characters, underscores, uppercase, duplicate and dynamic URLs. These elements can all cause indexation problems, so it’s best to solve them as soon as possible.
5. Page Titles
Next, we can take a look at the page titles. Every page should have a unique title and the most important keyword of the page should be used in the beginning. As you can see from this analysis, our website uses the same title on every page. This is something that needs to be fixed asap.
Remember that a title shouldn’t be longer than 60 characters? Although this is still a useful metric, we should actually look at the Title 1 Pixel Width column. As the name suggests, it measures the length of your titles by the number of pixels instead of characters. Google truncates titles that are longer than 512 pixels. Certain letters and numbers take up more space than others (for example I vs W), so a title that is shorter than 60 characters might still be too long.
6. Meta Descriptions
Meta descriptions are used in the search engine result pages under the page title. It’s best to keep them under 160 characters. Longer descriptions will be truncated.
In our analysis we see that the meta description 1 field is empty, suggesting that there are no meta descriptions on this website. This means that we need to add a unique meta description to each page.
Screaming Frog has a handy new feature which allows you to simulate a search snippet and analyse it. Simply click on a URL and select the SERP snippet tab at the bottom of your screen. Change the title and meta description and see how it looks on a desktop, smartphone or tablet.
Page speed is important, not only for mobile users but also for search engines. Google has stated that it will take page speed into consideration when ranking web pages. One of the aspects that can slow a page down is images. High resolution images can take up a lot of bandwidth, which is why it’s best to keep them under 100kb.
Via the images over 100kb filter you can easily export images that are too big. Don’t forget to take a look at the alt-text as well. Make sure each image has a unique and descriptive alt text.
Tip: you can see the speed of each page in the response codes tab under response time. Check for pages that stand out and use a tool such as Yslow or PageSpeed Insights to learn more about the loading time.
In the Directives tab you can see information about the meta robots tag, canonical links and rel=next/prev annotations. Use one of the filters to quickly view all pages with a certain type of directive. As you can see; this report for Gero Wonen Meubelen is empty, meaning that we need to implement the appropriate tags when necessary.
9. XML Sitemap
You can use the Screaming Frog SEO spider to generate an XML Sitemap. This type of Sitemap is used by search engines to doublecheck if they found all relevant content. It’s a roadmap which helps them crawl your website.
You’ll find this feature in the top navigation bar under Sitemaps > Create XML Sitemap. Once the file has been created, you can manually modify the priority and change the frequency of certain pages.
With the Screaming Frog SEO Spider you can analyse several on-site elements, such as page titles, meta descriptions, URL structure, response codes, images, etc. It’s a great tool to help you optimize a website and boost its performance in search result pages. Additionally; it’s completely free, so it should be a mandatory tool in every web designer’s toolbox!