One of the first things you should do after launching a website is create a Google Search Console account. Why? Because it helps you understand how Google crawls, analyses and indexes your website. It’s a great tool to help you discover problems which might hurt your rankings or user experience.
In this article we’ll run through the process of using Search Console, whilst highlighting all the relevant features.
Note: Up until May 20, 2015 Google Search Console was known as “Google Webmaster Tools”. It became apparent that the term “Webmaster” was outdated, hence the rebranding.
In order to use Google Search Console you need to verify ownership of the domain you’re analysing. There are several verification methods:
- Add a HTML tag to the
<head>of your site
- Sign in to the domain name provider
- Use Google Analytics
- Use the Google Tag Manager account
- Upload an HTML file
Verifying with more than one method will make your ownership more resilient.
I also suggest verifying the www and non-www version of your website url. Linking as many versions as possible guarantees that you’ll see all errors and problems. Once you’ve linked your website, Google will be able to crawl it and report back.
As the name suggests, the Search Appearance section tells you how your website is shown in the search results. Its appearance is influenced by a number of factors and can certainly have an influence on the click-through rate.
On this page, Google gathers all the Structured Data that was found. The graph shows the number of Structured Data elements and markup errors. Download the report to fix any problems.
This report works really well in combination with the Structured Data Testing Tool. Identify pages with errors, put them in the testing tool and use the suggestions to solve any issues.
Want to use Structured Data but don’t have access to the backend? There’s a solution; the Data Highlighter. This point-and-click tool is one of the easiest ways to implement Structured Data.
Start with the URL of a typical page on your website, select the type of information and start highlighting. When you highlight an element, use the context menu to select a tag.
All the highlighted information can be found in the right sidebar. And if you make an error, hit the clear tag button.
Information which isn’t visible on the page can be entered manually. Use the gear icon and select add missing tags.
The HTML Improvements page collects all errors relating to meta descriptions, titles and non-indexable content. This report is mostly used to search for duplicate content, something frowned upon by Google. For more information about duplicate content, take a look at an earlier article.
Sitelinks are used to help users navigate a website. They are effectively shortcuts to deeper pages which might be relevant to the user. For example: when I search for ‘Amazon’, I see Sitelinks to gift cards, the Amazon Appstore, books and Kindle ebooks.
If you think that a sitelink is inappropriate or incorrect, you can demote it. Select a page and enter the URL that should be removed.
Okay, now it’s getting interesting. Want to know where your visitors come from, which domains link to your website and what your internal link structure looks like? The Search Traffic section has the answers
Note: Search Analytics was previously known as Search Queries.
This is one of the most popular reports in Google Search Console. It gives you insights into the organic traffic from Google. You can see popular queries, pages, countries and devices.
The various filters give you a better understanding of your website’s performance in Google’s search results. For example:
- See how many visitors used “Image Search” to visit your site.
- Compare the average CTR of desktop and mobile.
- Check the average position of certain pages.
- Compare the number of visitors from “country x” with the same period last year.
Links to your Site
There are plenty of websites which analyse the link profile of a website, such as ahrefs, OpenSiteExplorer and MajesticSEO. These are premium tools and require a monthly fee to access all their features.
The link-report in Google Search Console is a great, free alternative. It shows the linking domains, anchor text and most-linked to pages. Download the complete table via the more button–useful for some Excel magic.
Both external and internal links can help a page rank higher. Via the internal links report, you can identify the pages that receive the most internal links. Click on a URL for the full list.
The Manual Actions page is the worst nightmare of every SEO professional. Here you’ll find information about any Google penalties which are currently in effect. For most people this page will be blank, but if you happen to be penalized, you’ll find a message on this page.
There are several causes that can trigger a penalty, amongst which:
- Unnatural links to your site
- Hacked pages
- Thin content with little or no added value
- Cloaking and/or sneaky redirects
Discussing the steps that are needed to solve these problems are beyond the scope of this article. But there are plenty of articles that can help you. Once you’re done, don’t forget to file a reconsideration request.
Plenty of websites try to attract international visitors with tailored content. The
hreflang markup should be used to identify the language and geographical targeting of each page.
If implemented incorrectly, the
hreflang tag can create some problems. For this reason it’s a good idea to check the Google Search Console International Targeting report from time to time.
The second tab (country) is only available for generic domain extensions, such as .com, .net and .org. If you want to target users in a specific country, use this option. Country-specific domains (such as .de and .es) are automatically associated with the corresponding country.
Google recently announced that, starting April 21 2015, mobile friendly design will be used as a ranking signal. As you can imagine, the Mobile Usability report has since become quite popular.
If the page is empty, your website is optimized for mobile users. Errors are displayed in a handy list, together with an overview of all the pages that are affected.
Tip: use the Mobile-Friendly Test to check if a page has a mobile-friendly design.
Under this section you can see how many pages of your website have been added to Google’s index and remove unwanted URLs.
The Index Status shows the number of pages that are indexed. These are URLs that Googlebot can access–URLs which are blocked by the robots.txt file or require a login are not displayed.
The Content Keywords report shows the most important (and frequently used) keywords of your website. Interesting, but perhaps not the most useful report available here.
On this page you’ll find all pages that are blocked by robots.txt rules.
The robots.txt file can be used to prevent pages from being added to Google’s index. But what if a page is already in the index and you want to remove it as soon as possible? In this case you should use the Remove URLs feature of Search Console. Don’t forget that only site owners and users with full permissions can request removals.
First of all, you need to indicate that it’s ok to remove the URL by blocking the page via the
robots meta tag or robots.txt file. Check whether the URL is correctly disallowed before submitting a removal request.
As well as single URLs, you can remove entire directories. The removal process for directories works in pretty much the same way.
Made an error? Don’t worry, because you can cancel removal requests for any site you own at any time.
As we mentioned earlier, before your pages can be indexed, your site needs to be crawled. All the information about this process can be found here.
The Crawl Errors report should be checked regularly. It shows all the errors Googlebot encounters whilst crawling the pages of your website. Response codes (404, 403, etc.) can be subdivided by device; desktop, mobile or feature phone. Don’t forget to mark fixed items to keep things organised.
This page shows the number of pages that haven been crawled over the last 90 days, in addition to the time spent downloading and the download size (in KB).
Ideally, you want to minimize time spent downloading and download size, whilst maximizing the pages crawled per day. If pages crawled per day decreases, but time spent downloading increases, check your site for performance issues.
Fetch as Google
“Fetch as Google” is a diagnostic tool that allows you to simulate how Google renders a certain page.
This tool is pretty self-explanatory. Use it to test new robots.txt markup and check for errors. All types of Google-crawlers (Googlebot, News, Images, Video, Mobile, Mediapartners and Adbot) are available.
In our SEO Fundamentals series we advised creating an XML-Sitemap. You can notify Google of the location of this Sitemap via Google Search Console. Use the button in the top right corner to add or edit a Sitemap (enter the URL). Afterwards you can see the difference between the number of submitted and indexed pages. Sitemap errors can also be found on this page.
Google tries its best to understand URL parameters–and most of the time, it’s very good at doing so. But occasionally, it needs some help understanding certain parameters.
When this is the case, add a parameter (case sensitive) and indicate if it changes the page content seen by the user. For a tracking parameter you can select no. For parameters that reorder products, for example, you can select yes.
If you’re a site owner and you see one of these messages:
it’s likely that your site has been hacked. To find out more, go to Google Search Console and check the “Security Issues” page. You’ll find more information about the type of problem and solutions for how to fix it.
This is arguably one of the most important features of Google Search Console and is somewhat ‘hidden’ behind the icon in the top right corner.
Here you can set a preferred domain. You can choose between the www and non-www version. To do this, you may need to verify ownership of both domains.
Why is this feature useful?
You can also adjust the crawl rate. By default, Google tries to crawl as many pages as possible without overwhelming your server’s bandwidth. However, it is possible to change the crawl rate via Google Search Console. The new custom crawl rate will be valid for 90 days.
Change of Address
Moving your primary website to a new domain? Don’t forget to use the change of address feature, which helps Google update its index.
Google rebranded its Webmaster Tools to reflect the changing landscape of website professionals. The tools available under the newly-named “Search Console” are simple, but extremely useful when setting up a website you’ve recently launched. Even once a website has become well established on the web, use Search Console to optimize its search performance.