• Home
  • About Us
        • Performance Marketing

        • Search Engine Optimisation

        • Social Media Marketing

        • Customer Lifecycle & Email Marketing

        • Brand Advertising

        • Content Marketing

        • Digital Transformations

        • International Growth Marketing

        • What’s right for me?

        • Browse by industry

  • Our Team
  • Our Work
  • Partners
  • Resources
  • Careers
  • Contact Us

Step by step guide: Google Webmasters Tool

 

With latest updates from Google, competition to get your business on the first page and in one of the top positions is getting tougher and tougher. To surpass your competitors it is very important to optimise a website efficiently.
In order to do that, Google Webmaster Tool is one of the awesome (and free!) SEO tools from Google that can help improve your website rankings. By using the Google Webmaster Tool you can make sure that your online strategy is consistent with your overall business goal.

New Features

Recently there has been a few improvements to this tool which has made it more powerful. Integration of +1 has enabled webmasters to get better information about the users and downloading the data in CSV format, which has made analysis easier for the webmasters and SEO professionals.  They can now make better decisions based on this data. Maile Ohye, Developer Programs Tech Lead at Google, posted the following video on the Google Webmaster Blog.  This video explains how the Google Webmaster Tool can help businesses in getting better SEO results.

[youtube=https://youtu.be/tQQmq9X5lQw?w=560&h=316]

Through this post I will explain step by step how you can install and use the tools from Google Webmaster Tool.  Let us start with the first step.

Setup

The correct setup of the Google Webmaster Tool is very important. In 3 easy steps anyone can setup Google Webmasters.

Step 1: Google Account

You will have to create a Google account to start. If you have already have account you can sign in to the Google Webmasters at (https://www.google.com/webmasters).

Step 2: Add Website

On the home page of the Google Webmaster Tool you will see a welcome message with an option to “add a site”. Click on this link and it will bring up a site submission box. Enter the domain of your website. It is always recommended to add the site’s www version. You can use Google Webmasters as soon as your website is published and add number of properties under the same account.

Step 3: Owner Verification

Once you have added a website to the Google Webmaster Tool you have to verify that you are the owner of the web property you are wishing to track. This step stops unauthorised people getting control of your website.

This can be done in one of the following four ways:

HTML Code (Recommended):  This method verifies your site by adding a HTML code. Google set up gives Meta data and it will ask you to paste that into the header section of your website’s home page.

It will look something like this:
<meta name=”google-site-verification” content=”rxx_-xxxxxxxxxxxxxxxxxxxxxxxxxxxx” />

And place this tag as below:

<html>
<head>

<meta name=”google-site-verification” content=”rxx_-xxxxxxxxxxxxxxxxxxxxxxxxxxxx” />

<title> My title </title>

</head>
<body>

page contents

</body>
</html>

When the tag is live, press “Verify” to get ownership control.

In case you cannot do that (due to limited access or due to any other reason) there are alternative ways of completing this.

HTML File upload: Webmasters will give you a file which can be downloaded. This can be placed in the root directory of website. Once this has been placed then you will have the option verify.

Google Analytics: This method works well if you have Google Analytics installed on your website under same Google ID. This will hook-up both accounts together.

Domain Name Provider: If you know your domain name provider, this method can be used. Choose from the drop down menu and once you have chosen, verify that selection.

Once your site is verified it can take up to 24 hours to start pulling the data.

Once you start getting the data, there will be different kinds of information available. I will go through all of them one by one explaining how this information can be useful for the SEO.

Once you choose web property on the home page you will see following options:

Messages

This section will show any message from Google. Usually Google will inform you about account changes/ Analytics owner changes/ warnings through this channel. It is very important to check these from time to time so that you are updated with the latest communication from the search engines. Google will contact the business if they find anything unusual within the account. Therefore, this kind of information can be very important. These messages can be set up as email forwarders if required through preference options. This allows you to get updates without logging in to the Webmaster console.

Dash Board Status Bar

This gives you a quick status in a graph or will show your website for any potential Crawl errors, Search Queries and Sitemaps.

Crawl Errors Graph

These are the errors which have been reported by the Google bot. This can be due to DNS, Server connectivity or by the robots.txt file. This verifies if Google is indexing your site properly. You can find sources for the errors. Fixing errors helps in two ways –

  1. Reduce bounce rate
  2. Get link value or page rank pass on.

Search Queries Graph

This section shows you how many impressions your site has had and what search queries helped the user to reach your website. It will show your number of impressions, clicks and queries over a certain time frame. This allows you to invest your time into those products which are in demand. Remember as an SEO you cannot create demand for a product. Out of all search queries you can star/ favourite the queries you think are relevant and can track them separately.

Note:  This data is not 100% precise. This should be taken as an indication only.

Sitemaps Graph

This shows you a graphical representation of URLs in index as compared to those submitted via sitemap.xml. In this particular instance 25 URLs were submitted and only 19 were indexed by Google.

Configuration

This section allows you to change settings, site links, URL Parameters, change of address and manage users.

Settings

The settings section allows you to customise the geographical target, preferred domain and crawl rate of your website.

Geographical Target

This is the place where you can choose your geographical target for Google. In this particular instance I have used Australia. If your site has a neutral top-level domain, such as .com or .org, geo targeting helps Google determine how your site appears in search results, and improves Google search results for geographic queries. If you don’t want your site associated with any location, select ‘Unlisted’.

Preferred Domain

You can choose how you want your website to be displayed, with www or without www. If you specify your preferred domain as https://www.example.com and Google finds a link to https://example.com, they will consider both links the same.

Crawl Rate

You can change your crawl rate but it is usually recommended to let Google choose the crawl rate. You can change the crawl rate (the speed of Google’s requests during the crawl) for sites at the root or subdomain level – for example, www.example.com and https://subdomain.example.com.

Site Links

This is a very interesting section. This allows you to control what is shown as a site link in Google. Site links are automatically generated by Google and show the most authoritative or popular content of your website as a site link. See the example below.

Site links are automatically generated links that may appear under your site’s search results. Although Google does not allow users to add site links, if you feel your particular page on the website is shown as a site link and should not be there, you have the option to demote it. Only site owners and users with full permissions can demote site links. In order to do this, just put the page name you want to demote and it will be removed from search results the next time Google crawls your site.

URL Parameters

These parameters should only be implemented if you are very sure about how you use them as these can affect the number of pages shown in search results. Google does not want to crawl content that it has already crawled before. This option, if implemented properly, can limit the number of pages which should be crawled by Google bots. By using parameters you help Google to efficiently crawl your website.  For detailed guide you can go to Google Support. When Google detects duplicate content, caused due to session IDs, Google groups the duplicate URLs into one cluster and selects what they think is the “best” URL to represent the cluster in the search results.

Change of address: If you decide to change the URL of our website, it is very important to migrate the website correctly. With aim of minimum traffic loss, minimum ranking loss you want all migration to occur smoothly. I found this detailed article on SEOMOZ very useful. Change of address options gives you control to get easy transition of your links and authority if you decide to transfer your site on to a new domain. Basically it helps you to permanently redirect your site to a new URL.

Users: This option allows you to add multiple users to a current account. Owners can change permissions of other accounts. This allows more than one Webmaster to get information without going through the verification process, as mentioned at the start of this article.

Health: This section allows you to see how your website is performing in search engines. This gives you detailed information on your website’s crawl errors, crawl stats, blocked URLs and malware (if any). You can also see how Google fetches your site through “Fetch as Google” option. I will go through these one by one:

Crawl Errors: The crawl error area shows data from the last 90 days for any crawl errors that happened on your site. It tells you the number of errors found and the source of that error so you can fix problems, like a 404 error page.

Crawl Stats: This area gives you all the information regarding the number of pages crawled per day, size of data downloaded per day by Google Bots and time spent for those downloads. This data can be useful if you need to research in detail about robot activity on your domain.

Blocked URLs:  If your website has content which you don’t want Google or other search engines to access, use a robots.txt file to specify how search engines should crawl your site’s content. This area shows if that particular functionality is working properly or not.

Fetch as Google: “Fetch as Google” helps Google to reach to particular URLs that you feel Google has not crawled yet. There is a limit to number of URLs which can be requested to Google. Here you can check if redirects and dynamic pages are working properly.

Your site might look different to search engine robots. You can also use this to check how it will look to Google by putting URL you want to check.

Malware: This section allows you to tell you if any spam or malware has been found by Google while crawling your site. This information can be very useful if you are in a phase to clear any malware on to your domain.

Traffic:  This is the most important and useful section for the owner/ Webmaster/ SEO professional of the website. This shows information on traffic. This is very useful for anybody who is more interested to know what search queries are used, what links are coming to your domain, information on internal links and Google Plus reports. Just keep in mind that figure we get from here should be taken as indication. It is not 100 % accurate data.

Search Queries: This allows the owner to know about the most popular content or product on the website. It tells in detail about the keywords which are getting the most clicks. There is an option to actually download this in a Excel format. This information includes click through rate and average position of your website for that particular term.

Links to your site: This shows you what other domains link to your website. This is the most accurate way to know how many links you have. You can also see the most linked content on the website. These links are one of the major ranking factor in SEO. In SEO, each link to a web page is counted as a vote. However, all votes do not have equal weight. If the link comes from a high quality site that is on the same topic as the publisher’s site, that link is worth more than a link that comes from a site with an unrelated topic. Google Webmaster allows you to get this comprehensive data, and is offered as a downloadable table as well. This data is very useful in analysing and making a strong SEO strategy.

Internal Links: Internal links tell the list of URLs which are internally linked within the website.

+1 Report:  This section analyses the impact of Google Plus on the search. This is a new feature in Google Webmasters. This includes search impressions, activity and audience. The reports provide valuable information about how +1’s are affecting the search results, click-through rate, etc.; if you want to know how +1’s are impacting your search performance.

Search Impact: This tells you how Google Plus has helped in getting impressions for a particular URL. Impressions and clicks can be compared easily with a table. This feature lists the pages on your site that received the most impressions with a +1 annotation, and allows you to see how +1 annotations impact click through rate (CTR).

+1 data is used as a ranking factor and it helps Google personalise the search results for users. After you +1 a webpage, there’s a chance that your vote could be attached to a search result as a +1 annotation (when your social connections search Google). The +1 annotation can help your social connections understand that you “recommend” that content.

Activity: This shows you which content has the most +1s. This will show you fresh +1’s to your site during the time frame selected. For example, you can view the number of +1’s from your own buttons on your site versus +1’s from the search results or on ads. You can also click a toggle button to reveal all +1’s to the site in aggregate.

Audience: This tells you the number of new users who +1d your content during a set time period. Once you have enough +1s it will show more characteristics about the users doing +1 on your site. There’s a trending graph of the number of users that have +1’d your content, along with graphs and charts for gender and age. There’s also a button at the top of the report which will show you the location of those users. The better you know your audience, the better (and more informed) your decisions will be.

Optimisation: This section allows the owner of site to review the optimisation efforts done on the site. You can get full information about Sitemaps, removed URLs, HTML improvements and content keywords.

Sitemaps: This section allows you to add a Sitemap to your website. It tells you about any warnings from Google if they are unable to reach particular content on the website. It tells you the dates when Sitemap was submitted and how many URLs have been indexed out of that. This also gives stats on the number of URLs submitted and number of URLs indexed. When the URL is not indexed you can see them as warnings in this section. These can look like this:

This will allow the owner to rectify any server issues with the site. Once that is fixed it can help SEO in a big way.

Remove URLs:  This can tell what URLs are removed using robots.txt. This section allows you to stop Google going on to the pages you do not want.

HTML Improvements: This gives you a summary about any HTML problems your site might have. This gives detailed information about duplicate title tags, duplicate Meta descriptions, long Meta descriptions, short Meta descriptions and any missing title tags your site has. The table is presented like this:

This will help you to find and fix any duplicate content/ uncover basic SEO issues your website has which when fixed can help in the better ranking of the website.

Content Keywords: The Content Keywords section lists the most significant keywords and its variants Google found while crawling your site. These tell you the level of significance of that keyword with the domain. If unusual keywords are shown in this section, this gives you a clear idea that your site might be hacked.

If keywords you are expecting are missing, this can be due to the reason that Google is not crawling and indexing all of the pages of your site.

Rich Snippets Testing Tool: Use the Rich Snippets Testing Tool to check that Google can correctly read your structured data mark-up and display it in its search results. This can be used to preview how your website will look like in search results. Below is the example of Rich Snippets tool:

 

To sum it up

Google Webmasters is a very useful tool for Search Engine Optimisation. It helps detect and resolving SEO issues your site might have. With this tool Webmasters and SEOs can easily check health of their website. By resolving issues detected by the Google Webmaster Tool, a website can be optimised efficiently which can result in better rankings for the right keywords. The better the rankings; the more likely an increase in traffic and conversions.

Reload Team
Reload Team

Sharing the latest industry topics, company news and behind the scenes.

View posts

Contact Us

Ready to Partner With Australia's Best Digital Agency?

Phone

1300 714 146

International

+61 7 3368 3667

Locations

Australia, United Kingdom & APAC

Once you have completed the form, a Digital Strategist will contact you about your enquiry. 

Enquire Now
Want to Partner with Australia's Best Digital Agency?

Fill in the form below and one of our team members will get back to you as soon as possible.

COntact US

Ready to Partner with Australias Best Digital Agency?