We’ve updated this guide to include new ways to setup your website with Google Webmaster Tools, the new data included in Webmaster Tools about your website, important data you might have forgotten about, and how to continually monitor for any issues that might affect your search engine rankings.
Setting Up Your Website with Webmaster Tools
If you haven’t already, the first thing you will need to do is set up your website with Webmaster Tools.
To do this, visit the Google Webmaster Tools website, sign in with your Google Account – preferably the one you are already using for Google Analytics.
Click the red Add Property button to begin.
Next, you will have to verify this site as yours.
Previously, this involved having to embed code into your website header or upload an HTML file to your web server.
Now, if you already have Google Analytics, you can verify your site by connecting Webmaster Tools to Google Analytics.
To do so, select the Use your Google Analytics account option.
Once your site is verified, you will want to submit a sitemap if you have one available.
This is a simple XML file that will tell Google Webmaster Tools what pages you have on your website.
You’ll find the option to add a sitemap under the “Crawl” tab of your toolbar.
To create a sitemap if you don’t already have one, you can use online tools like XML Sitemaps.
If you are running a website on your own domain using WordPress, you can install the Google XML Sitemaps plugin.
Once you have activated the plugin, look under your Settings in the WordPress dashboard and click on XML-Sitemap.
The plugin should have already generated your sitemap, so there’s nothing else you have to do.
You’ll find your URL at the very top of the page:
Copy the link address and head back over to your Webmaster Tools page
Then paste the portion of the URL after the http://yourdomain.com/ of your website into the box to submit your sitemap to Google Webmaster Tools.
It may take a few days for Webmaster Tools to start pulling information about your website if you are setting up your website on Webmaster Tools for the first time.
Be sure to wait a bit, then continue on to see what you can learn from Webmaster Tools.
Valuable Information within Webmaster Tools
Once you have data in Webmaster Tools, you will be able to view the following about your website.
These are only the highlights of new types of data within Google Webmaster Tools and the most important data you should always remember to check on occasionally.
When you visit your website in Webmaster Tools, you will first come to your dashboard.
This is an overview of the important data within Webmaster Tools. You can visit specific areas such as your Crawl Errors, Search Analytics, and Sitemaps from this screen by clicking on the applicable links.
You can also navigate to these areas using the menu in the left sidebar.
In the left sidebar, the first option you’ll see is Search Appearance.
Optimizing this is one way you can check your site’s SEO.
The Structured Data tab will send you to a page that looks something like this:
This gives you insight into how Google is viewing the content on your page, and if there are any errors that stand out.
You can find out more about how Structured Data works in your favour with Google’s helpful guide.
Rich cards are a form of structured data you can add to your site’s HTML.
This helps the search engine to better understand what’s on your webpage and display it in the results.
You’ll often see them displayed when you look up a book, movie, or even a recipe:
Google has another guide to help you discover ways to create Rich Cards for your own site.
When you do, you’ll come here to analyze and manage their performance.
Google’s Data Highlighter is a point-and-click tool that allows you to manually tell Google what to emphasize in its search engine results.
It’s an easier method of creating the clean callouts you see from professional sites.
This video gives a quick rundown of how it works and how you can start using it immediately.
This page gives you a brief rundown of any HTML issues detected by Google.
You’ll always hope to see this message:
That means your updates, changes, and overall structure are performing as expected.
If you do get a hit, Google uses this tool to point you in the right direction for revisions and updating.
Accelerated Mobile Pages
Accelerated Mobile Pages is an open source initiative recently started by Google to provide fast loading mobile websites that work with slow connection speeds.
You can go here to get started creating your first page if you haven’t got one already.
You’ll be given a boilerplate piece of coding that you can customize to your site.
When you set it up, your Webmaster tool will be able to break down your page and show a chart of successfully indexed pages as well as any AMP-specific errors.
The Search Traffic section is the second option of dropdowns you can select. Here, you’ll find a breakdown of how your site is performing from a variety of angles and where you can improve in the future.
This section of your Google Webmaster Tools provides an overall display of clicks, impressions, click-through rate, and your position.
Why is this now more important than ever?
It gives insight into the top keyword searches in which your website appears. It also shows the relationships between your impressions vs. clicks, average position, and changes in position.
You can also see how you’re performing on certain devices, and which pages are doing the heavy lifting.
Use this information to help track your overall SEO and find pages that are performing well or underperforming.
Links to Your Site
Curious about your backlinks?
Google Webmaster Tools shows you the domains that link to you the most as well as the pages on your website with the most links.
This is probably the most comprehensive listing of your backlinks that you will find, for free at least.
It’s a powerful tool to know where your content is being leveraged around the web, and what performs best in Google’s eyes.
These are the extra internal links from your site shown below it in search results. If you Google Kissmetrics, for example, you will see their listing plus an additional six top links from this site.
Unfortunately, you can’t specify which pages you want to show up in site links.
The Manual Actions tab is where you can find out if any of your pages are not compliant with Google’s webmaster quality guidelines.
It’s one of the ways that Google has taken action against web spamming.
On the Mobile Usability tab, you can check to make sure that all of your website’s pages are aligned with what Google considers best practice.
As you can see, you can have issues with text size, viewport settings, or even the proximity of your clickable elements.
Any of these problems, as well as other errors, can negatively affect your mobile site’s rankings and push you lower on the results page. Finding and fixing these errors will help your user experience and results.
This report gives you data about the URLs that Google has tried to index on your selected property in the last calendar year from the date you’re viewing.
As Googlebot crawls the Internet, it processes each page it comes across to compile an index of every word it sees on every page.
It also looks at content tags and attributes like your Titles or alt texts.
This graph shows a breakdown of the URLs on your site that has been indexed by Google and can thus appear in search results.
As you add and remove pages, this graph will change with you.
And don’t worry too much if you have a smaller number of indexed pages than you think you should. Googlebot filters out the URLs it sees as a duplicate, non-canonical, or those with a no index meta tag.
You’ll also notice a number of URLs that have been disallowed from crawling by your robots.txt file.
And you can also check on how many URLs you’ve removed with the Removal Tool. This will most likely always be a low value.
If you want to know if any of your sites is somehow beyond Google’s crawling ability, you’ll be able to take a look at your Blocked Resources report.
This report doesn’t show every resource though. Just the ones Google thinks you can control and fix.
Since Googlebot needs access to a great deal of information on your page in order to index you correctly, a blocked resource can affect how your page performs in Google’s search rankings.
Finding and fixing these errors will help Google consistently rank you accurately.
If for some reason you need to temporarily block a page from Google’s search results, this is where you would go.
You can hide a page for approximately 90 days before this wears off.
If you want to permanently remove a page from Google’s crawling, you’ll have to do it on your actual website.
This section of tools helps you break down the performance of your site according to how Googlebot sees it.
By utilizing the insights here, you can find broken links, see how often Google takes a look at you and set up technical parameters for how you want Google to crawl your site.
It’s never good to have broken links on your website.
But when you do or suspect that you might, you can verify what’s actually going on with this section of your Webmaster Tools.
You’ll be given an overall graph of your site’s performance:
This lets you know how errors have increased or decreased over time. You can see here we have quite a few broken links to fix.
Below the graph, you’ll find a list of URLs, response codes, and the date that the error was detected.
Google gives these to you in a priority ranking, so you know which one should be dealt with first.
By clicking on the URL, you’ll also be given an in-depth look at what exactly is going on.
If you have a lot of errors like this, focus on redirecting the ones with the most incoming links.
For a more in-depth analysis of how often Googlebot is looking at your site, you can select the Crawl Stats tab.
Here, the Webmaster Tools show you how often the pages of your site are crawled, how many kilobytes are downloaded per day, and what the download times of your site are.
According to Google, there is no “good” crawl number, but they do have advice for any sudden spikes or drops in your crawl rates.
Fetch as Google
This tool is helpful as it lets actually do a test run of how Google crawls and renders a specific URL on your site.
It’s a helpful way to make sure that Googlebot can access a page that might otherwise be left to guesswork.
If you’re successful, the page will render, and you’ll be able to see if any resources are blocked to Googlebot.
When you get to the debugging point of web development, you can’t beat this free tool.
If you’re using a robot.txt file to block Google’s crawlers from a specific resource, this tool allows you to double check that everything is working.
So if you have an image you don’t want to appear in a Google Image Search, you can test your robot.txt here to make sure that your image isn’t popping up where you don’t want it.
When you test, you’ll either receive an Accepted or Blocked message, and you can edit accordingly.
I mentioned sitemaps earlier, so I’ll cover this again in brief.
Here, you will see information about your sitemap.
If you notice the last date your sitemap was downloaded is not recent, you might want to submit your sitemap to refresh the number of URLs submitted.
Otherwise, this helps you keep track of how Google is reading your sitemap and whether or not all of your pages are viewed as you want them to be.
Google themselves recommend using this tool sparingly, as an incorrect URL parameter can negatively impact how your site is crawled.
You can read more about how to properly use URL parameters from Google.
When you do use them, this tool will help you keep tabs on their performance and make sure they’re not pointing Googlebot in the wrong direction.
Webmaster Tools can give you powerful insights into how your site performs, as well as what you can do to keep Google’s attention.
Do you use Webmaster Tools? What areas do you find most useful? Please share your thoughts in the comments below, and happy data analyzing!
If you need any help, please feel free to contact us.
Author: Neil Patel