Webmaster tools checklist
Google’s Webmaster Tools is hugely valuable and beneficial for SEO. It’s very important that you take full advantage of the insights it offers. It can help you see your website the way Google sees it and you’ll be able to spot errors that are effecting your rankings. Here’s our guide for using Webmaster Tools to help you get the most out of your SEO campaign.
Sitemaps tell Google and other search engines about the pages on your website and their organisation. You should be submitting at least one sitemap. To start, go to your Webmasters Tool Dashboard and on the left-hand side menu, click ‘Crawl’, where you’ll find the Sitemaps option. Here you can add your sitemap address and you should keep updating it over time with recently added URLs.
This tool allows you to see the recommended HTML improvements on your website. You’ll be able to identify duplicate meta descriptions, missing title tags, and non-indexable content. Click on the ‘Search Appearance’ menu in the dashboard and find the HTML improvements section. This tool can help improve your site’s performance.
You need to make sure that your website is accessible and being crawled regularly. The Crawl stats report shows information about a number of pages Google has crawled per day, as well as the amount of time spent downloading a page in milliseconds. It shows the activity on your site for the last 90 days. So, with this information, you can check page load speeds. Access your website’s Crawl Status in the dashboard.
Check for malware
This is a high priority. You should be regularly checking for security issues on your site. If Google detects malware, a message will be sent to the email address associated with the account and there is a chance a message will appear to users, saying it’s not safe to visit the site. So, it’s very important you keep checking for any issues because they can go unnoticed.
Use structured data to check that your schema or microformats are correctly in place. To access structured data testing, go to ‘Search Appearance’ and there you’ll find ‘Structured Data’. With this tool, you can modify your structured markup so Google understands the content on your site, which will allow search engines to categorise and index your content.
Check for URLs that are blocked by robots.txt and unblock any areas you want Google to index. You can find this report in blocked resources, which comes under ‘Google Index’, and you’ll be able to see the number of blocked URLs. You must submit a robots.txt file to block the content you don’t want Google to crawl. To do this, click ‘submit’ in the robots.txt editor and download your file. Then, upload your new robots.txt file to the root domain and click verify and submit live version.
Follow this checklist and your website will start ranking higher and running more smoothly. Also, take a look at Google’s own Webmaster Tools Checklist for more information.