Understanding Google’s Index Coverage Report

cnxt_dev
cnxt_dev
2018/05/18 03:01

What you need to know about the changes made to Google’s Index Coverage Report.

Over the last year, Google has made a number of exciting changes to their Search Console, one of which was the revamp of their Index Coverage Report.

When it comes to SEO, knowing whether Google is crawling your website and what your index status is, is incredibly important. It indicates how successful your SEO efforts have been and what changes you need to make should certain pages not be indexed.

Monitoring your index status is an SEO activity that should be performed on a regular basis and Google’s Index Coverage Report has always been an effective way to do just that, but now it’s been taken one step further.

The new and improved Index Coverage Report

Upon logging into Google’s Search Console, you will discover that the Index Coverage Report is far more detailed than ever before.

Colourful and detailed charts will show you how the number of indexed pages has changed over the last 90 days. You may notice a slight difference in the numbers between the classic Search Console and the updated version; this is due to Google’s data refinement processes that have recently taken place.

The chart will be split into three sections:

  1. Valid pages (Green)
  2. Pages with warnings (Yellow)
  3. Pages with errors (Red)

Google will now also provide you with the exact details on why certain pages may have warnings or errors so that you know exactly how to fix them. It’s recommended that you start with the red errors first and work your way to the yellow warnings thereafter.

The new features

Below is a list of the new features that you can take advantage of in the updated Index Coverage Report:

  • Sitemap filtration. Webmasters now have the ability to filter data by sitemaps submitted to Google. By viewing all of the URLs submitted within your sitemaps or viewing the individual pages submitted within individual sitemaps, you can more easily determine where the problem areas are.
  • Impression comparison. If you would like to determine whether your website has seen an increase in impressions due to more pages being indexed, you can tick the ‘Impressions’ checkbox when viewing the indexing chart.
  • Highlight specific issues. By selecting one error row, you will be presented with a list of URLs that are affected by that issue. Knowing which pages to focus on in order to correct that issue will save a lot of time going forward.
  • Pin down error dates. By having the ability to turn index statuses on and off, you can focus on single or multiple areas, which in turn, will help you pinpoint whether there were spikes in errors on particular dates.

Navigating critical indexing errors

As mentioned above, it’s best, to begin with the red errors first. Below is a list of the most common errors that you might see in your Index Coverage Report and how you can fix them:

Red

  • Redirect error. When a URL keeps looping a redirect, it will eventually timeout. To prevent this error from occurring, check that your sitemap only contains URLs with a 200 OK HTTP status code.
  • 404 error. If a page no longer exists but it’s still present in your sitemap, it will produce a 404 error and Google won’t index it. You can either update the URL if this is a page that should be included in your sitemap, otherwise, it is best to remove it.
  • Noindex marker. Pages that you don’t want to be indexed should not be included in your sitemap. You should either change the markup of this page or remove it from your sitemap.
  • URL blocked by robots.txt. You will see this error if a submitted URL has been disallowed within your site’s robots.txt file. Remove the page from the robots file or from the sitemap to correct the error.

Yellow

  • Submitted URL marked ‘noindex’. If Google was previously able to index a page but is now unable to due to a noindex tag, it will produce this error. Decide whether or not you would like to keep the tag to correct it.
  • Index, though blocked by robots.txt. If pages have been indexed by Google despite being blocked in your robots file, you will see this error. It’s up to you to decide whether you want to include these URLs in your robots file. If not, rather make use of the noindex tag.

While Google’s new console might take some getting used to, it will certainly provide a lot more valuable details to webmasters, making their jobs that much easier.

Have you switched over to the new Google Search Console yet?