![]() ![]() The script logs the processing in the console so you are aware of what is happening. ![]() In order to extract the coverage data from your website/property update the credential.js file with your Search Console credentials.Īfter that use your terminal and type npm start to run the script. Then install the necessary modules to run the script by typing npm install in your terminal npm install # Check Node versionĭownload the script using git, Github’s CLI or simply downloading the code from Github directly. In this script I'm using a specific syntax that can only be used from version 14 onwards so double check that you are above that version. At the time of writing this post I’m using version 14.16.0. Make sure that you have Node.js in your machine. Hence, I decided to automate it with Node.js and add it a few more features. This way to extract the data is very manual and time consuming. If you want to know and export which URLs are inside the multiple reports, you have to click on each report and export them one by one. Unfortunately, the export option on the Index Coverage Report view (pictured above) only gives you the top level numbers per report. Not all subcategories will be visible, only the ones that apply to your site. Each of these subcategories provide an additional level of classification to help site owners and SEOs understand why your URLs belong in the main category. Right now there are four main categories: Errors, Valid with warning, Valid and Excluded subdivided into 29 subcategories. How big your site is from Google’s point of view (Valid + Excluded + Errors).The amount of pages that Google has found but has not indexed (either because of an error or purposefully excluded).The amount of pages that Google has indexed.You can check Google’s own documentation and video tutorial to understand in more detail the data this section provides, but at a very top level the key data points are: However, Google also provides site owners with a more holistic view of the indexing status of their sites with the Index Coverage Report. This tool is great to gather individual information about specific URLs in your site. In my previous article I talked about how to get indexing information in bulk from Google Search Console using the URL Inspection Tool and Node.js. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |