Google’s New Search Console URL Inspection API: What It Is & How to Use It

Diagnosing specialized issues on your site can be one of the most tedious yet significant parts of running a site.

To exacerbate the situation, Google just permits you to investigate each URL in turn to analyze likely issues on your site (this is finished inside Google Search Control center).

Fortunately, there is currently a quicker method for testing your site: enter the Google Search Control center URL Review Programming interface…

What is the Google Search Console URL Inspection API?

The Google Search Control center URL Review Programming interface is a method for building check the information that Google Search Control center has on URLs. Its motivation is to help designers and SEOs all the more effectively troubleshoot and enhance their pages utilizing Google’s own information.

Here is an illustration of me utilizing the Programming interface to check whether a couple of URLs are listed and submitted in my sitemap:

What type of data can you get from the Google Search Console URL Inspection API?

The Google Search Control center URL Review Programming interface permits you to pull a great many information. The following is a rundown of probably the most examined highlights:

lastCrawlTime

With this field, you can see precisely when Googlebot last slithered your site. This is incredibly valuable for SEOs and designers to quantify the recurrence of Google’s slithering of their destinations. Already, you could gain admittance to this sort of information through log document examination or spot-checking individual URLs with Google Search Control center.

robotsTxtState

With this field, you can comprehend whether you have any robots.txt decides that will hinder Googlebot. This is the kind of thing you can check physically, however having the option to test it at scale with Google’s own information is a phenomenal forward-moving step.

googleCanonical and userCanonical

In certain circumstances, Google has been known to choose an alternate standard from the one that has been determined in the code. In this present circumstance, being able to think about both (one next to the other and at scale) utilizing the Programming interface is helpful for empowering you to roll out the proper improvements.

crawledAs

This field permits you to comprehend which client specialist is utilized for a creep of your site: Versatile/Work area. The reaction codes are underneath for reference:

  • DESKTOP – Desktop user agent
  • MOBILE – Mobile user agent

pageFetchState

Understanding the pageFetchState can assist you with diagnosing server blunders, not found 4xxs, delicate 404s, redirection mistakes, pages hindered by robots.txt, and invalid URLs. A rundown of reactions is underneath for reference.

FieldWhat it means
PAGE_FETCH_STATE_UNSPECIFIEDUnknown fetch state
SUCCESSFULSuccessful fetch
SOFT_404Soft 404
BLOCKED_ROBOTS_TXTBlocked by robots.txt
NOT_FOUNDNot found (404)
ACCESS_DENIEDBlocked due to unauthorized request (401)
SERVER_ERRORServer error (5xx)
REDIRECT_ERRORRedirection error
ACCESS_FORBIDDENBlocked due to access forbidden (403)
BLOCKED_4XXBlocked due to other 4xx issue (not 403, 404)
INTERNAL_CRAWL_ERRORInternal error
INVALID_URLInvalid URL

indexingState

The ordering state lets you know the ongoing status of indexation for the URLs. Aside from the more clear Pass and Bomb reactions, there are different reactions:

  • NEUTRAL is equivalent to the “Excluded” message in Search Console.
  • PARTIAL is equivalent to the “Valid with warnings” message in Search Console.
  • VERDICT_UNSPECIFIED means that Google is unable to come to a conclusion about the URL(s) in question.

coverageState 

This gives you detail on whether a URL has been submitted in your sitemap and recorded.

referringUrls

This permits you to see where each page is connected from, as per Google.

Sitemap 

This empowers you to comprehend which URLs are remembered for the sitemap(s).

Other uses for the API

You can likewise utilize the Programming interface to assess your AMP site — in the event that you have one.

How to use the Google Search Console URL Inspection API step by step

Utilizing the Google Search Control center URL Review Programming interface includes making a solicitation to research. The solicitation boundaries you want to characterize are your desired URL to examine and furthermore the URL of the property in Google Search Control center.

The solicitation body contains information with the accompanying design:

Assuming you are interested to dive deeper into how to utilize the Programming interface, Google has broad documentation about this.

The following is an illustration of the sort of reaction you can get from the Programming interface:

To try it out straight away, you can utilize valentin.app’s free Google Mass Examine URLs device. The device gives a speedy method for questioning the Programming interface with practically no coding abilities!

This is the way to utilize it. You can:

  1. Go to https://valentin.app/inspect.html, authorize access to your Google account, and select the Search Console property you want to test. Then paste your URLs into the box below. (The data will be processed in your browser and not uploaded to a server or shared with anyone.)
  2. Click the “Inspect URLs” button. The data will start to pull from the API.
  3. Export the data as a CSV or Excel file by clicking the button.
  4. Analyze the data and check for any potential issues.

How can you use the Google Search Console URL Inspection API in practice?

In principle, the Google Search Control center URL Examination Programming interface appears to be an extraordinary method for seeing more about your site. Nonetheless, you can pull such an excess of information that it’s hard to tell where to begin. So we should take a gander at a couple of instances of purpose cases.

1. Site migration – diagnosing any technical issues

Site relocations can cause a wide range of issues. For instance, engineers can coincidentally hinder Google from creeping your site or certain pages through robots.txt.

Fortunately, the Google Search Control center URL Investigation Programming interface makes reviewing for these issues a doddle.

For instance, you can check whether you’re hindering Googlebot from slithering URLs in mass by calling robotsTxtState.

Here is an illustration of me utilizing the Google Search Control center URL Examination Programming interface (through valentin.app) to call robotsTxtState to see the flow status of my URLs.

As may be obvious, these pages are not impeded by robots.txt, and there are no issues here.

2. Understand if Google has respected your declared canonicals

In the event that you roll out an improvement to the sanctioned labels across your site, you will want to find out whether or not Google is regarding them.

You might be asking why Google disregards the standard that you announced. Google can do this for various reasons, for instance:

  • Your declared canonical is not https. (Google prefers https for canonicals.)
  • Google has chosen a page that it believes is a better canonical page than your declared canonical.
  • Your declared canonical is a noindex page.

The following is an illustration of me utilizing the Google Search Control center URL Examination Programming interface to see whether Google has regarded my pronounced canonicals:

As we can see from the above screen capture, there are no issues with these specific pages and Google is regarding the canonicals.

3. Understand when Google recrawls after you make changes to your site

At the point when you update many pages on your site, you will need to know the effect of your endeavors. This can occur after Google has recrawled your site.

With the Google Search Control center URL Examination Programming interface, you can see the exact time Google crept your pages by utilizing lastCrawlTime.

In the event that you can’t gain admittance to the log records for your site, then, at that point, this is an extraordinary choice to comprehend how Google creeps your site.

As you can find in the screen capture above, lastCrawlTime shows the date and time my site was crept. In this model, the latest slither by Google is the landing page.

Understanding when Google recrawls your site following any progressions will permit you to interface whether the progressions you made have any sure or adverse consequence following Google’s creep.

FAQs

How to get around the Google Search Console URL Inspection API limits?

Albeit the Google Search Control center URL Investigation Programming interface is restricted to 2,000 inquiries each day, this inquiry not set in stone by Google Property.

This implies you can include numerous properties inside one site on the off chance that they are confirmed independently in Google Search Control center, really permitting you to sidestep the restriction of 2,000 questions each day.

Google Search Control center permits you to have 1,000 properties in your Google Search Control center record, so this ought to be all that anyone could need for most clients.

Can I use the Google Search Console URL Inspection API on any website?

Another potential restricting element is you can run the Google Search Control center URL Examination Programming interface on a property that you own in Google Search Control center. On the off chance that you don’t approach the property, then, at that point, you can’t review it utilizing the Google Search Control center URL Assessment Programming interface.

So this implies inspecting a site that you don’t approach can be dangerous.

How accurate is the data?

Exactness of the actual information has been an issue for Google throughout recent years. This Programming interface gives you admittance to that information. So apparently, the Google Search Control center URL Investigation Programming interface is just comparable to the information inside it.

As we have recently displayed in our investigation of Google Watchword Organizer’s precision, information from Google is frequently not generally so exact as individuals expect it to be.

Final thoughts

The Google Search Control center URL Review Programming interface is an incredible way for site proprietors to get mass information straightforwardly from Google for a bigger scope than what was beforehand conceivable from Google Search Control center.

Daniel Waisberg and the group behind the Google Search Control center URL Review Programming interface have most certainly worked effectively of getting this delivered into nature.

However, one of the reactions of the Google Search Control center URL Investigation Programming interface from the Website optimization local area is that the inquiry rate limit is excessively low for bigger destinations. (It is covered at 2,000 inquiries each day, per property.)

For bigger destinations, this isn’t sufficient. Additionally, regardless of the conceivable workarounds, this number actually is by all accounts on the low side.

What’s your experience of utilizing the Google Search Control center URL Assessment Programming interface? Got more inquiries? Ping me on Twitter.

Total Views: 60 ,

Leave a Reply

Your email address will not be published. Required fields are marked *