Google updated their Search Central Documentation to on verifying Googlebot, adding documentation about user-triggered bot visits, information that was missing from previous Googlebot documentation, which has created confusion for many years, with some publishers blocking the IP ranges of the legitimate visits.
Newly Updated Bot Documentation
Google added a new documentation that categorizes the three different kinds of bots that publishers should expect.
These are the three categories of Google Bots:
- Googlebot – Search crawler
- Special-case crawlers
- User-triggered fetchers (GoogleUserContent)
That last one, GoogleUserContent is one that’s confused publishers for a long time because Google didn’t have any documentation about it.
This is what Google says about GoogleUserContent:
Tools and product functions where the end user triggers a fetch.
For example, Google Site Verifier acts on the request of a user.
Because the fetch was requested by a user, these fetchers ignore robots.txt rules.”
The documentation states that the reverse DNS mask will show the following domain:
In the past, what I was told by some in the SEO community, is that bot activity from IP addresses associated with GoogleUserContent.com was triggered when a user viewed a website through a translate function that used to be in the search results, a feature that no longer exists in Google’s SERPs.
I don’t know if that’s…