AI Crawl Control blocks Googlebot crawls
Hello there, for some reason under my AI Crawl Control section, I see many unsuccessful crawls made by googlebot, even though I have proper robots.txt that allows everything, and it just quite scared me since it might damage the seo, I don’t have any ai bot blockers activated, only ddos protection and caching, but it still seems improper since out of 292 crawls, only 189 were allowed, and the rest marked as unsuccessful.
Could it be related perhaps to fake crawls? Like people imitating googlebot, and Cloudflare just blocks it, but still classifies it as “Googlebot”?,
Thank everyone who answers in advance!
1 Reply
Support AI
I'm happy to help you with your concern.
According to the documentation, it's possible that the unsuccessful crawls are due to fake crawls imitating Googlebot. Cloudflare's AI Crawl Control feature may block these fake crawls, but still classify them as "Googlebot" (https://developers.cloudflare.com/bots/additional-configurations/managed-robots-txt/).
However, to confirm whether the issue is related to fake crawls or not, I recommend reviewing the recommendations to prevent crawler errors (https://developers.cloudflare.com/support/troubleshooting/general-troubleshooting/troubleshooting-crawl-errors/). Make sure that you are not blocking Google crawler IP addresses via custom rules or IP Access rules within the Security app, and that your rate limiting rules do not apply to the Google crawler.
Additionally, you can test your robots.txt via Google to ensure that it's properly configured (https://support.google.com/webmasters/answer/6062598?hl=en).
If you're still unsure about the issue, you can also check the troubleshooting guide for more information (https://developers.cloudflare.com/support/troubleshooting/general-troubleshooting/troubleshooting-crawl-errors/).
Sources
This is an AI generated response, it may not be 100% accurate. Please validate the output. You can ask followups by replying to this message or mentioning me.