8 AI Crawlers Tested

GPTBot, ClaudeBot, PerplexityBot, Google-Extended, and 4 more

7 Detection Types

Hard blocks, challenges, cloaking, throttling, redirects, and more

3 Pages Scanned

Homepage + product + blog for a full cross-section

Free foreverNo login requiredResults in ~30 seconds7 detection types

What is Crawl Radar?

Before AI platforms like ChatGPT, Perplexity, or Gemini can recommend your brand, their crawlers need to actually reach your website. Many sites unknowingly block AI crawlers through robots.txt rules, WAF configurations, or CDN settings — making themselves invisible to AI search.

Crawl Radar tests 8 major AI crawlers against 3 pages on your site, detecting blocks, challenges, cloaking, throttling, and other access issues. Think of it as a pre-flight check before optimising your content for AI visibility.

What Does Crawl Radar Detect?

Hard Blocks

Bot gets a 4xx/5xx error while your browser loads the page fine — the bot is explicitly denied access.

Cloudflare / CAPTCHA Challenges

Bot receives an interstitial challenge page instead of your content. Common with aggressive WAF settings.

Content Cloaking

Bot receives a significantly smaller page than the browser — your server is serving different content to crawlers.

Throttling

Bot response is 10× slower than the browser, suggesting intentional rate-limiting of crawler traffic.

Redirect Mismatches

Bot gets redirected to a different URL than the browser — often a sign of bot-specific redirect rules.

X-Robots-Tag Blocks

Response headers contain noindex or bot-specific disallow directives that prevent indexing.

JS Rendering Gaps

Bot receives an empty JavaScript shell while the browser gets fully rendered content.

Frequently Asked Questions

What is Crawl Radar?+
Crawl Radar is a free tool that tests whether AI crawlers — GPTBot (OpenAI), ClaudeBot (Anthropic), PerplexityBot, Google-Extended (Gemini), Bytespider (ByteDance), Applebot-Extended, Meta-ExternalAgent, and cohere-ai — can actually reach and read your website. It simulates each bot's user-agent against your pages and detects blocks, challenges, cloaking, and throttling.
How does the test work?+
We make HTTP requests to your site using each AI bot's real user-agent string, plus a control request with a standard Chrome browser user-agent. By comparing the responses (status codes, body size, response time, headers), we detect 7 types of access issues: hard blocks, Cloudflare/CAPTCHA challenges, content cloaking, throttling, redirect mismatches, X-Robots-Tag blocks, and JS rendering gaps.
Is the Crawl Radar test free?+
Yes, completely free. No login, no credit card. You get the summary verdict and top 4 bot results instantly. Enter your email to unlock the full 8-bot breakdown with page-level details and fix recommendations.
How many pages does it test?+
The free test checks 3 representative pages from your site: your homepage (always), plus the first product/collection page and blog/content page it finds via your sitemap or homepage links. This gives a good cross-section without being slow.
What's the difference between Crawl Radar and a GEO Score?+
A GEO Score measures whether your site is optimised for AI citation — structured data, content format, authority signals. Crawl Radar answers the prerequisite question: can AI bots even reach your site in the first place? If bots are blocked, no amount of content optimisation will get you cited.
What does 'challenged' mean?+
A 'challenged' status means the AI bot received a Cloudflare or CAPTCHA challenge page instead of your actual content. This typically happens when your CDN or WAF (Web Application Firewall) is configured to challenge bot traffic. The bot can technically reach your server, but it can't get past the challenge to read your content.
What does 'cloaked' mean?+
Cloaking means the bot received a significantly different (smaller) page than a normal browser. Specifically, the bot's response body was less than half the size of the browser's response. This can happen due to server-side bot detection that serves stripped-down content to crawlers.
Why might I see 'inconclusive' results?+
An 'inconclusive' result means we couldn't determine whether the bot is specifically blocked. This happens when both the bot and browser requests fail (site is down), when both get challenged (site challenges all visitors), or when the bot request times out. It doesn't necessarily mean there's a problem.
How can I fix a blocked AI crawler?+
Most AI crawler blocks happen in robots.txt. Check your robots.txt file for Disallow rules targeting GPTBot, ClaudeBot, or other AI bots. If you're using Cloudflare, check your WAF rules for bot-management settings that might challenge or block AI crawlers. We provide specific fix recommendations after you run the test.
Can I get ongoing monitoring instead of a one-time test?+
This free test is a one-time snapshot. For continuous weekly monitoring with alerts when bots get blocked, check out Cited's platform plans at getcited.in/pricing.