Cloudflare will now block AI crawlers by default
Yesterday: AI bots abused your pages by default.
Today, Cloudflare slams the kitchen door shut.

What's going on?
Starting 1st July 2025 (dubbed "Content Independence Day"), Cloudflare now blocks all known AI crawlers on every brand-new zone by default.
That's 20% of the public web just flipped from opt-out to opt-in scraping.
Cloudflare have also proposed a pay to crawl model which will block AI web bots unless they pay creators for access to their content. Marketplace sets a per-request price.
Cloudflare's traffic data reveals that Anthropic's ClaudeBot made approximately 71,000 requests for every single referral click it sent back.
In 2023, Cloudflare let webmasters block AI crawlers, but it only worked with bots that abided by robots.txt. Cloudflare began allowing websites to block "all" AI bots last year & this setting is now enabled by default for new Cloudflare customers.
The changelog:
- Default Deny. New sites start with AI crawlers blocked. Existing sites keep a one-click toggle β zero regex, zero firewall gymnastics.
- Verified Bot ID. Crawlers must sign requests and declare intent β train, infer, index. You choose whom to feed and whom to bill.
- Pay Per Crawl. Bots that let their invoices go overdue will meet a 403 page at the edge.
Press release from Cloudflare CEO
What does this mean for SEOs?
ChatGPT and other generative answer engines are the fastest growing referrers in the new AI search era. So understandably, we've got some mixed feelings about blocking by default.
Should we block AI crawlers?
Every move is a trade-off - and there's no one size fits all answer for this. As an SEO you now own the decision. Think visibility versus value, reach versus revenue, brand visibility versus bandwidth.
π Upsides
- β Guard content equity; no more unpaid answer-box cannibalisation.
- β Fresh revenue stream when Pay Per Crawl matures.
- β Granular knobs: whitelist GPT-4o for branding, charge Claude for training.
π Downsides
- β AI answer engines may ignore you if you lock the gate.
- β Legacy zones need a dashboard audit.
- β Marketplace pricing is still the wild west.
We're observering new opportunity for smaller sites to grow their influence by allow their content to be crawled in AI search & absorbed into LLM training datasets by opt-ing out of this.
Five-minute action checklist
If your site is hosted or proxied by Cloudflare, stop what you're doing and follow this checklist.
- Log in β Security β Bots β "Block AI Bots"
- Review GA4 for AI-sourced clicks. If the line is flat, blocking won't hurt.
- Pick a policy:
- Do not block (off) - We recommend!
- Block on all pages - The new default
- Add a monthly reminder to eyeball Bot Analytics. New spoofers will try sneaky masks.
- Sync with dev + legal so robots.txt and Terms of Service match your edge rules.
Although it is now default, blocking is not mandatory. If your strategy relies on generative-AI visibility (e.g., being the quoted source in answer-engines), you can opt-in for free. Just understand that the referral return may be tiny compared with the crawl load, so measure twice.
James Berry
Founder & CEO at LLMrefs
llmrefs.com