Unblocking ClaudeBot for WordPress
- Claude does not use Google. It gets 86.7% of its data from the Brave Search index.
- Your site is likely invisible to Claude because you are blocking its crawlers in your robots.txt file.
- You must allow three specific user-agents: ClaudeBot, Claude-SearchBot, and Claude-User.
- Submitting your sitemap directly to Brave Webmaster Tools is a required step for visibility.
Your Google rankings are solid and your traffic is stable, but Claude never cites your content. This is a common issue for WordPress site owners who assume all AI models pull from Google's index. They do not. Anthropic's Claude models rely almost exclusively on Brave Search for web retrieval.
If you are not visible in Brave, you are invisible to Claude. The fix is usually simple, involving a few lines in your robots.txt file and a one-time sitemap submission. This guide provides the exact technical steps to get your WordPress site indexed by Brave and cited by Claude.
Unblocking ClaudeBot: How to Restore Brave Search Visibility for WordPress
The core of the problem is a misunderstanding of how Claude sources its information. While ChatGPT uses Bing and Gemini uses Google, Claude has a deep integration with Brave Search. Data shows an 86.7% overlap between Claude's citations and Brave's top search results.
This makes every technical signal you send to Brave's crawlers extremely important. Claude is famously selective, with a crawl-to-cite ratio of 38,065 to 1. You cannot afford to have a misconfigured robots.txt file blocking access. Getting this right is the first step in any serious Answer Engine Optimization strategy.
The Three Anthropic Crawlers You Must Allow
Anthropic uses a set of distinct crawlers for different tasks. Blocking any of them can harm your visibility. The old anthropic-ai user-agent is deprecated and no longer in use. You need to ensure your robots.txt file explicitly allows the three current crawlers.
| User-Agent | Purpose | Action Required |
|---|---|---|
ClaudeBot |
AI model training data collection | Allow: Essential for model's base knowledge |
Claude-SearchBot |
Live web search indexing for Brave Search | Allow: Critical for real-time answer retrieval |
Claude-User |
User-triggered browsing from within Claude | Allow: Enables Claude to visit your site on a user's behalf |
Many default WordPress or hosting robots.txt configurations are too aggressive. They may block "unknown" bots, which often includes these newer AI crawlers. You must check your configuration manually.
Step 1: Audit and Fix Your robots.txt File
Your robots.txt file, located at yourdomain.com/robots.txt, gives instructions to web crawlers. A single incorrect line can make you invisible to an entire answer engine.
The Manual Fix
Access your robots.txt file. You can do this via an FTP client or a WordPress SEO plugin that has a file editor. Add the following lines to ensure all of Anthropic's crawlers have full access:
User-agent: ClaudeBot
Allow: /
User-agent: Claude-SearchBot
Allow: /
User-agent: Claude-User
Allow: /
This configuration explicitly tells each bot they are welcome to crawl your entire site. It overrides any broad Disallow rules that might be blocking them. For a more detailed look at crawler rules, see this guide on the exact robots.txt configuration to allow OAI-SearchBot while blocking GPTBot.
The Automated Fix
Manually managing a growing list of AI crawlers is inefficient. The AEO God Mode plugin includes an AI Crawler Allowlist that automatically manages these rules for you. It recognizes all three active Anthropic crawlers (and 15 others) and ensures your robots.txt is always configured correctly without manual file edits. You can see exactly which bots visit your site using the built-in AI crawler log.
Step 2: Submit Your Sitemap to Brave Webmaster Tools
Google Search Console has no influence on Brave Search. To ensure Brave can efficiently find and index all your important pages, you must submit your sitemap directly to their platform.
- Go to Brave Webmaster Tools and create a free account.
- Verify ownership of your site.
- Navigate to the "Sitemaps" section.
- Submit your sitemap URL (usually
yourdomain.com/wp-sitemap.xml).
This action tells Brave's indexer about your site's structure and which pages to prioritize. It is a one-time setup that significantly improves indexing speed and coverage for Claude.
Step 3: Implement an llms.txt File
The llms.txt file is an emerging convention that acts like a robots.txt specifically for large language models. While not yet a formal standard, Anthropic has been a vocal supporter of the specification. Implementing it sends a strong, positive signal that you are preparing your content for AI consumption.
An llms.txt file tells AI systems what your site is about, which pages are most important, and how you prefer your content to be used. You can learn more about formatting and implementation in the complete guide to llms.txt.
Why Claude Ignores Content Even When Unblocked
Simply allowing access is not enough. Claude's algorithms are designed to find definitive, trustworthy information. If your content is thin, vague, or lacks verifiable claims, it will be crawled but never cited.
Claude's preference for factual density means that a well-researched article from 2024 can easily outrank a shallow blog post from last week. This is a direct contrast to Perplexity, which shows an extreme bias for content updated in the last 30 days.
To get cited by Claude, your content must:
- Contain specific data, numbers, and statistics.
- Reference authoritative sources (studies, official reports, expert commentary).
- Make clear, definitive statements rather than using weak or hedging language.
This is a key part of preparing your website for ClaudeBot and Anthropic AI in 2026. Technical access opens the door; content quality earns the citation.