TL;DR
- An llms.txt file is like robots.txt, but provides instructions for AI crawlers.
- It tells AI models about your site’s purpose, important pages, and usage rules.
- You can create it manually or use a WordPress plugin to generate it automatically.
- While adoption is growing, it is forward-looking tech; not all AI engines use it yet in 2026.
Creating an llms.txt file gives you a direct line of communication to the AI crawlers visiting your website. This simple text file, placed in your site's root directory, acts as a guide for models from Anthropic, OpenAI, and others. It helps them understand your content's structure and your preferences for how it should be used and cited. This guide shows you exactly how to create and optimize an llms.txt file in your WordPress root directory.
The process is straightforward. You can either build the file by hand and upload it via FTP, or you can use a dedicated plugin to handle the generation and updates for you.
What is an llms.txt File?
An llms.txt file is a public, plain-text file that lives at the root of your domain, for example, yourdomain.com/llms.txt. It follows an open specification designed to give site owners control over how Large Language Models (LLMs) interact with their content.
Think of it as a more detailed version of robots.txt. While robots.txt is a simple set of Allow and Disallow commands for crawlers, llms.txt provides richer context. It can define your site's purpose, point to key pages, and even suggest how AI should format citations when referencing your material.
How to Create and Optimize an llms.txt File in Your WordPress Root Directory
You have two main options for adding this file to your WordPress site. The manual method offers full control, while the plugin method provides automation and ease of use.
Method 1: Create the File Manually
This approach involves creating a text file on your computer and uploading it to your server.
- Create a New Text File: Open a plain text editor like Notepad (Windows) or TextEdit (Mac). Do not use a word processor like Microsoft Word, as it adds formatting that will break the file.
- Add Your Directives: Populate the file with the desired fields. Start with a simple configuration.
- Save the File: Save the file with the exact name
llms.txt. - Upload to Your Root Directory: Use an FTP client (like FileZilla) or your hosting provider's File Manager to upload the
llms.txtfile. It must be placed in the mainpublic_htmlorwwwfolder, the same location where yourwp-config.phpfile resides.
Method 2: Use a WordPress Plugin (Recommended)
For most WordPress users, a plugin is the simplest and most reliable way to manage this file. A good WordPress llms.txt plugin ensures the file is always correctly formatted and updated.
The AEO God Mode plugin includes a free llms.txt generator that automates this entire process. After installation, it scans your site's content, pages, and settings to build a compliant file. It also provides a custom context area for adding freeform instructions and a live preview before you save. This removes the risk of syntax errors and the need for FTP access.
The llms.txt Specification Explained
The file uses a simple Field: Value format. While the specification is still developing, several key directives are commonly used.
| Directive | Purpose | Example |
|---|---|---|
| User-agent | Specifies which AI crawler the rules apply to. Use * for all. | User-agent: * |
| Allow | Paths you explicitly want AI to crawl and use. | Allow: /blog/ |
| Disallow | Paths you want to exclude from AI use, like user-generated content or admin areas. | Disallow: /forums/ |
| Sitemap | Points crawlers to your XML sitemap for content discovery. | Sitemap: https://yourdomain.com/wp-sitemap.xml |
| Content-focus | A short description of your site’s main topics. | Content-focus: WordPress optimization for AI search |
| Citation-format | Your preferred format for how AI should cite your content. | Citation-format: {title} by {author} on {publication_date} |
llms.txt Examples for Different Sites
Your file's contents will change based on your site's goals.
Example 1: Standard Blog
A typical blog wants all its articles indexed but might want to exclude comment sections.
User-agent: *
Allow: /
Disallow: /comments/
Disallow: /wp-admin/
Sitemap: https://yourdomain.com/sitemap_index.xml
Content-focus: Technical guides on Answer Engine Optimization and WordPress performance.
Example 2: E-commerce Store
An e-commerce site wants product pages included but may want to block user account and checkout pages.
User-agent: *
Allow: /product/
Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/
Sitemap: https://yourdomain.com/sitemap_index.xml
Content-focus: Sells high-performance WordPress plugins for AEO.
The Reality Check: Do AI Engines Use llms.txt in 2026?
It is important to have realistic expectations. The llms.txt file is an emerging, community-driven proposal from llmstxt.org, not a fully ratified standard that all AI companies have officially adopted.
Based on extensive CDN and server log audits in early 2026, major AI crawlers like GPTBot, PerplexityBot, and ClaudeBot are not yet observed actively requesting llms.txt files at scale. Data shows that only around 10% of the top 300,000 domains currently implement the file. For a deeper analysis, see this data-driven answer on whether llms.txt is worth implementing.
Think of implementing llms.txt today as building forward-looking infrastructure. It prepares your site for a future where these directives become standard practice. While it may not provide a direct optimization benefit today, it establishes good governance for AI interaction. Some models, especially those from Anthropic, are on record encouraging its use, making it a valuable signal for platforms like Claude. Properly preparing your website for ClaudeBot should include this step.
Frequently Asked Questions
public_html or www.