TL;DR
- Most AI crawlers do not execute JavaScript to see content.
- If your site uses Client-Side Rendering (CSR), 69% of AI bots see a blank page.
- Server-Side Rendering (SSR) sends fully-formed HTML that all bots can read.
- To check your site, right-click and “View Page Source.” If your content isn’t there, AI can’t see it.
- Fixing this is critical for getting cited by AI engines like Perplexity and Claude.
Your modern, JavaScript-powered website might be completely invisible to the AI engines you want to rank in. While it looks great to a human user, many AI crawlers see nothing but an empty shell. They don't wait for your code to load; they read the initial HTML and move on.
This isn't a minor issue. It means your carefully written content, product details, and expert answers never enter the models of major AI platforms. The core problem is how your site delivers content, and for a majority of AI crawlers, the current approach is broken.
Parsing JavaScript for AI: Why 69% of Bots Ignore Your Content
Think of a website as a house. Server-Side Rendering (SSR) delivers a fully built, move-in ready house. Client-Side Rendering (CSR) delivers a flat-pack box from IKEA with instructions.
Human browsers are happy to assemble the IKEA furniture. They have the time and resources. But AI crawlers are on a tight schedule. They need to visit millions of sites. They don't have time for assembly. They glance at the box, see it's not a house, and leave.
This is the fundamental challenge of parsing JavaScript for AI. Crawlers are built for efficiency. Their job is to download and index text from HTML. Executing JavaScript is a separate, resource-intensive process called rendering. It requires a virtual browser, significant CPU power, and time. Most AI companies choose not to pay this cost, so their bots simply skip it.
If your site relies on JavaScript running in the user's browser (client-side) to show text, pricing, or articles, that content is hidden from the 69% of bots that don't render.
Which AI Crawlers Can (and Can't) Render JavaScript?
Not all bots are created equal. Some have limited rendering abilities, while others skip JavaScript entirely. This inconsistency is why relying on client-side rendering is so risky for Answer Engine Optimization (AEO).
| AI Crawler | Vendor | Renders JavaScript? | Notes |
|---|---|---|---|
| Googlebot | Google (Gemini) | Yes | Renders using a version of Chrome, but indexing can happen in two waves. The initial HTML is still vital. |
| OAI-SearchBot | OpenAI (ChatGPT) | Limited | Based on Bing’s crawler, which has some rendering abilities but is less consistent than Googlebot. |
| PerplexityBot | Perplexity | No | Focuses on raw HTML for speed and efficiency. It will not see CSR content. |
| ClaudeBot | Anthropic | No | Documented to only parse HTML. It does not execute JavaScript. |
| Google-Extended | Google (Training) | No | This bot gathers training data and prioritizes raw text. It generally does not render. |
| Applebot-Extended | Apple | No | Used for Apple Intelligence training. It focuses on the initial HTML payload. |
The bots that power a huge portion of the AI answer ecosystem, like Perplexity and Claude, do not render JavaScript. If you want to appear in their answers, your content must be in the initial HTML document. You can learn more about specific crawlers and their behaviors in our guide to OpenAI web crawlers.
The Two Rendering Models: CSR vs. SSR
Your website's rendering model determines whether AI bots see a finished house or a box of parts. Understanding the difference is the first step to fixing your site's visibility.
Client-Side Rendering (CSR)
This is common with modern JavaScript frameworks like React, Vue, and Angular. The server sends a nearly empty HTML file and a large JavaScript file. The user's browser executes the JavaScript to build the page and display the content.
- ✓ Rich, app-like user interactions
- ✓ Can feel faster after the initial load
- ✓ Less load on the web server
- ✗ Content is invisible to non-rendering bots
- ✗ Poor initial page load performance
- ✗ Bad for AEO and traditional SEO
Server-Side Rendering (SSR)
With SSR, the server does the work. It runs the JavaScript and builds the complete HTML page before sending it to the browser. The browser receives a finished document that is immediately visible to both users and bots.
- ✓ Excellent for AEO and SEO visibility
- ✓ Fast “First Contentful Paint” for users
- ✓ Content is readable by all crawlers
- ✗ Increases server CPU load
- ✗ Can be more complex to configure
- ✗ Slower “Time to Interactive” for complex apps
For AEO, the choice is clear. Your content must be server-rendered to ensure all AI crawlers can access, index, and cite it.
How to Diagnose JavaScript Rendering Issues
Before you can fix the problem, you need to confirm you have it. There are several simple tools and techniques to see your website the way an AI crawler does.
1. The "View Page Source" Test
This is the fastest method described in the Pro Tip above. The source code is the raw HTML file your server sends. If it contains your text, you are likely using SSR. If it contains a lot of code but no readable content, you are using CSR. Look for an empty <body> tag or a single <div id="app"> element.
2. Google's Rendering Tools
Google is the most advanced at rendering, so if your content doesn't appear in their tools, it definitely won't appear for other bots.
- Rich Results Test: Paste your URL into Google's Rich Results Test. The tool will render the page and show you a screenshot. More importantly, it has a "View Crawled Page" option that shows you the final HTML after Google has rendered the JavaScript. This is the best-case scenario for what a bot can see.
- Google Search Console: The URL Inspection tool in GSC is powerful. You can "Test Live URL" to see how Google renders your page. It will show you the initial HTML crawl and the final rendered version, highlighting any differences or resources that failed to load.
3. AI Crawler Logs
Checking your server logs tells you who is visiting, but not what they see. If you see frequent visits from PerplexityBot and ClaudeBot but are getting zero citations, a rendering issue is the most likely cause. You need a way to check which AI bots are crawling your site. The AEO God Mode plugin includes an AI Crawler Log that tracks these visits automatically, making it easy to spot these patterns.
Fixing Rendering Problems on Your WordPress Site
Once you've diagnosed a CSR problem, you need a plan to fix it. The right solution depends on your site's technical foundation.
Solution 1: Use Server-Rendered Themes
The simplest fix is prevention. When building a new site or redesigning an old one, choose a WordPress theme that generates static HTML on the server. Most themes from the WordPress repository and established marketplaces work this way. Avoid themes that rely heavily on JavaScript frameworks for displaying core content.
Solution 2: Implement Server-Side Rendering (SSR)
This is the most robust solution. For a "headless" WordPress site using a JavaScript front-end, this means using a framework like Next.js (for React) or Nuxt.js (for Vue). These frameworks have built-in SSR capabilities. The server runs the code and outputs pure HTML, solving the problem perfectly.
For a traditional WordPress site, this is much harder and often requires custom development. It's not a simple plugin fix.
Solution 3: Use Dynamic Rendering
Dynamic rendering is a hybrid approach. Your server detects the visitor's user-agent. If it's a human, it serves the normal client-side rendered version. If it's a known bot, it serves a pre-rendered, static HTML version of the page.
This can be achieved with third-party services like Prerender.io. It's a valid workaround that makes your content visible to bots without a full site rebuild. While some once considered it cloaking, it is now a widely accepted practice for accommodating crawlers that cannot execute JavaScript.
The Direct Impact on Your AEO KPIs
Failing to serve static HTML has a direct, negative impact on every important AEO metric. It's not just a technical detail; it's a barrier to performance.
- Citation Frequency: If bots can't read your content, they can't cite it. Your citation frequency for any query will be zero. You will be invisible.
- Citability Score: A page's Citability Score measures its readiness for AI citation. A client-side rendered page will always score an F because none of the positive signals—headings, direct answers, lists, tables—are present in the HTML for the tool or the bot to analyze.
- AI Share of Voice: You cannot capture your share of AI-driven conversations if your content never makes it into the AI models. Your competitors who use SSR will dominate, leaving you with no visibility.
Ultimately, proper rendering is foundational. Without it, no amount of content optimization matters. Before you worry about advanced strategies like crafting the perfect llms.txt file, you must ensure your content is delivered in a way bots can actually read.