JavaScript SEO has been a messy topic for years, made messier by the proliferation of rendering modes and the arrival of AI crawlers that don't render JavaScript at all. This guide covers what each mode actually means for bot visibility, and what to do about it.
The four rendering modes
CSR — Client-Side Rendering
The browser receives a near-empty HTML document and JavaScript builds the page in the client. Create React App, plain Vite apps, and many older SPAs work this way. Bot visibility: poor. The bot gets the shell and nothing else.
SSR — Server-Side Rendering
The server generates HTML for each request. The bot receives full page content. Good for bot visibility. The tradeoff is server load and latency. Next.js supports SSR with getServerSideProps (Pages Router) or dynamic server components (App Router).
SSG — Static Site Generation
Pages are pre-built at deploy time. Bots get complete HTML because it's just a static file. Excellent for bot visibility, but only works for content that doesn't change between deploys. Product prices, inventory levels, and personalized content can't be statically generated.
ISR — Incremental Static Regeneration
Next.js-specific. Pages are statically generated but can be revalidated on a schedule or on demand. Better than SSG for dynamic content, but revalidation takes time. A bot crawling right after a price change might get stale HTML.
The AI crawler complication
Even SSR and SSG have a problem: LLM crawlers like GPTBot and ClaudeBot process your HTML as text. They don't understand JavaScript frameworks, they don't evaluate DOM structure, and they struggle with navigation-heavy HTML. A product page served via SSR contains lots of HTML that's not the product — header, footer, nav, sidebar, scripts.
Serving clean Markdown to LLM crawlers gives them exactly the content they need: product name, description, specs, price, reviews — in a format that maps directly to how language models represent text. This is what PerfectSearch calls LLM-native rendering.
What to do right now
First, check whether your pages have a rendering problem. Then fix it with the approach that fits your stack. If you're on Next.js, the middleware approach is the least disruptive. If you're on any other stack, the DNS proxy works with zero code changes.
Run a site audit on your most important pages first:
Try it on your site
Full tool with more details