⚙️JavaScript SEO: How Google Handles JS-Rendered Content
How Googlebot processes JavaScript, the two-phase rendering model, why JS can delay indexing, and practical patterns to ensure your JS-rendered content gets indexed.
JavaScript-heavy sites — React, Vue, Angular SPAs — present unique challenges for SEO. Googlebot can execute JavaScript, but it processes pages in two phases separated by potential delays of hours or days. Understanding this pipeline is essential for any modern web developer.
Google's Two-Phase Rendering
- Phase 1 (Crawl): Googlebot fetches the HTML. If the page is empty HTML waiting for JS to populate content, Googlebot sees almost nothing. This initial version is processed quickly.
- Phase 2 (Render): Googlebot adds the URL to a render queue. A headless Chrome instance executes the JavaScript and generates the final DOM. This can happen immediately or days later depending on crawl budget.
- The rendered version is what actually gets indexed.
New pages on JS-heavy sites can take hours to days to be fully indexed because of the render queue. Time-sensitive content (news, events) may miss its indexing window entirely. Server-side rendering eliminates this problem.
Rendering Approaches Compared
| Approach | How It Works | SEO Impact |
|---|---|---|
| Client-Side Rendering (CSR) | Empty HTML, JS builds DOM in browser | Worst — depends entirely on render queue |
| Server-Side Rendering (SSR) | Full HTML generated on server per request | Best — Googlebot sees content immediately |
| Static Site Generation (SSG) | Pre-built HTML files at deploy time | Excellent — instant content, no render needed |
| Incremental Static Regeneration | SSG with scheduled revalidation | Excellent for mostly-static content |
Dynamic Rendering
Dynamic rendering serves pre-rendered HTML to crawlers and normal JS to users. It's a valid intermediate solution but Google calls it a workaround, not a long-term solution. The detection logic (serving different content to bots vs users) can also be flagged as cloaking if not implemented correctly.
Practical JS SEO Rules
- Critical content (title, headings, main text) should be in the initial HTML — not JS-rendered.
- Use server-side rendering or static generation for content that needs to rank.
- Avoid lazy-loading critical content (above-the-fold, main body text).
- Test how your pages look to Googlebot using Google's URL Inspection Tool in Search Console.
- Internal links must work even with JavaScript disabled — use real <a href> tags.
References
- [1]Google: JavaScript SEO basics — How Googlebot processes and indexes JavaScript — developers.google.com
- [2]web.dev: Rendering on the web — SSR, CSR, SSG and their SEO implications — web.dev