⚙️JavaScript SEO: How Google Handles JS-Rendered Content

How Googlebot processes JavaScript, the two-phase rendering model, why JS can delay indexing, and practical patterns to ensure your JS-rendered content gets indexed.

Hugo Team·October 7, 2026
javascript seorenderingclient-side renderingserver-side renderinggooglebotindexing

JavaScript-heavy sites — React, Vue, Angular SPAs — present unique challenges for SEO. Googlebot can execute JavaScript, but it processes pages in two phases separated by potential delays of hours or days. Understanding this pipeline is essential for any modern web developer.

Google's Two-Phase Rendering

  1. Phase 1 (Crawl): Googlebot fetches the HTML. If the page is empty HTML waiting for JS to populate content, Googlebot sees almost nothing. This initial version is processed quickly.
  2. Phase 2 (Render): Googlebot adds the URL to a render queue. A headless Chrome instance executes the JavaScript and generates the final DOM. This can happen immediately or days later depending on crawl budget.
  3. The rendered version is what actually gets indexed.
⚠️The Render Queue Delay

New pages on JS-heavy sites can take hours to days to be fully indexed because of the render queue. Time-sensitive content (news, events) may miss its indexing window entirely. Server-side rendering eliminates this problem.

Rendering Approaches Compared

ApproachHow It WorksSEO Impact
Client-Side Rendering (CSR)Empty HTML, JS builds DOM in browserWorst — depends entirely on render queue
Server-Side Rendering (SSR)Full HTML generated on server per requestBest — Googlebot sees content immediately
Static Site Generation (SSG)Pre-built HTML files at deploy timeExcellent — instant content, no render needed
Incremental Static RegenerationSSG with scheduled revalidationExcellent for mostly-static content

Dynamic Rendering

Dynamic rendering serves pre-rendered HTML to crawlers and normal JS to users. It's a valid intermediate solution but Google calls it a workaround, not a long-term solution. The detection logic (serving different content to bots vs users) can also be flagged as cloaking if not implemented correctly.

Practical JS SEO Rules

  • Critical content (title, headings, main text) should be in the initial HTML — not JS-rendered.
  • Use server-side rendering or static generation for content that needs to rank.
  • Avoid lazy-loading critical content (above-the-fold, main body text).
  • Test how your pages look to Googlebot using Google's URL Inspection Tool in Search Console.
  • Internal links must work even with JavaScript disabled — use real <a href> tags.

References

  1. [1]Google: JavaScript SEO basics — How Googlebot processes and indexes JavaScript — developers.google.com
  2. [2]web.dev: Rendering on the web — SSR, CSR, SSG and their SEO implications — web.dev

Your privacy matters

Hugo stores authentication tokens and your consent record. With your permission we may also show personalised ads via Google AdSense. ·