SEO
May 4, 2026
JavaScript SEO for React/Next.js Sites: The Hydration Trap

The hydration trap, in one sentence
JavaScript SEO for React and Next.js sites comes down to one question: what does Googlebot see when it hits your URL before any JavaScript runs? If you searched "JavaScript SEO React" expecting framework-specific tips, the honest answer is upstream of the framework. React ships an HTML shell. JavaScript fills it in. Googlebot does not always wait.
That gap is the hydration trap.
Gravidy runs on Framer, which is React under the hood. We diagnosed our own rendering pipeline the same way we diagnose client sites in our SEO audits: disable JavaScript, reload, and read what comes back. This post reflects that audit, not a theoretical walkthrough.
What the hydration trap actually is
Hydration is not a bug. It is a timing problem.
React sends static HTML to the browser, then JavaScript "wakes up" the page and takes control of the DOM. Anything that only exists after that JavaScript executes is invisible to Googlebot during its first crawl pass. If your navigation links, product descriptions, or meta tags are injected client-side, they may never be indexed at all.
Three places this happens most often: - Content fetched inside useEffect or useState arrives after the initial HTML response - Meta tags written by react-helmet or assigned via document.title exist only after hydration - Lazy-loaded components (React.lazy, dynamic Suspense boundaries) show a spinner in the initial HTML
The mismatch between what users see (post-hydration) and what crawlers index (pre-hydration) is the entire problem. Google documents this directly in their JavaScript SEO basics guide.
How Googlebot handles JavaScript: the two-wave model
Googlebot does render JavaScript. Just not immediately.
Google operates a two-wave indexing system. The first wave indexes the raw HTML response. The second wave hands the page to a Chromium-based renderer that executes JavaScript, but that second wave is queued, prioritized, and not guaranteed within any fixed window. Pages deeper in your architecture may only ever receive wave-one indexing because they fall outside crawl budget.
The practical implication is simple. Treat wave two as a bonus. Never as a dependency.
If your H1, body content, internal links, or canonical tag only exist after hydration, you are gambling on a crawl pass that may never come. Google's documentation on fixing JavaScript SEO issues says the quiet part out loud: server-side rendering, static rendering, or hydration are all acceptable, but rendering must complete before Googlebot moves on.
For why architecture beats content tweaks here, see our piece on the hidden power of technical SEO.
CSR, SSR, SSG, and ISR: what each mode means for indexing
Pure client-side rendering (CSR) is the worst option for SEO. Googlebot receives an empty <div> on wave one. Server-side rendering (SSR) sends fully rendered HTML on every request. Static site generation (SSG) pre-builds pages at deploy time. Incremental static regeneration (ISR), a Next.js feature, combines static HTML with scheduled revalidation, giving you static-page speed and dynamic-page freshness.
For most B2B SaaS pages, ISR or SSG is the correct default.
Rendering mode | What Googlebot sees on wave 1 | Freshness | SEO risk |
|---|---|---|---|
CSR (pure React) | Empty HTML shell | Real-time | High |
SSR (Next.js) | Fully rendered HTML | Real-time | Low |
SSG | Pre-built static HTML | Stale until rebuild | Minimal |
ISR | Pre-built HTML, revalidated on schedule | Configurable | Low |
Framer (default) | Server-rendered HTML with hydration | Deploy-time | Low to medium |
A note on Framer, since this is what Gravidy itself runs on. Framer's default rendering is server-side, but dynamic CMS sections and custom code embeds can revert to client-side rendering with no obvious warning. We audit each dynamic block on our own site separately, the same way we audit clients. Page weight matters here too; see Core Web Vitals for B2B SaaS for the thresholds Google now enforces.
Five hydration mistakes that damage rankings
The most damaging hydration mistakes are predictable once you know what to look for. In Next.js, the same five patterns show up in nearly every audit.
Mistake 1: useEffect for content fetching. Any data loaded inside useEffect is invisible on wave one. Move data fetching to server components or generateStaticParams. If a product description lives inside a client-side fetch, Google does not see it.
Mistake 2: Meta tags set client-side. react-helmet and document.title assignments execute after hydration. Use the Next.js Metadata API in layout.tsx or page.tsx instead, which guarantees server-rendered titles and descriptions.
Mistake 3: Missing canonical tags. Next.js does not auto-generate canonical tags in the App Router. You must declare alternates.canonical explicitly in the Metadata API. Skip this and you create duplicate-URL indexing problems, especially on paginated content.
Mistake 4: HashRouter navigation. Routes that look like example.com/#/page are not crawlable as separate URLs. Replace HashRouter with proper Next.js Link components and real path-based routing.
Mistake 5: Lazy-loaded above-the-fold components. React.lazy and Suspense for content above the fold means Googlebot may index the loading state, not the content. Reserve lazy loading for genuinely below-the-fold or interactive blocks.
The Framer version of mistake 1 is custom code embeds that fetch from an external API after mount. Validate by viewing page source (Ctrl+U), not the DevTools Elements tab. Elements shows the post-hydration DOM. Page source shows what Googlebot actually receives.
The cost of ignoring these is documented. After Stackmatix migrated a React app from pure CSR to Next.js with static rendering, indexed pages grew from 12 to 327 (a 96% indexation rate against the sitemap) and organic traffic increased 412% within 90 days. That outcome is not exotic. It is what happens when wave-one HTML stops being empty.
For SaaS-specific structured data that pairs with server rendering, our schema markup cheat sheet covers the JSON-LD patterns we recommend.
How to audit your React site in five minutes
The fastest rendering audit requires no paid tool. Open Chrome DevTools, hit the command palette (Cmd+Shift+P or Ctrl+Shift+P), and run "Disable JavaScript". Reload your target URL. What you see now is approximately what Googlebot indexes on wave one.
If your main content, navigation, or meta description has disappeared, you have a CSR problem. Fix that before any other SEO work.
Four checks worth running in sequence: - Disable-JavaScript reload, as above. The 30-second smoke test - View page source (Ctrl+U or Cmd+U), not Elements. Page source is the wave-one HTML; Elements is the live, hydrated DOM - Run Google Search Console's URL Inspection tool and review the rendered HTML and screenshot under "More info" - Crawl with Screaming Frog twice, once with JavaScript rendering enabled, once with it disabled, then diff the indexable word counts per URL. Pages with large gaps are your hydration leaks
This sequence catches more than most paid audits because it tests what actually matters: the HTML that reaches the indexer.
Concrete Next.js fixes, ordered by impact
Five fixes resolve the majority of React SEO problems without a framework migration or full rewrite.
The highest-impact change is keeping your page.tsx files as server components by default. Add "use client" only at the leaf-component level when interactivity actually requires it. Page-level "use client" directives are a silent killer.
Next, consolidate metadata. Replace any react-helmet usage with the Next.js Metadata API. Export a metadata object or a generateMetadata function from page.tsx. This guarantees server-rendered titles and descriptions on wave one.
Then set canonicals explicitly. Add alternates.canonical to every page's metadata export. For dynamic routes, build it inside generateMetadata from the URL params. Do not assume Next.js handles this. It does not.
For product pages, documentation, or blog posts that change infrequently, switch to SSG via generateStaticParams and set a revalidate value that matches your update cadence. Hourly (3600) is a reasonable default when content gets edited often.
Finally, inject structured data via a <script type="application/ld+json"> tag inside a server component. Never via a client-side script appended after mount. JSON-LD that arrives during hydration is JSON-LD Google may not see.
Frequently Asked Questions
Does Google index React sites?
Yes, Google indexes React sites, but the timing and completeness depend on your rendering setup. Pure client-side React apps send an empty HTML shell on wave one. Google may index that empty state or queue a JavaScript render that never happens for lower-priority pages. SSR or SSG removes the uncertainty entirely.
How do I fix CSR SEO without rebuilding the whole site?
Partial migration is the fastest path. Identify the pages with the most organic traffic potential (product pages, documentation, marketing landing pages) and convert those to SSG or SSR first. Authenticated dashboards do not need server rendering, since search engines never reach them. Spend effort where crawlers actually land.
What are the most important Next.js SEO settings?
Three settings matter most: the Metadata API for server-rendered titles and descriptions, explicit canonical tags via alternates.canonical on every route, and a dynamic sitemap generated at build time. After those are in place, confirm your robots.txt does not accidentally block /_next/static/. Blocking it prevents Googlebot from loading the JavaScript it needs for wave-two rendering.
What we tell every audit client
React and Next.js do not break SEO by default. The hydration trap is a configuration problem, not a framework flaw. The fix is knowing exactly which rendering mode each page uses, auditing the pre-JavaScript HTML that crawlers receive, and moving metadata and core content into server components where they belong.
If you are not sure whether your site has a rendering gap, the five-minute disable-JavaScript test tells you more than most paid audits.
The Gravidy site runs on Framer, and we have been through this audit on our own URLs. If you want a structured review of your React or Next.js site's crawlability and indexation, book a free SEO audit call. Thirty minutes, specific findings, no slide decks.


