Every B2B SaaS marketing site in 2026 is built on React. Most use Next.js, some use Gatsby, fewer use Remix, a small number still use Webflow or Framer or Lovable. The rendering question (CSR vs SSR vs SSG vs ISR) matters more than most marketing teams understand and is the single highest-impact technical SEO decision on a modern SaaS marketing site.
A B2B SaaS site with a CSR-only marketing layer ranks 20 to 40 percent worse than a structurally identical site rendering server-side. Most teams do not know which mode their site is in. Some teams discover it 8 months into an SEO program that is not producing results, when the audit finally surfaces the cause. This is the diagnostic and the fix.
01 / The four rendering modes
The four ways a React-based B2B SaaS marketing page can be rendered.
Client-side rendering (CSR)
The HTML response is essentially empty: an empty root div and a JavaScript bundle. The browser downloads the bundle, executes it, fetches data, and only then renders content. Common in early-stage React sites and in tools that default to CSR (older create-react-app, some headless CMS implementations).
Server-side rendering (SSR)
The server generates the full HTML on each request. The browser receives complete content immediately. All crawlers see content on first response. Slower per-request than SSG but more flexible for personalized content.
Static site generation (SSG)
The HTML is pre-built at deploy time and served as static files. Same crawler benefits as SSR, lower runtime cost. The default for most marketing pages.
Incremental static regeneration (ISR)
A Next.js hybrid pattern. Pages are statically generated and cached, but regenerated on a schedule or on demand. Best for content that updates periodically without requiring full deployment.
For B2B SaaS marketing sites, SSG and ISR are the right defaults. SSR is acceptable when content is highly personalized or dynamic. CSR is almost never the right answer for marketing pages.
02 / The diagnostic
Open any marketing page on your site. Right-click and select "View Page Source" (not "Inspect Element"). View Source shows the HTML response from the server. Inspect Element shows the rendered DOM after JavaScript execution.
Search the source for your page's main heading text. If the heading text is in the HTML source, the page is SSR or SSG. If you only see an empty root div with no text content inside, the page is CSR-only.
Run this check on:
- The homepage
- A blog post template
- A product page
- A pricing page
- A comparison or integration page
Each page type can render differently. We have audited sites where the homepage was SSG (great) but every blog post was CSR (catastrophic for organic traffic).
03 / Why CSR-only sites rank worse
Three mechanics, all compounding.
Crawl delay
Google can render JavaScript and will eventually index CSR pages. The rendering happens in a separate "rendering tier" that processes JavaScript pages 2 to 3 days after the initial crawl. Pages that update frequently fall behind. New pages take longer to be indexed and ranked.
Other crawlers do not render
Bing renders less JavaScript than Google and inconsistently. Most AI Search engines do not render JavaScript at all. This means CSR-only content is invisible to ChatGPT, Perplexity, Claude, Gemini, archive.org, social media preview bots (Facebook, LinkedIn, Twitter), and most enterprise web crawlers.
Ranking suppression
Even when Google does render and index a CSR page, the ranking is suppressed by 20 to 40 percent compared to an equivalent SSR/SSG page. The suppression is not a penalty; it is an algorithmic effect of slower indexation, weaker freshness signals, and lower confidence in content interpretation.
The combined effect: your CSR-only marketing site is competing with one hand tied behind its back. Every other technical SEO improvement you make compounds at half effectiveness because rendering is dragging down the baseline.
04 / The fix in Next.js
For Next.js sites (which is most B2B SaaS marketing sites in 2026), the fix is usually a single configuration change.
App Router (Next.js 13+)
Server Components are the default. Pages render server-side automatically unless you opt in to client-side with "use client" directive. If your marketing pages are accidentally CSR, it is usually because the entire layout was wrapped in a client component. Audit for "use client" at the top level and remove where unnecessary.
Pages Router (Next.js 12 and earlier)
Pages are SSR by default but only when they implement getServerSideProps or are statically generated with getStaticProps. Pages that fetch data client-side with useEffect after mount are effectively CSR for rendering purposes. Convert to getStaticProps for static content or getServerSideProps for dynamic content.
ISR
For content that changes periodically (blog posts after publishing, pricing updates, integration page updates), use revalidate parameter in getStaticProps. The page generates statically but regenerates on a schedule.
The change is usually 5 to 30 lines of code. The impact is up to 30 percent organic traffic improvement within 60 days.
05 / Beyond the rendering mode
Three additional JavaScript SEO issues common on B2B SaaS sites.
Hydration delays
The page renders quickly (good LCP) but stays non-interactive while React hydrates (bad INP). Symptoms: the page looks ready but clicks do not respond. Fix: minimize the JavaScript bundle size, defer non-critical hydration, use Streaming SSR in Next.js 13+ for progressive hydration.
Hash-based routing
Some legacy single-page applications use # URLs (e.g., example.com/#/products) instead of clean paths (example.com/products). Search engines treat the hash as a fragment, not a URL. The page never gets indexed properly. Fix: switch to history-based routing (browser History API). Every modern React router (Next.js, React Router v6+) supports this by default.
Lazy-loaded above-the-fold content
Components wrapped in <Suspense> or React.lazy() that are above the fold on initial render. The browser waits for the lazy load before rendering, which delays LCP. Fix: do not lazy-load anything in the initial viewport. Lazy-load only below-the-fold content.
06 / Metadata in source, not just DOM
A common mistake on SSR sites: metadata managed via JavaScript that injects <title> and meta tags after page load. Page renders correctly to users. View source shows missing or incorrect meta tags. Google reads source, sometimes inconsistently between rendering passes.
The fix in Next.js: use the Metadata API in App Router or next/head properly in Pages Router with the metadata included in the server-rendered HTML.
Verification: view source on any marketing page. Check that <title>, meta description, canonical, and OpenGraph tags are present in the HTML source. If they are not, your metadata is being injected client-side and Google may read inconsistent values.
07 / The AI Search angle on JavaScript SEO
AI Search engines crawl differently from Google.
ChatGPT's web browsing tool fetches HTML and parses content. It does not execute JavaScript. CSR content is invisible.
Perplexity's crawler is similar: HTML-first, minimal JavaScript execution. CSR content is largely invisible.
Claude's web search and Gemini's grounding similarly prioritize HTML content. JavaScript-rendered content is partially or fully invisible depending on the engine and content type.
The implication: a B2B SaaS site that is CSR-only is essentially invisible to AI Search engines, even if it is somewhat indexed by Google. As AI Search becomes a larger fraction of how buyers research B2B SaaS, this gap is widening.
The fix is the same as for traditional SEO: render server-side. SSG, SSR, or ISR. Any of those modes makes the content visible to AI Search engines.
08 / The migration path from CSR to SSG/ISR
For sites currently CSR-only, the migration path:
- Identify which pages need server rendering (almost all marketing pages).
- Choose target mode: SSG for static content, ISR for periodically updated content, SSR for highly dynamic.
- Refactor data fetching from
useEffecttogetStaticPropsorgetServerSideProps(Pages Router) or to Server Components (App Router). - Test rendering mode on staging by viewing source.
- Deploy in stages: highest-traffic pages first, full rollout after verification.
- Monitor GSC indexation and rankings for 14 to 28 days post-launch.
Expected outcome: 15 to 30 percent organic traffic increase within 60 days for most B2B SaaS sites. Larger improvements for sites with significant CSR content currently.
09 / Part of a larger technical playbook
For the full B2B SaaS technical SEO process, see our B2B SaaS technical SEO checklist. For related deep dives, see B2B SaaS website migrations without ranking loss and Schema markup for B2B SaaS.





