A B2B SaaS site at DR 30 with broken indexation will rank worse than a B2B SaaS site at DR 12 with clean indexation. The engineering team will tell you the indexation issue is minor. The agency will report it as one item in a list of 47. Six months later, the content engine produces a fraction of the traffic the budget was supposed to justify, and nobody connects the underperformance to the technical fix that never shipped.
This is the playbook for the technical SEO work we run on every B2B SaaS engagement. Ordered by impact, not by how impressive the diagnostic looks. Not a 47-item checklist — the 8 to 20 things that actually move the needle.
Generic checklists were written for the websites that dominated the internet in 2015. B2B SaaS sites are structurally different.
WordPress blogs. Shopify stores. Local business sites. The technical SEO advice for those sites is mostly correct, and mostly irrelevant for B2B SaaS marketing sites. There are six structural reasons.
01
The stack is JavaScript-heavy.
Most B2B SaaS marketing sites in 2026 are built on React (Next.js, Gatsby, Remix) or Vue (Nuxt). The rendering question — CSR vs SSR vs SSG vs ISR — matters more than the marketing team understands and produces an entire class of technical SEO problem that does not exist for HTML-first sites.
02
The site is split across surfaces.
Marketing site, product app, and documentation typically live on three different infrastructures. Subdomain versus subdirectory, indexable versus gated, internal linking across surfaces — all B2B SaaS-specific calls.
03
Programmatic content is common.
B2B SaaS sites at scale have integration pages, comparison pages, location pages, and template galleries built from a single template populated with hundreds or thousands of variations. Programmatic content has its own technical SEO failure modes: thin content, near-duplicates, improper canonicals, indexation leaks.
04
Conversion paths are technical events.
The demo signup flow, the trial start flow, the pricing calculator, the demo scheduler. Each is a rendering and Core Web Vitals concern with direct revenue impact when broken.
05
Replatforming is frequent.
B2B SaaS sites get rebuilt every 18 to 36 months on average. WordPress to Webflow. Webflow to Framer. Framer to Lovable. Lovable to a custom Next.js build. Each migration is a high-risk technical SEO event, and most teams do not know they handled the previous one badly until the next agency audits them.
06
AI Search has technical preconditions.
Visibility in ChatGPT, Perplexity, Gemini, and Claude depends on technical foundations — structured data, entity clarity, llms.txt, self-contained content — that overlap with but are distinct from traditional Google SEO. The bar in 2026 is higher than the bar for Google ranking was in 2020.
02 / The audit (the first 2 weeks of every engagement)
Before any content gets written and any link campaign launches, the audit has to be done.
Eight to fourteen working days of audit work. Not negotiable. Not "nice to have." It is the precondition for everything that comes after. The audit answers four questions in this order:
CRAWL
What is being crawled, indexed, and what is in the gap?
GSC Coverage reportSubmitted vs ValidExcluded / Discovered, not indexed
Most B2B SaaS sites have 20 to 60 percent of their submitted URLs in the Excluded or Discovered, not indexed buckets. The first audit usually surfaces the systemic reason.
How we run itPull the Coverage report, segment by category, then sample 10 to 20 URLs in each bucket to identify the underlying pattern (rendering, canonical, robots, soft 404).
PAGE
What is the page experience profile?
LCP < 2.5sINP < 200msCLS < 0.1
Core Web Vitals on the highest-traffic pages, mobile usability, JavaScript rendering verification on a 20 to 50 page sample, third-party script audit. Measure on real pages with real traffic — not the homepage.
How we run itCrUX field data segmented by page template. The aggregate CWV picture for high-traffic content is the actual ranking variable, not the lab score on the homepage.
SCHEMA
What is the schema and structured data state?
ArticleFAQPageBreadcrumbListPerson
Every page type validated through Google's Rich Results Test. Missing schema (Article on blog posts, FAQPage on resource pages, BreadcrumbList everywhere) is one of the most common audit findings — and one of the highest-leverage fixes for AI Search citation.
How we run itCrawl with Screaming Frog and extract structured data per page template. Cross-check each template against Rich Results Test before deploying broadly.
URL structure, canonical patterns, internal linking density, subdomain or subdirectory split, sitemap completeness, robots.txt rules. The architecture audit catches the migration debt that previous platforms left behind.
How we run itSitebulb crawl path analysis on sites over 1,000 URLs; manual review of robots.txt and the sitemap index; cross-check internal inlinks per template.
The tools, and what each one is for
No single tool is sufficient. The audit findings come from cross-referencing what each tool surfaces.
Screaming Frog SEO Spider
The workhorse crawler. Catches broken links, redirect chains, missing tags, schema issues, indexability flags. Pays for itself in the first audit.
Ahrefs Site Audit
Cross-checks the Screaming Frog crawl, surfaces additional issues from a different perspective, useful for ongoing monthly monitoring.
Sitebulb
Higher-resolution architecture analysis for sites over 1,000 URLs. Visual hint maps and crawl path analysis.
Google Search Console
The indexation reality check. Coverage, performance, enhancements, page experience reports. Free and the most underrated tool in the stack.
Bing Webmaster Tools
Secondary index, less weight than GSC but useful for detecting issues before they show up in GSC.
Google Rich Results Test
Schema validation per page type. Every template validated at least once before deploying broadly.
PageSpeed Insights & CrUX
Real-user CWV data versus lab data. Both matter; real-user data matters more for ranking decisions.
Chrome DevTools
JavaScript rendering investigation. Coverage tab for unused code. Performance tab for INP issues.
03 / Indexation: the issue most teams do not know they have
The marketing team thinks 200 pages are live. Google has indexed 73 of them.
The diagnostic is straightforward. Open Google Search Console, go to Coverage, look at the gap between "Submitted" and "Valid." That gap is the indexation problem. Common B2B SaaS indexation issues, in rough order of frequency:
01
Marketing pages built in client-side React without SSR
When the page loads, the HTML response contains an empty <div id="root"></div> and content gets injected after page load. Google can render JavaScript, but rendering happens 2 to 3 days after the initial crawl, ranking is suppressed by 20 to 40 percent, and most non-Google crawlers (including AI Search engines) never see the content. Fix: enable SSR or SSG in Next.js — usually a one-time configuration change with massive ranking impact.
02
Pages canonicalized to other URLs by mistake
During a migration or template update, the canonical tag on resource pages gets pointed to a category index instead of self-referencing. Every resource page on the site quietly tells Google "do not index me, index this other page instead." Diagnostic: view source on resource pages and check the canonical URL matches the page URL.
03
noindex, nofollow left on pages from staging
Most teams clone production to staging, add <meta name="robots" content="noindex, nofollow"> to prevent staging from being indexed, then forget to remove it when the new templates ship to production. Entire sections of the site become uncrawlable overnight.
Old debugging or staging directives left in robots.txt block important page patterns. Common pattern: Disallow: /resources/ left over from a docs migration. Diagnostic: site.com/robots.txt → look for Disallow rules that affect indexable URLs.
05
URL parameters creating duplicate content
UTM parameters, session IDs, filter parameters all create distinct URLs from Google's perspective unless properly handled. Without consistent canonical tags, the marketing team's UTM-tagged campaign links create duplicate-content issues that suppress the canonical version's rankings.
06
Pages indexed but ranking poorly due to thin content
Programmatic page directories (integration pages, location pages) with 200-word templated content. Google indexes them but treats them as low-value, which drags down the entire site's quality signal.
Fix priority for indexation is critical. Until indexation issues are resolved, every other technical SEO improvement produces less ROI than it should because some percentage of the affected pages are not actually competing for rankings.
04 / Core Web Vitals for B2B SaaS marketing sites
Three thresholds. Pages that fail rank worse — and the gap is widening.
Do not measure Core Web Vitals on the homepage. The homepage gets the most engineering attention by default. Pull the CrUX field data segmented by page template — the aggregate CWV picture for high-traffic content is the actual ranking variable.
LCP
Largest Contentful Paint < 2.5s
Three usual culprits: 4 MB hero PNGs (fix: WebP, srcset, explicit dimensions, fetchpriority="high", loading="eager"), 800 KB JS bundles blocking the main thread (fix: code splitting, defer non-critical scripts), web fonts without font-display: swap (fix: swap, preload critical font, system fallback).
INP
Interaction to Next Paint < 200ms
Almost always third-party script overhead. HubSpot tracking, Drift, Intercom, the cookie consent banner. Each loads synchronously, blocks the main thread, and degrades interactivity. Fix: defer, load after main-thread idle, lazy-load chat widgets that only appear after a delay.
CLS
Cumulative Layout Shift < 0.1
Images without explicit dimensions, late-loading embeds (YouTube, Wistia, calendar widgets), and web fonts causing reflow. Fix: explicit width and height on every image, reserve space with CSS aspect-ratio for embeds, font-display: optional or swap with size-adjust.
05 / JavaScript and rendering (the modern SaaS stack problem)
Rendering mode is the single highest-impact technical decision on a modern B2B SaaS marketing site.
The diagnostic: open any marketing page, view source (not "inspect element"), search for the page's main heading text. If the heading is in the HTML source, the page is SSR or SSG. If you only see <div id="root"></div>, the page is CSR-only and you have a problem.
CSR
Client-side rendering only
Empty HTML responseContent injected after JS executesOther crawlers see nothing
Google can technically render JavaScript and will eventually index CSR pages, but rendering happens 2 to 3 days after the initial crawl. Other crawlers (Bing, AI Search engines, social preview bots, archive.org) often never see the rendered content at all. CSR pages typically rank 20 to 40 percent worse than SSR equivalents and earn dramatically fewer AI Search citations.
SSR
Server-side rendering
Full HTML on each requestAll crawlers see content immediatelyNormal ranking baseline
The server generates the full HTML on each request. The browser receives complete content immediately. The cost is server load and per-request latency, both manageable for marketing-site traffic.
SSG
Static site generation
HTML pre-built at deployServed as static filesLowest runtime cost
Same crawler benefits as SSR with lower runtime cost. Best for marketing content that does not change frequently. The default for most marketing pages.
Pages are statically generated and cached, but regenerated on a schedule or on demand. Best of both worlds for content that updates periodically.
We have inherited engagements where switching the entire marketing site from CSR to SSG produced a 30 percent organic traffic lift within 60 days, with no other changes. Get the rendering wrong and the rest of the SEO work compounds at half speed.
06 / Site architecture: marketing site, app, and docs
Three surfaces. Three architectural decisions that determine how authority flows through the system.
The marketing site, the product app, the documentation. The architectural decisions cost very little to make correctly upfront and are expensive to change after the fact.
01
Subdomain vs subdirectory for the blog
Subdirectory wins almost always. /blog rather than blog.technotize.io. Google treats subdomains as separate sites in subtle but meaningful ways, including authority transfer and link equity flow. Programs that move a blog from subdomain to subdirectory typically see 15 to 35 percent organic traffic lifts within 6 months.
02
Subdomain vs subdirectory for docs
More nuanced. Customer-only reference docs are fine on a subdomain. Docs with prospect-education content that has search demand (tutorial guides, integration walkthroughs) belong in a subdirectory so the marketing site's authority lifts the docs content. The decision turns on whether docs content has commercial intent.
03
URL structure: hierarchical and shallow
/b2b-saas-seo/keyword-research is good. /blog/category/year/month/2026/keyword-research-guide-for-b2b-saas-companies is bad. Three levels deep maximum for most pages. Hyphens, lowercase, no parameters in canonicals, no trailing characters. Pick a trailing-slash convention and 301 the alternative.
Plus the smaller decisions that consistently matter: trailing slashes (pick one and 301 the alternative), pagination over infinite scroll for content listings, hreflang tags for multi-region sites, and 5 to 10 separate sitemaps in a sitemap index rather than one monolithic file. See how this connects to the keyword and cluster architecture →
07 / Schema markup that actually matters
The right schema, deployed correctly, produces visible ranking lifts.
Structured data is the technical foundation for both Google's rich results and AI Search citation. The wrong schema deployed enthusiastically produces manual penalties or does nothing.
Validation discipline: every page type validated through Google's Rich Results Test before deploying broadly; the GSC Enhancements report monitored monthly for new errors.
Organization
One block, deployed in the homepage and global footer. Includes name, logo, URL, contact information, social profiles. Foundational for knowledge graph entity recognition.
WebSite
Deployed on the homepage. Includes the site search action if the site has a working search function. Enables sitelinks search box in some Google results.
Article
Every blog post, every pillar page, every sub-pillar. Headline, description, image, datePublished, dateModified, author linked to Person, publisher, articleSection. The single most-deployed schema on a B2B SaaS site.
FAQPage
Every page with a real FAQ section. The strongest single signal for AI Search citation because AI extractors lift Q&A pairs cleanly when structured. Required for any page that wants to compete for AI Overview citation.
BreadcrumbList
Every page, not just blog posts. Reinforces the site's hierarchical structure for both Google and AI crawlers. Cheap to deploy, consistently helpful.
SoftwareApplication
For the product page itself. The accurate type for B2B SaaS products. Includes name, description, applicationCategory, operatingSystem, offers (pricing), aggregateRating only if you have real, verifiable reviews.
Person
Every author profile page. Name, jobTitle, worksFor, image, sameAs (linking to verified LinkedIn, Twitter). Critical for the E-E-A-T signal that AI Search engines weight heavily.
Skip the rest
Speakable, Video (unless genuinely video-centric), Recipe, Event, Course (unless you actually offer one), most niche schema types. Wrong schema is worse than no schema. Schema deployed solely for SEO without an underlying real entity is a manual review risk.
08 / Internal linking architecture at scale
At 200 to 1,000 pages, internal linking is not "add a few links per article." It is an architectural decision.
It determines how PageRank flows through the site and which pages compete effectively. For sites at scale (500+ pages), an internal linking audit takes 4 to 8 hours and consistently produces a 5 to 15 percent ranking lift across the cluster within 90 days.
Pillar pages receive links from every cluster post
Plus contextually relevant links from services pages, industry pages, and case studies. The /b2b-saas-seo master pillar should receive 50 to 150 internal links across the site. Each sub-pillar should receive 20 to 50.
Cluster posts link both up and sideways
Each cluster post receives a link from its parent pillar (top-down) plus 2 to 3 sideways links from sibling cluster posts. The bidirectional architecture is what Google reads as topical authority.
Commercial pages get linked from intent-mapped informational content
A comparison page about "HubSpot vs Pipedrive" should be linked from problem-aware and solution-aware content about CRM selection. Without those internal links, the comparison page ranks slower than it should.
Anchor text varies but stays descriptive
"Click here" produces no ranking signal. Exact-match keyword anchors over-optimized look spammy. The right pattern is descriptive natural-language anchors that include the target keyword variant.
Diagnose with Screaming Frog → Internal Inlinks
Sort ascending. Pages with fewer than 3 internal inlinks are isolated and will not rank well regardless of content quality. Fix is either linking to them more, retiring them through 301s, or removing them from the publishing roadmap.
09 / Migrations and replatforming
Most B2B SaaS sites get migrated every 18 to 36 months. Each migration is a high-risk technical SEO event.
We have inherited engagements where the previous migration cost 30 to 60 percent of organic traffic and the team did not realize it until the new agency audited the site months later. The checklist that prevents most of these losses:
01URL mapping document
Every old URL has a new URL it maps to or returns a 410 ("intentionally gone"). The mapping is documented before the migration, not improvised after.
02301 redirects for every changed URL
Not 302s. 302 redirects are temporary and do not pass full link equity. The single most common migration mistake is using 302s by default because that is what the new platform's redirect tool defaults to.
03Sitemap regeneration and resubmission
New sitemap files generated for the new URL structure, listed in a sitemap index, submitted to GSC and Bing Webmaster Tools the day of launch.
04Internal linking audit on the new site
Internal links inside content that pointed to old URLs need to point directly to the new equivalent — not rely on the 301 chain. Direct linking is cleaner.
05Schema markup re-deployment
Schema is often stripped during platform migrations because the new templates do not include the same structured data. Verify each page type's schema is intact after launch.
06Core Web Vitals verification
The new platform's CWV profile is often worse during the first 30 days as caching warms up and edge nodes propagate. Monitor for regressions.
07JavaScript rendering verification
If the new platform uses a different rendering approach (Webflow's HTML-first SSG to a Next.js site that defaulted to CSR), verify rendering on every major page type. Catches the most expensive migration error possible.
08Manual spot-check of the top 50 traffic pages
Open each one. Verify rendering, canonical, schema, redirects, content. Eight to twelve hours of manual work that catches what automated tools miss.
09Documented rollback plan
Before launch, document the conditions under which the migration gets rolled back. Without one, a failed migration produces panic decisions that compound the damage.
10Two-week post-launch monitoring
Daily checks of GSC Coverage, Performance, and the top 20 traffic pages. Most migration issues show up days 3 to 10 after launch as Google re-crawls. Migrate Tuesday morning, not Friday afternoon.
The single most common migration mistake: launching on a Friday afternoon and not monitoring until Monday. By Monday, broken redirects have been crawled, indexed, and started showing in search. Migrate Tuesday morning. Watch GSC daily for the first 14 days. Have engineering on standby for the first 72 hours.
10 / FAQ
What teams ask before they hand over the technical SEO work.
Part 04 of the B2B SaaS SEO playbook
This is the technical chapter.
The full playbook covers strategy, keyword research, content, technical, links, AI search, and reporting.