You shipped the site. Google can't find it.
Launch day is exciting. You've built the features, polished the design, deployed to production. But in the rush to ship, SEO is usually the last thing anyone checks - if it's checked at all.
The result? Your site goes live with invisible problems that silently prevent Google from indexing, ranking, or even crawling your pages. By the time you notice, you've lost weeks of potential organic traffic.
Here are the five SEO mistakes we see most often on freshly launched sites - and how to catch them before they cost you.
1. Shipping a noindex tag from staging
This is the number one killer. During development, you add a noindex meta tag or X-Robots-Tag header to keep staging out of Google's index. Then you deploy to production and forget to remove it.
<!-- This one line makes your entire site invisible to Google -->
<meta name="robots" content="noindex, nofollow" />
Google obeys immediately. Within days, your pages start disappearing from search results. And because your site looks perfectly fine to visitors, nobody notices until traffic flatlines.
The fix: Check every page for robots directives before launch. Not just the homepage - every route. A noindex tag on your pricing page or blog won't throw an error, but it will silently remove those pages from search.
2. A robots.txt that blocks everything
The default robots.txt on many frameworks and CMS platforms is restrictive by default. Some block entire directories. Others use Disallow: / which tells every search engine bot to stay away from your entire site.
# This blocks all crawlers from your entire site
User-agent: *
Disallow: /
You might have set this during development to prevent premature crawling. Or your deployment platform might generate a restrictive default. Either way, if this goes live, Google won't crawl a single page.
The fix: Before launch, visit yourdomain.com/robots.txt and verify it allows crawling of all public pages. A good starting point for most sites:
User-agent: *
Disallow:
Sitemap: https://yourdomain.com/sitemap.xml
Robots.txt Validator
Validate your robots.txt file for syntax errors and blocking rules.
3. Missing or broken sitemap.xml
A sitemap tells Google which pages exist on your site and when they were last updated. Without one, Google relies solely on following links to discover your content - which means orphaned pages or deep routes might never get indexed.
Common launch-day sitemap issues:
- No sitemap at all - you forgot to generate one or your build step doesn't produce one
- Sitemap returns 404 - the file isn't deployed or the route isn't configured
- Sitemap lists staging URLs -
https://staging.yourdomain.com/instead of your production domain - Sitemap references deleted pages - URLs that return 404, which wastes crawl budget
The fix: Generate your sitemap as part of your build process and verify every URL in it returns a 200 status code with the correct production domain.
Sitemap Validator
Validate your sitemap.xml file format and URL count.
4. No Open Graph tags (your links look terrible when shared)
This isn't a ranking factor, but it directly impacts your launch. When you share your new site on Twitter, LinkedIn, Slack, or Discord, the link preview is controlled by Open Graph tags. Without them, your link shows up as a bare URL with no image, no title, and no description.
First impressions matter. A broken link preview on launch day kills the momentum of every share, tweet, and message your team sends.
<!-- The four OG tags every page needs -->
<meta property="og:title" content="Your Page Title" />
<meta property="og:description" content="A compelling description." />
<meta property="og:image" content="https://yourdomain.com/og-image.png" />
<meta property="og:url" content="https://yourdomain.com/page" />
The fix: Add OG tags to every page. Test them with a link preview debugger before you start sharing. Make sure og:image points to an absolute URL that's actually accessible.
Open Graph Tag Preview
Check your Open Graph and Twitter Card tags for social media sharing.
5. Broken heading hierarchy
Search engines use your heading structure (<h1> through <h6>) to understand the topic and organization of each page. A page with no <h1>, multiple <h1> tags, or headings that skip levels sends confusing signals.
This is especially common in component-based frameworks where different components render headings independently, and nobody checks the final assembled hierarchy.
<!-- Bad: skipped levels, no clear structure -->
<h1>Welcome</h1>
<h3>Features</h3>
<h2>Pricing</h2>
<h4>FAQ</h4>
<!-- Good: logical hierarchy -->
<h1>Welcome to Our App</h1>
<h2>Features</h2>
<h3>Feature One</h3>
<h3>Feature Two</h3>
<h2>Pricing</h2>
<h2>FAQ</h2>
The fix: Before launch, check each page for a single <h1> and a logical heading order with no skipped levels. This also improves accessibility for screen reader users.
Heading Structure Checker
Analyze your H1-H6 heading hierarchy for SEO best practices.
Catch all five in one scan
Every one of these mistakes is invisible to visitors. Your site looks fine, works fine, and passes every functional test. But Google sees a completely different picture.
The fastest way to catch all of them is to run a full SEO audit before you deploy. LintPage checks for all five of these issues - plus 35 more - in a single scan. Paste your URL, get your results in 30 seconds, and ship with confidence.