You can't fix what you can't see
Most SEO issues are invisible. Your site looks perfect in the browser, but Google sees something different: a missing canonical tag splitting your rankings, a noindex tag hiding your best page, a broken sitemap pointing to URLs that don't exist.
An SEO audit finds these invisible problems before they cost you traffic. Here's how to do one systematically, from the checks that matter most to the ones that give you an edge.
Step 1: Check indexability
Nothing else matters if Google can't index your pages. Start here.
robots.txt
Visit https://yourdomain.com/robots.txt and check for overly broad Disallow rules. The most common mistake is a staging-environment Disallow: / that ships to production:
# Fetch your robots.txt and check for broad blocks
curl -s https://yourdomain.com/robots.txt
Look for rules that block important directories like /blog/, /products/, or static asset paths like /_next/.
Robots.txt Validator
Validate your robots.txt file for syntax errors and blocking rules.
noindex tags
Check every important page for <meta name="robots" content="noindex"> tags. These tell Google to drop the page from search results entirely. Also check for the X-Robots-Tag HTTP header, which does the same thing but is harder to spot:
curl -s https://yourdomain.com | grep -i "noindex"
curl -sI https://yourdomain.com | grep -i "x-robots-tag"
XML sitemap
Your sitemap at /sitemap.xml should exist, return a 200 status, contain valid XML, and list only URLs that actually resolve. A sitemap full of 404s wastes your crawl budget and signals poor site maintenance to Google.
Check that:
- Every URL in the sitemap returns 200
- No staging or development URLs are included
- The sitemap is referenced in your robots.txt
lastmoddates are accurate (don't set them all to today)
Sitemap Validator
Validate your sitemap.xml file format and URL count.
Step 2: Audit meta tags
Meta tags control how your pages appear in search results. Missing or broken meta tags mean Google has to guess - and it usually guesses wrong.
Title tags
Every page needs a unique title between 50-60 characters. Check for:
- Missing titles (Google generates a bad one for you)
- Duplicate titles across pages (signals redundant content)
- Titles over 60 characters (get truncated in search results)
- Generic titles like "Home" or "Untitled"
Meta descriptions
Write a unique description of 150-160 characters for each page. Descriptions aren't a ranking factor, but they control your search snippet and directly affect click-through rates.
<!-- Bad: missing or generic -->
<meta name="description" content="Welcome to our website" />
<!-- Good: specific and action-oriented -->
<meta name="description"
content="Run 45+ SEO checks in 30 seconds. Catch noindex tags,
broken robots.txt, and missing meta tags before launch." />
Canonical URLs
Every page should declare its canonical URL to prevent duplicate content issues. Without canonicals, Google might index your page at /pricing, /pricing/, and /pricing?ref=nav as three separate pages - splitting your ranking signals.
Meta Tag Checker
Check your page title, meta description, viewport, charset, and robots tags.
Step 3: Check heading structure
Your heading hierarchy tells Google (and screen readers) how your content is organized. Common problems:
- Multiple
<h1>tags on a single page - Skipped heading levels (jumping from
<h2>to<h4>) - Empty heading tags
- Headings used for styling instead of structure
Every page should have exactly one <h1>, and headings should follow a logical order without gaps.
Heading Structure Checker
Analyze your H1-H6 heading hierarchy for SEO best practices.
Step 4: Review Core Web Vitals
Google uses three performance metrics as ranking signals:
- LCP (Largest Contentful Paint): How fast the main content loads. Target: under 2.5 seconds.
- INP (Interaction to Next Paint): How fast the page responds to user input. Target: under 200ms.
- CLS (Cumulative Layout Shift): How much the layout moves during loading. Target: under 0.1.
Run Google Lighthouse in Chrome DevTools (F12 > Lighthouse tab) on your key pages. Focus on the pages that drive the most traffic first.
The most common fixes:
- Add
widthandheightto images (fixes CLS) - Lazy-load below-the-fold images (fixes LCP)
- Reduce JavaScript bundle size (fixes INP)
Step 5: Validate structured data
Structured data (JSON-LD) helps Google understand your content and can unlock rich snippets in search results. Check for:
- Valid JSON-LD syntax (use Google's Rich Results Test)
- Correct schema types for your content (
Article,Product,FAQPage, etc.) - Required fields filled in (many schemas have mandatory properties)
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "Article",
"headline": "How to Audit Your Website for SEO",
"author": { "@type": "Person", "name": "Your Name" },
"datePublished": "2026-03-28"
}
</script>
Step 6: Check internal linking
Google discovers pages by following links. If important pages are buried deep in your site with no internal links pointing to them, they may never get crawled.
Look for:
- Orphan pages - pages with no internal links pointing to them
- Dead links - internal links that return 404
- Redirect chains - links that go through multiple redirects before reaching the final page
The one-command version
Running through all six steps manually is thorough but time-consuming. If you want to check everything at once, LintPage covers steps 1 through 4 in a single 30-second scan - indexability, meta tags, headings, and performance checks included.
Free Website Audit Tool
Run all 45+ SEO checks in a single scan. Catch every issue on this checklist automatically.