Website Audit

Scan. Analyze. Improve.

Enter any URL and get a full-stack audit covering SEO, performance, accessibility, and security — powered by industry-standard checks.

This audit fetches the public URL you provide and inspects its visible technical signals. It is a starting point for identifying common issues — not a substitute for a full technical SEO audit, penetration test, or performance analysis under real user conditions.

Overview

The Website Audit tool fetches a public URL and runs a set of deterministic checks against the page response and its HTML. In seconds it surfaces the most common technical SEO, metadata, security, and crawlability issues that affect how search engines index pages and how users experience them.

The audit covers twelve checks across five areas: HTTP and HTTPS status, on-page metadata (title tag, meta description, canonical, H1), social sharing tags (Open Graph and Twitter Card), crawl signals (robots.txt and sitemap.xml), and security headers (HSTS, X-Content-Type-Options, X-Frame-Options, and CSP). Each check is graded as passed, warning, or critical, and combined into an overall score.

This is a single-page, single-request audit. It checks the URL you enter — not your entire site. Use it for homepage checks, pre-launch QA, deployment verification, and quick spot checks before sharing a URL with a client or stakeholder.

Use cases

When to use it

  • Pre-launch reviewrun the audit on a staging URL before going live to catch missing metadata, absent security headers, or incorrect canonical tags.
  • Post-deployment verificationconfirm that a recent deployment has not inadvertently removed metadata, broken HTTPS, or dropped security headers.
  • Technical SEO spot checkquickly audit a specific page for title length, meta description, canonical, H1, and social sharing tags without opening DevTools.
  • Client handoff reviewrun an audit on a client site before delivery to identify obvious issues and show a documented score.
  • Crawlability checkverify that robots.txt and sitemap.xml are reachable at their standard paths before submitting a site to search engines.

When it's not enough

  • Full site crawlthis tool audits a single URL per run. It does not crawl multiple pages, follow internal links, or report site-wide issues.
  • Core Web Vitals analysisthe audit does not measure Largest Contentful Paint, First Input Delay, or Cumulative Layout Shift. Use Lighthouse or PageSpeed Insights for field and lab performance data.
  • Penetration testingheader presence is a useful signal but is not a security assessment. Use a dedicated security scanner for a thorough review.
  • Accessibility auditingthe audit does not run WCAG or ARIA checks. Use an accessibility-specific tool for compliance testing.

How to use it

  1. 1

    Enter a public URL

    Type or paste the full URL you want to audit, including https://. The audit will follow redirects to the final destination.

  2. 2

    Click Run Audit

    The tool fetches the page, reads its HTML, and checks its response headers. This usually takes under five seconds.

  3. 3

    Review the score and findings

    The overall score gives a quick read on how many checks passed versus failed. Critical issues are shown first, followed by warnings and quick wins.

  4. 4

    Focus on critical issues first

    Missing HTTPS, missing title tags, and multiple missing security headers carry the highest score deductions. Address these before working through warnings.

  5. 5

    Re-run after making changes

    Once you have fixed issues on your site or staging environment, run the audit again to confirm the fixes are reflected in the live response.

Common errors and fixes

Missing title tag

Add a descriptive <title> element to the <head> of each page. Keep it between 10 and 60 characters, and make it unique per page. This is the most important on-page SEO element.

Missing or short meta description

Add a <meta name='description' content='...'> tag to each page. Aim for 50 to 160 characters. Descriptions do not directly affect rankings but improve click-through rates in search results.

Missing canonical tag

Add <link rel='canonical' href='https://yourdomain.com/page/'> to each page pointing to the preferred URL. This prevents duplicate content issues when the same page is accessible via multiple URLs.

Missing or multiple H1 tags

Each page should have exactly one H1 tag that clearly describes the page topic. If there are multiple H1 tags, consolidate them into one. If there are none, add one near the top of the content.

Missing Open Graph or Twitter Card tags

Add og:title, og:description, og:image, and twitter:card meta tags to the page <head>. These control how the page appears when shared on social platforms. Without them, platforms generate their own previews which are often inaccurate.

Multiple security headers missing

HSTS, X-Content-Type-Options, X-Frame-Options, and Content-Security-Policy must be configured at the web server or CDN level. Check your Vercel, Cloudflare, nginx, or Apache configuration and add the missing headers. Use the HTTP Headers Checker tool for a detailed view.

Robots.txt or sitemap.xml not found

Place robots.txt at https://yourdomain.com/robots.txt and sitemap.xml at https://yourdomain.com/sitemap.xml. Most frameworks and CMSs generate these automatically when configured correctly.

Frequently asked questions