Googlebot won't fully index pages or resources heavier than 2 MB. Enter a URL — the tool analyzes your HTML document, every CSS & JS file, Googlebot accessibility, and more.
Full audit in seconds — no sign-up, no installation
Googlebot downloads and processes only the first 2 MB of HTML. Everything beyond is ignored — critical for pages with large inline data or JSON-LD blocks.
Each external stylesheet and script also has a 2 MB limit. Heavy bundles not only slow page load but also restrict how much Googlebot can parse and render.
We check robots.txt, X-Robots-Tag headers, and the meta robots tag. A page can be open to users but invisible to Google at the same time.
gzip/brotli compression reduces transfer size 3–10×. Without it a 900 KB file may exceed limits. Cache-Control headers prevent repeated downloads.
Audit of HSTS, CSP, X-Frame-Options, and other headers. They affect both user security and Chrome's Page Experience signals.
We parse the robots.txt file: is Googlebot blocked, are there sitemap links, and which rules apply to the checked URL.
Everything about the 2 MB limit and indexing
gzip on; gzip_types text/css application/javascript;compression package.
We build high-performance web products and SEO solutions. Ivatech agency is available for new projects on Upwork.
Work with Ivatech on Upwork