Robots.txt & sitemap preview

Client

Paste a robots file or an XML sitemap to see a structured summary—User-agent blocks, Allow/Disallow rules, declared Sitemap URLs, or table rows from urlset / sitemapindex. Works offline in your browser; pair with Open Graph preview when you are auditing site metadata.

?

Paste a robots file or a sitemap document—parsing runs locally. This is a structural preview, not a live crawl or Google-specific validator.

Parsed structure

Sitemap directives

  • https://example.com/sitemap.xml

Group 1

User-agent: *

  • Disallow: /admin/
  • Allow: /admin/login

Group 2

User-agent: OtherBot

  • Disallow:

Other directives

  • Crawl-delay: 1

Common use cases

  • Skim robots.txt groups and Sitemap: lines before changing production crawl rules.
  • Extract loc URLs from a sitemap or sitemap index file you pasted from Search Console.
  • Compare Allow/Disallow lines when debugging why a path is blocked.

Common mistakes to avoid

  • Treating this as a live Google Search Console report

    This tool only parses the text you paste. It does not fetch your live site or query Google.

  • Expecting every non-standard directive

    Unknown directives are listed under “Other” when recognized as name:value pairs; highly vendor-specific lines may need manual review.

FAQ

Does this upload my robots.txt or sitemap to Toolcore?

No. Parsing runs entirely in your browser. Nothing is sent to our servers.

Why does my sitemap show no URLs?

Check that the root element is urlset or sitemapindex with standard loc children. Namespaced or unusual XML may need cleanup before parsing.

Related utilities you can open in another tab—mostly client-side.