Parse robots.txt and explain its rules in plain English.
Fetch and parse robots.txt for any domain. See user-agent blocks, allow/disallow rules, sitemaps, and a plain-English summary.
A fast technical SEO check for crawlability and sitemap discovery.
What this tool helps you do
Rule parsing
Extract user-agent, allow, and disallow rules automatically.
Sitemap discovery
List sitemap URLs declared in robots.txt.
Plain-English summary
Get a human-readable overview of the file.
How it works
Enter the domain.
Use the homepage or any page on the target domain.
Fetch and parse.
The tool reads robots.txt and breaks it into structured rules.
Review the summary.
Confirm crawlability, spot missing sitemaps, and compare user-agent blocks.
Use cases
Where teams use this tool
Technical SEO audits
Check crawl instructions and sitemap setup quickly.
Migration QA
Verify robots.txt rules after domain or platform changes.
Competitor research
See how competitors manage crawler access.
Developer-friendly
Simple Worker endpoint
/api/inspect?url=https://example.comEach free tool runs as a small Cloudflare Worker with a public HTTP interface. That makes them useful for browser use, SEO pages, and automation flows.
Open source
Use the repo, fork it, or deploy your own copy
Other free tools
