Search workflow Runs fully in your browser
Robots.txt Tester
Test crawl rules against user-agent paths.
Check crawl rules before they block important content. The robots.txt tester helps you compare user-agent directives against a path and quickly confirm whether a URL should be allowed.
What this tool helps you review
- Parses common `Allow` and `Disallow` rules and matches them against a chosen user-agent and path.
- Explains the matched directive so you can see why a URL is allowed or blocked.
- Works as a safe pre-deploy review tool for SEO teams and site owners.
FAQ
Common questions before you publish
Why test robots.txt before launch?
A small typo can block large sections of a site from crawling. Testing before launch helps you catch mistakes before they reach production and affect discovery.
Does robots.txt stop indexing completely?
Not always. Robots.txt controls crawling, not every form of indexing. Use it with canonical, noindex, and sitemap decisions as part of a broader SEO setup.
What path format should I enter?
Use the path that appears after the domain, such as `/blog/post/`. This mirrors how robots.txt rules are evaluated by crawlers.
Related tools