Robots.txt & Security.txt Checker
Analyze your robots.txt for SEO issues and security misconfigurations. Check if security.txt follows RFC 9116. Ensure crawlers see the right pages and security researchers can reach you.
What We Check
Robots.txt syntax validation
Disallow directive analysis
Sitemap reference check
Security.txt RFC 9116 validation
Sensitive path exposure detection
Crawl budget optimization tips
How It Works
1
Enter your website domain
2
We fetch /robots.txt and /.well-known/security.txt
3
Robots.txt directives are parsed and checked for common mistakes
4
Security.txt is validated against RFC 9116 requirements
5
You receive actionable recommendations for both files
Security Checks Included
This tool runs the following security checks on your website
Frequently Asked Questions
What is robots.txt?
Robots.txt is a text file at your domain root that tells search engine crawlers which pages to index and which to skip. Misconfigured robots.txt can hide important pages from Google or accidentally expose private URLs.
What is security.txt?
Security.txt (RFC 9116) is a standard file at /.well-known/security.txt that tells security researchers how to report vulnerabilities. It should include contact info, encryption keys, and an expiry date.
Can robots.txt leak sensitive paths?
Yes. Disallow directives can inadvertently reveal the existence of admin panels, staging environments, or internal tools. Attackers often check robots.txt first to find hidden paths.
Do I need both robots.txt and security.txt?
Yes. Robots.txt is essential for SEO and controlling crawler access. Security.txt is a security best practice recommended by IETF that helps researchers report vulnerabilities responsibly.
Ready to Check Your Website?
Run a free security scan now and get instant results with actionable fix recommendations.