All Security Checks
ContentLow PriorityFree
Robots.txt Security Audit
Robots.txt controls search engine crawling but can inadvertently reveal sensitive paths.
Why It Matters
Disallowing paths in robots.txt tells everyone (including attackers) that those paths exist. Sensitive areas should be protected by authentication, not obscurity.
How We Check
We analyze your robots.txt for paths that might reveal admin panels, backup files, or other sensitive locations.
How to Fix
Protect sensitive areas with authentication instead of robots.txt. If using robots.txt, avoid specific file names. Use generic patterns if needed.
Related Security Checks
Check Your Website Now
Run a free security scan to check for Robots.txt Security Audit issues and 58+ other security vulnerabilities.
Scan Your Website Free