What is a Robots.txt File?
A **Free Online Robots.txt Tester** is a simulator to understand how search engines process instructions in your `robots.txt` file. This file is part of the Robot Exclusion Protocol (REP), which is used to tell web crawlers which parts of your website they are allowed or not allowed to access.
Why Should You Test It?
Incorrect syntax or faulty logic in your robots.txt file can cause search engines like Google to accidentally block your entire website. Using this testing tool helps you validate rules before applying them to your live server.
Robots.txt Basic Rules
- **User-agent:** Targets a specific bot (e.g., `Googlebot`).
- **Disallow:** The path of the folder or file you want to restrict.
- **Allow:** The path specifically allowed (useful for bypassing a Disallow folder).
- **Sitemap:** Tells bots where your XML sitemap is located.
How to Simulate Rules
Simply paste your robots.txt file content on the left side, then enter the URL path you want to check on the right side. Select the User-agent you want to simulate, then click the test button. Our tool follows Google's standard priority order to determine if the URL is **ALLOWED** or **BLOCKED**.