---
title: "Robots.txt"
description: "Robots.txt is a text file placed at the root of a website that tells web crawlers which URLs they are allowed or disallowed from crawling, following the Robots Exclusion Protocol."
category: "SEO Crawling & Indexation"
date: "2026-03-05"
url: "https://getbeast.io/glossary/robots-txt/"
type: "glossary"
---

# Robots.txt

**Category:** SEO Crawling & Indexation | **Updated:** 2026-03-05

Robots.txt is a text file placed at the root of a website that tells web crawlers which URLs they are allowed or disallowed from crawling, following the Robots Exclusion Protocol.

---

## What Is Robots.txt?
Robots.txt is a plain-text file located at `https://yoursite.com/robots.txt` that provides crawl directives to web robots. Using the Robots Exclusion Protocol (REP), it specifies which user agents can access which URL paths. It supports `Allow`, `Disallow`, `Crawl-delay`, and `Sitemap` directives.

## Why Robots.txt Matters for SEO
Robots.txt is your **first line of crawl budget defense**. It prevents search engines from wasting time on admin pages, internal search results, and non-indexable content. It is also the primary mechanism for blocking AI crawlers like GPTBot and ClaudeBot from scraping your content.

## How to Use Robots.txt Effectively
Create rules that block low-value URL patterns while allowing access to all important content. Never block CSS, JavaScript, or image files that Googlebot needs for rendering. Test your rules using Google Search Console's robots.txt tester. Remember: robots.txt is advisory — it does not prevent indexing (use noindex for that).

---

## Related Terms

- [Crawl Budget](/glossary/crawl-budget/)
- [Noindex](/glossary/noindex/)
- [Meta Robots](/glossary/meta-robots/)
- [XML Sitemap](/glossary/xml-sitemap/)
- [Crawl Trap](/glossary/crawl-trap/)
- [Web Crawler](/glossary/web-crawler/)

## Further Reading

- [The Ultimate robots.txt Guide](/blog/robots-txt/)

---

*Part of the [GetBeast SEO Glossary](/glossary/). Visit [GetBeast.io](https://getbeast.io) for professional SEO and log analysis tools.*
