Technical Standard

The robots.txt for the Generative AI Era

In 2025, websites are read more by machines than by humans. **llms.txt** is the standard used to ensure AI models understand your business without the noise of traditional web structure.

What is llms.txt?

The **llms.txt** file is a proposal for a Markdown-based handshake between website owners and Large Language Models (LLMs). While traditional search engines rely on XML sitemaps and complex HTML parsing, AI agents prefer high-density, structured text that fits within their context window.

The Difference: robots.txt vs. sitemap.xml vs. llms.txt

Featurerobots.txtsitemap.xmlllms.txt
**Audience**Search BotsSearch IndexersLLMs & Assistants
**Goal**RestrictionInventoryContext & Meaning
**Format**Plain TextXMLMarkdown

Example Implementation

/llms.txt
# Project Name
> A one-sentence description of the site's purpose.

A more detailed summary of the business identity and core mission goes here.

## Core Documentation
- [Product Overview](/product): Detailed feature descriptions.
- [API Reference](/docs/api): Technical integration details.