1 What Is llms.txt?
llms.txt is a text file in the root directory of your website (alongside robots.txt and sitemap.xml) that is specifically created for Large Language Models (LLMs) like ChatGPT, Gemini, and Claude. It contains a machine-readable summary of your company, services, and products.
While robots.txt controls what can be crawled, llms.txt tells AI systems what your company is about and where to find the most important information.
Think of llms.txt as your elevator pitch for AI systems: compact, structured, and with links to the details.
2 Structure of an llms.txt File
The file follows a simple Markdown-like structure:
# Company Name > Short description of the company in one sentence. ## About Us More detailed description... ## Services / Products ### Service 1 Description of the service. URL: https://www.example.com/service-1 ### Service 2 Description of the service. URL: https://www.example.com/service-2 ## Contact - Phone: +49 ... - Email: info@example.com - Website: https://www.example.com
3 Why llms.txt Is Important for E-Commerce
Online stores typically have thousands of pages: product pages, categories, filter pages, CMS pages. For an AI system, it's difficult to extract the relevant information from this mass. llms.txt solves this problem:
- Gives AI systems a structured overview of your product range
- Links to the most important pages (categories, top products, about us)
- Communicates your expertise and differentiation
- Accelerates AI indexing of your store
- Complements Schema.org with company-level information
4 Best Practices for E-Commerce llms.txt
Keep it brief
Maximum 1-2 pages. AI systems prefer compact information.
Most important first
Core services and USPs in the first paragraph. Like a press article.
Link URLs
Each section should link to the corresponding page.
Update regularly
Incorporate new products, services, or changes promptly.
Mention certifications
Partnerships and certificates strengthen trustworthiness.
5 llms.txt Together with robots.txt and Schema.org
The three files/standards together form the technical foundation for AI visibility:
Controls access: Which AI crawlers can crawl which pages?
Provides context: Who are you, what do you offer, where are the details?
Provides the data: Structured product information at the page level.