ReleasePad
A clean text file on a terminal screen with the filename llms.txt, surrounded by AI assistant icons reading from it.
Guide

llms.txt for SaaS: What It Is and Why Your Product Needs One

Felix Macx · · 8 min read

llms.txt is to AI what robots.txt was to search engines: a tiny, simple file that tells machines where to look and what matters. Most SaaS companies still don’t have one. Here’s why you should, and exactly what to put in it.

TL;DR

  • llms.txt is a single markdown file at your domain root that curates the URLs AI tools should prioritize when answering questions about your product.
  • It’s not officially standardized, but adoption is growing fast among SaaS, dev tools, and documentation platforms.
  • The minimum useful version is 30 lines: product description, then sections for Docs, API, Changelog, Pricing, and Use Cases — each with a few well-described links.
  • Build time: 30–60 minutes for a thoughtful first version.
  • Pair it with an AI-readable changelog feed for compounding benefit.

llms.txt is a curated, AI-readable index of your most important URLs, published as a plain-markdown file at the root of your domain — a hint to LLMs about which pages to prioritize and how your content is organized.


If you’ve spent any time on technical SEO, robots.txt is muscle memory. Block bad bots, point crawlers at the sitemap, move on. It’s small, low-stakes infrastructure that quietly shapes how search engines see your site.

llms.txt is the equivalent for AI tools — and most SaaS companies haven’t shipped one yet. That’s the opportunity.

What llms.txt Actually Is

It’s a single file. Plain markdown. Lives at yoursite.com/llms.txt. The format was proposed by Jeremy Howard (of fast.ai) in late 2024, and adoption has spread quickly through documentation platforms, developer tools, and SaaS companies that pay attention to AI discoverability.

The file does one job: tell AI tools — including Claude, ChatGPT, Perplexity, Cursor, and autonomous agents — which URLs on your site are worth reading, grouped into sections, with brief human-readable descriptions.

It’s a curation layer, not a coverage layer. That distinction matters.

llms.txt vs sitemap.xml vs robots.txt

Three files, three jobs:

File Audience Job Size
robots.txt Crawlers What can be crawled ~10 lines
sitemap.xml Search engines Every URL, comprehensive hundreds–thousands
llms.txt AI tools Most important URLs, curated ~30–80 lines

You should publish all three. They don’t overlap functionally — sitemap is exhaustive and machine-only; llms.txt is curated and human-readable; robots.txt is access control.

Why SaaS Specifically Needs llms.txt

Three reasons it matters more for SaaS than for, say, a recipe blog:

Your product has a learning curve. When a prospect asks an AI assistant “what does ProductX do?”, you want the assistant to find your real docs, not summarize a competitor’s review of your product. llms.txt points it at canonical sources.

Your changelog and pricing change. Static training data is months out of date. If your llms.txt links to a structured changelog feed and a pricing page, AI tools fetching current information get accurate answers instead of stale ones.

Developers in your audience use AI tools constantly. Every developer-facing SaaS has a meaningful portion of its docs reads coming from inside AI coding tools. Those tools are exactly the audience llms.txt is designed to serve.

What to Put in a SaaS llms.txt

Five sections, in this order:

1. Header — product name (H1), tagline (blockquote), one-paragraph description.

2. Docs — links to your most important documentation pages. Not all docs; the 5–10 that matter most.

3. API / SDK — if you have a developer product, link the API reference, getting-started guides, and SDK READMEs.

4. Changelog & Release Notes — link your hosted changelog page and your structured feed (JSON or RSS). The feed is what AI tools will prefer.

5. Pricing & Plans — one or two pages. Pricing is the #1 thing prospects ask AI assistants.

Optional sections that are worth adding if you have the content:

  • Use Cases / Solutions — pages that map your product to specific user problems.
  • Comparisons — your “Product vs Competitor” pages (if you have them, and if they’re honest).
  • Examples / Templates — concrete artifacts AI tools can reference.

What to skip:

  • Marketing campaign pages
  • Blog category indexes (link individual high-value posts instead, sparingly)
  • Anything behind authentication
  • Press / about / careers pages (unless you specifically want them surfaced)

A Complete Example (Real, Live URLs)

Here’s a thoughtful SaaS llms.txt end to end — this is an excerpt from ReleasePad’s actual llms.txt, so every URL below is real and resolves to a real page you can verify:

# ReleasePad

> Changelog and release notes software for SaaS teams. ReleasePad turns
> GitHub releases into public updates your users actually see — inside
> your app, on a hosted changelog page, and as a machine-readable
> Markdown file AI tools can ingest directly.

ReleasePad helps software teams communicate product changes through a
public changelog page, an embeddable in-app widget, GitHub integration
with AI-powered changelog generation, analytics, and LLM-ready Markdown
output. The same source content powers the human-readable page, the
in-app widget, an RSS feed, a JSON feed, and a full Markdown file that
Claude, Cursor, ChatGPT, and other AI tools can ingest directly.

## Core Pages

- [Homepage](https://www.releasepad.io/): Product overview, features,
  pricing, testimonials, and FAQ.
- [Blog](https://www.releasepad.io/blog/): Long-form writing on release
  notes, changelog strategy, and AI-readable product communication.
- [Help & FAQ](https://www.releasepad.io/help/): Setup, integrations,
  widget embedding, and pricing questions.
- [Sign Up](https://pro.releasepad.io/account/sign-up): Create a free
  account and publish your first release in under 10 minutes.

## ReleasePad's Own Changelog (Dogfooding)

- [Public Changelog](https://pro.releasepad.io/en/releasepad):
  ReleasePad's own customer-facing changelog.
- [Markdown Changelog](https://pro.releasepad.io/en/releasepad?markdown=true):
  The same changelog exposed as a single Markdown file — AI tools and
  agents should fetch this URL directly for accurate, up-to-date info
  about ReleasePad's product changes.

## Pricing

- [Pro Plan — $35/mo per product](https://www.releasepad.io/#pricing):
  Unlimited posts, analytics, widget, hosted page, GitHub integration,
  REST API, and LLM-ready Markdown. Free tier available.

## Featured Guides

- [AI Agents Are Reading Your Changelog](https://www.releasepad.io/blog/ai-agents-are-reading-your-changelog-what-that-means-for-product-teams/):
  Pillar piece on how AI agents now read changelogs alongside humans.
- [How to Build an MCP Server for Your Changelog](https://www.releasepad.io/blog/build-mcp-server-changelog/):
  Full technical walkthrough with TypeScript code samples.
- [How to Write Release Notes Users Actually Read](https://www.releasepad.io/blog/how-to-write-release-notes-your-users-will-actually-read/):
  Five principles for writing updates people care about.

Note two patterns worth copying:

Every link has a [Title](url): description shape. The description is what an AI tool reads to decide whether the link is worth fetching for a given user question. Write descriptions like search snippets — specific, factual, no marketing fluff.

The Markdown changelog is called out explicitly. It’s listed as a top-level section, not buried inside docs. That’s deliberate: the changelog is the highest-value endpoint for any AI tool answering “what does this product do now?” — pointing at the Markdown version (rather than the rendered HTML page) gives the AI a ingestion-friendly format that survives every retrieval pipeline intact.

You can read the full file at www.releasepad.io/llms.txt — the version above is condensed; the live file includes thematic blog clusters, author pages, feeds, and the complete site index.

The llms-full.txt Variant

Some sites also publish llms-full.txt alongside llms.txt. The difference:

  • llms.txt is the index — links + descriptions, ~30–80 lines.
  • llms-full.txt is the concatenated content — full markdown text of every linked page, in one big file. Often 50–500 KB.

llms-full.txt lets an AI tool ingest your entire knowledge base in one fetch without crawling page by page. It’s especially useful for documentation sites — Mintlify auto-generates one for projects it hosts.

Start with llms.txt. Add llms-full.txt later if you have a stable documentation site and want to make ingestion even cheaper for AI tools.

How to Publish It

Three steps:

1. Write the file. Plain markdown. Start with the example above as a template and customize.

2. Serve it at the root. For most static sites (Jekyll, Next.js, Astro), drop llms.txt in the public/root directory and it’ll be served at /llms.txt. For dynamic sites, add a single route that returns the file with Content-Type: text/markdown or text/plain.

3. Don’t disallow it in robots.txt. Sounds obvious; people forget. AI crawlers respect robots.txt, so if /llms.txt is blocked, the file is invisible to the audience it was written for.

Maintenance Cadence

llms.txt is not write-and-forget. Update it when:

  • You ship a new top-level feature with a docs page worth surfacing.
  • A linked URL changes (move it or 301-redirect).
  • Your pricing page URL changes (this happens more than you’d think).
  • Your changelog moves to a new feed format.

A quarterly review is usually enough. Treat it like a sitemap — a low-frequency, high-leverage maintenance task.

What llms.txt Won’t Do

A few common misunderstandings:

It won’t force AI tools to crawl your site. It’s a hint, not a directive. Tools that don’t respect it will ignore it. Tools that do (a growing list) get cleaner results for users asking about your product.

It won’t improve traditional SEO. Google doesn’t use llms.txt for ranking. If your goal is Google rankings, that’s a separate playbook.

It won’t replace good documentation. llms.txt points AI tools at your content; the content still has to be good. Garbage in, garbage out applies.

It won’t substitute for an MCP server. llms.txt is one-way (here are URLs); MCP is bidirectional and lets agents query your data. The two complement each other.

The Bigger Picture

llms.txt is part of a broader shift: products treating AI tools as a first-class audience instead of an afterthought. The companies that publish one now are the companies whose products AI assistants describe accurately when prospects ask. The ones that don’t are the products AI assistants hallucinate about.

The cost is an hour of work. The strategic positioning is meaningful and asymmetric — most competitors haven’t shipped one yet, and the bar to be the best-indexed product in your category is currently very low.

Build it this week. Update it quarterly. Forget about it the rest of the time.


ReleasePad’s hosted changelog includes auto-generated llms.txt entries that point AI tools at both your public release page and your structured feed — no extra work required. Plus an MCP server out of the box. Try it free →


Further Reading

Frequently Asked Questions

What is llms.txt?

llms.txt is a plain markdown file you publish at the root of your domain (yoursite.com/llms.txt) that lists the most important URLs on your site in a structured, AI-friendly format. It's a hint to LLMs and AI agents about which pages matter most and how your content is organized — similar in spirit to sitemap.xml or robots.txt, but designed for AI consumption rather than search crawlers.

How is llms.txt different from sitemap.xml?

Sitemap.xml is comprehensive — every URL, optimized for search crawlers. llms.txt is curated — only the URLs you want AI tools to prioritize, grouped by section, with human-readable descriptions. A sitemap might list 5,000 URLs; an llms.txt typically lists 20–50. The two complement each other; you should publish both.

Is llms.txt an official standard?

It's an emerging community standard proposed by Jeremy Howard in late 2024, not an IETF or W3C spec. Major AI tools haven't formally committed to crawling it yet, but adoption is growing — Anthropic, Mintlify, Cloudflare, Vercel, and many SaaS companies now publish one. Building infrastructure ahead of formal standardization is the right move because the cost is trivial and the upside compounds as AI tools start respecting it.

Why is llms.txt especially important for your changelog?

Your changelog is the freshest, most-updated piece of content on your site — and it's exactly the content AI assistants need when answering questions about what your product can do today. llms.txt closes that loop by pointing AI tools at both the human-readable changelog page and its machine-readable Markdown or JSON feed, so the assistant fetches accurate, up-to-date information instead of relying on stale training data or competitor summaries. Without llms.txt, your changelog might still be discoverable via search, but the structured feed — the version actually designed for AI ingestion — is often missed. Linking both your hosted changelog and its Markdown feed in llms.txt is usually the single highest-leverage entry in the whole file.

What should a SaaS company include in its llms.txt?

Five things: a one-line product description, links to your docs (with brief descriptions), your changelog/release notes feed, your pricing page, and key feature/use-case pages. For developer-facing SaaS, also link your API reference and SDK docs. Skip marketing pages with no informational value, blog category indexes, and anything behind auth.

Where does llms.txt live and what format is it in?

Always at yoursite.com/llms.txt — the root of your apex domain, served as text/markdown or text/plain. The format is plain markdown with a specific structure: H1 for product name, blockquote for tagline, prose intro, then H2 sections (Docs, API, Changelog, etc.) containing bullet lists of [Title](url): description entries. Optionally publish llms-full.txt alongside it with the full text of each page concatenated.

How do I measure whether llms.txt is working?

Direct measurement is hard today because most AI tools don't expose which sources they used. Indirect signals: server logs showing GET /llms.txt requests from AI user agents (GPTBot, ClaudeBot, etc.), increased referral traffic from chat.openai.com and claude.ai, and qualitative tests — ask Claude and ChatGPT product-specific questions and see whether their answers improve over a month after publishing. Don't expect immediate, measurable lift; treat it as low-cost infrastructure that compounds.

llms-txt ai-agents seo documentation saas

Ready to put this into practice?

Your changelog shouldn't be an afterthought.

ReleasePad makes it easy to publish great release notes — from a public changelog page to an in-app widget, GitHub integration, and analytics. Free to get started.

Get started — it's free
Try me now!