A lot of small business owners are getting pushed toward the same shiny-object task right now: adding an llms.txt file.
If you listen to enough SEO chatter, you’d think this one file is the difference between being invisible and showing up in AI search.
That’s not what Google’s own documentation says.
Google’s AI features documentation is unusually clear here: you don’t need to create new machine readable files, AI text files, or markup to appear in AI Overviews or AI Mode. Google also says the same page must simply be eligible to appear in regular Search with a snippet, and traffic from AI features is reported inside the normal Web search type in Search Console’s Performance report.
So no, most small business websites do not need to drop everything and ship llms.txt this week.
That doesn’t mean the file is fake. It means it’s being oversold.
The official llms.txt proposal describes it as a proposal to help language models use website content at inference time. The original Answer.AI project says the same thing. Proposal is the key word. That’s very different from a broadly adopted web standard that Google has told site owners to implement.
For a small business, the bigger risk is obvious. You spend time debating one speculative file while the basics that actually affect crawlability, understanding, and conversion are still weak.
Here’s the practical version of the argument.
What llms.txt is, and what it isn’t
At its core, llms.txt is a text file that gives AI systems a cleaner guide to the parts of your site you want them to use. The proposal compares it to a curated map of important content, especially for documentation-heavy sites, software docs, and structured knowledge bases, according to llmstxt.org and the project repository.
That can be useful in some situations.
If you run a developer platform, publish technical documentation, or maintain a huge resource library, I can see why you’d experiment with it.
But most small business sites are not documentation portals. They’re usually 20 to 200 pages, with a homepage, service pages, location pages, a few case studies, and a blog.
Those sites rarely have an AI-parsing problem first. They usually have a clarity problem, a crawlability problem, or a conversion problem.
A local law firm doesn’t lose leads because it forgot llms.txt. It loses leads because its service pages are thin, its contact paths are weak, or its trust signals are buried.
A home service company doesn’t disappear from search because it missed one experimental file. It disappears because its Google Business Profile is outdated, its title tags are generic, or its local landing pages don’t explain what it actually does.
What Google says to focus on instead
This is where the hype dies down fast.
In Google’s AI features and your website documentation, Google says there are no additional requirements to appear in AI Overviews and AI Mode beyond the usual requirements for appearing in Search. The same page says standard SEO fundamentals still matter, including allowing crawling, making content discoverable through internal links, and making sure the content can be indexed and shown with a snippet.
That lines up with the technical basics Google has been repeating for years.
Google’s robots.txt guide says robots.txt is used to manage crawler traffic. Google’s sitemap documentation says a sitemap helps search engines discover important URLs, especially when pages might not be easy to find otherwise. Google’s structured data introduction explains that structured data helps Google understand page content.
That’s the actual checklist.
Not mystery files. Not trend-chasing.
If your small business site is blocking key assets, hiding important pages deep in navigation, shipping weak internal links, or missing useful business details in structured data, fix those first.
The five things I’d fix before touching llms.txt
1. Make sure Google can crawl and index the pages that matter
This sounds basic because it is basic, and it’s still where a lot of sites leak performance.
Check your robots file. Check that your important pages are in the sitemap. Check that you’re not accidentally blocking CSS, JavaScript, or sections of the site Google needs to render. Google’s robots.txt documentation and sitemap guide spell out how these files should work.
If you only have time for one technical pass this month, spend it here.
A page that isn’t crawled or indexed is not going to win in traditional search or AI search.
2. Tighten your internal linking
Google explicitly calls out internal linking in its AI features guidance. That’s important because internal links do two jobs at once: they help search engines discover pages, and they help them understand which pages on your site carry the most context.
For a small business, this doesn’t need to become a giant SEO project.
It can be simple:
- Link your homepage to your core service pages.
- Link service pages to related case studies and FAQs.
- Link blog posts back to the service pages that should convert readers.
If your website has useful content but nothing points to it clearly, llms.txt won’t rescue it.
3. Add better structured data, especially business details
A lot of business sites still treat schema markup like optional decoration.
It isn’t magic, but it does help search engines interpret what your business is, where you’re located, and what pages represent. Google’s LocalBusiness structured data documentation says you need required properties to be eligible for local business rich results, and recommended properties can improve the user experience. Google’s structured data intro makes the broader point: this markup helps Google understand your content.
For a small business, the useful fields are usually the boring ones:
- business name
- address
- phone number
- opening hours
- website URL
- service area or location details
That’s the kind of information search engines can actually use.
One warning here: don’t waste time stuffing fake review markup onto your own site. Google’s review snippet documentation and its review rich results update make it clear that self-serving reviews for LocalBusiness and Organization aren’t shown as rich results.
4. Rewrite service pages so they answer real buyer questions
This is where most small business sites still underperform.
They say things like “quality service,” “custom solutions,” or “trusted experts,” then wonder why nothing ranks or converts.
Google’s AI features page says existing SEO fundamentals still apply. That means your pages need to be clear enough to understand, not just optimized enough to publish.
A service page should answer questions like:
- What do you do?
- Who is it for?
- Where do you offer it?
- What problem does it solve?
- What happens next if someone contacts you?
If your plumbing page, dental page, HVAC page, legal page, or IT services page doesn’t answer those questions directly, you’re not dealing with an AI-search problem. You’re dealing with a messaging problem.
5. Use Search Console to find pages that already have demand
A lot of owners guess where to optimize.
That’s backwards.
Search Console’s Performance report shows clicks, impressions, CTR, and average position for your pages and queries. Google’s AI features documentation also says appearances in AI features are included in the Web search type. That means you don’t need a separate AI dashboard before making better decisions.
Look for pages with these patterns:
- good impressions, bad CTR
- page 1 or page 2 rankings with weak leads
- important service pages getting impressions for the wrong intent
Those are the pages worth rewriting first.
Most of the time, a stronger title, sharper excerpt, clearer service-page intro, and better internal links will produce more value than an experimental file few customers will ever know exists.
When llms.txt might be worth testing
I don’t think small businesses should ignore it forever.
I think they should put it in the right place on the priority list.
Testing llms.txt may make sense if you have one of these situations:
- a very large documentation or resource library
- a product with heavy developer docs
- a content team already strong on technical SEO basics
- time for experimentation after the core site is in good shape
Even then, I’d treat it as an experiment, not a rescue plan.
The reason is simple. Google has already told us in its AI features guidance that you don’t need AI text files to appear in AI features. So if your site is still missing the basics, llms.txt is not your bottleneck.
The small business rule of thumb
If your website still has weak service pages, poor internal links, inconsistent business information, thin structured data, or sloppy indexing signals, do not let llms.txt jump the line.
The businesses that win search in 2026 are usually not the ones doing the most experimental things first.
They’re the ones doing the obvious things better than their competitors.
They make their key pages easy to crawl. Easy to understand. Easy to trust. Easy to contact.
That still works in normal search. It still works in local search. And based on Google’s own documentation, it still matters in AI search too.
So if you’re a small business owner asking whether you need llms.txt right now, my honest answer is this:
Probably not.
Not before you’ve fixed the pages that make you money.
If you want help tightening the parts of your site that actually drive rankings, calls, and leads, get started here.
Richard Kastl
Founder & Lead EngineerRichard Kastl has spent 14 years engineering websites that generate revenue. He combines expertise in web development, SEO, digital marketing, and conversion optimization to build sites that make the phone ring. His work has helped generate over $30M in pipeline for clients ranging from industrial manufacturers to SaaS companies.