
Documentation now needs to work for AI agents too. That does not mean writing docs for robots and forgetting humans. It means accepting that AI coding tools are now a real consumer of developer documentation, and they fail in predictable ways when the documentation is messy, stale, or hard to retrieve.
Vercel published a useful guide on this recently, and the short version feels right: agents do not need animated code blocks, heavy navigation, or a pile of HTML. They need clean retrieval, discoverable structure, freshness metadata, canonical URLs, and tool access where precision matters.
Bad docs have always been expensive. They waste human time, cause wrong assumptions, and slow onboarding. Now they also make AI tools worse. If an agent pulls stale examples, misses the version number, or cannot tell which page is canonical, it will produce worse code and a developer will still have to clean up the confusion.
The practical work is not glamorous: clean markdown, useful indexes like llms.txt or sitemap.md, clear last-updated dates, project-specific instructions near the code, and search or MCP tools where the docs need to be queried carefully. But this is exactly the sort of boring work that makes AI-assisted development less annoying. Better context is now part of the engineering system.