A Model Context Protocol (MCP) server that provides AI assistants with documentation for libraries and frameworks.
Warning
Atlas Docs is currently in beta. Not everything might work perfectly, but we're actively improving the service. Your patience and feedback are greatly appreciated!
LLMs are great at generating general code, but suck at correctly using less popular or newly released libraries. This isn't surprising, since the models have not been trained comprehensively on code using these libraries.
Atlas Docs MCP server:
- Provides technical documentation for libraries and frameworks
- Processes the official docs into a clean markdown version for LLM consumption
- Is easy to set up with Cursor, Cline, Windsurf and any other MCP-compatible LLM clients
Claude 3.5 Sonnet on its own:
Claude 3.5 Sonnet with Atlas Docs MCP:
Atlas-MCP-tool-use.mp4
Atlas Docs MCP server works with any MCP client that supports the stdio
protocol, including:
- Cursor
- Cline
- Windsurf
- Claude Desktop
Add the following to your MCP client configuration file:
{
"mcpServers": {
"atlas-docs": {
"command": "npx",
"args": ["-y", "@cartographai/atlas-docs-mcp"]
}
}
}
That's it! You may need to restart the app (for Claude Desktop) for the server to be recognised.
Tip: Prompt your model to check the docs eg. "Use the tools to check the documentation for Astro to ensure that you use the library correctly."
Alternatively, you can install Atlas Docs MCP automatically via Smithery. Example for claude desktop:
npx -y @smithery/cli install @CartographAI/atlas-docs-mcp --client claude
Change "claude" to "cursor", "cline" or "windsurf" for the respective clients.
- AI-SDK (source: https://sdk.vercel.ai/docs/introduction)
- Astro (source: https://docs.astro.build/en/getting-started)
- ast-grep (source: https://ast-grep.github.io/llms.txt)
- Bun (source: https://bun.sh/llms.txt)
- CrewAI (source: https://docs.crewai.com/llms.txt)
- Drizzle (source: https://orm.drizzle.team/llms.txt)
- ElevenLabs (source: https://elevenlabs.io/docs/llms.txt)
- Fireworks (source: https://docs.fireworks.ai/llms.txt)
- Hono (source: https://hono.dev/llms.txt)
- Langgraph-js (source: https://langchain-ai.github.io/langgraphjs/llms.txt)
- Langgraph-py (source: https://langchain-ai.github.io/langgraph/llms.txt)
- Mastra (source: https://mastra.ai/llms.txt)
- ModelContextProtocol (source: https://modelcontextprotocol.io/llms.txt)
- Pglite (source: https://pglite.dev/docs/about)
- Prisma (source: https://www.prisma.io/docs/llms.txt)
- Resend (source: https://resend.com/docs/llms.txt)
- shadcn/ui (source: https://ui.shadcn.com/docs)
- Stripe (source: https://docs.stripe.com/llms.txt)
- Svelte (source: https://svelte.dev/docs/svelte/overview)
- SvelteKit (source: https://svelte.dev/docs/kit/introduction)
- tailwindcss (source: https://tailwindcss.com/docs/installation)
- TanStack-Router (source: https://tanstack.com/router/latest/docs/framework/react/overview)
- Trigger.dev (source: https://trigger.dev/docs/llms.txt)
- X (source: https://docs.x.com/llms.txt)
- Zapier (source: https://docs.zapier.com/llms.txt)
Want docs for another library not in this list? Please open an issue in this repo, we'll try to process and add it!
list_docs
: List all available documentation setsget_docs_index
: Retrieves a condensed, LLM-friendly index of a documentation setget_docs_full
: Retrieves a complete documentation set in a single consolidated filesearch_docs
: Search a documentation set by keywordsget_docs_page
: Retrieves a specific page of a documentation set
Atlas Docs processes tech libraries' documentation sites into clean, markdown versions. This MCP server provides the docs as MCP tools, calling Atlas Docs APIs for the data.
Please visit CartographAI/atlas and follow the instructions in the README. Update ATLAS_API_URL with the url of your deployment.
Please open an issue in this repo to request docs for a library, or to report a bug.
If you have any questions, feedback, or just want to say hi, we'd love to hear from you. You can find us on Cartograph's Discord comunity for real-time support, or email us at [email protected]