LLMS.txt Documentation MCP server

Provides AI systems with access to documentation from llms.txt files by fetching and parsing content from specified URLs, enabling seamless documentation lookup during coding sessions.
Back to servers
Setup instructions
Provider
LangChain
Release date
Mar 18, 2025
Language
Python
Package
Stats
38.2K downloads
834 stars

The MCP Documentation Server is an open-source tool that provides AI coding assistants like Cursor, Windsurf, and Claude with controlled access to documentation through the Model Context Protocol (MCP). It allows you to serve llms.txt files with full control over document retrieval and context, letting you audit what information is accessed by your AI assistants.

Installation

Prerequisites

First, install the uv package manager:

curl -LsSf https://astral.sh/uv/install.sh | sh

Installing the MCP Server

Install the MCP server using uv:

uvx --from mcpdoc mcpdoc

Basic Usage

Starting the Server

Launch the MCP server with one or more llms.txt URLs:

uvx --from mcpdoc mcpdoc \
    --urls "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt" "LangChain:https://python.langchain.com/llms.txt" \
    --transport sse \
    --port 8082 \
    --host localhost

This starts the server at http://localhost:8082 with documentation for LangGraph and LangChain.

Security and Domain Access Control

The server implements strict domain access controls:

  • For remote llms.txt files, only the domain of that file is automatically allowed
  • For local llms.txt files, no domains are automatically allowed
  • Additional domains can be specified with --allowed-domains domain1.com domain2.com
  • Use --allowed-domains '*' to allow all domains (use with caution)

Testing with MCP Inspector

You can test the server using the MCP inspector tool:

npx @modelcontextprotocol/inspector

Connect to your running server to test tool calls.

Integrating with AI Assistants

Cursor Integration

  1. Open Cursor Settings and navigate to the MCP tab
  2. Edit the ~/.cursor/mcp.json file with:
{
  "mcpServers": {
    "langgraph-docs-mcp": {
      "command": "uvx",
      "args": [
        "--from",
        "mcpdoc",
        "mcpdoc",
        "--urls",
        "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt LangChain:https://python.langchain.com/llms.txt",
        "--transport",
        "stdio"
      ]
    }
  }
}
  1. Update Cursor Global Rules with:
for ANY question about LangGraph, use the langgraph-docs-mcp server to help answer -- 
+ call list_doc_sources tool to get the available llms.txt file
+ call fetch_docs tool to read it
+ reflect on the urls in llms.txt 
+ reflect on the input question 
+ call fetch_docs on any urls relevant to the question
+ use this to answer the question
  1. Press CMD+L to open chat and select "agent"
  2. Try a question like "what are types of memory in LangGraph?"

Windsurf Integration

  1. Open Cascade with CMD+L and click "Configure MCP"
  2. Update the ~/.codeium/windsurf/mcp_config.json file with the same configuration as for Cursor
  3. Update Windsurf Global Rules with similar rules as for Cursor
  4. Try your example questions

Claude Desktop Integration

  1. Open Settings/Developer to update ~/Library/Application\ Support/Claude/claude_desktop_config.json
  2. Add the same MCP server configuration
  3. Restart Claude Desktop
  4. Add the rules directly to your prompt:
<rules>
for ANY question about LangGraph, use the langgraph-docs-mcp server to help answer -- 
+ call list_doc_sources tool to get the available llms.txt file
+ call fetch_docs tool to read it
+ reflect on the urls in llms.txt 
+ reflect on the input question 
+ call fetch_docs on any urls relevant to the question
</rules>

Claude Code Integration

In a terminal, run:

claude mcp add-json langgraph-docs '{"type":"stdio","command":"uvx" ,"args":["--from", "mcpdoc", "mcpdoc", "--urls", "langgraph:https://langchain-ai.github.io/langgraph/llms.txt", "LangChain:https://python.langchain.com/llms.txt"]}' -s local

Launch Claude Code and test with:

$ Claude
$ /mcp

Include the rules in your prompt as with Claude Desktop.

Advanced Configuration

Command-line Options

The server can be configured using various options:

mcpdoc --urls LangGraph:https://langchain-ai.github.io/langgraph/llms.txt --follow-redirects --timeout 15

You can specify documentation sources in three ways:

  1. Using a YAML config file:
mcpdoc --yaml sample_config.yaml
  1. Using a JSON config file:
mcpdoc --json sample_config.json
  1. Directly specifying URLs:
mcpdoc --urls LangGraph:https://langchain-ai.github.io/langgraph/llms.txt

These methods can be combined to merge documentation sources.

Configuration File Formats

YAML Configuration Example

- name: LangGraph Python
  llms_txt: https://langchain-ai.github.io/langgraph/llms.txt

JSON Configuration Example

[
  {
    "name": "LangGraph Python",
    "llms_txt": "https://langchain-ai.github.io/langgraph/llms.txt"
  }
]

Programmatic Usage

You can also use the server programmatically in Python:

from mcpdoc.main import create_server

server = create_server(
    [
        {
            "name": "LangGraph Python",
            "llms_txt": "https://langchain-ai.github.io/langgraph/llms.txt",
        },
    ],
    follow_redirects=True,
    timeout=15.0,
)

server.run(transport="stdio")

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "langgraph-docs-mcp" '{"command":"uvx","args":["--from","mcpdoc","mcpdoc","--urls","LangGraph:https://langchain-ai.github.io/langgraph/llms.txt LangChain:https://python.langchain.com/llms.txt","--transport","stdio"]}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "langgraph-docs-mcp": {
            "command": "uvx",
            "args": [
                "--from",
                "mcpdoc",
                "mcpdoc",
                "--urls",
                "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt LangChain:https://python.langchain.com/llms.txt",
                "--transport",
                "stdio"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "langgraph-docs-mcp": {
            "command": "uvx",
            "args": [
                "--from",
                "mcpdoc",
                "mcpdoc",
                "--urls",
                "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt LangChain:https://python.langchain.com/llms.txt",
                "--transport",
                "stdio"
            ]
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later