LLM.txt Directory MCP server

Access up-to-date API documentation efficiently.
Back to servers
Setup instructions
Provider
MCP Get
Release date
Dec 04, 2024
Language
TypeScript
Package
Stats
6.7K downloads
57 stars

This MCP server extracts and serves context from llm.txt files, allowing AI models to understand file structure, dependencies, and code relationships in development environments. It provides access to the LLM.txt Directory with features like file listing, content retrieval, and multi-query search capabilities with local caching.

Installation Options

Using MCP Get (Recommended)

The simplest installation method uses MCP Get, which automatically configures the server in Claude Desktop:

npx @michaellatman/mcp-get@latest install @mcp-get-community/server-llm-txt

Manual Configuration

Alternatively, manually configure the server by adding this to your claude_desktop_config.json:

{
  "mcpServers": {
    "llm-txt": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-llm-txt"
      ]
    }
  }
}

Available Tools

List LLM.txt Files

The list_llm_txt tool retrieves all available LLM.txt files from the directory. Results are cached locally for 24 hours in OS-specific locations:

  • Windows: %LOCALAPPDATA%\llm-txt-mcp
  • macOS: ~/Library/Caches/llm-txt-mcp
  • Linux: ~/.cache/llm-txt-mcp

Example response:

[
  {
    "id": 1,
    "url": "https://docs.squared.ai/llms.txt",
    "name": "AI Squared",
    "description": "AI Squared provides a data and AI integration platform that helps make intelligent insights accessible to all."
  }
]

Get LLM.txt Content

The get_llm_txt tool fetches content from a specific LLM.txt file by ID.

Parameters:

  • id: The numeric ID of the LLM.txt file (from list_llm_txt)

Example response:

{
  "id": 1,
  "url": "https://docs.squared.ai/llms.txt",
  "name": "AI Squared",
  "description": "AI Squared provides a data and AI integration platform that helps make intelligent insights accessible to all.",
  "content": "# AI Squared\n\n## Docs\n\n- [Create Catalog](https://docs.squared.ai/api-reference/catalogs/create_catalog)\n- [Update Catalog](https://docs.squared.ai/api-reference/catalogs/update_catalog)\n..."
}

Search LLM.txt Files

The search_llm_txt tool allows searching for multiple substrings within an LLM.txt file.

Parameters:

  • id: The numeric ID of the LLM.txt file
  • queries: Array of strings to search for (case-insensitive)
  • context_lines (optional): Number of lines to show before and after matches (default: 2)

Example response:

{
  "id": 1,
  "url": "https://docs.squared.ai/llms.txt",
  "name": "AI Squared",
  "matches": [
    {
      "lineNumber": 42,
      "snippet": "- [PostgreSQL](https://docs.squared.ai/guides/data-integration/destinations/database/postgresql): PostgreSQL\n popularly known as Postgres, is a powerful, open-source object-relational database system that uses and extends the SQL language combined with many features that safely store and scale data workloads.\n- [null](https://docs.squared.ai/guides/data-integration/destinations/e-commerce/facebook-product-catalog)",
      "matchedLine": "- [PostgreSQL](https://docs.squared.ai/guides/data-integration/destinations/database/postgresql): PostgreSQL\n popularly known as Postgres, is a powerful, open-source object-relational database system that uses and extends the SQL language combined with many features that safely store and scale data workloads.",
      "matchedQueries": ["postgresql", "database"]
    }
  ]
}

Note About IDs

The server uses numeric IDs rather than string identifiers. This design choice helps prevent language models from hallucinating non-existent LLM.txt files, encouraging models to check the actual list of available files first.

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "llm-txt" '{"command":"npx","args":["-y","@modelcontextprotocol/server-llm-txt"]}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "llm-txt": {
            "command": "npx",
            "args": [
                "-y",
                "@modelcontextprotocol/server-llm-txt"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "llm-txt": {
            "command": "npx",
            "args": [
                "-y",
                "@modelcontextprotocol/server-llm-txt"
            ]
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later