home / mcp / docs mcp server

Docs MCP Server

Grounded Docs MCP Server: Open-Source Alternative to Context7, Nia, and Ref.Tools

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "arabold-docs-mcp-server": {
      "url": "http://localhost:6280/sse",
      "headers": {
        "OPENAI_API_KEY": "YOUR_OPENAI_API_KEY"
      }
    }
  }
}

You can run Grounded Docs MCP Server on your machine to keep an up-to-date, private documentation index for your AI coding assistant. It fetches official docs from websites, GitHub, npm, PyPI, and local files, enabling your AI to query the exact version of libraries you are using and reducing hallucinations by grounding responses in real documentation.

How to use

Start the server on your machine and point your MCP client to the server to begin querying documentation. Use the built-in Web UI to add documentation sources, then configure your MCP client so your AI can request context from those sources during conversations. You can also run the server locally and connect via an MCP client that supports HTTP-based endpoints or local IPC channels.

How to install

Prerequisites: you need Node.js 22 or newer installed on your machine.

Install and run the server using the following command:

npx @arabold/docs-mcp-server@latest

Additional setup and configuration

Open the Web UI at http://localhost:6280 to add documentation sources such as websites, GitHub repositories, local folders, and zip archives. Then connect your MCP client by pointing it to the server’s endpoint.

If you want to connect via a direct MCP configuration, you can add an HTTP-based endpoint for the server as shown in the example configuration. This allows your MCP client to stream or fetch data from the server as needed.

Optionally, you can run the server with Docker. The following command maps the necessary volumes and exposes the port for client connections.

docker run --rm \
  -v docs-mcp-data:/data \
  -v docs-mcp-config:/config \
  -p 6280:6280 \
  ghcr.io/arabold/docs-mcp-server:latest \
  --protocol http --host 0.0.0.0 --port 6280

Embedding models (optional)

Using an embedding model is optional but dramatically improves search quality by enabling semantic vector search. You can provide your embedding provider’s credentials when starting the server.

Example—enable embeddings with an API key for OpenAI during startup.

OPENAI_API_KEY="sk-proj-..." npx @arabold/docs-mcp-server@latest

Available tools

web_ui

Web-based interface to add documentation sources, start/stop the server, and monitor status.

embedding_search

Optional semantic vector search to improve relevance and reduce hallucinations by matching queries to embedded representations.

scrape_sources

Support for indexing documents from websites, GitHub repositories, local folders, and zip archives.