The Docs MCP Server is an AI-powered documentation manager that solves the problem of outdated information and hallucinations in AI coding assistants. It indexes third-party documentation from websites, GitHub, npm, PyPI, and local files, making the latest official documentation available to your AI through the Model Context Protocol (MCP).
Install Docker and Docker Compose
Clone the repository
git clone https://github.com/arabold/docs-mcp-server.git
cd docs-mcp-server
Set up environment
cp .env.example .env
# Edit .env and add your OpenAI API key
Start the services
docker compose up -d
Configure your MCP client Add this configuration to your MCP settings:
{
"mcpServers": {
"docs-mcp-server": {
"url": "http://localhost:6280/sse",
"disabled": false,
"autoApprove": []
}
}
}
Access the web interface
Open http://localhost:6281
in your browser
Install and start Docker
Configure your MCP client Add this to your MCP settings:
{
"mcpServers": {
"docs-mcp-server": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"OPENAI_API_KEY",
"-v",
"docs-mcp-data:/data",
"ghcr.io/arabold/docs-mcp-server:latest"
],
"env": {
"OPENAI_API_KEY": "sk-proj-..." // Your OpenAI API key
},
"disabled": false,
"autoApprove": []
}
}
}
Run the server without installation:
OPENAI_API_KEY="sk-proj-..." npx @arabold/docs-mcp-server@latest
http://localhost:6281
You can index documentation from your local filesystem using file://
URLs:
file:///Users/me/docs/index.html
file:///Users/me/docs/my-library
When using Docker, you must mount your local folders:
docker run --rm \
-e OPENAI_API_KEY="your-key" \
-v /absolute/path/to/docs:/docs:ro \
-v docs-mcp-data:/data \
ghcr.io/arabold/docs-mcp-server:latest \
scrape mylib file:///docs/my-library
docker run --rm \
-e OPENAI_API_KEY="your-openai-api-key" \
-v docs-mcp-data:/data \
ghcr.io/arabold/docs-mcp-server:latest \
<command> [options]
Example to list indexed libraries:
docker run --rm \
-e OPENAI_API_KEY="your-openai-api-key" \
-v docs-mcp-data:/data \
ghcr.io/arabold/docs-mcp-server:latest \
list
npx @arabold/docs-mcp-server@latest <command> [options]
Example:
npx @arabold/docs-mcp-server@latest list
Configure the server using these environment variables:
Variable | Description |
---|---|
DOCS_MCP_EMBEDDING_MODEL |
Embedding model (default: text-embedding-3-small ) |
OPENAI_API_KEY |
OpenAI API key |
OPENAI_API_BASE |
Custom OpenAI-compatible API endpoint |
GOOGLE_API_KEY |
Google API key for Gemini |
GOOGLE_APPLICATION_CREDENTIALS |
Path to Google service account JSON |
AWS_ACCESS_KEY_ID |
AWS key for Bedrock |
AWS_SECRET_ACCESS_KEY |
AWS secret for Bedrock |
AWS_REGION |
AWS region |
AZURE_OPENAI_API_KEY |
Azure OpenAI API key |
AZURE_OPENAI_API_INSTANCE_NAME |
Azure OpenAI instance name |
AZURE_OPENAI_API_DEPLOYMENT_NAME |
Azure OpenAI deployment name |
AZURE_OPENAI_API_VERSION |
Azure OpenAI API version |
DOCS_MCP_DATA_DIR |
Data directory (default: ./data ) |
DOCS_MCP_PORT |
Server port (default: 6281 ) |
text-embedding-3-small
(default, OpenAI)openai:llama2
(OpenAI-compatible, Ollama)vertex:text-embedding-004
(Google Vertex AI)gemini:embedding-001
(Google Gemini)aws:amazon.titan-embed-text-v1
(AWS Bedrock)microsoft:text-embedding-ada-002
(Azure OpenAI)docker run -i --rm \
-e OPENAI_API_KEY="your-key" \
-e DOCS_MCP_EMBEDDING_MODEL="text-embedding-3-small" \
-v docs-mcp-data:/data \
ghcr.io/arabold/docs-mcp-server:latest
docker run -i --rm \
-e DOCS_MCP_EMBEDDING_MODEL="vertex:text-embedding-004" \
-e GOOGLE_APPLICATION_CREDENTIALS="/app/gcp-key.json" \
-v docs-mcp-data:/data \
-v /path/to/gcp-key.json:/app/gcp-key.json:ro \
ghcr.io/arabold/docs-mcp-server:latest
docker run -i --rm \
-e AWS_ACCESS_KEY_ID="your-aws-key" \
-e AWS_SECRET_ACCESS_KEY="your-aws-secret" \
-e AWS_REGION="us-east-1" \
-e DOCS_MCP_EMBEDDING_MODEL="aws:amazon.titan-embed-text-v1" \
-v docs-mcp-data:/data \
ghcr.io/arabold/docs-mcp-server:latest
There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json
file so that it is available in all of your projects.
If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json
file.
To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".
When you click that button the ~/.cursor/mcp.json
file will be opened and you can add your server like this:
{
"mcpServers": {
"cursor-rules-mcp": {
"command": "npx",
"args": [
"-y",
"cursor-rules-mcp"
]
}
}
}
To add an MCP server to a project you can create a new .cursor/mcp.json
file or add it to the existing one. This will look exactly the same as the global MCP server example above.
Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.
The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.
You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.