Home / MCP / Claude Context MCP Server

Claude Context MCP Server

Provides semantic code search over large codebases via an MCP server with vector-backed indexing.

typescript
Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
    "mcpServers": {
        "claude_context": {
            "command": "npx",
            "args": [
                "-y",
                "@zilliz/claude-context-mcp@latest"
            ],
            "env": {
                "OPENAI_API_KEY": "your-openai-api-key",
                "MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
                "MILVUS_TOKEN": "your-zilliz-cloud-api-key"
            }
        }
    }
}

Claude Context MCP is a server that enables semantic code search across your codebase by integrating with AI coding assistants. It indexes your code into a vector database and serves contextually relevant snippets to improve search accuracy and reduce token usage during interactions with agents like Claude Code.

How to use

You use Claude Context MCP by connecting it to an MCP-compatible client (such as Claude Code or other AI assistants). The MCP server runs locally or remotely and exposes a standard interface that lets your assistant request indexed code and search results using natural language queries. You index your codebase once and then perform semantic searches to find relevant code snippets, functions, or patterns across millions of lines without loading everything into memory for every request.

How to install

Prerequisites: Node.js 20.x to 22.x, and a compatible package manager such as npm or pnpm.

Install and start the MCP server using the following steps. Copy the commands exactly as shown.

# Step 1: Install dependencies and start the MCP server
npx @zilliz/claude-context-mcp@latest

Additional setup for the MCP server

The MCP server expects embedding keys and access tokens for the vector database. You provide these via environment variables when starting the server.

Common environment variables include OPENAI_API_KEY for the embedding model and MILVUS_TOKEN (or MILVUS_ADDRESS with MILVUS_TOKEN) for the Milvus or Zilliz Cloud vector database.

# Example environment setup for local development
OPENAI_API_KEY=your-openai-api-key MILVUS_TOKEN=your-zilliz-cloud-api-key npx @zilliz/claude-context-mcp@latest

Usage in an MCP client

1. Start the MCP server with the appropriate environment variables.

2. In your MCP client, connect to the Claude Context MCP server and issue indexing and search commands. Use the client’s built-in controls to index your codebase, monitor indexing progress, and perform semantic searches against the indexed content.

3. Index your codebase, then run queries like “find functions that handle user authentication” to retrieve relevant code chunks with scores indicating relevance.

Notes on environment and configuration

Use OpenAI for embeddings if you prefer OpenAI’s models. Provide your MILVUS/Milvus Cloud credentials to enable vector storage and retrieval.

Adjust file inclusion rules and embedding settings to optimize retrieval quality and cost for your codebase.

Available tools

index_codebase

Index a codebase directory for hybrid search (BM25 + dense vector) to enable fast semantic code retrieval.

search_code

Query the indexed codebase using natural language to retrieve relevant code snippets with relevance scores.

clear_index

Clear the search index for a specific codebase, removing all indexed content.

get_indexing_status

Check the current progress and status of an active indexing operation.