home / mcp / deeprepo mcp server

DeepRepo MCP Server

Exposes retrieval-augmented generation for local codebases via MCP clients with support for multiple providers.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "abhishek2432001-deeprepo": {
      "command": "deeprepo-mcp",
      "args": [],
      "env": {
        "HF_TOKEN": "<HF_TOKEN>",
        "GEMINI_API_KEY": "<GEMINI_API_KEY>",
        "OPENAI_API_KEY": "<OPENAI_API_KEY>",
        "ANTHROPIC_API_KEY": "<ANTHROPIC_API_KEY>",
        "HUGGINGFACE_API_KEY": "<HUGGINGFACE_API_KEY>"
      }
    }
  }
}

You can run the DeepRepo MCP server to expose retrieval-augmented generation capabilities for local codebases and connect it with popular MCP clients such as Cursor, Claude Desktop, and Antigravity. This server lets you ingest code, query with context, and manage conversations through standardized MCP endpoints while keeping all processing local or within selected providers.

How to use

You install and run the MCP server to enable ambient AI-assisted code queries from compatible MCP clients. Start the server using the CLI command or via the Python module, then configure your MCP client with the resulting endpoint and optional environment settings. Once running, you can ingest your codebase, run codebase queries with context-aware results, and retrieve sources for verification.

How to install

Prerequisites: ensure you have Python installed on your system. You may also want Node.js or other tooling if you plan to integrate with additional MCP clients, but the MCP server itself runs through Python.

# Install the MCP server package and related components
pip install -e .
# Or, if you are using extras for MCP dependencies
pip install 'deeprepo[mcp]'

MCP server setup and usage

You can start the MCP server in two ways. Use the CLI to launch a local MCP server, or run it as a Python module. Pick the method you prefer and follow the exact steps below.

# Using the CLI command
deeprepo-mcp

# Or as a Python module
python -m deeprepo.mcp.server

Configuring Cursor or other MCP clients

Create or edit your client’s MCP configuration file to point to the server. You can run the server with a default provider or choose specific embeddings and LLM providers via environment variables.

{
  "mcpServers": {
    "deeprepo": {
      "command": "python",
      "args": ["-m", "deeprepo.mcp.server"],
      "env": {
        "LLM_PROVIDER": "ollama"
      }
    }
  }
}

Available MCP commands for the server

The MCP server exposes a set of tools you can invoke through the MCP pipeline to manage and query your codebase.

REST API and endpoints for MCP usage

This MCP server is designed to work with MCP clients and may be extended to provide REST endpoints for client interactions. Use the MCP client to communicate with the server and perform ingestion, querying, and history management through the provided tools.

MCP environment variables and provider switching

You can specify which providers to use for embeddings and LLMs by setting environment variables when starting the MCP server. The following variables are commonly used to configure the providers.

# Common provider environment variables
export EMBEDDING_PROVIDER=openai
export LLM_PROVIDER=anthropic
export OPENAI_API_KEY=your-openai-key
export ANTHROPIC_API_KEY=your-anthropic-key

Security and best practices

Run the MCP server in a controlled environment. Use private API keys, enable network access only from trusted clients, and consider local-only operation when handling sensitive codebases.

Troubleshooting tips

If the server fails to start, verify Python dependencies are installed, check that the selected providers are reachable, and confirm the environment variables are correctly exported. Review client configuration to ensure the MCP endpoint is correct.

Tools exposed by the MCP server

The MCP server offers a set of core tools to manage your codebase and perform queried reasoning with context.

Available tools

ingest_codebase

Ingest a directory into the vector store so your codebase can be queried with context.

query_codebase

Query the knowledge base with retrieval-augmented generation to obtain relevant answers and sources.

search_similar

Find similar code snippets or files without invoking the LLM.

get_stats

Retrieve statistics about the vector store, such as chunk counts and similarity metrics.

clear_history

Clear the conversation history maintained by the MCP server.