home / mcp / deeprepo mcp server
Exposes retrieval-augmented generation for local codebases via MCP clients with support for multiple providers.
Configuration
View docs{
"mcpServers": {
"abhishek2432001-deeprepo": {
"command": "deeprepo-mcp",
"args": [],
"env": {
"HF_TOKEN": "<HF_TOKEN>",
"GEMINI_API_KEY": "<GEMINI_API_KEY>",
"OPENAI_API_KEY": "<OPENAI_API_KEY>",
"ANTHROPIC_API_KEY": "<ANTHROPIC_API_KEY>",
"HUGGINGFACE_API_KEY": "<HUGGINGFACE_API_KEY>"
}
}
}
}You can run the DeepRepo MCP server to expose retrieval-augmented generation capabilities for local codebases and connect it with popular MCP clients such as Cursor, Claude Desktop, and Antigravity. This server lets you ingest code, query with context, and manage conversations through standardized MCP endpoints while keeping all processing local or within selected providers.
You install and run the MCP server to enable ambient AI-assisted code queries from compatible MCP clients. Start the server using the CLI command or via the Python module, then configure your MCP client with the resulting endpoint and optional environment settings. Once running, you can ingest your codebase, run codebase queries with context-aware results, and retrieve sources for verification.
Prerequisites: ensure you have Python installed on your system. You may also want Node.js or other tooling if you plan to integrate with additional MCP clients, but the MCP server itself runs through Python.
# Install the MCP server package and related components
pip install -e .
# Or, if you are using extras for MCP dependencies
pip install 'deeprepo[mcp]'You can start the MCP server in two ways. Use the CLI to launch a local MCP server, or run it as a Python module. Pick the method you prefer and follow the exact steps below.
# Using the CLI command
deeprepo-mcp
# Or as a Python module
python -m deeprepo.mcp.serverCreate or edit your clientβs MCP configuration file to point to the server. You can run the server with a default provider or choose specific embeddings and LLM providers via environment variables.
{
"mcpServers": {
"deeprepo": {
"command": "python",
"args": ["-m", "deeprepo.mcp.server"],
"env": {
"LLM_PROVIDER": "ollama"
}
}
}
}The MCP server exposes a set of tools you can invoke through the MCP pipeline to manage and query your codebase.
This MCP server is designed to work with MCP clients and may be extended to provide REST endpoints for client interactions. Use the MCP client to communicate with the server and perform ingestion, querying, and history management through the provided tools.
You can specify which providers to use for embeddings and LLMs by setting environment variables when starting the MCP server. The following variables are commonly used to configure the providers.
# Common provider environment variables
export EMBEDDING_PROVIDER=openai
export LLM_PROVIDER=anthropic
export OPENAI_API_KEY=your-openai-key
export ANTHROPIC_API_KEY=your-anthropic-keyRun the MCP server in a controlled environment. Use private API keys, enable network access only from trusted clients, and consider local-only operation when handling sensitive codebases.
If the server fails to start, verify Python dependencies are installed, check that the selected providers are reachable, and confirm the environment variables are correctly exported. Review client configuration to ensure the MCP endpoint is correct.
The MCP server offers a set of core tools to manage your codebase and perform queried reasoning with context.
Ingest a directory into the vector store so your codebase can be queried with context.
Query the knowledge base with retrieval-augmented generation to obtain relevant answers and sources.
Find similar code snippets or files without invoking the LLM.
Retrieve statistics about the vector store, such as chunk counts and similarity metrics.
Clear the conversation history maintained by the MCP server.