home / mcp / autodev codebase mcp server

Autodev Codebase MCP Server

Provides an MCP server to index codebases, expose semantic search endpoints, and integrate with IDEs.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "anrgct-autodev-codebase": {
      "url": "http://localhost:3001/sse",
      "headers": {
        "IS_ENABLED": "true",
        "QDRANT_URL": "http://localhost:6333",
        "EMBED_MODEL": "dengcao/Qwen3-Embedding-0.6B:Q8_0",
        "EMBED_BASEURL": "http://localhost:11434",
        "QDRANT_APIKEY": "YOUR_API_KEY_IF_NEEDED",
        "EMBED_PROVIDER": "ollama",
        "SEARCH_MIN_SCORE": "0.4"
      }
    }
  }
}

You run an MCP (Model Context Protocol) server that exposes your codebase as a semantic search service and a set of endpoints IDEs can connect to. This server enables fast, vector-based code search, integrates with a local vector store, and provides an HTTP MCP endpoint for IDEs to interact with your workspace. It supports interactive TUI usage as well as a long-running MCP server mode for IDE integration and automation workflows.

How to use

You can use the MCP server in two practical modes. First, run an interactive session to index and explore your codebase locally. Second, start the MCP server so your IDE or tooling can connect and perform semantic searches, file previews, and code-context operations.

How to install

Prerequisites you need to have installed on your system before starting: Node.js and npm, a container runtime if you choose to run Docker, and a working shell for running commands.

# 1) Install Node.js and npm (if not already installed)
# Use your platform’s package manager, e.g., apt, brew, or installer from nodejs.org

# 2) Install and start the MCP server package globally
npm install -g @autodev/codebase

# 3) Optional: install dependencies locally for a project
# git clone your project, then:
npm install
npm run build
```}]} ,{

If you prefer running the MCP server as a long-running process ready for IDE connections, use the following standard start command from your project workspace.

npm run mcp-server

Configuration and usage notes

Configure how the MCP server indexes your codebase and where it stores embeddings and vectors. You can customize the embedding provider ( Ollama, OpenAI-compatible, or others), the embedding model, and the Qdrant vector store URL. The configuration has a clear priority: CLI parameters override the project config, which overrides the global config, which overrides built-in defaults.

Common MCP server endpoints you will use from the IDE or tooling: - Home: http://localhost:3001 - Health: http://localhost:3001/health - MCP Endpoint: http://localhost:3001/sse - Example IDE config uses the MCP endpoint above.

IDE integration example

Configure your IDE to connect to the MCP server using the SSE endpoint. If your IDE supports SSE MCP, supply the endpoint URL as shown.

{
  "mcpServers": {
    "codebase": {
      "url": "http://localhost:3001/sse"
    }
  }
}

Usage tips for the MCP server

- Use the MCP server in HTTP mode for IDE integrations and automated tooling. - Use the interactive TUI mode for local indexing and exploration when you first set up a workspace. - Start the MCP server in the workspace you want to index, so the server can expose the correct codebase context.

Security and maintenance

Protect access to the MCP endpoints by limiting host exposure and employing appropriate network controls. Regularly update the embedding models and vector store backends to benefit from improvements and security fixes. Monitor health and performance endpoints to ensure the service remains responsive.

Troubleshooting

If you encounter connection issues from your IDE, verify that the MCP server is running and listening on the expected host and port. Check the health endpoint for status, and review the server logs for any errors related to embeddings, vector storage, or API access.

Available tools

search_codebase

Semantically search through your codebase using embedded representations. Provide a query and receive file paths, similarity scores, and code blocks as results.