home / mcp / kontxt mcp server
Codebase indexing MCP server
Configuration
View docs{
"mcpServers": {
"reyneill-kontxt": {
"url": "http://localhost:8080/sse",
"headers": {
"GEMINI_API_KEY": "YOUR_API_KEY"
}
}
}
}Kontxt MCP Server is an MCP server that analyzes a local codebase to generate context for AI clients. It connects to your repository, exposes context tools for queries, and supports both SSE and stdio transport to fit your workflow. It tracks token usage and lets you tune the context generation to your needs.
Start the server to analyze a local codebase and provide context to your MCP clients. You can connect with an SSE client or run the server locally and have a client manage the transport.
Prerequisites: you need Python and a local code repository to analyze.
Create a Python virtual environment, install dependencies, and prepare the repository path.
Concrete commands you can run:
python kontxt_server.py --repo-path /path/to/your/codebaseConfigure the Gemini API key to enable tokenization and context generation. You can provide the key via an environment variable or a command-line flag.
If you are using a local environment, you may want to install a tokenizer and ensure required tools are available.
Provides context from the connected codebase for a given user query to enable targeted analysis by AI clients.
Lists the directory layout of the connected repository to help understand project organization.
Reads file contents from the repository to supply relevant code snippets to the analysis tools.
Searches the codebase for patterns or terms specified in your queries to refine context.
Tracks and reports token usage across repository structure, file reads, searches, and responses for cost awareness and optimization.