Home / MCP / Code Knowledge Tool MCP Server
Provides local memory storage and RAG context for code knowledge with MCP integration.
Configuration
View docs{
"mcpServers": {
"code_knowledge": {
"command": "python",
"args": [
"-m",
"code_knowledge_tool.mcp_tool"
],
"env": {
"PYTHONPATH": "${workspaceFolder}"
}
}
}
}You can leverage the Code Knowledge Tool as an MCP server to store code-related knowledge locally and provide context-aware insights through a modular MCP interface. It enables persistent memory, fast embeddings via Ollama, and seamless integration with RooCode and Cline to augment your development workflows with RAG-based context.
Connect to the Code Knowledge Tool MCP server from your MCP client using the provided stdio configuration. This config runs locally with Python and exposes the tool as an MCP endpoint that you can query for project-wide memory and code-context augmentations.
Prerequisites you need before installation are Python 3.8 or higher and Ollama running locally.
Step 1. Build the package locally.
# Clone the repository
git clone https://github.com/yourusername/code-knowledge-tool.git
cd code-knowledge-tool
# Create and activate a virtual environment
python -m venv venv
source venv/bin/activate
# Install build tools
python -m pip install --upgrade pip build
# Build the package
python -m buildStep 2. Prepare and start Ollama.
# Install Ollama (if not already installed)
curl https://ollama.ai/install.sh | sh
# Start Ollama service
ollama serveStep 3. Install the Code Knowledge Tool package.
# Navigate to where you built the package
cd /path/to/code_knowledge_tool
# Install from the wheel file
pip install dist/code_knowledge_tool-0.1.0-py3-none-any.whlStep 4. (Optional for development) install in editable mode with development dependencies.
# Assuming you're already in the code-knowledge-tool directory
# and have activated your virtual environment
# Install in editable mode with development dependencies
pip install -e ".[dev]"Step 5. Configure the MCP endpoint for local use.
{
"mcpServers": {
"code_knowledge": {
"command": "python",
"args": ["-m", "code_knowledge_tool.mcp_tool"],
"env": {
"PYTHONPATH": "${workspaceFolder}"
}
}
}
}Configuration notes: The MCP endpoint is provided as a stdio server. It runs the Python module code_knowledge_tool.mcp_tool and uses the workspace folder as PYTHONPATH to access the tool’s modules during runtime.
Usage notes: You can use this MCP server as a memory bank for your project and as a RAG context provider. It supports local vector storage, embedding generation with Ollama, multiple file type handling, and integration with RooCode and Cline for seamless development workflows.
Project setup templates: Copy the memory and rules template to your project to tailor knowledge base management and RAG-based workflows.
Folder and env hints: The configuration example uses PYTHONPATH to ensure the tool’s modules are discoverable during execution. Keep the environment variables aligned with your IDE or editor integration.
- Ensure Ollama is running before attempting embeddings or memory-related tasks.
- Use the provided JSON MCP config exactly as shown to expose the stdio endpoint. Do not modify the command or arguments unless you know you need to point to a different module or path.
- If you encounter module import errors, verify PYTHONPATH includes the workspace folder where code_knowledge_tool.mcp_tool resides.
Local vector storage for code knowledge with persistent storage.
Efficient embedding generation using Ollama.
RAG-based context augmentation to provide relevant code context.
MCP integration with RooCode and Cline to expose as MCP server.
Memory bank for project knowledge and memory management.