The MCP Server is a powerful Model Context Protocol implementation powered by txtai, offering semantic search, knowledge graph capabilities, and AI-driven text processing through a standardized interface. It serves as a bridge between your data and AI systems, providing intelligent document retrieval and understanding.
# Install uv if you don't have it already
pip install -U uv
# Create a virtual environment with Python 3.10 or newer
uv venv --python=3.10
# Activate the virtual environment
source .venv/bin/activate
# Install from PyPI
uv pip install kb-mcp-server
# Create a new conda environment
conda create -n embedding-mcp python=3.10
conda activate embedding-mcp
# Install from PyPI
pip install kb-mcp-server
# Create a new conda environment
conda create -n embedding-mcp python=3.10
conda activate embedding-mcp
# Clone the repository
git clone https://github.com/Geeksfino/kb-mcp-server.git.git
cd kb-mcp-server
# Install dependencies
pip install -e .
# Run the MCP server directly
uvx --from [email protected] kb-mcp-server --embeddings /path/to/knowledge_base
# Build a knowledge base
uvx --from [email protected] kb-build --input /path/to/documents --config config.yml
# Search a knowledge base
uvx --from [email protected] kb-search /path/to/knowledge_base "Your search query"
# Build a knowledge base from documents
kb-build --input /path/to/documents --config config.yml
# Update an existing knowledge base
kb-build --input /path/to/new_documents --update
# Export a knowledge base for portability
kb-build --input /path/to/documents --export my_knowledge_base.tar.gz
# Search a knowledge base
kb-search /path/to/knowledge_base "What is machine learning?"
# Search with graph enhancement
kb-search /path/to/knowledge_base "What is machine learning?" --graph --limit 10
# Build a knowledge base using a template configuration
./scripts/kb_build.sh /path/to/documents technical_docs
# Build using a custom configuration file
./scripts/kb_build.sh /path/to/documents /path/to/my_config.yml
# Update an existing knowledge base
./scripts/kb_build.sh /path/to/documents technical_docs --update
# Search a knowledge base
./scripts/kb_search.sh /path/to/knowledge_base "What is machine learning?"
# Start with a specific knowledge base folder
kb-mcp-server --embeddings /path/to/knowledge_base_folder
# Start with a given knowledge base archive
kb-mcp-server --embeddings /path/to/knowledge_base.tar.gz
# Using uvx (no installation required)
uvx [email protected] --embeddings /path/to/knowledge_base_folder
# With additional configuration options
kb-mcp-server --embeddings /path/to/knowledge_base --host 0.0.0.0 --port 8000 --enable-causal-boost
The MCP server can be configured using command-line arguments or environment variables:
# Using command-line arguments
kb-mcp-server --embeddings /path/to/knowledge_base --host 0.0.0.0 --port 8000
# Using environment variables
export TXTAI_EMBEDDINGS=/path/to/knowledge_base
export MCP_SSE_HOST=0.0.0.0
export MCP_SSE_PORT=8000
python -m txtai_mcp_server
Common configuration options:
--embeddings
: Path to the knowledge base (required)--host
: Host address to bind to (default: localhost)--port
: Port to listen on (default: 8000)--transport
: Transport to use, either 'sse' or 'stdio' (default: stdio)--enable-causal-boost
: Enable causal boost feature for enhanced relevance scoring--causal-config
: Path to custom causal boost configuration YAML fileTo connect LLM clients to the MCP server, create an MCP configuration file:
{
"mcpServers": {
"kb-server": {
"command": "/your/home/project/.venv/bin/kb-mcp-server",
"args": [
"--embeddings",
"/path/to/knowledge_base.tar.gz"
],
"cwd": "/path/to/working/directory"
}
}
}
{
"rag-server": {
"command": "python3",
"args": [
"-m",
"txtai_mcp_server",
"--embeddings",
"/path/to/knowledge_base.tar.gz",
"--enable-causal-boost"
],
"cwd": "/path/to/working/directory"
}
}
{
"mcpServers": {
"kb-server": {
"command": "uvx",
"args": [
"[email protected]",
"--embeddings", "/path/to/knowledge_base",
"--host", "localhost",
"--port", "8000"
],
"cwd": "/path/to/working/directory"
}
}
}
The knowledge base building process is controlled by YAML configuration files. Here's an example:
# Path to save/load embeddings index
path: ~/.txtai/embeddings
writable: true
# Content storage in SQLite
content:
path: sqlite:///~/.txtai/content.db
# Embeddings configuration
embeddings:
# Model settings
path: sentence-transformers/nli-mpnet-base-v2
backend: faiss
gpu: true
batch: 32
normalize: true
# Scoring settings
scoring: hybrid
hybridalpha: 0.75
# Pipeline configuration
pipeline:
workers: 2
queue: 100
timeout: 300
# Question-answering pipeline
extractor:
path: distilbert-base-cased-distilled-squad
maxlength: 512
minscore: 0.3
# Graph configuration
graph:
backend: sqlite
path: ~/.txtai/graph.db
similarity: 0.75
limit: 10
Several pre-made configurations are available in the src/kb_builder/configs
directory:
python -m kb_builder build --input /path/to/documents --config src/kb_builder/configs/technical_docs.yml
To add this MCP server to Claude Code, run this command in your terminal:
claude mcp add-json "kb-server" '{"command":"kb-mcp-server","args":["--embeddings","/path/to/knowledge_base.tar.gz"],"cwd":"/path/to/working/directory"}'
See the official Claude Code MCP documentation for more details.
There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json
file so that it is available in all of your projects.
If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json
file.
To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".
When you click that button the ~/.cursor/mcp.json
file will be opened and you can add your server like this:
{
"mcpServers": {
"kb-server": {
"command": "kb-mcp-server",
"args": [
"--embeddings",
"/path/to/knowledge_base.tar.gz"
],
"cwd": "/path/to/working/directory"
}
}
}
To add an MCP server to a project you can create a new .cursor/mcp.json
file or add it to the existing one. This will look exactly the same as the global MCP server example above.
Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.
The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.
You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.
To add this MCP server to Claude Desktop:
1. Find your configuration file:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
~/.config/Claude/claude_desktop_config.json
2. Add this to your configuration file:
{
"mcpServers": {
"kb-server": {
"command": "kb-mcp-server",
"args": [
"--embeddings",
"/path/to/knowledge_base.tar.gz"
],
"cwd": "/path/to/working/directory"
}
}
}
3. Restart Claude Desktop for the changes to take effect