home / mcp / embedding mcp server

Embedding MCP Server

Provides a Model Context Protocol server powered by txtai for semantic search, knowledge graphs, and AI-driven text processing.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "geeksfino-kb-mcp-server": {
      "command": "kb-mcp-server",
      "args": [
        "--embeddings",
        "/path/to/knowledge_base_folder"
      ]
    }
  }
}

You can run a local MCP Server powered by txtai that provides semantic search, knowledge graph querying, and AI-driven text processing over your knowledge base. It runs locally, keeps data private, and exposes a standardized interface for client applications to interact with your content.

How to use

Set up the MCP Server locally and connect a client to it to perform semantic searches, explore knowledge graphs, and run text processing pipelines such as summarization and extraction. You can start the server with an embedded knowledge base folder or a portable knowledge base archive. You have multiple options to run the server, including a direct PyPI command, a no-install compatible runner, or a Python module invocation. For client usage, you prepare a configuration that points to the running MCP server so your LLM client can send requests and receive structured results.

How to install

# Prerequisites
- Python 3.10+ (recommended)
- pip for Python package management
- Optional: uv or uvx for faster, install-free runs

# Approach 1: Install via PyPI and run
pip install -U uv
uv venv --python=3.10
source .venv/bin/activate
uv pip install kb-mcp-server

# Approach 2: Install from PyPI and run with uvx (no local install)
pip install uv
uvx [email protected] --embeddings /path/to/knowledge_base

# Approach 3: Install from source (development)
git clone https://github.com/Geeksfino/kb-mcp-server.git
cd kb-mcp-server
pip install -e .

# Start the server with a knowledge base
kb-mcp-server --embeddings /path/to/knowledge_base --host 0.0.0.0 --port 8000

# Alternative: start via uvx with a specific version
uvx [email protected] --embeddings /path/to/knowledge_base --host 0.0.0.0 --port 8000

# Start with a knowledge base archive
kb-mcp-server --embeddings /path/to/knowledge_base.tar.gz --host 0.0.0.0 --port 8000

Additional sections

Below are practical details you will use when configuring and running the MCP Server, along with optional knowledge base workflows and client considerations.

Configuration and startup options

The server is configured via command-line arguments or environment variables. Essential option is the path to your knowledge base. You can bind the server to a host and port, choose a transport method, and enable features that improve relevance scoring. Examples shown here use explicit values.

Starting commands you can use

kb-mcp-server --embeddings /path/to/knowledge_base_folder

kb-mcp-server --embeddings /path/to/knowledge_base.tar.gz --host 0.0.0.0 --port 8000
```

```
uvx [email protected] --embeddings /path/to/knowledge_base_folder --host 0.0.0.0 --port 8000

uvx [email protected] --embeddings /path/to/knowledge_base.tar.gz --host 0.0.0.0 --port 8000
```

```
python -m txtai_mcp_server --embeddings /path/to/knowledge_base.tar.gz --host 0.0.0.0 --port 8000

Knowledge base building and formats

Build your knowledge base from a collection of documents using supported inputs, embeddings, and optional graph construction. You can export portable knowledge bases as tar.gz archives and load them later with the MCP server.

How the server works with clients

Clients load an MCP configuration that points to the running server and choose how to interact with the server’s capabilities. Typical clients perform semantic searches, query the knowledge graph, and run text processing pipelines such as summarization and extraction.

Notes on security and privacy

Run the MCP Server locally to keep data on your device. When exposing the server over the network, apply standard security measures such as firewall rules, authentication if supported, and TLS for transport where applicable.

Troubleshooting and tips

If you encounter issues starting the server, verify that the knowledge base path exists, the host and port are not in use, and the Python environment is correctly activated. Check compatibility between Python and the server package version, and ensure you are using a compatible embedding model and toolkit versions.

Examples and workflows

- Load a local knowledge base and run a semantic search for a topic. - Build a knowledge graph from your documents and traverse related concepts. - Use text processing pipelines to summarize lengthy documents or extract key entities.

Available tools

kb_builder

Command-line tool for creating and managing knowledge bases by processing documents, extracting text, and building embeddings and graphs.

kb_mcp_server

The MCP server that provides semantic search, knowledge graph access, and text processing pipelines via a standardized interface.

kb_search

Tool to query a knowledge base and optionally enhance results with graph information.