txtai MCP server

Configurable server environment for efficient memory-based operations with customizable settings and flexible limits for scalable applications.
Back to servers
Setup instructions
Provider
rmtech1
Release date
Jan 11, 2025
Language
Python
Stats
6 stars

This MCP server provides a semantic search and memory management system built on txtai, enabling AI assistants like Claude and Cline to store, retrieve, and organize text-based information with powerful semantic search capabilities.

Installation

Prerequisites

  • Python 3.8 or higher
  • pip (Python package installer)
  • virtualenv (recommended)

Setup Instructions

  1. Clone the repository:

    git clone https://github.com/yourusername/txtai-assistant-mcp.git
    cd txtai-assistant-mcp
    
  2. Run the start script to set up and launch the server:

    ./scripts/start.sh
    

This script automatically:

  • Creates a virtual environment
  • Installs required dependencies
  • Sets up necessary directories
  • Creates a configuration file from the template
  • Starts the server

Configuration

The server can be configured using environment variables in a .env file. A template is provided:

# Server Configuration
HOST=0.0.0.0
PORT=8000

# CORS Configuration
CORS_ORIGINS=*

# Logging Configuration
LOG_LEVEL=DEBUG

# Memory Configuration
MAX_MEMORIES=0

Integration with AI Assistants

Claude Integration

Add this server to Claude's MCP configuration file (typically at ~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
  "mcpServers": {
    "txtai-assistant": {
      "command": "path/to/txtai-assistant-mcp/scripts/start.sh",
      "env": {}
    }
  }
}

Cline Integration

Add the server to Cline's MCP settings file (typically at ~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json):

{
  "mcpServers": {
    "txtai-assistant": {
      "command": "path/to/txtai-assistant-mcp/scripts/start.sh",
      "env": {}
    }
  }
}

Available MCP Tools

Store Memory

Stores new memory content with metadata and tags:

{
  "content": "Memory content to store",
  "metadata": {
    "source": "conversation",
    "timestamp": "2023-01-01T00:00:00Z"
  },
  "tags": ["important", "context"],
  "type": "conversation"
}

Retrieve Memory

Retrieves memories based on semantic search:

{
  "query": "search query",
  "n_results": 5
}

Search by Tag

Searches memories by tags:

{
  "tags": ["important", "context"]
}

Delete Memory

Deletes a specific memory by content hash:

{
  "content_hash": "hash_value"
}

Get Stats

Gets database statistics:

{}

Check Health

Checks database and embedding model health:

{}

Usage Examples

In Claude or Cline, you can use these tools through the MCP protocol:

# Store a memory
<use_mcp_tool>
<server_name>txtai-assistant</server_name>
<tool_name>store_memory</tool_name>
<arguments>
{
  "content": "Important information to remember",
  "tags": ["important"]
}
</arguments>
</use_mcp_tool>

# Retrieve memories
<use_mcp_tool>
<server_name>txtai-assistant</server_name>
<tool_name>retrieve_memory</tool_name>
<arguments>
{
  "query": "what was the important information?",
  "n_results": 5
}
</arguments>
</use_mcp_tool>

API Endpoints

Store Memory

POST /store

Request Body:

{
    "content": "Memory content to store",
    "metadata": {
        "source": "example",
        "timestamp": "2023-01-01T00:00:00Z"
    },
    "tags": ["example", "memory"],
    "type": "general"
}

Search Memories

POST /search

Request Body:

{
    "query": "search query",
    "n_results": 5,
    "similarity_threshold": 0.7
}

Search by Tags

POST /search_tags

Request Body:

{
    "tags": ["example", "memory"]
}

Delete Memory

DELETE /memory/{content_hash}

Get Statistics

GET /stats

Health Check

GET /health

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "txtai-assistant" '{"command":"path/to/txtai-assistant-mcp/scripts/start.sh","env":[]}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "txtai-assistant": {
            "command": "path/to/txtai-assistant-mcp/scripts/start.sh",
            "env": []
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "txtai-assistant": {
            "command": "path/to/txtai-assistant-mcp/scripts/start.sh",
            "env": []
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later