Home / MCP / MCP Mem0 MCP Server

MCP Mem0 MCP Server

Provides persistent AI memory with save, recall, and semantic search across sessions

python
Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
    "mcpServers": {
        "mem0": {
            "url": "http://localhost:8050/sse"
        }
    }
}

MCP-Mem0 provides persistent memory capabilities to AI agents by storing, retrieving, and semantically searching memories across sessions. It enables your agents to remember useful context over time, improving continuity and relevance in conversations and tasks.

How to use

You connect to MCP-Mem0 from an MCP client using either a remote HTTP endpoint or a local stdio process. Use the HTTP transport to talk to a running server at a stable URL, or use the stdio transport to run the server inline with your client. Once connected, you can save memories with semantic indexing, retrieve all memories for a full-context view, and perform semantic searches to surface relevant memories for your agents.

When you save a memory, provide the information you want the agent to recall later. Memories are stored with semantic representations so retrieval can be contextually relevant even if phrased differently. Use get_all_memories to fetch the entire memory corpus for debugging or to provide a comprehensive context during agent reasoning. Use search_memories to find memories related to a concept, event, or entity without scanning every item.

How to install

Prerequisites: Python 3.12 or later, a PostgreSQL-compatible database for vector storage, and an API key for your chosen LLM provider. You may run the server with Docker for convenience.

# Prerequisites
pip install uv

# Install as a local editable package (from a clone or downloaded source)
uv pip install -e .

# Create configuration from a sample
cp .env.example .env

If you prefer running with Docker, build and run the image, then connect to the exposed endpoint from your MCP client.

# Build the Docker image (example tag)
docker build -t mcp/mem0 --build-arg PORT=8050 .

# Run with your environment settings
docker run --env-file .env -p 8050:8050 mcp/mem0

Configuration and usage notes

Configure transport, database, and LLM details through environment variables. You can choose SSE for remote HTTP access or stdio for local execution. The server integrates with a PostgreSQL-compatible database for vector storage and requires an API key for your LLM provider.

Common variables include the transport, host, port, LLM provider and base URL, API key, model choice, embedding model, and the database connection URL.

Available tools

save_memory

Stores information in long-term memory with semantic indexing for efficient retrieval later.

get_all_memories

Retrieves all stored memories to provide comprehensive context for the agent.

search_memories

Performs semantic search over memories to find relevant items based on a query.