Graphiti MCP server

Provides a temporal knowledge graph system for storing, retrieving, and reasoning about relationships between entities with persistent memory across conversations
Back to servers
Setup instructions
Provider
Zep
Release date
Aug 08, 2024
Language
TypeScript
Stats
16.0K stars

Graphiti MCP Server is a framework for building and querying temporally-aware knowledge graphs for AI agents. It continuously integrates user interactions, structured and unstructured data into a coherent, queryable graph that supports historical queries without requiring complete graph recomputation. This MCP server implementation exposes Graphiti's functionality through the Model Context Protocol.

Quick Start

Using Claude Desktop and stdio Clients

  1. Clone the repository:
git clone https://github.com/getzep/graphiti.git

or

gh repo clone getzep/graphiti
  1. Note the full path to the directory:
cd graphiti && pwd
  1. Install the prerequisites (Python 3.10+, Neo4j, OpenAI API key).

  2. Configure your MCP client to use Graphiti with a stdio transport.

Using Cursor and SSE-enabled Clients

  1. Navigate to the mcp_server directory:
cd graphiti/mcp_server
  1. Start the service using Docker Compose:
docker compose up
  1. Configure your MCP client to point to http://localhost:8000/sse.

Installation

Prerequisites

  • Python 3.10 or higher
  • Neo4j database (version 5.26 or later)
  • OpenAI API key for LLM operations

Setup

  1. Clone the repository and navigate to the mcp_server directory
  2. Use uv to create a virtual environment and install dependencies:
# Install uv if you don't have it already
curl -LsSf https://astral.sh/uv/install.sh | sh

# Create a virtual environment and install dependencies
uv sync

Configuration

Configure the server using the following environment variables:

  • NEO4J_URI: URI for the Neo4j database (default: bolt://localhost:7687)
  • NEO4J_USER: Neo4j username (default: neo4j)
  • NEO4J_PASSWORD: Neo4j password (default: demodemo)
  • OPENAI_API_KEY: OpenAI API key (required)
  • MODEL_NAME: OpenAI model name to use
  • SMALL_MODEL_NAME: OpenAI model for smaller operations
  • LLM_TEMPERATURE: Temperature for LLM responses (0.0-2.0)

Additional variables are available for Azure OpenAI and concurrency settings. You can set these in a .env file in the project directory.

Running the Server

Direct Execution

Run the server using uv:

uv run graphiti_mcp_server.py

With options:

uv run graphiti_mcp_server.py --model gpt-4.1-mini --transport sse

Available arguments:

  • --model: Override the MODEL_NAME environment variable
  • --small-model: Override the SMALL_MODEL_NAME environment variable
  • --temperature: Override the LLM_TEMPERATURE environment variable
  • --transport: Choose transport method (sse or stdio, default: sse)
  • --group-id: Set a namespace for the graph (default: "default")
  • --destroy-graph: If set, destroys all Graphiti graphs on startup
  • --use-custom-entities: Enable entity extraction using predefined ENTITY_TYPES

Managing Concurrency

Adjust the SEMAPHORE_LIMIT environment variable (default: 10) to control concurrent operations. Lower this value if you encounter rate limit errors from your LLM provider, or increase it for better performance if your provider allows higher throughput.

Docker Deployment

  1. Configure environment variables using a .env file (recommended):

    cp .env.example .env
    # Edit the file to set your configuration
    
  2. Or set environment variables directly when running:

    OPENAI_API_KEY=your_key MODEL_NAME=gpt-4.1-mini docker compose up
    
  3. Start the services:

    docker compose up
    

The Docker setup includes both the Neo4j database and the Graphiti MCP server, exposing the server on port 8000.

Integrating with MCP Clients

STDIO Transport Configuration

{
  "mcpServers": {
    "graphiti-memory": {
      "transport": "stdio",
      "command": "/Users/<user>/.local/bin/uv",
      "args": [
        "run",
        "--isolated",
        "--directory",
        "/Users/<user>/dev/zep/graphiti/mcp_server",
        "--project",
        ".",
        "graphiti_mcp_server.py",
        "--transport",
        "stdio"
      ],
      "env": {
        "NEO4J_URI": "bolt://localhost:7687",
        "NEO4J_USER": "neo4j",
        "NEO4J_PASSWORD": "password",
        "OPENAI_API_KEY": "sk-XXXXXXXX",
        "MODEL_NAME": "gpt-4.1-mini"
      }
    }
  }
}

SSE Transport Configuration

{
  "mcpServers": {
    "graphiti-memory": {
      "transport": "sse",
      "url": "http://localhost:8000/sse"
    }
  }
}

Available Tools

  • add_episode: Add an episode to the knowledge graph
  • search_nodes: Search for relevant node summaries
  • search_facts: Search for relevant facts (edges between entities)
  • delete_entity_edge: Delete an entity edge
  • delete_episode: Delete an episode
  • get_entity_edge: Get an entity edge by UUID
  • get_episodes: Get the most recent episodes for a group
  • clear_graph: Clear all data and rebuild indices
  • get_status: Get server and Neo4j connection status

Working with JSON Data

Process structured JSON data using the add_episode tool with source="json":

add_episode(
name="Customer Profile",
episode_body="{\"company\": {\"name\": \"Acme Technologies\"}, \"products\": [{\"id\": \"P001\", \"name\": \"CloudSync\"}, {\"id\": \"P002\", \"name\": \"DataMiner\"}]}",
source="json",
source_description="CRM data"
)

Integrating with Cursor IDE

  1. Run the Graphiti MCP server:

    python graphiti_mcp_server.py --transport sse --use-custom-entities --group-id <your_group_id>
    

    or

    docker compose up
    
  2. Configure Cursor:

    {
      "mcpServers": {
        "graphiti-memory": {
          "url": "http://localhost:8000/sse"
        }
      }
    }
    

Integrating with Claude Desktop (Docker MCP Server)

  1. Run the Graphiti MCP server:

    docker compose up
    
  2. (Optional) Install mcp-remote globally:

    npm install -g mcp-remote
    
  3. Configure Claude Desktop by adding to claude_desktop_config.json:

    {
      "mcpServers": {
        "graphiti-memory": {
          "command": "npx",
          "args": [
            "mcp-remote",
            "http://localhost:8000/sse"
          ]
        }
      }
    }
    
  4. Restart Claude Desktop.

Disabling Telemetry

To disable anonymous telemetry collection, set:

export GRAPHITI_TELEMETRY_ENABLED=false

Or add to your .env file:

GRAPHITI_TELEMETRY_ENABLED=false

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "graphiti-memory" '{"transport":"stdio","command":"/Users/<user>/.local/bin/uv","args":["run","--isolated","--directory","/Users/<user>>/dev/zep/graphiti/mcp_server","--project",".","graphiti_mcp_server.py","--transport","stdio"],"env":{"NEO4J_URI":"bolt://localhost:7687","NEO4J_USER":"neo4j","NEO4J_PASSWORD":"password","OPENAI_API_KEY":"sk-XXXXXXXX","MODEL_NAME":"gpt-4.1-mini"}}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "graphiti-memory": {
            "transport": "stdio",
            "command": "/Users/<user>/.local/bin/uv",
            "args": [
                "run",
                "--isolated",
                "--directory",
                "/Users/<user>>/dev/zep/graphiti/mcp_server",
                "--project",
                ".",
                "graphiti_mcp_server.py",
                "--transport",
                "stdio"
            ],
            "env": {
                "NEO4J_URI": "bolt://localhost:7687",
                "NEO4J_USER": "neo4j",
                "NEO4J_PASSWORD": "password",
                "OPENAI_API_KEY": "sk-XXXXXXXX",
                "MODEL_NAME": "gpt-4.1-mini"
            }
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "graphiti-memory": {
            "transport": "stdio",
            "command": "/Users/<user>/.local/bin/uv",
            "args": [
                "run",
                "--isolated",
                "--directory",
                "/Users/<user>>/dev/zep/graphiti/mcp_server",
                "--project",
                ".",
                "graphiti_mcp_server.py",
                "--transport",
                "stdio"
            ],
            "env": {
                "NEO4J_URI": "bolt://localhost:7687",
                "NEO4J_USER": "neo4j",
                "NEO4J_PASSWORD": "password",
                "OPENAI_API_KEY": "sk-XXXXXXXX",
                "MODEL_NAME": "gpt-4.1-mini"
            }
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later