Graphiti MCP server

Provides a temporal knowledge graph system for storing, retrieving, and reasoning about relationships between entities with persistent memory across conversations
Back to servers
Setup instructions
Provider
Zep
Release date
Aug 08, 2024
Language
TypeScript
Stats
20.4K stars

Graphiti MCP Server is a framework that enables AI assistants to interact with temporally-aware knowledge graphs. It continuously integrates user interactions, structured data, and external information into a queryable graph that supports historical queries and efficient retrieval without requiring complete graph recomputation.

Installation

Prerequisites

  1. Docker and Docker Compose (for the default FalkorDB setup)
  2. OpenAI API key or API keys for other supported LLM providers
  3. (Optional) Python 3.10+ if running the MCP server standalone

Quick Start

Clone the Graphiti repository:

git clone https://github.com/getzep/graphiti.git

or

gh repo clone getzep/graphiti

For HTTP-enabled clients (like Cursor):

  1. Navigate to the MCP server directory:
cd graphiti/mcp_server
  1. Start the combined FalkorDB + MCP server:
docker compose up
  1. Point your MCP client to http://localhost:8000/mcp/

Setup with Python Environment

If you prefer running without Docker, create a virtual environment and install dependencies:

# Install uv if you don't have it already
curl -LsSf https://astral.sh/uv/install.sh | sh

# Create a virtual environment and install dependencies
uv sync

# Optional: Install additional LLM providers
uv sync --extra providers

Configuration

The server can be configured using a config.yaml file, environment variables, or command-line arguments.

Default Configuration

  • Transport: HTTP (accessible at http://localhost:8000/mcp/)
  • Database: FalkorDB (combined in single container with MCP server)
  • LLM: OpenAI with model gpt-5-mini
  • Embedder: OpenAI text-embedding-3-small

Database Configuration

FalkorDB (Default)

database:
  provider: "falkordb"  # Default
  providers:
    falkordb:
      uri: "redis://localhost:6379"
      password: ""  # Optional
      database: "default_db"  # Optional

Neo4j

database:
  provider: "neo4j"
  providers:
    neo4j:
      uri: "bolt://localhost:7687"
      username: "neo4j"
      password: "your_password"
      database: "neo4j"  # Optional, defaults to "neo4j"

LLM Configuration

Configure multiple LLM providers in your config.yaml:

llm:
  provider: "openai"  # or "anthropic", "gemini", "groq", "azure_openai"
  model: "gpt-4.1"  # Default model

Using Ollama for Local LLM

llm:
  provider: "openai"
  model: "gpt-oss:120b"  # or your preferred Ollama model
  api_base: "http://localhost:11434/v1"
  api_key: "ollama"  # dummy key required

embedder:
  provider: "sentence_transformers"  # recommended for local setup
  model: "all-MiniLM-L6-v2"

Make sure Ollama is running locally with: ollama serve

Environment Variables

Key variables that can be set in a .env file:

  • NEO4J_URI: URI for the Neo4j database
  • NEO4J_USER: Neo4j username
  • NEO4J_PASSWORD: Neo4j password
  • OPENAI_API_KEY: OpenAI API key
  • ANTHROPIC_API_KEY: Anthropic API key
  • GOOGLE_API_KEY: Google API key
  • GROQ_API_KEY: Groq API key
  • SEMAPHORE_LIMIT: Episode processing concurrency

Running the Server

Default Setup (FalkorDB Combined Container)

docker compose up

This starts a single container with:

  • HTTP transport on http://localhost:8000/mcp/
  • FalkorDB graph database on localhost:6379
  • FalkorDB web UI on http://localhost:3000

Running with Neo4j

Using Docker Compose:

docker compose -f docker/docker-compose.neo4j.yaml up

With an existing Neo4j instance:

export NEO4J_URI="bolt://localhost:7687"
export NEO4J_USER="neo4j"
export NEO4J_PASSWORD="your_password"

uv run graphiti_mcp_server.py --database-provider neo4j

Command-Line Arguments

  • --config: Path to YAML configuration file
  • --llm-provider: LLM provider to use
  • --embedder-provider: Embedder provider to use
  • --database-provider: Database provider to use
  • --model: Model name to use with the LLM
  • --temperature: Temperature setting for the LLM
  • --transport: Choose the transport method (http or stdio)
  • --group-id: Set a namespace for the graph
  • --destroy-graph: Destroys all Graphiti graphs on startup

Concurrency and Rate Limiting

Graphiti's ingestion pipelines are controlled by the SEMAPHORE_LIMIT environment variable:

  • OpenAI Tier 1 (free): SEMAPHORE_LIMIT=1-2
  • OpenAI Tier 2: SEMAPHORE_LIMIT=5-8
  • OpenAI Tier 3: SEMAPHORE_LIMIT=10-15 (default)
  • OpenAI Tier 4: SEMAPHORE_LIMIT=20-50
  • Anthropic Default: SEMAPHORE_LIMIT=5-8
  • Anthropic High tier: SEMAPHORE_LIMIT=15-30
  • Ollama (local): SEMAPHORE_LIMIT=1-5 (hardware dependent)

Integrating with MCP Clients

VS Code / GitHub Copilot

Add to your VS Code settings:

{
  "mcpServers": {
    "graphiti": {
      "uri": "http://localhost:8000/mcp/",
      "transport": {
        "type": "http"
      }
    }
  }
}

Other MCP Clients (stdio transport)

{
  "mcpServers": {
    "graphiti-memory": {
      "transport": "stdio",
      "command": "/Users/<user>/.local/bin/uv",
      "args": [
        "run",
        "--isolated",
        "--directory",
        "/Users/<user>>/dev/zep/graphiti/mcp_server",
        "--project",
        ".",
        "graphiti_mcp_server.py",
        "--transport",
        "stdio"
      ],
      "env": {
        "NEO4J_URI": "bolt://localhost:7687",
        "NEO4J_USER": "neo4j",
        "NEO4J_PASSWORD": "password",
        "OPENAI_API_KEY": "sk-XXXXXXXX",
        "MODEL_NAME": "gpt-4.1-mini"
      }
    }
  }
}

HTTP Transport (Default)

{
  "mcpServers": {
    "graphiti-memory": {
      "transport": "http",
      "url": "http://localhost:8000/mcp/"
    }
  }
}

Available Tools

  • add_episode: Add an episode to the knowledge graph
  • search_nodes: Search for relevant node summaries
  • search_facts: Search for relevant facts (edges)
  • delete_entity_edge: Delete an entity edge
  • delete_episode: Delete an episode
  • get_entity_edge: Get an entity edge by UUID
  • get_episodes: Get recent episodes for a specific group
  • clear_graph: Clear data and rebuild indices
  • get_status: Get server and connection status

Working with JSON Data

add_episode(
  name="Customer Profile",
  episode_body="{\"company\": {\"name\": \"Acme Technologies\"}, \"products\": [{\"id\": \"P001\", \"name\": \"CloudSync\"}, {\"id\": \"P002\", \"name\": \"DataMiner\"}]}",
  source="json",
  source_description="CRM data"
)

Integrating with Claude Desktop

Claude Desktop requires a gateway for HTTP transport:

  1. Run the Graphiti MCP server:

    docker compose up
    
  2. Configure Claude Desktop:

    {
      "mcpServers": {
        "graphiti-memory": {
          "command": "npx",
          "args": [
            "mcp-remote",
            "http://localhost:8000/mcp/"
          ]
        }
      }
    }
    
  3. Restart Claude Desktop

Telemetry

To disable telemetry, set the environment variable:

export GRAPHITI_TELEMETRY_ENABLED=false

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "graphiti-memory" '{"transport":"stdio","command":"/Users/<user>/.local/bin/uv","args":["run","--isolated","--directory","/Users/<user>>/dev/zep/graphiti/mcp_server","--project",".","graphiti_mcp_server.py","--transport","stdio"],"env":{"NEO4J_URI":"bolt://localhost:7687","NEO4J_USER":"neo4j","NEO4J_PASSWORD":"password","OPENAI_API_KEY":"sk-XXXXXXXX","MODEL_NAME":"gpt-4.1-mini"}}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "graphiti-memory": {
            "transport": "stdio",
            "command": "/Users/<user>/.local/bin/uv",
            "args": [
                "run",
                "--isolated",
                "--directory",
                "/Users/<user>>/dev/zep/graphiti/mcp_server",
                "--project",
                ".",
                "graphiti_mcp_server.py",
                "--transport",
                "stdio"
            ],
            "env": {
                "NEO4J_URI": "bolt://localhost:7687",
                "NEO4J_USER": "neo4j",
                "NEO4J_PASSWORD": "password",
                "OPENAI_API_KEY": "sk-XXXXXXXX",
                "MODEL_NAME": "gpt-4.1-mini"
            }
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "graphiti-memory": {
            "transport": "stdio",
            "command": "/Users/<user>/.local/bin/uv",
            "args": [
                "run",
                "--isolated",
                "--directory",
                "/Users/<user>>/dev/zep/graphiti/mcp_server",
                "--project",
                ".",
                "graphiti_mcp_server.py",
                "--transport",
                "stdio"
            ],
            "env": {
                "NEO4J_URI": "bolt://localhost:7687",
                "NEO4J_USER": "neo4j",
                "NEO4J_PASSWORD": "password",
                "OPENAI_API_KEY": "sk-XXXXXXXX",
                "MODEL_NAME": "gpt-4.1-mini"
            }
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later