Graphiti MCP Server is a framework for building and querying temporally-aware knowledge graphs for AI agents. It continuously integrates user interactions, structured and unstructured data into a coherent, queryable graph that supports historical queries without requiring complete graph recomputation. This MCP server implementation exposes Graphiti's functionality through the Model Context Protocol.
git clone https://github.com/getzep/graphiti.git
or
gh repo clone getzep/graphiti
cd graphiti && pwd
Install the prerequisites (Python 3.10+, Neo4j, OpenAI API key).
Configure your MCP client to use Graphiti with a stdio
transport.
cd graphiti/mcp_server
docker compose up
http://localhost:8000/sse
.uv
to create a virtual environment and install dependencies:# Install uv if you don't have it already
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create a virtual environment and install dependencies
uv sync
Configure the server using the following environment variables:
NEO4J_URI
: URI for the Neo4j database (default: bolt://localhost:7687
)NEO4J_USER
: Neo4j username (default: neo4j
)NEO4J_PASSWORD
: Neo4j password (default: demodemo
)OPENAI_API_KEY
: OpenAI API key (required)MODEL_NAME
: OpenAI model name to useSMALL_MODEL_NAME
: OpenAI model for smaller operationsLLM_TEMPERATURE
: Temperature for LLM responses (0.0-2.0)Additional variables are available for Azure OpenAI and concurrency settings. You can set these in a .env
file in the project directory.
Run the server using uv
:
uv run graphiti_mcp_server.py
With options:
uv run graphiti_mcp_server.py --model gpt-4.1-mini --transport sse
Available arguments:
--model
: Override the MODEL_NAME
environment variable--small-model
: Override the SMALL_MODEL_NAME
environment variable--temperature
: Override the LLM_TEMPERATURE
environment variable--transport
: Choose transport method (sse or stdio, default: sse)--group-id
: Set a namespace for the graph (default: "default")--destroy-graph
: If set, destroys all Graphiti graphs on startup--use-custom-entities
: Enable entity extraction using predefined ENTITY_TYPESAdjust the SEMAPHORE_LIMIT
environment variable (default: 10) to control concurrent operations. Lower this value if you encounter rate limit errors from your LLM provider, or increase it for better performance if your provider allows higher throughput.
Configure environment variables using a .env
file (recommended):
cp .env.example .env
# Edit the file to set your configuration
Or set environment variables directly when running:
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4.1-mini docker compose up
Start the services:
docker compose up
The Docker setup includes both the Neo4j database and the Graphiti MCP server, exposing the server on port 8000.
{
"mcpServers": {
"graphiti-memory": {
"transport": "stdio",
"command": "/Users/<user>/.local/bin/uv",
"args": [
"run",
"--isolated",
"--directory",
"/Users/<user>/dev/zep/graphiti/mcp_server",
"--project",
".",
"graphiti_mcp_server.py",
"--transport",
"stdio"
],
"env": {
"NEO4J_URI": "bolt://localhost:7687",
"NEO4J_USER": "neo4j",
"NEO4J_PASSWORD": "password",
"OPENAI_API_KEY": "sk-XXXXXXXX",
"MODEL_NAME": "gpt-4.1-mini"
}
}
}
}
{
"mcpServers": {
"graphiti-memory": {
"transport": "sse",
"url": "http://localhost:8000/sse"
}
}
}
add_episode
: Add an episode to the knowledge graphsearch_nodes
: Search for relevant node summariessearch_facts
: Search for relevant facts (edges between entities)delete_entity_edge
: Delete an entity edgedelete_episode
: Delete an episodeget_entity_edge
: Get an entity edge by UUIDget_episodes
: Get the most recent episodes for a groupclear_graph
: Clear all data and rebuild indicesget_status
: Get server and Neo4j connection statusProcess structured JSON data using the add_episode
tool with source="json"
:
add_episode(
name="Customer Profile",
episode_body="{\"company\": {\"name\": \"Acme Technologies\"}, \"products\": [{\"id\": \"P001\", \"name\": \"CloudSync\"}, {\"id\": \"P002\", \"name\": \"DataMiner\"}]}",
source="json",
source_description="CRM data"
)
Run the Graphiti MCP server:
python graphiti_mcp_server.py --transport sse --use-custom-entities --group-id <your_group_id>
or
docker compose up
Configure Cursor:
{
"mcpServers": {
"graphiti-memory": {
"url": "http://localhost:8000/sse"
}
}
}
Run the Graphiti MCP server:
docker compose up
(Optional) Install mcp-remote
globally:
npm install -g mcp-remote
Configure Claude Desktop by adding to claude_desktop_config.json
:
{
"mcpServers": {
"graphiti-memory": {
"command": "npx",
"args": [
"mcp-remote",
"http://localhost:8000/sse"
]
}
}
}
Restart Claude Desktop.
To disable anonymous telemetry collection, set:
export GRAPHITI_TELEMETRY_ENABLED=false
Or add to your .env
file:
GRAPHITI_TELEMETRY_ENABLED=false
To add this MCP server to Claude Code, run this command in your terminal:
claude mcp add-json "graphiti-memory" '{"transport":"stdio","command":"/Users/<user>/.local/bin/uv","args":["run","--isolated","--directory","/Users/<user>>/dev/zep/graphiti/mcp_server","--project",".","graphiti_mcp_server.py","--transport","stdio"],"env":{"NEO4J_URI":"bolt://localhost:7687","NEO4J_USER":"neo4j","NEO4J_PASSWORD":"password","OPENAI_API_KEY":"sk-XXXXXXXX","MODEL_NAME":"gpt-4.1-mini"}}'
See the official Claude Code MCP documentation for more details.
There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json
file so that it is available in all of your projects.
If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json
file.
To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".
When you click that button the ~/.cursor/mcp.json
file will be opened and you can add your server like this:
{
"mcpServers": {
"graphiti-memory": {
"transport": "stdio",
"command": "/Users/<user>/.local/bin/uv",
"args": [
"run",
"--isolated",
"--directory",
"/Users/<user>>/dev/zep/graphiti/mcp_server",
"--project",
".",
"graphiti_mcp_server.py",
"--transport",
"stdio"
],
"env": {
"NEO4J_URI": "bolt://localhost:7687",
"NEO4J_USER": "neo4j",
"NEO4J_PASSWORD": "password",
"OPENAI_API_KEY": "sk-XXXXXXXX",
"MODEL_NAME": "gpt-4.1-mini"
}
}
}
}
To add an MCP server to a project you can create a new .cursor/mcp.json
file or add it to the existing one. This will look exactly the same as the global MCP server example above.
Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.
The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.
You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.
To add this MCP server to Claude Desktop:
1. Find your configuration file:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
~/.config/Claude/claude_desktop_config.json
2. Add this to your configuration file:
{
"mcpServers": {
"graphiti-memory": {
"transport": "stdio",
"command": "/Users/<user>/.local/bin/uv",
"args": [
"run",
"--isolated",
"--directory",
"/Users/<user>>/dev/zep/graphiti/mcp_server",
"--project",
".",
"graphiti_mcp_server.py",
"--transport",
"stdio"
],
"env": {
"NEO4J_URI": "bolt://localhost:7687",
"NEO4J_USER": "neo4j",
"NEO4J_PASSWORD": "password",
"OPENAI_API_KEY": "sk-XXXXXXXX",
"MODEL_NAME": "gpt-4.1-mini"
}
}
}
}
3. Restart Claude Desktop for the changes to take effect