Graphiti MCP Server is a framework that enables AI assistants to interact with temporally-aware knowledge graphs. It continuously integrates user interactions, structured data, and external information into a queryable graph that supports historical queries and efficient retrieval without requiring complete graph recomputation.
Clone the Graphiti repository:
git clone https://github.com/getzep/graphiti.git
or
gh repo clone getzep/graphiti
For HTTP-enabled clients (like Cursor):
cd graphiti/mcp_server
docker compose up
http://localhost:8000/mcp/If you prefer running without Docker, create a virtual environment and install dependencies:
# Install uv if you don't have it already
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create a virtual environment and install dependencies
uv sync
# Optional: Install additional LLM providers
uv sync --extra providers
The server can be configured using a config.yaml file, environment variables, or command-line arguments.
http://localhost:8000/mcp/)database:
provider: "falkordb" # Default
providers:
falkordb:
uri: "redis://localhost:6379"
password: "" # Optional
database: "default_db" # Optional
database:
provider: "neo4j"
providers:
neo4j:
uri: "bolt://localhost:7687"
username: "neo4j"
password: "your_password"
database: "neo4j" # Optional, defaults to "neo4j"
Configure multiple LLM providers in your config.yaml:
llm:
provider: "openai" # or "anthropic", "gemini", "groq", "azure_openai"
model: "gpt-4.1" # Default model
llm:
provider: "openai"
model: "gpt-oss:120b" # or your preferred Ollama model
api_base: "http://localhost:11434/v1"
api_key: "ollama" # dummy key required
embedder:
provider: "sentence_transformers" # recommended for local setup
model: "all-MiniLM-L6-v2"
Make sure Ollama is running locally with: ollama serve
Key variables that can be set in a .env file:
NEO4J_URI: URI for the Neo4j databaseNEO4J_USER: Neo4j usernameNEO4J_PASSWORD: Neo4j passwordOPENAI_API_KEY: OpenAI API keyANTHROPIC_API_KEY: Anthropic API keyGOOGLE_API_KEY: Google API keyGROQ_API_KEY: Groq API keySEMAPHORE_LIMIT: Episode processing concurrencydocker compose up
This starts a single container with:
http://localhost:8000/mcp/localhost:6379http://localhost:3000Using Docker Compose:
docker compose -f docker/docker-compose.neo4j.yaml up
With an existing Neo4j instance:
export NEO4J_URI="bolt://localhost:7687"
export NEO4J_USER="neo4j"
export NEO4J_PASSWORD="your_password"
uv run graphiti_mcp_server.py --database-provider neo4j
--config: Path to YAML configuration file--llm-provider: LLM provider to use--embedder-provider: Embedder provider to use--database-provider: Database provider to use--model: Model name to use with the LLM--temperature: Temperature setting for the LLM--transport: Choose the transport method (http or stdio)--group-id: Set a namespace for the graph--destroy-graph: Destroys all Graphiti graphs on startupGraphiti's ingestion pipelines are controlled by the SEMAPHORE_LIMIT environment variable:
SEMAPHORE_LIMIT=1-2SEMAPHORE_LIMIT=5-8SEMAPHORE_LIMIT=10-15 (default)SEMAPHORE_LIMIT=20-50SEMAPHORE_LIMIT=5-8SEMAPHORE_LIMIT=15-30SEMAPHORE_LIMIT=1-5 (hardware dependent)Add to your VS Code settings:
{
"mcpServers": {
"graphiti": {
"uri": "http://localhost:8000/mcp/",
"transport": {
"type": "http"
}
}
}
}
{
"mcpServers": {
"graphiti-memory": {
"transport": "stdio",
"command": "/Users/<user>/.local/bin/uv",
"args": [
"run",
"--isolated",
"--directory",
"/Users/<user>>/dev/zep/graphiti/mcp_server",
"--project",
".",
"graphiti_mcp_server.py",
"--transport",
"stdio"
],
"env": {
"NEO4J_URI": "bolt://localhost:7687",
"NEO4J_USER": "neo4j",
"NEO4J_PASSWORD": "password",
"OPENAI_API_KEY": "sk-XXXXXXXX",
"MODEL_NAME": "gpt-4.1-mini"
}
}
}
}
{
"mcpServers": {
"graphiti-memory": {
"transport": "http",
"url": "http://localhost:8000/mcp/"
}
}
}
add_episode: Add an episode to the knowledge graphsearch_nodes: Search for relevant node summariessearch_facts: Search for relevant facts (edges)delete_entity_edge: Delete an entity edgedelete_episode: Delete an episodeget_entity_edge: Get an entity edge by UUIDget_episodes: Get recent episodes for a specific groupclear_graph: Clear data and rebuild indicesget_status: Get server and connection statusadd_episode(
name="Customer Profile",
episode_body="{\"company\": {\"name\": \"Acme Technologies\"}, \"products\": [{\"id\": \"P001\", \"name\": \"CloudSync\"}, {\"id\": \"P002\", \"name\": \"DataMiner\"}]}",
source="json",
source_description="CRM data"
)
Claude Desktop requires a gateway for HTTP transport:
Run the Graphiti MCP server:
docker compose up
Configure Claude Desktop:
{
"mcpServers": {
"graphiti-memory": {
"command": "npx",
"args": [
"mcp-remote",
"http://localhost:8000/mcp/"
]
}
}
}
Restart Claude Desktop
To disable telemetry, set the environment variable:
export GRAPHITI_TELEMETRY_ENABLED=false
To add this MCP server to Claude Code, run this command in your terminal:
claude mcp add-json "graphiti-memory" '{"transport":"stdio","command":"/Users/<user>/.local/bin/uv","args":["run","--isolated","--directory","/Users/<user>>/dev/zep/graphiti/mcp_server","--project",".","graphiti_mcp_server.py","--transport","stdio"],"env":{"NEO4J_URI":"bolt://localhost:7687","NEO4J_USER":"neo4j","NEO4J_PASSWORD":"password","OPENAI_API_KEY":"sk-XXXXXXXX","MODEL_NAME":"gpt-4.1-mini"}}'
See the official Claude Code MCP documentation for more details.
There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.
If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.
To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".
When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:
{
"mcpServers": {
"graphiti-memory": {
"transport": "stdio",
"command": "/Users/<user>/.local/bin/uv",
"args": [
"run",
"--isolated",
"--directory",
"/Users/<user>>/dev/zep/graphiti/mcp_server",
"--project",
".",
"graphiti_mcp_server.py",
"--transport",
"stdio"
],
"env": {
"NEO4J_URI": "bolt://localhost:7687",
"NEO4J_USER": "neo4j",
"NEO4J_PASSWORD": "password",
"OPENAI_API_KEY": "sk-XXXXXXXX",
"MODEL_NAME": "gpt-4.1-mini"
}
}
}
}
To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.
Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.
The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.
You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.
To add this MCP server to Claude Desktop:
1. Find your configuration file:
~/Library/Application Support/Claude/claude_desktop_config.json%APPDATA%\Claude\claude_desktop_config.json~/.config/Claude/claude_desktop_config.json2. Add this to your configuration file:
{
"mcpServers": {
"graphiti-memory": {
"transport": "stdio",
"command": "/Users/<user>/.local/bin/uv",
"args": [
"run",
"--isolated",
"--directory",
"/Users/<user>>/dev/zep/graphiti/mcp_server",
"--project",
".",
"graphiti_mcp_server.py",
"--transport",
"stdio"
],
"env": {
"NEO4J_URI": "bolt://localhost:7687",
"NEO4J_USER": "neo4j",
"NEO4J_PASSWORD": "password",
"OPENAI_API_KEY": "sk-XXXXXXXX",
"MODEL_NAME": "gpt-4.1-mini"
}
}
}
}
3. Restart Claude Desktop for the changes to take effect