Graphiti MCP Server is a framework for building and querying temporally-aware knowledge graphs tailored for AI agents in dynamic environments. It continuously integrates user interactions, structured and unstructured data, and external information into a coherent graph, supporting incremental updates and precise historical queries without requiring complete graph recomputation.
git clone https://github.com/getzep/graphiti.git
or
gh repo clone getzep/graphiti
cd graphiti
uv
to create a virtual environment and install dependencies:# Install uv if you don't have it already
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create a virtual environment and install dependencies
uv sync
Clone the repository as described above
Configure environment variables:
cp .env.example .env
.env
file to set your OpenAI API key and other options:
OPENAI_API_KEY=your_openai_api_key_here
MODEL_NAME=gpt-4.1-mini
Start the services with Docker Compose:
docker compose up
The server uses the following environment variables:
NEO4J_URI
: URI for the Neo4j database (default: bolt://localhost:7687
)NEO4J_USER
: Neo4j username (default: neo4j
)NEO4J_PASSWORD
: Neo4j password (default: demodemo
)OPENAI_API_KEY
: OpenAI API key (required for LLM operations)OPENAI_BASE_URL
: Optional base URL for OpenAI APIMODEL_NAME
: OpenAI model name to use for LLM operationsSMALL_MODEL_NAME
: OpenAI model name for smaller operationsLLM_TEMPERATURE
: Temperature for LLM responses (0.0-2.0)Azure OpenAI options:
AZURE_OPENAI_ENDPOINT
: Optional Azure OpenAI endpoint URLAZURE_OPENAI_DEPLOYMENT_NAME
: Optional Azure OpenAI deployment nameAZURE_OPENAI_API_VERSION
: Optional Azure OpenAI API versionAZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME
: Optional Azure embedding deployment nameAZURE_OPENAI_EMBEDDING_API_VERSION
: Optional Azure OpenAI API versionAZURE_OPENAI_USE_MANAGED_IDENTITY
: Optional use Azure Managed IdentitiesRun the Graphiti MCP server with uv
:
uv run graphiti_mcp_server.py
With options:
uv run graphiti_mcp_server.py --model gpt-4.1-mini --transport sse
Available arguments:
--model
: Override the MODEL_NAME
environment variable--small-model
: Override the SMALL_MODEL_NAME
environment variable--temperature
: Override the LLM_TEMPERATURE
environment variable--transport
: Choose transport method (sse or stdio, default: sse)--group-id
: Set a namespace for the graph (default: "default")--destroy-graph
: If set, destroys all Graphiti graphs on startup--use-custom-entities
: Enable entity extraction using predefined ENTITY_TYPESTo use Graphiti with an MCP-compatible client via stdio transport:
{
"mcpServers": {
"graphiti-memory": {
"transport": "stdio",
"command": "/Users/<user>/.local/bin/uv",
"args": [
"run",
"--isolated",
"--directory",
"/Users/<user>/dev/zep/graphiti/mcp_server",
"--project",
".",
"graphiti_mcp_server.py",
"--transport",
"stdio"
],
"env": {
"NEO4J_URI": "bolt://localhost:7687",
"NEO4J_USER": "neo4j",
"NEO4J_PASSWORD": "password",
"OPENAI_API_KEY": "sk-XXXXXXXX",
"MODEL_NAME": "gpt-4.1-mini"
}
}
}
}
For HTTP-based SSE transport:
{
"mcpServers": {
"graphiti-memory": {
"transport": "sse",
"url": "http://localhost:8000/sse"
}
}
}
Claude Desktop requires a gateway for SSE transport:
Run the Graphiti MCP server:
docker compose up
Optionally install mcp-remote
globally:
npm install -g mcp-remote
Configure Claude Desktop:
{
"mcpServers": {
"graphiti-memory": {
"command": "npx",
"args": [
"mcp-remote",
"http://localhost:8000/sse"
]
}
}
}
Restart Claude Desktop
Run the server with SSE transport:
python graphiti_mcp_server.py --transport sse --use-custom-entities --group-id <your_group_id>
or
docker compose up
Configure Cursor:
{
"mcpServers": {
"graphiti-memory": {
"url": "http://localhost:8000/sse"
}
}
}
The server exposes these tools:
add_episode
: Add an episode to the knowledge graphsearch_nodes
: Search the graph for relevant node summariessearch_facts
: Search for relevant facts (edges between entities)delete_entity_edge
: Delete an entity edgedelete_episode
: Delete an episodeget_entity_edge
: Get an entity edge by UUIDget_episodes
: Get the most recent episodes for a specific groupclear_graph
: Clear all data and rebuild indicesget_status
: Get server and Neo4j connection statusProcess structured data with the add_episode
tool using source="json"
:
add_episode(
name="Customer Profile",
episode_body="{\"company\": {\"name\": \"Acme Technologies\"}, \"products\": [{\"id\": \"P001\", \"name\": \"CloudSync\"}, {\"id\": \"P002\", \"name\": \"DataMiner\"}]}",
source="json",
source_description="CRM data"
)
There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json
file so that it is available in all of your projects.
If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json
file.
To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".
When you click that button the ~/.cursor/mcp.json
file will be opened and you can add your server like this:
{
"mcpServers": {
"cursor-rules-mcp": {
"command": "npx",
"args": [
"-y",
"cursor-rules-mcp"
]
}
}
}
To add an MCP server to a project you can create a new .cursor/mcp.json
file or add it to the existing one. This will look exactly the same as the global MCP server example above.
Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.
The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.
You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.