Cognee-MCP is a server that runs Cognee's memory engine using the Model Context Protocol (MCP). It enables AI agents to have memory capabilities by creating and accessing a knowledge graph, allowing you to query from any MCP-compatible client like Cursor, Claude Desktop, or directly from your terminal.
Clone the repository:
git clone https://github.com/topoteretes/cognee.git
Navigate to the MCP directory:
cd cognee/cognee-mcp
Install uv (if not already installed):
pip install uv
Install dependencies:
uv sync --dev --all-extras --reinstall
Activate the virtual environment:
source .venv/bin/activate
Create a .env file with your OpenAI API key:
LLM_API_KEY="YOUR_OPENAI_API_KEY"
Run the server with your preferred transport method:
# Default stdio transport
python src/server.py
# SSE transport for real-time streaming
python src/server.py --transport sse
# HTTP transport (recommended for web deployments)
python src/server.py --transport http --host 127.0.0.1 --port 8000 --path /mcp
You can run Cognee-MCP in a Docker container using one of these options:
.env
file with your API key and settingsdocker rmi cognee/cognee-mcp:main || true
docker build --no-cache -f cognee-mcp/Dockerfile -t cognee/cognee-mcp:main .
# HTTP transport
docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport http
# SSE transport
docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport sse
# stdio transport
docker run --env-file ./.env --rm -it cognee/cognee-mcp:main
# HTTP transport
docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport http
# SSE transport
docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport sse
# stdio transport
docker run --env-file ./.env --rm -it cognee/cognee-mcp:main
Once running, the MCP server exposes its functionality through tools that can be called from any MCP-compatible client.
# List all available datasets and data items
list_data()
# List data items in a specific dataset
list_data(dataset_id="your-dataset-id-here")
# Soft delete (safer, preserves shared entities)
delete(data_id="data-uuid", dataset_id="dataset-uuid", mode="soft")
# Hard delete (removes orphaned entities)
delete(data_id="data-uuid", dataset_id="dataset-uuid", mode="hard")
Create a run script for cognee (save as run-cognee.sh):
#!/bin/bash
export ENV=local
export TOKENIZERS_PARALLELISM=false
export EMBEDDING_PROVIDER="fastembed"
export EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2"
export EMBEDDING_DIMENSIONS=384
export EMBEDDING_MAX_TOKENS=256
export LLM_API_KEY=your-OpenAI-API-key
uv --directory /{cognee_root_path}/cognee-mcp run cognee
Install Cursor IDE and navigate to Settings → MCP Tools → New MCP Server
In the mcp.json file, configure your server:
{
"mcpServers": {
"cognee": {
"command": "sh",
"args": [
"/{path-to-your-script}/run-cognee.sh"
]
}
}
}
Refresh the server from the toggle next to your new cognee server. Check for the green dot to verify it's running.
Open Cursor Agent and start using cognee tools via prompting.
For more detailed configuration options:
Create a full .env file using the template available at env.template
Visit the documentation for information on using different LLM providers and database configurations.
To add this MCP server to Claude Code, run this command in your terminal:
claude mcp add-json "cognee" '{"command":"uv","args":["--directory","/Users/{user}/cognee/cognee-mcp","run","cognee"],"env":{"ENV":"local","TOKENIZERS_PARALLELISM":"false","LLM_API_KEY":"sk-"}}'
See the official Claude Code MCP documentation for more details.
There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json
file so that it is available in all of your projects.
If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json
file.
To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".
When you click that button the ~/.cursor/mcp.json
file will be opened and you can add your server like this:
{
"mcpServers": {
"cognee": {
"command": "uv",
"args": [
"--directory",
"/Users/{user}/cognee/cognee-mcp",
"run",
"cognee"
],
"env": {
"ENV": "local",
"TOKENIZERS_PARALLELISM": "false",
"LLM_API_KEY": "sk-"
}
}
}
}
To add an MCP server to a project you can create a new .cursor/mcp.json
file or add it to the existing one. This will look exactly the same as the global MCP server example above.
Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.
The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.
You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.
To add this MCP server to Claude Desktop:
1. Find your configuration file:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
~/.config/Claude/claude_desktop_config.json
2. Add this to your configuration file:
{
"mcpServers": {
"cognee": {
"command": "uv",
"args": [
"--directory",
"/Users/{user}/cognee/cognee-mcp",
"run",
"cognee"
],
"env": {
"ENV": "local",
"TOKENIZERS_PARALLELISM": "false",
"LLM_API_KEY": "sk-"
}
}
}
}
3. Restart Claude Desktop for the changes to take effect