Cognee MCP server

AI-friendly database and knowledge-management capabilities via various database schemes.
Back to servers
Setup instructions
Provider
Topoteretes
Release date
Aug 16, 2023
Language
Python
Stats
6.5K stars

Cognee-MCP is a server that runs Cognee's memory engine using the Model Context Protocol (MCP). It enables AI agents to have memory capabilities by creating and accessing a knowledge graph, allowing you to query from any MCP-compatible client like Cursor, Claude Desktop, or directly from your terminal.

Quick Installation

Method 1: Local Installation

  1. Clone the repository:

    git clone https://github.com/topoteretes/cognee.git
    
  2. Navigate to the MCP directory:

    cd cognee/cognee-mcp
    
  3. Install uv (if not already installed):

    pip install uv
    
  4. Install dependencies:

    uv sync --dev --all-extras --reinstall
    
  5. Activate the virtual environment:

    source .venv/bin/activate
    
  6. Create a .env file with your OpenAI API key:

    LLM_API_KEY="YOUR_OPENAI_API_KEY"
    
  7. Run the server with your preferred transport method:

    # Default stdio transport
    python src/server.py
    
    # SSE transport for real-time streaming
    python src/server.py --transport sse
    
    # HTTP transport (recommended for web deployments)
    python src/server.py --transport http --host 127.0.0.1 --port 8000 --path /mcp
    

Method 2: Docker Installation

You can run Cognee-MCP in a Docker container using one of these options:

Option A: Build locally

  1. Make sure you're in the cognee root directory and create a .env file with your API key and settings
  2. Build the image:
    docker rmi cognee/cognee-mcp:main || true
    docker build --no-cache -f cognee-mcp/Dockerfile -t cognee/cognee-mcp:main .
    
  3. Run the container:
    # HTTP transport
    docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport http
    
    # SSE transport
    docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport sse
    
    # stdio transport
    docker run --env-file ./.env --rm -it cognee/cognee-mcp:main
    

Option B: Pull from Docker Hub

# HTTP transport
docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport http

# SSE transport
docker run --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main --transport sse

# stdio transport
docker run --env-file ./.env --rm -it cognee/cognee-mcp:main

Using Cognee-MCP

Once running, the MCP server exposes its functionality through tools that can be called from any MCP-compatible client.

Core Tools

  • cognify: Transforms your data into a structured knowledge graph and stores it in memory
  • codify: Analyzes code repositories, builds a code graph, and stores it in memory
  • search: Queries your memory (supports GRAPH_COMPLETION, RAG_COMPLETION, CODE, CHUNKS, INSIGHTS)
  • list_data: Lists all datasets and their data items with IDs
  • delete: Deletes specific data from a dataset (supports soft/hard deletion)
  • prune: Resets cognee for a fresh start (removes all data)
  • cognify_status / codify_status: Tracks pipeline progress

Data Management Examples

# List all available datasets and data items
list_data()

# List data items in a specific dataset
list_data(dataset_id="your-dataset-id-here")

# Soft delete (safer, preserves shared entities)
delete(data_id="data-uuid", dataset_id="dataset-uuid", mode="soft")

# Hard delete (removes orphaned entities)
delete(data_id="data-uuid", dataset_id="dataset-uuid", mode="hard")

Setting Up with Cursor IDE

  1. Create a run script for cognee (save as run-cognee.sh):

    #!/bin/bash
    export ENV=local
    export TOKENIZERS_PARALLELISM=false
    export EMBEDDING_PROVIDER="fastembed"
    export EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2"
    export EMBEDDING_DIMENSIONS=384
    export EMBEDDING_MAX_TOKENS=256
    export LLM_API_KEY=your-OpenAI-API-key
    uv --directory /{cognee_root_path}/cognee-mcp run cognee
    
  2. Install Cursor IDE and navigate to Settings → MCP Tools → New MCP Server

  3. In the mcp.json file, configure your server:

    {
      "mcpServers": {
        "cognee": {
          "command": "sh",
          "args": [
            "/{path-to-your-script}/run-cognee.sh"
          ]
        }
      }
    }
    
  4. Refresh the server from the toggle next to your new cognee server. Check for the green dot to verify it's running.

  5. Open Cursor Agent and start using cognee tools via prompting.

Advanced Configuration

For more detailed configuration options:

  1. Create a full .env file using the template available at env.template

  2. Visit the documentation for information on using different LLM providers and database configurations.

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "cognee" '{"command":"uv","args":["--directory","/Users/{user}/cognee/cognee-mcp","run","cognee"],"env":{"ENV":"local","TOKENIZERS_PARALLELISM":"false","LLM_API_KEY":"sk-"}}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cognee": {
            "command": "uv",
            "args": [
                "--directory",
                "/Users/{user}/cognee/cognee-mcp",
                "run",
                "cognee"
            ],
            "env": {
                "ENV": "local",
                "TOKENIZERS_PARALLELISM": "false",
                "LLM_API_KEY": "sk-"
            }
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "cognee": {
            "command": "uv",
            "args": [
                "--directory",
                "/Users/{user}/cognee/cognee-mcp",
                "run",
                "cognee"
            ],
            "env": {
                "ENV": "local",
                "TOKENIZERS_PARALLELISM": "false",
                "LLM_API_KEY": "sk-"
            }
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later