MemGPT MCP server

Implements a multi-provider memory system for LLMs, enabling conversation history management and seamless model switching across OpenAI, Anthropic, OpenRouter, and Ollama.
Back to servers
Setup instructions
Provider
Vic563
Release date
Jan 04, 2025
Language
TypeScript
Stats
25 stars

This MCP server implements a memory system for LLMs, allowing you to chat with various AI providers while maintaining conversation history across interactions. It supports multiple models including Claude, GPT-4, and locally hosted models.

Installation

Prerequisites

Before installing, ensure you have:

  • Node.js and npm installed
  • API keys for any providers you want to use (OpenAI, Anthropic, OpenRouter)

Setup for Claude Desktop

To use with Claude Desktop, you need to add server configuration to the Claude desktop config file:

On MacOS:

~/Library/Application Support/Claude/claude_desktop_config.json

On Windows:

%APPDATA%/Claude/claude_desktop_config.json

Add the following configuration to the file:

{
  "mcpServers": {
    "letta-memgpt": {
      "command": "/path/to/memgpt-server/build/index.js",
      "env": {
        "OPENAI_API_KEY": "your-openai-key",
        "ANTHROPIC_API_KEY": "your-anthropic-key",
        "OPENROUTER_API_KEY": "your-openrouter-key"
      }
    }
  }
}

Environment Variables

Configure the following API keys in the configuration file:

  • OPENAI_API_KEY - Your OpenAI API key
  • ANTHROPIC_API_KEY - Your Anthropic API key
  • OPENROUTER_API_KEY - Your OpenRouter API key

Using the MCP Server

Available Tools

Chat Tool

Send messages to the current LLM provider:

{
  "name": "chat",
  "parameters": {
    "message": "Your message here"
  }
}

Memory Management

Retrieve conversation history:

{
  "name": "get_memory",
  "parameters": {
    "limit": 10
  }
}
  • Set limit to the number of memories to retrieve
  • Use "limit": null for unlimited memory retrieval
  • Returns memories in chronological order with timestamps

Clear all conversation history:

{
  "name": "clear_memory",
  "parameters": {}
}

Provider and Model Selection

Switch between LLM providers:

{
  "name": "use_provider",
  "parameters": {
    "provider": "anthropic"
  }
}

Supported providers:

  • openai
  • anthropic
  • openrouter
  • ollama

Select a specific model for the current provider:

{
  "name": "use_model",
  "parameters": {
    "model": "claude-3-sonnet"
  }
}

Supported Models

Anthropic Claude Models

  • Claude 3 Series:
    • claude-3-haiku: Fast responses, ideal for customer support
    • claude-3-sonnet: Balanced performance for general use
    • claude-3-opus: Advanced reasoning for complex tasks
  • Claude 3.5 Series:
    • claude-3.5-haiku: Enhanced speed and cost-effectiveness
    • claude-3.5-sonnet: Superior performance with computer interaction

OpenAI Models

  • gpt-4o
  • gpt-4o-mini
  • gpt-4-turbo

OpenRouter

Any model in 'provider/model' format (e.g., 'openai/gpt-4', 'anthropic/claude-2')

Ollama

Any locally available model (e.g., 'llama2', 'codellama')

Debugging

Since MCP servers communicate over stdio, you can use the MCP Inspector for debugging:

npm run inspector

This will provide a URL to access debugging tools in your browser.

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "letta-memgpt" '{"command":"/path/to/memgpt-server/build/index.js","env":{"OPENAI_API_KEY":"your-openai-key","ANTHROPIC_API_KEY":"your-anthropic-key","OPENROUTER_API_KEY":"your-openrouter-key"}}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "letta-memgpt": {
            "command": "/path/to/memgpt-server/build/index.js",
            "env": {
                "OPENAI_API_KEY": "your-openai-key",
                "ANTHROPIC_API_KEY": "your-anthropic-key",
                "OPENROUTER_API_KEY": "your-openrouter-key"
            }
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "letta-memgpt": {
            "command": "/path/to/memgpt-server/build/index.js",
            "env": {
                "OPENAI_API_KEY": "your-openai-key",
                "ANTHROPIC_API_KEY": "your-anthropic-key",
                "OPENROUTER_API_KEY": "your-openrouter-key"
            }
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later