Provides a persistent memory layer and structured session context for MCP clients, enabling cross-session recall and semantic search across AI tools.
Configuration
View docs{
"mcpServers": {
"lyellr88-marm-systems": {
"url": "http://localhost:8001/mcp"
}
}
}You can connect your AI agents to MARM MCP Server to give them persistent memory, cross-session recall, and structured session data. This memory layer sits beneath your AI tools, enabling semantic search, auto-classification, and shared memories across tools and sessions so your agents stay contextually aware and productive.
Connect your MCP client to the MARM MCP Server using either HTTP or STDIO transport. The HTTP method exposes a remote endpoint you can call over the network, while STDIO runs as a local process that communicates via standard input and output. Once connected, your agents can store memories, logs, notebooks, and session context in a centralized memory store and retrieve them by meaning across sessions and tools.
Typical usage patterns include creating a session for a project, sending logs and decisions as you work, and asking the memory system to surface relevant past notes or summaries when you face a similar task. You can also use semantic search to find related discussions, code snippets, or decisions across multiple tools and sessions.
Prerequisites: you need Python and Docker installed on your machine. You may also use pip to install Python packages if you choose the Python-based setup.
docker pull lyellr88/marm-mcp-server:latest
docker run -d --name marm-mcp-server -p 8001:8001 -v ~/.marm:/home/marm/.marm lyellr88/marm-mcp-server:latest
claude mcp add --transport http marm-memory http://localhost:8001/mcpIf you prefer a local HTTP install without Docker, run these commands to install and start the server, then connect your MCP client to the HTTP endpoint.
pip install marm-mcp-server==2.2.6
pip install -r marm-mcp-server/requirements.txt
python marm-mcp-server
claude mcp add --transport http marm-memory http://localhost:8001/mcpFor STDIO transport, you run the server as a local process and connect your MCP client to the STDIO stream. The final run command shown is the starting point for STDIO deployments.
pip install marm-mcp-server==2.2.6
pip install -r marm-mcp-server/requirements_stdio.txt
<platform> mcp add --transport stdio marm-memory-stdio python "your/file/path/to/marm-mcp-server/server_stdio.py"
python marm-mcp-server/server_stdio.pyIf you want to configure authentication or customize how the server is exposed, you can provide a manual HTTP configuration or rely on the STDIO setup where you specify how the client launches the server. STDIO configuration examples show how to run with a specific Python script and path.
<!-- STDIO JSON configuration example used for IDEs and certain MCP clients -->
{
"mcpServers": {
"marm-memory-stdio": {
"command": "python",
"args": ["marm-mcp-server/server_stdio.py"],
"cwd": "/path/to/marm-mcp-server"
}
}
}AI-powered semantic similarity search across all memories with optional global search.
Intelligent auto-classifying memory storage using vector embeddings.
Activate MARM intelligent memory and response accuracy layers.
Refresh AI agent session state and reaffirm protocol adherence.
Create or switch to named memory session containers.
Add structured log entries with automatic dating.
Display all memory entries and sessions with filters.
Delete specified sessions or individual entries.
Generate context-aware summaries with intelligent truncation.
Smart context bridging for seamless workflow transitions.
Add new notebook entries with semantic embeddings.
Activate notebook entries as instructions.
Display all saved notebook keys and summaries.
Delete specific notebook entries.
Clear the active instruction list.
Show the current active instruction list.
Background tool providing current date/time for log entries.
Comprehensive system information, health status, and loaded docs.
Reload documentation into memory for fast access.