home / mcp / ace mcp server

ACE MCP Server

ACE MCP Server

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "angry-robot-deals-ace-mcp": {
      "command": "node",
      "args": [
        "/absolute/path/to/ace-mcp-server/dist/index.js"
      ],
      "env": {
        "GOOGLE_MODEL": "gemini-1.5-pro",
        "LLM_PROVIDER": "deepseek",
        "OPENAI_MODEL": "gpt-4o",
        "ACE_LOG_LEVEL": "info",
        "MISTRAL_MODEL": "mistral-large-latest",
        "DEEPSEEK_MODEL": "deepseek-chat",
        "GOOGLE_API_KEY": "your-google-api-key",
        "LMSTUDIO_MODEL": "local-model",
        "OPENAI_API_KEY": "sk-your-openai-api-key",
        "ACE_CONTEXT_DIR": "./contexts",
        "ANTHROPIC_MODEL": "claude-3-5-sonnet-20241022",
        "MISTRAL_API_KEY": "your-mistral-api-key",
        "API_BEARER_TOKEN": "YOUR_SECURE_TOKEN",
        "DEEPSEEK_API_KEY": "sk-your-deepseek-api-key",
        "ANTHROPIC_API_KEY": "sk-ant-your-api-key",
        "LMSTUDIO_BASE_URL": "http://localhost:1234/v1",
        "OPENAI_EMBEDDING_MODEL": "text-embedding-3-small",
        "DEEPSEEK_EMBEDDING_MODEL": "deepseek-embedding"
      }
    }
  }
}

ACE MCP Server is an intelligent development assistant that learns from your coding patterns and automatically enhances your development workflow. It integrates with Cursor AI through the Model Context Protocol (MCP) to provide contextual code generation, deep analysis, and self-improving recommendations, helping you work faster and smarter.

How to use

You connect an MCP client to the ACE MCP Server to access four core tools that generate code, analyze it for improvements, get context-aware help, and enhance prompts with your accumulated playbook.

How to install

# 1. Ensure prerequisites
node -v
docker -v
npm -v

# 2. Install dependencies and build (if starting from source)
npm install
npm run build

# 3. Start the MCP server (stdio approach shown below)
# The final start command is shown in the stdio config examples

Additional configuration and usage notes

ACE MCP Server supports multiple environments for running as an MCP server. The primary method shown uses a local node process to run the server and provides a set of environment variables to customize the LLM provider, API keys, and access tokens. You can wire this server into various MCP clients such as Cursor AI, Claude Desktop, Windsurf, or run it as a standalone process.

Available tools

ace_smart_generate

Automatically generate enhanced code with ACE insights using accumulated playbook knowledge and context.

ace_smart_reflect

Automatically analyze code and suggest improvements with actionable guidance.

ace_context_aware

Provide contextually relevant suggestions for a given domain such as web, API, database, frontend, or backend.

ace_enhance_prompt

Enhance prompts using accumulated playbook knowledge to focus on security, reliability, and clarity.