DARP Engine MCP server

Enables discovery and intelligent routing of user requests to the most appropriate MCP servers through metadata-based search capabilities, eliminating the need to know specific server connections.
Back to servers
Provider
DARP AI
Release date
Apr 10, 2025
Language
Python
Stats
9 stars

DARPEngine is a metadata storage and search system for MCP (Model Context Protocol) servers. It allows you to index, search, and route requests to appropriate MCP servers based on capabilities, providing smart search functionality and a routing mechanism to connect users with the right tools for their specific needs.

Installation

To install and run DARPEngine:

export OPENAI_API_KEY=sk-...
docker network create highkey_network
docker compose build
docker compose -f docker-compose.yaml -f docker-compose-debug.yaml up --build --wait

Getting Started

Connecting via MCP Client

You can connect DARPEngine to an MCP Client (such as Claude Desktop or Cursor) using the provided MCP tools:

  1. In your MCP client, select SSE mode
  2. Specify http://localhost:4689/sse as the endpoint

Using the CLI

Alternatively, you can use the provided command-line interface. For basic scripts, you only need standard Python libraries, but the routing tool requires additional packages:

conda create -n darp 'python>=3.10'
conda activate darp
pip install -r mcp_server/requirements.txt

Adding MCP Servers

To add MCP servers to the engine:

python scripts/darp-add.py --url http://memelabs.ai:3006/sse --name code_analysis --description "Analyze gitlab repo for quality, topics, packages use"

Searching for Servers

To search for MCP servers that match your query:

python scripts/darp-search.py "Analyze https://github.com/BenderV/autochat"

Example output:

Found 1 servers:
code_analysis

Using the Router

The routing tool automatically directs your query to appropriate MCP servers and returns results:

python scripts/darp-router.py "Analyze https://github.com/BenderV/autochat"

This will return a detailed analysis of the specified repository using the available MCP tools. The output includes:

  • Code quality assessment
  • Code structure analysis
  • Functionality overview
  • Library usage details
  • Summary of the project

Features

  • Simple CLI: Easy-to-use command-line interface for adding servers and searching
  • API Search Access: Search for MCP servers programmatically
  • MCP Tool: Retrieve search results for connecting manually to MCP servers
  • Smart Routing: Answer questions by automatically using the most appropriate tools based on user requests

How to add this MCP server to Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cursor-rules-mcp": {
            "command": "npx",
            "args": [
                "-y",
                "cursor-rules-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later