Deep Research MCP server

Integrates search engines, web scraping, and Gemini models to perform iterative, deep research on any topic, generating insights and detailed reports with source citations.
Back to servers
Provider
ssdeanx
Release date
Feb 21, 2025
Language
TypeScript
Stats
38 stars

This MCP server implementation provides a powerful AI-powered research assistant that leverages Gemini LLMs and web scraping to conduct deep, iterative research on any topic. The server integrates with the Model Context Protocol for seamless incorporation into AI agent ecosystems.

Installation

Prerequisites

Before setting up the MCP server, ensure you have:

  • Node.js v22.x installed
  • API keys for:

Setup Process

  1. Clone the repository:

    git clone [your-repo-link-here]
    
  2. Install dependencies:

    npm install
    
  3. Configure environment variables:

    Create a .env.local file in the project root with your API keys:

    GEMINI_API_KEY="your_gemini_key"
    FIRECRAWL_KEY="your_firecrawl_key"
    # Optional: For self-hosted Firecrawl
    # FIRECRAWL_BASE_URL=http://localhost:3002
    
  4. Build the project:

    npm run build
    

Using the MCP Server

Starting the Server

To launch the MCP server, run:

node --env-file .env.local dist/mcp-server.js

Integrating with MCP-Compatible Tools

You can invoke the deep-research tool from any MCP-compatible agent with these parameters:

  • query (string, required): Your research question
  • depth (number, optional, 1-5): How deep the research should go (default: moderate)
  • breadth (number, optional, 1-5): How wide the research should be (default: moderate)
  • existingLearnings (string[], optional): Previous findings to guide research

Example Integration (TypeScript)

const mcp = new ModelContextProtocolClient(); // Assuming MCP client is initialized

async function invokeDeepResearchTool() {
  try {
    const result = await mcp.invoke("deep-research", {
      query: "Explain the principles of blockchain technology",
      depth: 2,
      breadth: 4
    });

    if (result.isError) {
      console.error("MCP Tool Error:", result.content[0].text);
    } else {
      console.log("Research Report:\n", result.content[0].text);
      console.log("Sources:\n", result.metadata.sources);
    }
  } catch (error) {
    console.error("MCP Invoke Error:", error);
  }
}

invokeDeepResearchTool();

Alternative Usage Methods

Command Line Interface

For standalone usage without MCP integration:

npm run start "your research query"

Example:

npm run start "what are latest developments in ai research agents"

MCP Inspector Testing

For interactive testing and debugging:

npx @modelcontextprotocol/inspector node --env-file .env.local dist/mcp-server.js

Research Features and Capabilities

Research Process

The deep-research tool follows an iterative process:

  1. Generate targeted search queries based on your research question
  2. Fetch results using Firecrawl's web searching capabilities
  3. Process and analyze content using Gemini LLMs
  4. Extract key learnings and identify new research directions
  5. Recursively explore deeper based on depth parameter
  6. Generate a comprehensive markdown report

Research Validation

The system incorporates robust validation:

  • Minimum input requirements (10+ characters, 3+ words)
  • Citation density metrics (1.5+ per 100 words)
  • Recent sources verification (3+ post-2019 references)
  • Conflict disclosure enforcement

Performance Optimizations

The latest version includes significant improvements:

  • Concurrent processing pipeline for faster results
  • Semantic text splitting for better LLM context management
  • 30% faster research cycles
  • 25% more efficient token usage

How to add this MCP server to Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cursor-rules-mcp": {
            "command": "npx",
            "args": [
                "-y",
                "cursor-rules-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later