OpenSearch MCP server

Integrates with OpenSearch to enable powerful full-text search, data aggregation, and real-time analytics capabilities for applications requiring advanced search and log analysis.
Back to servers
Provider
ibrooksSDX
Release date
Feb 18, 2025
Language
Python
Stats
3 stars

The mcp-server-opensearch provides a Model Context Protocol (MCP) server for OpenSearch, enabling AI applications to interact with OpenSearch as a semantic memory layer. This implementation allows LLMs to store and retrieve information from OpenSearch databases through a standardized protocol.

Installation Options

Via Smithery

To automatically install mcp-server-opensearch for Claude Desktop using Smithery:

npx -y @smithery/cli install @ibrooksSDX/mcp-server-opensearch --client claude

Using uv (Recommended)

You can run mcp-server-opensearch directly with uv without prior installation:

uv run mcp-server-opensearch \
  --opensearch-url "http://localhost:9200" \
  --index-name "my_index"

Or alternatively:

uv run fastmcp run demo.py:main

Testing

Local OpenSearch Client Test

Test your OpenSearch client connection with:

uv run python src/mcp-server-opensearch/test_opensearch.py

MCP Server Connection Test

Test the MCP server connection to the OpenSearch client:

cd src/mcp-server-opensearch
uv run fastmcp dev demo.py

Configuration with Claude Desktop

To integrate with Claude Desktop app, add the following to your claude_desktop_config.json in the "mcpServers" section:

{
  "opensearch": {
    "command": "uvx",
    "args": [
      "mcp-server-opensearch",
      "--opensearch-url",
      "http://localhost:9200",
      "--opensearch-api-key",
      "your_api_key",
      "--index-name",
      "your_index_name"
    ]
  },
  "Demo": {
    "command": "uv",
    "args": [
      "run",
      "--with",
      "fastmcp",
      "--with",
      "opensearch-py",
      "fastmcp",
      "run",
      "/Users/ibrooks/Documents/GitHub/mcp-server-opensearch/src/mcp-server-opensearch/demo.py"
    ]
  }
}

Alternatively, use the FastMCP UI to install the server to Claude:

uv run fastmcp install demo.py

Environment Variables

You can also configure the server using these environment variables:

  • OPENSEARCH_HOST: URL of the OpenSearch server (e.g., http://localhost)
  • OPENSEARCH_HOSTPORT: Port of the OpenSearch server (e.g., 9200)
  • INDEX_NAME: Name of the index to use in OpenSearch

Available Tools

search-openSearch

This tool allows you to store memories in the OpenSearch database.

Input:

  • query (JSON): Prepared JSON query message

Returns: Confirmation message after the operation is complete

How to add this MCP server to Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cursor-rules-mcp": {
            "command": "npx",
            "args": [
                "-y",
                "cursor-rules-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later