home / mcp / bigeye mcp server

Bigeye MCP Server

Provides tools to query data quality, analyze lineage, and track AI agent data access within the Bigeye platform.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "bigeyedata-bigeye-mcp-server": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e",
        "BIGEYE_API_KEY=your_api_key_here",
        "-e",
        "BIGEYE_API_URL=https://your-instance.bigeye.com",
        "-e",
        "BIGEYE_WORKSPACE_ID=your_workspace_id_here",
        "-e",
        "BIGEYE_DEBUG=false",
        "bigeye-mcp-server:latest"
      ],
      "env": {
        "BIGEYE_DEBUG": "false",
        "BIGEYE_API_KEY": "your_api_key_here",
        "BIGEYE_API_URL": "https://your-instance.bigeye.com",
        "BIGEYE_WORKSPACE_ID": "your_workspace_id_here"
      }
    }
  }
}

You can run the Bigeye MCP Server to interact with the Bigeye Data Observability platform through a configurable, ephemeral MCP service that starts on demand from your Claude Desktop environment. This server exposes a set of tools to manage data quality, lineage, AI agent data access, and incidents, allowing you to programmatically query, analyze, and act on data quality and lineage information without exposing credentials.

How to use

Start Claude Desktop and ensure the Bigeye MCP Server is configured in your Claude Desktop setup. The MCP server runs as an ephemeral container and starts automatically when you begin using the Bigeye tools. It stops when you are done, and a fresh instance is used for each session. Use the available tools to fetch data quality issues, analyze table quality, trace data lineage, track agent data access, and manage incidents.

How to install

Prerequisites: you need Docker installed on your machine and Claude Desktop configured to launch MCP servers. You will also interact with a local Python development setup if you prefer running the server directly without Docker.

# Quick start: build a local Docker-based MCP server (example)
# 1) Build the Docker image locally
# Note: replace the repository path with the actual image you build

git clone https://github.com/your-org/bigeye-mcp-server.git
cd bigeye-mcp-server

docker build -t bigeye-mcp-server:latest .

# 2) Configure Claude Desktop to use the Docker-based MCP server (see Configuration section)
# Claude Desktop will spin up the container as needed

Additional configuration and usage notes

Credentials are required to run the MCP server. Store them securely in your Claude Desktop configuration and do not paste API keys into chat interfaces. The server relies on three main credentials: HIGH-LEVEL API key, API URL, and Workspace ID, plus a debug flag you can enable for troubleshooting.

Environment variables you will provide in your configuration include BIGEYE_API_KEY, BIGEYE_API_URL, BIGEYE_WORKSPACE_ID, and BIGEYE_DEBUG. These are passed to the MCP server container to authorize and connect to your Bigeye instance.

If you run into missing credentials or authentication issues, verify that the environment variables match exactly in your Claude Desktop config and that the values are correctly formatted without trailing slashes for the API URL.

Configuration and security

Configure the MCP server as an ephemeral Docker container that Claude Desktop starts on demand. Your credential values are supplied as environment variables in the container launch command.

Security best practices: never paste API keys into chat interfaces, rotate keys regularly, and store credentials in secure configuration files with restricted permissions.

Troubleshooting

Missing environment variables show a detailed setup instruction when the server starts. Ensure the Claude Desktop config file includes BIGEYE_API_KEY, BIGEYE_API_URL, BIGEYE_WORKSPACE_ID, and BIGEYE_DEBUG as needed.

If authentication fails, confirm that your API key is valid, the workspace ID is correct, and the Bigeye URL is accurate (no trailing slash). Connectivity issues may require checking network access or firewall settings and enabling debug mode.

Notes

The server supports a broad set of tools for data quality, lineage, and agent data access management. It is designed to run as a disposable container, ensuring clean sessions and isolation between uses.

Available tools

get_issues

Fetch data quality issues with filtering by status, schema names, and pagination

get_table_issues

Get issues for a specific table

analyze_table_data_quality

Perform a comprehensive quality analysis for a table including metrics and issues

update_issue

Update issue status, priority, or add comments

merge_issues

Merge multiple issues into a single incident

unmerge_issues

Unmerge issues from incidents

get_issue_resolution_steps

Get AI-powered resolution suggestions

lineage_get_graph

Retrieve lineage graph for a data entity (upstream/downstream/bidirectional)

lineage_get_node

Get details for a specific lineage node

lineage_get_node_issues

Get all issues affecting a lineage node

lineage_analyze_upstream_causes

Trace upstream to identify root causes of data issues

lineage_analyze_downstream_impact

Analyze downstream impact of data issues

lineage_trace_issue_path

Complete lineage trace from root cause to impact

lineage_track_data_access

Track which tables/columns an AI agent accesses

lineage_commit_agent

Commit tracked access to Bigeye's lineage graph

lineage_get_tracking_status

View current tracking status

lineage_clear_tracked_assets

Clear tracking without committing

lineage_cleanup_agent_edges

Clean up old agent lineage edges

lineage_delete_node

Delete a custom lineage node