home / mcp / confluent mcp server

Confluent MCP Server

Provides an MCP server to manage Confluent Cloud resources via natural language interactions for topics, connectors, and Flink SQL.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "confluentinc-mcp-confluent": {
      "url": "http://localhost:8080/mcp",
      "headers": {
        "HTTP_HOST": "127.0.0.1",
        "HTTP_PORT": "8080",
        "LOG_LEVEL": "info",
        "MCP_API_KEY": "YOUR_API_KEY",
        "MCP_ALLOWED_HOSTS": "localhost,127.0.0.1",
        "MCP_AUTH_DISABLED": "false",
        "SSE_MCP_ENDPOINT_PATH": "/sse",
        "HTTP_MCP_ENDPOINT_PATH": "/mcp",
        "SSE_MCP_MESSAGE_ENDPOINT_PATH": "/messages"
      }
    }
  }
}

This MCP server enables AI assistants to interact with Confluent Cloud REST APIs, letting you manage Kafka topics, connectors, and Flink SQL statements through natural language interactions. It supports multiple transports and can be embedded into desktop tools or run standalone to empower conversational control of Confluent Cloud resources.

How to use

You use this MCP server with compatible MCP clients like Claude Desktop or Goose CLI to issue natural language requests that are translated into Confluent Cloud REST API actions. Start the server, connect your client, and begin asking to list topics, create topics, manage connectors, or run Flink SQL statements. You can enable or block specific tools to tailor the capabilities exposed to your automation.

How to install

Prerequisites: you need Node.js installed (preferably managed with a tool like NVM), and a working Confluent Cloud environment with the necessary credentials in an environment file.

Step 1: Prepare the environment file. Copy the example and fill in your Confluent Cloud and related service credentials.

Step 2: Install dependencies and run. You can run the server from source or use npx to execute it directly.

Configuration

The server supports HTTP and SSE transports with API key authentication by default. You configure environment variables in a .env file and provide optional host restrictions for DNS rebinding protection. The following sections summarize the key configuration areas and the variables you will typically set.

Security and authentication notes

Authentication for HTTP/SSE transports is enabled by default to protect against unauthorized access. You can generate an API key, add it to your environment file, and pass it in requests using the cflt-mcp-api-Key header. For development, you may disable authentication temporarily with explicit warnings and safeguards.

Usage patterns and tips

Install, configure, and run the server, then connect your MCP client and start issuing commands. Use the provided command-line options to deploy with multiple transports, enable or block tools, and manage how the server exposes Confluent Cloud capabilities. Remember the allow-list is applied before the block-list, and if neither is provided, all tools are enabled by default.

Claude Desktop configuration guide

Configure Claude Desktop to connect to your local MCP server by editing the mcpServers section of Claude's configuration. You can run the MCP server from source or via npx, and you should point Claude to the local server URL.

Goose CLI configuration guide

Install Goose CLI, run goose configure, and add an extension for the mcp-confluent server. Choose a method to run the MCP server from source or with npx, and provide the path to your .env file so Goose can connect to the MCP server.

Gemini CLI configuration guide

To use Gemini CLI with this MCP server, install Gemini CLI, register the mcp-confluent extension, ensure a valid .env file is available in the extension directory, and verify tools are visible with the Gemini extension list command.

mcp-confluent CLI usage

The MCP server includes a flexible command line interface to tailor environment files, transports, and tool enablement. You can start with help to see available options, deploy using all transports, or selectively enable or disable tools.

List of tool capabilities and endpoints

The server exposes a comprehensive set of tools that cover topics, topics management, Kafka connectors, Flink SQL, tableflow resources, and catalog operations. You can list, describe, create, update, delete, or read resources through these tools, depending on your allow/block configuration.

Flink catalog and diagnostics

Tools are provided to introspect Flink catalogs, databases, tables, statements, and to diagnose health and performance issues with Flink SQL statements.

Developer notes and contributing

For development, you can build, run, and test the MCP server locally, or deploy with Docker or Docker Compose. The project includes a structured codebase with transports, tools, and a CLI, plus commands to generate types and add new tools.

Available tools

add-tags-to-topic

Assign existing tags to Kafka topics in Confluent Cloud.

alter-topic-config

Alter topic configuration in Confluent Cloud.

consume-messages

Consumes messages from Kafka topics with optional Schema Registry serialization support.

create-connector

Create a new connector and return connector information upon success.

create-flink-statement

Create a Flink SQL statement and submit it for execution.

create-topic-tags

Create new tag definitions in Confluent Cloud.

create-topics

Create one or more Kafka topics.

delete-connector

Delete an existing connector and return a confirmation.

delete-flink-statements

Delete one or more Flink SQL statements.

delete-tag

Delete a tag definition from Confluent Cloud.

delete-topics

Delete topics by name.

check-flink-statement-health

Aggregate health check for a Flink SQL statement.

describe-flink-table

Get full schema details for a Flink table.

detect-flink-statement-issues

Analyze status, exceptions, and metrics to detect issues in a statement.

get-flink-statement-profile

Retrieve profiler data for a Flink statement.

get-flink-table-info

Get table metadata via INFORMATION_SCHEMA.TABLES.

list-flink-catalogs

List catalogs in the Flink environment.

list-flink-databases

List databases in a Flink catalog.

list-flink-tables

List tables in a Flink database.

get-topic-config

Retrieve configuration details for a Kafka topic.

list-clusters

List Kafka clusters in Confluent Cloud.

list-connectors

List active connectors and fetch connector details.

list-environments

List environments in Confluent Cloud.

list-schemas

List schemas from Schema Registry.

list-topics

List topics in the Kafka cluster.

produce-message

Produce messages to a Kafka topic with optional Schema Registry support.

read-connector

Read information about a connector.

read-environment

Get details of an environment by ID.

read-flink-statement

Read a Flink statement and its results.

remove-tag-from-entity

Remove a tag from an entity in Confluent Cloud.

search-topics-by-name

Search topics by name.

search-topics-by-tag

Search topics by tag.

create-tableflow-topic

Create a TableFlow topic.

list-tableflow-regions

List TableFlow regions.

list-tableflow-topics

List TableFlow topics.

read-tableflow-topic

Read a TableFlow topic.

update-tableflow-topic

Update a TableFlow topic.

delete-tableflow-topic

Delete a TableFlow topic.

create-tableflow-catalog-integration

Create a TableFlow catalog integration.

list-tableflow-catalog-integrations

List TableFlow catalog integrations.

read-tableflow-catalog-integration

Read a catalog integration.

update-tableflow-catalog-integration

Update a catalog integration.

delete-tableflow-catalog-integration

Delete a catalog integration.