home / mcp / confluent mcp server
Provides an MCP server to manage Confluent Cloud resources via natural language interactions for topics, connectors, and Flink SQL.
Configuration
View docs{
"mcpServers": {
"confluentinc-mcp-confluent": {
"url": "http://localhost:8080/mcp",
"headers": {
"HTTP_HOST": "127.0.0.1",
"HTTP_PORT": "8080",
"LOG_LEVEL": "info",
"MCP_API_KEY": "YOUR_API_KEY",
"MCP_ALLOWED_HOSTS": "localhost,127.0.0.1",
"MCP_AUTH_DISABLED": "false",
"SSE_MCP_ENDPOINT_PATH": "/sse",
"HTTP_MCP_ENDPOINT_PATH": "/mcp",
"SSE_MCP_MESSAGE_ENDPOINT_PATH": "/messages"
}
}
}
}This MCP server enables AI assistants to interact with Confluent Cloud REST APIs, letting you manage Kafka topics, connectors, and Flink SQL statements through natural language interactions. It supports multiple transports and can be embedded into desktop tools or run standalone to empower conversational control of Confluent Cloud resources.
You use this MCP server with compatible MCP clients like Claude Desktop or Goose CLI to issue natural language requests that are translated into Confluent Cloud REST API actions. Start the server, connect your client, and begin asking to list topics, create topics, manage connectors, or run Flink SQL statements. You can enable or block specific tools to tailor the capabilities exposed to your automation.
Prerequisites: you need Node.js installed (preferably managed with a tool like NVM), and a working Confluent Cloud environment with the necessary credentials in an environment file.
Step 1: Prepare the environment file. Copy the example and fill in your Confluent Cloud and related service credentials.
Step 2: Install dependencies and run. You can run the server from source or use npx to execute it directly.
The server supports HTTP and SSE transports with API key authentication by default. You configure environment variables in a .env file and provide optional host restrictions for DNS rebinding protection. The following sections summarize the key configuration areas and the variables you will typically set.
Authentication for HTTP/SSE transports is enabled by default to protect against unauthorized access. You can generate an API key, add it to your environment file, and pass it in requests using the cflt-mcp-api-Key header. For development, you may disable authentication temporarily with explicit warnings and safeguards.
Install, configure, and run the server, then connect your MCP client and start issuing commands. Use the provided command-line options to deploy with multiple transports, enable or block tools, and manage how the server exposes Confluent Cloud capabilities. Remember the allow-list is applied before the block-list, and if neither is provided, all tools are enabled by default.
Configure Claude Desktop to connect to your local MCP server by editing the mcpServers section of Claude's configuration. You can run the MCP server from source or via npx, and you should point Claude to the local server URL.
Install Goose CLI, run goose configure, and add an extension for the mcp-confluent server. Choose a method to run the MCP server from source or with npx, and provide the path to your .env file so Goose can connect to the MCP server.
To use Gemini CLI with this MCP server, install Gemini CLI, register the mcp-confluent extension, ensure a valid .env file is available in the extension directory, and verify tools are visible with the Gemini extension list command.
The MCP server includes a flexible command line interface to tailor environment files, transports, and tool enablement. You can start with help to see available options, deploy using all transports, or selectively enable or disable tools.
The server exposes a comprehensive set of tools that cover topics, topics management, Kafka connectors, Flink SQL, tableflow resources, and catalog operations. You can list, describe, create, update, delete, or read resources through these tools, depending on your allow/block configuration.
Tools are provided to introspect Flink catalogs, databases, tables, statements, and to diagnose health and performance issues with Flink SQL statements.
For development, you can build, run, and test the MCP server locally, or deploy with Docker or Docker Compose. The project includes a structured codebase with transports, tools, and a CLI, plus commands to generate types and add new tools.
Assign existing tags to Kafka topics in Confluent Cloud.
Alter topic configuration in Confluent Cloud.
Consumes messages from Kafka topics with optional Schema Registry serialization support.
Create a new connector and return connector information upon success.
Create a Flink SQL statement and submit it for execution.
Create new tag definitions in Confluent Cloud.
Create one or more Kafka topics.
Delete an existing connector and return a confirmation.
Delete one or more Flink SQL statements.
Delete a tag definition from Confluent Cloud.
Delete topics by name.
Aggregate health check for a Flink SQL statement.
Get full schema details for a Flink table.
Analyze status, exceptions, and metrics to detect issues in a statement.
Retrieve profiler data for a Flink statement.
Get table metadata via INFORMATION_SCHEMA.TABLES.
List catalogs in the Flink environment.
List databases in a Flink catalog.
List tables in a Flink database.
Retrieve configuration details for a Kafka topic.
List Kafka clusters in Confluent Cloud.
List active connectors and fetch connector details.
List environments in Confluent Cloud.
List schemas from Schema Registry.
List topics in the Kafka cluster.
Produce messages to a Kafka topic with optional Schema Registry support.
Read information about a connector.
Get details of an environment by ID.
Read a Flink statement and its results.
Remove a tag from an entity in Confluent Cloud.
Search topics by name.
Search topics by tag.
Create a TableFlow topic.
List TableFlow regions.
List TableFlow topics.
Read a TableFlow topic.
Update a TableFlow topic.
Delete a TableFlow topic.
Create a TableFlow catalog integration.
List TableFlow catalog integrations.
Read a catalog integration.
Update a catalog integration.
Delete a catalog integration.