Kubernetes Claude MCP server

Integrates Claude with Kubernetes, ArgoCD, and GitLab to analyze and troubleshoot GitOps workflows by collecting resource information, correlating cross-system data, and providing actionable recommendations through a RESTful API.
Back to servers
Provider
Blank Cut Inc.
Release date
Mar 24, 2025
Language
Go
Stats
7 stars

The Claude Kubernetes MCP Server provides a bridge between Kubernetes operations and Claude AI, allowing you to automate and enhance management of your Kubernetes environments through the Model Context Protocol.

Prerequisites

Before installing the MCP server, you'll need:

  • Go 1.20+
  • Docker
  • Kubernetes cluster with a valid ~/.kube/config
  • EKS cluster with AWS_PROFILE set locally
  • ArgoCD credentials
  • GitLab personal access token
  • Claude API key (Anthropic)
  • Vault credentials (optional)

Installation

Clone the Repository

git clone https://github.com/blankcut/kubernetes-mcp-server.git
cd kubernetes-mcp-server

Set Environment Variables

Export the necessary credentials for integration with external services:

export ARGOCD_USERNAME="argocd-username"
export ARGOCD_PASSWORD="argocd-password"
export GITLAB_TOKEN="gitlab-token"
export CLAUDE_API_KEY="claude-api-key"
export VAULT_TOKEN="optional-if-using-vault"

Configure your Kubernetes environment:

export KUBECONFIG=~/.kube/config

Configure the Server

Update the kubernetes-claude-mcp/config.yaml file with your credentials and server preferences:

server:
  address: ":8080"
  readTimeout: 30
  writeTimeout: 60
  auth:
    apiKey: "${API_KEY}" 

kubernetes:
  kubeconfig: ""
  inCluster: false
  defaultContext: ""
  defaultNamespace: "default"

argocd:
  url: "http://example.argocd.com"
  authToken: ""
  username: "${ARGOCD_USERNAME}"
  password: "${ARGOCD_PASSWORD}"
  insecure: true

gitlab:
  url: "https://gitlab.com"
  authToken: "${AUTH_TOKEN}"
  apiVersion: "v4"
  projectPath: "${PROJECT_PATH}"

claude:
  apiKey: "${API_KEY}"
  baseURL: "https://api.anthropic.com"
  modelID: "claude-3-haiku-20240307"
  maxTokens: 4096
  temperature: 0.7

Running the Server

Run Locally

From the project directory:

cd kubernetes-claude-mcp
go run ./cmd/server/main.go

For debug logging:

LOG_LEVEL=debug go run ./cmd/server/main.go --config config.yaml

Run with Docker

Build and run using Docker:

cd kubernetes-claude-mcp
docker build -t claude-mcp-server -f Dockerfile .
docker run -p 8080:8080 claude-mcp-server

Or with docker-compose:

docker-compose build
docker-compose up -d

Production Deployment

Deploy to Kubernetes using the included Helm chart:

cd kubernetes-claude-mcp/deployments/helm
helm install claude-mcp .

To update the deployment:

helm upgrade claude-mcp .

Using the API

All API requests require the X-API-Key header set to your configured API key value.

Health Check

curl -H "X-API-Key: your-api-key" http://localhost:8080/api/v1/health

Kubernetes Operations

List namespaces:

curl -H "X-API-Key: your-api-key" http://localhost:8080/api/v1/namespaces

List resources:

curl -H "X-API-Key: your-api-key" http://localhost:8080/api/v1/resources/pods?namespace=default

Get specific resource:

curl -H "X-API-Key: your-api-key" http://localhost:8080/api/v1/resources/pods/nginx-pod?namespace=default

Claude MCP Operations

Analyze a resource:

curl -X POST \
  -H "X-API-Key: your-api-key" \
  -H "Content-Type: application/json" \
  -d '{
    "resource": "pod",
    "name": "example-pod",
    "namespace": "default",
    "query": "What is the status of this pod?"
  }' \
  http://localhost:8080/api/v1/mcp/resource

Troubleshoot a resource:

curl -X POST \
  -H "X-API-Key: your-api-key" \
  -H "Content-Type: application/json" \
  -d '{
    "resource": "deployment",
    "name": "my-app",
    "namespace": "production",
    "query": "Why is this deployment failing to start?"
  }' \
  http://localhost:8080/api/v1/mcp/troubleshoot

API Reference

General Endpoints

  • GET /api/v1/health - Health check

Kubernetes Endpoints

  • GET /api/v1/namespaces - List all namespaces
  • GET /api/v1/resources/{kind}?namespace={ns} - List resources of a specific kind
  • GET /api/v1/resources/{kind}/{name}?namespace={ns} - Get a specific resource
  • GET /api/v1/events?namespace={ns}&resource={kind}&name={name} - Get events for a resource

ArgoCD Endpoints

  • GET /api/v1/argocd/applications - List ArgoCD applications

Claude MCP Endpoints

  • POST /api/v1/mcp/resource - Analyze a Kubernetes resource
  • POST /api/v1/mcp/troubleshoot - Troubleshoot a Kubernetes resource
  • POST /api/v1/mcp/commit - Analyze a GitLab commit
  • POST /api/v1/mcp - Generic MCP request

How to add this MCP server to Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cursor-rules-mcp": {
            "command": "npx",
            "args": [
                "-y",
                "cursor-rules-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later