home / mcp / aws cost explorer mcp server
MCP server for understanding AWS spend
Configuration
View docs{
"mcpServers": {
"aarora79-aws-cost-explorer-mcp-server": {
"url": "https://YOUR_MCP_SERVER_DOMAIN/sse",
"headers": {
"MCP_TRANSPORT": "stdio",
"BEDROCK_LOG_GROUP_NAME": "YOUR_BEDROCK_CW_LOG_GROUP_NAME",
"CROSS_ACCOUNT_ROLE_NAME": "ROLE_NAME_FOR_THE_ROLE_TO_ASSUME_IN_OTHER_ACCOUNTS"
}
}
}
}This MCP server lets you query AWS spend data from Cost Explorer and Bedrock model invocation logs through a conversational Claude interface. It exposes AWS cost data to Claude Desktop and supports running locally or remotely, enabling secure, scalable access to your AWS spending insights.
You can run the MCP server locally and access it from Claude Desktop, or run a remote MCP server on Amazon EC2 and connect with a LangGraph Agent. The server provides tools to retrieve EC2 spending, Bedrock usage, and detailed cost breakdowns, which Claude can invoke through natural language queries.
Prerequisites: you need Python 3.12 and AWS credentials with Cost Explorer access. You also require access to Anthropic API for Claude integration and optional Bedrock access for LangGraph Agent.
Step 1: Install the uv tool yourself on your system.
# On macOS and Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# On Windows
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"Step 2: Set up the project workspace by cloning the repository and entering the directory.
git clone https://github.com/aarora79/aws-cost-explorer-mcp.git
cd aws-cost-explorer-mcpStep 3: Create and activate a Python 3.12 virtual environment, then install dependencies.
uv venv --python 3.12
source .venv/bin/activate
uv pip install --requirement pyproject.tomlStep 4: Configure your AWS credentials for Cost Explorer and CloudWatch.
mkdir -p ~/.aws
# Set up your credentials in ~/.aws/credentials and ~/.aws/configYou will run the server locally using the stdio transport. If you need to access AWS spend data from other accounts, you can configure a cross-account role.
For remote access, you can expose the MCP server over HTTPS using a reverse proxy (nginx). The server will typically expose an HTTPS endpoint like https://your-mcp-server-domain-name.com/sse. If you plan to expose the service publicly, ensure you configure TLS certificates and restrict access as appropriate.
Remote server prerequisites include opening port 443, configuring an SSL certificate, and setting up nginx to forward requests to the MCP server running on port 8000.
Start the local MCP server with the stdio transport and required environment variables.
export MCP_TRANSPORT=stdio
export BEDROCK_LOG_GROUP_NAME=YOUR_BEDROCK_CW_LOG_GROUP_NAME
export CROSS_ACCOUNT_ROLE_NAME=ROLE_NAME_FOR_THE_ROLE_TO_ASSUME_IN_OTHER_ACCOUNTS
python server.pyThere are two ways to connect Claude Desktop to your local MCP server.
Option 1: Using Docker (recommended for isolation)
{
"mcpServers": {
"aws_cost_explorer": {
"command": "docker",
"args": [ "run", "-i", "--rm", "-e", "AWS_ACCESS_KEY_ID", "-e", "AWS_SECRET_ACCESS_KEY", "-e", "AWS_REGION", "-e", "BEDROCK_LOG_GROUP_NAME", "-e", "MCP_TRANSPORT", "-e", "CROSS_ACCOUNT_ROLE_NAME", "aws-cost-explorer-mcp:latest" ],
"env": {
"AWS_ACCESS_KEY_ID": "YOUR_ACCESS_KEY_ID",
"AWS_SECRET_ACCESS_KEY": "YOUR_SECRET_ACCESS_KEY",
"AWS_REGION": "us-east-1",
"BEDROCK_LOG_GROUP_NAME": "YOUR_CLOUDWATCH_BEDROCK_MODEL_INVOCATION_LOG_GROUP_NAME",
"CROSS_ACCOUNT_ROLE_NAME": "ROLE_NAME_FOR_THE_ROLE_TO_ASSUME_IN_OTHER_ACCOUNTS",
"MCP_TRANSPORT": "stdio"
}
}
}
}Option 2: Using UV (without Docker) if you prefer running the server directly
{
"mcpServers": {
"aws_cost_explorer": {
"command": "uv",
"args": [
"--directory",
"/path/to/aws-cost-explorer-mcp-server",
"run",
"server.py"
],
"env": {
"AWS_ACCESS_KEY_ID": "YOUR_ACCESS_KEY_ID",
"AWS_SECRET_ACCESS_KEY": "YOUR_SECRET_ACCESS_KEY",
"AWS_REGION": "us-east-1",
"BEDROCK_LOG_GROUP_NAME": "YOUR_CLOUDWATCH_BEDROCK_MODEL_INVOCATION_LOG_GROUP_NAME",
"CROSS_ACCOUNT_ROLE_NAME": "ROLE_NAME_FOR_THE_ROLE_TO_ASSUME_IN_OTHER_ACCOUNTS",
"MCP_TRANSPORT": "stdio"
}
}
}
}Once connected, you can ask Claude questions like what was your Bedrock usage in the last 7 days or how your EC2 spending looks for yesterday. The MCP server exposes tools to retrieve daily usage, detailed breakdowns, and cross-account spend data.
- get_ec2_spend_last_day(): Retrieves EC2 spending data for the previous day.
- get_detailed_breakdown_by_day(days=7): Delivers a comprehensive analysis of costs by region, service, and instance type.
- get_bedrock_daily_usage_stats(days=7, region='us-east-1', log_group_name='BedrockModelInvocationLogGroup'): Delivers a per-day breakdown of model usage by region and users.
- get_bedrock_hourly_usage_stats(days=7, region='us-east-1', log_group_name='BedrockModelInvocationLogGroup'): Delivers a per-day per-hour breakdown of model usage by region and users.
A Dockerfile is included for containerized deployment.
docker build -t aws-cost-explorer-mcp .
docker run -v ~/.aws:/root/.aws aws-cost-explorer-mcpExtend functionality by adding new cost analysis tools to the server and annotating them with the MCP tool decorator in the code.
You can use nginx as a reverse proxy to provide an HTTPS endpoint that proxies traffic to the local MCP server. This enables secure remote connections to the MCP server.
Follow these high-level steps to set up nginx with TLS and proxy to localhost:8000, then connect clients over HTTPS.
Retrieves EC2 spending data for the previous day.
Delivers a comprehensive analysis of costs by region, service, and instance type.
Delivers a per-day breakdown of Bedrock model usage by region and users.
Delivers a per-day per-hour breakdown of Bedrock model usage by region and users.