home / mcp / natural language to kusto query mcp server
Converts natural language prompts into Kusto queries and executes them against configured Kusto databases.
Configuration
View docs{
"mcpServers": {
"alexneyler-kusto-mcp": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-v",
"/path/to/settings.yaml:/app/settings.yaml",
"-e",
"AZURE_OPENAI_KEY",
"-e",
"KUSTO_ACCESS_TOKEN",
"alexeyler/kusto-mcp-server"
],
"env": {
"AZURE_OPENAI_KEY": "YOUR_AZURE_OPENAI_KEY",
"KUSTO_ACCESS_TOKEN": "YOUR_KUSTO_ACCESS_TOKEN"
}
}
}
}You can turn natural language prompts into executable Kusto queries against your configured Kusto databases. This MCP server translates your prompts, builds Kusto Query Language (KQL) queries, and can execute them, returning results in JSON or CSV. It supports configuring multiple databases and tailored prompts to guide query generation.
You interact with the MCP server through a client that sends your natural language prompts. Before you begin, provide a settings.yaml that defines the Kusto databases you want to query and the prompts the server can use for generation. You can run the MCP server locally via Docker, or run it in your preferred runtime environment if you have an appropriate runtime image.
To execute a prompt against a configured Kusto table, run the container with your settings and credentials. The server will generate a Kusto query from your prompt and run it against the selected database, returning results in the requested format (JSON or CSV). Use the following command pattern as a starting point.
docker run
-i
--rm
-v "/path/to/settings.yaml:/app/settings.yaml" \
-e AZURE_OPENAI_KEY=YOUR_AZURE_OPENAI_KEY \
-e KUSTO_ACCESS_TOKEN=YOUR_KUSTO_ACCESS_TOKEN \
alexeyler/kusto-mcp-serverIf you are integrating with VS Code, you can pass the settings file and credentials through a run configuration. The example shows mounting the settings file and providing environment variables to Docker.
{
"inputs": [
{"type": "promptString", "id": "azure-open-ai-key", "description": "Enter your Azure OpenAI key", "password": true},
{"type": "promptString", "id": "kusto-token", "description": "Enter your Kusto token", "password": true}
],
"kusto": {
"type": "stdio",
"command": "docker",
"args": [
"run","-i","--rm",
"-v","/path/to/settings.yaml:/app/settings.yaml",
"-e","AZURE_OPENAI_KEY",
"-e","KUSTO_ACCESS_TOKEN",
"alexeyler/kusto-mcp-server"
],
"env": {
"AZURE_OPENAI_KEY": "${input:azure-open-ai-key}",
"KUSTO_ACCESS_TOKEN": "${input:kusto-token}"
}
}
}Prerequisites: you need Docker installed on your machine. You will also prepare a settings.yaml file that configures the model, Kusto connections, and prompts.
1) Build the MCP server image from the repository source.
docker build ./src/Server/Dockerfile src2) Prepare your settings.yaml with your Kusto configurations and model credentials. See the configuration section for the structure and required fields.
3) Run the server using a container, mounting the settings file and passing your credentials as environment variables.
docker run
-i
--rm
-v "/path/to/settings.yaml:/app/settings.yaml" \
-e AZURE_OPENAI_KEY=YOUR_AZURE_OPENAI_KEY \
-e KUSTO_ACCESS_TOKEN=YOUR_KUSTO_ACCESS_TOKEN \
alexeyler/kusto-mcp-serverConfigure the server with a YAML file that defines the model connection to the Azure OpenAI endpoint and deployment, plus one or more Kusto configurations. Each Kusto entry describes the table, database, endpoint, and authentication method, along with the prompts used to craft queries.
Model configuration example (portion): you specify the endpoint, deployment, and optional key.
model:
endpoint: <Azure OpenAI Endpoint>
deployment: <Deployment Name>
key: <Azure Open AI key>Kusto configuration example (one or more entries): each entry defines a table, its category, database, endpoint, and optional access token. Prompts provide the system, user, and assistant content that guide query generation.
kusto:
- name: <Table Name>
category: <Category>
database: <Database Name>
table: <Table Name>
endpoint: <Kusto Endpoint>
accessToken: <Access Token>
prompts:
- type: <Prompt Type>
content: |
<Prompt Content>Ensure endpoints and credentials are kept secure. Use environment variables for sensitive values and avoid committing keys to version control. When running in VS Code or CI environments, prefer secure secret management mechanisms for keys and tokens.
If you need to generate an access token for a Kusto database, you can use Azure CLI to obtain a token for the Kusto resource.
Lists the supported Kusto tables as configured in settings.yaml.
Generates a Kusto query against a specified table for a given natural language prompt.
Generates and executes a Kusto query for a given prompt and returns the results in JSON or CSV.