Trino MCP server

Integrates with Trino databases to enable natural language querying, data analysis, and SQL execution via a FastAPI-based REST interface.
Back to servers
Provider
Stink Labs
Release date
Mar 02, 2025
Language
Python
Stats
6 stars

This MCP server for Trino provides AI models with structured access to Trino's distributed SQL query engine, allowing them to execute SQL queries against your data sources through a standardized protocol.

Installation

Docker Installation (Recommended)

The easiest way to get started is using Docker Compose:

# Start the server with docker-compose
docker-compose up -d

This will start three services:

Standalone Installation

For non-containerized usage, you can run the standalone API server:

# Run the standalone API server on port 8008
python llm_trino_api.py

Using the MCP Server

Basic Query Execution

You can verify the API is working by executing a simple query:

# Verify the API is working
curl -X POST "http://localhost:9097/api/query" \
     -H "Content-Type: application/json" \
     -d '{"query": "SELECT 1 AS test"}'

Command-Line Query Interface

For simple direct queries, use the command-line tool:

# Simple direct query
python llm_query_trino.py "SELECT * FROM memory.bullshit.real_bullshit_data LIMIT 5"

# Specify a different catalog or schema
python llm_query_trino.py "SELECT * FROM information_schema.tables" memory information_schema

API Integration Options

Docker Container API (Port 9097)

The Docker container provides an API on port 9097:

# Execute a query against the Docker container API
curl -X POST "http://localhost:9097/api/query" \
     -H "Content-Type: application/json" \
     -d '{"query": "SELECT 1 AS test"}'

Standalone Python API (Port 8008)

For more flexible deployments, use the standalone API:

# Start the API server on port 8008
python llm_trino_api.py

This creates endpoints at:

  • GET http://localhost:8008/ - API usage info
  • POST http://localhost:8008/query - Execute SQL queries

Python Client Example

import requests

def query_trino(sql_query):
    response = requests.post(
        "http://localhost:8008/query",
        json={"query": sql_query}
    )
    return response.json()

# Example query
results = query_trino("SELECT job_title, AVG(salary) FROM memory.bullshit.real_bullshit_data GROUP BY job_title ORDER BY AVG(salary) DESC LIMIT 5")
print(results["formatted_results"])

Demo and Sample Data

Generate and Load Sample Data

# Generate the bullshit data
python tools/create_bullshit_data.py

# Load the bullshit data into Trino's memory catalog
python load_bullshit_data.py

Run Complex Queries

The test script demonstrates end-to-end MCP interaction:

# Run a complex query against the sample data through MCP
python test_bullshit_query.py

Transport Options

STDIO Transport (Recommended)

# Run with STDIO transport inside the container
docker exec -i trino_mcp_trino-mcp_1 python -m trino_mcp.server --transport stdio --debug --trino-host trino --trino-port 8080 --trino-user trino --trino-catalog memory

Troubleshooting

API Returns 503 Service Unavailable

If you encounter 503 errors:

# Rebuild and restart the container
docker-compose stop trino-mcp
docker-compose rm -f trino-mcp
docker-compose up -d trino-mcp

Check Container Logs

docker logs trino_mcp_trino-mcp_1

Verify Trino is Running

curl -s http://localhost:9095/v1/info | jq

Port Conflicts with Standalone API

If port 8008 is already in use:

# Run with a custom port
python -c "import llm_trino_api; import uvicorn; uvicorn.run(llm_trino_api.app, host='127.0.0.1', port=8009)"

API Reference

Both Docker Container API (port 9097) and Standalone API (port 8008) offer:

  • GET /api - API documentation and usage examples
  • POST /api/query - Execute SQL queries against Trino

Example API request body:

{
  "query": "SELECT * FROM memory.bullshit.real_bullshit_data LIMIT 5",
  "catalog": "memory",
  "schema": "bullshit"
}

How to add this MCP server to Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "cursor-rules-mcp": {
            "command": "npx",
            "args": [
                "-y",
                "cursor-rules-mcp"
            ]
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later