home / mcp / fastapi crud mcp server

FastAPI CRUD MCP Server

A minimal example of FastAPI with MCP library fastapi-mcp

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "brunolnetto-fastapi-crud-mcp": {
      "url": "http://127.0.0.1:8000/mcp",
      "headers": {
        "LLM_MODEL": "openai:gpt-4o-mini",
        "LLM_PROVIDER": "openai",
        "MCP_HOST_URL": "http://127.0.0.1:8000/mcp",
        "LLM_MODEL_NAME": "gpt-4o-mini",
        "OPENAI_API_KEY": "sk-your-api-key"
      }
    }
  }
}

You can run a minimal CRUD API for items and expose it as MCP tools with an auto-generated client harness. This setup lets you manage items via a FastAPI server, validate actions with a scenario runner, and view results with a rich command-line interface. The MCP surface is available at a dedicated HTTP endpoint, and you can extend or swap the backend database as needed.

How to use

When you start the MCP server, you expose CRUD operations as MCP tools that a client can exercise. You will interact through the MCP URL to list, create, read, update, and delete items. The scenario harness drives these actions automatically, validating responses and presenting prompts and outputs in a readable Rich UI.

How to install

# Prerequisites
- Python 3.11+ installed
- Docker and Docker Compose installed (for running the server with Docker)
- internet access to install dependencies

# 1) Clone the project
git clone https://github.com/yourusername/fastapi-crud-mcp.git
cd fastapi-crud-mcp

# 2) Create and activate a virtual environment
python -m venv .venv
source .venv/bin/activate

# 3) Install Python dependencies
pip install -r backend/requirements.txt

# 4) Prepare environment variables (copy the example)
cp .env.example .env
```
```
# Optional: run the server locally via Docker Compose
docker compose up -d --build
```
```
# If you prefer to run the server directly (no Docker)
uv venv
source .venv/bin/activate
uv sync

Notes and setup tips

Configuration and runtime notes help you tailor the server to your environment. You can switch the database backend by editing the DB configuration, add authentication guards for the MCP endpoints, and extend the scenario set used by the client harness to cover your specific use cases.

Configuration and runtime details

The MCP surface is exposed at a URL that the client uses to drive actions against the API. You configure environment variables to control the MCP host, language model provider, and API keys used by the scenario runner. Ensure you supply the correct host URL and credentials when you start the client harness.

Available tools

scenario_runner

Client harness that drives and validates your API via PydanticAI agents.

mcp_exposure

Auto-expose endpoints as MCP tools under /mcp/tools and /mcp/events.

rich_cli

Rich terminal output for scenario runs and results.

sqlite_backend

SQLite backend used for the demo database with easy swap to other DBs.