Nano Agent MCP server

Bridges OpenAI's Agent SDK with natural language task execution, enabling autonomous agent workflows with file system operations, multi-provider LLM support, and comprehensive session management for complex coding and automation tasks.
Back to servers
Setup instructions
Provider
Daniel Isler
Release date
Aug 11, 2025
Language
JavaScript
Stats
178 stars

This MCP server acts as a bridge for model-based agents, allowing you to interact with various language models (OpenAI GPT-5, Claude, and local Ollama models) through a consistent interface to perform file operations and coding tasks.

Installation

Quick Setup

First, install the required dependencies:

# Install Astral UV package manager
curl -LsSf https://astral.sh/uv/install.sh | sh

# Setup Ollama for local models
curl -fsSL https://ollama.com/install.sh | sh

# Clone the repository
git clone https://github.com/disler/nano-agent

Configure your environment:

# Create and configure environment files
cp ./.env.sample ./.env
cp ./apps/nano_agent_mcp_server/.env.sample ./apps/nano_agent_mcp_server/.env

Edit both .env files to add your API keys:

  • OPENAI_API_KEY for GPT models
  • ANTHROPIC_API_KEY for Claude models

Install the MCP server:

cd nano-agent/apps/nano_agent_mcp_server
./scripts/install.sh
uv tool install -e .

Create a .mcp.json configuration file:

cp .mcp.json.sample .mcp.json

Your .mcp.json file should look like this:

{
  "mcpServers": {
    "nano-agent": {
      "command": "nano-agent",
      "args": []
    }
  }
}

Using Nano Agent

Via Command Line

The simplest way to test the agent is through the CLI:

cd apps/nano_agent_mcp_server

# Test the tools functionality
uv run nano-cli test-tools

# Run with default model (gpt-5-mini)
uv run nano-cli run "List all Python files in the current directory"

# Run with specific models
uv run nano-cli run "Create a hello world script in python" --model gpt-5-nano
uv run nano-cli run "Summarize the README.md" --model gpt-5

# Use Anthropic models
uv run nano-cli run "Hello" --model claude-opus-4-1-20250805 --provider anthropic

# Use local Ollama models (install first with: ollama pull gpt-oss:20b)
uv run nano-cli run "List files" --model gpt-oss:20b --provider ollama

# Enable verbose output to see token usage
uv run nano-cli run "Create and edit a test file" --verbose

Via MCP Client (Claude Code)

If you're using Claude Code or another MCP client, you can interact with the server in two ways:

Direct MCP Calls

mcp nano-agent: prompt_nano_agent "Create a hello world script in python" --model gpt-5
mcp nano-agent: prompt_nano_agent "Summarize the README.md" --model claude-opus-4-1-20250805 --provider anthropic

Using Sub-Agents

@agent-nano-agent-gpt-5-mini "Create a hello world script in python"
@agent-nano-agent-gpt-5 "Summarize the README.md"
@agent-nano-agent-claude-opus-4-1 "Create a Python function that calculates prime numbers"

Model Evaluation System

For comparing model performance, use the Higher Order Prompt (HOP) and Lower Order Prompt (LOP) pattern:

/perf:hop_evaluate_nano_agents .claude/commands/perf/lop_eval_1__dummy_test.md
/perf:hop_evaluate_nano_agents .claude/commands/perf/lop_eval_2__basic_read_test.md
/perf:hop_evaluate_nano_agents .claude/commands/perf/lop_eval_3__file_operations_test.md

This will run the same prompt across multiple models simultaneously and generate comparison tables.

Available Tools

The nano-agent includes these built-in tools:

  • read_file: Reads content from a file
  • list_directory: Lists files and directories (defaults to current directory)
  • write_file: Creates or overwrites files
  • get_file_info: Gets file metadata (size, dates, type)
  • edit_file: Edits files by replacing exact text matches

Supported Models

OpenAI Models

  • gpt-5 (full capability)
  • gpt-5-mini (default model)
  • gpt-5-nano (fastest, lowest cost)
  • gpt-4o

Anthropic Models

  • claude-opus-4-1-20250805 (highest capability)
  • claude-opus-4-20250514
  • claude-sonnet-4-20250514
  • claude-3-haiku-20240307 (fastest)

Local Ollama Models

  • gpt-oss:20b (smaller, faster)
  • gpt-oss:120b (larger, more capable)
  • Any other model you've pulled with Ollama

Switching Between Models

When running commands, you can specify both the model and provider:

# Format: uv run nano-cli run "prompt" --model MODEL_NAME --provider PROVIDER_NAME

# Examples
uv run nano-cli run "Write a test function" --model gpt-5 --provider openai
uv run nano-cli run "Analyze this code" --model claude-opus-4-1-20250805 --provider anthropic
uv run nano-cli run "Create a simple web server" --model gpt-oss:120b --provider ollama

Each provider requires different setup:

  • OpenAI needs OPENAI_API_KEY environment variable
  • Anthropic needs ANTHROPIC_API_KEY environment variable
  • Ollama requires the Ollama service running locally

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "nano-agent" '{"command":"nano-agent","args":[]}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "nano-agent": {
            "command": "nano-agent",
            "args": []
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "nano-agent": {
            "command": "nano-agent",
            "args": []
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later