Just Prompt (Multi-LLM Provider) MCP server

Unified interface for interacting with multiple LLM providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama with parallel prompt sending and response file saving capabilities.
Back to servers
Setup instructions
Provider
Daniel Isler
Release date
Mar 31, 2025
Language
Python
Stats
494 stars

Just Prompt is a lightweight MCP (Model Control Protocol) server that provides a unified interface to various Large Language Model providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama. It allows you to interact with multiple LLMs through a consistent API and compare responses across different models.

Installation

# Clone the repository
git clone https://github.com/yourusername/just-prompt.git
cd just-prompt

# Install with pip
uv sync

Setting Up Environment Variables

Create a .env file with your API keys:

cp .env.sample .env

Then edit the .env file with your API keys:

OPENAI_API_KEY=your_openai_api_key_here
ANTHROPIC_API_KEY=your_anthropic_api_key_here
GEMINI_API_KEY=your_gemini_api_key_here
GROQ_API_KEY=your_groq_api_key_here
DEEPSEEK_API_KEY=your_deepseek_api_key_here
OLLAMA_HOST=http://localhost:11434

Setting Up in Claude Code

Using mcp add-json

Copy this and paste it into Claude Code (but don't run until you copy the JSON):

claude mcp add just-prompt "$(pbpaste)"

Basic JSON configuration:

{
    "command": "uv",
    "args": ["--directory", ".", "run", "just-prompt"]
}

With a custom default model:

{
    "command": "uv",
    "args": ["--directory", ".", "run", "just-prompt", "--default-models", "openai:gpt-4o"]
}

With multiple default models:

{
    "command": "uv",
    "args": ["--directory", ".", "run", "just-prompt", "--default-models", "openai:o3:high,openai:o4-mini:high,anthropic:claude-opus-4-20250514,anthropic:claude-sonnet-4-20250514,gemini:gemini-2.5-pro-preview-03-25,gemini:gemini-2.5-flash-preview-04-17"]
}

Using mcp add with project scope

# With default models
claude mcp add just-prompt -s project \
  -- \
    uv --directory . \
    run just-prompt

# With custom default model
claude mcp add just-prompt -s project \
  -- \
  uv --directory . \
  run just-prompt --default-models "openai:gpt-4o"

# With multiple default models
claude mcp add just-prompt -s user \
  -- \
  uv --directory . \
  run just-prompt --default-models "openai:o3:high,openai:o4-mini:high,anthropic:claude-opus-4-20250514,anthropic:claude-sonnet-4-20250514,gemini:gemini-2.5-pro-preview-03-25,gemini:gemini-2.5-flash-preview-04-17"

Removing the MCP Server

claude mcp remove just-prompt

Available Tools

Provider Prefixes

All models must be prefixed with the provider name (you can use short names for quicker referencing):

  • o or openai: OpenAI
    • Example: o:gpt-4o-mini or openai:gpt-4o-mini
  • a or anthropic: Anthropic
    • Example: a:claude-3-5-haiku or anthropic:claude-3-5-haiku
  • g or gemini: Google Gemini
    • Example: g:gemini-2.5-pro-exp-03-25 or gemini:gemini-2.5-pro-exp-03-25
  • q or groq: Groq
    • Example: q:llama-3.1-70b-versatile or groq:llama-3.1-70b-versatile
  • d or deepseek: DeepSeek
    • Example: d:deepseek-coder or deepseek:deepseek-coder
  • l or ollama: Ollama
    • Example: l:llama3.1 or ollama:llama3.1

MCP Tool Functions

Sending Prompts

# Send a text prompt to default models
claude call just-prompt.prompt --text "Explain quantum computing in simple terms"

# Send a prompt to specific models
claude call just-prompt.prompt --text "Explain quantum computing in simple terms" --models_prefixed_by_provider "openai:gpt-4o,anthropic:claude-3-5-sonnet"

Sending Prompts from Files

# Send a prompt from a file to default models
claude call just-prompt.prompt_from_file --abs_file_path "/absolute/path/to/prompt.txt"

# Send a prompt from a file to specific models
claude call just-prompt.prompt_from_file --abs_file_path "/absolute/path/to/prompt.txt" --models_prefixed_by_provider "o:gpt-4o,a:claude-3-5-sonnet"

Saving Responses to Files

# Send a prompt from a file and save responses to files
claude call just-prompt.prompt_from_file_to_file --abs_file_path "/absolute/path/to/prompt.txt" --abs_output_dir "/absolute/path/to/output"

Using the CEO and Board Tool

This tool sends a prompt to multiple "board member" models and then has a "CEO" model make a decision based on their responses:

claude call just-prompt.ceo_and_board --abs_file_path "/absolute/path/to/decision_prompt.txt" --abs_output_dir "/absolute/path/to/output" --ceo_model "openai:o3"

Listing Providers and Models

# List all available providers
claude call just-prompt.list_providers

# List all models for a specific provider
claude call just-prompt.list_models --provider "openai"

Advanced Model Configuration

OpenAI Reasoning Effort

For OpenAI o-series models, you can control reasoning depth by adding a suffix:

  • :low – minimal internal reasoning (faster, cheaper)
  • :medium – balanced (default if omitted)
  • :high – thorough reasoning (slower, more tokens)

Examples:

openai:o4-mini:low
o:o4-mini:high

Anthropic Thinking Tokens

For Claude models, you can enable thinking tokens:

anthropic:claude-opus-4-20250514:1k
anthropic:claude-sonnet-4-20250514:4k

Supported values range from 1024 to 16000 tokens.

Gemini Thinking Budget

For Gemini models, you can set a thinking budget:

gemini:gemini-2.5-flash-preview-04-17:1k
gemini:gemini-2.5-flash-preview-04-17:8000

Supported values range from 0 to 24576 tokens.

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "just-prompt" '{"type":"stdio","command":"uv","args":["--directory",".","run","just-prompt","--default-models","openai:o3:high,openai:o4-mini:high,anthropic:claude-opus-4-20250514,anthropic:claude-sonnet-4-20250514,gemini:gemini-2.5-pro-preview-03-25,gemini:gemini-2.5-flash-preview-04-17"],"env":[]}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "just-prompt": {
            "type": "stdio",
            "command": "uv",
            "args": [
                "--directory",
                ".",
                "run",
                "just-prompt",
                "--default-models",
                "openai:o3:high,openai:o4-mini:high,anthropic:claude-opus-4-20250514,anthropic:claude-sonnet-4-20250514,gemini:gemini-2.5-pro-preview-03-25,gemini:gemini-2.5-flash-preview-04-17"
            ],
            "env": []
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "just-prompt": {
            "type": "stdio",
            "command": "uv",
            "args": [
                "--directory",
                ".",
                "run",
                "just-prompt",
                "--default-models",
                "openai:o3:high,openai:o4-mini:high,anthropic:claude-opus-4-20250514,anthropic:claude-sonnet-4-20250514,gemini:gemini-2.5-pro-preview-03-25,gemini:gemini-2.5-flash-preview-04-17"
            ],
            "env": []
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later