Just Prompt is a lightweight MCP (Model Control Protocol) server that provides a unified interface to various Large Language Model providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama. It allows you to interact with multiple LLMs through a consistent API and compare responses across different models.
# Clone the repository
git clone https://github.com/yourusername/just-prompt.git
cd just-prompt
# Install with pip
uv sync
Create a .env
file with your API keys:
cp .env.sample .env
Then edit the .env
file with your API keys:
OPENAI_API_KEY=your_openai_api_key_here
ANTHROPIC_API_KEY=your_anthropic_api_key_here
GEMINI_API_KEY=your_gemini_api_key_here
GROQ_API_KEY=your_groq_api_key_here
DEEPSEEK_API_KEY=your_deepseek_api_key_here
OLLAMA_HOST=http://localhost:11434
mcp add-json
Copy this and paste it into Claude Code (but don't run until you copy the JSON):
claude mcp add just-prompt "$(pbpaste)"
Basic JSON configuration:
{
"command": "uv",
"args": ["--directory", ".", "run", "just-prompt"]
}
With a custom default model:
{
"command": "uv",
"args": ["--directory", ".", "run", "just-prompt", "--default-models", "openai:gpt-4o"]
}
With multiple default models:
{
"command": "uv",
"args": ["--directory", ".", "run", "just-prompt", "--default-models", "openai:o3:high,openai:o4-mini:high,anthropic:claude-opus-4-20250514,anthropic:claude-sonnet-4-20250514,gemini:gemini-2.5-pro-preview-03-25,gemini:gemini-2.5-flash-preview-04-17"]
}
mcp add
with project scope# With default models
claude mcp add just-prompt -s project \
-- \
uv --directory . \
run just-prompt
# With custom default model
claude mcp add just-prompt -s project \
-- \
uv --directory . \
run just-prompt --default-models "openai:gpt-4o"
# With multiple default models
claude mcp add just-prompt -s user \
-- \
uv --directory . \
run just-prompt --default-models "openai:o3:high,openai:o4-mini:high,anthropic:claude-opus-4-20250514,anthropic:claude-sonnet-4-20250514,gemini:gemini-2.5-pro-preview-03-25,gemini:gemini-2.5-flash-preview-04-17"
claude mcp remove just-prompt
All models must be prefixed with the provider name (you can use short names for quicker referencing):
o
or openai
: OpenAI
o:gpt-4o-mini
or openai:gpt-4o-mini
a
or anthropic
: Anthropic
a:claude-3-5-haiku
or anthropic:claude-3-5-haiku
g
or gemini
: Google Gemini
g:gemini-2.5-pro-exp-03-25
or gemini:gemini-2.5-pro-exp-03-25
q
or groq
: Groq
q:llama-3.1-70b-versatile
or groq:llama-3.1-70b-versatile
d
or deepseek
: DeepSeek
d:deepseek-coder
or deepseek:deepseek-coder
l
or ollama
: Ollama
l:llama3.1
or ollama:llama3.1
# Send a text prompt to default models
claude call just-prompt.prompt --text "Explain quantum computing in simple terms"
# Send a prompt to specific models
claude call just-prompt.prompt --text "Explain quantum computing in simple terms" --models_prefixed_by_provider "openai:gpt-4o,anthropic:claude-3-5-sonnet"
# Send a prompt from a file to default models
claude call just-prompt.prompt_from_file --abs_file_path "/absolute/path/to/prompt.txt"
# Send a prompt from a file to specific models
claude call just-prompt.prompt_from_file --abs_file_path "/absolute/path/to/prompt.txt" --models_prefixed_by_provider "o:gpt-4o,a:claude-3-5-sonnet"
# Send a prompt from a file and save responses to files
claude call just-prompt.prompt_from_file_to_file --abs_file_path "/absolute/path/to/prompt.txt" --abs_output_dir "/absolute/path/to/output"
This tool sends a prompt to multiple "board member" models and then has a "CEO" model make a decision based on their responses:
claude call just-prompt.ceo_and_board --abs_file_path "/absolute/path/to/decision_prompt.txt" --abs_output_dir "/absolute/path/to/output" --ceo_model "openai:o3"
# List all available providers
claude call just-prompt.list_providers
# List all models for a specific provider
claude call just-prompt.list_models --provider "openai"
For OpenAI o-series models, you can control reasoning depth by adding a suffix:
:low
– minimal internal reasoning (faster, cheaper):medium
– balanced (default if omitted):high
– thorough reasoning (slower, more tokens)Examples:
openai:o4-mini:low
o:o4-mini:high
For Claude models, you can enable thinking tokens:
anthropic:claude-opus-4-20250514:1k
anthropic:claude-sonnet-4-20250514:4k
Supported values range from 1024 to 16000 tokens.
For Gemini models, you can set a thinking budget:
gemini:gemini-2.5-flash-preview-04-17:1k
gemini:gemini-2.5-flash-preview-04-17:8000
Supported values range from 0 to 24576 tokens.
To add this MCP server to Claude Code, run this command in your terminal:
claude mcp add-json "just-prompt" '{"type":"stdio","command":"uv","args":["--directory",".","run","just-prompt","--default-models","openai:o3:high,openai:o4-mini:high,anthropic:claude-opus-4-20250514,anthropic:claude-sonnet-4-20250514,gemini:gemini-2.5-pro-preview-03-25,gemini:gemini-2.5-flash-preview-04-17"],"env":[]}'
See the official Claude Code MCP documentation for more details.
There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json
file so that it is available in all of your projects.
If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json
file.
To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".
When you click that button the ~/.cursor/mcp.json
file will be opened and you can add your server like this:
{
"mcpServers": {
"just-prompt": {
"type": "stdio",
"command": "uv",
"args": [
"--directory",
".",
"run",
"just-prompt",
"--default-models",
"openai:o3:high,openai:o4-mini:high,anthropic:claude-opus-4-20250514,anthropic:claude-sonnet-4-20250514,gemini:gemini-2.5-pro-preview-03-25,gemini:gemini-2.5-flash-preview-04-17"
],
"env": []
}
}
}
To add an MCP server to a project you can create a new .cursor/mcp.json
file or add it to the existing one. This will look exactly the same as the global MCP server example above.
Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.
The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.
You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.
To add this MCP server to Claude Desktop:
1. Find your configuration file:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
~/.config/Claude/claude_desktop_config.json
2. Add this to your configuration file:
{
"mcpServers": {
"just-prompt": {
"type": "stdio",
"command": "uv",
"args": [
"--directory",
".",
"run",
"just-prompt",
"--default-models",
"openai:o3:high,openai:o4-mini:high,anthropic:claude-opus-4-20250514,anthropic:claude-sonnet-4-20250514,gemini:gemini-2.5-pro-preview-03-25,gemini:gemini-2.5-flash-preview-04-17"
],
"env": []
}
}
}
3. Restart Claude Desktop for the changes to take effect