home / mcp / wolfram alpha mcp server

Wolfram Alpha MCP Server

A Python-powered Model Context Protocol MCP server and client that uses Wolfram Alpha via API.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "akalaric-mcp-wolframalpha": {
      "command": "python3",
      "args": [
        "/path/to/src/core/server.py"
      ],
      "env": {
        "GeminiAPI": "YOUR_GEMINI_API_KEY",
        "WOLFRAM_API_KEY": "YOUR_WOLFRAM_API_KEY"
      }
    }
  }
}

You deploy an MCP server that connects chat applications to Wolfram Alpha, enabling advanced computational queries and structured knowledge retrieval within conversations. This server is designed for easy integration with MCP clients and supports a Gradio-based UI, a CLI client, and a multi-client setup.

How to use

Set up an MCP client to talk to the Wolfram Alpha MCP server. Your client can send natural language questions, which the server will translate into Wolfram Alpha API requests and return structured results suitable for chat flows. You can use the Gradio UI for a user-friendly web interface, or run the CLI/Docker client to test interactions locally. The client supports switching between Wolfram Alpha results and other AI providers configured in your environment.

How to install

Prerequisites you need before installing: Python 3.8+ and pip. You may also want to install uv for orchestration if you plan to run services via uv. You will need an active Wolfram Alpha API key and, optionally, a Gemini API key for the LangChain-based client.

Clone the project to your workstation and navigate into the project directory.

git clone https://github.com/ricocf/mcp-wolframalpha.git
cd mcp-wolframalpha

Create a .env file in the project root and populate the required keys.

WOLFRAM_API_KEY=your_wolframalpha_appid
GeminiAPI=your_google_gemini_api_key # Optional if using the Client method below

Install Python dependencies listed in the requirements file.

pip install -r requirements.txt

If you plan to use the UV-based synchronization workflow, install and run the necessary tooling.

uv sync

Configure your MCP server entry for your development environment. For Claude Desktop-style setups, you will point to the Python entry script as shown in the example configuration.

Additional setup and configuration

You can integrate a VSCode MCP Server or a Claude Desktop workflow by providing a runtime command that launches the server. The example below shows the required runtime details for starting the Wolfram Alpha MCP server locally.

{
  "mcpServers": {
    "WolframAlphaServer": {
      "command": "python3",
      "args": [
        "/path/to/src/core/server.py"
      ],
      "env": [
        {"name": "WOLFRAM_API_KEY", "value": "YOUR_WOLFRAM_API_KEY"},
        {"name": "GeminiAPI", "value": "YOUR_GEMINI_API_KEY"}
      ]
    }
  }
}

MCP client and UI usage

This project includes an MCP client example that uses Gemini via LangChain to connect the LLM to the MCP server. A Gradio-based UI is provided for an interactive web experience, allowing you to switch between Wolfram Alpha results and Google Gemini responses, view query history, and experiment with prompts.

Run the Gradio UI client locally to try the interface.

python main.py --ui

If you prefer running the CLI-based client directly, you can start it without the UI.

python main.py

A Docker setup is available to containerize and run the UI client or the MCP server as needed.

# Build UI container
docker build -t wolframalphaui -f .devops/ui.Dockerfile .

# Run UI container
docker run wolframalphaui

# Build LLM client container
docker build -t wolframalpha -f .devops/llm.Dockerfile .

docker run -it wolframalpha

Notes and considerations

Security: store API keys securely and avoid committing .env files to version control. Use environment variable management suitable for your deployment environment.

Multiple clients can connect concurrently to the same MCP server. The server is designed to support multi-client scenarios and can be extended to additional APIs beyond Wolfram Alpha.

Available tools

wolfram_query

Executes a Wolfram Alpha computational query against the Wolfram Alpha API and returns structured results suitable for chat integration.

gradio_ui

Gradio-based web UI that lets you interact with the Wolfram Alpha MCP server and switch between Wolfram Alpha and Gemini responses.

mcp_client_gemini

MCP client example using Gemini via LangChain to connect large language models to the MCP server for real-time Wolfram Alpha interactions.

Wolfram Alpha MCP Server - akalaric/mcp-wolframalpha