home / mcp / mcp server for langgraph agent

MCP Server for LangGraph Agent

Shopping Assistant MCP

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "bmaranan75-mcp-shopping-assistant-py": {
      "url": "http://your-server:8000/sse",
      "headers": {
        "API_KEYS": "your-api-key-1,your-api-key-2",
        "OKTA_DOMAIN": "your-domain.okta.com",
        "OAUTH_ENABLED": "true or false as required",
        "OAUTH_PROVIDER": "google or okta",
        "OKTA_CLIENT_ID": "your-okta-client-id",
        "GOOGLE_CLIENT_ID": "your-client-id.apps.googleusercontent.com",
        "OKTA_CLIENT_SECRET": "your-okta-client-secret",
        "GOOGLE_CLIENT_SECRET": "your-client-secret"
      }
    }
  }
}

This MCP Server provides a FastMCP-based interface to a LangGraph agent making it easy to connect ChatGPT Enterprise with your local LangGraph deployment. It exposes secure, scalable endpoints to invoke the agent, stream responses, and manage agent state, enabling real-time interactions and robust tooling within enterprise workflows.

How to use

You connect your MCP client to the LangGraph agent through the SSE transport, enabling real-time streaming of responses. Start the MCP server, ensure the LangGraph agent is running on port 2024, then perform calls to invoke the agent, stream answers, or query the agent’s state. Use the provided test UI to sanity-check connectivity and experience how prompts flow from your client to the agent and back.

How to install

Prerequisites you need before installing: Python 3.8+ and a functioning network to reach the LangGraph agent on port 2024.

Step 1: Install dependencies

pip install -r requirements.txt

Step 2: Configure environment (optional) You can enable or disable authentication as needed.

Step 3: Generate credentials (if using OAuth) and set up environment variables as required.

python generate_credentials.py

Step 4: Start the MCP server and optional UI Start commands shown below in the recommended workflow.

# Quick start (recommended)
./start.sh

# Manual start (alternative)
python src/agent_mcp/mcp_server.py

# Start test UI (optional)
cd web_ui
python server.py

Using the server with an MCP client

The MCP server supports both HTTP transport and a local stdio-based workflow. For HTTP, you can connect using a remote or local endpoint and the SSE transport for real-time updates. For local development, run the python module directly to start the server and connect your client accordingly.

Available tools

invoke_agent

Execute a single invocation of the LangGraph agent with a given prompt and optional thread context.

stream_agent

Stream responses from the LangGraph agent for real-time viewing of output.

get_agent_state

Retrieve the current state of a conversation thread by its thread ID.

check_system_health

Check the health status of the LangGraph agent and the MCP server components.

check_agent_status

Query the current status of the agent, including availability and performance metrics.

list_threads

List active or recent conversation threads with their metadata.