home / mcp / mcp server for langgraph agent
Shopping Assistant MCP
Configuration
View docs{
"mcpServers": {
"bmaranan75-mcp-shopping-assistant-py": {
"url": "http://your-server:8000/sse",
"headers": {
"API_KEYS": "your-api-key-1,your-api-key-2",
"OKTA_DOMAIN": "your-domain.okta.com",
"OAUTH_ENABLED": "true or false as required",
"OAUTH_PROVIDER": "google or okta",
"OKTA_CLIENT_ID": "your-okta-client-id",
"GOOGLE_CLIENT_ID": "your-client-id.apps.googleusercontent.com",
"OKTA_CLIENT_SECRET": "your-okta-client-secret",
"GOOGLE_CLIENT_SECRET": "your-client-secret"
}
}
}
}This MCP Server provides a FastMCP-based interface to a LangGraph agent making it easy to connect ChatGPT Enterprise with your local LangGraph deployment. It exposes secure, scalable endpoints to invoke the agent, stream responses, and manage agent state, enabling real-time interactions and robust tooling within enterprise workflows.
You connect your MCP client to the LangGraph agent through the SSE transport, enabling real-time streaming of responses. Start the MCP server, ensure the LangGraph agent is running on port 2024, then perform calls to invoke the agent, stream answers, or query the agentβs state. Use the provided test UI to sanity-check connectivity and experience how prompts flow from your client to the agent and back.
Prerequisites you need before installing: Python 3.8+ and a functioning network to reach the LangGraph agent on port 2024.
Step 1: Install dependencies
pip install -r requirements.txtStep 2: Configure environment (optional) You can enable or disable authentication as needed.
Step 3: Generate credentials (if using OAuth) and set up environment variables as required.
python generate_credentials.pyStep 4: Start the MCP server and optional UI Start commands shown below in the recommended workflow.
# Quick start (recommended)
./start.sh
# Manual start (alternative)
python src/agent_mcp/mcp_server.py
# Start test UI (optional)
cd web_ui
python server.pyThe MCP server supports both HTTP transport and a local stdio-based workflow. For HTTP, you can connect using a remote or local endpoint and the SSE transport for real-time updates. For local development, run the python module directly to start the server and connect your client accordingly.
Execute a single invocation of the LangGraph agent with a given prompt and optional thread context.
Stream responses from the LangGraph agent for real-time viewing of output.
Retrieve the current state of a conversation thread by its thread ID.
Check the health status of the LangGraph agent and the MCP server components.
Query the current status of the agent, including availability and performance metrics.
List active or recent conversation threads with their metadata.