home / mcp / bmad agent fastmcp service mcp server
🤖 Enterprise-grade AI Agent Service based on FastMCP framework. Features 25+ MCP tools, 10 professional agents, dual LLM modes (Cursor built-in + DeepSeek API), and seamless Cursor IDE integration. 企业级智能体调用服务,支持双LLM模式,提供25+专业MCP工具和10个专业智能体。
Configuration
View docs{
"mcpServers": {
"2799662352-bmad-agent-fastmcp": {
"command": "python",
"args": [
"bmad_agent_mcp.py"
],
"env": {
"PYTHONPATH": ".",
"USE_BUILTIN_LLM": "true",
"DEEPSEEK_API_KEY": "YOUR_API_KEY",
"PYTHONIOENCODING": "utf-8",
"PYTHONUNBUFFERED": "1"
}
}
}
}This MCP server provides the BMAD Agent FastMCP Service, exposing a modular set of agents, workflows, and LLM modes that integrate with Cursor IDE. It enables you to run a local BMAD MCP instance, switch between built-in and external LLMs, and orchestrate tasks through a suite of prebuilt tools and workflows.
You run the BMAD Agent MCP server locally and connect your MCP client to it. Start the service with Python, then interact through the client to list agents, start workflows, or switch LLM modes. Use the provided tools and workflows to automate common tasks across analysis, architecture, development, product management, and QA.
Key usage patterns you can perform include listing all professional agents, activating a chosen agent, invoking an agent to complete a task, listing and triggering workflows, and switching between Cursor’s built-in LLM and the external DeepSeek API for more advanced reasoning.
Example interactions you might perform inside your MCP client include: list_agents to view available agents, call_agent_with_llm with a specific task, and switch_llm_mode to switch the LLM backend. The system also exposes status queries to monitor health and progress.
Prerequisites: you need Python installed and access to install Python packages.
pip install -r requirements.txtCreate a local environment file by copying the template and optionally adjust settings.
cp .env.example .env
```
# Optional edits in .env
# USE_BUILTIN_LLM=true
# DEEPSEEK_API_KEY=your_keyStart the MCP server using the main Python script.
python bmad_agent_mcp.pyIntegrate with Cursor IDE by configuring the MCP server in your Cursor settings. The server runs locally and uses Python as the runtime for the main script.
{
"mcpServers": {
"bmad_agent": {
"command": "python",
"args": ["bmad_agent_mcp.py"],
"cwd": ".",
"env": {
"PYTHONPATH": ".",
"USE_BUILTIN_LLM": "true",
"PYTHONIOENCODING": "utf-8",
"PYTHONUNBUFFERED": "1"
}
}
}
}Environment variables control LLM usage and encoding. Provide your own DeepSeek API key if you plan to use the external LLM. Keep API keys and secrets secure, and avoid committing .env files to version control.
The server exposes a collection of tools to manage agents, workflows, tasks, and templates. You can discover and use these through your MCP client.
Return a list of all professional agents available in the system.
Fetch detailed information for a specific agent by its ID.
Activate or enable a specified agent to start handling tasks.
Invoke an agent to perform a task using LLM-based reasoning.
List all defined workflows in the MCP environment.
Start a selected workflow by its ID.
Check the current status of an active workflow.
Advance the workflow to the next step.
Switch between built-in Cursor LLM and external DeepSeek API.
Retrieve current LLM mode information.
Query the overall health and status of the MCP server.
List all available tasks in the MCP system.
Execute a specific task by its ID.
List all available templates for reuse in tasks or workflows.
Retrieve the content of a specific template by name.