home / mcp / flowise mcp server
Provides a Flowise-backed MCP server to list chatflows, create predictions, and dynamically register tools for Flowise chatflows or assistants.
Configuration
View docs{
"mcpServers": {
"matthewhand-mcp-flowise": {
"command": "uvx",
"args": [
"--from",
"git+https://github.com/matthewhand/mcp-flowise",
"mcp-flowise"
],
"env": {
"FLOWISE_API_KEY": "${FLOWISE_API_KEY}",
"FLOWISE_CHATFLOW_ID": "abc123",
"FLOWISE_SIMPLE_MODE": "true",
"FLOWISE_API_ENDPOINT": "${FLOWISE_API_ENDPOINT}",
"FLOWISE_ASSISTANT_ID": "assistant-001",
"FLOWISE_BLACKLIST_ID": "id4,id5",
"FLOWISE_WHITELIST_ID": "id1,id2",
"FLOWISE_BLACKLIST_NAME_REGEX": ".*deprecated.*",
"FLOWISE_WHITELIST_NAME_REGEX": ".*important.*",
"FLOWISE_CHATFLOW_DESCRIPTIONS": "abc123:Chatflow One,xyz789:Chatflow Two"
}
}
}
}You deploy and run an MCP server that integrates Flowise chatflows and predictions, exposing a straightforward API for listing chatflows, creating predictions, and dynamically registering tools in Flowise. This enables you to manage Flowise-powered assistants within your MCP ecosystem with flexible configuration and mode options.
You interact with the Flowise MCP server by running it as a local process and then connecting your MCP client to it. There are two modes you can use, depending on how much dynamic behavior you want from the server.
In FastMCP mode, you get a minimal set of tools that are easy to configure and use, including listing chatflows and creating predictions. In LowLevel mode, the server dynamically exposes tools for each chatflow or assistant, enabling rich, per-chatflow interactions. Your choice determines how tools are registered and named in your MCP workflow.
To start, you will configure the MCP server entry, point your MCP client at the local process, and then use the available tools to list chatflows, fetch predictions, or trigger specific chatflows. The exact tools available depend on the mode you enable.
Prerequisites you must have installed before running Flowise MCP: Python 3.12 or higher and the uvx package manager.
Install and run the server via uvx from the repository, using the following command to start the server locally:
uvx --from git+https://github.com/matthewhand/mcp-flowise mcp-flowiseIf you prefer to install or run via a package manager integration that points to a GitHub source, you can configure your MCP ecosystem with the following server entry. This example uses uvx to load the server from the repository and runs the mcp-flowise command.
{
"mcpServers": {
"mcp-flowise": {
"command": "uvx",
"args": [
"--from",
"git+https://github.com/matthewhand/mcp-flowise",
"mcp-flowise"
],
"env": {
"FLOWISE_API_KEY": "${FLOWISE_API_KEY}",
"FLOWISE_API_ENDPOINT": "${FLOWISE_API_ENDPOINT}"
}
}
}
}Configuration details, security considerations, and troubleshooting tips are provided to help you run and maintain the Flowise MCP server smoothly. You can adapt environment variables to your deployment environment and enforce access controls as needed.
List all available chatflows from Flowise when using FastMCP mode or static exposure in LowLevel mode.
Create a prediction for a given chatflow or assistant in FastMCP mode.
A dynamically created tool for each chatflow in LowLevel mode, named after the chatflow (normalized) and described by FLOWISE_CHATFLOW_DESCRIPTIONS when provided.