home / mcp / fabric mcp server
Provides access to Fabric workspaces, notebooks, SQL, Livy, pipelines, and more through MCP tools.
Configuration
View docs{
"mcpServers": {
"bablulawrence-ms-fabric-mcp-server": {
"command": "uvx",
"args": [
"ms-fabric-mcp-server"
],
"env": {
"FABRIC_SCOPES": "https://api.fabric.microsoft.com/.default",
"MCP_LOG_LEVEL": "INFO",
"AZURE_LOG_LEVEL": "info",
"FABRIC_BASE_URL": "https://api.fabric.microsoft.com/v1",
"MCP_SERVER_NAME": "ms-fabric-mcp-server",
"FABRIC_MAX_RETRIES": "3",
"LIVY_POLL_INTERVAL": "2.0",
"FABRIC_RETRY_BACKOFF": "2.0",
"LIVY_API_CALL_TIMEOUT": "120",
"FABRIC_API_CALL_TIMEOUT": "30",
"LIVY_SESSION_WAIT_TIMEOUT": "240",
"LIVY_STATEMENT_WAIT_TIMEOUT": "10"
}
}
}
}You can expose Microsoft Fabric operations as MCP tools that an AI agent can call, enabling automated workflows for workspaces, notebooks, SQL, Livy sessions, pipelines, and more. This MCP server is designed for development environments and should be reviewed before enabling any destructive actions in production. It provides a straightforward way to invoke Fabric actions from your MCP-enabled clients and agents.
To use this MCP server, start the server as a local stdio MCP endpoint and connect your MCP client or agent to invoke Fabric operations. The server exposes a range of tools that map to Fabric resources such as workspaces, items, notebooks, jobs, Livy sessions, pipelines, semantic models, Power BI tasks, and optional SQL endpoints. You can compose tool calls in your client to list resources, create or modify items, run jobs, manage Livy sessions, and perform data operations. For best results, start the server using the standard local invocation and then query the available tools from your MCP client.
Typical workflows include listing workspaces, retrieving notebook contents, creating and running on-demand jobs, managing Livy sessions, and building pipelines that automate notebook or dataflow activities. Always review AI-generated tool calls before execution, especially in development environments where operations can be destructive.
Prerequisites you need before installing this MCP server are Python and a command runner. You will also want an MCP client or editor to connect to the server.
Step 1: Install the MCP server package with Python’s package manager.
pip install ms-fabric-mcp-serverStep 2: (Optional) Install SQL support if you plan to use the SQL tools. This adds SQL-related endpoints when the sql extras are requested.
pip install ms-fabric-mcp-server[sql]Step 3: (Optional) Install OpenTelemetry tracing for distributed tracing in your environment.
pip install ms-fabric-mcp-server[sql,telemetry]Step 4: Start the MCP server locally using the fast local runner.
uvx ms-fabric-mcp-serverStep 5: Alternatively, start the server directly if you have Python installed.
python -m ms_fabric_mcp_serverYou can also run the server through a Node-based inspector for development visibility.
npx @modelcontextprotocol/inspector uvx ms-fabric-mcp-serverList all workspaces in Fabric.
List items within a workspace.
Delete a specified item in Fabric.
Import a notebook into Fabric.
Retrieve the content of a notebook.
Attach a Lakehouse to a notebook.
Get details about notebook executions.
List notebook execution records.
Retrieve driver logs for a notebook.
Run a job on demand.
Get the current status of a job.
Get job status by its URL.
Retrieve the result of an operation.
Create a Livy session.
List Livy sessions.
Get Livy session status.
Close a Livy session.
Run a statement in a Livy session.
Check the status of a Livy statement.
Cancel a Livy statement.
Get logs for a Livy session.
Create a new blank pipeline.
Add a copy activity to a pipeline.
Add a notebook activity to a pipeline.
Add a dataflow activity to a pipeline.
Add a generic activity to a pipeline.
Create a new semantic model.
Add a table to a semantic model.
Add a relationship to a semantic model.
Get details of a semantic model.
Get the definition of a semantic model.
Add measures to a semantic model.
Delete measures from a semantic model.
Refresh a semantic model in Power BI integration.
Execute a DAX query against a semantic model.
Get the SQL endpoint for the optional SQL tools.
Execute a SQL query against the configured endpoint.
Execute a SQL statement against the configured endpoint.