home / mcp / fabric mcp server

Fabric MCP Server

Provides access to Fabric workspaces, notebooks, SQL, Livy, pipelines, and more through MCP tools.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "bablulawrence-ms-fabric-mcp-server": {
      "command": "uvx",
      "args": [
        "ms-fabric-mcp-server"
      ],
      "env": {
        "FABRIC_SCOPES": "https://api.fabric.microsoft.com/.default",
        "MCP_LOG_LEVEL": "INFO",
        "AZURE_LOG_LEVEL": "info",
        "FABRIC_BASE_URL": "https://api.fabric.microsoft.com/v1",
        "MCP_SERVER_NAME": "ms-fabric-mcp-server",
        "FABRIC_MAX_RETRIES": "3",
        "LIVY_POLL_INTERVAL": "2.0",
        "FABRIC_RETRY_BACKOFF": "2.0",
        "LIVY_API_CALL_TIMEOUT": "120",
        "FABRIC_API_CALL_TIMEOUT": "30",
        "LIVY_SESSION_WAIT_TIMEOUT": "240",
        "LIVY_STATEMENT_WAIT_TIMEOUT": "10"
      }
    }
  }
}

You can expose Microsoft Fabric operations as MCP tools that an AI agent can call, enabling automated workflows for workspaces, notebooks, SQL, Livy sessions, pipelines, and more. This MCP server is designed for development environments and should be reviewed before enabling any destructive actions in production. It provides a straightforward way to invoke Fabric actions from your MCP-enabled clients and agents.

How to use

To use this MCP server, start the server as a local stdio MCP endpoint and connect your MCP client or agent to invoke Fabric operations. The server exposes a range of tools that map to Fabric resources such as workspaces, items, notebooks, jobs, Livy sessions, pipelines, semantic models, Power BI tasks, and optional SQL endpoints. You can compose tool calls in your client to list resources, create or modify items, run jobs, manage Livy sessions, and perform data operations. For best results, start the server using the standard local invocation and then query the available tools from your MCP client.

Typical workflows include listing workspaces, retrieving notebook contents, creating and running on-demand jobs, managing Livy sessions, and building pipelines that automate notebook or dataflow activities. Always review AI-generated tool calls before execution, especially in development environments where operations can be destructive.

How to install

Prerequisites you need before installing this MCP server are Python and a command runner. You will also want an MCP client or editor to connect to the server.

Step 1: Install the MCP server package with Python’s package manager.

pip install ms-fabric-mcp-server

Step 2: (Optional) Install SQL support if you plan to use the SQL tools. This adds SQL-related endpoints when the sql extras are requested.

pip install ms-fabric-mcp-server[sql]

Step 3: (Optional) Install OpenTelemetry tracing for distributed tracing in your environment.

pip install ms-fabric-mcp-server[sql,telemetry]

Step 4: Start the MCP server locally using the fast local runner.

uvx ms-fabric-mcp-server

Step 5: Alternatively, start the server directly if you have Python installed.

python -m ms_fabric_mcp_server

Additional startup options

You can also run the server through a Node-based inspector for development visibility.

npx @modelcontextprotocol/inspector uvx ms-fabric-mcp-server

Available tools

list_workspaces

List all workspaces in Fabric.

list_items

List items within a workspace.

delete_item

Delete a specified item in Fabric.

import_notebook_to_fabric

Import a notebook into Fabric.

get_notebook_content

Retrieve the content of a notebook.

attach_lakehouse_to_notebook

Attach a Lakehouse to a notebook.

get_notebook_execution_details

Get details about notebook executions.

list_notebook_executions

List notebook execution records.

get_notebook_driver_logs

Retrieve driver logs for a notebook.

run_on_demand_job

Run a job on demand.

get_job_status

Get the current status of a job.

get_job_status_by_url

Get job status by its URL.

get_operation_result

Retrieve the result of an operation.

livy_create_session

Create a Livy session.

livy_list_sessions

List Livy sessions.

livy_get_session_status

Get Livy session status.

livy_close_session

Close a Livy session.

livy_run_statement

Run a statement in a Livy session.

livy_get_statement_status

Check the status of a Livy statement.

livy_cancel_statement

Cancel a Livy statement.

livy_get_session_log

Get logs for a Livy session.

create_blank_pipeline

Create a new blank pipeline.

add_copy_activity_to_pipeline

Add a copy activity to a pipeline.

add_notebook_activity_to_pipeline

Add a notebook activity to a pipeline.

add_dataflow_activity_to_pipeline

Add a dataflow activity to a pipeline.

add_activity_to_pipeline

Add a generic activity to a pipeline.

create_semantic_model

Create a new semantic model.

add_table_to_semantic_model

Add a table to a semantic model.

add_relationship_to_semantic_model

Add a relationship to a semantic model.

get_semantic_model_details

Get details of a semantic model.

get_semantic_model_definition

Get the definition of a semantic model.

add_measures_to_semantic_model

Add measures to a semantic model.

delete_measures_from_semantic_model

Delete measures from a semantic model.

refresh_semantic_model

Refresh a semantic model in Power BI integration.

execute_dax_query

Execute a DAX query against a semantic model.

get_sql_endpoint

Get the SQL endpoint for the optional SQL tools.

execute_sql_query

Execute a SQL query against the configured endpoint.

execute_sql_statement

Execute a SQL statement against the configured endpoint.