Unstructured API MCP server

Integrates with the Unstructured API to enable document processing operations including managing connectors and workflows for extracting structured information from documents without switching tools.
Back to servers
Setup instructions
Provider
Unstructured
Release date
Mar 15, 2025
Language
Python
Stats
32 stars

The Unstructured API MCP Server acts as a bridge to the Unstructured platform, enabling users to manage data sources, destinations, workflows, and jobs through a set of specialized tools. This server makes it easy to create automated document processing pipelines between various storage systems and vector databases.

Installation

Prerequisites

  • Python 3.12 or higher
  • uv for environment management
  • An Unstructured API key (available from Unstructured platform)

Option 1: Using uv (Recommended)

Install the package directly:

uv pip install uns_mcp

Option 2: From Source Code

  1. Clone the repository
  2. Install dependencies:
    uv sync
    
  3. Create a .env file with your API key:
    UNSTRUCTURED_API_KEY="YOUR_KEY"
    

Configuration

Configuring with Claude Desktop

Add the following to your claude_desktop_config.json file (located in ~/Library/Application Support/Claude/):

Using the uvx command:

{
   "mcpServers": {
      "UNS_MCP": {
         "command": "uvx",
         "args": ["uns_mcp"],
         "env": {
           "UNSTRUCTURED_API_KEY": "<your-key>"
         }
      }
   }
}

Using the Python package:

{
   "mcpServers": {
      "UNS_MCP": {
         "command": "python",
         "args": ["-m", "uns_mcp"],
         "env": {
           "UNSTRUCTURED_API_KEY": "<your-key>"
         }
      }
   }
}

Available Tools

The UNS_MCP server provides the following capabilities:

Source Management

  • list_sources: View all available source connectors
  • get_source_info: Get details about a specific source
  • create_source_connector: Set up a new source connector
  • update_source_connector: Modify an existing source connector
  • delete_source_connector: Remove a source connector

Destination Management

  • list_destinations: View all available destination connectors
  • get_destination_info: Get details about a specific destination
  • create_destination_connector: Set up a new destination connector
  • update_destination_connector: Modify an existing destination
  • delete_destination_connector: Remove a destination connector

Workflow Management

  • list_workflows: View all workflows
  • get_workflow_info: Get details about a specific workflow
  • create_workflow: Create a new processing workflow
  • run_workflow: Execute a specific workflow
  • update_workflow: Modify an existing workflow
  • delete_workflow: Remove a workflow

Job Management

  • list_jobs: View all jobs for a specific workflow
  • get_job_info: Get details about a specific job
  • cancel_job: Cancel a running job

Supported Connectors

Source Connectors

  • S3
  • Azure
  • Google Drive
  • OneDrive
  • Salesforce
  • Sharepoint

Destination Connectors

  • S3
  • Weaviate
  • Pinecone
  • AstraDB
  • MongoDB
  • Neo4j
  • Databricks Volumes
  • Databricks Volumes Delta Table

Credential Configuration

The following credentials must be defined in your .env file to use the respective connectors:

Essential Credentials

  • UNSTRUCTURED_API_KEY: Required for all operations
  • ANTHROPIC_API_KEY: Required for running the minimal client

Source Connector Credentials

  • S3: AWS_KEY, AWS_SECRET
  • Azure: Either AZURE_CONNECTION_STRING, AZURE_ACCOUNT_NAME+AZURE_ACCOUNT_KEY, or AZURE_ACCOUNT_NAME+AZURE_SAS_TOKEN
  • Google Drive: GOOGLEDRIVE_SERVICE_ACCOUNT_KEY
  • OneDrive: ONEDRIVE_CLIENT_ID, ONEDRIVE_CLIENT_CRED, ONEDRIVE_TENANT_ID
  • Salesforce: SALESFORCE_CONSUMER_KEY, SALESFORCE_PRIVATE_KEY
  • Sharepoint: SHAREPOINT_CLIENT_ID, SHAREPOINT_CLIENT_CRED, SHAREPOINT_TENANT_ID

Destination Connector Credentials

  • Weaviate: WEAVIATE_CLOUD_API_KEY
  • AstraDB: ASTRA_DB_APPLICATION_TOKEN, ASTRA_DB_API_ENDPOINT
  • Neo4j: NEO4J_PASSWORD
  • MongoDB: MONGO_DB_CONNECTION_STRING
  • Databricks: DATABRICKS_CLIENT_ID, DATABRICKS_CLIENT_SECRET
  • Pinecone: PINECONE_API_KEY

Other Configuration Options

  • LOG_LEVEL: Set logging level for the minimal client
  • CONFIRM_TOOL_USE: Set to true to confirm execution before each tool call
  • DEBUG_API_REQUESTS: Set to true for detailed API request debugging

Firecrawl Integration

The server includes integration with Firecrawl for web content extraction:

HTML Content Retrieval

  • Use invoke_firecrawl_crawlhtml to start web crawling
  • Monitor jobs with check_crawlhtml_status
  • Cancel jobs with cancel_crawlhtml_job

LLM-Optimized Text Generation

  • Use invoke_firecrawl_llmtxt to generate LLM-optimized text
  • Retrieve results with check_llmtxt_status

Note: You'll need to set the FIRECRAWL_API_KEY environment variable to use these features.

Debugging

For debugging your MCP server, Anthropic provides an MCP Inspector tool:

mcp dev uns_mcp/server.py

You can also enable detailed logging of API request parameters by setting:

DEBUG_API_REQUESTS=true

Terminal Access (Advanced)

To add terminal access to the minimal client:

  1. Install desktop-commander:

    npx @wonderwhy-er/desktop-commander setup
    
  2. Start the client with the additional parameter:

    uv run python minimal_client/client.py "http://127.0.0.1:8080/sse" "@wonderwhy-er/desktop-commander"
    

Warning: This gives the client (and LLM) access to your local files. Use with caution.

Troubleshooting

If you encounter Error: spawn <command> ENOENT, the command is not installed or not in your PATH:

  • Install the missing command
  • Add it to your PATH
  • Or provide an absolute path in your configuration

How to install this MCP server

For Claude Code

To add this MCP server to Claude Code, run this command in your terminal:

claude mcp add-json "UNS_MCP" '{"command":"uvx","args":["uns_mcp"],"env":{"UNSTRUCTURED_API_KEY":"<your-key>"}}'

See the official Claude Code MCP documentation for more details.

For Cursor

There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json file so that it is available in all of your projects.

If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json file.

Adding an MCP server to Cursor globally

To add a global MCP server go to Cursor Settings > Tools & Integrations and click "New MCP Server".

When you click that button the ~/.cursor/mcp.json file will be opened and you can add your server like this:

{
    "mcpServers": {
        "UNS_MCP": {
            "command": "uvx",
            "args": [
                "uns_mcp"
            ],
            "env": {
                "UNSTRUCTURED_API_KEY": "<your-key>"
            }
        }
    }
}

Adding an MCP server to a project

To add an MCP server to a project you can create a new .cursor/mcp.json file or add it to the existing one. This will look exactly the same as the global MCP server example above.

How to use the MCP server

Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.

The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.

You can also explicitly ask the agent to use the tool by mentioning the tool name and describing what the function does.

For Claude Desktop

To add this MCP server to Claude Desktop:

1. Find your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

2. Add this to your configuration file:

{
    "mcpServers": {
        "UNS_MCP": {
            "command": "uvx",
            "args": [
                "uns_mcp"
            ],
            "env": {
                "UNSTRUCTURED_API_KEY": "<your-key>"
            }
        }
    }
}

3. Restart Claude Desktop for the changes to take effect

Want to 10x your AI skills?

Get a free account and learn to code + market your apps using AI (with or without vibes!).

Nah, maybe later