Home / MCP / Oxylabs MCP Server

Oxylabs MCP Server

Provides access to Oxylabs Web Scraper API and AI Studio tools to scrape, render, clean, and extract data for AI workloads across 195+ countries.

javascript
Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
    "mcpServers": {
        "oxylabs_http1": {
            "url": "https://server.smithery.ai/@oxylabs/oxylabs-mcp/mcp"
        }
    }
}

Oxylabs MCP Server acts as a bridge between AI models and the real web. It lets you scrape any URL, render JavaScript-heavy pages, clean and format content for AI consumption, bypass anti-scraping measures, and access geo-restricted data from 195+ countries. This enables you to feed AI analysis with fresh, structured web content without building scraping infrastructure from scratch.

How to use

You can run Oxylabs MCP Server with different clients that fit your workflow. Use the HTTP endpoint configuration when you want to connect to remote MCP services via URL. Use the STDIO configurations when you want to run local MCP server instances on your machine or in your environment. In any case, you will expose the tools that your credentials enable, so you only see the accessible options.

How to install

Prerequisites you need before building or running the server: node and a runtime capable of executing the chosen MCP method, plus internet access for remote service calls. Install the required MCP tooling and prepare credentials as indicated in the configuration steps.

{
  "mcpServers": {
    "oxylabs_http1": {
      "url": "https://server.smithery.ai/@oxylabs/oxylabs-mcp/mcp"
    }
  }
}

Additional configuration and setup

You have two local, runnable options to start Oxylabs MCP Server using the uvx or uv package managers. Choose one based on your environment and preference.

If you use uvx, install uvx and run the MCP with the following configuration snippet.

{
  "mcpServers": {
    "oxylabs": {
      "command": "uvx",
      "args": ["oxylabs-mcp"],
      "env": {
        "OXYLABS_USERNAME": "OXYLABS_USERNAME",
        "OXYLABS_PASSWORD": "OXYLABS_PASSWORD",
        "OXYLABS_AI_STUDIO_API_KEY": "OXYLABS_AI_STUDIO_API_KEY"
      }
    }
  }
}

Alternative: run with UV package manager

If you prefer the UV package manager, use this configuration to run Oxylabs MCP Server locally. Make sure to replace the placeholder path with the actual absolute path to your oxylabs-mcp folder.

{
  "mcpServers": {
    "oxylabs": {
      "command": "uv",
      "args": [
        "--directory",
        "/<Absolute-path-to-folder>/oxylabs-mcp",
        "run",
        "oxylabs-mcp"
      ],
      "env": {
        "OXYLABS_USERNAME": "OXYLABS_USERNAME",
        "OXYLABS_PASSWORD": "OXYLABS_PASSWORD",
        "OXYLABS_AI_STUDIO_API_KEY": "OXYLABS_AI_STUDIO_API_KEY"
      }
    }
  }
}

Smithery OAuth2 and URL-based configurations

You can connect using an OAuth2 flow via Smithery or pass credentials in the query string for clients that do not support OAuth2.

{ 
  "mcpServers": {
    "oxylabs": {
      "url": "https://server.smithery.ai/@oxylabs/oxylabs-mcp/mcp"
    }
  }
}

Troubleshooting and notes

If you see errors related to authentication, verify that your credentials are correctly set in the environment variables and that you have access rights to the Oxylabs services you are attempting to use.

Available tools

universal_scraper

Uses Oxylabs Web Scraper API for general website scraping.

google_search_scraper

Extracts results from Google Search using Oxylabs Web Scraper API.

amazon_search_scraper

Scrapes Amazon search results pages via Oxylabs Web Scraper API.

amazon_product_scraper

Extracts data from individual Amazon product pages using Oxylabs Web Scraper API.

ai_scraper

Scrape content from any URL with AI-powered data extraction and output in JSON or Markdown.

ai_crawler

Crawl a site based on a prompt and collect data across multiple pages in Markdown or JSON.

ai_browser_agent

Control a browser based on a prompt and return data in Markdown, JSON, HTML, or screenshots.

ai_search

Search the web for URLs and extract content with AI-powered processing.