home / mcp / scrapegraph mcp server
Provides enterprise-ready MCP endpoints for AI-powered web scraping with eight tools for markdown, extraction, crawling, and automation.
Configuration
View docs{
"mcpServers": {
"scrapegraphai-scrapegraph-mcp": {
"url": "https://scrapegraph-mcp.onrender.com/mcp",
"headers": {
"SGAI_API_KEY": "YOUR_API_KEY"
}
}
}
}You can run the ScrapeGraph MCP Server to empower AI agents with powerful web scraping capabilities. This server exposes eight tools for markdown conversion, AI-driven data extraction, multi-page crawling, and flexible output formats, enabling reliable integration with MCP clients like Claude Desktop and Cursor.
You will connect your MCP client to the hosted MCP endpoint or run the server locally to start using all tools. In both cases, you will configure the client to interact with the server through the MCP protocol, send natural language prompts or structured requests, and receive results in your preferred format. Use the client to request page transformations, data extraction, site crawls, or advanced agentic scraping workflows. Authentication is performed via an API key that you obtain from ScrapeGraph and provide to the MCP client as needed.
Prerequisites you need before installation include a supported runtime and access to the API key. You will either run the server locally for development or use a remote MCP endpoint.
Step 1. Install the MCP client tooling (recommended): you will use a package manager to install the MCP server starter via a one-command setup.
# Automated installation via Smithery (recommended)
npx -y @smithery/cli install @ScrapeGraphAI/scrapegraph-mcp --client claudeConfigure your client with either the remote MCP endpoint or a local startup command. You can use the hosted server for immediate use or run the server locally for development and testing. Ensure your API key is available where the client expects it, typically via an environment variable or a configuration string.
Remote server configuration is provided for Claude Desktop and Cursor clients. Local server usage requires starting the MCP server on your machine and connecting your MCP client to it.
Protect your ScrapeGraph API key. Use environment variables or secure configuration mechanisms to pass the API key to your MCP client. Do not hard-code keys in code that can be exposed in version control or logs.
If tools do not appear or you encounter authentication errors, verify that your API key is correct and accessible to the MCP client. Check logs for startup messages and confirm the MCP server is reachable at the configured URL or is running locally.
Convert a page to markdown, extract structured data with AI prompts, crawl a site with pagination, or run an agentic scraping workflow with a defined output schema.
[{
"type": "http",
"name": "scrapegraph_remote",
"url": "https://scrapegraph-mcp.onrender.com/mcp",
"args": []
}][
{
"type": "stdio",
"name": "scrapegraph_local",
"command": "scrapegraph-mcp",
"args": []
},
{
"type": "stdio",
"name": "scrapegraph_local_python",
"command": "python",
"args": ["-m", "scrapegraph_mcp.server"]
}
]The setup includes a one-command installation via Smithery for the local server and remote endpoints for Claude Desktop and Cursor. When configuring Claude Desktop, you can specify a mcpServers entry with a command using npx to run the MCP client and pass your API key in the config. For Cursor, you provide the remote MCP URL and API key in the headers section.
If you are developing against the server, ensure you have Python 3.13 or higher and a working Python package manager. You can test with MCP Inspector and verify tooling with linting and type checks as part of your workflow.
This server provides eight enterprise-grade tools for web scraping, including markdown output, AI-driven extraction, multi-page crawling, and agentic workflows. It is designed for reliability in production environments and easy integration with MCP clients.
{
"mcpServers": {
"scrapegraph-mcp-local": {
"command": "scrapegraph-mcp",
"args": []
}
}
}Transform any webpage into clean, structured markdown format.
AI-powered extraction with support for infinite scrolling to gather structured data.
AI-powered web searches with structured, actionable results.
Fetch page content with optional heavy JavaScript rendering.
Extract sitemap URLs and structure for a website.
Initiate asynchronous multi-page crawling with configurable depth and extraction mode.
Poll and retrieve results from asynchronous crawls.
Run advanced agentic scraping workflows with customizable steps and schemas.