home / mcp / baidu search mcp server

Baidu Search MCP Server

Provides an MCP server that queries Baidu Wenxin API with multiple model options and returns results with sources.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "appleinmusic-baidu-search-mcp": {
      "command": "node",
      "args": [
        "/path/to/baidu-search-mcp/build/index.js"
      ],
      "env": {
        "BAIDU_API_KEY": "your_api_key_here"
      }
    }
  }
}

This MCP server lets you perform smart Baidu Wenxin API searches through MCP clients, returning both results and references. It lets you choose among multiple models and apply depth and recency filters to tailor searches for your AI assistant workflows.

How to use

You run this server locally or in your MCP configuration, and then you can query it through your MCP client just like you would with other data sources. Use it to perform intelligent web searches via Baidu Wenxin API, specify the model you want, enable deep search when needed, and apply recency filters to keep results timely. The server returns search results along with references so your AI can cite sources when answering questions.

How to install

Prerequisites: you need Node.js and npm installed on your system.

Install the required dependencies for the MCP server.

Run the install step to bring in the necessary SDKs.

Build the server package so you can run it.

Start the server locally or configure it in your MCP setup as described in the configuration section.

Configuration and usage notes

To use this server, obtain your Baidu Wenxin API key from Baidu Smart Cloud by creating an application and retrieving the API key.

Set the environment variable BAIDU_API_KEY to your API key before starting the server.

Configuration example for MCP

Use the following in your MCP configuration to run the server as a local process. Replace the placeholder path with the actual build index path and insert your API key.

{
  "mcpServers": {
    "baidu_search": {
      "command": "node",
      "args": ["/path/to/baidu-search-mcp/build/index.js"],
      "env": {
        "BAIDU_API_KEY": "your_api_key_here"
      },
      "disabled": false,
      "autoApprove": []
    }
  }
}

API

The server exposes a search endpoint named baidu_search with the following parameters. You can set model, search mode, and depth or recency filters to suit your needs. The response includes results and references.

Parameters include query (required), model (choices: ernie-3.5-8k, ernie-4.0-8k, deepseek-r1, deepseek-v3; default: ernie-3.5-8k), search_mode (auto, required, disabled; default: auto), enable_deep_search (boolean; default false), and search_recency_filter (week, month, semiyear, year). You can enable deep search and specify a recency window to refine results.

开发

Build and run steps are provided to prepare the MCP server for use. Install dependencies, compile the TypeScript sources, and start the server as an MCP endpoint.

Available tools

baidu_search

Search using Baidu Wenxin API with configurable model, search mode, deep search, and recency filters. Returns results and references for citation in your AI agent responses.