A Model Context Protocol (MCP) server that enables natural language queries to databases
Configuration
View docs{
"mcpServers": {
"xgenerationlab-xiyan_mcp_server": {
"command": "python",
"args": [
"-m",
"xiyan_mcp_server"
],
"env": {
"YML": "path/to/config.yml"
}
}
}
}You can run XiYan MCP Server to enable natural language queries against your databases. This server bridges language models with SQL databases, letting you fetch data and read table contents through intuitive prompts. It supports multiple LLM backends, including local and remote models, and offers secure local operation as an option.
Start the MCP server in a local, standard way and connect your MCP client to the provided command. You can run the server to accept queries and return SQL results. Use a client to send natural language questions like “Show me customers from last month” and the server will translate to SQL, execute it against your configured database, and return the results.
Prerequisites: ensure you have Python 3.11 or newer installed.
Install the server from PyPI to get the latest release.
Run the following command in your terminal to install the server:
pip install xiyan-mcp-serverIf you prefer installing from source, clone the repository and install from there.
Prepare a YAML configuration file (for example, config.yml) with your MCP settings and database connection details, then run the server using the Python module. The following command starts the server and serves the MCP endpoint locally via standard input/output.
YML=path/to/config.yml python -m xiyan_mcp_serverIf you prefer server-sent events (SSE) transport, configure the transport as sse in the config and run the same command. The server will expose the SSE endpoint at the configured port.
You need a YAML configuration file to configure the server. A default configuration example is shown below. Update the values to match your environment and database details.
mcp:
transport: "stdio" # or "sse"
model:
name: "XGenerationLab/XiYanSQL-QwenCoder-32B-2412"
key: ""
url: "https://api-inference.modelscope.cn/v1/"
database:
host: "localhost"
port: 3306
user: "root"
password: ""
database: ""You can set the transport protocol to stdio or sse in the MCP section. The following options are supported and their defaults are shown here.
mcp:
transport: "stdio" # only, for simple local developmentmcp:
transport: "sse"
port: 8000
log_level: "INFO"Configure the large language model (LLM) used to translate natural language to SQL. You may use a general LLM, a state-of-the-art model via Modelscope or Dashscope, or run a local LLM. Each option requires its own settings, including model name, API key, and endpoint URL.
model:
name: "gpt-3.5-turbo"
key: "YOUR KEY"
url: "https://api.openai.com/v1"
database:XiYanSQL-QwenCoder-32B-2412 is a SOTA text-to-SQL model. You can use it via Modelscope or Dashscope, with the appropriate API keys.
Two main paths exist: remote general LLMs, specialized SOTA models via Modelscope or Dashscope, and an optional local LLM. Each path requires different configuration fields, including model name, API key, and endpoint URL.
Configure the database connection details. The MCP server currently supports MySQL and PostgreSQL. Provide host, port, user, password, and database name.
database:
host: "localhost"
port: 3306
user: "root"
password: ""
database: ""database:
dialect: "postgresql"
host: "localhost"
port: 5432
user: ""
password: ""
database: ""When you launch the server, you can choose to connect clients like Claude Desktop, Cline, Goose, Cursor, or Witsy by configuring each client with the MCP server details. Follow each client’s setup instructions to point at the MCP’s URL or the stdio command with your YAML config.
The MCP server exposes tools that help you query the database and sample data. The key tools are get_data for translating natural language into SQL and data-resource endpoints like {dialect}://{table_name} to preview sample data and list database names.
If you experience issues starting the server, verify that the YAML file path is correct and that the Python environment has the necessary dependencies installed. Ensure the database credentials are correct and that the database is reachable from the server host.
Use the local (stdio) transport for higher security by avoiding external API keys and keeping model inference on your own infrastructure when possible. If you use remote models, safeguard API keys and manage access to your MCP server.
If you want more context on XiYanSQL and related models, explore the XiYanSQL-QwenCoder family and Modelscope/Dashscope integrations for performance and deployment details.
You can list available tables as resources and read table contents to help your models reference the database structure during query construction. Use natural language prompts to retrieve data and inspect your schema as needed.
Translates natural language questions into SQL queries using the internal model and retrieves results from the configured database.
Accesses sample data or lists databases using the dialect-specific resource endpoints like {dialect}://{table_name} to aid model reference during query construction.