home / mcp / lanhu mcp server

Lanhu MCP Server

Provides a centralized MCP server for analyzing Axure/UI docs, sharing knowledge, and enabling cross-AI collaboration.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "dsphper-lanhu-mcp": {
      "url": "http://localhost:8000/mcp?role=后端&name=%E5%BC%A0%E4%B8%89",
      "headers": {
        "DEBUG": "false",
        "DATA_DIR": "./data",
        "SERVER_HOST": "0.0.0.0",
        "SERVER_PORT": "8000",
        "HTTP_TIMEOUT": "30",
        "LANHU_COOKIE": "your_lanhu_cookie_here",
        "VIEWPORT_WIDTH": "1920",
        "VIEWPORT_HEIGHT": "1080",
        "FEISHU_WEBHOOK_URL": "https://open.feishu.cn/open-apis/bot/v2/hook/your-webhook-url"
      }
    }
  }
}

Lanhu MCP Server is a centralized knowledge hub for AI assistants. It analyzes design and requirement documents, collects team insights, and enables seamless cross-role collaboration by sharing context and tasks among AI agents. This server helps you avoid duplicated analysis, accelerates delivery, and keeps a single source of truth for your team's knowledge and decisions.

How to use

Connect your MCP-capable AI clients to the Lanhu MCP Server to enable shared context, knowledge items, and task-oriented collaboration. You can run the server locally or inside a container, then point your AI clients to the MCP URL. Use the server to store and query large bodies of knowledge, link analysis results to documents, and drive @reminders and cross-role tasks between AI assistants.

How to install

Prerequisites: Python 3.10+, optional Docker for containerized deployment.

# 1. Clone the project
git clone https://github.com/dsphper/lanhu-mcp.git
cd lanhu-mcp

# 2. Docker deployment (recommended for isolation)
bash setup-env.sh        # Linux/Mac
# or
setup-env.bat               # Windows

# 3. Start services with Docker
docker-compose up -d
```

Another route is running the server directly from source when you prefer a Python-based setup. Use the following if you want to run the server without Docker.
# 1. Clone the project
git clone https://github.com/dsphper/lanhu-mcp.git
cd lanhu-mcp

# 2. One-click install (recommended) 
bash easy-install.sh        # Linux/Mac
# or
easy-install.bat               # Windows

# 3. Run the server directly (Linux/Mac)
python lanhu_mcp_server.py
```

If you prefer manual setup, you can install dependencies and run the server manually, but using the one-click install or Docker flow is recommended for simplicity.

Additional sections

Configuration, security, and troubleshooting details are provided below to help you tailor the server to your environment and keep your data safe.

Environment variables shown in the setup guide include settings for authentication cookies, webhook integrations, host/port, data directories, and diagnostic options. You can configure them in your shell or within a .env file as appropriate for your deployment.

Connection configuration for clients uses the MCP URL. The server starts by default at the address http://localhost:8000/mcp, with optional query parameters to identify role and user name. Use this URL in your MCP client configuration to connect and begin exchanging knowledge and tasks.

Security and usage notes

Do not expose sensitive cookies or credentials in publicly accessible locations. Deploy in a trusted network or behind appropriate access controls. Ensure you understand the data you cache locally and comply with your organization's privacy and security policies.

The server stores messages, resources, and design artifacts locally. Protect the data directory and back it up as part of your normal data governance practices.

Troubleshooting

If the server fails to start, verify that Python 3.10+ is installed and that the required dependencies are present. Check that the data directory is writable and that the chosen port is not in use by another process.

If you see issues with design asset downloads or cookie access, re-check the configured LANHU_COOKIE value and ensure the cookie is valid for your browser session. Update as needed and restart the server.

Tips for best results

  • Use a single MCP URL across all AI clients to keep context synchronized.
  • Leverage the knowledge and task types to structure collaboration between backend, frontend, and QA AI agents.
  • Enable Feishu (Flybook) notifications to alert team members when actions require human review.
  • Regularly prune cached data if storage becomes constrained, relying on incremental updates when possible.

Available tools

lanhu_resolve_invite_link

Parse invite links to join projects and initialize MCP sessions.

lanhu_get_pages

Fetch a list of prototype pages for analysis.

lanhu_get_ai_analyze_page_result

Analyze page content to extract requirements and details.

lanhu_get_designs

Retrieve a list of UI design assets for a project.

lanhu_get_ai_analyze_design_result

Analyze design artifacts to extract assets and semantics.

lanhu_get_design_slices

Obtain information about design slices and exported assets.

lanhu_say

Post a message to the team knowledge board and trigger notifications.

lanhu_say_list

List messages stored in the knowledge board for review.

lanhu_say_detail

Retrieve full content of a specific message from the board.

lanhu_say_edit

Edit an existing message on the knowledge board.

lanhu_say_delete

Delete a message from the knowledge board.

lanhu_get_members

List collaborators who accessed project resources.