home / mcp / promptx mcp server

PromptX MCP Server

Provides MCP-based AI agent capabilities via HTTP or local runtimes to inject expert roles, memory, and tools.

Installation
Add the following to your MCP client configuration file.

Configuration

View docs
{
  "mcpServers": {
    "deepractice-promptx": {
      "url": "http://127.0.0.1:5203/mcp"
    }
  }
}

You can run an MCP server to empower AI applications with programmable expert roles, memory, and tool integrations. This server makes it easy to connect your AI apps to scripted expert capabilities via HTTP endpoints or local runtimes, so you can orchestrate complex AI behavior without manual reconfiguration each time.

How to use

You interact with the MCP server through an MCP client. Start the server via one of several options, then configure your AI application to reach the MCP endpoint. Once connected, you can steer AI behavior by selecting expert roles, activating tools, and maintaining long-term context across chat sessions.

Recommended pattern: choose an MCP connection method, then add the MCP server entry to your AI tool configuration. When you say the trigger phrase or ask for an expert, your AI tool will switch into that expertโ€™s role and keep acting with that identity for the session.

How to install

Prerequisites: you need Node.js installed or a compatible runtime, plus access to a shell or terminal. You will also need a client or a way to run the MCP server locally.

Step 1 โ€” Install using the direct run option shows how to use a package runner to start the MCP server.

Step 2 โ€” Run the server via a container for production environments.

HTTP connection and local runtime options

You can connect to the MCP server over HTTP so your AI applications call a local endpoint.

You can also run the MCP server as a local tool via a standard runtime command, and point your AI application to that local server.

Usage examples and start commands

HTTP method example start point (local endpoint): you connect to the server at http://127.0.0.1:5203/mcp and begin sending requests for expert roles.

Local runtime example: you run a runtime command to start the MCP server, then configure your client with the local endpoint.

Notes and best practices

Keep the expert state active during sessions to preserve context and memory. Use natural language to request expert capabilities rather than specialized commands. This approach simplifies collaboration with AI agents.

Available tools

Excel Tool

Enables data analysis, automated report generation, chart visualization, and data processing automation within AI-assisted workflows.

Word Tool

Provides document reading, professional writing assistance, batch text replacement, and format conversions.

PDF Reader

Offers page-by-page reading, content extraction, image extraction, and smart caching for faster access to PDFs.

Nuwa

AI Role Designer that creates specialized professional roles from natural language, enabling tailored expert behavior.

Luban

Tool Integration Master that connects AI to external APIs and data sources, enabling actions like database queries or Slack postings.

Writer

Professional content creator that crafts technical or marketing content with human-like tone and personality.