The PRD Creator MCP Server is a specialized tool that enables AI systems to generate detailed Product Requirements Documents through a standardized Model Context Protocol interface. It supports multiple AI providers and offers template-based generation for creating professional PRDs quickly and efficiently.
Using NPX (Recommended):
npx -y prd-creator-mcp
Using Docker:
docker pull saml1211/prd-creator-mcp
docker run -i --rm saml1211/prd-creator-mcp
git clone https://github.com/Saml1211/prd-mcp-server.git
cd prd-mcp-server
npm install
npm run build
npm start
npm run dev
To configure the server with your preferred AI providers:
.env.example
file to .env
.env
fileupdate_provider_config
MCP toolCreate a complete PRD document using AI or template-based generation.
Example:
{
"productName": "TaskMaster Pro",
"productDescription": "A task management application that helps users organize and prioritize their work efficiently.",
"targetAudience": "Busy professionals and teams who need to manage multiple projects and deadlines.",
"coreFeatures": [
"Task creation and management",
"Priority setting",
"Due date tracking",
"Team collaboration"
],
"constraints": [
"Must work offline",
"Must support mobile and desktop platforms"
],
"templateName": "comprehensive",
"providerId": "openai",
"additionalContext": "Focus on enterprise features and security",
"providerOptions": {
"temperature": 0.5,
"maxTokens": 4000
}
}
Validate a PRD document against best practices.
Example:
{
"prdContent": "# My Product\n\n## Introduction\n...",
"validationRules": ["has-introduction", "minimum-length"]
}
These tools help you view available options:
list_validation_rules
: Shows all available validation ruleslist_ai_providers
: Shows all available AI providers and their statuslist_templates
: Shows all available PRD templatesThe server includes tools for working with templates:
create_template
: Create a new PRD templateget_template
: Retrieve a specific templateupdate_template
: Modify an existing templatedelete_template
: Remove a templateexport_templates
: Export all templates to JSONimport_templates
: Import templates from JSONrender_template
: Render a template with dataTools for managing the server:
get_provider_config
: View current provider configurationupdate_provider_config
: Update provider credentials or settingshealth_check
: Check system health and provider availabilityget_logs
: View recent system logsstats
: Review usage statisticsAdd to claude_desktop_config.json
:
{
"mcpServers": {
"prd-creator": {
"command": "npx",
"args": ["-y", "prd-creator-mcp"]
}
}
}
Add to your Cursor MCP client configuration:
{
"mcpServers": {
"prd-creator": {
"command": "npx",
"args": ["-y", "prd-creator-mcp"]
}
}
}
Add to .roo/mcp.json
:
{
"mcpServers": {
"prd-creator-mcp": {
"command": "npx",
"args": ["-y", "prd-creator-mcp"]
}
}
}
Available at: https://glama.ai/mcp/servers/@Saml1211/PRD-MCP-Server
Run with Docker and specify environment variables:
docker run -i --rm -e OPENAI_API_KEY=your_key_here prd-creator-mcp
View available command line options:
npx prd-creator-mcp --help
There are two ways to add an MCP server to Cursor. The most common way is to add the server globally in the ~/.cursor/mcp.json
file so that it is available in all of your projects.
If you only need the server in a single project, you can add it to the project instead by creating or adding it to the .cursor/mcp.json
file.
To add a global MCP server go to Cursor Settings > MCP and click "Add new global MCP server".
When you click that button the ~/.cursor/mcp.json
file will be opened and you can add your server like this:
{
"mcpServers": {
"cursor-rules-mcp": {
"command": "npx",
"args": [
"-y",
"cursor-rules-mcp"
]
}
}
}
To add an MCP server to a project you can create a new .cursor/mcp.json
file or add it to the existing one. This will look exactly the same as the global MCP server example above.
Once the server is installed, you might need to head back to Settings > MCP and click the refresh button.
The Cursor agent will then be able to see the available tools the added MCP server has available and will call them when it needs to.
You can also explictly ask the agent to use the tool by mentioning the tool name and describing what the function does.