home / mcp / lacylights mcp server
AI-powered Model Context Protocol server for theatrical lighting design and control. Features intelligent fixture management, script-based scene generation, and dynamic cue sequencing for professional theater productions.
Configuration
View docs{
"mcpServers": {
"bbernstein-lacylights-mcp": {
"command": "/usr/local/bin/node",
"args": [
"/path/to/lacylights-mcp/run-mcp.js"
],
"env": {
"OPENAI_API_KEY": "your_openai_api_key_here",
"LACYLIGHTS_GRAPHQL_ENDPOINT": "http://localhost:4000/graphql"
}
}
}
}LacyLights MCP Server provides an AI-powered interface to design, analyze, and control theatrical lighting through natural language. It connects your AI-assisted workflows with fixture management, cue creation, and live playback, enabling rapid iteration from concept to cue-ready designs.
You interact with the MCP server via an MCP client or integration that speaks the MCP protocol. Use natural language prompts to generate looks, analyze scripts, create cue sequences, and control live playback. You can translate scripted moments, moods, and blocking into DMX-enabled lighting configurations, then refine those looks with automatic optimization for dramatic impact, energy efficiency, or simplicity. Start by creating or loading a project, define fixtures and channels, generate looks from descriptions, assemble cue sequences, and drive playback during rehearsals or performances. If you run a show, you can start a cue list, advance through cues, jump to a specific cue, or fade to black as needed. You can also adjust looks on the fly to respond to actor blocking or directorial notes.
Prerequisites: ensure you have Node.js version 18 or newer installed on your system.
Install dependencies and prepare the environment.
# Install dependencies
npm install
# Copy example environment file and edit with your configuration
cp .env.example .envBuild and start the server in production mode after configuring your environment.
npm run build
npm startRun the MCP server once your backend is ready. You should see confirmation that the RAG service is initialized and the MCP server is running in stdio mode.
The server relies on environment variables for AI access and backend connectivity. Define OPENAI_API_KEY for AI-powered lighting generation and LACYLIGHTS_GRAPHQL_ENDPOINT to point to your lacylights-go GraphQL backend. Optional ChromaDB settings enable enhanced retrieval-augmented generation.
Example values you would place in your runtime environment (placeholders shown):
{
"mcpServers": {
"lacylights": {
"command": "/usr/local/bin/node",
"args": ["/path/to/lacylights-mcp/run-mcp.js"],
"env": {
"OPENAI_API_KEY": "your_openai_api_key_here",
"LACYLIGHTS_GRAPHQL_ENDPOINT": "http://localhost:4000/graphql"
}
}
}
}Common issues include module import errors, GraphQL connection problems, OpenAI API errors, and MCP integration errors in Claude. Ensure you are using the proper node wrapper script, verify backend availability, and confirm your API keys and endpoints are correctly configured.
To enable persistent vector storage for improved retrieval, you can run ChromaDB locally or via Docker. Update the CHROMA_HOST and CHROMA_PORT values in your environment to enable the RAG features.
The project is organized to support tooling for fixtures, looks, cues, and projects. It includes a core API client, RAG service, and AI-based lighting generation. You can add new tools by implementing them in the designated source directories and wiring them into the MCP server entry point.
A typical workflow guides you through project creation, fixture setup, script analysis, look generation, cue sequence creation, act-level cueing, timing optimization, and live playback control. You can refine positions, color temperatures, and effects to match blocking and costume considerations, then execute the cue lists during rehearsals or performances.
List all available lighting projects with optional fixture and look counts.
Create a new lighting project for a production.
Get comprehensive details about a specific project.
Delete a project and all associated data (requires confirmation).
Get information about importing QLC+ (.qxw) files.
Query available fixtures and their capabilities.
Deep analysis of fixture capabilities such as color mixing and positioning.
Add a new fixture to a project with manufacturer/model details.
View DMX channel usage map for a project.
Get optimal channel assignments for multiple fixtures.
Modify existing fixture properties.
Remove a fixture from a project (requires confirmation).
AI-powered look generation based on descriptions and context.
Extract lighting cues and suggestions from theatrical scripts.
Optimize looks for goals like energy, impact, or simplicity.
Update look properties and fixture values.
Activate a look by name or ID.
Fade all lights to black with customizable timing.
Query information about the currently active look.
Add fixtures to existing looks.
Remove specific fixtures from looks.
Read current fixture values in a look.
Ensure fixtures exist with specific values in a look.
Partial updates with fixture merging for looks.
Batch partial updates across multiple looks with fixture merging.
Build cue sequences from existing looks.
Generate complete cue lists for theatrical acts from scripts.
Optimize cue timing for strategies like dramatic timing.
Analyze cue lists with recommendations.
Update cue list metadata.
Add new cues to existing lists.
Remove cues from lists.
Modify individual cue properties.
Update multiple cues simultaneously.
Reorder cues with new numbering.
Query cues with filtering and sorting.
Delete entire cue lists (requires confirmation).
Begin playing a cue list from any point.
Advance to the next cue.
Go back to the previous cue.
Jump to a specific cue by number or name.
Stop the currently playing cue list.
Get playback status and navigation options.
Activate a look from a look board using the board's default fade time.