home / skills / brixtonpham / claude-config / stitch

stitch skill

/skills/stitch

This skill generates UI screens from text prompts, exports to React components, and creates DESIGN.md design systems to accelerate UI workflows.

npx playbooks add skill brixtonpham/claude-config --skill stitch

Review the files below or copy the command above to add this skill to your agents.

Files (1)
SKILL.md
1.4 KB
---
name: stitch
description: "Google Stitch UI design tool. Generate screens from text prompts, convert designs to React components, create DESIGN.md design systems. Use when: designing UI, generating screens, converting Stitch to code, creating design tokens. Keywords: stitch, design, UI, screen, generate, react, components, DESIGN.md, wireframe, prototype, mockup."
allowed-tools:
  - "stitch:*"
  - "Read"
  - "Write"
  - "Bash"
  - "WebFetch"
---

# Stitch UI Design Skill

Google Stitch MCP integration for AI-powered UI design generation.

## Workflows

### 1. Generate New Screen
```bash
mcp-cli call stitch/generate_screen_from_text '{"projectId": "ID", "prompt": "description", "deviceType": "DESKTOP"}'
```

### 2. Export to React
→ Invoke `react:components` skill after getting screen

### 3. Create Design System
→ Invoke `design-md` skill to generate DESIGN.md

## MCP Tools

| Tool | Parameters |
|------|------------|
| `stitch/list_projects` | filter: "view=owned" or "view=shared" |
| `stitch/create_project` | title: string |
| `stitch/get_project` | name: "projects/{id}" |
| `stitch/list_screens` | projectId: "projects/{id}" |
| `stitch/get_screen` | projectId, screenId |
| `stitch/generate_screen_from_text` | projectId, prompt, deviceType, modelId |

## Related Skills
- `design-md` - Extract design tokens → DESIGN.md
- `react:components` - Convert screens → React code

Overview

This skill integrates with Google Stitch to generate UI screens from text prompts, export designs to React components, and produce a DESIGN.md design system. It streamlines the flow from idea to interactive mockup by combining generation, project management, and code export. Use it to accelerate UI iteration, create consistent design tokens, and get production-ready component code.

How this skill works

The skill talks to Stitch project APIs to list, create, and retrieve projects and screens. It generates new screens from natural-language prompts and device targets, then pairs with a React conversion step to export components. It can also feed design tokens into a DESIGN.md generator to produce a living design system document.

When to use it

  • Rapidly prototype screens from a product brief or user story.
  • Create wireframes or high-fidelity mockups across device types.
  • Convert generated designs into React components for development.
  • Bootstrap or document a design system with DESIGN.md tokens.
  • Manage multiple Stitch projects and review shared screens.

Best practices

  • Write clear, concise prompts specifying layout, content, and device type.
  • Use project-level organization: create projects per feature or team.
  • Review and iterate generated screens before exporting to code.
  • Pair exports with the react:components skill for clean component output.
  • Run design-md after iterations to keep DESIGN.md tokens up to date.

Example use cases

  • Generate a dashboard screen for desktop from a product requirement prompt.
  • Create mobile onboarding screens by prompting for steps and tone.
  • Convert a validated Stitch screen into reusable React components for the frontend repo.
  • Extract color, spacing, and typography tokens into DESIGN.md for handoff to engineering.
  • List and review shared project screens from collaborators before finalizing designs.

FAQ

What inputs are required to generate a screen?

Provide a projectId, a natural-language prompt describing the UI, and a deviceType (e.g., DESKTOP or MOBILE).

How do I get code from a generated screen?

After generating a screen, invoke the react:components conversion workflow to export React component code.