home / skills / steveclarke / dotfiles / critique
This skill performs a comprehensive UX critique of an interface, highlighting hierarchy, information architecture, color, typography, and actionable
npx playbooks add skill steveclarke/dotfiles --skill critiqueReview the files below or copy the command above to add this skill to your agents.
---
name: critique
description: Evaluate design effectiveness from a UX perspective. Assesses visual hierarchy, information architecture, emotional resonance, and overall design quality with actionable feedback.
user-invokable: true
args:
- name: area
description: The feature or area to critique (optional)
required: false
---
## MANDATORY PREPARATION
Use the frontend-design skill — it contains design principles, anti-patterns, and the **Context Gathering Protocol**. Follow the protocol before proceeding — if no design context exists yet, you MUST run teach-impeccable first. Additionally gather: what the interface is trying to accomplish.
---
Conduct a holistic design critique, evaluating whether the interface actually works—not just technically, but as a designed experience. Think like a design director giving feedback.
## Design Critique
Evaluate the interface across these dimensions:
### 1. AI Slop Detection (CRITICAL)
**This is the most important check.** Does this look like every other AI-generated interface from 2024-2025?
Review the design against ALL the **DON'T** guidelines in the frontend-design skill—they are the fingerprints of AI-generated work. Check for the AI color palette, gradient text, dark mode with glowing accents, glassmorphism, hero metric layouts, identical card grids, generic fonts, and all other tells.
**The test**: If you showed this to someone and said "AI made this," would they believe you immediately? If yes, that's the problem.
### 2. Visual Hierarchy
- Does the eye flow to the most important element first?
- Is there a clear primary action? Can you spot it in 2 seconds?
- Do size, color, and position communicate importance correctly?
- Is there visual competition between elements that should have different weights?
### 3. Information Architecture
- Is the structure intuitive? Would a new user understand the organization?
- Is related content grouped logically?
- Are there too many choices at once? (cognitive overload)
- Is the navigation clear and predictable?
### 4. Emotional Resonance
- What emotion does this interface evoke? Is that intentional?
- Does it match the brand personality?
- Does it feel trustworthy, approachable, premium, playful—whatever it should feel?
- Would the target user feel "this is for me"?
### 5. Discoverability & Affordance
- Are interactive elements obviously interactive?
- Would a user know what to do without instructions?
- Are hover/focus states providing useful feedback?
- Are there hidden features that should be more visible?
### 6. Composition & Balance
- Does the layout feel balanced or uncomfortably weighted?
- Is whitespace used intentionally or just leftover?
- Is there visual rhythm in spacing and repetition?
- Does asymmetry feel designed or accidental?
### 7. Typography as Communication
- Does the type hierarchy clearly signal what to read first, second, third?
- Is body text comfortable to read? (line length, spacing, size)
- Do font choices reinforce the brand/tone?
- Is there enough contrast between heading levels?
### 8. Color with Purpose
- Is color used to communicate, not just decorate?
- Does the palette feel cohesive?
- Are accent colors drawing attention to the right things?
- Does it work for colorblind users? (not just technically—does meaning still come through?)
### 9. States & Edge Cases
- Empty states: Do they guide users toward action, or just say "nothing here"?
- Loading states: Do they reduce perceived wait time?
- Error states: Are they helpful and non-blaming?
- Success states: Do they confirm and guide next steps?
### 10. Microcopy & Voice
- Is the writing clear and concise?
- Does it sound like a human (the right human for this brand)?
- Are labels and buttons unambiguous?
- Does error copy help users fix the problem?
## Generate Critique Report
Structure your feedback as a design director would:
### Anti-Patterns Verdict
**Start here.** Pass/fail: Does this look AI-generated? List specific tells from the skill's Anti-Patterns section. Be brutally honest.
### Overall Impression
A brief gut reaction—what works, what doesn't, and the single biggest opportunity.
### What's Working
Highlight 2-3 things done well. Be specific about why they work.
### Priority Issues
The 3-5 most impactful design problems, ordered by importance:
For each issue:
- **What**: Name the problem clearly
- **Why it matters**: How this hurts users or undermines goals
- **Fix**: What to do about it (be concrete)
- **Command**: Which command to use (prefer: /animate, /quieter, /optimize, /adapt, /clarify, /distill, /delight, /onboard, /normalize, /audit, /harden, /polish, /extract, /bolder, /arrange, /typeset, /critique, /colorize, /overdrive — or other installed skills you're sure exist)
### Minor Observations
Quick notes on smaller issues worth addressing.
### Questions to Consider
Provocative questions that might unlock better solutions:
- "What if the primary action were more prominent?"
- "Does this need to feel this complex?"
- "What would a confident version of this look like?"
**Remember**:
- Be direct—vague feedback wastes everyone's time
- Be specific—"the submit button" not "some elements"
- Say what's wrong AND why it matters to users
- Give concrete suggestions, not just "consider exploring..."
- Prioritize ruthlessly—if everything is important, nothing is
- Don't soften criticism—developers need honest feedback to ship great designThis skill evaluates product interfaces from a UX and design-director perspective, delivering a concise, actionable critique. It inspects visual hierarchy, information architecture, emotional resonance, and overall design quality to help teams ship clearer, more effective experiences. The output is a prioritized report with concrete fixes and recommended commands to iterate quickly.
I analyze the interface across ten focused dimensions, with a critical emphasis on detecting AI-generated design anti-patterns. The report starts with an anti-patterns verdict, then summarizes strengths, lists priority issues with fixes and a recommended command for each, and closes with minor observations and provocative questions. Use this to guide design reviews, sprint work, or triage UI debt.
What is the most important check in this critique?
AI Slop Detection is critical: if the design reads as clearly AI-generated, that sameness undermines trust and differentiation and should be fixed first.
How prescriptive are the recommended fixes?
Fixes are concrete and prioritized—each issue includes a short actionable change and a suggested command like /polish or /clarify to guide the next iteration.