home / skills / coowoolf / insighthunt-skills / diagnose-data-treat-design
This skill helps you diagnose with data to observe reality, then treat with design to craft creative, evidence-based solutions.
npx playbooks add skill coowoolf/insighthunt-skills --skill diagnose-data-treat-designReview the files below or copy the command above to add this skill to your agents.
---
name: Diagnose with Data, Treat with Design
description: Use data to establish business observability (diagnosing problems) but use creative design thinking to solve them (treatment). Resolves the conflict between data-driven and intuition-driven product development.
---
# Diagnose with Data, Treat with Design
> "You want to diagnose with data and treat with design. Data is not a tool that's going to tell you what you should build." — Julie Zhuo
## What It Is
Data should be used to establish the "observability" of a business—understanding what is actually happening (**diagnosing**). However, data cannot dictate the solution; that requires creative empathy and design thinking (**treatment**).
## When To Use
- Teams stuck in **"analysis paralysis"**
- Designers **resist data** because they feel it stifles creativity
- Leadership expects data to **tell them what feature to build**
- Product decisions are being made on **"vibes"** alone
## Core Principles
### 1. Data Reflects Reality
Use metrics to understand user behavior and spot anomalies, not to predict the future with certainty.
### 2. Diagnosis vs. Treatment
| Phase | Tool | Output |
|-------|------|--------|
| **Diagnosis** | Quantitative data | WHERE the problem is |
| **Treatment** | Qualitative design | HOW to fix it |
### 3. Avoid False Precision
A/B tests have limitations and cannot replace long-term product vision.
### 4. New Context, New Metrics
As technology shifts (e.g., to LLMs), traditional metrics (clicks) must evolve to new forms (conversation quality).
## How To Apply
```
STEP 1: Build Observability Layer
└── Implement event logging
└── Create dashboards for key flows
└── Monitor anomalies in real-time
STEP 2: Diagnose with Data
└── "Conversion dropped 15% on step 3"
└── "Users abandon after 2 messages"
STEP 3: Investigate the WHY
└── User interviews
└── Session recordings
└── Support ticket analysis
STEP 4: Treat with Design
└── Brainstorm creative solutions
└── Prototype multiple options
└── Test based on hypothesis, not metric optimization
```
## Common Mistakes
❌ Expecting data to tell you exactly **what feature to build next**
❌ Ignoring data when it contradicts your "story" of the product's success
❌ Running A/B tests on everything instead of making bold design choices
## Real-World Example
Julie mentions how rapidly growing companies often run on "vibes" until growth slows. At that point, they must implement data logging to diagnose the root cause (the "why"), but the solution to re-ignite growth requires design intervention.
---
*Source: Julie Zhuo, Lenny's Podcast*
This skill teaches teams to diagnose product problems with data and treat them with creative design. It clarifies the roles of quantitative observability and qualitative design so you stop letting metrics dictate solutions or intuition override reality. The goal is faster, better-informed product decisions that combine evidence with creative problem solving.
First, build an observability layer: event logging, dashboards, and anomaly alerts that reveal where user journeys break. Use those metrics to form clear diagnoses (e.g., conversion dropped at step 3, users abandon after two messages). Then investigate the why through interviews, session recordings, and support analysis. Finally, apply design thinking to generate and prototype treatments, testing hypotheses rather than optimizing metrics blindly.
Can data and design ever conflict?
Yes; data can reveal that a behavior exists while design decides the best way to change it. Treat the conflict as complementary: use data to frame the problem and design to propose solutions.
When should I run A/B tests?
Run A/B tests to validate specific hypotheses when you can measure meaningful user outcomes. Don’t use tests to avoid making bold design choices or to chase marginal metric wins.