home / skills / refoundai / lenny-skills / analyzing-user-feedback

analyzing-user-feedback skill

/skills/analyzing-user-feedback

This skill helps you extract actionable product insights from customer feedback by clustering themes, identifying root causes, and guiding decisions.

npx playbooks add skill refoundai/lenny-skills --skill analyzing-user-feedback

Review the files below or copy the command above to add this skill to your agents.

Files (2)
SKILL.md
4.1 KB
---
name: analyzing-user-feedback
description: Help users synthesize and act on customer feedback. Use when someone is analyzing NPS responses, processing support tickets, reviewing user research, synthesizing feedback from multiple channels, or trying to identify patterns in customer input.
---

# Analyzing User Feedback

Help the user extract actionable insights from customer feedback using techniques from 56 product leaders.

## How to Help

When the user asks for help analyzing feedback:

1. **Understand their sources** - Ask where feedback is coming from (NPS, support, sales, social, interviews)
2. **Help identify patterns** - Assist in clustering feedback into themes and prioritizing by frequency and impact
3. **Challenge surface-level interpretations** - Push them to find root causes, not just stated complaints
4. **Connect to action** - Help translate insights into product decisions

## Core Principles

### Feedback is a river, not a lake
Shaun Clowes: "Really smart product managers are constantly swimming in a feedback river. Set up streams of user interview data, NPS, and competitor info to wash over you daily." Make feedback consumption continuous, not episodic.

### Users lie (unintentionally)
Bret Taylor: "Taking what a customer says in a focus group is rarely correct. Practice intellectual honesty to distinguish surface-level complaints from root causes." When users say "price," they often mean "value."

### Cluster, don't segment
Bob Moesta: "Instead of segmenting by demographics, we cluster by behavioral pathways. It's not one reason why people do things—it's sets of reasons." Look for the 'hire and fire' criteria for different user clusters.

### Every support ticket is a product failure
Geoff Charles: "We literally have 'every support ticket is a failure of our product' posted on all channels. Share every negative review with the relevant PM and designer monthly."

### The silent signals matter
Ramesh Johari: "There's a lot of information in ratings that are NOT left. The absence of a rating is often a strong signal of a mediocre experience users are too polite to report."

### Filter the 80% noise
Jen Abel: "80% of feedback is noise based on legacy habits, 20% is gold that guides the future product. It's the founder's job to interpret what's 'the old way' versus real market needs."

### Aggregate across all channels
Brian Balfour: "AI can analyze existing feedback AND identify knowledge gaps—what customers are NOT saying. Aggregate feedback from all sources into a centralized repository."

### Talk to churned users
Uri Levine: "The most critical insights come from users who dropped out of the funnel, not those who succeeded. Interview users who churned to find the 'why' behind the failure."

### Prioritize future users over vocal minorities
Tamar Yehoshua: "Don't over-index on people unhappy with your changes. Design for the bigger number of people who will use it tomorrow, not the vocal few complaining today."

### Make insights stick
Yuhki Yamashata: "The goal is 'memification'—synthesize insights so they're catchy enough for execs to cite in meetings. Use real-world metaphors to explain complex concepts."

## Questions to Help Users

- "Where is your feedback coming from? Are you missing any channels?"
- "Have you talked to churned users, or only happy customers?"
- "What's the pattern behind these complaints—what's the root cause?"
- "Are these requests from early adopters or from users stuck in old habits?"
- "How will you act on this insight?"

## Common Mistakes to Flag

- **Taking feedback literally** - Users say they want X but often need Y
- **Only listening to vocal users** - Silent majority may have different needs
- **Ignoring non-users** - People who didn't convert have critical insights
- **Feedback hoarding** - Insights trapped in silos don't help anyone
- **Hindsight bias** - Don't dismiss research findings as "obvious" after the fact

## Deep Dive

For all 64 insights from 56 guests, see `references/guest-insights.md`

## Related Skills

- Conducting User Interviews
- Measuring Product-Market Fit
- Prioritizing Roadmap
- Setting OKRs & Goals

Overview

This skill helps you synthesize customer feedback into clear, actionable insights so product teams can make better decisions. It guides you through collecting sources, clustering themes, identifying root causes, and turning findings into prioritized actions. Use it to make feedback continuous, not episodic, and to avoid common interpretation traps.

How this skill works

I first map where feedback comes from (NPS, support, interviews, social, churn). Then I help cluster responses into behavioral themes, surface silent signals, and challenge surface-level interpretations to find root causes. Finally I translate insights into concrete product experiments, prioritization criteria, and communication hooks for stakeholders.

When to use it

  • Analyzing NPS responses to identify drivers of satisfaction and churn
  • Processing support tickets to reveal product failures and UX gaps
  • Synthesizing user research across interviews, surveys, and social media
  • Triaging feature requests and distinguishing signal from noise
  • Preparing a feedback-driven roadmap or hypothesis backlog

Best practices

  • Aggregate feedback into a centralized repository across channels before analysis
  • Cluster by behavior and outcomes, not just demographics
  • Validate root causes with targeted follow-up (including churned users)
  • Prioritize based on frequency, impact, and strategic fit, not volume alone
  • Create memorable one-liners or metaphors to make insights stick for execs

Example use cases

  • Turn monthly support ticket themes into three priority product experiments
  • Analyze NPS verbatims to uncover the top two drivers of promoter and detractor scores
  • Synthesize interview notes and social mentions to map unmet needs for a new cohort
  • Identify silent signals by comparing low-engagement cohorts with active users
  • Translate scattered feature requests into a prioritized hypothesis backlog tied to metrics

FAQ

How do you avoid over-indexing on vocal users?

Look for volume and behavioral evidence across channels, interview churned and passive users, and weight requests by impact on key metrics rather than loudness.

What if most feedback is noise?

Filter by frequency and potential impact, cluster similar comments, and surface a small set of hypotheses to validate with short experiments or targeted interviews.