home / skills / refoundai / lenny-skills / conducting-user-interviews
This skill helps you plan, conduct, and extract insights from user interviews to validate problems and understand customer needs.
npx playbooks add skill refoundai/lenny-skills --skill conducting-user-interviewsReview the files below or copy the command above to add this skill to your agents.
---
name: conducting-user-interviews
description: Help users run better customer and user interviews. Use when someone is preparing for user research, planning discovery interviews, writing interview questions, analyzing interview findings, or trying to understand customer needs.
---
# Conducting User Interviews
Help the user run better discovery conversations and extract real insights using techniques from 43 product leaders.
## How to Help
When the user asks for help with user interviews:
1. **Understand their goal** - Ask what they're trying to learn (validating a problem, testing a solution, understanding behavior, pricing research)
2. **Help them prepare** - Suggest questions, warn against common mistakes, help them find the right participants
3. **Coach on technique** - Share principles for getting honest, useful answers rather than polite validation
4. **Help analyze findings** - Assist in synthesizing what they learned into actionable insights
## Core Principles
### Collect stories, not opinions
Teresa Torres: "Interviewing is a grossly underestimated skill. If you're not collecting rich stories, you won't identify opportunities." Don't ask "What do you like?" Ask "Tell me about the last time you..."
### Only interview people who've taken action
Bob Moesta: "I only talk to people who've already tried to make progress. What made them try? Ignore 'bitching' (complaining)—look for 'switching' (actual behavior change)."
### Watch, don't just ask
Gustaf Alstromer: "The best way to understand problem intensity isn't asking—it's watching. Have them screen share and walk through their daily workflow. Look for pain they've normalized."
### Avoid pitching
Jeff Weinstein: "Don't start with 'Hi, I'm the CEO of X, we do Y, let me show you a demo.' What a wasted opportunity. Listen first. Use silence to let them open up."
### Falsify, don't validate
Judd Antin: "We don't validate, we falsify. We look to be wrong. Many PMs want to be right—they do user-centered performance, not real research."
### Never ask what they want built
Judd Antin: "A researcher who asks customers what they want is a bad researcher. Focus on understanding behaviors and problems—not having users design your solution."
### Probe for the emotion
Nan Yu: "My goal is to feel bad the same way customers feel bad. Dig past the feature request to find the underlying negative emotion motivating it."
### Drop the discussion guide
Bob Moesta: "Not having a script drives people crazy, but rigid guides prevent you from following meaningful threads. Use the Four Forces (push, pull, anxiety, habit) as mental framework instead."
### Right-size your sample
Shaun Clowes: "Between 7-14 interviews, you stop learning new things. Less than 7, not enough data. More than 14, diminishing returns."
### Expect 90% rejection
Gustaf Alstromer: "90% of people aren't early adopters. You need to reach 10 to find 1. Rejection isn't failure—it's filtering for the right users."
### Get direct exposure
Marty Cagan: "I wasn't allowed to make product decisions until I'd visited 30 customers. Those visits changed my life—I thought I knew our customers and I really didn't."
### Respond with extreme speed
Jeff Weinstein: "When a customer goes out of their way to share a problem, that's a gift. I'll leave a meeting to reply. Be 'text message friendly' with 5-10 power users."
### Interview the non-users
Mihika Kapoor: "The most insightful conversations are with non-users. Ask why they're not using your product—you'll find perception gaps users can't see."
### Test willingness to pay
Jeff Weinstein: "Have them send you a $1 invoice right now. The gap between 'willingness to pay' and actually paying is massive. This tests real commitment."
### Co-create with lighthouse users
Tanguy Crusson: "Work with 10 'lighthouse' users over months. Put them in Slack with your team. Involve engineers directly so they build empathy."
## Questions to Help Users
- "What are you trying to learn from these interviews?"
- "Are you interviewing people who've already tried to solve this problem?"
- "How are you recruiting participants?"
- "What's your opening question? (Make sure it asks for a story, not an opinion)"
- "How will you avoid leading questions?"
- "What will you do with the findings?"
## Common Mistakes to Flag
- **Leading questions** - "Don't you think X would be better?" just gets agreement
- **Asking about hypotheticals** - "Would you use this?" is meaningless; behavior matters
- **Pitching during research** - You're there to learn, not sell
- **Too few interviews** - 2 isn't enough; aim for 7-14
- **Delegating observation** - PMs and designers must be in the room, not reading reports
## Deep Dive
For all 64 insights from 43 guests, see `references/guest-insights.md`
## Related Skills
- Analyzing User Feedback
- Defining Product Vision
- Measuring Product-Market Fit
- Designing Surveys
This skill helps you run discovery and customer interviews that surface real behavior, pain, and opportunity. It guides preparation, live interviewing techniques, participant recruitment, and synthesis so you turn conversations into actionable product insights. Use it to move beyond opinions and validate what users actually do and pay for.
I start by clarifying your learning goal (problem validation, solution testing, pricing, behavior). Then I help draft non-leading, story-focused questions, pick the right participants, and coach in-the-moment techniques to get honest answers. After interviews I assist with synthesis: pattern spotting, prioritizing findings, and turning them into next experiments or product decisions.
How many interviews should I run before making a decision?
Aim for 7–14 interviews; fewer than seven often misses patterns, more than fourteen yields diminishing returns for early-stage discovery.
What if participants only give opinions or feature requests?
Pivot: ask for a recent example, probe the context and emotion, and ask what they actually did to solve it. Focus on behavior and trade-offs rather than feature lists.
Can I recruit non-users?
Yes. Interviewing non-users reveals perception gaps and barriers to adoption that current users can’t articulate.