home / skills / refoundai / lenny-skills / writing-north-star-metrics
This skill helps you define a clear North Star metric by focusing on customer value, simplicity, actionability, and guardrails.
npx playbooks add skill refoundai/lenny-skills --skill writing-north-star-metricsReview the files below or copy the command above to add this skill to your agents.
---
name: writing-north-star-metrics
description: Help users define their North Star metric. Use when someone is choosing their primary success metric, trying to align the team around a key measure, struggling with metric proliferation, or setting up their measurement strategy.
---
# Writing North Star Metrics
Help the user define their North Star metric using frameworks and insights from 27 product leaders.
## How to Help
When the user asks for help with North Star metrics:
1. **Understand the value** - Ask what specific value the product delivers to users (not revenue or internal activity)
2. **Test for simplicity** - Ensure the metric can be understood and discussed by anyone in the company
3. **Check for actionability** - Confirm teams can actually influence this metric through their work
4. **Add guardrails** - Help them identify countervailing metrics that prevent gaming
## Core Principles
### Measure value delivered, not captured
Itamar Gilad: "The North Star metric measures how much value we create for the market. WhatsApp measured messages sent because every message is incremental value. Airbnb used nights booked." Select a metric that tracks core utility provided to the customer, not business extraction.
### Avoid lagging indicators
Jess Lachs: "Retention is a terrible thing to goal on. It's almost impossible to drive in a meaningful way in a short term. Find a short-term metric you can measure that drives a long-term output." Identify proxy metrics that are sensitive to experimentation and correlate with long-term goals.
### Simple beats sophisticated
Jess Lachs: "If people understand it, if they have an intuition around it, if it's something people can talk about across the company, it's going to be a much better metric than your made up composite score that nobody understands." Avoid composite metrics with complex coefficients.
### Measure from the customer's perspective
Jeff Weinstein: "What was the value we're trying to produce for the customer, and can we measure it from their perspective? We suggested 'companies with zero support tickets.'" Define metrics based on the absence of friction or the presence of success moments.
### A high-level mission simplifies decisions
Hari Srinivasan: "Everything at LinkedIn is a very connected ecosystem, but decisions aren't difficult because we're all here to help people connect to economic opportunity." A clear company-wide mission serves as the ultimate tie-breaker for product decisions.
### Name metrics evocatively
Jeff Weinstein: "Picking metric titles that make you feel something. 'Companies with zero support' - the brevity and customer mindset built into the chart name can become currency inside the company." Use simple, evocative names instead of technical database field names.
### Codify definitions precisely
Manik Gupta: "Everyone will talk about the same metric but have different nuances. What is a daily active user? Pick a definition, instrument it, codify it. No confusion." Metrics only drive alignment when backed by precise definitions and accurate instrumentation.
### Select a hard activation metric
Lauryn Isford: "An activation rate that falls in a lower percentage range, maybe 5-15%, is better than a high percentage because it means there's likely much higher correlation with long-term retention." High-bar activation metrics that few users reach are often more valuable.
### Revisit periodically
Lauryn Isford: "A North Star metric should be a measure of what you plan to do. Revisit North Star metrics every 6-12 months to ensure they still align with business goals." Be willing to shift metrics if the strategy requires it.
### Avoid revenue as North Star
Sean Ellis: "I think monthly purchases is great because it maps to value people are getting. Units of value from the customer perspective is more important than overall revenue." Revenue should be a product of doing things right, not the day-to-day guiding metric.
## Questions to Help Users
- "What specific moment represents a user getting value from your product?"
- "Could a non-technical person in your company understand and discuss this metric?"
- "Can teams actually influence this metric through their work in a quarter?"
- "What would happen if you gamed this metric - what would break?"
- "Does this metric measure value delivered to users or value extracted from them?"
- "Is this a leading indicator or a lagging one that's hard to move?"
## Common Mistakes to Flag
- **Revenue as North Star** - Revenue is an outcome; focus on the customer value that drives it
- **Complex composite metrics** - If you can't explain it simply, teams can't rally around it
- **Lagging indicators like retention** - Find the leading metrics that predict retention
- **Gaming vulnerability** - Add countervailing metrics to prevent optimization that hurts users
- **Undefined terms** - 'Active user' means nothing until you codify exactly what counts
## Deep Dive
For all 35 insights from 27 guests, see `references/guest-insights.md`
## Related Skills
- Setting OKRs & Goals
- Defining Product Vision
- Prioritizing Roadmap
- Designing Growth Loops
This skill helps teams define a clear, actionable North Star metric that measures the value delivered to customers. It guides you to pick a simple, leading metric that teams can influence, while adding guardrails to prevent gaming. The outcome is a single, evocative measure that aligns product decisions and experimentation.
I ask targeted questions to surface the core user value and the key success moment your product creates. I test candidate metrics for simplicity, actionability, and whether they are leading indicators of long-term outcomes. I help name the metric, codify precise definitions and instrumentation, and suggest countervailing metrics to avoid perverse optimization. I also recommend a cadence for revisiting the metric as strategy evolves.
Should revenue ever be the North Star metric?
No. Revenue is an outcome. Use a metric that measures units of customer value; revenue should follow if you deliver value consistently.
What if retention is my long-term goal?
Don’t set retention as the day-to-day North Star. Pick a leading metric or activation event that experiments can move and that correlates with retention.
How do I prevent teams from gaming the metric?
Add guardrail metrics that capture quality or customer health, and codify definitions and instrumentation so optimization focuses on true value.