TL;DR
AARRR (Pirate Metrics) is a five-stage funnel that maps how a user becomes a paying, referring customer: Acquisition, Activation, Retention, Referral, and Revenue. Developed by Dave McClure in 2007, it gives product and growth teams a single diagnostic model to see where users fall off. Fix the leakiest stage first, then move to the next.
What Is AARRR?
Dave McClure introduced AARRR at a Venture Hacks conference in 2007. He was advising early-stage startups that needed a practical way to prioritize growth work without drowning in metrics. The framework maps five stages of the user lifecycle, each one measurable and each one actionable.
The name "Pirate Metrics" is straightforward: when you say the five initials aloud, it sounds like a pirate growl. McClure leaned into the nickname because it made the framework stick. Conference audiences remembered AARRR when they forgot everything else from the talk. That memorability was intentional.
The five stages form a linear funnel. Users enter at Acquisition. If the product delivers value, they move through Activation, Retention, Referral, and Revenue. Most products leak heavily at one or two stages. AARRR's purpose is to surface which stage deserves attention now.
The Five Stages
Acquisition: How Users Discover You
Acquisition covers every channel that brings a new user to your product for the first time: organic search, paid ads, social, content marketing, referral links, word of mouth, and direct traffic.
The key metric is not raw visit volume. It is cost per acquisition (CAC) by channel. Two channels can each drive 1,000 signups per month, but if one costs $8 per signup and the other costs $80, they are not equivalent. Segment acquisition data by source and track which channels produce users who actually make it to Activation.
See customer acquisition cost (CAC) for a full breakdown of how to calculate and benchmark this metric.
Activation: First Valuable Experience
Activation is the moment a new user "gets it." They experience enough of the product's core value that they want to come back. This is the most commonly misunderstood stage because it requires defining what "value" means specifically for your product.
Slack's internal research identified "2,000 messages sent within a team" as a strong predictor of long-term retention. That number became their activation threshold. For a project management tool, activation might be "created a project and assigned one task." For a data analytics tool, it might be "ran a query and shared the result."
The metric: percentage of new signups who complete the activation event within 7 days. If you track it via the activation rate metric, you can cohort it and see how the rate changes month over month.
Activation is often the highest-leverage stage for early products. If 80% of new users never reach the aha moment, improving retention or referral is pointless. The AARRR calculator lets you model the impact of improving each stage.
Retention: Do They Come Back?
Retention measures whether users return after their first session. The standard metrics are N-day retention rates: what percentage of users who signed up on day 0 are still active on day 7, day 30, day 90?
Track retention as cohort curves rather than aggregate averages. Cohort curves show whether recent signups behave differently from users who joined six months ago. If new cohorts retain better, your product improvements are working. If they retain worse, something changed.
Retention is the foundation of everything else in AARRR. A product with poor retention cannot generate reliable referrals or predictable revenue. You are filling a leaky bucket.
Referral: Do They Tell Others?
Referral tracks whether satisfied users bring in new users without paid acquisition. The metric is the viral coefficient K: the average number of new signups each existing user generates through sharing, invitations, or word of mouth.
A viral coefficient above 1.0 means the product grows on its own. A coefficient of 0.3 means each user generates 0.3 new users on average, so paid acquisition must make up the difference. Most products sit between 0.1 and 0.5. Net Promoter Score (NPS) is a leading indicator for referral quality.
Referral is almost always the last stage to optimize. Users who do not retain do not refer. Build a referral program on top of a product with 10% Day 30 retention and you will accelerate churn, not growth.
Revenue: Monetization
Revenue measures whether users pay. For subscription products, track monthly recurring revenue (MRR), average revenue per user (ARPU), and free-to-paid conversion rate. For ad-supported products, substitute engagement metrics tied to ad revenue.
Use the LTV calculator to project lifetime value per cohort and set acquisition spend limits. LTV divided by CAC is the single most useful ratio in AARRR: it tells you whether the funnel is economically viable.
Monitor the churn calculator alongside revenue. A product growing MRR at 10% per month while churning 8% per month is building on sand.
Worked Example: A SaaS App
A B2C SaaS product has 100,000 monthly visitors. Here is the funnel:
| Stage | Count | Conversion Rate |
|---|---|---|
| Acquisition (monthly visitors) | 100,000 | baseline |
| Signup rate | 4,000 | 4% of visitors |
| Activation (reached aha moment) | 2,400 | 60% of signups |
| Retention at Day 30 | 960 | 40% of activated users |
| Referral (referred at least one user) | 96 | 10% of retained users |
| Revenue (converted to paid) | 240 | 25% of retained users |
Reading this funnel: the biggest absolute drop is at Acquisition to Signup. 96,000 visitors leave without signing up. But a 4% visitor-to-signup rate is not unusual for a content-driven acquisition strategy. Dig deeper: are those 96,000 visitors bouncing immediately (traffic quality problem) or are they exploring the site but not converting (CTA or messaging problem)?
The next big drop is Activation. 1,600 users signed up and never reached the aha moment. That is 40% of signups gone at the first hurdle. This is where to focus.
If Activation improves from 60% to 75%, the ripple effect is significant. The activated cohort grows from 2,400 to 3,000. At the same downstream rates, retained users go from 960 to 1,200 and paying customers go from 240 to 300. That is 60 additional paying customers per month from one metric improvement.
Improving paid conversion from 25% to 30% while Activation stays at 60% would produce 288 paying customers instead of 240. Smaller gain, and it does nothing to fix the underlying product adoption problem.
Action: interview the 40% who did not activate, identify where they stalled in onboarding, and simplify that step. Set a target of 70% Activation within 60 days.
When to Use AARRR
AARRR fits best when:
- You have a multi-step user journey with measurable drop-off points between sign up and payment
- You are at early to mid-stage and need a framework to decide where growth investment goes
- You run a freemium or self-serve SaaS product where users move through stages independently
- You need to align product, marketing, and growth teams on a shared model of the user lifecycle
- You build consumer apps where user behavior data is available in volume and acquisition channels are diverse
When NOT to Use AARRR
Skip AARRR or adapt it heavily when:
- You are enterprise sales-led: a six-month enterprise deal does not fit neatly into a self-serve funnel. Activation and Referral stages mean something different when a champion has to secure budget approval before you see Revenue.
- You run a marketplace: two-sided networks have two acquisition funnels (buyers and sellers) that interact with each other. A linear AARRR model misses the network effects that drive marketplace growth. The PLG flywheel is a better model here.
- You have very low traffic: with fewer than 50 signups per week, conversion rate percentages are statistically noisy and will send you in wrong directions.
- You want to measure UX quality or feature success: the HEART framework is built for that purpose.
Common Pitfalls
Defining Activation vaguely. "User explored the product" is not an activation event. It must be a specific, trackable action correlated to long-term retention. If you do not have retention data yet, pick the action you believe is most meaningful, track it for 60 days, then check whether users who completed that action retained better than those who did not.
Optimizing Acquisition before Activation. More traffic into a broken funnel produces more churn, not more revenue. Check your Activation rate before you increase ad spend.
Mixing acquisition channels in your funnel. Users from organic search activate differently than users from paid social. If you blend channels in your aggregate funnel, you get false averages that hide which sources produce valuable users and which produce noise. Segment by channel.
Treating Revenue as the only stage that matters. Revenue is a lagging indicator. Problems in Activation and Retention show up in Revenue 30 to 90 days later, by which time you have compounded the damage. Track all five stages.
Skipping cohort analysis for Retention. A single aggregate retention number hides whether things are improving or worsening. Compare cohorts month over month. New cohorts should retain better than older ones if your product is improving.
AARRR vs Alternatives
AARRR vs HEART: AARRR is a business funnel model that answers "where are users dropping out?" HEART (Happiness, Engagement, Adoption, Retention, Task Success) is a UX quality model that answers "how good is the experience at each step?" See the full HEART vs AARRR comparison for a side-by-side view. The two frameworks pair well: use AARRR to find the leaky stage, then use HEART to understand the experience quality problems causing the leak.
AARRR vs North Star Metric: AARRR gives you five metrics to watch simultaneously. The North Star approach collapses that into a single metric that best represents the value users get from your product (for Spotify, it is time listening; for Airbnb, it is nights booked). The North Star is simpler to communicate but tells you less about where to intervene. Most teams use a North Star for alignment and AARRR for diagnosis.
AARRR vs RICE framework: AARRR tells you which funnel stage to focus on. RICE helps you decide which specific feature or initiative to build within that stage. They operate at different levels of abstraction and work well together.
Tools That Help
The AARRR calculator lets you enter your funnel numbers across all five stages and instantly see conversion rates, cumulative drop-off, and which stage represents the biggest opportunity. Run it with your current numbers, then model what happens if you improve each stage by 10%.
The LTV calculator connects to the Revenue stage. It takes average revenue per user and average customer lifespan and outputs lifetime value. Pair it with your CAC to sanity-check whether your acquisition economics make sense.
The churn calculator feeds directly into Retention and Revenue analysis. Monthly churn rate tells you how long your average customer stays before canceling, which sets the ceiling on LTV.
Putting It Into Practice
AARRR works as a weekly diagnostic ritual. Pull your funnel numbers every Monday. Mark the stage with the largest conversion drop. That stage gets this week's focus. Write down what you will change, build, or test to improve it, and by how much. Revisit the following Monday.
The framework is not a strategy. It is a tool for deciding where strategy should go. The decision that AARRR makes easier is: "Given limited time and engineering capacity, which part of the funnel deserves attention this sprint?" Without that structure, growth teams tend to work on whatever is most visible, which is usually acquisition, because acquisition metrics are easy to report to leadership. AARRR forces you to look at the full picture and defend the choice to fix Activation over buying more traffic.
Most early-stage products that struggle with growth are not struggling because of Acquisition. They are struggling because too few users ever reach the moment where the product clicks for them. Fix that first.