Retention Analysis

If you can’t name what week-one behavior predicts month three, you’re fixing the wrong things.

Acquisition looked fine. Cohorts did not.

Most teams stop at the blended line because it is polite. The useful work starts where the average lies—splitting the people who actually reached value from the people the product never gave a fair shot.

Problem

Teams watched one blended curve and argued from different definitions of “activated.”

Without agreement on what good first-week usage meant, roadmap debates stayed vague. Metric noise crowded out experiments.

Insight

Splitting cohorts by acquisition source and first-week behavior made the pattern obvious: two specific journeys drove durable usage. Most other activity looked busy but didn’t predict much.

What I did

  • Tagged cohorts by channel, plan, and concrete first-week actions — not vanity completion events.
  • Defined activation as milestones tied to felt value, then instrumented those honestly.
  • Worked with design to simplify first-session guidance and remove obvious friction.

Impact

  • Week-eight retention up 9.8 points.
  • Early churn in the first fourteen days down 17%.
  • Backlog aligned to measurable levers instead of a grab bag of “retention initiatives.”