Inside Even

Retention

V1 app data: A retrospective (Part 2 of 3)

This is Part 2 of a retrospective on how we used app data to evaluate Even V1; Part 1 is here!

Once we’d gotten better at acquiring Even members, the next thing was to see if we could retain them.

We started off by measuring how many members stayed who fit into the category “Active.” To be active simply meant that a member had an active account — they’d moved on to using the app after vetting, and hadn’t since deactivated.

The app data

We soon realized that the “active” category wasn’t doing very much to help us understand retention. Some active members were sticking around for months; others were deactivating their accounts after a few weeks. It wasn’t very clear what was allowing us to retain some and not the others. “We’d see people come in, and then we’d see them quit,” Evan remembers. “It was a quick interaction; it was like, ‘I don’t know what to make of this.’”

The follow-up

A hint came from the Advisors team. They’d noticed it felt like we were skipping lots of paydays. (Skipping a payday just meant that, on a member’s payday, instead of making either a withdrawal or deposit in order to even their pay, we weren’t doing anything. There were many reasons for skipping a payday, ranging from the member requesting that we skip it, to us being forced to skip it by the fact we didn’t have the necessary account information from the member.) Upon investigation, we found we were actually skipping a full third of member paydays.

Given that handling paydays was all the app did back then, Evan points out this meant a lot of the members we were counting as “active” hadn’t truly made it all the way through our funnel. It was kind of like they were stalled a step before the real final step.

The decision

To try to separate out members who we thought were really experiencing the app, we created an new category called Engaged. To qualify as engaged, a member had to have had at least 2 paydays evened by Even.

Looking at retention numbers of members who were engaged vs. simply active, a pattern finally started to emerge. “The engaged number actually had really great retention,” says Evan. “We were like, ‘Okay, this works. If you can last through two paydays, you’re tremendously likely to stay on for another six months.’”

As for people who never became engaged: “Our business output is paydays being evened; if you’re not participating in the payday process, then there’s no question that you’re going to, at some point — especially when you start getting charged $3 a week — say, ‘This is not for me.’”

While this new engaged number helped us get a better understanding of what drove member retention, it absolutely did not mean we were off the hook. In the next post, I’ll finish by discussing the other half of what we learned from the data about retention, and how it forced us to take a hard look at whether we were providing enough Even members with enough utility.

The takeaways

  • Right after launch, it’s hard to know which metrics will end up being helpful. It’s likely we’ll start out looking at some numbers, then adjust and look at others as we figure out what really matters.
  • To design metrics that really matter, it’s important to think about which ones have meaning in the context of the end member experience (“Engaged”) — not just on a technical level to us (“Active”).
  • At the same time, cherry-picking data in order to tell ourselves a story we like is not cool or helpful to us at all. Creating a metric like Engaged is no substitute for facing the original problem posed by the numbers.

Thanks again to Mr. Evan for helping me remember our data story.

Sign up for our newsletter

Get updates around new research and findings in your email.