Inside Even

Utility

V1 app data: A retrospective (Part 3 of 3)

This is Part 3 of a retrospective on how we used app data to evaluate Even V1; Part 1 is here and Part 2 is here.

Our work on retention showed that actually experiencing Even Pay made new members way more likely to stick around. But did Even Pay alone provide enough utility to make a significant difference in members’ lives?

As described in this post, Even V1 was just the first “Hydrogen” building block of a larger “H2O” molecule we envisioned. H2O Even would be a completely automated money-managing system that planned for bills, saving, and spending, in addition to evening pay. We knew we wanted to build H2O one day; the question was, when?

If V1’s Hydrogen was “enough” to really impact people’s financial lives and stress levels, we could move forward with plans to grow our member base right away, and come back to H2O later. If it wasn’t “enough,” we needed to make a more compelling product before trying to prove ourselves with a larger market.

The app data

Even among engaged members we retained and kept serving for months, we didn’t see the level of cushion saving that would have indicated people were were having a drastically easier time managing their money as a result of having Even Pay. This suggested V1 wasn’t “enough” — but we wanted to be sure.

“Most people have no way to confirm that these things line up,” Evan said, “but because you’re able to have this very close relationship with everybody [through qual research], that means that you go from concocting a graph that looks like any shape to being like, ‘Hey, let’s confirm this.’”

The follow-up

Evan remembers a study on deactivation coming in handy at this point.

We’d been sending out a brief survey to everyone who decided to quit Even; now, I went back through and looked specifically at responses from people who’d quit after having multiple paychecks handled.

Unlike brand-new sign-ups, for whom distrust of Even was a major factor driving deactivation, long-term engaged members almost never cited trust issues. Instead, the main themes in people’s responses boiled down to the product not being compelling enough and not solving “the problem” well enough.

To make sure I was hearing the survey feedback right, I phone-interviewed a few of these deactivation survey-takers. A deactivator named Ryan* explained the problem this way:

“It was great to identify what my average paycheck would look like, and that anything over that, I should just immediately save it. That’s an excellent start. But for me, it’s the regular paycheck where I struggle with keeping things balanced. What do I need to account for budgeting? What do I need to put into savings? What do I have left over for everything else?”

This quote captures what a lot of other members were saying: Getting paycheck boosts was helpful, but it wasn’t necessarily leading to easier budgeting, more saving, and the ultimate goal — a feeling of financial confidence.

The decision

This third, utility-focused stage of evaluation led us to a pretty important conclusion: “Okay, we built an engine, but the engine is going to be very tough for us to scale.” We decided it was time to take out the engine of the app and build a brand new one that would include all the features of “H2O” — Even V2.

The takeaway

  • It’s really risky to make a big product decision based on a chart alone. Using qualitative methods to investigate hypotheses from quantitative app data allows us to make decisions with way more certainty.
  • (Conversely, when we sense that an important theme is bubbling up through qualitative channels like Advisors, turning to app data is a great way to investigate the extent and impact of the theme.)

*Ryan gave me permission to use his story for this post.

Thanks to Evan Goldschmidt for his assistance in reconstructing our V1 data journey for these posts.

Sign up for our newsletter

Get updates around new research and findings in your email.