Driving study habits and boosting cashflow through a 3-month challenge

Driving study habits and boosting cashflow through a 3-month challenge

I designed a 3-month challenge that increased engagement, reduced churn, and boosted cashflow. 16% of participants kept studying after the challenge ended, and 10% renewed their membership within 6 months—strong signals of lasting impact beyond the typical subscription period.

My role on the project

As the only product designer in an agile team I handled the end-to-end design process, including: User Research & Analysis, Proto Personas & User Stories, User Flows & UX/UI Design (low → high fidelity), Prototyping & Usability Testing, Design Handoff & collaboration with engineers

My role on the project

As the only product designer in an agile team I handled the end-to-end design process, including: User Research & Analysis, Proto Personas & User Stories, User Flows & UX/UI Design (low → high fidelity), Prototyping & Usability Testing, Design Handoff & collaboration with engineers

My role on the project

As the only product designer in an agile team I handled the end-to-end design process, including: User Research & Analysis, Proto Personas & User Stories, User Flows & UX/UI Design (low → high fidelity), Prototyping & Usability Testing, Design Handoff & collaboration with engineers

Where the idea came from

This wasn’t a roadmap feature—it started with a real user. Our Senior PM noticed Ivan Chong, a learner documenting his journey to become a Cloud Architect on Twitter and LinkedIn using hashtags like #100DaysOfCloud and #30DaysOfAzure. He was one of thousands already participating in a grassroots learning challenge. And none of our competitors were building anything around it.

That insight sparked an idea: what if we turned this organic behavior into a structured experience that rewarded consistency, reduced churn, and improved revenue?

Sometimes, ideas can come from simply paying attention to what users are already doing. That’s exactly what happened when we noticed Ivan Chong, a user who had recently decided to pivot into a Cloud Architect role. He was studying across multiple platforms within short timeframes
and posting daily progress on Twitter & LinkedIn

Screenshots of social media posts shared on Linkedin and X by Ivan Chong
Screenshots of social media posts shared on Linkedin and X by Ivan Chong
Screenshots of social media posts shared on Linkedin and X by Ivan Chong

Using the same hashtags: #100DaysOfCloud and #30DaysOfAzure
Digging a little deeper, we found out that he wasn’t alone.
He was part of a larger movement called 100 Days of Cloud—a community-driven challenge with thousands of people doing the same thing.
And surprisingly enough, None of our competitors were addressing this kind of user behavior. That’s when the lightbulb went off:
What if we created an initiative that helped goal-driven learners stay on track—while also reducing churn?

The opportunity

This was more than a trend. We saw clear business potential:


  • Acquire new users working toward clear, time-bound goals

  • Reduce early churn by encouraging longer-term commitments

  • Boost monthly net cashflow by requiring users to prepay for 3 months (the riskiest period for cancellations)

The opportunity

This was more than a trend. We saw clear business potential:


  • Acquire new users working toward clear, time-bound goals

  • Reduce early churn by encouraging longer-term commitments

  • Boost monthly net cashflow by requiring users to prepay for 3 months (the riskiest period for cancellations)

The opportunity

This was more than a trend. We saw clear business potential:


  • Acquire new users working toward clear, time-bound goals

  • Reduce early churn by encouraging longer-term commitments

  • Boost monthly net cashflow by requiring users to prepay for 3 months (the riskiest period for cancellations)

How might we help committed learners stay on track—while improving retention and revenue?

How might we help committed learners stay on track—while improving retention and revenue?

How might we help committed learners stay on track—while improving retention and revenue?

We launched Cloud Marathon, a 3-month challenge where users commit to studying 1 hour per day and posting their progress publicly. If they complete all 90 days, they get a full refund.

To make it more inclusive, I also designed a “Half Marathon” version: 45 days of study instead of 90. Same structure, less intense commitment.

detail of the landing page, where the steps to participate in the challenge are illustrated
detail of the landing page, where the steps to participate in the challenge are illustrated
detail of the landing page, where the steps to participate in the challenge are illustrated
Screenshots from the landing page, illustrationg the conditions to complete a full and half marathon.
Screenshots from the landing page, illustrationg the conditions to complete a full and half marathon.
Screenshots from the landing page, illustrationg the conditions to complete a full and half marathon.

The initial hook was obvious: learn for free.

But the deeper value came from helping users build study habits, explore beyond a single platform, and stay accountable through public sharing.

Designing the experience from scratch

With no prior solution to build on, I designed the entire experience from scratch—starting with the user journey. I mapped out all user flows, from landing page sign-up to completion, ensuring the challenge structure was easy to understand and stick with. (This is where I visually broke down the experience with user flow diagrams.)

Designing the experience from scratch

With no prior solution to build on, I designed the entire experience from scratch—starting with the user journey. I mapped out all user flows, from landing page sign-up to completion, ensuring the challenge structure was easy to understand and stick with. (This is where I visually broke down the experience with user flow diagrams.)

Designing the experience from scratch

With no prior solution to build on, I designed the entire experience from scratch—starting with the user journey. I mapped out all user flows, from landing page sign-up to completion, ensuring the challenge structure was easy to understand and stick with. (This is where I visually broke down the experience with user flow diagrams.)

screenshot of one of the user flow produced during the discovery phase of the project
screenshot of one of the user flow produced during the discovery phase of the project
screenshot of one of the user flow produced during the discovery phase of the project

Designing with constraints

We had no research budget and limited dev time. I collaborated with learning coaches to build proto-personas based on their real-world interactions, and worked closely with our PM to define core user stories for the MVP.

Designing with constraints

We had no research budget and limited dev time. I collaborated with learning coaches to build proto-personas based on their real-world interactions, and worked closely with our PM to define core user stories for the MVP.

Designing with constraints

We had no research budget and limited dev time. I collaborated with learning coaches to build proto-personas based on their real-world interactions, and worked closely with our PM to define core user stories for the MVP.

screenshot illustrating the protopersonas created to work on te project
screenshot illustrating the protopersonas created to work on te project
screenshot illustrating the protopersonas created to work on te project

From there, I designed a landing page, a dashboard to track progress and completion state and a results page summarizing outcomes and rewards.

Because dev resources were shared, we shipped the project in phases over four months. I prioritized clean handoff, edge case coverage, and constant alignment with engineers.

From there, I designed a landing page, a dashboard to track progress and completion state and a results page summarizing outcomes and rewards.

Because dev resources were shared, we shipped the project in phases over four months. I prioritized clean handoff, edge case coverage, and constant alignment with engineers.

From there, I designed a landing page, a dashboard to track progress and completion state and a results page summarizing outcomes and rewards.

Because dev resources were shared, we shipped the project in phases over four months. I prioritized clean handoff, edge case coverage, and constant alignment with engineers.

Image of Marathon's dashboard page and Result page
Image of Marathon's dashboard page and Result page
Image of Marathon's dashboard page

Validating early ideas

To get early buy-in, I rapidly sketched and prototyped mid-fidelity wireframes for internal testing. This allowed us to de-risk key UX decisions without slowing momentum.

Validating early ideas

To get early buy-in, I rapidly sketched and prototyped mid-fidelity wireframes for internal testing. This allowed us to de-risk key UX decisions without slowing momentum.

Validating early ideas

To get early buy-in, I rapidly sketched and prototyped mid-fidelity wireframes for internal testing. This allowed us to de-risk key UX decisions without slowing momentum.

Screenshot of midlevel wireframes
Screenshot of midlevel wireframes
Screenshot of midlevel wireframes

Why it worked

This wasn’t just about refunds. We designed around intrinsic motivation. Using Daniel Pink’s Drive and You-kai Chou’s Octalysis model, I shaped the experience around:


  • Autonomy: Users opted into the pace and platform of their study.

  • Accountability: Public sharing created positive peer pressure.

  • Progress: Streaks and daily tracking reinforced momentum.


Instead of forcing behavior change, we supported what motivated learners were already doing—and gave it structure and rewards.

Why it worked

This wasn’t just about refunds. We designed around intrinsic motivation. Using Daniel Pink’s Drive and You-kai Chou’s Octalysis model, I shaped the experience around:


  • Autonomy: Users opted into the pace and platform of their study.

  • Accountability: Public sharing created positive peer pressure.

  • Progress: Streaks and daily tracking reinforced momentum.


Instead of forcing behavior change, we supported what motivated learners were already doing—and gave it structure and rewards.

Why it worked

After stakeholders’ blessing, we quickly started to bring the different solutions to high-fidelity, tested them again and handed off the designs to engineering, including extreme use cases and documentation.

"When the reward is the activity itself— deepening learning, delighting customers, doing one’s best—there are no shortcuts."

Daniel H Pink

Screenshot of Marathon's result page
Screenshot of Marathon's result page

Results

We launched two rounds of Cloud Marathon and we saw:


  • +18% MoM net cashflow increase (minimal paid ads)

  • 16% of users continued studying post-challenge

  • 10% renewed a paid membership within 6 months

  • Repeat participation and strong word-of-mouth

Results

We launched two rounds of Cloud Marathon and we saw:


  • +18% MoM net cashflow increase (minimal paid ads)

  • 16% of users continued studying post-challenge

  • 10% renewed a paid membership within 6 months

  • Repeat participation and strong word-of-mouth

Results

We launched two rounds of Cloud Marathon and we saw:


  • +18% MoM net cashflow increase (minimal paid ads)

  • 16% of users continued studying post-challenge

  • 10% renewed a paid membership within 6 months

  • Repeat participation and strong word-of-mouth

Visual representation of the actvity results after 2 rounds of Marathon
Visual representation of the actvity results after 2 rounds of Marathon
Visual representation of the actvity results after 2 rounds of Marathon

What I learned

One key mistake: we relied too heavily on internal feedback for usability testing. After launch, 8% of users contacted support confused about disqualification. I addressed this by redesigning the daily challenge cards for clarity in the second edition.

What I learned

One key mistake: we relied too heavily on internal feedback for usability testing. After launch, 8% of users contacted support confused about disqualification. I addressed this by redesigning the daily challenge cards for clarity in the second edition.

What I learned

One key mistake: we relied too heavily on internal feedback for usability testing. After launch, 8% of users contacted support confused about disqualification. I addressed this by redesigning the daily challenge cards for clarity in the second edition.

Snapshot indicating before and after version of dashboard cards, redesigned after user feedback
Snapshot indicating before and after version of dashboard cards, redesigned after user feedback
Snapshot indicating before and after version of dashboard cards, redesigned after user feedback

This project reinforced a few key principles:

  • Align product ideas with real user behaviors

  • Small design bets can drive big business outcomes

  • Clarity and inclusivity in challenge-based design are non-negotiable

This project reinforced a few key principles:

  • Align product ideas with real user behaviors

  • Small design bets can drive big business outcomes

  • Clarity and inclusivity in challenge-based design are non-negotiable

This project reinforced a few key principles:

  • Align product ideas with real user behaviors

  • Small design bets can drive big business outcomes

  • Clarity and inclusivity in challenge-based design are non-negotiable

Check out more projects