TheLadders job site needed to develop an iPhone app. What’s an MVP for an app? How do you iterate in a Lean methodology?
Some issues that make it unlike website development:
- Consumers control the updates
- If consumers don’t like an app the first time they use it, they’ll never try it again
- 1-star reviews in the App Store would be tough for an MVP to overcome
The team wanted to create an MVP app that would average at least 3.5 stars in the App Store, so they launched a study that would allow the team to:
- Get the app into users’ hands early
- Observe use over sustained time (6 wks)
- Build, measure, learn and evaluate user satisfaction
The team’s process:
- Wrote a test plan indicating the test’s who, what, when, where, why and how.
- Recruited participants w/ a variety of demos and psychographics. They offered an incentive of $300 for 3 milestones ($50 for first week, $100 for 3 weeks, $150 for completion). They then funneled applicants through a screening survey and procured and distributed tools: they used Test Flight for app distribution; the Reflector app to project users’ iPhone interactions onto their desktops; and gotomeeting to share the reflected user interactions with the team.
- Identified and deployed an atomic experience in which users log in, search for jobs and save their selects.
- Planned six 1-week sprints: W, surveys sent and roadmaps established; TH, 6 30m gotomeeting calls; F, 6 30m gotomeeting calls; M, design and develop; TU, recap call and planning/re-prioritization.
- The survey acted a prompt to remind testers and to prepare them to think about how their usage of the app had changed that week, how the usage of the website changed that week, how they would currently rate the app and what it would take to get to a 5-star rating.
- Then they made the calls. 1 moderator and stakeholders observed remotely and chatted via gotomeeting. They also got some live user feedback on features still under development.
- Then they had a findings recap call to make sure all stakeholders were in alignment about what needed to happen next.
- Then they planned the next week using Trello, reprioritizing a sort of Kanban board to remove, add or readjust activities. They front-loaded what they feared were they most tricky aspects of the app so that they would have sufficient time to resolve any issues.
The resulting app has been widely downloaded and has a 4.5 start rating. What did they learn from testing that helped achieve this?
- Their job-sorting function didn’t work well for users: results were delivered prioritized by latest as in a search function, but when users came back next time they’d see totally different results due to the latest date prioritization. Users required more of an inbox metaphor with jobs ordered by date and stacked in order of relevance within each date, with a clear indicator of which listings were new since the user’s last login.
- They learned how to execute a “like” feature which helped job-seekers and recruiters find relevant matches.
- They tested a mobile job application function that job-seekers entirely rejected.
- They tested a feature for job-seekers to rate how well they thought they matched a job, but recruiters rejected that feedback as unsubstantiated.
- They learned to remove the paywall in the app so that users could save jobs on their phone and then see them on their desktop later without interruption.
- They found the usual sorts of usability issues.
- Most importantly: they learned what the detractors and delighters were for users and were able to build those into the final MVP app.