During my time at CareGuide, I was leading the design efforts on the Nanny Lane product. Nanny Lane is a web application that connects caregivers to families seeking long or short term care for their children. Using our tools and resources, families can find caregivers and caregivers can find work with ease.
To make strategic decisions around what features to prioritize, we wanted to build an in-app solution that would collect data regarding how our users felt about nanny shares. While doing so, we wanted to address a 17% drop off rate issue in our onboarding flow.
To get a better understanding of the problem, I started by reviewing the existing onboarding flow. Our most common user groups fell into four categories. We had families, nannies, families joining with another family and families who already had a nanny all signing up for different reasons.
Depending on the user's goals and how they filled out the questions while onboarding, they would typically fall into one of the following user types:
Looking for a nanny or nanny share
Looking for a nanny to complete nanny share
Looking for another family to complete nanny share
Looking for a family or nanny share
For this project, we focused on the “Family”, “Family & Family” and “Family & Nanny” user types since they were the ones dropping off most frequently. It wasn’t clear why they were dropping off, but what we did know was that it would happen in the first 2-3 screens.
Below are the screens where our users were dropping off, organized to show how we go about determining the user type and adjusting our content accordingly.
Looking for another family to complete nanny share
Looking for a nanny to complete nanny share
Looking for a nanny or nanny share
This project was under a short time frame, so I relied on existing usability test results and working with my product manager who often communicates with our users to understand some of the issues people were running into on this screen.
The results showed that the way the question was formatted confused our users and the nanny share definition was too wordy.
The question assumes users are joining with someone. Our data showed that 74% of users were joining alone (the “neither” option).
Users were skipping over this text where we tried to encourage users to try nanny sharing.
The question assumes users are joining with someone. Our data showed that 74% of users were joining alone (the “neither” option).
Users were skipping over this text where we tried to encourage users to try nanny sharing.
The way the screen above was laid out assumed that majority of our users were signing up to complete a nanny share. Looking into our data, we found that 74% of our users were selecting the “neither” option.
This meant that a large majority of our users were signing up alone, most likely just looking for a regular nanny. To simplify the path to sign up and reduce the possibility of the 74% of users getting confused or dropping off, I catered the redesign of this page to their needs first.
There were a lot of intricacies involved on our end to be able to categorize the user into one of our user types, deal with the drop off rate, and capture nanny share interest. I used user flow diagrams to map out the different ways we could do so.
Below you’ll find both the user flow diagrams along with the designs that were later made from those diagrams. The two below were the iterations we considered before arriving at the final solution.
I found it helpful to collaborate with my team to find what was working and what wasn’t with the existing iterations. My initial ideas involved changing the entire flow, so instead, we took a step back to evaluate the screen where we were experiencing the highest drop off rate.
We restructured the following screen in a way that was easier to follow. Our focus was on reducing the amount of time it would take to make a decision by reformatting the question in a way that was both easier to comprehend and goal-oriented.
According to Hick’s Law, the time it takes to make a decision increases with the number and complexity of choices. That logic explains why our users took longer and likely dropped off on this screen. So, we redesigned the parts that contributed to this complexity.
The question was changed to be goal-oriented, focusing on what task the user is looking to complete by signing up.
We simplified the nanny share definition to act as a quick tip for those who aren’t familiar with nanny shares.
Bringing the options down to 2 made it more likely that the user would accomplish their goal and not give up or get confused.
The question was changed to be goal-oriented, focusing on what task the user is looking to complete by signing up.
Bringing the options down to 2 made it more likely that the user would accomplish their goal and not give up or get confused.
We simplified the nanny share definition to act as a quick tip for those who aren’t familiar with nanny shares.
With the drop off issue solved, I was able to focus on designing an in-app solution to collect nanny share interest. To do so, I created a series of steps that would categorize our user accordingly while also tracking their interest in nanny sharing through button clicks.
Follow the path to understand how we determine the user type and nanny share interest upon signing up.
Looking for a nanny or nanny share
Looking for another family to complete nanny share
Looking for a nanny to complete nanny share
The intricacies that came from capturing nanny share interest with an in-app solution were difficult to understand when presented in a user flow diagram. With a prototype, we were able to see each path individually and determine how to track interest based on which button was clicked in the flow.
You can play with the prototype here to go down the different paths or take a look at the live example in the link below.
The changes we made to the onboarding flow launched in Q4 2019. Our main objective was to get a better understanding of whether our users were interested in nanny shares. We also looked into how we did with decreasing our drop off rate and increasing conversions.
With access to new data, our product manager was now able to make informed strategic decisions for the product.
The updates we made to the screen where we were experiencing a 17% drop off rate brought it down to 11%.
Along with a reduced drop off rate, we also noticed that our conversion rate had increased by 5%.
There wasn’t a specific design process or template that could be used to come up with a solution for this project. With the constraints I was given, I needed to think quickly, adapt, and decide which methods made the most sense to solve the problem at hand. Although this project didn’t follow a formulaic approach, it did take me out of my comfort zone and teach me the reality of what it means to work as a product designer.
I learned that not every project will fit into a step by step, pre-determined process. Sometimes I won’t have the time or resources to spend two weeks recruiting and interviewing users. What makes the difference in these situations is how I adapt to the constraints given and solve the problem with the time, resources, and skills I do have.
Getting data to validate whether users are interested in a concept can be done in various ways (surveys, interviews, etc...). The solution we shipped was able to accomplish this with less time and resources used.
In this case, the constraints required me to work faster than usual. I went right into working on high fidelity solutions to be able to meet our goal within the allotted period and was successful in doing so.
We had plans to provide a more personalized experience for our user types in the future. With the changes we made to the onboarding, we were one step closer to being able to provide this improved experience.
With a shorter timeframe I wasn't able to properly define the success metrics to evaluate post-launch. This led to some initial confusion around whether the launch was a success or not that could have been avoided.
I would have liked to work with customer support to see if they could've provide any insight that may have supported (or challenged) some of the quantitative data we collected.