Problem
While mentorship programs are proven to bridge newcomers' employment barriers, they have limited reach.
According to research by ALLIES on 11 mentorship programs in 8 Canadian cities, skilled immigrant unemployment dropped from 73% to 19% when mentorship took place. Many newcomer service providing organizations (SPOs) offer employment mentorship programs, but are constrained by manual, time-intensive program administration and limited mentor supply. Meanwhile, many newcomers remain unaware of mentorship as a channel for employment advice, and struggle to find work.
In 2018, PeaceGeeks was funded by Immigration, Refugees and Citizenship Canada to launch a pilot project, working with SPOs to co-create and test a digital platform that increases newcomers’ access to relevant employment mentors.
25-200
is the number of matches per year that each service providing organization (SPO) made (based on interviewing 15 organizations). On average, most organizations made 60 matches a year.
51%
of newcomers surveyed by Vancity and Angus Reid were unable to access work that matched their workplace credentials, ending up in junior roles or different fields.
Solution
Snapp is a digital service that facilitates connections between newcomers and local professionals.
My team and I created and piloted Snapp under a grant-funded research initiative which investigated how digital interventions can help newcomers better access relevant employment mentors.
Snapp wasn't just about designing digital touchpoints. Program administrators played a critical role in making the end users' (newcomer and local professional participants') experience possible. My role involved defining not only the digital experience for participants, but also digital experience of program administrators, and how people, process, and digital components work together to provide the big-picture service for newcomers.
Built with no-code
Due to resource constraints and partnership considerations, Snapp was created without engineers.
I led not only the design but also the implementation of Snapp on a no-code stack, comprising Webflow, Coda, Survicate, Google Forms, Zapier, and emails I coded with simple HTML.
Micro-matching participants
For participating newcomers and local professionals, the Snapp experience differed from other newcomer-serving employment mentorship programs in its program design.
Many newcomer mentorship programs require multiple meetings, intensive screening and orientation, and a formalized mentor/mentee relationship. On the contrary, Snapp offered micro-matches (one-time introductions), was purely virtual (using email and video calls), had lower eligibility barriers, and offered reciprocal exchange where newcomers can give local professionals feedback on their advisory skills.
Empowering program administrators
For case managers and frontline workers wanting to make connections for their clients (newcomers), Snapp streamlined logistics by consolidating participant information into easy-to-understand dashboards and automating the connection flow and communications. Matches were generated algorithmically with the ability for manual curation.
How did we get here?
Our focus
How might technology be used to improve newcomers' access to employment mentors?
Discovery
Engaging newcomers, mentors, and newcomer service providing organizations (SPOs)
We started with an extensive discovery phase to uncover the opportunities, strengths, and challenges in newcomers' employment and mentorship. Using interviews, focus groups, and co-creation, we learned from newcomers and local professionals, both with and without experience in mentorship, and mentorship program coordinators. We also compared 20 B.C. settlement and employment mentorship programs serving varying newcomer demographics, reviewed literature on innovative mentorship practices, and tested 18 mentorship platforms.
Insights
Insights from initial discovery
Through talking to newcomers, frontline workers, and mentors, as well as studying existing mentorship programs, we saw how access to mentorship was constrained on both ends. Newcomers had difficulties being aware of or using the existing mentorship programs; and mentorship programs themselves had limited capacity.
(Insights in white relate to newcomers; beige relate to service providing organizations)
For many newcomers, mentorship was an unfamiliar concept (as there were no mentorship programs in their home country). Others had lengthy professional experience and saw mentorship to be only for junior professionals.
Being a “mentee” or “beneficiary” can feel invalidating to newcomers who often had vast professional networks prior to immigrating, extensive work experience, or PhDs or Master’s degrees. The case manager or program coordinator at a service providing organizations can often be seen as playing the role of the gatekeeper, with the power to determine the newcomer’s “job-readiness” and their access to mentorship. From the newcomer’s view, this process can feel opaque and disempowering.
Due to cultural differences, networking and informational interviews can also feel foreign and intimidating for newcomers. Defined expectations in a mentorship program gives newcomers a safe and clear framework to ask for advice.
Many programs required two to three months' time commitment, or completion of an entire employment program after which the match was not guaranteed; whereas, newcomers mentioned wanting shorter engagements with more immediate connections and fewer educational components. Program timing also did not suit newcomers with survival jobs or fluctuating work schedules.
Many service providing organizations' mentorship processes were manual and time-intensive, such as: transferring registration data into spreadsheets; manually curated matches; using paper-based forms; conducting lengthy screening and training; individualized participant communications (email, calls).
A top challenge for service providing organizations is recruiting and retaining qualified mentors. They dedicated extensive energy to train and nurture mentor relationships.
Service providing organizations also had challenges measuring program impact due to low response rates, low survey uptake, and difficulty to attribute the long-term job acquisition to mentorship. They were often spending time emailing and calling newcomers to follow-up. Not many short-term performance indicators exist.
Service providing organizations’ programs were siloed and did not facilitate matches across programs. Sometimes newcomers would wait a long time for a match without realizing a different organization had a relevant mentor. Organizations were disincentivized to collaborate due to:
- Competitive funding and reporting structure,
- Protection of their mentor relationships and hesitation to "lend" a mentor to another organization's client, and
- Mismatch between different programs' standards of mentee readiness, eligibility, and mentor screening.
I learned about mentorship just now [in this interview]. I didn't realize there are those people who are here to help you in your industry.
Interviewee, Newcomer from Iran, formerly an accountant, arrived 5 years
Goals of Snapp
- Help newcomers connect with relevant local professionals
- Increase capacity of newcomer service providing organizations (SPOs) to facilitate accessible and scalable mentorship
Codesign
How did we make a shared solution possible?
In the grant, PeaceGeeks had proposed to co-design and pilot a shared digital mentorship tool to be used by multiple SPOs. However, once we began to engage with SPOs to launch into co-design and discovery, it suddenly felt like no one wanted us to solve this problem. So, how did we converge on a valuable solution that our partners and funders could say "yes" to?
Design with the system in mind
Challenge: Because SPO programs’ funds were tied to meeting specific client targets (ie. a target number of newcomers served), many SPO staff feared that a collaborative initiative would cause them to lose their clients to another organization. They were also concerned that PeaceGeeks was competing in their space and threatening to take clients away from their programs.
Response: We delved into understanding the funding and program landscape, to design the service to account for those factors, such as: How might we align our service with SPOs' funded mandates? How might we attribute client/volunteer outcomes to specific SPOs while keeping it a shared participant pool?
We also focused on understanding who SPOs' existing programs couldn’t serve, and how our project could fill these gaps.
Find ways to add value in the program administrator's journey
Challenge: Frontline workers at SPOs were cautious our project would either create more work for them, or threaten to erase their jobs. Frontline workers were protective of their processes, despite them being manual and time-intensive. Many emphasized that mentorship matching was an art and were unenthusiastic about a digital solution.
Response: By meeting with frontline workers to walk through the systems and process they currently use to administrate their mentorship programs, we were able to find pain points they wanted to address, such as measuring outcomes and follow-up with matches. By designing solutions that addressed these, we could make a shared matching solution more compelling for them.
Shared learning and sustained value
Challenge: SPOs were wary of the flighty, short-lived impact of government-granted "innovation" pilot projects like ours.
Response: While we couldn’t control the funded project term, we focused on: how can we create something of value that SPO partners can use and leverage even after this project’s funding ends?
We focused on digital interventions that SPOs could reasonably implement and sustain beyond our engagement, thus leading to a no-code solution.
We also made shared learning central to this project. Committee meetings became constant points of share-back, where I would present to SPOs what PeaceGeeks learned along the project phases, facilitate discussion on how SPOs might apply these learnings into their own programs, and listen as committee members guided the project direction.
Users
How did we find the "right" users?
Newcomers
By comparing existing programs, we validated with SPOs that the most opportune areas to focus our project was on newcomers who aren’t typically eligible for federally-funded programs (eg. temporary foreign workers, refugee claimants, international students), and those with limited time or had interested in micro-matching rather than long-term mentorship programs.
Frontline workers/program administrators
Because we were engaging with multiple SPOs, it was easy to get pulled into different directions with each SPO staff's varying needs. At a certain point, we needed to conclude that some frontline workers would not be the target users for administrating the service we sought to design.
Running a quick test to discover potential users and use case
One of the most helpful ways for us to converge on who would use this service was by showing people visual concepts early on.
Before getting too far into designing a solution, I mocked up and presented a fake landing page of the initial concept to frontline workers, to probe how they thought the solution might work, and understand how they saw their role in using this tool.
Through this process, we could quickly see who (if anyone) the idea resonates with and why, and iterate rapidly, narrowing in on the types of frontline workers who were our target user group.
It also revealed design opportunities and requirements to ensure the platform would be usable for frontline workers, such as: building in a way for them to review the matches, adding ways for follow-up, and giving guidance to newcomers to start the conversation.
Design
How did we design the digital experience?
Creating the flow of the service
We first focused on the user journey of newcomers and local professionals, highlighting opportunities and user needs along the experience.
From there, we began to sketch out how the flow might look from these participants' perspectives.
The design of Snapp required not only thinking about digital touchpoints for newcomers and local professionals, but also how program administrators run and experience the service, and how different digital components and people would work together to form the overall service experience.
So we then mapped how the different people — newcomers, local professionals, SPOs (program administrators), and PeaceGeeks ("super" administrator) — would interplay in the flow.
Wireframing
From there, we wireframed the experience from different points of view — newcomer/"mentee", local professional/"mentor", and SPO program administrator/"coach".
Wireframes from the program administrator perspective:
Getting creative with testing
There were several aspects of our wireframed designs that we weren't so sure about — such as getting participants to exchange feedback about each others' skills.
To rapidly test, we recruited five newcomers to participate in Wizard-of-Oz style test where we simulated the steps of our proposed flow using Google forms and manual matching. Newcomers were matched to “local professionals” who were PeaceGeeks staff, met for a coffee chat, and both parties were asked to fill out a form to exchange feedback after their meet-up. Following that, we interviewed both parties about the experience.
This lean test was very helpful. For example, we quickly learned that having a step to review newcomers' skills felt redundant since local professionals had already shared their comments during the meet-up.
Implement
How did we implement the design in no-code?
Next, I was responsible for figuring out how to build this digital service using no-code tools, which would allow us to test this service in the pilots. As such, I didn't proceed to higher-fidelity wireframes, knowing that the ultimate look-and-feel of the design would be constrained to whatever software we selected.
Connecting the technical components
Consulting our lead engineer, and conducting several internal tests of different platforms, we finally decided on our selection of Coda, Webflow, Survicate, Google Forms, Zapier, and email as our stack for prototyping the service that would be piloted.
I created the landing page on Webflow and the program administrators' interface on Coda. Participants' experience was largely facilitated via emails I designed with HTML and automated in a flow implemented in Coda.
(Why email, you ask? We found that email was most accessible across newcomer groups, and would have less friction than trying to convert users onto a web app for communicating with their matched partners.)
I mapped the technical flowchart of how the different systems would link together, and identified where manual processes would be required by program administrators and PeaceGeeks.
Here are some side-by-side comparisons of the wireframed designs vs. implementation!
Based on our wireframes and flows, I worked with my intern to implement the service piece-by-piece. This meant: creating the landing page, emails, and program admin interface; but also producing the training materials and onboarding of how SPO staff would administrate the program.
Matching detail screens
Local professionals' skill review
Selecting matches flow
Testing
How did we test through pilots?
Each Snapp pilot looked at learning how a digital service might increase SPO capacity and help newcomers better access employment connections.
Pilot 1 • Closed referrals
We designed and tested a collaborative digital interface that facilitated matching of newcomers and local professionals across multiple SPOs. Participants were newcomers and volunteers referred through seven employment programs from different organizations.
- 20 participants (10 newcomers, 10 local professionals)
- 7 SPO program administrators
- 20 matches made
- Pilot duration: 1 month
Pilot 2 • Public
While the collaborative approach in Pilot 1 showed promise in increasing the speed of matching and reducing administrative burden, the scale of the program was still constrained by the low supply of volunteers across the SPOs.
We decided to pivot and make Snapp publicly accessible to any newcomer or local professional, not just those referred by SPOs. Our goal was to test ways that we could augment the supply of local professionals.
- 264 participants (139 newcomers, 125 local professionals)
- 261 matches made
- Pilot duration: 1 month
A hypothesis-driven approach
Based on our research questions, we formed hypotheses and ideated on different design interventions to test these hypotheses. Then, we identified metrics that we would need to measure in order to prove/disprove our hypotheses.
To move from Pilot 1 to 2, we put down every screen/digital touchpoint from Pilot 1, ideated design improvements and changes needed to test our hypotheses, and then prioritized ideas to implement. This included also designing the points in our flow where we could gather data in order to validate our hypotheses (such as adding specific registration form fields for our research).
I used test cards to help us summarize results and make sense of next steps.
Here are some examples of findings that we made based on our hypotheses:
Some examples of the outcomes we measured from the pilots:
Impact
Results and lessons learned
89%
newcomer respondents wanted to participate again beyond the pilot
92%
local professional respondents wanted to participate again beyond the pilot
96%
"happiness" rating from newcomers. 76% "extremely happy"
Made
300%
more matches
in 1 month than an average SPO in a year
At many points, Snapp felt like a project in existential crisis — “Oh no, people don’t want this!” “We have no engineering resources for it... so I’m building it?” “Why are we even doing this? Can we give the money back?” So, to have navigated this project and received such positive response to our final pilot feels like a huge personal success. Still, I recognize that a pilot has its limitations in proving “stickiness” of a product/service, so I’m sure that if and when PeaceGeeks can continue Snapp, there will be more learnings to come!
Because of the initial resistance from partners towards the project, PeaceGeeks had thought that we would conduct Snapp purely for research and sunset it once the funding ended. To our surprise, as the Snapp project wrapped up and we shared our learnings, SPO partners and other organizations kept asking when we will run Snapp again, and some have asked us if they can use our build for administrating their own programs. Because of this, I’m excited to share that our team is currently researching, designing, and testing new business models for Snapp, to see how PeaceGeeks might continue this project now that the funding has ended.
As a shy newcomer still not used to Canadian way of networking, these meetings have been invaluable for me. Thank you!
Snapp newcomer participant (pilot 2)
Key takeaways
In initial conversations with frontline workers, I noticed that they felt frustrated by our “discovery” process because they wanted to know what we were trying to build (which evidently, we didn’t know yet) and how they might be impacted. As a result, frontline workers became quite hesitant to engage in our project.
While I was cautious about jumping to solutions so early on, I learned that being able to show a prototype or even a visual concept shifts the conversation in a powerful way, and helps both parties (the designer/researcher and the person giving feedback) better articulate where there is misalignment, uncover immediate problem points with the idea, and offer tangible steps for change.
I think that in this same way, while there was early resistance to the Snapp project, when organizations finally saw and engaged with the pilot, their perceptions shifted dramatically because they could see how it could complement their own programs, and be a positive benefit to newcomers.
This project challenged me to think beyond designing for discrete digital touchpoints to look at the end-to-end picture of how the system functions to deliver a service. This required designing not only from the perspective of the end user (newcomer or local professional participant), but also those responsible for running the service (SPO staff/program administrator).
As we observed users' experiences and collected feedback through the pilots, we often uncovered design issues that affected different layers of the service (eg. users' suggestions for form options on the feedback email, vs. SPOs' concerns about the risks of matching their clients to unvetted participants). The issues were so varied, it made it challenging to figure out what to do first.
This project taught me to navigate issues of varying scope (individual touchpoints vs. connections between touchpoints vs. overall experience of the journey), and prioritize/deprioritize issues based on the goals of each pilot and phase of design we were in.
Snapp taught me a great deal about how it's still possible to design a valuable user experience when many of the accepted factors (engineers, project buy-in, defined goals and users) aren't in place.
I learned a lot about the communication and consideration required to design a solution that contends with the practicalities of people and systems — navigating competition, partner politics, limited resources, sustainability, funding, and more. And with people and systems being so intertwined with the products that we use, I'm grateful that these considerations will help me to think more holistically about how to design truly valuable products in the future.