Skip to content
Research • Prototyping • Service Design

Snapp

Piloting a digital service with settlement organizations to improve newcomers' access to employment connections

About the project

Snapp is a digital service that matches newcomers with local professionals for one-on-one virtual chats to share employment advice.

As product owner and lead designer, I worked with over 15 newcomer service providing organizations (SPOs) to conceptualize, research, build, and test Snapp through a series of pilots. These pilots studied how we can improve newcomers’ access to employment mentorship through online tools and collaboration.

Why is this project interesting?

  • Architecting a service: digital solution + system and human processes
  • Getting to "yes". Navigating resistance to get to a solution users, funders, and partners could buy into
  • This was a pilot project. Lots of discovery, hypothesis-driven testing, iteration.
  • I ended up building Snapp too! Creating a no-code solution.

Role & Timeline

Design Lead • PeaceGeeks

2019-2021

Scope

Product Strategy, Prototyping, Research, Service Design, UX, Participatory Design, No-code/Low-code Implementation

Team

See it live

Learn more on the website

Problem

While mentorship programs are proven to bridge newcomers' employment barriers, they have limited reach.

According to research by ALLIES on 11 mentorship programs in 8 Canadian cities, skilled immigrant unemployment dropped from 73% to 19% when mentorship took place. Many newcomer service providing organizations (SPOs) offer employment mentorship programs, but are constrained by manual, time-intensive program administration and limited mentor supply. Meanwhile, many newcomers remain unaware of mentorship as a channel for employment advice, and struggle to find work.

In 2018, PeaceGeeks was funded by Immigration, Refugees and Citizenship Canada to launch a pilot project, working with SPOs to co-create and test a digital platform that increases newcomers’ access to relevant employment mentors.

25-200

is the number of matches per year that each service providing organization (SPO) made (based on interviewing 15 organizations). On average, most organizations made 60 matches a year.

51%

of newcomers surveyed by Vancity and Angus Reid were unable to access work that matched their workplace credentials, ending up in junior roles or different fields.

Solution

Snapp is a digital service that facilitates connections between newcomers and local professionals.

My team and I created and piloted Snapp under a grant-funded research initiative which investigated how digital interventions can help newcomers better access relevant employment mentors.

Snapp wasn't just about designing digital touchpoints. Program administrators played a critical role in making the end users' (newcomer and local professional participants') experience possible. My role involved defining not only the digital experience for participants, but also digital experience of program administrators, and how people, process, and digital components work together to provide the big-picture service for newcomers.

Snapp landing page (created on Webflow)

Built with no-code

Due to resource constraints and partnership considerations, Snapp was created without engineers.

I led not only the design but also the implementation of Snapp on a no-code stack, comprising Webflow, Coda, Survicate, Google Forms, Zapier, and emails I coded with simple HTML.

Automated match email sent to Snapp participants

Micro-matching participants

For participating newcomers and local professionals, the Snapp experience differed from other newcomer-serving employment mentorship programs in its program design.

Many newcomer mentorship programs require multiple meetings, intensive screening and orientation, and a formalized mentor/mentee relationship. On the contrary, Snapp offered micro-matches (one-time introductions), was purely virtual (using email and video calls), had lower eligibility barriers, and offered reciprocal exchange where newcomers can give local professionals feedback on their advisory skills.

Dashboard for Snapp program administrators (created on Coda)

Empowering program administrators

For case managers and frontline workers wanting to make connections for their clients (newcomers), Snapp streamlined logistics by consolidating participant information into easy-to-understand dashboards and automating the connection flow and communications. Matches were generated algorithmically with the ability for manual curation.

How did we get here?

Our focus

How might technology be used to improve newcomers' access to employment mentors?

Discovery

Engaging newcomers, mentors, and newcomer service providing organizations (SPOs)

We started with an extensive discovery phase to uncover the opportunities, strengths, and challenges in newcomers' employment and mentorship. Using interviews, focus groups, and co-creation, we learned from newcomers and local professionals, both with and without experience in mentorship, and mentorship program coordinators. We also compared 20 B.C. settlement and employment mentorship programs serving varying newcomer demographics, reviewed literature on innovative mentorship practices, and tested 18 mentorship platforms.

Through many months of relationship building, we were grateful to bring on 18 partners for the Snapp project!

Insights

Insights from initial discovery

Through talking to newcomers, frontline workers, and mentors, as well as studying existing mentorship programs, we saw how access to mentorship was constrained on both ends. Newcomers had difficulties being aware of or using the existing mentorship programs; and mentorship programs themselves had limited capacity.

(Insights in white relate to newcomers; beige relate to service providing organizations)

I learned about mentorship just now [in this interview]. I didn't realize there are those people who are here to help you in your industry.

Interviewee, Newcomer from Iran, formerly an accountant, arrived 5 years

Goals of Snapp

  • Help newcomers connect with relevant local professionals

  • Increase capacity of newcomer service providing organizations (SPOs) to facilitate accessible and scalable mentorship

Dive into some key approaches

Codesign

How did we make a shared solution possible?

In the grant, PeaceGeeks had proposed to co-design and pilot a shared digital mentorship tool to be used by multiple SPOs. However, once we began to engage with SPOs to launch into co-design and discovery, it suddenly felt like no one wanted us to solve this problem. So, how did we converge on a valuable solution that our partners and funders could say "yes" to?

Design with the system in mind

Challenge: Because SPO programs’ funds were tied to meeting specific client targets (ie. a target number of newcomers served), many SPO staff feared that a collaborative initiative would cause them to lose their clients to another organization. They were also concerned that PeaceGeeks was competing in their space and threatening to take clients away from their programs.

Response: We delved into understanding the funding and program landscape, to design the service to account for those factors, such as: How might we align our service with SPOs' funded mandates? How might we attribute client/volunteer outcomes to specific SPOs while keeping it a shared participant pool?

We also focused on understanding who SPOs' existing programs couldn’t serve, and how our project could fill these gaps.

We compared 20 mentorship programs to find opportunities to serve newcomers that aren't already connected to programs

Find ways to add value in the program administrator's journey

Challenge: Frontline workers at SPOs were cautious our project would either create more work for them, or threaten to erase their jobs. Frontline workers were protective of their processes, despite them being manual and time-intensive. Many emphasized that mentorship matching was an art and were unenthusiastic about a digital solution.

Response: By meeting with frontline workers to walk through the systems and process they currently use to administrate their mentorship programs, we were able to find pain points they wanted to address, such as measuring outcomes and follow-up with matches. By designing solutions that addressed these, we could make a shared matching solution more compelling for them.

Visiting SPOs to walk through their mentorship programs
Quotes from frontline workers about mentorship program administration

Shared learning and sustained value

Challenge: SPOs were wary of the flighty, short-lived impact of government-granted "innovation" pilot projects like ours.

Response: While we couldn’t control the funded project term, we focused on: how can we create something of value that SPO partners can use and leverage even after this project’s funding ends?

We focused on digital interventions that SPOs could reasonably implement and sustain beyond our engagement, thus leading to a no-code solution.

We also made shared learning central to this project. Committee meetings became constant points of share-back, where I would present to SPOs what PeaceGeeks learned along the project phases, facilitate discussion on how SPOs might apply these learnings into their own programs, and listen as committee members guided the project direction.

For example, I created a "toolkit" of learnings from our discovery, which we shared with partner SPOs. This included insights that didn't "make it" into our solution, but offered ideas for how SPOs might improve their own programs or conduct further research
I also conducted a comparison of mentorship software to see what we might be able to leverage for our solution

Users

How did we find the "right" users?

Newcomers

By comparing existing programs, we validated with SPOs that the most opportune areas to focus our project was on newcomers who aren’t typically eligible for federally-funded programs (eg. temporary foreign workers, refugee claimants, international students), and those with limited time or had interested in micro-matching rather than long-term mentorship programs.

Frontline workers/program administrators

Because we were engaging with multiple SPOs, it was easy to get pulled into different directions with each SPO staff's varying needs. At a certain point, we needed to conclude that some frontline workers would not be the target users for administrating the service we sought to design.

Running a quick test to discover potential users and use case

One of the most helpful ways for us to converge on who would use this service was by showing people visual concepts early on.

Before getting too far into designing a solution, I mocked up and presented a fake landing page of the initial concept to frontline workers, to probe how they thought the solution might work, and understand how they saw their role in using this tool.

I mocked up a very simple landing page which we put in front of frontline workers to gauge their reactions
We also probed what kinds of roles frontline workers would want to take in offering the service, helping us determine job stories for the admin interface.
Testing service matching variations with frontline workers

Through this process, we could quickly see who (if anyone) the idea resonates with and why, and iterate rapidly, narrowing in on the types of frontline workers who were our target user group.

It also revealed design opportunities and requirements to ensure the platform would be usable for frontline workers, such as: building in a way for them to review the matches, adding ways for follow-up, and giving guidance to newcomers to start the conversation.

Design

How did we design the digital experience?

Creating the flow of the service

We first focused on the user journey of newcomers and local professionals, highlighting opportunities and user needs along the experience.

Newcomer journey map

From there, we began to sketch out how the flow might look from these participants' perspectives.

Whiteboarding the participant flow

The design of Snapp required not only thinking about digital touchpoints for newcomers and local professionals, but also how program administrators run and experience the service, and how different digital components and people would work together to form the overall service experience.

So we then mapped how the different people — newcomers, local professionals, SPOs (program administrators), and PeaceGeeks ("super" administrator) — would interplay in the flow.

Mapping out the flow with the different users/parties involved in the service — newcomers, local professionals, SPOs, PeaceGeeks

Wireframing

From there, we wireframed the experience from different points of view — newcomer/"mentee", local professional/"mentor", and SPO program administrator/"coach".

Wireframed user flow of an early version of Snapp

Wireframes from the program administrator perspective:

Getting creative with testing

There were several aspects of our wireframed designs that we weren't so sure about — such as getting participants to exchange feedback about each others' skills.

To rapidly test, we recruited five newcomers to participate in Wizard-of-Oz style test where we simulated the steps of our proposed flow using Google forms and manual matching. Newcomers were matched to “local professionals” who were PeaceGeeks staff, met for a coffee chat, and both parties were asked to fill out a form to exchange feedback after their meet-up. Following that, we interviewed both parties about the experience.

This lean test was very helpful. For example, we quickly learned that having a step to review newcomers' skills felt redundant since local professionals had already shared their comments during the meet-up.

Side-by-side of wireframes and a Google form implementation to test our flow

Implement

How did we implement the design in no-code?

Next, I was responsible for figuring out how to build this digital service using no-code tools, which would allow us to test this service in the pilots. As such, I didn't proceed to higher-fidelity wireframes, knowing that the ultimate look-and-feel of the design would be constrained to whatever software we selected.

Connecting the technical components

Consulting our lead engineer, and conducting several internal tests of different platforms, we finally decided on our selection of Coda, Webflow, Survicate, Google Forms, Zapier, and email as our stack for prototyping the service that would be piloted.

I created the landing page on Webflow and the program administrators' interface on Coda. Participants' experience was largely facilitated via emails I designed with HTML and automated in a flow implemented in Coda.

(Why email, you ask? We found that email was most accessible across newcomer groups, and would have less friction than trying to convert users onto a web app for communicating with their matched partners.)

I mapped the technical flowchart of how the different systems would link together, and identified where manual processes would be required by program administrators and PeaceGeeks.

Snapp pilot 1 - technical flowchart

Here are some side-by-side comparisons of the wireframed designs vs. implementation!

Based on our wireframes and flows, I worked with my intern to implement the service piece-by-piece. This meant: creating the landing page, emails, and program admin interface; but also producing the training materials and onboarding of how SPO staff would administrate the program.

Matching detail screens
Wireframe: Program admin view of match detail
Implemented in Coda: Program admin view of match detail
Local professionals' skill review
Wireframe: Local professionals' view of skills review
Implemented in HTML & Coda: Local professionals' view of skills review
Selecting matches flow
Wireframe: Admin view of making matches
Implemented in Coda: Admin view of making matches

Testing

How did we test through pilots?

Each Snapp pilot looked at learning how a digital service might increase SPO capacity and help newcomers better access employment connections.

Pilot 1 • Closed referrals

We designed and tested a collaborative digital interface that facilitated matching of newcomers and local professionals across multiple SPOs. Participants were newcomers and volunteers referred through seven employment programs from different organizations.

  • 20 participants (10 newcomers, 10 local professionals)
  • 7 SPO program administrators
  • 20 matches made
  • Pilot duration: 1 month

Pilot 2 • Public

While the collaborative approach in Pilot 1 showed promise in increasing the speed of matching and reducing administrative burden, the scale of the program was still constrained by the low supply of volunteers across the SPOs.

We decided to pivot and make Snapp publicly accessible to any newcomer or local professional, not just those referred by SPOs. Our goal was to test ways that we could augment the supply of local professionals.

  • 264 participants (139 newcomers, 125 local professionals)
  • 261 matches made
  • Pilot duration: 1 month

A hypothesis-driven approach

Based on our research questions, we formed hypotheses and ideated on different design interventions to test these hypotheses. Then, we identified metrics that we would need to measure in order to prove/disprove our hypotheses.

Hypotheses in Pilot 2

To move from Pilot 1 to 2, we put down every screen/digital touchpoint from Pilot 1, ideated design improvements and changes needed to test our hypotheses, and then prioritized ideas to implement. This included also designing the points in our flow where we could gather data in order to validate our hypotheses (such as adding specific registration form fields for our research).

Ideating on ways to change and build upon Pilot 1 flow

I used test cards to help us summarize results and make sense of next steps.

Test card

Here are some examples of findings that we made based on our hypotheses:

Some examples of the outcomes we measured from the pilots:

Tracking how successful participants were in actually meeting in Pilot 2
Tracking newcomers' satisfaction in the matches

Impact

Results and lessons learned

89%

newcomer respondents wanted to participate again beyond the pilot

92%

local professional respondents wanted to participate again beyond the pilot

96%

"happiness" rating from newcomers. 76% "extremely happy"

Made

300%

more matches

in 1 month than an average SPO in a year

At many points, Snapp felt like a project in existential crisis — “Oh no, people don’t want this!” “We have no engineering resources for it... so I’m building it?” “Why are we even doing this? Can we give the money back?” So, to have navigated this project and received such positive response to our final pilot feels like a huge personal success. Still, I recognize that a pilot has its limitations in proving “stickiness” of a product/service, so I’m sure that if and when PeaceGeeks can continue Snapp, there will be more learnings to come!

Because of the initial resistance from partners towards the project, PeaceGeeks had thought that we would conduct Snapp purely for research and sunset it once the funding ended. To our surprise, as the Snapp project wrapped up and we shared our learnings, SPO partners and other organizations kept asking when we will run Snapp again, and some have asked us if they can use our build for administrating their own programs. Because of this, I’m excited to share that our team is currently researching, designing, and testing new business models for Snapp, to see how PeaceGeeks might continue this project now that the funding has ended.

As a shy newcomer still not used to Canadian way of networking, these meetings have been invaluable for me. Thank you!

Snapp newcomer participant (pilot 2)

Key takeaways

© 2021 - Cherrie Lam • Site built by me on GatsbyJS