At CareerPlug, updating an existing platform to modern technologies and user experience became a necessity after many years of unstructured growth. As a B2B agency, CareerPlug caters to many needs. Their primary focus is on small and medium businesses with a decentralized hiring system. As a new designer, I sought to develop an understanding of the ideal client profile, and how their needs overlapped with business goals.
What is “decentralized hiring”?
To put it concisely, decentralized hiring is when a person uses processes and qualifiers unique to them and their business needs. SMBs commonly hire on their own terms, follow unique processes, and typically don’t have much experience with hiring and associated legalities.
In context of franchising, decentralized hiring refers to the hiring practices of a franchisee. A person buys a “business in a box” as a part of a franchise network, and then independently runs the business while paying the franchisor royalty. The franchisor provides basic artifacts like job descriptions, company forms, etc., but the bulk of hiring and team development is left to the franchisee.
From a technical point of view, the outdated tech stack was common knowledge. To establish the experience point of view, I conducted research in the following areas:
Semi-structured user interviews (contextual inquiries), walkthroughs
Surveys, case analysis, NPS review
Card sorting, reverse card sorting, task analysis, hi-fi prototypes
As a design team of one, I collaborated with internal stakeholders as often as possible. With help from various team members, I was able to gather data that informed the first redesign of the platform. I have consolidated my learnings from each source in the following sections. Click a title in the list below to read more details.
The Legacy Platform
In the process of learning the existing application, I began identifying the key workflows that related to the real-world jobs to be done by the applications user. Here are some of the major tasks I identified:
As the audit progressed, I began discovering usability issues. Some of these were:
The usability issues, in particular the efficiency of use aspect, also highlighted the lack of communication between data models. For example, an applicants’ social security number would not populate automatically in other places, making the user input the same information multiple times.
I had two goals when I began reviewing support cases:
The support team employed a categorization system that related to the different areas of the app. A cursory glance at these spoke to areas with the highest volume. From here, it was as simple as looking at individual cases to gather qualitative data, and use it to inform any improvements to the app.
Multiple opportunities presented themselves, but the overall trend indicated a problem with the information architecture of the web app. In analyzing the incoming questions and the responses, assistance with finding artifacts/settings/applicants/etc was easily the largest driver of incoming communication. A subset of these dealt with understanding terminology. In order to address this, I conducted card sorting exercises to establish a sitemap that reflected the users’ mental models. The result was a restructured navigation system that provided definitions of different areas, decluttering of synonymous options, and groupings reflecting how users were organizing the apps capabilities like hiring tools, my organization, etc.
The second trend was about applicant flow. The users had a difficult time understanding that the webapp wasn’t a pool of applicants, but a way to manage applicants from other sources. This issue saw solutions come from other teams at the company- marketing, customer success, and sales. In app efforts included surfacing “source performance”, a bar graph chart that showed the click to application conversion ratio of various applicant sources.
The third, most frequent point of frustration validated a finding from the heuristics evaluation- users were seeking efficiency tools but, a) could not find them, and b) did not have access to them. This was compounded by the complicated, unwieldy workflows that could not communicate system status or consequences of actions a user would take. The learning curve attached to these, in addition to the lack of intuition, led to high levels of frustration.
In conjunction with developing an understanding of the user base, I began interviewing internal stakeholders to develop appreciation for their pain points. I focused on sales, customer support, and customer success teams. These three teams brought perspective that reflect different aspects of a users’ journey with CareerPlug:
Using qualitative techniques like interviews, observing app use, and walkthroughs, I sought to solidify the foundation I’d built in my previous research, while maintaining high levels of empathy towards the user.
In interviews, we would touch upon not just how the application was being used, but also which workflows they were trying to replicate from the real world, in the app. This was important for me to learn about so I could successfully mimic their real world processes inside the app. Walkthroughs of the existing app help shed light on the workarounds in play, along with layout failures, in-app language improvement opportunities, and generally give deeper insights into what the users’ expectations were.
Corroborating data from surveys, internal and, user interviews led to personas being formed. Please contact me to get the password for this whiteboard detailing the persona generation process and outcomes.
A competitive analysis was conducted to understand how direct and indirect competitors were appealing to the same set of user needs. This helped further my understanding of the user base by giving a different perspective on how mental models, jobs to be done, etc. were being approached. From a strategic standpoint, learning about positioning techniques helped inform some go to market plans further down the line.
After a period of research, an actionable list started accruing. While the list included a healthy number of low hanging fruit, the bigger changes overshadowed those to eventually drive a redesign of the system from both technical and experience standpoints.
High Effort Items
Low Effort Items
Research Driven Designs
Putting the cumulative knowledge of the internal stakeholders and what I'd learned from user research led to a few ideas for design consideration. Here's an example of how an insight(s) translated to a design idea:
As illustrated by support cases, finding settings and artifacts was one of the top three problems of the platform. Using card sorting and reverse card sorting exercises, I updated the information architecture to reflect the users' mental models. As a consequence, the hypothesis was that users' would be able to locate efficiency tools with ease.
One of the ranking usability issues identified during research was a lack of consistency. In order to discourage such a situation in the future, a design system was created and put into place for the next iteration of the app. This design system had a Sketch Component Library which enabled the design team to move fast and iterate quickly.
Some of the core elements of the design system are:
A constant theme across interviews with all the stakeholders that hiring was complicated. There were several layers to this problem:
Over the course of many months, I began chipping away at the problems, one layer at a time. Each iteration was brought to all stakeholders for feedback. I leaned on high fidelity designs and prototypes to test with users. I found that this particular user base faired better when the designs were as close to implementation as possible.
I was handed the initial concepts developed by the CEO and a contract UX designer. From these, I experimented with the information density, interactions, and presentation. Most iterations received internal feedback. Per 3-5 iterations, I would test these with users of the platform to keep myself caliberated to their needs.
The most recent round of feedback sourced from NPS responses, support cases, and user interviews indicated to some more opportunities for improvement. Some of these are:
The user base of the CareerPlug ATS came with a variety of jobs to be done, methods of executing these, and several other subjective preferences. In addition to these, internal stakeholders also brought their own opinions of the right and wrong way of executing workflows in-app. Acknowledging and embracing the diversity and designing tools that support these is sometimes more important than standardization. The various iterations of the list of applicants have been inching towards a world where the user is in control, but at a slower speed than anticipated.
Feasability, the most forgotten ingredient.
Contextual inquiries were critical in defining feasability. The more I interacted with users via interviews and surveys, the more familiar I became with their surroundings, their familiarity with hiring practices, how they used paper forms, and other miniscule details many might consider irrelevant. These details, however, generated high levels of empathy for the user base, alongwith appreciation for the legacy platforms' success.