Loading

B2B Platform Design

Bringing a legacy platform to modern technologies and experience.
Company
CareerPlug
Role
Lead UX Designer, Research, Product Management, User Interface Design
Duration
2018-2021

At CareerPlug, updating an existing platform to modern technologies and user experience became a necessity after many years of unstructured growth. As a B2B agency, CareerPlug caters to many needs. Their primary focus is on small and medium businesses with a decentralized hiring system. As a new designer, I sought to develop an understanding of the ideal client profile, and how their needs overlapped with business goals.

What is “decentralized hiring”?

To put it concisely, decentralized hiring is when a person uses processes and qualifiers unique to them and their business needs. SMBs commonly hire on their own terms, follow unique processes, and typically don’t have much experience with hiring and associated legalities.

In context of franchising, decentralized hiring refers to the hiring practices of a franchisee. A person buys a “business in a box” as a part of a franchise network, and then independently runs the business while paying the franchisor royalty. The franchisor provides basic artifacts like job descriptions, company forms, etc., but the bulk of hiring and team development is left to the franchisee.

Identifying Opportunities

From a technical point of view, the outdated tech stack was common knowledge. To establish the experience point of view, I conducted research in the following areas:

  • The Legacy Platform: In this audit, I aimed to identify the key workflows that related to the jobs to be done by the user base. I also searched for interface and experience inconsistencies to address as research went on.
  • Support Cases: Paying attention to case categories with high volumes, keyword searches, and reading individual cases started forming a story that could be related back to experience and interface issues.
  • Internal Stakeholder interviews: I aimed for the support, sales, and client success teams to begin understanding their day-to-day interactions with the clients. The senior leaders served as the source for business goals and creating alignment between user and business needs.
  • User Interviews: The most critical ingredient in this “recipe”, qualitative interviews were conducted to understand the jobs to be done, the priority for the user, and what their hiring practices were. These interviews helped me empathize with the end users, as well as the decision makers.

Primary Research and Findings

Research Methods

Semi-structured user interviews (contextual inquiries), walkthroughs

Surveys, case analysis, NPS review

Card sorting, reverse card sorting, task analysis, hi-fi prototypes

Relevant Findings

As a design team of one, I collaborated with internal stakeholders as often as possible. With help from various team members, I was able to gather data that informed the first redesign of the platform. I have consolidated my learnings from each source in the following sections. Click a title in the list below to read more details.

The Legacy Platform

In the process of learning the existing application, I began identifying the key workflows that related to the real-world jobs to be done by the applications user. Here are some of the major tasks I identified:

  • Creating job
  • Developing an online presence and “brand”
  • Assessing applicants’ fit for a job
  • Assorted administrative tasks such as maintaining digital versions of paper forms, creating templates, etc.

As the audit progressed, I began discovering usability issues. Some of these were:

  • System status communications: The app was incredibly deficient in informing users of the consequences of their actions as well as seeking confirmation for critical actions.
  • Match between system and real world: For a user that’s unfamiliar with hiring practices, the system used terms that required explanation at every stage. Examples of confusing terminology:

    “Deactivate”: Used in app as a replacement for “rejecting” an applicant.

    User Roles: The user base were used to hearing and using job titles like “hiring manager”, “recruiter”, etc. when defining responsibilities. The app has its own set of definitions that a user would need to learn, but also mcgyver to fit their real world roles and permissions. The app user roles are not configurable, i.e. each of the 4 roles comes with preset privileges.
  • Consistency and standards: UI elements like modals, banners, etc. were inconsistently used, affecting the learnability of the app.The language used in the app also varied at different parts of the app.
  • Flexibility and efficiency of use: The administrative side of the app was accessible to only certain roles. By limiting access to account setup controls to admins, the app enabled consistency, but hid features that would improve productivity for other roles.

The usability issues, in particular the efficiency of use aspect, also highlighted the lack of communication between data models. For example, an applicants’ social security number would not populate automatically in other places, making the user input the same information multiple times.

Support Cases

I had two goals when I began reviewing support cases:

  • To get into the mindset of our users
  • To identify opportunities based on mismatch with end user expectations

The support team employed a categorization system that related to the different areas of the app. A cursory glance at these spoke to areas with the highest volume. From here, it was as simple as looking at individual cases to gather qualitative data, and use it to inform any improvements to the app.

Multiple opportunities presented themselves, but the overall trend indicated a problem with the information architecture of the web app. In analyzing the incoming questions and the responses, assistance with finding artifacts/settings/applicants/etc was easily the largest driver of incoming communication. A subset of these dealt with understanding terminology. In order to address this, I conducted card sorting exercises to establish a sitemap that reflected the users’ mental models. The result was a restructured navigation system that provided definitions of different areas, decluttering of synonymous options, and groupings reflecting how users were organizing the apps capabilities like hiring tools, my organization, etc.

The second trend was about applicant flow. The users had a difficult time understanding that the webapp wasn’t a pool of applicants, but a way to manage applicants from other sources. This issue saw solutions come from other teams at the company- marketing, customer success, and sales. In app efforts included surfacing “source performance”, a bar graph chart that showed the click to application conversion ratio of various applicant sources.

The third, most frequent point of frustration validated a finding from the heuristics evaluation- users were seeking efficiency tools but, a) could not find them, and b) did not have access to them. This was compounded by the complicated, unwieldy workflows that could not communicate system status or consequences of actions a user would take. The learning curve attached to these, in addition to the lack of intuition, led to high levels of frustration.

Internal Stakeholders

In conjunction with developing an understanding of the user base, I began interviewing internal stakeholders to develop appreciation for their pain points. I focused on sales, customer support, and customer success teams. These three teams brought perspective that reflect different aspects of a users’ journey with CareerPlug:

  • Sales: Which features excite potential clients, what “seals the deal”, what aspects were deal breakers, what were some wishlist items for acquired clients, etc.
  • Customer Success: How are clients measuring success, what are their overall goals, how are clients adjusting their hiring practices, what advice was the team giving, which workarounds were being used, etc.
  • Customer Support: What were current users struggling with, where was the app not matching their expectations, which features/functionality were more effort to sustain than the value they provided, which medium (blogs, videos, articles, etc.) were most effective in communicating best practices/solutions, general demographics, etc.

User Interviews

Using qualitative techniques like interviews, observing app use, and walkthroughs, I sought to solidify the foundation I’d built in my previous research, while maintaining high levels of empathy towards the user.

In interviews, we would touch upon not just how the application was being used, but also which workflows they were trying to replicate from the real world, in the app. This was important for me to learn about so I could successfully mimic their real world processes inside the app. Walkthroughs of the existing app help shed light on the workarounds in play, along with layout failures, in-app language improvement opportunities, and generally give deeper insights into what the users’ expectations were.

Corroborating data from surveys, internal and, user interviews led to personas being formed. Please contact me to get the password for this whiteboard detailing the persona generation process and outcomes.

Secondary Research

Competitive Analysis

A competitive analysis was conducted to understand how direct and indirect competitors were appealing to the same set of user needs. This helped further my understanding of the user base by giving a different perspective on how mental models, jobs to be done, etc. were being approached. From a strategic standpoint, learning about positioning techniques helped inform some go to market plans further down the line.

View Full Analysis

Design

Actionable Opportunities

After a period of research, an actionable list started accruing. While the list included a healthy number of low hanging fruit, the bigger changes overshadowed those to eventually drive a redesign of the system from both technical and experience standpoints.

Research Driven Designs

Putting the cumulative knowledge of the internal stakeholders and what I'd learned from user research led to a few ideas for design consideration. Here's an example of how an insight(s) translated to a design idea:

Information Architecture

As illustrated by support cases, finding settings and artifacts was one of the top three problems of the platform. Using card sorting and reverse card sorting exercises, I updated the information architecture to reflect the users' mental models. As a consequence, the hypothesis was that users' would be able to locate efficiency tools with ease.

View Larger

wireframe
Legacy IA
wireframe
Updated IA
Design System

One of the ranking usability issues identified during research was a lack of consistency. In order to discourage such a situation in the future, a design system was created and put into place for the next iteration of the app. This design system had a Sketch Component Library which enabled the design team to move fast and iterate quickly.

Some of the core elements of the design system are:

  • Scalability: The design system went through several changes to incorporate patterns and layouts that had app-wide use
  • Default, active, disabled states for UI elements
  • Do's and don'ts detailed in usage guidelines
  • Dark, light, default UI configurations
  • Responsive/adaptive documentation

Explore The Design System

Simplifying Hiring

A constant theme across interviews with all the stakeholders that hiring was complicated. There were several layers to this problem:

  • From a franchisees' perspective, the hiring process was a mystery and it was difficult to use, right from setting it up, all the way until applicants were put through the process.
  • After set up, understanding what was needed to be done next was difficult.
  • From a franchisors' perspective, gaining an understanding of how the franchisees were using the system, which tactics were successful and which were failing, and monitoring the general health of accounts was difficult.

Over the course of many months, I began chipping away at the problems, one layer at a time. Each iteration was brought to all stakeholders for feedback. I leaned on high fidelity designs and prototypes to test with users. I found that this particular user base faired better when the designs were as close to implementation as possible.

I was handed the initial concepts developed by the CEO and a contract UX designer. From these, I experimented with the information density, interactions, and presentation. Most iterations received internal feedback. Per 3-5 iterations, I would test these with users of the platform to keep myself caliberated to their needs.

wireframe
Iteration 01
wireframe
Iteration 02
wireframe
Iteration 03
wireframe
Current Iteration
Additional Recommendations Based on Feedback

The most recent round of feedback sourced from NPS responses, support cases, and user interviews indicated to some more opportunities for improvement. Some of these are:

  • Add more clarity to what are next steps to take in a hiring process
  • Make tools like templates, webpages, etc. more accessible to non-admin users
  • Add inline error states for input fields so a user isn't surprised when they complete a form by clicking "submit"
  • Test terminology with users extensively to match their way of thinking
  • Address and respect the users' desire to use varying data to evaluate an applicant by providing more customization options on the list of applicants, profile, as well as hiring process qualifiers/disqualifiers

Reflections

Embrace diversity.

The user base of the CareerPlug ATS came with a variety of jobs to be done, methods of executing these, and several other subjective preferences. In addition to these, internal stakeholders also brought their own opinions of the right and wrong way of executing workflows in-app. Acknowledging and embracing the diversity and designing tools that support these is sometimes more important than standardization. The various iterations of the list of applicants have been inching towards a world where the user is in control, but at a slower speed than anticipated.

Feasability, the most forgotten ingredient.

Contextual inquiries were critical in defining feasability. The more I interacted with users via interviews and surveys, the more familiar I became with their surroundings, their familiarity with hiring practices, how they used paper forms, and other miniscule details many might consider irrelevant. These details, however, generated high levels of empathy for the user base, alongwith appreciation for the legacy platforms' success.