Ingrain: AI Interview Buddy
inspired by generative AI's transformation as a technological tool with real-world use cases, I designed a web app that helps job-seekers practice interviewing and receive high-quality feedback from a personalized AI coach.
With Ingrain, users can browse through preset interviews for a range of companies and roles, or customize their own interviews to start practicing within minutes. Ingrain serves an interview buddy, professional coach, and progress tracker that helps people land their dream jobs.
Ingrain: AI Interview Buddy
inspired by generative AI's transformation as a technological tool with real-world use cases, I designed a web app that helps job-seekers practice interviewing and receive high-quality feedback from a personalized AI coach.
With Ingrain, users can browse through preset interviews for a range of companies and roles, or customize their own interviews to start practicing within minutes. Ingrain serves an interview buddy, professional coach, and progress tracker that helps people land their dream jobs.
Timeline
2 months (Sep 2023 - Oct 2023)
Timeline
2 months
(Oct 2023 - Nov 2023)
Timeline
2 months
(Oct 2023 - Nov 2023)
Before designing any solutions, I set out to validate or refute my hypotheses about the problem. Does my view of the problem align with others' experiences? How do people typically approach interview preparation, and how might I bring value through design?
Interview preparation is perceived as highly important to interview performance.
Answering interview questions is the most common form of interview preparation.
The most popular forms of preparation are answering interview questions (73%) and researching interview questions (70%)
More than half of participants practice interviewing without a partner or coach.
Of those who practiced answering interview questions, more than half practiced answering without a partner or coach
Interview practice is not a unidimensional problem, and consists of multiple common pain points:
Receiving feedback is a chronic concern during interview practice
A majority of participants who practiced answering interview questions (alone or with peers) described feedback accuracy or quality as a challenge
Most practice to improve the quality of their answers vs. quality of their delivery
80% of participants described improving the content of their responses as a primary goal of interview preparation
With a refined understanding of the problem, I brainstormed several viable solutions but settled on Ingrain, an AI-powered interview buddy that unifies 3 value propositions addressing shared concerns from the problem & user discovery phase:
Offering 1Flexible practice, anytime
Offering 2Accurate personalized feedback
Offering 3Track & measure progress
With these offerings in mind, I constructed 2 relevant user personas for Ingrain. The first persona was validated through surveys and interviews, while the second is a hypothetical persona that faces similar challenges to the first.
Zach M.
Aspiring Technology Salesperson
- Zach is a recent university graduate
- Has mentors, but none working in tech sales
- Interview preparation is important to Zach
- Occasionally practices interviewing with classmates, but finds doing so unreliable
Olivia S.
Aspiring Software Engineer
- Olivia is 38 and a mother of two children
- Former executive assistant transitioning to software
- Just finished a 6-month coding bootcamp
- Doesn’t have time to build relationships with mentors
- Needs to find work as soon as possible
Before thinking about granular aspects of the solution like the visual design and features, I conducted a brief competitive analysis of existing solutions that solve similar problems. The focus of this independent study was to zoom out and identify how an AI interview product like Ingrain should be positioned among its competitors. I asked myself, how can Ingrain's features be designed to align with its intended product positioning?
Convenience as a competitive edge
AI tools like Ingrain bridge the effort gap by enabling practice with very little setup.
Automation drives user retention
"Nice-to-haves" like progress tracking become product "must-haves" when automated.
AI-powered interviews have a place in the market because users may not have the time or network to practice with others. To fulfill that pain point, products like Yoodli and Interview School are neatly positioned as low effort, low time-to-value tools that still yield realistic advice for users. These products strategically keep people engaged by automating other high-effort tasks like practice logs and performance history, which are often skipped during individual and partnered practice alike.
Moving forward, I conducted a usability study to test how 5 participants interact with AI-powered interviewing tools similar to Ingrain, Yoodli and Interview School. Through testing, my goal was to catalog Ingrain’s non-negotiable core features, explore how users interact with the Ingrain’s competitors, and answer "how might we's" based upon real experiences during the usability test.
Using positive and negative behavioral trends from 5 participants, I mapped a high-level user flow and shortlisted Ingrain's various features and capabilities, tying each list item to a specific observation from either user discovery or testing. Depending on the frequency or severity of each observation, I also prioritized specific features and interfaces for production.
Finally, I sketched Ingrain's core interfaces and used them to create an interactive mid-fidelity mockup that went straight to testing. Since the UI was fairly limited at this point, I had 5 people to explore the product while acting under a hypothetical persona first: an aspiring entry-level consultant seeking to improve performance in behavioral interviews. In this initial phase of testing, I discovered major oversights with my design and made adjustments to address the most critical concerns.
Then, I conducted a second round of testing with another 5 participants, where my main focus was on measuring behavioral trends that could be addressed through design.
Phase 1
Rapid sketching
Experiment with various flows, components, layouts
Phase 2
Test Prototype
Consolidate metrics for success and find oversights
Phase 3
Update & Retest
Adjust to outcomes and validate improvements
Phase 4
Test in high fidelity
Validate again with greater freedom and interactivity
Once I iterated again based on measured outcomes, I ran a 3rd and final round of testing – this time with a high-fidelity mockup and no hypothetical persona. Here, users explored the UI as if they were preparing for their most recent cycle of job interviews. Outcomes from this final round of testing with near-complete interactive freedom were crucial in finalizing Ingrain's design.
Many changes were made during this multi-phase design process, and design decisions were justified using behavioral trends and feedback gathered during the 3 user testing sessions. Although I treated the frequency or severity of feedback as an indicator of Ingrain's usability, each round of testing only included 5 participants, so I couldn't achieve saturation on many feedback areas. Therefore, the most important metrics of success were the ones I could measure across all participants in all testing phases. Some key measured outcomes are shared below:
Problem: Users are forced to sort & filter through hundreds of preset interviews, leading to excess time between seeking practice and starting an interview
Problem: Users spend time identifying a desired interview even when it is visible in the UI, since interview details are purely conveyed through text
Average time to start a desired interview (when visible after sort/filter): 24 seconds
Practice
Practice
Onboarding
Onboarding
Solution: User preferences are collected during onboarding, which is used to make recommendations and default sort & filter selections
Solution: Re-organized information hierarchy within selectable interviews: users identify desirable interviews by logo and position title first
Average time to start a desired interview when visible immediately: 5 seconds (79% improvement)
Practice
Practice
Onboarding
Onboarding
Problem: Users are forced to sort & filter through hundreds of preset interviews, leading to excess time between seeking practice and starting an interview
Solution: User preferences are collected during onboarding, which is used to make recommendations and default sort & filter selections
Problem: Users spend time identifying a desired interview even when it is visible in the UI, since interview details are purely conveyed through text
Solution: Reworked information hierarchy within selectable interviews. Users identify desirable interviews by logo and position title first
Average time to start a desired interview when visible immediately: 5 seconds (79% improvement)
I improved information hierarchy and several visual design elements to reduce the time between seeking practice and starting practice. My assumptions with Ingrain's interview recommendation & sort/filter algorithms are a key limitation in this phase; with the "before" iteration, we assume the desired interview becomes visible after one full interaction using sort/filter. In the "after" iteration, we assume the desired interview is listed immediately as a result of onboarding. Looking back, I should have hedged this assumption by also measuring average time to start a desired interview when visible immediately during the "before" phase.
Problem: Users decide whether to engage with a question upon encountering it, leading to excess time engaging with misfit questions
Problem: Users make needless interactions with the recording interface (e.g. turning off the microphone) because they aren't sure how the interface works
Average time engaging with skipped questions was 26 seconds. (16 seconds per ~1.6 skipped questions)
Recording
Recording
Tutorial
Tutorial
Solution: Spoiler accordion allows questions to be viewed and skipped before the interview begins, creating a smoother interview experience
Solution: Step-by-step interview recording walkthrough for new users; button status overlays revealed upon hovering
Average time engaging with skipped questions using the spoiler accordion was 15 seconds. (42% improvement)
Recording
Recording
Tutorial
Tutorial
Problem: Users decide whether to engage with a question upon encountering it, leading to excess time engaging with misfit questions
Solution: Spoiler accordion allows questions to be viewed and skipped before the interview begins, creating a smoother interview experience
Problem: Users make needless interactions with the recording interface (e.g. turning off the microphone) because they aren't sure how the interface works
Solution: Step-by-step interview recording walkthrough for new users; button status overlays revealed upon hovering
Average time engaging with skipped questions using the spoiler accordion was 15 seconds. (42% improvement)
I added UI components to help users spend less time engaging non-productively (e.g. spending time during an interview to decide whether a question is relevant or should be skipped.) One key goal was to prevent interruptions by allowing users to view and skip questions before the interview begins. This solution also curtails engagement with interviews that contain too many irrelevant questions altogether.
One limiting factor is the rate of "misfit questions" produced per interview. For example, If Ingrain's question generation algorithm was improved to reduce the frequency of misfit questions by 50%, then users would spend half as much time engaging non-productively. However, My hypothesis is that the benefit of preventing interruptions still outweighs the time cost of interacting with the question spoilers up-front.
Overview
Overview
Detailed
Detailed
Problem: Feedback recall is extremely low, particularly for feedback not displayed in the overview
Problem: Users sometimes ignore key parts of the feedback UI (e.g. skipping delivery feedback entirely) contributing to poor retention overall
Average recall of overview feedback was 75%, average recall of non-overview feedback was 13%
Overview
Overview
Detailed
Detailed
Tutorial
Tutorial
Solution: One-sentence summaries added to detailed feedback, quantitative metrics displayed in context
Solution: Step-by-step tutorial for new users to help navigate information-dense UI; overview feedback contains links to specific questions
Average recall of overview feedback was 80%, average non-overview feedback recall was 40% (3x improvement)
Overview
Overview
Detailed
Detailed
Tutorial
Tutorial
Problem: Feedback recall is extremely low, particularly for feedback not displayed in the overview
Solution: One-sentence summaries added to detailed feedback, quantitative metrics displayed in context
Problem: Users sometimes ignore key parts of the feedback UI (e.g. skipping delivery feedback entirely) contributing to poor retention overall
Solution: Step-by-step tutorial for new users to help navigate information-dense UI; overview feedback contains links to specific questions
Average recall of overview feedback was 80%, average non-overview feedback recall was 40% (3x improvement)
In the feedback section, I reorganized information in a way that made navigating feedback more intuitive to users, which led to an upward trend in feedback retention. Users were asked to recall feedback immediately after exiting the feedback interface, and information recall was measured separately for feedback that was displayed or not displayed in the overview section.
One limitation of this test is that the feedback used during testing was not unique to each user, which may have led users to spend less time reading and retaining feedback than if the feedback was unique to their specific interview.
My focus with this project was to design Ingrain's core features and user flow through an iterative, evidence-driven approach. However, Ingrain's testing and buildout was somewhat restricted because the product's backend does not exist (yet) and my timeline for the project was limited to 2 months. Given more time and resources to work on Ingrain, I would continue designing for 3 primary concerns:
Minimize effort to find relevant interviews
How can I refine interview selection to guide users to desired interviews?
Untested: Faceted search, user ratings, hierarchical clustering
Make reviewing feedback delightful
How can feedback be more digestible and interesting to engage with?
Untested: Additional interactivity, data visualization, comparisons
Make users want to return and practice
Untested: Challenges, achievements, leaderboards, etc.
How can I retain users and capitalize on monetization opportunities?
Lastly, I catalogued the important challenges with Ingrain's design that led me to reflect on my own design process and identify how to improve for future projects.
Challenge #1: Limited research sample
I made inferences and design choices based on a subset of users that was biased towards individuals in the early stages of a technology or business-adjacent career.
Moving forward:
Allocate more time and resources towards diversifying research participants to strengthen confidence in user research and make better-informed design decisions.
Challenge #2: Premature visual design
I spent unnecessary time making small visual adjustments between prototyping and iterative testing phases, much of which had no impact on test outcomes or final design.
Moving forward:
Continue sketching beyond the initial low-fi mockup phase for high-yield visual adjustments, and focus on priorities only until precision is necessary in a high-fidelity design.
Challenge #3: Excessive investigative scope
The scope of my research was too wide, so I gathered a lot of data that was “nice-to-know” but not actionable, which. made it difficult to cut the signal from the noise.
Moving forward:
Rigorously audit underlying goals of investigative questions, and consider how to combine research methods in a way that is mutually exclusive and collectively exhaustive.