
This article is part of my series, “What UX Courses Didn’t Teach Me (But Real Research Did).”
When I began my journey in UX research, I took the typical route of enrolling in courses and earning certificates to build my skills. I completed several impactful programs, including:
- Foundations of UX Design (Coursera)
- Start the UX Design Process (Coursera)
- Conduct UX Research and Test Early Concepts (Coursera)
- UX Research for Agile Teams (LinkedIn Learning)
- Enterprise Design Thinking Practitioner (IBM)
These courses taught me methods like usability testing, Figma prototyping, journey mapping, and Confluence documentation, leaving me feeling prepared to take on the challenge.
Education gives you the foundation. Experience teaches you how to survive the messy middle.
However, as the only researcher in the room, I soon faced challenges— adapting quickly, managing skeptical stakeholders, and working with messy data. While formal training offered the “what,” hands-on projects taught me the “how” and the crucial “now what?” when plans fell apart.
This isn’t a critique of bootcamps or certifications but a reflection on the gap between formal training and the essential lessons learned through experience. Let’s explore what I thought I needed versus what I truly required to thrive as a UX researcher!
Skill #1: Writing Research Plans vs. Convincing People to Care About Research
A good research plan seems straightforward: define clear objectives, identify the target audience, create unbiased questions, and propose a method. Training provided templates and examples, but the real challenge was convincing stakeholders that research was valuable.
Many stakeholders prioritized quick project milestones, viewing research as a delay rather than a safeguard against developing unwanted features. A well-crafted plan was ineffective if seen as an obstacle to rapid delivery.
I learned to position research as an accelerator by meeting with stakeholders to discuss the plan and encourage questions and feedback. These sessions were not just about logistics; they highlighted the importance of understanding customer needs to avoid costly rework.
Writing research plans taught me structure while advocating for their value taught me influence.
A perfect research plan can’t help a team racing toward the wrong finish line.
Skill #2: Conducting User Interviews vs. Surviving Curveballs Mid-Session
When I first learned to conduct user interviews, I thought success meant strictly following a script: asking open-ended questions, remaining neutral, and staying on track. However, my initial experiences taught me otherwise.
Participants didn’t behave like the ideal case studies; they often got distracted or veered off-topic. For example, one participant went on a tangent about church tea. In contrast, others vented about the closure of Lifeway retail stores, which was unrelated to the study.
At first, I struggled to keep them on script, worried about losing control. However, I realized that following their lead revealed valuable insights, including:
- Users were missing small but crucial badge indicators that clarified product formats.
- Participants navigated in unconventional, inefficient ways to find products they loved.
- Customers chose certain studies primarily because they received samples, influencing future purchases.
By allowing for freedom and staying curious, I discovered pain points, workarounds, and moments of delight that a scripted approach would have missed. Good interviews do follow a plan, but they also embrace the unexpected.
Great interviews follow the participant.
A script gets you started. Curiosity gets you the insights that matter.
Skill #3: Running Usability Tests vs. Managing Messy, Conflicting Data
Usability testing may seem straightforward: write clear tasks, set up a script, observe user difficulties, and capture insights. However, in practice, clarity and consistency are often lacking.
Participants frequently misunderstand tasks, giving one-word answers like “fine” or “okay,” even when they seem stuck. They may also contradict themselves by claiming they completed a task while demonstrating otherwise.
Success is often achieved through unexpected navigation routes, skipping “obvious” shortcuts, or stumbling upon solutions instead of following a planned flow.
Defending the data becomes challenging when results don’t match stakeholders’ expectations. I must back up my insights with direct evidence, such as screenshots, session recordings, and quotes.
Real usability data is messy and unpredictable. Good researchers translate this chaos into actionable clarity.
Real user data isn’t clean—it’s a conversation you have to interpret.
Skill #4: Synthesizing Findings vs. Translating Them into Business Value
When I first started synthesizing research, I focused on identifying patterns:
- Which pain points were most common?
- Which features confused users?
- What ideas surfaced repeatedly in interviews?
Affinity mapping felt like progress, but there was minimal response when I shared my findings. Leadership nodded politely but didn’t act on the insights.
I realized the issue: summarizing user struggles wasn’t enough. I needed to show their impact on the business.
Instead of saying: “Users struggled to complete checkout,”
I needed to communicate: “Improving this checkout flow could boost successful orders by 12% and reduce abandoned carts by 18%.”
Rather than stating: “Users didn’t notice badge format labels,”
I should highlight: “Clarifying badge formats could cut customer support calls about incorrect product formats—saving time and money.”
This shift in perspective helped me connect insights to measurable outcomes like retention, revenue, conversions, efficiency, and support costs. Synthesizing findings taught me to spot patterns, but translating them into business value made decision-makers listen.
Patterns are easy to find. The impact behind the patterns is what gets people to act.
Skill #5: Writing Reports vs. Telling Stories That People Remember
When I started sharing research findings, I believed the data would speak for itself through organized reports and concise summaries. However, I quickly realized that most stakeholders weren’t engaging. It became clear that:
People connect with stories, not just information.
My reports lacked emotional appeal and often went unnoticed. Now, I treat reports like products, focusing on real-world use by incorporating:
- Brief user quotes
- Powerful visuals illustrating journeys
- Video clips capturing genuine emotions
Instead of lengthy documentation, I hold live sessions to walk stakeholders through findings, share examples, and invite questions. I now create honest videos showcasing authentic user experiences rather than quick walkthroughs.
Because, in the end, stats tell you what happened.
Stories tell you why it matters.
Data informs. Stories persuade.
Conclusion: The Real UX Research Skills Are Built Along the Way
Courses and certifications gave me a foundation.
They taught me how to write a plan, craft a survey, and structure a usability study.
But the fundamental skills—the ones that made me effective—were built in the moments when things didn’t go according to plan:
- When participants went off-script.
- When the data didn’t match leadership’s assumptions.
- When reports were ignored unless I told a story, people remembered.
If you’re early in your UX research career, know this:
- You’re not failing if it feels harder than your training made it sound.
- You’re building the skills that no course can fully teach—the ones that come from real people, constraints, and conversations.
Growth doesn’t mean perfection. It means rerunning the study.
The best researchers aren’t the ones who follow the script perfectly. They’re the ones who stay curious, stay human, and keep showing up for their users.