upgrade
upgrade

🖥️Human-Computer Interaction

User Research Methods

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

User research is the foundation of everything you'll encounter in Human-Computer Interaction—without understanding users, you're just guessing. When you're tested on HCI concepts, you're being evaluated on your ability to connect research methods to design outcomes. Can you explain why a designer would choose ethnographic research over a survey? Do you understand the trade-offs between qualitative depth and quantitative breadth? These decisions shape every interface you interact with daily.

The methods in this guide aren't just a checklist to memorize. Each one represents a different lens for understanding human behavior, and knowing when to apply each method is what separates competent designers from great ones. As you study, focus on the underlying logic: What kind of data does each method produce? At what stage of design is it most useful? Don't just memorize the names—know what problem each method solves and how they complement each other.


Exploratory Methods: Understanding the Problem Space

Before you can design solutions, you need to understand the problem deeply. These methods help researchers discover user needs, motivations, and contexts that might not be obvious from the start. Exploratory research prioritizes depth over breadth, generating rich qualitative insights that inform early design decisions.

Interviews

  • One-on-one conversations that generate deep qualitative data through direct interaction with users
  • Open-ended questions allow participants to express thoughts, feelings, and unexpected insights—revealing the "why" behind behaviors
  • Uncover hidden motivations and pain points that quantitative methods miss; essential for early-stage research when you don't yet know what questions to ask

Contextual Inquiry

  • Combines observation with interviews in the user's natural environment—their office, home, or wherever they actually use products
  • Reveals real-world workflows and challenges that users might not think to mention in a lab setting; captures situated behavior
  • Collaborative approach positions the researcher as an apprentice learning from the user, fostering deeper understanding of tacit knowledge

Ethnographic Research

  • Immersive, long-term observation of users in natural settings to understand behaviors, culture, and social dynamics
  • Rich qualitative insights emerge from extended engagement—researchers may spend days or weeks embedded with users
  • Uncovers implicit behaviors and cultural factors that influence product use; especially valuable for designing for unfamiliar user populations

Compare: Contextual Inquiry vs. Ethnographic Research—both observe users in natural settings, but contextual inquiry is shorter and more task-focused, while ethnography involves deeper cultural immersion over extended periods. If an FRQ asks about understanding workplace culture, ethnography is your answer; for understanding specific task workflows, choose contextual inquiry.


Evaluative Methods: Testing What You've Built

Once you have designs—whether prototypes or live products—these methods help you assess how well they work for real users. Evaluative research focuses on identifying problems and measuring performance against user goals.

Usability Testing

  • Direct observation of users performing tasks reveals exactly where interfaces succeed or fail
  • Identifies specific usability issues like confusing navigation, unclear labels, or broken workflows—provides actionable design feedback
  • Think-aloud protocols capture users' mental models in real-time; even 5 users can uncover ~85% of usability problems (Nielsen's heuristic)

A/B Testing

  • Controlled experiments comparing two or more design variations to determine which performs better with actual users
  • Quantitative performance data on metrics like click-through rates, conversion, and task completion—removes guesswork from design decisions
  • Requires sufficient sample size for statistical significance; best for optimizing existing designs rather than exploring new concepts

Compare: Usability Testing vs. A/B Testing—usability testing tells you why something fails (qualitative), while A/B testing tells you which option performs better (quantitative). Use usability testing to diagnose problems, A/B testing to validate solutions at scale.


Generative Methods: Informing Design Decisions

These methods don't just evaluate—they actively generate insights that shape what you design. Generative research produces artifacts and frameworks that guide the design process itself.

Card Sorting

  • Users organize items into categories that make sense to them, directly informing information architecture and navigation design
  • Open card sorting lets users create their own categories; closed card sorting tests predefined structures
  • Can be conducted remotely using online tools, making it cost-effective for testing with diverse user groups

Focus Groups

  • Group discussions (typically 6-10 participants) generate diverse perspectives through participant interaction
  • Social dynamics spark ideas—participants build on each other's comments, revealing attitudes and reactions you might miss in individual interviews
  • Best for exploring concepts and gathering initial reactions; less reliable for predicting actual behavior due to social desirability bias

Compare: Interviews vs. Focus Groups—interviews provide individual depth without social influence, while focus groups generate broader perspectives through group dynamics. Choose interviews for sensitive topics; focus groups for brainstorming and concept exploration.


Synthesis Methods: Making Sense of Data

These methods transform raw research data into actionable design tools. Synthesis artifacts help teams maintain user focus throughout the design process.

Personas

  • Fictional user archetypes based on research data that represent key user segments with distinct goals and behaviors
  • Enable team empathy by giving abstract data a human face—"Would Sarah understand this menu?" is more actionable than "Would users understand this?"
  • Reference throughout design to ensure decisions align with user needs; effective personas include goals, frustrations, and context of use

User Journey Mapping

  • Visualizes the complete user experience across all touchpoints, from initial awareness through task completion
  • Identifies pain points and opportunities by mapping user emotions, actions, and thoughts at each stage
  • Encourages holistic thinking—reveals how individual interactions connect to form the overall experience; essential for service design

Compare: Personas vs. Journey Maps—personas describe who your users are, while journey maps describe what they experience. Use them together: journey maps often track a specific persona's path through your product.


Quantitative Methods: Measuring at Scale

When you need data from large user populations or want to identify statistical patterns, these methods deliver breadth and measurability. Quantitative research sacrifices depth for generalizability and statistical power.

Surveys

  • Reach large audiences efficiently through online distribution, gathering data from hundreds or thousands of users
  • Closed-ended questions (Likert scales, multiple choice) enable easy statistical analysis and comparison across user segments
  • Cost-effective for identifying trends but limited in explaining why users feel a certain way—often paired with qualitative follow-up

Compare: Surveys vs. Interviews—surveys tell you what users think at scale, interviews tell you why they think it. A well-designed study often uses surveys to identify patterns, then interviews to explore the most interesting findings in depth.


Quick Reference Table

ConceptBest Examples
Early-stage explorationInterviews, Ethnographic Research, Contextual Inquiry
Testing existing designsUsability Testing, A/B Testing
Information architectureCard Sorting, User Journey Mapping
Understanding user attitudesFocus Groups, Surveys, Interviews
Quantitative validationSurveys, A/B Testing
Maintaining user focusPersonas, User Journey Mapping
Natural environment researchContextual Inquiry, Ethnographic Research
Generating design directionCard Sorting, Focus Groups, Personas

Self-Check Questions

  1. A designer wants to understand why users abandon their shopping cart. Which two methods would provide the deepest insight into user motivations, and why would you choose them over surveys?

  2. Compare and contrast contextual inquiry and usability testing. When in the design process would you use each, and what kind of data does each produce?

  3. Your team has created three different homepage layouts and needs to determine which one leads to more sign-ups. Which method should you use, and what would you need to ensure valid results?

  4. A persona and a user journey map both synthesize research data. Explain how these two artifacts serve different purposes and how they might be used together in a design project.

  5. An FRQ asks you to recommend a research plan for designing a new medical records system for nurses. Which methods would you combine to balance depth of understanding with breadth of validation, and in what order would you conduct them?