upgrade
upgrade

🎨Design Strategy and Software

User Research Techniques

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

User research isn't just a checkbox in the design process—it's the foundation that separates products users love from products users abandon. You're being tested on your ability to select the right research method for specific design challenges, understand when qualitative insights trump quantitative data (and vice versa), and recognize how different techniques reveal different layers of user behavior. The core concepts here—generative vs. evaluative research, attitudinal vs. behavioral data, and the discovery-validation cycle—show up repeatedly in exam questions and real-world design decisions.

Think of these techniques as tools in a toolkit: a hammer is great for nails but terrible for screws. Similarly, running a survey when you need deep emotional insights will give you shallow, misleading data. The exam will test whether you know which tool to reach for and why. Don't just memorize what each technique does—know what type of data it produces, when in the design process it's most valuable, and how it compares to alternative methods.


Generative Research Methods

These techniques help you discover what to build in the first place. Generative research explores the problem space before solutions exist, uncovering user needs, motivations, and contexts you didn't know to ask about.

Interviews

  • One-on-one qualitative conversations that reveal the "why" behind user behavior—ideal for exploring motivations, frustrations, and unmet needs early in the design process
  • Open-ended questioning allows participants to surface insights you wouldn't have thought to ask about, making interviews essential for discovery-phase research
  • Best for depth over breadth—you'll get rich stories from 8-12 participants rather than statistical significance from hundreds

Contextual Inquiry

  • Field research combining observation and interviews in the user's actual environment—their office, home, or wherever they naturally use products
  • Reveals workflow realities that users can't articulate in a conference room, including workarounds, environmental constraints, and social dynamics
  • Captures the gap between stated and actual behavior—what users say they do versus what they really do when you're watching

Ethnographic Research

  • Immersive, long-term observation where researchers embed themselves in users' lives to understand cultural and social contexts shaping behavior
  • Uncovers deep motivations and systemic patterns that shorter research methods miss—particularly valuable for products entering unfamiliar markets or cultures
  • Time-intensive but high-impact—typically reserved for complex problem spaces where surface-level research has failed to explain user behavior

Compare: Contextual Inquiry vs. Ethnographic Research—both observe users in natural settings, but contextual inquiry focuses on specific tasks during shorter sessions while ethnography requires extended immersion to capture cultural patterns. If an FRQ asks about understanding cultural influences on product adoption, ethnography is your answer.


Evaluative Research Methods

Once you have a design direction, these techniques validate whether your solution actually works. Evaluative research tests specific designs against user expectations and abilities.

Usability Testing

  • Task-based observation where users attempt realistic goals while researchers identify friction points, errors, and confusion in real-time
  • Behavioral data over opinions—watch what users do, not just what they say, to find interface problems before launch
  • Moderated or unmoderated formats offer flexibility; moderated sessions allow follow-up questions while unmoderated scales to larger sample sizes

A/B Testing

  • Controlled experiments comparing design variations with live users to determine which version drives better outcomes (clicks, conversions, completion rates)
  • Quantitative validation that removes opinion from design debates—let user behavior data decide between competing approaches
  • Requires sufficient traffic to reach statistical significance; best for optimizing existing products rather than validating new concepts

Compare: Usability Testing vs. A/B Testing—usability testing tells you why users struggle (qualitative), while A/B testing tells you which design performs better (quantitative). Use usability testing to diagnose problems, A/B testing to validate solutions at scale.


Attitudinal Research Methods

These techniques capture what users think, feel, and believe—their perceptions, preferences, and emotional responses. Attitudinal data explains the reasoning behind behavioral patterns.

Surveys

  • Scalable data collection from large populations using structured questions—ideal for identifying trends, measuring satisfaction, and validating assumptions across user segments
  • Closed-ended questions yield quantitative metrics (NPS scores, Likert scales) while open-ended questions capture qualitative themes
  • Cost-effective but limited depth—surveys tell you what users think but rarely why they think it

Focus Groups

  • Facilitated group discussions with 6-10 participants exploring reactions to concepts, designs, or brand positioning
  • Group dynamics generate ideas through participant interaction—one person's comment sparks another's insight, revealing shared mental models
  • Risk of groupthink and dominant voices—skilled moderation is essential to ensure all perspectives emerge

Compare: Surveys vs. Focus Groups—surveys reach hundreds of users with standardized questions, while focus groups dive deep with small groups through dynamic conversation. Surveys quantify attitudes; focus groups explore the nuances behind them.


Synthesis and Modeling Methods

These techniques transform raw research data into actionable frameworks. Synthesis methods organize findings into tools that guide design decisions throughout the project.

Personas

  • Fictional user archetypes built from research data that represent key audience segments—including goals, behaviors, pain points, and contexts
  • Alignment tool for teams—personas keep everyone designing for the same user rather than their own assumptions or edge cases
  • Must be research-based to be valid; personas built from assumptions rather than data perpetuate bias rather than eliminating it

User Journey Mapping

  • Visual timeline of user experience across all touchpoints, documenting actions, thoughts, emotions, and pain points at each stage
  • Identifies opportunity gaps where user needs aren't being met and moments of friction that drive abandonment
  • Cross-functional communication tool—helps engineering, marketing, and support teams understand the complete user experience beyond their individual touchpoints

Card Sorting

  • Participatory design technique where users organize content or features into categories that make sense to them
  • Reveals mental models for information architecture—how users expect to find things versus how designers organized them
  • Open vs. closed formats: open sorting lets users create their own categories; closed sorting tests whether predefined categories match user expectations

Compare: Personas vs. Journey Maps—personas describe who your users are as people, while journey maps describe what they experience over time. Use personas to build empathy, journey maps to identify intervention points.


Quick Reference Table

ConceptBest Examples
Discovery/Generative ResearchInterviews, Contextual Inquiry, Ethnographic Research
Evaluative/Validation ResearchUsability Testing, A/B Testing
Qualitative Data CollectionInterviews, Contextual Inquiry, Focus Groups
Quantitative Data CollectionSurveys, A/B Testing
Attitudinal InsightsSurveys, Focus Groups, Interviews
Behavioral InsightsUsability Testing, A/B Testing, Contextual Inquiry
Research Synthesis ToolsPersonas, User Journey Mapping
Information ArchitectureCard Sorting

Self-Check Questions

  1. You're designing a completely new product category and need to understand potential users' daily challenges before generating any solutions. Which two techniques would be most appropriate, and why are surveys insufficient at this stage?

  2. Compare and contrast usability testing and A/B testing: What type of data does each produce, and at what stage of the design process is each most valuable?

  3. A stakeholder insists that focus groups will tell you everything you need to know about user behavior. What's the flaw in this reasoning, and which technique would you recommend adding to capture behavioral data?

  4. Your team created personas based on internal assumptions rather than research data. Explain why this undermines the purpose of personas and what research methods you'd use to build valid ones.

  5. An FRQ asks you to recommend a research plan for improving an existing e-commerce checkout flow. Identify which techniques belong in the discovery phase versus the validation phase, and justify your sequencing.