/ references / user-research-methods.md
user-research-methods.md
  1  # User Research Methods
  2  
  3  **Purpose:** Complete guide to understanding users through systematic research methods — from discovery to validation.
  4  
  5  **Principle:** User research is not an optional add-on. It's the foundation of user-centered design. Without research, you're designing for assumptions, not people.
  6  
  7  ---
  8  
  9  ## 1. Why User Research Matters
 10  
 11  ### The Business Case
 12  
 13  **Principle:** Research reduces risk and increases ROI.**
 14  
 15  **Impact:**
 16  - **Cost savings:** Fixing problems after launch costs 100x more than during design
 17  - **Competitive advantage:** Understanding user needs creates differentiated experiences
 18  - **Reduced abandonment:** Products based on research have 2-3x higher retention
 19  
 20  **The reality:**
 21  Even the best UX designers cannot design optimal experiences without observing real users. The number of variables in modern interfaces and human behavior is too large to rely on intuition alone.
 22  
 23  ### When to Do Research
 24  
 25  **Research happens throughout the design process:**
 26  
 27  1. **Discover (Before Design):** Understand user needs and context
 28  2. **Explore (During Design):** Validate design decisions and directions
 29  3. **Test (Before Launch):** Ensure usability and effectiveness
 30  4. **Listen (After Launch):** Monitor performance and uncover new opportunities
 31  
 32  ---
 33  
 34  ## 2. Types of User Research
 35  
 36  ### Qualitative vs. Quantitative
 37  
 38  **Qualitative Research:**
 39  - **Purpose:** Understand "why" users behave the way they do
 40  - **Methods:** Interviews, usability testing, field studies, diary studies
 41  - **Output:** Insights, patterns, quotes, anecdotes
 42  - **Sample Size:** Small (5-10 participants per user group)
 43  - **Best for:** Discovering problems, understanding motivations, exploring opportunities
 44  
 45  **Quantitative Research:**
 46  - **Purpose:** Measure "what" users do at scale
 47  - **Methods:** Surveys, analytics, A/B testing, heat maps
 48  - **Output:** Metrics, percentages, statistical significance
 49  - **Sample Size:** Large (100+ participants)
 50  - **Best for:** Validating assumptions, benchmarking, tracking trends
 51  
 52  ### Attitudinal vs. Behavioral
 53  
 54  **Attitudinal Research:**
 55  - **What it is:** Listening to what users say
 56  - **Methods:** Interviews, surveys, focus groups
 57  - **Use when:** You want to understand opinions, preferences, or self-reported behavior
 58  - **Risk:** What people say differs from what they do
 59  
 60  **Behavioral Research:**
 61  - **What it is:** Watching what users actually do
 62  - **Methods:** Usability testing, field studies, analytics
 63  - **Use when:** You want to understand real behavior and interactions
 64  - **Advantage:** Actions speak louder than words
 65  
 66  **Best practice:** Combine both approaches for the sharpest insights.
 67  
 68  ---
 69  
 70  ## 3. Core Research Methods
 71  
 72  ### Usability Testing
 73  
 74  **Principle:** Observe real users performing real tasks with your design.**
 75  
 76  **What it is:**
 77  A facilitator guides a participant through tasks using an interface while observing behavior and listening to feedback.
 78  
 79  **Key Elements:**
 80  - **Facilitator:** Guides the test, asks follow-up questions, ensures data quality
 81  - **Tasks:** Realistic activities users would perform in real life
 82  - **Participant:** Should represent your target user group
 83  
 84  **Sample Size:**
 85  - **Qualitative:** 5 participants per user group uncovers 80% of problems
 86  - **Quantitative:** 20+ participants for statistically significant metrics
 87  
 88  **Types of Usability Testing:**
 89  
 90  1. **In-Person:**
 91     - Facilitator and participant in same room
 92     - Rich observational data (body language, facial expressions)
 93     - Higher cost and logistics
 94  
 95  2. **Remote Moderated:**
 96     - Facilitator and participant in different locations
 97     - Screen sharing (Skype, Zoom, GoToMeeting)
 98     - Lower cost, geographic flexibility
 99  
100  3. **Remote Unmoderated:**
101     - Participant completes tasks alone using online platform
102     - Automated task delivery and recording
103     - Scalable but limited interaction
104  
105  **Best Practices:**
106  - Use the "think-aloud" method: ask participants to narrate their thoughts
107  - Write task instructions carefully — avoid leading language
108  - Test your tasks first (pilot test) to catch issues
109  - Record sessions for analysis (with consent)
110  
111  ### Interviews
112  
113  **Principle:** Direct conversations uncover deep insights about user needs, motivations, and contexts.**
114  
115  **Types:**
116  
117  1. **Contextual Inquiry:**
118     - Interview users in their own environment
119     - Observe them performing tasks in context
120     - Reveals environmental and workflow factors
121  
122  2. **Semi-Structured Interviews:**
123     - Open-ended questions with a flexible guide
124     - Allows exploration of unexpected topics
125     - Balances consistency with depth
126  
127  3. **Stakeholder Interviews:**
128     - Interview business stakeholders about goals and constraints
129     - Understand organizational context and requirements
130     - Align research with business objectives
131  
132  **Best Practices:**
133  - Ask "why" repeatedly to dig deeper (laddering technique)
134  - Avoid leading questions that suggest answers
135  - Listen more than you speak (80/20 rule)
136  - Record and transcribe for analysis
137  
138  ### Surveys
139  
140  **Principle:** Surveys gather data at scale from many users quickly.**
141  
142  **When to use:**
143  - Validate findings from qualitative research
144  - Gather demographic information
145  - Track satisfaction over time
146  - Collect self-reported behaviors and preferences
147  
148  **Best Practices:**
149  - Keep it short (5-10 questions max)
150  - Use simple, unambiguous language
151  - Mix question types (multiple choice, rating scales, open-ended)
152  - Test your survey first (pilot with 5-10 people)
153  - Offer incentives to increase response rates
154  
155  **Question types:**
156  - **Multiple choice:** Easy to analyze, limited depth
157  - **Rating scales:** Quantifiable attitudes (Likert scales)
158  - **Open-ended:** Rich insights, harder to analyze
159  - **Ranking:** Understand priorities and preferences
160  
161  ### Field Studies
162  
163  **Principle:** Observe users in their natural environment to understand context and behavior.**
164  
165  **What it reveals:**
166  - Environmental factors affecting usage
167  - Workflow and process context
168  - Social and cultural influences
169  - Pain points not visible in lab settings
170  
171  **Types:**
172  1. **Ethnographic Studies:** Immersive observation over extended periods
173  2. **Contextual Inquiry:** Interview + observation during task performance
174  3. **Shadowing:** Following users through their day
175  
176  **Best Practices:**
177  - Plan observation periods (2-4 hours typical)
178  - Take detailed notes and photos (with permission)
179  - Look for workarounds and hacks — they reveal unmet needs
180  - Stay neutral, don't intervene unless necessary
181  
182  ### Diary Studies
183  
184  **Principle:** Users record their experiences over time, revealing patterns and context.**
185  
186  **What it captures:**
187  - Longitudinal behavior and attitudes
188  - Context of use (when, where, why)
189  - Emotional journey over time
190  - In-the-moment reflections
191  
192  **Implementation:**
193  - Participants log activities in a diary (digital or physical)
194  - Prompts ask about specific behaviors, contexts, or feelings
195  - Duration: 1 week to 1 month typical
196  - Daily check-ins maintain engagement
197  
198  **Best Practices:**
199  - Keep diary entries short (2-3 minutes)
200  - Use mobile-friendly digital diaries
201  - Send daily reminders to participants
202  - Offer incentives for completion
203  - Analyze for patterns over time
204  
205  ### Card Sorting
206  
207  **Principle:** Users organize information into categories, revealing mental models.**
208  
209  **When to use:**
210  - Design information architecture
211  - Organize navigation structures
212  - Plan content hierarchies
213  - Understand user categorization logic
214  
215  **Types:**
216  1. **Open Card Sort:** Users create their own categories
217  2. **Closed Card Sort:** Users sort into predefined categories
218  3. **Hybrid Card Sort:** Mixed approach with some predefined categories
219  
220  **Best Practices:**
221  - Use 30-60 cards maximum
222  - Label cards clearly with user-facing terms
223  - Test your cards first (are they understandable?)
224  - Analyze for patterns, not consensus
225  - Consider remote tools for scalability
226  
227  ---
228  
229  ## 4. Research Process
230  
231  ### Step 1: Define Research Questions
232  
233  **Start with clear objectives.**
234  
235  Good research questions:
236  - Specific and focused
237  - Answerable with chosen methods
238  - Aligned with business goals
239  - Grounded in assumptions to test
240  
241  **Examples:**
242  - "Why do users abandon checkout at the payment step?"
243  - "How do users currently track their expenses?"
244  - "What features do users expect in a mobile banking app?"
245  
246  ### Step 2: Choose Methods
247  
248  **Match methods to questions:**
249  
250  | Research Goal | Best Methods |
251  |---------------|--------------|
252  | Understand user needs | Interviews, field studies, diary studies |
253  | Test usability | Usability testing (moderated) |
254  | Validate at scale | Surveys, analytics, A/B testing |
255  | Explore information needs | Card sorting, tree testing |
256  | Observe natural behavior | Field studies, diary studies |
257  
258  **Consider constraints:**
259  - Time available
260  - Budget
261  - Access to participants
262  - Tools and expertise
263  
264  ### Step 3: Recruit Participants
265  
266  **Principle:** Recruit users who represent your target audience.**
267  
268  **Recruitment channels:**
269  - **User database:** Existing customers or users
270  - **Recruitment agencies:** Professional services (UserResearch.com, User Interviews)
271  - **Social media:** Targeted ads and posts
272  - **Referrals:** Current participants refer others
273  - **Intercept recruiting:** Approaching users in context
274  
275  **Screening criteria:**
276  - Demographics (age, location, role)
277  - Experience level (novice vs. expert)
278  - Usage patterns (frequency, features used)
279  - Technical setup (device, browser, internet)
280  
281  **Incentives:**
282  - Monetary compensation ($50-150 per session typical)
283  - Gift cards or product discounts
284  - Early access to features
285  - Donation to charity
286  
287  **Sample size guidance:**
288  - **Qualitative:** 5-10 participants per user group
289  - **Quantitative:** 100+ for statistical significance
290  - **Card sorting:** 15-30 participants
291  - **Surveys:** 200+ for reliable data
292  
293  ### Step 4: Prepare Research Materials
294  
295  **Essential materials:**
296  
297  1. **Research Plan:**
298     - Objectives and questions
299     - Methods and timeline
300     - Participant criteria
301     - Discussion guide or tasks
302  
303  2. **Discussion Guide (for interviews):**
304     - Introduction and consent
305     - Warm-up questions
306     - Core questions aligned with objectives
307     - Closing and thank you
308  
309  3. **Task Scenarios (for usability testing):**
310     - Realistic activities
311     - Clear starting point
312     - Success criteria
313     - Avoid leading language
314  
315  4. **Consent Form:**
316     - Purpose of research
317     - What will be recorded
318     - How data will be used
319     - Right to withdraw
320  
321  ### Step 5: Conduct Research
322  
323  **Best practices during sessions:**
324  
325  - **Build rapport:** Start with casual conversation
326  - **Stay neutral:** Don't influence responses
327  - **Probe deeper:** Ask "why" and "tell me more"
328  - **Record everything:** Audio, video, notes (with consent)
329  - **Be flexible:** Follow interesting threads
330  - **Watch the clock:** Respect participant time
331  
332  **Common pitfalls:**
333  - Leading questions ("Don't you think...?")
334  - Interrupting participants
335  - Jumping to solutions
336  - Confirming bias (seeking only agreement)
337  - Ignoring body language
338  
339  ### Step 6: Analyze Data
340  
341  **Qualitative Analysis:**
342  
343  1. **Affinity Diagramming:**
344     - Group observations and quotes by theme
345     - Identify patterns across participants
346     - Surface insights and opportunities
347  
348  2. **Thematic Analysis:**
349     - Code data for recurring themes
350     - Map relationships between themes
351     - Illustrate with quotes and examples
352  
353  **Quantitative Analysis:**
354  
355  1. **Descriptive Statistics:**
356     - Averages, percentages, distributions
357     - Task completion rates, time on task
358     - Satisfaction scores (SUS, NPS)
359  
360  2. **Inferential Statistics:**
361     - Compare groups (A/B testing)
362     - Test significance (p-values)
363     - Correlations and relationships
364  
365  **Best practices:**
366  - Start analysis immediately after sessions
367  - Use both top-down (research questions) and bottom-up (emerging themes) approaches
368  - Triangulate findings (confirm across methods)
369  - Distinguish between what users say vs. do
370  
371  ### Step 7: Report Findings
372  
373  **Principle:** Translate research into actionable insights.**
374  
375  **Report structure:**
376  
377  1. **Executive Summary:**
378     - Key findings and recommendations
379     - Business impact
380     - 1-page summary for stakeholders
381  
382  2. **Background:**
383     - Research questions and objectives
384     - Methods used
385     - Participants and context
386  
387  3. **Findings:**
388     - Organized by theme or research question
389     - Support with data (quotes, metrics, examples)
390     - Distinguish between critical and nice-to-have
391  
392  4. **Recommendations:**
393     - Specific, actionable design changes
394     - Prioritized by impact and effort
395     - Aligned with business goals
396  
397  5. **Appendices:**
398     - Detailed methodology
399     - Full transcripts (if needed)
400     - Screening criteria and materials
401  
402  **Presentation tips:**
403  - Start with key insights, not methodology
404  - Use video clips and quotes to bring findings to life
405  - Include stakeholders in the analysis process for buy-in
406  - Provide clear next steps
407  
408  ---
409  
410  ## 5. Research Timing
411  
412  ### Discovery Phase (Before Design)
413  
414  **Goal:** Understand user needs, context, and pain points.
415  
416  **Methods:**
417  - Stakeholder interviews
418  - User interviews
419  - Field studies
420  - Competitive analysis
421  - Analytics review
422  
423  **Output:**
424  - User personas
425  - User needs and pain points
426  - Opportunity areas
427  - Design requirements
428  
429  ### Exploration Phase (During Design)
430  
431  **Goal:** Validate design directions and explore solutions.
432  
433  **Methods:**
434  - Card sorting
435  - Tree testing
436  - Concept testing
437  - Prototype testing
438  - A/B testing
439  
440  **Output:**
441  - Validated design directions
442  - Information architecture
443  - Refined prototypes
444  - Prioritized features
445  
446  ### Validation Phase (Before Launch)
447  
448  **Goal:** Ensure usability and effectiveness.
449  
450  **Methods:**
451  - Usability testing (moderated)
452  - Usability testing (unmoderated)
453  - Accessibility testing
454  - Surveys (SUS, NPS)
455  
456  **Output:**
457  - Usability benchmarks
458  - List of issues to fix
459  - Confidence in launch readiness
460  - Satisfaction metrics
461  
462  ### Listening Phase (After Launch)
463  
464  **Goal:** Monitor performance and uncover opportunities.
465  
466  **Methods:**
467  - Analytics review
468  - Surveys
469  - Feedback analysis
470  - Support ticket analysis
471  - A/B testing
472  
473  **Output:**
474  - Performance dashboards
475  - Improvement backlog
476  - Optimization priorities
477  - Future research needs
478  
479  ---
480  
481  ## 6. Research Ethics
482  
483  ### Informed Consent
484  
485  **Principle:** Participants must understand and agree to the research.**
486  
487  **Consent form should include:**
488  - Purpose of research
489  - What will be recorded
490  - How data will be used and stored
491  - Who will see the data
492  - Right to withdraw without penalty
493  - Contact information for questions
494  
495  ### Privacy and Confidentiality
496  
497  **Best practices:**
498  - Anonymize data in reports (use pseudonyms)
499  - Store data securely (encrypted, access-controlled)
500  - Retain data only as long as needed
501  - Don't share identifiable information without consent
502  - Follow GDPR and other regulations
503  
504  ### Avoiding Harm
505  
506  **Ensure research:**
507  - Does not cause stress or discomfort
508  - Respects participant time
509  - Does not exploit vulnerable populations
510  - Provides fair compensation
511  - Minimizes risk
512  
513  ### Bias Awareness
514  
515  **Common biases:**
516  - **Confirmation bias:** Seeking only data that supports assumptions
517  - **Leading questions:** Influencing responses through phrasing
518  - **Social desirability:** Participants saying what they think you want to hear
519  - **Recency bias:** Overweighting recent data
520  - **Hawthorne effect:** Participants changing behavior because they're observed
521  
522  **Mitigation strategies:**
523  - Use neutral language in questions
524  - Triangulate findings across methods
525  - Include diverse perspectives in analysis
526  - Acknowledge limitations in reporting
527  
528  ---
529  
530  ## 7. Common Research Mistakes
531  
532  ### 1. Researching Too Late
533  
534  **Problem:** Starting research after design decisions are made.
535  
536  **Solution:** Research early and often throughout the process.
537  
538  ### 2. Wrong Participants
539  
540  **Problem:** Testing with non-representative users (colleagues, friends).
541  
542  **Solution:** Recruit participants who match your target user profile.
543  
544  ### 3. Leading Questions
545  
546  **Problem:** Questions that suggest desired answers.
547  
548  **Solution:** Use neutral, open-ended questions. Test your discussion guide first.
549  
550  ### 4. Too Many Research Questions
551  
552  **Problem:** Trying to answer everything in one study.
553  
554  **Solution:** Focus on 3-5 key questions per study. Do multiple studies if needed.
555  
556  ### 5. Ignoring Context
557  
558  **Problem:** Researching in lab settings that don't reflect real use.
559  
560  **Solution:** Combine lab testing with field studies and diary studies.
561  
562  ### 6. Analysis Paralysis
563  
564  **Problem:** Collecting data but not analyzing or acting on it.
565  
566  **Solution:** Start analysis immediately after sessions. Report quickly.
567  
568  ### 7. Research Without Action
569  
570  **Problem:** Findings sit in reports but don't influence design.
571  
572  **Solution:** Involve stakeholders in research. Present actionable recommendations.
573  
574  ---
575  
576  ## 8. Research Tools
577  
578  ### Planning and Recruitment
579  - **Dovetail:** Research repository and analysis
580  - **Notion:** Research planning and documentation
581  - **UserInterviews.com:** Participant recruitment
582  - **UserResearch.com:** Professional recruitment services
583  
584  ### Data Collection
585  - **Zoom/Skype:** Remote interviews and testing
586  - **UserTesting.com:** Unmoderated usability testing
587  - **Lookback:** Moderated remote testing
588  - **Maze:** Prototype testing
589  - **Optimal Workshop:** Card sorting and tree testing
590  - **Typeform/Google Forms:** Surveys
591  
592  ### Analysis and Reporting
593  - **Miro/Lucidchart:** Affinity diagramming
594  - **Excel/Google Sheets:** Quantitative analysis
595  - **Dovetail:** Qualitative analysis and insights management
596  - **Notion:** Research repository
597  - **Keynote/PowerPoint:** Presentation and reporting
598  
599  ### Analytics and Behavior
600  - **Google Analytics:** User behavior metrics
601  - **Hotjar/CrazyEgg:** Heat maps and session recordings
602  - **Mixpanel/Amplitude:** Event tracking and funnels
603  
604  ---
605  
606  ## Quick Checklist
607  
608  ### Planning
609  - [ ] Define clear research questions
610  - [ ] Choose appropriate methods
611  - [ ] Create research plan and timeline
612  - [ ] Prepare discussion guide or tasks
613  - [ ] Create consent forms
614  
615  ### Recruitment
616  - [ ] Define screening criteria
617  - [ ] Choose recruitment channel
618  - [ ] Set incentives
619  - [ ] Schedule participants
620  
621  ### Execution
622  - [ ] Pilot test materials
623  - [ ] Conduct research sessions
624  - [ ] Record sessions (with consent)
625  - [ ] Take notes during sessions
626  
627  ### Analysis
628  - [ ] Transcribe recordings
629  - [ ] Code and categorize data
630  - [ ] Identify patterns and insights
631  - [ ] Triangulate findings
632  
633  ### Reporting
634  - [ ] Create executive summary
635  - [ ] Illustrate with examples and quotes
636  - [ ] Provide actionable recommendations
637  - [ ] Present to stakeholders
638  - [ ] Archive research for future reference
639  
640  ---
641  
642  **Remember:** Good user research is not about proving you're right. It's about discovering what's true. Approach research with curiosity, not confirmation bias.
643  
644  **Research is the foundation of user-centered design. Without it, you're designing in the dark.**