Skip to main content

User Interviews

User interviews are one-on-one conversations designed to understand how people think, what they need, and why they behave the way they do. They are the most versatile research method and the one every UX professional should master first.

When to Use Interviews

Interviews are best when you want to understand the "why" behind behavior:

  • Discovery phase — understanding user needs before designing anything
  • Exploring pain points — digging into problems users face with current solutions
  • Understanding context — learning about the environment and workflow around a task
  • Post-task debriefs — understanding why a participant struggled during a usability test

Interviews are not ideal for measuring behavior at scale or comparing two designs quantitatively — use surveys or A/B tests for those.

Writing an Interview Script

A good script is a guide, not a rigid checklist. Structure it in three phases:

Opening (5 minutes)

Build rapport and set expectations:

"Thanks for joining today. I'm [name] and I'm a researcher at [company].
We're trying to improve [product/feature]. There are no right or wrong
answers — I'm here to learn from your experience. The session will take
about 45 minutes. Do you have any questions before we start?"

Ask for consent to record if applicable. Explain how the recording will be used.

Core Questions (30-35 minutes)

Start broad and get specific. Move from context to behavior to opinion:

  1. Context questions — establish the participant's background

    • "Tell me about your role and what a typical day looks like."
    • "How does [task] fit into your daily workflow?"
  2. Behavior questions — understand what they actually do

    • "Walk me through the last time you [did the thing]."
    • "What steps did you take to [accomplish the goal]?"
  3. Pain point questions — uncover problems

    • "What was the most frustrating part of that experience?"
    • "If you could change one thing about how you [do the task], what would it be?"
  4. Aspiration questions — learn what ideal looks like

    • "If this worked perfectly, what would that look like?"
    • "What would make this easier for you?"

Closing (5 minutes)

"Is there anything else about [topic] that I didn't ask about but
you think is important? Thank you so much for your time."

The Art of Asking Questions

The quality of your questions determines the quality of your insights.

Ask Open-Ended Questions

Closed (avoid)Open (prefer)
"Do you like this feature?""How do you feel about this feature?"
"Is the navigation easy?""Tell me about your experience finding things."
"Would you use this?""How would this fit into your current workflow?"

Follow Up Relentlessly

The most valuable insights come from follow-up questions. When a participant says something interesting, dig deeper:

  • "Tell me more about that."
  • "What do you mean by [their word]?"
  • "Why is that important to you?"
  • "Can you give me an example?"
  • "What happened next?"

These prompts draw out the stories, emotions, and details that reveal real insights.

Avoid Leading Questions

Leading questions suggest an expected answer:

  • Leading: "Don't you think the new design is better?"

  • Neutral: "How does the new design compare to the previous one for you?"

  • Leading: "Most people find this confusing. What about you?"

  • Neutral: "What was your experience with this section?"

Embrace Silence

After asking a question, wait. New researchers fill silence with more questions or rephrased versions. Resist this urge. Participants need time to think, and the pause often produces the most thoughtful answers.

Count to five silently after the participant stops talking before you move on. They will often continue with deeper thoughts.

Active Listening

Listening is the most important skill in an interview. Active listening means:

  • Full attention — do not check your script, phone, or notes while the participant is speaking
  • Verbal acknowledgment — "I see," "That makes sense," "Interesting" (without expressing agreement or disagreement)
  • Reflecting back — "So what I'm hearing is..." to confirm understanding
  • Body language — nodding, maintaining eye contact, leaning slightly forward
  • Not solving — when a participant describes a problem, do not offer solutions. Your job is to understand, not fix.

Note-Taking

Take notes during the session, but keep them lightweight. Write down:

  • Direct quotes that capture key sentiments (mark them with quotation marks)
  • Observations about behavior or emotion
  • Questions that come to mind for follow-up

If the session is recorded, you can fill in details later. Do not try to transcribe everything in real time — it prevents you from being present.

Immediately after each session, spend 10-15 minutes writing a debrief:

  • Top 3-5 key takeaways
  • Surprising or unexpected findings
  • Quotes that stood out
  • Patterns emerging across sessions

Synthesis

After all interviews are complete, synthesize your findings:

  1. Review notes and recordings — highlight key statements and observations
  2. Create codes — tag recurring themes (e.g., "trust issues," "time pressure," "workaround")
  3. Group into themes — cluster related codes into broader themes
  4. Identify patterns — which themes appeared across multiple participants?
  5. Write insights — transform themes into actionable insight statements

An insight statement follows this format: "[User group] needs [need] because [reason], but currently [barrier]."

Example: "New users need a guided setup flow because they are unsure which settings matter, but currently the settings page presents 30+ options with no hierarchy."

Common Mistakes

  • Asking about the future — "Would you use this feature?" is unreliable. People are bad at predicting their own behavior. Ask about past behavior instead.
  • Interviewing stakeholders as users — internal opinions are not user research.
  • Confirming your hypothesis — if you only ask questions that confirm what you already believe, you are not doing research.
  • Running too many interviews — diminishing returns set in after 8-12 interviews. If every interview sounds the same, you have enough data.