People often describe what they think they want, but that doesn’t always match what will actually solve their problem. Most of the time, they’re describing symptoms, not the root cause. That’s where you come in. Your job is to dig deeper, ask better questions, and help them uncover what they really need. Interviews might sound insightful, but users are human. We are subject to memory lapses, bias, and the desire to be liked. This means that what people say doesn’t always match what they do or need. Sometimes people forget, sometimes they're just being polite, and sometimes they just want to keep the peace. Whatever the reason, this is how we end up designing based on assumptions, not actual needs. And that’s how we get stuck with products, services, and experiences that totally miss the mark—or worse, flat-out fail.
Personas bridge that gap—they are grounded in user research, not just opinions. They help us see past surface-level answers and design for what matters.
Why Users Don’t Always Say What They Mean

© Interaction Design Foundation, CC BY-SA 4.0
People aren’t trying to deceive researchers. But the research process can unintentionally invite vague or inaccurate answers. Here are some of the key psychological and methodological reasons why users’ answers can’t be taken at face value:
Social desirability bias: Users want to give answers they believe are socially acceptable—they make claims like they brush their teeth twice a day or never forget passwords.
Example: A user claims they always read the terms and conditions before signing up because saying otherwise feels careless, even if they always skip them.
Hawthorne effect: Simply being observed makes people act differently.
Example: In a usability test, a participant carefully follows instructions and explores all features, even though, on their own, they’d likely skim or abandon the page quickly.
Recall bias: Users often misremember what they did, how often, or why.
Example: A user says they check the company’s mobile app “a few times a week,” but usage data shows they log in only once every two weeks.
Limited self-awareness: People don’t always understand the deeper motivations behind their actions.
Example: A participant insists they prefer to shop around for deals, but clicks the first product that meets their needs—every time.
Framing bias: Even the way we ask questions can suggest an answer.
Example: If you ask “How much do you enjoy using this app?”, it assumes a level of enjoyment and pressures the user to answer positively—even if their real experience was neutral or frustrating.
That’s why experienced researchers don’t rely on self-reporting alone. While interviews and surveys can reveal useful perspectives, they only show one side of the story—what users think they do, or what they want to believe about their behavior. To truly understand users, you need to pair those insights with observational methods that reveal what actually happens in real-world contexts. It’s only when you compare what people say with what they do that you can see the gaps, contradictions, and unspoken needs that drive meaningful design decisions.
Common Research Biases That Skew Your Personas (and How to Avoid Them)
User bias is only half the story. The other half is your own biases (researcher bias)—the subtle, often unintentional ways our expectations influence what we hear and how we interpret it.
Here are some researcher biases to watch for:
Confirmation bias: You focus on findings that support your assumptions.
Example: You believe users prefer minimal interfaces, so during interviews, you only highlight and record feedback that supports simplicity. As a result, you ignore those who want more detailed information upfront.`
Cultural bias: You apply your own norms to interpret others’ behavior.
Example: You assume Western conventions for navigation (like left-to-right reading or hamburger menus) are universal, and misinterpret confusion from international users as usability issues rather than cultural differences.
Availability bias: You use easily accessible participants who may not reflect your true users.
Example: You conduct interviews with your coworkers or friends because they’re easy to reach, but they’re far more tech-savvy than your target audience of first-time mobile users.
Framing effect: Poorly worded questions that suggest a “right” answer.
Example: If you ask, “How easy was it to find what you were looking for?” instead of, “Tell me about your experience navigating the site,” you subtly pressure users to give positive feedback—even if they struggled.
You can reduce these risks when you combine thoughtful research with direct observation. Start by asking neutral, open-ended questions that allow users to express themselves in their own words, rather than leading them toward a particular response. For example, instead of asking, “Do you find our checkout confusing?”—which subtly assumes there may be a problem—you might ask, “Can you walk me through how you usually complete a purchase online?” This invites users to describe their natural process without judgment or pressure.
It’s also critical to observe users in real or realistic contexts whenever possible. When you watch someone interact with a product in their actual work environment or daily routine reveals habits, hesitations, and workarounds that interviews alone can’t reveal. A user might say they always check their cart before checkout, but observation may show they often skip that step when multitasking or distracted. These subtle gaps between intention and behavior are where the most valuable insights live—and where your personas gain depth and accuracy.
Ultimately, it’s the combination of how you ask and what you watch that gives you a clearer picture of your users' true needs.
How to Validate Personas with Observational Research

© Interaction Design Foundation, CC BY-SA 4.0
To move from guesswork to real understanding, personas must be grounded in what users actually do—not just what they say. This is where observational research becomes essential. Interviews and surveys often capture opinions and perceptions, but they can miss critical details about users' behaviors, habits, and pain points. This is why observation is a cornerstone of grounded theory (a research method where insights and patterns emerge directly from real-world observations and interviews). It allows researchers to develop insights from the ground up and let patterns emerge naturally from real-life behavior.
As William Hudson, User Experience Strategist and Founder of Syntagm Ltd., explains in this video, grounded theory offers an ideal framework for this kind of research—one that prioritizes open-ended exploration and builds understanding from real user behavior, not assumptions.
Grounded theory is based on observing real behavior, not just collecting opinions. So how do you put that into practice when developing your personas?
Here are three observational methods that help you move beyond surface-level insights and build personas that reflect how people behave in their everyday environments:
Contextual Inquiry
Observe users as they complete tasks in their own environments, asking clarifying questions along the way. This method is especially useful to discover friction points in workflows that users may not be able to describe on their own.
Example: A user testing a business expense app claimed it was “easy to use,” but in a contextual observation, they spent several minutes unsure how to categorize recurring payments—something they didn’t mention when asked directly.
Naturalistic Observation
Watch users in their normal routines and don’t interfere. This helps you understand what people do when they’re not being prompted or observed in a structured task. It’s an ideal way to capture instinctive behavior.
Example: In a public library study, researchers observed that older patrons consistently avoided self-checkout kiosks, even though they had rated them “very convenient” in surveys. The real-world behavior suggested hesitation or confusion that wasn’t captured in the self-reported data.
Cultural Probes
Ask participants to self-document their experiences through diaries, photos, or videos over time. This method captures long-term, context-rich insights that might otherwise fall outside a traditional research session.

© Interaction Design Foundation, CC BY-SA 4.0
Example: One user described online grocery shopping as “easy.” But when asked to keep a diary of their weekly experience, they repeatedly noted frustrations with late or thawed frozen deliveries—insights that would have been missed in a single interview.
When you incorporate these real-world research techniques into your process, you gather data as well as uncover contradictions, contextual cues, and emotional responses that shape far more realistic and trustworthy personas. Don’t rely on what users say in a controlled setting, you must see what they do in the messy, unpredictable context of everyday life—that’s the kind of insight that leads to products, services and experiences your users will love.
How to Balance What Users Say vs. What They Do
The key isn’t to distrust what users say, it’s to contextualize it.
What people say in interviews tells you how they perceive their behavior. What they do in natural environments tells you how they actually behave. Personas should be built on both.
Here’s how to integrate the two effectively:
Start broad: Begin with open interviews to hear pain points in users’ own words.
Example: A user says they “always look for eco-friendly options when shopping.” That belief is important—but it’s just the starting point.
Dig deeper: Use observations to spot discrepancies or validate patterns.
Example: The same user may click the first product with a fast delivery badge, skipping sustainability filters altogether.
Apply triangulation: Combine multiple data sources, researchers, and methods to build a fuller picture.
Example: You learn from interviews that users find your scheduling tool “straightforward,” but session recordings show repeated backtracking and high abandonment. By comparing sources, you uncover that users misunderstood the interface but didn’t know how to describe the problem.
This method—called triangulation—helps you check one insight against another, which reduces the chance of faulty data and makes your personas more reliable and representative.
Good Design Starts with a Persona That Works
When a persona is rooted in real research—not assumptions or isolated anecdotes—personas become more than profiles. They become tools for truth.
A behavior-driven persona gives your team a consistent lens through which to evaluate ideas, prioritize features, and challenge assumptions. It keeps design decisions grounded in the lived experiences of your users—not generic notions of what “the user” wants.
Instead of asking vague, hypothetical questions like:
“What would a typical user think of this feature?”
You ask:
“Would Priya, who skips onboarding and dives straight into the product, understand this flow without context?”
“Would Marcus, who only accesses our app on low-bandwidth connections, be able to complete this task without frustration?”
“Would Mei, who shares her device with her kids, trust this interface with sensitive data?”
These kinds of specific, grounded questions shift conversations from opinion to insight. They help you design for real people. People who are inconsistent. People who multitask. People who don't read tooltips. People who make decisions based on emotion, urgency, habit, or hesitation. Personas, when built well, embrace that complexity, and turn your designs into products, services and experiences people will love.
The Take Away
Users don’t always say what they mean, and they don’t always behave the way they claim. That gap between intention and action makes it risky to rely on self-reporting alone. Interviews and surveys reveal perceptions—but real understanding comes from observing what users actually do.
Accurate personas require more than anecdotes or assumptions. They’re shaped by real behaviors, uncovered through methods like contextual inquiry, naturalistic observation, and cultural probes. Grounded theory allows patterns to emerge without preconceptions, while triangulation strengthens findings by combining multiple sources and perspectives.
Research bias—whether confirmation, cultural, framing, or availability—can distort results, and user-side biases like recall errors or social desirability further complicate what’s said. When you counteract those influences, you design better questions, watch users in real contexts, and recognize that even contradictions offer valuable insight.
Reliable personas reflect this complexity. They balance what users say with what they do. They move beyond fiction and help teams make confident, human-centered decisions—rooted in evidence, not assumption. And that’s what leads to products, services, and experiences that meet the needs of the people they’re built for.
References and Where to Learn More
Want to know more about personas and how to use them effectively? Personas and User Research: Design Products and Services People Need and Want will show you how to gather meaningful user insights, avoid bias, and build research-backed personas that help you design intuitive, relevant products. You’ll walk away with practical skills and a certificate that demonstrates your expertise in user research and persona creation.
Check out Personas for a clear breakdown of what makes personas effective—and how to avoid common pitfalls.
Read Grounded Theory: Base Findings on Research, Not Preconceptions to learn how to build insights from user behavior without letting assumptions shape the outcome.