One of the biggest challenges in UX research is understanding what users truly value. People often say one thing but behave differently when faced with actual choices. Conjoint analysis helps bridge this gap by analyzing how users make trade-offs between different features, enabling UX teams to prioritize effectively. Unlike direct surveys, conjoint analysis presents users with realistic product combinations, capturing their genuine decision-making patterns. When paired with advanced statistical and machine learning methods, this approach becomes even more powerful and predictive. Choice-based models like Hierarchical Bayes estimation reveal individual-level preferences, allowing tailored UX improvements for diverse user groups. Latent Class Analysis further segments users into distinct preference categories, helping design experiences that resonate with each segment. Advanced regression methods enhance accuracy in predicting user behavior. Mixed Logit Models recognize that different users value features uniquely, while Nested Logit Models address hierarchical decision-making, such as choosing a subscription tier before specific features. Machine learning techniques offer additional insights. Random Forests uncover hidden relationships between features - like those that matter only in combination - while Support Vector Machines classify users precisely, enabling targeted UX personalization. Bayesian approaches manage the inherent uncertainty in user choices. Bayesian Networks visually represent interconnected preferences, and Markov Chain Monte Carlo methods handle complexity, delivering more reliable forecasts. Finally, simulation techniques like Monte Carlo analysis allow UX teams to anticipate user responses to product changes or pricing strategies, reducing risk. Bootstrapping further strengthens findings by testing the stability of insights across multiple simulations. By leveraging these advanced conjoint analysis techniques, UX researchers can deeply understand user preferences and create experiences that align precisely with how users think and behave.
Analyzing User Behavior on B2B Websites
Explore top LinkedIn content from expert professionals.
Summary
Analyzing user behavior on B2B websites involves studying how business users interact with a digital platform to uncover their preferences, needs, and pain points. This process helps companies improve user experiences, streamline navigation, address issues, and align their offerings with user expectations.
- Monitor user interactions: Use tools like session recordings or analytics to see how users navigate your website and identify areas where they face friction or disengage.
- Focus on behavior, not opinions: Observe what users do rather than relying solely on what they say in interviews, as actions often reveal more genuine insights.
- Combine diverse data sources: Cross-check qualitative interviews with behavioral data and outcomes like conversion rates to get a fuller picture and reduce bias in your findings.
-
-
The best PMM research doesn’t come from collecting more data. It comes from collecting data from more SOURCES...aka triangulation. Triangulation helps you improve the validity, depth, and confidence of your findings by cross-checking insights across distinct but complementary data sources. This helps reduce bias and reduce how much you need from a single data source. For instance, for most B2B personas, just 5 solid interviews will get you 80% there, if you complement it with other sources. So, how can you apply this practically? Let’s go through a real example: Research question: What key benefits should we emphasize in the messaging for our primary persona, Business Ops leads? 1️⃣ Data source 1: Qualitative (what they say) Sources (pick one or more): --> 4 customer interviews with biz ops leads --> Gong snippets from late-stage technical eval calls --> Internal CSM notes during onboarding and renewal Common quotes include: “Every tool we add creates another integration headache.” “I just want something that doesn’t break other things.” This suggests they care less about flashy features and more about stability, reliability, and ease of maintenance. Now let’s verify this by going thru behavior data. 👇 2️⃣ Data source 2: behavioral (what they do) Sources (pick one or more): --> Support logs and ticket categories for similar accounts --> Feature usage of admin controls, integrations, and audit logs --> Help center searches by role/persona tag Insights: → Ops users are most active in integration, data sync, and permission → High NPS users rarely file tickets, but when they do, it’s for downtime or bugs, not UI complaints This confirms that reliability and ease of system management drive real behavior. 3️⃣ Data source 3: outcome ( what they choose) Sources: --> Win/loss notes --> Procurement objections tagged by role --> Post-sale NPS comments filtered by Business Ops titles Insights: → In wins: “Didn’t have to loop in Engineering” or “We were able to integrate in 1 sprint” → High NPS Ops users cite: “It just works. Rarely need to touch it.” This confirms that the decision patterns match the earlier sentiments. ✅ Triangulated insight: “Business Ops leaders prioritize system trust and low-maintenance integrations; they will choose a solution that promises stability, control, and minimal firefighting over advanced features.” In summary, triangulated findings are more defensible, easier to get buy in and more resistant to bias. You won’t always have time for deep research, especially in a startup. But even a scrappy mix of 2–3 sources can level up your insight. The good news is you can use AI to speed up the grunt work, and then YOU bring the insight. This is the type of work that helps you drive business strategy and get seen. ❓ When you build personas or messaging, what sources do you pull from? #productmarketing #research #strategy #coaching
-
Your UX research is lying to you. And no, I'm not talking about small data inconsistencies. I've seen founders blow $100K+ on product features their users "desperately wanted" only to face 0% adoption. Most research methods are fundamentally flawed because humans are terrible at predicting their own behavior. Here's the TRUTH framework I've used to get accurate user insights: T - Test with money, not words • Never ask "would you use this?" • Instead: "Here's a pre-order link for $50" • Watch what they do, not what they say R - Real environment observations • Stop doing sterile lab tests • Start shadowing users in their natural habitat • Record their frustrations, not their feedback U - Unscripted conversations • Ditch your rigid question list • Let users go off on tangents • Their random rants reveal gold T - Track behavior logs • Implement analytics BEFORE research • Compare what users say vs. what they do • Look for patterns, not preferences H - Hidden pain mining • Users can't tell you their problems • But they'll show you through workarounds • Document their "hacks" - that's where innovation lives STOP: • Running bias-filled focus groups • Asking leading questions • Taking feedback at face value • Rushing to build based on opinions START: • Following the TRUTH framework • Measuring actions over words • Building only what users prove they need PS: Remember, Henry Ford said if he asked people what they wanted, they would have said "faster horses." Don't ask what they want. Watch what they do. Follow me, John Balboa. I swear I'm friendly and I won't detach your components.
-
Look at what they do, not just what they say. User behavior is how users interact with and use software. It includes things like: → how people navigate the interface → which features people use most often → the order in which people perform tasks → how much time people spend on activities → how people react to prompts or feedback Product managers and designers must understand these behaviors. Analyzing user behavior can enhance the user experience, simplify processes, spot issues, and make the software more effective. Discovering the "why" behind user actions is the key to creating great software. In many of my sales discussions with teams, I notice that most rely too heavily on interviews to understand user problems. While interviews are a good starting point, they only cover half of the picture. What’s the benefit of going beyond interviews? → See actual user behavior, not just reported actions → Gain insights into unspoken needs in natural settings → Minimize behavior changes by observing discreetly → Capture genuine interactions for better data → Document detailed behaviors and interactions → Understand the full user journey and hidden pain points → Discover issues and opportunities users miss → Identify outside impacts on user behavior Most people don't think in a hyper-rational way—they're just trying to fit in. That's why when we built Helio, we included task-based activities to learn from users' actions and then provided follow-up questions about their thoughts and feelings. User behaviors aren't always rational. Several factors contribute to this: Cognitive Biases ↳ Users rely on mental shortcuts, often sticking to familiar but inefficient methods. Emotional Influence ↳ Emotions like stress or frustration can lead to hasty or illogical decisions. Habits and Routine ↳ Established habits may cause users to overlook better options or new features. Lack of Understanding ↳ Users may make choices based on limited knowledge, leading to seemingly irrational actions. Contextual Factors ↳ External factors like time pressure or distractions can impact user behavior. Social Influence ↳ Peer pressure or the desire to conform can also drive irrational choices. Observing user behavior, especially in large sample sizes, helps designers see how people naturally use products. This method gives a clearer and more accurate view of user behavior, uncovering hidden needs and issues that might not surface in interviews. #productdesign #productdiscovery #userresearch #uxresearch
-
User research is great, but what if you do not have the time or budget for it........ In an ideal world, you would test and validate every design decision. But, that is not always the reality. Sometimes you do not have the time, access, or budget to run full research studies. So how do you bridge the gap between guessing and making informed decisions? These are some of my favorites: 1️⃣ Analyze drop-off points: Where users abandon a flow tells you a lot. Are they getting stuck on an input field? Hesitating at the payment step? Running into bugs? These patterns reveal key problem areas. 2️⃣ Identify high-friction areas: Where users spend the most time can be good or bad. If a simple action is taking too long, that might signal confusion or inefficiency in the flow. 3️⃣ Watch real user behavior: Tools like Hotjar | by Contentsquare or PostHog let you record user sessions and see how people actually interact with your product. This exposes where users struggle in real time. 4️⃣ Talk to customer support: They hear customer frustrations daily. What are the most common complaints? What issues keep coming up? This feedback is gold for improving UX. 5️⃣ Leverage account managers: They are constantly talking to customers and solving their pain points, often without looping in the product team. Ask them what they are hearing. They will gladly share everything. 6️⃣ Use survey data: A simple Google Forms, Typeform, or Tally survey can collect direct feedback on user experience and pain points. 6️⃣ Reference industry leaders: Look at existing apps or products with similar features to what you are designing. Use them as inspiration to simplify your design decisions. Many foundational patterns have already been solved, there is no need to reinvent the wheel. I have used all of these methods throughout my career, but the trick is knowing when to use each one and when to push for proper user research. This comes with time. That said, not every feature or flow needs research. Some areas of a product are so well understood that testing does not add much value. What unconventional methods have you used to gather user feedback outside of traditional testing? _______ 👋🏻 I’m Wyatt—designer turned founder, building in public & sharing what I learn. Follow for more content like this!