Data doesn’t have to define your design process. But failing to use it is a big mistake. In our process, we use data from the beginning to draw inspiration, then use data to guide our prototyping decisions, and eventually make more data-driven choices. The process is more flexible than people often think. The goal isn’t to use data–it’s to make more informed decisions that ultimately improve user and business outcomes. Here’s how: → Data-Inspired Design (Frame the Challenge) We use data to inspire and shape our understanding of the design problem. The aim is to find insights that lead to creative solutions while considering what users need, how they behave, and why they act in specific ways. We find up to 100 opportunities to create lift in a design initiative. Helio UX metrics help us gather early user feedback or signals, highlighting where users struggle or where new opportunities lie. We can set a clear direction for the design process by using these early insights and proxy metrics. We also do interviews. Our team focuses on collecting these early signals to understand the reasons behind user actions. → Data-Informed Design (Assess the Potential) We weigh the benefits and risks of different ideas. Data helps guide the design process, but intuition and insights are just as important as measurable factors. In more significant engagements, we collect answers from up to 30,000 participants in this phase. Helio is handy here, as it allows teams to test early prototypes on a large scale, gathering UX metrics crucial for evaluating design choices. Data storytelling and analyzing user research turn insights into practical feedback. Collaboration across teams also ensures that the design meets user and business needs. We gather feedback through usability tests and measure task completion rates, helping link early design ideas to clear success criteria. → Data-Driven Design (Finalize the Choices) Data helps us make decisions that align with business and user goals. The focus is refining the design using feedback and data to make it as effective as possible. Once the design is live, we connect early metrics with analytics. Helio helps us collect data, such as success rates, user satisfaction, and task completion. These figures provide the confidence needed to finalize design decisions. We align UX metrics with business goals, focusing on clear outcomes like improved usability, higher feature adoption, or revenue growth. Design KPIs and early signals play a role, guiding us in making final decisions based on how well the product performs against these success metrics. —–––––– Data can be applied differently throughout the design process—from an initial source of inspiration to a guiding force in assessing potential and ultimately as the driver of final decisions. We use data differently in each design phase, balancing creativity and analysis. Interested? DM me. #productdesign #productdiscovery #userresearch #uxresearch
User Experience Principles for Data-Driven Applications
Explore top LinkedIn content from expert professionals.
Summary
User experience principles for data-driven applications focus on creating intuitive and impactful designs by integrating user needs with actionable insights from data. These principles guide teams through every phase of the design process—from gathering initial insights to making data-informed decisions—to align user satisfaction with business goals.
- Start with user behavior: Use data to understand how users interact with your application, identify pain points, and uncover opportunities to create meaningful solutions that meet real needs.
- Combine creativity and data: Balance analytical findings with creative intuition by allowing insights to guide, rather than dictate, design decisions, especially during prototyping and problem-solving phases.
- Prioritize feedback loops: Build tools and interfaces that actively collect user feedback and usage patterns, enabling continuous improvement of both the design and the underlying data product.
-
-
I’ve had the chance to work across several #EnterpriseAI initiatives esp. those with human computer interfaces. Common failures can be attributed broadly to bad design/experience, disjointed workflows, not getting to quality answers quickly, and slow response time. All exacerbated by high compute costs because of an under-engineered backend. Here are 10 principles that I’ve come to appreciate in designing #AI applications. What are your core principles? 1. DON’T UNDERESTIMATE THE VALUE OF GOOD #UX AND INTUITIVE WORKFLOWS Design AI to fit how people already work. Don’t make users learn new patterns — embed AI in current business processes and gradually evolve the patterns as the workforce matures. This also builds institutional trust and lowers resistance to adoption. 2. START WITH EMBEDDING AI FEATURES IN EXISTING SYSTEMS/TOOLS Integrate directly into existing operational systems (CRM, EMR, ERP, etc.) and applications. This minimizes friction, speeds up time-to-value, and reduces training overhead. Avoid standalone apps that add context-switching or friction. Using AI should feel seamless and habit-forming. For example, surface AI-suggested next steps directly in Salesforce or Epic. Where possible push AI results into existing collaboration tools like Teams. 3. CONVERGE TO ACCEPTABLE RESPONSES FAST Most users have gotten used to publicly available AI like #ChatGPT where they can get to an acceptable answer quickly. Enterprise users expect parity or better — anything slower feels broken. Obsess over model quality, fine-tune system prompts for the specific use case, function, and organization. 4. THINK ENTIRE WORK INSTEAD OF USE CASES Don’t solve just a task - solve the entire function. For example, instead of resume screening, redesign the full talent acquisition journey with AI. 5. ENRICH CONTEXT AND DATA Use external signals in addition to enterprise data to create better context for the response. For example: append LinkedIn information for a candidate when presenting insights to the recruiter. 6. CREATE SECURITY CONFIDENCE Design for enterprise-grade data governance and security from the start. This means avoiding rogue AI applications and collaborating with IT. For example, offer centrally governed access to #LLMs through approved enterprise tools instead of letting teams go rogue with public endpoints. 7. IGNORE COSTS AT YOUR OWN PERIL Design for compute costs esp. if app has to scale. Start small but defend for future-cost. 8. INCLUDE EVALS Define what “good” looks like and run evals continuously so you can compare against different models and course-correct quickly. 9. DEFINE AND TRACK SUCCESS METRICS RIGOROUSLY Set and measure quantifiable indicators: hours saved, people not hired, process cycles reduced, adoption levels. 10. MARKET INTERNALLY Keep promoting the success and adoption of the application internally. Sometimes driving enterprise adoption requires FOMO. #DigitalTransformation #GenerativeAI #AIatScale #AIUX
-
Data Products are NOT all code, infra, and biz data. Even from a PURE technical POV, a Data Product must also have the ability to capture HUMAN Feedback. The User’s insight is technically part of the product and defines 𝐭𝐡𝐞 𝐃𝐚𝐭𝐚 𝐏𝐫𝐨𝐝𝐮𝐜𝐭’𝐬 𝐟𝐢𝐧𝐚𝐥 𝐬𝐭𝐚𝐭𝐞 & shape. This implies Human Action is an integrated part of the Data Product, and it turns out 𝐚𝐜𝐭𝐢𝐨𝐧 𝐢𝐬 𝐭𝐡𝐞 𝐩𝐫𝐞𝐥𝐢𝐦𝐢𝐧𝐚𝐫𝐲 𝐛𝐮𝐢𝐥𝐝𝐢𝐧𝐠 𝐛𝐥𝐨𝐜𝐤 𝐨𝐟 𝐟𝐞𝐞𝐝𝐛𝐚𝐜𝐤. How the user interacts with the product influences how the product develops. But what is the 𝐛𝐫𝐢𝐝𝐠𝐞 𝐛/𝐰 𝐃𝐚𝐭𝐚 𝐏𝐫𝐨𝐝𝐮𝐜𝐭𝐬 𝐚𝐧𝐝 𝐇𝐮𝐦𝐚𝐧 𝐀𝐜𝐭𝐢𝐨𝐧𝐬? It’s a 𝐆𝐎𝐎𝐃 𝐔𝐬𝐞𝐫 𝐈𝐧𝐭𝐞𝐫𝐟𝐚𝐜𝐞 that doesn’t just offer a read-only experience like dashboards (no action or way to capture action), but enables the user to interact actively. This bridge is entirely a user-experience (UX) problem. With the goal of how to enhance the User's Experience that encourages action, the interface/bridge between Data Products and Human Action must address the following: 𝐇𝐨𝐰 𝐭𝐨 𝐟𝐢𝐧𝐝 𝐭𝐡𝐞 𝐫𝐢𝐠𝐡𝐭 𝐝𝐚𝐭𝐚 𝐩𝐫𝐨𝐝𝐮𝐜𝐭 𝐭𝐡𝐚𝐭 𝐬𝐞𝐫𝐯𝐞𝐬 𝐦𝐲 𝐧𝐞𝐞𝐝? A discovery problem addressed by UX features such as natural language search (contextual search), browsing, & product exploration features. 𝐇𝐨𝐰 𝐜𝐚𝐧 𝐈 𝐮𝐬𝐞 𝐭𝐡𝐞 𝐩𝐫𝐨𝐝𝐮𝐜𝐭? An accessibility problem addressed by UX features such as native integrability- interoperability with native stacks, policy granularity (and scalable management of granules), documentation, and lineage transparency. 𝐇𝐨𝐰 𝐜𝐚𝐧 𝐈 𝐮𝐬𝐞 𝐭𝐡𝐞 𝐩𝐫𝐨𝐝𝐮𝐜𝐭 𝐰𝐢𝐭𝐡 𝐜𝐨𝐧𝐟𝐢𝐝𝐞𝐧𝐜𝐞? A more deep-rooted accessibility problem. You can't use data you don't trust. Addressed by UX features such as quality/SLO overview & lineage (think contracts), downstream updates & request channels. Note that it's the data product that's enabling quality but the UI that's exposing trust features. 𝐇𝐨𝐰 𝐜𝐚𝐧 𝐈 𝐢𝐧𝐭𝐞𝐫𝐚𝐜𝐭 𝐰𝐢𝐭𝐡 𝐭𝐡𝐞 𝐩𝐫𝐨𝐝𝐮𝐜𝐭 & 𝐬𝐮𝐠𝐠𝐞𝐬𝐭 𝐧𝐞𝐰 𝐫𝐞𝐪𝐮𝐢𝐫𝐞𝐦𝐞𝐧𝐭𝐬? A data evolution problem. Addressed by UX features such as logical modelling interface, easily operable by both adept and non-technical data users. 𝐇𝐨𝐰 𝐭𝐨 𝐠𝐞𝐭 𝐚𝐧 𝐨𝐯𝐞𝐫𝐯𝐢𝐞𝐰 𝐨𝐟 𝐭𝐡𝐞 𝐠𝐨𝐚𝐥𝐬 𝐈’𝐦 𝐟𝐮𝐥𝐟𝐢𝐥𝐥𝐢𝐧𝐠 𝐰𝐢𝐭𝐡 𝐭𝐡𝐢𝐬 𝐩𝐫𝐨𝐝𝐮𝐜𝐭? A measurement/attribution problem. Addressed by UX features such as global and local metrics trees. ...and so on. You get the picture. Note that not only the active user suggestions but also the user’s usage patterns are recorded, acting as active feedback for data product dev and managers. This UI is like a product hub for users to actively discover, understand, and leverage data products while passively enabling product development at the same time through consistent 𝐟𝐞𝐞𝐝𝐛𝐚𝐜𝐤 𝐥𝐨𝐨𝐩𝐬 𝐦𝐚𝐧𝐚𝐠𝐞𝐝 𝐚𝐧𝐝 𝐟𝐞𝐝 𝐢𝐧𝐭𝐨 𝐭𝐡𝐞 𝐫𝐞𝐬𝐩𝐞𝐜𝐭𝐢𝐯𝐞 𝐝𝐚𝐭𝐚 𝐩𝐫𝐨𝐝𝐮𝐜𝐭𝐬 by the UI. How have you been solving the UX for your Data Products?