Creating a CSR Dashboard

Explore top LinkedIn content from expert professionals.

  • View profile for Brent Dykes
    Brent Dykes Brent Dykes is an Influencer

    Author of Effective Data Storytelling | Founder + Chief Data Storyteller at AnalyticsHero, LLC | Forbes Contributor

    72,255 followers

    Many #datavisualization#dashboard, and #datastorytelling mistakes can be traced back to this simple problem: taking a presenter rather than an audience perspective. 🙋🏻 When designing data charts 📊, are you designing them with the audience in mind? I’ve often found that data communicators expect their audience to see the data from their perspective without evaluating their visuals from the audience’s viewpoint. They assume that what works for them will also work for their audience. This approach can be a recipe for disaster if you don’t know your audience very well. Before rushing to present some data, you should learn as much about your audience as possible. 👉 Knowledge level: How familiar are they with the topic or data? 👉 Relevance: How relevant or meaningful is your data to them? 👉 Context: What background information or assumptions are they missing? 👉 Data literacy: Will they be able to make sense of your charts? Once you've gained this understanding, you can attempt to design the data charts in a way that makes the most sense for your audience. It's also valuable to ask for feedback from colleagues or audience members beforehand to test your approach and fix potential problems. A common excuse I hear from data professionals is that they don’t have time to tailor their content to each audience. While it’s true that you might not be able to do it all the time, it is crucial to do it as much as possible. If you don’t make time to take an audience-centric approach, you will continue to be “busy” without driving meaningful outcomes. This type of shortsighted mindset makes you vulnerable when leaders begin to question what value you’re providing. What has helped you maintain an audience-centric perspective when designing your data charts, dashboards, and data stories?

  • View profile for Aishwarya Srinivasan
    Aishwarya Srinivasan Aishwarya Srinivasan is an Influencer
    595,120 followers

    If you are looking for a roadmap to master data storytelling, this one's for you Here’s the 12-step framework I use to craft narratives that stick, influence decisions, and scale across teams. 1. Start with the strategic question → Begin with intent, not dashboards. → Tie your story to a business goal → Define the audience - execs, PMs, engineers all need different framing → Write down what you expect the data to show 2. Audit and enrich your data → Strong insights come from strong inputs. → Inventory analytics, LLM logs, synthetic test sets → Use GX Cloud or similar tools for freshness and bias checks → Enrich with market signals, ESG data, user sentiment 3. Make your pipeline reproducible → If it can’t be refreshed, it won’t scale. → Version notebooks and data with Git or Delta Lake → Track data lineage and metadata → Parameterize so you can re-run on demand 4. Find the core insight → Use EDA and AI copilots (like GPT-4 Turbo via Fireworks AI) → Compare to priors - does this challenge existing KPIs? → Stress-test to avoid false positives 5. Build a narrative arc → Structure it like Setup, Conflict, Resolution → Quantify impact in real terms - time saved, churn reduced → Make the product or user the hero, not the chart 6. Choose the right format → A one-pager for execs, & have deeper-dive for ICs → Use dashboards, live boards, or immersive formats when needed → Auto-generate alt text and transcripts for accessibility 7. Design for clarity → Use color and layout to guide attention → Annotate directly on visuals, avoid clutter → Make it dark-mode (if it's a preference) and mobile friendly 8. Add multimodal context → Use LLMs to draft narrative text, then refine → Add Looms or audio clips for async teams → Tailor insights to different personas - PM vs CFO vs engineer 9. Be transparent and responsible → Surface model or sampling bias → Tag data with source, timestamp, and confidence → Use differential privacy or synthetic cohorts when needed 10. Let people explore → Add filters, sliders, and what-if scenarios → Enable drilldowns from KPIs to raw logs → Embed chat-based Q&A with RAG for live feedback 11. End with action → Focus on one clear next step → Assign ownership, deadline, and metric → Include a quick feedback loop like a micro-survey 12. Automate the follow-through → Schedule refresh jobs and Slack digests → Sync insights back into product roadmaps or OKRs → Track behavior change post-insight My 2 cents 🫰 → Don’t wait until the end to share your story. The earlier you involve stakeholders, the more aligned and useful your insights become. → If your insights only live in dashboards, they’re easy to ignore. Push them into the tools your team already uses- Slack, Notion, Jira, (or even put them in your OKRs) → If your story doesn’t lead to change, it’s just a report- so be "prescriptive" Happy building 💙 Follow me (Aishwarya Srinivasan) for more AI insights!

  • View profile for Bryan Zmijewski

    Started and run ZURB. 2,500+ teams made design work.

    12,261 followers

    Align design decisions with business goals and user needs. What does this mean? Aligning business goals with user needs is not a one-time effort–it's an ongoing process that requires regularly involving stakeholders, some of whom will be more engaged than others. To keep everyone aligned, create a regular cadence for reviewing design decisions, using user metrics and feedback as key inputs. Since stakeholders aren't closely tracking user behavior, design decisions should focus on making the feedback easy to understand and use, helping simplify their decision-making process. Here’s a dashboard redesign example that explains each point: → Purpose of design Explain what problem the design solves and why it's important for the business. Example: Redesign the dashboard to help clients better manage their supply chain data, making it easier to spot inefficiencies. → Impact on business Describe how the design will help achieve business goals. Example:  We want to reduce customer support requests by 15% and increase retention among enterprise clients by improving the dashboard → User metrics Identify the data points to measure the design's success. We use Helio. Example: Track client sentiment, engagement, and usability of the new dashboard features and how quickly they resolve supply chain issues. → User feedback Share what users said or did that influenced the design. Example: Clients reported that the current dashboard is too complex, so we're simplifying the interface to make data easier to access. → Risks Point out any potential problems and how they will be managed. Example: We’ll add an option to toggle between simplified and detailed modes to prevent advanced users from missing in-depth data views. → Timeline Provide an explicit schedule for when the design will be completed and tested. Example: The redesigned dashboard will be ready for beta testing in three weeks, with a full rollout planned for the end of the quarter. #productdesign #productdiscovery #userresearch #uxresearch

  • View profile for Amanda Makulec

    Data viz design, workshops & keynotes | Author of Dashboards that Deliver | Co-host of Chart Chat | Data Visualization Society Advisory Council

    10,136 followers

    Make mistakes with lower stakes. When I'm working with clients, I always advocate for trying out design ideas early and often with the people who will eventually use the dashboard. Why? ☑️ A sketch or a quick Figma mock up takes far less time to put together than a fully baked dashboard with a complete data model. Get feedback on low fidelity versions to create quicker feedback loops between designer and users. ☑️ Input early helps people build a sense of ownership over what's being created. That can translate to greater adoption later, since the dashboard reflects their real needs. ☑️ What people say they want, even in the best discovery interviews, can shift when they see those ideas translated into visuals. Don't fall in the 'black box' development trap where you disappear for a week or month and come back with a final product that you think is perfect (without much billable time left on the contract). In Dashboards that Deliver, we talk about this approach through a 'double diamond' approach adapted from the design world, focusing on exploring lots of needs and then narrowing to priorities first (discovery), then brainstorming lots of visual ideas and then refining to a final mockup. How do you use prototyping in your #dataviz work? Have you experienced push back from clients that insist on working 'in the real tool' from the start? Andy Cotgreave Steve Wexler Jeffrey Shaffer

  • View profile for Thibaut Nyssens 🐣

    Sr. Solutions Engineer @ Atlassian | founding GTM @ Cycle (acq. by Atlassian) | Early-stage GTM Advisor

    8,996 followers

    I talked with 100+ product over the last months They all had the same set of problems Here's the solution (5 steps) Every product leader told me at least one of the following: "Our feedback is all over the place" "PMs have no single source of truth for feedback" "We'd like to back our prioritization with customer feedback" Here's a step-by-step guide to fix this 1/ Where is your most qualitative feedback coming from? What sources do you need to consolidate? - Make an exhaustive list of your feedback sources - Rank them by quality & importance - Find a way to access that data (API, Zapier, Make, scraping, csv exports, ...) 2/ Route all that feedback to a "database-like" tool, a table of records Multiple options here: Airtable, Notion, Google sheets and of course Cycle App -Tag feedback with their related properties: source, product area customer id or email, etc - Match customer properties to the feedback based on customer unique id or email 3/ Calibrate an AI model Teach the AI the following: - What do you want to extract from your raw feedback? - What type of feedback is the AI looking at and how should it process it? (an NPS survey should be treated differently than a user interview) - What features can be mapped to the relevant quotes inside the raw feedback Typically, this won't work out of the box. You need to give your model enough human-verified examples (calibrate it), so it can actually become accurate in finding the right features/discoveries to map. This part is tricky, but without this you'll never be able to process large volumes of feedback and unstructured data. 4/ Plug a BI tool like Google data studio or other on your feedback database - Start by listing your business questions and build charts answering them - Include customer attributes as filters in the dashboard so you can filter on specific customer segments. Every feedback is not equal. - Make sure these dashboards are shared/accessible to the entire product team 5/ Plug your product delivery on top of this At this point, you have a big database full of customer insights and a customer voice dashboard. But it's not actionable. - You want to convert discoveries into actual Jira epics or Linear projects & issues. - You need to have some notion of "status" sync, otherwise your feedback database won't clean itself and you won't be able to close feedback loops The diagram below gives you a clear overview of how to build your own system. Build or buy? Your choice

  • View profile for Zach Gemignani

    Founder and CEO of Juice Analytics - the company behind Juicebox

    8,155 followers

    “Why won't customers use the dashboard?” This question is so common it has become a cliche. Frequently the answer is that the customer-facing reporting or dashboard was not treated like a product. Basic questions were missed in the rush to make the data available: What are my users pain points? How do we make their life better? The essential elements to a good data product aren’t a huge mystery. But they do take an empathetic, customer-focused perspective that is often lacking. Here are some of my key lessons: Lesson 1: Apps, not Dashboards. Multiple, small, focused data products are better than one comprehensive solution that tries to do too much. Many companies launch an “analytics dashboard” or “self-service portal” that is design to answer any and all questions. Of course it doesn’t, and is more confusing than useful. Lesson 2: Form Follows Function: A data product should be delivered and experienced by different audiences in different ways. For example, an executive audience is more interested in summarized insights delivered in static formats (PDF, PPT). Whereas analytical audiences may want an interactive, exploratory solution. Lesson 3: The Goal is Insights. To paraphrase James Carville, “It’s the Insights, Stupid!” The data, visualizations, dashboard…these are all vehicles to find and deliver useful insights. How are you guiding people to find insights, then share and act on those insights? Lesson 4: Lead with Actions. For many years, we designed analytical solutions that assumed users will drill into the data to find the information that was most relevant to them in the moment. It isn’t always the right starting point. If possible, lead with the To Dos or Actions. Lesson 5: The Right Starting Points. Initial settings and personalization are powerful tools in your design toolbox. A remarkable number of data product users (we’ve watched a lot videos of user behavior) will not click on anything to customize the views of the data. Lesson 6: Data Wrapped in Context. A data product needs to do much more than present data. It needs to explain the scope and purpose of the solution, guide the users through the experience, and provide help. Our solutions use images and text to put the data in context. Lesson 7: Secondary Audiences. Data product serve more than the direct users. The information in your product needs to travel to secondary audiences who may impact decisions. How can you ensure insights can get shared more broadly? Lesson 8: Selling is priority #1. This is comfortable territory for many data people. However, as creators of data products, we need to think about how to support the sales team, clarify the value points for customers, and deliver a premium, differentiated product. Lesson 9: Iterate on Feedback. A data product should be its least-good version on initial release. As you start to get (paying) customer feedback, you’ll learn more about what customers really want.

  • View profile for David Giraldo

    Saved over $500k for clients with 25+ reporting and data analytics solutions | Azure, Power Platform & Fabric Consultant

    6,553 followers

    The client called me because their executive dashboard wasn't getting used. "Perfect metrics," they said. "Tracks everything we need." I spent 5 minutes with their VP and understood the real problem. The dashboard answered every question except the one that mattered: "What changed?" Here's what I found: 47 beautifully designed KPIs. Color-coded. Benchmarked. Completely useless for decision-making. Revenue up 8%. From what? Customer satisfaction at 4.2/5. Compared to when? Support tickets down 12%. Why? The VP scrolled through metric after metric, then asked: "But what actually happened this quarter?" The dashboard couldn't tell him. This pattern is everywhere: We build metric museums instead of decision tools. Scorecards that track everything and explain nothing. Perfect data visualization that requires 10 minutes of detective work to understand what's going on. What I rebuilt instead: Dashboards that start with the story, not the score. "Revenue jumped 8% this quarter because the enterprise deal finally closed, but new customer acquisition is slowing." "Support tickets dropped 12% after we fixed the login bug, but complexity of remaining issues is trending up." The KPIs are still there. But they're supporting the narrative, not replacing it. The uncomfortable truth: If your dashboard requires analysis to understand what happened, it's not a dashboard. It's a data dump with pretty charts. Your executives don't want to solve puzzles. They want answers. PS. Ever rebuilt a report just to answer “what changed?” You’re not alone. Share your experience. I’m building a playbook.

  • View profile for Hardik C.

    Data presentations, 10x faster with AI agents

    10,259 followers

    Given that many of you pinged asking for more content on `Metrics dump slides` and `Perfect dashboard`, let me state the most obvious but often overlooked principle to solve for information overload (from my personal example). 💡 Easiest 30-second tip that made my executive dashboard actually drive conversations. Start with the story, not the stats. ➡️ Before: 'Monthly Usage Dashboard' - 15 metrics (most never discussed) - 5 charts (too complex to interpret quickly) - 3 tables (rarely referenced in meetings) 🤯 Result: Information overload, no decisions made ➡️ After: 'How Product Launch X Drove 23% Growth' - One Hero metric: 23% MoM increase in Metric Y - One main supporting chart: 47% conversion rate improvement - Clear narrative as the cause: Feature Y adoption rate correlation ✨ Result: Instant clarity; 5-minute discussion → concrete next steps 🏎️ Quick Implementation: 1. Write HEADLINES, not labels 2. Pick ONE hero metric that matters the most TODAY 3. Support with max 2-3 stats that explain the WHY Try it today. Watch engagement soar. Reply '💡' if you want more dashboard psychology tips. #Analytics #Dashboards #MagicBI #DataStorytelling

Explore categories