Data doesn't give definitive answers. This reality has become starkly apparent during my years in tech. I've watched skilled engineers and analysts present opposing conclusions using the same datasets. These weren't technical misunderstandings - they reflected a more profound challenge in approaching data-driven decisions. In countless meetings, data transformed from a discovery tool into a shield for existing beliefs. A product manager would highlight engagement metrics supporting feature expansion, while engineering would emphasize the same dataset's performance implications. Both analyses were technically sound. Both missed the larger picture. Something shifted when we started each analysis by examining our assumptions. Instead of asking, 'What does the data say?' we began with, 'Why are we analyzing this specific data in this specific way?' Three insights shaped my perspective: First, strong analyses start by acknowledging what we don't know. Our most productive conversations began with clear statements of our assumptions and limitations. Second, data serves us better as a tool for questioning than answering. Understanding the context and constraints of our analysis matters more than statistical significance. Third, embracing ambiguity leads to better decisions than forcing false certainty. The most impactful outcomes emerged when we combined robust analysis with clear principles and nuanced judgment. I've seen too many organizations chase the illusion of purely data-driven decisions. The reality? Data informs rather than determines. It guides rather than dictates. For those building data-informed teams: How do you handle decisions when your data presents multiple valid interpretations? What practices help you recognize and challenge your own analytical assumptions?"
How Data-Driven Decisions can Mislead
Explore top LinkedIn content from expert professionals.
Summary
Data-driven decisions can sometimes lead organizations astray when the data is misinterpreted, incomplete, or used without context. While data provides valuable insights, understanding its limitations and recognizing the role of human judgment is essential for making informed decisions.
- Question your assumptions: Always define the purpose behind analyzing specific data and consider the assumptions you are making to avoid biased or incomplete conclusions.
- Don’t confuse correlation with causation: Correlation doesn’t always imply causation—avoid drawing incorrect conclusions by investigating the underlying relationships within your data.
- Balance data with judgment: Use data as a guide, but factor in human context, emotions, and nuanced judgment to make sound decisions that reflect the bigger picture.
-
-
Most decisions don’t fail from lack of data. They fail from misreading it. One of the most common mistakes I see, especially in product reviews and investment decisions, is confusing correlation with causation. It’s a simple topic. In fact, my 8-year-old daughter told me, just as she saw me writing this, that she learned it in 3rd grade. And yet, I still see it used incorrectly often. A metric moves. A team gets credit. Budgets shift. But often, the data shows correlation, not causation, and that distinction matters. One famous example of this is from eBay. They believed paid search was driving substantial revenue due to the levels of clicks coming from Google Search, until they ran controlled experiments. Clicks remained mostly unchanged but moved from search ads to organic search. When the ads were turned off, sales barely changed. The result: millions in marketing spend that had little to no impact. Data doesn’t make decisions. People do. And great leaders know the difference between what data shows, and what it means.
-
One reason I resist being data-driven: Data is almost always measuring a specific thing. But most times in business what we need to understand is the "gestalt" - the whole picture, with all its complexity, moving parts, and unintended interactions. And the data we can get is often highly misleading for that kind of understanding. Relying too much on the data that's easy to get, and ignoring the rest of it (what I call the "gestalt" - or "something that is made of many parts and yet is somehow more than or different from the combination of its parts") can lead to the McNamara Fallacy. Which in turn leads failures - often catastrophic. "When the McNamara discipline is applied too literally, the first step is to measure whatever can be easily measured. The second step is to disregard that which can't easily be measured or given a quantitative value. The third step is to presume that what can't be measured easily really isn't important. The fourth step is to say that what can't be easily measured really doesn't exist. This is suicide." (From the Wikipedia article.)
-
A model with worse MAPE made better business decisions. I ran two Marketing Mix Models (MMM) on simulated data, where I knew the true ROAS, and found that traditional accuracy metrics could actually mislead budget decisions. 🔍 The version with less informed priors had better MAPE and MAE but completely failed at ranking ROAS correctly across channels—a critical mistake for budget allocation--while the version with more informed priors correctly rank-ordered the media channels. The challenge? Accuracy in total bookings ≠ correctly allocating incremental impact across media channels. A model that predicts total sales well can still mislead businesses into suboptimal spending decisions. In my latest video, I break down: https://lnkd.in/e9egnkPk ✅ Why traditional error metrics can mislead MMM evaluations ✅ How adding informed priors improved ROAS rank ordering—despite worse MAPE ✅ Why business decision accuracy matters more than pure prediction accuracy 🎥 Watch here: https://lnkd.in/e9egnkPk How do you evaluate your marketing models beyond just prediction accuracy? Would love to hear your thoughts! 👇 #datascience #marketinganalytics #MMM #causalinference #bayesianstatistics #marketingmixmodel
Data Science Tutorial and Real-world problem: Biggest Challenge in Evaluating Marketing Mix Models
https://www.youtube.com/
-
𝗗𝗼 𝗬𝗼𝘂 𝗨𝗻𝗱𝗲𝗿𝘀𝘁𝗮𝗻𝗱 𝘁𝗵𝗲 𝗗𝗮𝘁𝗮 𝗔𝗻𝗮𝗹𝘆𝘁𝗶𝗰𝘀 𝗳𝗿𝗼𝗺 𝗬𝗼𝘂𝗿 𝗔𝗜 𝗧𝗼𝗼𝗹? Data Analytics driving by AI Tools are transforming industries. But are we using them effectively? I often sense a disconnect when speaking to business owners. They create dashboards and reports with ease. Yet, there’s a crucial element often overlooked. A Data Expert is essential for using AI analytics correctly. AI Tools are valuable, but they have limitations: • They speed up data cleansing. • They bridge gaps between datasets. • They generate predictive visuals quickly. However, we must ask ourselves: • When did we last validate our assumptions? • Are the models still relevant to current conditions? 𝗟𝗲𝘁’𝘀 𝗰𝗼𝗻𝘀𝗶𝗱𝗲𝗿 𝗮 𝗿𝗲𝗮𝗹-𝘄𝗼𝗿𝗹𝗱 𝗲𝘅𝗮𝗺𝗽𝗹𝗲: Zillow's iBuying program aimed to revolutionize real estate. They relied heavily on AI for home valuations. Initially, it seemed promising. But Zillow's algorithms misjudged market dynamics. As a result, they overpaid for homes significantly. This led to over $1 billion in losses. Zillow's failure highlights a critical lesson. Even with vast data, AI can misfire without human insight. A data expert could have identified these flaws early on. AI tools require constant monitoring and adjustment. Don't let algorithms run unchecked in your business. 𝗖𝗼𝗺𝗯𝗶𝗻𝗲 𝗔𝗜 𝗽𝗼𝘄𝗲𝗿 𝘄𝗶𝘁𝗵 𝗵𝘂𝗺𝗮𝗻 𝗲𝘅𝗽𝗲𝗿𝘁𝗶𝘀𝗲 𝗳𝗼𝗿 𝗯𝗲𝘁𝘁𝗲𝗿 𝗼𝘂𝘁𝗰𝗼𝗺𝗲𝘀. Regularly reassess your models and assumptions. This is the key to informed decision-making. Are you leveraging both AI and human insight effectively? Your experiences could help others avoid pitfalls! Got questions? Let's discuss in the comments below! #PostItStatistics #DataScience #ai Follow Dr. Kruti Lehenbauer or Analytics TX, LLC
-
After more than 25 years in market research, I’ve learned that a single poorly worded survey question can mislead teams and compromise decision-making. One of my most memorable examples of this was when I had a client that had built a prototype of a device to track and monitor driving and wanted to target parents with teenage drivers. This was their question: With 8% of all fatal crashes occurring among drivers ages 15 to 20, motor vehicle deaths are the second-leading cause of death for that age group. We know your child’s safety is of utmost importance, and you are willing to do whatever you can to keep them safe. How likely would you be to install a device in your car to track and monitor your teenage driver? I told them that question would guilt a lot of the parents into selecting a positive rating, but it would not give them an accurate, unbiased estimate of market potential. Here's the wording they finally agreed to. A manufacturer has created a device that tracks a driver’s behavior (e.g., speeding, slamming on the brakes) and their location. It allows a user to set boundaries for where a car can be driven and be notified if the boundaries are crossed. It also allows a user to talk to the driver while they are on the road. How likely would you be to install a device with those capabilities to monitor your teenage driver? The results were not very favorable, which upset the client but also prevented them from making an expensive mistake. #MarketResearch #SurveyDesign #DataDrivenDecisions
-
We shouldn't always take data at face value. This was my takeaway listening to 13 year NHLer Dominic Moore on a recent panel... The theme of the panel was the influence data has had on hockey. Dominic brought up an important point we can all apply to business. Here's what he shared: Early in his career, coaches would get on him for having fewer blocked shots than they perceived he should have, but he viewed this a different way. In his mind, a lack of blocked shots meant he was in a better position to start with. His point being if players were in the right position to start with, they shouldn't need to be blocking shots. This type of thinking is so important with data being shared so vastly and quickly today. I'm as data-driven as they come, but at times we need to take a step back to critically think about the data we're consuming. Often surface-level data intended to help can have underlying outcomes we are missing. Data-driven decision-making is the way, but don't discount your human intuition. #sportsbiz #linkedinsports #dataanalytics
-
Most of the data content I see on LinkedIn falls into one of the categories below: - How to become a data analyst - How to become a better data analyst - Here's why you should learn “X” over “Y” - Here's how I became a data analyst - Here's what I do as a data analyst Heck, most of my stories fall into these too. But, two weeks ago, I read a post from Syed Al Mahiyan that made me think, “I rarely see people actively writing about HOW to make data-driven decisions. We know why it matters, but few people talk about how to actually do it.” I told myself I’d think deeper on the issue, but after becoming busy with work and life, I forgot about the matter… Until yesterday. Kelly Adams wrote a post that made me revisit this issue. So, I dove into my “secret knowledge safe” in search of an idea to explain HOW we can use data to make decisions. And I found this quote from Morgan Housel’s book The Psychology of Money: “Spreadsheets are good at telling you when the numbers do or don’t add up. They’re not good at modeling how you’ll feel when you tuck your kids in at night wondering if the investment you’ve made were a mistake that will hurt their future.” While the book is about the finance world, I see a lot of value for the data world here. No matter how much data you look at, you have to remember that we are humans… with emotions… making decisions. Keyword = EMOTIONS. I don’t care how great you are at “putting your emotions to the side.” No decision is ever 100% data-driven. If a human (not AI) has the final say, there will be inevitable bias. Things like: - How happy you are in the moment - How high the stakes are - How risk averse you are at the moment - How similar (yet independent) decisions benefited you in the past The data you analyze only tells you half of the story - the logical side of it. Don’t forget to think about the emotional side. Every decision has it, so it shouldn’t be ignored. p.s. I'll keep thinking about the idea of "HOW to make decisions with data." I realize now this post steered more on the "how NOT to make decisions with data" side haha.