Cultural Shifts Affecting Trust in Experts

Explore top LinkedIn content from expert professionals.

Summary

Cultural shifts affecting trust in experts refer to changes in society that make people more skeptical about the authority and credibility of specialists, often leading to a preference for personal beliefs, influencers, or familiar figures over traditional expert advice. These shifts are fueled by factors like political polarization, social media, and a growing tendency to question established knowledge.

  • Promote transparency: Be open about the limits and uncertainties of expert knowledge to build credibility and encourage genuine dialogue.
  • Encourage critical thinking: Help people learn how to evaluate information sources and distinguish between genuine expertise and confident misinformation.
  • Model openness: Show a willingness to adapt decisions based on new evidence or insights, demonstrating that changing your mind can be a strength, not a weakness.
Summarized by AI based on LinkedIn member posts
  • View profile for Timothy Allen, MD, PhD

    EVP, Safety/Clinical- Chief Medical Officer/ Delivery Strategy Biopharmaceuticals | Hem. Onc. with IO/CGT subFellowship. Former FDA/ODAC. “Once a Marine, always a Marine.” USMC.🇺🇸

    4,796 followers

    Once the Most Trusted, Now Among the Most Doubted: Why Americans Turned on Science—And How We Rebuild It Physicians and scientists were once among the most trusted figures in America. Families turned to them for reassurance, from childhood vaccines to life-saving surgeries. Scientists were celebrated for conquering polio, landing on the moon, and advancing cancer therapies. Today, Pew Research (2023) shows only 29% of Americans have “a great deal of confidence” in medical scientists—a collapse that should alarm us all. Where did we go wrong? Politics drowned out evidence. During COVID-19, a mask became a political symbol. Vaccines, once embraced as collective salvation in the 1950s, were now partisan choices. Physicians weren’t just advising—they were accused of campaigning. Greed cast its shadow. The opioid epidemic killed over 500,000 people (CDC, 1999–2019). It wasn’t science that failed—it was marketing that corrupted it. The profession carried the stigma of profit-driven harm. The internet rewired trust. False health claims spread six times faster than corrections (Yale, 2021). A parent Googling vaccines often sees conspiracy videos before a single CDC link. Doctors now compete with influencers and algorithms. The system itself broke faith. Nearly two-thirds of U.S. bankruptcies involve medical debt (Am J Med, 2019). A physician’s advice to “fill the prescription” means little when the bill is higher than rent. Culture shifted. “My truth” began to outweigh data. A cardiologist may cite decades of trial evidence, yet a patient insists on an herbal TikTok cure. And the consequences are painfully human: • A 40-year-old woman in the ICU told her doctor before intubation: “I wish I had listened.” She refused vaccination after reading misinformation. • In Appalachia, patients refused chemotherapy, convinced “Big Pharma profits” outweighed survival. Several died of treatable disease. • A father in Texas delayed taking his child to the ER for appendicitis, fearing “they’ll bankrupt us.” By arrival, the appendix had ruptured. • A young mother launched a GoFundMe to pay for insulin—not because science failed, but the system did. How do we rebuild trust? • Transparency: Admit limits—false certainty destroys credibility. • Equity: Mobile vaccine clinics rebuilt faith in underserved areas. • Engagement: Oncology trials with patient boards (ASCO, 2022) improved recruitment and confidence. • Accountability: Purdue Pharma’s $8B opioid settlement must be the rule, not the exception. • Humanization: Patients who felt listened to were three times more likely to follow treatment (JAMA, 2017). Medicine has never been more advanced, yet trust has never been more fragile. Rebuilding it requires humility, accountability, and above all, remembering patients are not consumers but partners. Because when trust breaks, science cannot move society forward—and the cost is measured in lives.

    • +3
  • View profile for Allyn Bailey
    Allyn Bailey Allyn Bailey is an Influencer

    Talent Futurist + Transformation Leader + Experience Designer + Brand Builder + Keynote Speaker + TA / HR Tech Strategic Advisor

    10,715 followers

    Let’s talk about something that’s shifting under all of us: expertise. Not whether it matters, but how we define it, spot it, and trust it. Because the idea of what makes someone an “expert” has changed. And the shift hasn’t been subtle. For years, expertise was tethered to tenure. You earned your stripes by staying in the game. Time served. Titles held. Systems navigated. Authority was about proximity to power, not always about perspective. If you were young, you waited your turn. If you had fresh insight but not enough years behind you, good luck getting the room to listen. I know this, because I lived it. I was doing high-impact work in my 20s, and still had to bring a “translator” with a senior title to get people to take it seriously. Then things started shifting. Millennials questioned the model. Gen Z rebuilt the platform. Now we’re in an era where expertise is more democratized, more visible, and also more vulnerable. Social platforms and AI have accelerated this shift. Now, if you know how to speak in headlines or prompts, you can look like an authority, instantly. Confidence is often mistaken for credibility. And a viral take can outrank a decade of lived expertise. But here’s the tension: That democratization isn’t inherently bad. Some of the sharpest thinking I’ve seen lately has come from people early in their careers. We all know that wisdom doesn’t magically arrive with a 20-year work anniversary. The problem isn’t who is speaking. It’s how we’re evaluating what’s worth listening to. So if tenure isn’t enough… If clicks aren’t enough… What does real expertise look like now? It’s not just about being right. It’s about being useful. It’s about discernment. Pattern recognition. Knowing when to speak and when to listen. Understanding the risk behind the recommendation. That’s the kind of expertise we should be elevating now—at every level, from every generation. Because in a world where AI can mimic knowledge and platforms reward noise, real authority won’t be about who talks first or loudest. It’ll be about who adds value when it matters most.

  • View profile for Kyrtin Atreides

    COO | Cognitive Architecture & Cognitive Bias Researcher | Co-founder

    6,813 followers

    A cultural phenomenon I’ve noticed increasingly as of late is the default tendency for people with an amateur or no understanding of a domain to treat someone lecturing on the subject as though they held equal footing. This was previously observable in the “post-truth” cultural tendency for people to naively assume that their personal beliefs held as much water as a mountain of scientific evidence, though it now seems increasingly more like people just assume that the mountain doesn’t exist and that all things are opinion and subjective. I’ve held conversations in groups where such an individual would appear, and though otherwise seemingly intelligent and with some kinds of knowledge, they filled in all gaps in their knowledge, however severe, with opinion and/or disinformation, and danced to this tune of pretending that the mountain doesn’t exist. Individuals are blind to the mountain and blind to their blindness, but the confidence of these individuals seems to be increasing, perhaps as an artifact of cultural shifts. This explosion of confidence can in part be traced back to various “Disinformation Brokers”, Influencers who are paid or otherwise make a living by spreading disinformation far and wide. Such individuals spread and repeat overtly fraudulent claims like the popular “We don’t really know what LLMs are doing” and the unfalsifiable “What if they just reason differently?” kind of “leading questions” used in social engineering. This leads an increasing number of amateurs and enthusiasts (and investors or their advisors) to walk around thinking that they understand something and confidently ignoring mountains of scientific evidence and all that depends upon it. Science is the process of climbing that mountain to “stand on the shoulders of giants”, adding our own stones to the top, and growing humanity’s sum of knowledge. The disinformation brokers persuade those standing at the base of the mountain that they’re already at the top, and that all above them remains unknown territory, full of possibilities (Hype). In a way, this mirrors “Theory-induced Blindness”, but the blindness being induced is malevolent and adversarial. Stranger still, since the Disinformation Brokers are treated as if they were experts in the field, many of those with expertise greater than them who might otherwise have learned better end up repeating the demonstrably false claims that they propagate, eroding the effective expertise of many in the field. Over time, this can and has “un-grounded” now-former experts in AI, in a manner not entirely dissimilar to data poisoning in AI models. However robust or antifragile people may be, no expert is wholly immune to being bombarded with disinformation, and the greater their exposure per time interval, the greater their risk of poisoning. In AI models poisoning risks “Model Collapse”, but we now also face a similar risk from the humans talking about AI, which we may call “Competence Collapse”.

  • View profile for Colin Hardie

    Enterprise Data & AI Officer @ SEFE | Data & AI Strategy, Governance, Architecture

    7,809 followers

    Building on my previous posts about cultivating data champions and managing organisational change, I want to explore a fundamental challenge that every data leader faces. Understanding why intelligent, experienced professionals sometimes reject or ignore compelling data insights. Rather than being due to technical competence or data quality, it's about the psychological barriers that prevent even the most rational decision-makers from embracing data-driven approaches when they conflict with established beliefs or comfortable practices. Experienced professionals have built their careers on pattern recognition and intuitive judgement. This expertise becomes a double-edged sword when data suggests different conclusions. The more successful someone has been using traditional decision-making approaches, the harder it becomes to trust alternative methods. Senior managers often feel that accepting data-driven recommendations diminishes their value or expertise. This creates resistance that has nothing to do with the quality of the analysis and everything to do with professional identity and perceived relevance. The same issue can also be seen in resistance to the adoption of AI. Another issue can be that when presented with comprehensive data analysis, decision-makers sometimes become overwhelmed by the volume of information rather than empowered by it. Too much detail can create paralysis rather than clarity, leading people to revert to familiar decision-making patterns. The challenge for data leaders is finding the right level of detail that provides sufficient confidence without overwhelming the decision-maker. This balance varies significantly between individuals and situations. Confirmation bias plays a significant role in how data insights are received. Decision-makers naturally focus on information that supports their existing beliefs whilst questioning or dismissing contradictory evidence. This isn't deliberate obstinacy, it's just normal human psychology. Similarly, the sunk cost fallacy can make it emotionally difficult to accept data that suggests changing course on established initiatives or investments, even when the evidence is compelling. Creating an environment where data can challenge assumptions requires psychological safety. Decision-makers need to feel confident that changing their minds based on new evidence will be seen as strength rather than weakness. This cultural shift often requires leadership to model the behaviour, demonstrating how they use data to refine or change their own decisions without losing credibility or authority. Understanding these psychological dynamics is essential for data leaders who want to drive genuine organisational change rather than simply provide reports that gather digital dust. How do you address psychological resistance to data-driven insights in your organisation? #DataTransformation #DataStrategy #Leadership #Management #Innovation #AI

  • View profile for Meg Kendall

    Head of Strategy @ The Climate Hub | Climate Tech + Carbon Markets

    4,928 followers

    The war on expertise is in full swing. And sustainability communicators are caught in the crossfire. When 'gut instinct' is glorified over fact, and 'real Americans' (whatever that means) outrank scientists, it becomes clear that expertise is no longer a trusted asset — but evidently, a political fault line. We see it everywhere: → ‘Common sense solutions’ — a euphemism for gut instinct policies that ignore data → ‘The real America’ — a way to discredit urban, diverse, and educated populations → ‘The people versus the elites’ — a manufactured cultural divide that pits expertise against everyday experience And this isn't just a right-wing problem. The 'left' dismiss perspectives outside their ideological bubble, too — effectively creating echo chambers that are just as rigid as the populist rejection of expertise. It alienates people, fuels resentment, and keeps climate solutions siloed among the already convinced. This tension between academia and the 'real world' is shaping our politics and our ability to solve global crises. And not in a good way. We're in a trust crisis that’s making effective sustainability communication nearly impossible. 🚫 What are the forces driving modern anti-intellectualism? 📈 What makes it such a powerful political strategy? 🫣 Has climate skepticism morphed into outright hostility? If nobody trusts the experts anymore, how do we get the truth across? https://lnkd.in/euUJE4Ap

Explore categories