Public Distrust in Data Handling by Companies

Explore top LinkedIn content from expert professionals.

Summary

Public distrust in data handling by companies refers to people’s growing concern that businesses are not keeping their personal information safe or using it responsibly, especially as more data is collected through digital channels. This mistrust can lead to skepticism about personalization, fear of privacy violations, and reluctance to engage with brands.

  • Communicate transparently: Clearly explain to customers what data you collect, how you use it, and why it matters, using plain language that anyone can understand.
  • Prioritize user consent: Always obtain clear and informed permission before collecting, sharing, or using personal information, and honor customers' choices throughout the process.
  • Align data use with purpose: Make sure your handling of customer data is strictly limited to the intended service or benefit and avoid selling or repurposing information for unrelated goals.
Summarized by AI based on LinkedIn member posts
  • View profile for Dr. Andrée Bates

    Chairman/Founder/CEO @ Eularis | AI Pharma Expert, Keynote Speaker | Neuroscientist | Our pharma clients achieve measurable exponential growth in efficiency and revenue from leveraging AI | Investor

    26,621 followers

    80% of people believe the privacy risks outweigh the benefits of personalized marketing. Yet they still crave relevant, tailored experiences. This is the paradox keeping pharma and healthcare marketers awake at night. As someone who's spent years navigating the intersection of privacy and personalization in healthcare, I can tell you: we're facing an "arms race" between customer expectations and privacy concerns—and traditional approaches are failing both sides. The numbers are sobering:  ▪️ 2/3 of Americans believe they can't go through life without being tracked  ▪️ 79% don't trust companies to be responsible with their data  ▪️ 75% don't believe governments will hold companies accountable But here's what's really concerning me: Some healthcare marketers are still operating like it's 2010. While third-party cookies disappear and privacy laws evolve daily, some are still using ethically questionable tactics like fingerprinting—the exact behaviour driving consumer distrust. The path forward isn't choosing between privacy OR personalization. It's building harmony between both. After working with dozens of pharma companies on this challenge, I've identified what actually works: 🎯 Customer-centric privacy: Ask "Does this benefit THEM directly?" not just "Can we legally do this?" 🔒 Context-appropriate silos: Practitioner data stays separate from patient data unless absolutely necessary 🏗️ Privacy by design: Build protection into every process, not as an afterthought ⚖️ Transparent consent: Simple, complete explanations of data use—no legal jargon The companies getting this right aren't just avoiding regulatory headaches. They're building deeper trust, higher engagement, and ultimately better patient outcomes. Remember: In healthcare, we're not just marketers—we're stewards of some of the most sensitive information people will ever share. That's not a burden; it's a competitive advantage when done right. Privacy laws will keep changing. Customer expectations will keep evolving. But the principle remains constant: Put the customer's interests at the heart of every data decision. How is your organization balancing personalization with privacy? I'd love to hear your strategies and challenges. #HealthcareMarketing #DataPrivacy #PatientTrust #Personalization

  • View profile for Tiago Dias

    Founder, Product & CEO Unlockit | Applying Blockchain to PropTech & GovTech for Transparency & Compliance | Bridging Innovation & Public Trust | Step by Step

    25,520 followers

    Why Data Privacy Isn’t just a buzzword. It’s the Future of Trust! In today’s digital age, trust and privacy are no longer optional—they’re foundational.  Trust is built on belief. Believing that an organization is reliable, ethical, and acting in your best interest. Privacy, meanwhile, is about control—control over your personal information and how it’s used. Together, they form the bedrock of meaningful relationships between businesses and consumers.  But here’s the catch: transparency is the glue that holds it all together.   Transparency isn’t just a trend—it’s a necessity. It means being clear, concise, and accessible when communicating with customers. It means breaking down complex topics like data collection and usage into language everyone can understand. And most importantly, it means answering three critical questions for every consumer:  - What data are you collecting?    - How are you using it?    - Why do you need it? Recent findings from a Cisco survey underscore this point:  - 81% of respondents believe how a business handles personal data reflects its respect for customers.   - Yet, 43% feel companies aren’t doing enough to safeguard their data online.   - A staggering 76% are dissatisfied with current data transparency policies and practices. Why the disconnect? Because people crave clarity. Without it, uncertainty breeds mistrust—and mistrust erodes loyalty faster than anything else.  The truth is, in 2025, data privacy and ethics aren’t negotiable. They’re table stakes. As businesses, we have a responsibility to prioritize the safety and empowerment of our users. By embracing ethical practices and protecting user data, we don’t just protect our customers—we empower them.  At Unlockit, we believe trust should be at the heart of everything we do. That’s why we leverage blockchain technology to create systems where transparency meets innovation. When you commit to integrity and openness, you build more than just a loyal customer base—you create advocates who trust and champion your mission.  Empowered users lead to stronger, more resilient relationships—and new business models that thrive on mutual respect.  Let’s redefine what’s possible by putting trust first.  Because when trust wins, everyone wins.   ——— My name is Tiago Dias, Founder of Unlockit, and I’m on a mission to build trust through blockchain.   #Trust #Innovation #DataPrivacy

  • View profile for Santun Gunadi

    Data Protection Consultant | Lawyer | Law and Technology Enthusiast

    2,598 followers

    PROTECT DATA, EARN TRUST! Every day, I get bombarded with telemarketing calls offering credit cards or loan. They claim, they got my information from a credit card association linked to a card I previously applied for. Whether that’s true or not, it leaves me questioning the security of my personal data. Experiences like this make me hesitant to apply for another credit card in the future, and I’m sure I’m not the only one who feels this way. This highlights why compliance with personal data protection (PDP) laws is so important. When companies mishandle or misuse personal data, they erode trust, not just in their business, but across the entire industry. On the flip side, businesses that follow PDP regulations show that they respect their customers’ privacy, fostering loyalty and confidence. The impact extends to vendors, too. Companies are more selective about who they partner with, preferring vendors who prioritize data protection. Vendors that fail to comply risk losing opportunities because no business wants to jeopardize its reputation due to careless partners. For businesses and vendors alike, PDP compliance isn’t just a legal obligation, it’s a way to build trust, improve customer relationships, and stand out in a competitive market. By protecting data, they create an environment where customers feel safe to engage.

  • View profile for Graham Hill (Dr G)

    Customer base optimisation | Operating model redesign | Change facilitation | AI augmentation of work | Opinions my own

    3,819 followers

    Personalise ME! PERSONALISATION REQUIRES CUSTOMER TRUST TL;DR As companies collect more and more data about customers, they need to be trusted if customers are to accept it being used for personalisation. The benefits of trust go far beyond just accepting personalisation. WHAT YOU NEED TO KNOW Customers increasingly use digital channels to contact companies, providing them with an opportunity to collect unprecedented amounts of data. Excessive data collection, (and poor personalisation), have created a trust deficit in customers. Companies must identify how they can increase trust in personalisation if they want to reap the benefits from it.   Customers routinely communicate with companies through digital channels. The MoEngage ‘Personalisation Pulse Check 2023’ survey found that customers preferred to communicate through the website (16%), social media (16%) and mobile apps (15%). During each digital contact, companies collect an enormous amount of data and meta-data. This ‘digital exhaust’ is a powerful source of insight about customers and the jobs they are trying to do.   Customers are becoming aware of their digital exhaust and its value to companies. And they are uncomfortable with excessive data collection and the poor return they get from it. Twilio’s ‘The State of Personalization 2023’ report found that 51% of customers do not believe brands protect their data or use it responsibly. And in a BCG study, 88% of customers said companies should either get consent, or pay them for using their data for personalisation. The missing ingredient is TRUST; customers' trust that companies will protect their data, and also, that they will use it in customers’ best interests.   The most effective personalisation – that reinforces the value customers get from using their products – provides a service. A recent Institute of Customer Service report shows that not only is trust the biggest contributor to customer satisfaction, but also that companies scoring a 9 or 10 (out of 10) on trust are more likely to retain customers and be recommend by them (84%). And an article on ‘More Customers Say They Will Pay a Premium for Trusted Brands’, found that customers will or pay 35% more for trusted brands too.   Customers know their data is being collected and used for personalisation. They want to trust that companies will not only protect their data, but that personalisation will provide a benefit for them too. THREE TAKEAWAYS 1. Review the data and meta-data collected during interactions to see if it would be expected by customers (Hint: Use the ICO’s ‘Legitimate Interest Assessment’ framework as a guide) 2. Identify how personalisation influences customers’ trust and how the data collected should be used to improve it (Hint: Use the MIT CISR Data Monetisation framework as a guide) 3. Monitor customers’ trust and its impact on e.g. CSAT, retention, NPS and pricing power.   #Personalisation #Trust #CustomerTrust #Retention #NPS #PricingPower

  • View profile for Odia Kagan

    CDPO, CIPP/E/US, CIPM, FIP, GDPRP, PLS, Partner, Chair of Data Privacy Compliance and International Privacy at Fox Rothschild LLP

    24,164 followers

    "Collecting, storing, using, and sharing people’s sensitive information without their informed consent violates their privacy, and exposes them to substantial secondary harms like stigma, discrimination, physical violence, and emotional distress. The Federal Trade Commission will not stand for it" - says FTC in new blog post recapping its actions in Avast, X-Mode and InMarket. Key points re some common themes: 🔹 Browsing and location data are sensitive. Full stop. 🔹 Browsing and location data paint an intimate picture of a person’s life, including their religious affiliations, health and medical conditions, financial status, and sexual orientation. 🔹 What makes the underlying data sensitive springs from the insights they reveal and the ease with which those insights can be attributed to particular people. 🔹 Years of research shows that datasets often contain sensitive and personally identifiable information even when they do not contain any traditional standalone elements of PII, and re-identification gets easier every day—especially for datasets with the precision of those at issue 🔹 People have no way to object to—let alone control—how their data is collected, retained, used, and disclosed when these practices are hidden from them. 🔹 When a developer incorporates a company’s code into their app through an SDK, that developer amplifies any privacy risks inherent in the SDK by exposing their app’s users to it. 🔹 Data handling must align with the purposes for which it was collected. 🔹 Purpose matters: Firms do not have free license to market, sell, and monetize people’s information beyond purposes to provide their requested product or service. 🔹 Any safeguards used to maintain people’s privacy are often outstripped by companies’ incentives and abilities to match data to particular people - make sure that you control the sharing and use of data by your downstream. 🔹 Promises and contract clauses are important, but they must be backed up by action. 🔹 Firms should not let business model incentives that focus on the bottom line outweigh the need for meaningful privacy safeguards. #dataprivacy #dataprotection #privacyFOMO https://lnkd.in/eAuTmutG

  • View profile for Alexandra Eavis

    Chief Product & Technology Officer

    4,243 followers

    Those of us that work in healthcare analytics will find this sort of article deeply depressing but it is sadly what many people and patients believe. "They predict it will result in our details falling into the clutches of global corporates and artificial intelligence (AI) brains that could, on the basis of our medical conditions, block our credit, cancel health insurance, and harm our chances — or our children’s — of getting a mortgage." It's great to see the NHS finally doing a national public engagement on health data (announced 29 Sep 2023) but unfortunately a lot of public trust has already been eroded. I genuinely believe that working with the public to help them understand how data can be used in a secure and privacy enhancing way to deliver real value to them is the most important public health campaign we can look to run this century. When we are transparent and they are informed, people want to contribute to: - a longer and healthier life of their own and for their loved ones - preventing and finding cures for diseases around the world - a more efficient health service that has more money to spend on doctors and nurses In a world where data-informed technology can make a step change in outcomes, misinformation and disinformation scaring citizens into opting out is not good for individuals or society. Health and care providers, the research and life sciences industry, and technology suppliers all have a duty of care to protect this most precious public trust. As do the media - shame on the publisher!  #HealthcareTransformation #NHSPublicEngagement #NHS #Data #Trust #HealthData https://lnkd.in/ef9NU2nC

Explore categories