A big part of why I got into cybersecurity, and later, privacy, was out of concern for the effects that digital data and social media could have on the persecution of vulnerable communities. The Holocaust was one especially horrific example of how even analog data collection of persecuted communities can operationalize their genocide. With the emergence of digitalized data, this threat is scaled. The problems we have at hand now are bigger than what I could have imagined when I first entered this space a few years ago. We run the risk of not only hostile governments using our information against us, but of us using information about our race, ethnicities, religious identities and affiliations, political opinions, etc. against each other. When I first entered this space, the nightmare situation that kept me up at night was that information identifying vulnerable populations (based along religious, race, or other lines) might be released en-masse with the intention to cause harm to those communities. This nightmare situation is already coming to fruition - just weeks ago, news broke that #23andme had experienced a data breach affecting circa 1 million Ashkenazi Jewish users, and over 100,000 Chinese users (this article from EFF provides useful resources on what to do if you're concerned you may be affected: https://lnkd.in/dZ3y3Xzv). At the same time, the rise of generative AI leaves ample ground for the spread of increasingly mindbending, reality-resembling, disinformation and fake/manipulated content that contributes to the rise of polarization, online hate, and physical violence against persecuted groups (this article from the Anti-Defamation League describes some examples of how this is taking place in the Israel-Hamas war: https://lnkd.in/dkkxtrNJ). The nexus of these issues, the horrific violence in the Middle East these past few weeks, and the rise of anti-semitism and Islamophobia/anti-Arab sentiment in the US and across the world leaves me concerned for my family, friends, and colleagues - in the region and at home. I'm praying for all those who are affected by this conflict, and urge anyone who this may reach to think critically about what you read online and practice empathy for members of these communities who are struggling in these fearful times.
How Data Trust Affects Real People
Explore top LinkedIn content from expert professionals.
Summary
Data trust refers to how individuals feel about sharing their personal information with organizations, and how secure, respected, and in control they feel regarding the use of their data. When data trust is broken, it can lead to real-world consequences like financial harm, emotional stress, and damaged relationships between people and the companies or institutions they rely on.
- Prioritize transparency: Clearly communicate to individuals how their data is being collected, used, and protected so they can make informed choices.
- Empower user control: Give people the ability to manage, access, and even own their personal information, especially when it involves sensitive data.
- Protect vulnerable groups: Take extra measures to secure data for those at higher risk, such as patients or marginalized communities, to uphold their dignity and safety.
-
-
Here's an uncomfortable truth: We're asking the most vulnerable patients to trust us with their most sensitive data while giving them the least control over how we use it. Think about that for a moment. The communities most harmed by healthcare systems are also the ones who could benefit most from truly personalized care. But they're sitting in our waiting rooms, watching us collect their information, make decisions about their bodies, and wondering: "What happens to all this data about me?" We can't keep building personalized healthcare ON TOP of broken trust. We need Privacy-First Experience Hubs that put patients back in control: → Data decisions made WITH communities, not just FOR them → Build cultural intelligence into AI without sacrificing security → Turn legal jargon into plain-language transparency → Give patients ownership, not just access, to their health information Real talk: If your personalization strategy doesn't start with rebuilding trust, you're just creating more barriers for the people who need care most. The question isn't whether we can afford to make this change. It's whether we can afford not to. What would happen if your most vulnerable patients felt the safest in your system? #PatientExperience #HealthEquity #PatientTrust #HealthcareInnovation #SDOH #HealthcareCX #PatientEX
-
We talk about data privacy like it's only a compliance issue. It's not. It's a dignity issue too. Every day, vulnerable populations share their most intimate information with social services. Income data. Health records. Immigration status. Housing history. They share because they need help, not because they've chosen to. But do we always handle this with the care it deserves? For example, imagine an organization serving domestic violence survivors and considering a new case management system that would "streamline operations" by centralizing all client data in the cloud. Efficient? Yes. But also potentially dangerous if that data was breached or subpoenaed. They could choose a different path. Local storage. Encrypted communications. Clear data retention policies. It might be more complex, more expensive. But it could better respect the trust their clients place in them. This aligns with how Crisis Text Line handles their 10+ million conversations - they've achieved ICH accreditation by maintaining strict confidentiality protocols, only breaking them when absolutely necessary for safety (https://lnkd.in/gvikqPCs). Privacy isn't just about preventing breaches. It's about recognizing that the people we serve have already had too many choices taken away. The least we can do is protect the information they trust us with. How are you supporting the dignity of your clients in how you treat their data? #DataDignity #PrivacyMatters #TrustInTech #EthicalData #TechWithRespect
-
A hairdresser and a marketer came into the bar. Hold on… Haircuts and marketing? 🤔 Here's the reality: Consumers are more aware than ever of how their data is used. User privacy is no longer a checkbox – It is a trust-building cornerstone for any online business. 88% of consumers say they won’t share personal information unless they trust a brand. Think about it: Every time a user visits your website, they’re making an active choice to trust you or not. They want to feel heard and respected. If you're not prioritizing their privacy preferences, you're risking their data AND loyalty. We’ve all been there – Asked for a quick trim and got VERY short hair instead. Using consumers’ data without consent is just like cutting the hair you shouldn’t cut. That horrible bad haircut ruined our mood for weeks. And a poor data privacy experience can drive customers straight to your competitors, leaving your shopping carts empty. How do you avoid this pitfall? - Listen to your users. Use consent and preference management tools such as Usercentrics to allow customers full control of their data. - Be transparent. Clearly communicate how you use their information and respect their choices. - Build trust: When users feel secure about their data, they’re more likely to engage with your brand. Make sure your website isn’t alienating users with poor data practices. Start by evaluating your current approach to data privacy by scanning your website for trackers. Remember, respecting consumer choices isn’t just an ethical practice. It’s essential for long-term success in e-commerce. Focus on creating a digital environment where consumers feel valued and secure. Trust me, it will pay off! 💰
-
You didn’t click “I agree” to have your driver's license sold on the dark web. But you probably did it anyway—without knowing. Because when it comes to data, most companies aren’t just protecting their systems. They’re gambling yours on their vendors. Hertz just confirmed what most customers feared but didn’t read in the fine print: Your personal data—name, DOB, payment info, driver’s license—was compromised. Not because Hertz got hacked. But because a vendor they hired did. This wasn’t some random phishing scam. It was a zero-day exploit on a trusted file transfer platform (Cleo), used by dozens of companies. The breach didn’t happen inside Hertz’s house—but their guests left the door wide open. 3 Big Truths We Need to Swallow: - Cybersecurity is only as strong as your weakest vendor. Fancy locks mean nothing if you hand spare keys to everyone at the party. - “No evidence” is not the same as “No breach.” Hertz denied any impact…until they didn’t. Now, thousands of users are at risk. - Outsourcing trust ≠ outsourcing responsibility. Customers don’t care if it was your vendor’s fault. They just know it was you they trusted. We keep telling people to “read the privacy policy.” But what we really need is a world where companies actually understand what they’re signing away on behalf of their customers. This isn’t just a tech story. It’s a trust story. And for HR leaders, solopreneurs, and founders—here’s the real question: Do you know who your vendors are trusting with your data? Or are you betting your reputation on someone else’s unpatched vulnerability? Let’s talk about the hidden breach points no one audits. What’s one system or vendor you rely on... that you haven’t truly questioned? #DataBreach #CyberSecurity #TrustIssues #PrivacyMatters #VendorRisk #DigitalTrust #LeadershipMatte
-
Everyone’s talking about 23andMe. But the real story isn’t one company filing for bankruptcy—it’s a reminder that even the boldest ideas can’t survive without a foundation of trust. I’ve been talking about this for years, about how trust is the most valuable currency in healthcare, especially as the industry keeps consolidating. More M&As. More companies folding. And with every shift, there’s one constant: patients are rarely told what happens to their data when the business model changes. A few years ago, Savvy partnered with the American Medical Association to ask patients and consumers how they feel about health data privacy. The big takeaway? Trust is everything. People aren’t anti-innovation. They understand the value of data in driving research and progress. But they want to know: 📱 Who’s collecting it? 📈 How is it being used? 💸 What happens to it if a company gets acquired or shuts down? Those questions are front and center now. We shouldn’t be seeing headlines like “California recommends deleting your data from Company X.” But here we are—because trust was eroded through data breaches, unclear policies, and shifting priorities that left patients out of the loop. The truth is, we need bold ideas like 23andMe. We need companies willing to explore what’s possible with health data. But great innovation can’t be divorced from patient privacy. And now, as AI becomes increasingly reliant on patient data to train and scale, the stakes are even higher. Patients want to see the future take shape—but not at the cost of their privacy, autonomy, or trust. If the business model and privacy practices aren’t built for the future—and built with patients in mind—then trust breaks down. Innovation requires risk. And when you’re pushing the envelope, things won’t always go as planned—because people are fallible, and companies are too. But that’s exactly why we need better systems—so when things don’t go as planned, the consequences don’t fall hardest on the people who trusted you in the first place. And when trust breaks down, the innovation disappears right along with it. That’s not just a business failure. That’s a loss for patients who deserve access to what these breakthroughs could have offered. Patients want progress. Companies want to build the future. Trust is the bridge—and it’s on all of us to reinforce it. #AskPatients #DataPrivacy #DigitalHealth