🌍 I was recently interviewed by Paula Dupraz-Dobias for Geneva Solutions. The piece has now been published 👉 "Donor crisis prompts a rethink on rules of collaboration for humanitarian data partnerships". 🤔 The article sheds light on how aid funding cuts are pushing humanitarian organisations to seek private-sector collaboration—raising serious privacy and ethical risks in the rush for efficiency. As I put it in the piece: “The challenge today is how do you improve the way you make decisions or design services or develop policies that leverage new tools such as data and AI in a systematic, sustainable and responsible way … where no one is left behind.” 👉 To me, this means building new data governance frameworks that: ✅ Articulate the purpose behind data and AI use ✅ Ground their operation in human rights principles ✅ Involve local stakeholders and aid beneficiaries in shaping decisions (digital self determination) ✅ Manage data thoughtfully across its entire lifecycle—collection, analysis, sharing, use, and reuse ✅ Secure a “social license” from vulnerable communities for both initial and future data re-use Without these, even well-intentioned partnerships risk compromising privacy, legitimacy and ultimately, the dignity and well‑being of the people they aim to serve. But with them, data and AI can become a real force multiplier—inclusive, and truly impactful. 💻 Full article here 👉 "Donor crisis prompts a rethink on rules of collaboration for humanitarian data partnerships", by Paula Dupraz‑Dobias, Geneva Solutions at https://lnkd.in/esF3J433 ➡️ For those who want to find out more re: data governance check out the recently released Data Governance Toolkit: https://lnkd.in/ef_U_wqm 🤔 Also: these topics are not new. Almost 10 years ago we drafted for UNOCHA a think brief on "Building data responsibility into humanitarian action" with Nathaniel Raymond, Ziad Al Achkar, Ph.D. and Jos Berens - See: https://lnkd.in/edU_dqYC #opendata #humanitarian #humanrights #UnitedNations #artificialintelligence
Risks and trust in data sharing partnerships
Explore top LinkedIn content from expert professionals.
Summary
Risks-and-trust-in-data-sharing-partnerships refers to the challenges and necessary safeguards involved when organizations share sensitive data, aiming to protect privacy, uphold ethical standards, and build confidence among all parties. Trust is crucial because mishandled or insecure data can lead to regulatory penalties, reputation damage, and loss of collaboration opportunities.
- Prioritize transparency: Make data-sharing agreements clear and straightforward so everyone knows how their data will be used and protected.
- Protect privacy: Use privacy-enhancing technologies and strong security practices, such as anonymization and encryption, to reduce risks when sharing sensitive information.
- Build mutual value: Ensure all parties benefit from the partnership and encourage open communication to strengthen trust and cooperation.
-
-
Incorporating Data Privacy Clauses in NDAs 🔐 As someone deeply involved in data protection, I have seen firsthand how critical it is to protect sensitive information in our collaborations. In today’s landscape, integrating robust data privacy clauses into Non-Disclosure Agreements (NDAs) is no longer optional—it's essential. Why This Matters: 1. Regulatory Compliance: With regulations like GDPR and CCPA shaping our practices, we must ensure our NDAs reflect these legal requirements. I've witnessed the repercussions of non-compliance, and it's not something any organization can afford. 2. Data Classification: Clearly defining what sensitive data looks like is crucial. For example, specifying categories like PII or financial data helps everyone understand what’s at stake. 3. Access Controls: Establishing who can access sensitive information—and under what conditions—helps uphold the principle of least privilege. I’ve found that clarity here builds trust among all parties involved. 4. Breach Notification: It’s vital to have a breach notification protocol outlined in the NDA. Knowing how to respond swiftly can make all the difference in minimizing damage. 5. Data Transfer: In our globalized world, addressing cross-border data transfers in NDAs ensures we remain compliant with international standards. By embedding these technical aspects into our NDAs, we reinforce our commitment to data integrity and privacy. It’s not just about legal compliance; it’s about cultivating trust in every partnership. Let’s prioritize data privacy in our agreements and foster a culture of accountability in our industry. #DataPrivacy #NDA #LegalCompliance #DataSecurity #RiskManagement #cybersecurity #dataprotection
-
BEYOND QUALITY: SECURING & GOVERNING DATA FOR ETHICAL USE A few weeks ago, I shared my experience tackling data quality issues in the oil & gas industry focusing on identifying and resolving inconsistencies, missing values, and compliance risks. But fixing data quality was only the first step. The bigger challenge was ensuring that this data remained secure, ethically shared, and regulatory compliant. As part of my project, I worked on a data-sharing scenario where an industry body required datasets to be provided to academic researchers. The issue? These datasets contained sensitive business information that couldn't be exposed carelessly. Upstream data had to align with OGA & NUPRC regulations, while corporate data needed to meet GDPR compliance to protect customer information. The risks of getting this wrong were significant: regulatory fines reaching €20M (GDPR) or £1M (OGA), potential competitive threats, and a loss of business trust due to weak governance. To address these challenges, I pseudonymized and anonymized sensitive data, ensuring that well IDs and customer records were masked while still retaining analytical value. I also encrypted files using AES-256, stored them securely with robust access controls and audit logs, and enforced Multi-Factor Authentication (MFA) and secure file transfers to prevent unauthorized access. Additionally, I drafted a Data Use Agreement (DUA) between bp and the University of Aberdeen to outline the terms for ethical and secure data usage. Working alongside my talented course mate Amarachi Grace Agiri, PMP® throughout this project has enriched the experience. Even though we were on different teams, our shared commitment to data quality and security made the experience truly enriching. This experience reinforced a crucial lesson: data quality is only as good as the security and governance surrounding it. It’s not just about accuracy, it’s about protecting and responsibly managing data in a way that ensures compliance and maintains trust. Now, I’m eager to dive deeper into enterprise-wide data governance strategies, exploring how security, compliance, and automation can be seamlessly integrated into industrial workflows. More importantly, I’m curious. How does your industry handle data security and governance? Let’s discuss! 👇 #DataQuality #DataSecurity #DataGovernance #OilAndGas #PetroleumData #GDPR #ISO27001 #Compliance #BigData #DigitalTransformation #EnergySector
-
After 8+ years of experience working closely with dozens of data partners to integrate 40+ data sources into one platform, we've learned: 1. Invest in relationships Focus on trust before technology. Connect, feed people, show up, operate with integrity, apologize, fix it when you get it wrong, do excellent work...It's not complicated, but it takes time. 2. Educate on what's legally and technically possible Most concerns about data sharing are born from a place of confusion or lack of knowledge. When you aren't sure what's legal or ethical, then you are usually more risk averse. 3. Write strong and clear data sharing agreements Make agreements clear, solid, and simple. People feel better when they understand what the boundaries of the partnership look like. Strong agreements aren't an indication of a lack of faith. In fact, the opposite is true. The clearer your agreements, the more trust you can build with partners. 4. Show why it matters Don't just extract value. Deliver value back. Create win-wins. That always makes sharing more fun. What's your take? How can the anti-trafficking movement build strong, trust-based data sharing partnerships? #data #lighthouse #humantrafficking
-
The Future of Data Partnerships: Why Proof-Based Collaboration is the Best Path Forward Robert Silver’s article underscores an undeniable truth: data partnerships are the future of connected media. As commerce, media, and CRM converge, second-party data is becoming an essential tool for enriching customer experiences. But as the industry moves away from third-party cookies and toward privacy-first strategies, we need to ask: How do we ensure that data partnerships remain trustworthy, compliant, and actually deliver value? At Precise.ai we believe the answer is proof-based collaboration—a model that shifts the focus from simple data sharing to secure, validated, and privacy-preserving activation. Beyond Data Sharing—Why Proof-Based Collaboration Wins: Traditional second-party data partnerships rely on direct data exchanges, which introduce risks: trust gaps, compliance concerns, and inefficiencies in how insights are leveraged. A proof-based approach solves these challenges by ensuring: ✅ Data Integrity Without Exposure – Instead of exchanging raw data, partners can validate insights securely using privacy-preserving AI and federated learning. This ensures brands work with real, high-fidelity insights without risking data leakage. ✅ Regulatory & Consumer Trust Compliance – The future of data collaboration isn’t just about access—it’s about controlled, transparent activation. Proof-based systems ensure zero-trust data handling, where brands can verify impact without overstepping privacy boundaries. ✅ Performance-Driven Partnerships – Rather than static data handoffs, continuous, real-time validation ensures that each partnership delivers measurable ROI—whether in audience enrichment, predictive modeling, or campaign performance. The Shift from Data Ownership to Data Utility: The real opportunity in second-party data isn’t who owns it, but how it’s used. A proof-based approach allows brands to activate insights dynamically, respecting both regulatory constraints and consumer trust. The days of open-ended data exchanges are over. The future is privacy-first, performance-driven, and built on proof. Let’s move beyond data partnerships. It’s time for proof-based collaboration. #DataPrivacy #AI #DataPartnerships #ConnectedMedia #ProofBasedCollaboration #PreciseAI https://lnkd.in/eAg6JPQx
-
With data sharing between car manufacturers and insurance companies top of mind for regulators recently - what are some To Do's we're discussing with clients in and out of the automotive space (OEMs, dealerships, and others) Clear is kind. 🔹 Regulator: You can't bury disclosures in multiple lengthy documents that overlap and are filled with cross references [This is also Data Protection Commission Ireland case in Meta] 🔹To do: Review your terms and privacy disclosures together as a whole - have you been "hoarding" docs? Can they be streamlined and simplified? 🔹Regulator: You need to give the customer a meaningful opportunity to review terms of service and the data sharing component must be made obvious. 🔹To do: Besides Marie Kondo-ing your disclosures package, simplify the disclosure itself. Simpler, shorter. Give a copy to someone in your target audience to read and test for comprehension. 🔹 Regulator: If you engage in profiling that can result in a loss of access to a right or a service (e.g. create a risk score) you must disclose this, explain how it is calculated and explain the consequences of the score. 🔹To do: Review your disclosure and make sure profiling is disclosed and adequately explained. Voluntary means it's optional. 🔹Regulator: Opt-in consent can’t be disguised as an integral part of a lengthy mandatory onboarding process. It must be clear that it is voluntary. 🔹To do: If something is voluntary - make sure it's obvious. Test this with your audience too. 🔹Regulator: Beware when incentivizing agents to enroll customers in a data sharing program. You need to make sure the enrollment remains voluntary.. 🔹To do: Check what happens "on the ground". If your enrollment is handled by third parties, provide them with guidance and audit them. 🔹 Regulator: If someone declines to enroll in a voluntary data sharing you can’t present warnings that doing so will result in the degradation of service, or absence of safety features. You also shouldn’t nudge them to sign up with multiple emails after they decline. 🔹To do: Make sure no means no. Excessive nudging has been held by regulators (including Federal Trade Commission in the US and European Data Protection Board in the EU) as a "dark pattern" that can undermine true consent. If you Sell something say something 🔹 Regulator: If you sell/share information; make a profit / revenue share out of it you need to disclose this in a way that is clear and not misleading. If you are sharing data for profit you can’t say that you are sharing for the improvement of your product or for safety, functionality or operability. 🔹You can’t use a download of a free app as consent to sharing information with third parties To do: Make sure your data sharing is clear (this is top of mind for California Privacy Protection Agency and other state regulators and the FTC) and provide an opt-in / opt-out where required. Image by ChatGPT #dataprivacy #dataprotection #privacyFOMO
-
In Third-Party Risk Management (TPRM), a significant challenge arises from the trust dynamics between vendors and customers. Vendors often hesitate to share comprehensive security documentation due to concerns about losing control, potential misuse of sensitive data, or it being used against them in future negotiations. On the other hand, customers face intense pressure to safeguard their organizations, leading them to seek extensive transparency through security questionnaires, audit reports, and test results, going beyond mere certifications. The outcome? Strain, delays, frustration, and sometimes, only surface-level reassurance. This status quo is unsustainable. To enhance TPRM, collaboration is key. It commences with establishing mutual understanding: - Vendors can offer redacted summaries, third-party attestations, or restricted access under Non-Disclosure Agreements (NDAs). - Customers can prioritize risk-focused inquiries over broad requests for exhaustive information. - While standard frameworks like HITRUST, play a role, genuine advancement stems from fostering a culture of partnership over coercion. Let's transcend the impasse. TPRM doesn't require more friction...it demands more trust. #TPRM #Cybersecurity #ThirdPartyRisk #VendorRiskManagement #HITRUST #RiskManagement #SecurityLeadership #InfoSec #DueDiligence #TrustButVerify #Partnerships
-
In the pursuit of data-driven excellence, collaboration is essential. However, scaling these capabilities while overcoming inherent challenges isn’t easy, and raises many questions and concerns: ➝ Should you trust your partners with your sensitive data? Can you do this compliantly amidst all the various global privacy regulations? ➝ Is a “clean room” or other “secure, shared space” under someone else’s control truly secure? ➝ Will all your data partners agree to use the same shared space? If not, can data collaboration be accomplished at any kind of scale? My advice is simple: ➝ Trust only your own organization with your sensitive data. Protect PII/PHI at rest and NEVER directly share it. The recent onslaught of data breaches reinforces this necessity. ➝ From a privacy perspective, you must maintain control, an auditable trail of your data processes, and the ability to abide by “right to be forgotten” and opt-out requests. There are now better ways to protect data at rest, collaborate remotely in a privacy-preserving way, and scale data workflows. It’s time to embrace it. #DataCollaboration #DataSecurity #Privacy
-
Trust starts with how privacy risks are handled at the top. When that leadership is missing, privacy failures happen, and they aren't always splashy headlines about data breaches. Sometimes privacy failures happen when a company can't answer basic questions about how it handles data. That lack of clarity can slow growth, stall deals, and erode trust. And trust starts with owning that clarity and managing how privacy risks are handled. This begins at the top with the C-suite, and it looks like: ✅ Reviewing privacy risks alongside financial and operational risk ✅ Naming a leader responsible for data governance with authority across business units ✅ Building privacy reviews into launch timelines ✅ Reporting regulatory changes, contractual exposure, and delivery gaps Yet when these steps are skipped, risk trickles down. And this looks like: ❌ A new feature launches without a data review ❌ A deletion policy exists, but no one checks if the data is deleted ❌ A third-party vendor has more access to systems than anyone realizes It's not whether leadership supports privacy. It’s whether their support shows up in how their company operates. That’s why companies need to embed privacy into business operations beginning at the top of their organization. And this matters even more as AI is added to the mix. Companies that want to get ahead need to treat AI as a core business function rather than a downstream policy issue. Because AI risk is showing up earlier and earlier, from planning through execution. This means companies need to perform AI risk assessments across product development, M&A, market expansion, and budget planning. If your business touches personal data—and it most likely does—privacy and AI oversight aren't side conversations. They’re foundational to how companies grow and build consumer trust. And it starts in the boardroom. Read my latest article for Forbes Business Council to learn more about privacy as a boardroom issue: https://lnkd.in/eb2WP5HP