Technology And Society

Explore top LinkedIn content from expert professionals.

  • View profile for Shelley Zalis
    Shelley Zalis Shelley Zalis is an Influencer
    326,976 followers

    Safiya Noble, Ph.D. is a #tech justice advocate, a scholar, and a professor of Gender Studies and African American Studies at UCLA, and she’s zeroing in on the intersection of technology and human rights, in order to protect marginalized communities from the effects of AI; her research has shown how it can and does exacerbate racism and sexism. She discovered algorithmic discrimination in internet search engines, which got her thinking. If it’s happening there, it’s going to happen in AI and #AI is already incredibly powerful. If the data remains biased, it will be very problematic and harmful. AI has been proven to result in racial profiling, affecting people’s ability to get mortgages or pass job screenings for interviews. Facial recognition technology has misidentified people with darker skin tones, Black women in particular, which has resulted in wrongful arrests. We know AI isn’t perfect, but these are massive issues that can be very dangerous. Noble stands by the fact that humanity trumps technology and if we can simply learn more about one another organically, the more empathetic we’ll be. 👏

  • View profile for Stephanie Espy
    Stephanie Espy Stephanie Espy is an Influencer

    MathSP Founder and CEO | STEM Gems Author, Executive Director, and Speaker | #1 LinkedIn Top Voice in Education | Keynote Speaker | #GiveGirlsRoleModels

    158,376 followers

    For 60 Years, Kids’ TV Cast Boys As ‘Doers’ And Girls As Passive, Study Suggests: “New research reveals that the language in children’s television is reinforcing harmful gender stereotypes, and that little has improved in 60 years. In some cases, the gender bias is getting worse over time. The study, published this week in Psychological Science, examined scripts from 98 children’s television programs in the U.S. spanning from 1960 to 2018. The researchers employed natural language processing tools to examine which words were more likely to be associated with male characters and which were more likely to be associated with female characters. In total, they analyzed 6,600 episodes, 2.7 million sentences and 16 million words. Among the shows studied were classics like The Flintstones (1960) and more modern series like The Powerpuff Girls (2016) and Lost in Space (2018). In particular, the researchers examined how often male and female characters were portrayed as active agents (those who do) versus passive recipients (those who are done to). They found that boys are ‘doers’ while girls are the ‘done-tos.’ Perhaps most shockingly, when the researchers examined how this language has changed over time, they found that it hadn’t. The gender gap in who takes action in these programs hasn’t improved in six decades. Given the amount of time children spend watching television, the study authors suggest that those who watch these programs will develop biased ideas about how women and men behave in the real world. ‘These biases aren’t just about who gets more lines; they’re about who gets to act, lead, and shape the story. Over time, such patterns can quietly teach children that agency belongs more naturally to boys than to girls, even when no one intends that message,’ professor of psychology at NYU and an author on the paper Andrei Cimpian explained in a press release. AI learning models that train on program scripts pose an additional threat of perpetuating the gender bias. The study authors explain in their paper, “The rising popularity of script-writing programs powered by artificial intelligence (AI), which are trained on language from pre-existing screenplays, adds urgency to the goal of uncovering social biases in the language in children’s media.’ As technology continues to evolve, it becomes increasingly important to understand the messages we’re sending.” Read more 👉 https://lnkd.in/e5g6Z8WF ✍️ Article by Kim Elsesser #WomenInSTEM #GirlsInSTEM #STEMGems #GiveGirlsRoleModels

  • View profile for Patricia Gestoso-Souto ◆ Inclusive AI Innovation

    Director Scientific Services and Operations SaaS | Ethical and Inclusive Digital Transformation | Award-winning Inclusion Strategist | Trustee | International Keynote Speaker | Certified WorkLife Coach | Cultural Broker

    6,510 followers

    [Techno-Patriarchy: How AI is Misogyny’s New Clothes Gender discrimination is baked into artificial intelligence by design and it’s in the interests of tech bros. In my day job, I support our clients using AI to accelerate the discovery of new drugs and materials. I can see the benefits of this technology to the people and the planet. But there is a dark side too. That’s the reason tech -       Disregards women’s needs and experiences when developing AI solutions. -       Deflects its accountability in automating and increasing online harassment -       Purposely reinforces gender stereotypes -       Operationalises menstrual surveillance -       Sabotages women’s businesses and activism I substantiate each of the points above with real examples and the impact on the lives of women.   Fortunately, not all is doom and gloom.   Because insanity is to do the same thing and expect a different outcome, I also share what we need to start doing differently to develop AI that works for women too.   #EthicalAI #InclusiveAI #MisogynisticAI #BiasedAI #Patriarchy #InclusiveTech #WomenInTech #WomenInBusiness

  • View profile for Laura Burge

    Educational Leader | Equity, Respect and Inclusion I Consultant

    4,141 followers

    After a couple of train trips to the CBD last week, I’m midway through 'The New Age of Sexism' by Laura Bates. It has been both fascinating and deeply unsettling to sit with the realities she describes. Bates paints a clear picture of how the inequalities and oppressions of our current world are being “baked into the very foundations” of the digital future we are building at speed. Technologies like the metaverse, AI-generated content, and deepfake pornography are not just neutral tools; they carry forward existing harms, often amplifying them in ways that are harder to prevent, regulate, or even detect. One idea that particularly struck me is her description of the “make it now and fix any safety issues later” approach in emerging tech. This is an attitude we would never tolerate in the offline world and yet, in the online and virtual world, this has become the default operating model. Women and other minoritised groups are, as Bates writes, the “canaries in the coal mine.” Their abuse and suffering provide the early warning signals, the data points that allow companies to tweak systems, while continuing to profit in the meantime. It is a chilling reminder of whose safety is deprioritised when innovation and profits are valued above responsibility. Safety should not be an optional add-on, or a ‘later stage’ consideration. If these technologies really are the foundations of our future society, then safety, equity, and accountability must be treated as the baseline.

  • View profile for Tarika Barrett, Ph.D.
    Tarika Barrett, Ph.D. Tarika Barrett, Ph.D. is an Influencer

    Chief Executive Officer at Girls Who Code

    89,820 followers

    AI has the power to either accelerate gender equality or set us back decades. The choice is ours. Right now, women make up less than a third of AI professionals and just 18% of AI researchers worldwide. From mentorship to career growth, AI can be the great equalizer—helping women master new skills, navigate leadership, and automate the invisible labor that has long held them back. But AI also carries risks, from gender-biased hiring algorithms to deepfake abuse. That’s why we need women shaping these tools—not just using them, but leading their development. At Girls Who Code, we are giving our students the tools and resources to change the future of AI and better than communities. That starts with giving everyone a seat at the table in tech. Read more: https://bit.ly/4hNOvap

  • View profile for Riya K. Hira

    Learning Experience Designer | Impact Communications Strategist | Social Entrepreneur | Exploring AI for Learning, Storytelling & Social Impact

    5,249 followers

    Is it fair that in India, more than half of the women don’t have exclusive access to a mobile phone? Shocking, right? Yes, you heard it right. While mobile phones have become an essential tool for communication, education, and work, 52% of Indian women can only use a shared phone, and 21% don’t use one at all. Why is that? Even in 2023, when the world is more connected than ever, women still face barriers to accessing basic technology. Many families prioritize giving phones to men, considering it a “necessity” for them, while treating it as a “luxury” for women. It sounds absurd, doesn’t it? But this isn’t just about phones. It’s about independence, privacy, and opportunity. Without exclusive access, women can’t freely explore educational content, apply for jobs, or even make private calls without judgment. The gap is clear: 48% of men have their own phones compared to just 27% of women. Why does this matter? Because a mobile phone isn’t just a device—it’s a lifeline. It’s the key to staying informed, building careers, and accessing critical services. And when women are left out of the digital world, they’re left out of opportunities. So, how do we fix this? We need to make digital access a priority for women. This means: ✅ Affordable devices and data plans for women. ✅ Programs that address societal norms and encourage families to empower their daughters. ✅ Digital literacy initiatives to ensure women can fully utilize these tools. Because until everyone is connected, we can’t claim to be truly equal. What do you think? How can we make sure women get equal access to technology? LinkedIn News India LinkedIn Guide to Creating #digitaldivide #empowerwomen #genderequality #technologyforall Source: Comprehensive Annual Modular Survey, NSSO 2022-23 (via dataforindia)

  • View profile for Dr Ulrika Sultan

    Awarded PhD, researching girls in TECH, STEM & STEAM at Chalmers University of Technology. Innovator, advisory board member and science communicator.

    2,633 followers

    Early Gender Stereotypes May Shape the Future of STEM Participation New research (linked below) reveals that children as young as five begin to develop biased views about men’s and women’s abilities in STEM fields. Boys often rate men as more competent in STEM, while both boys and girls see girls as less capable of learning STEM subjects. These early stereotypes could contribute to the gender gap in STEM careers, underscoring the need for educators and caregivers to address these biases early on. https://lnkd.in/dDRTRuj4

  • View profile for Guljana Mahboob

    UN Migration | Commonwealth Shared Scholar | University of Warwick | Development Practitioner

    3,978 followers

    Excited to share insights from my Master's thesis on how information technology is transforming women's empowerment in Pakistan. Initially, I was unsure where my research would lead. Still, it soon became clear that IT not only opens doors to online learning, scholarships, and global educational opportunities but also challenges traditional gender roles. As Nirmal Puwar's idea of “space invaders” suggests, women are boldly entering spaces that were once considered off-limits, redefining what it means to be independent and self-reliant. My work involved extensive discussions and in-depth, semi-structured interviews with participants who highlighted both the immense benefits and the harsh challenges of our digital era. On the one hand, digital platforms like Facebook and WhatsApp have become essential tools for mentoring and connecting with scholarship opportunities. On the other hand, cyberbullying and online harassment—recorded at around 4441 cases in 2021, with a significant number involving women—pose real threats that can diminish confidence and hinder progress. Moreover, my research pointed to unequal access to IT, influenced by economic constraints and entrenched societal hierarchies. While government initiatives such as providing laptops have been appreciated by many, issues around accessibility remain a challenge. Research by UNDP Pakistan on digitalization and its impact on women has been extremely helpful in understanding the transformative potential of digital tools, yet it also reinforces that much more work is needed. In light of these findings, I strongly urge Ministry of Planning Development and Special Initiatives to create more safe digital spaces where women can explore their potential without fear. Establishing secure, supportive environments is essential for enabling women to fully engage with the digital world and leverage its opportunities for education and employment. I’m grateful for the journey my research has taken me on, and I look forward to continuing this conversation on digital empowerment and gender equality.  #GenderEquality #WomenEmpowerment #Technology #UNDP #Digitalization

  • View profile for Rachel Reeds

    Positive disruptor. Authentic leader. Bold thinker.

    3,990 followers

    Higher education is increasingly adopting AI-driven tools: chatbots for applicant queries, systems for verifying qualifications, and algorithms for shortlisting candidates. While these technologies promise efficiency, they also carry the risk of embedding and obscuring systemic biases. Misogyny, racism, and classism are already embedded in our systems. Add AI to the mix, and those biases don’t disappear, they get scaled. In her latest book, The New Age of Sexism, Laura Bates exposes how emerging technologies, including AI, are reinforcing and amplifying existing gender inequalities. She delves into how AI systems, often trained on biased data, can perpetuate harmful stereotypes and discrimination against women and marginalised groups. Consider these scenarios: - A chatbot providing less comprehensive information to female or international applicants, reflecting historical underrepresentation in training data. - A document verification system disproportionately flagging certificates from certain countries as suspicious. - An admissions algorithm favouring candidates from traditionally privileged backgrounds (by something as simple as giving primacy to A-levels), inadvertently penalising those with non-linear educational paths. These are not hypothetical concerns. For instance: The UK government's AI system for detecting welfare fraud was found to exhibit bias against individuals based on age, disability, marital status, and nationality. Research by Joy Buolamwini revealed that facial recognition systems have higher error rates for darker-skinned women, highlighting the intersection of racial and gender biases in AI technologies. If your institution is integrating AI into recruitment or admissions, it's crucial to ask: Who developed and trained (and continues to train) the model? What data was used, and does it reflect diverse populations? How are biases identified and mitigated? Who is accountable for the decisions made by these systems? Automation doesn't eliminate bias; it often conceals it behind a façade of objectivity. We must critically assess and address the implications of AI in our institutions to ensure equity and fairness. As Laura Bates emphasises, it's not about fixing the individuals affected by these systems but about fixing the systems themselves. How is your institution approaching the integration of AI in a way that promotes inclusivity and mitigates bias? Or, has no one thought about it yet?

  • View profile for Oliver Hauser

    Professor of Economics & Deputy Director of AI Institute, University of Exeter | Senior Advisor, UK Cabinet Office | Speaker | Improving Organisations, AI Impact, and Inclusive Leadership Through Data & Experimentation

    3,667 followers

    New Research Published in QJE! 🎉 Thrilled to share our latest paper, published this week in the Quarterly Journal of Economics (with my amazing co-authors Christine Exley, Molly Moore and John-Henry Pezzuto). 📊 Across 15 studies with nearly 9,000 participants, we found that people systematically believe women are more generous, equality-oriented, and cooperative than men. Yet, the data shows that men and women’s behaviors in relation to trust, cooperation, and fairness across a variety of economic activities are far more similar than these stereotypes suggest. Participants predicted how men and women would behave in various economic games. In the Dictator Game, they estimated that women would be more generous when splitting money. In the Ultimatum Game, they anticipated women would be fairer. In the Public Goods Game, they expected women to cooperate more to a shared pot. Across 28 contexts, women were consistently expected to choose "socially-oriented" outcomes 8–13% more often than men—but these differences rarely exist. We then find similar results across a variety of other settings, vignettes and domains, plus additional experiments that hint at the role of upbringing in shaping these beliefs about gender differences. Why does this matter? These stereotypes can limit leadership opportunities, reinforce traditional family roles, and skew support for policies like equal pay and parental leave—perpetuating inequalities. 💡 Let’s challenge these beliefs and create environments where everyone can thrive. 🔗 Link to the paper in comments! #Research #Leadership #HR  #Stereotypes #GenderEquality

Explore categories