A long time ago, a new client said to me: "It would be really nice if you could find me someone other than a well-educated, white male." It stuck with me. That one sentence called out a massive, systemic issue in the industry. When every person on your commercial team looks the same, it’s not an accident. It’s a decision whether it's intentional or not. And here are some of the messages it is sending: Lack of inclusivity. Companies in the top quartile for gender diversity are 25% more likely to outperform on profitability, yet your team looks like a throwback to 1995. Groupthink. Diverse teams are 87% better at making decisions, but your "sameness" is creating an echo chamber that kills innovation. Cultural red flag. High-performing women see your team photo and think: "That’s not a place where I’ll be heard, supported, or promoted." So, they don’t want to work for you. Missed market insight. In many sectors of healthcare, women drive 80% of consumer decisions. If you’re selling without them on your GTM team, you’re flying blind. Recruiting blind spot. You’re not “hiring the best person for the job”. You’re hiring who looks like you. That’s not merit based. That’s bias. So many people think that diversity is charity. Think about that. Diversity is not charity. It's actually strategy. It’s profitability. It’s performance. And if you’re still saying “we just can’t find any women,” then you’re either not looking hard enough or you’re creating a culture where they don’t want to work. And yes, we know where to find her. #Diversity #WomenInSales #MedTechleadership #GoToMarketStrategy #ExecutiveSearch #InclusionMatters #RepresentationMatters #FoundersToWatch #StartupsToFollow #WomenInMedTech
Feminism in tech hiring practices
Explore top LinkedIn content from expert professionals.
Summary
Feminism-in-tech-hiring-practices refers to approaches that challenge gender bias and aim to create fairer, more inclusive opportunities for women in the technology sector. These practices focus on promoting representation, eliminating systemic barriers, and ensuring that companies listen to what women actually want from their workplaces.
- Prioritize representation: Make sure your hiring panels and leadership teams include women, as this signals inclusivity and attracts more talented candidates.
- Use structured processes: Implement clear, consistent interview methods and transparent compensation discussions to help minimize bias and create a level playing field for all applicants.
- Listen and adapt: Regularly gather feedback from women in tech about their challenges and priorities, then tailor your policies and employer branding to address their real needs.
-
-
Structured interviews and diverse panels are key to reducing bias and ensuring fair hiring practices, but this isn’t always the case! Mark Simpson, CEO of Pillar, shares results from their studies that showcase differences in interviews based on gender (e.g., women are often asked more questions and asked to prove their worth more than men). ♦️ Question Disparity: Women are asked 20% more questions with 25% less time to answer. Structured interviews help reduce this gap. ♦️ Proving Worth: Women face more questions about strengths and failures than men. Structured processes can decrease this bias by 42%. ♦️ Compensation Discomfort: Women feel less comfortable discussing compensation. Transparency can help address this. ♦️ Small Talk Bias: Men engage in more sports-related small talk, potentially leading to biases. ♦️ Female Interviewers: Women report better experiences and receive more soft skill questions when interviewed by women. ♦️ Interview Length: Female interviewers tend to conduct interviews 10-15% longer. My advice….diversify your panel! Use tools like BrightHire to ensure unbiased, consistent interview practices for candidates. #Hiring #Recruitment #TalentAquisition #Culture #Values #DiversityandInclusion #Jobseeker #Interviewing #Candidates #ExecutiveSearch
-
If you're losing brilliant women at the final stages of hiring - this might be why... Let me talk you through a recent example where a company had a disproportionately high number of women dropping out at late interview and offer stage for their tech roles: They were offering great salaries. Flexible working. A decent benefits package. So what was going wrong? We took a look at the data. Out of 2 billion data points, a few things stood out: → Diversity is non-negotiable. Women in tech rank it 31% higher than the average candidate. If they don’t see representation in leadership, they won’t apply → Flexible hybrid work wins, because structure matters. Demand for remote-only roles is 11% below average, while core hours and in-office collaboration rank higher → Family-friendly policies trump flashy perks. Fertility leave (+41%), job sharing (+33%), and parental leave (+19%) are the real differentiators But then we dug deeper; and that's where it got really interesting: → Women in data roles showed a higher demand for in-office work - mentorship and access to resources mattered → Women in engineering & development wanted mission-driven work and career progression above all else → Women in product roles prioritised culture and flexibility more than any other group The company checked their employer brand. Their careers page talked about “great culture” and “exciting opportunities.” But it said nothing about what actually mattered to the people they were trying to hire. They weren’t losing candidates because of the salary or the benefits. They were losing them because they don't know what their target talent groups actually want. The companies getting this right aren’t guessing. They’re using data to shape their employer brand - so they attract the right people, with the right message. Download our women in tech report to access more of these insights: https://lnkd.in/enYcGpeW And tell me if you've turned down a job offer for similar reasons? #WomenInTech #Hiring #EmployerBranding #FutureOfWork #DiversityMatters
-
More women in tech means more inclusive products—built for everyone. It also means better financial outcomes, greater innovation, and a future shaped by all of us. And yet… 👉 Women still make up only 28.2% of the global tech workforce. 🤖 In AI, just 22% of professionals are women. ⚠️ Women hold just 12.4% of C-suite roles in tech. Why? Because of systemic bias. Because of outdated stereotypes. Because of toxic workplaces, bro culture, lack of role models, lack of sponsors, and persistent gender discrimination. The list goes on. As we close out Women's History Month, Let’s move from awareness to action. Supporting women in tech isn’t a gesture—it’s a responsibility. Here’s how you can walk the talk: 🔹 Hire more women in tech. Especially in leadership. 🔹 Promote women into decision-making roles. And invest in their growth. 🔹 Be a sponsor. Not just a mentor—use your influence to elevate others. 🔹 Back your Women ERGs—with budget, visibility, and ongoing support. 🔹 Partner with organizations like WomenTech Network and Executive Women in Tech (EWIT) to create real impact. 💡 And on April 4, we bring this global: #WITDay celebrates women shaping the future of technology. 📣 Use the #WITDay to spotlight the women on your team, in your company, and in your community who inspire, lead, and drive change. Because representation matters—and the world needs to see them. #WITday #womenintech #womenleaders #womenshistorymonth
-
According to new research from Nigel Frank International, four in five men in tech believe women are treated equally in the industry. Given that women currently occupy only 22% of tech roles across European companies and a shockingly high 56% of women leave the tech industry 10-20 years into their careers (double the rate of men), I’d have to disagree. There’s a glaringly obvious (but still important) question to ask here: why are we asking men about an issue that disproportionately impacts women in this industry? It’s no secret women are hugely underrepresented in tech – we need to be asking them how they feel they are treated first and foremost. The fact that men in tech believe women have equal progression opportunities is only making matters worse. If they‘re unaware of how women in their businesses feel, then there’s nothing to fix! This piece from Pascale Davies digs into the issue. The point on “invisible challenges” faced by women in tech is key – we must make these challenges visible if we’re going to fix the underrepresentation of women in tech. That means transparent conversations about the industry’s gender pay gap, taking a hard look at whether a company’s culture is genuinely welcoming and supportive of women, addressing the talent bottlenecks that are stopping women reaching senior leadership positions, and – most importantly – listening and learning from the women in the industry to know what we need to do better. https://bit.ly/48ohYlY
-
𝗔𝗜 𝗶𝘀 𝗼𝗻𝗹𝘆 𝗮𝘀 𝗳𝗮𝗶𝗿 𝗮𝘀 𝘁𝗵𝗲 𝘄𝗼𝗿𝗹𝗱 𝗶𝘁 𝗹𝗲𝗮𝗿𝗻𝘀 𝗳𝗿𝗼𝗺. Artificial Intelligence isn’t created in a vacuum - it’s trained on data that reflects the world we’ve built. And that world carries deep, historic inequities. If the training data includes patterns of exclusion, such as who gets promoted, who gets paid more, whose CVs are ‘successful’, then AI systems learn those patterns and replicate them. At scale and at pace. We’re already seeing the consequences: 🔹Hiring tools that favour men over women 🔹Voice assistants that misunderstand female voices 🔹Algorithms that promote sexist content more widely and more often This isn’t about a rogue line of code. It’s about systems that reflect the values and blind spots of the people who build them. Yet women make up just 35% of the US tech workforce. And only 28% of people even know AI can be gender biased. That gap in awareness is dangerous. Because what gets built, and how it behaves, depends on who’s in the room. So what are some practical actions we can take? Tech leaders: 🔹 Build systems that are in tune with women’s real needs 🔹 Invest in diverse design and development teams 🔹 Audit your tools and data for bias 🔹 Put ethics and gender equality at the core of AI development, not as an afterthought Everyone else: 🔹 Don’t scroll past the problem 🔹 Call out gender bias when you see it 🔹 Report misogynistic and sexist content 🔹 Demand tech that works for all women and girls This isn’t just about better tech. It is fundamentally about fairer futures. #GenderEquality #InclusiveTech #EthicalAI Attached in the comments is a helpful UN article.
-
The blatant bias of AI resume-screening tools against women and people of color shouldn’t be surprising. What’s disturbing is the world’s collective shrug. If these algorithms favored Black men or Latina women over white men, we’d see headlines everywhere, people in the streets, and big tech CEOs in a frenzy trying to “fix” the problem. But since the bias here is against Black men and women, it’s treated as a niche issue, hardly newsworthy—just another consequence of tech’s “imperfections.” It’s hard not to see this as an indictment of who we actually value in this society. Consider the fallout if an AI system screened out white men from executive roles. Imagine Elon Musk or other tech giants watching this play out in their own hiring processes—do we really think they’d sit quietly on the sidelines? Not a chance. They’d be up in arms, rallying everyone to overhaul the system and ensure no one from their demographic is left behind. Yet here we are with AI systematically weeding out Black men and women from top-tier jobs, and the reaction? Silence. Some polite “concerns,” maybe a nod to “ongoing research,” but no serious action. And let’s talk about the tech companies' responses: Salesforce and Contextual AI both emphasized that their models weren’t “intended” for resume screening. But the fact is, this technology is out there, and if it’s being used in ways that systematically erase opportunities for minorities and women, hiding behind disclaimers isn’t good enough. If these tools were inadvertently disadvantaging white men, would “it wasn’t intended for this” be an acceptable response? Doubtful. The excuses and deflections are telling—it seems no one’s really interested in taking accountability unless it impacts those at the top of the societal food chain. There’s no reason why a pre-process that pseudo-anonymizes names and genders couldn’t be easily applied prior to processing these resumes. This isn’t just about hiring; it’s about power. AI is shaping our future, deciding who gets jobs, loans, housing, and more. It reflects the values of those who build it, and the lack of urgency to address these biases is painfully clear evidence of who counts—and who doesn’t. It’s time to demand more than hand-wringing and weak assurances. Let’s call this what it is: a deliberate disregard for fairness because the people affected are not those with enough power or influence to demand change. Until we start holding AI creators and companies to the same standards for fairness and equity that we claim to care about, this problem isn’t going anywhere. https://lnkd.in/ecyxecHT
-
On Threads, Amy Diehl, PhD posted: “Exclusion of women is often hidden by making minimum requirements much higher than necessary for the job.” Diehl linked to a recent Axios article about how tough the job market is for women executives. One reason? A board running a search for a CEO might say, “We’re looking for public company CEOs who are in the tech industry.” As one executive search consultant explained, “Go with those specs and you end up with a list that inevitably excludes females from consideration.” Such exclusion isn’t limited to women or just the C-suite. A few years ago, I advised a company that was struggling to hire a new Director of Finance. When I reviewed the job posting, two requirements jumped out at me: * Ability to be mobile (including, but not limited to walking, bending, squatting, crouching, twisting, kneeling, reaching, etc). * Ability to lift/carry/push/pull objects that weigh up to 35 pounds as needed. Yes, for a finance role. You might guess what happened next. I shared my feedback, and the team quickly admitted they’d borrowed generic HR language without thinking it through. Those physical requirements weren’t necessary and could have caused some great candidates to pass on the opportunity. Removing unnecessary requirements is a key inclusive hiring practice. In my book "The Better Allies Approach to Hiring," I recommend reviewing each requirement and asking, “If an otherwise perfect candidate came along without this experience, would we still hire them?” If the answer is yes, it shouldn’t be in the job description. Now it’s your turn. Grab a recent job posting from your organization. Are there any requirements that aren’t actually essential? Check the “preferred qualifications” and “nice-to-haves,” too—they may be silently shrinking your talent pool. — This is an excerpt from my upcoming “5 Ally Actions” newsletter. Subscribe and read the full edition at https://lnkd.in/gQiRseCb #BetterAllies #Allyship #InclusionMatters #Inclusion #Belonging #Allies 🙏