The AI revolution could leave women even further behind. Here’s why.
“There were 12 of us out of the 44 who were hired,” she says. “Now, I don’t see any of us in the senior ranks.”
By “us,” the 22-year-old engineer from Maharashtra means women.
Fresh out of college with a degree in electronics and telecommunications, Ankita was one of the few women in her batch who studied machine learning and artificial intelligence—skills that are suddenly in demand everywhere. Today, she’s posted in a public sector undertaking (PSU), using those same tools to build predictive models for oil and gas leaks.
It’s the kind of role she knows she can’t afford to step away from.
“Just knowing AI won’t cut it. If you take a break, you’re out,” she says. Her voice is even. Certain. “You have to work with it. Because once AI moves ahead, it won’t look back.”
Her parents—both IT professionals—cautioned her early. Skip the coding-heavy roles, they said. AI will take those first. Find jobs tied to core automation. They’ll last longer.
That quiet warning, passed down at home over meals and job applications, captures a deeper fear many women now carry. That the AI wave won’t just disrupt work but widen existing cracks.
The shorthand for it is everywhere: AI. In LinkedIn posts. In team meetings. In HR policies and CEO memos. It’s rushing into workplaces at every level, automating tasks, accelerating hiring, rewriting org charts.
But behind the rush lies a quieter danger.
“It’s not just the jobs AI will replace,” says Sonalde Desai, professor at the University of Maryland and the National Council of Applied Economic Research. “It’s who it will leave behind.”
She points to history: when economies shifted from agriculture to manufacturing, women were pushed out. Jobs deemed “breadwinning” migrated toward men.
Now, the shift is digital—but the logic hasn’t changed.
“The algorithms we use, the data we train them on—all carry human assumptions,” Desai says. “And those assumptions haven’t always included women.”
Built-in bias
It’s not theoretical. Take the seatbelt.
Crash test dummies used for decades have mirrored the 50th percentile male: 1.77 meters tall, 76 kilos. When tested, the results look fine—until you realize the model doesn’t account for female anatomy. The result? In real crashes, women are 47% more likely to be seriously injured and 17% more likely to die.
The tech worked. Just not for everyone.
That blind spot now sits inside HR systems.
AI tools used for screening CVs, assessing performance, and recommending promotions are learning from data that’s already riddled with bias.
“AI doesn’t operate in a vacuum,” says Jibu Elias, an AI ethicist. “It scales the patterns we feed it. If those patterns excluded women, the system learns exclusion.”
Elias calls it the infrastructure AI: resume sifting, sentiment analysis in interviews, productivity dashboards, remote surveillance tools. “It doesn’t look like a robot replacing a person,” he says. “It looks like a dashboard saying: this one isn’t ‘fit’.”
A study from 2020 titled “Men should be competent, women should have it all” found that while competence was the sole standard for evaluating male candidates, women were judged on competence, sociability, and morality. One candidate, three bars to clear.
AI is now learning from that pattern—and replicating it.
“When we use AI to filter talent or evaluate performance, we risk making women not just less visible—but invisible,” Elias says. “The system doesn’t see you. So you never make it to the shortlist.”
Real-world effects
The impact is already showing up in the numbers.
A McKinsey report from May paints the picture: women make up 33% of entry-level roles in India Inc. That figure drops to 24% at the manager level—and stays flat from there. Men at the entry level are twice as likely to be promoted. Women are 1.3 times more likely to exit.
The pandemic made it worse. Many women dropped out due to caregiving burdens, and while some were brought back through flexible work models, the pipeline remains fractured. AI could turn that fracture into a break.
“Lower-wage, routine-heavy jobs—customer support, data entry, HR ops—are most at risk of automation,” says Rohini Lakshané, a technologist and researcher. “Those are the jobs where women are overrepresented.”
And the higher-paying, AI-driven roles? They’re already skewing male.
“Women are underrepresented in STEM. So fewer women are in the roles designing, deploying, or leading AI systems. And if you’re not in the room, you don’t get to shape how the room is built.”
That divide is sharper in smaller towns, where access to upskilling and reliable internet still lags. Without targeted interventions—better procurement frameworks, re-skilling programs, diversity audits—AI may entrench the very imbalances it could have solved.
“The prompts that shape hiring models today are still learning from past preferences,” Lakshané says. “If companies have always hired men for a profile, the AI will keep favoring male candidates.”
It’s not just the tool. It’s the training data. And the people missing from it.
Not all bad
Still, not everyone is sounding the alarm.
“AI can help,” says Anshuman Das, CEO and co-founder of recruitment firm Careernet. “It enables remote work, which is good for women. If the hiring manager is biased, then AI or no AI, the outcome will reflect that. But companies are more aware now. They’re building checks.”
The demand for AI talent is growing rapidly. Two years ago, only 3-4% of job mandates from clients sought AI skills in candidates with 4-10 years’ experience. Today it’s 20%. In startups it’s 40%.
And yet, it’s not clear how many of those roles go to women.
Ankita Sinha, for one, plans to hold on.
“I know I can’t step away, even for a bit,” she says. “AI doesn’t wait.”
Her skills are relevant today. But the real test will come tomorrow—when the next wave of prompts decides who fits, and who fades.
Because the danger isn’t just that AI changes the job.
It’s that, in the systems shaping the future, you’re not even in the picture.
Post Comment