Ideas, insights, and updates shaping tomorrow’s conversations.

Feminist AI for Justice
18 Nov 2025

Feminist AI for Justice

Explore how a feminist approach to AI can transform technology from reproducing inequality to advancing fairness and inclusion.

Artificial intelligence is no longer a futuristic idea. It already shapes how we apply for jobs, access healthcare, navigate cities, and even negotiate salaries. But while AI promises efficiency and progress, it also risks deepening the very injustices many of us have spent decades fighting. 

Technology is never neutral, it mirrors the choices – and blindspots – of the people and systems that build it. When inequality is coded into our systems, it becomes embedded in our tools. We have already seen how this plays out. 

  • Amazon famously scrapped its recruitment engine after it taught itself to reject women’s CVs, simply because the data it was trained on reflected a male-dominated industry. 
  • A U.S. healthcare algorithm underestimated the needs of Black patients because it assumed healthcare spending was the same as healthcare need, ignoring the structural barriers that limit access to care. 
  • More recently, studies revealed that large language models can give different salary negotiation advice to women and men.

These examples are not isolated mishaps. They are digitalised injustices.

Technology is never neutral, it mirrors the choices – and blindspots – of the people and systems that build it.

Why AI Must Be Feminist

If justice requires equality, then AI must be feminist. And by “feminist”, I refer to the definition of the Oxford Dictionary, namely “Advocacy of equality of the sexes”. I mean a framework for interrogating power, elevating marginalised voices, and designing systems that work for everyone. It asks the uncomfortable but necessary questions: Who is represented in the data? Who decides how systems function? And who carries the risks when things go wrong? 

These questions shift the focus from technical efficiency to social responsibility, and that shift is essential if AI is to serve justice.

[...] a framework for interrogating power, elevating marginalised voices, and designing systems that work for everyone.

Women at the Heart of Technological Innovation

The story of technology itself began with women. Ada Lovelace wrote the world’s first algorithm; Hedy Lamarr invented the frequency-hopping technology that powers WiFi and Bluetooth; Grace Hopper created the first computer compiler, paving the way for modern programming languages. Yet as technology became more profitable, women were systematically written out of its story.

Today, in the age of AI, women continue to shape and correct the trajectory of technology. Mira Murati, Chief Technology Officer of OpenAI, built ChatGPT and is now addressing the ethical and safety challenges of AI that many companies still struggle to confront. Four years before ChatGPT, it was another team of women and non-binary innovators who developed Sophia, the world’s first chatbot designed for survivors of domestic violence, proving that empathy, safety, and privacy can be engineered into technology itself.

From the world’s first algorithm to the latest generation of AI, women have continually redefined what technology can be, not just intelligent, but humane. If we are to build technology worthy of the word ‘justice,’ then it must be feminist, because women have been at the heart of innovation from the very beginning. A feminist lens insists that half the population cannot be treated as an afterthought.

If we are to build technology worthy of the word ‘justice,’ then it must be feminist, because women have been at the heart of innovation from the very beginning.

Core Principles of Feminist AI

A feminist approach to AI creates tangible benefits and translates into design principles. First, accountability: when diverse voices are present in building, testing, and managing, systems are less likely to misclassify or exclude communities. Second, transparency: documenting how models are built and which risks have been flagged allows for scrutiny and trust. Third, empowerment: tools designed with and for the people they serve are more effective, and more likely to be adopted. Finally, inclusivity: feminist AI insists that no one is left behind. Together, these principles transform ideals of justice into everyday technologies that shape real lives.

Case Study: Chatbot Sophia - A Digital Companion to End Domestic Abuse

At Spring ACT, we have seen the benefits firsthand. These principles guide the creation of Chatbot Sophia, the world’s first AI-chatbot designed to empower people affected by domestic violence wherever they are in the world. From the very beginning, Sophia was built with survivors, not just for them. Their input shaped its features, such as the Digital Safe, a secure digital space to store supporting documentation like photos, threatening messages, or voice notes. Through close collaboration between survivors, human rights workers, and engineers, the Digital Safe was designed with unique safeguards: an image-based password instead of a traditional one, a decoy website, and browser history rewriting. These thoughtful innovations ensure safety and peace of mind for people seeking support. Privacy is another central factor: Sophia leaves no digital trace. Accessibility matters just as much: Sophia speaks 95 languages, is available 24/7, and adapts to local realities and resources in countries around the world.

The example of Chatbot Sophia shows that feminist AI is not an abstract theory but a design choice. In 2025, the United Nations recognised Sophia with its AI for Good Impact Award, calling it one of the world’s best AI solutions, precisely due to its inclusivity. More than the award, what matters is the tens of thousands of confidential conversations Sophia has had worldwide, reaching people in 172 countries and facilitating over 43,000 conversations.

The example of Chatbot Sophia shows that feminist AI is not an abstract theory but a design choice.

Embedding Inclusivity in AI from Day One

So what does this mean for the wider ecosystem? For leaders, it means embedding inclusivity from day one, not treating it as an afterthought - if at all - once the product is already built. If you fail to design with diverse data, diverse teams, and diverse perspectives, you will inevitably embed exclusion into your technology. And exclusion at scale is discrimination with an algorithmic face. For the tech community and policymakers, it means taking responsibility for the environments in which AI grows: mandating gender-sensitive testing, requiring independent audits, and establishing transparency standards that prevent harm before it becomes entrenched.

[...] it means embedding inclusivity from day one, not treating it as an afterthought

The Role of Social Justice Practitioners in AI

For social justice practitioners, it means leaning in rather than stepping back. Technology cannot be left solely to engineers and big companies, it must be shaped by survivor voices, grassroots expertise, and human rights frameworks. We will never again live in a world without AI. It is paramount that new innovations are grounded in values of justice and equality rather than profit. And this can only happen when human rights workers are equal partners and active users in shaping this technological revolution. 

Technology cannot be left solely to engineers and big companies, it must be shaped by survivor voices, grassroots expertise, and human rights frameworks.

The Crossroads of AI and Justice

We are at a crossroads. AI can become a tool that entrenches inequality, or one that dismantles it. A feminist approach ensures we choose the latter, by embedding accountability, transparency, empowerment, and inclusivity into every stage of design. Justice will not emerge spontaneously from an algorithm. It must be coded deliberately, tested rigorously, and protected politically.

Feminist AI is a blueprint for technology that protects, empowers, and uplifts. But making that future a reality requires everyone, not just policymakers and engineers, to act. For founders and innovators, inclusivity must be embedded from day one. Build with diverse data, diverse teams, and diverse perspectives. Don’t retrofit fairness, code it from the start. The tech community and policymakers must institutionalise gender-sensitive testing, mandate independent audits, and enforce transparency standards that catch harm before it scales. Workplaces have a role too. By integrating survivor-support tools into internal systems, they can ensure help is accessible when it’s needed most, quietly, safely, and without stigma.

Towards a More Equal World

Social justice organisations must lean into technology, not away from it. Human rights work and tech are no longer separate worlds; they must inform each other if justice is to keep pace with innovation. And for all of us, question the systems we use, the data they rely on, and the values they embed. Ask not only what technology can do, but what it should do, and for whom.

Technology, at its core, can do two things: it can reproduce bias, or it can redistribute power. That choice is ours. If we are serious about justice, we cannot leave it to chance. Feminist AI is a commitment to a fairer, more humane world. It’s better technology for all.

Technology, at its core, can do two things: it can reproduce bias, or it can redistribute power.

References

  • BBC News (2018) Amazon scrapped “sexist AI” tool.
  • The Guardian (2019) Healthcare algorithm used across America has dramatic racial biases.
  • Science Magazine (2019) Dissecting racial bias in an algorithm used to manage the health of populations.
  • The Next Web (2025) ChatGPT advises women to ask for lower salaries, study finds.
  • Oxford English Dictionary (n.d.) Equality.
  • Wikipedia (n.d.) Ada Lovelace.
  • Wikipedia (n.d.) Hedy Lamarr.
  • Wikipedia (n.d.) Grace Hopper.
  • Fortune (2025) Meet Mira Murati, the 36-year-old tech prodigy who shot to fame at OpenAI and now runs a startup that’s a poaching target for Mark Zuckerberg.
  • Sophia.chat (n.d.) Official website.
  • Spring ACT (n.d.) Official website.
  • AI for Good (2025) Meet the winners for the 2025 AI for Good Impact Awards.
Suggested

Join a growing community of professionals building their thought leadership portfolios on INSPO today.