Women at the Table

The Algorithm Took Notes: What We Learned at CSW70 About Tech Bias in Justice Systems

On March 11, at a side event during the 70th session of the Commission on the Status of Women, we convened a panel that brought together voices rarely heard in the same room: UN human rights experts, a CEDAW Committee member, and a civil society leader from the Global South — all focused on a single question. When courts use algorithms to make decisions about freedom, whose bias gets baked in?

The answer, grounded in an expert paper I was commissioned to write for the CSW70 Expert Group Meeting, is unambiguous: algorithmic systems do not eliminate discrimination. They launder it — encoding structural bias, giving it a scientific credential, and scaling it beyond what any single biased judge could accomplish alone.

Three things courts don’t know about their own algorithms

  • They learn from a discriminatory past. In Brazil, language models detect gender bias in court decisions with 88.86% accuracy. Women are characterized as emotional and unpredictable; men’s violence is explained by circumstance. When these patterns become training data, the algorithm doesn’t correct history. It projects it forward.
  • They discriminate without ever naming gender. Employment gaps from caregiving, residential instability, relationship history — these proxy variables correlate with gendered social roles. The algorithm never mentions gender. Gender is everywhere in it.
  • They automate the credibility gap. Women already face lower credibility ratings from judges, particularly in sexual assault cases. Algorithms trained on those decisions systematize that gap at scale.

The numbers tell the story

The data we presented spans continents. In the United States, women rated “high risk” by the COMPAS algorithm reoffended at less than half the rate of men given the same rating — 25% versus 52%. In the Netherlands, immigrant women from Morocco and Turkey were rated 40% higher risk than Dutch women with identical criminal histories. In Canada, Indigenous women were rated 35% higher risk despite significantly lower violent reoffending rates. These are not edge cases. They are the system working as designed.

Fairness is a political choice — not a technical one

Computer scientists have proven that no algorithm can be simultaneously fair by every definition when base rates differ between groups. There are 21 mathematical definitions of fairness. They cannot all be satisfied at once. That means choices must be made — fair to whom, measured how, at whose expense — and those choices are political, not technical.

What came out of the room

Our panelists — Laura Nyirinkindi of the UN Working Group on Discrimination Against Women and Girls, Esther Eghobamien-Mshelia of the CEDAW Committee, and Fernanda K. Martins of Fundación Multitudes — didn’t stop at diagnosis. They identified concrete accountability levers, including using licensing and relicensing of tech platforms as regulatory leverage, cascading CEDAW obligations to cities and communities to make algorithmic accountability justiciable locally, and pushing for mandatory disclosure whenever an algorithm influences a judicial decision.

Two immediate opportunities demand attention: CEDAW is developing General Recommendation 41 on stereotyping, which will address algorithmic stereotyping in the digital era — the draft is open for consultation now. And the Working Group’s thematic report on AI and gender equality goes to the Human Rights Council in June 2026.

The bottom line

The international human rights framework is not silent on algorithmic discrimination. What’s missing is not the law. It is the political will to enforce it.

The algorithm did not rise above the judge’s prejudice. It took notes. The question is whether we will keep pretending otherwise — or demand that justice systems answer for the tools they deploy.

Operational, not aspirational. Designed with, not designed for.

 Caitlin Kraft-Buchman is CEO and Founder of Women At The Table. Her expert paper, “Gender Bias in Judicial Algorithms: A Global Analysis of Algorithmic Discrimination,” was commissioned for the CSW70 Expert Group Meeting on ensuring access to justice for women and girls.

Photo credits:
Pauline Wee & DAIR
https://betterimagesofai.org | https://creativecommons.org/licenses/by/4.0/



Last modified: March 30, 2026