• AI & Equality

    A Human Rights Toolbox


    The <AI & Equality> Human Rights Toolbox

    located at AIEqualityToolbox.com

    is a one-stop anti-bias and discrimination website that will house a stand-alone interactive learning tool, coding notebooks, and community engagement forum for university students and faculty to actively engage with data and Human Rights concepts in order to understand the linkages and impacts of algorithmic creation that better reflect human rights values.



    Our ultimate objective is to bring a university generation to understand the scientist’s unique potential of social impact in the real world,

    bridging science and human rights policy to foster systemic resilience + more equal, just, robust democracies.


    This will include real Life Blended Workshops and the Toolbox. In addition to the Women at the Table and OHCHR teams, the real life blended workshops will ultimately be taught by other human rights / legal experts jointly with computer science /machine learning faculty that will use the <AI & Equality> Human Rights Toolbox as a basic workshop structure to explore the concrete interplay of the two disciplines and a human rights based approach.

    We have already presented a blended workshop at EPFL on March 26, 2021 and February 13, 2020 in collaboration with the Office of the High Commissioner for Human Rights (OHCHR), the EPFL Digital Humanities Lab, and EPFL Equalities Office.

    A further series of workshops are planned at other universities throughout Spring 2021.


    How can a human rights based approached be applied to computer science, engineering and machine learning algorithms?


  • Background

    Why and where can algorithms produce inequality outcomes? Why and where can algorithms be gender biased? How can a human rights-based approach be applied to computer algorithms that engage, reason about, and make decisions on people? A growing body of research shows that algorithmic bias is in play in every aspect of modern life and has substantial, far-reaching impacts on our work environments, private life and culture.


    We are at a critical turning point. In order to innovate and thrive in a rapidly changing global environment, new norms and practices are needed. Flawed systems and established cultural standards that currently control how we live and work are so normalized as defaults that we don’t even notice them. From 20th century medical trials, international standards, city transit systems and global trading rules to 21st century online platforms that apply sophisticated algorithmic decision making and machine learning systems, this default has proven to harm.


    In this crucial moment when AI is transforming every aspect of our lives and the very fabric of our society, it is clear that the design and deployment of AI must be grounded in human rights. The principle of equality forms the core of the human rights vision of the 1945 Charter of the United Nations which states that human rights and fundamental freedoms should be available to all human beings `without discrimination on the basis of race, sex, language or religion'. The principle of the equal rights of women and men is one of the pillars upon which the United Nations and post World War Two frameworks were founded. Gender equality and all equality - the very heart of human rights - must be included in AI design and deployment.


    Particularly urgent given the scale at which AI systems are being deployed, we need scientists from informatics, the humanities, and engineers that understand the intersectional dimensions of the problem, and the implications their work has for all citizens, so that we all can thrive

    Who is this workshop for?

    EPFL undergraduate and graduate students.


    The Digital Humanities Institute in collaboration with the Equal Opportunities Office will host a 3 hour interactive workshop for EPFL students via Zoom. Using gender as a prism to understand a human rights framework that underscores AI, the interactive workshop will focus on what a human rights approach is, how a sociological perspective can contribute to developing practices, and the journey of data and model design, using a Jupyter notebook to work through the Pre-Processing, In-Processing and Post-Processing steps of the data life cycle. Through the workshop we will foster reflection on the stereotypes, biases and gendered roles of both women and men, with the intention of understanding what real-life constraints hinder equality in the working environments and the output of computer scientists and engineers. The workshop will increase participant awareness of the relevance of gender and equality in their work and to their workplace. It provides a unique opportunity to develop, deepen, and apply gender equality learnings, putting learning into action, ultimately leading to better decision-making, excellence in science, and improved practices.


    After completing the workshop, the participant will receive a certificate.


    Applying a human rights-based approach this workshop will develop and strengthen awareness as well as the understanding of gender equality and gender bias as a first step towards behavioural change, and the integration of an intersectional perspective into everyday work of computer science and engineering. (Intersectionality, as defined by the Global Research Council is ‘the interconnected nature of social categorizations such as race, class and gender as they apply to a given individual or group, regarded as giving overlapping and interdependent systems of discrimination and disadvantage.)


    Throughout the workshop, participants will complete a variety of interactive exercises, discussion and activities. The workshop will be supported by specific training materials including a checklist tailored for students for use to embed gender and equality across their research and day-to-day work.


    Additionally, following the workshop (after 4-6 weeks), participants will be invited to attend a voluntary additional 1.5 hour session to focus on the application of the checklists to real-life research and design scenarios. This follow-up session will allow participants to reflect on the initial training and lessons learnt and have the opportunity to share insights that have come up in their research, design, development and learning environments, including the opportunity to bring the learnings forward as a group.


    Learning outcomes

    Upon completion EPFL students will have the knowledge and skills to:

    • Explain a human rights-based approach to AI ;
    • Identify relevance of different biases and importance of intersectionality, gender equality and bias to computer science and engineering / institutional objectives;
    • Analyze how gender or inequality bias has occurred or can occur in the research, design and development of AI;
    • Apply how and when to use gender & equality inclusive tools and techniques to mitigate bias in AI;
    • Evaluate concrete methods to integrate gender and equality into design, planning and implementation of AI projects.

    Instructors & Hosts


    Caitlin Kraft-Buchman is the CEO/Founder of Women at the Table, a growing global Swiss civil society organization based in Geneva, and the first organization to focus on systems change by helping women gain influence in sectors that have key structural impact: technology; the economy; sustainability; and democracy and governance. She is a founder of the A+ Alliance for Inclusive Algorithms which was one of Fast Company’s 2020 World Changing Ideas in AI & Data, and leadership of the UN Women Generation Equality Action Coalition for Technology and Innovation.


    Asako Hattori serves as a human rights officer at the Women’s Human Rights and Gender Section, the Office of the United Nations High Commissioner for Human Rights (UN Human Rights– – OHCHR). Previously, she served at the Secretariat of the United Nations human rights treaty bodies, including the Committee on the Rights of the Child, the Committee on the Elimination of Discrimination against Women and the Committee on the Rights of Persons with Disabilities. At UN Human Rights, she has also worked on economic, social and cultural rights, land and human rights, gender stereotyping, digital technologies and women’s rights.


    Jessica Pidoux is a doctoral assistant at the Ecole Polytechnique Fédérale de Lausanne, holds a master's degree in sociology of communication and culture from the University of Lausanne and is finalising her thesis on dating applications programming and user practices.


    Sofia Kypraiou is a final-year master’s student in Data Science at EPFL who will work on her master’s thesis exploring and enhancing a practical toolbox for AI and Human rights in 2021. She holds a degree in Informatics & Telecommunications from the University of Athens.



    Daniel Gatica-Perez leads the Social Computing Group at Idiap and is a professor at EPFL, affiliated with the School of Engineering and the College of Humanities.


    Helene Füger is the head of EPFL’s Equality Office

All Posts