AI & Gender

Concept Note

Informal round table on data-driven technologies and

protection of women’s human rights

Date: 7 November, Geneva, Tuesday, 7 November, 13:00 – 15:30.

Venue: Women@theTable, Maison de la Paix, Geneva

Background

Data-driven technologies, such as big data and artificial intelligence (AI), are rapidly developing and soon will define many, if not all, aspects of our life. This trend will provide opportunities as well as risks for the realization of women’s human rights.

In particular, data-driven technologies can promote or exacerbate existing gender discrimination, and discriminatory gender stereotype or bias. For example, online advertisements of certain occupations may exclude women if data sets used to generate or disseminate such ads is biased and targets mostly men. AI used in social media platforms may exacerbate misogyny and online violence against women if it learned its function from biased data.

Currently, analysis has been developed on AI and racial discrimination, and gender analysis has been made as a part of such analysis. However, so far, analysis focused on gender discrimination independent from racial discrimination or on interventions necessary for protection of women’s human rights in data-driven technologies is still under-developed.

In order for human rights actors, such as Women at the Table (NGO) and OHCHR (UN agency), it would be crucial for them to understand the current status and future perspective of these technologies, their potential and risks for the protection and promotion of women’s and girls’ rights, and what kind of intervention can be made to facilitate that such technologies will facilitate the realization of human rights, including gender equality and women’s rights, and not to pose risks for their realization.

Focus and objective

The meeting will bring together a number of academic experts on digital technology to advise Women at the Table and OHCHR on what is the current status of data-driven technologies and what is possible to influence the development of such technologies. The brainstorming should inform the two organizations:

  • What are the potential and risks of data-driven technologies for the efforts to eliminate gender discrimination, gender stereotype and bias, and online violence against and exploitation of women and girls?
    • What can these technologies do now and will be able to do in near future?
    • How and where are these technologies used?
    • What are the current measures taken as safeguard against abuse/misuse?
    • How is the data for self-learning generated and fed into AI?
  • What are the tangible ways to influence the development, deployment and operation of data-driven technologies on gender equality/removing gender bias as well as prevention of online VAW/exploitation?
  • Can we prevent bias/discrimination in the course of developing/designing AI?
  • Data generation/collection?
  • Safeguards against abuse/misuse?  
  • How academic institutions could help the work of human rights actors (such as Women at the table and OHCHR) to work on data-driven technologies, gender equality and women’s rights?

Format

The discussion will be held in a closed meeting. A number of experts will be invited to brainstorm on key factors mentioned above. Participants will be staff members of Women at the Table and OHCHR (including staff members working on digital technologies and human rights from angles other than gender perspective) The meeting will start with one or two brief presentation(s) to provide an overview of the current status of data-driven technologies, followed by free discussion among experts and other participants.

Invitees

  • The Principals of main Swiss research institutions in the areas of digital technologies, i.e. École polytechnique fédérale de Lausanne (EPFL : https://www.epfl.ch/ ), ETH Zurich (https://www.ethz.ch/en.html ), and University of Geneva : (http://www.unige.ch/ ), will be contacted to nominate key academics in the area of data-driven technologies.

Nisheeth Vishnoi and Elisa Celis of EPFL gave a landscaping presentation on Fairness in AI that touched on all the issues related but not limited to gender. Examples of the work they are doing can be found here:

All Posts
×

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OKSubscriptions powered by Strikingly