Return to site

Triple A: Affirmative Action for Algorithms

Artificial Intelligence, Machine Learning & Gender: A Call To Action

Triple A: Affirmative Action for Algorithms

Artificial Intelligence, Machine Learning & Gender:

A concrete Call to Action

We are at a critical turning point. In order to innovate and thrive in a rapidly changing global environment, new norms are needed. The “standardized male” is the default of flawed systems and cultural standards that currently control how we live and work – defaults so normalized we don’t even notice. From 20th century drug trials, international standards and global trading rules, to 21st century algorithmic decision making and machine learning systems, this default has proven to harm people – and the bottom line. In fact, democracy itself is in peril. We must establish new norms.

Our focus is women and girls in all their intersections and forms. Because there has been systematic exclusion of women in defining the old rules of the system – and the continuing exclusion of women in defining the new rules - we must activate a new coalition to advocate for strategic and innovative thinking in laws, regulations and norms to achieve gender equality and strengthen democracy in the new systems we create. Women – an essential and untapped resource – with the power to reshape our society’s systems must be influentially involved in all levels of decision-making, so change can happen now, before old norms and stereotypes are baked into the machine learning systems of the future.

Particularly urgent given the scale at which ADM systems and machine learning are being deployed, we need Affirmative Action for Algorithms, to correct real life bias and barriers that prevent women from achieving their full participation and rights in the present, and in the future we invent.

As Marshall McLuhan is famously quoted, “We shape our tools, and thereafter our tools shape us.”

This is our immediate challenge. We must establish new tools, and new norms, for lasting institutional and cultural systems change now and for the century beyond. This concerns all corners of the world. It is crucial that we focus on gender equality and democracy for both women and men, now. Then everyone can thrive. We must leave no one behind.

We call on Governments, Private Sector and Civil Society Organizations to:

1. ADVOCATE FOR AND ADOPT GUIDELINES THAT ESTABLISH ACCOUNTABILITY AND TRANSPARENCY FOR ALGORITHMIC DECISION-MAKING (ADM) IN BOTH THE PUBLIC AND PRIVATE SECTOR.

We must ensure machine learning does not embed an already biased system into all our futures.

Call To Action:

  • Algorithmic equitable actions to correct real life biases and barriers that prevent women and girls from achieving full participation and equal enjoyment of rights.
  • Public institutions to Pilot and Lead:  Affirmative Action for Algorithms deployed when public institutions pilot ADM. Base pilots on longstanding and new social science research that allocate social incentives, subsidies, or scholarships where women have traditionally been left behind by prior systems.     This is a positive agenda to advance values of equality we have long embraced, to correct for the visibility, quality and influence of women proportionate to the population.
  • Public and Private sector uptake of Algorithmic Impact Assessments (AIA): A self-assessment framework designed to respect the public’s right to know the AI systems that impact their lives in terms of principles of accountability and fairness.
  • Rigorous testing across the lifecycle of AI systems: Testing should account for the origins and use of training data, test data, models, Application Program Interfaces (APIs) and other components over a product life cycle. Testing should cover pre-release trials, independent auditing, certification and ongoing monitoring to test for bias and other harms. ADM should improve the quality of, not control the human experience.
  • Strong legal frameworks to promote accountability: Including potential expansion of powers for sector specific agencies, or creation of new terms of reference to oversee, audit and monitor ADM systems for regulatory oversight and legal liability on the private and public sector.
  • Gender-responsive procurement guidelines: Organizations and all levels of Government, to develop ADM gender equality procurement guidelines with hard targets; and outline roles and responsibilities of those organisations required to apply these principles.
  • Improve datasets – Open gender disaggregated data, data collection and inclusive quality datasets: Actively produce open gender disaggregated datasets; this better enables an understanding of the sources of bias in AI, to ultimately improve the performance of machine learning systems. Invest in controls to oversee data collection processes and human-in-the-loop verification, so that data is not collected at the expense of women and other traditionally excluded groups. Engage in more inclusive data collection processes that focus not only on quantity but also on the quality of datasets.

2. TAKE CLEAR PROACTIVE STEPS TO INCLUDE AN INTERSECTIONAL VARIETY AND EQUAL NUMBERS OF WOMEN AND GIRLS IN THE CREATION, DESIGN AND CODING OF ADM.

New technologies offer new opportunities including the creation of genuinely new structures that require new ideas and new teams.

Gender roles being removed from the real world are being baked into new ADM with the old and stereotypical conceptions and associations of gender, race and class.

Innovative and inclusive thinking are necessary.

This imagination and skill can be provided by the largest untapped intellectual resource on the planet – women and girls.

Call To Action:

  • Gender balance in AI decision making: Gender balance in decision making should be put on the official agenda of all involved with the funding, design, adoption and evaluation of ADM.
  • Gender balance in design teams: Employment of a robust range of intersectional feminists in the design of ADM systems will trigger and assist greater innovation and creativity, as well detection and mitigation of bias and harmful effects on women, girls and the traditionally excluded.
  • Require companies to proactively disclose and report on gender balance in design teams. Incentivize companies with balanced teams.  
  • Require universities and start-ups to proactively disclose and report on gender balance in research and design teams, including upstream when applying for grants. Incentivize teams that are balanced and multi-disciplinary. 
  • Research fund: Create a research fund to explore the impacts of gender and AI, machine learning, bias and fairness, with a multi-disciplinary approach beyond the computer science and engineering lens to include new ways of embedding digital literacy, and study the economic, political and social effects of ADM on the lives of women and those traditionally excluded from rules making and decision taking.

3. INTERNATIONAL COOPERATION AND AN APPROACH TO ADM AND MACHINE LEARNING GROUNDED IN HUMAN RIGHTS.

Mass scale correction of skewed data systems will require multilateral and international cooperation to ensure we leave no one behind.

Call To Action:

  • A UN agencies-wide review of the application of existing international human rights law and standards for ADM, machine learning and gender: This can guide and provoke the creative thinking for a an approach grounded in human rights that is fit for purpose in the fast changing digital age.
  • Development of a set of metrics for digital inclusiveness: To be urgently agreed, measured worldwide and detailed with sex disaggregated data in the annual reports of institutions such as the UN, the International Monetary Fund, the International Telecommunication Union, the World Bank, other multilateral development banks and the OECD.     

Women at the Table

June 2019,

Call To Action delivered at the Women in Big Data Conference

Zürich, Switzerland

            

All Posts
×

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OK