Affirmative Action for Algorithms
In order to innovate and thrive in a rapidly changing global environment, new norms are needed.
The “standardized male” is the default of flawed systems and cultural standards that currently control how we live and work – defaults so normalized we don’t even notice. From 20th century drug trials, international standards and global trading rules, to 21st century algorithmic decision making and machine learning systems, this default has proven to harm people – and the bottom line.
The systemic exclusion of women in defining the old rules of the system and a continuing exclusion of women in defining the new rules, calls for new strategic and innovative thinking to achieve gender equality, and to strengthen democracy in the new systems we create.
In July, Women@theTable released its position paper ‘Triple A: Affirmative Action for Algorithms – Artificial Intelligence, Automated Decision-Making & Gender’. The first paper of its kind to focus specifically on gender, artificial intelligence and automated decision-making. The report speaks of all intersections of women and girls, who serve as a proxy for all groups traditionally invisible and ‘other’ to the system – those traditionally left behind.
The position paper highlights the mounting evidence that gender bias and sexism is pervasive in Automated Decision-Making (ADM). From inherent bias in hiring; selection bias and stereotypes in the delivery of ads to women; and the entrenched implicit stereotypes and unconscious bias that gets translated into explicit misogyny through feminised machines like Alexa – women continue to be excluded and left behind.
We are at a critical turning point. Particularly urgent given the scale at which Automated Decision-Making (ADM) systems are being deployed, is that machine learning trains on and then extracts data’s deep and implicit inference. This means that machine learning makes the implicit information in the data, explicit in the code, and the machine learning ‘intelligently mirrors’ the information it has been given from the analog world. In addition, machine learning improves on the historic bias in the data, and crafts it into an embedded, exacerbated digital form. Thus, making the bias being slowly stripped from the analog world, a new digital, and potentially permanent reality.
The position paper advocates for Affirmative Action for Algorithms (AAA) to correct real life bias and barriers that prevent women from achieving their full participation and rights in the present, and in the future we invent. The position paper provides practical recommendations to the Government, private sector and civil society organisations, centred around accountability, inclusion, and cooperation to advance the values of equality and to correct for the visibility and influence of half the world’s population – women.
A Global Alliance
Recognising the need for an alliance and coordination at the global level to address gender bias in ADM, Women@theTable is leading a global alliance for AAA. The AAA Alliance is comprised of concerned technology leaders, government agencies, nonprofits, and academics committed to addressing this issue while it’s still possible. The alliance was formally launched with a Declaration at Women in Big Data in Zürich in June 2019 and at London’s renowned Chatham House in July 2019.
AAA Alliance advisory board members
To date, the AAA Alliance advisory board members includes senior leaders from Ciudadania Inteligente / The Smart Citizens Foundation, the Data-Pop Alliance, EPFL, Yale, MIT, the City of Barcelona, the World Wide Web Foundation, the World Bank Group, and others.
Awareness & Advocacy
The AAA alliance focuses on awareness raising and advocacy for algorithm accountability in framework discussions being held multilaterally in trade (WTO), intellectual property (WIPO), standards (ISO, ITU, IEC), human rights council (HRC) and among other international settings / institutions, primarily in Geneva.
& Municipal Pilots
The AAA alliance is also committed to creating low cost targeted pilots that correct for gender bias across geographies on the municipal level using a tri-partite and multi-disciplinary approach. This approach brings together coalitions of: academia – data scientists together in conversation with social scientists; municipalities – cities in consultation with their citizens; and the private sector – technology companies, philanthropists and foundations. The AAA pilots will be designed to actively correct real world and digital bias, and not only check for bias. AAA pilots embrace the principles of Smart Cities 3.0 which focuses on the quality of the lives we want to lead and a bottom up participatory approach. That is, citizen co-creation and and design, specifically including a gender analysis.
A Critical Turning Point
We are at a critical turning point. AAA pilots offer us an opportunity to ensure that machine learning does not embed an already biased system into all our futures.Particularly urgent given the scale at which ADM systems and machine learning are being deployed. For example, AAA pilots could be used to allocate public housing, university scholarships, certain social incentives and subsidies to women who have traditionally been left behind - to correct real life bias and barriers that prevent women from achieving their full participation and rights. Other pilots could focus on urban mobility and justice / crime prevention.
As Marshall McLuhan is famously quoted, “We shape our tools, and thereafter our tools shape us.” This is our immediate challenge. We must establish new tools, and new norms, for lasting institutional and cultural systems change now and for the century beyond. This concerns all corners of the world. It is crucial that we focus on gender equality and democracy for both women and men, now. Then everyone can thrive. We must leave no one behind.
We call on Governments, Private Sector and Civil Society Organizations to:
Undertake Algorithmic equitable actions to correct real life biases and barriers that prevent women and girls from achieving full participation and equal enjoyment of rights:
1. ADVOCATE FOR AND ADOPT GUIDELINES THAT ESTABLISH ACCOUNTABILITY AND TRANSPARENCY FOR ALGORITHMIC DECISION-MAKING (ADM) IN BOTH THE PUBLIC AND PRIVATE SECTOR.
We must ensure machine learning does not embed an already biased system into all our futures.
2. TAKE CLEAR PROACTIVE STEPS TO INCLUDE AN INTERSECTIONAL VARIETY AND EQUAL NUMBERS OF WOMEN AND GIRLS IN THE CREATION, DESIGN AND CODING OF ADM.
New technologies offer new opportunities including the creation of genuinely new structures that require new ideas and new teams. Gender roles being removed from the real world are being baked into new ADM with the old and stereotypical conceptions and associations of gender, race and class. Innovative and inclusive thinking are necessary. This imagination and skill can be provided by the largest untapped intellectual resource on the planet – women and girls.
3. INTERNATIONAL COOPERATION AND AN APPROACH TO ADM AND MACHINE LEARNING GROUNDED IN HUMAN RIGHTS.
Mass scale correction of skewed data systems will require multilateral and international cooperation to ensure we leave no one behind.
We just sent you an email. Please click the link in the email to confirm your subscription!