Women at the Table

Help us make these practical and urgent recommendations for inclusive algorithms a reality.

We call on Governments, Private Sector, and Civil Society Organizations to:

1

Advocate for and adopt guidelines that establish accountability and transparency for algorithmic decision making (ADM) in both the public and private sectors.
We must ensure machine learning does not embed an already biased system into our futures
Algorithmic equitable actions to correct for real life biases and barriers that prevent women and girls from achieving full participation and equal enjoyment of rights.

Public institutions to Pilot and Lead: Affirmative Action for Algorithms deployed when public institutions pilot ADM. Base pilots on longstanding and new social science research that allocate social incentives, subsidies, or scholarships where women have traditionally been left behind in prior systems. This is a positive agenda to advance values of equality we have long embraced, to correct for the visibility, quality, and influence of women proportionate to the population.

Public and private sector uptake of Algorithmic Impact Assessments (AIA): A self assessment framework designed to respect the public’s right to know the AI systems that impact their lives in terms of principles of accountability and fairness.

Rigorous testing across the lifecycle of AI systems: Testing should account for the origins and use of training data, test data, models, Application Program Interface (APIs), and other components over a product life cycle. Testing should cover pre-release trials, independent auditing, certification, and ongoing monitoring to test for bias and other harms. ADM should improve the quality of, not control the human experience.

Strong legal frameworks to promote accountability: Including potential expansion of powers for sector specific agencies, or creation of new terms of reference to oversee, audit, and monitor ADM systems for regulatory oversight and legal liability on the private and public sector.Gender responsive procurement guidelines: Organizations and all levels of government, to develop ADM gender equality procurement guidelines with hard targets; and outline roles and responsibilities of those organizations required to apply these principles.

Improve datasets – Open gender disaggregated data, data collection, and inclusive quality datasets: Actively produce open gender disaggregated datasets; this better enables an understanding of the sources of bias in AI, to ultimately improve the performance of machine learning systems. Invest in controls to oversee data collection processes and human-in-the-loop verification, so that data is not collected at the expense of women and other traditionally excluded groups. Engage in more inclusive data collection processes that focus not only on quantity but on quality of datasets.

2

Take clear proactive steps to include an intersectional variety and equal numbers of women and girls in the creation, design, and coding of ADM.

New technologies offer new opportunities including the creation of genuinely new structures that require new ideas and new teams. Gender roles being removed from the real world are being baked into new ADM with old and stereotypical conceptions and associations of gender, race, and class. Innovative and inclusive thinking are necessary. The imagination and skill can be provided by the largest untapped intellectual resource on the planet – women and girls.

Gender balance in AI decision making: Gender balance in decision making should be put on the official agenda of all involved with the funding, design, adoption, and evaluation of ADM.

Gender balance in design teams: Employment of a robust range of intersectional feminists in the design of ADM systems that will trigger and assist greater innovation and creativity, as well detection and mitigation of bias and harmful effects on women, girls, and those traditionally excluded.

Require companies to proactively disclose and report on gender balance in research and design teams, including upstream when applying for grants. Incentivize teams that are balanced and multi-disciplinary.

Research fund: Create a research fund to explore the impacts of gender and AI, machine learning, bias and fairness, with a multi-disciplinary approach beyond the computer science and economic lens to include new ways of embedding digital literacy, and study the economic, political, and social effects of ADM on the lives of women and those traditionally excluded from rules making and decision taking.

3

International cooperation and an approach to ADM and machine learning grounded in human rights.

Mass scale correction of skewed data will require multilateral and international cooperation to ensure we leave no one behind.

A UN agencies-wide review of the application of existing international human rights laws and standards for ADM, machine learning, and gender: This can guide and provoke the creative thinking for an approach grounded in human rights that is fit for purpose in the fast changing digital age.

Development of a set of metrics for digital inclusiveness: To be urgently agreed, measured worldwide, and detailed with sex disaggregated data in the annual reports of institutions such as the UN, the International Monetary Fund, the International Telecommunications Union, the World Bank, and other multilateral development banks, and the OECD.

Last modified: November 21, 2022

Comments are closed.