Because there has been systematic exclusion of women in standards making, in data collection, and defining the old rules of the system, and a continuing exclusion of women in defining the new rules – aggravated by the lack of women scientists creating ADM systems – newly strategic and innovative thinking to achieve gender equality, and to strengthen democracy is needed in the new systems we create.
Urgent given the scale at which ADM systems are being deployed, is that machine learning trains on and then extracts data’s deep and implicit inference. This means that machine learning makes the implicit information in the data, explicit in the code, and the machine learning ‘intelligently mirrors’ the information it has been given from the analog world.
In addition, machine learning improves on the historic bias in the data, and crafts it into an embedded, exacerbated digital form. Thus, making the bias being slowly stripped from the analog world, a new digital, and potentially permanent reality. We must ensure that machine learning does not embed an already biased system into all our futures.