Women at the Table

In order to correct the real life bias and barriers that prevent women, girls and all people marginalized from achieving full participation and rights in the present & the future we invent, we must ensure that machine learning does not embed already biased systems into our collective futures.

We Shape Our Tools, Thereafter our Tools Shape Us

Our foundational paper from 2019  harking back to media critic Marshall McLuhan’s statement that “We shape our tools and thereafter our tools shape us” giving a full landscape and set of recommendations on ways to invent a more equitable future with the algorithmic tools we create.

Artificial Intelligence Recruitment: Digital Dream or Dystopia of Bias?

In a Post-COVID world we turn more and more to online recruitment. How does this effect those already left out of the system and the data.  This paper written in collaboration with a team from Skadden,Arps looks at three jurisdictions, UK, EU/France, and the US to further nderstand what is already happening and where the law might go.

The Algorithmic Origins of Bias

There has been a great deal of concern regarding the presence of social biases in artificial intelligence (AI) systems, lately. With the increasing adoption of AI technologies in our daily lives, it is not a surprise that chinks have started to show in AI’s armour.

< A+ > Declaration

Written for a Keynote at Women in Data Science Zürich 2019, this Call to Action has become the manifesto for the <A+> Alliance and all of its, and Women at the Table’s work.
Last modified: July 5, 2022

Comments are closed.