• The Deadly Data Gender Gap

    All data tells a story.

    And like all stories, its power and purpose depend on the protagonist and their point of view.

  • Introduction

     

    If ‘history’, as Churchill famously said, ‘is written by the victors’, who and what and which data is telling our modern (and our historical) data story? Data’s narrative, like history, is as much dependent on what has been omitted, as it is dependent on what has been committed, to record. And our data story begins with the consistent omission of women.

     

    Data is everywhere and influences almost everything we do, often without our recognizing its influence. When it comes to data about people, unfortunately most of the data we are using tells only half the story because most of the data isn’t gender balanced, if it even includes women at all. What more could we learn, how much more could we earn, how much better could we serve, how much more of a contribution could we make if we knew the whole story? Or more startling still, how much harm could we prevent?

     

    This is an urgent story because lives are at stake. As is our efficiency and productivity. Great opportunity and creativity – drivers of innovation – have been and continue to be lost.

     

    This is because women have traditionally been left out of the data story. 20th century drug trials, the design of safety features in cars, medical treatments, the work equipment we wear, to name a few examples, are based on data that uses the default of a “standardized male”, a default of flawed systems and cultural standards that currently form the physical framework and infrastructure of how we live and work. These defaults are so normalized we don’t even notice them, yet they have proven to harm and lead to deadly consequences.

     

    Concurrently, the digital world is exploding with new and ever growing technologies that are leading and have led to an exponential increase in the volume of data available, even in the least developed and most isolated places in the world. With the creation of unprecedented possibilities for informing and transforming society we are seeing innovation and experimentation thrive. Whether data is the ‘new oil’, the ‘new gold’, a ‘new currency’, or a ‘strategic resource’, data is our new future.

     

    In the anatomy of storytelling we are at the inciting incident – where change is inspired by something happening to disrupt the setting - a data revolution compounded by Artificial Intelligence (AI) and Automated Decision-Making (ADM) systems that will lead to a crisis unless we choose a different path.

     

    If data is incomplete because it is missing half or much more than half of the population, (if we are to take into account that the Global South has been left out of most or many massive datasets and research), the entire arc of the data story lacks more and more precision and safety.

     

    It becomes more and more biased as the data story perpetuates and retells the incomplete story. Incomplete data becomes complete bias as it is interwoven in the data story that is retold over and over. With the advent of AI and ADM systems, entrenched biases that are so engrained that they are unconscious are not only being passed onto the next generation, but they are becoming intractable as machines begin to learn from one another and make bias in its many forms explicit.

     

    The lifeblood of our world’s decision-making is data. And therefore we are at a critical turning point – we must create new norms and write a new data story. We must make the data revolution a gender data revolution grounded in human rights. We need data that accurately reflects and represents the lives and diversity of women. Gender data must be available, accessible and analysed – then everyone can thrive .

     

    In order to advance the values of equality societies have long embraced we are advocating to take action to correct the data for the numbers of women proportionate to the population.

     

    This paper follows on from Women at the Table’s paper “We Shape Our Tools, and Therefore Our Tools Shape Us”: Affirmative Action for Algorithms, Artificial Intelligence, Automated Decision-Making and Gender. And here once again, we mean all intersections of women and girls when we speak of women and girls. However women and girls can also serve as a proxy for all or any groups traditionally invisible and ‘other’ to the system – those traditionally left behind.

All Posts
×