The article, titled “Setting the right standards to address gender bias in research, datasets, and AI”, was written by our founder and CEO Caitlin Kraft-Buchman has been published by Healthcare Transformers, a content hub aimed at c-suite executive leaders in healthcare. The site is a place to showcase non-promotional, innovative, thought leadership material from experts in the field around topics of healthcare transformation.
- Systematically including women in research will benefit science, medicine and business performance
- Not including the female perspective could exacerbate existing gender biases as new technologies, including AI, emerge
- Collecting targeted and disaggregated data will be instrumental in plugging existing data gaps and overcoming gender bias in research and science
Systematically including women in research will drive scientific excellence, support advances in medicine, and open up new markets and opportunities for industry, Caitlin Kraft-Buchman, CEO and Co-Founder of Women at the Table, a systems change think tank based in Geneva, Switzerland, says in a recent interview with Healthcare Transformers.
Medical research has been heavily skewed to focus on male subjects.1 This has resulted in a poor understanding of the biology of females and is impacting the types of treatments and care available for women, but with the advent of new technologies, including Artificial Intelligence (AI), there is a renewed impetus to change the status quo when it comes to gender bias in research.
“We need to re-examine all of our old assumptions in a way that will allow us to collectively agree on what we need to move towards excellence in science,” Caitlin says. “We’re in a moment in history where we’re able to include the gender dimension into the way we design products. If we are able to make products work for more people, one would assume that there will also be more profitability for companies,” she says.
Plugging the gender bias gap in research
Much of the production of scientific and medical knowledge has, in the past, come from men, leaving women and the issues that concern them most excluded from research and literature.1 This has often stopped women from nabbing the top posts in the scientific community, further perpetuating gender inequity.
Although some progress was made in recent years – the top medical journals have all made public commitments to gender equity, diversity, and inclusion – women were forced to give up much of their hard-won ground during the pandemic.2 Many women, who had built up accomplished research careers, had to divert their attention away from work and take on additional domestic duties like homeschooling and looking after vulnerable relatives as schools were closed and care support networks fell away.
A survey published in the journal Nature Human Behaviour showed that female scientists, particularly those with young children, saw a “substantial decline in time devoted to research”. The authors cautioned that this could have important short- and longer-term effects on women’s careers and have urged institution leaders and funders to take action.3
The immediate impact was felt during the first few months of the pandemic. The British Medical Journal (BMJ) has published data that reveal out of the 3 million submissions that were made to major health journals during the first six months of 2020, only 36% came from women. The gender gap applied to research and non-research articles and was across all positions of authorship as well as both top-tier and lower-impact journals, according to the BMJ.2
Further analysis by the BMJ of 500,000 authors across the areas of basic medicine, biology, chemistry, and clinical medicine confirmed that the gender gap was increasing, underscoring the impact of the imbalance of contributions from male and female scientists across a number of domains.2
Designing algorithms to address gender bias in research
This alarming backslide in gender equity in research comes at a time when new technologies could exacerbate pre-existing bias.
Big data analytics, digital biomarkers, and natural language processing are all emerging as tools that could be used to advance and accelerate scientific progress, but there is a risk that existing biases could be permanently wired into AI algorithms unless decisive action is taken to address this.
“If you’re using skewed data for AI models, using data that are statistically biased and you add to that societal bias, we’re going to supercharge data that are essentially corrupt and incomplete,” Caitlin says.
Researchers writing in the journal NPJ Digital Medicine noted an increased awareness and evidence of biases since the introduction of biomedical AI.4
“AI could magnify and perpetuate existing biases and existing sex and gender inequalities if they are developed without removing biases and confounding factors,” the authors cautioned, adding that “undesirable bias can be accidentally introduced” into AI algorithms if samples are not representative or if there is bias in the overall population as a result of underlying social, historical or institutional factors.4
But emerging technology could also help to address gender imbalance, the authors wrote.4
“On the other hand, they [algorithms] have the potential to mitigate inequalities by effectively integrating sex and gender differences in healthcare if designed properly. The development of precise AI systems in healthcare will enable the differentiation of vulnerabilities for disease and response to treatments among individuals while avoiding discriminatory biases,” they said.4
Acting with urgency
For Caitlin, it’s clear that stakeholders from across the healthcare ecosystem need to act with expediency to ensure the right data are collected and used to form the basis for AI.
“It’s an urgent, urgent, urgent issue and everybody has an opportunity now to look at the models that they have deployed, how they’re working, whether they’re effective and accurate, and who they work for,” Caitlin says. “Researchers and executives, everybody needs to pitch in and collect more and better targeted and disaggregated data so that we can fill the gaps that exist. Then we will be able to create the solutions and protocols that will guide us as we move forward.”
Fundamental to driving successful outcomes will be understanding the problems we need technology to solve, asking the right questions, and evaluating the data being captured to ensure scientists create the most appropriate solutions, Caitlin says, citing an example that was published in the New England Journal of Medicine that highlighted how wrong assumptions can lead to the wrong conclusions and outcomes for patients.5
“Researchers looked at around 13 different algorithms that are used across the US and looked at the inherent bias in the models. In one particular case, they looked at an Emergency Room algorithm where the algorithm designers worked off the assumption that the more you went to a doctor the sicker you were and therefore the faster you would need to be seen if you went to an Emergency Room.
“But what they failed to take into account was the fact that the Emergency Room is often the first port of call for people in the US, who are otherwise unable to afford to see a doctor. The unintended consequence was that these people, who were the most sick but didn’t see a doctor often, ended up being pushed to the back of the line. I’m sure this was not the intention, but it just shows what can happen when you don’t include the right people with the right insights in the design of the algorithmic model,” Caitlin says.
Learning from the International Standards Organization
The development of AI is moving forward at a blistering pace and as it does so, Caitlin believes it is crucial that everyone holds themselves to rigorous standards.
“We need to interrogate the quality of our own work and processes to make sure that the things we are developing will stand the test of time, that they are resilient, and that they are effective,” Caitlin says, adding we can learn a lot from how we are starting to make more inclusive the standards we are using to assess our built environment.
“The built world has been made for this default male body and most standards are based on that. We know from the great work in the book Invisible Women that this is reflected in everything from beakers in the chemistry lab to the size of piano keys to air conditioning levels to crash test dummies. But we have an opportunity to think about things differently. We have an opportunity to create personalized medicine with rigorous critical analysis, new discovery, new science, and it’s time for us all to embrace these possibilities,” she says.
One learning could come from the way in which Caitlin and her team worked with the United Nations Economic Commission for Europe and other partners to draft the Gender-Responsive Standards initiative, which was adopted by the International Standards Organization (ISO) and all major standards organizations across the world in 2019.6
Now when standards are proposed or when they come up for review, organizations must look at whether they have considered the gender dimension and address deficiencies if necessary.
“The built world has made this commitment. It’s a slow and steady commitment to rethinking how the things we have built work for both men and women,” Caitlin says.
Creating the right incentives to tackle gender bias in research and science
Ultimately, making sure organizations from all parts of the healthcare pipeline are incentivized to meet the needs of both sexes will play a crucial role in shifting the needle, Caitlin says.
“We see that in terms of diversity, equity, inclusion, as soon as you start attaching to somebody’s performance review how they have helped diverse people in their workforce move through the ranks, get promotions, be enabled, things start to change because it’s being measured. And I think it’s the same concept in terms of pharma or health insurance or any sector dealing with the kind of important medical data that we have been talking about. This dimension needs to be part of their performance reviews. I think that would move things forward a lot,” Caitlin says.Last modified: November 22, 2023