Why the exclusion of women from data matters

FT News
|

Automatic TRANSCRIPT

Treating men as the default. Human and economic planning is not only costly for society for the practice can also be deadly for women when applied to things like medical trials. This is the case made by Caroline creo. Peres in new book invisible. Women exposing data bias in a world designed for men Fritz student talk to business editor Sarah, Gordon about the arguments put forward. Sarah tennis this book by Caroline created Peres invisible. Women what exactly is the problem that she's seeking to diagnose we've heard about quite a lot and you've written extensively on things like the gender pay gap. But this is a different aspect of the gap between men and women. Yes. Yes. I mean, what she's to hear about is the gender data gap, which she defines as the absence of women in a whole range of databases, which then affect resource allocation and policy decisions in healthcare in call design in disaster relief in a is so cheap not only gone as the evidence of that absence. But then talks about what the consequences are. And in some cases, the consequences are really serious. I mean, and then some extraordinary examples that I have confess personally, I wouldn't have thought of. I mean, it's everything from. How seatbelts might be designed to medical equipment how it's calibrated? But even an examp-, which are the you deployed in your review of the book about how you plan for clearing of snow. I mean, it's extraordinary the range of of s where this is prominent. Yes, the snow clearing. One is a very good example. It's a town in Sweden which decided to include women in the data that used to decide its nuclear policies. They wanted to done is it had prioritized clearing. Snow? I from the roads before getting to the pavements, and it looked at patents of accident and injury in winter snowfall situations, which in Sweden on many and decided to reverse that priority. So to clear the pavements rather than the roads. Now that not only benefited women in one particular way in that there were more women on the pavements than on the roads and women. On the pavements were also suffering more than men on the roads in the sense that pushing a buggy or a shopping call talk being an elderly woman and walking on a pavement. It turns out. His actually more dangerous. Are you're more likely to be injured than driving a car through the snow? So number one benefited women, but it actually also benefited the public purse because what they found is that the cost of injury accident emergency admissions lost work time in terms of broken bones. Whatever that actually they made a huge saving just by reversing that policy priority. And that's one of the examples that she uses to look at the positive benefits of including women in the data. But what she does more of in the book whose look at the whole range of negative consequences from excluding them and some of these are actually life threatening. I mean, we talked about seatbelts. It gets worse. Yes. So the car design chapter is. Pretty shocking in the sense that driving seats all designed predominantly for the male body. And indeed creo Peres talks a lot about this idea of the default human, and the default human, of course, is not just a man, it's also a white, man. So what she also points out is that the ethnic gender data gap is even larger than just the gender date gap. But one of the things I found my shocking was the healthcare chapter. So in some cases, because a lot of healthcare databases that mainly men there, actually, mainly young men who take part in a lot of clinical trials, one of the reasons is their availability. But another is that women are simply too variable. She has this tastic phrase, which she takes from Henry Higgins in my fair lady. Which is why Konta women be more like command, which is a song. That Henry Higgins sings in the musical my fair lady. And this is one of the root causes for the problems. With healthcare data is that women all as they are all is actually easier to measure, male healthcare indicators. Anyway, the consequences of this are that for example, in the US the level of haunt activity at which you have a pacemaker fitted has been set based on male data. It should be lower for women, and therefore women are not having pacemakers fitted in situations where the loaves could actually be saved because of it. So the gen dictator gap. She oak us doesn't just have consequences for urban planning or whatever it's actually in some cases fatal for women. I'm one of the other really interesting what she did was around disaster relief and the response to the Ebola epidemic in Sierra Leone in and she talks about the fact that a lot of the age was in the form of food aid, the food aid went into the quarantine dare is but people providing it didn't think to provide the fuel. So women who cook the food. We're going out of the quarantined areas to get the fuel to cook the food that was being brought into the quarantine dares to stop them leaving it, and of course, spreading the disease by doing. So I gather she also goes beyond the human if you like to also look at how this system if you like is being now extended into the world of AI algorithms also with negative consequences. It's very interesting book. But this is probably the most interesting, and I think also the most important message in her book is that what she argues that the situation the gender data gap is not just there. It's actually worsening wealth of an improving and one of the examples she gives for that is a I and the databases on which is based for example, the software that's used to scan CV's. She gives the example of a software platform cool guild, which does the first sweep of CV's for technology companies to that employee. All. An interview software coders, and it has decided the software has decided that a particularly strong predictor of coating power is how much time you spend on a certain Japanese online manga site now as you doubtless know manga saw its are not female friendly for lots of reasons also women research shows have less spur time to go on line gaming. Yeah. That's your line and you'll sticking to it. And therefore, the that's being used to preselect potential candidates is already excluding women from those databases. So what she's saying is that a in some instances is not just perpetuating biases. But actually amplifying them in science fascinating. She's obviously done a lot of research and his diagnosing a lot of problems does she put forward solutions. How this could be redressed corrected. I mean is it possible to actually remove these biases easily? I mean, the bias is so broad ranging that she doesn't say that it's difficult to do that what she's proposing. And I think it's one of the reasons why it's a very good book. His because she makes the point that in most cases, this is not a wilful exclusion of women from research or from data. It's an absence a lack of presence, if you know what I mean, it's not willed. It's not malicious, and this it simply in many cases. Is that the often men who are collecting the data don't see what they're doing? So as much of this whole debate. And that's why it's relevant beyond just data is that it's about a recognition of the problem goes some way to solving the problem. And in fact, a lot of this is not rocket science. I mean thinking about for example, designing replacements for slum environments in Brazil, you don't just move apartment blocks. You move schools you move families. So that grandparents could go looking after children, you know, it's not rocket science as soon as you see it. It's completely obvious. What should be done, but you need to see it to do it. So I mean in summary. You would say this book is an important contribution to that. I mean, it certainly contained lots of information that I hadn't thought of myself was simply not aware of says interesting from that point view. But I also think it's very important for policymakers to read because as we know data guides resource, allocation and bad data means bad. Source on occasion. Sarah. Thank you very much. Thank

Coming up next