Why Tech Companies Are Limiting Police Use of Facial Recognition

Short Wave


All right emily kwong so. We're talking about this announcement from a string of tech companies that they are going gonna put limits on their facial recognition technology especially when it comes to law enforcement amazon microsoft and ibm yes on june eighth. Ibm said it would discontinue general purpose facial recognition or analysis software altogether. Get out of the business completely and it made an impression after. Ibm's big letter. Amazon announced a one year moratorium on sales of they're very popular software recognition spelled with a k. To law enforcement to give congress time to implement appropriate rules so a one year ban. Yes microsoft took it a step further saying it wouldn't sell products to law enforcement at all until a federal law is in place. Here's microsoft president. Brad smith speaking to the washington post we need to use this moment to pursue a strong national law to govern facial recognition that is grounded in the protection of human rights and for matali in conde who has been pushing for regulation changes in tech for years. This was a big deal when these words were coming out of silicon valley. She felt all of the feelings. My initial was thank god. Thank god i was. I was happy. I was pleased. I was optimistic. I was short of breath. I was exhausted. Tally is the ceo of ai. For the people a fellow at both harvard and stanford universities for her. These announcements shifted the conversation. But that's about it. So i'm pleased. It's got us incredibly far but we're by no means the woods not out of the woods because for all of the advancement and facial recognition systems. Still get it wrong. They'll incorrectly match folks what's called a false positive or fail to associate the same person to two different images of false negative. Yeah and what's vaccine. Is these errors are happening. More often. when the machines are analyzing dark-skinned faces and that can disproportionally affect already marginalized communities prone to unconscious bias at the hands of law enforcement leading to false accusations arrests and much worse so until there's action on this metallic said words just aren't enough gotcha. So let's unpack this a little bit. Let's talk about how biased gets into facial recognition systems in the first place. I'd love that okay. So it starts right with how the systems learn to do their jobs. A process known as machine learning so to make facial recognition systems engineers feed algorithms large amounts of what's called training data in this case. That would be pictures of human faces. Yes the way machines learn is that they repeat task again and again and again and again and again developing a statistical model for what a face is supposed to look like so if you wanted to teach the algorithm to recognize a man you'd put in like millions of pictures of men you got it. The machine will then measure the distance between the eyes on each picture the circumference of the nose for example the ear to measurement and over time the machine starts to be able to predict whether the next image it seeing is quote a man which sounds okay right here comes the but but the machine is only a smart as its training data so remember joy ghulam weenie who i mentioned at the top of the episode. Yeah the the mit yes. So she and her colleague timid gabe developed a way to skin color in these training sets and the two they looked at were overwhelmingly composed of lighter skinned subjects. Seventy nine percent for ibi dash a and eighty six percent. For etienne's these are two common data sets that were largely as joy. Put it pale and male. So basically the training data used to create these algorithms is not diverse. And that's how that bias gets in The diversity of human beings is not always being represented in these training sets and so faces outside the systems norm. sometimes don't get recognized. Here's matala explaining what the research meant to her. That goes back to this other issue of not just hiring but a bigger issue of those no one in the team to say that you haven't put all the faces you haven't put all the digital images of all human beings could look like in the way that they sharpen society in order to recognize these faces. And it's so. After realizing how unbalanced these training sets were joy intimidate decided to create their own with equality in race and gender to get a general idea of how facial ai systems performed with a more diverse population so basically they fed it more diverse pictures to to look at. Yeah it was kind of interesting. They used images from the top ten national parliaments in the world with women in power specific yes specifically picking african and european nations and they tested this new data against three different commercially available systems for classifying gender one made by ibm the second microsoft and the third by face plus plus an running these tests joint him knit found clear discrepancies gender and racial lines with darker skinned faces getting mis classified the most. Here's mut-ali again. So one of the things that joy blue armies amazing work looks. That is the coloration between short hair and gender so many many many black women with afros where mislabeled as men mis gendered because the system had trained itself to recognize short hair as a male trait and this research project mattie produced a massive ripple effect further studies legislation in december the national institute of standards and technology or nist published a big paper of its own testing one hundred eighty nine facial recognition algorithms from around the world and they found biases to looking at one global data set some algorithms in their study produced one hundred times more false positives with african and asian faces compared to eastern european ones and when tested using another data set of mug shots from the us. The highest false positives were found among american indians with higher rates in african american and asian populations again depending on the algorithm. Wow yeah that is not what you want from your data. And i'm guessing white. Men benefited from the highest accuracy rates. Yes they did now. The knicks study did conclude that the most accurate algorithms demonstrated far less demographic bias but for multi. This evidence of bias raises a bigger question about the ethics of relying on. Ai systems to classify and police at all the problem with ai. Systems machine learning is that they're really really really good at standard routine tasks and the issue with humans is that we are not standard. We're not routine. Were actually massively messy right. We're not the same but when a police officer searches face in the system. They're not making arrests based on just spat match alone are they. Oh absolutely not. Yeah it's a tool for identifying potential suspects but if you think about how there's already implicit bias in policing critics. A facial recognition are basically saying. It doesn't make sense to embrace technologies riddled with bias to right if all this research has shown. These tools are capable of misidentifying black people. We cannot use biometric tools that discriminate against a group of people who are ready discriminated against within the criminal justice system but policing most specifically mattie. When i first spoke to mut-ali in march she was open to moratoriums on facial. Recognition like amazon is doing buying time for these systems to improve regulations to be put in place but the protests have her views. Because why why am i being moderate with completely reimagined how we interact with technology so now she wants to see facial recognition banned from law enforcement use which some cities in the us have done. Moutallos has tried to push for legislation to outlaw discrimination in technology before but it seems like now people are paying attention and have a language for talking about structural racism that they just didn't have before whether why america listened to me or not. I was gonna continue with this work. I believe that technology should be an empowering force for all people and that's my work but now having old and new ala not just allies but co-conspirators bright. I'm so happy. Because i didn't think would happen in my lifetime and it's an it's

Coming up next