Why the racism in facial recognition software probably cant be fixed


Been proven that facial recognition software isn't good at accurately identifying people of Color. We also know that police departments around the country use facial recognition tools to identify suspects and make arrests, and now we know about what is possibly the first confirmed wrongful arrest made as a result of mistaken identification by software. The New York Times reported last week. That Robert Williams a black man was wrongfully arrested in Detroit in January. Joy Berlin. We has been researching this topic for years as a computer scientist based at the MIT. Media Lab and Head of the nonprofit Algorithm ICK Justice League. She says like racism. Algorithm make bias is systemic. This is not a case of one bad algorithm. What we're seeing is a reflection of systemic racism in automated tools like artificial intelligence right now where gambling with people's faces. Faces and homework gambling with people's faces were gambling with democracy I mean it does seem like awareness and policy around facial recognition are starting to catch up if slowly, but what more needs to happen, is that the role of companies alone to not offer the software, or is it the role of governments to stop police departments from using this technology? We can't leave it to. To companies to self regulate while we have three that have come out to that back from facial recognition technology still have major players who supplies h recognition technology to a law enforcement. We absolutely need lawmakers, so we don't have to accept this narrative. That technology is already out there. There's nothing we can do. We have a voice. We have a choice what we are pushing. Pushing for with the Algorithm Justice League is a federal law that bans face surveillance across the nation, so I am encouraged by the latest bill I've seen a from Congress so I would highly encourage support of that bill so that at least across the country there are some base level protections. You wrote this post. The ended with the line. Racial Justice Requires Algorithm Justice. Given that humans right the algorithms, this is sort of the the constant question. How can that be accomplished algorithm justice? I mean sure so when we're talking about our MC justice, it's really this question about having a choice having a voice and having accountability, and so what we're seeing with this wild wild west approach to deploying facial recognition technology for surveillance is the lack of choice. Lack of consent, and so when we're thinking about Algorithm, make justice it is this question about who has power and who has a voice in. It needs to be the voice of the people when we're talking about racial justice, the reason racial justice requires Algorithms Mc. Justice is increasingly we have. Tools that are being used to determine who gets hired who gets fired who get stopped or arrested by the police, what kind of medical treatment you might receive where your children can go to school, and so and so much as a I is increasingly governing and serving as the gatekeeper to economic opportunities educational opportunities. We can't get to a racial justice without thinking about Algorithm mic justice. Technology companies are so sure that there must be a way to make the technology work and make it work largely for good. and I wonder if you think the events of the past month have really changed whether they keep working on this technology or whether they might actually start to see the the real limitations that you have seen I along. Companies respond to the pressure of the people, and I want to emphasize that the announcement for IBM to stop selling facial recognition technology for Amazon to do a one year moratorium an for Microsoft to not sell to law enforcement came after the cold blooded murder of George Floyd protest in the streets, years of employees activism. We have a responsibility to think about. How do we create equitable and accountable systems, and sometimes what that means is you don't create the tool. Joy Bologna as a computer scientists at the MIT media lab, but speaking of tools we should not create researchers at Harrisburg University last week said they've created facial recognition software that can actually predict crime. We already know what could go wrong.

Coming up next