How constant surveillance puts protesters at risk


As black lives matter protests continue around the country police are using facial recognition and all kinds of other technology to arrest protesters and organizers, and while in some cases, the people arrested did commit crimes. After the fact, arrests can have a chilling effect on free speech and lead to cases of mistaken identity. They also show us just how much surveillance is part of our lives. Simone Brown is a professor at Ut Austin. She's also author of the book dark matters on the surveillance of blackness. She told me about how police identified and arrested a protester and Philadelphia there was a tattoo on her arm and she was wearing a t shirt. That was you know quite. Said Keep, the immigrants deport the racist and so the police used the images of her. They went to find out you know where did she get this shirt made and they found her comment that she made on Oetzi they looked at her instagram they looked at her linked in profiles and they were able to match her this image to identify her and she was eventually charged. So all this is to say that there's still just kind of trails of data actually that we leave about ourselves that is being used to form a case to what extent to your knowledge is some of this technology being used to find arrest protesters and even protest organizers. Chilling effect that organizers but also the ACLU became quite aware and worked to challenge was around twenty, fifteen, twenty sixteen or so this company Gio Fia, which is really a company that's kind of social media analysis company that was working on hand in hand with various policing agencies to monitor key words, black lives, matter protests, Jihad all of these things were then tagged and flag to. then. In some cases visit, you'd have a policing agency, visit a potential protester, and of course, if you go onto Gio phidias website now there's like nothing really just contact information, but we know if something is out there if gop has gone, another company will pop up and you know fill in that gap when you layer on all of this technology that you described, it sounds like. It could be relatively accurate and I could see police departments falling into the idea that you know that although there have been concerns facial recognition is now is accurate once you add. In social media like this should work. Great. What is the push back to that and so the idea of something you know working great. If just one person is wrongly. Identified, say, for example with facial recognition technology, then it's not working at all these technologies rely on this idea that they are perfect correct but they really they really aren't and so people are asking for a pause because these technologies are not outside of this system in which we live in where you know black people are criminalized right how do you feel like this? The longterm implications of this surveillance might play out will people be less willing to take the risk of exercising their right to protest? I don't think so I think in terms that people you know we're in the middle of a pandemic and yet people are still you know risking a lot to go out and protest and demand something better I think one case and I'll give you an example that I think is important is in terms of DNA collection, and so a lot of people want to their armchair genealogists are also want to find family or some type of connection and they use a company like say twenty three and me or ancestry dot com or jet match, and that same company Jed match was then recently purchased i. Think just last year by a company that has close ties to you know a policing agency and this company is not just about finding long lost relatives, but they save their primarily for forensic analysis. So the question of whether it's sea or whether it's ancestry dot com it's like we have to really think about what happens to that data. Well, it's interesting. It's it feels like it's a thing that that privacy researchers have warned about for a long time that there is essentially a big web of surveillance and were leaving tracks all the time and that it isn't always obvious what the harm might be. Until something like this happens exactly. But you know they're also tech one of the places that I looked to see you know what's what's the future perhaps for the future that's already here is looking at airport security and there have been a lot of push in terms of AI enabled technologies to. Assess risk to assess threat and one of the things that you know a few companies are starting to develop now is emotion recognition, and so that might be that a traveler present themselves at an airport speak to an Avatar one company. Avatar actually stands for Automated Virtual Agent for truth assessments and this Avatar will then ask them a series of questions and then measurements are then taken. By the changes in their voice by. Heat or sweats or any type of what what might be termed a micro expression. Of guilt like your heart rate, increasing those types of things, and then assign a certain threat category to see if that person might be a threat to airport security, and so I don't necessarily know if these types of technologies are being used to monitor protests, activists and other moments of rebellion, but it is something to look out for. That pause was yeah that's. Terrifying. IS THEIR EAR recognition? Is that a thing? Yes. And so there's there's recognition of everything and it's almost like throwing something to the wall and seeing. What hits but the air's been it's a relatively stable part of the body. And that has been known since Al Proteon, which is said to be the father of forensic. Sciences. was using that in the eighteen hundreds as a stable way of recognizing or. Identifying the human body to catalog them, and so there are researchers that are working on every part in piece of the body that you could think imaginable as a way to try and. Shore. Up, this idea. which is just an idea that the human body is stable that the human can be categorized and identified, and we know that's Bodies don't work that way but you know the science does. Your book is called dark matters on the surveillance of blackness. We've been talking about this in the context context of protest. Why is this surveillance of particular concern to Black Americans? It's a particular concern because it's has been the state you know I say surveillance is the facts of anti blackness not only in the US but globally, and so it's been a concern here in the US for centuries we thinking about slave patrols, plantation control, all of these technologies that were. Put in place. To deem black people as. Outside of the right to have rights, but it's also why I think it's important to study the history of surveillance within transatlantic slavery within Plantation Slavery because it also offers US moments of resistance and moments of rebellion and escape to something different something that looks like freedom. Simone. Brown is a professor at Ut Austin and author of the Book Dark Matters.

Coming up next