Amazon, Microsoft, Shankar Narayan discussed on Coast to Coast AM

Automatic TRANSCRIPT

Northwest. Moto? Welcome back to the gap sort of tank tops got for you. Oh, yes. Hey, Shankar, Lou. Hey, everything on our show comes back, the minority report. And so it just kind of works for what we're about to talk to you about. So shocker Narayan is with the technology. Liberty project director at the ACLU of Washington. Thank you so much for coming back on the show because as soon as you know, the Amazon facial recognition story popped up again, and how now some developers are protesting against it. We were like well now we have to revisit this. Yes. Indeed. I think it's very timely especially also given that facial recognition as being discussed in the state legislature right now and being pushed by both Microsoft and Amazon, so let's start with where would the average person see this being used? Well, part of the issue is we really don't know what uses are happening right now what we can say is that for example, we did a public records request him several states for law enforcement agencies, and we found at least a couple of agencies that were using facial recognition and public spaces without any suspicion without a warrant. In other words, you could be subject to recognition just for walking around in a public place, and that seems very problematic to us, especially given the demonstrated by season technology and the lack of control or even notice consent for the person who is simply going about their daily life. Well, let's talk about consent because this has popped up in New York City one of their new apartment buildings and shopping malls. When you walk in they have a placard display that says by walking in these doors, you are gay. Consent. Is that really the way that it could work like I could be walking down through Westlake. And because I'm in that neighborhood in that area. I have given consent because I've walked past a sign. Well, you know, just the way you say it, I think you already know the answer there really isn't an opportunity to consent there, particularly not if for example, the only grocery store in your town is using facial recognition, and you're only alternative might be to go miles and miles away. Or even if an entire mall is using it if the mall the next door over as using it. If facial recognition becomes ubiquitous in the settings, then the placard isn't going to help you very much you you are simply going to have to walk in because ultimately, you will you will need to do business somewhere. And unfortunately, that is in fact, the approach that's being considered in one of the bills that's happening in Washington state right now, and it really does go to show. I think that we need to have a conversation about where we want to have facial recognition where it's appropriate and where it's not and part of the problem with the private use. Of course, also is. That those databases may be accessible to government actors for law enforcement purposes, as well as to third parties who maybe using it, you know, for example, around your health insurance. So there's really no control over the technology right now, and our recommendation is to get out ahead of the technology rather than let it proliferate into the wild figure out what goes wrong after the fact, can you explain sort of legally, or at least from a from a ethical standpoint that distinction or the line between there's I mean, obviously, we're on surveillance cameras all the time. But face recognition sort of takes it up to that next level of a much more analytical approach to the same sort of thing. I mean, should people be less worried or less concerned at places that put them under the sort of perpetual, surveillance, you walk into a convenience store, for example. And you always on camera as opposed to places that use facial recognition software. I would say that facial recognition super charges your regular camera. Baser Valence that's for several reasons. One is that facial recognition is largely undetectable. Because of course, you can see the camera, but you won't see the facial recognition technology. That's being used on that footage for that. Same reason facial recognition is more pernicious because it can also be used after the fact on any video or still image. So you know, the government doesn't have to decide or a private entity doesn't have to decide who. The follow around with the facial recognition tool. They can just use any footage that that's been collected that exists, and the third reason is of course, that you can't change your face. You can choose if you don't want to have your license plate scanned. You can choose not to drive, you know, you can make other choices around your privacy. Having to do, you know, your apps, and how you do business on the internet. You cannot change your face. It is immutable. When you walk around your face goes with you. And that's why I think researchers have caused for caution and have called for a lot of concern in. How facial recognition is rolled out. In fact, I just saw today a brand new article by Luke star who ironically as a researcher for Microsoft in which he he he he compared facial recognition he called it the plutonium of AI and that was because he thought that it had. The potential to really change our society, just by way of changing how the relationship works between government and covering, and I think we should take those concerned seriously, particularly again, given that there is now a mounting body of evidence that the technology is biased. So let's talk about you mentioned, Microsoft, and Amazon and they're in Olympia pushing this technology in using it in different ways. How are they doing that? Well, as part of the data privacy Bill that we discussed the last time we were on a little known component of that deal. Actually, authorizes facial recognition Bill that the Bill that came out of the Senate had this this closet, basically said, you know, if a business a placard you've given consent. It also authorized widespread law enforcement use of the technology. And of course, we think that's that's a little bit backwards because we haven't really had been discussion of the impact of the psychology. And I think we need to be very careful before the legislature simply blesses the technology, and and let's proliferate that we figure out how it's actually going to change our society. Remember, facial recognition isn't just about recognizing someone right? It's about much more than I dedicate. It's about things like whether someone is happy or sad. Whether someone is potentially dangerous. So for example, imagine facial recognition incorporated into a police body camera that gives the officer score based on analyzing someone's face as to whether they're dangerous or not and that officer making a life or death decision based on that technology, which again, we know may not be accurate and his actually less accurate. For people of color for women and for other vulnerable groups. So that's the kind of world that we're kind of taking a headlong leap into. And unfortunately, I think the the approach that these technology vendors, you know, who who have largely written this legislation or taking is to say when you know, trust us, we're gonna roll the psychology out. And you know, there may be mistakes. But it's going to be fine. I think our approach would be more cautious to say, let's have this discussion with the right stakeholders to table. And if they're limited ways in which we should use facial recognition, our our society. Let's put some really strong safeguards around them. But we're not there yet. So would you let's say you had a new iphone and it had the unlock facial recognition recognition unlock. Would you use it? No, I would not because I think it's not worth the risk to my privacy to to unlock. My phone with my face. Honestly, I'm perfectly happy perfectly. Happy lucky my phone with a with a password. It's it's generally worked. Well, and again, you know, it's biometric that's collected on you your face print that you can never change. And once that face sprint falls into the hands of third parties. You and your privacy of security of not just that piece of data, but all of the rest of the data that you have that may be subject to access with that. These are permanently jeopardized that's too big of a risk that Shankar Narayan from the technology. Liberty expert, he's the director there ACLU of Washington Shankar. Thank you so much for spending some more time with us. And I'm sure Amazon, and Microsoft and Facebook and Google and everyone else will do something else that we will have to have you back on again much appreciated. I would be happy to come back. Thank you. Kimmy Klein taking a look at your drive. Now this time brought to you by Subaru of Puyallup. Kimmy in Everett. Our long-term accident.

Coming up next