2 Burst results for "Washington Shankar"

"washington shankar" Discussed on KIRO Radio 97.3 FM

KIRO Radio 97.3 FM

09:48 min | 2 years ago

"washington shankar" Discussed on KIRO Radio 97.3 FM

"Northwest. Moto? Welcome back to the gap sort of tank tops got for you. Oh, yes. Hey, Shankar, Lou. Hey, everything on our show comes back, the minority report. And so it just kind of works for what we're about to talk to you about. So shocker Narayan is with the technology. Liberty project director at the ACLU of Washington. Thank you so much for coming back on the show because as soon as you know, the Amazon facial recognition story popped up again, and how now some developers are protesting against it. We were like well now we have to revisit this. Yes. Indeed. I think it's very timely especially also given that facial recognition as being discussed in the state legislature right now and being pushed by both Microsoft and Amazon, so let's start with where would the average person see this being used? Well, part of the issue is we really don't know what uses are happening right now what we can say is that for example, we did a public records request him several states for law enforcement agencies, and we found at least a couple of agencies that were using facial recognition and public spaces without any suspicion without a warrant. In other words, you could be subject to recognition just for walking around in a public place, and that seems very problematic to us, especially given the demonstrated by season technology and the lack of control or even notice consent for the person who is simply going about their daily life. Well, let's talk about consent because this has popped up in New York City one of their new apartment buildings and shopping malls. When you walk in they have a placard display that says by walking in these doors, you are gay. Consent. Is that really the way that it could work like I could be walking down through Westlake. And because I'm in that neighborhood in that area. I have given consent because I've walked past a sign. Well, you know, just the way you say it, I think you already know the answer there really isn't an opportunity to consent there, particularly not if for example, the only grocery store in your town is using facial recognition, and you're only alternative might be to go miles and miles away. Or even if an entire mall is using it if the mall the next door over as using it. If facial recognition becomes ubiquitous in the settings, then the placard isn't going to help you very much you you are simply going to have to walk in because ultimately, you will you will need to do business somewhere. And unfortunately, that is in fact, the approach that's being considered in one of the bills that's happening in Washington state right now, and it really does go to show. I think that we need to have a conversation about where we want to have facial recognition where it's appropriate and where it's not and part of the problem with the private use. Of course, also is. That those databases may be accessible to government actors for law enforcement purposes, as well as to third parties who maybe using it, you know, for example, around your health insurance. So there's really no control over the technology right now, and our recommendation is to get out ahead of the technology rather than let it proliferate into the wild figure out what goes wrong after the fact, can you explain sort of legally, or at least from a from a ethical standpoint that distinction or the line between there's I mean, obviously, we're on surveillance cameras all the time. But face recognition sort of takes it up to that next level of a much more analytical approach to the same sort of thing. I mean, should people be less worried or less concerned at places that put them under the sort of perpetual, surveillance, you walk into a convenience store, for example. And you always on camera as opposed to places that use facial recognition software. I would say that facial recognition super charges your regular camera. Baser Valence that's for several reasons. One is that facial recognition is largely undetectable. Because of course, you can see the camera, but you won't see the facial recognition technology. That's being used on that footage for that. Same reason facial recognition is more pernicious because it can also be used after the fact on any video or still image. So you know, the government doesn't have to decide or a private entity doesn't have to decide who. The follow around with the facial recognition tool. They can just use any footage that that's been collected that exists, and the third reason is of course, that you can't change your face. You can choose if you don't want to have your license plate scanned. You can choose not to drive, you know, you can make other choices around your privacy. Having to do, you know, your apps, and how you do business on the internet. You cannot change your face. It is immutable. When you walk around your face goes with you. And that's why I think researchers have caused for caution and have called for a lot of concern in. How facial recognition is rolled out. In fact, I just saw today a brand new article by Luke star who ironically as a researcher for Microsoft in which he he he he compared facial recognition he called it the plutonium of AI and that was because he thought that it had. The potential to really change our society, just by way of changing how the relationship works between government and covering, and I think we should take those concerned seriously, particularly again, given that there is now a mounting body of evidence that the technology is biased. So let's talk about you mentioned, Microsoft, and Amazon and they're in Olympia pushing this technology in using it in different ways. How are they doing that? Well, as part of the data privacy Bill that we discussed the last time we were on a little known component of that deal. Actually, authorizes facial recognition Bill that the Bill that came out of the Senate had this this closet, basically said, you know, if a business a placard you've given consent. It also authorized widespread law enforcement use of the technology. And of course, we think that's that's a little bit backwards because we haven't really had been discussion of the impact of the psychology. And I think we need to be very careful before the legislature simply blesses the technology, and and let's proliferate that we figure out how it's actually going to change our society. Remember, facial recognition isn't just about recognizing someone right? It's about much more than I dedicate. It's about things like whether someone is happy or sad. Whether someone is potentially dangerous. So for example, imagine facial recognition incorporated into a police body camera that gives the officer score based on analyzing someone's face as to whether they're dangerous or not and that officer making a life or death decision based on that technology, which again, we know may not be accurate and his actually less accurate. For people of color for women and for other vulnerable groups. So that's the kind of world that we're kind of taking a headlong leap into. And unfortunately, I think the the approach that these technology vendors, you know, who who have largely written this legislation or taking is to say when you know, trust us, we're gonna roll the psychology out. And you know, there may be mistakes. But it's going to be fine. I think our approach would be more cautious to say, let's have this discussion with the right stakeholders to table. And if they're limited ways in which we should use facial recognition, our our society. Let's put some really strong safeguards around them. But we're not there yet. So would you let's say you had a new iphone and it had the unlock facial recognition recognition unlock. Would you use it? No, I would not because I think it's not worth the risk to my privacy to to unlock. My phone with my face. Honestly, I'm perfectly happy perfectly. Happy lucky my phone with a with a password. It's it's generally worked. Well, and again, you know, it's biometric that's collected on you your face print that you can never change. And once that face sprint falls into the hands of third parties. You and your privacy of security of not just that piece of data, but all of the rest of the data that you have that may be subject to access with that. These are permanently jeopardized that's too big of a risk that Shankar Narayan from the technology. Liberty expert, he's the director there ACLU of Washington Shankar. Thank you so much for spending some more time with us. And I'm sure Amazon, and Microsoft and Facebook and Google and everyone else will do something else that we will have to have you back on again much appreciated. I would be happy to come back. Thank you. Kimmy Klein taking a look at your drive. Now this time brought to you by Subaru of Puyallup. Kimmy in Everett. Our long-term accident.

Amazon Microsoft Shankar Narayan Washington Shankar Washington director Kimmy Klein New York City researcher officer Subaru legislature Westlake Bill Lou Luke
"washington shankar" Discussed on KIRO Radio 97.3 FM

KIRO Radio 97.3 FM

08:48 min | 2 years ago

"washington shankar" Discussed on KIRO Radio 97.3 FM

"Tracy Taylor. In focus. Candy, Mike and tasha. What is the most valuable thing you own your house? Your car. Now, it's your data Washington. Lawmakers are trying to address how companies collected share data what they know about us. How they sell it with do with it. A new Bill suggests you can find out who knows what about you. And then just contact the company and say, delete me. And then we just trust them to do that shocker does that even sound remotely close of what this Bill is asking, and what we are assuming it's going to do for us. Unfortunately, the reality and the the marketing around this Bill or two very different things. It is it is being marketed as something that's supposed to give power back to the consumer in the area of data privacy. But unfortunately, the first problem is that would written by the technology companies themselves, and it was really stacked up with loophole. So that a company that holds your data can actually override your consent around that data, even with your lack of consensus, you withdraw your consent to they're using your data. They can still come back over right that this Bill is really problematic. Speaking with us is Shankar Narayan with ACLU with Eric Civil Liberties Union, how do you how can you actually regulate? This sort of data privacy thing on a state by state basis that seems like it would be almost fundamentally impossible is that true. It's certainly difficult to do. And it would be preferable. I think if if congress were to enact strong data protection or if our federal regulators like the FTC were to do. So, unfortunately, the prospects were that are not looking great. And so it is possible. I think for states to take a leading role California, for example, inactive their consumer Privacy Act, which have stronger protections against the sale of your data. But unfortunately, I think the technology companies have taken notice and are looking to enact weaker statute in different states, including the one in Washington Shankar. Can you give us an example? So let's say I mean, you I'm sure you've dealt with people who have had these nightmare scenarios on their hands wants their data gets out there. What are some of the problems that you routinely? See, well, certainly what we see is that consumers rarely even know what they're consenting to when they do business with an entity. So really what's happening? When you use a, sir. Service when you do business online. Even when you tap a search term into a web browser. You are accepting terms of service that largely allows your data to be shared and soul to third parties without your really understanding that that's going to happen. And the problem with that is that so much of our decision making is now data driven that those pieces of data can come back to buy two down the line, for example, when you try to get a job or when you try to get a loan. Those data brokers will have sold your data to the company that Ben will use an algorithm or a program to figure out if you are actually worthy candidate for that loner for that job. And you will not know which piece of data at was that you let go off Deo gave up that had that consequence for you. This Bill is being marketed away to to reverse that. But unfortunately, doesn't give real power back to the consumers. What can you can you? Tell us a little bit about the gap between what's happening in California, for example, and what is potentially going to happen here. Yes. Certainly, so, you know, the gold standard in the space right now really is the European approach and the basis of that European approach to data privacy really is that the business that collects data has to get meaningful consent. But they actually have to get that consent for a specific business purpose, and what happens to that data down the line with they're allowed to do with. It really depends on the business purpose that they've declared here it's kind of the opposite in Washington statute. We basically fume that the business has a purpose that they are legitimately collecting and the burden goes back on the. Consumer to come back and fight. And, you know, even if they would draw their consent the entity that holds the data can simply declare different business purpose and continue to hold a despite the fact that you would and consent. So we think the more consumer focused model if Europe, and and California which follows a similar model is much better than making it optional for the companies to decide and giving them so many different ways to override the will of the consumer. So what about AI because that's something that does come up when you talk about your data, and what somebody else has and everything else in who's generating who's picking what are some of the concerns with the algorithms and the data that we need to look at now to help us in the future. Yeah. That's that is really good question. And that goes back to the decision making processes that. I talked about many industries are adopting automated decision systems, which which are called eighty s which are all based systems that use the data that people are giving up to make decisions about them. For the first time in any state this year with the ACO us backing a Bill was introduced in Washington that would actually make it mandatory for there to be public scrutiny around these algorithms because of course, the first problem is they're often deployed completely silently, the consumer isn't aware when they make credit application or when they get a public benefits check that determination around that is being made by now Brigham. So the first thing to do is surface them the second thing to do is to figure out if they're being fair and their decision making by but opening them up to scrutiny. That's what the Bill in Washington would have done it. Unfortunately, didn't advance, but we are gonna come back to try to make that apart in Washington law, and that is extremely important as well. So most of us have given up some rights to our own data. All. When we when we don't read the fine print, right? We are acknowledging that we are conceding this stuff to whatever company that that is we're buying a service from our acquiring a product from this law assume supersedes any previous granting of dissemination of our data. But is it possible for companies to then find a clever rewrite of these things that no one ever reads in just essentially change that stuff that we're approving? And and essentially, you know, neuter the any new law. Well, it really depends on how you write the law. And that is in fact, one of the problems with the Bill, that's that's in front of the legislature right now is that among other things definition of consent has been watered down, you know, it's gotta be meaningful consent. And what we know right now is that for consumers, it's really kind of a chicken and egg thing, right? They don't know for sure all what data they're giving up. And as a double whammy, they actually don't know why that data so valuable because they don't know who it's going to be sold to and what decisions it's going to be made are going to be made on the basis of that data. So we need to really make that equation clear on the consumer side with a lot of education as to how your life can be impacted. Right. This is not just about UCF targeted ads, which is what a lot of people think is the consequence of giving up your data. That's the least of your worries your worries really are not getting a job not getting housing your benefits, getting cut in half, your health care, your health insurance premium going up, you know, all of these real world consequences that have meaningful impacts on people's lives are being driven by data. And I think we need to educate consumers better on the fact that this is happening and also rewrite the laws so that there's more transparency around. Being collected. And I think, you know, the European model is the way we really need to go to try to achieve that. But unfortunately, the.

California Washington Bill Tracy Taylor Washington Shankar Shankar Narayan Mike FTC Europe tasha UCF Deo ACLU congress Ben