Apple, Ben Yellen, University Of Maryland Center For Health And Homeland Security discussed on The CyberWire

The CyberWire
|

Automatic TRANSCRIPT

Joining me once again. Ben yellen he's from the university of maryland center for health and homeland security. And also my co host on the podcast caveat. Which if you have not checked out yet what are you waiting for. Listen ben. Welcome back thanks dave We would be remiss if we did not discuss The recent Hubbub from apple's announcement that they are going to be scanning ios devices for See sam which is Child sexual abuse materials Can we just do a quick overview here from your point of view. What's going on here. So apple is doing two things. They are both scanning i messages. If a parent ops in there scanning the messaging application on ios devices for nude images for minors so if a minor is between thirteen eighteen years old the minor would be notified. Would get an alert would them. You're about to send receive a new image. This is a warning that message would go to the parents if it's a child under thirteen. I think there are fewer civil liberties objections to that particular announcement from apple The announcement that presents more significant civil liberties concerns in my view is the announcement that apple is going to scan photos in the eye cloud against a known database of child pornographic images And if they discover that a an image is matches one that's in that database they could potentially share that information with the government and that would lead to a criminal prosecution right now. The the sticky wicked here is that there are plenty of tech companies who are scanning their cloud services for these sorts of images that is routine at this point. Facebook google dropbox. They all do that. What sets apple apart is their plan is to do the scanning on device right so it's not just in the cloud it's on the device itself and there's no technological reason they couldn't scan a hard drive for example they're making a policy choice to confine this right now to photos that are posted on an i cloud but the technology exists to search it on. Somebody's devise even if they don't post that photo to the cloud so this presents many potential civil liberties concerns. It's not per se fourth amendment violation. Because this is a private company But the government of course knowing that apple has instituted this practice. This policy is going to know that they probably have access to information. That would be valuable for criminal prosecutions and we know. The government has tried hard to get apple to reveal encrypted communications to give the government access to encrypted communications and it's not just our government even though this program is being piloted in the united states. It certainly eventually will be available to overseas governments that are far less concerned with Civil rights and civil liberties and even though it's being used right now for a csi am It could be used for other purposes to scan images to scan messages for Disfavored political content or for censorship purposes. So the idea is once you build this technology and once you put it into practice as apple plans to do over the next several months then you have created this back door and even though you are trying you are claiming to confine the use of this technology in the short term. Once the technology is created apple is going to be under enormous pressure from governments around the world to use it for a more expanded purposes. And so that's the inherent danger here. We should mention that users do have the ability to opt out. If you don't use apples. I cloud photos service your photos on your device. According to apple won't even be scanned. They won't be looked at unless you're using their cloud services But that doesn't seem to be putting people at ease. Yes so first of all as i said before. That's a policy choice. That's not a technological choice apple of course still could scan your device They do it for a bunch of other purposes right on fine mall. Malware on your Mac book for example right So that's not necessarily anything new. That's a policy choice that they're making now. And i think the concern is that this is going to be a slippery slope. Where a government says. If you really care about stopping child exploitation. Why can find these searches justify. Does that have been posting. I cloud why can't you also search you know photos that have been saved a hard drive or even you know have been You know just saved on on a single device. So i think that's the concern. That's more of a slippery slope. Think the fact that this is apple kerry's You know an increased weight as opposed to another service provider apple presents itself as You know being very committed to user privacy the protection of users information That's how it sells themselves themselves. That's how they present themselves publicly. And so i think this cuts against One of their professed corporate values which is the protection of private information. They're putting a tough place Because obviously to be against this it seemingly to be against rooting out sexual exploitation of minors right. The intentions here are very noble on. I think we have to acknowledge that. I think we have to acknowledge That the problem that they're trying to solve his of course of the utmost importance right But you know. I think the method in which there Engaging in this type of surveillance of their own users could come back to those users. And so i think we have to be honest about that as well. It also strikes me that this is In some ways apple has a corporate culture. I believe of kind of knowing what's best for our users. yes right. it's and it's that old you know like henry ford said you know if i'd asked my users what they wanted. They would have said they needed. You know better faster horses or buggy whips or you know something along those lines but and so apple along in their history has said you know. You don't need that floppy drive anymore. You don't need that headphone jack anymore. And i think that aligns with apple surprise at the backlash here. I think apple thought that they they did. The hard work of designing would is i think most people agree a very clever technological solution to this and yet people are still having a very strong reaction. Yeah i think a couple of things go into that One is we have values in this country about protecting private information. Some of that is inherent in Our legal system the fourth amendment protects us against unreasonable searches and seizures. So even though this as of now isn't an action the government is taking it does seem contrary to our values where we don't want anybody in our protected private spaces And that certainly includes technological spaces including the i cloud where we store our photos. So i think that's a huge part of it. The other part of it. Like i said is the fact that this is supposed to be the company that most stringently protect sees privacy and so if apple is doing it then what does that mean for every other company that doesn't present themselves as protecting our private information. And what does it mean for. Technological companies that are based overseas and more authoritarian countries. Are they gonna learn from apple. And deploy this technology in a way doesn't just target sensitive exploitative image as that sort of thing it's used to go into messaging applications to go into photos and try and crackdown on free speech or political sense and i think those are. That's kind of the nature of the backlash as i see it all right well. There's much more to this conversation. And in fact we spend the entire episode of Caveat this week discussing.

Coming up next