Rene, Apple, Karen Freeman discussed on iMore
Twenty twenty one. Today we're talking about apple's latest controversial Child safety initiative that was announced yesterday. And we're also going to dive into some rumors about the rapidly approaching. Fall product. rollout Joining me today as always. I'm more contributor karen freeman. Hey carrot and joining us today as one of our special guests is. I'm more writers stephen work. Hey steven hello. I'm definitely like the least like the the lesser of the special guest. Let's joining as or other special guests this week. My friend in yours rene ritchie of. I believe it's pronounced youtube dot comb back to h. All right great to be here. It's great to. It's great to have you on the show rene because yeah. Yeah it's it's like having someone back from from for a nice family visit and it's it's good that you hear this week rene because i mean it was a fairly quiet week for news is fairly low key. We had some rumors we had some. You know maybe smaller. Mac pro coming down the line and then yesterday happened. I feel i'm saying more and more tom. Because apple came out with this These new child safety protections. Well the they announced these new child safety protections that will be coming with the major software updates. Let's say next month just to be safe. iowa's fifteen ipad. I was fifteen and mac. Os monterey so rene wanted to have you on the show to just kind of walk us through some of that. What exactly is going on here. All right so there's three separate features that are coming. Us only At first us only ios fifteen ipad s. fifteen the first one. The idea of it is if you are enrolled as a child like you have a device that has parental controls on it. And you get an explicit image. It will warn you that the images explicit. It's using on device machine learning to determine what an explicit images. Don't make any of your impossible burger jokes. Because i'm sure they've heard them. You know all all all over. Already pop up warnings. This might be an explicit image. Do you want to see anyway and you can choose. Yes or no if you're under the age of thirteen and it's enabled if you choose. Yes it's going to say. Are you really sure because we're going to alert your parents that you're getting this image And then you can still say yes. Sure it's literally a picture of a hot dog from the last baseball game. We'll get off my back apple and the same thing happens if you send an image if you go to send it. It'll say this looks like a sexually. Explicit image are you sure you want to send it And then you can say yes and if you're under the age of thirteen and again your parents have enforced at permission. It will say. are you really sure. Because we're gonna tell your parents and then you can say yes or no and it's it's some people have looked at this as as oh it's breaking to factor sorry end to end encryption. It's not in that you end. Encryption is about disclosure. it's like nobody who shouldn't be seeing. The message is seeing the message. If you choose to send a message to somebody. You're opting them in to seeing that message. And this is the same thing you or your legal guardian. Gets it aside if someone else gets to see messages under certain circumstances so doesn't have to be enrolled in this. You'll know for sure if you are enrolled nest. Nobody can enroll you in this without your knowledge of it. You may not like it if you're under the age of thirteen exists but it's there and apples intending this to prevent grooming some people. Say why don't you do is for adults. Nobody wants to get those eggplant picks like all of us. Have this as an option. And they're not commenting on that yet. But it's it's meant to stop grooming people sending images to children to Eventually scale up to greater abuse as the first one the second one is the database which is a database of known images that involve and if we can actually say what stands for on youtube or that's instant quarantine on youtube site so it's an image of a database of the absolute worst images images that illegal to possess a lot of stuff. You can search for like a lot of stuff. Might be like disreputable. You might have ethical qualms this stuff is illegal to possess again. It's us only the us database. And what apple is doing is my guest. My gut is that apple. Does not apple's not legally allowed to have this stuff on their servers. They do not want this stuff on their servers but they don't want to scan are photos on their servers to find it and get rid of it which is what every other tech company does facebook microsoft microsoft invented. The i forget what it's called something. Dna photo dna technology ten years ago to solve this problem. They've been using it ever since. Apple's is very similar in terms of the albums being but facebook twitter google. Everybody does this apple. Doesn't want to do it on their service because they'd have to actually go through all our photos so they decided to do with this incredibly intricate kind of clever approach where they take the database. They converted to what they call neural hash. So the database is already has. You don't actually have the pictures you have hashes which is a mathematical representation of the picture and apple wraps in rural hash to make just one more level abstracted because nobody wants everyone. Some people. Were afraid that you're getting these pictures on your device that applicant match against them. Nobody's getting any cease sam on their device. You're getting the neural hashes. And what those neural hashes do is only if and when you upload to to. I cloud photos. If you don't use. I call photos if you're not uploading anything doesn't affect you in the least if you're uploading your images. I cloud photos as they're being uploaded apple..