Apple expanding child safety features across iMessage, Siri, iCloud Photos

Posted:
in General Discussion edited August 2021
Apple is releasing a suite of features across its platforms aimed at protecting children online, including a system that can detect child abuse material in iCloud while preserving user privacy.

Credit: Apple
Credit: Apple


The Cupertino tech giant on Thursday announced new child safety features across three areas that it says will help protect children from predators and limit the spread of Child Sexual Abuse Material (CSAM). The official announcement closely follows reports that Apple would debut some type of system to curb CSAM on its platforms.

"At Apple, our goal is to create technology that empowers people and enriches their lives -- while helping them stay safe," the company wrote in a press release.

For example, Apple will implement new tools in Messages that will allow parents to be more informed about how their children communicate online. The company is also uses a new system that leverages cryptographic techniques to detect collections of CSAM stored in iCloud Photos to provide information to law enforcement. Apple is also working on new safety tools in Siri and Search.

"Apple's expanded protection for children is a game changer," said John Clark, CEO and President of the National Center for Missing & Exploited Children.
With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material. At the National Center for Missing & Exploited Children we know this crime can only be combated if we are steadfast in our dedication to protecting children. We can only do this because technology partners, like Apple, step up and make their dedication known. The reality is that privacy and child protection can co-exist. We applaud Apple and look forward to working together to make this world a safer place for children.
All three features have also been optimized for privacy, ensuring that Apple can provide information about criminal activity to the proper authorities without threatening the private information of law-abiding users.

The new features will debut later in 2021 in updates to iOS 15, iPadOS 15, macOS Monterey, and watchOS 8.

CSAM detection in iCloud Photos

A diagram of the CSAM scanning system. Credit: Apple
A diagram of the CSAM scanning system. Credit: Apple


The most significant new child safety feature that Apple is planning on debuting focuses on detecting CSAM within iCloud Photos accounts.

If Apple detects collections of CSAM stored in iCloud, it'll flag that account and provide information to the NCMEC, which works as a reporting center for child abuse material and works with law enforcement agencies across the U.S.

Apple isn't actually scanning images here. Instead, it's using on-device intelligence to match CSAM to a known database of hashes provided by the NCMEC and other child safety organizations. This database is converted into an unreadable set of hashes that are stored securely on a user's device.

The actual method of detecting CSAM in iCloud Photos is complicated, and uses cryptographic techniques at every step to ensure accuracy while maintaining privacy for the average user.

Apple says a flagged account will be disabled after a manual review process to ensure that it's a true positive. After an account is disabled, the Cupertino company will send a message to NCMEC. Users will have the opportunity to appeal an account termination if they feel like they've been mistakenly targeted.

The company again reiterates that the feature only detects CSAM stored in iCloud Photos -- it won't apply to photos stored strictly on-device. Additionally, Apple claims that the system has an error rate of less than one in one trillion accounts per year.

Rather than cloud-based scanning, the feature also only reports users who have a collection of known CSAM stored in iCloud. A single piece of abusive material isn't enough to trigger it, which helps to cut back the rate of false positives.

Again, Apple says that it will only learn about images that match known CSAM. It is not scanning every image stored in iCloud and won't obtain or view any images that aren't matched to known CSAM.

The CSAM detection system will only apply to U.S.-based iCloud accounts to start. Apple says it will likely roll the system out on a wider scale in the future.

Communication safety

One of the new updates focuses on increasing the safety of children communicating online using Apple's iMessage

For example, the iMessage app will now show warnings to children and parents when they are receiving or sending sexually explicit photos.

If a child under 17 years old receives a sensitive image, it will be automatically blurred and the child will be presented with helpful resources. Apple also included a mechanism that will let children under 13 years old know that a message will be sent to their parents if they do view it. Children between 13 and 17 years old will not be subject to parental notification when opening these images and Communication Safety cannot be enabled on accounts used by adults over the age of 18.

The system uses on-device machine learning to analyze images and determine if it's sexually explicit. It's specifically designed so that Apple does not obtain or receive a copy of the image.

Siri and Search updates

In addition to the iMessage safety features, Apple is also expanding the tools and resources it offers in Siri and Search when it comes to online child safety.

For example, iPhone and iPad users will be able to ask Siri how they can report CSAM or child exploitable. Siri will then provide the appropriate resources and guidance.

Siri and Search are also being updated to step in if users perform searches or queries for CSAM. As Apple notes, "these interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue."

Maintaining user privacy

Apple has long touted that it goes to great lengths to protect user privacy. The company has even gone toe-to-toe with law enforcement over user privacy rights. That's why the introduction of a system meant to provide information to law enforcement has some security experts worried.

This sort of tool can be a boon for finding child pornography in people's phones. But imagine what it could do in the hands of an authoritarian government? https://t.co/nB8S6hmLE3

-- Matthew Green (@matthew_d_green)


However, Apple maintains that surveillance and abuse of the systems was a "primary concern" while developing them. It says it designed each feature to ensure privacy was preserved while countering CSAM or child exploitation online.

For example, the CSAM detection system was designed from the start to only detect CSAM -- it doesn't contain mechanisms for analyzing or detecting any other type of photo. Furthermore, it only detects collections of CSAM over a specific threshold.

Apple says the system doesn't open the door to surveillance, and it doesn't do anything to weaken its encryption. The CSAM detection system, for example, only analyzes photos that are not end-to-end encrypted.

Security experts are still concerned about the ramifications. Matthew Green, a cryptography professor at Johns Hopkins University, notes that the hashes are based on a database that users can't review. More than that, there's the potential for hashes to be abused -- like a harmless image shared a hash with known CSAM.

"The idea that Apple is a 'privacy' company has bought them a lot of good press. But it's important to remember the this is the same company that won't encrypt your iCloud backups because the FBI put pressure on them," Green wrote.

Ross Anderson, a professor of security engineering at the University of Cambridge, called the system "an absolutely appalling idea" in an interview with The Financial Times. He added that it could lead to "distributed bulk surveillance of... our phones and laptops."

Digital rights group The Electronic Frontier Foundation also penned a blog post about the feature, saying it is "opening the door to broader abuses."

"All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children's, but anyone's accounts. That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change," wrote EFF's India McKinney and Erica Portnoy.

Read on AppleInsider
JoeNautilus
«1345

Comments

  • Reply 1 of 97
    Rayz2016Rayz2016 Posts: 6,957member
    Hah! I said:

    The more I read about this, the more ridiculous it sounds. 
    Apple spends the past three years banging on about privacy, then starts scanning your photo library?
    I don’t buy it, and no one would buy another iPhone if they thought their lives could be ruined by buggy software. 
    The only way I see this being acceptable is if this so-called “client-side tool” would allow parents to ensure their children weren’t sending naked pics of themselves to Republican politicians posing as high-school kids. 

    Okay, so this is about the kids, fair enough, but I think what we’re seeing here is the very thin edge of a very wide wedge. 


    edited August 2021 byronlelijahgjdwmike54dewmewilliamlondonchemengin1watto_cobra
  • Reply 2 of 97
    BeatsBeats Posts: 3,073member
    Sounds like a Trojan horse. 
    aguyinatxbyronlentropyselijahgjdwmike54rcfa
  • Reply 3 of 97
    Rayz2016 said:
    Hah! I said:

    The more I read about this, the more ridiculous it sounds. 
    Apple spends the past three years banging on about privacy, then starts scanning your photo library?
    I don’t buy it, and no one would buy another iPhone if they thought their lives could be ruined by buggy software. 
    The only way I see this being acceptable is if this so-called “client-side tool” would allow parents to ensure their children weren’t sending naked pics of themselves to Republican politicians posing as high-school kids. 

    Okay, so this is about the kids, fair enough, but I think what we’re seeing here is the very thin edge of very wide edge. 


    Thing edge to a very wide edge? How so?
    williamlondonwatto_cobra
  • Reply 4 of 97

    As “no good deed goes unpunished” I might question the wisdom of this, however someone, somewhere needs to take a stand against all sorts of “miss-deeds”

    And whist it may be preferable for the law enforcement of countries to do this they are not always the best equipped to do so.

    It may well be that all the corporations that claim to just be the “messengers” and not the “authors” need to take a more responsible attitude the what they host.

    watto_cobra
  • Reply 5 of 97
    It's terrifying that Apple is allowing the government to bypass legal restrictions that would have made this type of search unlawful.  I am not defending criminals or pedos but I strongly object to the government having unlimited insight into the photos on personal devices.  A list of hashes is exactly that, and that list could be expanded to anything the government would like as I strongly assume the the government is providing the hashes to Apple initially.
    edited August 2021 Beatsjdwandrewj5790chemengin1
  • Reply 6 of 97
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    aguyinatx said:
    It's terrifying that Apple is allowing the government to bypass legal restrictions that would have made this type of search unlawful.  I am not defending criminals or pedos but I strongly object to the government having unlimited insight into the photos on personal devices.  A list of hashes is exactly that, and that list could be expanded to anything the government would like as I strongly assume the the government is providing the hashes to Apple initially.
    Read the article and Apple's explainer about it, and understand the technology behind it better.

    This is not something that the database can be fed an image of a dime-bag, and it will universally pick out all the dime-bags, unilaterally.
    edited August 2021 byronlomasouwatto_cobra
  • Reply 7 of 97
    Rayz2016Rayz2016 Posts: 6,957member
    Rayz2016 said:
    Hah! I said:

    The more I read about this, the more ridiculous it sounds. 
    Apple spends the past three years banging on about privacy, then starts scanning your photo library?
    I don’t buy it, and no one would buy another iPhone if they thought their lives could be ruined by buggy software. 
    The only way I see this being acceptable is if this so-called “client-side tool” would allow parents to ensure their children weren’t sending naked pics of themselves to Republican politicians posing as high-school kids. 

    Okay, so this is about the kids, fair enough, but I think what we’re seeing here is the very thin edge of very wide edge. 


    Thing edge to a very wide edge? How so?
    Yeah, sorry, I meant “very wide wedge”. 
    Beats
  • Reply 8 of 97
    This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what?
    Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.
    byronlelijahggeorgie01bonobobchemengin1
  • Reply 9 of 97
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what?
    Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.
    Given that Google and Facebook have been doing this for about a decade, and both companies are at least as incompetent as you claim Apple is, plus there's no rash of false positives amongst those larger user base, I believe your fears are unfounded.

    Wait and see. If there are colossal fuck-ups, they'll be found out.
    edited August 2021 byronlStrangeDaysGeorgeBMacdewmewatto_cobra
  • Reply 10 of 97
    BeatsBeats Posts: 3,073member
    aguyinatx said:
    It's terrifying that Apple is allowing the government to bypass legal restrictions that would have made this type of search unlawful.  I am not defending criminals or pedos but I strongly object to the government having unlimited insight into the photos on personal devices.  A list of hashes is exactly that, and that list could be expanded to anything the government would like as I strongly assume the the government is providing the hashes to Apple initially.

    Oh yeah. It’s definitely a “we’re doing it for the kids!” thing that will later include more and more until all your photos belong to the government. We’ve seen the government do crap like this before. They want Apple’s tasty encrypted database.

    BTW arresting pedos who have this content DOES NOT stop child abuse in any way. The abusers stay free, it’s just an excuse to put more people in jail, the children’s lives don’t benefit.
    elijahggeorgie01williamlondon
  • Reply 11 of 97
    netroxnetrox Posts: 1,421member
    First of all, Apple's not able to see any child porn. It can't. It only matches the "pattern" found in child porn. 

    Second, there's no way to prove with just hashes. Prosecutors will have to prove that there's evidence of child porn and I am sure the court will not consider "hash" to be evidence especially when it's fuzzy. 

    Third, government as a whole pretty knows everything about you - you ain't that special. They are the ones that issue SSN, the ones that issues birth certificates, the ones that issues death certificates, the one that issues ID or DL, the one that asks for your taxes, and so on. 




    watto_cobra
  • Reply 12 of 97
    This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what?
    Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.
    Given that Google and Facebook have been doing this for about a decade, and both companies are at least as incompetent as you claim Apple is, plus there's no rash of false positives amongst those larger user base, I believe your fears are unfounded.

    Wait and see. If there are colossal fuck-ups, they'll be found out.
    "Other kids have been playing with matches for decades and they didn't burn their houses down. If your kid does burn down your house, they'll be found out."
    Or we could simply not give the kid matches. By the time they are "found out" it will be too late for some people.

    One revelation here is that none of your data is really secure at Apple. How can they scan your photos if they are securely encrypted? They shouldn't even be able to tell they are photos. It should just look like a lot of random data. The fist thing I recommend doing is to delete all your data in iCloud.

    I notice you didn't mention my final point. Imagine it was your kid. The kid takes photos of themselves they should not have and sends them to a friend or alternately receives them from a friend their same age. Apple's AP fingerprints the pictures as CSAM and they get turned over to the police. Suddenly your kid has to register as a sexual criminal for the rest of their lives. What could have been dealt with quietly by their parents becomes a public legal nightmare. Thanks Apple! That's why we buy our kids iPhones right? How are those kids getting protected?
    edited August 2021 elijahgdanoxbonobob
  • Reply 13 of 97
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what?
    Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.
    Given that Google and Facebook have been doing this for about a decade, and both companies are at least as incompetent as you claim Apple is, plus there's no rash of false positives amongst those larger user base, I believe your fears are unfounded.

    Wait and see. If there are colossal fuck-ups, they'll be found out.
    "Other kids have been playing with matches for decades and they didn't burn their houses down. If your kid does burn down your house, they'll be found out."
    Or we could simply not give the kid matches. By the time they are "found out" it will be too late for some people.

    One revelation here is that none of your data is really secure at Apple. How can they scan your photos if they are securely encrypted? They shouldn't even be able to tell they are photos. It should just look like a lot of random data. The fist thing I recommend doing is to delete all your data in iCloud.
    Your comparison is weird, but okay. 

    Courts and investigators don't use the hashes to prove anything. Subpoenas will still be required for the data itself, and then, another set of eyes.

    You want to delete your iCloud based on pie-in-the-sky "mights" and "maybes" based on your own interpretation (big edit here. What I said wasn't fair, and I apologize), I'm certainly not going to try to stop you.

    In your specious "your kid" argument that you didn't say anything about in your previous post and added to your response while I was composing mine, you're missing the subpoena, discovery, and legal parts of the in-between finding the images, and the registration as a sex offender parts. And, unless said pictures are in the database already that Apple isn't building and is being provided by NCMEC they won't get flagged.

    That's what's getting missed here. That's what nearly every venue other than the tech-centric ones are missing: These database checksums that are being built are against known CSAM images. Not random ones, not vast expanses of skin. Known images, being circulated, and flagged as CSAM by NCMEC.

    There's no privacy violation here. Nobody is looking at your pictures. Nobody is going to see or know because of this that you have 20,000 pictures of a Bud can sweating on the bow of your fishing boat. 

    I'm out until the morning. Enjoy your evening. For what it's worth, Apple has already confirmed that if you turn iCloud photos off, it can't checksum the photos.
    edited August 2021 StrangeDayswilliamlondonwatto_cobra
  • Reply 14 of 97
    This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what?
    Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.
    Given that Google and Facebook have been doing this for about a decade, and both companies are at least as incompetent as you claim Apple is, plus there's no rash of false positives amongst those larger user base, I believe your fears are unfounded.

    Wait and see. If there are colossal fuck-ups, they'll be found out.
    "Other kids have been playing with matches for decades and they didn't burn their houses down. If your kid does burn down your house, they'll be found out."
    Or we could simply not give the kid matches. By the time they are "found out" it will be too late for some people.

    One revelation here is that none of your data is really secure at Apple. How can they scan your photos if they are securely encrypted? They shouldn't even be able to tell they are photos. It should just look like a lot of random data. The fist thing I recommend doing is to delete all your data in iCloud.
    Your comparison is weird, but okay. 

    Courts and investigators don't use the hashes to prove anything. Subpoenas will still be required for the data itself, and then, another set of eyes.

    You want to delete your iCloud based on pie-in-the-sky "mights" and "maybes" based on your own interpretation (big edit here. What I said wasn't fair, and I apologize), I'm certainly not going to try to stop you.

    In your specious "your kid" argument that you didn't say anything about in your previous post and added to your response while I was composing mine, you're missing the subpoena, discovery, and legal parts of the in-between finding the images, and the registration as a sex offender parts. And, unless said pictures are in the database already that Apple isn't building and is being provided by NCMEC they won't get flagged.

    That's what's getting missed here. That's what nearly every venue other than the tech-centric ones are missing: These database checksums that are being built are against known CSAM images. Not random ones, not vast expanses of skin. Known images, being circulated, and flagged as CSAM by NCMEC.

    There's no privacy violation here. Nobody is looking at your pictures. Nobody is going to see or know because of this that you have 20,000 pictures of a Bud can sweating on the bow of your fishing boat. 
    Kids' photos are going to start getting flagged by this system the moment it goes live. Is Apple ready for that? Will they contact the parents of the kids or the authorities? And yes there is a way that a kids photos could be in CSAM and you should already know that.
    How do people opt out of this program?
    Will they tell users when their photos have been flagged and viewed by humans?
    Will we get to see the score our own photos got?
    Can we see the source code to make sure it was implemented correctly?
    Where is this database of known CSAM images? How would Apple possess such a thing legally?
    So many unanswered questions. I guess we have to wait for the S to hit the F.
    edited August 2021 elijahg
  • Reply 15 of 97
    entropysentropys Posts: 4,166member
    I find this ability disturbing. We already know Facebook etc are bad, and of course, this technology is couched in terms of tracking down rock spiders to make it untouchable if one should object to it.

    But I could see this also being eventually used by State actors against their own citizens. Want to persecute anyone associated with an opposition politician? This technology will identify them for you.  All of them.
    mrstepelijahgbonobobmike54
  • Reply 16 of 97
    heli0sheli0s Posts: 65member
    Child sexual abuse has been catapulted into the stratosphere since smartphones and encrypted messaging apps were introduced. I'm glad to see Apple finally taking steps to put a small dent into this disgusting epidemic. If you want to know more about this topic and just how much of an issue it really is, listen to this interview that Sam Harris did with Gabriel Dance who is the deputy investigations editor at The New York Times. If you have the stomach, that is. Fair warning.
    williamlondon
  • Reply 17 of 97
    mrstepmrstep Posts: 514member
    aguyinatx said:
    It's terrifying that Apple is allowing the government to bypass legal restrictions that would have made this type of search unlawful.  I am not defending criminals or pedos but I strongly object to the government having unlimited insight into the photos on personal devices.  A list of hashes is exactly that, and that list could be expanded to anything the government would like as I strongly assume the the government is providing the hashes to Apple initially.
    Yeah, I can see issues with:

    1) Ironically, there's a headline right now "iTunes Match is not working for a growing number of users".  Clearly doing matches of content hasn't been their strong suit previously, so...

    2) Creepy AF for them to start scanning the files on your own device.  If they scan content on iCloud, that's kind of par for the course. Dropbox, Google Drive, etc. all do the same, whether it's disturbing or not. (DMCA takedowns of a video your ripped from DVD at one point. music, etc. - but at least it's something that was pushed to an effectively public place.)

    3) Definitely a slippery slope.  Today it's from a CSAM database, tomorrow... is it a PDF of a blank vax card in the database? How about if that card was used to fake a child's vaccination?  Think of the children?  How about when it's the CCP adding images of the Tiananmen massacre to find dissidents?  All it will need is an extra hash or 2, right?  And with the tech there, there's no chance it doesn't get used.

    4) Is it really Apple's place to scan the content on your computers?  Really?

    I'm assuming this is the "privacy workaround" they've done to dodge other encryption-breaking the government is pushing for, but it's only a matter of time before this becomes a scan of all files, whether being mirrored to iCloud or not, and the type of content is bound to expand.  A bit of general image recognition and it won't need to match that database, just flag what is possibly categorized as problematic (CP, drugs, guns, political content) and upload to the government for review.  Imagine how great this is for "the war of terror", as Borat put it.  None of these tools has ever become less invasive over time. 
    danoxbonobob
  • Reply 18 of 97
    Rayz2016Rayz2016 Posts: 6,957member
    This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what?
    Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.
    Given that Google and Facebook have been doing this for about a decade, and both companies are at least as incompetent as you claim Apple is, plus there's no rash of false positives amongst those larger user base, I believe your fears are unfounded.

    Wait and see. If there are colossal fuck-ups, they'll be found out.
    Yeah, but here’s the thing: before, when Apple made a colossal fuck up, such as letting random text strings crash the iPhone, or bugs in the login code, they’re mildly annoying. 

    A colossal fuck-up now means that someone’s life is ruined. 

    jdwbonobob
  • Reply 19 of 97
    heli0sheli0s Posts: 65member
    Rayz2016 said:

    A colossal fuck-up now means that someone’s life is ruined. 

    What if that someone is the child?
    Xedwilliamlondon
  • Reply 20 of 97
    StrangeDaysStrangeDays Posts: 12,877member
    Rayz2016 said:
    This is tremendously dangerous. Pattern matching can create false hits triggering ... what exactly? Transferring your data to a third party for their inspection without your permission? Keeping their private data private is exactly, precisely the reason users trust and buy Apple products. If Apple starts backing out of privacy guarantees, then what?
    Let me say clearly: I have zero faith that Apple has the technical skill to implement a feature like this. Apple screws up really simple things routinely. This is a really hard computer science problem. Apple is going to screw it up. Guaranteed. When they do screw it up, they will certainly have people looking at your private photos and quite likely get innocent people in trouble with law enforcement. The way the USA treats kids who share pictures of themselves with other kids as life long criminals is a good reason to stop and think carefully about a plan like this.
    Given that Google and Facebook have been doing this for about a decade, and both companies are at least as incompetent as you claim Apple is, plus there's no rash of false positives amongst those larger user base, I believe your fears are unfounded.

    Wait and see. If there are colossal fuck-ups, they'll be found out.
    Yeah, but here’s the thing: before, when Apple made a colossal fuck up, such as letting random text strings crash the iPhone, or bugs in the login code, they’re mildly annoying. 

    A colossal fuck-up now means that someone’s life is ruined. 
    Guess you guys are continuing to ignore the part about subpoenas, investigators, and prosecutors. Nobody is given an automatic “Go to jail” card. If it’s not child porn, nobody is going to jail. These systems already exist, right?
    watto_cobra
Sign In or Register to comment.