Apple reportedly plans to make iOS detect child abuse photos

2»

Comments

  • Reply 21 of 26
    williamhwilliamh Posts: 1,048member
    swat671 said:
    "This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?”

    I’m mean, I’m sorry, but I don’t care WHAT type of government it is. CSM (child sex material) is CSM. Repressive governments definitely don’t have issues “taking out the trash”. 

    As far as finding the pics or videos, there is “hashing” or “fingerprinting”. Governments or organizations like ICAC (Internet Crimes Against Children) Task Force use these fingerprints to search the web (and I assume the dark web) for images like this. I assume that Apple would just use these hashes. 

    I agree with you about the approach that Apple would likely take. For those who wonder how Apple will determine what constitutes child sex material, it is this: the National Center for Missing and Exploited Children  has a database of hashes of images with known victims where someone has already been prosecuted.  Apple could match hashes to the NCMEC database and there wouldn't be any speculation about what they found. Through this sort of mechanism, Apple could do something about child exploitation without ever moving sensitive images off a person's phone.

    It's not hard to imagine how this sort of capability could be deployed for evil purposes but stopping Apple from doing this wouldn't prevent that.
     0Likes 0Dislikes 0Informatives
  • Reply 22 of 26
    boboliciousbobolicious Posts: 1,201member
    markbyrn said:
    if this story is true, Apple would have the worst case of fork-tongue ever known. Perpetuating surveillance of user iPhones is a massive privacy violation, and we know it won't stop at just loathsome imagery.
    ...is it possible this has been the frog boiling 'Trojan Horse' plan ever since 2011...?

    I was shocked by the always on Photos indexing - there is no 'opt out' and why does an Apple Watch need to sync via Apple servers when it is a few inches from a mac...?
    edited August 2021
    markbyrn
     0Likes 0Dislikes 1Informative
  • Reply 23 of 26
    shaminoshamino Posts: 563member
    crowley said:
    shamino said:
    Abuse is more than just child pornography, and non-pornographic abuse photos are not illegal.

    What happens if I visit Amnesty International's web site, read a few articles, and view photos of children that were beaten by an abuser?  Is Apple going to tell law enforcement that I am now abusing children simply because I have the pictures on my phone?  What if I decide to save these photos in order to republish them on a blog or in a newsletter?  What if I work for Amnesty and took the photos myself?
    Then you have excellent, verifiable answers to any questions and there won't be a problem
    ...
    Nothing you've described is a nightmare, all are very easily solvable with a simple conversation.

    Of course, because when the police arrest you on accusations of a serious felony, the always believe you when you try to give an explanation.  They have never been known to confiscate all your personal electronics and leave you to sit in jail while they conduct their own investigation.

    Simple conversations don't work when you've been pre-judged as a pedophile.  And anything you say can and will be used against you.

    I hope you can afford a good lawyer, because the ones appointed by the court won't listen to you either.  They'll try to convince you to plead to a lesser charge.
    baconstangwatto_cobra
     2Likes 0Dislikes 0Informatives
  • Reply 24 of 26
    boboliciousbobolicious Posts: 1,201member
    I remember reading that the best ideas could be 'as fragile as a whisper' in the private lab of Joni & Steve. I could not agree more...
    edited August 2021
     0Likes 0Dislikes 0Informatives
  • Reply 25 of 26
    jimh2jimh2 Posts: 685member
    dewme said:
    tedz98 said:
    I’m totally against child porn. But this capability is a double edged sword. Apple, which touts its strong belief in the privacy of its customers using its devices, to begin surveilling customer content sets up an unusual and potentially dangerous precedent. What undesirable content will be next to be monitored? Who gets notified when supposed undesirable content is identified? Who determines what constitutes undesirable content? What if governments demand unique monitoring capabilities if Apple wants to sell their products in their countries? Despite the universally agreed upon disgust towards child pornography, the freedom of the entire Apple ecosystem will be jeopardized if this capability is deployed. This is as good as giving government agencies “back door” access to iPhones - which Apple has vehemently opposed. Very good intentions with very dangerous side effects.
    These are all legitimate concerns, i.e., the law of unintended consequences. I’m looking at another side of the “double edged sword” from a different perspective. What Apple is proposing is in my opinion 100% the right thing to do. Bravo Apple, go right ahead lower the heel of scrutiny on those low lifes and societal cockroaches. However, one concern I have is that once Apple steps up and claims that they are going to help stomp out a social, societal, or human behavioral problems for the common good they’ll be held liable for the cases they miss. We see this every day with respect to the App Store. Despite Apple’s scrutiny of app submissions a few bad apps still get through. When this happens Apple is skewered and lambasted in the both the court of public opinion and in real courts with real class action lawsuits involving real money. 

    Still, I don’t think Apple should avoid taking on challenges where the penalty for imperfection is high when it’s the right thing to do. Being forthright up-front about the infallibility of what they are taking on will probably not make a difference later on when they are sued, and they most certainly will be sued at some point. That’s just a cost they’ll have to accept and the law schools will be grateful to Apple for their high minded decision.  
    Except imperfection means someone with a photo that looks like something it is not can ruin someone’s life. Another issue is when it starts detecting other things like whether you drink, smoke, do drugs, engage in reckless behavior, own guns, etc. These new features start off with good intentions but always expand. 
    shaminobaconstang
     2Likes 0Dislikes 0Informatives
This discussion has been closed.