macplusplus

About

Username
macplusplus
Joined
Visits
293
Last Active
Roles
member
Points
3,141
Badges
1
Posts
2,119
  • What you need to know: Apple's iCloud Photos and Messages child safety initiatives

    crowley said:
    crowley said:
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s  attempt to adopt self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    I'mma just leave this here:
    U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.
    From: https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html
    Flagging such cases doesn't mean preventive Orwellian surveillance. Such a law cannot pass. Even if it did, it cannot be interpreted in such an Orwellian sense. Citizens will fight, courts will interpret.
    No idea what you're even talking about.  

    You said criminals are "not stupid enough to use iCloud", which is obviously untrue, since they're stupid enough to use Facebook.

    You said Apple are attempting to "prevent crimes before they occur", which doesn't seem to be true or even relevant.  Images of child abuse are definitely crimes that have already occurred.

    Stop using Orwellian like a trump word.  It isn't.
    This is why preventive Orwellian surveillance is not a solution. How will you distinguish a mother's baby shower photo from a child abuse photo? Not AI, I mean human interpretation. You need a context to qualify it as child abuse. The scheme as described will not provide that context. "Images of child abuse are definitely crimes that have already occurred", agreed, but if and only if they are explicit enough to provide an abuse context. What about innocent looking non-explicit photos collected as a result of long abusive practices? So, the number of cases Apple can flag will be extremely llimited, since such explicit context will mostly reside elsewhere, dark web or some other media.
    Have you even bothered to read these articles? Like even bothered? They do NOT evaluate the subject of your photos. They are specific hash matches to *known* child pornography, cataloged in the CSAM database. 

    Seriously fucking educate yourself before clutching your pearls. If you can’t read the article you’re commenting on, try this one:

    https://daringfireball.net/2021/08/apple_child_safety_initiatives_slippery_slope
    Apparently you fucking educated yourself enough to still not understand that an innocent looking photo may still point to child abuse but Apple’s scheme will miss it thus it is ineffective. Crime is a very complex setup, it cannot be reduced to a couple of hashes.
    elijahg
  • What you need to know: Apple's iCloud Photos and Messages child safety initiatives

    crowley said:
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s  attempt to adopt self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    I'mma just leave this here:
    U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.
    From: https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html
    Flagging such cases doesn't mean preventive Orwellian surveillance. Such a law cannot pass. Even if it did, it cannot be interpreted in such an Orwellian sense. Citizens will fight, courts will interpret.
    bulk001
  • What you need to know: Apple's iCloud Photos and Messages child safety initiatives

    bulk001 said:
    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s  attempt to adopt self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    Ever watched the TV show dumbest criminals? Beside, a pretext to what? What do you have that is so private that the government should not see it? Your affair? Your own dick pic? Nobody cares and if the NSA looked at them so what? If you are a terrorist, 1/6 insurrectiionist, child pornographer, teen slashing tires in a neighborhood etc. I want you to be caught. If anything, the pretext to me seems to be that privacy is being used as an excuse to exploit children. 
    You have law enforcement for that. I don't want Apple to become a cop, that's it.

    The pretext is a long developed deep issue involved with Tim Apple's political engagements that apparently leaves him now in a very delicate situation. Won't discuss that further here...
    muthuk_vanalingam
  • What you need to know: Apple's iCloud Photos and Messages child safety initiatives

    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s attempt to adopt a self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    muthuk_vanalingam
  • Open letter asks Apple not to implement Child Safety measures

    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    mike54darkvaderboboliciouswilliamlondonRayz2016