xyzzy-xxx

About

Username
xyzzy-xxx
Joined
Visits
65
Last Active
Roles
member
Points
850
Badges
1
Posts
222
  • Edward Snowden calls Apple CSAM plans 'disaster-in-the-making'

    I agree 100% with Snowden, doing this spying on the device is the worst thing ever from Apple.
    Hopefully Apple changes mind and put this in iCloud.
    This would make me updating to iOS 15 and buying an iPhone 13 Pro as I originally planned.
    mrstepOfer9secondkox2davgregbaconstangchemengin1
  • Bill Maher declares Apple CSAM tools a 'blatant constitutional breach'

    mcdave said:
    jdw said:
    tedz98 said:
    The general public has no understanding of what a file hash is. 

    That really is the entire point which many who are defending Apple's move are ignoring. Nothing else matters, and certainly not the technical way in which CSAM scanning works.  That's precisely why I've said in other threads that Apple is now obligated to at the very least DELAY the release until they can do PR damage control and at least try to win more public support.  They cannot do that between now and the release of iOS 15, so the feature must be pulled from iOS 15 and delay the release until at least iOS 16.  And if they never get public support and the matter seems only to get worse and worse, then the idea may need to be permanently shelved.

    This is Tim Cook's call now.  It's no doubt a hard call for him because he's played social justice warrior at times in the past, and this no doubt would seem like a step back for him.  But it's a call he has to make and make soon.
    How do you feel about Google & Facebook’s server-side CSAM scanning? And all the other scanning they do? Apple’s is the lightest of all.
    Totally wrong, server side scanning is much more acceptable because 
    1. you decide what information you put on the server (on my iPhone is currently a lot of information I never would put in the cloud)
    2. the device scan could be altered or hacked
    3. you could get send images to the device that are automatically synced to iCloud
    Doing this on the device is just bad and probably only done because it would require to much processing time to calculate hashes for all the uploaded images on the servers.
    Apple needs to delay this "feature" and put it inti iCloud.
    macplusplusbaconstangmuthuk_vanalingam
  • Researchers who built rudimentary CSAM system say Apple's is a danger

    CSAM scanning end user devices is just the worst idea Apple ever had – if they feel they need to do something they should delay the feature and put it in iCloud.

    Even if they put this in iCloud there are technologies that allow end to end encryption (asymmetric cryptography).

    I will not use iOS 15 and macOS Monterey until this spyware has been removed.
    muthuk_vanalingam
  • Tech industry needs to rebuild user trust after privacy losses, says Tim Cook

    Apple's only realistic option would be to remove CSAM detection from device and defer this "feature" until it is ready to run in iCloud.
    Otherwise many customers (figures depending on countries) will not update to iOS 15 and iPhone 13.
    elijahg
  • Outdated Apple CSAM detection algorithm harvested from iOS 14.3 [u]

    This is the kind of stuff I alluded to in a previous comment: 

    https://forums.appleinsider.com/discussion/comment/3327620/#Comment_3327620

    People will try to exploit this.  It would be better if it wasn’t there at all. The hashes SHOULD be unique, but due to the nature of hashes (it’s an algorithm), there can be collisions.  You may sooner win the lottery than encounter one by chance… but what if it wasn’t by chance? It was only a matter of time before someone created an image or images that match hashes in CSAM and started propagating. Now Apple may have a human review step after several flags, but what happens if they get flooded with requests because a series of images gets propagated?  Humans can only work so fast. But this human review implies images may be sent to people at Apple, which would necessarily circumvent the encryption that prevents anyone but the intended parties to see the images.

    I also find it unfortunate that this has been in since ios14.3, but also hope apple doesn’t enable it.  I have iCloud photos turned off and I will not be upgrading to 15.  The “smart” criminals will also do the same thing.  All this does is root out the dumb ones.  The “smarter” ones will turn it off avoid it, but the smartest ones will find/use other channels now.
    Totally agree, this bull***t should never have made it into a release. If Apple is not stopped, I think there will be at least some funny stories (beside from 5% losses for Apple from iCloud accounts and people abandoning Apple) in the future (and probably zero positive effects, since all criminals are warned).
    elijahgmobirdmrstep