henrybay
About
- Username
- henrybay
- Joined
- Visits
- 39
- Last Active
- Roles
- member
- Points
- 285
- Badges
- 0
- Posts
- 147
Reactions
-
Apple backs down on CSAM features, postpones launch
mr. h said:henrybay said:Great news! Apple listened. Their CSAM concept made a mockery of Apple’s privacy ethos. Even though it was well intentioned, it would have turned our iPhones into digital Stasi officers monitoring our every move.Apple should turn their attention to screening cloud services where much of this offensive material is apparently stored and shared. But they should leave our iPhones alone. Our phones should be sacrosanct paragons of privacy.
Their CSAM concept was actually an extremely clever way of enabling all of your photos to be uploaded to iCloud fully encrypted (without giving Apple the keys), such that neither Apple nor anyone else (should they hack into iCloud, or be law-enforcement with a warrant) would have been able to inspect the photos in iCloud, with the exception of any individual photos that matched a CSAM hash, with the proviso that even then, there would have to be at least 30 photos that matched known CSAM material, before even that was possible.
But now, since they have backed down, all of your photos will continue to be uploaded to iCloud unencrypted, where Apple, law enforcement, and any hackers will be able to inspect all of your photos.
Which one of these two scenarios offers the most privacy?Which one of these two scenarios offers the most privacy?
That’s a no brainer. I choose scenario 2 because I don’t care who sees my photos on iCloud but I care deeply about who can access the content of my iPhone.
I consider my iPhone an extension of my private domain and is therefore sacrosanct. I don’t consider cloud services in the same way and assume they are open to external scrutiny.
It is also naive to believe that Apple’s CSAM approach won’t be exploited by the technical wizards who created spyware programs like Pegasus. They could turn this program against us and sell their service to repressive regimes. Let’s not kid ourselves — this is a back door into our phones. No amount of reassuring techno mumbo or nuanced obfuscation can disguise this fact.
-
Apple backs down on CSAM features, postpones launch
Great news! Apple listened. Their CSAM concept made a mockery of Apple’s privacy ethos. Even though it was well intentioned, it would have turned our iPhones into digital Stasi officers monitoring our every move.Apple should turn their attention to screening cloud services where much of this offensive material is apparently stored and shared. But they should leave our iPhones alone. Our phones should be sacrosanct paragons of privacy. -
Tim Cook wants to debut one more big product category before he retires
Cook is a great CEO, no doubt about that. I think his new product category will be Virtual Reality and an Apple ‘multiverse’.Cook only made two mistakes, namely:
1. letting Jony Ives take ‘minimalism’ too far at Apple, which resulted in the ridiculous buttery keyboard and the removal of essential ports from Macbooks.2. Allowing the CSAM privacy backdoor on the iPhone, which contradicts Apple’s entire commitment to privacy.But no CEO is perfect. -
Tech industry needs to rebuild user trust after privacy losses, says Tim Cook
According to Tim Cook. "At Apple, when we make something, we make sure that we spend an enormous amount of time thinking carefully about how it will be used."If this was truly the case Tim, then why on earth would you introduce a ‘back door’ (ie CSAM screener) into your iPhones that can be exploited by malevolent governments? Of course we should protect children, but this is not the way to do it. -
Civil rights groups worldwide ask Apple to drop CSAM plans
Apple makes mistakes. The butterfly keyboard was one of them. But this mistake was trivial compared to Apple’s decision to start scanning our phones for images. It’s the equivalent of someone entering your house, rifling through your photo albums, taking a few away, and delivering them to the authorities.Of course we must do all we can to prevent child abuse. But not by abusing our fundamental rights. Surely a better way to prevent CSAM is to scan photo collections in the cloud, like Google does. But this process should NOT occur on our phones which should be sacrosanct.Yes, I know that Apple says only half the process is done on the phone and that it’s only looking for digital tags. But that’s just techno mumbo jumbo. The point is that the privacy of our iPhone is being breached and being used to monitor us. It’s a back door to a bleak future.