macplusplus
About
- Username
- macplusplus
- Joined
- Visits
- 293
- Last Active
- Roles
- member
- Points
- 3,141
- Badges
- 1
- Posts
- 2,119
Reactions
-
New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM
StrangeDays said:All the boneheads claiming Apple created a "backdoor" -- nope. All the tech companies do this (Dropbox, Microsoft, Google), and Apple did 100% server-side CSAM scanning a year ago:
https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/
...you dudes are simply panicking and clutching your pearls because you didn't know about it before.
Previously, Apple had rejected government's request to develop a custom iOS in order to break into criminals' iPhones. That would work on case-by-case basis, but Apple had still rejected it because it would set a precedent. And now, Apple sets that precedent voluntarily, and not on case-by-case basis but for the whole ecosystem... -
New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM
lkrupp said:All this handwringing and foaming at the mouth, spittle flying everywhere. Fine, but what do you all pan to DO about it? Leave the iOS platform? Where will you go? Android, even though Google has been doing this for a couple of years now and Apple is just playing catch up? Will you extricate yourself from the online universe and go off the grid in a bunker in northern Utah or the Yukon territory, maybe Hudson’s Bay? Of course you can’t change anything in the ballot box because both parties are onboard with this. It’s coming and there’s NOTHING you can do about it except bitch on a tiny tech blog.
What will you do? Where will you go? Any answers? -
New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM
Apple can resist to government requests but if a government makes that scheme into law Apple cannot resist.
Apple has made a precedent with that. Now many governments will take that precedent as template and will try to turn it into law. But in the past Apple had resisted to the request of developing a special iOS to break into criminals' phones as that would constitute a precedent.
-
Epic Games CEO slams Apple 'government spyware'
coolfactor said:Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..
Do photos sent via these messaging apps automatically get added to iCloud Photos simply by being sent and not viewed with one's eyes?
We don't know Apple's thresholds. It could be 1000 vouchers. It could be 10,000 vouchers. If someone has manually viewed 1000+ photos by choice and each time they view, that photo is uploaded to iCloud, then let them go to jail, as they are willfully participating in the exchange of such material.
The technology that Apple has designed and implemented is not the problem. It's strongly designed with privacy in mind. What the main concern is that Apple could be forced to make that same technology be used for other purposes, which may border on invasion of privacy. It's the potential for abuse that is getting everyone up in arms.
I trust that Apple has implemented this system really well. Innocent until *proven* guilty, after all, right?
-
Open letter asks Apple not to implement Child Safety measures
killroy said:macplusplus said:killroy said:macplusplus said:killroy said:macplusplus said:omasou said:macplusplus said:Apple should shut down iCloud instead of developing a mass surveillance method for the government.
If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...
Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
Your phone yes the OS not so much. If you read your phone carriers terms of service you might find that they can upload anything they want to. What Apple is proposing is to do is add a firewall on the OS to keep the nefarious stuff off their servers.
You can just op-out. i don't think Google or Facebook and others that have been doing it for yaers give you that.