bbh
About
- Username
- bbh
- Joined
- Visits
- 62
- Last Active
- Roles
- member
- Points
- 1,337
- Badges
- 2
- Posts
- 138
Reactions
-
Bill Maher declares Apple CSAM tools a 'blatant constitutional breach'
The problem, to me, is that even after opting out by declining to use iCloud upload for photo storage, the scanning software remains on your iPhone. I think that and the overall privacy issue could be addressed by making the "opt in" download of the scanning software separately, a requirement for activating iCloud Photo upload.
A separate download of the scanning software for those who wish to continue with iCloud Photo storage. No scanning software on your iPhone, no ability to use iCloud Photo upload. This does not seem technologically difficult to me. -
Apple details user privacy, security features built into its CSAM scanning system
The motivation behind this, in addition to the outright moral repugnancy of kiddy porn, is that Apple simply does not want their servers to host that stuff. What’s wrong with that? If you want to use their servers…..your choice….you must go through the CSAM filter.YOU still control your data. Choose to use their servers or don’t. This hysteria is unbelievably overwrought. -
Apple details user privacy, security features built into its CSAM scanning system
jdw said:Apple is still not making the majority of people feel comfortable with its plan, and as such Apple has the moral obligation to at the very least DELAY its plan until it can better convey to the public why they will be 100% protected. That remains true even of some contend this is being blown out of proportion.
Apple needs to explain IN DETAIL how it will proactively help anyone falsely accused of engaging in illicit activities seeing that Apple would be primarily responsible for getting those falsely accused people in that situation in the first place. That discussion must include monetary compensation, among other things.
Apple needs to explain how it intends to address the mental toll on its own employees or contracted workers who will be forced to frequently examine kiddy porn to determine if the system properly flagged an account or not. THIS IS HUGE and must not be overlooked! On some level it is outrageous that the very content we wished banned from the world will be forced upon human eyes in order to determine if a machine made a mistake or not.
Only when these points have been adequately addressed should Apple begin implement their plan or a variant of it. The next operating system release is too soon.
Apple truly has done a terrible job of explaining what is actually happening and what actually does happen downstream of a photo being flagged. First, nobody is actually looking at your photo or any other photo. They are comparing a CSAM algorithm with an algorithm generated when your photo is uploaded. If your "photos" cross the threshold of too many suspect photos, a human confirms that the ALGORITHM of the suspect photos matches the CSAM algorithm. No body ever looks at your photo. This is not a case of someone looking at your pictures, and declaring that "to me, this is child pornography".
And, if you are still unconvinced and wrapped up in the hysteria surrounding this issue, all you have to do is System>Photos> iCloud Photos to OFF. Slide the button to the OFF position. There are numerous ways to back up your photos and have them available on numerous platforms anywhere in the world without using iCloud Photos. But...unless you totally rely on your own personal backup solution (I have a 2 Drive Synology NAS), chance are any other place you choose to remotely store your photos is most likely using the CSAM system already. -
Apple employees express concern over new child safety tools
applesauce007 said:I think the Apple tool is a great idea.
The iPhone has top notch photo and video capabilities and is very portable.
As such, it can easily be a portable tool to abuse children.
Apple wants to stop that. Go Apple go go go!
APPLE is trashing years and years of reputation with this. BAD IDEA. -
Satechi introduces the USB-C Clamp Hub for 24-inch iMac