macplusplus
About
- Username
- macplusplus
- Joined
- Visits
- 288
- Last Active
- Roles
- member
- Points
- 3,140
- Badges
- 1
- Posts
- 2,118
Reactions
-
Open letter asks Apple not to implement Child Safety measures
killroy said:macplusplus said:killroy said:macplusplus said:omasou said:macplusplus said:Apple should shut down iCloud instead of developing a mass surveillance method for the government.
If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...
Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.So you are saying this is no true.
-
Epic Games CEO slams Apple 'government spyware'
crowley said:macplusplus said:9secondkox2 said:crowley said:newisneverenough said:Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative.Not anymore.Goodbye iCloud storage.Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !.. -
Epic Games CEO slams Apple 'government spyware'
9secondkox2 said:crowley said:newisneverenough said:Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative.Not anymore.Goodbye iCloud storage.Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement on your own property. -
Open letter asks Apple not to implement Child Safety measures
-
Apple Watch 'black box' algorithms unreliable for medical research [u]
dysamoria said:macplusplus said:AppleInsider said:Two sets of the same daily heart rate variability data collected from one Apple Watch were collected, covering the same period from December 2018 until September 2020. While the sets were collected on September 5, 2020, and April 15, 2021, the data should have been identical given they dealt with identical timeframes, but differences were discovered.
Whether that "revision" is suitable to your research project is not my concern nor Apple's business. Go find some funding...