macplusplus

About

Username
macplusplus
Joined
Visits
288
Last Active
Roles
member
Points
3,140
Badges
1
Posts
2,118
  • Open letter asks Apple not to implement Child Safety measures

    killroy said:
    killroy said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    This is just not true. They store iCloud content on their servers encrypted but with Apple's keys. Your device keys are not used to encrypt content on iCloud (with a few exceptions like passwords etc., certainly not photos). Since they can decrypt your iCloud data and deliver it to law enforcement anytime (with a search warrant), they can do so for their hash business too. Since they already get the permission to scan your content on iCloud by license agreement, what is the point in injecting another but questionable tool into your device, your own property?

    So you are saying this is no true.

    What that support document doesn't mention is that iCloud Data is encrypted using Apple's own keys, not yours. More accurately the document states that indirectly by counting the limited cases your (own) device keys are used, i.e. E2E encryption cases. All other cases don't use your device keys, meaning by deduction they use Apple's keys.

    darkvadermuthuk_vanalingam
  • Epic Games CEO slams Apple 'government spyware'

    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.

    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..
    baconstangRayz2016Beats
  • Epic Games CEO slams Apple 'government spyware'

    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement on your own property.
    williamlondonbaconstang
  • Open letter asks Apple not to implement Child Safety measures

    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    mike54darkvaderboboliciouswilliamlondonRayz2016
  • Apple Watch 'black box' algorithms unreliable for medical research [u]

    dysamoria said:

    Two sets of the same daily heart rate variability data collected from one Apple Watch were collected, covering the same period from December 2018 until September 2020. While the sets were collected on September 5, 2020, and April 15, 2021, the data should have been identical given they dealt with identical timeframes, but differences were discovered.

    Even medical-grade devices update their firmware. Are those updates "transparent" ? Do you know how that medical-grade ECG machine of yours reports those "bundle blocks" supposedly existing in your heart? So stop making stupid assumptions and thank Apple for refining its algorithms and backwardly user data. This just shows Apple's serious commitment to the integrity and accuracy of user health data.

    Whether that "revision" is suitable to your research project is not my concern nor Apple's business. Go find some funding...
    Medical-grade devices generally have simpler designs and more reliable sensing mechanisms because they’re not designed to be a small, standalone, luxury wrist computer.
    The whole article and discussion became moot after Apple's response. According to Apple there is no backward revision of user data after an algorithm change. Bugs aside, if one gets two different data sets after exporting the same data twice in one year interval then this is the export procedure which should be inspected. HRV data is no big deal, there is a beats per minute column, always integer value, and a timestamp column. The different export programs that interpret these values and deduce HRV may present such discrepancies. This is not enough for making such bold claims as "Apple should provide sensor data!".
    GeorgeBMac