macplusplus

About

Username
macplusplus
Joined
Visits
293
Last Active
Roles
member
Points
3,141
Badges
1
Posts
2,119
  • Open letter asks Apple not to implement Child Safety measures

    killroy said:
    killroy said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    This is just not true. They store iCloud content on their servers encrypted but with Apple's keys. Your device keys are not used to encrypt content on iCloud (with a few exceptions like passwords etc., certainly not photos). Since they can decrypt your iCloud data and deliver it to law enforcement anytime (with a search warrant), they can do so for their hash business too. Since they already get the permission to scan your content on iCloud by license agreement, what is the point in injecting another but questionable tool into your device, your own property?

    Your phone yes the OS not so much. If you read your phone carriers terms of service you might find that they can upload anything they want to. What Apple is proposing is to  do is add a firewall on the OS to keep the nefarious stuff off their servers.

    Today iCloud Photos. Tomorrow what? iCloud Drive? Including the Desktop and Documents folders of all your Macs? To protect themselves from your nefarious junk?
    GeorgeBMac
  • Epic Games CEO slams Apple 'government spyware'

    crowley said:
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
    After the check.  
    No the check is only for producing the safety voucher. Besides, since human review is required after the threshold is reached, refusing anything before the human review would be meaningless.
    I think you've misunderstood me.  I never said that Apple will refuse to take the photo based on the output of the check, just that they refuse to take it without the check having taking place.  The result of the check doesn't make a difference to whether they take it, but it has to be checked against the CSAM list.
    OK, refusing on the server / refusing on the client conflict has caused it. The check, as proposed, seems for good cause, but it is new, it signals the introduction of a new paradigm. This is the new paradigm that causes people's concerns and questioning, as to where that can go. No such labeling of user content upon the request of a third party has occurred before. Today pictures, tomorrow what? Would you like to be labeled as copyright pirate because of a song sent to you as a gift and you long forgotten it exists somewhere on your iPhone?
    Go to Daring Fireball, read about it, read Apple's own technical description of how it works. The scenario you present as a concern will not happen. The problem is a bunch of people who either don't understand it, didn't take the time to understand it, or, like the Epic CEO, want to deliberately misrepresent it for personal gain, have shaped your idea of how it works and they're all wrong. Take the time to actually learn about it and understand it before you condemn it.
    Gruber admits that this is new:
    "The difference going forward is that Apple will be matching fingerprints against NCMEC’s database client-side, not server-side"

    He then continues:

    "
    This slippery-slope argument is a legitimate concern." and 

    "
    Will Apple actually flatly refuse any and all such demands? If they do, it’s all good. If they don’t, and these features creep into surveillance for things like political dissent, copyright infringement, LGBT imagery, or adult pornography — anything at all beyond irrefutable CSAM — it’ll prove disastrous to Apple’s reputation for privacy protection. The EFF seems to see such slipping down the slope as inevitable.

    We shall see. The stakes are incredibly high, and Apple knows it. Whatever you think of Apple’s decision to implement these features, they’re not doing so lightly."

    That doesn't support your argument much.
    baconstangFileMakerFeller
  • Epic Games CEO slams Apple 'government spyware'

    crowley said:
    crowley said:
    crowley said:
    crowley said:
    crowley said:
    Apple completely screwed this up. It’s conceptually wrong from the start. I can’t believe this even got initiated as a project. It’s idiotic to try and excuse looking into private data by justifying the method of the technology. Apples entire stance before now is something I have supported. In this ridiculous step they’ve utterly failed. And should cancel this initiative. 
    If it's being uploaded then it's not simply private data, it's data that the user is pushing into Apple's domain.  Why shouldn't Apple take steps to verify that they aren't being asked to host illegal photographs?
    It’s MY data being stored. Supposedly with my privacy in mind. 

    Not anymore. 

    Goodbye iCloud storage. 

    Nothing to hide. Also not willing to allow the first footstep into a slippery slope of “oh. Your data is only yours. Well, unless we find a reason for it not to be.@
    It is your data and it is private but that privacy cannot prevent Apple from performing legally required checks and scans on their servers. This is one reason most of the iCloud data is not end-to-end encrypted. It is still encrypted, but with Apple's keys, not your own device keys. Practically unencrypted, from the user's point of view. And this is why law enforcement can access your iCloud data anytime by presenting a search warrant.

    But scanning your iPhone is totally different. It is your property, not Apple's. You didn't rent that device from Apple, you bought it. And the child protection pretext falls short given the invasiveness of what they want to implement.
    They aren't scanning your iPhone, they're scanning the photo that you want to put on their service.  They're refusing to even take it without first checking if it matches a known child abuse picture.  That seems fine to me.
    No they don't refuse anything. If their intent were to refuse something they could refuse it on the server as well. They accept whatever you send, but with an associated "safety voucher" if there is a CSAM match. And if those vouchers reach a certain threshold they report you.
    They're refusing to even take it without first checking if it matches a known child abuse picture.
    How so? The graphic says"uploaded to Apple" so they take it.

    https://appleinsider.com/articles/21/08/06/what-you-need-to-know-apples-icloud-photos-and-messages-child-safety-initiatives
    After the check.  
    No the check is only for producing the safety voucher. Besides, since human review is required after the threshold is reached, refusing anything before the human review would be meaningless.
    I think you've misunderstood me.  I never said that Apple will refuse to take the photo based on the output of the check, just that they refuse to take it without the check having taking place.  The result of the check doesn't make a difference to whether they take it, but it has to be checked against the CSAM list.
    OK, refusing on the server / refusing on the client conflict has caused it. The check, as proposed, seems for good cause, but it is new, it signals the introduction of a new paradigm. This is the new paradigm that causes people's concerns and questioning, as to where that can go. No such labeling of user content upon the request of a third party has occurred before. Today pictures, tomorrow what? Would you like to be labeled as copyright pirate because of a song sent to you as a gift and you long forgotten it exists somewhere on your iPhone?
    baconstang
  • Open letter asks Apple not to implement Child Safety measures

    jungmark said:
    What happens If someone send child porn images to someone else iPhone, will it be taken down? This could be a way to silence critics. 
    Nothing unless you upload the images to iCloud. 
    You don't "upload" images to iCloud. The operating system automatically syncs your photo library to iCloud once you activate iCloud Photos. So photos sent to someone may make him a real criminal if the recipient forgets to check his photo library after deleting it in the receiving messaging app, since messaging apps may come with the default option of saving media to the photo library.

    Many people use their phones by trial and error and don't have a coherent view of the operating system, are unaware of what resides where.

    And Apple's scheme doesn't warn you in such cases. Even if it detects a hash match, it continues to sneakily upload it to iCloud by updating your "perversion" score. If that score reaches some threshold it reports you, to catch you hands on job right where you are, damn pervert !
    GeorgeBMacbaconstangIreneWmuthuk_vanalingam
  • Open letter asks Apple not to implement Child Safety measures

    killroy said:
    What happens If someone send child porn images to someone else iPhone, will it be taken down? This could be a way to silence critics. 

    There's a difference between sent and downloaded. They can send you photos now and then call the cops making a swatting call.
    Sent or downloaded doesn't matter. In most jurisdictions possessing i.e. having it stored in your possession is the crime. You must check what messaging apps automatically save in your photo library.
    DoctorQGeorgeBMacmuthuk_vanalingam