StrangeDays

About

Username
StrangeDays
Joined
Visits
315
Last Active
Roles
member
Points
33,948
Badges
2
Posts
13,227
  • Apple exec said iCloud was the 'greatest platform' for CSAM distribution

    Of all the CSAM whiners of late, I haven’t seen any address what their thoughts are on the fact that Google, Dropbox, Microsoft, Twitter, already do this... Nobody allows images to be stored on their servers w/o scanning hashes to ensure it isn't known child porn. Microsoft has PhotoDNA, and Google has its own tools:

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    …are they also outraged about it? Are they going to give up Gmail and Dropbox because they too scan hashes for child pornography?

    jdb8167fastasleepwatto_cobrajony0
  • Purported 'Apple Watch Series 7' renders show larger speakers, iPhone 12-style straight ed...

    If Apple wanted to innovate, they could try a round face as an option. That new Samsung watch looks sexy, has better features and the price is a lot lower than the Apple Watch. If it worked with an iPhone, I would be tempted.
    Nope, there’s plenty of actual innovation in the march of progress and features in the AW. Doing a round face isn’t inherently innovative, it’s just a change from the design decision of the current design. Personally I value the information display area of a rectangle and don’t need a watch trying to mimic a swinging arm design. 
    mike1entropysroundaboutnowpatchythepirateseanjtwokatmewpscooter63applguydarkvader
  • Apple's Federighi says child protection message was 'jumbled,' 'misunderstood'

    mfryd said:
    NYC362 said:
    Come on already.  Google and Facebook have been doing this for years.   Suddenly when Apple wants to do the same thing, everyone gets twisted.

    It is just like the 1000 gas powered cars that will catch on fire get not a work of national attention, but one Tesla goes up in flames and there's a world wide news bulletin like it's the end of the world. 
    While Google and Facebook have been scanning user data for years, they are honest and open about it.  They make their money by selling information they glean about their customers.  Google doesn't charge for Gmail because they learn more about you when you use their services, and hence make more profit selling that higher grade information.

    On the other hand, Apple prides itself on protecting the privacy of their customers.  A key sales point in buying into the Apple ecosystem is that Apple does everything they possibly can in order to protect your data.  They even fight court orders requiring them to add back doors to iPhone local encryption.

    Under Apple's new policy, every image you upload to your iCloud library will be scanned, and compared against a list of blacklisted images.  If you have too many blacklisted images, you will be reported to the authorities.  

    Initially, the blacklist will only contain child porn images.  I can easily imagine a narcissistic leader ordering Apple to add to that list images which are critical to the government.   Imagine a photo of a President that makes him look foolish, shows him in a compromising position, or reveals a public statement to be a lie.  Such a President would have a strong incentive to add these photos to the list.  Remember, Apple doesn't know what the blacklisted photos look like, Apple only has digital fingerprints of these images (it would be illegal for Apple to posses Child Porn, even if it was for a good cause).
    You can imagine whatever you like, but the system uses the same NCMEC child porn database utilized by Microsoft, Google, Dropbox, Twitter, etc.. Your little fantasy about the POTUS adding images to it would affect every one of those major cloud storage providers, not just Apple. 

    There is nothing new here. You’re not forced to use any of these commercial cloud services to host your images online. 

    And reporting child porn libraries to the authorities is not some choice Apple makes — they’re required to by law. As are the other cloud storage services. You can’t store child porn, it’s against the law. 
    ihatescreennameskurai_kageforegoneconclusionDBSyncmarklarkjony0tjwolfbulk001meterestnzcommand_f
  • Apple's Federighi says child protection message was 'jumbled,' 'misunderstood'

    elijahg said:

    Criticisms of the features centered on the perception that Apple was analyzing photos on users' iPhones. "That's a common but really profound misunderstanding," said Federighi.

    "This is only being applied as part of a process of storing something in the cloud," he continued. "This isn't some processing running over the images you store in Messages, or Telegram... or what you're browsing over the web. This literally is part of the pipeline for storing images in iCloud."
    So they’re actually not analysing photos on-device as they’re uploaded to iCloud? Meaning the content of the white paper Apple published the other day is actually wrong? Or does he seem to think people will be convinced all’s fine by claiming that photos are not scanned, only “analysed” as part of the upload process? That code could easily be expanded out into a daemon that scans all data on the phone at any time. That’s the issue. Nothing to do with messaging (though that wasn’t great either).
    He’s very clearly referring the misunderstanding that Apple is analyzing all photos all the time, like in Messages or Telegram. It isn’t. It’s analyzing the photo prior to uploading it to their commercial iCloud servers, so they don’t have to host child porn. 

    These are numeric hash comparisons to the hash signatures of known child porn. The hash is not the photo, it’s a numeric representation of the photo — in the same way that your iPhone doesn’t store your FaceID or TouchID images, but instead stores a numeric hash which is used for authentication. 

    Man how do you not get this yet.
    napoleon_phoneapartihatescreennameskurai_kageforegoneconclusionDBSyncmarklarkjony0bulk001command_fradarthekat
  • Apple's Federighi says child protection message was 'jumbled,' 'misunderstood'

    darkvader said:
    The message is not the problem.

    The spyware is the problem.
    It’s not spyware. It’s the exact same child porn hash checking used by Microsoft, Google, Dropbox, Twitter, and Tumblr. Nobody wants to host child pornography on their own servers, and they won’t let you upload them. If you use any of those services, you’re already using CSAM hash checking. 

    You are free to not pay to use iCloud Photos and your problem is solved. 
    ihatescreennameskurai_kageforegoneconclusionmarklarkjony0dewmebulk001command_fAlex_Vargonaut