crowley

I don't add "in my opinion" to everything I say because everything I say is my opinion.  I'm not wasting keystrokes on clarifying to pedants what they should already be able to discern.

About

Banned
Username
crowley
Joined
Visits
454
Last Active
Roles
member
Points
11,767
Badges
2
Posts
10,453
  • Hackers selling data on 100M T-Mobile customers after server attack

    Dogperson said:
    Not just this company, but ALL the personal info hacks - WHY IS NONE OF THIS INFORMATION ENCRYPTED???????
    Great question. Why is none of your information encrypted on iCloud? We know it isn't because Apple can scan your photos for illegal images and then have humans review them before sending all your data unencrypted to the government.
    You sure are wrong a lot: https://support.apple.com/en-us/HT202303

    DataEncryptionNotes
    In transitOn server
    BackupYesYesA minimum of 128-bit AES encryption
    Safari History & BookmarksYesYes
    CalendarsYesYes
    ContactsYesYes
    Find My (Devices & People)YesYes
    iCloud DriveYesYes
    Messages in iCloudYesYes
    NotesYesYes
    PhotosYesYes
    RemindersYesYes
    Siri ShortcutsYesYes
    Voice MemosYesYes
    Wallet passesYesYes
    I believe much or all of that is encrypted with keys that Apple have though, not with end to end user keys, so theoretically Apple could unencrypt it and share it with law enforcement.  A pretty compelling reason from a consumer standpoint to switch to full end to end encryption and shift any validation to on-device systems.
    ronn
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    bdubya said:

    And scanning with on-device learning of your photo library for any reason is an invasion of privacy without permission

    But Apple's CSAM check doesn't do this!  It doesn't "learn" your photo library.  It check individual photos at the point of upload to iCloud, for which the terms and conditions clearly give Apple the right, therefore by accepting terms and conditions the user have given their permission.  If they don't want to give their permission then they can opt out of iCloud Photos.

    Moreover, the fact that iCloud in China is not totally under Apple's control is probably a significant reason for Apple to rely on on-device processing to do these checks, as it means that it is totally under Apple's control.
    killroysagan_studentlkruppbyronlscstrrfronnjahbladeuraharan2itivguywatto_cobra
  • Apple engineers lack optimism about the Apple TV strategy, claims report

    As people migrate away from cable to streaming, Apple TV -- at least a good, quality Apple TV -- is and will be needed more than ever.
    In addition, Apple could (and I think should) also produce a Dolby Atmos system that ties into it.   They should also be looking at gaming consoles to challenge XBox and Playstation.

    But, Apple seems to be befuddled by the home:
    -- Their Homekit doesn't seem to have much interest or direction from management
    -- Their Apple TV doesn't seem to get much interest or direction either.
    -- Then instead of expanding and enhancing Homepod, they killed it.

    Apple has so much cash that they are literally giving it away.  But they won't invest in products to enhance the home despite so much potential there.

    But it's also contradictory:  They are investing Billions into Apple TV+   But, what are people supposed to watch it on?  Comcast cable?
    Lots of TVs now come with an TV app built in.  And there's a PlayStation app for TV.  And a fair few Mac and iOS devices out there.
    StrangeDaysscstrrf
  • What you need to know: Apple's iCloud Photos and Messages child safety initiatives

    Then I assume you don’t use Dropbox, Gmail, Twitter, Tumblr, etc etc… They all use the CSAM database for the same purpose. 

    The main take-away - commercial cloud hosting uses their servers. Should they not take measures to address child pornography on them? Not using their commercial service, there’s no issue. Is that not reasonable? One needn’t use commercial hosting services, especially if using it for illegal purposes.
    And this is exactly what criminals actually do: they are not stupid enough to use iCloud, they have the dark web, they have browsers and file transfer tools tailored to the special protocols developed for the dark web. Apple has long explained very well that iCloud backups are not encrypted. Law enforcement has (or should have) no issue with iCloud, because they can get any person’s unencrypted iCloud data anytime by presenting a court order. And I assure you, this is almost always much faster than Apple’s surveillance, based on the accumulation of some nasty tokens and the following human review.

    So, that child protection pretext stinks. Since law enforcement can access iCloud data anytime, Apple’s  attempt to adopt self-declared law enforcement role to “prevent crimes before they occur” is Orwellian !
    I'mma just leave this here:
    U.S. law requires tech companies to flag cases of child sexual abuse to the authorities. Apple has historically flagged fewer cases than other companies. Last year, for instance, Apple reported 265 cases to the National Center for Missing & Exploited Children, while Facebook reported 20.3 million, according to the center’s statistics. That enormous gap is due in part to Apple’s decision not to scan for such material, citing the privacy of its users.
    From: https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html
    bulk001StrangeDaysradarthekatwatto_cobradysamoria
  • What you need to know: Apple's iCloud Photos and Messages child safety initiatives

    Rayz2016 said:

     this can be done without telling anyone because the code isn't open source. 
    Incidentally, while the code isn't open source, it's probably amongst the most exposed closed source code Apple has ever committed, since they've published a significant description of the PSI protocol, with all of its constraints and logical operators: https://www.apple.com/child-safety/pdf/Apple_PSI_System_Security_Protocol_and_Analysis.pdf.

    Apple are being uncharacteristically open about what they're doing.
    watto_cobradysamoria