foregoneconclusion

About

Username
foregoneconclusion
Joined
Visits
254
Last Active
Roles
member
Points
10,806
Badges
2
Posts
3,056
  • Butterfly keyboard MacBook owners compensation payments are arriving

    motif88 said:
    Nothing fake about it… I had my butterfly keyboard replaced twice. Did you try and type on that POS for 2+ years? I did.

    Just received my well earned $395 check today.
    AppleInsider tried to assess what the repair rates were for the MBPs with butterfly mechanisms. Their final conclusion was that MAYBE the 1st year of MBPs with that mechanism had higher repair rates. That was about as definitive as it ever got. And you have to remember that the MacBook got the butterfly mechanism a year earlier than the MBP and didn't generate complaints. 
    watto_cobraAlex_V
  • Fortnite coming to iPhones in the EU via AltStore

    Fortnite was already available on iOS via three different browser game streaming platforms.
    sphericdewmewatto_cobra
  • Apple's batterygate settlement fund didn't have enough money to pay a $92.17 check

    iOS_Guy80 said:
    Where on the check is dated January 3, 2024?
    That's a UK mistake...United States does month/day/year but they were reading it as day/month/year. 
    maltzronn
  • Child safety watchdog accuses Apple of hiding real CSAM figures

    gatorguy said:
    AppleInsider said: Apple abandoned its plans for Child Sexual Abuse Material (CSAM) detection, following allegations that it would ultimately be used for surveillance of all users. The company switched to a set of features it calls Communication Safety, which is what blurs nude photos sent to children.
    So the steps are:

    A. User decides whether or not they want to use iCloud for backup
    B. User decides whether or not they agree to Apple's terms of use for iCloud backup
    C. User decides which apps have files uploaded to iCloud
    D. Per Apple terms of use, files from those apps can be scanned for illegal content

    Those steps wouldn't have changed with on-device scanning AND the files scanned wouldn't have changed with on-device scanning. The "controversy" surrounding that method was always inane. There was no change in level of surveillance at all. The user always had complete control over what would be subject to scanning.





     iCloud?

    From a related article: 

    "Almost all cloud services routinely scan for the digital fingerprints of known CSAM materials in customer uploads, but Apple does not.

    The company cites privacy as the reason for this, and back in 2021 announced plans for a privacy-respecting system for on-device scanning. However, these leaked ahead of time, and fallout over the potential for abuse by repressive governments.. led to the company first postponing and then abandoning plans for this. 

    As we’ve noted at the time, an attempt to strike a balance between privacy and public responsibility ended up badly backfiring. 

    If Apple had instead simply carried out the same routine scanning of uploads used by other companies, there would probably have been little to no fuss. But doing so now would again turn the issue into headline news. The company really is stuck in a no-win situation here."

    https://www.apple.com/legal/internet-services/icloud/

    You agree that you will NOT use the Service to:

    a. upload, download, post, email, transmit, store, share, import or otherwise make available any Content that is unlawful, harassing, threatening, harmful, tortious, defamatory, libelous, abusive, violent, obscene, vulgar, invasive of another’s privacy, hateful, racially or ethnically offensive, or otherwise objectionable;"

    ----

    C. Removal of Content

    You acknowledge that Apple is not responsible or liable in any way for any Content provided by others and has no duty to screen such Content. However, Apple reserves the right at all times to determine whether Content is appropriate and in compliance with this Agreement, and may screen, move, refuse, modify and/or remove Content at any time, without prior notice and in its sole discretion, if such Content is found to be in violation of this Agreement or is otherwise objectionable.


    The reality is that Apple's iCloud terms already give them the right to scan files. You can't use the service without agreeing to their terms. As shown above, their terms include the right to screen files for unlawful content. CSAM is unlawful. And note that they don't have to give notice to do it. They can scan for CSAM any time they choose. They don't have to make public statements about changing their position etc. 

    avon b7dewmebaconstangwatto_cobra
  • Child safety watchdog accuses Apple of hiding real CSAM figures

    AppleInsider said: Apple abandoned its plans for Child Sexual Abuse Material (CSAM) detection, following allegations that it would ultimately be used for surveillance of all users. The company switched to a set of features it calls Communication Safety, which is what blurs nude photos sent to children.
    So the steps are:

    A. User decides whether or not they want to use iCloud for backup
    B. User decides whether or not they agree to Apple's terms of use for iCloud backup
    C. User decides which apps have files uploaded to iCloud
    D. Per Apple terms of use, files from those apps can be scanned for illegal content

    Those steps wouldn't have changed with on-device scanning AND the files scanned wouldn't have changed with on-device scanning. The "controversy" surrounding that method was always inane. There was no change in level of surveillance at all. The user always had complete control over what would be subject to scanning.





    dewmewatto_cobra