macplusplus

About

Username
macplusplus
Joined
Visits
293
Last Active
Roles
member
Points
3,141
Badges
1
Posts
2,119
  • Apple agrees to make key App Store changes, create $100M fund to settle developer lawsuit

    Beats said:
    What a stupid decision. Apple should not allow small guys to twist their arms.

    i would jack up the developer anual price to $299 to make up for all the BS. Heck Apple should charge $999/year. Nothing is free.
    That would discourage programmers committing to Apple platforms. Many of the registered Apple developers are amateurs or learners.

    The alternative to the sales commission is the upfront listing fee. You pay high listing fee in exchange of less commission. Or you pay less listing fee for higher commission. That listing fee + commission model works since decades on eBay, Amazon and many other online venues.

    Apple can even host third party app stores. Provided that they pay their rent, obviously, nothing is free. All of these are negotiable business deals that can be concluded without going to the courts. Then why that have never happened or discussed? Because their intention is not to do business, their intention is to destroy Apple. This is a small marginal vocal community (not specifically those mentioned in the article) managed behind the curtains by some other tech giant.
    Detnator
  • Bill Maher declares Apple CSAM tools a 'blatant constitutional breach'

    jdw said:
    crowley said:
    Apple's software, Apple's services, no breakage.
    LOL.  I can't wait to hear Tim Cook quote your words on his next TV interview about how Apple is defending privacy as a right...

    Interviewer: "Mr. Cook, you have long defended Privacy as a right and outlined in detail the steps Apple has taken to protect that right.  How do you harmonize your current plan to hash-scan for CSAM images on-device prior to their being uploaded to iCloud?"

    Tim Cook: "Apple's software. Apple's services. No privacy broken!"

    LOL.
    Now your phrase
    “hash-scan for CSAM images on-device prior to their being uploaded to iCloud”
    makes me think of how your luggage is x-rayed before being put on a plane.
    They don't scan your luggage in your house before being put on a plane.
    anantksundaramemig647baconstangRoderikuscat52bluefire1muthuk_vanalingamchemengin1
  • San Francisco doctor charged with possessing child pornography in iCloud

    Looks like they caught this jerk without installing spyware on his phone.
    That’s old news. All the commercial cloud services have hash checking server-side, including Apple:

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

    …what they’re working on now is making the check happen on your phone prior to upload, so that your iCloud stuff is fully E2E, which it isn’t currently. The butthurts don’t comprehend this.

    This is not the E2E encryption you were waiting for, i.e the one that would fully encrypt "your iCloud stuff". Their technical document mentions some "device-generated keys" but that pertains only to iCloud Photos, since it is tied to CSAM matching.

    Besides, they have never declared openly and clearly that they are bringing E2E encryption to iCloud Photos, correct me if I missed something.
    gatorguybaconstangAlex_Vmuthuk_vanalingam
  • Researchers who built rudimentary CSAM system say Apple's is a danger

    crowley said:

    There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.
    The threshold is 30 pictures, according to Federighi. 30 pictures can be sent in a blaze through WhatsApp. If WhatsApp's auto-save media option is active, until the user awakens and deletes them one by one, the 30 pictures may be already flagged as CSAM and uploaded to iCloud.
    You think someone hates you so much that they will find and distribute 30 counts of child pornography that they somehow know is on the NCMEC list to you in the hope that you have WhatsApp automatically save images to Photos and you have iCloud Photos switched on?

    All for the massive humiliation for the police to maybe turn up, at which point you explain that you received messages unsolicited from the phone number of the person who hates you?  And if you're smart you'll have reported in advance anyway.

    Be careful you don't strain yourself with those contortions.
    Well, Sherlock, be careful not to report your very kind self while trying to report unsollicited CSAM messages. Reporting doesn't save yor ass, because you already possess them since you guarded them as evidence. Otherwise what would you report? You save those photos as "evidence" and you expect the cops buy that ! Your only bet is to un-possess them i.e. delete them ASAP.
    Your doom scenarios are as stupid as the “Hey, you!” FaceID thefts and the TouchID finger-chopping-off muggings, neither of which ever happened. You guys need to write fiction. 

    Apple, Google, Microsoft, and Dropbox already do server-side CSAM scanning, so you’d already get flagged today if your phone was somehow duped into sending child pornography to your iCloud Photos. 

    You’re inventing self-victim fantasy. 

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/ ;

    https://www.microsoft.com/en-us/photodna

    https://protectingchildren.google/intl/en/

    Apple has never admitted in their recent communications that they have CSAM-scanned photos on iCloud servers. Instead they consistently denied it. Your only reference on server-side scanning is that Sophos article, which, in the light of latest releases from Apple, should be considered at least outdated. 

    "The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers which we’ve never done for iCloud Photos."

    "This is an area we’ve been looking at for some time, including current state of the art techniques which mostly involves scanning through entire contents of users’ libraries on cloud services that — as you point out — isn’t something that we’ve ever done; to look through users’ iCloud Photos."

    https://techcrunch.com/2021/08/10/interview-apples-head-of-privacy-details-child-abuse-detection-and-messages-safety-features/

    Besides, many people don't have a coherent view of the operating system and they use their phones by trial and error. They don't know what resides where, many don't even know what cloud is. Forgetting that WhatsApp auto-saves photos in the photo library is not a doom scenario, it is a very common usage scenario. Go clean-up your Downloads folder before fighting against "conspiracy theories".
    muthuk_vanalingamchemengin1
  • Researchers who built rudimentary CSAM system say Apple's is a danger

    crowley said:


    There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.
    The threshold is 30 pictures, according to Federighi. 30 pictures can be sent in a blaze through WhatsApp. If WhatsApp's auto-save media option is active, until the user awakens and deletes them one by one, the 30 pictures may be already flagged as CSAM and uploaded to iCloud.
    You think someone hates you so much that they will find and distribute 30 counts of child pornography that they somehow know is on the NCMEC list to you in the hope that you have WhatsApp automatically save images to Photos and you have iCloud Photos switched on?

    All for the massive humiliation for the police to maybe turn up, at which point you explain that you received messages unsolicited from the phone number of the person who hates you?  And if you're smart you'll have reported in advance anyway.

    Be careful you don't strain yourself with those contortions.
    Well, Sherlock, be careful not to report your very kind self while trying to report unsollicited CSAM messages. Reporting doesn't save yor ass, because you already possess them since you guarded them as evidence. Otherwise what would you report? You save those photos as "evidence" and you expect the cops buy that ! Your only bet is to un-possess them i.e. delete them ASAP.
    Beatsmuthuk_vanalingam