macplusplus
About
- Username
- macplusplus
- Joined
- Visits
- 293
- Last Active
- Roles
- member
- Points
- 3,141
- Badges
- 1
- Posts
- 2,119
Reactions
-
Apple agrees to make key App Store changes, create $100M fund to settle developer lawsuit
Beats said:What a stupid decision. Apple should not allow small guys to twist their arms.
i would jack up the developer anual price to $299 to make up for all the BS. Heck Apple should charge $999/year. Nothing is free.
The alternative to the sales commission is the upfront listing fee. You pay high listing fee in exchange of less commission. Or you pay less listing fee for higher commission. That listing fee + commission model works since decades on eBay, Amazon and many other online venues.
Apple can even host third party app stores. Provided that they pay their rent, obviously, nothing is free. All of these are negotiable business deals that can be concluded without going to the courts. Then why that have never happened or discussed? Because their intention is not to do business, their intention is to destroy Apple. This is a small marginal vocal community (not specifically those mentioned in the article) managed behind the curtains by some other tech giant. -
Bill Maher declares Apple CSAM tools a 'blatant constitutional breach'
dangermouse2 said:jdw said:crowley said:Apple's software, Apple's services, no breakage.
Interviewer: "Mr. Cook, you have long defended Privacy as a right and outlined in detail the steps Apple has taken to protect that right. How do you harmonize your current plan to hash-scan for CSAM images on-device prior to their being uploaded to iCloud?"
Tim Cook: "Apple's software. Apple's services. No privacy broken!"
LOL.
“hash-scan for CSAM images on-device prior to their being uploaded to iCloud”
makes me think of how your luggage is x-rayed before being put on a plane. -
San Francisco doctor charged with possessing child pornography in iCloud
StrangeDays said:baconstang said:Looks like they caught this jerk without installing spyware on his phone.https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/
…what they’re working on now is making the check happen on your phone prior to upload, so that your iCloud stuff is fully E2E, which it isn’t currently. The butthurts don’t comprehend this.
Besides, they have never declared openly and clearly that they are bringing E2E encryption to iCloud Photos, correct me if I missed something. -
Researchers who built rudimentary CSAM system say Apple's is a danger
StrangeDays said:macplusplus said:crowley said:macplusplus said:AppleInsider said:
There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.
All for the massive humiliation for the police to maybe turn up, at which point you explain that you received messages unsolicited from the phone number of the person who hates you? And if you're smart you'll have reported in advance anyway.
Be careful you don't strain yourself with those contortions.
Apple, Google, Microsoft, and Dropbox already do server-side CSAM scanning, so you’d already get flagged today if your phone was somehow duped into sending child pornography to your iCloud Photos.
You’re inventing self-victim fantasy.
"The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers which we’ve never done for iCloud Photos."
"This is an area we’ve been looking at for some time, including current state of the art techniques which mostly involves scanning through entire contents of users’ libraries on cloud services that — as you point out — isn’t something that we’ve ever done; to look through users’ iCloud Photos."
https://techcrunch.com/2021/08/10/interview-apples-head-of-privacy-details-child-abuse-detection-and-messages-safety-features/
Besides, many people don't have a coherent view of the operating system and they use their phones by trial and error. They don't know what resides where, many don't even know what cloud is. Forgetting that WhatsApp auto-saves photos in the photo library is not a doom scenario, it is a very common usage scenario. Go clean-up your Downloads folder before fighting against "conspiracy theories". -
Researchers who built rudimentary CSAM system say Apple's is a danger
crowley said:macplusplus said:AppleInsider said:
There are also protections against a bad actor sending CSAM to an innocent person. The Apple system only detects collections of CSAM in iCloud. Unless a user saves CSAM to iCloud themselves, or their Apple account is hacked by a sophisticated threat actor, then there's little chance of such a scam working out.
All for the massive humiliation for the police to maybe turn up, at which point you explain that you received messages unsolicited from the phone number of the person who hates you? And if you're smart you'll have reported in advance anyway.
Be careful you don't strain yourself with those contortions.