zimmie
About
- Username
- zimmie
- Joined
- Visits
- 172
- Last Active
- Roles
- member
- Points
- 2,737
- Badges
- 1
- Posts
- 651
Reactions
-
German government wants Tim Cook to reconsider CSAM plans
dantheman827 said:genovelle said:Since the images are not scanned but specific data from hash tag markers to identify know child pornography images from a data base. It is no different from a file that has a known virus being detected and handled. There is a reason this guy is coming out as many others. They have these files themselves and are fearful of being caught
Or I don't know, maybe people actually value their privacy and Apple just messed up big time?
It's not an objection to the scanning, it's an objection to the scanning being done on your device without you having the option to disable it.
Client-side CSAM detection is incontrovertibly better for privacy than server-side CSAM detection, which Apple currently does. To do the scanning on the server side, the server (and by extension, Apple) has to be able to see the photos you send to them. With client-side scanning, your photos can be encrypted before Apple ever gets them, so Apple can't ever see them. There have been several known incidents where employees of photo sync/sharing sites have saved customers' private images and shared them with other employees without the consent of the people in the photos. NSA employees are known to have done the same with photos caught in their surveillance dragnets. Client-side CSAM scanning and sending only encrypted images to Apple is specifically meant to address that type of issue.
Whether the scanning should happen at all is definitely worth debating. The legal teams of every major photo sharing site clearly believe US law currently requires this scanning to happen. Dropbox, Facebook, Flickr, Google, Instagram, OneDrive, and more all do exactly the same scanning server-side which Apple does today. -
EU to propose common charger for all smartphones, ignores Apple's protest
GeorgeBMac said:So what advantages does Lightening offer over USB-C / thunderbolt?likewiseSo what advantages does USB-C / thunderbolt offer over Lightening?At one point Lightening was clearly superior to USB(-A). But I suspect that the answers today will show Apple has been dragging its feet and falling behind. The question is: "Why?"One possible answer is that Lightening gives Apple greater control over the iPhone -- you can only do those things Apple says you can do -- much like its control over Apps. One can argue that Apple should have no control -- but that comes with collateral damage.
Ignoring install base, the main advantage of USB-C for phones is that Apple doesn't own it, so a lot of other companies have been using it for new designs for a while.
Standardization on one type of connector for phones from many manufacturers means consumers can switch platforms more easily, and manufacturers can stop including even the cables. Of course, this prevents further progress, as nobody is allowed to make and use a different connector. Depending on the actual wording of the proposed legislation, it could mandate USB Micro-B connectors on smart watches (many of which function as phones), which would be a huge step backwards for them. -
WhatsApp latest to pile on Apple over Child Safety tools
crowley said:AppleInsider said:
"Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy."
I'm still digesting the technical analysis papers on the CSAM detection. This application of Threshold Secret Sharing is fascinating. I've only ever used it for "You need at least four managers to agree to get access to the certificate authority" kind of stuff. -
Apple reportedly plans to make iOS detect child abuse photos
CloudTalkin said:crowley said:"This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?"
1. It's not in the hands of Apple. It's not in the hands of anyone. It's a, thus far, unsubstantiated rumor from a security researcher.
2. If it comes to fruition that Apple does enable the AI feature, wouldn't they be bound by the law to report the info to authorities (idk, ianal). If the offending data is stored in iCloud, then it would also be subject to worldwide government data requests. Requests that Apple has honored ~80% of the time on average.
3. Keeping in mind this is only a claim by a researcher, and not Apple, the question would then have to be asked: What constitutes child pornography to the AI? Is it reviewed by a human for higher level verification? If so, Apple employee or 3rd party source (like the original voice recordings)? What triggers reporting to authorities and who bears responsibility for errors?
A parent sending pics of the kids in bubble bath to grandparents. Photo of a young looking 18 girl topless at a nude beach. Scouts shirtless around a campfire.
Would any one of those trigger the AI? What if all three were on the same phone? It's entirely possible and not far fetched.
I can't stress enough this isn't Apple going after child abusers. This is a researcher making a claim. But if Apple were going to do so it would most definitely affect that "government access -authoritarian or otherwise- query made by the researcher, in myriad way not even addressed in my comment.
2,3. The system Dr. Green talked about isn't AI in any meaningful sense. It's fuzzy hashing. It matches only images which are close to existing known images. The concern is fuzzy hashing is, by its very nature, imprecise. While normal hashes match an exact chunk of data, fuzzy hashes are more likely to match "similar" data, and our idea of similar may not be the hash's idea.
These systems are normally used forensically, after someone is already suspected of possession of CSAM. The system directs investigators to specific files, then the investigators confirm. We don't have good studies of false positive rates, because when the systems are used, it's typically on drives which contain thousands of CSAM images. And the drives people use to store CSAM tend to contain little else, limiting false positives. What's a false positive here or there, when you confirm a hundred of the images are CSAM?
When a false positive can basically destroy your ability to live in modern society, we really need a solid understanding of how likely they are.
As for the authoritarian government angle, if such a capability were built, China would definitely demand it be used to report people in China who have a copy of Tank Man. -
Apple Silicon transition may hit its two-year target with 2022 Mac Pro
tenthousandthings said:zimmie said:crowley said:michelb76 said:imagladry said:mike54 said:After all the unlimited praise youtubers, tech sites, Apple fanboius, etc gave the M1, I just hope Apple is not taking advantage of this praise, milking as much revenue as they can from it, thereby delaying advancement. Apple does have a bad habit of releasing something great and then sitting on it past its use-by date.
If you look at the packaging, they're almost identical there, too. The big reason for the M1's limited memory performance and capacity is that it only has space for two RAM packages.
Thus, the M1 can reasonably be called an A14X.I’ll guess it’s likely they’ll follow the familiar A-series pattern with the M series, so there will be an M1X soon and then an M2 and M2X next year; but that leaves out the Mac Pro. So if there is going to be a third, highest-end, iteration, whether “Z” or something else, kind of seems like it might need to wait for the M2 architecture.