exceptionhandler
About
- Username
- exceptionhandler
- Joined
- Visits
- 631
- Last Active
- Roles
- member
- Points
- 268
- Badges
- 0
- Posts
- 354
Reactions
-
Compared: New Apple TV 4K versus 2021 Apple TV 4K
We recommend that prospective buyers of the Apple TV 4K should buy the Wi-Fi + Ethernet model.I’d also recommend the one with the Ethernet port for an additional reason. I had a 3rd gen Apple TV that lost its wifi capability and was out of its support period, but it still served us years because I just used an Ethernet cable to connect it to the network.
-
These are the most popular emoji characters of 2021
-
Two years after Apple Silicon, Intel still wants Apple to buy chips
-
iOS 15 adoption rate appears to be lagging behind past updates
StrangeDays said:Incorrect. Sorry but you're speaking from ignorance here. Deploying 100% E2E is exactly why Apple designed on-device child rape CSAM scanning.
point is, even if some criminal were using it (probably not unless they are dumb, because of server side CSAM), they can encrypt their stash using something like TrueCrypt (and even use the plausible deniability mode that double encrypts it) and store it in a general file sharing/storage service like Dropbox or Google drive. Or heck, even just keep it local and just sync it to their own devices. They’ll find ways to see and propagate it.
im not saying apple shouldn’t do CSAM… they should, but keep it on the server side. It’s encrypted in transit (https) and is probably encrypted at rest, so only a few who have the keys may have access to it at Apple. How does that change if the move to on device scanning and reporting?As I’ve said before:Apple has used a threshold algorithm to enable review of the contents… remove that, and you will have true e2e encryption; otherwise it’s just “wish it were e2e”. It’s close, but no cigar.
Once a threshold is met what happens then? Do we trust Apple to handle the data correctly? Do we trust the authorities will handle the data correctly? Do we trust individuals at these places not to share it or leak it? Do we trust that the thresholds won’t change? Do we trust that other types of images won’t be deemed “illicit” in the future? Do we trust a similar threshold algorithm won’t be applied to text in imessages, looking for messages of terrorism or “hate” speech? Do we trust that it will only be done for photos uploaded to iCloud?I see no difference:
- Someone having limited access outside the intended audience to the data with on device scanning and reporting…
- Someone having limited access outside the intended audience to the data with server side scanning and reporting…
The only difference here is where it’s done, and on device has more serious privacy problems than the server side.
I treat anything I sent to iCloud photos as public knowledge, aka anything I want to keep private stays on my devices. On device scanning and reporting encroaches on that privacy (even given the current software “limitations” of thresholds and only done when sent to iCloud)… so, given that, I would much rather the CSAM reporting stay server side.
¯\_(ツ)_/¯
-
New iOS 15.2 beta includes Messages feature that detects nudity sent to kids
I’m not against scanning, as the software necessarily needs to be able to read and write images (wouldn’t be useful if it didn’t). Scanning is just a subset of reading images - it makes it easier for me to find images of particular things.
But to report to a 3rd party of the results of a scan without my consent, that’s what I take issue with. As long as the data remains on device, scanning is a useful time saving tool.
I also being a concerned parent, probably won’t be handing over a smartphone or tablet to my kids that I don’t have access to. So this feature is not really useful to me. They have an iPad, but I also have it locked down where the password can’t be changed (and technically at this point, they don’t know the password, so they have to ask to use it).
I guess it could be useful if an unknown number/account sends illicit images to have it blurred out, just in case.
When they are old enough to buy/afford a phone and pay the monthly charges, they can do what they want. -
iOS 15 adoption rate appears to be lagging behind past updates
StrangeDays said:exceptionhandler said:
It may “technically” be E2E, as long as you disregard the the watchdog that is not yours at your front door. Even then, it will not be true E2E… remove the thresholds and ability to report to someone else outside of the senders intended parties, then it will be true E2E.StrangeDays said:elijahg said:crowley said:neverindoubt said:“ It's likely that the reduced adoption rates are because of a change in Apple's update stance. The company is no longer forcing users to install the latest operating system version to gain important security updates.”
99.9% off the reason.
“Apple's set of iOS 15 features -- including the controversial-but-delayed CSAM detection system -- may also be playing a part.”
0.1% of the reason.
If I had child rape photos and didn’t want them to report them to police, I’d disable iCloud Photos.
I’m much more interested in the E2E encryption I’ll get when this eventually rolls out. That offers me value. Worrying about child rape porn? Not a concern.
The only person who is unhappy in this scenario is somebody with child rape photos on their device. The greater good is clearly on the side of all the customers in the world who will enjoy E2E privacy on all their data, vs the child rape collectors unhappy that they will still get in trouble. Oh well.
Currently CSAM child rape photo scanning takes place on the servers of Apple, Google, Microsoft, etc. If some nefarious govt wants to make them look for Winnie the Pooh images, they can do that easier server-side, and could force them to do it today. Nothing about on-device makes it any easier, and arguably more difficult. -
EU will require iMessage, WhatsApp to communicate with smaller messaging services
All-Purpose Guru said:So does the EU have ANYBODY in their legislature that has any idea how any of this stuff could be accomplished?The only way I know that would allow interoperability between different encrypted platforms would take a can opener to the encryption and would compromise security.
but that may not be what they are talking about here, which in that case, you’d be correct unless each separate service understood the same encryption protocol and keys. -
New folding iPad and refreshed iPad mini 7 now in 2024, says Kuo
radarthekat said:Why does the world need a folding iPad?More complex to design and manufacture, with more parts and extra cabling to connect them across the two halves of the device.
More likely to break or malfunction.
Thick when folded.
Potentially reduced internal space for battery versus non-folding iPad with same screen dimensions.Not really more convenient to carry.
I really don’t see the use case here for a folding screen. I think this is a case of:
“Yeah, but your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.” - Dr. Ian Malcolm
-
Apple expands feature that blurs iMessage nudity to UK, Canada, New Zealand, and Australia...
jdw said:I was against the proposed Apple CSAM features because, even though the risk was low, there was still the change it could basically "call the cops" on someone, even by accident. However, I am not averse to the blurring of nudity as mentioned by the article because it doesn't secretly phone home or otherwise call the cops on the user. Some naughty people who send those kinds of photos won't be happy, but since I don't do that, it's obviously a non-issue for me. And while I don't engage in other illicit activities regarding kiddie porn either, the fact it potentially could alert authorities to target a given iPhone user keeps me dead set against it.
So go ahead a blur naughty photos, Apple! Just keep the cops and prosecution out of it. Seems like that is the case here, so it doesn't bother me that this could be the precursor to something else. When that something else arrives, we the consumer can evaluate it at the time. For now, no objections from me.
This nudity detection feature, if as stated all happens on device, with no reporting to any party outside the original participants, does not violate privacy and is a tool that can be used by parents to protect their kids (when the kids can afford their own phone/plan, they can decide for themselves to turn it on or off, imo)
I don’t have issue with scanning, what I take issue with is what is done with the results of a scan (is it reported or not). -
Amazon slashing 9,000 more jobs in fresh round of layoffs
lkrupp said:So millennials are finding out how the world really works, that no job is secure, that all this talk about how companies value employees is bullshit. You didn’t want to come back to work at the office but wanted to remain cozy working at home with your pet cat on your lap. It’s the BOTTOM LINE, baby, the bottom line. Get used to it, You’ll be switching jobs every few years for the rest of your lives with no security, no perks, no free lunches and lattes. Think joining a union will make it all better? Hardy har har. The SCOTUS is likely to shitcan your hopes of getting your student loan debt laid on the backs of taxpayers. $400 billion? Think again.
The world will continue to need electricians, plumbers, brick masons, welders, carpenters, big equipment operators for the foreseeable future, not so much programmers, data entry workers, marketing types, even certain engineering fields as AI will see to that. It’ll be awhile until Boston Dynamics comes up with a robot that can wire and plumb a new home.
End rant from a 73 year old curmudgeon.
As @StrangeDays has said, engineers don’t have anything to worry about (at least the good ones). ChatGPT can only generate things based on prior art, not write its own code from scratch. While ML has vastly improved, it’s still in the category of 4GL languages for programming in my opinion: I don’t trust them to always do the right thing. It’s a great tool, good for finding insights or solutions, but a tool doesn’t replace the programmer. That is not to say it’s useless. It could allow us engineers to be more productive by allowing us to focus on other, more important things, or provide us with insights we may not have found in a timely manner. As software engineers, we kinda strive to automate things so that human errors don’t happen, make things quicker, or have the computer do something we find mundane to free us up to do more interesting tasks. ML is no different and really the next step.