- Last Active
Compared: New Apple TV 4K versus 2021 Apple TV 4KWe recommend that prospective buyers of the Apple TV 4K should buy the Wi-Fi + Ethernet model.
I’d also recommend the one with the Ethernet port for an additional reason. I had a 3rd gen Apple TV that lost its wifi capability and was out of its support period, but it still served us years because I just used an Ethernet cable to connect it to the network.
EU will require iMessage, WhatsApp to communicate with smaller messaging servicesAll-Purpose Guru said:So does the EU have ANYBODY in their legislature that has any idea how any of this stuff could be accomplished?The only way I know that would allow interoperability between different encrypted platforms would take a can opener to the encryption and would compromise security.
but that may not be what they are talking about here, which in that case, you’d be correct unless each separate service understood the same encryption protocol and keys.
iOS 15 adoption rate appears to be lagging behind past updatesIncorrect. Sorry but you're speaking from ignorance here. Deploying 100% E2E is exactly why Apple designed on-device child rape CSAM scanning.
point is, even if some criminal were using it (probably not unless they are dumb, because of server side CSAM), they can encrypt their stash using something like TrueCrypt (and even use the plausible deniability mode that double encrypts it) and store it in a general file sharing/storage service like Dropbox or Google drive. Or heck, even just keep it local and just sync it to their own devices. They’ll find ways to see and propagate it.
im not saying apple shouldn’t do CSAM… they should, but keep it on the server side. It’s encrypted in transit (https) and is probably encrypted at rest, so only a few who have the keys may have access to it at Apple. How does that change if the move to on device scanning and reporting?As I’ve said before:Apple has used a threshold algorithm to enable review of the contents… remove that, and you will have true e2e encryption; otherwise it’s just “wish it were e2e”. It’s close, but no cigar.
Once a threshold is met what happens then? Do we trust Apple to handle the data correctly? Do we trust the authorities will handle the data correctly? Do we trust individuals at these places not to share it or leak it? Do we trust that the thresholds won’t change? Do we trust that other types of images won’t be deemed “illicit” in the future? Do we trust a similar threshold algorithm won’t be applied to text in imessages, looking for messages of terrorism or “hate” speech? Do we trust that it will only be done for photos uploaded to iCloud?
I see no difference:
- Someone having limited access outside the intended audience to the data with on device scanning and reporting…
- Someone having limited access outside the intended audience to the data with server side scanning and reporting…
The only difference here is where it’s done, and on device has more serious privacy problems than the server side.
I treat anything I sent to iCloud photos as public knowledge, aka anything I want to keep private stays on my devices. On device scanning and reporting encroaches on that privacy (even given the current software “limitations” of thresholds and only done when sent to iCloud)… so, given that, I would much rather the CSAM reporting stay server side.
iOS 15 adoption rate appears to be lagging behind past updatesexceptionhandler said:
It may “technically” be E2E, as long as you disregard the the watchdog that is not yours at your front door. Even then, it will not be true E2E… remove the thresholds and ability to report to someone else outside of the senders intended parties, then it will be true E2E.elijahg said:crowley said:neverindoubt said:“ It's likely that the reduced adoption rates are because of a change in Apple's update stance. The company is no longer forcing users to install the latest operating system version to gain important security updates.”
99.9% off the reason.
“Apple's set of iOS 15 features -- including the controversial-but-delayed CSAM detection system -- may also be playing a part.”
0.1% of the reason.
If I had child rape photos and didn’t want them to report them to police, I’d disable iCloud Photos.
I’m much more interested in the E2E encryption I’ll get when this eventually rolls out. That offers me value. Worrying about child rape porn? Not a concern.
The only person who is unhappy in this scenario is somebody with child rape photos on their device. The greater good is clearly on the side of all the customers in the world who will enjoy E2E privacy on all their data, vs the child rape collectors unhappy that they will still get in trouble. Oh well.
Currently CSAM child rape photo scanning takes place on the servers of Apple, Google, Microsoft, etc. If some nefarious govt wants to make them look for Winnie the Pooh images, they can do that easier server-side, and could force them to do it today. Nothing about on-device makes it any easier, and arguably more difficult.
These are the most popular emoji characters of 2021