exceptionhandler
About
- Username
- exceptionhandler
- Joined
- Visits
- 856
- Last Active
- Roles
- member
- Points
- 325
- Badges
- 0
- Posts
- 369
Reactions
-
EU accuses Apple of breaking antitrust laws with Apple Pay [u]
This whole thing makes me think of ‘The Little Red Hen’
https://en.m.wikipedia.org/wiki/The_Little_Red_Hen
with apple being the hen, and the EU the other farm animals, except they are trying to force apple to share. -
Surfshark, TurboVPN and more are secretly undermining security
zimmie said:As I keep saying, there's nothing private about most of these "VPN" services. They are proxies which use VPN technologies for the client-to-proxy leg of the connection.
With most, you are exchanging snooping from your telco for snooping from Belarusian companies and telcos. Not exactly an upgrade in privacy. -
Apple expands feature that blurs iMessage nudity to UK, Canada, New Zealand, and Australia...
command_f said:exceptionhandler said:jdw said:<Snip>.
I also don't understand what encryption you think is being "circumvented". The data on your phone is encrypted but, within the phone, the encryption keys are available and the data unlocked: if it wasn't, how would Photos be able to display your photos to you?You are correct that the data is encrypted before storing on the phone, and it has to be decrypted to be displayed on your screen. This is different from the encryption used to send something via Messages for instance, and is also different from the encryption to send things via iCloud photos. When sending images via iCloud photos, it would have to be decrypted from storage to be sent. The image should then be encrypted in transit (https) to Apple servers using a different protocol and a different key. But then at this point Apple has access to the contents to check for CSAM, and they’ve probably have been doing this since before the announcement of on device CSAM. I suspect Apple is then re-encrypting for data at rest on the server side, so only people with the keys may have access to the data.I highly doubt iCloud photos will ever be fully E2E encrypted, meaning only the original intended participants are privy to the contents of the encryption, with no one in the middle being able to view it, including but not limited to Apple or law enforcement. Threshold encryption is a way to circumvent the original participants to report to another, originally unintended party. Thus, I see little difference between the privacy of on-device vs server side CSAM detection. On device gives a false sense privacy. Since I treat anything I send off device (already decrypted from local storage) as public knowledge, this is effectively the same.
It all boils down to trust. Do I trust Apple? More so than other companies. Do I trust that they won’t look at my images? Generally I do, but I’m also not naive. Do I trust their software not to be hacked or leak these images? To some degree, but again, I’m not naive. Do I trust Apple’s claims about encryption? Generally, yes. How about end to end encryption (meaning it stays encrypted once it leaves the device and is only decrypted once it arrives at the final destination)? Enough that I’ll send private details to someone on the other side.
And that’s just part of it. Do I trust the receiver not to re-share the details? Do I trust the receiver to have taken precautions not to inadvertently leak the info, such as using untrusted 3rd party software, screen lookers, or even hackers? This is why things I want to keep private stay on my phone, and things I send out of it I regard as public knowledge despite promises made about the implementation (which I generally consider as trust worthy for now). -
Apple expands feature that blurs iMessage nudity to UK, Canada, New Zealand, and Australia...
jdw said:I was against the proposed Apple CSAM features because, even though the risk was low, there was still the change it could basically "call the cops" on someone, even by accident. However, I am not averse to the blurring of nudity as mentioned by the article because it doesn't secretly phone home or otherwise call the cops on the user. Some naughty people who send those kinds of photos won't be happy, but since I don't do that, it's obviously a non-issue for me. And while I don't engage in other illicit activities regarding kiddie porn either, the fact it potentially could alert authorities to target a given iPhone user keeps me dead set against it.
So go ahead a blur naughty photos, Apple! Just keep the cops and prosecution out of it. Seems like that is the case here, so it doesn't bother me that this could be the precursor to something else. When that something else arrives, we the consumer can evaluate it at the time. For now, no objections from me.
This nudity detection feature, if as stated all happens on device, with no reporting to any party outside the original participants, does not violate privacy and is a tool that can be used by parents to protect their kids (when the kids can afford their own phone/plan, they can decide for themselves to turn it on or off, imo)
I don’t have issue with scanning, what I take issue with is what is done with the results of a scan (is it reported or not). -
Smarthome firm and early HomeKit partner Insteon is dead, with no warning to customers
I abhor devices that won’t work without internet. If I’m on the local network, it should work without needing to phone home. And the only internet communication should be done through homekit’s apis to allow remote control access. Apple’s services/apis aren’t going away anytime soon. Another alternative would be to use an on-site vpn to access these devices. From security and reliability perspectives, each vendor providing their own services is a nightmare.