exceptionhandler

About

Username
exceptionhandler
Joined
Visits
856
Last Active
Roles
member
Points
325
Badges
0
Posts
369
  • EU accuses Apple of breaking antitrust laws with Apple Pay [u]

    This whole thing makes me think of ‘The Little Red Hen’

    https://en.m.wikipedia.org/wiki/The_Little_Red_Hen

    with apple being the hen, and the EU the other farm animals, except they are trying to force apple to share.
    watto_cobra
  • Surfshark, TurboVPN and more are secretly undermining security

    zimmie said:
    As I keep saying, there's nothing private about most of these "VPN" services. They are proxies which use VPN technologies for the client-to-proxy leg of the connection.

    With most, you are exchanging snooping from your telco for snooping from Belarusian companies and telcos. Not exactly an upgrade in privacy.
    This.  Security at its root is all about trust.  I don’t trust any of these vpn services.  I’d rather the internet provider just see all the traffic.  That’s is not to say I don’t/won’t use vpns.  They make sense when both sides of the connection can be trusted, like connecting to work to access resources on the network there.  Or when I’m out and about, I have a personal vpn server at home I use to get access to things there.  Poking a hole in the firewall for a vpn is a whole lot safer than poking a hole for each and every resource I want to access away from home.  I trust the encryption behind several vpn softwares, but I don’t always trust what’s at the other end unless I know what’s there.
    watto_cobra
  • Apple expands feature that blurs iMessage nudity to UK, Canada, New Zealand, and Australia...

    command_f said:
    jdw said:
    <Snip>.
    I agree.  I was and still am against on device CSAM detection.  I’m ok if CSAM detection happens in the cloud, however… their house, their rules. On device CSAM necessitates the circumvention of encryption to report to a 3rd originally unintended party the data that is in violation of the rules.
    I don't follow your logic. CSAM detection is OK if it's done on someone else's computer (Apple's Server in the cloud) but not if it's done on your own phone? If it's done on your phone, you then get the warning and no-one else can tell (unless you keep doing it, which is my memory of what Apple proposed). If it's done on someone else's server then the data is present somewhere out of your control.

    I also don't understand what encryption you think is being "circumvented". The data on your phone is encrypted but, within the phone, the encryption keys are available and the data unlocked: if it wasn't, how would Photos be able to display your photos to you? 
    As I’ve said before: I treat anything I send to iCloud photos as public knowledge (Even if it’s meant only for friends and family).  Anything I want to keep private stays on my devices. On device scanning and reporting encroaches on that privacy (even given the current software “limitations” of thresholds and only done when sent to iCloud)… so, given that, I would much rather the CSAM reporting stay server side.

    You are correct that the data is encrypted before storing on the phone, and it has to be decrypted to be displayed on your screen.  This is different from the encryption used to send something via Messages for instance, and is also different from the encryption to send things via iCloud photos.  When sending images via iCloud photos, it would have to be decrypted from storage to be sent.  The image should then be encrypted in transit (https) to Apple servers using a different protocol and a different key. But then at this point Apple has access to the contents to check for CSAM, and they’ve probably have been doing this since before the announcement of on device CSAM. I suspect Apple is then re-encrypting for data at rest on the server side, so only people with the keys may have access to the data.

    I highly doubt iCloud photos will ever be fully E2E encrypted, meaning only the original intended participants are privy to the contents of the encryption, with no one in the middle being able to view it, including but not limited to Apple or law enforcement.  Threshold encryption is a way to circumvent the original participants to report to another, originally unintended party.  Thus, I see little difference between the privacy of on-device vs server side CSAM detection.  On device gives a false sense privacy.  Since I treat anything I send off device (already decrypted from local storage) as public knowledge, this is effectively the same.

    It all boils down to trust.  Do I trust Apple? More so than other companies.  Do I trust that they won’t look at my images? Generally I do, but I’m also not naive.  Do I trust their software not to be hacked or leak these images? To some degree, but again, I’m not naive.  Do I trust Apple’s claims about encryption? Generally, yes.  How about end to end encryption (meaning it stays encrypted once it leaves the device and is only decrypted once it arrives at the final destination)? Enough that I’ll send private details to someone on the other side.

    And that’s just part of it.  Do I trust the receiver not to re-share the details? Do I trust the receiver to have taken precautions not to inadvertently leak the info, such as using untrusted 3rd party software, screen lookers, or even hackers? This is why things I want to keep private stay on my phone, and things I send out of it I regard as public knowledge despite promises made about the implementation (which I generally consider as trust worthy for now).
    muthuk_vanalingamwatto_cobra
  • Apple expands feature that blurs iMessage nudity to UK, Canada, New Zealand, and Australia...

    jdw said:
    I was against the proposed Apple CSAM features because, even though the risk was low, there was still the change it could basically "call the cops" on someone, even by accident.  However, I am not averse to the blurring of nudity as mentioned by the article because it doesn't secretly phone home or otherwise call the cops on the user.  Some naughty people who send those kinds of photos won't be happy, but since I don't do that, it's obviously a non-issue for me.  And while I don't engage in other illicit activities regarding kiddie porn either, the fact it potentially could alert authorities to target a given iPhone user keeps me dead set against it.

    So go ahead a blur naughty photos, Apple!  Just keep the cops and prosecution out of it.  Seems like that is the case here, so it doesn't bother me that this could be the precursor to something else.  When that something else arrives, we the consumer can evaluate it at the time.  For now, no objections from me.
    I agree.  I was and still am against on device CSAM detection.  I’m ok if CSAM detection happens in the cloud, however… their house, their rules. On device CSAM necessitates the circumvention of encryption to report to a 3rd originally unintended party the data that is in violation of the rules.

    This nudity detection feature, if as stated all happens on device, with no reporting to any party outside the original participants, does not violate privacy and is a tool that can be used by parents to protect their kids (when the kids can afford their own phone/plan, they can decide for themselves to turn it on or off, imo)

      I don’t have issue with scanning, what I take issue with is what is done with the results of a scan (is it reported or not).
    elijahgmuthuk_vanalingamwatto_cobra
  • Smarthome firm and early HomeKit partner Insteon is dead, with no warning to customers

    I abhor devices that won’t work without internet.  If I’m on the local network, it should work without needing to phone home.  And the only internet communication should be done through homekit’s apis to allow remote control access.  Apple’s services/apis aren’t going away anytime soon.  Another alternative would be to use an on-site vpn to access these devices.  From security and reliability perspectives, each vendor providing their own services is a nightmare.
    watto_cobra