exceptionhandler

About

Username
exceptionhandler
Joined
Visits
930
Last Active
Roles
member
Points
334
Badges
0
Posts
378
  • macOS Sonoma beta review: Few major updates, but very welcome

    I’ve never understood the desire for widgets.  I’ve tried using some form of them since the old dashboard back in snow leopard to ones currently in iOS (I’d rather have more room for apps on my screen than have a widget take up that space). The utility they add just seems too cumbersome, and at which point why not just open the app? Just my personal experience/opinion; I’m sure there are many who swear by them. 

    I guess somewhat related are the “complications” on Apple Watch.  I do use some of those on my watch faces.

    id be curious to hear about people’s use cases and workflows around widgets.
    appleinsideruserwilliamlondonAlex1Ndarkvader
  • Compared: New Apple TV 4K versus 2021 Apple TV 4K

    We recommend that prospective buyers of the Apple TV 4K should buy the Wi-Fi + Ethernet model.


    I’d also recommend the one with the Ethernet port for an additional reason.  I had a 3rd gen Apple TV that lost its wifi capability and was out of its support period, but it still served us years because I just used an Ethernet cable to connect it to the network.

    muthuk_vanalingamretrogustowatto_cobrawilliamlondonpulseimages
  • EU will require iMessage, WhatsApp to communicate with smaller messaging services

    So does the EU have ANYBODY in their legislature that has any idea how any of this stuff could be accomplished?

    The only way I know that would allow interoperability between different encrypted platforms would take a can opener to the encryption and would compromise security.
    Unless the Messages app supports the protocols for these other services, meaning the chat would work in iMessage, but explicitly be limited to Facebook messenger for instance if that’s how the chat started, so you’d have to login in Messages with a Facebook account to send/receive to people on Facebook.  And to use the iMessage protocol in messages, you’d still have to login to iCloud.  So essentially each service would be sandboxed, but dislayable in the app.

    but that may not be what they are talking about here, which in that case, you’d be correct unless each separate service understood the same encryption protocol and keys.
    radarthekat
  • iOS 15 adoption rate appears to be lagging behind past updates

    Incorrect. Sorry but you're speaking from ignorance here. Deploying 100% E2E is exactly why Apple designed on-device child rape CSAM scanning.
    Do you work for Apple? Do you know an insider at Apple that has stated so? Has Apple officially released this detail somewhere?  You’re making an assumption here, and it’s no better than what pundits do that I see bashed all the time here.  Apple may, or may not, be implementing a sort of encryption that would be “wish it were real E2E” (think of the basic security questions banks ask, which is a type of “wish it were 2 factor auth”, it’s not the real thing, dumpster divers can find that info, and at best it’s just secondary passwords “something you know”… real 2F is something you know, and something you have, 3F is something you know, something you have, and something you are.). The whole point of this is, real E2E is when I, in secret, encrypt a message, that no one else knows it’s contents, and it stays encrypted until it reaches the target audience, who has the capability to decrypt it (at which point you have to trust they will not share that info, or someone or something is not watching).  But the point is behind E2E is that only you, the sender are privy to the contents, and sometime later, the target audience is also privy.

    point is, even if some criminal were using it (probably not unless they are dumb, because of server side CSAM), they can encrypt their stash using something like TrueCrypt (and even use the plausible deniability mode that double encrypts it) and store it in a general file sharing/storage service like Dropbox or Google drive.  Or heck, even just keep it local and just sync it to their own devices.  They’ll find ways to see and propagate it.

    im not saying apple shouldn’t do CSAM… they should, but keep it on the server side.  It’s encrypted in transit (https) and is probably encrypted at rest, so only a few who have the keys may have access to it at Apple.  How does that change if the move to on device scanning and reporting? 

    As I’ve said before:
    Apple has used a threshold algorithm to enable review of the contents… remove that, and you will have true e2e encryption; otherwise it’s just “wish it were e2e”. It’s close, but no cigar.

    Once a threshold is met what happens then? Do we trust Apple to handle the data correctly? Do we trust the authorities will handle the data correctly?  Do we trust individuals at these places not to share it or leak it?  Do we trust that the thresholds won’t change? Do we trust that other types of images won’t be deemed “illicit” in the future? Do we trust a similar threshold algorithm won’t be applied to text in imessages, looking for messages of terrorism or “hate” speech?  Do we trust that it will only be done for photos uploaded to iCloud?

    I see no difference:

     - Someone having limited access outside the intended audience to the data with on device scanning and reporting…

    - Someone having limited access outside the intended audience to the data with server side scanning and reporting…

    The only difference here is where it’s done, and on device has more serious privacy problems than the server side.


    I treat anything I sent to iCloud photos as public knowledge, aka anything I want to keep private stays on my devices.  On device scanning and reporting encroaches on that privacy (even given the current software “limitations” of thresholds and only done when sent to iCloud)… so, given that, I would much rather the CSAM reporting stay server side.

    ¯\_(ツ)_/¯

    ctt_zhwilliamlondonmuthuk_vanalingamelijahg
  • iOS 15 adoption rate appears to be lagging behind past updates

    elijahg said:
    crowley said:
    “ It's likely that the reduced adoption rates are because of a change in Apple's update stance. The company is no longer forcing users to install the latest operating system version to gain important security updates.”

    99.9% off the reason.

    “Apple's set of iOS 15 features -- including the controversial-but-delayed CSAM detection system -- may also be playing a part.”

    0.1% of the reason.
    Agree.  Why would a feature that most people don't know or care about and that was pulled be paying a part?
    Because people don't trust that Apple isn't doing it anyway due to the entirely out-of-the-blue way they announced it? That said I was on Big Sur until about a week ago. There was nothing compelling enough for me to spend days fixing things after they break as Apple deprecates random core bits of the OS.
    Why on earth would I be worried about CSAM fingerprint scanning for child rape images in iCloud Photos, even if they turned it on tomorrow? There is no reason to worry about this, since Apple already does it server-side on their iCloud servers, as do Google, Microsoft, Dropbox, etc.

    If I had child rape photos and didn’t want them to report them to police, I’d disable iCloud Photos. 

    I’m much more interested in the E2E encryption I’ll get when this eventually rolls out. That offers me value. Worrying about child rape porn? Not a concern.

    It may “technically” be E2E, as long as you disregard the the watchdog that is not yours at your front door.  Even then, it will not be true E2E… remove the thresholds and ability to report to someone else outside of the senders intended parties, then it will be true E2E.
    Incorrect. Sorry but you're speaking from ignorance here. Deploying 100% E2E is exactly why Apple designed on-device child rape CSAM scanning. It scans on the device because once it gets synced to iCloud, they can't scan it there because...encrypted. 

    The only person who is unhappy in this scenario is somebody with child rape photos on their device. The greater good is clearly on the side of all the customers in the world who will enjoy E2E privacy on all their data, vs the child rape collectors unhappy that they will still get in trouble. Oh well.

    Currently CSAM child rape photo scanning takes place on the servers of Apple, Google, Microsoft, etc. If some nefarious govt wants to make them look for Winnie the Pooh images, they can do that easier server-side, and could force them to do it today. Nothing about on-device makes it any easier, and arguably more difficult.
    What good is E2E if someone is watching everything going in and reporting to their boss?
    williamlondonelijahg