exceptionhandler

About

Username
exceptionhandler
Joined
Visits
631
Last Active
Roles
member
Points
268
Badges
0
Posts
354
  • Compared: New Apple TV 4K versus 2021 Apple TV 4K

    We recommend that prospective buyers of the Apple TV 4K should buy the Wi-Fi + Ethernet model.


    I’d also recommend the one with the Ethernet port for an additional reason.  I had a 3rd gen Apple TV that lost its wifi capability and was out of its support period, but it still served us years because I just used an Ethernet cable to connect it to the network.

    muthuk_vanalingamretrogustowatto_cobrawilliamlondonpulseimages
  • These are the most popular emoji characters of 2021

    How accurate is this data?  I wouldn’t imagine they would have access to usage of these images in e2e encrypted systems like iMessage (at least I’d hope not).  So which data sources being used? Was this limited to unencrypted communications such as social media posts?
    williamlondonDogperson
  • Two years after Apple Silicon, Intel still wants Apple to buy chips

    They’ve got a 1 in a million chance…
    pulseimagesdewmeh2pnetroxbloggerblogravnorodomargonautwatto_cobrajony0
  • iOS 15 adoption rate appears to be lagging behind past updates

    Incorrect. Sorry but you're speaking from ignorance here. Deploying 100% E2E is exactly why Apple designed on-device child rape CSAM scanning.
    Do you work for Apple? Do you know an insider at Apple that has stated so? Has Apple officially released this detail somewhere?  You’re making an assumption here, and it’s no better than what pundits do that I see bashed all the time here.  Apple may, or may not, be implementing a sort of encryption that would be “wish it were real E2E” (think of the basic security questions banks ask, which is a type of “wish it were 2 factor auth”, it’s not the real thing, dumpster divers can find that info, and at best it’s just secondary passwords “something you know”… real 2F is something you know, and something you have, 3F is something you know, something you have, and something you are.). The whole point of this is, real E2E is when I, in secret, encrypt a message, that no one else knows it’s contents, and it stays encrypted until it reaches the target audience, who has the capability to decrypt it (at which point you have to trust they will not share that info, or someone or something is not watching).  But the point is behind E2E is that only you, the sender are privy to the contents, and sometime later, the target audience is also privy.

    point is, even if some criminal were using it (probably not unless they are dumb, because of server side CSAM), they can encrypt their stash using something like TrueCrypt (and even use the plausible deniability mode that double encrypts it) and store it in a general file sharing/storage service like Dropbox or Google drive.  Or heck, even just keep it local and just sync it to their own devices.  They’ll find ways to see and propagate it.

    im not saying apple shouldn’t do CSAM… they should, but keep it on the server side.  It’s encrypted in transit (https) and is probably encrypted at rest, so only a few who have the keys may have access to it at Apple.  How does that change if the move to on device scanning and reporting? 

    As I’ve said before:
    Apple has used a threshold algorithm to enable review of the contents… remove that, and you will have true e2e encryption; otherwise it’s just “wish it were e2e”. It’s close, but no cigar.

    Once a threshold is met what happens then? Do we trust Apple to handle the data correctly? Do we trust the authorities will handle the data correctly?  Do we trust individuals at these places not to share it or leak it?  Do we trust that the thresholds won’t change? Do we trust that other types of images won’t be deemed “illicit” in the future? Do we trust a similar threshold algorithm won’t be applied to text in imessages, looking for messages of terrorism or “hate” speech?  Do we trust that it will only be done for photos uploaded to iCloud?

    I see no difference:

     - Someone having limited access outside the intended audience to the data with on device scanning and reporting…

    - Someone having limited access outside the intended audience to the data with server side scanning and reporting…

    The only difference here is where it’s done, and on device has more serious privacy problems than the server side.


    I treat anything I sent to iCloud photos as public knowledge, aka anything I want to keep private stays on my devices.  On device scanning and reporting encroaches on that privacy (even given the current software “limitations” of thresholds and only done when sent to iCloud)… so, given that, I would much rather the CSAM reporting stay server side.

    ¯\_(ツ)_/¯

    ctt_zhwilliamlondonmuthuk_vanalingamelijahg
  • New iOS 15.2 beta includes Messages feature that detects nudity sent to kids

    I’m not against scanning, as the software necessarily needs to be able to read and write images (wouldn’t be useful if it didn’t).  Scanning is just a subset of reading images - it makes it easier for me to find images of particular things.

    But to report to a 3rd party of the results of a scan without my consent, that’s what I take issue with.  As long as the data remains on device, scanning is a useful time saving tool.

    I also being a concerned parent, probably won’t be handing over a smartphone or tablet to my kids that I don’t have access to.  So this feature is not really useful to me. They have an iPad, but I also have it locked down where the password can’t be changed (and technically at this point, they don’t know the password, so they have to ask to use it).

    I guess it could be useful if an unknown number/account sends illicit images to have it blurred out, just in case.

    When they are old enough to buy/afford a phone and pay the monthly charges, they can do what they want.
    muthuk_vanalingamwilliamlondonmcdave
  • iOS 15 adoption rate appears to be lagging behind past updates

    elijahg said:
    crowley said:
    “ It's likely that the reduced adoption rates are because of a change in Apple's update stance. The company is no longer forcing users to install the latest operating system version to gain important security updates.”

    99.9% off the reason.

    “Apple's set of iOS 15 features -- including the controversial-but-delayed CSAM detection system -- may also be playing a part.”

    0.1% of the reason.
    Agree.  Why would a feature that most people don't know or care about and that was pulled be paying a part?
    Because people don't trust that Apple isn't doing it anyway due to the entirely out-of-the-blue way they announced it? That said I was on Big Sur until about a week ago. There was nothing compelling enough for me to spend days fixing things after they break as Apple deprecates random core bits of the OS.
    Why on earth would I be worried about CSAM fingerprint scanning for child rape images in iCloud Photos, even if they turned it on tomorrow? There is no reason to worry about this, since Apple already does it server-side on their iCloud servers, as do Google, Microsoft, Dropbox, etc.

    If I had child rape photos and didn’t want them to report them to police, I’d disable iCloud Photos. 

    I’m much more interested in the E2E encryption I’ll get when this eventually rolls out. That offers me value. Worrying about child rape porn? Not a concern.

    It may “technically” be E2E, as long as you disregard the the watchdog that is not yours at your front door.  Even then, it will not be true E2E… remove the thresholds and ability to report to someone else outside of the senders intended parties, then it will be true E2E.
    Incorrect. Sorry but you're speaking from ignorance here. Deploying 100% E2E is exactly why Apple designed on-device child rape CSAM scanning. It scans on the device because once it gets synced to iCloud, they can't scan it there because...encrypted. 

    The only person who is unhappy in this scenario is somebody with child rape photos on their device. The greater good is clearly on the side of all the customers in the world who will enjoy E2E privacy on all their data, vs the child rape collectors unhappy that they will still get in trouble. Oh well.

    Currently CSAM child rape photo scanning takes place on the servers of Apple, Google, Microsoft, etc. If some nefarious govt wants to make them look for Winnie the Pooh images, they can do that easier server-side, and could force them to do it today. Nothing about on-device makes it any easier, and arguably more difficult.
    What good is E2E if someone is watching everything going in and reporting to their boss?
    williamlondonelijahg
  • EU will require iMessage, WhatsApp to communicate with smaller messaging services

    So does the EU have ANYBODY in their legislature that has any idea how any of this stuff could be accomplished?

    The only way I know that would allow interoperability between different encrypted platforms would take a can opener to the encryption and would compromise security.
    Unless the Messages app supports the protocols for these other services, meaning the chat would work in iMessage, but explicitly be limited to Facebook messenger for instance if that’s how the chat started, so you’d have to login in Messages with a Facebook account to send/receive to people on Facebook.  And to use the iMessage protocol in messages, you’d still have to login to iCloud.  So essentially each service would be sandboxed, but dislayable in the app.

    but that may not be what they are talking about here, which in that case, you’d be correct unless each separate service understood the same encryption protocol and keys.
    radarthekat
  • New folding iPad and refreshed iPad mini 7 now in 2024, says Kuo

    Why does the world need a folding iPad? 

    More complex to design and manufacture, with more parts and extra cabling to connect them across the two halves of the device. 

    More likely to break or malfunction.

    Thick when folded.

    Potentially reduced internal space for battery versus non-folding iPad with same screen dimensions.  

    Not really more convenient to carry.  

    Not specifically addressed (falls under the more likely to break or malfunction) but the hinge also introduces an ingress point for debris, adding to the care needed to keep it functional/nice at best, and at worst would hinder its ability to work.  Unless of course apple figured out how to completely seal the hinge… but even then, there are all the other issues you’ve identified.

    I really don’t see the use case here for a folding screen.  I think this is a case of:

    “Yeah, but your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.” - Dr. Ian Malcolm
    watto_cobraroundaboutnowradarthekatFileMakerFeller
  • Apple expands feature that blurs iMessage nudity to UK, Canada, New Zealand, and Australia...

    jdw said:
    I was against the proposed Apple CSAM features because, even though the risk was low, there was still the change it could basically "call the cops" on someone, even by accident.  However, I am not averse to the blurring of nudity as mentioned by the article because it doesn't secretly phone home or otherwise call the cops on the user.  Some naughty people who send those kinds of photos won't be happy, but since I don't do that, it's obviously a non-issue for me.  And while I don't engage in other illicit activities regarding kiddie porn either, the fact it potentially could alert authorities to target a given iPhone user keeps me dead set against it.

    So go ahead a blur naughty photos, Apple!  Just keep the cops and prosecution out of it.  Seems like that is the case here, so it doesn't bother me that this could be the precursor to something else.  When that something else arrives, we the consumer can evaluate it at the time.  For now, no objections from me.
    I agree.  I was and still am against on device CSAM detection.  I’m ok if CSAM detection happens in the cloud, however… their house, their rules. On device CSAM necessitates the circumvention of encryption to report to a 3rd originally unintended party the data that is in violation of the rules.

    This nudity detection feature, if as stated all happens on device, with no reporting to any party outside the original participants, does not violate privacy and is a tool that can be used by parents to protect their kids (when the kids can afford their own phone/plan, they can decide for themselves to turn it on or off, imo)

      I don’t have issue with scanning, what I take issue with is what is done with the results of a scan (is it reported or not).
    elijahgmuthuk_vanalingamwatto_cobra
  • Amazon slashing 9,000 more jobs in fresh round of layoffs

    lkrupp said:
    So millennials are finding out how the world really works, that no job is secure, that all this talk about how companies value employees is bullshit. You didn’t want to come back to work at the office but wanted to remain cozy working at home with your pet cat on your lap. It’s the BOTTOM LINE, baby, the bottom line. Get used to it, You’ll be switching jobs every few years for the rest of your lives with no security, no perks, no free lunches and lattes. Think joining a union will make it all better? Hardy har har. The SCOTUS is likely to shitcan your hopes of getting your student loan debt laid on the backs of taxpayers. $400 billion? Think again. 

    The world will continue to need electricians, plumbers, brick masons, welders, carpenters, big equipment operators for the foreseeable future, not so much programmers, data entry workers, marketing types, even certain engineering fields as AI will see to that. It’ll be awhile until Boston Dynamics comes up with a robot that can wire and plumb a new home.

    End rant from a 73 year old curmudgeon.
    http://www.stilldrinking.org/programming-sucks

    As @StrangeDays has said, engineers don’t have anything to worry about (at least the good ones).  ChatGPT can only generate things based on prior art, not write its own code from scratch.  While ML has vastly improved, it’s still in the category of 4GL languages for programming in my opinion: I don’t trust them to always do the right thing.  It’s a great tool, good for finding insights or solutions, but a tool doesn’t replace the programmer.  That is not to say it’s useless.  It could allow us engineers to be more productive by allowing us to focus on other, more important things, or provide us with insights we may not have found in a timely manner.  As software engineers, we kinda strive to automate things so that human errors don’t happen, make things quicker, or have the computer do something we find mundane to free us up to do more interesting tasks.  ML is no different and really the next step.
    ronnwatto_cobra