command_f

About

Username
command_f
Joined
Visits
85
Last Active
Roles
member
Points
550
Badges
0
Posts
439
  • New iMessage edit and unsend features have 15-minute time limit

    kdrummer said:
    macxpress said:
    Plenty of time before you pass out after sending that drunken text to your ex. Great feature. 
    Or the accidental dick pic....
    Now with iCloud Shared Photo Library, everyone in your family will still get to see that “accidental” picture!
    Make sure when you accidentally take it that you're in black and white: then it's art.
    hexclockkdrummerwatto_cobra
  • Family alleges AirTag was used to stalk mother and daughter on Disney World trip

    twlatl said:
    This happened to me on our last trip to Disneyworld. No one was stalking us. When you are on a ride, or in a line for a while with someone who has an Apple device - air tag, AirPods, etc - the tracking notification can get confused thinking that device is trying to track you when it’s just a device on someone close by. 
    I think you're probably right. There are guides to the "optimum" route to take around Disney: two parties that read the same website could well stay in close proximity for a sequence of rides.
    jony0
  • Apple expands feature that blurs iMessage nudity to UK, Canada, New Zealand, and Australia...

    jdw said:
    <Snip>.
    I agree.  I was and still am against on device CSAM detection.  I’m ok if CSAM detection happens in the cloud, however… their house, their rules. On device CSAM necessitates the circumvention of encryption to report to a 3rd originally unintended party the data that is in violation of the rules.
    I don't follow your logic. CSAM detection is OK if it's done on someone else's computer (Apple's Server in the cloud) but not if it's done on your own phone? If it's done on your phone, you then get the warning and no-one else can tell (unless you keep doing it, which is my memory of what Apple proposed). If it's done on someone else's server then the data is present somewhere out of your control.

    I also don't understand what encryption you think is being "circumvented". The data on your phone is encrypted but, within the phone, the encryption keys are available and the data unlocked: if it wasn't, how would Photos be able to display your photos to you? 
    jony0watto_cobra
  • Apple expands feature that blurs iMessage nudity to UK, Canada, New Zealand, and Australia...

    entropys said:
    Opt in or out is not the same as a company creating the ability to do so. iMessage main thing, apart from the cool blue boxes,is its privacy.  This is helping to wreck it.

    on the particular matter, my first thought was how many kids actually have iPhones anyway? But then I am an old fogey from The Time Before Mobiles. Every kid has phones. And privacy, once important, and protection against creation of a Big Brother in all its possible forms, no longer matters.
    How is this a privacy issue?

    The article says "All detection of nudity is done on-device, meaning that any potentially sensitive images never leaves an iPhone.". If only the phone in question is involved then no-one else is involved, so by definition it's private.
    jony0watto_cobra
  • Western Australia Police can now use CarPlay to respond to emergencies

    uroshnor said:
    iOS has been validated by Australian Signals Directorate for the security classifications used by police in Australia for over a decade, and most Australian (& NZ) police forces use Apple devices at fairly large scales - thousands to tens of thousands of iPhones or iPad Minis mainly in multiple forces.
    That's interesting. Did they get access to Apple's source code, I wonder.
    watto_cobra