Rayz2016

About

Banned
Username
Rayz2016
Joined
Visits
457
Last Active
Roles
member
Points
18,421
Badges
2
Posts
6,957
  • Apple Watch credited with saving man's life after fall

    bageljoey said:
    bageljoey said:
    I often go mountain biking on my own. I’ve been thinking I should enable the fall detection but I’m worried that I would trip the warning.  
    Would it be possible to not notice the pre-call haptic advisement?  The riding can be intense and I don’t want to be calling 911 on accident!

    That would be a possibility.   Not only could the rough ride trigger a false fall detection, but the juggling might obscure the haptic feedback of the notification (it's not very loud -- it's mostly the vibration).   But, actually, I'm not sure about the false fall detection:  your hands on the handle bars might stop that -- i get them while playing basketball or touch football with sudden movements.

    But, you can turn fall detection off and enjoy all the other benefits of the watch -- like being able to call for help if needed.

    Yeah, my plan has always been to stay conscious long enough to manually send for help from my watch

    That's not a plan. You know that, right?
    MacProscstrrf
  • Epic Games CEO slams Apple 'government spyware'

    mbdrake76 said:
    M68000 said:
    Why are people so up in arms about this?  If you have nothing to hide why are you fighting Apple’s attempt to make the world safer from criminals and sick people?   This “ceo” made the amazingly stupid comment about “presumption of guilt”?   Well,  at least in the United States a person is innocent until proven guilty.   

    I’m frankly amazed by the feedback in this forum in the last few days,  people who trust and love Apple and now don’t want to help Apple try to make the world safer.  

    If you work for any large company I hope you know that any email, chat messages and files on company computers can be looked at and scanned by security and network admins.

    if you want total privacy,  get rid of cloud storage and maybe go back to using a typewriter LOL

    Apple does not want their products used for crime and is making an attempt to do something about it - what is so hard to understand?
    Your employer has the right to monitor your activities at work - but certainly not at home.

    My general concern is if things were to go wrong, and things can and do go wrong.  We're Apple users, we expect things to go wrong!  Due to a bug or hash mismatch (okay - the odds of it triggering a false positive are very low), it could be possible for completely innocent images to be flagged up incorrectly.  Apple hasn't exactly the most marvellous reputation for dealing with sensitive and urgent problems when accounts are closed for something the account isn't responsible for.

    But, as many other people have said, it doesn't have to stop there.  The same tech can be used (or expanded) to check for other content that, say, governments can enforce on Apple to weed out and notify them of any infraction.  This has the capability (mind you, most things do) for abuse.

    HOWEVER..

    Adobe already do this with their cloud services.  This is outlined here:

    https://www.adobe.com/uk/legal/lawenforcementrequests/childsafety.html

    So those using Lightroom Desktop/CC or Lightroom Classic which syncs photos to the Adobe Creative Cloud are already having their photos scanned with CSAM technology when it's uploaded to their servers.  I've not seen any articles that mention this, or any feedback that Adobe has to say on it.

    I can definitely see why Apple wants to implement CSAM on the iPhone (and perhaps give Apple a chance to say to law enforcement - hey, you don't need to crack the phone - we do the job for you!) - and it'd be one of the few companies that aren't already doing so (Google already do it through many of their products and services already - https://transparencyreport.google.com/child-sexual-abuse-material/reporting?hl=en), but it does somewhat go against their privacy matters mantra.

    Adobe, Microsoft, Apple and Google scan servers for CSAM images and report the ones found to the authorities. 

    Apple, as far as I can tell, is the only one installing spyware on your device to scan files before they reach the servers.

    baconstang
  • Open letter asks Apple not to implement Child Safety measures

    iadlib said:
    This is admirable and I like the intention behind it. But to plays devils advocate. I bring up that this technique. Maybe. Possibly. Could be adapted for other kinds of data. Say emails. Text messages. Let’s say it becomes an API or baked in feature. What if that feature gets hijacked? By a hacker, or a government? To search for anti-state speech in China, or even industrial espionage. This is a Pandora’s box that couldn’t ever be shut. 

    Apple takes hash values of child abuse images from the NMSEC database and loads them onto the phone (which is not something I want to think about being loaded onto my phone, even if they are just hashed values). The phone then runs the comparison.

    So the hijacking is basically just loading the hash to whichever database Apple is told to track. 
    macplusplusargonaut
  • WhatsApp latest to pile on Apple over Child Safety tools

    Facebook schooling Apple on privacy. 

    It’s like one of those What if … movies. 
    darkvadermike54watto_cobra
  • Open letter asks Apple not to implement Child Safety measures

    Yup, this basically. 


    GeorgeBMacgatorguybaconstangviclauyychcrefugeeargonautgeorgie01entropys