tylersdad

About

Username
tylersdad
Joined
Visits
58
Last Active
Roles
member
Points
2,020
Badges
2
Posts
310
  • Apple privacy head explains privacy protections of CSAM detection system

    blastdoor said:
    Folks -- if you don't like this your complaint is with governments, not Apple. In fact, you should thank Apple for letting us know that they're doing this. 

    The first inescapable truth is that Apple (and every other company) MUST either (1) follow the laws of countries in which they operate or (2) pull out of those countries. 
    The second inescapable truth is that there is no way Apple is pulling out of the US, and almost no way Apple is pulling out of China. 

    So, if you don't want Apple to do something a government is asking them to do, your best bet is to demand the government not ask them to do it in the first place. 

    I've seen no evidence to suggest the government is forcing them to implement this feature. Can you post a link for the source of this information? 
    muthuk_vanalingam
  • Apple privacy head explains privacy protections of CSAM detection system

    bvwj said:
    So if some political foe wants to destroy me, all they need to do is hack my phone, deposit a collection of CSAM, and the cops will do the rest.  How convenient.
    Don't worry. Someone at Apple review all of your images before calling the cops. Does that make you feel any better? 
    elijahgtuddy215
  • Internal Apple memo addresses public concern over new child protection features

    Another point to consider is the fact that people in favor of this keep saying Apple isn’t looking at your photos. However, Apple says that if an image is flagged it is reviewed for unapproved content prior to informing law enforcement. 

    So at some point, Apple will most certainly be looking at user’s photos. 
    OctoMonkeyjdw
  • Internal Apple memo addresses public concern over new child protection features

    bulk001 said:
    lkrupp said:
    tylersdad said:
    This is monumentally bad for privacy. It's making me reconsider my investments in Apple products. My entire family belongs to the Apple ecosystem. We all have some version of the iPhone 12, iPads and Apple Watches. It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end? 


    If it makes the perverts switch to Android, so be it and good riddance. And I doubt you will leave the platform when you realize you’re still better off with iOS than Android. Google will be doing exactly the same thing shortly as it usually follows Apple in these matters. Who will you go to? Your statement is just a fart in a wind storm. Good luck with it making any difference.
    Agree with your basic sentiment of supporting Apple in this. But, I think though that Google Drive and FB already do something along these lines and Apple is actually playing catch-up? Could be wrong. If so be sure to write me a rant on how Apple leads on everything! 😂
    Apple shouldn’t be playing catch-up on this spyware. They should stand up for the privacy of their customers. Today it’s child pornography that’s being targeted. Maybe tomorrow it’s anti-vax memes? Regardless, this is a horrible idea that will almost certainly be abused. 
    OctoMonkeymuthuk_vanalingam
  • Internal Apple memo addresses public concern over new child protection features

    loopless said:

    ...snip...

    Using the guise of child-porn as a trojan horse ( anyone who objects is tarred and feathered as being in favor of child-abuse) we are having our privacy invaded in the most evil way. 
    Trojan horse is a great description, because that's exactly what this is. It's spyware. Just because it's installed by Apple does not make it any less so. It's spyware. It's checking for unapproved content on the phone that I paid for. Eventually, the list of unapproved content will grow or this will be used by totalitarian governments to punish their citizens. 

    The ethics of this are an absolute mess. 
    baconstangdarkvadermacplusplusjdw