aguyinatx

About

Username
aguyinatx
Joined
Visits
4
Last Active
Roles
member
Points
25
Badges
0
Posts
14
  • What you need to know: Apple's iCloud Photos and Messages child safety initiatives

    mjtomlin said:
    Just thought I’d chime in after reading so many misguided complaints about this subject. Are all of you context deaf? Did you read what is actually happening or are you just incapable of comprehending it?

    They are not scanning photos for specific images, they’re simply counting the bits and creating a hash… all they see are numbers… this in no way scans photos for offensive images, nor does it in any way violate your privacy.

    It amazes me that so many are complaining about trying to restrict/thwart child pornography?!

    it’s even more ridiculous when you consider every photo you save to the Photos app is automatically scanned through an image recognition engine to identify the contents of the photo.

    Creating a hash requires scanning the image and absolutely requires the file to be opened.  Do this on a Mac and it's pretty easy to demonstrate.  Also DO NOT RUN COMMANDS YOU DO NOT UNDERSTAND FROM FORUM POSTS.  Ask someone or research the commands.

    Now.... In a terminal....

    sudo -I and enter your account password. This elevates your privileges to root.  That's a lower case I btw.
    echo "foo" > /opt/root-owned-file.txt  This creates a file in /opt called root-owned-file.txt with the word "foo" as the only content.
    chmod 0600 /opt/root/owned-file.txt This ensures that only the root user can read and write to the file
    exit  and hit return

    Now you're running as the user you logged in with. 
    sha256sum /opt/root-owned-file.txt   should give you a hash (those number you were talking about) but it doesn't.  You get a permission denied because you can't hash a file that you can't open.  Apple isn't magic, they have to open the image in order to analyze it.  Full stop.  No binary or user on a Unix system can hash a file without opening.  

    Okay, clean up the file sudo rm /opt/root-owned-file.txt

    Next up... This computer is one that I paid for and I own.  Only parties I consent to should have the right to open files to analyze them.  From the example above,   No one is complaining about stopping CSAM, but these aren't computers that Apple owns, and they aren't asking users if they want to submit to surveillance, and no scanning a photo to see if a dog is in it is not surveillance.  Additionally Apple is clearly adopting a vigilante role that is extra-judicial.  Law enforcement agencies require a warrant to compel someone to surrender access to a computer, and yet Apple presumes powers that the FBI doesn't have.

    The article is primarily an ad hominem fallacy without many facts.  "Hey they other guys are doing it too!" is a baseless argument.  I do not have a Facebook account so I don't care what they do.  I'm not given a choice with Apple suddenly, and I am perfectly justified in getting my ire up when they insist that they have the uninvited right to open files that I create.  
    macplusplusbaconstangmuthuk_vanalingam
  • Apple backs down on CSAM features, postpones launch

    MplsP said:
    How many of the people screaming about CSAM have Facebook, WhatsApp, Instagram, and google apps on their devices and an Amazon or google smart speaker in their home?
    That isn't relevant, a very oranges to lemons (not going to say apples: pun avoided) comparison. Facebook et al. are services where you deliver content to them where it is stored and processed on their systems.  They have every right to scan for anything they want on their systems in the same way I have the right to censor speech in my home.  If I don't like what someone says I can remove them.  Apples proposal was to use a system I own in a way that created an adversarial relationship with me.  It was actively checking on my activity to make sure it complied with a 3rd parties definition of okay.  I have to make the obligatory declaration that exploitation of all people, especially children is reprehensible and morally bankrupt.  Apple is perfectly justified in scanning iCloud for CSAM if they want, and I support it 100 percent.  Apple should not be free to use a device I own for purposes of letting law enforcement bypass warrants.
    darkvader
  • Edward Snowden calls Apple CSAM plans 'disaster-in-the-making'

    netrox said:
    You guys, I just have to remind you that MS, Google, Facebook, and Twitter have been "scanning" your videos and images for CSAM for years. Apple was actually "late" to this feature. Now you're upset about Apple? LOL

    Clearly, none of you people have a clue on how it works. 

    Snowden is a traitor and betrayed our Americans for exposing our privacy for the internet to see and you're worried about Apple who is making sure no child porn is shared across the Internet while preserving privacy? 
     

    What Google et al are doing is NOT the same thing as Apple proposes.  The "cloud" is just a slang word that means someone else's computer.  When you choose to upload a photo to a third party you lose the right to control over that same photo.  Apple proposes that when you take a photo you lose control over it.  That is very very different, and Snowdens point is exactly correct.  It is true that Snowden broken laws but that isn't relevant to the conversation; it's an ad-hominem attack against what you perceive to be a character flaw.  That type of character attack doesn't contribute anything and makes you look desperate to prove a point.  Don't do it.  As far as a "clue" about how it works, well I am afraid I do.  Here is a link to a better writer than me: blog (hackerfactor blog).  Read it.
    muthuk_vanalingamOfermacplusplusbaconstang
  • Bill Maher declares Apple CSAM tools a 'blatant constitutional breach'

    netrox said:
    This whole CSAM is tiring, it's been done by Google, MS, Tumblr, Reddit, and so on for YEARS! Apple just announced to make us know they're doing the same thing. I don't recall anyone batting an eye with MS, Google, Facebook, and so on. 

    https://www.microsoft.com/en-us/photodna

    You're comparing apples (pun is intended) to oranges.  The other companies you mention are platforms that are either designed specifically to share content, or are cloud providers.  The difference between a cloud provider scanning for CSAM and Apples strategy is that when you use the "cloud" you're really just using someone else's computer, and they have the right to scan their computers for content they do not like.  I have the right to inspect anything in my house any time I like right?  Apple is looking on my phone using a cpu and storage that I paid for and own.  That's the difference, and it's not a matter of semantics.  iCloud is ON by default and Apple is proactively making the decision to A upload my photo, and B scan it.  The "spin" that they're putting on it that they do not have to look at the picture to understand its content is poppy cock; many others, that are better writers than I, have addressed that.  You are not correct.
    danoxjdwJMStearnsX2
  • Edward Snowden calls Apple CSAM plans 'disaster-in-the-making'

    aguyinatx said:
    robaba said:
    aguyinatx said:
    netrox said:
    You guys, I just have to remind you that MS, Google, Facebook, and Twitter have been "scanning" your videos and images for CSAM for years. Apple was actually "late" to this feature. Now you're upset about Apple? LOL

    Clearly, none of you people have a clue on how it works. 

    Snowden is a traitor and betrayed our Americans for exposing our privacy for the internet to see and you're worried about Apple who is making sure no child porn is shared across the Internet while preserving privacy? 
     

    What Google et al are doing is NOT the same thing as Apple proposes.  The "cloud" is just a slang word that means someone else's computer.  When you choose to upload a photo to a third party you lose the right to control over that same photo.  Apple proposes that when you take a photo you lose control over it. 
    So, when you send pictures you’ve personally taken with your iPhone and send them to iCloud, and your iPhone hashes them and comparison them to hashes of child porn, you suddenly somehow have retroactively lost control of the picture the moment you took it?  Yeah, lean back on the rhetoric dude.
    The scanning is done on device.  When iCloud is enabled iPhoto sync is ON BY DEFAULT.  There is no rhetoric to lean back on, there are facts however.  The phone is taking an adversarial role with it's owner as it now has the capacity to judge the content of a photo and react to that content without the consent of the owner.  We're talking about CSAM in the US, but as Apple has shown it will deliberately choose to weaken security for convenience, that will change as other regulatory environments demand it to.  I am all for scanning on their servers, but not on hardware that I own.

    https://www.zdnet.com/article/apple-removes-feature-that-allowed-its-apps-to-bypass-macos-firewalls-and-vpns/

    iCloud Photo Library and Photo Stream are most definitely opt-in. 
    Well done. You definitely had me second guessing myself, but it turns out you're 100% wrong on that.  Here's what I did, I have an older (but still able to run 14.7) iPhone that I reset to defaults.  I signed in with my iCloud account and the photo library was set to sync via iCloud on the new (old) phone by default. The iCloud photo sync was already OFF on my actual iPhone 11. As a matter of fact I don't use photo syncing on any of my Apple devices.  The verdict - iCloud photo sync defaults to ON when iCloud is enabled.

    muthuk_vanalingam