Rayz2016

About

Banned
Username
Rayz2016
Joined
Visits
457
Last Active
Roles
member
Points
18,421
Badges
2
Posts
6,957
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    Rayz2016 said:
    gatorguy said:
    Rayz2016 said:
    gatorguy said:
    Rayz2016 said:
    crowley said:
    Apple can resist to government requests but if a government makes that scheme into law Apple cannot resist.
    This was true last week too, nothing has changed with regards to Apple's obligation to follow the law in places where they do business.

    My guess is that they've been offered a deal: implement the backdoor and the anti-trust/monopoly stuff goes away.
    Huh.

    You know another big tech, Google, is in the antitrust crosshairs. It also coincides with a decision by Google to no longer give themselves a key to user cloud data so that they can't turn over certain private information even if compelled by court order. They simply can't decrypt it, period. There's been two other recent Google policy changes that will restrict authorities' access to data and communications too, both here and abroad. Is there any connection between privacy and antitrust action? I'm not so sure there isn't.

    I actually meant Apple had been offered a deal, but now I'm intrigued.

    Google, throwing away the keys? 

    Where's the link for this? 
    https://www.androidcentral.com/apple-may-have-ditched-encrypted-backups-google-hasnt


    Hmmm. 

    That is very interesting. There’s a theory floating around that Apple is running the back door in the client so they can implement encrypted backups on iCloud. This seems to blow that idea out of the water. 
    Gruber states that but he’s cautiously optimistic or more cautious than optimistic on that.
    But if Google can encrypt backups without building back doors in the client, then why can’t Apple?

    I suppose running it on their servers for millions of files is quite expensive. Makes more sense to shift the resource hit for a handful of files to each individual customer. 
    darkvader
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    cpsro said:
    Of course Apple would never expand the tools. Apple merely provides the backdoor. Governments will be the ones to walk through it.
    Apple will be opening the tools to third parties, apparently. 



    darkvader
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    lkrupp said:
    All this handwringing and foaming at the mouth, spittle flying everywhere. Fine, but what do you all pan to DO about it? Leave the iOS platform? Where will you go? Android, even though Google has been doing this for a couple of years now and Apple is just playing catch up? 


    Just going to politely stop you there for a second.

    Google hasn't been doing this for years. Google carries out scans on the server, just like Microsoft, just like Apple.

    What Google doesn't do (yet) is run government-sponsored spyware on the client device. 

    What will you do? Where will you go? Any answers?
    Well, I fully expect Google to do the same thing, now that Apple has given law enforcement a taste of what they can get away with.

    But the difference is that Android is open-source, so there will still be phones available that don't do the scan.


    I think the point that supporters of this are missing is that it isn't the whole scanning, getting someone you neither know or trust to examine your private file, then shutting down your account, demanding you prove to them that you're not a nonce, then contacting the authorities. Nope, that's not the problem.

    The problem is doing it on the device. 

    There have been people that suggest that we should have a backdoor. But the reality is if you put a backdoor in, that backdoor’s for everybody, for good guys and bad guys… I think everybody’s coming around also to recognizing that any backdoor means a backdoor for bad guys as well as good guys. And so a backdoor is a nonstarter. It means we are all not safe… I don’t support a backdoor for any government, ever.
    We do think that people want us to help them keep their lives private. We see that privacy is a fundamental human right that people have. We are going to do everything that we can to help maintain that trust. — Apple CEO Tim Cook, October 1, 2015

    Yup, that aged well.

    macplusplusmuthuk_vanalingamelijahgbaconstangmike54
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    crowley said:
    Apple can resist to government requests but if a government makes that scheme into law Apple cannot resist.
    This was true last week too, nothing has changed with regards to Apple's obligation to follow the law in places where they do business.
    Yeah, I think the trick is to wait for them to demand a backdoor; then at least you can tell your customers that the law says they have to put in a backdoor at the same time as the competition.

    Putting one in before you're told to strikes me as a little bit dumb.

    My guess is that they've been offered a deal: implement the backdoor and the anti-trust/monopoly stuff goes away.

    Then it would be daft of the government to force Apple to allow alternative app stores, because then they prevent folk from installing software that might bypass the checks.
    macplusplusmuthuk_vanalingamjahbladebaconstang
  • Epic Games CEO slams Apple 'government spyware'

    I am interested to know what the difference is between Apple doing this and Google doing this. Google has been scanning photos for CSAM for years and I've never seen an uproar over it. Is there something I'm missing that makes it terrible on Apple's part and OK on Google's part?

    For reference, here's an article from 2016 about a man being arrested on child porn charges after authorities received a tip from Google that he had uploaded CSAM to his Gmail account.

    https://www.denverpost.com/2016/06/22/jonbenet-ramsey-child-porn-boulder-gary-oliva/


    Google found the child porn by scanning the file on their servers. This is nothing new; Microsoft, Apple and Google have been doing this for quite some time.

    The problem is that this will not find porn if the files are encrypted before they are sent to the server.

    So Apple will get around this by installing a spy program that will scan photos, hash them, compare with the database of child porn image hashes they have download to your phone and report on them before they are sent to iCloud.

    The problem is that Apple has already admitted that the system is open to abuse.  Since the images are hashed, then Apple doesn't know what images are being searched for. This means that any government can inject any image into the database Apple is picking up. In some countries, this won't be child porn; it'll be images of wanted dissidents, government protestors, subversive poetry. 

    Apple has basically told the world that they're happy to build them a surveillance network for any country that asks them to.

    That's the problem.
    mike54sriceanantksundarambaconstangFileMakerFeller