macplusplus

About

Username
macplusplus
Joined
Visits
293
Last Active
Roles
member
Points
3,141
Badges
1
Posts
2,119
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    All the boneheads claiming Apple created a "backdoor" -- nope. All the tech companies do this (Dropbox, Microsoft, Google), and Apple did 100% server-side CSAM scanning a year ago:

    https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

    ...you dudes are simply panicking and clutching your pearls because you didn't know about it before.
    That article is an argument against Apple, not in favor of. Since they are already performing that scan on their servers, what is the point in injecting another mechanism into the device itself? I have no authority on iCloud servers, those are Apple's property, but I have authority on my device and I don't want it to be used to inspect me. This is not different than planting a camera into your house to monitor if you abuse your children or your wife.

    Previously, Apple had rejected government's request to develop a custom iOS in order to break into criminals' iPhones. That would work on case-by-case basis, but Apple had still rejected it because it would set a precedent. And now, Apple sets that precedent voluntarily, and not on case-by-case basis but for the whole ecosystem...
    elijahggatorguymuthuk_vanalingamchemengin1entropysRayz2016mejsricbaconstangmike54cocho
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    lkrupp said:
    All this handwringing and foaming at the mouth, spittle flying everywhere. Fine, but what do you all pan to DO about it? Leave the iOS platform? Where will you go? Android, even though Google has been doing this for a couple of years now and Apple is just playing catch up? Will you extricate yourself from the online universe and go off the grid in a bunker in northern Utah or the Yukon territory, maybe Hudson’s Bay? Of course you can’t change anything in the ballot box because both parties are onboard with this. It’s coming and there’s NOTHING you can do about it except bitch on a tiny tech blog. 

    What will you do? Where will you go? Any answers?

    Maybe those bitching on this tiny tech blog can't do much, but [unfortunately] parents can do by prohibiting the iPhone to their kids: "iPhone? God forbid ! I heard that Apple's cloud is full of paedophiles and Apple is working hard to deal with that..."
    baconstang
  • New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

    Apple can resist to government requests but if a government makes that scheme into law Apple cannot resist.

    Apple has made a precedent with that. Now many governments will take that precedent as template and will try to turn it into law. But in the past Apple had resisted to the request of developing a special iOS to break into criminals' phones as that would constitute a precedent.
    georgie01OctoMonkeyelijahgronn
  • Epic Games CEO slams Apple 'government spyware'

    Just for the sake of fun let's imagine a user sporadically bombarded with child porn over WeChat, Telegram, WhatsApp and alike. Our decent guy duly deletes every image he finds in those chat apps, he thinks he keeps his iPhone clean of such things. But what he doesn't know is that everytime such an image arrives, it is automatically saved to his iCloud photo library too. Safety vouchers will accumulate and our poor guy will end up in jail even without figuring out what has happened to him !..

    Do photos sent via these messaging apps automatically get added to iCloud Photos simply by being sent and not viewed with one's eyes?

    We don't know Apple's thresholds. It could be 1000 vouchers. It could be 10,000 vouchers. If someone has manually viewed 1000+ photos by choice and each time they view, that photo is uploaded to iCloud, then let them go to jail, as they are willfully participating in the exchange of such material.

    The technology that Apple has designed and implemented is not the problem. It's strongly designed with privacy in mind. What the main concern is that Apple could be forced to make that same technology be used for other purposes, which may border on invasion of privacy. It's the potential for abuse that is getting everyone up in arms.

    I trust that Apple has implemented this system really well. Innocent until *proven* guilty, after all, right?
    Whether photos are added automatically or after viewing doesn't matter. Most people use their phones by trial and error without having a coherent view of the operating system, they don't know what resides where. Besides, it is not always easy to qualify an image as child porn. Subject below the age of consent it is child porn, but the age of consent varies between jurisprudences. You defend Apple with the argument "Innocent until proven guilty" but you explicitly refuse to apply it to the decent individual in our fictitious case.
    baconstang
  • Open letter asks Apple not to implement Child Safety measures

    killroy said:
    killroy said:
    killroy said:
    omasou said:
    Apple should shut down iCloud instead of developing a mass surveillance method for the government.
    It is NOT mass surveillance method for the government. It is a system for REPORTING CSAM and designed to be an advocate for and to protect children.

    If we see or are aware of CSAM we should report it. Apple can SEE and be AWARE of CSAM w/o violating anyone's privacy and SHOULD report it.
    OK. Why do they monitor my device from within? They can scan their servers for any abusive material. User backups on iCloud are stored unencrypted and law enforcement can always access those backups with a search warrant. They can perform the same CSAM hash checking on their iCloud servers as well.

    The fact that they are bringing the monitoring right into my device shows that they might be following a totally different agenda than preventing child abuse. They may be trying to permanently implement something onto user devices which scope may extend to God knows where...

    Because once it's on Apple servers they can't see it because it's encrypted. You have to see it before it's encrypted or it won't work.
    This is just not true. They store iCloud content on their servers encrypted but with Apple's keys. Your device keys are not used to encrypt content on iCloud (with a few exceptions like passwords etc., certainly not photos). Since they can decrypt your iCloud data and deliver it to law enforcement anytime (with a search warrant), they can do so for their hash business too. Since they already get the permission to scan your content on iCloud by license agreement, what is the point in injecting another but questionable tool into your device, your own property?

    Your phone yes the OS not so much. If you read your phone carriers terms of service you might find that they can upload anything they want to. What Apple is proposing is to  do is add a firewall on the OS to keep the nefarious stuff off their servers.

    Today iCloud Photos. Tomorrow what? iCloud Drive? Including the Desktop and Documents folders of all your Macs? To protect themselves from your nefarious junk?

    You can just op-out. i don't think Google or Facebook and others that have been doing it for yaers give you that.
    Such superficial comparisons with Google or Facebook don't help and are misleading. iCloud Drive is much more than a photo repository, it is a fundamental component of macOS. Google and Facebook don't host your Desktop and Documents folders. Apple's responsibility is deeper than Google's or Facebook's or Dropbox's. Pushing the opt-out option as an argument is just an implicit way to admit how such projects may put the whole ecosystem at stakes.
    darkvadermuthuk_vanalingam