What you need to know: Apple's iCloud Photos and Messages child safety initiatives

1356789

Comments

  • Reply 41 of 162
    MarvinMarvin Posts: 15,320moderator
    cpsro said:
    elijahg said:
    Remember that 1 in 1 trillion isn't 1 false positive per 1 trillion iCloud accounts - it's 1 per 1 trillion photos. I have 20,000 photos, that brings the chances I have a falsely flagged photo to 1 in 50 million. Not quite such spectacular odds then.
    One in a trillion over 20,000 photos is not 1 in 50 million. It's one in a trillion, 20,000 times. The odds do not decrease per photo, as your photo library increases in size. There is not a 1:1 guarantee of a falsely flagged photo in a trillion-strong photo library.

    And even if it was, one in 50 million is still pretty spectacularly against.
    One in 50 million with over 1 billion accounts is 200 false positives per year.
    But Apple might claim the false positive rate is per account, not per photo.
    These statistics are, however, under idealized (perfectly random) circumstances and real data typically doesn't look entirely random.
    Apple posted a document saying it's per account, they have a threshold and multiple images have to match the database so the number is likely based on a combination of the image hash collision rate and the threshold required per account:

    https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

    "Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the database of known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines whether there is a match without revealing the result.

    The device creates a cryptographic safety voucher that encodes the match result. It also encrypts the image’s NeuralHash and a visual derivative. This voucher is uploaded to iCloud Photos along with the image.

    Using another technology called threshold secret sharing, the system ensures that the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images.

    The threshold is selected to provide an extremely low (1 in 1 trillion) probability of incorrectly flagging a given account. This is further mitigated by a manual review process wherein Apple reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated."

    The hashes use the colors and shapes in the image, collisions between completely distinct images will be extremely rare and only after a flag is raised will they manually review to make sure.

    Apple doesn't want to host content like this on their services, it makes them an unwilling participant. As soon as someone tries to upload these images to iCloud, they want to verify the content first.

    Lots of code accesses the content of images, even just viewing an image requires code to decompress an image from its file format into uncompressed in memory and copy it onto the GPU for display. Similarly with generating thumbnails to browse a gallery of photos. Copying files onto a network drive, they have to duplicate every bit in the image.

    Every client device has to decode image content at some point, the added step here is checking it against a database when a user tries to send it to Apple and have them host it.
    Images in messages are also decoded by Apple when viewing them, scanning for nudity and protecting them is helping the device users. It means less chance celebrities will have iCloud photos leaked and it means parents can worry less about their kids being targeted.

    If people don't want Apple to do any of this to them, they can avoid sending images to their services and the features in parental controls.

    It's likely that Apple has been getting inundated with requests for decryption from law enforcement. Recently a network was shut down with 400,000 users:

    https://www.bbc.com/news/world-europe-56969414

    That's one network out of dozens, maybe hundreds. Apple could be getting tens of thousands of iCloud account decryption requests to check for this. Active scanning will make things more efficient.
    rundhvidkillroywatto_cobrajony0
  • Reply 42 of 162
    DAalsethDAalseth Posts: 2,783member
    The first thought I had when reading this was , “Holy F, do I ever hate statistics.” I’ve had two occasions in my life to study stats, once in University, and once last year when I was working on a higher certification for work. It always seemed like a meaningless parlour game that told me nothing. I’ve got an absolute blind spot for the subject. So you people can argue about the actual probabilities when you have 20k images and a one in a trillion odds of a false hit all you want. I’ll just skip those comments. Means nothing to me. (For the record I was and still am quite good with math, except for stats.)

    The second thing that occurred to me was that no matter how low the odds, no matter how well Apple handles it, this will come back and bite them in the @$$. It will be used by their competitors and the army of Apple Hating Trolls to weave a narrative of Apple spying on people and colluding with foreign powers. No matter how untrue the narrative is it WILL harm Apple’s reputation for being all about privacy. 
    elijahgradarthekatwatto_cobramwhite
  • Reply 43 of 162
    Tim Apple has shown himself to be nothing but a fraud when it comes to privacy. Time for him to go…he’s overstayed his welcome.  
    elijahgyuck9
  • Reply 44 of 162
    22july201322july2013 Posts: 3,570member
    I'm not saying that I'm for or against this system. I am unbiased for now. I don't have any position yet, I just have some technical and procedural questions.

    1. Why would CSAM or Apple use a hash algorithm that has a 1 in 1,000,000,000 (a trillion) chance of a mismatch, when using the MD5 hashing algorithm which is already built-into macOS has a 1 in 70,000,000,000,000,000,000,000,000,000 (70 octillion) chance of a mismatch? Yes, I know this is an image hash and not a byte-wise file hash, but even so, why was the image hash algorithm designed with such an amazingly high collision chance? Is it impossible to design an image hash that has an error rate in the octillions? Why did they settle for an error rate as surprisingly large as one in a trillion? I want to know.

    2. What happens if the news media releases a picture of a child in a victimized situation but blurs or hides the indecent parts, in order to help get the public to identify the child, and what if this image gets sent to my iCloud? Is that going to trigger a match? The image has lots in common with the original offending image. Sure, a small part of the image was cropped out, but the CSAM algorithm results in matches even when images area as they say, "slightly cropped." They said this: "an image that has been slightly cropped or resized should be considered identical to its original and have the same hash." This means it could trigger a match. Am I entitled to know exactly how much of a change will typically cause the match to fail? Or is this something the public is not allowed to learn about?

    3. When Apple detects a matching photo hash, how does Apple (or the US government) take into account the location of the culprit when a match occurs? Suppose the culprit who owns the offending photo resides in, say, Russia. What happens then? The US can't put that person on trial (although since 2004 the US has been putting a growing number foreign terrorists on trial after dragging them into the US, and then using tenuous links like accessing an American bank account as justification for charging them in a US Federal court.) About 96% of the people in the world do not live in the US, so does that mean a high percentage of the cases will never go to trial? Does the US government send a list of suspects to Vladimir Putin every week when they find Russians who have these illegal images in their iCloud? Or do Russian culprits get ignored? What about Canadian culprits, since Canada is on good terms with the US? Does the US government notify the Canadian government, or does the US wait until the culprit attempts to cross the border for a vacation? I want to see the list of countries that the US government provides its suspect list with. Or is this something the public is not allowed to learn about?

    And now for a couple of hypothetical questions of less urgency but similar importance:

    4. If a friendly country like Canada were to develop its own database, would Apple report Canadian matches to the Canadian authority, or to the US authority, or to both governments? In other words, is Apple treating the US government as the sole arbitrator of this data, or will it support other jurisdictions? 

    5. What happens if an unfriendly country like China builds its own child abuse database, would Apple support that, and then would Apple report to Chinese authorities? And how would Apple know that China hasn't included images of the Tiananmen massacre in its own database?

    And now a final question that comes to my mind:

    Since the crimes Apple is trying to fight occur all over the world, shouldn't the ICC (International Criminal Court) be creating a CSAM database? Frankly, I blame the ICC for not tackling this problem. I'm aware that US Republicans generally oppose the ICC, but Democrats sound much more open to using the ICC. Biden has been silent on the ICC since getting elected, but he has said that his administration: "will support multilateral institutions and put human rights at the center of our efforts to meet the challenges of the 21st century.” Reading between the lines, that sounds like he supports the ICC. And since 70% of Americans support the ICC, according to a poll, maybe this is a popular issue that Biden can hang his hat on.

    My main concern with this whole topic is that there are so many important questions like these that are not being considered.
    1. Based on what I've been told, and what the NCMEC has said, the hash is SHA-256 and the core technology that Microsoft delivered for it to the NCMEC has been SHA-256 for some time. I'm sure there's some wobble and decrease in the odds because of the "modified image" provision. The one in a trillion appears to be an under-promise, and over-deliver.

    2. A "slight crop" will be similar as you say. A blur is a noticeable change to the hash. I'm pretty sure there'll be no transparency on this, as was discussed in the piece.

    3. Existing law prevails where the user is located. The system is US-only. The NCMEC isn't transparent about data sources for the hash database.

    4. No idea.

    5. No idea.

    ICC, probably, but it doesn't look like they are.
    I appreciate that you tried to answer my questions. But even the ones you did answer weren't clear. If you are saying that the NCMEC claims to be using SHA256, then why didn't you provide that information in the article, and how do you get a high error rate of "a trillion" if you are using SHA256? That makes no sense. The way you talk about "slight crops" and "blurs" is not at all clear, and is not justified with any evidence, and is therefor not convincing. Your third answer really doesn't tell me anything. "Existing law prevails." What does that mean? Does it mean that the US government will or will not inform the foreign government? You didn't answer that question. How do I get the answer to my question? Are the answers available or will the government never tell us what they do in these cases?
    There is no simple answer to your follow-on question as it pertains to #3, given treaties and whatnot. Apple has geofencing for all kinds of services like Apple Music, and it stands to reason, that they will here too. For now, this is a US system, run in the US, on US phones. US law prevails. If you're from Xistan and in the US, then US law applies as it pertains to the hashing, regardless if Xistan's laws are less restrictive. I'm sure the feds won't tell what they did specifically if the scenario is more complex.
    It sounds like you are saying if you are outside US borders, you are safe from prosecution (which will be the vast majority of offenders.) But you still aren't saying whether the US government is informed of offenders who are outside US borders. Does the information stop inside Apple's corporate walls, or does it reach the US government? And if the latter, does the US then arrest the offender when they enter US borders? You don't seem to be remotely concerned about the information flow. To me this is a hugely important question and I can't get you concerned about it.

    But I appreciate that you have gone home for today and that's okay. You have no obligation to answer these questions anyway. I appreciate your contributions even if they don't address my concerns. (P.S. I couldn't find the word "xistan" in the article, or in the comments, or in the dictionary, or even by googling it. So I'm guessing it's a made up word for any non-US country. Do you consider Canada to be a "xistan"?)
  • Reply 45 of 162
    yojimbo007yojimbo007 Posts: 1,165member
    ORWELLIAN!
    Besides absolute invasiveness  what do they hope to accomplish with this??… it will solve nothing!
    Ones who have  these tendencies and know that they are being watched.. they will simply circumvent  it by using other platforms and Avoid Apple iCloud or Apple all together.
    Including those who just cant stand the totalitarian approach of big tech and Apple‘s increasingly invasive walled garden ( feels like the a walled China  ) 
    This is a horrific  pr/pub/buisness  move by Apple… it feels Orwellian !   It Is ORWELLIAN !

    Dont do it Tim…. this will alienate everyday normal people who value privacy! 
    (its not to the best interest of the share holders! )
    elijahgdarkvader
  • Reply 46 of 162
    thrangthrang Posts: 1,008member
    My take is this.

    Apple saw the writing on the wall regarding government mandated back-doors. Doors that would all in all likelihood much more open with unfettered access to much if not all of your information. Perhaps begin he sense, the pressure were growing immense.

    Perhaps, they decided to develop a content and technology around very narrow and focused one-way beacons (to avoid a mandated backdoor), initially to identify illegal and abhorrent possession and behavior. Perhaps evidence of murders, rapes, extortion, terrorism, may be other beacons that are sent out in the future.

    I know, there will be discussions over who makes the decisions, how is it vetted, errors, misuse, hacking, etc. But essentially, to me, Apple seeking to control the process of what they see are inevitable laws that they would need to comply with and with much worse outcomes for users. One way beacons that need to be further vetted to ensure no false positives is an effort to ameliorate law enforcement concerns while still protecting legitimate private information is perhaps a very good approach.

    While it feels icky upon initially hearing about it, the more you think about it this way (and what government enforced alternatives might be), it may begin to make sense.

    In terms of what defines when these future beacons will be sent out, Apple will like say ask for the pertinent laws to govern such beaconing, leaving up to elected leaders to clarify/legislate/vote what kind of content is considered severely harmful, how it is to be legally obtained, and utlimately leave it up to voters to support or oust those politicians in future elections. So in this case, there is a well defined hash database for this particular category. Apple then implements an on-device methodology that is designed to keep the rest of your data protected and unavailable for unrelated sniffing about while beaconing when there is a match.

    As to other categories of hash matching, governments will need to figure that out which would be subjects to immense scrutiny  and public debate I'm sure...

    There are caveats of course, but in principal this is how I see what has happened.
    dewmekillroywatto_cobra
  • Reply 47 of 162
    lkrupplkrupp Posts: 10,557member
    Okay, here we go with the misinformation train. I’m watching the 5 o’clock local news in St. Louis. Next story up, after the commercials, “Apple will start scanning your photos, how you could wind up in jail”. Can’t wait to listen to the spin they put on this.
    watto_cobrajony0
  • Reply 48 of 162
    IreneWIreneW Posts: 303member
    IreneW said:
    cpsro said:
    elijahg said:
    Remember that 1 in 1 trillion isn't 1 false positive per 1 trillion iCloud accounts - it's 1 per 1 trillion photos. I have 20,000 photos, that brings the chances I have a falsely flagged photo to 1 in 50 million. Not quite such spectacular odds then.
    One in a trillion over 20,000 photos is not 1 in 50 million. It's one in a trillion, 20,000 times. The odds do not decrease per photo, as your photo library increases in size. There is not a 1:1 guarantee of a falsely flagged photo in a trillion-strong photo library.

    And even if it was, one in 50 million is still pretty spectacularly against.
    One in 50 million with over 1 billion accounts is 200 false positives per year.
    But Apple might claim the false positive rate is per account, not per photo.
    These statistics are, however, under idealized (perfectly random) circumstances and real data typically doesn't look entirely random.
    The false positive rate is probably per account, considering the SHA-256 collision rate is one in 4 * 10^60 processes. 

    That's four novemdecillion hashes. Had to look that one up.
    Is it specified anywhere tto be a SHA-256? Some article mentioned that this was a algorithm developed and donated by Microsoft, which allowed changes in pictures but still providing a match. 
    If so, the number may be smaller (or bigger), so let's stay with the facts.
    The facts are, Apple says one in a trillion, and I don't know if it's per picture or per account (edit - see next page. It's one in a trillion per account) NCMEC says the current algo is based on SHA-256 which is the 4*10^60, but it won't be that based on the changes in pictures allowances.

    Somewhere in between the two is the actual number.

    NOW I'm getting up. Talk to you all off and on over the weekend.
    Thanks, have a great weekend!
  • Reply 49 of 162
    citpekscitpeks Posts: 246member
    ORWELLIAN!
    Besides absolute invasiveness  what do they hope to accomplish with this??… it will solve nothing!
    Ones who have  these tendencies and know that they are being watched.. they will simply circumvent  it by using other platforms and Avoid Apple iCloud or Apple all together.
    Including those who just cant stand the totalitarian approach of big tech and Apple‘s increasingly invasive walled garden ( feels like the a walled China  ) 
    This is a horrific  pr/pub/buisness  move by Apple… it feels Orwellian !   It Is ORWELLIAN !

    Dont do it Tim…. this will alienate everyday normal people who value privacy! 
    (its not to the best interest of the share holders! )

    If the 1984 commercial was remade today, it would be hard to argue against the notion that Apple would be cast as the figure on the screen, not the hammer thrower.  It has been building up to that for a long time.

    Arguing over the statistics and techniques is missing the point -- privacy means what one keeps inside their box is known only to themselves, and those they share it with.  Not to others who will now monitor the box and try to infer what is inside, by whatever means and however supposedly accurate they may be.  The cops won't be going down the street knocking on every door and looking inside without reason, but they will be driving by with a scanner, sniffing for patterns and treating that as a proxy for probably cause.

    It would be naive to think that the rulers in certain countries, especially one powerful one with a market too large to ignore, aren't rubbing their hands in glee over having another potential tool to apply to their existing surveillance state.  Want access to our market?  Our laws say that requires measures, of our choosing, using our standards, to ensure the safety and security of the people.

    Others, like the media cartel may be thinking, "Fingerprints?  Yeah, we got those.  How can be make better use of them?"  Hmmm.

    Apple has climbed the mountain, and has stood firmly at the top for a while.  But get to close to the edge, and it's a long way down.

    Disheartening to have to concede that Scott McNealy was right, all along.
    DAalsethelijahg
  • Reply 50 of 162
    irelandireland Posts: 17,798member
    Privacy nightmare. Ripe for abuse by bad actors, governments, and even future Apple execs. The worst idea to ever come out of Cupertino.
    edited August 2021 DAalseth
  • Reply 51 of 162
    netroxnetrox Posts: 1,418member
    The fact that people are freaking out over the implementation of Apple's detection of child pornography shows that they have absolutely no ability to think logically and understand that they have no risk if they don't distribute child pornography.  

    It's simple as that. Literally. 

    Also, the government has your ID, your SSN, your taxes, your public records, your passports, your registrations, your work history, your residence, and so on. You ain't that private as you think you are.  

    killroyradarthekatwatto_cobramwhite
  • Reply 52 of 162
    Rayz2016Rayz2016 Posts: 6,957member
    ireland said:
    Privacy nightmare. Ripe for abuse by bad actors, governments, and even future Apple execs. The worst idea to ever come out of Cupertino.
    Worse than the hockey puck mouse?

    What I’m seeing here is folk ignoring the danger to give Apple a free pass. 

    False positives: won’t happen. 

    Coercion by authoritarian governments:  invalid slippery slope arguments.  The reason it’s called the slippery slope is that once you’re on it then you will inevitably slide to the bottom. The only way to avoid this is to stay off the slope to begin with. 

    So instead of trying to ignore the bad authoritarian actor argument, perhaps the supporters would care to explain how it won’t happen. They won’t; all they will say is that the lynching of homosexuals is unfortunate, but Apple has to respect the laws in that country, conveniently forgetting that Apple introduced the system to make the lynching possible. 

    But Google already does it: and when I become a Google customer/product, then I’ll post the same argument on GoogleInsider. 

    The supporters of this move focus on the stats: what are the chances of you getting trapped in a false positive. They point out that the government already knows everything about you anyway, so what’s the difference if Apple rifles through your personal photos, accuses you of being a pervert and demands that you prove otherwise. 

    But the heartening thing is that most people arguing against this are not concerned about themselves (most probably because they’re not the perverts Apple assumes they are); they’re concerned about the dissidents, the religious outcasts and the LGBQT+ community living under oppressive regimes who will be persecuted with Apple’s help.  



    edited August 2021 elijahgbaconstangmuthuk_vanalingam
  • Reply 53 of 162
    crowleycrowley Posts: 10,453member
    Rayz2016 said:

    So instead of trying to ignore the bad authoritarian actor argument, perhaps the supporters would care to explain how it won’t happen. They won’t; all they will say is that the lynching of homosexuals is unfortunate, but Apple has to respect the laws in that country, conveniently forgetting that Apple introduced the system to make the lynching possible. 
    What database of homosexual imagery exists that Apple would be able to use a similar process to detect them?

    And are homosexual media circles even a thing?

    It doesn't seem like there's a whole lot of scope there anyway, even were Apple to be asked to do such a thing and accept it.
    watto_cobrajony0
  • Reply 54 of 162
    Rayz2016Rayz2016 Posts: 6,957member
    crowley said:
    Rayz2016 said:

    So instead of trying to ignore the bad authoritarian actor argument, perhaps the supporters would care to explain how it won’t happen. They won’t; all they will say is that the lynching of homosexuals is unfortunate, but Apple has to respect the laws in that country, conveniently forgetting that Apple introduced the system to make the lynching possible. 
    What database of homosexual imagery exists that Apple would be able to use a similar process to detect them?

    And are homosexual media circles even a thing?

    It doesn't seem like there's a whole lot of scope there anyway, even were Apple to be asked to do such a thing and accept it.

    Did I say database? My mistake, because it doesn't even have to be a database.

    Our favourite authoritarian regime just has to insert the checksums into their "wanted  list" and no one would be the wiser. Apple doesn't know what they're searching for, and Apple's customers running the matching software don't know what's being searched for.

    Wait a second.

    A piece of software running on your phone that scans your files, builds a profile and then downloads it to a server?

    Last week we were all calling that malware.

    S'funny how quickly things change.
    edited August 2021 elijahgyuck9muthuk_vanalingam
  • Reply 55 of 162
    MarvinMarvin Posts: 15,320moderator
    citpeks said:
    ORWELLIAN!
    Besides absolute invasiveness  what do they hope to accomplish with this??… it will solve nothing!
    Ones who have  these tendencies and know that they are being watched.. they will simply circumvent  it by using other platforms and Avoid Apple iCloud or Apple all together.
    Including those who just cant stand the totalitarian approach of big tech and Apple‘s increasingly invasive walled garden ( feels like the a walled China  ) 
    This is a horrific  pr/pub/buisness  move by Apple… it feels Orwellian !   It Is ORWELLIAN !

    Dont do it Tim…. this will alienate everyday normal people who value privacy! 
    (its not to the best interest of the share holders! )
    If the 1984 commercial was remade today, it would be hard to argue against the notion that Apple would be cast as the figure on the screen, not the hammer thrower.  It has been building up to that for a long time.

    Arguing over the statistics and techniques is missing the point -- privacy means what one keeps inside their box is known only to themselves, and those they share it with.  Not to others who will now monitor the box and try to infer what is inside, by whatever means and however supposedly accurate they may be.
    It's not the user's box that's the issue, it's Apple's box (iCloud), which the user is sharing the contents with. Apple doesn't want this content on their servers.

    https://techcrunch.com/2021/08/05/apple-icloud-photos-scanning

    "Apple said the feature is technically optional in that you don’t have to use iCloud Photos, but will be a requirement if users do. After all, your device belongs to you but Apple’s cloud does not."

    They could disable encryption and do the scanning on their end but that would mean Apple employees have access to every customer's private photos.

    This won't just be coming from Apple but from law enforcement pressuring Apple. They are having resources strained dealing with this:

    https://www.theguardian.com/global-development/2021/feb/09/exclusive-rise-in-child-abuse-images-online-threatens-to-overwhelm-uk-police-officers-warn

    Their image database has 17m images and growing by 500,000 every 2 months. Children are sending images of themselves to people exploiting them while their parents are in another room unaware of it.

    The arguments about slippery slope are really what ifs because there's not an inevitable descent here. What if Apple implemented something much worse, then it would be much worse. Right but they haven't and there's not many images that can be scanned for in this way. It's not like they'd scan for 'images that look like Winnie the Pooh'. Some historical images like Tiananmen Square could be but again Apple has no reason to flag these.

    If a government forced Apple to do something completely contrary to their values, they could easily shut down the service in that region. There are cases where they've complied with local laws, which are against their values:

    https://www.fightforthefuture.org/news/2021-06-14-apple-is-enabling-censorship-of-lgbtq-apps-in-152/

    but if there was to be a database for LGBTQ image content that was censored, it would impact way more than Apple. This isn't an Apple-exclusive issue, it affects all online services.
    radarthekatwatto_cobrajony0
  • Reply 56 of 162
    crowleycrowley Posts: 10,453member
    Rayz2016 said:
    crowley said:
    Rayz2016 said:

    So instead of trying to ignore the bad authoritarian actor argument, perhaps the supporters would care to explain how it won’t happen. They won’t; all they will say is that the lynching of homosexuals is unfortunate, but Apple has to respect the laws in that country, conveniently forgetting that Apple introduced the system to make the lynching possible. 
    What database of homosexual imagery exists that Apple would be able to use a similar process to detect them?

    And are homosexual media circles even a thing?

    It doesn't seem like there's a whole lot of scope there anyway, even were Apple to be asked to do such a thing and accept it.
    Did I say database? My mistake, because it doesn't even have to be a database.

    Our favourite authoritarian regime just has to insert the checksums into their "wanted  list" and no one would be the wiser. Apple doesn't know what they're searching for, and Apple's customers running the matching software don't know what's being searched for.
    So now the authoritarian regime is doing this surreptitiously?  That's a lot of effort to go to in order to ensure a reviewer at Apple gets sent a bundle of pictures of dissidents that they will immediately record as not child abuse.

    radarthekatwatto_cobrajony0
  • Reply 57 of 162
    Rayz2016Rayz2016 Posts: 6,957member
    Delightful ad hominem and continued slippery slope. You asked about my kids yesterday in a poorly thought-out example that didn't apply, and I care more about them, than I do a hypothetical Apple contractor who hypothetically has PTSD and is hypothetically being treated poorly by Apple. And there's a lot of could, maybe, and in the future here.

    Did you even read this article to its conclusion, or did you just start commenting? I feel like if you had read the article, that last paragraph would be a little different.

    You've been complaining about Apple hardware and software for a long time here. Some of it is warranted and some of it is not. If this is so unacceptable to you, take the final steps and go. That simple.

    There is no extreme danger from this system as it stands. As with literally everything else in this world, there is good and bad, and we've talked about it here. Every decision we all make weighs the good and the bad.

    You're welcome to believe that there extreme danger based on hypotheticals. When and if it develops, we'll write about it.
    I will just cut to the chase. The article you wrote above mostly apologist in nature. Your strongest argument is that it is OK if Apple does this because other companies have been doing it for years.

    You didn't address the elephant in the room: Apple has been selling itself and its products on the idea that they will keep our private data private. They stated or strongly implied that all of our data would be strongly encrypted and that governments and hackers would not be able to access it and that even Apple could not access it due to the strength of the encryption. All of those statements and implied promises were false if encrypted data from our phones is unencrypted when it is stored in iCloud. The promise of privacy is turned into a farce when Apple itself violates it and scans our data without our permission. Yes other companies have been doing things like this with our data for years but that's why we were buying Apple products, right?

    I could pick apart your article a point at a time but it would be tedious. Example: Our ISPs can scan our data for IP content. Yes but we can and do use VPNs to work around that issue. Heck most employers require the use of a VPN to keep their company secrets secret (I bet Apple does too).

    BTW I hope you appreciate that I incorporated your arguments into my own. I did read what you wrote. I just happen to completely disagree with it.

    In one of your other posts (at least I think it was you), you said that Apple was doing this so it could bring in E2E encryption across its whole service.

    Okay, if you insist, but what would be the point?

    They're installing malware software that reads (and yes, I'm afraid it is reading) your files, compiles some data and sends it off to another server without giving you the chance to see what's being sent. This. presumably, has to be done before the file is encrypted and sent up to iCloud. So they've already nullified any reason to encrypt the data, so it's pretty much a waste of processor cycles. Since this piece of reading software can now be expanded by through any country's request, then why even bother?

    Except of course, there is the possibility that folk will encrypt stuff themselves before they send it iCloud; which now, Apple can circumvent. If they're happy to scan photos on-device, then what's to stop them scanning other files on device? Folk say that if you don't like it then don't use iCloud Photos. Problem is that it's not iCloud Photos that's doing the scanning – it's the software on your device; that's the back door.  A back door with a very weak lock. A back door with a very weak lock that the Saudis, the Russians, the Chinese and the UK will have a key to ask soon as they ask. It's a back door that opens out into a wide open garden simply by expanding the data that it scans, the server it reports to, and this can be done without telling anyone because the code isn't open source. 

    Now, Google, Microsoft and Twitter have been scanning images for years, but I can't find any references (which may be my poor BingFu)  that they install software on devices that scans your data and then uploads it to another server. From what I can tell, they scan images once they hit the servers then report them. 


    elijahgyuck9muthuk_vanalingam
  • Reply 58 of 162
    Rayz2016Rayz2016 Posts: 6,957member
    Delightful ad hominem and continued slippery slope. You asked about my kids yesterday in a poorly thought-out example that didn't apply, and I care more about them, than I do a hypothetical Apple contractor who hypothetically has PTSD and is hypothetically being treated poorly by Apple. And there's a lot of could, maybe, and in the future here.

    Did you even read this article to its conclusion, or did you just start commenting? I feel like if you had read the article, that last paragraph would be a little different.

    You've been complaining about Apple hardware and software for a long time here. Some of it is warranted and some of it is not. If this is so unacceptable to you, take the final steps and go. That simple.

    There is no extreme danger from this system as it stands. As with literally everything else in this world, there is good and bad, and we've talked about it here. Every decision we all make weighs the good and the bad.

    You're welcome to believe that there extreme danger based on hypotheticals. When and if it develops, we'll write about it.
    I will just cut to the chase. The article you wrote above mostly apologist in nature. Your strongest argument is that it is OK if Apple does this because other companies have been doing it for years.


    It's interesting that AI is happy to deny the hypotheticals when 95% of their content is based around rumours, conjecture, the tealeaf readings of untalented analysts and the ramblings of the well-connected Ming-whatever.

    Their whole business is based on hypotheticals.

    But now, hypotheticals are a problem?
    elijahg
  • Reply 59 of 162
    crowleycrowley Posts: 10,453member
    Rayz2016 said:

    They're installing malware software that reads (and yes, I'm afraid it is reading) your files, compiles some data and sends it off to another server without giving you the chance to see what's being sent. 
    The boolean fact that a photo has been flagged?  If you don't hit a warning level it doesn't even matter.
    Rayz2016 said:

    This. presumably, has to be done before the file is encrypted and sent up to iCloud. So they've already nullified any reason to encrypt the data, so it's pretty much a waste of processor cycles. 
    What?  The reason to encrypt the data is exactly the same as it ever was.  Creating a hash to compare against a CSAM list of hashes doesn't change a thing.
    Rayz2016 said:

    Since this piece of reading software can now be expanded by through any country's request, then why even bother?
    Not automatically it can't.  Apple have only said that they will consider doing this on a case-by-case basis.  
    Rayz2016 said:

    Except of course, there is the possibility that folk will encrypt stuff themselves before they send it iCloud; which now, Apple can circumvent. If they're happy to scan photos on-device, then what's to stop them scanning other files on device? 
    Widespread outrage?  If Apple are willing to disclose that they're doing CSAM scanning, and take the resultant flak for it, then we can have confidence then any expansion to other areas will also be disclosed.  At which point you can stop using an iPhone if you want.

    The crucial point here is that they're not doing this.  They're only doing this for photos that are uploaded to iCloud, and only as a CSAM check.
    Rayz2016 said:

    it's the software on your device; that's the back door.  A back door with a very weak lock. 
    Citation needed.

    Rayz2016 said:

    A back door with a very weak lock that the Saudis, the Russians, the Chinese and the UK will have a key to ask soon as they ask. It's a back door that opens out into a wide open garden simply by expanding the data that it scans, the server it reports to, and this can be done without telling anyone because the code isn't open source. 
    It's software that creates a hash from a photo and compares it against other hashes.  Why do you think that's infinitely expandable in any particular way by foreign governments?  If foreign governments have access to your phone to such a degree then they can modify a deep rooted piece of Apple code, then they're already in a position where they can install any kind of monitoring that they want.  And if you think Apple will do it for them for any request they dream up, and implement it surreptitiously then I don't think you know Apple very well.

    Apple can change any code because very little of their code is open source.  The fact that they've publicly announced that they're doing this should give you reassurance that Apple is going about this in the right way. 
    Rayz2016 said:

    Now, Google, Microsoft and Twitter have been scanning images for years, but I can't find any references (which may be my poor BingFu)  that they install software on devices that scans your data and then uploads it to another server. From what I can tell, they scan images once they hit the servers then report them. 
    Which Apple can't do if photos are user-encrypted in iCloud, which I imagine is the plan for the future.  They're doing the exact same thing as the others, just on the client end.  Absolutely no difference of any significance.
    edited August 2021 radarthekatwatto_cobrajony0
  • Reply 60 of 162
    amar99amar99 Posts: 181member
    Last apple product I purchase, last software update I install. Now that they're in bed with the government, their politically-charged "excuse" for invading every user's privacy cannot outweigh the long-term consequences. Governments want in, and Apple has just given them an avenue. The future is dark for Apple if they go down this path off complicity.
    edited August 2021 elijahg
Sign In or Register to comment.