Apple urges UK to rethink anti-encryption Online Safety Bill

Posted:
in General Discussion

Apple has denounced the UK's Online Safety Bill's kneecapping of end-to-end encryption as a "serious threat" to citizens, and is trying to make the UK government think twice about the changes.

UK Houses of Parliament
UK Houses of Parliament



The Online Safety Bill is being considered by the UK parliament as a potential law that could force online messaging services that use encryption to scan for potential images of child abuse. As part of a wider criticism of the bill's intentions, Apple has publicly objected to the law's implementation.

The bill reasons that law enforcement is not capable of identifying child sexual abuse material being shared across online messaging services like iMessage, due to the implementation of end-to-end encryption. Therefore, the law would empower regulator Ofcom to order such platforms to scan the contents of messages.

However, to accomplish that, there has to be a weakening of end-to-end encryption itself, making it less secure and eliminating the whole point of using the technique for privacy in the first place.

"End-to-end encryption is a critical capability that protects the privacy of journalists, human rights activists, and diplomats," an Apple statement received by the BBC on Tuesday reads. "It also helps everyday citizens defend themselves from surveillance, identity theft, fraud, and data breaches."

The statement continues "The Online Safety Bill poses a serious threat to this protection, and could put UK citizens at greater risk. Apple urges the government to amend the bill to protect strong end-to-end encryption for the benefit of all."

Apple's statement occurs at the same time as the Open Rights Group sends an open letter to minister Chloe Smith, the Secretary of State for Science, Innovation, and Technology.

Signed by over 80 civil society organizations and academics, the group believes "The UK could become the first liberal democracy to require the routine scanning of people's private chat messages, including chats that are secured by end-to-end encryption" if the bill becomes law.

"As over 40 million UK citizens and 2 billion people worldwide rely on these services, this poses a significant risk to the security of digital communication services not only in the UK, but also internationally," the letter warns.

Apple's statement against the Online Safety Bill means it joins other messaging services who are against the bill. The Meta-owned WhatsApp told the BBC it refuses to weaken its encrypted systems, while Signal said in February that it would "walk" from the UK if ordered to do the scanning.

While Apple is against the bill, it has previously attempted to perform actions that would be somewhat in the ballpark of what the bill would require it to do. Its 2021 attempt to introduce on-device scanning of images as a child protection measure was praised by the UK government, but was ultimately killed off by Apple in December 2022.

Read on AppleInsider

«1

Comments

  • Reply 1 of 21
    I'm a bit confused as to why Apple would be so against this when they already have a ready-to-go service (that they voluntarily developed themselves) that would scan for such images while keeping end-to-end encryption intact.
    williamlondon
  • Reply 2 of 21
    MalcolmOwenMalcolmOwen Posts: 28member, editor
    I'm a bit confused as to why Apple would be so against this when they already have a ready-to-go service (that they voluntarily developed themselves) that would scan for such images while keeping end-to-end encryption intact.
    I think the tsunami of criticism and complaints that Apple had after mentioning it the first time may have something to do with it...
    williamlondoncoolfactorbyronl
  • Reply 3 of 21
    anonymouseanonymouse Posts: 6,861member
    I'm a bit confused as to why Apple would be so against this when they already have a ready-to-go service (that they voluntarily developed themselves) that would scan for such images while keeping end-to-end encryption intact.
    Because that's completely irrelevant to the issue of legislation outlawing end-to-end encryption. I'm not sure why this would cause confusion.
    williamlondonFileMakerFellerAlex1Nbyronl
  • Reply 4 of 21
    OsbertOsbert Posts: 2member
    Benjamin Franklin once said: "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."

    Those that pretend not to understand are the ones that wish to do great harm to society.
    Society is not easy and Governments should not be allowed to infringe upon any citizens or even a non-citizen's, right to privacy.

    SO GLAD we left UK in 1648.

    !!! NOBODY !!! SHOULD HAVE ALL THE KEYS TO ALL THE CASTLES !!!! - nobody.

    Absolute power brings absolute corruption!
    muthuk_vanalingamwilliamlondonxyzzy-xxxbaconstangAlex1Nappleinsideruserbyronl
  • Reply 5 of 21
    coolfactorcoolfactor Posts: 2,248member
    I'm a bit confused as to why Apple would be so against this when they already have a ready-to-go service (that they voluntarily developed themselves) that would scan for such images while keeping end-to-end encryption intact.

    1. Apple develops a safe and secure means of helping to address the concerns around distribution of illegal/damaging material through their network (iCloud).
    2. Everyone freaks out that Apple staff will be looking at every photo that they take — which is not true.
    3. Apple agrees to not release the feature.
    4. The government proposes a solution that directly violates the integrity of user privacy.
    5. Citizens have no choice but to obey the law put in place by their elected officials.


    People are weird. Apple had the better solution all along.


    edited June 2023 FileMakerFellerAlex1N
  • Reply 6 of 21
    I'm a bit confused as to why Apple would be so against this when they already have a ready-to-go service (that they voluntarily developed themselves) that would scan for such images while keeping end-to-end encryption intact.
    Because that's completely irrelevant to the issue of legislation outlawing end-to-end encryption. I'm not sure why this would cause confusion.
    The legislation described here would not necessitate the outlawing of end-to-end encryption absolutely, only require that CSAM be able to be detected and reported. Apple's implementation would satisfy the law without breaking encryption. The purpose of the proposed law is CSAM, not breaking encryption.
    I'm not sure why this would cause confusion.
    williamlondon
  • Reply 7 of 21
    xyzzy-xxxxyzzy-xxx Posts: 185member
    I'm a bit confused as to why Apple would be so against this when they already have a ready-to-go service (that they voluntarily developed themselves) that would scan for such images while keeping end-to-end encryption intact.

    1. Everyone freaks out that Apple staff will be looking at every photo that they take — which is not true.
    You missed the point – Apple had developed technology (not Apple staff) that was supposed to analyse images in iOS 15. Massive protest prevented Apple to introduce this technology and it was until iOS 16 for Apple to declare that this technology would not be used.
    byronl
  • Reply 8 of 21
    anonymouseanonymouse Posts: 6,861member
    I'm a bit confused as to why Apple would be so against this when they already have a ready-to-go service (that they voluntarily developed themselves) that would scan for such images while keeping end-to-end encryption intact.
    Because that's completely irrelevant to the issue of legislation outlawing end-to-end encryption. I'm not sure why this would cause confusion.
    The legislation described here would not necessitate the outlawing of end-to-end encryption absolutely, only require that CSAM be able to be detected and reported. Apple's implementation would satisfy the law without breaking encryption. The purpose of the proposed law is CSAM, not breaking encryption.
    I'm not sure why this would cause confusion.
    The purpose of the law is ending end-to-end encryption. Child sex abuse is simply being used as the most emotionally compelling rationale for doing so. Apparently this does result in a lot of confusion.

    Ironically, victims of child sex abuse are being exploited by lawmakers who want to end privacy.
    FileMakerFellerbaconstangpscooter63Alex1Nmacpluspluswilliamlondonbyronl
  • Reply 9 of 21
    Osbert said:
    Benjamin Franklin once said: "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."

    Those that pretend not to understand are the ones that wish to do great harm to society.
    Society is not easy and Governments should not be allowed to infringe upon any citizens or even a non-citizen's, right to privacy.

    SO GLAD we left UK in 1648.

    !!! NOBODY !!! SHOULD HAVE ALL THE KEYS TO ALL THE CASTLES !!!! - nobody.

    Absolute power brings absolute corruption!

    The quote, "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety," is popularly understood as a declaration about the importance of civil liberties and as a warning against giving up freedom in exchange for security.

    However, the historical context of the quote is a bit more nuanced. It comes from a 1755 letter that Franklin, then serving as a Pennsylvania assemblyman, wrote on behalf of the Pennsylvania Assembly to the colonial governor during a time of frontier war. The letter was a response to the governor's refusal to allow the Assembly to tax the lands of the Penn family, who ruled Pennsylvania from afar, for war expenses. Instead, the governor proposed that the Assembly give up its power to tax in exchange for funds to ensure frontier security. In this context, "essential liberty" refers to the Assembly's power to levy taxes, and "a little temporary safety" pertains to the financial aid for frontier defense.

    Thus, Franklin's quote was a criticism of the governor's proposal and a defense of the Assembly's political power. Over time, the quote has been abstracted from its specific historical context and widely used in discussions about civil liberties, privacy, and security.

    I highly suggest you read Edward Snowdon's book before casting criticism of other country's alleged lack of "freedom". 

    edited June 2023 FileMakerFellerAlex1Napplebynaturewilliamlondonmuthuk_vanalingambyronl
  • Reply 10 of 21
    Osbert said:
    Benjamin Franklin once said: "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."

    Those that pretend not to understand are the ones that wish to do great harm to society.
    Society is not easy and Governments should not be allowed to infringe upon any citizens or even a non-citizen's, right to privacy.

    SO GLAD we left UK in 1648.

    !!! NOBODY !!! SHOULD HAVE ALL THE KEYS TO ALL THE CASTLES !!!! - nobody.

    Absolute power brings absolute corruption!

    The quote, "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety," is popularly understood as a declaration about the importance of civil liberties and as a warning against giving up freedom in exchange for security.

    However, the historical context of the quote is a bit more nuanced. It comes from a 1755 letter that Franklin, then serving as a Pennsylvania assemblyman, wrote on behalf of the Pennsylvania Assembly to the colonial governor during a time of frontier war. The letter was a response to the governor's refusal to allow the Assembly to tax the lands of the Penn family, who ruled Pennsylvania from afar, for war expenses. Instead, the governor proposed that the Assembly give up its power to tax in exchange for funds to ensure frontier security. In this context, "essential liberty" refers to the Assembly's power to levy taxes, and "a little temporary safety" pertains to the financial aid for frontier defense.

    Thus, Franklin's quote was a criticism of the governor's proposal and a defense of the Assembly's political power. Over time, the quote has been abstracted from its specific historical context and widely used in discussions about civil liberties, privacy, and security.

    I highly suggest you read Edward Snowdon's book before casting criticism of other country's alleged lack of "freedom". 

    Thank you, that is invaluable context. The allusion to the rights of a State rather than the rights of the Union is worth keeping in mind as well.
  • Reply 11 of 21

    Signed by over 80 civil society organizations and academics, the group believes "The UK could become the first liberal democracy to require the routine scanning of people's private chat messages, including chats that are secured by end-to-end encryption" if the bill becomes law. 

    Wasn't the UK also the first liberal democracy to adopt the widespread use of video surveillance of public spaces? I doubt this argument will sway the thinking of the powers that be. The legal framework is such that the state grants rights to the citizens rather than the citizens grant rights to the state (which was the original purpose of the US constitution).
    baconstangAlex1Nappleinsideruserbyronl
  • Reply 12 of 21
    davidwdavidw Posts: 2,057member
    I'm a bit confused as to why Apple would be so against this when they already have a ready-to-go service (that they voluntarily developed themselves) that would scan for such images while keeping end-to-end encryption intact.

    1. Apple develops a safe and secure means of helping to address the concerns around distribution of illegal/damaging material through their network (iCloud).
    2. Everyone freaks out that Apple staff will be looking at every photo that they take — which is not true.
    3. Apple agrees to not release the feature.
    4. The government proposes a solution that directly violates the integrity of user privacy.
    5. Citizens have no choice but to obey the law put in place by their elected officials.


    People are weird. Apple had the better solution all along.



    Number 2 is wrong.

    iOS users "freaked out" because the CSAM search was to be performed on the users device, before encryption and sent through Apple servers. That would be a violation of the US Constitution 4th Amendment dealing with privacy and unreasonable search. The SCOTUS has ruled many times that a cell phone has the same protection of privacy as a home and an auto. Therefore, like a home and auto, a court search warrant would have to be obtained before any search, unless in emergency situations.


    Everyone knew the staff was not going to be looking at every photo. Everyone already knew Apple was implementing CSAM on users  iCloud photo libraries. Just like how  Microsoft, Google, FaceBook and others perform CSAM on the photo libraries on their servers.  No matter how much Apple try to explain that they were only searching for matching  CSAM "hashes" and not looking at any photos, they could not justify violating users 4th Amendment rights to perform such a search on the users own devices.

    Plus, Apple software had the capability of detecting adult material that a parent could use to control a minor's account. Such photos and words would be burred out on the minor's account and the parent would be informed. So Apple CSAM software was much more capable than just searching for matching CSAM "hashes". It could scan all photos looking for adult images.  

    Number 4 is also wrong.

    The US government solution do not violate any users Constitutional rights to privacy as it involves the search being done on the third party servers. There is no expectation of privacy when one allows a third party access to ones data. It's just that the government can not make it a law that a third party must perform the search, otherwise the third party becomes an agency of the government and must protect its users US Constitutional rights. But it is with in the third parties rights to "voluntarily" search for material that they do not want stored on their servers. Thus there is no real violation of privacy if a third party "voluntarily" scans encrypted messages, while the messages are on their servers. They just can't advertise that they have end to end encryption. Which is fine with practically all governments.

    And in reality, Apple performing a CSAM search before the message get encrypted is not true end to end encryption. End to end encryption means that only you and the person you're sending the message to, knows what is contained (or not contained) in the message. If Apple gets to scan the message for CSAM before encryption, they know whether your message contains any CSAM or not, thus it would not be true end to end encryption, by any definition of it.  

    williamlondonmuthuk_vanalingam
  • Reply 13 of 21
    I'm a bit confused as to why Apple would be so against this when they already have a ready-to-go service (that they voluntarily developed themselves) that would scan for such images while keeping end-to-end encryption intact.
    Because that's completely irrelevant to the issue of legislation outlawing end-to-end encryption. I'm not sure why this would cause confusion.
    The legislation described here would not necessitate the outlawing of end-to-end encryption absolutely, only require that CSAM be able to be detected and reported. Apple's implementation would satisfy the law without breaking encryption. The purpose of the proposed law is CSAM, not breaking encryption.
    I'm not sure why this would cause confusion.
    The purpose of the law is ending end-to-end encryption. Child sex abuse is simply being used as the most emotionally compelling rationale for doing so. Apparently this does result in a lot of confusion.

    Ironically, victims of child sex abuse are being exploited by lawmakers who want to end privacy.
    You are unequivocally and absolutely incorrect. Did you read the damn bill? No you didn't. Because it doesn't ONCE mention encryption. It literally even spells out ways that a company can itself decide how to accomplish the task. For instance it would allow Apple to implement it's CSAM detection service that they've already developed that does not break encryption as satisfying this law, as I stated from the beginning.
    You literally have no idea what you're talking about. You're just typing BS from your emotional response to this bill, which again, is NOT aimed at encryption. 
    Damn you're so confused, aren't you? Well I guess maybe not so much confused as completely lacking any actual knowledge of this topic whatsoever.
    edited June 2023 williamlondon
  • Reply 14 of 21
    chutzpahchutzpah Posts: 392member
    Osbert said:
    Benjamin Franklin once said: "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."

    Those that pretend not to understand are the ones that wish to do great harm to society.
    Society is not easy and Governments should not be allowed to infringe upon any citizens or even a non-citizen's, right to privacy.

    SO GLAD we left UK in 1648.

    !!! NOBODY !!! SHOULD HAVE ALL THE KEYS TO ALL THE CASTLES !!!! - nobody.

    Absolute power brings absolute corruption!
    1648?  The English Civil War?  Or Westphalia?  Who left the UK in 1648?

    Also, calm down.
  • Reply 15 of 21
    I'm a bit confused as to why Apple would be so against this when they already have a ready-to-go service (that they voluntarily developed themselves) that would scan for such images while keeping end-to-end encryption intact.

    Because that's not the technology Apple developed.  What Apple was going to implement was to scan images before they were transmitted to iCloud or a message service using the user's machine and a curated list of CSAM hash values.  A noble sounding goal, to be sure, but I think Apple underestimated people's willingness to have their phones hijacked to do that work, and their skepticism about any given government's willingness to suborn the "curated" list of hash values and add images unrelated to CSAM that they find objectionable.

    Since Apple doesn't control every single end-to-end encrypted messaging service, complying with this law would require every device manufacturer to implement a system similar to Apple's, or the developers of the messaging services to allow decrypting at some stage to scan for images.
  • Reply 16 of 21
    The purpose of the law is ending end-to-end encryption. Child sex abuse is simply being used as the most emotionally compelling rationale for doing so. Apparently this does result in a lot of confusion.

    Ironically, victims of child sex abuse are being exploited by lawmakers who want to end privacy.
    You are unequivocally and absolutely incorrect. Did you read the damn bill? No you didn't. Because it doesn't ONCE mention encryption. It literally even spells out ways that a company can itself decide how to accomplish the task. For instance it would allow Apple to implement it's CSAM detection service that they've already developed that does not break encryption as satisfying this law, as I stated from the beginning.
    You literally have no idea what you're talking about. You're just typing BS from your emotional response to this bill, which again, is NOT aimed at encryption. 
    Damn you're so confused, aren't you? Well I guess maybe not so much confused as completely lacking any actual knowledge of this topic whatsoever.

    No, anonymouse is exactly correct.  Of course the law isn't explicitly purposed to end encryption, that would expose their agenda.  But whatever the stated purpose, the intended effect is to eliminate end-to-end encryption, or weaken it unto uselessness, so that governments can have access to citizens' private communication, "just in case", and "for their own protection".  CSAM is just the excuse they're using to justify it.
  • Reply 17 of 21
    The purpose of the law is ending end-to-end encryption. Child sex abuse is simply being used as the most emotionally compelling rationale for doing so. Apparently this does result in a lot of confusion.

    Ironically, victims of child sex abuse are being exploited by lawmakers who want to end privacy.
    You are unequivocally and absolutely incorrect. Did you read the damn bill? No you didn't. Because it doesn't ONCE mention encryption. It literally even spells out ways that a company can itself decide how to accomplish the task. For instance it would allow Apple to implement it's CSAM detection service that they've already developed that does not break encryption as satisfying this law, as I stated from the beginning.
    You literally have no idea what you're talking about. You're just typing BS from your emotional response to this bill, which again, is NOT aimed at encryption. 
    Damn you're so confused, aren't you? Well I guess maybe not so much confused as completely lacking any actual knowledge of this topic whatsoever.

    No, anonymouse is exactly correct.  Of course the law isn't explicitly purposed to end encryption, that would expose their agenda.  But whatever the stated purpose, the intended effect is to eliminate end-to-end encryption, or weaken it unto uselessness, so that governments can have access to citizens' private communication, "just in case", and "for their own protection".  CSAM is just the excuse they're using to justify it.
    Ah, I see you're employing the Slippery Slope Fallacy here. Moving on.
    edited June 2023 williamlondon
  • Reply 18 of 21
    The purpose of the law is ending end-to-end encryption. Child sex abuse is simply being used as the most emotionally compelling rationale for doing so. Apparently this does result in a lot of confusion.

    Ironically, victims of child sex abuse are being exploited by lawmakers who want to end privacy.
    You are unequivocally and absolutely incorrect. Did you read the damn bill? No you didn't. Because it doesn't ONCE mention encryption. It literally even spells out ways that a company can itself decide how to accomplish the task. For instance it would allow Apple to implement it's CSAM detection service that they've already developed that does not break encryption as satisfying this law, as I stated from the beginning.
    You literally have no idea what you're talking about. You're just typing BS from your emotional response to this bill, which again, is NOT aimed at encryption. 
    Damn you're so confused, aren't you? Well I guess maybe not so much confused as completely lacking any actual knowledge of this topic whatsoever.

    No, anonymouse is exactly correct.  Of course the law isn't explicitly purposed to end encryption, that would expose their agenda.  But whatever the stated purpose, the intended effect is to eliminate end-to-end encryption, or weaken it unto uselessness, so that governments can have access to citizens' private communication, "just in case", and "for their own protection".  CSAM is just the excuse they're using to justify it.
    Ah, I see you're employing the Slippery Slope Fallacy here. Moving on.
    Governments absolutely despise end-to-end encryption. And even western governments have tried to use "think of the children (or the terrorists)" arguments to end it for years. Some of their reasons for despising it are arguably valid, as E2EE takes away a significant intelligence gathering tool that can be used to stop terrorist plots, or to figure out how to prosecute somebody (or better understand if anyone else was involved) after an incident such as a mass shooting incident. But, much of this is also the banal interest in the state just gathering data because that's what they've always done and that's what they always want to be able to do, and because digital technologies made this so much easier to do at scale than back in the paper days and their addicted to it and really don't want to lose it. 
    williamlondon
  • Reply 19 of 21
    anonymouseanonymouse Posts: 6,861member
    The purpose of the law is ending end-to-end encryption. Child sex abuse is simply being used as the most emotionally compelling rationale for doing so. Apparently this does result in a lot of confusion.

    Ironically, victims of child sex abuse are being exploited by lawmakers who want to end privacy.
    You are unequivocally and absolutely incorrect. Did you read the damn bill? No you didn't. Because it doesn't ONCE mention encryption. It literally even spells out ways that a company can itself decide how to accomplish the task. For instance it would allow Apple to implement it's CSAM detection service that they've already developed that does not break encryption as satisfying this law, as I stated from the beginning.
    You literally have no idea what you're talking about. You're just typing BS from your emotional response to this bill, which again, is NOT aimed at encryption. 
    Damn you're so confused, aren't you? Well I guess maybe not so much confused as completely lacking any actual knowledge of this topic whatsoever.

    No, anonymouse is exactly correct.  Of course the law isn't explicitly purposed to end encryption, that would expose their agenda.  But whatever the stated purpose, the intended effect is to eliminate end-to-end encryption, or weaken it unto uselessness, so that governments can have access to citizens' private communication, "just in case", and "for their own protection".  CSAM is just the excuse they're using to justify it.
    Ah, I see you're employing the Slippery Slope Fallacy here. Moving on.
    Slippery Slope's status as a "fallacy" is controversial at best, particularly regarding civil  and political rights. However, arguing that a conclusion is wrong because the argument is based on a fallacy unequivocally commits the argument from fallacy fallacy. Arguing that there are no slippery slopes because not all slopes are slippery seems like a significant lapsus logice.

    See how calling out fallacies can be a slippery slope of its own?

    But just for you guys who are claiming there is nothing to see here, from the article,

    The bill reasons that law enforcement is not capable of identifying child sexual abuse material being shared across online messaging services like iMessage, due to the implementation of end-to-end encryption. Therefore, the law would empower regulator Ofcom to order such platforms to scan the contents of messages. 

    However, to accomplish that, there has to be a weakening of end-to-end encryption itself, making it less secure and eliminating the whole point of using the technique for privacy in the first place. 

    Apple's plan to detect CSAM was limited to images uploaded to iCloud, and had absolutely nothing to do with images transmitted via iMessage that were not uploaded to iCloud. Anyone arguing that this is not a poorly disguised attack on end-to-end encryption generally is either engaged in wishful thinking or simply hasn't been paying attention to the attacks on end-to-end encryption engaged in by multiple governments, including that of the UK.

    And, we see how effective it is to cite child sex abuse as the target of such laws; it's such an emotional subject that some of you have already suspended all rational thinking.
    beowulfschmidt
  • Reply 20 of 21
    The purpose of the law is ending end-to-end encryption. Child sex abuse is simply being used as the most emotionally compelling rationale for doing so. Apparently this does result in a lot of confusion.

    Ironically, victims of child sex abuse are being exploited by lawmakers who want to end privacy.
    You are unequivocally and absolutely incorrect. Did you read the damn bill? No you didn't. Because it doesn't ONCE mention encryption. It literally even spells out ways that a company can itself decide how to accomplish the task. For instance it would allow Apple to implement it's CSAM detection service that they've already developed that does not break encryption as satisfying this law, as I stated from the beginning.
    You literally have no idea what you're talking about. You're just typing BS from your emotional response to this bill, which again, is NOT aimed at encryption. 
    Damn you're so confused, aren't you? Well I guess maybe not so much confused as completely lacking any actual knowledge of this topic whatsoever.

    No, anonymouse is exactly correct.  Of course the law isn't explicitly purposed to end encryption, that would expose their agenda.  But whatever the stated purpose, the intended effect is to eliminate end-to-end encryption, or weaken it unto uselessness, so that governments can have access to citizens' private communication, "just in case", and "for their own protection".  CSAM is just the excuse they're using to justify it.
    Ah, I see you're employing the Slippery Slope Fallacy here. Moving on.
    Slippery Slope's status as a "fallacy" is controversial at best, particularly regarding civil  and political rights. However, arguing that a conclusion is wrong because the argument is based on a fallacy unequivocally commits the argument from fallacy fallacy. Arguing that there are no slippery slopes because not all slopes are slippery seems like a significant lapsus logice.

    See how calling out fallacies can be a slippery slope of its own?

    But just for you guys who are claiming there is nothing to see here, from the article,

    The bill reasons that law enforcement is not capable of identifying child sexual abuse material being shared across online messaging services like iMessage, due to the implementation of end-to-end encryption. Therefore, the law would empower regulator Ofcom to order such platforms to scan the contents of messages. 

    However, to accomplish that, there has to be a weakening of end-to-end encryption itself, making it less secure and eliminating the whole point of using the technique for privacy in the first place. 

    Apple's plan to detect CSAM was limited to images uploaded to iCloud, and had absolutely nothing to do with images transmitted via iMessage that were not uploaded to iCloud. Anyone arguing that this is not a poorly disguised attack on end-to-end encryption generally is either engaged in wishful thinking or simply hasn't been paying attention to the attacks on end-to-end encryption engaged in by multiple governments, including that of the UK.

    And, we see how effective it is to cite child sex abuse as the target of such laws; it's such an emotional subject that some of you have already suspended all rational thinking.
    I'm not claiming that governments do not hate end-to-end encryption generally, and the big security agencies would like nothing more than a back door for them to see everything we do. I just truthfully do not think that this particular law necessarily achieves that goal, especially if the companies who are left to decide how to implement it do it correctly.

    I think the majority of our disagreement stems from the fact that you are taking this particular AppleInsider article as 100% factual and accurate. We both know AI gets all sorts of stuff wrong almost on a daily basis. The quote you pulled here from the article is pure opinion and speculation. Again, perhaps actually reading the proposed law, or a writeup from an actual expert on the matter could provided clarity. If done correctly, this law allows an implementation that would not weaken end-to-end encryption.
    williamlondon
Sign In or Register to comment.