UK tries to claim it hasn't backed down on encryption at all

Posted:
in iOS edited September 2023

Despite introducing a clause that means its Online Safety Bill is no longer a concern for Apple, Whatsapp, or users, the UK government is insisting with a straight face that it's still exactly as tough on Big Tech as before.

UK Houses of Parliament
UK Houses of Parliament



On Wednesday, the UK Parliament debated an Online Safety Bill that, in its original form, would have seen Apple, WhatsApp, Signal and more shutter their messaging and social media services in the country. Bowing to that pressure, the UK regulator Ofcom introduced a face-saving clause that effectively stopped the country's nonsensical demands to break end-to-end encryption.

Except, the Conservative government that was pushing for this -- against the advice of security experts and even an ex-MI5 head -- insists that it has not even blinked.

"We haven't changed the bill at all," UK technology minister Michelle Donelan told Times Radio, as spotted by Reuters.

"If there was a situation where the mitigations that the social media providers are taking are not enough," she continued, "and if after further work with the regulator they still can't demonstrate that they can meet the requirements within the bill, then the conversation about technology around encryption takes place."

Ofcom's amendment to the bill said that firms such as Apple would be ordered to open up their encryption only "where technically feasible and where technology has been accredited as meeting minimum standards of accuracy in detecting only child sexual abuse and exploitation content."

There is no technology today that will allow only the good guys to break end-to-end encryption -- and there never will be.

Consequently the Tory government can argue -- and is arguing -- that no word has been changed in the bill. But words have been added, and they neuter the entire nonsensical and unenforceable plan.

While the UK has been debating this, its schoolchildren have been returning to schools -- and then been forced to stay away again because of rampant unsafe building conditions.

Read on AppleInsider

ronn

Comments

  • Reply 1 of 8
    The neutering of this aspect of the bill was completely inevitable. What is interesting is why the government, despite knowing full well the negative consequences to end to end encrypted services, attempted to argue the opposite for so long? To which constituency were they appealing?
    IreneWwatto_cobra
  • Reply 2 of 8
    OferOfer Posts: 241unconfirmed, member
    Diletante said:
    The neutering of this aspect of the bill was completely inevitable. What is interesting is why the government, despite knowing full well the negative consequences to end to end encrypted services, attempted to argue the opposite for so long? To which constituency were they appealing?
    Their intelligence and law enforcement services 
    williamlondonwatto_cobraelijahg
  • Reply 3 of 8
    Coming from our useless government that tells you black is white and up is down this should be no surprise. As it will be no surprise when they try raising the same plan again.
    williamlondonwatto_cobra
  • Reply 4 of 8
    blastdoorblastdoor Posts: 3,308member
    I wonder if it might actually be a public service to the world for the UK to implement its stupid law. Let them and everyone else see the consequences of stupidity. 

    The UK already did this with Brexit -- they demonstrated to every other country in the EU just how incredibly stupid it is to leave the EU. I bet it will now be a very long time indeed before any other country seriously considers following the UK's boneheaded move. 
    watto_cobra
  • Reply 5 of 8
    chasmchasm Posts: 3,307member
    Make no mistake -- this entire proposal, from start to finish, was intended simply as a distraction from the 19 other, much larger scandals the Tory government is facing. Raw sewage in the lakes and ocean around the UK, the privatisation and robber-baron destruction of the public utilities and transportation system, overpaid officials raking in "contributions" and "second incomes" from private concerns that influence their vote (aka open baldfaced buying of legislators), the fiasco of Brexit, the plummeting income levels for the middle class and ever-expanding poor, the steady dismantling of the NHS, the worker strikes because people can live on the current wages thanks to widespread inflation and unobtainable mortgages ... the list goes on and on and on.

    There's nothing this Tory party has touched that hasn't turned to shit.
    watto_cobra
  • Reply 6 of 8
    Could we have the William Gallagher who writes intelligent articles back please?

    This item, and its predecessor, read like political polemics. There is no link between fighting CSAM and the problems in UK schools with RAAC (unless some AI wrote the story and thought they were both about children). The political party of our current government is hardly relevant (and even less when RAAC is dragged in). There is a real story buried in all of this nonsense and it's important.

    We do need to counter the distribution of CSAM and end to end encryption does make that harder. Back doors etc in the encryption was a naïve approach with some major downsides so we should be relieved that the UK government has backed away from that.

    I would like to be further positive by reading between the lines of the quote "If ... the mitigations that the social media providers are taking are not enough, and if ... they still can't demonstrate that they can meet the requirements within the bill, then the conversation about technology around encryption takes place.". Doesn't that say that we'll be back if things don't improve? It leaves the solution in the hands of the technology companies - isn't that what we want? The "technology around encryption" bit is correctly vague, it's just a big stick being threatened in the general area.

    Now being constructive, can someone please remind me why Apple's original proposal was so bad: Before encryption, and totally on-device, scan photos that are leaving the device to go to Apple's servers (iCloud) to ensure they are not CSAM. For me, opposing this because such a scheme could be extended to scan for other types of material is nonsense. Of course it could be extended but this is just extrapolation to the point of absurdity. Apple could do many things (let's start a rumour that they are injecting subliminal messages in photos shall we?) but that doesn't mean it would. Apple could even do it without telling us but their track-record says they wouldn't.
    edited September 2023
  • Reply 7 of 8
    elijahgelijahg Posts: 2,759member
    command_f said:
    Now being constructive, can someone please remind me why Apple's original proposal was so bad: Before encryption, and totally on-device, scan photos that are leaving the device to go to Apple's servers (iCloud) to ensure they are not CSAM. For me, opposing this because such a scheme could be extended to scan for other types of material is nonsense. Of course it could be extended but this is just extrapolation to the point of absurdity. Apple could do many things (let's start a rumour that they are injecting subliminal messages in photos shall we?) but that doesn't mean it would. Apple could even do it without telling us but their track-record says they wouldn't.
    The argument was it is easy for a government (think China) to force Apple to secretly add extra keys to the CSAM scanning database - such as photos of people a government didn't like - with very few people needing to be in-the-know for that to happen. It is a *lot* harder to secretly implement an entire scanning routine without anyone finding out.
    command_f
  • Reply 8 of 8
    elijahg said:
    command_f said:
    Now being constructive, can someone please remind me why Apple's original proposal was so bad: Before encryption, and totally on-device, scan photos that are leaving the device to go to Apple's servers (iCloud) to ensure they are not CSAM. For me, opposing this because such a scheme could be extended to scan for other types of material is nonsense. Of course it could be extended but this is just extrapolation to the point of absurdity. Apple could do many things (let's start a rumour that they are injecting subliminal messages in photos shall we?) but that doesn't mean it would. Apple could even do it without telling us but their track-record says they wouldn't.
    The argument was it is easy for a government (think China) to force Apple to secretly add extra keys to the CSAM scanning database - such as photos of people a government didn't like - with very few people needing to be in-the-know for that to happen. It is a *lot* harder to secretly implement an entire scanning routine without anyone finding out.
    Thanks for the response.

    I have seen this "extrapolation" argument but I don't buy it. Apple could add lots of things secretly without them being seen (just like APTs secretly install lots of stuff that is hard to detect - and they do it remotely to a functioning device, not build it into the OS). Apple is already scanning all the photos on iOS to power the search-by-contents feature; we trust them to do that without looking for sensitive stuff too.

    We all have to trust someone: if you use an iPhone you've implicitly chosen to trust Apple and believe them when they say they won't steal/misuse all the personal info from the phone.
Sign In or Register to comment.