Apple & big tech urged to fight online child sexual abuse with more vigor
Attorney General William Barr has urged Apple and other tech giants to fight child sexual-abuse material circulation in a set of voluntary principles, but despite eliminating any mention of encryption, the general sentiment that tech companies should introduce backdoors is still present.

U.S. Attorney General William Barr speaking at the Center for Strategic & International Studies in February
At a gathering alongside representatives from Australia, Canada, New Zealand, and the United Kingdom on Thursday, Attorney General Barr and the group announced the development of voluntary standards to prevent the distribution of child abuse imagery online.
The principles were created in consultation with industry representatives and will be promoted by the WePROTECT Global Alliance, which counts 97 governments and 25 technology companies including Apple in its membership.
The principles cover a number of areas already being covered by technology firms in various capacities. Giants like Apple, Facebook, and Google have all taken measures to identify abusive material, making it inaccessible, and the reporting of said material to authorities.
Furthermore, companies need to adopt measures to improve child safety online, as well as preventing the use of live-streaming services for abusive purposes. Collaboration between companies is also advised on the matter.
"For the first time, the Five Countries are collaborating with tech companies to protect children against online sexual exploitation," said Barr. "We hope the Voluntary Principles will spur collective action on the part of industry to stop one of the most horrendous crimes impacting some of the most vulnerable members of society."
The Technology Coalition, an organization that works to promote online child safety and to eradicate online child sexual exploitation, welcomed the principles as a demonstration that child exploitation "is a complex global issue." The voluntary principles "provide a vital contribution to focusing efforts on the most important areas of the threat," the group said in a statement, with it working to spread awareness and to "redouble our efforts" to combat the online abuse.
Apple is counted as a member of the Technology Coalition, which has been in existence since 2006. Other group members include Adobe, Dropbox, Facebook, Flickr, GoDaddy, Google, Microsoft, PayPal, Roblox, Snapchat, Twitter, Verizon Media, VSCO, Wattpad, and Yubo.
The announcement ties in to a bipartisan bill titled the "Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2019," or EARN IT. The draft bill intends to create a set of best practices for tech firms and websites to follow "regarding the prevention of online child exploitation."
The bill would require the creation of a National Commission on Online Child Exploitation Prevention if enacted, which would deal with making the best practices themselves relating to child sexual-abuse material detection and reporting, "and for other purposes." Failure to certify compliance with the best practices could remove immunity protections under Section 230 of the Communications Act, protecting service providers from being sued over content posted by a user, which could lead to criminal prosecutions or civil lawsuits.
It is thought that, in order to appease any request relating to the examining of content set out by the commission, it would effectively require the breaking of encryption. This would include end-to-end encryption, where service operators typically cannot see the contents of messages and files.
"Online child sexual-abuse material, as the bill labels it, is a heinous problem. It's understandable that the co-sponsors of this bill want to address it," said Free Press Action senior policy counsel Gaurav Laroia in a statement. "But the legislation's construct could upset the entire internet ecosystem to combat activities that are already clearly unlawful."
The general request made of technology giants like Apple is to provide some sort of mechanism that can grant access to encrypted data, but only for authorized parties. For example, in February the head of the UK's MI5 Sir Andrew Parker said he wanted "exceptional access" to encrypted communications "where there is a legal warrant and a compelling case" to do so.
Arguments by tech companies and critics of the demands are resisting for a number of reasons, but it boils down to the idea that backdoors are not safe at all. By creating a backdoor, while it may only be meant for law enforcement, the same technology could plausibly be abused by bad actors, including hackers or spy agencies, and could even lead to mass surveillance of the population.
"The idea that we can break encryption and safely store a record of everything just for the putative good guys is technically unsound," Laroia asserts. "It's anathema to the privacy rights people must have against not just corporate actors and criminals, but against overly intrusive governments, too."
Apple has been a strong proponent of encryption, and has been at the center of many arguments about backdoors, such as during the FBI investigation into the Pensacola shooting. The FBI and Barr have requested Apple unlocks the iPhone at the center of the investigation, with Barr further asserting Apple didn't provide "substantive assistance," Apple rejected the calls and responded it had already provided a considerable amount of data to the investigation.

U.S. Attorney General William Barr speaking at the Center for Strategic & International Studies in February
At a gathering alongside representatives from Australia, Canada, New Zealand, and the United Kingdom on Thursday, Attorney General Barr and the group announced the development of voluntary standards to prevent the distribution of child abuse imagery online.
The principles were created in consultation with industry representatives and will be promoted by the WePROTECT Global Alliance, which counts 97 governments and 25 technology companies including Apple in its membership.
The principles cover a number of areas already being covered by technology firms in various capacities. Giants like Apple, Facebook, and Google have all taken measures to identify abusive material, making it inaccessible, and the reporting of said material to authorities.
Furthermore, companies need to adopt measures to improve child safety online, as well as preventing the use of live-streaming services for abusive purposes. Collaboration between companies is also advised on the matter.
"For the first time, the Five Countries are collaborating with tech companies to protect children against online sexual exploitation," said Barr. "We hope the Voluntary Principles will spur collective action on the part of industry to stop one of the most horrendous crimes impacting some of the most vulnerable members of society."
The Technology Coalition, an organization that works to promote online child safety and to eradicate online child sexual exploitation, welcomed the principles as a demonstration that child exploitation "is a complex global issue." The voluntary principles "provide a vital contribution to focusing efforts on the most important areas of the threat," the group said in a statement, with it working to spread awareness and to "redouble our efforts" to combat the online abuse.
Apple is counted as a member of the Technology Coalition, which has been in existence since 2006. Other group members include Adobe, Dropbox, Facebook, Flickr, GoDaddy, Google, Microsoft, PayPal, Roblox, Snapchat, Twitter, Verizon Media, VSCO, Wattpad, and Yubo.
EARN IT and Encryption
While the announcement did not bring up encryption at all, calls for companies to do more to fight the circulation of abuse images would naturally infer the breaking of security. As it stands, other existing and related measures are already firmly entrenched in the debate as a whole.The announcement ties in to a bipartisan bill titled the "Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2019," or EARN IT. The draft bill intends to create a set of best practices for tech firms and websites to follow "regarding the prevention of online child exploitation."
The bill would require the creation of a National Commission on Online Child Exploitation Prevention if enacted, which would deal with making the best practices themselves relating to child sexual-abuse material detection and reporting, "and for other purposes." Failure to certify compliance with the best practices could remove immunity protections under Section 230 of the Communications Act, protecting service providers from being sued over content posted by a user, which could lead to criminal prosecutions or civil lawsuits.
It is thought that, in order to appease any request relating to the examining of content set out by the commission, it would effectively require the breaking of encryption. This would include end-to-end encryption, where service operators typically cannot see the contents of messages and files.
"Online child sexual-abuse material, as the bill labels it, is a heinous problem. It's understandable that the co-sponsors of this bill want to address it," said Free Press Action senior policy counsel Gaurav Laroia in a statement. "But the legislation's construct could upset the entire internet ecosystem to combat activities that are already clearly unlawful."
Encryption Debate Continues
Barr's comments are the latest in continued attempts by the US government and others around the world to force tech companies into providing access to encrypted data. The long-running encryption debate has the governments and law enforcement demanding access on the basis of needing to access potentially crucial evidence stored on electronic devices, but are protected by encryption.The general request made of technology giants like Apple is to provide some sort of mechanism that can grant access to encrypted data, but only for authorized parties. For example, in February the head of the UK's MI5 Sir Andrew Parker said he wanted "exceptional access" to encrypted communications "where there is a legal warrant and a compelling case" to do so.
Arguments by tech companies and critics of the demands are resisting for a number of reasons, but it boils down to the idea that backdoors are not safe at all. By creating a backdoor, while it may only be meant for law enforcement, the same technology could plausibly be abused by bad actors, including hackers or spy agencies, and could even lead to mass surveillance of the population.
"The idea that we can break encryption and safely store a record of everything just for the putative good guys is technically unsound," Laroia asserts. "It's anathema to the privacy rights people must have against not just corporate actors and criminals, but against overly intrusive governments, too."
Apple has been a strong proponent of encryption, and has been at the center of many arguments about backdoors, such as during the FBI investigation into the Pensacola shooting. The FBI and Barr have requested Apple unlocks the iPhone at the center of the investigation, with Barr further asserting Apple didn't provide "substantive assistance," Apple rejected the calls and responded it had already provided a considerable amount of data to the investigation.
Comments
The greatest resource is still the human resource. How about better pay for teachers (with smaller class sizes) and training to recognize these things?
Technology is great, but it isn't the answer to every problem.
How long do you think it would take before a politician used the backdoors to snoop on rivals?
It's a stretch to suggest that technology is enabling a problem: it's merely yet another channel where an existing problem can occur.
Hysterical platitudes about "the children" do not form an argument against privacy and encryption. Rather you should have deep concerns when a government official is relying on salience bias to enact legislative changes that affect a wide number of your freedoms.
- Tech companies already scan for such child pornographic and related imagery in cloud photo libraries, this is done via hashing to maintain the individual's privacy.
- This government has tried myriad ways to protest encryption as a road block, yet in reality they actually have *easier* investigative access in the digital era than offline in the past. We've also seen that the governments do not disclose every security vulnerability to tech vendors.
- Pedophiles share imagery on the dark web, ordinary investigative process is the most effective means of uncovering this activity, removing iMessage encryption changes nothing there.
- People conducting illegal activities do not use commercial chat software, they are aware of their conduct and use other solutions, including simply rolling their own encryption. Criminal networks are sufficiently sophisticated.
- The perceived gain in investigative abilities is small in comparison to the massive loss of privacy and security. This is why political figures talk about "children" and "terrorism", they need to make the problem seem larger and more urgent to justify their unjustifiable demands.
- Child sexual abuse: the majority occurs entirely offline by family members or close friends of the family. It is better detected by parents, teachers and nurses/doctors.
- The online "grooming" of children already occurs on unencrypted chat software.
It's also rich to go after encryption when we have sites like facebook which make it trivial to give out so much personal information by accident, the largest invader of childrens' privacy online are proud parents.He can take a flying leap, no matter what the subject.
Breaking messaging encryption might be a much bigger deal (there's indications other countries have devised a way of accessing those encrypted messages, ie China almost certainly and perhaps Russia) and even if not becomes a bit of a slippery slope. Where do you stop? I would suggest before this becomes law.
This isn't Apples problem. Apple doesn't work for the government or their agendas.
This goes for every industry. Budweiser's job isn't preventing people from driving drunk, Microsofts job isn't to track down hackers and turn them in, Weiser Locks job isn't to arrest thieves who break into your home, etc.
LOL... Only if you believe Trump's shill Bill Barr and the rest of the right wing propaganda machine.
But, I do get a kick out of Republicans when the get busted. They always come with the same first grade level excuse of: "They ALL do it!"
Apple have proven quite effective at moving machine learning on-device. There's no apparent reason why Apple couldn't proactively use automation to scan images received on a device at the OS level, irrespective of encryption, and raise red flags in a database that is anonymised unless wrongdoing is identified and escalated.
In this story, police attempted to use an exercise app on a user's Android phone to link him to a crime -- simply because the app showed him passing near the site of a crime scene. Essentially they scooped up mass amounts of location information from probably hundreds of smart phones in order to finger the criminal in a crime that had happened almost a year earlier. This is the stuff of BigBrother watching your every move:
Google tracked his bike ride past a burglarized home. That made him a suspect.