Apple shareholder urges action on CSAM videos

Posted:
in General Discussion edited September 2021
Apple investor Christian Brothers Investment Services is pressing the company to do more to combat the spread of child sexual abuse material (CSAM), with a specific eye on videos.

CSAM


Fund manager Jeff McCroy in August sent a letter to Apple pushing for action on CSAM videos, reports Bloomberg. The shareholder owned $271 million in company stock and debt as of Aug. 31.

"The technology sector as a whole has been slow to address this important issue,'' McCroy told the publication. He went on to say that companies should share artificial intelligence technologies and other tools to combat videos containing imagery of sexual violence, content that can spread rapidly across the internet.

McCroy's firm has been a staunch advocate for change in the fight against CSAM and for years pushed the likes of Apple, AT&T and Verizon to do more to counter illicit photos, the report said.

Apple recently attempted to implement a suite of CSAM detection and reportage tools on iOS, but the initiative was met with significant pushback from privacy advocates.

Introduced as a multi-pronged effort to securely monitor user content for offending material, Apple's CSAM plan involved in-line image detection tools to protect children using Messages, safety protocols for Siri and Search, and an on-device photo monitoring solution. The latter feature has proven to be contentious.

Apple saw stiff resistance from industry experts, privacy advocates and customers who argued the system will lead to mass surveillance. The uproar prompted Apple to postpone a launch originally scheduled for 2021 as it gathers feedback from interested parties.

"It's disappointing to learn that Apple is delaying their efforts for change,'' McCroy said. "The longer it takes for action, the more children that are at risk of exposure and harm. We hope that Apple will expedite these planned improvements and take action sooner than later."

CBIS plans to interface with Apple, AT&T and Verizon on the issue, the report said.

McCroy's fund has leaned on Apple since 2016. When he first forwarded potential action on child pornography, McCrory said he felt like the "only investor in the room," according to the report. Over the past five years, however, Apple faced additional pressure from investors like the Sisters of St. Dominic of Caldwell, New Jersey, and School Sisters of Notre Dame Cooperative Investment Fund.

CBIS previously submitted a proxy proposal in 2018 to gain more information about what the company was doing to address issues related to the sexual exploitation of children online. The proposal was withdrawn after Apple outlined efforts in the area, but CBIS later met with managers, the company's director of global security investigations and child safety, and members of the law enforcement compliance and app development teams to discuss the matter.

It is unclear if McCroy intends to draft a proxy proposal for consideration in 2022. Apple already faces a shareholder resolution to reverse so-called "anti-repair policies."

Read on AppleInsider

Comments

  • Reply 1 of 11
    Apple should move carefully with advice from privacy experts. I doubt iCloud Photo sharing is a very common method of spreading CSAM.
    elijahgbaconstangbyronl
  • Reply 2 of 11
    This guy should stay in his lane and the let real privacy experts have their say. 

    Real privacy experts are nearly universally against Apple’s proposed implementation of CSAM. 
    elijahgbaconstangkmobergbyronlmuthuk_vanalingambluefire1
  • Reply 3 of 11
    There’s a certain irony here . . .
  • Reply 4 of 11
    ivanhivanh Posts: 597member
    Apple can pay hundreds of NGOs to sue Apple. That’s a way to gain global law-enforcement power like totalitarian regimes.
    williamlondon
  • Reply 5 of 11
    ivanh said:
    Apple can pay hundreds of NGOs to sue Apple. That’s a way to gain global law-enforcement power like totalitarian regimes.
    Are you drunk? You're making less than normal (for you) sense today.
  • Reply 6 of 11
    Apple should move carefully with advice from privacy experts. I doubt iCloud Photo sharing is a very common method of spreading CSAM.

    Apple’s anti-fraud chief said company was ‘the greatest platform for distributing child porn’


    edited September 2021
  • Reply 7 of 11
    mac_dogmac_dog Posts: 1,069member
    He should be trying to weed out the pedophiles who are running the country. I’m talking about the Jeffrey Epstein crowd. 
    marklark
  • Reply 8 of 11
    Christian Brothers?  Like the booze company?
    Weren't they associated originally with the Roman Catholic church?
  • Reply 9 of 11
    ikirikir Posts: 127member
    tylersdad said:
    This guy should stay in his lane and the let real privacy experts have their say. 

    Real privacy experts are nearly universally against Apple’s proposed implementation of CSAM. 
    Yes to fuel the mass paranoia.
  • Reply 10 of 11
    Apple has good reasons not to go down the CSAM path. It opens a pandoras box. Apple is not the police forces of the world. Its not their job to control us. And with this they start to do exactly that. On every phone there is out there. Thats a massive intrusion into privacy and will make people drop Apple as a brand.

    While Im buying Apple products since 1984 and am in love with them, I will never buy a new iPhone again and I will not upgrade to IOS 15 because of the potential CSAM backdoors.

    And Im by far not alone. Especially europeans who have GDPR laws to uphold their privacy are _severly_ concerned. An investor who cares about his portfolio should take this into consideration. iPhone sales in Europe will drop after iPhone 13.

    Everyone feels betrayed. We are all adults.  We don't need big brother to tell us what we are allowed to do or not. Even if its advice to breeth ...


    byronlmuthuk_vanalingam
  • Reply 11 of 11
    crowleycrowley Posts: 10,453member
    I feel for the guys at Apple who must be banging their heads against the wall. 
    edited September 2021 marklarkmknelsonfastasleep
Sign In or Register to comment.