Congress asking Apple and other big tech what they're doing about deepfakes

Jump to First Reply
Posted:
in iOS edited December 2024

Pointed letters from Congress have been sent to tech executives like Apple's Tim Cook, stating concerns over the prevalence of deepfake non-consensual intimate images.

Smartphone screen displaying a pixelated person's portrait in a white uniform with black tie. Top of the screen shows 'Deep Fake' in bold letters.
Non-consensual intimate images being made with apps on iPhone



The letters stem from earlier reports about nude deepfakes being created using dual-use apps. Advertisements popped up all over social media promoting face swapping, and users were using them to place faces into nude or pornographic images.

According to a new report from 404 Media, the US Congress is taking action based on these reports, asking tech companies how it plans to stop non-consensual intimate images from being generated on their platforms. Letters were sent to Apple, Alphabet, Microsoft, Meta, ByteDance, Snap, and X.

The letter to Apple specifically calls out the company for missing such dual-use apps despite App Review Guidelines. It asks what Apple plans to do to curb the spread of deepfake pornography and mentions the TAKE IT DOWN Act.

The following questions were asked of Tim Cook:


  • What plans are in place to proactively address the proliferation of deepfake pornography on your platform, and what is the timeline of deployment for those measures?

  • What individuals or stakeholders, if any, are included in developing these plans?

  • What is the process after a report is made and what kind of oversight exists to ensure that these reports are addressed in a timely manner?

  • What is the process for determining if an application needs to be removed from your store?

  • What, if any, remedy is available to users who report that their image was included non-consensually in a deepfake?



Apple acts as the steward to the App Store, and in doing so, gets the blame anytime something gets into the store that shouldn't. These instances of deepfake tools or children's apps that become casinos are used as fuel in the argument that Apple shouldn't be the gatekeeper to its app platform.

The apps identified by previous reports that were removed by Apple showed clear signs of potential abuse. For example, users could source videos from Pornhub for face swapping.

Apple Intelligence can't make nude or pornographic images, and Apple has stopped Sign-in with Apple from working on deepfake websites. However, that's just the bare minimum of what Apple can and should be doing to combat deepfakes.

As the letter from Congress suggests, Apple needs to take steps to ensure that dual-use apps can't get through review. Extra precautions should at least be taken with apps that promise AI image and video manipulation, especially those that offer face swapping capabilities.



Read on AppleInsider

Comments

  • Reply 1 of 6
    mattinozmattinoz Posts: 2,583member
    What’s Congress doing?
    are they extending laws to make use of deep fakes as unlawful in political advertising as it is in commercial advertising. 
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 2 of 6
    Is congress asking Adobe and the other photo manipulation app developers what they are doing to curb the use of their technology for the same purpose that's been in use since photo manipulation was a thing?
    dewmewatto_cobra
     2Likes 0Dislikes 0Informatives
  • Reply 3 of 6
    So that Boebert episode never really happened?
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 4 of 6
    The ai genie is out of the bottle. 

    In some ways, it’s helpful. 

    In others, a big problem. 

    I don’t think lawmakers can put the genie in a bottle. 

    But they can rein it in and protect people from deceptive practices. 

    Just because it’s software doesn’t make up theft any better or stealing likenesses to misrepresent without it being an obvious lampoon. 

    Really need some string laws here, with significant repercussions. 
    edited December 2024
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 5 of 6
    I'm not American, nor a lawyer, but doesn't the First Amendment say "Congress shall [not] abridge freedom of speech"? When they put pressure on any company to "proactively address the proliferation of deepfake pornography" isn't that a direct attempt to make companies abridge people's freedom of speech?

    I recognized one signatory on the list of questions. She was a Democrat. Were all the signatories Democrats?
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 6 of 6
    I'm not American, nor a lawyer, but doesn't the First Amendment say "Congress shall [not] abridge freedom of speech"? When they put pressure on any company to "proactively address the proliferation of deepfake pornography" isn't that a direct attempt to make companies abridge people's freedom of speech?

    I recognized one signatory on the list of questions. She was a Democrat. Were all the signatories Democrats?

    The amendment actually says "Congress shall make no law ... abridging the freedom of speech..."  "Pressuring" a corporation skirts that restriction unless they decide to impose some sort of punishment.  It can be a very fine line, and until the unlikely action of the U.S. Supreme Court to stop them, they'll continue.

    However, Congress and U.S. courts have created many, many exceptions to unabridged free speech.  Incitement to violence, for instance.  Fraud.  Misleading advertising.  Certain forms of pornography.  Intimidation.  Often, the legitimacy of any given "speech" hinges on the intent of that speech.  Parody and satire, for instance, even though potentially depicting entirely false information, are protected speech, while false accusations of crimes or misdoing fall under slander or liable, which are not covered by free speech.
    edited December 2024
    muthuk_vanalingamwilliamlondonentropyswatto_cobra
     0Likes 0Dislikes 4Informatives
Sign In or Register to comment.