When you report bugs on iOS, some content may be used for AI training

Jump to First Reply
Posted:
in General Discussion

If you decide to report a bug on a beta version of iOS, you now apparently have to let Apple use the uploaded content for Apple Intelligence training with no way to opt out.

Apple logo in white surrounded by a colorful, intertwined, abstract ribbon design on a black background.
If you want to report a bug on iOS, content you upload may be used for AI training.



On Monday, Apple announced its plans for a new opt-in Apple Intelligence training program. In essence, users can let Apple use content from their iPhone to train AI models. The training itself happens entirely on-device, and it incorporates a privacy-preserving method known as Differential Privacy.

Apple took measures to ensure that no private user data is transmitted for Image Playground and Genmoji training, as Differential privacy introduces artificial noise. This makes it so that individual data points cannot be tracked to their source.

Even so, some users are unhappy about the opt-in AI training program. While Apple said that it would become available in a future iOS 18.5 beta, one developer has already noticed a possible related change to the Feedback app.

In a social media post, developer Joachim outlined a new section of Apple's privacy notice in the Feedback application. When uploading an attachment as part of a bug report, such as a sysdiagnose file, users now need to give Apple consent to use the uploaded content for AI training.

Submit report notification with text about permissions for uploading content to Apple for improving products, featuring options to go back, don't ask again, or submit.
The privacy notice of the Feedback app now references AI training.



"Apple may use your submission to improve Apple products and services, such as training Apple Intelligence models and other machine learning models," the notice reads, in part.

The developer who spotted this addition criticized Apple for not including an opt-out option, saying that the only way users could opt out was by not filing a bug report at all.

They blasted Apple, expressing frustration that the iPhone maker decided to "hide it in the other privacy messaging stuff," and made it very clear that this is something they did not want.

Still, this is only one developer's reaction, with a few others chiming in with replies. It remains to be seen whether other devs will echo the sentiment, but it's likely given that there's no apparent opt-out button.

Anyone who wants to file a bug report will have to consent to Apple's AI training, and people will understandably be upset, even with privacy-preserving measures in place.

You can opt out of AI training, but not when reporting bugs



As time goes on, Apple's AI training program will expand to other areas of the iPhone operating system beyond bug reporting. The company wants to use Differential Privacy-based AI training for Genmoji, Image Playground, and Writing Tools.

Smartphone screen showing analytics settings with toggles for sharing iPhone, Watch, and iCloud analytics enabled. Text explains data sharing purposes and privacy information.
It's possible to opt out of Apple's AI training program, but not if you want to report bugs on a beta version of iOS.



While nothing has yet been implemented on that front, users are able to opt out of the on-device Apple Intelligence training program by turning off analytics in Settings.

This can be done by scrolling down and selecting Privacy & Security, then Analytics & Improvements. Those who wish to opt out can do so by toggling the "Share iPhone & Watch Analytics" setting.

There appears to be no way for developers to opt out of training Apple's AI with their bug reports at this time.




Read on AppleInsider

Comments

  • Reply 1 of 10
    coolfactorcoolfactor Posts: 2,375member

    Has this claim been validated? The dialog text specifically says the user can "delete any attachments" they don't want to submit, so wouldn't that include attachments related to the AI training, while still submitting the bug report?

    Seems like opportunistic reporting here. Please try this yourself and let us know exactly what the situation is.

    williamlondon
     1Like 0Dislikes 0Informatives
  • Reply 2 of 10
    DAalsethdaalseth Posts: 3,269member
    The developer who spotted this addition criticized Apple for not including an opt-out option, saying that the only way users could opt out was by not filing a bug report at all.
    If this is true it would adversely impact bug reporting. A lot of us are working very hard to not let our data be used for AI training. Now we may be in possession of an important bug report but have to violate our principles if we want to report it. Many may decide not to.  If this is true, it would be a very stupid move on Apple’s part.
    edited April 18
    netroxswat671jibwilliamlondonFileMakerFeller
     1Like 4Dislikes 0Informatives
  • Reply 3 of 10
    swat671swat671 Posts: 169member
    I fail to see an issue here. What IS the issue, exactly?
    jibwilliamlondon
     2Likes 0Dislikes 0Informatives
  • Reply 4 of 10
    swat671swat671 Posts: 169member
    DAalseth said:
    The developer who spotted this addition criticized Apple for not including an opt-out option, saying that the only way users could opt out was by not filing a bug report at all.
    If this is true it would adversely impact bug reporting. A lot of us are working very hard to not let our data be used for AI training. Now we may be in possession of an important bug report but have to violate our principles if we want to report it. Many may decide not to.  If this is true, it would be a very stupid move on Apple’s part.
    Again, what’s the issue? What “principles”? You kinda sound like those vegans who just like to hear themselves talk and make themselves sound self-important. 
    dewmejibwilliamlondon
     2Likes 1Dislike 0Informatives
  • Reply 5 of 10
    As others, I fail to see a problem here.    Apple's constant drumbeat about privacy and security of customers' information is the reason Siri and Apple Intelligence are pretty much useless today.   Most Apple users do not allow the collection and use of any personal information, and this is based on tech pundits and Apple recommending over the years that those users turn off any data-collection settings in all their devices.     How is Siri supposed to learn anything about you, where you are, what you're trying to do, what you like to do, where you go, where you work, where you like to eat, shop, and play, etc etc if you do not allow it to collect any information in regards to any of those things?     This information can be collected anonymously but few Apple users allow it.

    Google on the other hand, knows everything about you, where you live, where you go, where you eat, shop, play, and work.  They collect all this data every time you use a Google-related app, whether you have allowed them to do so or even if you know about if or not.    And the result is that Google's digital assistant is FAR more useful and correct in everything it does, because it KNOWS you.    There is just no comparison in the utility of Google's assistant and Siri's ignorance.   And it is because Siri knows NOTHING about you.
    clexmanjibwilliamlondonForumPost
     3Likes 1Dislike 0Informatives
  • Reply 6 of 10
    DAalsethdaalseth Posts: 3,269member
    swat671 said:
    DAalseth said:
    The developer who spotted this addition criticized Apple for not including an opt-out option, saying that the only way users could opt out was by not filing a bug report at all.
    If this is true it would adversely impact bug reporting. A lot of us are working very hard to not let our data be used for AI training. Now we may be in possession of an important bug report but have to violate our principles if we want to report it. Many may decide not to.  If this is true, it would be a very stupid move on Apple’s part.
    Again, what’s the issue? What “principles”? You kinda sound like those vegans who just like to hear themselves talk and make themselves sound self-important. 
    Generative artificial intelligence is the cocaine of this generation. It is already damaging education, creativity, and putting people out of work. 
    dewmejibwilliamlondonForumPost
     1Like 3Dislikes 0Informatives
  • Reply 7 of 10
    clexmanclexman Posts: 225member
    retcable said:
    As others, I fail to see a problem here.    Apple's constant drumbeat about privacy and security of customers' information is the reason Siri and Apple Intelligence are pretty much useless today.  
    100% agree.
    dewmejibwilliamlondon
     3Likes 0Dislikes 0Informatives
  • Reply 8 of 10
    More to point will they act open bug reports ?

    My experience over many years is mostly - no !
     0Likes 0Dislikes 0Informatives
  • Reply 9 of 10
    mpantonempantone Posts: 2,400member
    This seems very shortsighted from Apple. Anything that discourages people from filing bug reports is not a particularly wise move especially from a company that has built up a reputation for protecting user privacy. There really should be an opt-out toggle for the people (some of whom have already voiced their interest) who don't want to let their data be used to train AI models.

    After all, a lot of the debugging data does say what the user was doing at the time and some of this might not be desirable to let idiot AI models parse through.

    Just the fact that it generated some discussion and hesistation amongst AppleInsider forum participants is really enough to make having an AI training opt out a necessity.
    edited April 21
    FileMakerFeller
     1Like 0Dislikes 0Informatives
  • Reply 10 of 10
    danvmdanvm Posts: 1,484member
    DAalseth said:
    swat671 said:
    DAalseth said:
    The developer who spotted this addition criticized Apple for not including an opt-out option, saying that the only way users could opt out was by not filing a bug report at all.
    If this is true it would adversely impact bug reporting. A lot of us are working very hard to not let our data be used for AI training. Now we may be in possession of an important bug report but have to violate our principles if we want to report it. Many may decide not to.  If this is true, it would be a very stupid move on Apple’s part.
    Again, what’s the issue? What “principles”? You kinda sound like those vegans who just like to hear themselves talk and make themselves sound self-important. 
    Generative artificial intelligence is the cocaine of this generation. It is already damaging education, creativity, and putting people out of work. 
    I’m old enough to remember the same thing being said of personal computers and the Internet, and look we where are now. Most people cannot live with at least one or both of them.

    I think the same will happen with generative AI. It is just part of the evolution of technology.  It will improve and be part of our lives, one way or the other. 
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.