Internal Apple AI 'Ask' tool being tested by employees

Posted:
in General Discussion

Apple is reportedly testing an AI tool called "Ask" for AppleCare support advisors that generate answers using data learned from an internal database.

An image showing the Siri icon and ChatGPT icon combined.
Apple is working on AI tools



Apple's push into AI isn't a secret, as its CEO has even shared his excitement for what's coming later in 2024. Even though nothing has been announced, the company is likely working on and testing many tools that rely on generative models similar to how ChatGPT operates.

According to a report from MacRumors based on obtained information on the project, Apple launched a pilot program that provides select AppleCare support advisers an AI tool called "Ask." It is a tool that automatically generates responses to technical questions based on information from Apple's internal database.

Unlike a simple search tool, which returns the same results every time based on relevance, the "Ask" program generates an answer based on specifics mentioned in the query, like device type or operating system. Advisors can mark these answers as "helpful" or "unhelpful."

Hallucinations are an issue with generative language models, which is a nice way of saying that chatbots tend to make things up with high confidence. The "Ask" tool attempts to avoid this behavior by being trained only on its internal database with additional checks that ensure responses are "factual, traceable, and useful."

The internal tool will allegedly be made available to more advisors in the future after feedback is collected.

There's a good chance this leaked "Ask" tool either is or is based on the previously leaked "Ajax." It is an internal tool that some allegedly referred to as "AppleGPT."

What is this AI stuff anyway



From this leak, it isn't clear if Apple is referring to the "Ask" tool as "AI," but the signs are there. The leaked information wouldn't use language like "generate" or "factual" if it was a generic search and fetch tool.

A HomePod is lit up and text says 'I'm sorry, I can't do that'
It's time for a big Siri upgrade and AI might help with that



Until a recent fad chose to call it a more sexy-sounding "Artificial Intelligence," much of the technology involved in modern AI was called machine learning. Apple's marketing team hasn't had a chance to name its technologies yet, so there's a chance Apple will pull a "spatial computing" and provide another name entirely.

Whatever it's called, these tools, like ChatGPT, rely on having access to a database of information to generate an answer using what can be described as next-word predictions. It finds a pattern in the database and decides what the next logical word might be to answer a query.

Of course, that's an oversimplification.

Apple is no stranger to ML or AI. The touch keyboard has always relied on an algorithm to predict what letter a user will press in a certain space. Today, that same keyboard uses a transformer language model for autocorrect.

Rumors are piling up around Apple's push into AI and what it might mean for iOS 18. We'll all have to wait until WWDC in June to find out.

Rumor Score: Likely

Read on AppleInsider

Comments

  • Reply 1 of 17
    ... has this been Apple's long 'game' since 2011 - to 'quietly' harvest customer data and IP under the interminable EULA, misnomer of 'machine learning' and guise of 'what is on your iphone stays on your iphone' privacy for future monetisation...?

    What ever happened to the cascading class actions reported by Gizmodo per: gizmodo.com.au/2023/02/after-a-dozen-lawsuits-apple-breaks-its-silence-on-privacy-problems/ ???

    9secondkox2
  • Reply 2 of 17
    Hopefully they give this technology a better "human" name than Siri.

    How about Jeeves?
    9secondkox2
  • Reply 3 of 17
    MacProMacPro Posts: 19,726member
    ... has this been Apple's long 'game' since 2011 - to 'quietly' harvest customer data and IP under the interminable EULA, misnomer of 'machine learning' and guise of 'what is on your iphone stays on your iphone' privacy for future monetisation...?

    What ever happened to the cascading class actions reported by Gizmodo per: gizmodo.com.au/2023/02/after-a-dozen-lawsuits-apple-breaks-its-silence-on-privacy-problems/ ???

    I seriously doubt this.
    watto_cobra9secondkox2
  • Reply 4 of 17
    MacProMacPro Posts: 19,726member
    I hope this has LAM as well as LLM; being able to have 'Ask' do stuff will be a paradigm shift in personal computing on a Star Trekian level.  I believe it will, and this all using on-device edge technology.  My best guess is this will be released on the next iPhone with limited support for the last few iterations.  It would be cool if HomePod II also has support secretly already.
    watto_cobradewmeForumPost
  • Reply 5 of 17
    Using your own internal database is the legal way to use "AI". 
    watto_cobra9secondkox2ForumPost
  • Reply 6 of 17
    Hope this doesn’t spell the end of human tech support for Apple who will now join the money saving masses by offering only bot chat for help. Kill me now. 
    watto_cobra9secondkox2
  • Reply 7 of 17
    ... has this been Apple's long 'game' since 2011 - to 'quietly' harvest customer data and IP under the interminable EULA, misnomer of 'machine learning' and guise of 'what is on your iphone stays on your iphone' privacy for future monetisation...?

    I would argue you’ve got it backward. AI is neither artificial nor intelligent. Machine learning is what serious scientists (NASA among many) in this area called from the beginning. 
    roundaboutnowwatto_cobra9secondkox2ForumPost
  • Reply 8 of 17
    avon b7avon b7 Posts: 7,655member
    Hope this doesn’t spell the end of human tech support for Apple who will now join the money saving masses by offering only bot chat for help. Kill me now. 
    A lot of chatbots currently used for first line contact are appalling. 

    However, there is a real (almost guaranteed) chance that for informational tech support, AI will likely prove far more accurate and far, far faster. 
    ForumPost
  • Reply 9 of 17
    If Apple doesn’t streamline it all to integrate into next ten Siri, it’s a marketing and use ability fail. 

    Let everyone else have a million ai apps for every little use case. 

    Even internally Apple can have a special build of Siri and just wall internal info off for the public. 

    Basically everyone knows how to use Siri. But now you’ll be able to do so much more. 
  • Reply 10 of 17
    ... has this been Apple's long 'game' since 2011 - to 'quietly' harvest customer data and IP under the interminable EULA, misnomer of 'machine learning' and guise of 'what is on your iphone stays on your iphone' privacy for future monetisation...?

    What ever happened to the cascading class actions reported by Gizmodo per: gizmodo.com.au/2023/02/after-a-dozen-lawsuits-apple-breaks-its-silence-on-privacy-problems/ ???

    Pretty much 

    Apple does indeed collect info on its users. That’s why they offer an opt-out - Knowing that most don’t bother. 
  • Reply 11 of 17
    chasmchasm Posts: 3,290member
    Hope this doesn’t spell the end of human tech support for Apple who will now join the money saving masses by offering only bot chat for help. Kill me now. 
    Unlikely. As described, the “Ask” tool can only point human advisors in the right direction to fixes to try based on user’s description (and subsequent advisor interpretation) of the symptoms and issues.

    One of the BIGGEST factors beyond the cosmetic design and aesthetic of Apple’s products that keeps customers loyal is its legendary human telephone support, who go the extra mile with ALL callers, not just the AppleCare+ buyers.
  • Reply 12 of 17
    Nobody I know perceives a chat bot as a service improvement. It’s irritating and frustrating and purely a cost saving tactic. Just like Siri still understands nothing from what you are saying (non english) and is still unusable. 
  • Reply 13 of 17
    If Apple doesn’t streamline it all to integrate into next ten Siri, it’s a marketing and use ability fail. 

    Let everyone else have a million ai apps for every little use case. 

    Even internally Apple can have a special build of Siri and just wall internal info off for the public. 

    Basically everyone knows how to use Siri. But now you’ll be able to do so much more. 
    This literally makes no sense. Why would an internally used tool have to be part of Siri? 


    9secondkox2
  • Reply 14 of 17
    If Apple doesn’t streamline it all to integrate into next ten Siri, it’s a marketing and use ability fail. 

    Let everyone else have a million ai apps for every little use case. 

    Even internally Apple can have a special build of Siri and just wall internal info off for the public. 

    Basically everyone knows how to use Siri. But now you’ll be able to do so much more. 
    This literally makes no sense. Why would an internally used tool have to be part of Siri? 


    Same codebase, same ease of use, and a great way to ensure that Siri is consistently tested to the max in critical applications such as tech support. If it's good enough for internal support, it's going to be great for customers. Protected IP can be removed/blocked from public-facing versions. Of course the internal version should get more input options such as keyboard, etc. This results in a more efficient, reliable, and trusted product for all involved. 
  • Reply 15 of 17
    If Apple doesn’t streamline it all to integrate into next ten Siri, it’s a marketing and use ability fail. 

    Let everyone else have a million ai apps for every little use case. 

    Even internally Apple can have a special build of Siri and just wall internal info off for the public. 

    Basically everyone knows how to use Siri. But now you’ll be able to do so much more. 
    This literally makes no sense. Why would an internally used tool have to be part of Siri? 


    Same codebase, same ease of use, and a great way to ensure that Siri is consistently tested to the max in critical applications such as tech support. If it's good enough for internal support, it's going to be great for customers. Protected IP can be removed/blocked from public-facing versions. Of course the internal version should get more input options such as keyboard, etc. This results in a more efficient, reliable, and trusted product for all involved. 
    Same codebase? Nothing in this arctic le nor the macrumors one suggests this chatbot style tool shares or will share anything code with Siri. Further, training a model on information and then not allowing it to use any of that information doesn't gain you anything. These tools are only as useful as the information they are trained on. Public tools should be trained on information the public will be helpful to the public.


  • Reply 16 of 17
    MacProMacPro Posts: 19,726member
    ... has this been Apple's long 'game' since 2011 - to 'quietly' harvest customer data and IP under the interminable EULA, misnomer of 'machine learning' and guise of 'what is on your iphone stays on your iphone' privacy for future monetisation...?

    What ever happened to the cascading class actions reported by Gizmodo per: gizmodo.com.au/2023/02/after-a-dozen-lawsuits-apple-breaks-its-silence-on-privacy-problems/ ???

    Pretty much 

    Apple does indeed collect info on its users. That’s why they offer an opt-out - Knowing that most don’t bother. 
    As far as I am aware, the opt-out feature is for third-party apps running in the Apple ecosystem.  I don't recall ever being asked if I wanted to opt out of an Apple product.  Correct me if I am wrong.
    muthuk_vanalingam
  • Reply 17 of 17
    FWIW, “AI” is the superset, which contains ML, LLMs, and generative AI within it. The subsets vary in purpose and technique. 
Sign In or Register to comment.