Apple has been working on its own ChatGPT AI tool for some time

Posted:
in General Discussion edited July 2023

After years of weaving Machine Learning into all its software and devices, and silence on the matter at WWDC, Apple appears to be readying its own ChatGPT large language model.




It can't be said that Apple is lagging behind with AI, not when it has John Giannandrea as an actual Head of Artificial Intelligence, and he says AI and Machine Learning are deeply embedded in everything the company is doing. And when Tim Cook was an early, if not the earliest, tech CEO to enthuse about AI tools for everyone -- back in 2016.

Yet it is said to be lagging behind, and it's because there is no Apple version of ChatGPT, the AI tool whose use has skyrocketed in 2023.

Now, as well as Apple recruiting more engineers to work on generative AI, Bloomberg says it has already made an "Apple GPT".

Despite years of weaving Machine Learning into all its software and devices, Apple is arguably behind the curve with AI tools like ChatGPT -- or it was. and the framework to create large language models (LLM), the resource at the heart of ChatGPT, Google's Bard, and all similar tools. In Apple's case, the internal use only framework is called Ajax.

Bloomberg claims that Apple is privately concerned about missing out on AI, and it has made Ajax in order to unify machine learning development within the company.

The "Apple GPT" chatbot is solely for internal use at Apple, and it also requires special approval for access. It's already been briefly halted at one point by security concerns, and there is a mandate that it cannot be used to create features that will be used by customers.

Reportedly, though, Apple staff are using it for real work. It's been used for prototyping products, and it's also summarizing text for staff. According to unspecified sources within Apple, the tool currently does nothing that ChatGPT, Bard, or other AI systems do.

It's also not currently intended to be launched publicly. Instead, Apple is engaged in trying to find a consumer angle for such tools.

The sources stress that there is no plan in place yet, but also says that Apple's hope is to have major AI launch some time in 2024.

Internal disagreements



Reportedly, the drive toward AI is being led by both John Giannandrea and Craig Federighi -- and they apparently disagree on the approach. Giannandrea is said to be more conservative about what Apple can do, and that he wants to see how ChatGPT and other tools evolve.

Separately, Bloomberg reports that Apple is planning a coaching service called Quartz which will use AI on health data collected by the Apple Watch. Any forthcoming Apple Car is also set to use Machine Learning and AI in its self-driving features.

Read on AppleInsider

«1

Comments

  • Reply 1 of 36
    igorskyigorsky Posts: 757member
    Sad day for the “Apple is falling behind” crowd. 
    jSnivelygregoriusmwatto_cobra
  • Reply 2 of 36
    mayflymayfly Posts: 385member
    It's not a matter of if, but when. Apple is unique in the tech world, in the fact that the operating system, software apps (even 3rd party) and hardware are all integrated for a unified experience by the enduser. ChatGPT does not currently fit into that business model. When Apple releases a public AI-assisted interface, it's not going to look like current implementations. It will be better. It probably already is.
    byronlgregoriusmigorskyJapheyradarthekath2pjas99watto_cobraAlex1Nlolliver
  • Reply 3 of 36
    Without a consumer-usable or at least visible AI assisting service in the OS, future improvements in hardware are going to be dismissed by consumers with two words: "so what?" It could be a conversational Siri that can take on projects, remember discussions and context, do brainstorming and the like. It could be a WWSJD service, predicting What Would Steve Jobs Do in any situation. But there has to be some tangible benefit from the hardware or it will erode Apple's cool factor and quickly. 
    StrangeDayswilliamlondonbyronlpscooter63caladanianwatto_cobraAlex1NVermelho
  • Reply 4 of 36
    timmilleatimmillea Posts: 244member
    There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition. 

    My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK. 

    It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets. 




    byronlgregoriusmwilliamlondonappleinsideruseriqatedoh2pradarthekatcaladanianwatto_cobraAlex1N
  • Reply 5 of 36
    byronlbyronl Posts: 363member

    Reportedly, though, Apple staff are using it for real work. It's been used for prototyping products, and it's also summarizing text for staff. According to unspecified sources within Apple, the tool currently does nothing that ChatGPT, Bard, or other AI systems do. 

    How could a large language model be used to prototype products? 

    Also, I’m assuming it’s “other AI systems can’t do”

     Interesting development though, they were very fast to make this.
  • Reply 6 of 36
    mayflymayfly Posts: 385member
    timmillea said:
    There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition. 

    My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK. 

    It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets. 




    If Apple had continued to "lead" (debatable) with new technologies without making a profit, there would be no Apple. There almost wasn't, right before Apple gave Steve Jobs $700 million to buy his NextStep OS and bring him back to the helm. There was a time when Sun Microsystems was in talks to buy Apple.

    But to bemoan the fact that AI hasn't achieved singularity in 30 years shows a lack of understanding the enormous technical challeges involved. It will take processing power that does not even exist at the scale required at this time. Perhaps quantum computing will be the answer to the advances you're seeking. Decades from now.
    muthuk_vanalingambyronlwatto_cobraAlex1N
  • Reply 7 of 36
    lukeilukei Posts: 379member
    Seems strange given all of this that Siri remains so poor
    9secondkox2pscooter63williamlondonappleinsiderusercaladanianAlex1NVermelho
  • Reply 8 of 36
    waveparticlewaveparticle Posts: 1,497member
    Apple, Microsoft, and Google are the only companies that build OSs that are running on personal computers. They will decide the future of AI. 
    mayflyh2ppscooter63
  • Reply 9 of 36
    Prediction: The 2024 developers conference will focus on a major AI announcement implemented across the ecosystem. Particular emphasis will be on integration  into Apple Vision Pro. Particular emphasis will be on integration into VisionOS to allow the consumer to create 3-D environments and applications.
    byronljas99watto_cobraAlex1N
  • Reply 10 of 36
    Apple, Microsoft, and Google are the only companies that build OSs that are running on personal computers. They will decide the future of AI. 
    You mean Apple. Google doesn’t have a real OS on the desktop. They have that wannabee pretend ChromeOS. Microsoft doesn’t have a mobile OS.

    Only Apple has a complete OS ecosystem comprising mobile & desktop.
    danox9secondkox2gregoriusmwilliamlondonbyronltmayradarthekatjas99pscooter63watto_cobra
  • Reply 11 of 36
    JapheyJaphey Posts: 1,767member
    mayfly said:
    timmillea said:
    There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition. 

    My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK. 

    It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets. 




    But to bemoan the fact that AI hasn't achieved singularity in 30 years shows a lack of understanding the enormous technical challeges involved. It will take processing power that does not even exist at the scale required at this time. Perhaps quantum computing will be the answer to the advances you're seeking. Decades from now.
    Did you study AI and software engineering in college? If you did, well done  But if you didn’t, what makes you think that you know more than someone who did? 

    Also, who said anything about the Singularity?
    edited July 2023 byronlappleinsideruserwatto_cobra
  • Reply 12 of 36
    9secondkox29secondkox2 Posts: 2,727member
    Cool. Now if they’d just fuse that with Siri snd make it that simple, great. 

    Current ai chat bots are just annoying. 
    jas99watto_cobra
  • Reply 13 of 36
    waveparticlewaveparticle Posts: 1,497member
    Apple, Microsoft, and Google are the only companies that build OSs that are running on personal computers. They will decide the future of AI. 
    You mean Apple. Google doesn’t have a real OS on the desktop. They have that wannabee pretend ChromeOS. Microsoft doesn’t have a mobile OS.

    Only Apple has a complete OS ecosystem comprising mobile & desktop.
    I have put them in the correct order. LOL Despite the differences you stated, each one will try hard on AI. 
  • Reply 14 of 36
    danoxdanox Posts: 2,874member
    Apple, Microsoft, and Google are the only companies that build OSs that are running on personal computers. They will decide the future of AI. 
    You mean Apple. Google doesn’t have a real OS on the desktop. They have that wannabee pretend ChromeOS. Microsoft doesn’t have a mobile OS.

    Only Apple has a complete OS ecosystem comprising mobile & desktop.
    In the coming years, will find out how complete Android really is as a OS, maybe when they try to copy Apple Vision Pro OS?
    edited July 2023 williamlondonwatto_cobra
  • Reply 15 of 36
    charlesncharlesn Posts: 842member
    Count me doubtful of Apple in this endeavor. I'm all-in on the Apple ecosystem except when it comes to a voice assistant. Thirteen YEARS after Apple bought the company that developed Siri--which was also four YEARS before Amazon introduced Alexa--Siri remains the laughable dumbass of the voice assistants, forever at the back of the class.I could go on ad nauseam with examples of Siri stupidity, but a fantastic recent account appeared in Mark Ellis Reviews when Mark--another all-in on Apple guy except for Siri--did a month-long test of switching out Alexa for Siri to see if Apple's voice assistant was finally competent in 2023. In a word... NO!!! Even Mark's wife noted that Siri was intolerably stupid. My advice to Apple: prove you can make a competent voice assistant before attempting a ChatGPT-type tool -- as of now, Siri can't even get a song request right consistently. 
    williamlondoncaladanianpscooter63Alex1N
  • Reply 16 of 36
    danvmdanvm Posts: 1,409member
    Apple, Microsoft, and Google are the only companies that build OSs that are running on personal computers. They will decide the future of AI. 
    You mean Apple. Google doesn’t have a real OS on the desktop. They have that wannabee pretend ChromeOS. Microsoft doesn’t have a mobile OS.

    Only Apple has a complete OS ecosystem comprising mobile & desktop.
    From what I know, ChromeOS is considered a desktop OS, even though with it's limitations.  

    With MS, they have an enterprise ecosystem no other company have, including Apple.  They could reach apps and services Apple won't be able to reach with their current ecosystem, like ERP's, BI and UC. Having a desktop and mobile OS is part of the equation, but there is much more than those two.
    byronlavon b7pscooter63Alex1N
  • Reply 17 of 36
    mayflymayfly Posts: 385member
    Japhey said:
    mayfly said:
    timmillea said:
    There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition. 

    My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK. 

    It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets. 




    But to bemoan the fact that AI hasn't achieved singularity in 30 years shows a lack of understanding the enormous technical challeges involved. It will take processing power that does not even exist at the scale required at this time. Perhaps quantum computing will be the answer to the advances you're seeking. Decades from now.
    Did you study AI and software engineering in college? If you did, well done  But if you didn’t, what makes you think that you know more than someone who did? 

    Also, who said anything about the Singularity?
    When I was in college, there was no AI. There was no software. The only computer in Chicago was an IBM 360 mainframe at the Illinois Institute of Technology. That's where I went to college, and where I majored in EE, with a computer science minor. The first engineering job I had was at Robert Bosch corp, developing electronic fuel injection hardware and software. Then in the engineering dept. at Siemens, working on the implementation of integrated circuit technology into their medical devices. Followed by 17 years of self employment in graphic arts (you could find my name on the original Adobe Pagemaker/Indesign and Illustrator teams). Followed by working at Apple until I retired in 2014.

    Other than that, you're right, I'm probably unqualified to opine about the resources necessary to advance AI to pass the Imitation Game.
    cg27pscooter63freeassociate2Alex1N
  • Reply 18 of 36
    cg27cg27 Posts: 213member
    timmillea said:
    There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition. 

    My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK. 

    It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets. 


    Back in 1968, 2001: A Space Odyssey was released based on fairly rigorous extrapolations of where tech would be 33 years later: HAL, moon bases, orbiting hotels with artificial gravity, and so on.  As John Lennon said, life is what happens while you’re busy making other plans.  The Vietnam War and other factors delayed certain advancements / priorities.  No doubt we’ll have these within a few decades, and possibly a Mars base.
    byronlradarthekatwatto_cobraAlex1N
  • Reply 19 of 36
    JapheyJaphey Posts: 1,767member
    mayfly said:
    Japhey said:
    mayfly said:
    timmillea said:
    There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition. 

    My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK. 

    It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets. 




    But to bemoan the fact that AI hasn't achieved singularity in 30 years shows a lack of understanding the enormous technical challeges involved. It will take processing power that does not even exist at the scale required at this time. Perhaps quantum computing will be the answer to the advances you're seeking. Decades from now.
    Did you study AI and software engineering in college? If you did, well done  But if you didn’t, what makes you think that you know more than someone who did? 

    Also, who said anything about the Singularity?
    When I was in college, there was no AI. There was no software. The only computer in Chicago was an IBM 360 mainframe at the Illinois Institute of Technology. That's where I went to college, and where I majored in EE, with a computer science minor. The first engineering job I had was at Robert Bosch corp, developing electronic fuel injection hardware and software. Then in the engineering dept. at Siemens, working on the implementation of integrated circuit technology into their medical devices. Followed by 17 years of self employment in graphic arts (you could find my name on the original Adobe Pagemaker/Indesign and Illustrator teams). Followed by working at Apple until I retired in 2014.

    Other than that, you're right, I'm probably unqualified to opine about the resources necessary to advance AI to pass the Imitation Game.
    Well done  Thanks for the CV. 
    Now, what about the Singularity? The only thing I remember Timmillea “bemoaning” is the slow rate of progress in the area in which they specialized. Does this not make them qualified to opine as well? At least, without being told that they don’t understand and without putting words into their mouth? I think maybe it does. 
    edited July 2023 muthuk_vanalingampscooter63igorskywatto_cobraAlex1N
  • Reply 20 of 36
    hexclockhexclock Posts: 1,259member
    For anyone interested, I highly recommend watching Richard Feynman’s lecture on computer heuristics. At the end he does a Q & A session and discusses machine intelligence. 

    jSnivelyAlex1N
Sign In or Register to comment.