Doom and gloom reporting on Apple Intelligence continues to ignore Apple's playbook

Jump to First Reply
Posted:
in General Discussion edited July 22

Chances are you've been underwhelmed by Apple Intelligence, but a combination of unrealistic expectations, fictional narratives, and ignored progress won't stop it from being a key part of Apple's ecosystem.

Smartphone with triple camera design, metallic finish, and Apple logo, positioned in front of colorful, blurred abstract background with intertwined loops.
Apple Intelligence isn't chasing the same AI grift as other companies



The world viewed WWDC 2024 with middling expectations for Apple's late entry into the artificial intelligence race. The slow rollout and limited functions caused an expected yawn from analysts that were used to the grift and glamour promised by other tech companies, and that should be expected by now.

The protracted exhale of doom and gloom reporting continues with a ranging history and speculation on Apple's AI initiatives from The Information. Much of the information has been previously covered by Bloomberg, but it did detail the history of Apple's foundation model team.

However, my focus is on the expected narrative around Apple's supposed internal confusion around AI and its strategy (or lack thereof) resulting in discontent and staff departures. As always, it's important to remember that "people familiar with the matter" are likely disgruntled employees excited to speak about their pet peeves, and the coverage will likely lean into any negative narratives that help amplify a story.

The reality is the artificial intelligence industry is large, complex, and full of momentum that results in more flux and noise than even Apple is used to dealing with. It would be impossible to ignore the allure of working on what is promised to be the future of AI, superintelligence, with a giant pay package larger than Apple's CEO.

Apple's true AI problem



The issue isn't that Apple got caught off guard or has some kind of talent brain drain. It's the usual problem with Apple that the world has with the company -- its pursuit of consumer privacy and maintaining product plan secrecy.

Both of these facets are polar opposites of the AI mantra that requires being loud, declaring the end of civilization with every point update, and sucking up every ounce of data available, legal or not. That said, Apple's offerings are very different from what we've seen from other AI companies and certainly come off as underwhelming compared to seemingly sentient AI chatbots.

There's no good way to make advanced autocorrect sound interesting. Apple Intelligence launched with Writing Tools that edit, manipulate, or generate text, Image Playground with poorly made renderings of people and emoji, and notification summaries that caught a lot of flack from news publications.

On the range between Apple's worst launch, Apple Maps, and iPhone, it's closer to Apple Maps on the spectrum. However, it's not the absolute disaster being peddled by many reports on the technology.

For the most part, people outside of the tech space likely have some idea what artificial intelligence is and have no idea that Apple even has a version running on their iPhone. Apple's strategy in the space is to get out of the way and offer the technology in ways that enhance the user experience without always shouting that it is an AI feature.

The most you'll see of Apple indicating Apple Intelligence is in use is the shiny rainbow elements that show up in the UI. A lot of Apple Intelligence isn't even explained beyond a tiny symbol in a summary, if a sign of AI is shown at all.

Smartphone screen with colorful border, dark patterned wallpaper, calendar widget, various app icons including email and messaging, and current time displayed at the top.
The Siri animation is the most prominent use of Apple's AI rainbow



Apple's strategy going forward, offering developers and users direct access to the foundation models, is to get further out of the way. There will be times Apple Intelligence is in use that the user will never even realize -- and that's the point.

Which is the perceived problem with Apple Intelligence. It isn't here to allegedly replace jobs or discover "new" physics, but it is here to improve workflows and help users with mundane tasks.

AI reporting's red herrings



One of the biggest misses that has been repeated around Apple Intelligence since its launch is the idea of Apple being behind or caught off guard by AI. Rewind to before ChatGPT's debut, and Apple was at the forefront of machine learning and was even offering Neural Engines in all of its products before it was an industry standard.

Mobile interface showing writing tools for proofreading, rewriting, and choosing tone or format options like friendly, professional, concise, summary, key points, list, and table.
Apple Intelligence was built on Apple's strong foundation of ML



Once the industry learned that the catch-all term of "artificial intelligence" was being used everywhere, even in place of the more accurate machine learning term, demand for that term skyrocketed. Instead of misleading customers, Apple continued referring to its machine learning tools as such, and waited to call anything AI until it was using generative technologies.

The report suggests that Giannandrea was so disconnected from the idea that LLMs could be useful that he stopped Apple from pursuing them in 2022 when Ruoming Pang, the now-departed head of Apple's foundation model team, introduced them. That narrative is likely accurate, but it leaves out the fact that LLMs were largely useless in their debut state as incredibly hallucinatory chatbots.

The story around Apple Intelligence continues to suggest that Apple's executive team woke up sometime in 2023 and realized the technology was important and should be pursued. However, this very same report suggests that Apple's teams under Giannandrea had been working on various AI technologies since they were put together in 2018.

The reality is much more simple -- analysts and pundits are unhappy with how Apple has approached AI so far. They seem to have completely missed what Apple's strategy is and continue to push the idea Apple is behind instead of considering the company's own direction.

Of course, I won't ignore that there has been some reshuffling of leadership internally as Apple has adjusted how it wants to tackle AI. Craig Federighi is now overseeing Apple's AI initiatives, and the company is looking externally to bolster its efforts.

However, instead of celebrating Apple's ability to develop both its internal models while seeking to give customers options, reports suggest Apple is giving up on its own efforts. There don't appear to be any signs of that actually occurring beyond these reports from anonymous employees.

Apple's true AI goals



For some reason, everyone seems completely ready to accept that Apple is just going to hire OpenAI or Anthropic to build the backend of Siri while alienating its own teams. This incredible departure from Apple's usual strategy of vertical integration has just been accepted as expected because, given the narrative around how behind Apple is, the company would have no other choice but to give up.

Man in dark clothing stands in a modern building with wide, illuminated stairs on both sides in the background.
Apple CEO Tim Cook isn't sleeping on Apple Intelligence. Image source: Apple



Instead, when framed from a company that knows what it is doing and is actively evolving its strategy to play in the consistently exaggerated AI field, it's easy to see where Apple may be going with Apple Intelligence. It starts with app intents and ends with a private AI ecosystem.

Apple's executive team likely stepped in and surprised the Apple Intelligence team with its announcement of the contextual AI and Siri delay. The report reveals what I've suspected all along -- that Apple had a working version ready to ship, but the failure rate was below Apple's incredible standards.

The primary example Apple showed off with the new contextual Siri that relies on app intents is the mom at the airport scenario. Imagine if there was a 30% chance that it told you the wrong time, or even the wrong airport or wrong day.

If poorly summarized headlines made the news, wait and see what would happen if one person out of billions of users goes to the wrong airport.

Apple's pursuit of perfection with AI is likely a folly since the technology is inherently failure-prone. It can't think or reason, so unless there are some incredible checks and balances with human intervention, there likely will never be a 100% guarantee of accuracy.

Whatever system Apple devises to help overcome this limitation of AI, it'll likely debut later in 2025 or early 2026 during the iOS 26 cycle. I have no doubt that a version of that technology is coming, even as Apple further utilizes app intents in other ways across its platforms.

A smartphone screen displays a Super MAGFest advertisement, including event details and a brief description about the gaming and music festival. Options show for asking questions, adding to calendar, image search.
Apple is already brining third-party tools into the Apple Intelligence umbrella



Where the contextual, on-device, and private Apple Intelligence will be game-changing on its own, it's nothing compared to what could come next. Apple's willingness to develop its models while still giving users access to others will be its ultimate victory in the space.

I don't believe Apple is going to replace the Siri backend with Claude or ChatGPT. That reporting seems to be the result of pundit wishcasting and the desire to see Apple fail.

Instead, Apple seems to be in talks with OpenAI, Anthropic, and others to bring models of their popular AI programs to Private Cloud Compute. If that happens, it goes well beyond Siri simply shaking hands with ChatGPT over a privacy-preserving connection -- it guarantees user data will never be used by third parties.

If Apple's pitch for Private Cloud Compute is a server-side AI that runs with the same privacy and security measures as an on-device AI, then imagine those same conditions applying to ChatGPT.

The future of Apple Intelligence is a combination of contextually aware local models running on devices with access to third-party models running via Private Cloud Compute. Users would be able to choose which models suit their needs in the moment and submit data without fear of it becoming part of training data.

Running track with numbered lanes one to six, overlapping by colorful abstract logos spaced over lanes.
Apple could win the race by giving users choice in an AI ecosystem



The Apple Intelligence ecosystem would be the only one built from the ground up for ethical artificial intelligence. Regardless of model, on device or in Apple's servers, it would run privately, securely, and on renewable energy.

In a world filled with exaggerations around AI and its future, we should celebrate the grounded and realistic promises. Artificial intelligence is a tool, like a hammer, that can enhance human productivity, not replace it, and Apple's strategy reflects that.

Apple's tagline for Apple Intelligence is "AI for the rest of us." And by providing billions of Apple customers access to sustainable, ethical, private, and secure AI models, I'd argue Apple is pursuing a potential future that's not only realistic, but places it well ahead of the competition.



Read on AppleInsider

Alex1NmattinozJavert24601
«1

Comments

  • Reply 1 of 28
    ssfe11ssfe11 Posts: 192member
    Apple's willingness to develop its modelswhile still giving users access to others will be its ultimate victory in the space.….
    this is the line that says it all and what the media and so called “AI Experts” are missing. It’s the search wars all over again only this time Apple will not only have others AI models but now Apple will also have its own AI models. Sounds like a win win. 
    williamlondoniOS_Guy80Alex1Ncflcardsfan80
     4Likes 0Dislikes 0Informatives
  • Reply 2 of 28
    ssfe11ssfe11 Posts: 192member
    Great article!
    williamlondoniOS_Guy80blastdoor
     2Likes 1Dislike 0Informatives
  • Reply 3 of 28
    ne1ne1 Posts: 87member
    "The issue isn't that Apple got caught off guard or has some kind of talent brain drain."

    Usually, I would say the assertion of this article is correct. But in this case, Apple was caught off guard by the popularity of consumer available AI tools, that's why they have not yet gotten a model working to their satisfaction and have been delayed. They likely had too much of a focus on the abandoned Apple Car and Apple Vision Pro and failed to devote resources to AI early enough and recognize its importance. We all know they sat on Siri and didn't improve it for years.

    So normally, you would be right but in this case, they did drop the ball. I blame the executive team for not putting enough emphasis on it earlier. 



    edited July 22
    williamlondonmr moeCrossPlatformFroggeriOS_Guy80pulseimagesrezwitsJavert24601blastdoor
     4Likes 4Dislikes 0Informatives
  • Reply 4 of 28
    twolf2919twolf2919 Posts: 184member
    "a combination of unrealistic expectations, fictional narratives, and ignored progress won't stop it from being a key part of Apple's ecosystem" - I don't think anyone had unrealistic expectations!  The expectations were totally realistic, based on what Apple teased at the 2024 conference.  And it's not that everyone is ignoring progress - it's just that most of the AI progress is in areas that are pretty niche (not many people feel the need for AI generated images such as in Image playground or need email summaries) and, in others, it's just a step improvement (e.g. object recognition in Photos).  Removing unwanted objects from photos is a pretty nice feature that might be useful to anyone taking photos.  But I find it awkward/unintuitive to use - and the results are often not perfect.

    In summary, I think everyone expected a much better Siri in at most a year's time.  That wasn't an unreasonable/unrealistic expectation given what was demoed in 2024.  Apple simply shouldn't have demoed and subsequently advertised that feature if it couldn't deliver it.
    ne180s_Apple_GuyWesley_HilliardwilliamlondoniOS_Guy80mr moegatorguy
     4Likes 3Dislikes 0Informatives
  • Reply 5 of 28
    kkqd1337kkqd1337 Posts: 516member
    My take is their AI and Siri are pretty bad products.

    But I don't think consumers really care about these services.
    thtdanoxwilliamlondoniOS_Guy80pulseimagesJavert24601saarek
     4Likes 3Dislikes 0Informatives
  • Reply 6 of 28
    Wesley_Hilliardwesley_hilliard Posts: 618member, administrator, moderator, editor
    ne1 said:
    "The issue isn't that Apple got caught off guard or has some kind of talent brain drain."

    Usually, I would say the assertion of this article is correct. But in this case, Apple was caught off guard by the popularity of consumer available AI tools, that's why they have not yet gotten a model working to their satisfaction and have been delayed. They likely had too much of a focus on the abandoned Apple Car and Apple Vision Pro and failed to devote resources to AI early enough and recognize its importance. We all know they sat on Siri and didn't improve it for years.

    So normally, you would be right but in this case, they did drop the ball. I blame the executive team for not putting enough emphasis on it earlier. 



    I think the framing is important. It's not as if Apple wasn't working on LLMs in 2022 or earlier, it's just that the leadership didn't see them as worthy of pursuit that early on. I'd even argue that Apple engaging in the AI nonsense in 2024 was still too soon and only brought about because of ceaseless chatter from pundits. To be clear, I'm glad the announced and released what they did, but it was still too soon by Apple standards. And the delay was necessary, why release something that'll just spew garbage like the rest of the industry, we already have garbage generators, why add another?

    AI is going to help make a lot of work easier, but it's still in its infancy. All the people pretending it's this amazing futuristic technology today is going to feel real foolish in a couple of years when it's as mundane as an excel spreadsheet. The hype is doing the technology injustice, and people are believing the hype. AI fatigue in general pop set in last year as Apple announced Apple Intelligence, and the company is lucky they took such an approach.

    AI should be a background tool that you don't even realize you're using. It's why all the grift around it being some kind of world-altering technology that could be the end of mankind will ultimately fall apart, because it will never be much more than what it is today -- a powerful next bit prediction engine. It's why I find the pursuit of "superintelligence" laughable, because it's a fiction invented by people seeking investment and the ability to bypass regulations that will ultimately fail. I expect when a company finally declares the've reached superintelligence, it'll be a watered down version that's much less than expected, if not completely underwhelming and not actually "beyond human intelligence." They moved the goalposts before, they'll do it again.

    Meanwhile, Apple will succeed by doing what it always does: releasing products people actually use, that are ethical, and maintain privacy.
    thtdanoxiOS_Guy80Alex1Nmr moecflcardsfan80rezwits
     6Likes 1Dislike 0Informatives
  • Reply 7 of 28
    mpantonempantone Posts: 2,497member
    Consumer use of AI chatbots and AI assistants is real otherwise no one would be downloading and using ChatGPT. There's something of a generational gap for usage. Zoomers are using AI heavily for school (and their teachers are quite aware of it and are using their own AI tools to detect AI usage). Boomers, Generation X, and early Millennials use AI tools far less frequently.

    AI saves time for many tasks. It doesn't make anyone smarter, it just makes them more efficient. What someone does with their newly gained free time is something else. If you use it to read and learn new topics, yes, you can get smarter. If you just sink your extra free time just watching TikToks, well, I don't know about.

    Having AI write your work e-mails doesn't make you understand your business any better. Nor does it help improve your language and communication skills. It might smooth things over with your colleagues but it won't help you come up with bright ideas on how to better the place you work.

    Anyhow, yes, all of these companies are fighting over a limited pool of talent. Some top level AI engineer working on a model at Meta isn't helping Apple get their model improved. Yes, hiring the best people does matter which why these people are being paid more than minimum wage or even many other people in the same company.

    It's also important to reiterate that consumer AI business is actually turning a profit right now. Everyone is running at a loss right now, including OpenAI (operators of ChatGPT, the most popular consumer AI service). All of these companies are trying to set a foundational framework for future success and profit. It's a marathon not a sprint and Apple's mission of protecting user privacy is always going to put them at a disadvantage compared to their competitors who make the lion's share of their profits by selling user activity data (Alphabet, Meta).

    Apple's senior management knows how bad Siri is. Even the "improved" Siri they were developing wasn't good enough so it was postponed to some future release date. Apple's years of neglect of Siri is now utterly and completely crystal clear.
    edited July 22
    williamlondoniOS_Guy80Wesley_Hilliardmr moemuthuk_vanalingamJavert24601
     3Likes 3Dislikes 0Informatives
  • Reply 8 of 28
    Their playbook was to announce a bunch of features a year ago that they couldn't deliver on.
    kkqd1337iOS_Guy80Wesley_HilliarddewmeJavert24601jeffharrisblastdoorgrandact73
     5Likes 3Dislikes 0Informatives
  • Reply 9 of 28
    mpantonempantone Posts: 2,497member
    Their playbook was to announce a bunch of features a year ago that they couldn't deliver on.
    None of these features were really expected to move the needle in terms of Apple's fiscal performance. So while the fact that Apple fell off their timeline, it probably won't hurt them too much in the long run. And Apple does prioritize user privacy (which I appreciate) so they are always a step or two behind.

    Most of the currently deployed Apple Intelligence features are pretty lackluster. Some of them work some of the time but there's no killer app or killer feature. In the end, most consumer AI functionality is smoke-and-mirrors.

    But yes, they should have been more cautious about what they announced. My gut feeling is they really expected to deliver but belatedly discovered that getting these features to typical Apple polish was harder than they expected. We've seen periodic stumbles from Apple, after all, they are human.

    Generally the better practice is: "under promise, over deliver" IMHO. But I'm not running the CEO of a Fortune 5 company.
    Javert24601
     1Like 0Dislikes 0Informatives
  • Reply 10 of 28
    The “unrealistic expectation” comes from Apple management. It was, and continues to be, unrealistic for them to expect allowing Siri to fall so far off the curve that most loyal Apple users wouldn’t be ready to just give up on it. 

    Now that I’m flooded with Alexa+ that can at handle multiple home automation tasks with ease, why should I wait for Siri just to catch up? 

    Siri is an embarrassment and tarnishes the entire Apple ecosystem.


    mr moecflcardsfan80Javert24601williamlondon
     3Likes 1Dislike 0Informatives
  • Reply 11 of 28
    You mean like how apple introduced Siri and let it fail for the last how many years? Yeah, sure there’s no doom and gloom. Like others have cited, Siri has a lot of catch up. Why wait? Why would other consumer electronics makers wait or even care if it’s compatible. They announced vaporware to sell phones last year, people bought them and features still haven’t arrived. Broken promises after another. 
    Javert24601jeffharrisgrandact73williamlondon
     3Likes 1Dislike 0Informatives
  • Reply 12 of 28
    mpantonempantone Posts: 2,497member
    You mean like how apple introduced Siri and let it fail for the last how many years? Yeah, sure there’s no doom and gloom. Like others have cited, Siri has a lot of catch up. Why wait? Why would other consumer electronics makers wait or even care if it’s compatible. They announced vaporware to sell phones last year, people bought them and features still haven’t arrived. Broken promises after another. 
    Apple BOUGHT Siri. 

    Siri was a standalone startup whose roots were SRI International (um, the name R I isn't accidental); the speech recognition engine was based on Nuance technology. That startup had a standalone iOS app (I even remember the very un-Apple-like script "Siri" on the app icon on my iPod touch) in 2010 before the company was acquired by Apple. Apple most certainly did not invent the technology. They did incorporate it in a relatively quick 18 months into iOS.

    After that, they effectively did nothing of consequence with Siri. Siri's poor voice recognition has been well hashed over for 10+ years. People expected Apple to make Siri better once it was part of iOS. They did NOTHING. Siri has been festering for 15 years. They made bombastic statements in WWDC 2024 about Siri which failed to come to fruition.

    When Apple's senior management belatedly got the hint that ChatGPT-like consumer-facing AI was a thing, they understood that voice recognition was an integral part of this experience. But Siri's long neglect slowed down their efforts. Had they been continuously maintaining and improving Siri over 15+ years, they would have been able to leverage these improvements into launching more Apple Intelligence features.

    Voice command would be integral for the rumored screen-based HomePod device as well as something like an Apple Ring. Voice recognition needs to work 99+% of the time for 99+% of users, not just some of the time for some users. Including ones with strong accents, people not speaking in their native tongues, people with unusual cadences, lisps, etc.

    Today's Siri is no where close which is why Apple senior management postponed the "improved Siri" deployment. It's still back in the garage. My guess is that New & Improved Siri™ has maybe a 50% chance of debuting in iOS 26. That's right, it might slip until iOS 27.

    While I have no special insight into Apple's engineering efforts my guess is that Apple hoped on bolting on AI features onto the existing Siri service and belatedly realized in late 2024 that they needed to do an entire code rewrite -- both front end and back end -- to get Siri to where they want it to go. One thing Apple knows it cannot do is to rush out a "new" Siri that is only marginally better than what current exists. People need to be able to TRUST the voice recognition system if they are going to use it regularly to do important things.

    Today's Siri does not enable any sense of trust whatsoever. It's actually closer to an SNL parody of voice recognition systems which it wouldn't be had Apple committed to continually improving it. What was a cute novelty in 2010 is now just an embarrassment.

    With each passing week, the AI stakes grow enormously. Apple is aware of that and they can't have "new" Siri going rogue like AI Dumpster Fire of the Week Replit.

    https://www.msn.com/en-us/news/other/replits-ceo-apologizes-after-its-ai-agent-wiped-a-companys-code-base-in-a-test-run-and-lied-about-it/ar-AA1J2IlP

    And that comes just a week after Grok 4's "MechaHitler" delusions of grandeur.

    Hell, you don't need comedy writers to make up AI technology skits. You can just cut-and-paste actual AI industry news in July 2025.

    Wouldn't it be fun if "new" Siri With Apple Intelligence accidentally on purpose deleted your entire calendar and contacts databases?

    One takeaway I've gotten from following the AI industry over the past three years is that AI CEOs are the absolutely least trustworthy executives in the entire technology industry. They are snake oil salesmen: Altman, Amodei, Musk, Zuckerberg, now this Replit doofus.

    Apple must take their time and get AI right because now the bar is dropping every single day. No one really needs some silly fawning AI chatbot that exhibits serious bias.
    edited July 22
    cflcardsfan80mr moeJavert24601muthuk_vanalingamwilliamlondon
     4Likes 1Dislike 0Informatives
  • Reply 13 of 28
    ne1ne1 Posts: 87member
    ne1 said:
    "The issue isn't that Apple got caught off guard or has some kind of talent brain drain."

    Usually, I would say the assertion of this article is correct. But in this case, Apple was caught off guard by the popularity of consumer available AI tools, that's why they have not yet gotten a model working to their satisfaction and have been delayed. They likely had too much of a focus on the abandoned Apple Car and Apple Vision Pro and failed to devote resources to AI early enough and recognize its importance. We all know they sat on Siri and didn't improve it for years.

    So normally, you would be right but in this case, they did drop the ball. I blame the executive team for not putting enough emphasis on it earlier. 

    I think the framing is important. It's not as if Apple wasn't working on LLMs in 2022 or earlier, it's just that the leadership didn't see them as worthy of pursuit that early on. I'd even argue that Apple engaging in the AI nonsense in 2024 was still too soon and only brought about because of ceaseless chatter from pundits. To be clear, I'm glad the announced and released what they did, but it was still too soon by Apple standards. And the delay was necessary, why release something that'll just spew garbage like the rest of the industry, we already have garbage generators, why add another?

    AI is going to help make a lot of work easier, but it's still in its infancy. All the people pretending it's this amazing futuristic technology today is going to feel real foolish in a couple of years when it's as mundane as an excel spreadsheet. The hype is doing the technology injustice, and people are believing the hype. AI fatigue in general pop set in last year as Apple announced Apple Intelligence, and the company is lucky they took such an approach.

    AI should be a background tool that you don't even realize you're using. It's why all the grift around it being some kind of world-altering technology that could be the end of mankind will ultimately fall apart, because it will never be much more than what it is today -- a powerful next bit prediction engine. It's why I find the pursuit of "superintelligence" laughable, because it's a fiction invented by people seeking investment and the ability to bypass regulations that will ultimately fail. I expect when a company finally declares the've reached superintelligence, it'll be a watered down version that's much less than expected, if not completely underwhelming and not actually "beyond human intelligence." They moved the goalposts before, they'll do it again.

    Meanwhile, Apple will succeed by doing what it always does: releasing products people actually use, that are ethical, and maintain privacy.
    You make some interesting points. Thank you for responding in a thoughtful manner. Context is important and I'm glad we agree on the leadership's lack of emphasis on AI. I largely agree that the technology is in its infancy and has a long way to go. But disagree that it should always be a background tool implementation. People are not only using it as such-- they're using it as a primary driver of what they need to do everyday (whether or not they should be doing that is a different matter).  

    Overall, what Apple is (and should be) afraid of is the iPhone simply becoming a conduit for other AI services and apps. This was referenced last year when someone mentioned a concern that it would just become a "dumb box." We are a long way away from that right now, thankfully. However, Jonny Ives' pairing with OpenAI to create new products must be increasing Apple's concern. Nothing OpenAI (or others) create will replace an iPhone in the next few years, but in 5 years, 10? If those products (which are not meant to replace the phone) are successful, of course they will build a phone in the future. 

    Whether AI is hype or not, right now people are buying into that hype (as you said) and one day, a "black box" type phone with AI as the primary driver may emerge that can do all sorts of things an iPhone can't. Which is why Apple is (and should be) busting a move on the technology now.

    The good part of last year's fiasco is they are unlikely to be caught with their pants down again. I hope. 
    edited July 22
    Javert24601
     1Like 0Dislikes 0Informatives
  • Reply 14 of 28
    Wesley's article is a wandering mess. and has the feel of an Apple fanboy, which we all are to some extent.

    I would certainly differentiate Apple's previous "machine learning" with its neural chip vs. "Apple Intelligence".  I think the article conflates the two, and perhaps, I'm making more of a distinction than there really is.  The reason I differentiate between the two is that the former is really more behind the scenes, and Apple has been dedicating time, effort, and money to developing it.  However, the latter is what Apple came up when it realized how utterly behind it was on AI because Giannandrea was disconnected from the industry's emphasis on LLMs , which is why Siri never evolved.  So, the leadership scrambled and came up with "Apple Intelligence" to make a more overt attempt at pushing "AI" to users.  Mind you, the latter is what Apple sold the public on; pundits did not push their opinions or wishes; Apple advertised those Apple Intelligence features.  And, it delivered junk and even failed to deliver the debacle that is "smarter Siri".  Furthermore, "Apple's strategy in the space is to get out of the way and offer the technology in ways that enhance the user experience without always shouting that it is an AI feature" is exactly the former "machine learning" strategy.  However, again, the public is expecting the more overt AI interactions, which is why I believe there needs to be a distinction between "machine learning" and "Apple Intelligence".  Apple did well with the former but utterly failed to anticipate the latter.

    I agree with the opinion of many of the other posts describing Apple's failure to develop Siri for the last 15 years.  As with my other posts, I believe this stems from Apple's inability to work on multiple projects at the same time instead switching from one product to another like a computer before Microsoft Windows 386.  With so many products, money, and employees, Apple has to start multitasking.


    Wesley_Hilliardmr moeblastdoorgrandact73williamlondon
     3Likes 2Dislikes 0Informatives
  • Reply 15 of 28
    Dracodraco Posts: 47member
    I can't even voice dictate a text message on my iPhone without it screwing something up. It doesn't understand context, grammar, capitalization or punctuation. 
    edited 8:52AM
    williamlondon
     0Likes 1Dislike 0Informatives
  • Reply 16 of 28
    blastdoorblastdoor Posts: 3,842member
    Apple today is almost the antithesis of the Apple that introduced the iPod in 2001. 

    The iPod had almost no critical technology that Apple owned. The OS was licensed (Pixo). The key bit of hardware was a 1.8" hard drive from Toshiba. 

    The value-added in the iPod was the *design*, not the technology. Apple did a masterful job of combining off-the-shelf technologies into a unique product, and marketed it well. Later, they replaced the hard drive with flash -- a technology they also did not invent and did not control. 

    Apple today is almost the polar opposite. Apple has developed and/or owns many compelling technologies relevant to AI/ML, from Apple Silicon on up the stack -- compilers, programming languages, APIs (like CoreML), you name it. And any gaps in Apple's technology toolkit could be easily filled by licensing from others, because Apple today also has a ton of cash. 

    What Apple seems to be struggling with is combining those technologies into a compelling product. They are good at integrating those technologies into existing products, like when they added the neural engine to the iPhone and gave us Face ID. But they seem to really be struggling to create an AI product analogous to the iPod. 

    Meanwhile, we have companies like ChatGPT creating a product -- the chatbot -- that is clearly an "AI product". It's not a perfect product - -there are many pain points. But it is an AI product, not just a product that happens to use AI. 

    If Apple can get its product development act together, it could do great things in the AI space because it does have so many great technologies. I suspect Apple will eventually get its act together and develop that great AI product, but it might require a new CEO to do it (presumably that will be John Ternus). 
    thtwilliamlondon
     0Likes 2Dislikes 0Informatives
  • Reply 17 of 28
    mpantonempantone Posts: 2,497member
    For anyone who thinks that AI is a passing fad, you are completely out of touch with reality.

    AI is here to stay. It's doing some amazing things in the enterprise markets and if it can eke out 100 basis points in net profit for some Fortune 500 company, guess what? They're gonna use it.

    I bet 99.9% of people on these tech news site discussion forums who say they don't use AI are over 30 years old. That's right. There's a generational gap in AI usage.

    Just yesterday, the AP reported on this:

    https://apnews.com/article/ai-companion-generative-teens-mental-health-9ce59a2b250f3bd0187a717ffa2ad21f

    That's right pre-teens are using this stuff and some older teenagers even see the danger in young children using AI.

    And the consumer AI industry is largely a lawless frontier right now, it needs heavy government regulation from world governments, not just your state's governor or 1600 Pennsylvania.

    And many of today's consumer AI companies are really no better than tobacco companies. They are creating AI chatbots that look and behave like anime characters (Grok's assistants, SpicyChat AI, Character.AI, et cetera) to attract youngsters into interacting with them. It's the digital equivalent of adding candy flavors to vaping products.

    Look at the way Grok started rolling out their AI anime-skinned assistants like Ani. They debuted on iPhones first, still not available on many Android devices. Why? Probably because iPhone is the platform of choice for young people (the under 25 market), especially teens. If you are over 35, guess what? You are not the target audience of consumer AI companies. I'm way beyond that but at least I know that I am not who AI companies crave as a user.

    If you care about the future of today's youngsters -- the ones who will be tasked with fixing many of the world's problems -- you need to pay attention to what AI is today, where it is going, who is using it, for what reasons, etc.

    There's one oldtimer here who continually gripes about AI, fearing it will displace him from his job as a writer. AI's potential effects are far, Far, FAR greater than that.

    Just sticking your head in the sand or plugging your ears and saying "I'm not using AI so nyah!" like a little brat throwing a tantrum isn't going to stop AI from proliferating. That much is clear in the 3+ years I've been closely following AI.
    edited 12:22PM
    Wesley_Hilliardmuthuk_vanalingam
     0Likes 1Dislike 1Informative
  • Reply 18 of 28
    Dracodraco Posts: 47member
    At this point, I should be able to carry on a conversation with my iPhone regarding just about anything. I've been playing with Grok, and the technology to do this is available now; it's obvious that Apple is way behind the curve on AI. 
    Wesley_Hilliardgrandact73williamlondon
     1Like 2Dislikes 0Informatives
  • Reply 19 of 28
    mpantonempantone Posts: 2,497member
    Is that with marginally normal Grok 3? Or with off-the-rails "delusions of grandeur" MechaHitler Grok 4?

    The problem isn't with the ability of these AI chatbot assistants to have a conversational tone. The problems are things like accuracy, bias, lack of reasoning, lack of emotion/compassion, inability to recognize satire/sarcasm, lack of common sense (which is a growing problem with a lot of humans).

    I occasionally try out all of these consumer AI assistants and they are all dreadfully inaccurate. About 5.5 months ago I asked "what time is the Super Bowl kickoff?" Not a single one got it right. About a week before the NCAA men's basketball tournament, I asked 4-5 chatbots to fill out a bracket. Not a single one provided a bracket that had the actual teams playing in the tournament.

    Grok's performance was unbelievably bad at the March Madness bracket. Not only did it create fake competitors playing a fake bracket, it predicted that all of the higher seeds would win their games. ZERO upsets. That *NEVER* happens in a real sports competition.

    Sure, it might sound like you're having a human-like conversation but Grok is about the worst of the AI chatbots. Embarrassingly bad. And let's not forget its MechaHitler hallucinations. That just happened a couple of weeks ago.

    This is a good example of how a consumer thinks Grok sounds real so it must be accurate and trustworthy. LOL, terrible assumption. And now with Grok's new found language skills, would you vote it into office? Accept its recommendation to eat rocks and put glue on pizza? Give up your ICBM launch codes?

    Ahahahahahahaha!!!!!

     :p 
    edited 12:36PM
    grandact73
     0Likes 1Dislike 0Informatives
  • Reply 20 of 28
    blastdoorblastdoor Posts: 3,842member
    mpantone said:
    For anyone who thinks that AI is a passing fad, you are completely out of touch with reality.

    AI is here to stay. It's doing some amazing things in the enterprise markets and if it can eke out 100 basis points in net profit for some Fortune 500 company, guess what? They're gonna use it.

    I bet 99.9% of people on these tech news site discussion forums who say they don't use AI are over 30 years old. That's right. There's a generational gap in AI usage.

    Just yesterday, the AP reported on this:

    https://apnews.com/article/ai-companion-generative-teens-mental-health-9ce59a2b250f3bd0187a717ffa2ad21f

    That's right pre-teens are using this stuff and some older teenagers even see the danger in young children using AI.

    And the consumer AI industry is largely a lawless frontier right now, it needs heavy government regulation from world governments, not just your state's governor or 1600 Pennsylvania.

    And many of today's consumer AI companies are really no better than tobacco companies. They are creating AI chatbots that look and behave like anime characters (Grok's assistants, SpicyChat AI, Character.AI, et cetera) to attract youngsters into interacting with them. It's the digital equivalent of adding candy flavors to vaping products.

    Look at the way Grok started rolling out their AI anime-skinned assistants like Ani. They debuted on iPhones first, still not available on many Android devices. Why? Probably because iPhone is the platform of choice for young people (the under 25 market), especially teens.

    If you care about the future of today's youngsters, the ones who will be tasked with fixing many of the world's problems, you need to pay attention to what AI is today, where it is going, who is using it, for what reasons, etc.

    There's one oldtimer here who continually gripes about AI, fearing it will displace him from his job as a writer. AI's potential effects are far, Far, FAR greater than that.

    Just sticking your head in the sand or plugging your ears and saying "I'm not using AI so nyah!" like a little brat throwing a tantrum isn't going to stop AI from proliferating. That much is clear in the 3+ years I've been closely following AI.
    I think you’re mostly right that the AI naysayers are old. BUT — I think the old timers who do embrace AI have a potentially huge advantage over everyone else, because for them the AI replaces some of the junior colleagues who they previously would have delegated to. AI used by a seasoned veteran who has a deep understanding of their business can be much more powerful than AI used by a kiddo who knows very little about the real world.
    muthuk_vanalingam
     1Like 0Dislikes 0Informatives
Sign In or Register to comment.