Internal fighting and privacy concerns hinder Apple's ability to modernize Siri
A new report highlights Apple's ongoing struggles to modernize its smart assistant, Siri, at a time when artificial intelligence chatbots are poised to reign supreme.
It's been suggested that Apple has decided to step up its game and focus on natural-language search processing, but the shift may be causing trouble behind the scenes.
A new report from The Information on Thursday morning takes a deep dive into Apple's forays into artificial intelligence -- specifically regarding Siri.
It suggests that a series of unfortunate events and departures has led to a lack of confidence among employees.
In late 2022, three of Apple's engineers responsible for AI implementation departed to work for Google. They believed that Google was a better environment for working on large-language models, or LLMs, which allow smart chatbots to provide users with humanlike responses to questions. The departure was a blow to Apple's artificial intelligence teams, already crippled by existing issues.
Over three dozen former Apple employees cited organizational dysfunction and a lack of drive as hindrances to the company's AI and machine-learning efforts. This was especially true when working on Siri, Apple's highest-profile AI technology.
While Apple's relatively small gains in artificial intelligence haven't hurt the company, the tech giant likely knows it can't entirely ignore the shift to LLMs.
However, employing nascent technology can be a risk. ChatGPT is notorious for producing answers riddled with inaccuracies, if not outright false. Apple, more so than its competitors would likely want to maintain its brand image.
Thursday's report also points out that Apple may be reticent to implement LLMs because they would necessarily require queries to be processed in the cloud. Apple has spent years moving many of Siri's functions on-device, which helps maintain users' privacy when using the voice assistant.
Apple's senior vice president of Machine Learning and AI Strategy, John Giannandrea, joined Apple in 2018. At that time, the company had limited protocols in place for collecting data on how users interfaced with Siri. This was partially because of privacy concerns and because the company had decided that the metrics weren't worth the investment.
While Giannandrea attempted to expand the research, ultimately, executives paused the project when the media reported that third-party contractors listened to Siri recordings without users consent.
Instead, the company focused on curating responses that helped maintain its brand image. According to former Apple employees, most of Siri's "canned" responses are written or reviewed by -- humans. This helps to reduce potentially embarrassing, if not outright worrisome responses.
What direction Apple takes with Siri remains to be seen. Some reports claim that the company is already researching ways to safely implement improved intelligent capabilities for user queries. Still, much like its foray into virtual reality, it's likely that Apple won't put out any intelligent search features until its ready.
Read on AppleInsider
Comments
Interestingly, this is analogue to your choices when hiring a human personal assistant. Do you want an assistant who is respectful and discreet, or one who knows more about you because they regularly rifle through all your stuff and share info about you with friends and paying customers? That second option will already know you might want to order some flowers, but that's because they listened to every bit of that argument you had with you wife this morning. Do you want an assistant who asks for clarification and is willing to admit when they simply don't know the answer, but can help you find it, or one who always tells you what you want to hear, whether or not their info is correct? The first one can be a little frustrating, but that second one is willing to send you out there misinformed.
I know I'd rather have the assistant who is discreet and self-confident enough to be honest about what info they can provide.
machow2.com/dictate-offline-catalina
"Google Assistant is designed to wait in standby mode until it detects an activation, like when it hears “Hey Google.” The status indicator on your device will let you know when Google Assistant is activated. When in standby mode, it won’t send what you’re saying to Google servers or anyone else. After Google Assistant detects an activation, it exits standby mode and sends your request to Google servers. This can also happen if there is a noise that sounds like “Hey Google” or an unintended manual activation.
If you don't want to activate your Google Assistant by voice, you can mute the mic on your device or turn off "Hey Google."
On-device processing? Google Assistant does that too and has since 2019.
https://appleinsider.com/articles/19/05/07/google-assistant-response-speed-getting-improved-by-on-device-processing
I’m not currently concerned at all about the underlying language models or where the service is hosted because I’ve lost faith and confidence in Siri and because using has always seemed like a chore with hit or miss results. Apple simply needs to win back users’ confidence in the Siri service itself, no matter what technical approach they need to come up with to make it happen.
Maybe I’m an outlier, but I suspect that a nontrivial number of Apple customers (and employees) have largely given up on Siri? Yes, it does save lives when it works, but for everyday utility, it’s basically sitting in the junk drawer. Apple needs to change that.