Apple's Siri director to speak at AI Frontiers conference in November
Apple senior director of Siri, Alex Acero, is scheduled to appear at this year's AI Frontiers conference, where he will be joined by other experts in the field to discuss the role natural language processing plays in modern personal assistant technology.

Apple product manager Kim Beverett demonstrates Siri Shortcuts onstage at WWDC 2018.
According to the AI Frontiers website, Acero will take part in a panel entitled "Personal Assistants," which is slated to follow the conference's opening keynote address on Nov. 9.
The Apple executive will likely discuss Siri, Apple's virtual assistant technology that powers user interaction on iPhone, iPad, Apple Watch, Mac and HomePod.
Acero joined Apple in 2013 as part of the tech giant's efforts to build out Siri's speech recognition and machine learning capabilities. Before Apple, Acero served a 20 year stint at Microsoft Research, where he managed teams in speech, computer vision, NLP, machine translation, machine learning and information retrieval.
AI Frontiers brings together a range of industry professionals working on artificial intelligence and deep learning applications. Over the course of three days, panelists will present and discuss their projects, whether it be scholarly research or products used by millions of people around the world.
Joining Acero's panel are Amazon Alexa director Ruhi Sarikaya and Google research scientist Dilek Hakkani-Tur, both experts in the field of natural language processing.
Amazon's Alexa is one of the most popular virtual assistants on the U.S. market thanks to the company's Echo speakers. With integrations, or skills, that tap into third-party products and services, Alexa enables voice control of smart home appliances, media playback devices and services, shopping, communications and more.
Google's virtual assistant technology, aptly dubbed Google Assistant, is another top contender. Users and critics have found Assistant to be one of the more accurate voice-enabled products available, a testament to Google's heavy investment in AI and machine learning systems. At Google IO 2018, the company showcased a particularly impressive feature dubbed Google Duplex which, when trained correctly, allows Assistant to carry a limited conversation with a human.
Acero is no stranger to the public eye. Last year, the Siri director provided an inside look at the technology's ability to learn new languages, specifically how data is ingested, processed and prepared for consumer use.
Apple has consistently improved its virtual assistant since it saw release as part of iOS 5 in 2011. Other tech companies have followed with their own solutions, Amazon and Google being the most notable.
Nearly seven years old, Siri's capabilities are still largely restricted to services tasks like creating calendar entries or controlling basic device functions. That is expected to change when iOS 12 launches this fall.
The next-generation operating system includes a new Shortcuts app that will allow users to create and run app macros via a custom Siri phrase. For example, an iPhone user can create a shortcut called "Heading home" that commands Siri to query Maps for a navigation route, send a custom text via Messages, set a home thermostat and begin media playback.

Apple product manager Kim Beverett demonstrates Siri Shortcuts onstage at WWDC 2018.
According to the AI Frontiers website, Acero will take part in a panel entitled "Personal Assistants," which is slated to follow the conference's opening keynote address on Nov. 9.
The Apple executive will likely discuss Siri, Apple's virtual assistant technology that powers user interaction on iPhone, iPad, Apple Watch, Mac and HomePod.
Acero joined Apple in 2013 as part of the tech giant's efforts to build out Siri's speech recognition and machine learning capabilities. Before Apple, Acero served a 20 year stint at Microsoft Research, where he managed teams in speech, computer vision, NLP, machine translation, machine learning and information retrieval.
AI Frontiers brings together a range of industry professionals working on artificial intelligence and deep learning applications. Over the course of three days, panelists will present and discuss their projects, whether it be scholarly research or products used by millions of people around the world.
Joining Acero's panel are Amazon Alexa director Ruhi Sarikaya and Google research scientist Dilek Hakkani-Tur, both experts in the field of natural language processing.
Amazon's Alexa is one of the most popular virtual assistants on the U.S. market thanks to the company's Echo speakers. With integrations, or skills, that tap into third-party products and services, Alexa enables voice control of smart home appliances, media playback devices and services, shopping, communications and more.
Google's virtual assistant technology, aptly dubbed Google Assistant, is another top contender. Users and critics have found Assistant to be one of the more accurate voice-enabled products available, a testament to Google's heavy investment in AI and machine learning systems. At Google IO 2018, the company showcased a particularly impressive feature dubbed Google Duplex which, when trained correctly, allows Assistant to carry a limited conversation with a human.
Acero is no stranger to the public eye. Last year, the Siri director provided an inside look at the technology's ability to learn new languages, specifically how data is ingested, processed and prepared for consumer use.
Apple has consistently improved its virtual assistant since it saw release as part of iOS 5 in 2011. Other tech companies have followed with their own solutions, Amazon and Google being the most notable.
Nearly seven years old, Siri's capabilities are still largely restricted to services tasks like creating calendar entries or controlling basic device functions. That is expected to change when iOS 12 launches this fall.
The next-generation operating system includes a new Shortcuts app that will allow users to create and run app macros via a custom Siri phrase. For example, an iPhone user can create a shortcut called "Heading home" that commands Siri to query Maps for a navigation route, send a custom text via Messages, set a home thermostat and begin media playback.
Comments
Buy Watson already, and restart from scratch, Apple. And Siri-ously, shut down this silly little toy project.
Siri can leapfrog Apple over rivals like Amazon and Google. Apple holds a few aces in hand. Right now, as default, Siri monitors you and location as you use apps, including third-party e-commerce apps like Nike and Addidas. I’ve seen no reportage on this. It’s speculation, based mostly on a review of all the various apps activated under my Settings>Siri & Search and a read of the accompanying text by Apple. Why does Siri want to know how I use Addidas or Nike apps? This can be part of a brilliant strategy, via App Store, to eventually bypass Amazon shopping, where Apple facilitates frictionless direct purchases of sportswear, athletic shoes, and much more. (Already Apple has more credit card numbers and shipping addresses than Amazon, and probably Google too.) No analyst reports I’ve seen have yet to mention this. But in my view it’s only a matter of time-- maybe iOS 14? iOS15?--until Siri through apps can turn into a personalized multipurpose shopping assistant, overseeing whatever apps are important to you. It seems to me a concrete elegant way (as Workflow/Shortcuts in iOS 12), for Apple to tackle the giant field of e-commerce by breaking it into achievable pieces. Be great for users, devs, and Apple. Maybe AI can flesh it out in longer more cogent piece.
Yes, Siri can still be dumber than a brick, recently misdirecting me on the road to repair windshield to a Safelite shop near Cleveland 800 miles away! 800 hundred miles drawing a route on map!! But second attempt worked fine. This baffles me. Since natural language understood my request and had my location data in my car on the road in Arlington, Virginia. Maybe Mr. Acero can address this. That said Siri can be trained, and is improving. In the comparisons of intelligent assistants, at least a few should include week-long sessions to take training into account.
Apple is as likely to “just sell” their iPhone line, as IBM is Watson. Lol.
I would absolutely LOVE to see an agreement where Watson is integrated into Siri.
However the “just buy” Tesla, Google, Netflix, and apparently now IBM crowd’s tedious & monotonous idiotic war cry is sad, old, & belies even the vaguest understanding of technology, business, or just basically society in general.
I believe that Apple is shifting their problem to the developer and can then announce. how much smarter Siri is performing, it’s just madness from a company that is sitting on billions and has to come up with this kind of “solution”.
If the developers do not integrate these shortcuts into their apps, Siri will remain a dumb brick that can foresee the weather forecast and little basic things that where great news in 2014!
As to your second para, I have no clue what you mean. Explain?