This could be a problem for Google and Microsoft, because their AI services are all done online.
Nope. These things are on-device using Google AI:
Gemini Nano: Google's most efficient AI model for on-device tasks, which powers features like:
Summarize in Recorder: Summarizes recorded conversations, interviews, and presentations without a network connection
Smart Reply in Gboard: Suggests high-quality responses to save time
Magic Eraser: Erases larger objects and people from photos without leaving traces
Read aloud: Reads news articles aloud in English or another language
Transcription: Transcribes recorded conversations and identifies the two speakers
Summarizing audio recordings in the Recorder app
Generative AI reply suggestions in chat apps like WhatsApp
An enhanced Magic Eraser that generates new pixels to fill in spaces left by removed objects
Best Take: A photo editing feature that combines similar photos into one, allowing users to select their favorite expressions for each person in the photo.
Magic Audio Eraser: Removes unwanted background noise from videos on the Google Pixel 8 and Pixel 8 Pro
So while some are done online, the resource-intensive Video Boost for instance, most of the AI features introduced with the Pixel 8 are done privately on the user's phone rather than sent to Google to accomplish. That makes Google's current use of on-device AI processing similar to the rumored integration of AI features on this fall's Pro iPhones.
However, if the reports are accurate, not all of Apple's AI features can be accomplished on-device. The more advanced features will still need cloud assistance.
I respect you, when you give examples to the contrary, which shows I overgeneralized. But it doesn't show I was wrong about everything. All I need is one example to prove I was right.
This could be a problem for Google and Microsoft, because their AI services are all done online.
Nope. These things are on-device using Google AI:
Gemini Nano: Google's most efficient AI model for on-device tasks, which powers features like:
Summarize in Recorder: Summarizes recorded conversations, interviews, and presentations without a network connection
Smart Reply in Gboard: Suggests high-quality responses to save time
Magic Eraser: Erases larger objects and people from photos without leaving traces
Read aloud: Reads news articles aloud in English or another language
Transcription: Transcribes recorded conversations and identifies the two speakers
Summarizing audio recordings in the Recorder app
Generative AI reply suggestions in chat apps like WhatsApp
An enhanced Magic Eraser that generates new pixels to fill in spaces left by removed objects
Best Take: A photo editing feature that combines similar photos into one, allowing users to select their favorite expressions for each person in the photo.
Magic Audio Eraser: Removes unwanted background noise from videos on the Google Pixel 8 and Pixel 8 Pro
So while some are done online, the resource-intensive Video Boost for instance, most of the AI features introduced with the Pixel 8 are done privately on the user's phone rather than sent to Google to accomplish. That makes Google's current use of on-device AI processing similar to the rumored integration of AI features on this fall's Pro iPhones.
However, if the reports are accurate, not all of Apple's AI features can be accomplished on-device. The more advanced features will still need cloud assistance.
I respect you, when you give examples to the contrary, which shows I overgeneralized. But it doesn't show I was wrong about everything. All I need is one example to prove I was right.
This could be a problem for Google and Microsoft, because their AI services are all done online.
Nope. These things are on-device using Google AI:
Gemini Nano: Google's most efficient AI model for on-device tasks, which powers features like:
Summarize in Recorder: Summarizes recorded conversations, interviews, and presentations without a network connection
Smart Reply in Gboard: Suggests high-quality responses to save time
Magic Eraser: Erases larger objects and people from photos without leaving traces
Read aloud: Reads news articles aloud in English or another language
Transcription: Transcribes recorded conversations and identifies the two speakers
Summarizing audio recordings in the Recorder app
Generative AI reply suggestions in chat apps like WhatsApp
An enhanced Magic Eraser that generates new pixels to fill in spaces left by removed objects
Best Take: A photo editing feature that combines similar photos into one, allowing users to select their favorite expressions for each person in the photo.
Magic Audio Eraser: Removes unwanted background noise from videos on the Google Pixel 8 and Pixel 8 Pro
So while some are done online, the resource-intensive Video Boost for instance, most of the AI features introduced with the Pixel 8 are done privately on the user's phone rather than sent to Google to accomplish. That makes Google's current use of on-device AI processing similar to the rumored integration of AI features on this fall's Pro iPhones.
However, if the reports are accurate, not all of Apple's AI features can be accomplished on-device. The more advanced features will still need cloud assistance.
I respect you, when you give examples to the contrary, which shows I overgeneralized. But it doesn't show I was wrong about everything. All I need is one example to prove I was right.
That means that Google and MS will have the benefit to run AI on-device and / or in cloud. I think May and June events will have more news on how Apple, Google and MS will work with in their devices and platforms.
Spotlight search. Hope that this upgrade provides some useful search. As Spotlight search currently stands you will be lucky, very lucky, to find anything of any use on the Mac. The Mac is a great machine and so is the iPhone. But search on the Mac and on the iPhone is about as terrible as it gets. Any time I am stuck with having to look for a file or document I reach for an anti migraine tablet. I know I am going to get a massive headache at all the useless junk that is thrown at me which has nothing to do with the search criteria that I entered.
Spotlight works great for me, on my Mac and iPhone/iPad. Couldnt live without it.
Perhaps you could use some instruction on composing better search prompts?
Spotlight search. Hope that this upgrade provides some useful search. As Spotlight search currently stands you will be lucky, very lucky, to find anything of any use on the Mac. The Mac is a great machine and so is the iPhone. But search on the Mac and on the iPhone is about as terrible as it gets. Any time I am stuck with having to look for a file or document I reach for an anti migraine tablet. I know I am going to get a massive headache at all the useless junk that is thrown at me which has nothing to do with the search criteria that I entered.
Really? I find Spotlight great. Especially compared to Windows search, which is makes a chocolate teaspoon seem otherworldly.
Spotlight works very well for me, too, both on my Mac and iPhone.
Regular search, not so much, on the Mac. Typing in the Search window of a Finder window usually starts giving results with each letter typed, with the majority of them unrelated to my search term. Unrelated by the results not having any thing of my Search term. Maybe Search has found some meta data or something deeper than the file name, causing it to bring up scads of results that are meaningless.
As is typical of entering any data looking for a result, I hit the enter/Return key. This maintains all the irrelevant seemingly unrelated results. Relevant results may be in the return noise but most often there's so much noise the results are useless unless I'm desperate.
The key is to avoid hitting enter/Return at the end of typing my search terms. Instead I have to put the cursor on the drop down that appears from the Search window and click on the result that matches my search term(s). This gives me only results obviously related to my search term(s) though they may not be what I'm looking for specifically. But there's far fewer results to comb through, and most often dozens of results with the first method are whittled down to one exact, relevant file.
If Apple made the later result the default of hitting enter/Return, a huge 1st World Problem/Annoyance would be eliminated. Me <— not holding my breath.
As for AI, I don't have high hopes yet. I've seen so many wacky results, I'm not sure it will benefit Siri, which has been too long neglected. I will gladly applaud Apple making Siri far more reliable and accurate, if and when that happens. Alexa makes Siri act like her childish immature younger sister.
Spotlight search. Hope that this upgrade provides some useful search. As Spotlight search currently stands you will be lucky, very lucky, to find anything of any use on the Mac. The Mac is a great machine and so is the iPhone. But search on the Mac and on the iPhone is about as terrible as it gets. Any time I am stuck with having to look for a file or document I reach for an anti migraine tablet. I know I am going to get a massive headache at all the useless junk that is thrown at me which has nothing to do with the search criteria that I entered.
Complete opposite than my daily experience. Spotlight and search on macOS wonderfully indexes and returns files I search for with text from the files.
Does that mean Siri suddenly understands Dutch? If not, any “massive upgrade” has zero value for me and Siri remains switched off like the past few years.
No offense but this is somewhat niche…
So much stuff is niche until it isn’t. We are still waiting for Apple News in non-English speaking countries!
Apple should have been at the forefront of this, instead they let Siri languish for the last 10 years. Today's Siri fail: "Hey Siri, Add cornflour to my shopping list." "Ok, your message to Chris says 'to my shopping list'. Shall I send it?" "What? No". "Ok." "Hey Siri, Add cornflour to my shopping list." "Ok, your message to Chris says 'to my shopping list add cornflour to my shopping list'" "Shall I send it?" "No". "Ok." "Hey Siri, Add cornflour to my shopping list." "Ok, your message to Chris says 'to my shopping list add cornflour to my shopping list add cornflour to my shopping list'" "Shall I send it?" "No". "Ok." <turns Homepod off and back on> "Hey Siri, Add cornflour to my shopping list". "Ok, i have added corn and flour to your shopping list".
At the same time, Google respects a website owner's right to monetize their content and won't "summarize" without the content owner's permission. There have also been no legal objections to it either, as far as I know, so I fully expect Apple will handle it the same as Google; AI Summarize, sans ads, will only be available with a website owner's permission. https://support.google.com/assistant/answer/14163109?hl=en
Is this satire? „Google respects…“?? They did have no respect at all to anyone’s right be it copyright or the „do not track“ headers. They track anyone even against outspoken disagreement with their wrongdoing. They try to use the W3C as a political instrument to strenghten their monopole on the ad markets. They steal health data, book copyrights and anything they can get hold of. Google is the worst parasite of the web ever.
Comments
https://en.wikipedia.org/wiki/Microsoft_Copilot
"In January 2024, a premium service, Microsoft Copilot Pro, was launched, costing US$20 monthly."
Intel confirms Microsoft's Copilot AI will soon run locally on PCs, next-gen AI PCs require 40 TOPS of NPU performance | Tom's Hardware (tomshardware.com)
That means that Google and MS will have the benefit to run AI on-device and / or in cloud. I think May and June events will have more news on how Apple, Google and MS will work with in their devices and platforms.
Spotlight works very well for me, too, both on my Mac and iPhone.
Regular search, not so much, on the Mac. Typing in the Search window of a Finder window usually starts giving results with each letter typed, with the majority of them unrelated to my search term. Unrelated by the results not having any thing of my Search term. Maybe Search has found some meta data or something deeper than the file name, causing it to bring up scads of results that are meaningless.
As is typical of entering any data looking for a result, I hit the enter/Return key. This maintains all the irrelevant seemingly unrelated results. Relevant results may be in the return noise but most often there's so much noise the results are useless unless I'm desperate.
The key is to avoid hitting enter/Return at the end of typing my search terms. Instead I have to put the cursor on the drop down that appears from the Search window and click on the result that matches my search term(s). This gives me only results obviously related to my search term(s) though they may not be what I'm looking for specifically. But there's far fewer results to comb through, and most often dozens of results with the first method are whittled down to one exact, relevant file.
If Apple made the later result the default of hitting enter/Return, a huge 1st World Problem/Annoyance would be eliminated. Me <— not holding my breath.
As for AI, I don't have high hopes yet. I've seen so many wacky results, I'm not sure it will benefit Siri, which has been too long neglected. I will gladly applaud Apple making Siri far more reliable and accurate, if and when that happens. Alexa makes Siri act like her childish immature younger sister.
"Hey Siri, Add cornflour to my shopping list." "Ok, your message to Chris says 'to my shopping list'. Shall I send it?" "What? No". "Ok."
"Hey Siri, Add cornflour to my shopping list." "Ok, your message to Chris says 'to my shopping list add cornflour to my shopping list'" "Shall I send it?" "No". "Ok."
"Hey Siri, Add cornflour to my shopping list." "Ok, your message to Chris says 'to my shopping list add cornflour to my shopping list add cornflour to my shopping list'" "Shall I send it?" "No". "Ok."
<turns Homepod off and back on>
"Hey Siri, Add cornflour to my shopping list". "Ok, i have added corn and flour to your shopping list".
What a f**king disaster.