elijahg
About
- Username
- elijahg
- Joined
- Visits
- 397
- Last Active
- Roles
- member
- Points
- 6,576
- Badges
- 2
- Posts
- 2,904
Reactions
-
Apple's fix for bad AI notification summaries won't actually improve results
Wesley Hilliard said:kkqd1337 said:I think major news organisations should be able to opt out of their data being AI processed
I think the issue is the summary is sometimes so short it is limited on the number of words it can say, so it just picks what it thinks are important points like in the previous BBC article someone's name, and "gay" and mangles them together which completely loses context and often reality. -
Apple's fix for bad AI notification summaries won't actually improve results
This kind of thing proves Apple Intelligence has most definitely not been seriously worked on for anywhere near as long as Cook says it has. The Apple Summary Service does a better job than this, and doesn't hallucinate. -
Merry Christmas, from the entire AppleInsider team
-
Apple-Nvidia collaboration triples speed of AI model production
danox said:Friendly? Apple will use them for as long as they have to in the back of house, Apple is not far from convergence with many of the Nvidia graphics cards, when will that convergence take place the M4 Studio Ultra? or will it be the M5 or M6 it isn’t that far away, look at the power signature required for one of the graphics cards made by Nvidia. Apple and Nvidia ain’t on the same path the power signature wattage is so far off. It’s ridiculous.
https://www.pcguide.com/gpu/power-supply-rtx-4080/ 750 Watts system recommendation, and the back house Nvidia stuff it’s even more out there. The current M2 Ultra takes 107 watts for everything, and the M4 is even more powerful and efficient than that, let along what’s coming up with the M5 and M6.
Currently the 4080 is about 3.3 times faster than the M2 Studio Ultra (Blender), it will be interesting to see how close the M4 Studio Ultra gets next year, we know it’ll use a hell of a lot less power to achieve Its performance.
Apple Silicon is definitely powerful enough now, but for market inertia by the time of the M5 it will be indisputable.
My 2019 iMac gets a Geekbench Metal score of about 3x less than the top end M2 Ultra, but it's 4 years older than the M2 Ultra, and it didn't cost £5200 + display. The Pro Vega 48 in my iMac was pretty sluggish compared to the equivalent Nvidia card at the time: getting 11,000 on 3DMark. The Nvidia RTX 2080 at the time was getting nearly double that. So that shows how Apple screwed over Mac users by refusing to use Nvidia. Nvidia also wrote their own Mac drivers, which were updated all the time and much better than the Apple-written ATI drivers. And shows that in the real world, right now, Apple Silicon GPUs are still a long way from reaching dedicated GPUs.
-
Apple-Nvidia collaboration triples speed of AI model production
The reason Apple switched from ATI to Nvidia in the first place was because they fell out with ATI over them leaking a PowerBook with an ATI GPU in. Unfortunately that hate of Nvidia really screwed over the Mac owners from about 2012 onwards. We were stuck with crappy, hot, slow ATI GPUs when Nvidia was much much much faster.
https://www.zdnet.com/article/ati-on-apple-leak-our-fault/