elijahg

About

Username
elijahg
Joined
Visits
397
Last Active
Roles
member
Points
6,576
Badges
2
Posts
2,904
  • Apple's fix for bad AI notification summaries won't actually improve results


    kkqd1337 said:
    I think major news organisations should be able to opt out of their data being AI processed 


    Apple isn't processing data from news organizations. Notification summaries are generated on-device with local data.
    @kkqd1337 never said Apple themselves were processing the data. It is being processed by on-device AI. You and Apple literally call it AI notification summaries. 

    I think the issue is the summary is sometimes so short it is limited on the number of words it can say, so it just picks what it thinks are important points like in the previous BBC article someone's name, and "gay" and mangles them together which completely loses context and often reality.
    mac daddy zeewatto_cobraAlex1Nkkqd1337
  • Apple's fix for bad AI notification summaries won't actually improve results

    This kind of thing proves Apple Intelligence has most definitely not been seriously worked on for anywhere near as long as Cook says it has. The Apple Summary Service does a better job than this, and doesn't hallucinate. 
    mac daddy zeeMac4macwatto_cobrakkqd1337
  • Merry Christmas, from the entire AppleInsider team

    Thanks guys for putting up with us, speaking for everyone but I’m sure everyone would agree; we do really enjoy your work and the effort you put in, even if we don’t show quite as much appreciation as we should sometimes!

    Merry Christmas to all of you! 🎄
    iOS_Guy80darbus69avon b7applesnorangesjbirdiikunmuthuk_vanalingamfred1lotonesCurtisHight
  • Apple-Nvidia collaboration triples speed of AI model production

    danox said:
    Friendly? Apple will use them for as long as they have to in the back of house, Apple is not far from convergence with many of the Nvidia graphics cards, when will that convergence take place the M4 Studio Ultra? or will it be the M5 or M6 it isn’t that far away, look at the power signature required for one of the graphics cards made by Nvidia. Apple and Nvidia ain’t on the same path the power signature wattage is so far off. It’s ridiculous.

    https://www.pcguide.com/gpu/power-supply-rtx-4080/ 750 Watts system recommendation, and the back house Nvidia stuff it’s even more out there. The current M2 Ultra takes 107 watts for everything, and the M4 is even more powerful and efficient than that, let along what’s coming up with the M5 and M6.

    Currently the 4080 is about 3.3 times faster than the M2 Studio Ultra (Blender), it will be interesting to see how close the M4 Studio Ultra gets next year, we know it’ll use a hell of a lot less power to achieve Its performance.

    Apple Silicon is definitely powerful enough now, but for market inertia by the time of the M5 it will be indisputable.





    The Apple Silicon GPU benchmarks whilst incredible for an iGPU, are somewhat cherry picked. It will be interesting to see how much faster the Mx GPUs get, but I very much doubt - even with Apple's incredibly skilled engineers - that they will be anywhere near dedicated GPU performance. It's just physics. Massively less transistors means less performance, no matter how many clever optimisations there are. Power per watt, yes Apple rules the roost by far and likely always will.

    My 2019 iMac gets a Geekbench Metal score of about 3x less than the top end M2 Ultra, but it's 4 years older than the M2 Ultra, and it didn't cost £5200 + display. The Pro Vega 48 in my iMac was pretty sluggish compared to the equivalent Nvidia card at the time: getting 11,000 on 3DMark. The Nvidia RTX 2080 at the time was getting nearly double that. So that shows how Apple screwed over Mac users by refusing to use Nvidia. Nvidia also wrote their own Mac drivers, which were updated all the time and much better than the Apple-written ATI drivers. And shows that in the real world, right now, Apple Silicon GPUs are still a long way from reaching dedicated GPUs.
    byronl
  • Apple-Nvidia collaboration triples speed of AI model production

    The reason Apple switched from ATI to Nvidia in the first place was because they fell out with ATI over them leaking a PowerBook with an ATI GPU in. Unfortunately that hate of Nvidia really screwed over the Mac owners from about 2012 onwards. We were stuck with crappy, hot, slow ATI GPUs when Nvidia was much much much faster. 

     https://www.zdnet.com/article/ati-on-apple-leak-our-fault/
    MacProbyronl