OctoMonkey

About

Username
OctoMonkey
Joined
Visits
58
Last Active
Roles
member
Points
758
Badges
0
Posts
351
  • Apple best-placed to benefit as AI goes mainstream, says Morgan Stanley


    I must admit that I'm asking myself "What's the point?" when the world is on a path of self-destruction anyway. 
    You know...  I was just about to have a piece of pie.  You had to go and ruin it for me.
    ForumPostScot1watto_cobra
  • iMac 24-inch M3 review: A clear sign that Intel Mac support is ending soon

    brianjo said:
    Since they are committed to the stupid power brick, it would have made sense to add a couple USB or thunderbolt ports there as well, tidying up the desk workspace.

    Hell, even better would be to move all of the brains into the brick and calling it a Mac Mini, thus making the display just a display, allowing you to upgrade the machine by replacing the brick instead of the whole machine...
    The requirement of the power brick is another outstanding example of Apple's insistence on form over function...  and should have definitely been on the list of "Cons".  There is no justifiable excuse for the power brick and, for me, would reduce this from a 4/5 to a 3/5.
    Alex1Nwilliamlondonappleinsideruser
  • Apple CEO Tim Cook calls AI a fundamental technology

    danox said:
    and yet...  When my iPhone does a voice-text transcription of voicemail message, it spells my name wrong.  A whole lot of intelligence there!

    Apple unlike their competition is working towards on device AI solutions and not the ET phone home AI. What’s implemented on the Apple Vision Pro begins with the M2/M3 chip working in conjunction with the R1 chip which knows what to do when you look at something or when you slightly rub your fingers together to execute a command, in short, it won’t be Googles (video boost) implementation where information is sent to Google HQ, and then bounced back to you after being scrubbed of personal information, I think the AI path Apple is taking is more private to the end user (on device).

    On device, AI is more complicated because it actually requires a SOC chip combined with OS level software, and conversely requires a hell of a lot more time to design, engineer, and develop, designing something that works on a super computer back home at Google/Microsoft HQ does not which is why Google resorted to that method to cover for the shortcomings of the Tensor SOC in the Pixel 8 Pro which is five years behind Apple.
    Let us say your phone is setup with the owner contact information being Brad Lee Smith.  If somebody leaves a voice-mail message starting with "Hello Brad Lee", the transcription should not read "Hello Bradley".  This is not a particularly complicated issue to understand.  The old Knowledge Navigator concept from 35 years ago showed this type of "AI" in action.
    byronlAlex1NScot1
  • Apple has most of the elements it needs to create its own search engine

    As long as they call it Sherlock and compensate Karelia, I am good with it!
    williamlondondarkvader
  • Tim Cook confirms Apple is researching ChatGPT-style AI

    dewme said:
    The term “AI” is the current generation’s universally applicable adjective equivalent to the previous generation’s “digital.” The Digital Age spawned crazy notions like the Digital Economy, Digital Ecosystem, and my favorite, the Digital Divide. No, the Digital Divide is not the line of demarcation between the free flowing, non quantized, infinitely divisible notions of the Analog Universe and a much different universe defined by ones and zeros.

    The good news is that you don’t need tools like ChatGPT to transport yourself and your new work into the buzzwordy coolness of the new AI Age. All you need is the Search & Replace feature of your favorite human driven generative tools, aka word processing and presentation programs, and simply replace all occurrences of “Digital” with “AI.” 

    Welcome to the AI Economy, the AI Ecosystem, and my brand new favorite, the AI Divide. Just make sure you’re living on the right side of the new divide. You don’t want to be left behind.
    People are viewing "AI" today the same way people viewed "fuzzy-logic" a quarter century ago...  Gads!  Has it really been that long.  "Fuzzy-logic" was viewed by many "experts" in the industry as THE game changer for AI.  It was largely dead within 5-7 years.
    williamlondonAlex1NFileMakerFellerwatto_cobra