freeassociate2

Just another faceless crustacean dog-toy. 

About

Username
freeassociate2
Joined
Visits
52
Last Active
Roles
member
Points
860
Badges
1
Posts
238
  • Game Mode isn't enough to bring gaming to macOS, and Apple needs to do more

    (Shrugs) Nintendo seems to do just fine with their hardware choices. It’s a lack of commitment from devs and studios, not a lack hardware that’s bleeding edge.

    Also, “old” games, really? You mean the ones people are still playing, buying, and otherwise spending money on, even years later? Oh, I am so insulted.

    I don’t think the majority of the macOS market cares about gaming on that platform. On iOS and iPadOS … and potentially VisionOS, definitely. Most of us just don’t sit at our Macs (which we use for work all day) to game. We sit down at our large screen TVs, Atmos (or other) sound systems, and play on our console(s) cradled by our comfy couches and chairs.

    Also, notably, the console community is relatively free of the endemic cheating the PC side nurtures. Let’s face it, the Mac market of gamers does not want to deal with the toxicity of the PC game market. They just opt out.

    https://www.pcgamer.com/on-behalf-of-pc-gaming-sorry-about-all-those-cheaters-in-your-console-games/
    tmaywilliamlondonFileMakerFeller
  • Windows XP can partially run on Vision Pro hardware in emulation

    Btw, how stupid is UTM’s “Securely run operating systems on your Mac” advertising in light of the fact that they advocate jail-breaking your phone?

    good grief
    williamlondonFileMakerFellerwatto_cobra
  • Windows XP can partially run on Vision Pro hardware in emulation

    About as useful as turning an Apple Watch into a mechanical one, clever feat more for attention and initial adulation than actual practical usage.

    Because no one should ever do something they enjoy doing just for the joy of doing it.  /s
    You mean like serial killers? :D

    Doing impractical, worthless, or outright stupid things may define a lot of human behavior — that doesn’t make it an inherent good, buddy. 
    williamlondonwatto_cobra
  • Latest Intel and AMD vulnerabilities a gentle reminder to switch to Apple silicon

    The problem with those saying “it’s inevitable due to x” and “now that there’s x amount of processors” is twofold. One, Apple’s ARM processors have been on the market for fifteen years and in millions of units, yet the amount of discovered exploits doesn’t even approach that of x86. Second, millions more ARM processors have been released by competitors in the same spaces, in addition to those in the embedded market. All present high value targets, so the “obscurity” argument falls apart here. Here again, the exploits discovered over the same span of time on the x86 side are not comparable.

    Whether the weight of x86 flaws and engineering compromises will eventually sink the platform before a fundamental shift in computing makes it moot remains to be seen. As of right now, it’s mostly propped up by the inertia of the software market and the knowledge base of engineers and developers. If either of those advantages falter, or a competing market need (power savings, heat dissipation, $$, etc) becomes paramount, x86 will fail in the market.

    Either way, the premise is still sound: there are significant factors that make transitioning to Apple Silicon, and ARM generally (or other architectures), a good move for numerous market segments. 
    FileMakerFellerwatto_cobra
  • Apple has been working on its own ChatGPT AI tool for some time

    mayfly said:
    Japhey said:
    mayfly said:
    timmillea said:
    There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition. 

    My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK. 

    It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets. 




    But to bemoan the fact that AI hasn't achieved singularity in 30 years shows a lack of understanding the enormous technical challeges involved. It will take processing power that does not even exist at the scale required at this time. Perhaps quantum computing will be the answer to the advances you're seeking. Decades from now.
    Did you study AI and software engineering in college? If you did, well done  But if you didn’t, what makes you think that you know more than someone who did? 

    Also, who said anything about the Singularity?
    When I was in college, there was no AI. There was no software. The only computer in Chicago was an IBM 360 mainframe at the Illinois Institute of Technology. That's where I went to college, and where I majored in EE, with a computer science minor. The first engineering job I had was at Robert Bosch corp, developing electronic fuel injection hardware and software. Then in the engineering dept. at Siemens, working on the implementation of integrated circuit technology into their medical devices. Followed by 17 years of self employment in graphic arts (you could find my name on the original Adobe Pagemaker/Indesign and Illustrator teams). Followed by working at Apple until I retired in 2014.

    Other than that, you're right, I'm probably unqualified to opine about the resources necessary to advance AI to pass the Imitation Game.
     you could find my name on the original Adobe Pagemaker/Indesign and Illustrator teams”

    that’s pretty neat. (The other stuff is, too.) Congrats on retirement! (I’m kinda low-key dreading the financial aspects. But I’ve got another ten years to go.)
    mayflywatto_cobra