Japhey

About

Username
Japhey
Joined
Visits
73
Last Active
Roles
member
Points
4,986
Badges
2
Posts
1,767
  • Spotify's AI DJ expands to 50 countries around the world

    The Spotify DJ is actually really good. I listen to many different genres, and the AI did a great job mixing them together in a way that that felt fluid. The transition between slow and fast songs was usually broken up by the DJ, who was not as annoying or as frequent as I anticipated. Choosing music is something I’ve never needed help with, so I personally find this to be mostly a novelty…but a really good one that I might use again if I’m ever feeling lazy. 
    appleinsideruser
  • Apple TV+ cancels Uma Thurman thriller 'Suspicion'

    jbdragon said:
    I didn't know their series even existed.  I do like Uma Thurman, but knowing the series already ended, do I want to get invested in watching this?
    No. 
    FileMakerFellerwatto_cobra
  • Apple TV+ cancels Uma Thurman thriller 'Suspicion'

    If I remember correctly, over the course of 8 episodes Uma Thurman had approximately 5 minutes of total screen time. So for this show to be labeled as an Uma “thriller” was disingenuous on Apple’s part from the very beginning. Though, that may not have mattered much had the show been just a little better. 
    FileMakerFellerwatto_cobra
  • Apple has been working on its own ChatGPT AI tool for some time

    mayfly said:
    Japhey said:
    mayfly said:
    timmillea said:
    There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition. 

    My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK. 

    It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets. 




    But to bemoan the fact that AI hasn't achieved singularity in 30 years shows a lack of understanding the enormous technical challeges involved. It will take processing power that does not even exist at the scale required at this time. Perhaps quantum computing will be the answer to the advances you're seeking. Decades from now.
    Did you study AI and software engineering in college? If you did, well done  But if you didn’t, what makes you think that you know more than someone who did? 

    Also, who said anything about the Singularity?
    When I was in college, there was no AI. There was no software. The only computer in Chicago was an IBM 360 mainframe at the Illinois Institute of Technology. That's where I went to college, and where I majored in EE, with a computer science minor. The first engineering job I had was at Robert Bosch corp, developing electronic fuel injection hardware and software. Then in the engineering dept. at Siemens, working on the implementation of integrated circuit technology into their medical devices. Followed by 17 years of self employment in graphic arts (you could find my name on the original Adobe Pagemaker/Indesign and Illustrator teams). Followed by working at Apple until I retired in 2014.

    Other than that, you're right, I'm probably unqualified to opine about the resources necessary to advance AI to pass the Imitation Game.
    Well done  Thanks for the CV. 
    Now, what about the Singularity? The only thing I remember Timmillea “bemoaning” is the slow rate of progress in the area in which they specialized. Does this not make them qualified to opine as well? At least, without being told that they don’t understand and without putting words into their mouth? I think maybe it does. 
    muthuk_vanalingampscooter63igorskywatto_cobraAlex1N
  • Apple has been working on its own ChatGPT AI tool for some time

    mayfly said:
    timmillea said:
    There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition. 

    My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK. 

    It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets. 




    But to bemoan the fact that AI hasn't achieved singularity in 30 years shows a lack of understanding the enormous technical challeges involved. It will take processing power that does not even exist at the scale required at this time. Perhaps quantum computing will be the answer to the advances you're seeking. Decades from now.
    Did you study AI and software engineering in college? If you did, well done  But if you didn’t, what makes you think that you know more than someone who did? 

    Also, who said anything about the Singularity?
    byronlappleinsideruserwatto_cobra