Japhey
About
- Username
- Japhey
- Joined
- Visits
- 73
- Last Active
- Roles
- member
- Points
- 4,986
- Badges
- 2
- Posts
- 1,767
Reactions
-
Spotify's AI DJ expands to 50 countries around the world
The Spotify DJ is actually really good. I listen to many different genres, and the AI did a great job mixing them together in a way that that felt fluid. The transition between slow and fast songs was usually broken up by the DJ, who was not as annoying or as frequent as I anticipated. Choosing music is something I’ve never needed help with, so I personally find this to be mostly a novelty…but a really good one that I might use again if I’m ever feeling lazy. -
Apple TV+ cancels Uma Thurman thriller 'Suspicion'
jbdragon said:I didn't know their series even existed. I do like Uma Thurman, but knowing the series already ended, do I want to get invested in watching this? -
Apple TV+ cancels Uma Thurman thriller 'Suspicion'
If I remember correctly, over the course of 8 episodes Uma Thurman had approximately 5 minutes of total screen time. So for this show to be labeled as an Uma “thriller” was disingenuous on Apple’s part from the very beginning. Though, that may not have mattered much had the show been just a little better. -
Apple has been working on its own ChatGPT AI tool for some time
mayfly said:Japhey said:mayfly said:timmillea said:There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition.
My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK.
It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets.But to bemoan the fact that AI hasn't achieved singularity in 30 years shows a lack of understanding the enormous technical challeges involved. It will take processing power that does not even exist at the scale required at this time. Perhaps quantum computing will be the answer to the advances you're seeking. Decades from now.Also, who said anything about the Singularity?Other than that, you're right, I'm probably unqualified to opine about the resources necessary to advance AI to pass the Imitation Game.Now, what about the Singularity? The only thing I remember Timmillea “bemoaning” is the slow rate of progress in the area in which they specialized. Does this not make them qualified to opine as well? At least, without being told that they don’t understand and without putting words into their mouth? I think maybe it does. -
Apple has been working on its own ChatGPT AI tool for some time
mayfly said:timmillea said:There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition.
My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK.
It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets.But to bemoan the fact that AI hasn't achieved singularity in 30 years shows a lack of understanding the enormous technical challeges involved. It will take processing power that does not even exist at the scale required at this time. Perhaps quantum computing will be the answer to the advances you're seeking. Decades from now.Also, who said anything about the Singularity?