mayfly
About
- Username
- mayfly
- Joined
- Visits
- 4
- Last Active
- Roles
- member
- Points
- 1,100
- Badges
- 1
- Posts
- 385
Reactions
-
Goldman Sachs continues to bleed cash from Apple Card operations
eightzero said:vtvita said:eightzero said:mayfly said:eightzero said:OK, I really don't get this. How does GS lose money on a credit card? Are they paying Apple a disproportionate amount of their rake from the cardholders? The article says "credit losses" so somehow more Apple Card holders are welching?
Or...it is possible GS thinks they just aren't making the billion they planned? Not sure that's a "loss."
I am sure GS has overhead on these operations. They have to pay staff to provide customer service; and other infrastructure like IT and the like. But a billion in losses to that? Really?
BTW, this card from GS is the most miserable credit card experience I've ever had, for many reasons, which I've written about before. I've simply made it dormant, seldom—if ever—to be used again. -
Apple guts internal communication tool, crippling union organization
larrya said:Mark of a great company: censorship and insecurity. -
Goldman Sachs continues to bleed cash from Apple Card operations
eightzero said:mayfly said:eightzero said:OK, I really don't get this. How does GS lose money on a credit card? Are they paying Apple a disproportionate amount of their rake from the cardholders? The article says "credit losses" so somehow more Apple Card holders are welching?
Or...it is possible GS thinks they just aren't making the billion they planned? Not sure that's a "loss."
I am sure GS has overhead on these operations. They have to pay staff to provide customer service; and other infrastructure like IT and the like. But a billion in losses to that? Really? -
Apple has been working on its own ChatGPT AI tool for some time
Japhey said:mayfly said:timmillea said:There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition.
My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK.
It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets.But to bemoan the fact that AI hasn't achieved singularity in 30 years shows a lack of understanding the enormous technical challeges involved. It will take processing power that does not even exist at the scale required at this time. Perhaps quantum computing will be the answer to the advances you're seeking. Decades from now.Also, who said anything about the Singularity?Other than that, you're right, I'm probably unqualified to opine about the resources necessary to advance AI to pass the Imitation Game. -
Apple has been working on its own ChatGPT AI tool for some time
timmillea said:There was a time when Apple always led with new technologies - mostly a deeply unprofitable time. In latter years, they work in secret, study what the competition is doing, innovates on top, patents to the hill, then embarrasses the competition.
My first degree at Durham University starting 1992 was 50% in AI and 50% software engineering. Then no one I met outside the University had even heard of artificial intelligence nor believed in it when I explained what it was. Now AI is on the main broadcast news all the time. Even now, Nick Clegg of Meta was on the airwaves this morning explaining that the current generation of AI is simply predicting the next word or 'token' from big data. Back in 1992, Durham had a huge natural language processing system called LOLITA which was based on deep semantic understanding - an internal, language-independant representation based on semantic graphs. LOLITA read the Wall Street Journal everyday and could answer questions on it with intelligence, not parrot fashion. For my final year project, I worked on the dialogue module including 'emotion'. Then the LOLITA funding ended and that was the end of that. Had it been in the US, I can't help feeling that LOLITA would have morphed into one of the top corporates in the World. We don't support genius or foresight in the UK.
It is truly depressing that 30 years later, the current state of AI is still neural nets trained on mediocre data sets.But to bemoan the fact that AI hasn't achieved singularity in 30 years shows a lack of understanding the enormous technical challeges involved. It will take processing power that does not even exist at the scale required at this time. Perhaps quantum computing will be the answer to the advances you're seeking. Decades from now.