blastdoor
About
- Username
- blastdoor
- Joined
- Visits
- 335
- Last Active
- Roles
- member
- Points
- 6,902
- Badges
- 1
- Posts
- 3,851
Reactions
-
iPhone 17 may have been spotted in the wild
Stabitha_Christie said:charlesn said:Stabitha_Christie said:Wesley_Hilliard said:Stabitha_Christie said:I love a good rumor as much as the next person but can we not normalize this kind of behavior? While it isn't illegal to take someone's photo in public is still an invasion of privacy and promoting this kind of thing will only lead to more instances of this kind of thing.
Like I said in the piece, it's not something that's going to be a problem because these kinds of design changes are very rare. Nearly every other prototype iPhone has looked identical to its predecessor with the exception of iPhone X, which was prototyped in a literal brick-sized box IIRC.I wouldn't worry about this becoming a common way to leak iPhone information. -
iPhone 17 may have been spotted in the wild
-
Apple researchers take aim at AI hallucinations and true conversations
brian_001 said:well said @blastdoor Apple's research is topnotch but they are legging behind in implementation of their research work to make various products and services. -
Doom and gloom reporting on Apple Intelligence continues to ignore Apple's playbook
Wesley_Hilliard said:blastdoor said:Wesley_Hilliard said:blastdoor said:Wesley_Hilliard said:
I do not subscribe to the idea that AI will take over or become sentient. It's going to make humans more efficient at certain things, and render some jobs redundant. But not because the AI is doing the job, but because it'll take less humans to do the same work. The writer worried about losing his job to AI shouldn't be, because even if you write with AI, you'll need human intervention to give it soul and reason -- which AI will never have.
That's why it's so funny to me that people see Apple as so behind. It's laying the groundwork for the future of a cooperative AI ecosystem built on Apple platforms with Apple's rules and values, and because it isn't complete this second, it somehow means they're lost in the woods. As with nearly every Apple endeavor in the past 30 years, I wouldn't bet against them.
Accurately recognizing that they are behind is not the same thing as betting against them. I also would not bet against them. But I can open my eyes and see that for the moment, they are absolutely behind in this market with respect to actual products that are useful to people and worth paying for.
I can ask the o3 model to write an R Shiny app (what I'm doing right now) with so-and-so features and it does it. I can then iterate productively to refine the app. I can ask it why it did things and explain to me how various aspects of the code work, so that I learn more (I've used R forever but I'm new to Shiny). This effectively replaces a research assistant or programmer for me. It's a huge productivity boost.
For another example -- earlier today I asked ChatGPT whether there's a connection between conducting a fixed effects meta-analysis using weights to account for error covariance and conducting a principal components analysis. It explained the connection and then, based on remembering an earlier conversation, suggested how this connection applied to some other work I was doing. If I asked Siri anything like that the answer would be "here's what I found on the web"
I'm glad you get those uses from ChatGPT, but that doesn't have any bearing on what Apple Intelligence can or can't do. (Between the two, I'd take the go cart, tbh)
It's not that I"m saying ChatGPT is bad or people that use it are bad. Let's take our opinion of AI and the ethical conundrums out of the equation entirely. Apple isn't building ChatGPT, why should it? Even when Siri is backed by an LLM, it won't be ChatGPT.That's the problem here. People are comparing apples and oranges, literally. They're two different things.Also, let's not so easily dismiss the use cases presented by each. How many people need to vibe code? How many are doing research? (however flawed research with an AI might be) Now compare that to how many people have iPhones capable of running Apple Intelligence that need to transform text, edit photos, or triage notifications.It's not that ChatGPT isn't useful, it's clearly being used by millions, but what do users need? Apple may work towards something like a chatbot in the future, or offer third-party ones via Private Cloud Compute, but that isn't their goal today. Right now, Apple's clearly stated goals is to develop artificial intelligence that works privately to serve users in a way that isn't intrusive, and it's doing that.
What you're asking for is completely antithetical to Apple's goals. And the sooner nerds and pundits realize that, the happier they will be. Because if you go to McDonalds and order steak, you're going to get laughed out of the room. In the end, Apple will be the true powerhouse in AI because it's actually taking a human approach to its product instead of trying to convince everyone of a delusion of grandeur.Apple Intelligence is not currently a product. It’s a bundle of technologies. Those technologies could be incorporated into a successful product, but so far that hasn’t happened.In many ways, Apple has become more of a technology company than a product company. I’m certainly pro technology, and Apple has some great technology. But making great products is also important. Apple doesn’t have an AI product, just AI technologies that are integrated into existing products. That’s fine and good, but it’s not enough. -
Doom and gloom reporting on Apple Intelligence continues to ignore Apple's playbook
Wesley_Hilliard said:blastdoor said:Wesley_Hilliard said:
I do not subscribe to the idea that AI will take over or become sentient. It's going to make humans more efficient at certain things, and render some jobs redundant. But not because the AI is doing the job, but because it'll take less humans to do the same work. The writer worried about losing his job to AI shouldn't be, because even if you write with AI, you'll need human intervention to give it soul and reason -- which AI will never have.
That's why it's so funny to me that people see Apple as so behind. It's laying the groundwork for the future of a cooperative AI ecosystem built on Apple platforms with Apple's rules and values, and because it isn't complete this second, it somehow means they're lost in the woods. As with nearly every Apple endeavor in the past 30 years, I wouldn't bet against them.
Accurately recognizing that they are behind is not the same thing as betting against them. I also would not bet against them. But I can open my eyes and see that for the moment, they are absolutely behind in this market with respect to actual products that are useful to people and worth paying for.
I can ask the o3 model to write an R Shiny app (what I'm doing right now) with so-and-so features and it does it. I can then iterate productively to refine the app. I can ask it why it did things and explain to me how various aspects of the code work, so that I learn more (I've used R forever but I'm new to Shiny). This effectively replaces a research assistant or programmer for me. It's a huge productivity boost.
For another example -- earlier today I asked ChatGPT whether there's a connection between conducting a fixed effects meta-analysis using weights to account for error covariance and conducting a principal components analysis. It explained the connection and then, based on remembering an earlier conversation, suggested how this connection applied to some other work I was doing. If I asked Siri anything like that the answer would be "here's what I found on the web"