blastdoor

About

Username
blastdoor
Joined
Visits
335
Last Active
Roles
member
Points
6,902
Badges
1
Posts
3,851
  • iPhone 17 may have been spotted in the wild


    charlesn said:
    I love a good rumor as much as the next person but can we not normalize this kind of behavior? While it isn't illegal to take someone's photo in public is still an invasion of privacy and promoting this kind of thing will only lead to more instances of this kind of thing. 
    Sorry to say, but the law is quite clear, when in public, it is fair game. While I can't take a photo of someone and use it commercially, there's nothing stopping anyone from taking photos, recording video, or capturing audio of anyone in public.

    Like I said in the piece, it's not something that's going to be a problem because these kinds of design changes are very rare. Nearly every other prototype iPhone has looked identical to its predecessor with the exception of iPhone X, which was prototyped in a literal brick-sized box IIRC.

    I wouldn't worry about this becoming a common way to leak iPhone information.
    Not sure if you missed the point or intentionally avoiding it. I clearly stated it wasn’t illegal but legality doesn’t make it right. People should be able to go out in public without someone photographing them. When you use the photos you are ultimately enabling the behavior. Cool that you didn’t break the law but did y’all make the right choice. This person now has their pictures splattered around the internet. The news value?  That there is a new phone and if you put in a giant case no one will see what it looks like? Stop the presses! 
    Stabitha, I hear the Photography Police are actively recruiting. Give it some thought. Seems like you'd be an enthusiastic candidate. 
    Yes, suggesting that we should respectful of people's privacy is really just me being overbearing. What a terrible world it would be if we respected each other.
    Somehow I simultaneously think everything you’re saying is right but also don’t care too much. Maybe that’s what “desensitized” means.
    pulseimages
  • iPhone 17 may have been spotted in the wild

    I read a rumor that the iPhone 18 will be even better than the iPhone 17 — it might also look slightly different. So I’m going to wait for that one. 
    randominternetpersonwilliamlondonpulseimagesronn
  • Apple researchers take aim at AI hallucinations and true conversations

    brian_001 said:
    well said @blastdoor ;Apple's research is topnotch but they are legging behind in implementation of their research work to make various products and services. 
    Yup. It reminds me a bit of how once upon a time Toshiba had this unique 1.8" hard drive (when the smallest hard drive anyone else had was 2.5") but couldn't figure out how to incorporate it into a successful product. Eventually another company that wasn't much of a leader in technology, but very much a leader in product design, came along and incorporated that technology into what would become a wildly successful product. 
    williamlondonmuthuk_vanalingamAlex1Nbrian_001
  • Doom and gloom reporting on Apple Intelligence continues to ignore Apple's playbook

    blastdoor said:
    blastdoor said:
    I think there's some truth in the middle here. AI is a tool, and can be useful, in certain circumstances. I've never dismissed it as a passing fad, but I do think the hype around it is overblown nonsense from those seeking investment capital. I use AI every day (Apple Intelligence) and I benefit from it. Apple is leading the market in creating powerful, on-device, private, and secure models while also allowing users private access to leading AI platforms. It'll prove to be an incredible combination over time.

    I do not subscribe to the idea that AI will take over or become sentient. It's going to make humans more efficient at certain things, and render some jobs redundant. But not because the AI is doing the job, but because it'll take less humans to do the same work. The writer worried about losing his job to AI shouldn't be, because even if you write with AI, you'll need human intervention to give it soul and reason -- which AI will never have.

    That's why it's so funny to me that people see Apple as so behind. It's laying the groundwork for the future of a cooperative AI ecosystem built on Apple platforms with Apple's rules and values, and because it isn't complete this second, it somehow means they're lost in the woods. As with nearly every Apple endeavor in the past 30 years, I wouldn't bet against them.
    If the only AI you use is Apple Intelligence then you're in no position to assess whether Apple is behind. I use an enterprise license for ChatGPT almost every day. Apple offers nothing like it -- they absolutely are behind in terms of offering a product that competes with what ChatGPT can do today. In terms of raw technology I agree that Apple has a lot going for them. But they have yet to create an AI product that is as useful as ChatGPT. 

    Accurately recognizing that they are behind is not the same thing as betting against them. I also would not bet against them. But I can open my eyes and see that for the moment, they are absolutely behind in this market with respect to actual products that are useful to people and worth paying for. 
    Someone can assess a vehicle is fast without driving it. What are you using the enterprise ChatGPT license for? What product are you using that you believe Apple should be offering?
    But you can't assess the difference between an EV and a combustion engine without ever driving the eV. It's a qualitatively different experience, and using ChatGPT is a qualitatively different thing than what Apple Intelligence is today. Actually, it's more like the difference between a go cart (Apple intelligence) and a Tesla Model S. If all you've done is drive go carts, you have zero clue what a Model S is like. 

    I can ask the o3 model to write an R Shiny app (what I'm doing right now) with so-and-so features and it does it. I can then iterate productively to refine the app. I can ask it why it did things and explain to me how various aspects of the code work, so that I learn more (I've used R forever but I'm new to Shiny). This effectively replaces a research assistant or programmer for me. It's a huge productivity boost. 

    For another example -- earlier today I asked ChatGPT whether there's a connection between conducting a fixed effects meta-analysis using weights to account for error covariance and conducting a principal components analysis. It explained the connection and then, based on remembering an earlier conversation, suggested how this connection applied to some other work I was doing. If I asked Siri anything like that the answer would be "here's what I found on the web" 

    So, you want Apple Intelligence to be a totally different product? I get what you're saying but it's also kind of like asking why your Xbox can't double as a submersible. They're clearly different products with different goals. Apple Intelligence isn't a chatbot nor is it a vibe coding platform.

    I'm glad you get those uses from ChatGPT, but that doesn't have any bearing on what Apple Intelligence can or can't do. (Between the two, I'd take the go cart, tbh)

    It's not that I"m saying ChatGPT is bad or people that use it are bad. Let's take our opinion of AI and the ethical conundrums out of the equation entirely. Apple isn't building ChatGPT, why should it? Even when Siri is backed by an LLM, it won't be ChatGPT.

    That's the problem here. People are comparing apples and oranges, literally. They're two different things.

    Also, let's not so easily dismiss the use cases presented by each. How many people need to vibe code? How many are doing research? (however flawed research with an AI might be) Now compare that to how many people have iPhones capable of running Apple Intelligence that need to transform text, edit photos, or triage notifications.

    It's not that ChatGPT isn't useful, it's clearly being used by millions, but what do users need? Apple may work towards something like a chatbot in the future, or offer third-party ones via Private Cloud Compute, but that isn't their goal today. Right now, Apple's clearly stated goals is to develop artificial intelligence that works privately to serve users in a way that isn't intrusive, and it's doing that.

    What you're asking for is completely antithetical to Apple's goals. And the sooner nerds and pundits realize that, the happier they will be. Because if you go to McDonalds and order steak, you're going to get laughed out of the room. In the end, Apple will be the true powerhouse in AI because it's actually taking a human approach to its product instead of trying to convince everyone of a delusion of grandeur.
    What I’m asking for is that Apple make a good AI product (or products). I’m not saying they have to make another chatbot (though they could), I’m only using successful chatbot products as an example of what an AI product is and why it’s useful. 

    Apple Intelligence is not currently a product. It’s a bundle of technologies. Those technologies could be incorporated into a successful product, but so far that hasn’t happened. 

    In many ways, Apple has become more of a technology company than a product company. I’m certainly pro technology, and Apple has some great technology. But making great products is also important. Apple doesn’t have an AI product, just AI technologies that are integrated into existing products. That’s fine and good, but it’s not enough. 
    muthuk_vanalingam
  • Doom and gloom reporting on Apple Intelligence continues to ignore Apple's playbook

    blastdoor said:
    I think there's some truth in the middle here. AI is a tool, and can be useful, in certain circumstances. I've never dismissed it as a passing fad, but I do think the hype around it is overblown nonsense from those seeking investment capital. I use AI every day (Apple Intelligence) and I benefit from it. Apple is leading the market in creating powerful, on-device, private, and secure models while also allowing users private access to leading AI platforms. It'll prove to be an incredible combination over time.

    I do not subscribe to the idea that AI will take over or become sentient. It's going to make humans more efficient at certain things, and render some jobs redundant. But not because the AI is doing the job, but because it'll take less humans to do the same work. The writer worried about losing his job to AI shouldn't be, because even if you write with AI, you'll need human intervention to give it soul and reason -- which AI will never have.

    That's why it's so funny to me that people see Apple as so behind. It's laying the groundwork for the future of a cooperative AI ecosystem built on Apple platforms with Apple's rules and values, and because it isn't complete this second, it somehow means they're lost in the woods. As with nearly every Apple endeavor in the past 30 years, I wouldn't bet against them.
    If the only AI you use is Apple Intelligence then you're in no position to assess whether Apple is behind. I use an enterprise license for ChatGPT almost every day. Apple offers nothing like it -- they absolutely are behind in terms of offering a product that competes with what ChatGPT can do today. In terms of raw technology I agree that Apple has a lot going for them. But they have yet to create an AI product that is as useful as ChatGPT. 

    Accurately recognizing that they are behind is not the same thing as betting against them. I also would not bet against them. But I can open my eyes and see that for the moment, they are absolutely behind in this market with respect to actual products that are useful to people and worth paying for. 
    Someone can assess a vehicle is fast without driving it. What are you using the enterprise ChatGPT license for? What product are you using that you believe Apple should be offering?
    But you can't assess the difference between an EV and a combustion engine without ever driving the eV. It's a qualitatively different experience, and using ChatGPT is a qualitatively different thing than what Apple Intelligence is today. Actually, it's more like the difference between a go cart (Apple intelligence) and a Tesla Model S. If all you've done is drive go carts, you have zero clue what a Model S is like. 

    I can ask the o3 model to write an R Shiny app (what I'm doing right now) with so-and-so features and it does it. I can then iterate productively to refine the app. I can ask it why it did things and explain to me how various aspects of the code work, so that I learn more (I've used R forever but I'm new to Shiny). This effectively replaces a research assistant or programmer for me. It's a huge productivity boost. 

    For another example -- earlier today I asked ChatGPT whether there's a connection between conducting a fixed effects meta-analysis using weights to account for error covariance and conducting a principal components analysis. It explained the connection and then, based on remembering an earlier conversation, suggested how this connection applied to some other work I was doing. If I asked Siri anything like that the answer would be "here's what I found on the web" 

    williamlondonmr moemuthuk_vanalingamgatorguy