jellybelly

About

Username
jellybelly
Joined
Visits
81
Last Active
Roles
member
Points
503
Badges
1
Posts
157
  • iPadOS 26 multitasking is more Mac-like with menubar, better pointer and more

    Sure, but in the end, the iPad is now navigated and interacted with in the same way that a computer has been interacted with since 1992. We could have hoped for a more ambitious, “natural” UX by combining fingers, eyes, voice, and pen with AI. But they gave up, for now. 

    They didn’t give up.   They just called it wrap for having something for this year.  They have teams working on next years version and beyond.  

    "Nature does not hurry, yet everything is accomplished."  — Lao Tzu 
    thtwilliamlondonsflagel
  • Files on iPadOS 26 doesn't suck anymore

    I hope we can see date created.    
    Please, please please.    
    Just opening one of my two portfolio modeling Numbers spreadsheets just to ONLY view a file, changes the modification date—really, no changes made.   This is due to the handy sito save feature built in.  But it gets annoying.   

    My files are very similar files and I’d like to know which one was created first and is usually the master file without having to have that as part of the file name.    

    For now, I’ve compressed the file name clue to the “letters “MR” at the end of the file name, to designate as Master Record.  

    I can then use my old desktop publishing conventions of starting with cc1, cc2… etc. Meaning customer corrreection vs a version change on my part which used v1, v2, v2.1, v2.12  v3 for my versions to try or advance to.   The dot .1 or dot .11, v1.12, or v3.23 were progressively more minor changes with whole number changes as the most major changes.    
    I’ll still use those naming conventions that have served me since 1987 in early prepress and in my graphic design and then later in1991, in my Photoshop work.  

    BUT I still want to see the creation date.   It helps me in billing work.   I use my “cc” prefixes for billable customer changes major and minor, but bill creative time for “v” versions.   But I do like to see a documents entire history for various reasons including creation date.   
    Is it available without having to view on a Mac?  —can I view it on new iPad OS failed system?


    watto_cobra
  • Apple & Michigan State University partner to boost US manufacturing skills

    How much does Apple really know about manufacturing? They are an R&D company. They out source their manufacturing to other companies. Obviously there is some percentage of engineers at Apple that understand the manufacturing process so they can act as an interface between the R&D team and the contract manufacturers but I would expect that team to be smaller than the R&D team.
    Apple has designed a lot of the equipment that their manufacturing partners use, from jigs to low end robotics to sophisticated robots.   
    AppleInsider had an article some number of  years ago on the robot Apple designed to take apart iPhones to recycle valuable materials.   
    If they don’t design production equipment for a manufacturer, they may purchase it so they get first production rights on the equipment. TSMC is an example where Apple invests in the FAB equipment made by ASM for a next step smaller process node and gets up to 6 months, perhaps more, of all production on that next process improvement on ASM made equipment that TSMC uses in their production process. 
    tht
  • Siri in iOS 18.4 is getting worse before it gets better

    tundraboy said:

    I believe Apple is seeking to allow eventual transitioning and integration to R-AI—AI with reasoning, not just trillions of predictive tests on language. 

    I read a book by some tech researcher who said "if you can replace a neuron with a man-made nano-device that did everything a neuron did, then would that brain function any differently?" (Or words to that effect.) He then added that logically then, you should be able to replace every neuron in the human brain with the same nano-device and have an artificial brain and AI that is indistinguishable from the human variety.  That is his argument for why AI will eventually instantiate human intelligence.

    Of course the main stumbling block in his argument is that he assumed that a man-made nano-device that does EVERYTHING that a neuron does is unquestionably attainable.  We don't even know how neurons work.  We don't even know if we will ever know enough to truly understand how a neuron works.  This is the fallacy of assuming infinite future knowledge that a lot of futurists including AI advocates unwittingly commit.

    Yes R-AI, AI with Reasoning, would solve a lot of the criticisms leveled on AI.  Only problem is, no one really knows how to get a machine to truly reason the way the smarter segment of the human population does.  We don't even know if that is achievable, but some just power through with  their arguments by treating it as a given.   (Reasoning like the other, much larger, segment of humanity, on the other hand, --well, AI has already achieved that.)
    In referring to R-AI addition of reasoning, it is at a low level, very low compared to humans, but nevertheless will add to productivity in small ways at first, along with an error rate.   The 95% accuracy rate you referred to can be as low as 80% and have some uses.  My bicycle tool multiplies my movement and is a very useful tool only in 10% or less of my travels, but I’ll still have some uses.   I barely have a use for AI in writing.  Grammar checkers and spell checkers have been around for over several decades—software trained on rules by humans.  The AI equivalent hasn’t helped me very reliably. But when I find it helpful, it’s for a first pass—it highlights possible errors so fast, I can review the suggestions and ignore what might not be applicable or appropriate.  But it still is a tool that can be used with appropriate expectations.  A few of the AppleInsider staff have mentioned its usefulness.   I expect they are more skilled in leveraging AI as an additive tool.  
    Yes it’s is overhyped in the masses.   But it’s steadily being improved in the labs.   The same will be true for R-AI.  It’ll be a tool for appropriate use, and the reasoning will make it just a bit better.  

    You’re correct point out the fallacy of approaching the  vast network of neurons that are analog, not digital.   Plus there interaction with the incoming body senses from the usual suspects of sight, sound, hearing, taste, smelll, hot, cold, in addition to dull vs sharp pain, muscle tension or relaxation feedback, endocrine interaction — the list goes on and on that a robot won’t have in the same way.  Nevertheless, as overall AI develops, it will be a useful tool in the right circumstances.  

    I’m in for the long haul warts snd all.  Used in  conjunction with radiologists in mammogram reading it’s had increased accuracy—not alone but when used along with human readers ( radiologists).  

    I expect slow progress so I’m not cynical, but rather hopeful and patient,


    watto_cobraAlex1N
  • Siri in iOS 18.4 is getting worse before it gets better

    shad0h said:
    Dramaticising a component or feature not working in a developer beta...

    How exactly is that quality tech journalism ?
    In my opinion it is high quality tech journalism. It is honest, based on repeatable observations.  Where the observation was not repeatable, the author points that out.

    The author is writing about observations from different sources as well as within the AppleInsider team that brings a variety of experience and skills to the table.   

    He’s pointing out that Siri is performing less well as it is going through a transition of development.  

    As far as scraping/starting-over and waiting for a new Siri as implied by some, elsewhere online, Apple has to keep some core functions working that have been useful in things such as ‘Apple Home’ functions albeit with new hiccups. 

    Apple is faced with a transition from a type of machine learning that required selective iterations of data in the thousands-on-device to billions or more in iCloud, along with smart search—to integrating LLM’s (Large Language Models).  LLM are an iteration of data at such a large scale that it takes data centers requiring the electrical power of small cities.  
    It’s a different kind of machine learning that is so massive in its data scanning and complex algorithms, that AI software engineers admit they don’t know precisely how results are arrived at in the sense of every iterative test that was tried in the massive scanning of data and the predictive testing tried and discarded to winnow down to the mostly usable predictions of characters (we are talking about the “L” that stands for Language) and results.  

    I believe Apple is seeking to allow eventual transitioning and integration to R-AI—AI with reasoning, not just trillions of predictive tests on language. 

    We are an impatient species, a drive that moves us forward in starts and stops.  We have wants and hopes that can become expectations and even demands.  Our weakness and strength are hopes and wants jumping to demands even when we are too impatient to have them realized (or not) in a time period that is hard to understand and/or predict.  

    In the case of artificial intelligence developing in the Apple sphere, our expectations are getting ahead of reality.  If you are disappointed in the progress so far, that seems quite reasonable.  Disappointment is different than demands.  
    Any blanket conclusions out in the blogosphere that Apple AI is useless because much of it does not yet exist, goes to impatience that’s likely an inefficient use of our energy.  But it is our choice if that is sometimes our reaction. And that’s fine.  We have that choice.   My hope is that we don’t ’throw out the baby with the bath water’ now or in the future.  


    muthuk_vanalingamxyzzy01watto_cobraAlex1N