jellybelly
About
- Username
- jellybelly
- Joined
- Visits
- 81
- Last Active
- Roles
- member
- Points
- 503
- Badges
- 1
- Posts
- 158
Reactions
-
Apple is asking iPhone suppliers for screens without any bezel
commentzilla said:mayfly said:There is a phrase in this article that understates what I'd think is a much bigger issue:"increased vulnerability to external shocks"
With no metal bezel surrounding a glass lens, seems like it would be way more likely to break due to accidents that current phones survive intact. They're going to have to explore a radically different glass technology to prevent that. Or they could just use current glass tech, and make more money on repairs, I suppose.Where we in the US refer to a ‘hood’ on the front of a car, the Brits say ‘bonnet’. What we call a ‘windshield’ is referred to as a ‘windscreen’ by Brits. I can see calling the face of the iPhone a lens that we look through to see the image displayed on or under film layers.Gorilla glass that Apple uses has had many iterations over the years. It approaches the hardness and scratch resistance of some metals. Tougher than nails.I’d think that’s the protection that does the job, unless the display film comes all the way to the edge of the glass—and can be harmed by just the edge being hit.My concern could be reliable touch rejection from gripping the iPhone with fingers encroaching on the edges of the display. I’d take a guess that Apple thought of that a long time ago and it won’t be a problem. I guess I’m not really concerned about that after all. -
Apple reaffirms privacy as a tentpole feature in Siri after lawsuit settlement
mikethemartian said:Apparently it wasn't so frivolous given Apple's willingness to settle to prevent the discovery from coming out in open public court.The discovery is out. A third party creepily stole user data. Apple does better than the competition in policing Apps, and yet gets criticized for holding up App approval while looking for loopholes an unscrupulous developer might try to hide.The lack of translation to Portuguese of the EULA was mentioned as a problem. So , again I say, that discovery is out. And probably being worked on by Apple across 100’s of thousands of Apps across umpteen languages.Don’t assume settling is any kind of admission of guilt. This weighing the cost vs benefits is standard practice in tort law. It would be malpractices to not consider whether you would likely loose more than you gain. Yes, principle is a weighted factor, and evidence shows Apple weighs principles more heavily than many large tech companies. -
Siri in iOS 18.4 is getting worse before it gets better
shad0h said:Dramaticising a component or feature not working in a developer beta...
How exactly is that quality tech journalism ?
The author is writing about observations from different sources as well as within the AppleInsider team that brings a variety of experience and skills to the table.He’s pointing out that Siri is performing less well as it is going through a transition of development.
As far as scraping/starting-over and waiting for a new Siri as implied by some, elsewhere online, Apple has to keep some core functions working that have been useful in things such as ‘Apple Home’ functions albeit with new hiccups.Apple is faced with a transition from a type of machine learning that required selective iterations of data in the thousands-on-device to billions or more in iCloud, along with smart search—to integrating LLM’s (Large Language Models). LLM are an iteration of data at such a large scale that it takes data centers requiring the electrical power of small cities.It’s a different kind of machine learning that is so massive in its data scanning and complex algorithms, that AI software engineers admit they don’t know precisely how results are arrived at in the sense of every iterative test that was tried in the massive scanning of data and the predictive testing tried and discarded to winnow down to the mostly usable predictions of characters (we are talking about the “L” that stands for Language) and results.I believe Apple is seeking to allow eventual transitioning and integration to R-AI—AI with reasoning, not just trillions of predictive tests on language.We are an impatient species, a drive that moves us forward in starts and stops. We have wants and hopes that can become expectations and even demands. Our weakness and strength are hopes and wants jumping to demands even when we are too impatient to have them realized (or not) in a time period that is hard to understand and/or predict.In the case of artificial intelligence developing in the Apple sphere, our expectations are getting ahead of reality. If you are disappointed in the progress so far, that seems quite reasonable. Disappointment is different than demands.Any blanket conclusions out in the blogosphere that Apple AI is useless because much of it does not yet exist, goes to impatience that’s likely an inefficient use of our energy. But it is our choice if that is sometimes our reaction. And that’s fine. We have that choice. My hope is that we don’t ’throw out the baby with the bath water’ now or in the future. -
Blackmagic's new camera for Apple Vision Pro content has a hefty $29,995 price tag
I see the biggest challenge in creating Vision Pro content in new ways of story telling.It’s not just depth but with a wide field of vision, you have competing things for you eyes to look at.That’s why not all cinema content is suitable for, or cannot really take advantage of iMax.It would take the talent of studios or directors like Pixar, James Cameron or Lucas Film to invent new storytelling to take advantage of immersive content.You could even track the eyes and have content change along numerous insights and perspectives. Then you re-watch it and create a diverging or more revealing storyline by examining different visual elements the second and third time and you get new insights and plot twists.Immersive is most effective when the camera is in close. Telephoto bringing you apparently closer yet doesn’t register depth. Telephoto shots taken from further away tenders to flatten shots.This is used to great effect in portraits whereby features are foreshortened such as large noses etc. Taking portraits from say 20 feet instead of 5 feet away are more complimentary. That’s why portrait lens tend to called such in the 85mm to 105mm short telephoto range, also with wider apertures such as 1.8 to 2.8 f/stops. It’s not the lenses, but rather the distance they afford for intimate and complimentary portraits. The wide apertures afford bokeh, bring the the subject to the forefront of attention, blurring distracting backgrounds.In sports, the closer you are the faster ther angular velocity and you might miss catching something that happens so fast and passes by your eyes before you register it—and you’re less likely to get the big picture.
Current televised sports has many cameras with cuts in real time in an expensive mobile or on-site studio.Having many of the expensive immersive two lens cameras would initially be not just expensive in cameras and talent, (cameras making this Black Magic look inexpensive), new directorial skills need to develop. Initially the best use case would be immersive replays, although, would that justify buying an expensive headset for viewing. And would additional costs be recovered under current subscription pricing or would competition drive adoption. Lets hope it’s the later.Certain sports are better for immersive content. You’ll see more overhead cable rigging for cameras as equipment
is developed and deployed such as you see in overhead cables with cameras over American football games. It’s difficult as you can’t get in the way of high kicks or high passes. The same goes for basketball plus not annoying the live audience in higher seats with more cable trolleyed cameras overhead between the action and the high seats. -
Siri in iOS 18.4 is getting worse before it gets better
tundraboy said:I believe Apple is seeking to allow eventual transitioning and integration to R-AI—AI with reasoning, not just trillions of predictive tests on language.
Of course the main stumbling block in his argument is that he assumed that a man-made nano-device that does EVERYTHING that a neuron does is unquestionably attainable. We don't even know how neurons work. We don't even know if we will ever know enough to truly understand how a neuron works. This is the fallacy of assuming infinite future knowledge that a lot of futurists including AI advocates unwittingly commit.
Yes R-AI, AI with Reasoning, would solve a lot of the criticisms leveled on AI. Only problem is, no one really knows how to get a machine to truly reason the way the smarter segment of the human population does. We don't even know if that is achievable, but some just power through with their arguments by treating it as a given. (Reasoning like the other, much larger, segment of humanity, on the other hand, --well, AI has already achieved that.)Yes it’s is overhyped in the masses. But it’s steadily being improved in the labs. The same will be true for R-AI. It’ll be a tool for appropriate use, and the reasoning will make it just a bit better.You’re correct point out the fallacy of approaching the vast network of neurons that are analog, not digital. Plus there interaction with the incoming body senses from the usual suspects of sight, sound, hearing, taste, smelll, hot, cold, in addition to dull vs sharp pain, muscle tension or relaxation feedback, endocrine interaction — the list goes on and on that a robot won’t have in the same way. Nevertheless, as overall AI develops, it will be a useful tool in the right circumstances.I’m in for the long haul warts snd all. Used in conjunction with radiologists in mammogram reading it’s had increased accuracy—not alone but when used along with human readers ( radiologists).I expect slow progress so I’m not cynical, but rather hopeful and patient, -
iPadOS 26 multitasking is more Mac-like with menubar, better pointer and more
Sure, but in the end, the iPad is now navigated and interacted with in the same way that a computer has been interacted with since 1992. We could have hoped for a more ambitious, “natural” UX by combining fingers, eyes, voice, and pen with AI. But they gave up, for now.They didn’t give up. They just called it wrap for having something for this year. They have teams working on next years version and beyond."Nature does not hurry, yet everything is accomplished." — Lao Tzu -
iPhone 17 Pro rumored to get Liquid Glass color treatment
How about a glass iPhone? It could be sandblasted or etched to be frosted on the inside surface. This could give the appearance of a white liquid glass iPhone.If edge lighting of the glass front and back with RGBW LEDs is done under the metal edges, what ever color the LEDs make will laze through the glass and will either hit the frosted surfaces producing a diffuse glow of color or bounce off the glossy smooth outer face back to the frosted inner surface and voila you have a frosted glowing iPhone in changeable color.Light sensors could detect ambient wavelengths from the environment to determine the color created by the edge LEDs. Or the dominant color being made on the screen could be used to change the iPhone appearance. Or it could be your explicit choice.I worked with some artists that made a lot of money with edge lit etched glass layers (or sand blasted frosting). To keep energy use down, the edge lighting could be limited to a low glow only when asleep or perhaps only when charging.With numerous LEDs along the edge, gradients of color could be formed.