AppleZulu

About

Username
AppleZulu
Joined
Visits
258
Last Active
Roles
member
Points
9,142
Badges
2
Posts
2,519
  • Apple has big camera upgrades lined up through iPhone 19 Pro

    If I read the specs and descriptions of the tetraprism tech, it's pretty clever, but the use of the term"optical zoom" is a bit loosey-goosey.

    If you're interested, read below for some TL;DR thoughts about how these lenses work.

    For a true optical zoom as is used on 'real' cameras, there are many stacked lens elements inside the literally long lens you see mounted on the camera. The zoom happens when the distance between some of those lens elements is literally physically increased, thus changing the magnification of the light projected onto the sensor (or film) in the back of the camera. The critical thing to understand here is that the full area of the sensor is receiving the enlarged projected image. When "zoomed in," that small, distant object is optically "blown up" by the lenses to be larger, thus filling the full area of the sensor. What this means is that the resolution of the sensor's image does not change. If the lenses are of high quality, the zoomed-in image will be sharp and undistorted.

    Another thing useful to note about "real" cameras is the use of interchangeable lenses. Most are "prime" or fixed focal length lenses, meaning the amount they're zoomed in (or out to wide angles) is static. The benefit of prime lenses is that, because there are no moving parts, there are no compromises, and thus they can be set perfectly for their focal length and thus be sharper and less prone to distortions than a movable zoom lens that must accommodate the physics of multiple focal lengths. For professional photographers, zoom lenses are used for convenience and speed for charging focal lengths. If a professional photographer is in a controlled environment and has foreknowledge of the focal length needed for what they're shooting, they'll swap out and use the exact prime lens for the purpose. 

    This is important to consider, because in reality, as best as I can tell, all of the lenses on an iPhone, including the tetraprism lens, are actually fixed, prime lenses. The only moving parts in them are image stabilizers*.

    For smartphone cameras, a severe limitation is the front-to-back depth of the phone itself. Zoom and telephoto lenses on "real" cameras are long for a reason. The physical distance between lenses and the sensor is important to how much an image can be magnified before it's projected on the sensor.

    The tetraprism lens in an iPhone offsets the sensor, and effectively uses multiple prisms (in one glass element) that allow the light to enter the front of the lens and then the prism, and then reflect back and forth four times before reaching the sensor. To achieve the same result without the tetraprism, the camera lens would have to stick out the back of the phone significantly further, which is something nobody wants. In reality, this lens is not a movable zoom lens, but a fixed focal length telephoto lens. 

    Here's where the loosey-goosey part comes in. The proper definition of optical zoom means the lens elements physically move and change in order to magnify smaller objects to be larger and larger on the camera sensor, while the resolution of the image remains fixed at the sensor's full megapixel resolution.

    "Digital zoom" involves an unmagnified image reaching the sensor, and then the camera software crops a portion of that image and "blows it up" on the display. This quickly yields inferior results, because you are just magnifying a smaller number of pixels from the middle of the sensor, thus lowering the resolution of the final image.

    What the so-called optical zoom on the iPhone is really doing is two things. First, it's selecting which of the lenses on the phone will be used. This is like when you swap out for a more powerful telephoto lens on a "real" camera. Then it's doing the digital cropping thing, but defining a certain minimum megapixel resolution as the "standard" resolution and maintaining that minimum by starting with a higher-resolution sensor before cropping out that acceptable minimum resolution portion of the image, e.g., cropping a 12 megapixel portion out of a 48 megapixel image. So calling this process optical zoom is, pun-intended, a bit of a stretch.

    *Image stabilization involves suspending the camera lens and/or the sensor in a magnetic field, and using gyroscopes to measure small movements (your hands shaking), and moving the lens and/or sensor in the opposite direction to compensate for that shake. This prevents blurring of the image during exposure so that you get sharper photos as a result. So image stabilization involves moving parts in the lens, but doesn't involve changing the focal length of the lens.
    muthuk_vanalingamlewchenkoAlex1N
  • Apple has big camera upgrades lined up through iPhone 19 Pro

    Pema said:
    All this constant chatter about cameras, cameras, cameras. I get it. Phone users want to take pics. Of just about anything, anytime, everywhere. These days you can't stroll on a street and not see someone holding up their phone taking a picture of some rather ordinary pigeon perched on a bollard. Big deal. You know that this pic and the photographer isn't going to end up in a museum somewhere alongside Ansel Adams. 

    For my part I would like to have a camera to be useful to take the most mundane pictures without the constant frustrations that I always experience. I am standing in front my shiny car attempting to take a pick of a panel that needs scratch repair. What do I see? My reflection. So I try to lean away and what do I see? My hands hanging goofy like trying to shoot that pic. How bloody annoying. 

    Then you are trying to flog something online, same deal. A stainless steel kettle and there you are like some skulking creep in the reflection. 

    These are my bugbears about all this talk about cameras. For the average camera user I don't care how many pixels and how many lens when I can't solve the simple straightforward problem of reflection. Of course, you are going to jump in and say, hey get a tripod. Why didn't I think of that? Try lining up that shot, Sherlock. 

    The other issue with phones, negating the all pervasive issue with cameras, is the utterly, stupid inadvertent touching of the screen and suddenly when you look at your phone screen you are facing some alien in outer space trying to flog you a bunch of stellar dust. Huh? How they hell did I get there? 

    And finally there is this dot.com, Dutch Tulip Mania about AI. Every few years the IT industry sinks into the doldrums and then needs a spark, AI. Well, there was a company called Borland run by a bloke called Philip Khan who released a piece of software called Turbo AI back in the last century. 

    Guess what the challenge was? Data. The data that the IT industry is going to scrape to give you intelligent anything is your data manipulated by algorithms, in case you haven't figured that out. 

    In other words, it's not organic AI, it's old, crap data being scraped from humongous warehouses filled to the rafters with servers housing giga mounds of data. And the more we use our phones, our computers to search and do anything the data grows diametrically. But have you noticed this? As soon as you search for a warm toilet seat cover on your next search there are ten vendors that want to flog you warm toilet seat covers. That's not generative or predictive. That's just plain old stupid AI Mimicking. You searched for this so I am going to give you the same. 

    Anyone whose ever stock traded will have noticed the disclaimer: past winnings is not guarantee of future earnings. And that disclaimer ought to be slapped on any AI product in the future: past data is being used to give you your answers but it is no guarantee of anything useful. It's the old saying garbage in/garbage out. 

    Nvidia is running a storm of success to mega trillions, watch how they plummet back to earth same as the Dutch Tulip Mania and the dot.com when the ordinary folks work out that there is no magic bullet in AI. Just the same-o, same-o. 

    The day that someone delivers organic AI is the day I will sit up and take notice. Till, one big, fat yawn  :s     

    Come to think of it, I believe that that is what Humane AI was trying to deliver. Real time AI. See how well they did??  :D
    If your problem is reflections, what you’re looking for is a circular polarizing filter. This is not a filter effect in an app. This is a physical filter that is positioned right in front of the camera lens. It literally filters out unwanted light from reflections. That’s what the professionals use on their professional cameras to diminish reflections. I’ve never tried one for an iPhone, but apparently they make them. That’s not something you’d want permanently stacked into iPhone camera lenses (or any professional lens), so don’t expect that to be a future iPhone innovation. 


    muthuk_vanalingamunbeliever2meterestnzXed
  • EU hits back at Apple withholding Apple Intelligence from the region

    Perhaps Apple should just cut a deal with Samsung to manufacture an Apple-branded euPhone that runs on Android and be done with it. 
    9secondkox2watto_cobra
  • Apple Intelligence impresses now, and it's still very early

    melgross said:

    Will ChatGP ultimately end up being 'Sherlocked'...?
    It wouldn’t surprise me in the least to find that Apple is working on their own solution with the intention of no longer needing to use ChatGPT or some other third-party. 
    Apple doesn't have talent to "Sherlock" ChatGPT, if they could they'd already have their own service instead of outsourcing to get it.
    What I do see Apple doing is trying to pass all this off as their own doing to the unaware masses who don't know. 
    That’s ridiculous. I mean, totally ridiculous.  You really need to know something before posting nonsense. Apple has been doing this well before anyone else. They call it by its proper term; machine learning. Artificial intelligence is just a term dreamed up a time ago to sound more exciting for the masses.

    if you look at what Apple us doing here, you’d see that they’ve been working on nit for years. But Apple has learned from the Maps fiasco to not introduce a major initiative before it’s ready. Look at what Microsoft and Google did with theirs. Both companies had to withdraw their LLMs shortly after introducing them because of major problems. Google’s engineers even wrote a letter to management n telling to not bring it out because it wasn’t ready and there would be problems. They’ve now withdrawn AI search until they fix it. Microsoft has withdrawn the latest Windows 11 update because of a security nightmare and there’s no information as to when Recall will come out, if it ever will, which is what some people in the industry believe.

    so Apple doesn’t have the talent? Really, that’s garbage.
    I agree with your conclusion that Apple has the capacity to impact this field, and even that in recent times the term Artificial Intelligence has been over-hyped. That said, the term itself has been in use since at least 1956 and probably earlier, so the term is not a new thing. 
    williamlondonroundaboutnowAlex1Nwatto_cobra
  • Adobe has clarified controversial shrinkwrap license terms, but the damage may have alread...

    "The statement also doesn't address that the terms still seem to breach confidentiality agreements that artists may have signed, should they use cloud-based, well, anything, from Adobe."

    I'm not a lawyer, but I think it does, actually. It's pretty clear that if you put anything on their servers, they reserve the right to scan it for illegal 
    activity, and if such activity is detected, escalate it for human review, and then on to law enforcement, etc., as necessary.

    Nondisclosure agreements do not shield illegal activity. So if you're photoshopping kiddie porn and using Adobe's servers to do it, the breach of your NDA is the least of your problems. 

    If you're not operating a criminal conspiracy and your nondisclosure agreement says you will guarantee that no one will ever possibly be able to look at covered content, then you're going to do your work in your own little SCIF on a device that isn't connected to the internet.

    I'm not an expert, but my bet is that most NDAs will say something more along the lines that you will take reasonable measures to assure the security of the covered content. In that case, posting drafts of your work on instagram to get your friends' opinions of it is a violation of your NDA. Storing it on a server that's scanned for illegal activity probably isn't, nor would it be a violation if someone breaks into your locked office and steals the material. 

    It's absolutely valid to ask questions about these things, but especially when the issue blows up publicly, it should be no surprise that it takes a few days to get an answer. Once it's gone off the rails, the engineers, executives and lawyers will all be in sweaty meetings thinking, rethinking and overthinking the response, because they know at that point whatever they say will be challenged with a "yeah, but what do you mean by..." response. 
    gatorguyAnilu_777Alex1Nfastasleepwatto_cobra