Apple has big camera upgrades lined up through iPhone 19 Pro

13»

Comments

  • Reply 41 of 46
    melgrossmelgross Posts: 33,598member
    avon b7 said:
    melgross said:
    Pema said:
    All this constant chatter about cameras, cameras, cameras. I get it. Phone users want to take pics. Of just about anything, anytime, everywhere. These days you can't stroll on a street and not see someone holding up their phone taking a picture of some rather ordinary pigeon perched on a bollard. Big deal. You know that this pic and the photographer isn't going to end up in a museum somewhere alongside Ansel Adams. 

    For my part I would like to have a camera to be useful to take the most mundane pictures without the constant frustrations that I always experience. I am standing in front my shiny car attempting to take a pick of a panel that needs scratch repair. What do I see? My reflection. So I try to lean away and what do I see? My hands hanging goofy like trying to shoot that pic. How bloody annoying. 

    Then you are trying to flog something online, same deal. A stainless steel kettle and there you are like some skulking creep in the reflection. 

    These are my bugbears about all this talk about cameras. For the average camera user I don't care how many pixels and how many lens when I can't solve the simple straightforward problem of reflection. Of course, you are going to jump in and say, hey get a tripod. Why didn't I think of that? Try lining up that shot, Sherlock. 

    The other issue with phones, negating the all pervasive issue with cameras, is the utterly, stupid inadvertent touching of the screen and suddenly when you look at your phone screen you are facing some alien in outer space trying to flog you a bunch of stellar dust. Huh? How they hell did I get there? 

    And finally there is this dot.com, Dutch Tulip Mania about AI. Every few years the IT industry sinks into the doldrums and then needs a spark, AI. Well, there was a company called Borland run by a bloke called Philip Khan who released a piece of software called Turbo AI back in the last century. 

    Guess what the challenge was? Data. The data that the IT industry is going to scrape to give you intelligent anything is your data manipulated by algorithms, in case you haven't figured that out. 

    In other words, it's not organic AI, it's old, crap data being scraped from humongous warehouses filled to the rafters with servers housing giga mounds of data. And the more we use our phones, our computers to search and do anything the data grows diametrically. But have you noticed this? As soon as you search for a warm toilet seat cover on your next search there are ten vendors that want to flog you warm toilet seat covers. That's not generative or predictive. That's just plain old stupid AI Mimicking. You searched for this so I am going to give you the same. 

    Anyone whose ever stock traded will have noticed the disclaimer: past winnings is not guarantee of future earnings. And that disclaimer ought to be slapped on any AI product in the future: past data is being used to give you your answers but it is no guarantee of anything useful. It's the old saying garbage in/garbage out. 

    Nvidia is running a storm of success to mega trillions, watch how they plummet back to earth same as the Dutch Tulip Mania and the dot.com when the ordinary folks work out that there is no magic bullet in AI. Just the same-o, same-o. 

    The day that someone delivers organic AI is the day I will sit up and take notice. Till, one big, fat yawn  :s     

    Come to think of it, I believe that that is what Humane AI was trying to deliver. Real time AI. See how well they did??  :D
    Your problem taking pictures with your phone isn’t the fault of the phone. It’s the fault of you not knowing how to minimize reflections. If you just want to take a snap, which is what you’re saying, then you get what anyone else with any camera will get, no matter how good it is. It’s up to you to think about where the reflection is coming from, and doing something about it. Take a piece of cardboard and put it between the light that’s causing the reflection and the subject. Or a piece of white cloth. 

    The point is that you have to be proactive here. No camera can do that for you - yet. Maybe someday. Yes, it means you’ll have to actually think about what’s happening and how you’ll fix any problem. But you know the old adage; “if it’s worth doing, it’s worth doing well.”

    as for people taking pictures of birds or whatever, it’s not up to you to decide what they need in a camera. You also have no idea of what they’re thinking. If an iPhone doesn’t do it for you, but a flip phone and spends a few thousand on a “real” camera and lenses to see if you can do better. 

    Meanwhile, companies and individuals will continue using iPhones for movies, Tv shoes, product and fashion, industrial videos, etc. Oh, and award winning photos that end up in museums too.

    i am looking forwards to all the improvements Apple can make to the cameras, the computational area and even some AI assist. I’m not ashamed to say that after a career of doing commercial work, running a commercial lab and consulting with Kodak and Apple about color standards many years ago, that despite having a Canon R5 with a number of lenses, I use mt iPhone Pro Max for most of my photography, as for a lot of work, it’s actually good enough.
    Yes, it is valid to point out reasonable, simple solutions to reflections. Cameras are there to capture what we are looking at. That will include reflections that our brains help to filter out in real life so we don't 'see' them as much.

    That said, some phones have had this (ever improving) reflection removal technology for a while now.

    This is from four years ago:



    Similar technologies are used for casual underwater photography by smartphones. 
    Yes, I’ve used the software, and often, it doesn’t work well. As I said, it doesn’t work - yet. And when you see company provided, or company friends provided examples, you know it’s carefully set up in advance to work as well as possible. Since Huawei, Samsung and others have been caught cheating a number of times, you should be careful of using examples they set up. I’ve also used Google’s sharpening software. Sometimes it works very well and sometimes it’s a horror show. I’d rather not rely on any of it right now. The problem with these technologies is that they really don’t know what they’re looking at. It’s why AI programs add extra fingers, ears and other odd body parts and other anomalous features. At some point this will all work too well, but not yet.
    edited July 15
  • Reply 42 of 46
    avon b7avon b7 Posts: 7,943member
    melgross said:
    avon b7 said:
    melgross said:
    Pema said:
    All this constant chatter about cameras, cameras, cameras. I get it. Phone users want to take pics. Of just about anything, anytime, everywhere. These days you can't stroll on a street and not see someone holding up their phone taking a picture of some rather ordinary pigeon perched on a bollard. Big deal. You know that this pic and the photographer isn't going to end up in a museum somewhere alongside Ansel Adams. 

    For my part I would like to have a camera to be useful to take the most mundane pictures without the constant frustrations that I always experience. I am standing in front my shiny car attempting to take a pick of a panel that needs scratch repair. What do I see? My reflection. So I try to lean away and what do I see? My hands hanging goofy like trying to shoot that pic. How bloody annoying. 

    Then you are trying to flog something online, same deal. A stainless steel kettle and there you are like some skulking creep in the reflection. 

    These are my bugbears about all this talk about cameras. For the average camera user I don't care how many pixels and how many lens when I can't solve the simple straightforward problem of reflection. Of course, you are going to jump in and say, hey get a tripod. Why didn't I think of that? Try lining up that shot, Sherlock. 

    The other issue with phones, negating the all pervasive issue with cameras, is the utterly, stupid inadvertent touching of the screen and suddenly when you look at your phone screen you are facing some alien in outer space trying to flog you a bunch of stellar dust. Huh? How they hell did I get there? 

    And finally there is this dot.com, Dutch Tulip Mania about AI. Every few years the IT industry sinks into the doldrums and then needs a spark, AI. Well, there was a company called Borland run by a bloke called Philip Khan who released a piece of software called Turbo AI back in the last century. 

    Guess what the challenge was? Data. The data that the IT industry is going to scrape to give you intelligent anything is your data manipulated by algorithms, in case you haven't figured that out. 

    In other words, it's not organic AI, it's old, crap data being scraped from humongous warehouses filled to the rafters with servers housing giga mounds of data. And the more we use our phones, our computers to search and do anything the data grows diametrically. But have you noticed this? As soon as you search for a warm toilet seat cover on your next search there are ten vendors that want to flog you warm toilet seat covers. That's not generative or predictive. That's just plain old stupid AI Mimicking. You searched for this so I am going to give you the same. 

    Anyone whose ever stock traded will have noticed the disclaimer: past winnings is not guarantee of future earnings. And that disclaimer ought to be slapped on any AI product in the future: past data is being used to give you your answers but it is no guarantee of anything useful. It's the old saying garbage in/garbage out. 

    Nvidia is running a storm of success to mega trillions, watch how they plummet back to earth same as the Dutch Tulip Mania and the dot.com when the ordinary folks work out that there is no magic bullet in AI. Just the same-o, same-o. 

    The day that someone delivers organic AI is the day I will sit up and take notice. Till, one big, fat yawn  :s     

    Come to think of it, I believe that that is what Humane AI was trying to deliver. Real time AI. See how well they did??  :D
    Your problem taking pictures with your phone isn’t the fault of the phone. It’s the fault of you not knowing how to minimize reflections. If you just want to take a snap, which is what you’re saying, then you get what anyone else with any camera will get, no matter how good it is. It’s up to you to think about where the reflection is coming from, and doing something about it. Take a piece of cardboard and put it between the light that’s causing the reflection and the subject. Or a piece of white cloth. 

    The point is that you have to be proactive here. No camera can do that for you - yet. Maybe someday. Yes, it means you’ll have to actually think about what’s happening and how you’ll fix any problem. But you know the old adage; “if it’s worth doing, it’s worth doing well.”

    as for people taking pictures of birds or whatever, it’s not up to you to decide what they need in a camera. You also have no idea of what they’re thinking. If an iPhone doesn’t do it for you, but a flip phone and spends a few thousand on a “real” camera and lenses to see if you can do better. 

    Meanwhile, companies and individuals will continue using iPhones for movies, Tv shoes, product and fashion, industrial videos, etc. Oh, and award winning photos that end up in museums too.

    i am looking forwards to all the improvements Apple can make to the cameras, the computational area and even some AI assist. I’m not ashamed to say that after a career of doing commercial work, running a commercial lab and consulting with Kodak and Apple about color standards many years ago, that despite having a Canon R5 with a number of lenses, I use mt iPhone Pro Max for most of my photography, as for a lot of work, it’s actually good enough.
    Yes, it is valid to point out reasonable, simple solutions to reflections. Cameras are there to capture what we are looking at. That will include reflections that our brains help to filter out in real life so we don't 'see' them as much.

    That said, some phones have had this (ever improving) reflection removal technology for a while now.

    This is from four years ago:



    Similar technologies are used for casual underwater photography by smartphones. 
    Yes, I’ve used the software, and often, it doesn’t work well. As I said, it doesn’t work - yet. And when you see company provided, or company friends provided examples, you know it’s carefully set up in advance to work as well as possible. Since Huawei, Samsung and others have been caught cheating a number of times, you should be careful of using examples they set up. I’ve also used Google’s sharpening software. Sometimes it works very well and sometimes it’s a horror show. I’d rather not rely on any of it right now. The problem with these technologies is that they really don’t know what they’re looking at. It’s why AI programs add extra fingers, ears and other odd body parts and other anomalous features. At some point this will all work too well, but not yet.
    Given the option, I'd rather have something that works half the time than none of the time because the feature doesn't exist on my phone. 

    The feature is from four years ago and that's plenty of time for Apple to implement something similar, along with all the other similar NPU related tasks. The fact that it hasn't done that before now is because iPhone users get drip-fed features. That is simply how it is, especially in the area of photography. 

    Huawei has set the bar for smartphone photography advancements since 2017.

    Here’s a relatively 'old' example of Huawei’s pioneering efforts:

    "The Huawei P60 Pro is unequivocally the best camera phone we've tested in a number of key areas including macro photography. Huawei's demonstrated mastery of computational photography combines with pioneering lens technology to create an experience unlike anything we've seen on a smartphone, but there's so much more to this phone than a fantastic camera."

    https://www.digitalcameraworld.com/reviews/huawei-p60-pro-review-a-mobile-photography-revelation

    All marketing material is going to be setup for best impact. Look no further than Apple's Eye Sight. 

    That's marketing but if you don't push the needle you won't go anywhere. Some things will work better than others. Some things will stick and others won't. 

    AI features may cross a specific red line in the eyes of some people. Google's option to take a series of shots of a group of people and then pick out the best shots of each individual in the series and present one photo where everyone looks their best might be an example.

    I'd rather have that option than not though. 
    edited July 16 muthuk_vanalingam
  • Reply 43 of 46
    AppleZuluAppleZulu Posts: 2,120member
    avon b7 said:
    melgross said:
    avon b7 said:
    melgross said:
    Pema said:
    All this constant chatter about cameras, cameras, cameras. I get it. Phone users want to take pics. Of just about anything, anytime, everywhere. These days you can't stroll on a street and not see someone holding up their phone taking a picture of some rather ordinary pigeon perched on a bollard. Big deal. You know that this pic and the photographer isn't going to end up in a museum somewhere alongside Ansel Adams. 

    For my part I would like to have a camera to be useful to take the most mundane pictures without the constant frustrations that I always experience. I am standing in front my shiny car attempting to take a pick of a panel that needs scratch repair. What do I see? My reflection. So I try to lean away and what do I see? My hands hanging goofy like trying to shoot that pic. How bloody annoying. 

    Then you are trying to flog something online, same deal. A stainless steel kettle and there you are like some skulking creep in the reflection. 

    These are my bugbears about all this talk about cameras. For the average camera user I don't care how many pixels and how many lens when I can't solve the simple straightforward problem of reflection. Of course, you are going to jump in and say, hey get a tripod. Why didn't I think of that? Try lining up that shot, Sherlock. 

    The other issue with phones, negating the all pervasive issue with cameras, is the utterly, stupid inadvertent touching of the screen and suddenly when you look at your phone screen you are facing some alien in outer space trying to flog you a bunch of stellar dust. Huh? How they hell did I get there? 

    And finally there is this dot.com, Dutch Tulip Mania about AI. Every few years the IT industry sinks into the doldrums and then needs a spark, AI. Well, there was a company called Borland run by a bloke called Philip Khan who released a piece of software called Turbo AI back in the last century. 

    Guess what the challenge was? Data. The data that the IT industry is going to scrape to give you intelligent anything is your data manipulated by algorithms, in case you haven't figured that out. 

    In other words, it's not organic AI, it's old, crap data being scraped from humongous warehouses filled to the rafters with servers housing giga mounds of data. And the more we use our phones, our computers to search and do anything the data grows diametrically. But have you noticed this? As soon as you search for a warm toilet seat cover on your next search there are ten vendors that want to flog you warm toilet seat covers. That's not generative or predictive. That's just plain old stupid AI Mimicking. You searched for this so I am going to give you the same. 

    Anyone whose ever stock traded will have noticed the disclaimer: past winnings is not guarantee of future earnings. And that disclaimer ought to be slapped on any AI product in the future: past data is being used to give you your answers but it is no guarantee of anything useful. It's the old saying garbage in/garbage out. 

    Nvidia is running a storm of success to mega trillions, watch how they plummet back to earth same as the Dutch Tulip Mania and the dot.com when the ordinary folks work out that there is no magic bullet in AI. Just the same-o, same-o. 

    The day that someone delivers organic AI is the day I will sit up and take notice. Till, one big, fat yawn  :s     

    Come to think of it, I believe that that is what Humane AI was trying to deliver. Real time AI. See how well they did??  :D
    Your problem taking pictures with your phone isn’t the fault of the phone. It’s the fault of you not knowing how to minimize reflections. If you just want to take a snap, which is what you’re saying, then you get what anyone else with any camera will get, no matter how good it is. It’s up to you to think about where the reflection is coming from, and doing something about it. Take a piece of cardboard and put it between the light that’s causing the reflection and the subject. Or a piece of white cloth. 

    The point is that you have to be proactive here. No camera can do that for you - yet. Maybe someday. Yes, it means you’ll have to actually think about what’s happening and how you’ll fix any problem. But you know the old adage; “if it’s worth doing, it’s worth doing well.”

    as for people taking pictures of birds or whatever, it’s not up to you to decide what they need in a camera. You also have no idea of what they’re thinking. If an iPhone doesn’t do it for you, but a flip phone and spends a few thousand on a “real” camera and lenses to see if you can do better. 

    Meanwhile, companies and individuals will continue using iPhones for movies, Tv shoes, product and fashion, industrial videos, etc. Oh, and award winning photos that end up in museums too.

    i am looking forwards to all the improvements Apple can make to the cameras, the computational area and even some AI assist. I’m not ashamed to say that after a career of doing commercial work, running a commercial lab and consulting with Kodak and Apple about color standards many years ago, that despite having a Canon R5 with a number of lenses, I use mt iPhone Pro Max for most of my photography, as for a lot of work, it’s actually good enough.
    Yes, it is valid to point out reasonable, simple solutions to reflections. Cameras are there to capture what we are looking at. That will include reflections that our brains help to filter out in real life so we don't 'see' them as much.

    That said, some phones have had this (ever improving) reflection removal technology for a while now.

    This is from four years ago:



    Similar technologies are used for casual underwater photography by smartphones. 
    Yes, I’ve used the software, and often, it doesn’t work well. As I said, it doesn’t work - yet. And when you see company provided, or company friends provided examples, you know it’s carefully set up in advance to work as well as possible. Since Huawei, Samsung and others have been caught cheating a number of times, you should be careful of using examples they set up. I’ve also used Google’s sharpening software. Sometimes it works very well and sometimes it’s a horror show. I’d rather not rely on any of it right now. The problem with these technologies is that they really don’t know what they’re looking at. It’s why AI programs add extra fingers, ears and other odd body parts and other anomalous features. At some point this will all work too well, but not yet.
    Given the option, I'd rather have something that works half the time than none of the time because the feature doesn't exist on my phone. 

    The feature is from four years ago and that's plenty of time for Apple to implement something similar, along with all the other similar NPU related tasks. The fact that it hasn't done that before now is because iPhone users get drip-fed features. That is simply how it is, especially in the area of photography. 

    Huawei has set the bar for smartphone photography advancements since 2017.

    Here’s a relatively 'old' example of Huawei’s pioneering efforts:

    "The Huawei P60 Pro is unequivocally the best camera phone we've tested in a number of key areas including macro photography. Huawei's demonstrated mastery of computational photography combines with pioneering lens technology to create an experience unlike anything we've seen on a smartphone, but there's so much more to this phone than a fantastic camera."

    https://www.digitalcameraworld.com/reviews/huawei-p60-pro-review-a-mobile-photography-revelation

    All marketing material is going to be setup for best impact. Look no further than Apple's Eye Sight. 

    That's marketing but if you don't push the needle you won't go anywhere. Some things will work better than others. Some things will stick and others won't. 

    AI features may cross a specific red line in the eyes of some people. Google's option to take a series of shots of a group of people and then pick out the best shots of each individual in the series and present one photo where everyone looks their best might be an example.

    I'd rather have that option than not though. 
    Another Huawei promotional plug. Thanks for that. 

    Features that replace what was in an image with something that was not in that image should be reserved for editing tools. They shouldn’t be the direct product of a shutter click. Polarizing filters prevent unwanted light from reaching the camera. “Reflection removal” software guesses what might be behind a reflection and paints that fabricated guess in. There is indeed an ethical line that is crossed by presenting that alteration as the original photo. Most users won’t understand that their image is a fabrication. In most instances that won’t matter, but in some, they will believe a false image is actually depicting some important reality. 
  • Reply 44 of 46
    avon b7avon b7 Posts: 7,943member
    AppleZulu said:
    avon b7 said:
    melgross said:
    avon b7 said:
    melgross said:
    Pema said:
    All this constant chatter about cameras, cameras, cameras. I get it. Phone users want to take pics. Of just about anything, anytime, everywhere. These days you can't stroll on a street and not see someone holding up their phone taking a picture of some rather ordinary pigeon perched on a bollard. Big deal. You know that this pic and the photographer isn't going to end up in a museum somewhere alongside Ansel Adams. 

    For my part I would like to have a camera to be useful to take the most mundane pictures without the constant frustrations that I always experience. I am standing in front my shiny car attempting to take a pick of a panel that needs scratch repair. What do I see? My reflection. So I try to lean away and what do I see? My hands hanging goofy like trying to shoot that pic. How bloody annoying. 

    Then you are trying to flog something online, same deal. A stainless steel kettle and there you are like some skulking creep in the reflection. 

    These are my bugbears about all this talk about cameras. For the average camera user I don't care how many pixels and how many lens when I can't solve the simple straightforward problem of reflection. Of course, you are going to jump in and say, hey get a tripod. Why didn't I think of that? Try lining up that shot, Sherlock. 

    The other issue with phones, negating the all pervasive issue with cameras, is the utterly, stupid inadvertent touching of the screen and suddenly when you look at your phone screen you are facing some alien in outer space trying to flog you a bunch of stellar dust. Huh? How they hell did I get there? 

    And finally there is this dot.com, Dutch Tulip Mania about AI. Every few years the IT industry sinks into the doldrums and then needs a spark, AI. Well, there was a company called Borland run by a bloke called Philip Khan who released a piece of software called Turbo AI back in the last century. 

    Guess what the challenge was? Data. The data that the IT industry is going to scrape to give you intelligent anything is your data manipulated by algorithms, in case you haven't figured that out. 

    In other words, it's not organic AI, it's old, crap data being scraped from humongous warehouses filled to the rafters with servers housing giga mounds of data. And the more we use our phones, our computers to search and do anything the data grows diametrically. But have you noticed this? As soon as you search for a warm toilet seat cover on your next search there are ten vendors that want to flog you warm toilet seat covers. That's not generative or predictive. That's just plain old stupid AI Mimicking. You searched for this so I am going to give you the same. 

    Anyone whose ever stock traded will have noticed the disclaimer: past winnings is not guarantee of future earnings. And that disclaimer ought to be slapped on any AI product in the future: past data is being used to give you your answers but it is no guarantee of anything useful. It's the old saying garbage in/garbage out. 

    Nvidia is running a storm of success to mega trillions, watch how they plummet back to earth same as the Dutch Tulip Mania and the dot.com when the ordinary folks work out that there is no magic bullet in AI. Just the same-o, same-o. 

    The day that someone delivers organic AI is the day I will sit up and take notice. Till, one big, fat yawn  :s     

    Come to think of it, I believe that that is what Humane AI was trying to deliver. Real time AI. See how well they did??  :D
    Your problem taking pictures with your phone isn’t the fault of the phone. It’s the fault of you not knowing how to minimize reflections. If you just want to take a snap, which is what you’re saying, then you get what anyone else with any camera will get, no matter how good it is. It’s up to you to think about where the reflection is coming from, and doing something about it. Take a piece of cardboard and put it between the light that’s causing the reflection and the subject. Or a piece of white cloth. 

    The point is that you have to be proactive here. No camera can do that for you - yet. Maybe someday. Yes, it means you’ll have to actually think about what’s happening and how you’ll fix any problem. But you know the old adage; “if it’s worth doing, it’s worth doing well.”

    as for people taking pictures of birds or whatever, it’s not up to you to decide what they need in a camera. You also have no idea of what they’re thinking. If an iPhone doesn’t do it for you, but a flip phone and spends a few thousand on a “real” camera and lenses to see if you can do better. 

    Meanwhile, companies and individuals will continue using iPhones for movies, Tv shoes, product and fashion, industrial videos, etc. Oh, and award winning photos that end up in museums too.

    i am looking forwards to all the improvements Apple can make to the cameras, the computational area and even some AI assist. I’m not ashamed to say that after a career of doing commercial work, running a commercial lab and consulting with Kodak and Apple about color standards many years ago, that despite having a Canon R5 with a number of lenses, I use mt iPhone Pro Max for most of my photography, as for a lot of work, it’s actually good enough.
    Yes, it is valid to point out reasonable, simple solutions to reflections. Cameras are there to capture what we are looking at. That will include reflections that our brains help to filter out in real life so we don't 'see' them as much.

    That said, some phones have had this (ever improving) reflection removal technology for a while now.

    This is from four years ago:



    Similar technologies are used for casual underwater photography by smartphones. 
    Yes, I’ve used the software, and often, it doesn’t work well. As I said, it doesn’t work - yet. And when you see company provided, or company friends provided examples, you know it’s carefully set up in advance to work as well as possible. Since Huawei, Samsung and others have been caught cheating a number of times, you should be careful of using examples they set up. I’ve also used Google’s sharpening software. Sometimes it works very well and sometimes it’s a horror show. I’d rather not rely on any of it right now. The problem with these technologies is that they really don’t know what they’re looking at. It’s why AI programs add extra fingers, ears and other odd body parts and other anomalous features. At some point this will all work too well, but not yet.
    Given the option, I'd rather have something that works half the time than none of the time because the feature doesn't exist on my phone. 

    The feature is from four years ago and that's plenty of time for Apple to implement something similar, along with all the other similar NPU related tasks. The fact that it hasn't done that before now is because iPhone users get drip-fed features. That is simply how it is, especially in the area of photography. 

    Huawei has set the bar for smartphone photography advancements since 2017.

    Here’s a relatively 'old' example of Huawei’s pioneering efforts:

    "The Huawei P60 Pro is unequivocally the best camera phone we've tested in a number of key areas including macro photography. Huawei's demonstrated mastery of computational photography combines with pioneering lens technology to create an experience unlike anything we've seen on a smartphone, but there's so much more to this phone than a fantastic camera."

    https://www.digitalcameraworld.com/reviews/huawei-p60-pro-review-a-mobile-photography-revelation

    All marketing material is going to be setup for best impact. Look no further than Apple's Eye Sight. 

    That's marketing but if you don't push the needle you won't go anywhere. Some things will work better than others. Some things will stick and others won't. 

    AI features may cross a specific red line in the eyes of some people. Google's option to take a series of shots of a group of people and then pick out the best shots of each individual in the series and present one photo where everyone looks their best might be an example.

    I'd rather have that option than not though. 
    Another Huawei promotional plug. Thanks for that. 

    Features that replace what was in an image with something that was not in that image should be reserved for editing tools. They shouldn’t be the direct product of a shutter click. Polarizing filters prevent unwanted light from reaching the camera. “Reflection removal” software guesses what might be behind a reflection and paints that fabricated guess in. There is indeed an ethical line that is crossed by presenting that alteration as the original photo. Most users won’t understand that their image is a fabrication. In most instances that won’t matter, but in some, they will believe a false image is actually depicting some important reality. 
    Everything coming off a digital sensor is processed. Nothing happens on a shutter click unless you want it to. It could be a filter like an HDR shot, bokeh, an AI infused shot that detects smiles, scenes or other objects. My phones have had an AI button on the camera app for years and it can be turned on/off prior to the shot or even after you've taken it (toggled). The pro mode on many phones (normally mid to high end phones will shoot RAW too).

    The reflection removal feature is obviously a guess and YMMV. There is no getting away from that and it is exactly why I said 'ever improving' in the original reply. 

    The thing is that AI infused 'guesswork' is getting better and better and it is always good to have the option of using it.

    Everyone will have their ethical lines drawn in different places. Some will be clear, others, not so. 

    No one has issues with automatic red eye removal because it wasn't visible to the people in the shot. 

    However, was it wrong for me to eliminate the very visible mosquito bites from my sister-in-law's legs in the beach photos?

    I know of a case where a married person had an affair and retouched the photos to remove a wedding ring.

    AI has the potential to make changes basically invisible to even a trained eye. So long as the changes don't go overboard. 

    Insta filters may jump off the screen but AI is capable of whitening your teeth to believable levels. And a whole lot more. Sometimes, less is more. Like with wigs!

    Whatever we may think, it's here to stay and getting automated and faster. As long as you can control it, I'm ok with it. 

    Of course for a whole generation of children, grandmas will look nothing like what they were really like when they've gone, but that's life and I can't begrudge them 'cheating' when I'll almost definitely do the same when I'm 'old'. 
    edited July 16 muthuk_vanalingam
  • Reply 45 of 46
    AppleZuluAppleZulu Posts: 2,120member
    avon b7 said:
    AppleZulu said:
    avon b7 said:
    melgross said:
    avon b7 said:
    melgross said:
    Pema said:
    All this constant chatter about cameras, cameras, cameras. I get it. Phone users want to take pics. Of just about anything, anytime, everywhere. These days you can't stroll on a street and not see someone holding up their phone taking a picture of some rather ordinary pigeon perched on a bollard. Big deal. You know that this pic and the photographer isn't going to end up in a museum somewhere alongside Ansel Adams. 

    For my part I would like to have a camera to be useful to take the most mundane pictures without the constant frustrations that I always experience. I am standing in front my shiny car attempting to take a pick of a panel that needs scratch repair. What do I see? My reflection. So I try to lean away and what do I see? My hands hanging goofy like trying to shoot that pic. How bloody annoying. 

    Then you are trying to flog something online, same deal. A stainless steel kettle and there you are like some skulking creep in the reflection. 

    These are my bugbears about all this talk about cameras. For the average camera user I don't care how many pixels and how many lens when I can't solve the simple straightforward problem of reflection. Of course, you are going to jump in and say, hey get a tripod. Why didn't I think of that? Try lining up that shot, Sherlock. 

    The other issue with phones, negating the all pervasive issue with cameras, is the utterly, stupid inadvertent touching of the screen and suddenly when you look at your phone screen you are facing some alien in outer space trying to flog you a bunch of stellar dust. Huh? How they hell did I get there? 

    And finally there is this dot.com, Dutch Tulip Mania about AI. Every few years the IT industry sinks into the doldrums and then needs a spark, AI. Well, there was a company called Borland run by a bloke called Philip Khan who released a piece of software called Turbo AI back in the last century. 

    Guess what the challenge was? Data. The data that the IT industry is going to scrape to give you intelligent anything is your data manipulated by algorithms, in case you haven't figured that out. 

    In other words, it's not organic AI, it's old, crap data being scraped from humongous warehouses filled to the rafters with servers housing giga mounds of data. And the more we use our phones, our computers to search and do anything the data grows diametrically. But have you noticed this? As soon as you search for a warm toilet seat cover on your next search there are ten vendors that want to flog you warm toilet seat covers. That's not generative or predictive. That's just plain old stupid AI Mimicking. You searched for this so I am going to give you the same. 

    Anyone whose ever stock traded will have noticed the disclaimer: past winnings is not guarantee of future earnings. And that disclaimer ought to be slapped on any AI product in the future: past data is being used to give you your answers but it is no guarantee of anything useful. It's the old saying garbage in/garbage out. 

    Nvidia is running a storm of success to mega trillions, watch how they plummet back to earth same as the Dutch Tulip Mania and the dot.com when the ordinary folks work out that there is no magic bullet in AI. Just the same-o, same-o. 

    The day that someone delivers organic AI is the day I will sit up and take notice. Till, one big, fat yawn  :s     

    Come to think of it, I believe that that is what Humane AI was trying to deliver. Real time AI. See how well they did??  :D
    Your problem taking pictures with your phone isn’t the fault of the phone. It’s the fault of you not knowing how to minimize reflections. If you just want to take a snap, which is what you’re saying, then you get what anyone else with any camera will get, no matter how good it is. It’s up to you to think about where the reflection is coming from, and doing something about it. Take a piece of cardboard and put it between the light that’s causing the reflection and the subject. Or a piece of white cloth. 

    The point is that you have to be proactive here. No camera can do that for you - yet. Maybe someday. Yes, it means you’ll have to actually think about what’s happening and how you’ll fix any problem. But you know the old adage; “if it’s worth doing, it’s worth doing well.”

    as for people taking pictures of birds or whatever, it’s not up to you to decide what they need in a camera. You also have no idea of what they’re thinking. If an iPhone doesn’t do it for you, but a flip phone and spends a few thousand on a “real” camera and lenses to see if you can do better. 

    Meanwhile, companies and individuals will continue using iPhones for movies, Tv shoes, product and fashion, industrial videos, etc. Oh, and award winning photos that end up in museums too.

    i am looking forwards to all the improvements Apple can make to the cameras, the computational area and even some AI assist. I’m not ashamed to say that after a career of doing commercial work, running a commercial lab and consulting with Kodak and Apple about color standards many years ago, that despite having a Canon R5 with a number of lenses, I use mt iPhone Pro Max for most of my photography, as for a lot of work, it’s actually good enough.
    Yes, it is valid to point out reasonable, simple solutions to reflections. Cameras are there to capture what we are looking at. That will include reflections that our brains help to filter out in real life so we don't 'see' them as much.

    That said, some phones have had this (ever improving) reflection removal technology for a while now.

    This is from four years ago:



    Similar technologies are used for casual underwater photography by smartphones. 
    Yes, I’ve used the software, and often, it doesn’t work well. As I said, it doesn’t work - yet. And when you see company provided, or company friends provided examples, you know it’s carefully set up in advance to work as well as possible. Since Huawei, Samsung and others have been caught cheating a number of times, you should be careful of using examples they set up. I’ve also used Google’s sharpening software. Sometimes it works very well and sometimes it’s a horror show. I’d rather not rely on any of it right now. The problem with these technologies is that they really don’t know what they’re looking at. It’s why AI programs add extra fingers, ears and other odd body parts and other anomalous features. At some point this will all work too well, but not yet.
    Given the option, I'd rather have something that works half the time than none of the time because the feature doesn't exist on my phone. 

    The feature is from four years ago and that's plenty of time for Apple to implement something similar, along with all the other similar NPU related tasks. The fact that it hasn't done that before now is because iPhone users get drip-fed features. That is simply how it is, especially in the area of photography. 

    Huawei has set the bar for smartphone photography advancements since 2017.

    Here’s a relatively 'old' example of Huawei’s pioneering efforts:

    "The Huawei P60 Pro is unequivocally the best camera phone we've tested in a number of key areas including macro photography. Huawei's demonstrated mastery of computational photography combines with pioneering lens technology to create an experience unlike anything we've seen on a smartphone, but there's so much more to this phone than a fantastic camera."

    https://www.digitalcameraworld.com/reviews/huawei-p60-pro-review-a-mobile-photography-revelation

    All marketing material is going to be setup for best impact. Look no further than Apple's Eye Sight. 

    That's marketing but if you don't push the needle you won't go anywhere. Some things will work better than others. Some things will stick and others won't. 

    AI features may cross a specific red line in the eyes of some people. Google's option to take a series of shots of a group of people and then pick out the best shots of each individual in the series and present one photo where everyone looks their best might be an example.

    I'd rather have that option than not though. 
    Another Huawei promotional plug. Thanks for that. 

    Features that replace what was in an image with something that was not in that image should be reserved for editing tools. They shouldn’t be the direct product of a shutter click. Polarizing filters prevent unwanted light from reaching the camera. “Reflection removal” software guesses what might be behind a reflection and paints that fabricated guess in. There is indeed an ethical line that is crossed by presenting that alteration as the original photo. Most users won’t understand that their image is a fabrication. In most instances that won’t matter, but in some, they will believe a false image is actually depicting some important reality. 
    Everything coming off a digital sensor is processed. Nothing happens on a shutter click unless you want it to. It could be a filter like an HDR shot, bokeh, an AI infused shot that detects smiles, scenes or other objects. My phones have had an AI button on the camera app for years and it can be turned on/off prior to the shot or even after you've taken it (toggled). The pro mode on many phones (normally mid to high end phones will shoot RAW too.

    The reflection removal feature is obviously a guess and YMMV. There is no getting away from that and it is exactly why I said 'ever improving' in the original reply. 

    The thing is that AI infused 'guesswork' is getting better and better and it is always good to have the option of using it.

    Everyone will have their ethical lines drawn in different places. Some will be clear, others, not so. 

    No one has issues with automatic red eye removal because it wasn't visible to the people in the shot. 

    However, was it wrong for me to eliminate the very visible mosquito bites from my sister-in-law's legs in the beach photos?

    I know of a case where a married person had an affair and retouched the photos to remove a wedding ring.

    AI has the potential to make changes basically invisible to even a trained eye. So long as the changes don't go overboard. 

    Insta filters may jump of the screen but AI is capable of whitening your teeth to believable levels. And a whole lot more. Sometimes, less is more. Like with wigs!

    Whatever we may think, it's here to stay and getting automated and faster. As long as you can control it, I'm ok with it. 

    Of course for a whole generation of children, grandmas will look nothing like what they were really like when they've gone, but that's life and I can't begrudge them 'cheating' when I'll almost definitely do the same when I'm 'old'. 
    It's true that a photograph is, at its most basic, already only an approximation of reality, and cameras have always made its possible to make editorial choices when capturing an image. 

    That said, as a matter of practice, wholesale alterations of an image shouldn't be automatic. That should be a user choice in editing and finishing the photograph. 

    You can retouch your photo to your heart's content, but the camera shouldn't whiten teeth, restore hairlines, remove wrinkles, red-eye, bug bites and wedding rings without capturing and showing the harsher reality first. This is true both for a basic respect for the truth, and because it's far harder to undo an alteration that has overwritten reality than it is to go back and start with reality and make different choices about editing from there. 

    Maybe the mosquito bites actually help tell a humorous story about the trials and tribulations of a  family beach vacation, and your sister-in-law would actually like the original version to remember that. A car passenger's face might be partially obscured by a reflection on the windshield. AI "reflection removal" may render the entire windshield blank, as if the passenger was never there, or it might restore a face on the passenger, but not one that looks like the actual passenger. If AI corrections are applied to the original image before it is even saved, the user can't then decide that the partially obscured face or mosquito bites are preferable. If forgotten and retrieved later, the AI-altered "original" image might also illustrate a false record. Interestingly, Apple realizes this, and when you use its basic editing tools in the photos app, it actually preserves the original photo's data. If you open a photo on your iPhone that you've previously cropped, edited and saved, and you tap to edit it again, you will be given the option of revering the photo back to its original state.

    There's a reason professional photographers use "raw" image files as their digital starting point. The best starting point for any photo editing operation is the original sensor data. Some pros even realize that even basic digital color management standards are built on fundamentally incorrect "good enough" approximations adopted during the early development of color television. You need an unaltered raw image file to then use a different color management tool before the incorrect "good enough" standards are permanently applied to an image file. 
    edited July 16 muthuk_vanalingam
  • Reply 46 of 46
    avon b7avon b7 Posts: 7,943member
    AppleZulu said:
    avon b7 said:
    AppleZulu said:
    avon b7 said:
    melgross said:
    avon b7 said:
    melgross said:
    Pema said:
    All this constant chatter about cameras, cameras, cameras. I get it. Phone users want to take pics. Of just about anything, anytime, everywhere. These days you can't stroll on a street and not see someone holding up their phone taking a picture of some rather ordinary pigeon perched on a bollard. Big deal. You know that this pic and the photographer isn't going to end up in a museum somewhere alongside Ansel Adams. 

    For my part I would like to have a camera to be useful to take the most mundane pictures without the constant frustrations that I always experience. I am standing in front my shiny car attempting to take a pick of a panel that needs scratch repair. What do I see? My reflection. So I try to lean away and what do I see? My hands hanging goofy like trying to shoot that pic. How bloody annoying. 

    Then you are trying to flog something online, same deal. A stainless steel kettle and there you are like some skulking creep in the reflection. 

    These are my bugbears about all this talk about cameras. For the average camera user I don't care how many pixels and how many lens when I can't solve the simple straightforward problem of reflection. Of course, you are going to jump in and say, hey get a tripod. Why didn't I think of that? Try lining up that shot, Sherlock. 

    The other issue with phones, negating the all pervasive issue with cameras, is the utterly, stupid inadvertent touching of the screen and suddenly when you look at your phone screen you are facing some alien in outer space trying to flog you a bunch of stellar dust. Huh? How they hell did I get there? 

    And finally there is this dot.com, Dutch Tulip Mania about AI. Every few years the IT industry sinks into the doldrums and then needs a spark, AI. Well, there was a company called Borland run by a bloke called Philip Khan who released a piece of software called Turbo AI back in the last century. 

    Guess what the challenge was? Data. The data that the IT industry is going to scrape to give you intelligent anything is your data manipulated by algorithms, in case you haven't figured that out. 

    In other words, it's not organic AI, it's old, crap data being scraped from humongous warehouses filled to the rafters with servers housing giga mounds of data. And the more we use our phones, our computers to search and do anything the data grows diametrically. But have you noticed this? As soon as you search for a warm toilet seat cover on your next search there are ten vendors that want to flog you warm toilet seat covers. That's not generative or predictive. That's just plain old stupid AI Mimicking. You searched for this so I am going to give you the same. 

    Anyone whose ever stock traded will have noticed the disclaimer: past winnings is not guarantee of future earnings. And that disclaimer ought to be slapped on any AI product in the future: past data is being used to give you your answers but it is no guarantee of anything useful. It's the old saying garbage in/garbage out. 

    Nvidia is running a storm of success to mega trillions, watch how they plummet back to earth same as the Dutch Tulip Mania and the dot.com when the ordinary folks work out that there is no magic bullet in AI. Just the same-o, same-o. 

    The day that someone delivers organic AI is the day I will sit up and take notice. Till, one big, fat yawn  :s     

    Come to think of it, I believe that that is what Humane AI was trying to deliver. Real time AI. See how well they did??  :D
    Your problem taking pictures with your phone isn’t the fault of the phone. It’s the fault of you not knowing how to minimize reflections. If you just want to take a snap, which is what you’re saying, then you get what anyone else with any camera will get, no matter how good it is. It’s up to you to think about where the reflection is coming from, and doing something about it. Take a piece of cardboard and put it between the light that’s causing the reflection and the subject. Or a piece of white cloth. 

    The point is that you have to be proactive here. No camera can do that for you - yet. Maybe someday. Yes, it means you’ll have to actually think about what’s happening and how you’ll fix any problem. But you know the old adage; “if it’s worth doing, it’s worth doing well.”

    as for people taking pictures of birds or whatever, it’s not up to you to decide what they need in a camera. You also have no idea of what they’re thinking. If an iPhone doesn’t do it for you, but a flip phone and spends a few thousand on a “real” camera and lenses to see if you can do better. 

    Meanwhile, companies and individuals will continue using iPhones for movies, Tv shoes, product and fashion, industrial videos, etc. Oh, and award winning photos that end up in museums too.

    i am looking forwards to all the improvements Apple can make to the cameras, the computational area and even some AI assist. I’m not ashamed to say that after a career of doing commercial work, running a commercial lab and consulting with Kodak and Apple about color standards many years ago, that despite having a Canon R5 with a number of lenses, I use mt iPhone Pro Max for most of my photography, as for a lot of work, it’s actually good enough.
    Yes, it is valid to point out reasonable, simple solutions to reflections. Cameras are there to capture what we are looking at. That will include reflections that our brains help to filter out in real life so we don't 'see' them as much.

    That said, some phones have had this (ever improving) reflection removal technology for a while now.

    This is from four years ago:



    Similar technologies are used for casual underwater photography by smartphones. 
    Yes, I’ve used the software, and often, it doesn’t work well. As I said, it doesn’t work - yet. And when you see company provided, or company friends provided examples, you know it’s carefully set up in advance to work as well as possible. Since Huawei, Samsung and others have been caught cheating a number of times, you should be careful of using examples they set up. I’ve also used Google’s sharpening software. Sometimes it works very well and sometimes it’s a horror show. I’d rather not rely on any of it right now. The problem with these technologies is that they really don’t know what they’re looking at. It’s why AI programs add extra fingers, ears and other odd body parts and other anomalous features. At some point this will all work too well, but not yet.
    Given the option, I'd rather have something that works half the time than none of the time because the feature doesn't exist on my phone. 

    The feature is from four years ago and that's plenty of time for Apple to implement something similar, along with all the other similar NPU related tasks. The fact that it hasn't done that before now is because iPhone users get drip-fed features. That is simply how it is, especially in the area of photography. 

    Huawei has set the bar for smartphone photography advancements since 2017.

    Here’s a relatively 'old' example of Huawei’s pioneering efforts:

    "The Huawei P60 Pro is unequivocally the best camera phone we've tested in a number of key areas including macro photography. Huawei's demonstrated mastery of computational photography combines with pioneering lens technology to create an experience unlike anything we've seen on a smartphone, but there's so much more to this phone than a fantastic camera."

    https://www.digitalcameraworld.com/reviews/huawei-p60-pro-review-a-mobile-photography-revelation

    All marketing material is going to be setup for best impact. Look no further than Apple's Eye Sight. 

    That's marketing but if you don't push the needle you won't go anywhere. Some things will work better than others. Some things will stick and others won't. 

    AI features may cross a specific red line in the eyes of some people. Google's option to take a series of shots of a group of people and then pick out the best shots of each individual in the series and present one photo where everyone looks their best might be an example.

    I'd rather have that option than not though. 
    Another Huawei promotional plug. Thanks for that. 

    Features that replace what was in an image with something that was not in that image should be reserved for editing tools. They shouldn’t be the direct product of a shutter click. Polarizing filters prevent unwanted light from reaching the camera. “Reflection removal” software guesses what might be behind a reflection and paints that fabricated guess in. There is indeed an ethical line that is crossed by presenting that alteration as the original photo. Most users won’t understand that their image is a fabrication. In most instances that won’t matter, but in some, they will believe a false image is actually depicting some important reality. 
    Everything coming off a digital sensor is processed. Nothing happens on a shutter click unless you want it to. It could be a filter like an HDR shot, bokeh, an AI infused shot that detects smiles, scenes or other objects. My phones have had an AI button on the camera app for years and it can be turned on/off prior to the shot or even after you've taken it (toggled). The pro mode on many phones (normally mid to high end phones will shoot RAW too.

    The reflection removal feature is obviously a guess and YMMV. There is no getting away from that and it is exactly why I said 'ever improving' in the original reply. 

    The thing is that AI infused 'guesswork' is getting better and better and it is always good to have the option of using it.

    Everyone will have their ethical lines drawn in different places. Some will be clear, others, not so. 

    No one has issues with automatic red eye removal because it wasn't visible to the people in the shot. 

    However, was it wrong for me to eliminate the very visible mosquito bites from my sister-in-law's legs in the beach photos?

    I know of a case where a married person had an affair and retouched the photos to remove a wedding ring.

    AI has the potential to make changes basically invisible to even a trained eye. So long as the changes don't go overboard. 

    Insta filters may jump of the screen but AI is capable of whitening your teeth to believable levels. And a whole lot more. Sometimes, less is more. Like with wigs!

    Whatever we may think, it's here to stay and getting automated and faster. As long as you can control it, I'm ok with it. 

    Of course for a whole generation of children, grandmas will look nothing like what they were really like when they've gone, but that's life and I can't begrudge them 'cheating' when I'll almost definitely do the same when I'm 'old'. 
    It's true that a photograph is, at its most basic, already only an approximation of reality, and cameras have always made its possible to make editorial choices when capturing an image. 

    That said, as a mater of practice, wholesale alterations of an image shouldn't be automatic. That should be a user choice in editing and finishing the photograph. 

    You can retouch your photo to your heart's content, but the camera shouldn't whiten teeth, restore hairlines, remove wrinkles, red-eye, bug bites and wedding rings without capturing and showing the harsher reality first. This is true both for a basic respect for something approximating the truth, and because it's far harder to undo an alteration that has overwritten reality than it is to go back and start with reality and make different choices about editing from there. 

    Maybe the mosquito bites actually help tell a humorous story about the trials and tribulations of a  family beach vacations, and your sister-in-law would like the original version to remember that. A car passenger's face might be partially obscured by a reflection on the windshield. AI "reflection removal" may render the entire windshield blank, as if the passenger was never there, or it might restore a face on the passenger, but not one that looks like the actual passenger. If AI corrections are applied to the original image before it is even saved saved, the user can't then decide that the partially obscured face or mosquito bites are preferable. If forgotten and retrieved later, the AI-altered "original" image might also illustrate a false record. 
    Like I said, none of the AI enhancements on my phone are permanent on a shutter click. In the gallery they appear with an AI flag. Hitting that flag removes the AI enhancements. 

    It's optional from the outset and undoable from the shutter click.

    If I modify the shot from the gallery app or any other I have the choice of creating a new file or overwriting the original. 

    Reflection removal has never been an 'automatic' step. It's an after-the-shot affair.

    Yes, some reflections can add another dimension to a photo. Sometimes by accident, sometimes deliberately.

    I removed the mosquito bites because she was self-conscious of them and I didn't want people to forget a great photo by getting drawn to the red blotches on her legs. She didn't ask me to do it and the originals are there if she wants them. 

    Everyone will have their own take on what is right or wrong with this stuff but the key is having the options in the first place. 

    muthuk_vanalingam
Sign In or Register to comment.