It looks like Samsung is cheating on 'space zoom' moon photos

2

Comments

  • Reply 21 of 43
    AppleishAppleish Posts: 687member
    Not Samsung! How could this be? /s
    radarthekatlolliverwatto_cobra
  • Reply 22 of 43
    netroxnetrox Posts: 1,415member
    That's why they were calling it "space zoom" as a way to say, "We can zoom into space and we'll add data from known space objects to it so you'll think you're seeing clear details of identified objects" - it didn't say that it does 600mm zoom. 
    lolliverwatto_cobra
  • Reply 23 of 43
    netroxnetrox Posts: 1,415member
    It is just not possible for a sensor of that size to get 200MP worth of data even with optical zoom and even optical zoom cannot just keep adding more details as you zoom in. There's a limit to what we can resolve with different distances. 

    That's why we have microscopes and telescopes. You cannot have them both and expect them to resolve all the details at all distances at same time. 

    Like you cannot observe the location and the speed of a particle. 
     
    Simple laws of physics prevent us from achieving what we want so we end up deluding ourselves with AI stuff. Sadly.   


    radarthekatlolliverwatto_cobra
  • Reply 24 of 43
    Might as well just download a NASA image if that’s the case. 
    Not the best analogy. A NASA photo may be comprised of images from a number of Earth and space-based telescopes with wavelengths ranging from radio to x-ray. 
    watto_cobra
  • Reply 25 of 43
    coolfactorcoolfactor Posts: 2,239member

    The use of the term "AI" (Artificial Intelligence) to describe all of this technology is really getting out of hand. 

    This is Machine Learning — "ML". Are writers using the term "AI" to dumb the article down for the average joe?

    Here's a good explanation of the difference. AI is not wrong, but it's far too broad, as explained in the video.
    https://www.youtube.com/watch?v=J4Qsr93L1qs

    Our phones are not sentient, therefore they do not yet have "intelligence". But they have learned and do learn, and respond to input with an expected output.
    radarthekatroundaboutnowlolliverwatto_cobra
  • Reply 26 of 43
    radarthekatradarthekat Posts: 3,842moderator
    bohuj said:
    Bullshit of Apple fans. It easy to confirm its not cheating, takę an s22 or s23 ultra and using 100x zoom takę a Photo of any lamppost. It works same As moon shot u will See all details inside the lamp like bulb or Led array, wires etc. 
    Tell us you know Samsung is cheating by attempting to distract from the presented facts.  
    magman1979lolliverwatto_cobra
  • Reply 27 of 43
    radarthekatradarthekat Posts: 3,842moderator
    avon b7 said:
    Xed said:
    I don’t really care if it cheated. To me, what’s more important is how well can it cheat? If the Samsung can accurately recreate PORTIONS of the moon occluded by a tree or other object, then the cheating is good enough that it’s actually useful because people can make custom images of the moon that you can’t just download from the internet. Has anyone tried this?
    But that's not the camera and it shouldn't touted as such. That would be a perfectly fine app or service for photography, but it needs to be presented as such. I doubt we're very far away from taking an image and then saying/typing to AI, "clean up this image so that tree naturally is not obstructing the moon, add detail to the full moon, and make it 20% larger in the night sky," to have it render that in a split second.
    There are two sides to this problem. 

    The purist in me says the photo should be what I see and little more. 

    The realist in me says we now have the ability to do so much more with the camera app, AI, ISP and sensors, that the line between what is acceptable and what is not, is blurred.

    We take a photo through a train window and use AI to remove the reflection. We use AI on selfies and selfie video to adjust where our eyes are looking. We use AI to create a bokeh effect. We use modern ISPs and sensors to take photos in very low light. Sometimes turning night almost into day. 

    In all of the above cases, the result can be better or worse. 

    In the specific case of the moon, 99% percent of point and shoot photos do not give you anything even remotely close to what you actually see. What you get is a little white blurry blob. Enough to ruin the photo.

    Anything that corrects that problem is probably very welcome because then you will be getting closer to what you actually see. Even if the processing draws on elements that are pre-stored on the phone and AI performs the magic. 

    The key would be just how effectively the image is recomposed within the night sky you are actually seeing. 

    I'm sure most people would gladly choose such an option

    Samsung is probably overselling what the phone is doing out of the gate and being economical with the truth. It's nothing new though. Periscope lenses and extreme zoom capabilities have their use cases. 

    We were in exactly the same position four years ago with the P30 Pro and its Moon Mode. Again, an option. Once the storm had passed no one really cared. People use it gladly. 

    It's important not to forget that in moon shots, the finished photo is probably closer to what you actually witnessed on the night. 

    Where we stand on this subject will depend on what we are talking about. Samsung's marketing push or how well the final shot represents what we saw. 

    Here is a modern periscope moonshot from an upcoming phone. 

    How many of us would love to take something like that with a point and shoot camera phone? Let's not forget that the moon might be a focal point in the composition but it's the zoom that is really shining here. 

    It's a tough call. 


    https://www.gizchina.com/2023/03/02/huawei-p60-pro-camera-zoom-the-world-is-not-yet-ready-for-this/

    It should never be a tough call when we're speaking about truth versus deception.  That's what this news story is about. 

     It's a second conversation if you want to discuss how we might consider using machine learning and external data sources to enhance images.  
    edited March 2023 roundaboutnowmagman1979lolliverMacProwatto_cobra
  • Reply 28 of 43
    So Samsung bad but fake bokeh is good, right?
  • Reply 29 of 43
    sphericspheric Posts: 2,544member
    So Samsung bad but fake bokeh is good, right?
    Faking an „enhancement“ by including a template for a common test case to make it look like your product is doing something it cannot do is fraud. 

    Telling everybody that your camera can create sharp, detailed images at 100x zoom and then using the moon as an example, knowing that the camera is in fact completely incapable of this, is bad. 
    roundaboutnowmagman1979lolliverwatto_cobra
  • Reply 30 of 43
    sflocalsflocal Posts: 6,092member
    bohuj said:
    Bullshit of Apple fans. It easy to confirm its not cheating, takę an s22 or s23 ultra and using 100x zoom takę a Photo of any lamppost. It works same As moon shot u will See all details inside the lamp like bulb or Led array, wires etc. 
    Oh look... Scamscum's shill cheque must have been accepted by this troll's bank.
    roundaboutnowmagman1979lolliverwatto_cobra
  • Reply 31 of 43
    avon b7avon b7 Posts: 7,622member
    avon b7 said:
    Xed said:
    I don’t really care if it cheated. To me, what’s more important is how well can it cheat? If the Samsung can accurately recreate PORTIONS of the moon occluded by a tree or other object, then the cheating is good enough that it’s actually useful because people can make custom images of the moon that you can’t just download from the internet. Has anyone tried this?
    But that's not the camera and it shouldn't touted as such. That would be a perfectly fine app or service for photography, but it needs to be presented as such. I doubt we're very far away from taking an image and then saying/typing to AI, "clean up this image so that tree naturally is not obstructing the moon, add detail to the full moon, and make it 20% larger in the night sky," to have it render that in a split second.
    There are two sides to this problem. 

    The purist in me says the photo should be what I see and little more. 

    The realist in me says we now have the ability to do so much more with the camera app, AI, ISP and sensors, that the line between what is acceptable and what is not, is blurred.

    We take a photo through a train window and use AI to remove the reflection. We use AI on selfies and selfie video to adjust where our eyes are looking. We use AI to create a bokeh effect. We use modern ISPs and sensors to take photos in very low light. Sometimes turning night almost into day. 

    In all of the above cases, the result can be better or worse. 

    In the specific case of the moon, 99% percent of point and shoot photos do not give you anything even remotely close to what you actually see. What you get is a little white blurry blob. Enough to ruin the photo.

    Anything that corrects that problem is probably very welcome because then you will be getting closer to what you actually see. Even if the processing draws on elements that are pre-stored on the phone and AI performs the magic. 

    The key would be just how effectively the image is recomposed within the night sky you are actually seeing. 

    I'm sure most people would gladly choose such an option

    Samsung is probably overselling what the phone is doing out of the gate and being economical with the truth. It's nothing new though. Periscope lenses and extreme zoom capabilities have their use cases. 

    We were in exactly the same position four years ago with the P30 Pro and its Moon Mode. Again, an option. Once the storm had passed no one really cared. People use it gladly. 

    It's important not to forget that in moon shots, the finished photo is probably closer to what you actually witnessed on the night. 

    Where we stand on this subject will depend on what we are talking about. Samsung's marketing push or how well the final shot represents what we saw. 

    Here is a modern periscope moonshot from an upcoming phone. 

    How many of us would love to take something like that with a point and shoot camera phone? Let's not forget that the moon might be a focal point in the composition but it's the zoom that is really shining here. 

    It's a tough call. 


    https://www.gizchina.com/2023/03/02/huawei-p60-pro-camera-zoom-the-world-is-not-yet-ready-for-this/

    It should never be a tough call when we're speaking about truth versus deception.  That's what this news story is about. 

     It's a second conversation if you want to discuss how we might consider using machine learning and external data sources to enhance images.  
    I targeted the 'truth' aspect in my post. The article, through its existence, raises my other point. 

    It's always going to be a tough call when talking about deception because people are going to draw the line between what is acceptable and unacceptable in different places for different reasons.

    For the moon specifically, I'm sure most people will accept the use of pre-shot material over a white blob, especially if you can toggle the retouching on and off. 

    With that in mind, it's disappointing that Samsung wasn't more open with what is happening here. I'm assuming there is no small print somewhere detailing the use of pre-stored material. 
    MacPro
  • Reply 32 of 43
    MacProMacPro Posts: 19,718member
    bohuj said:
    Bullshit of Apple fans. It easy to confirm its not cheating, takę an s22 or s23 ultra and using 100x zoom takę a Photo of any lamppost. It works same As moon shot u will See all details inside the lamp like bulb or Led array, wires etc. 
    You have missed the point, either deliberately or you are not thinking straight.  No one is inferring the 'trickery' is limited to moon shots.  What prevents the same AI tech being used on the lamp post?  Have you tried using Topaz labs Sharpen AI, or Adobe Neural Filter; Photo Restoration (and many other AIs)?  They don't care what the subject is.

    Now is this disingenuous?  It all depends on awareness.  If you are an end user, it is fun for sure.  However,  if you bought the device because you were tricked by marketing and thought this was a genuine picture you took, not so much.

    I use Topaz Filters in Photoshop to clean up real estate photography.  They are tools.  I don't use them in my bird photography, I use a Sony 600mm lens.
    edited March 2023 magman1979lolliverwatto_cobra
  • Reply 33 of 43
    macguimacgui Posts: 2,350member
    avon b7 said:
    Xed said:
    I don’t really care if it cheated. To me, what’s more important is how well can it cheat? If the Samsung can accurately recreate PORTIONS of the moon occluded by a tree or other object, then the cheating is good enough that it’s actually useful because people can make custom images of the moon that you can’t just download from the internet. Has anyone tried this?
    But that's not the camera and it shouldn't touted as such. That would be a perfectly fine app or service for photography, but it needs to be presented as such. I doubt we're very far away from taking an image and then saying/typing to AI, "clean up this image so that tree naturally is not obstructing the moon, add detail to the full moon, and make it 20% larger in the night sky," to have it render that in a split second.
    There are two sides to this problem. 

    The purist in me says the photo should be what I see and little more. 

    The realist in me says we now have the ability to do so much more with the camera app, AI, ISP and sensors, that the line between what is acceptable and what is not, is blurred.

    We take a photo through a train window and use AI to remove the reflection. We use AI on selfies and selfie video to adjust where our eyes are looking. We use AI to create a bokeh effect. We use modern ISPs and sensors to take photos in very low light. Sometimes turning night almost into day. 

    In all of the above cases, the result can be better or worse. 

    In the specific case of the moon, 99% percent of point and shoot photos do not give you anything even remotely close to what you actually see. What you get is a little white blurry blob. Enough to ruin the photo.

    Anything that corrects that problem is probably very welcome because then you will be getting closer to what you actually see. Even if the processing draws on elements that are pre-stored on the phone and AI performs the magic. 

    The key would be just how effectively the image is recomposed within the night sky you are actually seeing. 

    I'm sure most people would gladly choose such an option

    Samsung is probably overselling what the phone is doing out of the gate and being economical with the truth. It's nothing new though. Periscope lenses and extreme zoom capabilities have their use cases. 

    We were in exactly the same position four years ago with the P30 Pro and its Moon Mode. Again, an option. Once the storm had passed no one really cared. People use it gladly. 

    It's important not to forget that in moon shots, the finished photo is probably closer to what you actually witnessed on the night. 

    Where we stand on this subject will depend on what we are talking about. Samsung's marketing push or how well the final shot represents what we saw. 

    Here is a modern periscope moonshot from an upcoming phone. 

    How many of us would love to take something like that with a point and shoot camera phone? Let's not forget that the moon might be a focal point in the composition but it's the zoom that is really shining here. 

    It's a tough call. 


    https://www.gizchina.com/2023/03/02/huawei-p60-pro-camera-zoom-the-world-is-not-yet-ready-for-this/

    It should never be a tough call when we're speaking about truth versus deception.  That's what this news story is about. 

     It's a second conversation if you want to discuss how we might consider using machine learning and external data sources to enhance images.  
    This. 
    magman1979lolliverwatto_cobra
  • Reply 34 of 43
    jfabula1jfabula1 Posts: 138member
    Ok….fake, AI, Machine learning, they are all over now a days….so what do I called fake boobs now? Hmmm maybe AI enhanced? I think we should get used to it, just saying. Maybe another marketing stunt, who knows. Good job Samsung.
  • Reply 35 of 43
    AI is not wrong, but it's far too broad,
    There is one other minor detail: AI does not exist.
    watto_cobra
  • Reply 36 of 43
    applguyapplguy Posts: 235member
    I'd be interested to know if the phone was connected to the internet and if there would be a difference between being connected and not connected. I can believe the onboard AI may identify it's a moon then download pics of the moon to create an image. 
  • Reply 37 of 43
    AppleZuluAppleZulu Posts: 1,989member
    This makes me think of all those cop shows where the detective asks the tech guy to zoom in and enhance a blurry security camera image and in an instant, a smudge becomes a sharp, detailed photo that they use the to ID the perp and read his watch to establish exactly when he was at the scene of the crime. It's all BS, but it moves the show forward to the next scene.
    lolliverwatto_cobra
  • Reply 38 of 43
    avon b7avon b7 Posts: 7,622member
    AI is not wrong, but it's far too broad,
    There is one other minor detail: AI does not exist.
    These things are context and definition dependent:

    https://ai.engineering.columbia.edu/ai-vs-machine-learning/#:~:text=Put in context, artificial intelligence,and improve themselves through experience
  • Reply 39 of 43
    Is this similar to taking a picture of my wife at a distance of ten miles and enlarged it turns out I’m married to Rihanna. :)
    MacPrololliversphericwatto_cobra
  • Reply 40 of 43
    applguy said:
    I'd be interested to know if the phone was connected to the internet and if there would be a difference between being connected and not connected. I can believe the onboard AI may identify it's a moon then download pics of the moon to create an image. 
    That's no moon - it's a space station!
    DAalsethwatto_cobra
Sign In or Register to comment.