It looks like Samsung is cheating on 'space zoom' moon photos

Posted:
in General Discussion edited March 2023
Moon photos taken with the "Space Zoom" of Samsung's flagship smartphone models appear to be more a feat of AI trickery than anything else, a Reddit user's investigation into the feature claims.

Samsung S23 Ultra
Samsung S23 Ultra


Samsung's flagship Galaxy smartphone lineup, including the Galaxy S23 Ultra, has an extremely high level of zoom for the rear cameras. With a 100x zoom level, created by augmenting 3x and 10x telephoto cameras with a digital zoom aided by Samsung's AI Super Resolution technology, it can capture shots of things very far away.

That so-called Space Zoom could potentially allow users to photograph the moon, and many do. However, it may be the case that the level of detail in the moon shots may only be higher due to software shenanigans.

In Friday's post to the Android subreddit, "u/ibreakphotos" declared that Samsung's Space Zoom "moon shots are fake," and that they had proof. The lengthy post then demonstrates that belief, in a fairly convincing way.

Referring to previous reporting that the moon photographs from the S20 Ultra and later models are real and not faked, the Redditor points out that no-one has managed to succeed in proving that they are real or fake, until their post.

The user tested the effect by downloading a high-resolution image of the moon, then downsized it to a 170 by 170-resolution image, and then applied a gaussian blur to obliterate any final details of its surface.

They then showed the low-res blurry moon at full screen on their monitor, walked to the other end of their room, zoomed in on the fake celestial body, and took a photograph. After some processing, an image of the moon was produced by the smartphone, but the surface had considerably more detail for the surface than the doctored source.

The low-res and blurry source image of the moon (left), and what Samsung's smartphone processed it as (right) [Reddit u/ibreakphotos]
The low-res and blurry source image of the moon (left), and what Samsung's smartphone processed it as (right) [Reddit u/ibreakphotos]


The user reckons Samsung "is leveraging an AI model to put craters and other details on places which were just a blurry mess." They go further to stress that while super resolution processing uses multiple images to recover otherwise-lost detail, this seems to be something different.

It is proposed that this is a case "where you have a specific AI model trained on a set of moon images, in order to recognize the moon and slap on the moon texture on it."

"This is not the same kind of processing that is done when you're zooming into something else, when those multiple exposures and different data from each frame account to something," they propose. "This is specific to the moon."

It is reckoned that since the moon is tidally locked to Earth, "it's very easy to train your model on other moon images and just slap that texture when a moon-like thing is detected," and that the AI is "doing most of the work, not the optics."

Referencing to an earlier failed attempt to bust Space Zoom's quality, Samsung assured that the feature used up to 20 pictures, then processed them as a composite with AI. That AI identifies the content of the scene, and then performs a "detail enhancing function" on the subject.

At the time of a previous investigation in 2021, attempts to trigger an overlay or AI processing on a clove of garlic on a black background or a table tennis ball failed to trick the smartphone. The 2023 test using a 170-by-170 resolution image of the real moon may have given the AI processing just enough basic detail to make it think it was looking at the actual moon.

The new test also eliminates any sort of multi-frame sharpening from being used, since it's a shot of the same low-resolution moon for every frame.

It remains to be seen if this brief investigation will trigger closer scrutiny at the use of AI in photography, but the concept is one that has been employed across the entire mobile industry. Even Apple leans on computational photography to improve the quality of images from its cameras.

While the public may be convinced that AI processing techniques being applied to images from smartphone cameras is a good thing in general, oddly specific instances such as this may cause some pause for people who care about photography as an artform.

Read on AppleInsider
«13

Comments

  • Reply 1 of 43
    XedXed Posts: 2,519member
    You may as well just use Google Images instead of taking your own photos.
    sflagelretrogustosphericlkruppgilly33ForumPostzeus423lolliverwatto_cobra
  • Reply 2 of 43
    DAalsethDAalseth Posts: 2,783member
    The general rule of thumb in Astronomy is an optical telescope is limited to 50x per inch of aperture. (Not counting adaptive optics etc., which is not a factor here). The lens on this thing is what, a quarter inch? (Being generous). That means that any optical zoom over 12 is not going to do anything but fuzz. It’s like those department store telescopes that promise 300-400 power out of a one inch lens. Not gonna happen. You can pile on the lenses, but it won’t do you any good. 

    So Samsung decided to sweeten the image with file photos, and claim absurd magnification numbers.

    They put the scum in Samscum.
    StrangeDayssphericsaarekdarelrexnetroxradarthekatjellybellymagman1979lolliverwatto_cobra
  • Reply 3 of 43
    chasmchasm Posts: 3,273member
    Wait, WHAT? Samsung CHEATING to make their products look like they perform better than they do??

    WELL I NEVER!!  :D
    gilly33darelrexbaconstangradarthekatFileMakerFellermagman1979lolliverwatto_cobra
  • Reply 4 of 43
    bohujbohuj Posts: 1member
    Bullshit of Apple fans. It easy to confirm its not cheating, takę an s22 or s23 ultra and using 100x zoom takę a Photo of any lamppost. It works same As moon shot u will See all details inside the lamp like bulb or Led array, wires etc. 
    williamlondon
  • Reply 5 of 43
    lkrupplkrupp Posts: 10,557member
    bohuj said:
    Bullshit of Apple fans. It easy to confirm its not cheating, takę an s22 or s23 ultra and using 100x zoom takę a Photo of any lamppost. It works same As moon shot u will See all details inside the lamp like bulb or Led array, wires etc. 
    Ooo, a Samsung bot on patrol.
    DAalsethForumPost9secondkox2macxpresswilliamlondonzeus423darelrexfreeassociate2roundaboutnowmagman1979
  • Reply 6 of 43
    You can't defend this. These are faked photos. The article is about testing using a blurred photo of the moon and Samsung phone come out with a sharp version of that phone. That is physically impossible to do. The information from what the camera took isn't there to make a real sharp photo. Samsung is using AI to do photoshop with moon shots.
    darelrexradarthekatFileMakerFellermagman1979lolliverwatto_cobrajony0
  • Reply 7 of 43
    chutzpahchutzpah Posts: 392member
    How useful!  My photo albums are filled with pictures of the moon, you never know when you're going to see it next.
    williamlondonFileMakerFellerwatto_cobra
  • Reply 8 of 43
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    bohuj said:
    Bullshit of Apple fans. It easy to confirm its not cheating, takę an s22 or s23 ultra and using 100x zoom takę a Photo of any lamppost. It works same As moon shot u will See all details inside the lamp like bulb or Led array, wires etc. 
    1) The discoverer is an Android user, almost exclusively.
    2) This test you've proposed has nothing to do with the moon, just the zoom. This is addressed in the headline, lede, and article body.
    3) The tester took a picture where the data was just enough to identify it as the moon, given that the same surface of the moon is always facing the earth. Samsung then added data that didn't exist, to generate the photo. This isn't defensible, it isn't defensible with a picture that an owner shot of the actual moon. It's just faked, like Fidonet128 above said.
    edited March 2023 muthuk_vanalingam9secondkox2macxpresssphericwilliamlondondarelrexradarthekatFidonet127jellybellyfreeassociate2
  • Reply 9 of 43
    s.metcalfs.metcalf Posts: 972member
    Says a lot about Samsung as a company.  They are serial cheaters and liars.
    9secondkox2williamlondondarelrexradarthekatmagman1979lolliverwatto_cobraJapheyjony0
  • Reply 10 of 43
    saareksaarek Posts: 1,520member
    At the end of the day you can only “zoom” in a certain amount per centimetre of lens. After that it makes no difference, so they are lying in terms of saying that the camera system on its own can zoom in 100X, that’s total BS.

    Obviously the image you get out is largely fake because AI has simply guessed what should be there.

    But for the average user who just wants a good picture to dump on Facebook, well, they will be happy. Surely, at least as far as Samsung and the majority of their customers are concerned, what they get out at the end is what matters.

    For photos that are not zoomed to stupid levels the AI will enhance what is really there. 
    9secondkox2williamlondonjellybellylolliverwatto_cobra
  • Reply 11 of 43
    montomonto Posts: 4member
    The proposal suggests that this could be a situation where a particular AI model has been trained using a collection of moon images for the purpose of identifying the moon and superimposing its texture onto other images.
    gilly339secondkox2radarthekatwatto_cobra
  • Reply 12 of 43
    gilly33gilly33 Posts: 433member
    bohuj said:
    Bullshit of Apple fans. It easy to confirm its not cheating, takę an s22 or s23 ultra and using 100x zoom takę a Photo of any lamppost. It works same As moon shot u will See all details inside the lamp like bulb or Led array, wires etc. 
    Why you hating on Apple fans? The person who did the experiment appears to be Android user. And even I an Apple fan as you say was pretty amazed at these moon shots. Hey people are always calling Apple bluff so hey get over it. 
    9secondkox2sphericwilliamlondonradarthekatwatto_cobra
  • Reply 13 of 43
    JimmyGJimmyG Posts: 3member
    Hi Folks,

    First-time poster here, I joined just to post some my own personal experience with telephoto lunar photography. FWIW, my experience in observing and recording the night sky spans a lifetime of amateur astronomy since before I got my first telescope in 1967.

    A few quick items that folks should be educated on and familiar with before casting one's public opinion on this thread's topic...

    1. The, er, "evidentiary" image of the moon purportedly shot with a Sony a7R III/200-600mm lens here...

    Fake Samsung Galaxy S21 Ultra moon shots debunked - MSPoweruser

    ...speaks to either A. the photographer's inability to know how to use that camera/lens combo for excellent results, and/or B. an attempt to use fake or misleading evidence for their argument. A simple search of the Sony 200-600mm moon images will reveal such...

    Search: Sony 200-600mm moon | Flickr

    ...clearly, that lens and pretty much any Sony body can provide a much more detailed lunar image than the one posted as evidence.

    2. The use of combining numerous images and using deconvolution methods in post processing has been a, um, "thing" in the world of astrophotography since the dawn of digital imaging...

    moon deconvolution at DuckDuckGo

    My take on Samsung's "100x Space Zoom" feature is that it's nothing more than an automated stacking and deconvolution software routine labeled/marketed as "AI".

    TL;DR...There is no trickery here except on the part of folks claiming such of Samsung.

    I hope that helps clear the air of any further misconceptions on this subject and, hopefully, any erroneous conclusions and/or opinions based on that unfamiliarity.

    Best,
    Jimmy G


    MickeyValentinewilliamlondon
  • Reply 14 of 43
    9secondkox29secondkox2 Posts: 2,663member
    When I take a photo, I want it to be my photo - of the thing I took a picture of. What I DONT want is for my subject to be swapped out or mingled with someone else’s photo or a “sharper” AI approximation. 

    Might as well just download a NASA image if that’s the case. 
    darelrexcoolfactorradarthekatwatto_cobra
  • Reply 15 of 43
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    JimmyG said:
    Hi Folks,

    First-time poster here, I joined just to post some my own personal experience with telephoto lunar photography. FWIW, my experience in observing and recording the night sky spans a lifetime of amateur astronomy since before I got my first telescope in 1967.

    A few quick items that folks should be educated on and familiar with before casting one's public opinion on this thread's topic...

    1. The, er, "evidentiary" image of the moon purportedly shot with a Sony a7R III/200-600mm lens here...

    Fake Samsung Galaxy S21 Ultra moon shots debunked - MSPoweruser

    ...speaks to either A. the photographer's inability to know how to use that camera/lens combo for excellent results, and/or B. an attempt to use fake or misleading evidence for their argument. A simple search of the Sony 200-600mm moon images will reveal such...

    Search: Sony 200-600mm moon | Flickr

    ...clearly, that lens and pretty much any Sony body can provide a much more detailed lunar image than the one posted as evidence.

    2. The use of combining numerous images and using deconvolution methods in post processing has been a, um, "thing" in the world of astrophotography since the dawn of digital imaging...

    moon deconvolution at DuckDuckGo

    My take on Samsung's "100x Space Zoom" feature is that it's nothing more than an automated stacking and deconvolution software routine labeled/marketed as "AI".

    TL;DR...There is no trickery here except on the part of folks claiming such of Samsung.

    I hope that helps clear the air of any further misconceptions on this subject and, hopefully, any erroneous conclusions and/or opinions based on that unfamiliarity.

    Best,
    Jimmy G


    Read the article for what the tester did.

    The data did not exist, at all, in the test shot, so it literally can't be "stacking and deconvolution" because it didn't exist in the first place. The Samsung phone added the data.
    sphericelijahgwilliamlondondarelrexnetroxradarthekatroundaboutnowFileMakerFellermagman1979lolliver
  • Reply 16 of 43
    I don’t really care if it cheated. To me, what’s more important is how well can it cheat? If the Samsung can accurately recreate PORTIONS of the moon occluded by a tree or other object, then the cheating is good enough that it’s actually useful because people can make custom images of the moon that you can’t just download from the internet. Has anyone tried this?
    williamlondon
  • Reply 17 of 43
    XedXed Posts: 2,519member
    I don’t really care if it cheated. To me, what’s more important is how well can it cheat? If the Samsung can accurately recreate PORTIONS of the moon occluded by a tree or other object, then the cheating is good enough that it’s actually useful because people can make custom images of the moon that you can’t just download from the internet. Has anyone tried this?
    But that's not the camera and it shouldn't be touted as such. That would be a perfectly fine app or service for photography, but it needs to be presented as such. I doubt we're very far away from taking an image and then saying/typing to AI, "clean up this image so that tree naturally is not obstructing the moon, add detail to the full moon, and make it 20% larger in the night sky," to have it render that in a split second.
    edited March 2023 williamlondonradarthekatlolliverwatto_cobra
  • Reply 18 of 43
    I semi noticed this when taking photos of the moon as it rose. There were clouds in front of the moon and a some point as the clouds moved away, my photos of the moon changed dramatically to remove any clouds from the moon. Definitely disappointing. So while I'm not saying you're wrong. I don't think you're completely right either. As the photos with the clouds in front came out just as good as those the "detail enhancing function" slaps a texture on. 
    williamlondondarelrex
  • Reply 19 of 43
    avon b7avon b7 Posts: 7,622member
    Xed said:
    I don’t really care if it cheated. To me, what’s more important is how well can it cheat? If the Samsung can accurately recreate PORTIONS of the moon occluded by a tree or other object, then the cheating is good enough that it’s actually useful because people can make custom images of the moon that you can’t just download from the internet. Has anyone tried this?
    But that's not the camera and it shouldn't touted as such. That would be a perfectly fine app or service for photography, but it needs to be presented as such. I doubt we're very far away from taking an image and then saying/typing to AI, "clean up this image so that tree naturally is not obstructing the moon, add detail to the full moon, and make it 20% larger in the night sky," to have it render that in a split second.
    There are two sides to this problem. 

    The purist in me says the photo should be what I see and little more. 

    The realist in me says we now have the ability to do so much more with the camera app, AI, ISP and sensors, that the line between what is acceptable and what is not, is blurred.

    We take a photo through a train window and use AI to remove the reflection. We use AI on selfies and selfie video to adjust where our eyes are looking. We use AI to create a bokeh effect. We use modern ISPs and sensors to take photos in very low light. Sometimes turning night almost into day. 

    In all of the above cases, the result can be better or worse. 

    In the specific case of the moon, 99% percent of point and shoot photos do not give you anything even remotely close to what you actually see. What you get is a little white blurry blob. Enough to ruin the photo.

    Anything that corrects that problem is probably very welcome because then you will be getting closer to what you actually see. Even if the processing draws on elements that are pre-stored on the phone and AI performs the magic. 

    The key would be just how effectively the image is recomposed within the night sky you are actually seeing. 

    I'm sure most people would gladly choose such an option

    Samsung is probably overselling what the phone is doing out of the gate and being economical with the truth. It's nothing new though. Periscope lenses and extreme zoom capabilities have their use cases. 

    We were in exactly the same position four years ago with the P30 Pro and its Moon Mode. Again, an option. Once the storm had passed no one really cared. People use it gladly. 

    It's important not to forget that in moon shots, the finished photo is probably closer to what you actually witnessed on the night. 

    Where we stand on this subject will depend on what we are talking about. Samsung's marketing push or how well the final shot represents what we saw. 

    Here is a modern periscope moonshot from an upcoming phone. 

    How many of us would love to take something like that with a point and shoot camera phone? Let's not forget that the moon might be a focal point in the composition but it's the zoom that is really shining here. 

    It's a tough call. 


    https://www.gizchina.com/2023/03/02/huawei-p60-pro-camera-zoom-the-world-is-not-yet-ready-for-this/








  • Reply 20 of 43
    tundraboytundraboy Posts: 1,884member
    With a large enough image database, you can take a photo of a celebrity (at a concert you attended, say) and the Samsung phone can magically upgrade the facial image to studio quality.  Or if you have a friend who's a doppelgänger of a famous person, then pictures of them will be upgraded with database images of said famous person.
    lolliverwatto_cobra
Sign In or Register to comment.