It looks like Samsung is cheating on 'space zoom' moon photos
Moon photos taken with the "Space Zoom" of Samsung's flagship smartphone models appear to be more a feat of AI trickery than anything else, a Reddit user's investigation into the feature claims.

Samsung S23 Ultra
Samsung's flagship Galaxy smartphone lineup, including the Galaxy S23 Ultra, has an extremely high level of zoom for the rear cameras. With a 100x zoom level, created by augmenting 3x and 10x telephoto cameras with a digital zoom aided by Samsung's AI Super Resolution technology, it can capture shots of things very far away.
That so-called Space Zoom could potentially allow users to photograph the moon, and many do. However, it may be the case that the level of detail in the moon shots may only be higher due to software shenanigans.
In Friday's post to the Android subreddit, "u/ibreakphotos" declared that Samsung's Space Zoom "moon shots are fake," and that they had proof. The lengthy post then demonstrates that belief, in a fairly convincing way.
Referring to previous reporting that the moon photographs from the S20 Ultra and later models are real and not faked, the Redditor points out that no-one has managed to succeed in proving that they are real or fake, until their post.
The user tested the effect by downloading a high-resolution image of the moon, then downsized it to a 170 by 170-resolution image, and then applied a gaussian blur to obliterate any final details of its surface.
They then showed the low-res blurry moon at full screen on their monitor, walked to the other end of their room, zoomed in on the fake celestial body, and took a photograph. After some processing, an image of the moon was produced by the smartphone, but the surface had considerably more detail for the surface than the doctored source.
![The low-res and blurry source image of the moon (left), and what Samsung's smartphone processed it as (right) [Reddit u/ibreakphotos]](https://photos5.appleinsider.com/gallery/53429-107213-moonshot-r-ibreakphotos-xl.jpg)
The low-res and blurry source image of the moon (left), and what Samsung's smartphone processed it as (right) [Reddit u/ibreakphotos]
The user reckons Samsung "is leveraging an AI model to put craters and other details on places which were just a blurry mess." They go further to stress that while super resolution processing uses multiple images to recover otherwise-lost detail, this seems to be something different.
It is proposed that this is a case "where you have a specific AI model trained on a set of moon images, in order to recognize the moon and slap on the moon texture on it."
"This is not the same kind of processing that is done when you're zooming into something else, when those multiple exposures and different data from each frame account to something," they propose. "This is specific to the moon."
It is reckoned that since the moon is tidally locked to Earth, "it's very easy to train your model on other moon images and just slap that texture when a moon-like thing is detected," and that the AI is "doing most of the work, not the optics."
Referencing to an earlier failed attempt to bust Space Zoom's quality, Samsung assured that the feature used up to 20 pictures, then processed them as a composite with AI. That AI identifies the content of the scene, and then performs a "detail enhancing function" on the subject.
At the time of a previous investigation in 2021, attempts to trigger an overlay or AI processing on a clove of garlic on a black background or a table tennis ball failed to trick the smartphone. The 2023 test using a 170-by-170 resolution image of the real moon may have given the AI processing just enough basic detail to make it think it was looking at the actual moon.
The new test also eliminates any sort of multi-frame sharpening from being used, since it's a shot of the same low-resolution moon for every frame.
It remains to be seen if this brief investigation will trigger closer scrutiny at the use of AI in photography, but the concept is one that has been employed across the entire mobile industry. Even Apple leans on computational photography to improve the quality of images from its cameras.
While the public may be convinced that AI processing techniques being applied to images from smartphone cameras is a good thing in general, oddly specific instances such as this may cause some pause for people who care about photography as an artform.
Read on AppleInsider

Samsung S23 Ultra
Samsung's flagship Galaxy smartphone lineup, including the Galaxy S23 Ultra, has an extremely high level of zoom for the rear cameras. With a 100x zoom level, created by augmenting 3x and 10x telephoto cameras with a digital zoom aided by Samsung's AI Super Resolution technology, it can capture shots of things very far away.
That so-called Space Zoom could potentially allow users to photograph the moon, and many do. However, it may be the case that the level of detail in the moon shots may only be higher due to software shenanigans.
In Friday's post to the Android subreddit, "u/ibreakphotos" declared that Samsung's Space Zoom "moon shots are fake," and that they had proof. The lengthy post then demonstrates that belief, in a fairly convincing way.
Referring to previous reporting that the moon photographs from the S20 Ultra and later models are real and not faked, the Redditor points out that no-one has managed to succeed in proving that they are real or fake, until their post.
The user tested the effect by downloading a high-resolution image of the moon, then downsized it to a 170 by 170-resolution image, and then applied a gaussian blur to obliterate any final details of its surface.
They then showed the low-res blurry moon at full screen on their monitor, walked to the other end of their room, zoomed in on the fake celestial body, and took a photograph. After some processing, an image of the moon was produced by the smartphone, but the surface had considerably more detail for the surface than the doctored source.
![The low-res and blurry source image of the moon (left), and what Samsung's smartphone processed it as (right) [Reddit u/ibreakphotos]](https://photos5.appleinsider.com/gallery/53429-107213-moonshot-r-ibreakphotos-xl.jpg)
The low-res and blurry source image of the moon (left), and what Samsung's smartphone processed it as (right) [Reddit u/ibreakphotos]
The user reckons Samsung "is leveraging an AI model to put craters and other details on places which were just a blurry mess." They go further to stress that while super resolution processing uses multiple images to recover otherwise-lost detail, this seems to be something different.
It is proposed that this is a case "where you have a specific AI model trained on a set of moon images, in order to recognize the moon and slap on the moon texture on it."
"This is not the same kind of processing that is done when you're zooming into something else, when those multiple exposures and different data from each frame account to something," they propose. "This is specific to the moon."
It is reckoned that since the moon is tidally locked to Earth, "it's very easy to train your model on other moon images and just slap that texture when a moon-like thing is detected," and that the AI is "doing most of the work, not the optics."
Referencing to an earlier failed attempt to bust Space Zoom's quality, Samsung assured that the feature used up to 20 pictures, then processed them as a composite with AI. That AI identifies the content of the scene, and then performs a "detail enhancing function" on the subject.
At the time of a previous investigation in 2021, attempts to trigger an overlay or AI processing on a clove of garlic on a black background or a table tennis ball failed to trick the smartphone. The 2023 test using a 170-by-170 resolution image of the real moon may have given the AI processing just enough basic detail to make it think it was looking at the actual moon.
The new test also eliminates any sort of multi-frame sharpening from being used, since it's a shot of the same low-resolution moon for every frame.
It remains to be seen if this brief investigation will trigger closer scrutiny at the use of AI in photography, but the concept is one that has been employed across the entire mobile industry. Even Apple leans on computational photography to improve the quality of images from its cameras.
While the public may be convinced that AI processing techniques being applied to images from smartphone cameras is a good thing in general, oddly specific instances such as this may cause some pause for people who care about photography as an artform.
Read on AppleInsider
Comments
So Samsung decided to sweeten the image with file photos, and claim absurd magnification numbers.
They put the scum in Samscum.
2) This test you've proposed has nothing to do with the moon, just the zoom. This is addressed in the headline, lede, and article body.
3) The tester took a picture where the data was just enough to identify it as the moon, given that the same surface of the moon is always facing the earth. Samsung then added data that didn't exist, to generate the photo. This isn't defensible, it isn't defensible with a picture that an owner shot of the actual moon. It's just faked, like Fidonet128 above said.
For photos that are not zoomed to stupid levels the AI will enhance what is really there.
The data did not exist, at all, in the test shot, so it literally can't be "stacking and deconvolution" because it didn't exist in the first place. The Samsung phone added the data.
The purist in me says the photo should be what I see and little more.
The realist in me says we now have the ability to do so much more with the camera app, AI, ISP and sensors, that the line between what is acceptable and what is not, is blurred.
We take a photo through a train window and use AI to remove the reflection. We use AI on selfies and selfie video to adjust where our eyes are looking. We use AI to create a bokeh effect. We use modern ISPs and sensors to take photos in very low light. Sometimes turning night almost into day.
In all of the above cases, the result can be better or worse.
In the specific case of the moon, 99% percent of point and shoot photos do not give you anything even remotely close to what you actually see. What you get is a little white blurry blob. Enough to ruin the photo.
Anything that corrects that problem is probably very welcome because then you will be getting closer to what you actually see. Even if the processing draws on elements that are pre-stored on the phone and AI performs the magic.
The key would be just how effectively the image is recomposed within the night sky you are actually seeing.
I'm sure most people would gladly choose such an option.
Samsung is probably overselling what the phone is doing out of the gate and being economical with the truth. It's nothing new though. Periscope lenses and extreme zoom capabilities have their use cases.
We were in exactly the same position four years ago with the P30 Pro and its Moon Mode. Again, an option. Once the storm had passed no one really cared. People use it gladly.
It's important not to forget that in moon shots, the finished photo is probably closer to what you actually witnessed on the night.
Where we stand on this subject will depend on what we are talking about. Samsung's marketing push or how well the final shot represents what we saw.
Here is a modern periscope moonshot from an upcoming phone.
How many of us would love to take something like that with a point and shoot camera phone? Let's not forget that the moon might be a focal point in the composition but it's the zoom that is really shining here.
It's a tough call.
https://www.gizchina.com/2023/03/02/huawei-p60-pro-camera-zoom-the-world-is-not-yet-ready-for-this/