Over the last few days, Samsung has come under fire over the originality of the “moon shots” taken by its Galaxy S23 Ultra camera.
The Samsung Galaxy S23 Ultra, which was unveiled last month and started selling in Kenya in the open market this month, has a feature called Space Zoom. This feature, among other things, has been promoted by Samsung as being very good at taking mind-blowing snaps of the Earth’s only satellite, the moon.
The big question among many, right now, is, how does it do that?
And, that discussion didn’t start just the other day. The murmurs have persisted for a while. They are the one thing that won’t go away. Especially since there’s been some precedence: around this time, 4 years ago, Huawei, pretty much the pioneer of the “crazy moon shots” photography segment on smartphones, was caught cheating. So, yeah, there’s quite a lot to suspect of some cheating in the way the moon images that make for good conversation starters among friends and family, go.
Now that we know that, what was happening with this Infinix phone?
My personal reference point over the last few years has been this research conducted by then Input Mag reviewer Raymond Wong.
Over the last few days, people on Reddit and yet more reviewers, have added their voices to this unfolding debacle.
So, are the photos you get after you point your Samsung Galaxy S21 Ultra, S22 Ultra or S23 Ultra real or fake?
Well, who better to explain than the people who make the hardware and the software in question themselves?
Samsung has come up with a detailed explainer of what happens, reinforcing what had already been shared by a community manager on its forums back in January.
The company says:
Samsung is committed to delivering best-in-class camera experiences in all conditions. Since the introduction of the Galaxy S10, the Samsung Galaxy series has harnessed artificial intelligence (AI) technologies in its cameras to help users capture every epic moment anytime, anywhere.
As part of this, Samsung developed the Scene Optimizer feature, a camera functionality which uses advanced AI to recognize objects and thus deliver the best results to users. Since the introduction of the Galaxy S21 series, Scene Optimizer has been able to recognize the moon as a specific object during the photo-taking process, and applies the feature’s detail enhancement engine to the shot.
When you’re taking a photo of the moon, your Galaxy device’s camera system will harness this deep learning-based AI technology, as well as multi-frame processing in order to further enhance details.
TL;DR, yes, AI is involved, as Samsung explains:
In order to take a clear photo of the moon, Galaxy cameras harness Super Resolution to synthesize more than 10 images taken at 25x zoom or higher. The image taken at 25x zoom or above needs to eliminate noise and enhance clarity and other details.
Super Resolution technology helps produce images through multi-frame composition. When Scene Optimizer is turned on and the moon has been recognized as an object, the camera will deliver users a bright and clear image through the detail enhancement engine of Scene Optimizer on top of the Super Resolution technology.
The engine for recognizing the moon was built based on a variety of moon shapes and details, from full through to crescent moons, and is based on images taken from our view from the Earth.
It uses an AI deep learning model to detect the presence of the moon and identify the area it occupies – as denoted by the square box – in the relevant image. Once the AI model has completed its learning, it can detect the area occupied by the moon even in images that were not used in training…
Once the device has recognized the moon after zooming in, it will control the brightness of the display in order to present the moon more clearly while maintaining optimal brightness.
Accordingly, when users take photos of the moon at early evening time, the sky around the moon will appear darker than the actual sky colour because the device will darken the object background brightness in order to help make the moon appear more clearly. Once your device recognizes the moon, it will continuously adjust focus to stay directly on the moon.
So, is the moon image you get when you take a photo of the moon after having zoomed in more than 25x and with the Scene Optimizer feature turned on real or fake? The answer to that question will vary depending on your understanding of the explanation given by Samsung and where you stand on the purity scale.
We are living in the age of computational photography and we have come to accept and expect that some things are better handled beyond the traditional optics of photography, by complex algorithms and, now, pre-trained models and this means that there are many aids being used to enhance many things that we get off the cameras on our phones, especially since the tiny spaces on these devices mean that there’s only so much that can be done as far as the hardware goes.
At the same time, there is something to be said about what a phone’s camera can and can’t do naturally, extraneous claims by its maker on its capabilities for advertising’s sake or to get a leg up over its competitors notwithstanding.