The ability to capture extraordinarily detailed images of the Moon has long been a big seller for Samsung “Space Zoom”-enabled phones. Nevertheless, a recent Reddit post made clear how much computational processing the company is performing, and given the proof provided, it feels like we should just come right out and say it: Samsung’s images of the Moon are false.
So, in this context, what does “artificial” actually mean? It’s a difficult question to answer, and as computational tools are more thoroughly incorporated into the photographic process, the answer will grow more significant and complex. We can be sure that in the near future, just as it has in the past to account for digital cameras, Photoshop, Instagram filters, and more, our understanding of what constitutes a fake photo will evolve. Let’s focus on the Samsung and Moon case for the moment though.
Reddit user u/ibreakphotos carried out a test of Samsung’s smartphones that were brilliant in their simplicity. They took a photograph of the Moon that was purposefully blurry, projected it on a computer screen, and then captured it using a Samsung S23 Ultra. The resulting image, as you can see below, exhibited a clean and crisp “photograph” of the Moon, while the initial image on the screen had no detail at all. Details from the S23 Ultra were added that just weren’t there previously. Blurred pixels were not upscaled, and supposedly deleted data was not found. A new Moon, although a fake one, has just emerged.
This argument is not brand-new. Ever since Samsung introduced a 100x “Space Zoom” feature in its S20 Ultra in 2020, people have been curious about the company’s Moon photography. Samsung claims that the procedure is more sophisticated than merely copying and pasting restored textures onto images of the Moon to make its shots, despite claims from certain places to the contrary.
A lengthy article on the “false detailed moon images” produced by the Galaxy S21 Ultra was published in 2021 by Input Mag. The company uses AI to recognize the Moon and then “provides a detail enhancement feature by eliminating blurring and noises,” according to Samsung, who claimed that “no picture overlaying or texture effects are performed when capturing a shot.”
Later, Samsung provided further details in a blog post (translated from Korean by Google). The description of the crucial phase that transforms a photograph of a blurry Moon into a sharp Moon, however, which forms the basis of the explanation, is handled in overly sophisticated terms. Samsung simply states that it “effectively removes noise and maximizes the details of the moon to complete a bright and clear picture of the moon” using a “detail improvement engine function” (emphasis added). What does that imply? Simply put, we are unsure.
The fairest conclusion is that Samsung’s method uses AI to upscale blurry elements from the original image. This is a tried-and-true technique that has its issues. Yet, as the Reddit experiments demonstrate, Samsung’s technique is more intrusive than this: it makes blurry details sharper and produces them. At this point, the majority of people would agree better that the resulting image is false.
The problem with this is that “fakeness” is not a binary but rather a spectrum. (Like all the divisions we make into categories.) The information that an optical sensor receives when taking a photo—the light that was captured—is typically used to define the criteria of “realness” in photography. The outcome is not fake even after considerable editing, similar to how skilled photographers alter RAW photographs by adjusting the colour, exposure, contrast, and other factors. Yet in this case, it appears that a computer process rather than optical data produced the Moon photographs that were obtained by the Samsung phone. It’s more of a generated image than a photo, to be more precise.
This definition may not be acceptable to everyone, and that is okay. Making this distinction will likewise get much more difficult in the future. The proportion of “optically collected” and “software-generated” data in smartphone output has been shifting since smartphone manufacturers began utilizing computational approaches to get beyond the limitations of smartphones’ tiny photo sensors. Future technologies like Samsung’s “detail improvement engine” will undoubtedly become more prevalent and used in more contexts. “Detail improvement engines” can be trained on a variety of data, such as the faces of your family and friends to ensure that you never take a terrible photo of them or on famous sites to enhance your vacation photos. We’ll likely forget that we ever labelled such photographs as false in the future.
Nevertheless, for the time being, Samsung’s Moon imaging stands out, and this is because it’s an efficient application for this kind of computational photography. First off, taking pictures of the Moon is visually appealing. Even when librations and rotational changes are taken into account, the Moon appears to be essentially the same in every photograph seen from Earth. It contains detail but lacks depth. As a result, adding AI enhancements is not too difficult. And secondly, moon photography is a marketing goldmine because a) everyone is aware that phones don’t do the Moon justice and b) anyone can try out the feature. That has made it simple for Samsung to demonstrate the superior photography abilities of their phones.
This viral appeal is what has caused the company’s problems. By failing to properly explain the feature, Samsung has caused many consumers to mistake its AI-improved photos for a physics-defying optical zoom that cannot fit in a smartphone. That in turn has motivated others to criticize the pictures (because the tech world loves a scandal). Although Samsung does not explicitly state that its Moon photos are reflective of all of its zoom photography, a consumer could easily believe this, thus it is important to emphasize what is happening in reality.
At the end of the day, as photography evolves, so will our perception of what a “genuine photo” looks like. Yet for now, it appears reasonable to draw the conclusion that Samsung’s Moon images are more false than real. This might most likely change in a few years. Samsung is yet to make a comment. In the interim, disable the “Scene Optimizer” feature on your Samsung device to get a pure shot of the Moon. Then, prepare to take a picture of a hazy circle in the sky.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.