There’s an ongoing controversy about how Samsung’s phones insert “fake” images of the Earth’s moon into night time photos, which isn’t entirely true. Samsung has now stepped in to explain how the Galaxy S23 and its other devices take moon shots.
Smartphone cameras don’t have the same lenses and zoom capabilities of dedicated DSLRs and mirrorless cameras, so they rely on a mix of specialized sensors and software magic to boost photo and video quality. That’s part of the reason photos captured on an iPhone might look different than the same photo from a OnePlus or Samsung Galaxy phone — each company has its own optimizations, color balance, and other features affecting the final result.
Back in 2020, Samsung introduced “Space Zoom” with the Galaxy S20 Ultra, which combined digital upscaling and optical zoom to push image zoom to 100x. That level of zoom is helpful for many different scenarios, but Samsung also advertised it as capable of taking detailed images of the moon. The feature required “Scene Optimizer” to be enabled, which is used for many of Samsung’s AI camera features.
A Reddit post from March 10 by Reddit user ibreakphotos started the current controversy, alleging that Samsung was faking moon photos and that “Samsung’s marketing is deceptive.” They downloaded a high-resolution image of the moon, downsized it and applied a blur (so all the detail was gone), then took a zoom photo with an unspecified Samsung phone. The final image had more detail than the original image.
Samsung has explained how the feature works several times over the years, including in a detailed InputMag article and a Korean info page, but the controversy led to Samsung publishing a new post explaining the feature. It’s based on the earlier article written in Korean, but translated into English for more clarity.
Samsung said in the post, “When you’re taking a photo of the moon, your Galaxy device’s camera system will harness this deep learning-based AI technology, as well as multi-frame processing in order to further enhance details. When Scene Optimizer is turned on and the moon has been recognized as an object, the camera will deliver users a bright and clear image through the detail enhancement engine of Scene Optimizer on top of the Super Resolution technology.”
Samsung is using an on-device AI model to recognize the moon in photos, then adjusts the focus and other controls to capture the best image possible. The image then runs through an “AI detail enhancement engine” to eliminate noise in the photos and enhance details. Since the AI model was trained on real photos of the moon, it can fill in details that may not be visible in the raw data. That’s how modern smartphones capture all photos, not just photos of the moon. The AI replacement is just more aggressive with moon photos on Galaxy phones, because there’s usually not much data to work with — an unprocessed smartphone photo of the moon might just be a bright dot.
So, are moon photos from Galaxy phones “fake” or not? Well, this is sort of a Ship of Theseus situation. It’s still based on a real photo you are taking with your phone, but it has been dramatically enhanced with an AI model trained with much better photos of the moon. Is it the same photo if it has been modified so much? I would say so, but mostly because it doesn’t matter much for smartphone photography. Also, this is one enhancement feature that makes images look closer to real life, which is much different from enhancement features like Beauty Mode, which purposefully distort reality.
The only alternative to this level of AI enhancement is a blurry white dot, which is what you get when Scene Enhancer is turned off, because smartphones can’t physically fit the lenses and sensors required for detailed moon photos. We’ll have to wait for a few scientific breakthroughs before a flat rectangle in your pocket can do that.