Samsung’s Moon Shots Force Us to Ask How Much AI Is Too Much

Samsung’s Moon Shots Force Us to Ask How Much AI Is Too Much

[ad_1]

And unlike, for example, the Eiffel Tower, its appearance is not going to change drastically based on lighting. Moon shooting typically only happens at night, and Samsung’s processing falls apart if the moon is partially obscured by clouds.

One of the clearest ways Samsung’s processing fiddles with the moon is in manipulating mid-tone contrast, making its topography more pronounced. However, it’s clearly also capable of introducing the appearance of texture and detail not present in the raw photo.

Samsung does this because the Galaxy S21, S22, and S23 Ultra phones’ 100x zoom images suck. Of course they do. They involve cropping massively into a small 10-MP sensor. Periscope zooms in phones are great, but they are not magic.

Credible Theories

Huawei is the other big company accused of faking its moon photos, with the otherwise brilliant Huawei P30 Pro from 2019. It was the last flagship Huawei released before the company was blacklisted in the US, effectively destroying its phones’ appeal in the West.

Android Authority claimed the phone pasted a stock image of the moon into your photos. Here’s how the company responded: “Moon Mode operates on the same principle as other master AI modes, in that it recognizes and optimizes details within an image to help individuals take better photos. It does not in any way replace the image—that would require an unrealistic amount of storage space since AI mode recognizes over 1,300 scenarios. Based on machine learning principles, the camera recognizes a scenario and helps to optimize focus and exposure to enhance the details such as shapes, colors, and highlights/lowlights.”

Familiar, right?

You won’t see these techniques used in too many other brands, but not for any high-minded reason. If a phone does not have a long-throw zoom of at least 5x, a Moon mode is largely pointless.

Trying to shoot the moon with an iPhone is difficult. Even the iPhone 14 Pro Max doesn’t have the zoom range for it, and the phone’s autoexposure will turn the moon into a searing blob of white. From a photographer’s point of view, the exposure control of the S23 alone is excellent. But how “fake” are the S23’s moon images, really?

The most generous interpretation is that Samsung uses the real camera image data and just implements its machine learning knowledge to massage the processing. This could, for example, help it to trace the outlines of the Sea of Serenity and Sea of Tranquility when attempting to bring out a greater sense of detail from a blurred source.

However, this line is stretched in the way the final image renders the position of the Kepler, Aristarchus, and Copernicus craters with seeming uncanny accuracy when these small features are not perceptible in the source. While you can take an inference of where moon features are from a blurry source, this is next-level stuff.

Still, it’s easy to overestimate how much of a leg up the Samsung Galaxy S23 gets here. Its moon photos may look OK from a glance, but they are still bad. A recent Versus video featuring the S23 Ultra and Nikon P1000 shows what a decent sub-DSLR consumer superzoom camera is capable of.

A Question of Trust

The furor over this moon issue is understandable. Samsung uses lunar imagery to hype its 100x camera mode and the images are, to an extent, synthesized. But it has really just poked a toe outside the ever-expanding Overton AI window here, which has directed phone photography innovation for the past decade.

Each of these technical tricks, whether you call them AI or not, was designed to do what would have been impossible with the raw basics of a phone camera. One of the first of these, and arguably the most consequential, was HDR (High Dynamic Range). Apple built HDR into its camera app in iOS 4.1, released in 2010, the year of the iPhone 4.

[ad_2]