Smartphone Photography Shootout: Samsung And Apple Head-To-Head

[ad_1]

Through the pandemic, I became passionate about photography. Although I’ve technically been doing “professional” product photography for articles featured on-line and in print for a couple of decades, it was only in the last few years that I stepped up my game. As much as I love my pro camera gear though, it’s not something you can carry around perpetually. As the old saying goes, “the best camera is the one that’s with you”, and like most of you, my smartphone camera is the one that’s almost always within reach. So, when presented with the opportunity to test a couple of the top smartphones for photography, I jumped at the chance.

The smartphones I’m referencing are the Samsung Galaxy S23 Ultra and the Apple iPhone 14 Pro Max. Both of these devices might be the current flagships for their respective manufacturers, but they couldn’t be any more different in terms of their rear “world facing” camera configurations (both cameras feature 12MP selfie cameras). The iPhone 14 Pro Max features a 48MP main camera, with a 24mm 7-element lens (ƒ/1.78 aperture), with sensor-shift optical image stabilization. That main camera is flanked by a 12MP Ultra Wide shooter, with a 13mm 6-element lens (ƒ/2.2 aperture) and a 120° field of view, and a 12MP Telephoto shooter, with a 77mm 6-element lens (ƒ/2.8 aperture), with 3x optical zoom (digital zoom up to 15x) and optical image stabilization.

The Samsung Galaxy S23 Ultra features a bleeding-edge 200MP main camera with a 24mm lens (ƒ/1.7 aperture) and optical image stabilization. Adjacent to the main camera, there’s also a 10 MP “periscope telephoto” shooter (ƒ/4.9 aperture), with OIS and 10x optical zoom, a 10 MP telephoto camera with a 70mm lens (ƒ/2.4 aperture), OIS and 3x optical zoom, and a 12 MP ultra-wide camera, with a 13mm lens (ƒ/2.2 aperture), 120˚ field of view, and Dual Pixel PDAF. The ultra-wide shooters on both the Galaxy S23 Ultra and iPhone 14 Pro Max also do double-duty as Macro shooters.

Those main camera specifications need some additional clarification, however. While many of today’s high-end smartphones boast ultra-high megapixel ratings, they also do something called “pixel-binning”. Pixel binning uses multiple pixels on the sensor to gather data that’s grouped, or “binned”, together to effectively act a single pixel of a certain color. For example, the iPhone’s 48MP main camera actually outputs 12MP images by default, because it is using quad-pixel binning. Users must shoot in RAW, which requires additional processing, to capture a 48MP image. The Galaxy S23 Ultra does the same thing; 200MP image captures are possible, but typical captures are 48MP. Pixel binning is one way to counter-balance the more limited light-gathering capabilities of the small pixels on densely-packed smartphone camera sensors. It results in more accurate color captures and less noise, but creates images that are only 25% of the maximum resolution of the sensor, when using quad-binning at least.

High-End Smartphone Camera Systems Require Massive Processing Power

Smartphones designed with high-quality photography in mind usually use complex image processing techniques, high-quality optics, and a multitude of computational photography technologies to capture and intelligently process as much image data as possible. As such, today’s smartphones require powerful image processors to not only produce high-quality images, but offer a good user experience.

The iPhone 14 Pro Max is powered by Apple’s own A16 Bionic chipset, which features a 16-Core Neural Engine capable of 17 TOPS and an advanced Image Signal Processor (ISP). The Samsung Galaxy S23 Ultra is powered by the flagship Snapdragon 8 Gen 2 Mobile Platform that features Qualcomm Technologies, Inc.’s first ever Cognitive ISP.

Sensor sizes and megapixel counts may be key differentiators that play a significant role in a smartphone camera system’s quality, but they are less relevant than most people think. Smartphones simply do not have the physical space to accommodate large camera sensors and a sensor’s pixel density can only be so high before image quality is negatively impacted, particularly in challenging lighting conditions. Although there are other things to consider, in general, smaller pixels on a camera sensor gather less light than larger pixels (all other things being equal), which can degrade image quality in sub-optimal lighting. Because of this, a smartphone image processing capabilities are immensely important.

There’s isn’t much data available on Apple’s proprietary ISP. The company consistently enhances the capability of its “Axx” mobile processors with each generation, but specifics are scarce. Qualcomm has also enhanced its Snapdragon platforms over the years with leading image processing and computational photography features. Although Qualcomm’s partners, like Samsung, don’t typically enable every possible photography or videography-related feature available in the latest Snapdragon mobile platforms, those that place a strong emphasis on photography often produce devices capable of capturing excellent photos and videos. In fact, according to DxOMark, many of the best smartphone cameras are powered by Snapdragon.

The Spectra Image Signal Processor featured in the Snapdragon 8 Gen 2 has triple Cognitive ISPs, capable of processing 3.2 gigapixels per second. The trio of Cognitive ISPs gives devices like the Galaxy S23 Ultra the ability to leverage up to three independent cameras simultaneously and enables features like triple simultaneous image capture, seamless zooming between cameras, multi-capture HDR composites, and up to 200 Megapixel photo capture, among other things. The first of its kind Cognitive ISP in the Snapdragon 8 Gen 2, in conjunction with the chip’s Hexagon AI engine, also enables something called real-time “Semantic Segmentation”, which is enabled on the Galaxy S23’s selfie camera.

Semantic Segmentation leverages a neural network that was trained to identify many different objects, like people, faces, skies, grass, etc., and each of those objects can be segmented, and custom processing applied to them to enhance a captured image or video. If you’re familiar with image processing applications, you can think of the segments as layers, and each layer can be edited independently to produce a desired effect – bluer skies, smoother skin, greener grass – you get the idea. And since the segments can be altered independently, each segment can be properly exposed, saturated, or sharpened, without diminishing the quality of the other segments. As software support for Semantic Segmentation matures, it has the potential to be a real game changer.

Both the iPhone 14 Pro Max and the Galaxy S23 Ultra use an array of computational photography techniques to produce images. Common computational photography features, like Night Mode, Portrait Mode, and HDR are available on both platforms. The Galaxy S23 Ultra also offers a feature called AI Zoom. AI Zoom is an experiential feature that gives users the ability to seamlessly zoom from the widest angle all the way to 100x zoom, spanning three different cameras, by simply pinching and zooming or using an on-screen slider. There’s no need to manually switch between the ultra-wide, standard, or telephoto camera, or to figure out which one is best to compose a particular shot.

The Mobile Photography Experience

Although I’m focusing on photography here, I should also note that both the iPhone 14 Pro Max and the Galaxy S23 Ultra are high-performance devices, which offer quality user experiences in general. Both of the phones are fast, and have myriad of photo and video-related features. Specifically as it related to photos and videos, autofocus is quick and accurate on both devices, they capture realistic tones and textures, and noise performance is also very good. For general photography, I think anyone coming from a previous-gen device is likely to be quite pleased. But there are some major advantages to the Qualcomm-powered Samsung Galaxy S23 Ultra worth pointing out.

The Galaxy S23 Ultra’s camera application features a number of customizations required to give users access to some of the device’s special photo features, like Hyperlapse, Single Take, and Expert Raw, to name but a few. Users of any skill level, however, should be able to navigate its menu system and manipulate camera settings or quickly switch between modes. Although there are many options available to tune image quality to a user’s liking, auto-mode takes phenomenal shots. The Galaxy S23 Ultra is capable of capturing excellent detail; perhaps one small criticism is that it can sometimes oversaturate images and perform a bit too much sharpening.

Without the right processing, handling wide dynamic range lighting situations can be challenging for the relatively small sensors in a smartphone, but even with strong backlighting with the sun shining brightly, the both the iPhone 14 Pro Max’s and Galaxy S23 Ultra’s cameras performed well. Colors and tones of the scene are well balanced, with crisp detail, when normally some objects would be silhouetted on lesser cameras that can’t handle this kind of lighting. That said, the Galaxy S23 Ultra better handled the exposure and was able to retain more blue in the sky.

The higher resolution 200MP sensor in the Galaxy S23 Ultra also captures more detail. When scaled and posted to social media, with minimal editing, the differences may be very difficult to suss out. But the massive resolution offered by the Galaxy S23 Ultra gives users much more flexibility to crop without sacrificing as much detail in the final image. These zoomed crops of the images captured on both phones shows just how much additional resolution is available on the Samsung device.

A 200MP capture on the Galaxy S23 Ultra is 16320×12240 pixels. A 48MP capture on the iPhone 14 Pro Max is 8064×6048 pixels. You could crop 50% from each dimension on the Galaxy S23 Ultra and still have a similar number of pixels to work with versus the full resolution on the iPhone.

Both phones handled Portrait mode well, though they offer different fields of view in this mode. The simulated blurred background bokeh is smooth, on both with surprisingly realistic looking transitions from in-focus to out-of-focus areas. Portrait Mode can often introduce unwanted artifacts in the fine details around a subject’s edges, but both phones do a pretty good job here. The Galaxy S23 Ultra does a slightly better job though, with smoother transition, and more detail in the subject.

The Galaxy S23 Ultra also stands out with close-up Macro photography. Even when its ultra-wide camera is nearly touching the subject, it is able to focus properly and capture fine detail.

The iPhone 14 Pro Max does a great job too if you don’t pixel peep, but zooming in shows more fine detail in the Samsung capture – you can actually make out individual bits of pollen, whereas on the iPhone the processing makes the internal part of the flower look like it has a rough, bumpy texture.

In terms of video recording, both devices are relatively full featured, but the edge yet again goes to the Galaxy S23 Ultra. The iPhone 14 Pro Max tops out of 4K60 recording, while the Galaxy 23 Ultra can do 8K30 recording. Slow motion, normal, and time lapse recording are available on both devices as well. In practice I found the video quality on both devices to be quite good, though the Galaxy 23 Ultra seemed to handle sub-optimal, low-light conditions somewhat better.

Top camera performance continues to be paramount for flagship smartphone buyers, but whether evaluating specification or real-world results, it’s clear Qualcomm-powered Android devices like the Samsung Galaxy S23 Ultra are a step ahead of the competition. Independent bodies like DxOMark show most of the top smartphone cameras are backed by Qualcomm processors.

[ad_2]