Samsung’s Galaxy A54 Already Feels Outdated Now That the Pixel 7a Is Out

It’s a bit difficult to justify buying Samsung’s mid-tier Galaxy A54 when the Google Pixel 7a comes with the same processor as its flagship counterpart and has already proven it’s a better camera phone. But the Galaxy A54 starts at $450 compared to the 7a’s $500, boasting some of its flagship sibling’s marquee features at a slightly lower price point than Google’s budget option. Those features include a high-resolution display, excellent battery life, and a design that I felt was more comfortable than the Pixel’s rectangle angles and stark camera bar. The sacrifice is Samsung flagship-level camera quality, wireless charging, and storage space options.

Samsung’s Galaxy A54 doesn’t look like a budget phone

The trick to selling a mid-range device is to make sure it doesn’t look cheap. That’s what Google did with its A-series and what Samsung has decided to do going forward. Last year’s Galaxy A53 was dull if you were buying it in black. But at least this year’s Galaxy A54 appears styled after its flagship counterpart.

The Galaxy A54 is styled after its flagship counterparts.
Photo: Florence Ion / Gizmodo

The phone looks less “mid-range” than last year’s A53, even in black. Still, I love the periwinkle blue colorway on our review unit (it appears purple in Samsung’s official renders). It looks almost pearlescent in direct light, and it offers a chic aura that doesn’t immediately make you think this is a sub-$500 device. The A54’s build is durable, with the Galaxy A54 boasting tough glass on either side of the chassis, plus an IP67 water and dust resistance rating. There’s still some plastic on the outside frame, but you can barely tell at first glance.

The Galaxy A54 (left) versus the Galaxy A53 (right).
Photo: Florence Ion / Gizmodo

The Galaxy A54’s 6.4-inch display is bold and beautiful. It’s bigger than the 6.1-inch screen you get on the Pixel 7a and way more saturated, but it’s that high-resolution Samsung AMOLED shining through with its smooth-scrolling 120Hz refresh rate that does it for me. The Pixel 7a display’s color temperature veers warmer, whereas Samsung’s displays are bluer, so video and graphics appear more vibrant. The maximum brightness for the A54 is 1,000 nits, and I found it just a smidge easier to read in direct sunlight than the Pixel 7a.

It’s plastic all around but you can’t really tell.
Photo: Florence Ion / Gizmodo

Samsung’s Exynos vs. Google’s Tensor

Samsung’s Galaxy A54 runs on Samsung-made hardware, but it’s not exactly in the same vein as Google’s homemade Tensor G2, which powers the Pixel 7a and its flagship siblings. The Exynos 1380 processor is Samsung’s latest mid-range, 8-core chipset. It benchmarks better than last year’s Galaxy A53 but remains hampered by 6GB of RAM. When jumping between apps, you can feel the A54’s relative choppiness. It’s not often, but it’s occasional enough that any sluggishness is more noticeable than on the Pixel 7a (although that phone’s 60Hz default refresh rate presents its own issues when it comes to smooth scrolling).

In battery tests, the Galaxy A54 outlasted its flagship brethren and the Google Pixel 7a. Its 5,000 mAh battery is larger than the 7a’s 4,385 mAh. It lasted about 18 hours and 48 minutes in our battery rundown test, compared to the Pixel 7a’s 16 hours and 11 minutes. It’s good to know Samsung’s mid-range can last a long time, especially since it’s pushing out a 120Hz refresh rate.

The three-camera system could use a telephoto.
Photo: Florence Ion / Gizmodo

I’m bummed there isn’t wireless charging on the Galaxy A54, even if it is a sub-$500 phone. I have managed to build out a network of chargers around my home, where I live and work, and it’s imperative I can plop down a device to charge it. I can do that with Google’s Pixel 7a but not with the Galaxy A54.

Not the best phone camera for night shots

Two test shots with the Galaxy A54 during mid-day.
Image: Florence Ion / Gizmodo

The Galaxy A54’s three-lens camera system includes a 50-MP primary camera with OIS and an aperture of f/1.8, a 12-MP ultrawide, and a 5-megapixel macro that seems unnecessary. I never thought to use it during my testing period and would have preferred telephoto glass for that third camera on the back for a little more optical zoom. The front-facing camera is 32-MP.

The Galaxy A54’s nighttime landscape shot.
Photo: Florence Ion / Gizmodo

Overall, the Galaxy A54’s camera system gets the job done if you’re outside and the weather is bright. Blues and reds are “punched up” with every photo, and you can use the on-screen camera controls to adjust the scene when it’s the golden hour. Just ensure you set the focus point in the right spot, or it could cost you a well-balanced photo. The maximum video recording resolution on the Galaxy A54 is 4K at 30 frames per second, which is smooth enough. It kept up with my kid panning the rows at a colorful car show, and the microphones were solid at mitigating the wind noise.

The Galaxy A54 will not get you astrophotography like a Google Pixel. Even with its dedicated night mode, it struggled to temper photos and capture enough light. And when it did produce an image in the dark, you can see the pixelation, as if evidence of an algorithmic struggle. That’s not as apparent in the Night Sight photos shot with the Pixel 7a. If it’s night skies and poorly-lit faces you find yourself snapping most, the Galaxy A54 should’t be your first pick.

Samsung’s version of Android is still good

Samsung’s Galaxy A54 runs Android 13 with OneUI 5.1. This gets you access to Samsung-exclusive features like Bixby Modes, widget stacks, and an improved screenshot manager. The company has promised four years of software updates for the Galaxy A54 and five years of security updates. That’s better than on the Pixel A-series, which gets four years of updates plus a sprinkling of quarterly Pixel-exclusive feature drops (though, like game loot, you never know what you’ll get and when).

Samsung adopted Android 13’s uniform icons.
Photo: Florence Ion / Gizmodo

I appreciate that Samsung’s Android 13 has adopted unified icons. They make the interface look less cartoony than its colorful stock icons. There’s also an option to dim the wallpaper when you’re in dark mode, which I like for my sensitive eyes.

The Galaxy A54 needs one more flagship feature

Samsung’s Galaxy A54 is $450 if you buy it unlocked. But if you need mmWave connectivity to access the 5G bandwidth you’re paying for, it’s $50 more, which is the variant I’d have to buy if I wanted to use this device on Verizon.

If you don’t mind the nighttime camera performance, Samsung’s Galaxy A54 is bang for your buck.
Photo: Florence Ion / Gizmodo

The Galaxy A54 needs one or two more flagship-like features to justify its cost relative to the Pixel 7a. It’d be nice to see Samsung add wireless charging or higher storage space tiers for folks who want a mid-ranger that will last them through all five years of its security updates. Concessions like that may not immediately translate to more units sold, but Samsung could establish itself as a mid-range manufacturer promoting longevity. That’s a point where it still has a leg up over the Pixel on the software side, so we’d love to see the hardware match.

Samsung’s best phone just got the Galaxy S23 Ultra’s best feature for free

Samsung has today given the Galaxy Z Fold 4 a hot new photography feature that will make it even better at taking photos of the night sky.

The feature, called astrophotography mode, comes from the very top of Samsung’s phone lineup in the shape of the Galaxy S23 series of phones. It allows devices to take detailed photos of the night sky in a way that isn’t normally possible all while creating some truly impressive images. Now, that same feature is available to Galaxy Z Fold 4 owners.

Samsung Galaxy Z Fold 4 – Image source:


Samsung had previously said that it intended to roll the feature out to people who owned non-Galaxy S23 and Galaxy S22 devices and now it’s made good on that promise. The Galaxy Z Fold 4 isn’t the only model that’s in on the act either, with the Galaxy S20 series, Galaxy Note 20, and Galaxy Note 20 Ultra all also able to enjoy the new astrophotography capabilities. All of the Galaxy Z Fold phones except the very first will also get the same feature, as will the Galaxy S21 which means there are plenty of phones out there that can now take some stunning photos of the moon.

The astrophotography mode is made available thanks to an updated Expert RAW app that can now be downloaded from the Galaxy Store. The app can also note your location and then give you tips on where you should be looking for specific constellations. What’s more, it can then take long-exposure shots to help you capture as much light as possible for even better shots. Those photos can go up to 10 minutes in terms of exposure, something helped by the fact the Galaxy Z Fold 4 can actually prop itself up, removing the need for a dedicated tripod.

Those who have a Galaxy Z Fold 4 can now download the updated Expert RAW app and we’d definitely suggest that you do that before it next gets dark. Once the app is downloaded and installed actually taking shots using the astrophotography mode is a simple case of tapping the final button in the top-right corner of the viewfinder.

This new update comes as Samsung continues to work on getting the Galaxy Z Fold 5 ready, with the phone likely to be announced alongside its flip-based cousin as soon as July 2023. Samsung’s phones are far from the only foldable on the way, with Google expected to announce the Pixel Fold on May 20 during its annual developer conference.

San Francisco Proposal Photographer Offers New SF Locations And Photography Packages

(MENAFN- EIN Presswire)

San Francisco Proposal Photographer

San Francisco Romantic Proposal

San Francisco Proposal Photography

New photography packages for proposals now available at San Francisco Proposal Photographer, providing beautiful, timeless photos to cherish for a lifetime.

SAN FRANCISCO, CA, UNITED STATES, April 30, 2023/ / — San Francisco Proposal Photographer is excited to announce the launch of their new proposal photography packages and locations, which aim to capture life’s intimate moments with skilled discretion and artistry. The studio has added several enchanting locations for proposals in San Francisco to its list, including Golden Gate Bridge, Baker Beach, Russian Hill, Ina Coolbrith Park, Sutro Baths, Fairmont Rooftop, Japanese Tea Gardens, Lovers Lane, Pier 14, and Muir Woods.

For more information, go to

The studio provides four different proposal packages, each of which is designed to capture the unique essence of the moment and offer beautiful and timeless photos to cherish for a lifetime. Package A includes a smartphone shoot with all photos delivered to the client, while Package B provides professional camera coverage of the proposal. Package C includes everything in Package B along with a 10-15 minute photo session, and Package D offers a 45-minute photo shoot.

“We are thrilled to be able to offer our clients the opportunity to capture their special moment in some of the most beautiful locations in San Francisco,” said the owner of the studio.“We strive to provide our clients with the highest quality proposal photography services, and our new packages and locations are sure to make their proposal even more memorable.”

San Francisco Proposal Photographer understands the significance of a proposal and strives to provide couples with the perfect setting and the ideal photos to commemorate their special day. In addition, the studio encourages clients to come up with their own surprise proposal ideas and unique ways to make their special moment even more memorable.

A recent customer commented:”It was a pleasure working with this team. I organized a proposal shoot with them and they were discrete, punctual, and enthusiastic. The pictures came out great and my fiance had no idea it was coming. Highly recommend them!”.

For more information about the studio’s photography packages and locations, visit the san francisco proposal photographer .


San Francisco Proposal Photographer
2150 Post St, San Francisco, CA 94115
(415) 812-5014

Eric Smith
San Francisco Proposal Photographer
+1 415-812-5014
email us here


Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.

Future Apple Watch could get cameras for photography & Face ID

Apple’s proposal would be less bulky than the existing third-party Wristcam

AppleInsider may earn an affiliate commission on purchases made through links on our site.

Apple has big plans for cameras in future Apple Watches, if they can be fitted without making the watch awkward to wear — and if the cameras can be of high enough quality.

The popularity of the Apple Watch Ultra has shown that people are willing to wear bulkier devices if there is a clear benefit to them. In the case of the Apple Watch Ultra, that benefit includes a greatly extended battery life, for instance.

Future Apple Watches may also become at least a little larger, as Apple is again looking at ways to incorporate a camera. Previously, Apple has had a patent granted to include a camera in the Apple Watch’s Digital Crown, but that’s got to have very limited use.

Now in a newly-revealed patent application, Apple is proposing that a Watch could fit a camera into a slight protrusion toward the top of its chassis, above the display. “Wearable Electronic Device Having A Digital Camera Assembly,” would be similar to the existing Wristcam in where the camera is positioned, but that product is a whole Watch band.

Apple isn’t keen on taking up a whole band, or anything that makes the Apple Watch cumbersome.

“While certain electrical components, such as a camera, may perform desirable

functions,” says the patent application, “the integration of such components may result in a bulky device which may hinder user performance, may be uncomfortable to wear, or may be unsuited for performing certain functions (e.g., a camera disposed within a wearable electronic device may be awkward to position when capturing optical input).”

Apple is also dismissive of compromises to fit the camera into a Watch band. “Additionally, low-quality components may not meet a user’s quality expectations,” it says, “(e.g., a low-quality camera may produce low-quality images).”

It’s not just that Apple wants you to look nice on photographs. Apple specifically wants a video camera with up to 4K and 60 frames per second, or a still camera up to 12MP.

Either a spacecraft, or an Apple Watch side view with a camera protrusion to the right

That’s because this is for more than photography wildlife, more than for capturing the crowd at Little League. This camera is actually meant for more than any regular photography or video.

“The digital camera assembly may be used for a variety of purposes,” continues Apple, “including, as non-limiting examples, facial identification, fingerprint sensing, scanning a Quick Response (QR) code, video conferencing, biometric monitoring (e.g., heart rate monitoring), photography, video or image capture, or any combination thereof.”

So with a camera on your Apple Watch, you could unlock all of your Apple devices through Face ID.

The Watch could also use its camera to “capture movement of a user’s body or other objects during certain activities.” Using visual inertial odometry (VIO), “the camera can be used to obtain a high degree of motion sensing accuracy, which may be used to monitor, detect, and/or predict a user’s motion or gesture based on certain characteristics.”

That’s a lot to demand of a camera, and Apple is not expecting to be able to fit one under the screen of an Apple Watch. Instead, it will be on what Apple calls a protusion, and much of the patent application is about how to do that without making the Watch distracting to wear.

“[A] digital camera assembly may be integrated into the wearable electronic device in a way so as to minimize an effect of the digital camera assembly on other electronic components and/or a form factor of the wearable electronic device,” says Apple.

“For example, in implementations where a digital camera assembly is positioned within an internal cavity (e.g., camera cavity) of a protrusion,” it continues, “the digital camera assembly may extend from the housing, over a band slot, and away from a display, a battery, a circuit assembly, or sensors of the wearable electronic device.”

“Likewise, the protrusion may be shaped to avoid interfering with geometry of the band slot,” says the patent application, “so that a band/strap may still be permitted to couple with the housing of the wearable electronic device.”

The patent application is credited to five inventors, including Christopher M. Warner, whose previous work includes muscle-sensing Apple Watch bands.

iPhone vs Android: Two different photography and machine learning approaches

Apple’s computational photography aims for realism

AppleInsider may earn an affiliate commission on purchases made through links on our site.

A controversy with Samsung’s phone cameras has renewed the conversation surrounding computational photography, and highlights the difference between it, and Apple’s approach in iOS.

It isn’t a big secret that Apple relies upon advanced algorithms and computational photography for nearly all of its iPhone camera features. However, users are beginning to ask where to draw the line between these algorithms and something more intrusive, like post-capture pixel alteration.

In this piece, we will examine the controversy surrounding Samsung’s moon photos, how the company addresses computational photography, and what this means for Apple and its competitors going forward.

Computational photography

Computational photography isn’t a new concept. It became necessary as people wanted more performance from their tiny smartphone cameras.

The basic idea is that computers can perform billions of operations in a moment, like after a camera shutter press, to replace the need for basic edits or apply more advanced corrections. The more we can program the computer to do after the shutter press, the better the photo can be.

This started with Apple’s dual camera system on iPhone 7. Other photographic innovations before then, like Live Photos, could be considered computational photography, but Portrait Mode was the turning point for Apple.

Apple introduced Portrait Mode in 2016, which took depth data from the two cameras on the iPhone 7 Plus to create an artificial bokeh. The company claimed it was possible thanks to the dual camera system and advanced image signal processor, which conducted 100 billion operations per photo.

Needless to say, this wasn’t perfect, but it was a step into the future of photography. Camera technology would continue to adapt to the smartphone form factor, chips would get faster, and image sensors would get more powerful per square inch.

Portrait mode uses computational photography to separate the foreground

In 2023, it isn’t unheard of to shoot cinematically blurred video using advanced computation engines with mixed results. Computational photography is everywhere, from the Photonic Engine to Photographic Styles — an algorithm processes every photo taken on iPhone. Yes, even ProRAW.

This was all necessitated by people’s desire to capture their life with the device they had on hand — their iPhone. Dedicated cameras have physics on their side with large sensors and giant lenses, but the average person doesn’t want to spend hundreds or thousands of dollars on a dedicated rig.

So, computational photography has stepped in to enhance what smartphones’ tiny sensors can do. Advanced algorithms built on large databases inform the image signal processor how to capture the ideal image, process noise, and expose a subject.

However, there is a big difference between using computational photography to enhance the camera’s capabilities and altering an image based on data that the sensor never captured.

Samsung’s moonshot

To be clear: Apple is using machine learning models — or “AI, Artificial Intelligence” for those using the poorly coined popular new buzzword — for computational photography. The algorithms provide information about controlling multi-image captures to produce the best results or create depth-of-field profiles.

The image processor analyzes skin tone, skies, plants, pets, and more to provide proper coloration and exposure, not pixel replacement. It isn’t looking for objects, like the moon, to provide specific enhancements based on information outside of the camera sensor.

We’re pointing this out because those debating Samsung’s moon photos have used Apple’s computational photography as an example of how other companies perform these photographic alterations. That simply isn’t the case.

Samsung’s moon algorithm in action. Credit: u/ibreakphotos on Reddit

Samsung has documented how Samsung phones, since the Galaxy S10, have processed images using object recognition and alteration. The Scene Optimizer began recognizing the moon with the Galaxy S21.

As the recently-published document describes, “AI” recognizes the moon through learned data, and the detail improvement engine function is applied to make the photo clearer with multi-frame synthesis and machine learning.

Basically, Samsung devices will recognize an unobscured moon and then use other high-resolution images and data about the moon to synthesize a better output. The result isn’t an image captured by the device’s camera but something new and fabricated.

Overall, this system is clever because the moon looks the same no matter where it is viewed on earth. The only thing that changes is the color of the light reflected from its surface and the phase of the moon itself. Enhancing the moon in a photo will always be a straightforward calculation.

Both Samsung and Apple devices take a multi-photo exposure for advanced computations. Both analyze multiple captured images for the best portion of each and fuse them into one superior image. However, Samsung adds an additional step for recognized objects like the moon, which introduces new data from other high-resolution moon images to correct the moon in the final captured image.

Samsung’s moon algorithm explained. Credit: Samsung

This isn’t necessarily a bad thing. It just isn’t something Samsung hasn’t made clear in its advertising or product marketing, which may lead to customer confusion.

The problem with this process, and the reason a debate exists, is how this affects the future of photography.

Long story short, the final image doesn’t represent what the sensor detected and the algorithm processed. It represents an idealized version of what might be possible but isn’t because the camera sensor and lens are too small.

The impending battle for realism

From our point of view, the key tenet of iPhone photography has always been realism and accuracy. If there is a perfect middle in saturation, sharpness, and exposure, Apple has trended close to center over the past decade, even if it hasn’t always remained perfectly consistent.

We acknowledge that photography is incredibly subjective, but it seems that Android photography, namely Samsung, has leaned away from realism. Again, not necessarily a negative, but an opinionated choice made by Samsung that customers have to address.

For the matter of this discussion, Samsung and Pixel devices have slowly tilted away from that ideal realistic representational center. They are vying for more saturation, sharpness, or day-like exposure at night.

The example above shows how the Galaxy S22 Ultra favored more exposure and saturation, which led to a loss of detail. Innocent and opinionated choices, but the iPhone 13 Pro, in this case, goes home with a more detailed photo that can be edited later.

This difference in how photos are captured is set in the opinionated algorithms used by each device. As these algorithms advance, future photography decisions could lead to more opinionated choices that cannot be reversed later.

For example, by changing how the moon appears using advanced algorithms without alerting the user, that image is forever altered to fit what Samsung thinks is ideal. Sure, if users know to turn the feature off, they could, but they likely won’t.

We’re excited about the future of photography, but as photography enthusiasts, we hope it isn’t so invisible. Like Apple’s Portrait Mode, Live Photos, and other processing techniques — make it opt-in with obvious toggles. Also, make it reversible.

Tapping the shutter in a device’s main camera app should take a representative photo of what the sensor sees. If the user wants more, let them choose to add it via toggles before or editing after.

For now, try taking photos of the night sky with nothing but your iPhone and a tripod. It works.

Why this matters

It is important to stress that there isn’t any problem with replacing the ugly glowing ball in the sky with a proper moon, nor is there a problem with removing people or garbage (or garbage people) from a photo. However, it needs to be a controllable, toggle-able, and visible process to the user.

Computational photography is the future, for better or worse

As algorithms advance, we will see more idealized and processed images from Android smartphones. The worst offenders will outright remove or replace objects without notice.

Apple will inevitably improve its on-device image processing and algorithms. But, based on how the company has approached photography so far, we expect it will do so with respect to the user’s desire for realism.

Tribalism in the tech community has always caused debates to break out among users. Those have included Mac or PC, iPhone or Android, and soon, real or ideal photos.

We hope Apple continues to choose realism and user control over photos going forward. Giving a company complete opinionated control over what the user captures in a camera, down to altering images to match an ideal, doesn’t seem like a future we want to be a part of.

samsung: Samsung Galaxy S23 Ultra review: All about improvements

Samsung’s premium flagship smartphone, the S23 Ultra, is its most powerful offering, with substantial and practical improvements. Here’s a lowdown on what works and what doesn’t.

Big Plus
The smartphone’s familiar industrial design with sharp edges and flat rails makes it comfortable to hold despite a massive 6.8-inch Edge Dynamic AMOLED 2X QHD+ display. It also has a solid build quality.

Best Point
Its design may be similar from previous iterations, but the Qualcomm Snapdragon 8 Gen 2 chipset is powerful and power efficient. It handles everything you throw at it. The new 200-megapixel sensor is another major improvement. Samsung has also made some changes to the entire camera’s processing. As a result, it captures more light, provides better details, offers rich portraits, and is impressive for low-light photographs.

Standout Feature
The S Pen is one feature that stands out and is the key differentiator of the S23 Ultra. Besides, its new Astrophotography mode and 50-megapixel expert RAW features are other worthy improvements.

The X-factor
The Galaxy S23 Ultra is equipped with two dedicated telephoto lenses, – MP sensors, and 3x and 10x telephoto zoom. The zoom quality is unparalleled and produces a sharp and colour-rich output.

What Could Have Been Better
It only offers Wi-Fi 6E, while even other relatively affordable smartphones offer Wi-Fi 7 support. It could also have been priced lower.

Should I Go For It?
Definitely, if you are a power user and looking for all the bells and whistles in a smartphone. Starting at Rs 1,24,999, the Galaxy S23 Ultra offers an impressive display, camera, performance, and battery. It is a no-brainer for anyone looking for a premium Android phone.

The Galaxy S23 Ultra Is Great, but Would Be Better if It Folded

I struggle to look at any other Samsung smartphone now that I’ve been living with its foldables. The Samsung Galaxy Z Fold 4 has effectively changed how I use Android. Most of the time, I’ll only bother with my Google Pixel 7 if someone is calling the number linked to that phone. Otherwise, you’ll see me primarily on the foldable. It’s just so much more versatile for the life I lead.

That’s not to say I didn’t enjoy my time with the Samsung Galaxy S23 Ultra, but I missed the Fold while reviewing this one. Samsung’s ultimate new flagship device is everything you could want in a smartphone, but there is also a lot here that feels like overkill now that we’re in the second iteration of the Ultra and its stylus-wielding ways. In fact, I forgot to use the stylus until about two days ago (I don’t draw). And while four cameras are a great back-of-the-box brag, I still don’t understand how to push them to the extent they’ve been marketed as being capable of, and I realize I probably never will. And I like high spec phones!

Regardless, the Ultra still has plenty going for it, including a better design than the last generation. Those rear-facing cameras may not be enough to justify the price to casual users, but their post processing algorithms are just as good as Google’s—better in some cases. The Ultra even has a few features I think foldables are still missing—like that stowable stylus.

But when it comes to targeting genuine innovation as opposed to niche specialty features, the Ultra might miss the mark compared to both the competition and Samsung’s other phones.

The best Ultra yet

If you like big phones, you’ll love the Galaxy S23 Ultra (I don’t—it’s not foldable). It has a 6.8-inch Dynamic AMOLED display, categorized as such because it’s based on tech that allows the display to dynamically change refresh rates without killing the battery. The jury is still out on how much battery that display tech saves, and I’ll get more into that when we talk about the battery rundown results later. Still, the display that Samsung has going here is like carrying a tiny version of its TVs in your pocket.

You might have gotten into the Galaxy line because you love Samsung’s displays. I can’t blame you. Like on the S22 Ultra, the screen on the S23 Ultra is a 1440p resolution with a 120Hz refresh rate. I love watching TV on this thing, even the 720p classics like Taxi and One Day at a Time. What I especially appreciate about Samsung is how low the brightness can go so that I can fall asleep to those shows at the end of the night without lighting up the room. Samsung enables the use of Android 12’s extra dim mode, and with that turned on, the phone doesn’t go any higher than about 350 nits—the standard rate is around 430 nits, or a whopping 1,750 nits if you’re out in direct sunlight and using the adaptive brightness feature.

The Galaxy S23 Ultra houses an S Pen inside its chassis.
Photo: Florence Ion / Gizmodo

The best part about the new Galaxy S23 Ultra is that Samsung fixed some of what I didn’t like with the Galaxy S22 Ultra’s design. Mainly, it squared off the edges instead of rounding them, so it’s easy to cradle the phone one-handed. I finally felt confident that I wasn’t going to drop it. I’m glad Samsung stopped with the overtly rounded edges, which are also annoying to use when you’re tapping on the edge of the screen.

This is still a gigantic smartphone. I hope you have big hands if you plan to play games on this thing. My small hands and long claws had difficulty cradling the Ultra to play with on-screen controls in games like Dreamlight Valley through Xbox Game Pass, and my wrists got weary holding the phone to control my character in Riptide GP: Renegade. The first-gen Razer Kishi controller that I use for Android gaming also feels as if it’s stretched to capacity on this phone, as if the Galaxy S23 Ultra will pop out at any minute. Unless it’s a point-and-tap game, I use a Bluetooth controller to play games on the S23 Ultra. The OnePlus 11’s similarly sizeable 6.7-inch display, comparatively, feels less ginormous because it doesn’t have the Ultra’s squared-off corners and the chassis is narrower.

The Galaxy S23 Ultra is a big phone, make no mistake.
Photo: Florence Ion / Gizmodo

The Galaxy S23 Ultra utilizes an in-display fingerprint sensor and face unlock for added lock screen security. It’s best that Samsung didn’t carry over the power button fingerprint sensor like on the Z Fold 4, because I am constantly accidentally pressing that one and locking myself out of it. Scanning in a fingerprint or smiling at the Ultra felt fast and responsive unless I wore a mask or sunglasses.

The default sorage space on the S23 Ultra has thankfully been bumped up to 256GB. It starts there and goes all the way up to 1TB, if you can stomach paying for it (doing so will add $420 on top of the base storage’s cost). The Ultra is also IP68 rated for water and dust resistance.

Qualcomm with Samsung flavoring

Something to note about this year’s Galaxy S23 lineup is that it runs a unique flavor of the Qualcomm Snapdragon 8 Gen 2 processor. Rather than use the one that came right out of the box, Samsung infused some of its AI smarts to tune camera and performance algorithms to its liking. The company already does this to some effect with its Exynos chips overseas, and it’s bringing that expertise to the phones sold in the states to one-up Google’s homemade Tensor processor. Sometimes it works.

If you like to game, the S23 Ultra can serve. But its big size may not be easy for everyone to cradle.
Photo: Florence Ion / Gizmodo

The Galaxy S23 Ultra is available with 8GB and 12GB of RAM, which seems absurd. The Ultra should have 12GB of memory as the standard, since it’s technically the ultimate Samsung phone. Even with the 12GB of RAM, you can’t tell that the chip inside the Galaxy S23 Ultra is any beefier than what’s inside the similarly-specced OnePlus 11. On paper, and in Geekbench 5 (which will be Geekbench 6 in our reviews going forward), the Galaxy S23 Ultra performed better than OnePlus 11 by only about 300 points on the single-core score and 400 points on the multi-core one. But that proves little about whether Samsung’s infused chip is faster or more able than OnePlus’s vanilla one in actual use. Considering the Google Pixel 7 Pro is a laughing stock on the benchmark charts but not in real-world use—it ranks with 400 points less than the Galaxy S23 Ultra—it’s hard to use these benchmarks as the sole test for what’s possible. Anyway, neither of these Android devices can hold a candle to the numbers that Apple’s A16 Bionic spits out.

The upside to having such a powerful smartphone is that it can do everything: play games locally and from the cloud, create and edit documents, quickly export edited videos, process RAW photos, and chat with whoever. The Ultra can handle each of these cases with absolute ease, but that’s expected from a phone that I’ve been running for about three weeks. The real test for these devices is how they do after a year in the hand.

Riptide GP: Renegade runs smoothly on the S23 Ultra, but it’s too big to comfortably play for my hands.
Photo: Florence Ion / Gizmodo

I echo the sentiments of a few other reviews: the Galaxy S23 Ultra doesn’t get as hot as previous versions of the device or even other Android phones. I fell asleep next to it a few nights in a row while it was charging and playing Pluto TV, and I didn’t feel the usual heat emanating as the battery fueled up for the next day. It did get toasty once while I was mindlessly scrolling through TikTok (as I often do), and it was significant enough that I remember saying, “I should probably mention this in the review.”

Apple’s iPhone 14 Pro Max lasts longer

I’m sorry to include Apple in the subhead of a Samsung Galaxy review. But I remain impressed by the battery test on Apple’s latest flagship, and it’s now the benchmark for every other flagship phone review.

Samsung’s 5,000 mAh battery is enormous while remaining the same size as in last year’s Ultra. Whatever Samsung did on the backend to extend battery life has worked thus far—the S23 Ultra beat out the S22 Ultra by about two hours, lasting 18 hours and 33 minutes. But that’s nothing to Apple’s nearly 24-hour battery life on its large iPhone 14 Pro Max. I want some of whatever magic Apple has going on with its software to come to Android land.

These results translated to using the phone daily, too. As I mentioned, I’m a TikTok freak, and I was surprised to see that the Ultra chewed through only 23% of its battery life in five hours after mixed-use, which included tuning into my Disney streamer.

Move over, Pixel camera

The back of the Galaxy S23 Ultra houses four camera sensors.
Photo: Florence Ion / Gizmodo

Because the Galaxy S23 Ultra is being dubbed as “ultimate,” its cameras are appropriately extreme. They’re also the key upgrade point here, and took up the majority of Samsung’s announcement event for this phone. The primary camera is a 200-MP standard wide-angle lens with optical image stabilization (OIS) and an f/1.7 aperture. The ultra-wide camera is a 12-MP sensor with an f/2.2 aperture. And the two telephoto lenses on the back also have OIS, though one has an f/2.4 aperture with a maximum 3x optical zoom, and the other is f/4.9 with a 10x optical zoom. The maximum digital zoom for this camera is 100x, just like the S22 Ultra.

An unedited photo shot at dusk with the Galaxy S23 Ultra.
Photo: Florence Ion / Gizmodo

Whenever someone outside of the Android bubble realizes the Galaxy S23 Ultra has four cameras on the back, they often ask me, “why?” The answer is so it has camera lenses for every foreseeable situation. For instance, if you’re chasing your kid around the park, you want that quick 3x optical zoom to capture them in the frame and up close. The result is a background bokeh effect that helps make the image instantly shareable on Instagram without using Portrait mode. Or if you happen to be lying down at the park, only to hear the roar of a jet engine approaching overhead, you can use the 10x optical zoom to get a closer look and maybe even post it to TikTok. For epic sky days, when the clouds seem to be cruising through as if they’re fresh cotton candy spun right out of the bin, the ultra wide-angle camera helps increase the drama when shared in your secret Slack channel of friends obsessed with sunsets.

Even with two dedicated telephoto cameras, it’s not always the best at zooming in to see what’s yonder.
Photo: Florence Ion / Gizmodo

Nowadays, most smartphone cameras are capable of everything I just described, but Samsung purports a higher resolution and greater color and distance detail. These are the cameras we have on us every day, and Samsung argues that these are the digital memories we’ll be pulling from as we struggle to remember our lives someday in the future.

That’s not to say that every photo the Galaxy S23 Ultra produces is perfect. Zooming past the 10x optical limit requires praying that the image won’t be jaggy or over-sharpened. There were so many instances on the evening of my daughter’s third birthday that the pictures of her punching around a balloon came out looking blurry—a real bummer for me as I was trying to find a cute one to share within group chats. I also tried staying up one night to capture the Air Force flying their planes in the sky above, and I could not produce anything worth sharing.

Another unedited photo shot with the Galaxy S23 Ultra.
Photo: Florence Ion / Gizmodo

As it stands, the 200-MP sensor on the Galaxy S23 Ultra isn’t shooting in its full resolution at all times. Like most flagship smartphones, including the iPhone 14 Pro and Google Pixel 7, Samsung uses pixel-binning, so the phone shoots like a 12-MP camera with 16 pixels within each megapixel. The result is brighter photos throughout with better detail. I preferred the 12-MP images worked over by the algorithm over the full 200-MP raw ones, which usually require some post-editing, anyway. I want to avoid editing a photo while just trying to share it on social media.

You can see more clearly how the Galayx S23 Ultra’s post-processing stacks up compared to the iPhone 14 Pro Max and Pixel 7 Pro in the slideshow I put together here. For the most part, I found Samsung’s algorithms to veer towards being saturated, though it was impressive at tempering the final product to maintain detail where it mattered. The most obvious example is a photo where I shot the Santa Ynez Mountains in Santa Barbara; the S23 Ultra held on to the subtle detail of the sunset, lighting up the ridges without over-contrasting them.

I wrote more about Expert RAW in the other piece, including Samsung’s improved astrophotography feature. I wish that Samsung would have extracted this feature on its own rather than buried it inside another download that has to be enabled in the camera app before anyone knows it’s even there. Samsung includes all these unique camera features as if we’re supposed to know how to use them right out of the box. But as with the improved nighttime video recording capabilities teased during the Ultra’s debut at Galaxy Unpacked earlier this month, I had no idea where to start. Just because a smartphone can do all these fancy things doesn’t mean that the general population will aspire to that. And after ten years of reviewing smartphones, I might also give up.

That’s a big problem, as the camera system here is a major selling point and a major justification for the price tag. Compare that to Apple, which due to making both the iPhone and iOS, is able to bundle its phones with tons of everyday usability conveniences.

The Galaxy S23 Ultra offers so many different camera modes that I’m often too overwhelmed to play with any.
Photo: Florence Ion / Gizmodo

Before we move on from the cameras, there are a few other things to note: video recording on this smartphone is aces, even without a tripod. But for stability’s sake, I’ve been propping the Ultra up on a handheld tripod and following my kid around at 60 fps. The video is so smooth! The Ultra maxes out at 30 frames per second in 8K resolution for video recording, and there’s a Pro Video mode if you’re comfortable with tweaking camera settings. The front-facing camera is a 12-MP sensor with an f/2.2 aperture; annoyingly, it doesn’t zoom in or out.

Does a smartphone need a stylus?

The S Pen has always been a nice-to-have, but it feels more fitting for a tablet-style device.
Photo: Florence Ion / Gizmodo

Samsung’s S Pen has been around for a long while. It’s as iconic as Paris Hilton’s chihuahuas in the 2000s (RIP to them all). Last year’s Ultra was the first time it appeared in the regular Galaxy lineup after the sunsetting of the Galaxy Note series of yore. But functionally, it’s similar to what the S Pen could do before it. You can pop it out for drawing and cropping when the situation on screen calls for it—accommodating for business people doing precise things, like needing to move a cursor within a document or having to sign off on a contract while in line somewhere. But I’m starting to realize this screen is too limited for anything art driven. Granted, I’m not an artist, but if I imagine myself as a college student (again), the S Pen would feel much more appropriate docked inside a gadget like the Z Fold 4, with can open up into a larger display that’s fit for highlighting and making digital notes. That’s a form factor that lends itself to a stylus rather than the cramped screen on the S23 Ultra.

The other problem with the S Pen is that it requires its own space inside the chassis to dock. That’s the tradeoff for a phone slightly too big for your pocket or those straddling gaming controllers. As much as the S Pen is an iconic tool, I don’t know that it belongs on a smartphone anymore, even if you can use it as a Bluetooth controller.

Samsung’s version of Android

Samsung’s software is fine, but often it doubles up on Google’s offerings.
Photo: Florence Ion / Gizmodo

The Galaxy S23 Ultra ships with One UI 5, based on the latest version of Android 13. The One UI 5.1 update is the one that everyone’s waiting for right now, since it includes features like Bixby Text Calling, which works similarly to the Pixel’s Screen Calling. This feature is now live in English (it was available only in Korea until now), but I couldn’t get it to work during my testing period. I hope to revisit this and some of Bixby’s other features later, as I’m curious to understand the benefits of sticking with it over the tried-and-true (even if sometimes frustrating) Google Assistant.

I don’t mind Samsung’s version of Android, especially not since adopting the foldable. I realized it comes with the benefit of Samsung tweaking what Google gave it to its devices, even if it doesn’t have any semblance of Android’s interface framework, called Material You. Samsung offers some neat integration with Microsoft’s Your Phone app on Windows PCs that’s beyond the default experience, including the ability to control your device from the desktop remotely. There’s also the ability to snap a photo in Expert RAW and have it immediately populate in Adobe Lightroom. These abilities are nice to have, but like the Galaxy S22 Ultra last year, I hardly ever considered using them after the review period was over. They’re not a reason to go out and buy a phone.

Still too much phone

I hope you enjoyed reading this entire review only to have me tell you to buy a foldable.
Photo: Florence Ion / Gizmodo

I know there are people out there salivating over the Galaxy S23 Ultra. They want the best that Samsung has to offer in its lineup, whether it’s for bragging rights or because they want all those lenses and this is the only camera they’ll own. I get all that, but I still think the Ultra is a bit of overkill in a market where we’re all screaming for a deal. There are still two other models of the Galaxy S23 that I have yet to review, and though they’re smaller devices with slightly different chassis, they more or less deliver the same Samsung experience across the board for less. They’re priced a little over the Pixel 7 lineup, starting at $800 and $1,000 for the S23 and S23+, respectively.

If you’re going to spend a starting price of $1,200 on any Android smartphone, I’m pleading with you to get a foldable instead. Yes, it’s a new kind of form factor with dubious longevity, but it’s not going away any time soon. For many, even those who want the best, camera fidelity will reach a point diminishing returns. But a foldable drastically changes every user’s experience. There is more competition cropping up overseas and the rumor mill is getting louder as more manfacturers are hopping on board this new smartphone fad. At the very least, if you’re spending a whopping amount of money on a smartphone, get something that’s a bonafide phone and a tablet for the price.

Better and better cameras are perhaps not what each new generation of a phone should be targeting, at least anymore.

Galaxy S23 Ultra Camera Tested vs Pixel 7 Pro, iPhone 14 Pro Max

Like last year’s release, the Galaxy S23 Ultra has four cameras on the back (one of those holes is just a sensor).
Photo: Florence Ion / Gizmodo

I’m having a ton of fun with the Samsung Galaxy S23 Ultra. There’s still more to explore before I have a full review ready—I still need to test its purported nighttime video-taking prowess, plus I’m still learning how to get the most out of the Expert RAW mode in the Samsung camera app. But until then, I figured I’d share what I’ve noticed so far about the $1,200 phone’s camera compared to those on other popular flagship smartphones I have floating around, like the Pixel and iPhone.

How do the Galaxy S23 Ultra’s shots compare to those on the Google Pixel 7 Pro and Apple iPhone 14 Pro Max? So far, so good. It feels like Samsung fixed some of its zooming algorithms for more precise shots, but I still need to perform side-to-side testing against last year’s S22 Ultra. Samsung’s photos are still more saturated than what Google and Apple produce, but the result is less noisy than it used to be. The company has also toned down the sharpening on its photos and figured out how to balance stark sunlight coming in from the side. However, the Ultra is still no match for the Pixel when it comes to nighttime shooting capabilities. Spoiler: the Pixel 7 Pro is still better at shooting photos of stars, at least until I become a Expert RAW mode power user.

Samsung’s Massive Galaxy Camera, Honor Magic 5 Leaks, Red Magic 8 Pro Review

Taking a look back at seven days of news and headlines across the world of Android, this week’s Android Circuit includes Galaxy S23 Ultra camera details, the latest Galaxy S23 photos, Pixel 7’s YouTube problems, the Honor Magic 5 is coming, Red Magic 8 Pro review, Pixel Tablet preparations, and Google’s podcast plans.

Android Circuit is here to remind you of a few of the many things that have happened around Android in the last week (and you can find the weekly Apple news digest here).

At The Heart Of The Ultra’s Image

Samsung has launched a new camera sensor in the ISOCELL HP2, which sports a massive 200-megapixel image sensor and a lot of smart tricks to improve performance – such as pixel-bing technology so the sensor can simulate different pixel sizes to match the lighting conditions. Unsurprisingly, the specs of the new sensor match the leaked specs of the Galaxy S23 Ultra:

“Samsung has already promised improved nighttime photography and astrophotography from its next-gen flagship, as revealed in a promo video accompanying the recent launch of pre-orders for the Galaxy S23 Ultra. Now, it looks highly likely that the claimed ‘epic detail and color reproduction’ of the Isocell HP2 will be making its way to that smartphone too.”


The Smooth Galaxy

The team at WinFuture has details on the upcoming Galaxy S23 handsets, which have come to light via smartphone dealers in Germany, including new promotional images that highlight eh smooth back of the handsets except for the main camera’s three lenses:

“. Samsung mainly “removes” the camera hump and refines a few details on the case so that the optics only protrude slightly from the back… Otherwise, you rely on a pleasing look, which runs in a similar form through all new smartphone series that Samsung is presenting this year, so it should also be used in the cheaper Galaxy A, E, F and M models. “

(via SamMobile).

Pixel 7’s Latest Problem

Pixel 7 and Pixel 7 Pro users are reporting problems with YouTube on the latest Google handsets. Existing full-screen video on YouTube or YouTube TV is causing handsets to randomly lock up; sometimes leaving controls active, other times requiring a reboot:

“This isn’t the first problem that Pixel 7 owners have faced since the device was launched, including networking issues that resulted in dropped calls, display problems including touch events being misreported and white lines appearing on the screen, face unlock not unlocking Google Pay, and media player controls not showing on the lock screen… No doubt, the new issue with YouTube will also be addressed promptly.”


Get Ready For More Magic

It’s still to be announced (and let’s face it, Mobile World Congress is just around the corner), but we have another indication that Honor will be bringing the Magic 5 series to the public in the very near future.

“The HONOR Magic 5 series, model PGT-AN00, appeared on the network certification portal powered by the Snapdragon 8 Gen 2 processor and features optical image stabilization and at least one telephoto lens with the possibility of supporting up to 100x zoom. The device is also rumoured to be released at the end of February.”

(via Reddit).

Red Magic 8 Pro Review

The latest gaming phone from Nubia is the Red Magic 8 pro. Powered by the Qualcomm Snapdragon 8 Gen 2 chipset and running the Redmagic OS and it’s own customized gaming launcher, the team at GSM Arena has reviewed the handset to see what’s new in the mobile gaming world in 2023:

“…gaming is the bread and butter of the Red Magic line, and the Red Magic 8 Pro excels at it like few others. Not only does it have the raw power to chew through even the heaviest modern titles, but nubia’s extra Game Space software layer has gotten better than ever. We found ourselves particularly impressed with how well the Red Magic 8 Pro handles gaming on an external monitor via a display connection “

(GSM Arena).

Preparing For The Tablet

Google has tweaked the UI of its Discovery app to better suit devices with larger screens, such as its upcoming Pixel Tablet. Of course, this has benefits for other Android tablets as well, so it’s a win for every manufacturer, even if Google gets there first:

“Instead of just two columns of articles, Google Discover now has a third that makes the feed go edge-to-edge. The fullscreen effect is particularly prominent to the left of your homescreen with a black background, while the Google logo and your profile avatar are moved to the top corners.”


And Finally…

What’s happened to Google’s grand adventure in podcasting? Links directly into the Google Podcasts app are no longer showing on search results; the one-click play experience for those staying inside Mountain View’s walls is no more.

“The Google Podcasts app has had no feature updates for the last eighteen months; the Google Podcasts Creators Program closed this month; and now with the removal of the product from Google Search, the business unit that owns the app, we asked if Google Podcasts is to close. The answer: “YouTube and Google Podcasts serve different needs. Both serve podcast listeners today and we are not changing this right now. We have nothing more to share at this time”.”


Android Circuit rounds up the news from the Android world every weekend here on Forbes. Don’t forget to follow me so you don’t miss any coverage in the future, and of course read the sister column in Apple Loop! Last week’s Android Circuit can be found here, and if you have any news and links you’d like to see featured in Android Circuit, get in touch!

Samsung Leaks Major Galaxy S23 Ultra Upgrade As Pre-Orders Go Live

Samsung’s soon-to-be-released Galaxy S23 Ultra is tipped to receive some compelling camera improvements. Now a new promo video, direct from Samsung itself, teases ‘stunning night photos’ from the next-gen Galaxy flagship as pre-orders go live.

The fifteen-second video, titled “Made for Mooonlight” (note the three ‘o’s), focuses on three of the Galaxy S23 Ultra’s (reportedly huge) rear camera lenses before displaying the text “capture the night even in low light,” followed by “stunning night photos are coming soon.” Imagery present in the video, including shoots of the moon and the milky way, suggest strong astrophotography capabilities.

No explicit mention is made of any specific hardware upgrades, but the focus on the three cameras tallies with an earlier leak in which the flagship Galaxy is tipped to feature noticeably larger rear camera lenses than its predecessor, the Galaxy S22 Ultra.

The Galaxy S23 Ultra is also tipped to feature a new 200-megapixel main camera sensor along with improved ‘daily mode’ photos, improved sharpness at 10x zoom, and more natural processing according to previous leaks. Meanwhile, the front-facing camera is expected to see a reduction in megapixels, although I can’t imagine Samsung would allow this to lead to a reduction in quality, given the focus on photography in its latest promo.

Now we know that Samsung is promoting the strength of its new smartphone’s night photography in its first official promo, we can expect a significant boost in camera performance. Potentially larger lenses, combined with the additional processing power of Qualcomm’s new Snapdragon 8 Gen 2 chipset, are no-doubt key to any improved low-light performance from the Galaxy S23 Ultra.

You can pre-order the Samsung Galaxy S23 Ultra now, ahead of its launch this Feb 1.

Follow @paul_monckton on Instagram

MORE FROM FORBESSamsung’s New Galaxy Camera Tech Will Challenge Apple’s iPhone SupremacyMORE FROM FORBESNew Samsung Leak Points To Massive Galaxy S23 Ultra Upgrades