Google Pixel 8 vs. iPhone 15: did Google finally beat Apple?

[ad_1]

Renders of the Google Pixel 8 next to the Apple iPhone 15.
Digital Trends

Google has unveiled the Google Pixel 8 lineup at its Made by Google event in New York. Though there aren’t any drastic upgrades to the Pixel 8, Google made some smaller changes that are sure to be welcome among fans.

But Apple also just released the iPhone 15 lineup a month ago. So, if you’re in the market for a new smartphone, which one should you pick up? The Pixel 8 and iPhone 15 are both the baseline models for Google and Apple, respectively, so let’s take a closer look at these two phones.

Google Pixel 8 Event in 11 Minutes

Google Pixel 8 vs. iPhone 15: specs

Google Pixel 8 Apple iPhone 15
Display 6.2-inch LTPS OLED Actua Display

1080 x 2400 resolution at 428 ppi

1,400 nits HDR brightness, 2,000 nits peak brightness

6.1-inch Super Retina XDR OLED Display

2556 x 1179 resolution at 460 ppi

1,600 nits HDR brightness, 2,000 nits peak brightness

Always-on display Yes No
Refresh rate 120Hz – 60Hz 60Hz
Colors Obsidian, hazel, rose Black, yellow, blue, green, pink
Storage and RAM 128GB/256GB

8GB RAM

128GB/256GB/512GB

6GB RAM

Processor Tensor G3 A16 Bionic
Software Android 14 iOS 17
Rear cameras 50MP main

12MP ultrawide

48MP main

12MP ultrawide

Front camera 10.5MP 12MP
Biometrics Under-display fingerprint sensor

Face Unlock

Face ID
Battery 4,575mAh 3,349mAh
Charging 27W wired

18W wireless with Google Pixel Stand

12W Qi-compatible wireless charging

4.5W reverse wireless

27W wired

15W MagSafe wireless charging

7.5W Qi-compatible wireless charging

4.5W reverse wired

USB-C Yes Yes
Price From $699 From $799
Available from Google Store Apple

Google Pixel 8 vs. iPhone 15: design

The black Google Pixel 8, being held by a person wearing a sweater.
Google

On the surface, both the Google Pixel 8 and Apple’s iPhone 15 don’t look very different from their predecessors.

The Google Pixel 8 looks very much like the Google Pixel 7 that came before it. However, there are some minor changes. The Pixel 8 now has more rounded corners and symmetrical bezels, as well as a flat display. You’ll still have the traditional camera bar on the back with the metal finish, which houses the dual camera system. Google has a new satin frame enclosure with a polished back glass for the Pixel 8 (aka, the back is glossy). There are three beautiful colors for the Pixel 8: obsidian, hazel, and rose.

A green iPhone 15 on a green bench.
Christine Romero-Chan / Digital Trends

With Apple’s iPhone 15, it looks pretty similar to the iPhone 14. You still have the flat edges and diagonal dual camera layout, but that’s where the similarities mostly end. Previous base model iPhones had glossy glass on the back, but the iPhone 15 uses a new “color-infused” glass that now has a unique, matte finish. Since it’s matte, fingerprints are less likely to show up.

The iPhone 15 also has more rounded edges compared to its predecessor, which gives it a more ergonomic and comfortable feel in the hand. Apple also uses an aluminum frame that’s pretty resistant to fingerprints, and it also makes the phone quite lightweight. There are five color choices for the iPhone 15: black, yellow, blue, green, and pink. However, most of the colors are quite pale, with pink being the best of the bunch.

Both phones look great, but with more color options and a more practical matte back, we’re giving the iPhone 15 the win here.

Winner: iPhone 15

Google Pixel 8 vs. iPhone 15: display

Google Pixel 8 render against a pink background.
Google Pixel 8 Digital Trends

This year, Google actually made the display on the Pixel 8 a bit smaller than before. Previously, the Pixel 7 had a 6.3-inch display, but the Pixel 8 now has a 6.2-inch Actua LTPS OLED display with a 1080 x 2400 pixel resolution at 428 pixels per inch (ppi) and a 20:9 aspect ratio.

The Pixel 8 display also has a 120Hz refresh rate and reaches an HDR brightness of 1,400 nits and a peak brightness of up to 2,000 nits. The new Actua display yields 42% brighter levels than the previous Pixel 7.

The Google Pixel 8 also has an always-on display that features At a Glance and Now Playing. It uses Corning Gorilla Glass Victus as well.

A green iPhone 15 in hand showing home screen.
Christine Romero-Chan / Digital Trends

Apple’s iPhone 15 display is not quite as impressive, especially considering the price point. With the iPhone 15, you have a 6.1-inch Super Retina XDR OLED display with a 1179 x 2556 resolution, 19:5:9 aspect ratio, and 461 ppi. It reaches typical brightness at 1,000 nits, but can go up to 2,000 nits peak brightness outdoors. It also brings in the Dynamic Island from last year’s iPhone 14 Pro models, making it the new standard for the iPhone.

However, even though the iPhone 15 costs more than the Pixel 8, the refresh rate is only 60Hz, which is far below what the Pixel 8 (and most Android phones, for that matter) is capable of. Though the display features bright and rich colors with sharp text, scrolling won’t be as smooth, and it’s definitely noticeable if you’re used to a higher refresh rate. It also does not support an always-on display mode, which is something the Pixel 8 does offer.

Winner: Google Pixel 8

Google Pixel 8 vs. iPhone 15: software and performance

Android 14 logo on a smartphone.
Joe Maring / Digital Trends

The Google Pixel 8 will ship with Android 14 out of the box, naturally. It also features Google’s new Tensor G3 chip and Titan M2 security coprocessor. The Pixel 8 also has 8GB RAM and either 128GB or 256GB of storage available.

With the Google Tensor G3, the machine learning model on the Google Pixel 8 is about 10x more complex than what was on the Google Pixel 6. Some specific features of Tensor G3 include the Pixel’s Call Screening feature that can detect and filter out spam calls, improved Clear Calling for high-definition and crystal clear phone calls, and Audio Magic Eraser, which can erase distracting sounds from your videos.

Google is also taking steps to future-proof its phones. The Pixel 8 will have Android 14 out of the box, and Google is promising OS and security updates for seven years. It will also release Feature Drops, which add helpful new features for the Pixel 8, every few months.

The display on a green iPhone 15.
Christine Romero-Chan / Digital Trends

Apple’s iPhone 15 comes with iOS 17. On the inside, it has the A16 Bionic that debuted last year with the iPhone 14 Pro models and has 6GB RAM. You have the option of 128GB, 256GB, or 512GB of storage.

The A16 Bionic is quite fast in terms of performance, and iOS 17 has been a solid performer. With iOS 17, you’ll have some new features like NameDrop, improved AirDrop, Standby mode, interactive widgets, Live Voicemail, and more. It may be a smaller update than previous years, but it does add a lot of quality-of-life improvements. Apple also tends to support its devices for around six years on average, so you can expect the iPhone 15 to last a while too.

Ultimately, which one is better really depends on whether you prefer iOS or Android overall. But if we had to pick a winner, the iPhone 15 takes it — at least for now. While the A16 Bionic is a proven chip with great performance across the board, it remains to be seen if Tensor G3 can seriously compete with it. Tensor G2 was riddled with efficiency and performance issues, and while those should be fixed for G3, we need to see for ourselves if that’s actually the case.

Winner: iPhone 15

Google Pixel 8 vs. iPhone 15: cameras

Someone holding up the pink Google Pixel 8, while also wearing a pink shirt.
Google

The Pixel 8 has a 50MP main camera and a 12MP ultrawide. The main camera also has an f/1.68 aperture, 82-degree field of view, and digital zoom up to 8x with optical 2x zoom. The ultrawide lens has an f/2.2 aperture and a 125.8-degree field of view. The dual camera setup also has a single-zone laser detect autofocus, a spectral and flicker sensor, and optical and electronic image stabilization on the wide lens.

The selfie camera on the Pixel 8 is 10.5MP with an f/2.2 aperture, fixed focus, and 95-degree ultrawide field of view.

The rear camera should also produce some quality video. You can get 4K recording at 24/30/60 frames per second (fps) or 1080p recording at 24/30/60 fps. The front camera is capable of 4K recording at 24/30/60 fps. Google also has a macro focus video mode, cinematic blur and pan, astrophotography time lapse, and more.

Google Pixels have had a solid reputation for taking great photos, and the Pixel 8 is no exception. The upgraded camera system with the Pixel 8 has advanced image processing that will bring out vivid colors and striking details. There is also now a new Macro Focus that lets you take even better macro images. Another new feature is Best Take, which will combine similar pictures into a single image where everyone looks their best.

A green iPhone 15 showing the camera and matte glass back.
Christine Romero-Chan / Digital Trends

On the iPhone 15, you also have a dual camera setup with a 48MP main and 12MP ultrawide. The main camera has an f/1.6 aperture, while the ultrawide has an f/2.4 aperture. Despite not having a telephoto lens, Apple made it possible to get 2x optical zoom with up to 10x digital zoom. For the selfie camera, the iPhone 15 has a 12MP lens with an f/1.9 aperture. The selfie camera also has autofocus, which you don’t get on the Pixel 8.

Apple has made some improvements in the camera quality this year with Smart HDR 5. Colors don’t appear as harsh or washed-out as in previous generations. There is also a new Auto Portrait feature for the entire iPhone 15 lineup. With Auto Portrait, as long as the camera detects a person or pet in focus, it automatically captures depth data even if you’re not in Portrait mode. When viewing the photo, you can toggle Portrait mode on or off as you please, and the feature even works for selfies too.

For video recording, Apple can be tough to beat. You have 4K recording at 24/25/30/60 fps or 1080p recording at 25/30/60 fps. The iPhone 15 also has a cinematic mode with shallow depth of field up to 4K HDR at 30 fps, HDR video recording with Dolby Vision up to 4K at 60 fps, and slo-mo video up to 1080p at 120/240 fps.

You get slightly more digital zoom and autofocus for the selfie camera on the iPhone 15. But the Pixel 8 has a slightly sharper 50MP main camera, and given Google’s track record, it should also provide excellent photographs. Until we’re able to put both cameras to the test, we’re calling this one a draw.

Winner: Tie

Google Pixel 8 vs. iPhone 15: battery and charging

Someone talking on the phone with the Google Pixel 8.
Google

The Pixel 8 has a 4,575mAh battery, and Google promises 24-hour battery life with up to 72-hour battery life with Extreme Battery Saver — the same claim it made last year for the Pixel 7. It uses USB-C for fast charging up to 27 watts, which results in a 50% charge in 30 minutes.

For wireless charging, the Pixel 8 uses Qi-compatible wireless chargers and can wirelessly charge up to 18W with the Google Pixel Stand accessory. For other wireless chargers, it will cap out at 12W speeds. You can also use the Pixel 8 to charge up other devices with Battery Share reverse wireless charging, which is around 4.5W.

A green iPhone 15 showing battery life.
Christine Romero-Chan / Digital Trends

Apple packed in a 3,349mAh battery cell in the iPhone 15. In our tests, this should get you through the day with moderate usage. It’s an improvement over the 3,279mAh battery of the iPhone 14, but not by much.

The iPhone 15 has finally ditched Lightning for USB-C, so you no longer need a proprietary cable to charge the iPhone. However, Apple did not make any improvements to the overall charging speed, as the iPhone 15 still charges around the same 20W as before, so again, you’re looking at a 50% charge in 30 minutes.

As far as wireless charging goes, the iPhone 15 still has 15W MagSafe wireless charging speeds and 7.5W with Qi-compatible charging pads. Apple did add reverse wired charging, so as long as you have a cable, you can charge up other devices at 4.5W speed.

We still need to test the Pixel 8’s battery for ourselves, but there’s a good chance it could come out ahead. It’s significantly larger than the battery capacity of the iPhone 15, and when you combine that with the convenience of reverse wireless charging, it’s enough to give it the win.

Winner: Google Pixel 8

Google Pixel 8 vs. iPhone 15: price and availability

A green iPhone 15 leaning on a lamp post.
Christine Romero-Chan / Digital Trends

The Pixel 8 starts at $699 for the 128GB model, which is a $100 increase from the Pixel 7. Google justifies the price increase with a new design, upgraded cameras, and the Tensor G3. The Pixel 8 comes in three colors: obsidian, hazel, and rose.

You can preorder the Pixel 8 now, and it will be widely available to the public on October 12.

The iPhone 15 starts at $799 for the 128GB version. It comes in five colors: black, yellow, blue, green, and pink. However, most of the colors are very light and pale, though pink is a notable standout. Apple launched the iPhone 15 on September 22, and you can purchase it right now.

Google Pixel 8 vs. iPhone 15: verdict

Two Google Pixel 8 phones lying next to each other on a table.
Google

If you’re looking to get the most bang for your buck with a base model flagship, then it’s hard to beat the Google Pixel 8. For just $699, despite being more than its predecessor, you’re getting a lot more features this year.

For one, the Pixel 8 now features a 120Hz display, which is a bump up from the 90Hz of the Pixel 7. Even though it’s slightly smaller at 6.2 inches, the refresh rate is also higher, and the peak brightness is doubled. The display on the Pixel 8 is also double what Apple offers with the iPhone 15, which is only 60Hz and $100 more, which is disappointing.

Though the cameras on both phones are similar, the Pixel 8 edges out the iPhone 15 just slightly with the 50MP main shooter versus the 48MP main camera. Though the iPhone 15 does have the Auto Portrait mode, the Pixel 8 should be capable of taking some great photos as well, especially with the new Best Take feature, which looks especially useful if you take a lot of group photos.

Battery life could also work in the Pixel 8’s favor, offering a 4,575mAh battery compared to the iPhone 15’s measly 3,349mAh. Charging speeds for both phones are about the same with USB-C, but the Pixel 8 is capable of faster wireless charging overall (unless you want to use MagSafe). The Pixel 8 is also more convenient for reverse wireless charging, whereas the iPhone 15 can reverse charge, but only if you have a cable with you.

What does this all mean for you? If you know you want an iPhone, the iPhone 15 is a tough phone to beat. It’s expertly designed, very performant, and a pure joy to use. But if you want an Android handset — or you’re simply looking for the best deal possible — the Google Pixel 8 drives a pretty hard bargain.


Editors’ Recommendations






[ad_2]

From Dramatic Locations To Dreamy Photo Sets, Here Are The New Trends In Wedding Photography

[ad_1]

(MENAFN- Khaleej Times) Published: Thu 14 Sep 2023, 8:16 PM

Last updated: Thu 14 Sep 2023, 8:24 PM

It is said that the cake gets eaten, the flowers get withered but it is the wedding photos that last forever. Wedding pictures are a perfect way to preserve what are arguably the best moments of one’s life, and its value only increases with time.“Wedding photography is an art that stands as a testament to love; it transcends time. Cherished for generations, it is all about encapsulating emotions and not documenting pictures,” says Shrey Bhagat, founder and creative director, Raabta.

Undoubtedly, one of the most important aspects of a wedding is photography, which is getting bigger and better by the day. A multi-billion-dollar market that is growing exponentially, wedding photography has evolved tremendously over the years. From dramatic locations and dreamy photo sets to stunning imagery and visual storytelling, the trends in wedding photography are eclectic and exciting. Here is a look at some of the latest trends in this special genre of photography.

Creative, candid and collaborative

Gone are the days when wedding photography was a strictly formal affair and it was all about awkward posing and forced smiling before the camera.“Earlier, there was no real connection between a photographer and the couple, but now, couples and the photographer are a team and all decisions with respect to the approach and style of pictures are taken mutually.

It is all about capturing real, candid moments that tell the unique story of the wedding day,” says Himanshu Patel, founder, Epic Stories. Couples are totally involved in the creative process and the photography team spends hours listening to each couple’s story to understand the best way to capture the essence of their wedding.“Some couples even opt for a customised song dedicated to them, and this just adds a whole new dimension to storytelling with wedding photography,” adds Bhagat.

With the help of new technology and changing preferences, wedding photography has become more creative and dynamic. Photographers today are more focussed on capturing nuances, subtle expressions and feelings shared between the couple. While it could be casual interactions or intricate details, it is all about embracing the authentic, and even the imperfect, to capture the true spirit of the occasion.“Couples love how this gives them an output that is up close and personal. The beauty of these frames is that it is full of raw emotions, with an overlapping foon people,” quips Bhagat. As a result, conventional wedding albums have now transformed into beautiful fairy-tales that encapsulate the narrative of the couples’ journeys.

Latest trends for the Midas touch

Apart from candid images, drone shots have become a rage in recent times. With elaborate décor themes spanning from royal, traditional, tropical, vintage and even boho chic, drone shots are used to capture these details beautifully. Most weddings are all about rituals and elaborate ceremonies, and drone shots lend more gravitas to such pictures.

“Cinematic style photography is the trend these days with a foon candid videography. We work with the couple from months in advance to craft the complete narrative right from pre-wedding shoots, save-the-date creatives and even short, quick teasers with about 10-20 edited wedding pictures or a short film that is released on the same day for the couple to share on social media,” says Palani A, founder, S.A. Digital Studio, Bengaluru. He adds that editing is as challenging as the actual shooting and that the desaturated editing style is gaining popularity. This technique creates images that are artistically minimalistic with softer tones and an earthy, natural feel.

There are a number of couples adopting the vintage and nostalgic aesthetic in terms of black and white images and even hazy images that have an ethereal quality. Most photographers are using a slower shutter-speed to capture these soulful images.“We are also returning to using film to give photos a classic, old-fashioned feel. These dreamy pictures add an extra touch of magic and is akin to viewing the photo through a soft, gentle filter,” adds Patel. Such pictures often feel like they are carrying a memory right in the photo.“Controlled blurriness adds cinematic emotion and movement, focusing on moments and gestures,” says Jayant Chhabra, founder, Cupcake Productions.

Further, first look images are a huge trend as they capture one of the purest moments of weddings and seize that feeling forever through a perfect photograph. It also captures the emotional reactions of parents when they see their children for the first time in their wedding attire. Capturing bridal room chronicles with elaborate shots of the trousseau, jewellery and the whole journey of her getting ready is in vogue as is capturing your pets with you on your special day.

Tantalising techniques

Other trends include groomsmen shoots, veil shots and the use of GIFs. While bridal veil images exude an enigmatic and timeless appeal, pictures of the groom with his groomsmen having fun allows their personalities to shine through. After all, why should brides have all the fun? “GIFs are likely to be quite popular in the coming days. Imagine capturing a small moment from the wedding and turning it into a moving picture. It is like freezing a happy dance or a cute smile and making it come alive. These GIFs are like mini stories that show the feeling of the moment in a fun way in a jiffy,” says Patel.

Double-exposure photographs, underwater portraits and backlit silhouettes are the other techniques adopted by photographers to incorporate a unique and distinctive vibe. Artistic compositions with the strategic use of reflective surfaces like mirrors helps create exotic, unusual images and so does the use of smoke bombs which creates the perfect ambience for moments like the couple’s first dance. Post wedding shoots are also extremely popular.

Like all things, change is the only constant in the field of wedding photography too. Photographers spend a lot of time researching new styles and technology to stay relevant.“Keeping up with the trends in wedding photography requires a proactive, adaptive and continulearning approach. Attending meets by photography clubs and other workshops and conferences can help you gain valuable insights. Further, it is imperative to update your camera equipment, lighting gears, lenses and editing software,” concludes Chhabra.

[ad_2]

How to Win Gold with Your Snaps: iPhone Photography Awards 2024 Now Open for Submissions

[ad_1]

Showcase Your Artistry with iPhone Photography Awards 2024: Win Gold, Platinum, and More

Showcase Your Artistry with iPhone Photography Awards 2024: Win Gold, Platinum, and More

KEY HIGHLIGHTS

  • IPPA 2024 invites iPhone and iPad photographers to showcase their skills and compete for gold and platinum bars.
  • The competition emphasises authenticity, allowing image alterations only via iOS apps.
  • With 14 categories ranging from abstract to travel, IPPA 2024 offers a platform for every photographic passion.
As millions of users across the globe capture their everyday lives with the sophisticated cameras on their iPhones, the iPhone Photography Awards (IPPA) offers a stage for the most creative to shine. Though not an official Apple event, the IPPA has been celebrating the art of iPhone photography since 2007.

Participation Criteria: A Worldwide Call

IPPA extends its call to iPhone and iPad photographers around the globe. While the competition accepts photographs from personal archives, they should not have been previously published, with the exception of those shared on personal Facebook or Instagram accounts.

Submission Guidelines: Celebrating Authenticity

The IPPA emphasises authenticity, allowing minimal alterations using only iOS apps. Photos edited with desktop image processing programs such as Photoshop are not acceptable. iPhone add-on lenses, however, can be utilised to enhance image capture.

Rewarding Excellence: A Trove of Prizes

The competition offers an array of coveted prizes. The grand prize winner will be awarded an iPad Air, while the top three winners will receive an Apple Watch each. Moreover, the first-place winners across the 14 categories stand a chance to win a gold bar from a highly renowned private gold mint. Runners-up are not left behind either, with second and third-place winners receiving a platinum bar from the same mint.

Diverse Categories: A Platform for Every Passion

The competition encompasses a broad range of categories, including Abstract, Animals, Architecture, Children, Citylife/Cityscape, Landscape, Lifestyle, Nature, People, Portrait, Series (3 images), Still Life, Travel, and Other.

Registration Details: Open Till March 31, 2024

Participants are required to pay an entry fee per image, which ranges from $5.50 (roughly ₹470) for a single image to $135.50 for submitting 50 images. The submission deadline is set for March 31, 2024, allowing ample time for aspiring participants.

[ad_2]

Smartphone Photography Shootout: Samsung And Apple Head-To-Head

[ad_1]

Through the pandemic, I became passionate about photography. Although I’ve technically been doing “professional” product photography for articles featured on-line and in print for a couple of decades, it was only in the last few years that I stepped up my game. As much as I love my pro camera gear though, it’s not something you can carry around perpetually. As the old saying goes, “the best camera is the one that’s with you”, and like most of you, my smartphone camera is the one that’s almost always within reach. So, when presented with the opportunity to test a couple of the top smartphones for photography, I jumped at the chance.

The smartphones I’m referencing are the Samsung Galaxy S23 Ultra and the Apple iPhone 14 Pro Max. Both of these devices might be the current flagships for their respective manufacturers, but they couldn’t be any more different in terms of their rear “world facing” camera configurations (both cameras feature 12MP selfie cameras). The iPhone 14 Pro Max features a 48MP main camera, with a 24mm 7-element lens (ƒ/1.78 aperture), with sensor-shift optical image stabilization. That main camera is flanked by a 12MP Ultra Wide shooter, with a 13mm 6-element lens (ƒ/2.2 aperture) and a 120° field of view, and a 12MP Telephoto shooter, with a 77mm 6-element lens (ƒ/2.8 aperture), with 3x optical zoom (digital zoom up to 15x) and optical image stabilization.

The Samsung Galaxy S23 Ultra features a bleeding-edge 200MP main camera with a 24mm lens (ƒ/1.7 aperture) and optical image stabilization. Adjacent to the main camera, there’s also a 10 MP “periscope telephoto” shooter (ƒ/4.9 aperture), with OIS and 10x optical zoom, a 10 MP telephoto camera with a 70mm lens (ƒ/2.4 aperture), OIS and 3x optical zoom, and a 12 MP ultra-wide camera, with a 13mm lens (ƒ/2.2 aperture), 120˚ field of view, and Dual Pixel PDAF. The ultra-wide shooters on both the Galaxy S23 Ultra and iPhone 14 Pro Max also do double-duty as Macro shooters.

Those main camera specifications need some additional clarification, however. While many of today’s high-end smartphones boast ultra-high megapixel ratings, they also do something called “pixel-binning”. Pixel binning uses multiple pixels on the sensor to gather data that’s grouped, or “binned”, together to effectively act a single pixel of a certain color. For example, the iPhone’s 48MP main camera actually outputs 12MP images by default, because it is using quad-pixel binning. Users must shoot in RAW, which requires additional processing, to capture a 48MP image. The Galaxy S23 Ultra does the same thing; 200MP image captures are possible, but typical captures are 48MP. Pixel binning is one way to counter-balance the more limited light-gathering capabilities of the small pixels on densely-packed smartphone camera sensors. It results in more accurate color captures and less noise, but creates images that are only 25% of the maximum resolution of the sensor, when using quad-binning at least.

High-End Smartphone Camera Systems Require Massive Processing Power

Smartphones designed with high-quality photography in mind usually use complex image processing techniques, high-quality optics, and a multitude of computational photography technologies to capture and intelligently process as much image data as possible. As such, today’s smartphones require powerful image processors to not only produce high-quality images, but offer a good user experience.

The iPhone 14 Pro Max is powered by Apple’s own A16 Bionic chipset, which features a 16-Core Neural Engine capable of 17 TOPS and an advanced Image Signal Processor (ISP). The Samsung Galaxy S23 Ultra is powered by the flagship Snapdragon 8 Gen 2 Mobile Platform that features Qualcomm Technologies, Inc.’s first ever Cognitive ISP.

Sensor sizes and megapixel counts may be key differentiators that play a significant role in a smartphone camera system’s quality, but they are less relevant than most people think. Smartphones simply do not have the physical space to accommodate large camera sensors and a sensor’s pixel density can only be so high before image quality is negatively impacted, particularly in challenging lighting conditions. Although there are other things to consider, in general, smaller pixels on a camera sensor gather less light than larger pixels (all other things being equal), which can degrade image quality in sub-optimal lighting. Because of this, a smartphone image processing capabilities are immensely important.

There’s isn’t much data available on Apple’s proprietary ISP. The company consistently enhances the capability of its “Axx” mobile processors with each generation, but specifics are scarce. Qualcomm has also enhanced its Snapdragon platforms over the years with leading image processing and computational photography features. Although Qualcomm’s partners, like Samsung, don’t typically enable every possible photography or videography-related feature available in the latest Snapdragon mobile platforms, those that place a strong emphasis on photography often produce devices capable of capturing excellent photos and videos. In fact, according to DxOMark, many of the best smartphone cameras are powered by Snapdragon.

The Spectra Image Signal Processor featured in the Snapdragon 8 Gen 2 has triple Cognitive ISPs, capable of processing 3.2 gigapixels per second. The trio of Cognitive ISPs gives devices like the Galaxy S23 Ultra the ability to leverage up to three independent cameras simultaneously and enables features like triple simultaneous image capture, seamless zooming between cameras, multi-capture HDR composites, and up to 200 Megapixel photo capture, among other things. The first of its kind Cognitive ISP in the Snapdragon 8 Gen 2, in conjunction with the chip’s Hexagon AI engine, also enables something called real-time “Semantic Segmentation”, which is enabled on the Galaxy S23’s selfie camera.

Semantic Segmentation leverages a neural network that was trained to identify many different objects, like people, faces, skies, grass, etc., and each of those objects can be segmented, and custom processing applied to them to enhance a captured image or video. If you’re familiar with image processing applications, you can think of the segments as layers, and each layer can be edited independently to produce a desired effect – bluer skies, smoother skin, greener grass – you get the idea. And since the segments can be altered independently, each segment can be properly exposed, saturated, or sharpened, without diminishing the quality of the other segments. As software support for Semantic Segmentation matures, it has the potential to be a real game changer.

Both the iPhone 14 Pro Max and the Galaxy S23 Ultra use an array of computational photography techniques to produce images. Common computational photography features, like Night Mode, Portrait Mode, and HDR are available on both platforms. The Galaxy S23 Ultra also offers a feature called AI Zoom. AI Zoom is an experiential feature that gives users the ability to seamlessly zoom from the widest angle all the way to 100x zoom, spanning three different cameras, by simply pinching and zooming or using an on-screen slider. There’s no need to manually switch between the ultra-wide, standard, or telephoto camera, or to figure out which one is best to compose a particular shot.

The Mobile Photography Experience

Although I’m focusing on photography here, I should also note that both the iPhone 14 Pro Max and the Galaxy S23 Ultra are high-performance devices, which offer quality user experiences in general. Both of the phones are fast, and have myriad of photo and video-related features. Specifically as it related to photos and videos, autofocus is quick and accurate on both devices, they capture realistic tones and textures, and noise performance is also very good. For general photography, I think anyone coming from a previous-gen device is likely to be quite pleased. But there are some major advantages to the Qualcomm-powered Samsung Galaxy S23 Ultra worth pointing out.

The Galaxy S23 Ultra’s camera application features a number of customizations required to give users access to some of the device’s special photo features, like Hyperlapse, Single Take, and Expert Raw, to name but a few. Users of any skill level, however, should be able to navigate its menu system and manipulate camera settings or quickly switch between modes. Although there are many options available to tune image quality to a user’s liking, auto-mode takes phenomenal shots. The Galaxy S23 Ultra is capable of capturing excellent detail; perhaps one small criticism is that it can sometimes oversaturate images and perform a bit too much sharpening.

Without the right processing, handling wide dynamic range lighting situations can be challenging for the relatively small sensors in a smartphone, but even with strong backlighting with the sun shining brightly, the both the iPhone 14 Pro Max’s and Galaxy S23 Ultra’s cameras performed well. Colors and tones of the scene are well balanced, with crisp detail, when normally some objects would be silhouetted on lesser cameras that can’t handle this kind of lighting. That said, the Galaxy S23 Ultra better handled the exposure and was able to retain more blue in the sky.

The higher resolution 200MP sensor in the Galaxy S23 Ultra also captures more detail. When scaled and posted to social media, with minimal editing, the differences may be very difficult to suss out. But the massive resolution offered by the Galaxy S23 Ultra gives users much more flexibility to crop without sacrificing as much detail in the final image. These zoomed crops of the images captured on both phones shows just how much additional resolution is available on the Samsung device.

A 200MP capture on the Galaxy S23 Ultra is 16320×12240 pixels. A 48MP capture on the iPhone 14 Pro Max is 8064×6048 pixels. You could crop 50% from each dimension on the Galaxy S23 Ultra and still have a similar number of pixels to work with versus the full resolution on the iPhone.

Both phones handled Portrait mode well, though they offer different fields of view in this mode. The simulated blurred background bokeh is smooth, on both with surprisingly realistic looking transitions from in-focus to out-of-focus areas. Portrait Mode can often introduce unwanted artifacts in the fine details around a subject’s edges, but both phones do a pretty good job here. The Galaxy S23 Ultra does a slightly better job though, with smoother transition, and more detail in the subject.

The Galaxy S23 Ultra also stands out with close-up Macro photography. Even when its ultra-wide camera is nearly touching the subject, it is able to focus properly and capture fine detail.

The iPhone 14 Pro Max does a great job too if you don’t pixel peep, but zooming in shows more fine detail in the Samsung capture – you can actually make out individual bits of pollen, whereas on the iPhone the processing makes the internal part of the flower look like it has a rough, bumpy texture.

In terms of video recording, both devices are relatively full featured, but the edge yet again goes to the Galaxy S23 Ultra. The iPhone 14 Pro Max tops out of 4K60 recording, while the Galaxy 23 Ultra can do 8K30 recording. Slow motion, normal, and time lapse recording are available on both devices as well. In practice I found the video quality on both devices to be quite good, though the Galaxy 23 Ultra seemed to handle sub-optimal, low-light conditions somewhat better.

Top camera performance continues to be paramount for flagship smartphone buyers, but whether evaluating specification or real-world results, it’s clear Qualcomm-powered Android devices like the Samsung Galaxy S23 Ultra are a step ahead of the competition. Independent bodies like DxOMark show most of the top smartphone cameras are backed by Qualcomm processors.

[ad_2]

Vespera Smart Telescope review: specs, performance, cost

[ad_1]


The Vaonis Vespera smart telescope makes it easier than ever to observe the night sky with your iPhone, but at a steep cost.

The Vaonis Vespera telescope brings a sense of automation to astrophotography as an app-controlled and easy-to-carry telescope that people can use without prior telescope experience. With sign-offs from Terry Virts and Scott Kelly, we can see that even astronauts are seeing how the future of astrophotography is being shaped by software and robotics.

Out of the box, the telescope comes with a short adjustable tripod, a USB-C cable, and an adapter. The tripod legs can be screwed on, and the magnetic charger makes it easy to power the Vespera on the go with a power bank.

The Vespera weighs around 11 pounds and is small enough to fit in most backpacks and even a smaller crossbody bag. This makes carrying it out to a park or on a hike to observe the night sky easier than most other telescopes.

The Vespera is a snug fit in a medium-sized Jansport crossbody.

The Stellaris app uses GPS to set where you’re observing, taking into account the Earth’s rotation and adjusting its autofocusing features to eliminate manual adjustments for the user. All you have to do is set up the Vespera with its tripod legs, open the app, and select what you want to observe.

The Vespera will open its telescope arm, swiveling and adjusting its angle for where to look in the sky.

Vaonis Vespera Smart Telescope – App and Use

Starting up the Vespera involves connecting to it via WiFi, and the initialization process can take around five to ten minutes to scan the sky for viable objects to observe.

With that said, we recommend you set up the Vespera in a large, open environment. A small yard with many trees is not ideal, while an open field can maximize the range of motion the telescope provides.

The Singularity app will tell you how long it’ll take to observe a certain object, and the estimate holds up well within five to ten minutes.

Depending on your surroundings, the initialization or observation can fail if there’s something blocking the telescope’s view, so some trial and error is needed to get a successful picture. Patience is required when using the Vespera since it can take a while to re-initialize and set up an observation again.

The battery is powerful enough to take on a long night of star-watching with a claimed eight hours of automation. On average, a thirty-minute observation consumed 5% of the battery, making the eight to ten-hour range plausible.

In our limited astrophotography experience, the Vespera captured crisp and satisfying images. Depending on the closeness of the celestial object, the brightness and clarity can vary.

For example, we found that the Ring Nebula was much clearer to capture than the Whirlpool Galaxy. Of course, quality may depend on the user’s environment as well.

The Ring Nebula captured on Vespera

The Ring Nebula captured on Vespera

Images can be exported in a variety of formats in a 1920×1080 resolution, including JPG, TIFF, and FITS. Being able to easily save your results to your phone is appealing if you want to share them with others.

Two modes on the Vespera especially take advantage of its ability to tirelessly take photos: mosaic mode and “Plan my Night.” Mosaic mode captures multiple snapshots of the sky and assembles them, which can take longer than a usual observation.

“Plan my Night”, however, allows the user to use Vespera to observe different objects throughout the night ahead of time. This makes it possible to study the stars in your sleep if you leave the telescope on your lawn or during a camping trip.

The

The “Plan my Night” feature shown on iPad.

Since it has features for automation and to be outdoors for long periods of time, the Vespera has an IP43 water resistance rating, so light splashes and rain are permissible.

We’d avoid too much exposure though, and keep watch of the weather in case you leave it outside overnight.

A step in the right direction for accessibility

The Vespera telescope has a lot of potential as a tool to make astrophotography more accessible. With applications in education and with space enthusiasts, the Vespera introduces the complexities of capturing celestial objects in an easy-to-digest way.

Following along using the Singularity app, users can learn and engage with the night sky to their heart’s content. The Vespera offers a wonderful way to embark on a personal hobby or share the experience with others.

Comparatively, a beginner telescope can start range from $100 to $500, which is a fraction of the $1,499 cost of the Vespera. The main features to pay for are the automation and ease of use, as well as the small size and portability.

Still, it’s a worthwhile investment if you want to take the leap and lack experience in handling telescopes.

Vaonis Vespera Smart Telescope Pros

  • Highly portable
  • Stellaris app simplifies setup and observation
  • Good battery life and water resistant
  • Different features to take advantage of automation

Vaonis Vespera Smart Telescope Cons

  • High cost may be a deterrent
  • Brightness and clarity of captured images can vary
  • Initialization and observation can fail suddenly depending on surroundings

Rating: 3.5 out of 5

Where to buy the Vaonis Vespera Smart Telescope

The Vaonis Vespera Smart Telescope is available on the Vaonis store for $1,499 (plus $90 US shipping).

[ad_2]

San Francisco Proposal Photographer Offers New SF Locations And Photography Packages

[ad_1]

(MENAFN- EIN Presswire)

San Francisco Proposal Photographer

San Francisco Romantic Proposal

San Francisco Proposal Photography

New photography packages for proposals now available at San Francisco Proposal Photographer, providing beautiful, timeless photos to cherish for a lifetime.

SAN FRANCISCO, CA, UNITED STATES, April 30, 2023/einpresswire.com / — San Francisco Proposal Photographer is excited to announce the launch of their new proposal photography packages and locations, which aim to capture life’s intimate moments with skilled discretion and artistry. The studio has added several enchanting locations for proposals in San Francisco to its list, including Golden Gate Bridge, Baker Beach, Russian Hill, Ina Coolbrith Park, Sutro Baths, Fairmont Rooftop, Japanese Tea Gardens, Lovers Lane, Pier 14, and Muir Woods.

For more information, go to

The studio provides four different proposal packages, each of which is designed to capture the unique essence of the moment and offer beautiful and timeless photos to cherish for a lifetime. Package A includes a smartphone shoot with all photos delivered to the client, while Package B provides professional camera coverage of the proposal. Package C includes everything in Package B along with a 10-15 minute photo session, and Package D offers a 45-minute photo shoot.

“We are thrilled to be able to offer our clients the opportunity to capture their special moment in some of the most beautiful locations in San Francisco,” said the owner of the studio.“We strive to provide our clients with the highest quality proposal photography services, and our new packages and locations are sure to make their proposal even more memorable.”

San Francisco Proposal Photographer understands the significance of a proposal and strives to provide couples with the perfect setting and the ideal photos to commemorate their special day. In addition, the studio encourages clients to come up with their own surprise proposal ideas and unique ways to make their special moment even more memorable.

A recent customer commented:”It was a pleasure working with this team. I organized a proposal shoot with them and they were discrete, punctual, and enthusiastic. The pictures came out great and my fiance had no idea it was coming. Highly recommend them!”.

For more information about the studio’s photography packages and locations, visit the san francisco proposal photographer .

###

San Francisco Proposal Photographer
2150 Post St, San Francisco, CA 94115
(415) 812-5014

Eric Smith
San Francisco Proposal Photographer
+1 415-812-5014
email us here

MENAFN30042023003118003196ID1106141447


Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.

[ad_2]

Future Apple Watch could get cameras for photography & Face ID

[ad_1]

Apple’s proposal would be less bulky than the existing third-party Wristcam



AppleInsider may earn an affiliate commission on purchases made through links on our site.

Apple has big plans for cameras in future Apple Watches, if they can be fitted without making the watch awkward to wear — and if the cameras can be of high enough quality.

The popularity of the Apple Watch Ultra has shown that people are willing to wear bulkier devices if there is a clear benefit to them. In the case of the Apple Watch Ultra, that benefit includes a greatly extended battery life, for instance.

Future Apple Watches may also become at least a little larger, as Apple is again looking at ways to incorporate a camera. Previously, Apple has had a patent granted to include a camera in the Apple Watch’s Digital Crown, but that’s got to have very limited use.

Now in a newly-revealed patent application, Apple is proposing that a Watch could fit a camera into a slight protrusion toward the top of its chassis, above the display. “Wearable Electronic Device Having A Digital Camera Assembly,” would be similar to the existing Wristcam in where the camera is positioned, but that product is a whole Watch band.

Apple isn’t keen on taking up a whole band, or anything that makes the Apple Watch cumbersome.

“While certain electrical components, such as a camera, may perform desirable

functions,” says the patent application, “the integration of such components may result in a bulky device which may hinder user performance, may be uncomfortable to wear, or may be unsuited for performing certain functions (e.g., a camera disposed within a wearable electronic device may be awkward to position when capturing optical input).”

Apple is also dismissive of compromises to fit the camera into a Watch band. “Additionally, low-quality components may not meet a user’s quality expectations,” it says, “(e.g., a low-quality camera may produce low-quality images).”

It’s not just that Apple wants you to look nice on photographs. Apple specifically wants a video camera with up to 4K and 60 frames per second, or a still camera up to 12MP.

Either a spacecraft, or an Apple Watch side view with a camera protrusion to the right

Either a spacecraft, or an Apple Watch side view with a camera protrusion to the right

That’s because this is for more than photography wildlife, more than for capturing the crowd at Little League. This camera is actually meant for more than any regular photography or video.

“The digital camera assembly may be used for a variety of purposes,” continues Apple, “including, as non-limiting examples, facial identification, fingerprint sensing, scanning a Quick Response (QR) code, video conferencing, biometric monitoring (e.g., heart rate monitoring), photography, video or image capture, or any combination thereof.”

So with a camera on your Apple Watch, you could unlock all of your Apple devices through Face ID.

The Watch could also use its camera to “capture movement of a user’s body or other objects during certain activities.” Using visual inertial odometry (VIO), “the camera can be used to obtain a high degree of motion sensing accuracy, which may be used to monitor, detect, and/or predict a user’s motion or gesture based on certain characteristics.”

That’s a lot to demand of a camera, and Apple is not expecting to be able to fit one under the screen of an Apple Watch. Instead, it will be on what Apple calls a protusion, and much of the patent application is about how to do that without making the Watch distracting to wear.

“[A] digital camera assembly may be integrated into the wearable electronic device in a way so as to minimize an effect of the digital camera assembly on other electronic components and/or a form factor of the wearable electronic device,” says Apple.

“For example, in implementations where a digital camera assembly is positioned within an internal cavity (e.g., camera cavity) of a protrusion,” it continues, “the digital camera assembly may extend from the housing, over a band slot, and away from a display, a battery, a circuit assembly, or sensors of the wearable electronic device.”

“Likewise, the protrusion may be shaped to avoid interfering with geometry of the band slot,” says the patent application, “so that a band/strap may still be permitted to couple with the housing of the wearable electronic device.”

The patent application is credited to five inventors, including Christopher M. Warner, whose previous work includes muscle-sensing Apple Watch bands.

[ad_2]

iPhone vs Android: Two different photography and machine learning approaches

[ad_1]

Apple’s computational photography aims for realism



AppleInsider may earn an affiliate commission on purchases made through links on our site.

A controversy with Samsung’s phone cameras has renewed the conversation surrounding computational photography, and highlights the difference between it, and Apple’s approach in iOS.

It isn’t a big secret that Apple relies upon advanced algorithms and computational photography for nearly all of its iPhone camera features. However, users are beginning to ask where to draw the line between these algorithms and something more intrusive, like post-capture pixel alteration.

In this piece, we will examine the controversy surrounding Samsung’s moon photos, how the company addresses computational photography, and what this means for Apple and its competitors going forward.

Computational photography

Computational photography isn’t a new concept. It became necessary as people wanted more performance from their tiny smartphone cameras.

The basic idea is that computers can perform billions of operations in a moment, like after a camera shutter press, to replace the need for basic edits or apply more advanced corrections. The more we can program the computer to do after the shutter press, the better the photo can be.

This started with Apple’s dual camera system on iPhone 7. Other photographic innovations before then, like Live Photos, could be considered computational photography, but Portrait Mode was the turning point for Apple.

Apple introduced Portrait Mode in 2016, which took depth data from the two cameras on the iPhone 7 Plus to create an artificial bokeh. The company claimed it was possible thanks to the dual camera system and advanced image signal processor, which conducted 100 billion operations per photo.

Needless to say, this wasn’t perfect, but it was a step into the future of photography. Camera technology would continue to adapt to the smartphone form factor, chips would get faster, and image sensors would get more powerful per square inch.

Portrait mode uses computational photography to separate the foreground

Portrait mode uses computational photography to separate the foreground

In 2023, it isn’t unheard of to shoot cinematically blurred video using advanced computation engines with mixed results. Computational photography is everywhere, from the Photonic Engine to Photographic Styles — an algorithm processes every photo taken on iPhone. Yes, even ProRAW.

This was all necessitated by people’s desire to capture their life with the device they had on hand — their iPhone. Dedicated cameras have physics on their side with large sensors and giant lenses, but the average person doesn’t want to spend hundreds or thousands of dollars on a dedicated rig.

So, computational photography has stepped in to enhance what smartphones’ tiny sensors can do. Advanced algorithms built on large databases inform the image signal processor how to capture the ideal image, process noise, and expose a subject.

However, there is a big difference between using computational photography to enhance the camera’s capabilities and altering an image based on data that the sensor never captured.

Samsung’s moonshot

To be clear: Apple is using machine learning models — or “AI, Artificial Intelligence” for those using the poorly coined popular new buzzword — for computational photography. The algorithms provide information about controlling multi-image captures to produce the best results or create depth-of-field profiles.

The image processor analyzes skin tone, skies, plants, pets, and more to provide proper coloration and exposure, not pixel replacement. It isn’t looking for objects, like the moon, to provide specific enhancements based on information outside of the camera sensor.

We’re pointing this out because those debating Samsung’s moon photos have used Apple’s computational photography as an example of how other companies perform these photographic alterations. That simply isn’t the case.

Samsung's moon algorithm in action. Credit: u/ibreakphotos on Reddit

Samsung’s moon algorithm in action. Credit: u/ibreakphotos on Reddit

Samsung has documented how Samsung phones, since the Galaxy S10, have processed images using object recognition and alteration. The Scene Optimizer began recognizing the moon with the Galaxy S21.

As the recently-published document describes, “AI” recognizes the moon through learned data, and the detail improvement engine function is applied to make the photo clearer with multi-frame synthesis and machine learning.

Basically, Samsung devices will recognize an unobscured moon and then use other high-resolution images and data about the moon to synthesize a better output. The result isn’t an image captured by the device’s camera but something new and fabricated.

Overall, this system is clever because the moon looks the same no matter where it is viewed on earth. The only thing that changes is the color of the light reflected from its surface and the phase of the moon itself. Enhancing the moon in a photo will always be a straightforward calculation.

Both Samsung and Apple devices take a multi-photo exposure for advanced computations. Both analyze multiple captured images for the best portion of each and fuse them into one superior image. However, Samsung adds an additional step for recognized objects like the moon, which introduces new data from other high-resolution moon images to correct the moon in the final captured image.

Samsung's moon algorithm explained. Credit: Samsung

Samsung’s moon algorithm explained. Credit: Samsung

This isn’t necessarily a bad thing. It just isn’t something Samsung hasn’t made clear in its advertising or product marketing, which may lead to customer confusion.

The problem with this process, and the reason a debate exists, is how this affects the future of photography.

Long story short, the final image doesn’t represent what the sensor detected and the algorithm processed. It represents an idealized version of what might be possible but isn’t because the camera sensor and lens are too small.

The impending battle for realism

From our point of view, the key tenet of iPhone photography has always been realism and accuracy. If there is a perfect middle in saturation, sharpness, and exposure, Apple has trended close to center over the past decade, even if it hasn’t always remained perfectly consistent.

We acknowledge that photography is incredibly subjective, but it seems that Android photography, namely Samsung, has leaned away from realism. Again, not necessarily a negative, but an opinionated choice made by Samsung that customers have to address.

For the matter of this discussion, Samsung and Pixel devices have slowly tilted away from that ideal realistic representational center. They are vying for more saturation, sharpness, or day-like exposure at night.

The example above shows how the Galaxy S22 Ultra favored more exposure and saturation, which led to a loss of detail. Innocent and opinionated choices, but the iPhone 13 Pro, in this case, goes home with a more detailed photo that can be edited later.

This difference in how photos are captured is set in the opinionated algorithms used by each device. As these algorithms advance, future photography decisions could lead to more opinionated choices that cannot be reversed later.

For example, by changing how the moon appears using advanced algorithms without alerting the user, that image is forever altered to fit what Samsung thinks is ideal. Sure, if users know to turn the feature off, they could, but they likely won’t.

We’re excited about the future of photography, but as photography enthusiasts, we hope it isn’t so invisible. Like Apple’s Portrait Mode, Live Photos, and other processing techniques — make it opt-in with obvious toggles. Also, make it reversible.

Tapping the shutter in a device’s main camera app should take a representative photo of what the sensor sees. If the user wants more, let them choose to add it via toggles before or editing after.

For now, try taking photos of the night sky with nothing but your iPhone and a tripod. It works.

Why this matters

It is important to stress that there isn’t any problem with replacing the ugly glowing ball in the sky with a proper moon, nor is there a problem with removing people or garbage (or garbage people) from a photo. However, it needs to be a controllable, toggle-able, and visible process to the user.

Computational photography is the future, for better or worse

Computational photography is the future, for better or worse

As algorithms advance, we will see more idealized and processed images from Android smartphones. The worst offenders will outright remove or replace objects without notice.

Apple will inevitably improve its on-device image processing and algorithms. But, based on how the company has approached photography so far, we expect it will do so with respect to the user’s desire for realism.

Tribalism in the tech community has always caused debates to break out among users. Those have included Mac or PC, iPhone or Android, and soon, real or ideal photos.

We hope Apple continues to choose realism and user control over photos going forward. Giving a company complete opinionated control over what the user captures in a camera, down to altering images to match an ideal, doesn’t seem like a future we want to be a part of.

[ad_2]

The Galaxy S23 Ultra Is Great, but Would Be Better if It Folded

[ad_1]

I struggle to look at any other Samsung smartphone now that I’ve been living with its foldables. The Samsung Galaxy Z Fold 4 has effectively changed how I use Android. Most of the time, I’ll only bother with my Google Pixel 7 if someone is calling the number linked to that phone. Otherwise, you’ll see me primarily on the foldable. It’s just so much more versatile for the life I lead.

That’s not to say I didn’t enjoy my time with the Samsung Galaxy S23 Ultra, but I missed the Fold while reviewing this one. Samsung’s ultimate new flagship device is everything you could want in a smartphone, but there is also a lot here that feels like overkill now that we’re in the second iteration of the Ultra and its stylus-wielding ways. In fact, I forgot to use the stylus until about two days ago (I don’t draw). And while four cameras are a great back-of-the-box brag, I still don’t understand how to push them to the extent they’ve been marketed as being capable of, and I realize I probably never will. And I like high spec phones!

Regardless, the Ultra still has plenty going for it, including a better design than the last generation. Those rear-facing cameras may not be enough to justify the price to casual users, but their post processing algorithms are just as good as Google’s—better in some cases. The Ultra even has a few features I think foldables are still missing—like that stowable stylus.

But when it comes to targeting genuine innovation as opposed to niche specialty features, the Ultra might miss the mark compared to both the competition and Samsung’s other phones.

The best Ultra yet

If you like big phones, you’ll love the Galaxy S23 Ultra (I don’t—it’s not foldable). It has a 6.8-inch Dynamic AMOLED display, categorized as such because it’s based on tech that allows the display to dynamically change refresh rates without killing the battery. The jury is still out on how much battery that display tech saves, and I’ll get more into that when we talk about the battery rundown results later. Still, the display that Samsung has going here is like carrying a tiny version of its TVs in your pocket.

You might have gotten into the Galaxy line because you love Samsung’s displays. I can’t blame you. Like on the S22 Ultra, the screen on the S23 Ultra is a 1440p resolution with a 120Hz refresh rate. I love watching TV on this thing, even the 720p classics like Taxi and One Day at a Time. What I especially appreciate about Samsung is how low the brightness can go so that I can fall asleep to those shows at the end of the night without lighting up the room. Samsung enables the use of Android 12’s extra dim mode, and with that turned on, the phone doesn’t go any higher than about 350 nits—the standard rate is around 430 nits, or a whopping 1,750 nits if you’re out in direct sunlight and using the adaptive brightness feature.

A photo of the Galaxy S23 Ultra

The Galaxy S23 Ultra houses an S Pen inside its chassis.
Photo: Florence Ion / Gizmodo

The best part about the new Galaxy S23 Ultra is that Samsung fixed some of what I didn’t like with the Galaxy S22 Ultra’s design. Mainly, it squared off the edges instead of rounding them, so it’s easy to cradle the phone one-handed. I finally felt confident that I wasn’t going to drop it. I’m glad Samsung stopped with the overtly rounded edges, which are also annoying to use when you’re tapping on the edge of the screen.

This is still a gigantic smartphone. I hope you have big hands if you plan to play games on this thing. My small hands and long claws had difficulty cradling the Ultra to play with on-screen controls in games like Dreamlight Valley through Xbox Game Pass, and my wrists got weary holding the phone to control my character in Riptide GP: Renegade. The first-gen Razer Kishi controller that I use for Android gaming also feels as if it’s stretched to capacity on this phone, as if the Galaxy S23 Ultra will pop out at any minute. Unless it’s a point-and-tap game, I use a Bluetooth controller to play games on the S23 Ultra. The OnePlus 11’s similarly sizeable 6.7-inch display, comparatively, feels less ginormous because it doesn’t have the Ultra’s squared-off corners and the chassis is narrower.

A photo of the Galaxy S23 Ultra

The Galaxy S23 Ultra is a big phone, make no mistake.
Photo: Florence Ion / Gizmodo

The Galaxy S23 Ultra utilizes an in-display fingerprint sensor and face unlock for added lock screen security. It’s best that Samsung didn’t carry over the power button fingerprint sensor like on the Z Fold 4, because I am constantly accidentally pressing that one and locking myself out of it. Scanning in a fingerprint or smiling at the Ultra felt fast and responsive unless I wore a mask or sunglasses.

The default sorage space on the S23 Ultra has thankfully been bumped up to 256GB. It starts there and goes all the way up to 1TB, if you can stomach paying for it (doing so will add $420 on top of the base storage’s cost). The Ultra is also IP68 rated for water and dust resistance.

Qualcomm with Samsung flavoring

Something to note about this year’s Galaxy S23 lineup is that it runs a unique flavor of the Qualcomm Snapdragon 8 Gen 2 processor. Rather than use the one that came right out of the box, Samsung infused some of its AI smarts to tune camera and performance algorithms to its liking. The company already does this to some effect with its Exynos chips overseas, and it’s bringing that expertise to the phones sold in the states to one-up Google’s homemade Tensor processor. Sometimes it works.

A photo of the Galaxy S23 Ultra

If you like to game, the S23 Ultra can serve. But its big size may not be easy for everyone to cradle.
Photo: Florence Ion / Gizmodo

The Galaxy S23 Ultra is available with 8GB and 12GB of RAM, which seems absurd. The Ultra should have 12GB of memory as the standard, since it’s technically the ultimate Samsung phone. Even with the 12GB of RAM, you can’t tell that the chip inside the Galaxy S23 Ultra is any beefier than what’s inside the similarly-specced OnePlus 11. On paper, and in Geekbench 5 (which will be Geekbench 6 in our reviews going forward), the Galaxy S23 Ultra performed better than OnePlus 11 by only about 300 points on the single-core score and 400 points on the multi-core one. But that proves little about whether Samsung’s infused chip is faster or more able than OnePlus’s vanilla one in actual use. Considering the Google Pixel 7 Pro is a laughing stock on the benchmark charts but not in real-world use—it ranks with 400 points less than the Galaxy S23 Ultra—it’s hard to use these benchmarks as the sole test for what’s possible. Anyway, neither of these Android devices can hold a candle to the numbers that Apple’s A16 Bionic spits out.

The upside to having such a powerful smartphone is that it can do everything: play games locally and from the cloud, create and edit documents, quickly export edited videos, process RAW photos, and chat with whoever. The Ultra can handle each of these cases with absolute ease, but that’s expected from a phone that I’ve been running for about three weeks. The real test for these devices is how they do after a year in the hand.

A photo of the Galaxy S23 Ultra

Riptide GP: Renegade runs smoothly on the S23 Ultra, but it’s too big to comfortably play for my hands.
Photo: Florence Ion / Gizmodo

I echo the sentiments of a few other reviews: the Galaxy S23 Ultra doesn’t get as hot as previous versions of the device or even other Android phones. I fell asleep next to it a few nights in a row while it was charging and playing Pluto TV, and I didn’t feel the usual heat emanating as the battery fueled up for the next day. It did get toasty once while I was mindlessly scrolling through TikTok (as I often do), and it was significant enough that I remember saying, “I should probably mention this in the review.”

Apple’s iPhone 14 Pro Max lasts longer

I’m sorry to include Apple in the subhead of a Samsung Galaxy review. But I remain impressed by the battery test on Apple’s latest flagship, and it’s now the benchmark for every other flagship phone review.

Samsung’s 5,000 mAh battery is enormous while remaining the same size as in last year’s Ultra. Whatever Samsung did on the backend to extend battery life has worked thus far—the S23 Ultra beat out the S22 Ultra by about two hours, lasting 18 hours and 33 minutes. But that’s nothing to Apple’s nearly 24-hour battery life on its large iPhone 14 Pro Max. I want some of whatever magic Apple has going on with its software to come to Android land.

These results translated to using the phone daily, too. As I mentioned, I’m a TikTok freak, and I was surprised to see that the Ultra chewed through only 23% of its battery life in five hours after mixed-use, which included tuning into my Disney streamer.

Move over, Pixel camera

A photo of the Galaxy S23 Ultra

The back of the Galaxy S23 Ultra houses four camera sensors.
Photo: Florence Ion / Gizmodo

Because the Galaxy S23 Ultra is being dubbed as “ultimate,” its cameras are appropriately extreme. They’re also the key upgrade point here, and took up the majority of Samsung’s announcement event for this phone. The primary camera is a 200-MP standard wide-angle lens with optical image stabilization (OIS) and an f/1.7 aperture. The ultra-wide camera is a 12-MP sensor with an f/2.2 aperture. And the two telephoto lenses on the back also have OIS, though one has an f/2.4 aperture with a maximum 3x optical zoom, and the other is f/4.9 with a 10x optical zoom. The maximum digital zoom for this camera is 100x, just like the S22 Ultra.

A photo taken with the Galaxy S23 Ultra

An unedited photo shot at dusk with the Galaxy S23 Ultra.
Photo: Florence Ion / Gizmodo

Whenever someone outside of the Android bubble realizes the Galaxy S23 Ultra has four cameras on the back, they often ask me, “why?” The answer is so it has camera lenses for every foreseeable situation. For instance, if you’re chasing your kid around the park, you want that quick 3x optical zoom to capture them in the frame and up close. The result is a background bokeh effect that helps make the image instantly shareable on Instagram without using Portrait mode. Or if you happen to be lying down at the park, only to hear the roar of a jet engine approaching overhead, you can use the 10x optical zoom to get a closer look and maybe even post it to TikTok. For epic sky days, when the clouds seem to be cruising through as if they’re fresh cotton candy spun right out of the bin, the ultra wide-angle camera helps increase the drama when shared in your secret Slack channel of friends obsessed with sunsets.

Photo samples from the Galaxy S23 Ultra

Even with two dedicated telephoto cameras, it’s not always the best at zooming in to see what’s yonder.
Photo: Florence Ion / Gizmodo

Nowadays, most smartphone cameras are capable of everything I just described, but Samsung purports a higher resolution and greater color and distance detail. These are the cameras we have on us every day, and Samsung argues that these are the digital memories we’ll be pulling from as we struggle to remember our lives someday in the future.

That’s not to say that every photo the Galaxy S23 Ultra produces is perfect. Zooming past the 10x optical limit requires praying that the image won’t be jaggy or over-sharpened. There were so many instances on the evening of my daughter’s third birthday that the pictures of her punching around a balloon came out looking blurry—a real bummer for me as I was trying to find a cute one to share within group chats. I also tried staying up one night to capture the Air Force flying their planes in the sky above, and I could not produce anything worth sharing.

A photo taken with the Galaxy S23 Ultra

Another unedited photo shot with the Galaxy S23 Ultra.
Photo: Florence Ion / Gizmodo

As it stands, the 200-MP sensor on the Galaxy S23 Ultra isn’t shooting in its full resolution at all times. Like most flagship smartphones, including the iPhone 14 Pro and Google Pixel 7, Samsung uses pixel-binning, so the phone shoots like a 12-MP camera with 16 pixels within each megapixel. The result is brighter photos throughout with better detail. I preferred the 12-MP images worked over by the algorithm over the full 200-MP raw ones, which usually require some post-editing, anyway. I want to avoid editing a photo while just trying to share it on social media.

You can see more clearly how the Galayx S23 Ultra’s post-processing stacks up compared to the iPhone 14 Pro Max and Pixel 7 Pro in the slideshow I put together here. For the most part, I found Samsung’s algorithms to veer towards being saturated, though it was impressive at tempering the final product to maintain detail where it mattered. The most obvious example is a photo where I shot the Santa Ynez Mountains in Santa Barbara; the S23 Ultra held on to the subtle detail of the sunset, lighting up the ridges without over-contrasting them.

I wrote more about Expert RAW in the other piece, including Samsung’s improved astrophotography feature. I wish that Samsung would have extracted this feature on its own rather than buried it inside another download that has to be enabled in the camera app before anyone knows it’s even there. Samsung includes all these unique camera features as if we’re supposed to know how to use them right out of the box. But as with the improved nighttime video recording capabilities teased during the Ultra’s debut at Galaxy Unpacked earlier this month, I had no idea where to start. Just because a smartphone can do all these fancy things doesn’t mean that the general population will aspire to that. And after ten years of reviewing smartphones, I might also give up.

That’s a big problem, as the camera system here is a major selling point and a major justification for the price tag. Compare that to Apple, which due to making both the iPhone and iOS, is able to bundle its phones with tons of everyday usability conveniences.

A photo of the Galaxy S23 Ultra

The Galaxy S23 Ultra offers so many different camera modes that I’m often too overwhelmed to play with any.
Photo: Florence Ion / Gizmodo

Before we move on from the cameras, there are a few other things to note: video recording on this smartphone is aces, even without a tripod. But for stability’s sake, I’ve been propping the Ultra up on a handheld tripod and following my kid around at 60 fps. The video is so smooth! The Ultra maxes out at 30 frames per second in 8K resolution for video recording, and there’s a Pro Video mode if you’re comfortable with tweaking camera settings. The front-facing camera is a 12-MP sensor with an f/2.2 aperture; annoyingly, it doesn’t zoom in or out.

Does a smartphone need a stylus?

A photo of the Galaxy S23 Ultra

The S Pen has always been a nice-to-have, but it feels more fitting for a tablet-style device.
Photo: Florence Ion / Gizmodo

Samsung’s S Pen has been around for a long while. It’s as iconic as Paris Hilton’s chihuahuas in the 2000s (RIP to them all). Last year’s Ultra was the first time it appeared in the regular Galaxy lineup after the sunsetting of the Galaxy Note series of yore. But functionally, it’s similar to what the S Pen could do before it. You can pop it out for drawing and cropping when the situation on screen calls for it—accommodating for business people doing precise things, like needing to move a cursor within a document or having to sign off on a contract while in line somewhere. But I’m starting to realize this screen is too limited for anything art driven. Granted, I’m not an artist, but if I imagine myself as a college student (again), the S Pen would feel much more appropriate docked inside a gadget like the Z Fold 4, with can open up into a larger display that’s fit for highlighting and making digital notes. That’s a form factor that lends itself to a stylus rather than the cramped screen on the S23 Ultra.

The other problem with the S Pen is that it requires its own space inside the chassis to dock. That’s the tradeoff for a phone slightly too big for your pocket or those straddling gaming controllers. As much as the S Pen is an iconic tool, I don’t know that it belongs on a smartphone anymore, even if you can use it as a Bluetooth controller.

Samsung’s version of Android

A photo of the Galaxy S23 Ultra

Samsung’s software is fine, but often it doubles up on Google’s offerings.
Photo: Florence Ion / Gizmodo

The Galaxy S23 Ultra ships with One UI 5, based on the latest version of Android 13. The One UI 5.1 update is the one that everyone’s waiting for right now, since it includes features like Bixby Text Calling, which works similarly to the Pixel’s Screen Calling. This feature is now live in English (it was available only in Korea until now), but I couldn’t get it to work during my testing period. I hope to revisit this and some of Bixby’s other features later, as I’m curious to understand the benefits of sticking with it over the tried-and-true (even if sometimes frustrating) Google Assistant.

I don’t mind Samsung’s version of Android, especially not since adopting the foldable. I realized it comes with the benefit of Samsung tweaking what Google gave it to its devices, even if it doesn’t have any semblance of Android’s interface framework, called Material You. Samsung offers some neat integration with Microsoft’s Your Phone app on Windows PCs that’s beyond the default experience, including the ability to control your device from the desktop remotely. There’s also the ability to snap a photo in Expert RAW and have it immediately populate in Adobe Lightroom. These abilities are nice to have, but like the Galaxy S22 Ultra last year, I hardly ever considered using them after the review period was over. They’re not a reason to go out and buy a phone.

Still too much phone

A photo of the Galaxy S23 Ultra

I hope you enjoyed reading this entire review only to have me tell you to buy a foldable.
Photo: Florence Ion / Gizmodo

I know there are people out there salivating over the Galaxy S23 Ultra. They want the best that Samsung has to offer in its lineup, whether it’s for bragging rights or because they want all those lenses and this is the only camera they’ll own. I get all that, but I still think the Ultra is a bit of overkill in a market where we’re all screaming for a deal. There are still two other models of the Galaxy S23 that I have yet to review, and though they’re smaller devices with slightly different chassis, they more or less deliver the same Samsung experience across the board for less. They’re priced a little over the Pixel 7 lineup, starting at $800 and $1,000 for the S23 and S23+, respectively.

If you’re going to spend a starting price of $1,200 on any Android smartphone, I’m pleading with you to get a foldable instead. Yes, it’s a new kind of form factor with dubious longevity, but it’s not going away any time soon. For many, even those who want the best, camera fidelity will reach a point diminishing returns. But a foldable drastically changes every user’s experience. There is more competition cropping up overseas and the rumor mill is getting louder as more manfacturers are hopping on board this new smartphone fad. At the very least, if you’re spending a whopping amount of money on a smartphone, get something that’s a bonafide phone and a tablet for the price.

Better and better cameras are perhaps not what each new generation of a phone should be targeting, at least anymore.

[ad_2]

Galaxy S23 Ultra Camera Tested vs Pixel 7 Pro, iPhone 14 Pro Max

[ad_1]

A photo of the Galaxy S23 Ultra

Like last year’s release, the Galaxy S23 Ultra has four cameras on the back (one of those holes is just a sensor).
Photo: Florence Ion / Gizmodo

I’m having a ton of fun with the Samsung Galaxy S23 Ultra. There’s still more to explore before I have a full review ready—I still need to test its purported nighttime video-taking prowess, plus I’m still learning how to get the most out of the Expert RAW mode in the Samsung camera app. But until then, I figured I’d share what I’ve noticed so far about the $1,200 phone’s camera compared to those on other popular flagship smartphones I have floating around, like the Pixel and iPhone.

How do the Galaxy S23 Ultra’s shots compare to those on the Google Pixel 7 Pro and Apple iPhone 14 Pro Max? So far, so good. It feels like Samsung fixed some of its zooming algorithms for more precise shots, but I still need to perform side-to-side testing against last year’s S22 Ultra. Samsung’s photos are still more saturated than what Google and Apple produce, but the result is less noisy than it used to be. The company has also toned down the sharpening on its photos and figured out how to balance stark sunlight coming in from the side. However, the Ultra is still no match for the Pixel when it comes to nighttime shooting capabilities. Spoiler: the Pixel 7 Pro is still better at shooting photos of stars, at least until I become a Expert RAW mode power user.

[ad_2]