From Dramatic Locations To Dreamy Photo Sets, Here Are The New Trends In Wedding Photography

[ad_1]

(MENAFN- Khaleej Times) Published: Thu 14 Sep 2023, 8:16 PM

Last updated: Thu 14 Sep 2023, 8:24 PM

It is said that the cake gets eaten, the flowers get withered but it is the wedding photos that last forever. Wedding pictures are a perfect way to preserve what are arguably the best moments of one’s life, and its value only increases with time.“Wedding photography is an art that stands as a testament to love; it transcends time. Cherished for generations, it is all about encapsulating emotions and not documenting pictures,” says Shrey Bhagat, founder and creative director, Raabta.

Undoubtedly, one of the most important aspects of a wedding is photography, which is getting bigger and better by the day. A multi-billion-dollar market that is growing exponentially, wedding photography has evolved tremendously over the years. From dramatic locations and dreamy photo sets to stunning imagery and visual storytelling, the trends in wedding photography are eclectic and exciting. Here is a look at some of the latest trends in this special genre of photography.

Creative, candid and collaborative

Gone are the days when wedding photography was a strictly formal affair and it was all about awkward posing and forced smiling before the camera.“Earlier, there was no real connection between a photographer and the couple, but now, couples and the photographer are a team and all decisions with respect to the approach and style of pictures are taken mutually.

It is all about capturing real, candid moments that tell the unique story of the wedding day,” says Himanshu Patel, founder, Epic Stories. Couples are totally involved in the creative process and the photography team spends hours listening to each couple’s story to understand the best way to capture the essence of their wedding.“Some couples even opt for a customised song dedicated to them, and this just adds a whole new dimension to storytelling with wedding photography,” adds Bhagat.

With the help of new technology and changing preferences, wedding photography has become more creative and dynamic. Photographers today are more focussed on capturing nuances, subtle expressions and feelings shared between the couple. While it could be casual interactions or intricate details, it is all about embracing the authentic, and even the imperfect, to capture the true spirit of the occasion.“Couples love how this gives them an output that is up close and personal. The beauty of these frames is that it is full of raw emotions, with an overlapping foon people,” quips Bhagat. As a result, conventional wedding albums have now transformed into beautiful fairy-tales that encapsulate the narrative of the couples’ journeys.

Latest trends for the Midas touch

Apart from candid images, drone shots have become a rage in recent times. With elaborate décor themes spanning from royal, traditional, tropical, vintage and even boho chic, drone shots are used to capture these details beautifully. Most weddings are all about rituals and elaborate ceremonies, and drone shots lend more gravitas to such pictures.

“Cinematic style photography is the trend these days with a foon candid videography. We work with the couple from months in advance to craft the complete narrative right from pre-wedding shoots, save-the-date creatives and even short, quick teasers with about 10-20 edited wedding pictures or a short film that is released on the same day for the couple to share on social media,” says Palani A, founder, S.A. Digital Studio, Bengaluru. He adds that editing is as challenging as the actual shooting and that the desaturated editing style is gaining popularity. This technique creates images that are artistically minimalistic with softer tones and an earthy, natural feel.

There are a number of couples adopting the vintage and nostalgic aesthetic in terms of black and white images and even hazy images that have an ethereal quality. Most photographers are using a slower shutter-speed to capture these soulful images.“We are also returning to using film to give photos a classic, old-fashioned feel. These dreamy pictures add an extra touch of magic and is akin to viewing the photo through a soft, gentle filter,” adds Patel. Such pictures often feel like they are carrying a memory right in the photo.“Controlled blurriness adds cinematic emotion and movement, focusing on moments and gestures,” says Jayant Chhabra, founder, Cupcake Productions.

Further, first look images are a huge trend as they capture one of the purest moments of weddings and seize that feeling forever through a perfect photograph. It also captures the emotional reactions of parents when they see their children for the first time in their wedding attire. Capturing bridal room chronicles with elaborate shots of the trousseau, jewellery and the whole journey of her getting ready is in vogue as is capturing your pets with you on your special day.

Tantalising techniques

Other trends include groomsmen shoots, veil shots and the use of GIFs. While bridal veil images exude an enigmatic and timeless appeal, pictures of the groom with his groomsmen having fun allows their personalities to shine through. After all, why should brides have all the fun? “GIFs are likely to be quite popular in the coming days. Imagine capturing a small moment from the wedding and turning it into a moving picture. It is like freezing a happy dance or a cute smile and making it come alive. These GIFs are like mini stories that show the feeling of the moment in a fun way in a jiffy,” says Patel.

Double-exposure photographs, underwater portraits and backlit silhouettes are the other techniques adopted by photographers to incorporate a unique and distinctive vibe. Artistic compositions with the strategic use of reflective surfaces like mirrors helps create exotic, unusual images and so does the use of smoke bombs which creates the perfect ambience for moments like the couple’s first dance. Post wedding shoots are also extremely popular.

Like all things, change is the only constant in the field of wedding photography too. Photographers spend a lot of time researching new styles and technology to stay relevant.“Keeping up with the trends in wedding photography requires a proactive, adaptive and continulearning approach. Attending meets by photography clubs and other workshops and conferences can help you gain valuable insights. Further, it is imperative to update your camera equipment, lighting gears, lenses and editing software,” concludes Chhabra.

[ad_2]

Vespera Smart Telescope review: specs, performance, cost

[ad_1]


The Vaonis Vespera smart telescope makes it easier than ever to observe the night sky with your iPhone, but at a steep cost.

The Vaonis Vespera telescope brings a sense of automation to astrophotography as an app-controlled and easy-to-carry telescope that people can use without prior telescope experience. With sign-offs from Terry Virts and Scott Kelly, we can see that even astronauts are seeing how the future of astrophotography is being shaped by software and robotics.

Out of the box, the telescope comes with a short adjustable tripod, a USB-C cable, and an adapter. The tripod legs can be screwed on, and the magnetic charger makes it easy to power the Vespera on the go with a power bank.

The Vespera weighs around 11 pounds and is small enough to fit in most backpacks and even a smaller crossbody bag. This makes carrying it out to a park or on a hike to observe the night sky easier than most other telescopes.

The Vespera is a snug fit in a medium-sized Jansport crossbody.

The Stellaris app uses GPS to set where you’re observing, taking into account the Earth’s rotation and adjusting its autofocusing features to eliminate manual adjustments for the user. All you have to do is set up the Vespera with its tripod legs, open the app, and select what you want to observe.

The Vespera will open its telescope arm, swiveling and adjusting its angle for where to look in the sky.

Vaonis Vespera Smart Telescope – App and Use

Starting up the Vespera involves connecting to it via WiFi, and the initialization process can take around five to ten minutes to scan the sky for viable objects to observe.

With that said, we recommend you set up the Vespera in a large, open environment. A small yard with many trees is not ideal, while an open field can maximize the range of motion the telescope provides.

The Singularity app will tell you how long it’ll take to observe a certain object, and the estimate holds up well within five to ten minutes.

Depending on your surroundings, the initialization or observation can fail if there’s something blocking the telescope’s view, so some trial and error is needed to get a successful picture. Patience is required when using the Vespera since it can take a while to re-initialize and set up an observation again.

The battery is powerful enough to take on a long night of star-watching with a claimed eight hours of automation. On average, a thirty-minute observation consumed 5% of the battery, making the eight to ten-hour range plausible.

In our limited astrophotography experience, the Vespera captured crisp and satisfying images. Depending on the closeness of the celestial object, the brightness and clarity can vary.

For example, we found that the Ring Nebula was much clearer to capture than the Whirlpool Galaxy. Of course, quality may depend on the user’s environment as well.

The Ring Nebula captured on Vespera

The Ring Nebula captured on Vespera

Images can be exported in a variety of formats in a 1920×1080 resolution, including JPG, TIFF, and FITS. Being able to easily save your results to your phone is appealing if you want to share them with others.

Two modes on the Vespera especially take advantage of its ability to tirelessly take photos: mosaic mode and “Plan my Night.” Mosaic mode captures multiple snapshots of the sky and assembles them, which can take longer than a usual observation.

“Plan my Night”, however, allows the user to use Vespera to observe different objects throughout the night ahead of time. This makes it possible to study the stars in your sleep if you leave the telescope on your lawn or during a camping trip.

The

The “Plan my Night” feature shown on iPad.

Since it has features for automation and to be outdoors for long periods of time, the Vespera has an IP43 water resistance rating, so light splashes and rain are permissible.

We’d avoid too much exposure though, and keep watch of the weather in case you leave it outside overnight.

A step in the right direction for accessibility

The Vespera telescope has a lot of potential as a tool to make astrophotography more accessible. With applications in education and with space enthusiasts, the Vespera introduces the complexities of capturing celestial objects in an easy-to-digest way.

Following along using the Singularity app, users can learn and engage with the night sky to their heart’s content. The Vespera offers a wonderful way to embark on a personal hobby or share the experience with others.

Comparatively, a beginner telescope can start range from $100 to $500, which is a fraction of the $1,499 cost of the Vespera. The main features to pay for are the automation and ease of use, as well as the small size and portability.

Still, it’s a worthwhile investment if you want to take the leap and lack experience in handling telescopes.

Vaonis Vespera Smart Telescope Pros

  • Highly portable
  • Stellaris app simplifies setup and observation
  • Good battery life and water resistant
  • Different features to take advantage of automation

Vaonis Vespera Smart Telescope Cons

  • High cost may be a deterrent
  • Brightness and clarity of captured images can vary
  • Initialization and observation can fail suddenly depending on surroundings

Rating: 3.5 out of 5

Where to buy the Vaonis Vespera Smart Telescope

The Vaonis Vespera Smart Telescope is available on the Vaonis store for $1,499 (plus $90 US shipping).

[ad_2]

Samsung’s Galaxy A54 Already Feels Outdated Now That the Pixel 7a Is Out

[ad_1]

It’s a bit difficult to justify buying Samsung’s mid-tier Galaxy A54 when the Google Pixel 7a comes with the same processor as its flagship counterpart and has already proven it’s a better camera phone. But the Galaxy A54 starts at $450 compared to the 7a’s $500, boasting some of its flagship sibling’s marquee features at a slightly lower price point than Google’s budget option. Those features include a high-resolution display, excellent battery life, and a design that I felt was more comfortable than the Pixel’s rectangle angles and stark camera bar. The sacrifice is Samsung flagship-level camera quality, wireless charging, and storage space options.

Samsung’s Galaxy A54 doesn’t look like a budget phone

The trick to selling a mid-range device is to make sure it doesn’t look cheap. That’s what Google did with its A-series and what Samsung has decided to do going forward. Last year’s Galaxy A53 was dull if you were buying it in black. But at least this year’s Galaxy A54 appears styled after its flagship counterpart.

A photo of the Galaxy A54

The Galaxy A54 is styled after its flagship counterparts.
Photo: Florence Ion / Gizmodo

The phone looks less “mid-range” than last year’s A53, even in black. Still, I love the periwinkle blue colorway on our review unit (it appears purple in Samsung’s official renders). It looks almost pearlescent in direct light, and it offers a chic aura that doesn’t immediately make you think this is a sub-$500 device. The A54’s build is durable, with the Galaxy A54 boasting tough glass on either side of the chassis, plus an IP67 water and dust resistance rating. There’s still some plastic on the outside frame, but you can barely tell at first glance.

A photo of the Galaxy A54

The Galaxy A54 (left) versus the Galaxy A53 (right).
Photo: Florence Ion / Gizmodo

The Galaxy A54’s 6.4-inch display is bold and beautiful. It’s bigger than the 6.1-inch screen you get on the Pixel 7a and way more saturated, but it’s that high-resolution Samsung AMOLED shining through with its smooth-scrolling 120Hz refresh rate that does it for me. The Pixel 7a display’s color temperature veers warmer, whereas Samsung’s displays are bluer, so video and graphics appear more vibrant. The maximum brightness for the A54 is 1,000 nits, and I found it just a smidge easier to read in direct sunlight than the Pixel 7a.

A photo of the Galaxy A54

It’s plastic all around but you can’t really tell.
Photo: Florence Ion / Gizmodo

Samsung’s Exynos vs. Google’s Tensor

Samsung’s Galaxy A54 runs on Samsung-made hardware, but it’s not exactly in the same vein as Google’s homemade Tensor G2, which powers the Pixel 7a and its flagship siblings. The Exynos 1380 processor is Samsung’s latest mid-range, 8-core chipset. It benchmarks better than last year’s Galaxy A53 but remains hampered by 6GB of RAM. When jumping between apps, you can feel the A54’s relative choppiness. It’s not often, but it’s occasional enough that any sluggishness is more noticeable than on the Pixel 7a (although that phone’s 60Hz default refresh rate presents its own issues when it comes to smooth scrolling).

In battery tests, the Galaxy A54 outlasted its flagship brethren and the Google Pixel 7a. Its 5,000 mAh battery is larger than the 7a’s 4,385 mAh. It lasted about 18 hours and 48 minutes in our battery rundown test, compared to the Pixel 7a’s 16 hours and 11 minutes. It’s good to know Samsung’s mid-range can last a long time, especially since it’s pushing out a 120Hz refresh rate.

A photo of the Galaxy A54

The three-camera system could use a telephoto.
Photo: Florence Ion / Gizmodo

I’m bummed there isn’t wireless charging on the Galaxy A54, even if it is a sub-$500 phone. I have managed to build out a network of chargers around my home, where I live and work, and it’s imperative I can plop down a device to charge it. I can do that with Google’s Pixel 7a but not with the Galaxy A54.

Not the best phone camera for night shots

Two test shots with the Galaxy A54 during mid-day.

Two test shots with the Galaxy A54 during mid-day.
Image: Florence Ion / Gizmodo

The Galaxy A54’s three-lens camera system includes a 50-MP primary camera with OIS and an aperture of f/1.8, a 12-MP ultrawide, and a 5-megapixel macro that seems unnecessary. I never thought to use it during my testing period and would have preferred telephoto glass for that third camera on the back for a little more optical zoom. The front-facing camera is 32-MP.

A Galaxy A54 nighttime landscape shot.

The Galaxy A54’s nighttime landscape shot.
Photo: Florence Ion / Gizmodo

Overall, the Galaxy A54’s camera system gets the job done if you’re outside and the weather is bright. Blues and reds are “punched up” with every photo, and you can use the on-screen camera controls to adjust the scene when it’s the golden hour. Just ensure you set the focus point in the right spot, or it could cost you a well-balanced photo. The maximum video recording resolution on the Galaxy A54 is 4K at 30 frames per second, which is smooth enough. It kept up with my kid panning the rows at a colorful car show, and the microphones were solid at mitigating the wind noise.

The Galaxy A54 will not get you astrophotography like a Google Pixel. Even with its dedicated night mode, it struggled to temper photos and capture enough light. And when it did produce an image in the dark, you can see the pixelation, as if evidence of an algorithmic struggle. That’s not as apparent in the Night Sight photos shot with the Pixel 7a. If it’s night skies and poorly-lit faces you find yourself snapping most, the Galaxy A54 should’t be your first pick.

Samsung’s version of Android is still good

Samsung’s Galaxy A54 runs Android 13 with OneUI 5.1. This gets you access to Samsung-exclusive features like Bixby Modes, widget stacks, and an improved screenshot manager. The company has promised four years of software updates for the Galaxy A54 and five years of security updates. That’s better than on the Pixel A-series, which gets four years of updates plus a sprinkling of quarterly Pixel-exclusive feature drops (though, like game loot, you never know what you’ll get and when).

A photo of the Galaxy A54

Samsung adopted Android 13’s uniform icons.
Photo: Florence Ion / Gizmodo

I appreciate that Samsung’s Android 13 has adopted unified icons. They make the interface look less cartoony than its colorful stock icons. There’s also an option to dim the wallpaper when you’re in dark mode, which I like for my sensitive eyes.

The Galaxy A54 needs one more flagship feature

Samsung’s Galaxy A54 is $450 if you buy it unlocked. But if you need mmWave connectivity to access the 5G bandwidth you’re paying for, it’s $50 more, which is the variant I’d have to buy if I wanted to use this device on Verizon.

A photo of the Galaxy A54

If you don’t mind the nighttime camera performance, Samsung’s Galaxy A54 is bang for your buck.
Photo: Florence Ion / Gizmodo

The Galaxy A54 needs one or two more flagship-like features to justify its cost relative to the Pixel 7a. It’d be nice to see Samsung add wireless charging or higher storage space tiers for folks who want a mid-ranger that will last them through all five years of its security updates. Concessions like that may not immediately translate to more units sold, but Samsung could establish itself as a mid-range manufacturer promoting longevity. That’s a point where it still has a leg up over the Pixel on the software side, so we’d love to see the hardware match.

[ad_2]

Samsung’s best phone just got the Galaxy S23 Ultra’s best feature for free

[ad_1]

Samsung has today given the Galaxy Z Fold 4 a hot new photography feature that will make it even better at taking photos of the night sky.

The feature, called astrophotography mode, comes from the very top of Samsung’s phone lineup in the shape of the Galaxy S23 series of phones. It allows devices to take detailed photos of the night sky in a way that isn’t normally possible all while creating some truly impressive images. Now, that same feature is available to Galaxy Z Fold 4 owners.

Samsung Galaxy Z Fold 4 - Image source: samsung.com

Samsung Galaxy Z Fold 4 – Image source: samsung.com

VIEW GALLERY – 2 IMAGES

Samsung had previously said that it intended to roll the feature out to people who owned non-Galaxy S23 and Galaxy S22 devices and now it’s made good on that promise. The Galaxy Z Fold 4 isn’t the only model that’s in on the act either, with the Galaxy S20 series, Galaxy Note 20, and Galaxy Note 20 Ultra all also able to enjoy the new astrophotography capabilities. All of the Galaxy Z Fold phones except the very first will also get the same feature, as will the Galaxy S21 which means there are plenty of phones out there that can now take some stunning photos of the moon.

The astrophotography mode is made available thanks to an updated Expert RAW app that can now be downloaded from the Galaxy Store. The app can also note your location and then give you tips on where you should be looking for specific constellations. What’s more, it can then take long-exposure shots to help you capture as much light as possible for even better shots. Those photos can go up to 10 minutes in terms of exposure, something helped by the fact the Galaxy Z Fold 4 can actually prop itself up, removing the need for a dedicated tripod.

Those who have a Galaxy Z Fold 4 can now download the updated Expert RAW app and we’d definitely suggest that you do that before it next gets dark. Once the app is downloaded and installed actually taking shots using the astrophotography mode is a simple case of tapping the final button in the top-right corner of the viewfinder.

This new update comes as Samsung continues to work on getting the Galaxy Z Fold 5 ready, with the phone likely to be announced alongside its flip-based cousin as soon as July 2023. Samsung’s phones are far from the only foldable on the way, with Google expected to announce the Pixel Fold on May 20 during its annual developer conference.

[ad_2]

San Francisco Proposal Photographer Offers New SF Locations And Photography Packages

[ad_1]

(MENAFN- EIN Presswire)

San Francisco Proposal Photographer

San Francisco Romantic Proposal

San Francisco Proposal Photography

New photography packages for proposals now available at San Francisco Proposal Photographer, providing beautiful, timeless photos to cherish for a lifetime.

SAN FRANCISCO, CA, UNITED STATES, April 30, 2023/einpresswire.com / — San Francisco Proposal Photographer is excited to announce the launch of their new proposal photography packages and locations, which aim to capture life’s intimate moments with skilled discretion and artistry. The studio has added several enchanting locations for proposals in San Francisco to its list, including Golden Gate Bridge, Baker Beach, Russian Hill, Ina Coolbrith Park, Sutro Baths, Fairmont Rooftop, Japanese Tea Gardens, Lovers Lane, Pier 14, and Muir Woods.

For more information, go to

The studio provides four different proposal packages, each of which is designed to capture the unique essence of the moment and offer beautiful and timeless photos to cherish for a lifetime. Package A includes a smartphone shoot with all photos delivered to the client, while Package B provides professional camera coverage of the proposal. Package C includes everything in Package B along with a 10-15 minute photo session, and Package D offers a 45-minute photo shoot.

“We are thrilled to be able to offer our clients the opportunity to capture their special moment in some of the most beautiful locations in San Francisco,” said the owner of the studio.“We strive to provide our clients with the highest quality proposal photography services, and our new packages and locations are sure to make their proposal even more memorable.”

San Francisco Proposal Photographer understands the significance of a proposal and strives to provide couples with the perfect setting and the ideal photos to commemorate their special day. In addition, the studio encourages clients to come up with their own surprise proposal ideas and unique ways to make their special moment even more memorable.

A recent customer commented:”It was a pleasure working with this team. I organized a proposal shoot with them and they were discrete, punctual, and enthusiastic. The pictures came out great and my fiance had no idea it was coming. Highly recommend them!”.

For more information about the studio’s photography packages and locations, visit the san francisco proposal photographer .

###

San Francisco Proposal Photographer
2150 Post St, San Francisco, CA 94115
(415) 812-5014

Eric Smith
San Francisco Proposal Photographer
+1 415-812-5014
email us here

MENAFN30042023003118003196ID1106141447


Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.

[ad_2]

Future Apple Watch could get cameras for photography & Face ID

[ad_1]

Apple’s proposal would be less bulky than the existing third-party Wristcam



AppleInsider may earn an affiliate commission on purchases made through links on our site.

Apple has big plans for cameras in future Apple Watches, if they can be fitted without making the watch awkward to wear — and if the cameras can be of high enough quality.

The popularity of the Apple Watch Ultra has shown that people are willing to wear bulkier devices if there is a clear benefit to them. In the case of the Apple Watch Ultra, that benefit includes a greatly extended battery life, for instance.

Future Apple Watches may also become at least a little larger, as Apple is again looking at ways to incorporate a camera. Previously, Apple has had a patent granted to include a camera in the Apple Watch’s Digital Crown, but that’s got to have very limited use.

Now in a newly-revealed patent application, Apple is proposing that a Watch could fit a camera into a slight protrusion toward the top of its chassis, above the display. “Wearable Electronic Device Having A Digital Camera Assembly,” would be similar to the existing Wristcam in where the camera is positioned, but that product is a whole Watch band.

Apple isn’t keen on taking up a whole band, or anything that makes the Apple Watch cumbersome.

“While certain electrical components, such as a camera, may perform desirable

functions,” says the patent application, “the integration of such components may result in a bulky device which may hinder user performance, may be uncomfortable to wear, or may be unsuited for performing certain functions (e.g., a camera disposed within a wearable electronic device may be awkward to position when capturing optical input).”

Apple is also dismissive of compromises to fit the camera into a Watch band. “Additionally, low-quality components may not meet a user’s quality expectations,” it says, “(e.g., a low-quality camera may produce low-quality images).”

It’s not just that Apple wants you to look nice on photographs. Apple specifically wants a video camera with up to 4K and 60 frames per second, or a still camera up to 12MP.

Either a spacecraft, or an Apple Watch side view with a camera protrusion to the right

Either a spacecraft, or an Apple Watch side view with a camera protrusion to the right

That’s because this is for more than photography wildlife, more than for capturing the crowd at Little League. This camera is actually meant for more than any regular photography or video.

“The digital camera assembly may be used for a variety of purposes,” continues Apple, “including, as non-limiting examples, facial identification, fingerprint sensing, scanning a Quick Response (QR) code, video conferencing, biometric monitoring (e.g., heart rate monitoring), photography, video or image capture, or any combination thereof.”

So with a camera on your Apple Watch, you could unlock all of your Apple devices through Face ID.

The Watch could also use its camera to “capture movement of a user’s body or other objects during certain activities.” Using visual inertial odometry (VIO), “the camera can be used to obtain a high degree of motion sensing accuracy, which may be used to monitor, detect, and/or predict a user’s motion or gesture based on certain characteristics.”

That’s a lot to demand of a camera, and Apple is not expecting to be able to fit one under the screen of an Apple Watch. Instead, it will be on what Apple calls a protusion, and much of the patent application is about how to do that without making the Watch distracting to wear.

“[A] digital camera assembly may be integrated into the wearable electronic device in a way so as to minimize an effect of the digital camera assembly on other electronic components and/or a form factor of the wearable electronic device,” says Apple.

“For example, in implementations where a digital camera assembly is positioned within an internal cavity (e.g., camera cavity) of a protrusion,” it continues, “the digital camera assembly may extend from the housing, over a band slot, and away from a display, a battery, a circuit assembly, or sensors of the wearable electronic device.”

“Likewise, the protrusion may be shaped to avoid interfering with geometry of the band slot,” says the patent application, “so that a band/strap may still be permitted to couple with the housing of the wearable electronic device.”

The patent application is credited to five inventors, including Christopher M. Warner, whose previous work includes muscle-sensing Apple Watch bands.

[ad_2]

iPhone vs Android: Two different photography and machine learning approaches

[ad_1]

Apple’s computational photography aims for realism



AppleInsider may earn an affiliate commission on purchases made through links on our site.

A controversy with Samsung’s phone cameras has renewed the conversation surrounding computational photography, and highlights the difference between it, and Apple’s approach in iOS.

It isn’t a big secret that Apple relies upon advanced algorithms and computational photography for nearly all of its iPhone camera features. However, users are beginning to ask where to draw the line between these algorithms and something more intrusive, like post-capture pixel alteration.

In this piece, we will examine the controversy surrounding Samsung’s moon photos, how the company addresses computational photography, and what this means for Apple and its competitors going forward.

Computational photography

Computational photography isn’t a new concept. It became necessary as people wanted more performance from their tiny smartphone cameras.

The basic idea is that computers can perform billions of operations in a moment, like after a camera shutter press, to replace the need for basic edits or apply more advanced corrections. The more we can program the computer to do after the shutter press, the better the photo can be.

This started with Apple’s dual camera system on iPhone 7. Other photographic innovations before then, like Live Photos, could be considered computational photography, but Portrait Mode was the turning point for Apple.

Apple introduced Portrait Mode in 2016, which took depth data from the two cameras on the iPhone 7 Plus to create an artificial bokeh. The company claimed it was possible thanks to the dual camera system and advanced image signal processor, which conducted 100 billion operations per photo.

Needless to say, this wasn’t perfect, but it was a step into the future of photography. Camera technology would continue to adapt to the smartphone form factor, chips would get faster, and image sensors would get more powerful per square inch.

Portrait mode uses computational photography to separate the foreground

Portrait mode uses computational photography to separate the foreground

In 2023, it isn’t unheard of to shoot cinematically blurred video using advanced computation engines with mixed results. Computational photography is everywhere, from the Photonic Engine to Photographic Styles — an algorithm processes every photo taken on iPhone. Yes, even ProRAW.

This was all necessitated by people’s desire to capture their life with the device they had on hand — their iPhone. Dedicated cameras have physics on their side with large sensors and giant lenses, but the average person doesn’t want to spend hundreds or thousands of dollars on a dedicated rig.

So, computational photography has stepped in to enhance what smartphones’ tiny sensors can do. Advanced algorithms built on large databases inform the image signal processor how to capture the ideal image, process noise, and expose a subject.

However, there is a big difference between using computational photography to enhance the camera’s capabilities and altering an image based on data that the sensor never captured.

Samsung’s moonshot

To be clear: Apple is using machine learning models — or “AI, Artificial Intelligence” for those using the poorly coined popular new buzzword — for computational photography. The algorithms provide information about controlling multi-image captures to produce the best results or create depth-of-field profiles.

The image processor analyzes skin tone, skies, plants, pets, and more to provide proper coloration and exposure, not pixel replacement. It isn’t looking for objects, like the moon, to provide specific enhancements based on information outside of the camera sensor.

We’re pointing this out because those debating Samsung’s moon photos have used Apple’s computational photography as an example of how other companies perform these photographic alterations. That simply isn’t the case.

Samsung's moon algorithm in action. Credit: u/ibreakphotos on Reddit

Samsung’s moon algorithm in action. Credit: u/ibreakphotos on Reddit

Samsung has documented how Samsung phones, since the Galaxy S10, have processed images using object recognition and alteration. The Scene Optimizer began recognizing the moon with the Galaxy S21.

As the recently-published document describes, “AI” recognizes the moon through learned data, and the detail improvement engine function is applied to make the photo clearer with multi-frame synthesis and machine learning.

Basically, Samsung devices will recognize an unobscured moon and then use other high-resolution images and data about the moon to synthesize a better output. The result isn’t an image captured by the device’s camera but something new and fabricated.

Overall, this system is clever because the moon looks the same no matter where it is viewed on earth. The only thing that changes is the color of the light reflected from its surface and the phase of the moon itself. Enhancing the moon in a photo will always be a straightforward calculation.

Both Samsung and Apple devices take a multi-photo exposure for advanced computations. Both analyze multiple captured images for the best portion of each and fuse them into one superior image. However, Samsung adds an additional step for recognized objects like the moon, which introduces new data from other high-resolution moon images to correct the moon in the final captured image.

Samsung's moon algorithm explained. Credit: Samsung

Samsung’s moon algorithm explained. Credit: Samsung

This isn’t necessarily a bad thing. It just isn’t something Samsung hasn’t made clear in its advertising or product marketing, which may lead to customer confusion.

The problem with this process, and the reason a debate exists, is how this affects the future of photography.

Long story short, the final image doesn’t represent what the sensor detected and the algorithm processed. It represents an idealized version of what might be possible but isn’t because the camera sensor and lens are too small.

The impending battle for realism

From our point of view, the key tenet of iPhone photography has always been realism and accuracy. If there is a perfect middle in saturation, sharpness, and exposure, Apple has trended close to center over the past decade, even if it hasn’t always remained perfectly consistent.

We acknowledge that photography is incredibly subjective, but it seems that Android photography, namely Samsung, has leaned away from realism. Again, not necessarily a negative, but an opinionated choice made by Samsung that customers have to address.

For the matter of this discussion, Samsung and Pixel devices have slowly tilted away from that ideal realistic representational center. They are vying for more saturation, sharpness, or day-like exposure at night.

The example above shows how the Galaxy S22 Ultra favored more exposure and saturation, which led to a loss of detail. Innocent and opinionated choices, but the iPhone 13 Pro, in this case, goes home with a more detailed photo that can be edited later.

This difference in how photos are captured is set in the opinionated algorithms used by each device. As these algorithms advance, future photography decisions could lead to more opinionated choices that cannot be reversed later.

For example, by changing how the moon appears using advanced algorithms without alerting the user, that image is forever altered to fit what Samsung thinks is ideal. Sure, if users know to turn the feature off, they could, but they likely won’t.

We’re excited about the future of photography, but as photography enthusiasts, we hope it isn’t so invisible. Like Apple’s Portrait Mode, Live Photos, and other processing techniques — make it opt-in with obvious toggles. Also, make it reversible.

Tapping the shutter in a device’s main camera app should take a representative photo of what the sensor sees. If the user wants more, let them choose to add it via toggles before or editing after.

For now, try taking photos of the night sky with nothing but your iPhone and a tripod. It works.

Why this matters

It is important to stress that there isn’t any problem with replacing the ugly glowing ball in the sky with a proper moon, nor is there a problem with removing people or garbage (or garbage people) from a photo. However, it needs to be a controllable, toggle-able, and visible process to the user.

Computational photography is the future, for better or worse

Computational photography is the future, for better or worse

As algorithms advance, we will see more idealized and processed images from Android smartphones. The worst offenders will outright remove or replace objects without notice.

Apple will inevitably improve its on-device image processing and algorithms. But, based on how the company has approached photography so far, we expect it will do so with respect to the user’s desire for realism.

Tribalism in the tech community has always caused debates to break out among users. Those have included Mac or PC, iPhone or Android, and soon, real or ideal photos.

We hope Apple continues to choose realism and user control over photos going forward. Giving a company complete opinionated control over what the user captures in a camera, down to altering images to match an ideal, doesn’t seem like a future we want to be a part of.

[ad_2]

samsung: Samsung Galaxy S23 Ultra review: All about improvements

[ad_1]

Samsung’s premium flagship smartphone, the S23 Ultra, is its most powerful offering, with substantial and practical improvements. Here’s a lowdown on what works and what doesn’t.

Big Plus
The smartphone’s familiar industrial design with sharp edges and flat rails makes it comfortable to hold despite a massive 6.8-inch Edge Dynamic AMOLED 2X QHD+ display. It also has a solid build quality.

Best Point
Its design may be similar from previous iterations, but the Qualcomm Snapdragon 8 Gen 2 chipset is powerful and power efficient. It handles everything you throw at it. The new 200-megapixel sensor is another major improvement. Samsung has also made some changes to the entire camera’s processing. As a result, it captures more light, provides better details, offers rich portraits, and is impressive for low-light photographs.

Standout Feature
The S Pen is one feature that stands out and is the key differentiator of the S23 Ultra. Besides, its new Astrophotography mode and 50-megapixel expert RAW features are other worthy improvements.

The X-factor
The Galaxy S23 Ultra is equipped with two dedicated telephoto lenses, – MP sensors, and 3x and 10x telephoto zoom. The zoom quality is unparalleled and produces a sharp and colour-rich output.

What Could Have Been Better
It only offers Wi-Fi 6E, while even other relatively affordable smartphones offer Wi-Fi 7 support. It could also have been priced lower.

Should I Go For It?
Definitely, if you are a power user and looking for all the bells and whistles in a smartphone. Starting at Rs 1,24,999, the Galaxy S23 Ultra offers an impressive display, camera, performance, and battery. It is a no-brainer for anyone looking for a premium Android phone.

[ad_2]

The Galaxy S23 Ultra Is Great, but Would Be Better if It Folded

[ad_1]

I struggle to look at any other Samsung smartphone now that I’ve been living with its foldables. The Samsung Galaxy Z Fold 4 has effectively changed how I use Android. Most of the time, I’ll only bother with my Google Pixel 7 if someone is calling the number linked to that phone. Otherwise, you’ll see me primarily on the foldable. It’s just so much more versatile for the life I lead.

That’s not to say I didn’t enjoy my time with the Samsung Galaxy S23 Ultra, but I missed the Fold while reviewing this one. Samsung’s ultimate new flagship device is everything you could want in a smartphone, but there is also a lot here that feels like overkill now that we’re in the second iteration of the Ultra and its stylus-wielding ways. In fact, I forgot to use the stylus until about two days ago (I don’t draw). And while four cameras are a great back-of-the-box brag, I still don’t understand how to push them to the extent they’ve been marketed as being capable of, and I realize I probably never will. And I like high spec phones!

Regardless, the Ultra still has plenty going for it, including a better design than the last generation. Those rear-facing cameras may not be enough to justify the price to casual users, but their post processing algorithms are just as good as Google’s—better in some cases. The Ultra even has a few features I think foldables are still missing—like that stowable stylus.

But when it comes to targeting genuine innovation as opposed to niche specialty features, the Ultra might miss the mark compared to both the competition and Samsung’s other phones.

The best Ultra yet

If you like big phones, you’ll love the Galaxy S23 Ultra (I don’t—it’s not foldable). It has a 6.8-inch Dynamic AMOLED display, categorized as such because it’s based on tech that allows the display to dynamically change refresh rates without killing the battery. The jury is still out on how much battery that display tech saves, and I’ll get more into that when we talk about the battery rundown results later. Still, the display that Samsung has going here is like carrying a tiny version of its TVs in your pocket.

You might have gotten into the Galaxy line because you love Samsung’s displays. I can’t blame you. Like on the S22 Ultra, the screen on the S23 Ultra is a 1440p resolution with a 120Hz refresh rate. I love watching TV on this thing, even the 720p classics like Taxi and One Day at a Time. What I especially appreciate about Samsung is how low the brightness can go so that I can fall asleep to those shows at the end of the night without lighting up the room. Samsung enables the use of Android 12’s extra dim mode, and with that turned on, the phone doesn’t go any higher than about 350 nits—the standard rate is around 430 nits, or a whopping 1,750 nits if you’re out in direct sunlight and using the adaptive brightness feature.

A photo of the Galaxy S23 Ultra

The Galaxy S23 Ultra houses an S Pen inside its chassis.
Photo: Florence Ion / Gizmodo

The best part about the new Galaxy S23 Ultra is that Samsung fixed some of what I didn’t like with the Galaxy S22 Ultra’s design. Mainly, it squared off the edges instead of rounding them, so it’s easy to cradle the phone one-handed. I finally felt confident that I wasn’t going to drop it. I’m glad Samsung stopped with the overtly rounded edges, which are also annoying to use when you’re tapping on the edge of the screen.

This is still a gigantic smartphone. I hope you have big hands if you plan to play games on this thing. My small hands and long claws had difficulty cradling the Ultra to play with on-screen controls in games like Dreamlight Valley through Xbox Game Pass, and my wrists got weary holding the phone to control my character in Riptide GP: Renegade. The first-gen Razer Kishi controller that I use for Android gaming also feels as if it’s stretched to capacity on this phone, as if the Galaxy S23 Ultra will pop out at any minute. Unless it’s a point-and-tap game, I use a Bluetooth controller to play games on the S23 Ultra. The OnePlus 11’s similarly sizeable 6.7-inch display, comparatively, feels less ginormous because it doesn’t have the Ultra’s squared-off corners and the chassis is narrower.

A photo of the Galaxy S23 Ultra

The Galaxy S23 Ultra is a big phone, make no mistake.
Photo: Florence Ion / Gizmodo

The Galaxy S23 Ultra utilizes an in-display fingerprint sensor and face unlock for added lock screen security. It’s best that Samsung didn’t carry over the power button fingerprint sensor like on the Z Fold 4, because I am constantly accidentally pressing that one and locking myself out of it. Scanning in a fingerprint or smiling at the Ultra felt fast and responsive unless I wore a mask or sunglasses.

The default sorage space on the S23 Ultra has thankfully been bumped up to 256GB. It starts there and goes all the way up to 1TB, if you can stomach paying for it (doing so will add $420 on top of the base storage’s cost). The Ultra is also IP68 rated for water and dust resistance.

Qualcomm with Samsung flavoring

Something to note about this year’s Galaxy S23 lineup is that it runs a unique flavor of the Qualcomm Snapdragon 8 Gen 2 processor. Rather than use the one that came right out of the box, Samsung infused some of its AI smarts to tune camera and performance algorithms to its liking. The company already does this to some effect with its Exynos chips overseas, and it’s bringing that expertise to the phones sold in the states to one-up Google’s homemade Tensor processor. Sometimes it works.

A photo of the Galaxy S23 Ultra

If you like to game, the S23 Ultra can serve. But its big size may not be easy for everyone to cradle.
Photo: Florence Ion / Gizmodo

The Galaxy S23 Ultra is available with 8GB and 12GB of RAM, which seems absurd. The Ultra should have 12GB of memory as the standard, since it’s technically the ultimate Samsung phone. Even with the 12GB of RAM, you can’t tell that the chip inside the Galaxy S23 Ultra is any beefier than what’s inside the similarly-specced OnePlus 11. On paper, and in Geekbench 5 (which will be Geekbench 6 in our reviews going forward), the Galaxy S23 Ultra performed better than OnePlus 11 by only about 300 points on the single-core score and 400 points on the multi-core one. But that proves little about whether Samsung’s infused chip is faster or more able than OnePlus’s vanilla one in actual use. Considering the Google Pixel 7 Pro is a laughing stock on the benchmark charts but not in real-world use—it ranks with 400 points less than the Galaxy S23 Ultra—it’s hard to use these benchmarks as the sole test for what’s possible. Anyway, neither of these Android devices can hold a candle to the numbers that Apple’s A16 Bionic spits out.

The upside to having such a powerful smartphone is that it can do everything: play games locally and from the cloud, create and edit documents, quickly export edited videos, process RAW photos, and chat with whoever. The Ultra can handle each of these cases with absolute ease, but that’s expected from a phone that I’ve been running for about three weeks. The real test for these devices is how they do after a year in the hand.

A photo of the Galaxy S23 Ultra

Riptide GP: Renegade runs smoothly on the S23 Ultra, but it’s too big to comfortably play for my hands.
Photo: Florence Ion / Gizmodo

I echo the sentiments of a few other reviews: the Galaxy S23 Ultra doesn’t get as hot as previous versions of the device or even other Android phones. I fell asleep next to it a few nights in a row while it was charging and playing Pluto TV, and I didn’t feel the usual heat emanating as the battery fueled up for the next day. It did get toasty once while I was mindlessly scrolling through TikTok (as I often do), and it was significant enough that I remember saying, “I should probably mention this in the review.”

Apple’s iPhone 14 Pro Max lasts longer

I’m sorry to include Apple in the subhead of a Samsung Galaxy review. But I remain impressed by the battery test on Apple’s latest flagship, and it’s now the benchmark for every other flagship phone review.

Samsung’s 5,000 mAh battery is enormous while remaining the same size as in last year’s Ultra. Whatever Samsung did on the backend to extend battery life has worked thus far—the S23 Ultra beat out the S22 Ultra by about two hours, lasting 18 hours and 33 minutes. But that’s nothing to Apple’s nearly 24-hour battery life on its large iPhone 14 Pro Max. I want some of whatever magic Apple has going on with its software to come to Android land.

These results translated to using the phone daily, too. As I mentioned, I’m a TikTok freak, and I was surprised to see that the Ultra chewed through only 23% of its battery life in five hours after mixed-use, which included tuning into my Disney streamer.

Move over, Pixel camera

A photo of the Galaxy S23 Ultra

The back of the Galaxy S23 Ultra houses four camera sensors.
Photo: Florence Ion / Gizmodo

Because the Galaxy S23 Ultra is being dubbed as “ultimate,” its cameras are appropriately extreme. They’re also the key upgrade point here, and took up the majority of Samsung’s announcement event for this phone. The primary camera is a 200-MP standard wide-angle lens with optical image stabilization (OIS) and an f/1.7 aperture. The ultra-wide camera is a 12-MP sensor with an f/2.2 aperture. And the two telephoto lenses on the back also have OIS, though one has an f/2.4 aperture with a maximum 3x optical zoom, and the other is f/4.9 with a 10x optical zoom. The maximum digital zoom for this camera is 100x, just like the S22 Ultra.

A photo taken with the Galaxy S23 Ultra

An unedited photo shot at dusk with the Galaxy S23 Ultra.
Photo: Florence Ion / Gizmodo

Whenever someone outside of the Android bubble realizes the Galaxy S23 Ultra has four cameras on the back, they often ask me, “why?” The answer is so it has camera lenses for every foreseeable situation. For instance, if you’re chasing your kid around the park, you want that quick 3x optical zoom to capture them in the frame and up close. The result is a background bokeh effect that helps make the image instantly shareable on Instagram without using Portrait mode. Or if you happen to be lying down at the park, only to hear the roar of a jet engine approaching overhead, you can use the 10x optical zoom to get a closer look and maybe even post it to TikTok. For epic sky days, when the clouds seem to be cruising through as if they’re fresh cotton candy spun right out of the bin, the ultra wide-angle camera helps increase the drama when shared in your secret Slack channel of friends obsessed with sunsets.

Photo samples from the Galaxy S23 Ultra

Even with two dedicated telephoto cameras, it’s not always the best at zooming in to see what’s yonder.
Photo: Florence Ion / Gizmodo

Nowadays, most smartphone cameras are capable of everything I just described, but Samsung purports a higher resolution and greater color and distance detail. These are the cameras we have on us every day, and Samsung argues that these are the digital memories we’ll be pulling from as we struggle to remember our lives someday in the future.

That’s not to say that every photo the Galaxy S23 Ultra produces is perfect. Zooming past the 10x optical limit requires praying that the image won’t be jaggy or over-sharpened. There were so many instances on the evening of my daughter’s third birthday that the pictures of her punching around a balloon came out looking blurry—a real bummer for me as I was trying to find a cute one to share within group chats. I also tried staying up one night to capture the Air Force flying their planes in the sky above, and I could not produce anything worth sharing.

A photo taken with the Galaxy S23 Ultra

Another unedited photo shot with the Galaxy S23 Ultra.
Photo: Florence Ion / Gizmodo

As it stands, the 200-MP sensor on the Galaxy S23 Ultra isn’t shooting in its full resolution at all times. Like most flagship smartphones, including the iPhone 14 Pro and Google Pixel 7, Samsung uses pixel-binning, so the phone shoots like a 12-MP camera with 16 pixels within each megapixel. The result is brighter photos throughout with better detail. I preferred the 12-MP images worked over by the algorithm over the full 200-MP raw ones, which usually require some post-editing, anyway. I want to avoid editing a photo while just trying to share it on social media.

You can see more clearly how the Galayx S23 Ultra’s post-processing stacks up compared to the iPhone 14 Pro Max and Pixel 7 Pro in the slideshow I put together here. For the most part, I found Samsung’s algorithms to veer towards being saturated, though it was impressive at tempering the final product to maintain detail where it mattered. The most obvious example is a photo where I shot the Santa Ynez Mountains in Santa Barbara; the S23 Ultra held on to the subtle detail of the sunset, lighting up the ridges without over-contrasting them.

I wrote more about Expert RAW in the other piece, including Samsung’s improved astrophotography feature. I wish that Samsung would have extracted this feature on its own rather than buried it inside another download that has to be enabled in the camera app before anyone knows it’s even there. Samsung includes all these unique camera features as if we’re supposed to know how to use them right out of the box. But as with the improved nighttime video recording capabilities teased during the Ultra’s debut at Galaxy Unpacked earlier this month, I had no idea where to start. Just because a smartphone can do all these fancy things doesn’t mean that the general population will aspire to that. And after ten years of reviewing smartphones, I might also give up.

That’s a big problem, as the camera system here is a major selling point and a major justification for the price tag. Compare that to Apple, which due to making both the iPhone and iOS, is able to bundle its phones with tons of everyday usability conveniences.

A photo of the Galaxy S23 Ultra

The Galaxy S23 Ultra offers so many different camera modes that I’m often too overwhelmed to play with any.
Photo: Florence Ion / Gizmodo

Before we move on from the cameras, there are a few other things to note: video recording on this smartphone is aces, even without a tripod. But for stability’s sake, I’ve been propping the Ultra up on a handheld tripod and following my kid around at 60 fps. The video is so smooth! The Ultra maxes out at 30 frames per second in 8K resolution for video recording, and there’s a Pro Video mode if you’re comfortable with tweaking camera settings. The front-facing camera is a 12-MP sensor with an f/2.2 aperture; annoyingly, it doesn’t zoom in or out.

Does a smartphone need a stylus?

A photo of the Galaxy S23 Ultra

The S Pen has always been a nice-to-have, but it feels more fitting for a tablet-style device.
Photo: Florence Ion / Gizmodo

Samsung’s S Pen has been around for a long while. It’s as iconic as Paris Hilton’s chihuahuas in the 2000s (RIP to them all). Last year’s Ultra was the first time it appeared in the regular Galaxy lineup after the sunsetting of the Galaxy Note series of yore. But functionally, it’s similar to what the S Pen could do before it. You can pop it out for drawing and cropping when the situation on screen calls for it—accommodating for business people doing precise things, like needing to move a cursor within a document or having to sign off on a contract while in line somewhere. But I’m starting to realize this screen is too limited for anything art driven. Granted, I’m not an artist, but if I imagine myself as a college student (again), the S Pen would feel much more appropriate docked inside a gadget like the Z Fold 4, with can open up into a larger display that’s fit for highlighting and making digital notes. That’s a form factor that lends itself to a stylus rather than the cramped screen on the S23 Ultra.

The other problem with the S Pen is that it requires its own space inside the chassis to dock. That’s the tradeoff for a phone slightly too big for your pocket or those straddling gaming controllers. As much as the S Pen is an iconic tool, I don’t know that it belongs on a smartphone anymore, even if you can use it as a Bluetooth controller.

Samsung’s version of Android

A photo of the Galaxy S23 Ultra

Samsung’s software is fine, but often it doubles up on Google’s offerings.
Photo: Florence Ion / Gizmodo

The Galaxy S23 Ultra ships with One UI 5, based on the latest version of Android 13. The One UI 5.1 update is the one that everyone’s waiting for right now, since it includes features like Bixby Text Calling, which works similarly to the Pixel’s Screen Calling. This feature is now live in English (it was available only in Korea until now), but I couldn’t get it to work during my testing period. I hope to revisit this and some of Bixby’s other features later, as I’m curious to understand the benefits of sticking with it over the tried-and-true (even if sometimes frustrating) Google Assistant.

I don’t mind Samsung’s version of Android, especially not since adopting the foldable. I realized it comes with the benefit of Samsung tweaking what Google gave it to its devices, even if it doesn’t have any semblance of Android’s interface framework, called Material You. Samsung offers some neat integration with Microsoft’s Your Phone app on Windows PCs that’s beyond the default experience, including the ability to control your device from the desktop remotely. There’s also the ability to snap a photo in Expert RAW and have it immediately populate in Adobe Lightroom. These abilities are nice to have, but like the Galaxy S22 Ultra last year, I hardly ever considered using them after the review period was over. They’re not a reason to go out and buy a phone.

Still too much phone

A photo of the Galaxy S23 Ultra

I hope you enjoyed reading this entire review only to have me tell you to buy a foldable.
Photo: Florence Ion / Gizmodo

I know there are people out there salivating over the Galaxy S23 Ultra. They want the best that Samsung has to offer in its lineup, whether it’s for bragging rights or because they want all those lenses and this is the only camera they’ll own. I get all that, but I still think the Ultra is a bit of overkill in a market where we’re all screaming for a deal. There are still two other models of the Galaxy S23 that I have yet to review, and though they’re smaller devices with slightly different chassis, they more or less deliver the same Samsung experience across the board for less. They’re priced a little over the Pixel 7 lineup, starting at $800 and $1,000 for the S23 and S23+, respectively.

If you’re going to spend a starting price of $1,200 on any Android smartphone, I’m pleading with you to get a foldable instead. Yes, it’s a new kind of form factor with dubious longevity, but it’s not going away any time soon. For many, even those who want the best, camera fidelity will reach a point diminishing returns. But a foldable drastically changes every user’s experience. There is more competition cropping up overseas and the rumor mill is getting louder as more manfacturers are hopping on board this new smartphone fad. At the very least, if you’re spending a whopping amount of money on a smartphone, get something that’s a bonafide phone and a tablet for the price.

Better and better cameras are perhaps not what each new generation of a phone should be targeting, at least anymore.

[ad_2]

Galaxy S23 Ultra Camera Tested vs Pixel 7 Pro, iPhone 14 Pro Max

[ad_1]

A photo of the Galaxy S23 Ultra

Like last year’s release, the Galaxy S23 Ultra has four cameras on the back (one of those holes is just a sensor).
Photo: Florence Ion / Gizmodo

I’m having a ton of fun with the Samsung Galaxy S23 Ultra. There’s still more to explore before I have a full review ready—I still need to test its purported nighttime video-taking prowess, plus I’m still learning how to get the most out of the Expert RAW mode in the Samsung camera app. But until then, I figured I’d share what I’ve noticed so far about the $1,200 phone’s camera compared to those on other popular flagship smartphones I have floating around, like the Pixel and iPhone.

How do the Galaxy S23 Ultra’s shots compare to those on the Google Pixel 7 Pro and Apple iPhone 14 Pro Max? So far, so good. It feels like Samsung fixed some of its zooming algorithms for more precise shots, but I still need to perform side-to-side testing against last year’s S22 Ultra. Samsung’s photos are still more saturated than what Google and Apple produce, but the result is less noisy than it used to be. The company has also toned down the sharpening on its photos and figured out how to balance stark sunlight coming in from the side. However, the Ultra is still no match for the Pixel when it comes to nighttime shooting capabilities. Spoiler: the Pixel 7 Pro is still better at shooting photos of stars, at least until I become a Expert RAW mode power user.

[ad_2]