Ryan Haines / Android Authority
Google Pixel smartphones have been some of the best Android smartphones you can buy, but their presence at the top is surprising if you take a deeper look at the spec sheet. The Pixels barely have top-of-the-line specifications, trailing behind most market leaders. However, thanks to some nifty software magic, Google can extract the most possible value out of hardware. We can see this in action with the Google Camera app on Pixel smartphones, which enables some cool photography features. Here are all the features that you get on the Google Camera app.
What is Google Camera?
Ryan Haines / Android Authority
Google Camera is the default camera application shipped on Google Pixel smartphones. Most OEMs ship their own modified camera app on their smartphones as part of their Android skin, so Google is no different.
What makes Google Camera unique is that it can extract the best results out of dated camera hardware often found on Pixel smartphones. The Google Camera app contains most of the algorithms responsible for Google’s software magic on photos.
These software optimizations are so potent that third-party modders regularly attempt to port the latest Google Camera app from Pixel devices to other Android smartphones, improving the photography prowess of their non-Pixel hardware.
The Google Camera app was initially released to the public on the Google Play Store. But those days are long gone. Google Camera is now exclusive to Pixel smartphones. If you spot the app on a non-Pixel smartphone, it will likely be a third-party Google Camera port (often called “GCam” in this context).
Google Camera: Photo features
The Google Camera app has many features, but it also misses out on some, like a dedicated manual or Pro mode for photos and videos. Not packing a manual mode is a feature in itself, as you, as an average user, need to trust the Pixel to take the best shot for you.
Despite such a flaring shortcoming, the Google Camera app is still one of the best camera apps for Android. The presence of these other features compensates for the missing Pro mode.
The highlight feature of the Google Camera app is HDR Plus, which was added around the release of the Nexus 6. HDR Plus is the engine behind HDR imaging in the Google Camera app. In its early announcement posts in 2014, the company said it uses “computational photography” for HDR Plus.
When you press the shutter button in the Google Camera app, HDR Plus captures a rapid burst of three to 15 pictures and combines them into one.
In low light scenes, HDR Plus takes a burst of shots with short exposure times, aligns them algorithmically, and then claims to replace each pixel with the average color at the position across these burst shots. Using short exposures reduces blur while averaging shots reduces noise.
In scenes with high dynamic range, HDR Plus follows the same technique, and it manages to avoid blowing out the highlights and combines enough shots to reduce noise in the shadows.
In scenes with high dynamic range, the shadows can often remain noisy as all images captured in a burst remain underexposed. This is where the Google Camera app uses exposure bracketing, making use of two different exposures and combining them.
HDR Plus with Bracketing is the highlight feature of the Google Camera app.
The experience with exposure bracketing gets complicated with Zero Shutter Lag (ZSL, more on this feature below). HDR Plus works around ZSL by capturing frames before and after the shutter press. One of the shorter exposure frames is used as the reference frame to avoid clipped highlights and motion blur. Other frames are aligned to this frame, merged, and then de-ghosted through a spatial merge algorithm that decides per pixel whether image content should be merged or not.
If all of this sounds complicated and confusing to you as a user, fret not. The Google Camera app doesn’t require you to worry about these details. You just have to click photos; Google’s algorithms will handle the rest.
Here are some camera samples from the Pixel 7 Pro’s primary camera:
Night Sight is all of HDR Plus but in very low light. Because of the lack of light, the exposures and burst limits are allowed to be liberally longer. The time to take a night shot is thus longer, and a stronger element of motion must be compensated for.
You can expect a Night Sight photo to take about one to three seconds, and we’d advise you to wait another second after pressing the shutter button. Pixel smartphones will automatically enable Night Sight when it is dark, though you can manually toggle the mode if necessary. Note that Night Sight does not work if you have Flash turned on.
On newer Pixel smartphones, the denoising process in the HDR Plus process during Night Sight uses new neural networks that run on the Tensor processor. This has yielded improvements in the speed of a Night Sight shot.
Here are some Night Sight samples from the Pixel 7 Pro:
You need to mount your Pixel smartphone on a tripod and be in practically pitch-black conditions (away from city lights) with your phone pointed toward the clear sky. Once your Pixel phone determines the conditions to be right, it will show a message “Astrophotography on.”
In this mode, the Pixel phone will take 16 16-second photos and merge them to produce one detailed photograph. You can also create a cool one-second astrophotography timelapse of this 16-second shot.
Zero Shutter Lag (ZSL)
Robert Triggs / Android Authority
Zero Shutter Lag has long been an invisible feature in the Google Camera experience. The philosophy with Zero Shutter Lag is self-explanatory: What you click should be immediately captured. Users should be able to click the shutter button and forget about the image if they wish. The task should be done right at the button press, requiring no further waiting for processing to complete.
However, this is easier said than done, especially considering features like HDR Plus (combining a burst of images) and pixel binning (combining adjacent pixels) are inherently compute-intensive.
ZSL gets around this by capturing frames before the shutter is pressed! In some situations, like HDR Plus, longer exposures are captured after the shutter is pressed, though this experience is often hidden from the viewfinder.
ZSL used to be a more vital feature when phone processors were slow and required a lot of time to process an image. Zero Shutter Lag no longer gets advertised as strongly, as the feature is now practically seen across the smartphone ecosystem in ideal lighting conditions.
ZSL is also fairly challenging to orchestrate in current times, where our reliance on computational photography is at an all-time high. It gets further eclipsed by features like Night Sight and Astrophotography that intentionally take multiple seconds to capture photos.
Super Res Zoom
Kevin Convery / Android Authority
Historically, Pixels haven’t had the latest camera hardware, so Google had to rely on software magic to meet customer expectations. For instance, Google resisted adding a telephoto camera for optical zoom on the Pixel for quite some time and instead developed the Super Res Zoom feature that mimics the same functionality with digital zoom.
Super Res Zoom was introduced with the Pixel 3. On that phone, this feature merged many frames onto a higher resolution picture (multi-frame super-resolution) instead of upscaling a crop of a single image that digital zoom often did. This technique allowed the Pixel 3 with its single camera to achieve zoom details at the 2x level that was surprisingly better than expected out of digital zoom.
Super Res Zoom salvages the results of digital zoom.
With Super Res Zoom, you would get more details if you pinched to zoom by 2x before taking a photo rather than digitally cropping the photo by 2x after taking it.
When Google finally made the jump to a telephoto lens with the Pixel 4 series, it used HDR Plus techniques on the telephoto lens to achieve even better results.
Google switched lanes more recently, adapting to the larger primary and telephoto sensors on the Pixel 6 Pro and Pixel 7 Pro for its Super Res Zoom feature. For 2x zoom, Google crops into the inner portion of the 50MP primary sensor to produce 12.5MP photos. It then applies remosaicing algorithms and uses HDR Plus with bracketing to reduce noise.
For 5x zoom, Google uses a crop of the 48MP telephoto sensor and the same techniques. For zoom outside of 2x and 5x, Google uses Fusion Zoom, a machine-learning algorithm that merges images from multiple cameras into one.
Once again, you, as a user, do not have to worry about any of this. Just pinch to zoom in, click the shutter button, and let Google figure out the rest on their Camera app.
Here are some zoom samples from the Pixel 7 Pro:
Portrait Mode takes photos with a shallow depth of field, letting the subject pull all the attention to itself while the background remains blurred.
Smartphones usually use two cameras located next to each other to capture depth information (just like our eyes!). Without this depth information, the phone would have difficulty separating the subject from the background.
However, Google managed to do a good job with Portrait Mode on the Pixel 2, and it did it with just one camera on the front and back. The company used computational photography and machine learning to overcome hardware limitations.
Portrait mode on single-camera setups (like the front camera) starts with an HDR Plus image. The Camera app then uses machine learning to generate a segmentation mask that identifies common subjects like people and pets.
If depth information is available in some way (when you have multiple cameras available, such as on the back), it generates a depth map, which helps apply the blur accurately.
Real Tone is Google’s effort at making photography more inclusive for darker skin tones. It attempts to counter the skin tone bias that has long existed in photography, giving people of color a chance at being photographed more authentically.
As part of the Real Tone initiative efforts, Google has improved its face identification in low-light conditions. The AI and machine learning models that Google trains are now fed a wider, diversified data set. The company has also improved how white balance and automatic exposure work in photographs to accommodate a wider set of skin tones better.
Starting with the Pixel 7 series, the company also uses a new color scale (called the “Monk Skin Tone Scale”) that better reflects the full range of skin tones.
Dual Exposure controls and Color Temperature control
Aamir Siddiqui / Android Authority
As mentioned earlier, the Google Camera app does not have a Pro mode. Very little thinking is involved in clicking a good photo on a Pixel smartphone.
If want some control over your photos, Google gives you three settings that you can play with:
- Dual Exposure Controls:
- Brightness: Changes the overall exposure. It can be used to recover more detail in bright skies or intentionally blow out the background if needed.
- Shadows: Changes only the dark areas. It manipulates the tone mapping instead of the exposure. It is helpful for high-contrast scenes, letting users boost or reduce shadows.
- Color temperature control: Changes the scene’s color temperature to make it warmer or cooler.
You can access all three sliders by long pressing on the viewfinder in Photo mode. These settings co-exist alongside HDR Plus, so you can still use computational photography features while slightly modifying specific settings to suit your taste better.
Ryan Haines / Android Authority
Google adopted “Computational RAW” with the Pixel 3, though the term doesn’t feature frequently in their marketing.
With the Google Camera app, you can save RAW image files (.dng) alongside processed image files (.jpg). RAW files traditionally allow you a wider range of adjustments for settings like exposure, highlights, shadows, and more.
But the trick here on the Google Camera app is that these RAW image files aren’t entirely raw and untouched. Google processes the RAW file through its computational photography pipeline before saving it.
RAW on Google Camera is Computational RAW.
This approach may alarm purists and hobbyists who want an untouched, unprocessed RAW file. But try as hard as you may; you will find it extremely difficult to get a better result processing your RAW file by yourself than compared to Google’s super-processed JPGs.
Computational RAW is the middle ground, applying some of the Google software magic and letting you apply some of your own on top. The results of this approach take advantage of Google’s processing expertise and your vision.
Macro Focus is one feature that relies heavily on hardware. It uses the ultrawide lens on the Pixel 7 Pro that is equipped with autofocus, letting you focus as close as 3cm away. When you come close to a subject, the Pixel 7 Pro will transition from the main camera to the ultrawide and let you take a macro photo.
You can also take a macro focus video on the Pixel 7 Pro.
Zak Khan / Android Authority
Motion Mode on Pixel smartphones adds a creative way to use long exposure shots. It essentially uses a longer exposure photo and adds a blur effect to the background and moving parts in the image.
Most newer Pixel phones have two blur effects: Action Pan and Long Exposure. Pixel 6a, Pixel 7a, and Pixel Fold users do not have the Action Pan effect.
Action Pan works best for moving subjects against a stationary background (where the background gets blurred), while Long Exposure is better for motion-based scenes (where the moving object is blurred).
Google Camera has a Motion Photos feature, which records a short, silent video when capturing a photo. It adds life to a still image and captures candid moments before and after a shot. The same was implemented in iOS as Live Photo.
Motion Photos is different from Motion Mode. You can export Motion Photos as videos.
Aamir Siddiqui / Android Authority
If there is one thing clear so far, the Google Camera app takes a lot of photos all the time, even when clicking one photo. You need to set Top Shot to Auto or On.
Top Shot lets you save alternative shots from a Motion Photo or video. The camera app takes a lot of photos before and after you tap the shutter button and then recommends a better-quality photo than the one you clicked, like one where all the people in the image are smiling well and not blinking.
Note that Top Shot is unavailable when you have enabled Flash, Night Sight, Selfie Illumination, or Social Media Depth features.
The Google Camera app saves data about the faces you photograph or record frequently if you turn on the Frequent Faces feature. The face data is saved on your Pixel smartphone and not sent to Google. When you turn off the feature, the data is deleted.
With Frequent Faces, the Google Camera app on Pixel 4 and later devices identifies and recommends better shots of faces you capture often. So you will get fewer blinking eyes and more smiling faces when using the Top Shot feature.
The feature also taps into the Real Tone feature, offering better auto-white balance for these recognized subjects.
Like other camera apps, the Google Camera app also lets you take videos in photo mode. Long-press the shutter button in photo mode and begin video recording.
Google Camera includes a timer setting for three seconds and 10 seconds. When you activate these timer settings, you also activate the Palm Timer. Once you have framed yourself in the photo, raise your palm to face towards the camera, and the timer will begin counting down.
Guided Frame is an accessibility feature on the Google Camera app designed for the visually impaired community. This feature uses Google’s TalkBack mode to audibly guide you through the framing and photo-clicking process for a selfie.
Panorama and Photo Sphere
The Google Camera app also includes Panorama and Photo Sphere modes. Panorama lets you stitch multiple images to create one long image. Photo Sphere enables you to stitch multiple images to create an “image sphere” that shows off all around you.
Google Camera: Video features
A substantial focus of the Google Camera app is photos, and the extensive feature list and improvements over the years are testimony of this attention. Videos are also crucial to the Google Camera experience, but it doesn’t receive the same love. As a result, Pixel phones with the Google Camera app can take excellent photos and good videos.
The Google Camera app can record video at up to 4K 60fps across all lenses, though this feature sees limitations depending on your Pixel phone. You can record with the h.264 (default) or the h.265 codec. You can also choose to record in 10-bit HDR.
There are a few stabilization options available within Google Camera:
- Standard: Uses optical image stabilization (OIS, if present), electronic image stabilization (EIS), or both.
- Locked: Uses EIS on the telephoto lens or 2x zoom if 2x telephoto is not present.
- Active: Uses EIS on the wide-angle lens.
- Cinematic Pan: For dramatic and smooth panning shots.
There is also a dedicated Cinematic video mode, which is like Portrait mode but for videos. Further, you also get the usual slow-motion and time-lapse video capabilities.
Google Camera: Availability
Ryan Haines / Android Authority
For devices other than Google Pixels, there are unofficial GCam ports. Third-party enthusiasts modify the Google Camera app and make it run on unsupported phones. They also tweak some of the myriad processing values to get subjectively different results that suit the hardware output from a certain class of phones.
While you can use GCam ports to get the Google Camera experience on your non-Pixel device, note that there will always be the risk of installing unknown APKs, and we recommend against that. Please be careful with what you install on your phone, and random APKs found on the internet should not be installed. Only install apps from official sources and developers that you trust.
Yes, the Google Camera app is free as it comes pre-installed on Pixel smartphones.
The official Google Camera app comes pre-installed on Pixel smartphones. If your phone does not have the app pre-installed, you cannot officially install the Google Camera app. Instead, you can install unofficial Google Camera ports at your own risk.