In a post on the Google Research blog, the team behind the astrophotography feature explains how it works. First, the camera takes 15 long exposure shots with a limit of 16 seconds per frame and merges them. These long-exposure frames collect 250 times more optical data than a regular photo. Then, using an onboard AI called a “convolutional neural network,” the camera automatically adjusts the photo’s contrast and reduces noise in the shot, making celestial objects more prominent.
Extremely low-light settings like the night sky also require special processing to show a clear image on the phone’s viewfinder (in other words, its screen) and to adjust the camera’s autofocus, so Google came up with similar algorithms they call “post-shutter viewfinder” and “post-shutter autofocus” to combat these issues. Both make astrophotography significantly easier on the user. Otherwise, you’d point your phone at the sky and see a blurry gray mess or end up with a badly focused photo.
As long as there is at least a modest degree of moonlight, Google Pixel phones deliver sharp images of starry skies, especially when mounted on a tripod. It also produces clear, vibrant images of landscapes at dawn or dusk, in case you want a change of view and perspective.