6 Special Stargazing Events That Include October 14’s Solar Eclipse

[ad_1]

On Saturday, October 14 a “ring of fire” annular solar eclipse will surge across nine states in the U.S. Southwest. Everyone in the Americas will see a partial solar eclipse.

Many events being staged for the eclipse—either for the ring or merely for a small partial—are just for a few hours. However, October is a fabulous time for stargazing when many annual star parties are held.

If you’re just as interested in dark skies and stargazing as you are in the eclipse, these five star parties—some heavily themed by the eclipse and others that merely have it as a side-show—could be for you.

1. Dark Sky Festival, California (78% eclipse)

When: October 12-15

Where: Death Valley Starcamp, Indian Ranch Road, Panamint Valley, California

Held between October 12-15, the annual California Dark Sky Festival features three nights of telescopes, astronomy lectures by professional astronomers, nightscape photography workshops, afternoon demonstrations, giveaways, camping, a food truck and a 78% partial solar eclipse on October 14.

2. Heart of America Star Party, Missouri (63%)

When: October 12-15

Where: Butler, Missouri

Hosted by the Astronomical Society of Kansas City and aimed at both beginners and experienced astronomers, the annual Heart of America Star Party will feature guest speakers, a solar filter-making workshop and eclipse-viewing. It’s held south of Kansas City near on a 40-acre dark sky site near Butler, Missouri.

3. Dark Skies Star Party, Illinois (50% eclipse)

When: October 12-15

Where: 10149 County Highway 11, Chandlerville, Illinois

Stargazers and amateur astronomers will descend on the Jim Edgar Panther Creek State Fish and Wildlife Area for the annual Illinois Dark Skies Star Party, three days of sky-gazing with University speakers on both Friday and Saturday. A 50% partial solar eclipse will be visible on Saturday. Hosted by the Sangamon Astronomical Society, there’s lots of room for tent camping and RVs.

4. Solar Eclipse Village, Texas (‘ring of fire’)

When: October 14-16

Where: Garner State Park, Concan, Texas

Solar Eclipse Village will feature an astrophotography session and a guided stargazing tour on Saturday, October 14, but not until visitors have witnessed a “ring of fire” solar eclipse at 11:49 a.m. CDT for 4 minutes 35 seconds. Expect eclipse lectures, science experiments, activities and guided eclipse viewing, with Sunday featuring a lengthy preview of April 8’s total solar eclipse, which remarkably will occur in exactly the same place.

5. Boulder Mail Trail, Utah (‘ring of fire’)

When: October 10-14

Where: Grand Staircase-Escalante National Monument, Utah

If you’re a more active type and want dark skies and a lonely eclipse amid incredible surroundings, consider this last-minute Boulder Mail Trail backpacking trip ($1,675 per person) through the Grand Staircase-Escalante National Monument. Hosted by Wildland Trekking, this guided four-day trek through canyon country will encounter the “ring of fire” while hiking out of the lower Escalante River Canyon east of Escalante, Utah.

6. Eclipse Into Nature Star Party, Oregon (‘ring of fire’)

When: October 13-14

Where: Running Y Resort, Klamath Falls, Oregon

Held at the Running Y Resort in Klamath Falls, Oregon is a one-off two-day event called “Eclipse Into Nature,” but featuring a star party on October 13. Expect an evening of stargazing with telescopes and lectures, with a “ring of fire” solar eclipse visible on Saturday. Eclipse day will start at 6:00 a.m. PDT with live music and a breakfast buffet before the event. Solar eclipses glasses will be provided.

I’m an expert on eclipses—the editor of WhenIsTheNextEclipse.com and author of The Complete Guide To The Great North American Eclipse of April 8, 2024. For the very latest on the “ring of fire” solar eclipse—including travel and lodging options—check my main feed for new articles each day.

Wishing you clear skies and wide eyes.

[ad_2]

Department of Physics and Astronomy makes astrophotography more accessible to students – The Ithacan

[ad_1]

Ithaca College purchased new astrophotography equipment — like telescopes, cameras, and light filters — in Summer 2023. Poor weather conditions have made capturing photos of the sky with the new equipment challenging, but the Department of Physics and Astronomy is working toward making the sky and the new equipment more accessible to students.

Images of space are captured by placing specialized cameras in front of a telescope, which function similarly to the human eye and a pair of glasses: the telescope makes the image accessible to the camera. However, the newly purchased equipment is much more powerful than a pair of glasses. By using different light filters, astrophotography cameras are capable of capturing light that is not part of the visual spectrum.

Sophomore Suryash Malviya, a physics and astronomy and computer science double major, worked with the new equipment extensively over the summer. The funding for the equipment — which cost $1,500 — came from a combination of alumni gifts and the Physics and Astronomy Department’s annual research budget. 

Malviya spent eight weeks doing astronomical research at the college through the Summer Scholars Program.

The Summer Scholars Program is a selective program that gives students in the School of Humanities & Sciences the opportunity to work with staff on paid research projects. Malviya worked with Matthew Price, professor in the Department of Physics and Astronomy.

Malviya said astrophotography practically supplements the theoretical study of the components of the sky.

“Astrophotography is the art of visually recording what’s in the sky,” Malviya said. “We can only see a particular part of the light spectrum, so astrophotography helps us see what’s out there.”

Price and Malviya used specialized cameras in their research, including planetary cameras and deep-sky cameras. However, smoke from the Canadian wildfires polluted the sky over the summer, making research difficult.

“We didn’t get a whole lot of data,” Price said. “You can’t see through the smoke, so a big optical telescope is going to be useless in the clouds.”

High concentrations of smoke made it unsafe to go outside and use the equipment. Malviya said the observatory was inaccessible over the summer because of these weather complications.

“You don’t get to explore all of the features of the camera unless you put it into use in the real world,” Malviya said.

Price said smoke from wildfires could have been dangerous to the telescopes and cameras, which forced Malviya and Price to be cautious in their research.

However, at the end of summer, the college ordered a radio telescope that functions differently from a traditional telescope. Instead of using optics, it collects and amplifies radio waves, which eliminates the obstacle of capturing unclear images or damaging the cameras.

“It might be cloudy, but you can still use the [radio] telescope and do your work from inside, so it’s really helpful,” Malviya said.

A new optical telescope was also installed in August 2022 at the Clinton B. Ford Observatory, located in the Natural Lands, after it was forced to close following the COVID-19 pandemic.

 “[The observatory] was taken over by the nature [during the pandemic] because no one was maintaining it,” Malviya said. “A bunch of mice got inside and they chewed up a bunch of wires.”

The observatory reopened in Fall 2022. Price plans to open the observatory to students and use the new astrophotography equipment there starting Fall 2023 to make up for lost time.

“We will start having regular public viewing nights, but there are still some things we need to do,” Price said. “We truly only had the new optics and the new mount up there for just a year and we lost months of summer work because of the smoke and rain.”

Students need to be involved in independent astronomy research in order to use the new astrophotography equipment, but this equipment is not necessary to capture the sky. 

Senior Nikolaj Konieczny, a physics and astronomy major and president of the Astronomy Club, practices astrophotography using a telescope and the camera on his phone.

“I want astrophotography to be something that’s accessible to anybody,” Konieczny said. “I don’t see why you shouldn’t have a picture of Saturn on your phone. Everybody has the ability to put their phone right up to the eyepiece and snap a picture of that.”

The Astronomy Club, which is advised by Price, has access to its own equipment that can now be used by anyone, including a large telescope. The Astronomy Club plans to meet every week and will soon have access to the observatory, where they will host their own public viewing nights. 

“Previously, [the Astronomy Club] was an academic-only club, and I personally don’t see why we should gatekeep the universe from other people,” Konieczny said. “It’s beautiful. It’s something you have to share with everybody.”

[ad_2]

Exploring the universe through astrophotography – Estes Park Trail-Gazette

[ad_1]

The subject of this month’s Estes Valley Astronomical Society (EVAS) meeting will be “exploring the universe through astrophotography,” with Dr. RDee Sherrill. EVAS, in conjunction with The Estes Park Memorial Observatory is offering a free public open house and lecture on Saturday Sept. 30, 2023. The goal of EVAS is to promote amateur astronomy and education in the Estes valley.

Astrophotography provides us with insights into the mysteries of the universe not visible with the naked eye. Whether we enjoy the technical or physical side of this hobby, the pictures we produce are special, they are personal, the result of many hours perseverance, they help give meaning to the beauty and awe of the universe.

Dr. Sherrill has experience in equipment, techniques and processes to make this presentation interesting to beginners through experienced photographers. He will share his knowledge of several niches of astronomy imaging, especially deep sky objects includeing nebulae, star clusters, galaxies and more.

Dr. RDee Sherrill will lead this month's Estes Valley Astronomical Society (EVAS) meeting on "exploring the universe through astrophotography."(Estes Park Memorial Observatory/Courtesy Photo)
Dr. RDee Sherrill will lead this month’s Estes Valley Astronomical Society (EVAS) meeting on “exploring the universe through astrophotography.”(Estes Park Memorial Observatory/Courtesy Photo)

After receiving his PhD degree in nuclear chemistry, Dr. Sherrill worked at the DOE Rocky Flats Plant, the NASA Johnson Space Center, the DOE Brookhaven National Laboratory and several private companies. RDee and his wife Beth
moved full time to Estes Park in 2011 and he retired in 2013. Upon retirement RDee decided to further pursue his lifelong interest in astronomy by building an observatory and delving into the Astro Photography of deep space objects. RDee
is now a full-fledged, self-taught Astro photographer who can rival most professionals in the field.

Estes Park Memorial Observatory.(James Frank/Courtesy Photo)
Estes Park Memorial Observatory.(James Frank/Courtesy Photo)

The observatory is just north of the high school at 1600 Manford Avenue. Please park in the teacher’s parking lot adjacent to the observatory. The doors will open at 7 p.m. and the meeting will start at 7:30 p.m. The presentation, including a question-and-answer period, lasts about an hour. After the presentation, weather permitting, we will look through the telescope at various celestial objects. Information about the meeting can be found on the observatory website at: www.AngelsAbove.Org.

[ad_2]

The story behind the photo: ‘Swamped Skies’ by Joshua Rozells

[ad_1]

By AG STAFF

September 26, 2023

This forms part of a series of Q&As with winning photographers from this year’s Australian Geographic Nature Photographer of the Year awards.

Joshua Rozells was crowned the winner of the ‘Our Impact’ category in the 2023 Australian Geographic Nature Photographer of the Year (AGNPOTY) competition.

His winning image, titled ‘Swamped Skies’, shows light pollution caused by satellites in the sky above Western Australia’s Pinnacles Desert.


Can you tell us the back story of this photo? 

This photo was taken on an astrophotography trip I went out on in early 2022 to the Pinnacles, Western Australia. The photo was originally meant to be a star trail but I changed it to show the amount of satellites in the night sky after I reviewed the photos. The photo shows the impact that satellites are having on our natural night sky. It is an issue that did not exist until a few years ago. It is an issue that is increasing at an alarming rate due to the exponential rise in the amount of satellites that have been launched into Earth’s orbit in recent years.

What is your connection to the subject matter? 

As a hobbyist astrophotographer, satellites are increasingly becoming a problem. There hasn’t been a night of astrophotography that has gone by that I haven’t captured satellites in my photos and the issue has progressively gotten worse. For astrophotographers like myself, satellites are a nuisance but not a deal breaker. Even though I don’t like satellites in my photos, I can deal with satellites in my photos or Photoshop them out of my photos if I really want to. But for astronomers, it is becoming a problem that is impacting their ability to conduct astronomy research.

Where is it taken, and what led you to this site? 

The photo was taken at the Pinnacles. Between the dark skies and the unique limestone structures, it is is one of the most amazing locations for astrophotography in Western Australia! I have been to the Pinnacles many times to do astrophotography.

Photographer Joshua Rozells. Image credit: supplied by Joshua Rozells

Were you unexpectedly there or had you planned to cover this moment? 

The shot was taken unexpectedly; I had originally gone to the Pinnacles to do a star trail. When I was out in the field I noticed that there were a few satellites in the sky but didn’t think much about it. It was only when I got home and reviewed the photos that I noticed the sheer amount of satellites that were present in the sky that night. Once I realised how many satellites were in the sky I decided that I would edit the photo to show the amount of satellites rather than editing it into a star trail.

What are the technical challenges of photographing this kind of scene?

Lots of planning is required for astrophotography to take place and all the planning can quickly come undone if the weather doesn’t cooperate. I often plan my astrophotography trips months in advance to ensure that I get the best photo possible. 

How did you prepare to take this image? 

Going out for an astrophotography trip always requires a fair bit of preparation. Firstly, I checked the moon’s luminosity and rising/setting time to find a night that there would be a night without any moonlight to make sure the stars are more visible. A few days before the trip I had to check the weather conditions (most importantly the cloud coverage and wind conditions) to make sure that the conditions would be good for astrophotography. I arrived at the Pinnacles during daylight so that I could scout out a good location at the Pinnacles to take the photo. Finally, I set up the camera on a tripod and attached an intervalometer to take consecutive photos.

Did you have special equipment? 

Other than the camera and tripod, I used an intervalometer to enable the camera to take consecutive photos. I also used two panel lights to light up the Pinnacle in the foreground.

Have you covered this topic/subject before?

I have not done a photo that showed the amount of satellites in the sky before.

Why is this form of photography important to you?

Astrophotography is important to me because it helps me to see what the human eye cannot see. As a Christian it also helps me to reflect on the greatness of God; I’m always in awe of His magnificent creation when I’m out stargazing. When I’m out stargazing I can’t help but to meditate on the Bible passage Psalm 19:1, which says “The heavens declare the glory of God; the skies proclaim the work of his hands”.

Any additional thoughts?

This photo clearly shows the impact that satellites are having on our night sky. The proliferation of satellites is increasingly becoming a problem for astronomers. In 2021, over 1700 spacecrafts and satellites were launched into orbit. Light pollution caused by SpaceX’s Starlink satellites are the worst offenders as they are low Earth orbit satellites, and they travel in satellite trains. One can only assume the issue will exponentially increase in the next few years, with SpaceX alone intending to launch over 40,000 satellites in total. The space industry is almost entirely unregulated, with no limits on the amount of satellites that anyone is able to launch and there is currently no regulation in place to minimise the light pollution they cause. Organisations such as the International Astronomical Union’s Center for the Protection of the Dark and Quiet Sky are advocating for the regulation and protection of the night sky. But more needs to be done by the space industry, governments, and NGOs so that we can all enjoy the beauty of the natural night sky for generations to come.


Related: Winners: 2023 Australian Geographic Nature Photographer of the Year



[ad_2]

Out-of-this-world discovery lands scientist trio Astronomy Photographer of the Year 2023

[ad_1]

The 15th Astronomy Photographer of the Year competition has crowned Marcel Drechsler, Xavier Strottner, and Yann Sainty as the overall winners. Their winning photograph, “Andromeda, Unexpected,” reveals a colossal plasma arc adjacent to the Andromeda Galaxy (M31). This revelation has triggered a global scientific collaboration, as it might be the largest structure of its kind near our Milky Way.

The Andromeda Galaxy is our closest spiral neighbor making it a frequent subject of astrophotography. What makes this find remarkable is the arc’s massive size, spanning about 1.5 x 0.45 degrees, and its proximity, only 1.2 degrees away from M31’s center, southeast of the galaxy’s main body. But to capture it properly you will need one of the best telescopes for photography.

• These are the best camera for astrophotography – perfect for capturing deep space in all its beauty

Astronomy Photographer of the Year 2023

Marcel Drechsler, Xavier Strottner, and Yann Sainty expressed their gratitude, saying, “It’s an enormous honor for our team to receive this important award, and we are grateful for all the support, friendship, and encouragement we have received along our journey. It encourages us to continue our passion for astrophotography and research.” László Francsics, a judge and astrophotographer, praised the image for its spectacular and valuable contribution to astrophotography.

The Young Astronomy Photographer of the Year title went to Chinese teenagers Runwei Xu and Binyu Wang for their collaboration on The Running Chicken Nebula, described as “strikingly beautiful” by judge Yuri Beletsky.

Other winners include Circle of Light by Andreas Ettl, The Dark Wolf – Fenrir by James Baguley, A Sun Question by Eduardo Schaberger Poupeau, and Grand Cosmic Fireworks by Angel An. In the Annie Maunder Prize for Image Innovation category, John White’s Black Echo stood out, visually representing the sound of a black hole.

Astronomy Photographer of the Year 2023
Astronomy Photographer of the Year 2023
Astronomy Photographer of the Year 2023

Dr. Ed Bloomer, an astronomer at the Royal Observatory Greenwich, commented, “The highlight of this year is perhaps a number of genuine discoveries being imaged, but we’ve had wonderful efforts in every category and I’m particularly pleased to see the continued strength of our young entrants and those eligible for The Sir Patrick Moore Prize for Best Newcomer.  It has led to some intense debate amongst the judges as we try to choose the very best of the best, but we don’t mind!”

This year’s  Astronomy Photographer of the Year competition, organized by the Royal Observatory Greenwich, saw over 4,000 entries from 64 countries. The overall winners have been awarded a cash prize of £10,000, winners of each category will take home £1,500 while runners-up and highly commended will receive £500 and £250 respectively.

All winning photos and a selection of shortlisted images will be on exhibition at the National Maritime Museum in London, UK, from September 16 with entry costing £10. To see the full list of winning images head to the Royal Museums Greenwich website.

See the best telescopes for beginners, and the best telescopes for kids

[ad_2]

Google Pixel 8 leaks reveal AI-powered camera features and major software update policy

[ad_1]

The camera bump of the Google Pixel 7a.

June Wan/ZDNET

It’s hardly worth calling them “leaks” anymore given the volume and the fact that there’s a good chance they’re essentially Google-approved advertising for next week’s release, but thanks to a slew of leaks over the weekend, some more details about the Google Pixel 8 and Pixel 8 Pro have been revealed.

New AI-enhanced cameras

For starters, a Google promo video touting the phone’s AI-assisted photo and video tricks was provided to 91mobiles.com. That promo begins by showing off “Video Boost” which provides smoother video and proceeds to show an upgraded night sight for taking photos in low light. You can think of it as an enhanced HDR capability.

The video goes on to show off the “Audio Erase” feature that removes unwanted background noise from videos, providing an example of a person playing a musical instrument outdoors while a siren blares in the distance. Once the feature is applied to the video, only the music remains. 

The phone’s “Real Tone” feature is also on full display, which not only sees more diverse faces in more lighting conditions for facial recognition, the company says, but has an auto-white balance model to help determine color in a picture and uses that to reflect a variety of skin tones in photos — looking more true to life.

Google Pixel 8 series

Here’s a closer look at the Pixel 8 Pro (left) and Pixel 8 (right).

Google

After briefly showing off some astrophotography and Super Res Zoom, next up was a face-swapping feature where if someone in a photo is captured with a bad facial expression, it can be swapped out for a better one (the phone captures multiple images to do this), or AI can edit the face on its own.

Also: Google just ‘leaked’ the unreleased Pixel 8 Pro, and fans of this color will be delighted

The video closed by showing the phone’s manual camera controls, which Google says are “modeled after DSLR” and lets users manipulate shutter speed, focus, ISO, and more.

The Pixel 8 is getting “Pro” upgrades

And lastly, we now know that the regular Pixel 8 will get a display upgrade over the 7, from a 90Hz refresh rate to 120Hz. Of course, that’s not the only upgrade from the 7 to the 8, as a couple of things are now clear from these leaks. 

One, Google wants to provide a little more separation between its flagship and mid-level phones. While they were well-received, one complaint about the more recent Pixel 7a was that it was hard to justify buying the Pixel 7 frequently being discounted. That seems to change this year, as the non-Pro flagship phone has enough to distinguish itself and justify its price increase.

Also: The Pixel 8 could bring back a buzzword that was once the talk of the tech industry

Two, Google is taking the camera on the Pixel 8 series very seriously. The 8 Pro’s specs, for example, includes a 48 MP Quad PD telephoto lens, super resolution zoom of up to 30x, and manual controls, making for — what is on paper — a stellar camera experience.

Key buying info

In addition, the 91mobiles tipster also revealed that Google Pixel 8 Pro preorders would come with a free Pixel Watch 2. Although exact pricing hasn’t been revealed, that’s roughly a $350 value and makes the high-end price of the 8 Pro a little more palatable.  

We also learned that the Pixel 8 series will have up to seven years of upgrades for security and OS. It’s worth noting though that this doesn’t necessarily mean seven years of OS upgrades — it could be, say, five years of OS plus two more of security.

While we know almost everything there is to know about Google’s new flagship phones, you can expect this isn’t the last of the leaks. The new phones will be available October 4, so stay tuned for ZDNET’s reporting when the products are officially unveiled.



[ad_2]

AI face swap, Pro controls, and more

[ad_1]

Google Pixel 8 Pro Official Teaser

TL;DR

  • Camera features of the Pixel 8 series have leaked through a seemingly official Google video.
  • The Pixel 8 Pro is getting a new Pro camera mode with manual adjustment options for shutter speed, ISO, and more.
  • Google is also adding a new AI face-swapping feature to fix bad facial expressions.

The floodgates have opened for Google Pixel 8 series leaks. There is little that we don’t know about the upcoming phones ahead of their official announcement on October 4. The latest leak accompanies a price and camera specs reveal that happened over the weekend and gives us a seemingly official Google video detailing the camera features of the Pixel 8 and Pixel 8 Pro.

Tipster Kamila Wojciechowska paired up with 91Mobiles to leak the Pixel 8 series video, breaking down all the new AI and photography tricks Google plans to deploy with the latest flagships. Features presented in the clip include new manual camera controls “modeled after DSLR controls” for the Pixel 8 Pro. Google is finally giving users the freedom to play around with shutter speed, ISO, focus, and other aspects of photography on the Pro Pixel.

There’s also a new face-swapping feature on the Pixel 8 series, likely powered by AI. If you happen to capture someone with a bad or weird facial expression, you’ll be able to swap it out for a better one. It looks like the new Pixels will capture multiple images and offer the user options to choose the best expression for face-swapping purposes. Alternatively, Google might be using AI to adjust facial expressions — a feature you can find these days on many third-party AI image editing apps.

[ad_2]

All the features that you get on Google Pixel smartphones

[ad_1]

google pixel fold camera vs google pixel 7 pro camera vs google pixel 7a camera close up

Ryan Haines / Android Authority

Google Pixel smartphones have been some of the best Android smartphones you can buy, but their presence at the top is surprising if you take a deeper look at the spec sheet. The Pixels barely have top-of-the-line specifications, trailing behind most market leaders. However, thanks to some nifty software magic, Google can extract the most possible value out of hardware. We can see this in action with the Google Camera app on Pixel smartphones, which enables some cool photography features. Here are all the features that you get on the Google Camera app.

What is Google Camera?

google pixel 7 pro camera app in hand taking a shot of the exterior of a house

Ryan Haines / Android Authority

Google Camera is the default camera application shipped on Google Pixel smartphones. Most OEMs ship their own modified camera app on their smartphones as part of their Android skin, so Google is no different.

What makes Google Camera unique is that it can extract the best results out of dated camera hardware often found on Pixel smartphones. The Google Camera app contains most of the algorithms responsible for Google’s software magic on photos.

These software optimizations are so potent that third-party modders regularly attempt to port the latest Google Camera app from Pixel devices to other Android smartphones, improving the photography prowess of their non-Pixel hardware.

The Google Camera app was initially released to the public on the Google Play Store. But those days are long gone. Google Camera is now exclusive to Pixel smartphones. If you spot the app on a non-Pixel smartphone, it will likely be a third-party Google Camera port (often called “GCam” in this context).

Google Camera: Photo features

The Google Camera app has many features, but it also misses out on some, like a dedicated manual or Pro mode for photos and videos. Not packing a manual mode is a feature in itself, as you, as an average user, need to trust the Pixel to take the best shot for you.

Despite such a flaring shortcoming, the Google Camera app is still one of the best camera apps for Android. The presence of these other features compensates for the missing Pro mode.

HDR Plus

The highlight feature of the Google Camera app is HDR Plus, which was added around the release of the Nexus 6. HDR Plus is the engine behind HDR imaging in the Google Camera app. In its early announcement posts in 2014, the company said it uses “computational photography” for HDR Plus.

When you press the shutter button in the Google Camera app, HDR Plus captures a rapid burst of three to 15 pictures and combines them into one.

In low light scenes, HDR Plus takes a burst of shots with short exposure times, aligns them algorithmically, and then claims to replace each pixel with the average color at the position across these burst shots. Using short exposures reduces blur while averaging shots reduces noise.

Google Camera HDR Plus in low light

In scenes with high dynamic range, HDR Plus follows the same technique, and it manages to avoid blowing out the highlights and combines enough shots to reduce noise in the shadows.

Google Camera HDR Plus in high dynamic range

In scenes with high dynamic range, the shadows can often remain noisy as all images captured in a burst remain underexposed. This is where the Google Camera app uses exposure bracketing, making use of two different exposures and combining them.

HDR Plus with Bracketing is the highlight feature of the Google Camera app.

The experience with exposure bracketing gets complicated with Zero Shutter Lag (ZSL, more on this feature below). HDR Plus works around ZSL by capturing frames before and after the shutter press. One of the shorter exposure frames is used as the reference frame to avoid clipped highlights and motion blur. Other frames are aligned to this frame, merged, and then de-ghosted through a spatial merge algorithm that decides per pixel whether image content should be merged or not.

If all of this sounds complicated and confusing to you as a user, fret not. The Google Camera app doesn’t require you to worry about these details. You just have to click photos; Google’s algorithms will handle the rest.

Here are some camera samples from the Pixel 7 Pro’s primary camera:

Night Sight

Night Sight on Google Camera with HDR Plus with Bracketing

Night Sight is all of HDR Plus but in very low light. Because of the lack of light, the exposures and burst limits are allowed to be liberally longer. The time to take a night shot is thus longer, and a stronger element of motion must be compensated for.

You can expect a Night Sight photo to take about one to three seconds, and we’d advise you to wait another second after pressing the shutter button. Pixel smartphones will automatically enable Night Sight when it is dark, though you can manually toggle the mode if necessary. Note that Night Sight does not work if you have Flash turned on.

On newer Pixel smartphones, the denoising process in the HDR Plus process during Night Sight uses new neural networks that run on the Tensor processor. This has yielded improvements in the speed of a Night Sight shot.

Here are some Night Sight samples from the Pixel 7 Pro:

Astrophotography

Pixel 4 astrophotography mode in Sutherland. - Astrophotography mode

You need to mount your Pixel smartphone on a tripod and be in practically pitch-black conditions (away from city lights) with your phone pointed toward the clear sky. Once your Pixel phone determines the conditions to be right, it will show a message “Astrophotography on.”

In this mode, the Pixel phone will take 16 16-second photos and merge them to produce one detailed photograph. You can also create a cool one-second astrophotography timelapse of this 16-second shot.

Zero Shutter Lag (ZSL)

Google Pixel 7 camera housing

Robert Triggs / Android Authority

Zero Shutter Lag has long been an invisible feature in the Google Camera experience. The philosophy with Zero Shutter Lag is self-explanatory: What you click should be immediately captured. Users should be able to click the shutter button and forget about the image if they wish. The task should be done right at the button press, requiring no further waiting for processing to complete.

However, this is easier said than done, especially considering features like HDR Plus (combining a burst of images) and pixel binning (combining adjacent pixels) are inherently compute-intensive.

ZSL gets around this by capturing frames before the shutter is pressed! In some situations, like HDR Plus, longer exposures are captured after the shutter is pressed, though this experience is often hidden from the viewfinder.

ZSL used to be a more vital feature when phone processors were slow and required a lot of time to process an image. Zero Shutter Lag no longer gets advertised as strongly, as the feature is now practically seen across the smartphone ecosystem in ideal lighting conditions.

ZSL is also fairly challenging to orchestrate in current times, where our reliance on computational photography is at an all-time high. It gets further eclipsed by features like Night Sight and Astrophotography that intentionally take multiple seconds to capture photos.

Super Res Zoom

Pixel Pro Zoom 5X Shot

Kevin Convery / Android Authority

Historically, Pixels haven’t had the latest camera hardware, so Google had to rely on software magic to meet customer expectations. For instance, Google resisted adding a telephoto camera for optical zoom on the Pixel for quite some time and instead developed the Super Res Zoom feature that mimics the same functionality with digital zoom.

Super Res Zoom was introduced with the Pixel 3. On that phone, this feature merged many frames onto a higher resolution picture (multi-frame super-resolution) instead of upscaling a crop of a single image that digital zoom often did. This technique allowed the Pixel 3 with its single camera to achieve zoom details at the 2x level that was surprisingly better than expected out of digital zoom.

Super Res Zoom salvages the results of digital zoom.

With Super Res Zoom, you would get more details if you pinched to zoom by 2x before taking a photo rather than digitally cropping the photo by 2x after taking it.

When Google finally made the jump to a telephoto lens with the Pixel 4 series, it used HDR Plus techniques on the telephoto lens to achieve even better results.

Google switched lanes more recently, adapting to the larger primary and telephoto sensors on the Pixel 6 Pro and Pixel 7 Pro for its Super Res Zoom feature. For 2x zoom, Google crops into the inner portion of the 50MP primary sensor to produce 12.5MP photos. It then applies remosaicing algorithms and uses HDR Plus with bracketing to reduce noise.

For 5x zoom, Google uses a crop of the 48MP telephoto sensor and the same techniques. For zoom outside of 2x and 5x, Google uses Fusion Zoom, a machine-learning algorithm that merges images from multiple cameras into one.

Google Camera Fusion Zoom on Super Res Zoom

Once again, you, as a user, do not have to worry about any of this. Just pinch to zoom in, click the shutter button, and let Google figure out the rest on their Camera app.

Here are some zoom samples from the Pixel 7 Pro:

Portrait Mode

Google Pixel 2 Portrait Mode

Portrait Mode takes photos with a shallow depth of field, letting the subject pull all the attention to itself while the background remains blurred.

Smartphones usually use two cameras located next to each other to capture depth information (just like our eyes!). Without this depth information, the phone would have difficulty separating the subject from the background.

However, Google managed to do a good job with Portrait Mode on the Pixel 2, and it did it with just one camera on the front and back. The company used computational photography and machine learning to overcome hardware limitations.

Portrait mode on single-camera setups (like the front camera) starts with an HDR Plus image. The Camera app then uses machine learning to generate a segmentation mask that identifies common subjects like people and pets.

If depth information is available in some way (when you have multiple cameras available, such as on the back), it generates a depth map, which helps apply the blur accurately.

Real Tone

Real Tone photo on the Google Pixel 7 Pro 2

Real Tone is Google’s effort at making photography more inclusive for darker skin tones. It attempts to counter the skin tone bias that has long existed in photography, giving people of color a chance at being photographed more authentically.

As part of the Real Tone initiative efforts, Google has improved its face identification in low-light conditions. The AI and machine learning models that Google trains are now fed a wider, diversified data set. The company has also improved how white balance and automatic exposure work in photographs to accommodate a wider set of skin tones better.

Starting with the Pixel 7 series, the company also uses a new color scale (called the “Monk Skin Tone Scale”) that better reflects the full range of skin tones.

Dual Exposure controls and Color Temperature control

Google Camera Dual Exposure Controls and Color Temperature Control

Aamir Siddiqui / Android Authority

As mentioned earlier, the Google Camera app does not have a Pro mode. Very little thinking is involved in clicking a good photo on a Pixel smartphone.

If want some control over your photos, Google gives you three settings that you can play with:

  • Dual Exposure Controls:
    • Brightness: Changes the overall exposure. It can be used to recover more detail in bright skies or intentionally blow out the background if needed.
    • Shadows: Changes only the dark areas. It manipulates the tone mapping instead of the exposure. It is helpful for high-contrast scenes, letting users boost or reduce shadows.
  • Color temperature control: Changes the scene’s color temperature to make it warmer or cooler.

You can access all three sliders by long pressing on the viewfinder in Photo mode. These settings co-exist alongside HDR Plus, so you can still use computational photography features while slightly modifying specific settings to suit your taste better.

Computational RAW

google pixel 7a camera from behind

Ryan Haines / Android Authority

Google adopted “Computational RAW” with the Pixel 3, though the term doesn’t feature frequently in their marketing.

With the Google Camera app, you can save RAW image files (.dng) alongside processed image files (.jpg). RAW files traditionally allow you a wider range of adjustments for settings like exposure, highlights, shadows, and more.

But the trick here on the Google Camera app is that these RAW image files aren’t entirely raw and untouched. Google processes the RAW file through its computational photography pipeline before saving it.

RAW on Google Camera is Computational RAW.

This approach may alarm purists and hobbyists who want an untouched, unprocessed RAW file. But try as hard as you may; you will find it extremely difficult to get a better result processing your RAW file by yourself than compared to Google’s super-processed JPGs.

Computational RAW is the middle ground, applying some of the Google software magic and letting you apply some of your own on top. The results of this approach take advantage of Google’s processing expertise and your vision.

Macro Focus

Macro Focus is one feature that relies heavily on hardware. It uses the ultrawide lens on the Pixel 7 Pro that is equipped with autofocus, letting you focus as close as 3cm away. When you come close to a subject, the Pixel 7 Pro will transition from the main camera to the ultrawide and let you take a macro photo.

You can also take a macro focus video on the Pixel 7 Pro.

Motion Mode

A Long Exposure shot taken with a Google Pixel 6 of a subway car. The subway car itself is blurry and conveys a sense of motion while the tunnel and station platform are clear.

Zak Khan / Android Authority

Motion Mode on Pixel smartphones adds a creative way to use long exposure shots. It essentially uses a longer exposure photo and adds a blur effect to the background and moving parts in the image.

Most newer Pixel phones have two blur effects: Action Pan and Long Exposure. Pixel 6a, Pixel 7a, and Pixel Fold users do not have the Action Pan effect.

Action Pan works best for moving subjects against a stationary background (where the background gets blurred), while Long Exposure is better for motion-based scenes (where the moving object is blurred).

Motion Photos

Google Camera has a Motion Photos feature, which records a short, silent video when capturing a photo. It adds life to a still image and captures candid moments before and after a shot. The same was implemented in iOS as Live Photo.

Motion Photos is different from Motion Mode. You can export Motion Photos as videos.

Top Shot

Google Camera Top Shot

Aamir Siddiqui / Android Authority

If there is one thing clear so far, the Google Camera app takes a lot of photos all the time, even when clicking one photo. You need to set Top Shot to Auto or On.

Top Shot lets you save alternative shots from a Motion Photo or video. The camera app takes a lot of photos before and after you tap the shutter button and then recommends a better-quality photo than the one you clicked, like one where all the people in the image are smiling well and not blinking.

Note that Top Shot is unavailable when you have enabled Flash, Night Sight, Selfie Illumination, or Social Media Depth features.

Frequent Faces

Google Camera Frequent Faces

The Google Camera app saves data about the faces you photograph or record frequently if you turn on the Frequent Faces feature. The face data is saved on your Pixel smartphone and not sent to Google. When you turn off the feature, the data is deleted.

With Frequent Faces, the Google Camera app on Pixel 4 and later devices identifies and recommends better shots of faces you capture often. So you will get fewer blinking eyes and more smiling faces when using the Top Shot feature.

The feature also taps into the Real Tone feature, offering better auto-white balance for these recognized subjects.

Long Shot

Like other camera apps, the Google Camera app also lets you take videos in photo mode. Long-press the shutter button in photo mode and begin video recording.

Palm Timer

Google Pixel Feature Drop Palm Detection in Google Camera

Google Camera includes a timer setting for three seconds and 10 seconds. When you activate these timer settings, you also activate the Palm Timer. Once you have framed yourself in the photo, raise your palm to face towards the camera, and the timer will begin counting down.

Guided Frame

Google Camera Guided Frame

Guided Frame is an accessibility feature on the Google Camera app designed for the visually impaired community. This feature uses Google’s TalkBack mode to audibly guide you through the framing and photo-clicking process for a selfie.

Panorama and Photo Sphere

The Google Camera app also includes Panorama and Photo Sphere modes. Panorama lets you stitch multiple images to create one long image. Photo Sphere enables you to stitch multiple images to create an “image sphere” that shows off all around you.

Google Camera: Video features

A substantial focus of the Google Camera app is photos, and the extensive feature list and improvements over the years are testimony of this attention. Videos are also crucial to the Google Camera experience, but it doesn’t receive the same love. As a result, Pixel phones with the Google Camera app can take excellent photos and good videos.

The Google Camera app can record video at up to 4K 60fps across all lenses, though this feature sees limitations depending on your Pixel phone. You can record with the h.264 (default) or the h.265 codec. You can also choose to record in 10-bit HDR.

There are a few stabilization options available within Google Camera:

  • Standard: Uses optical image stabilization (OIS, if present), electronic image stabilization (EIS), or both.
  • Locked: Uses EIS on the telephoto lens or 2x zoom if 2x telephoto is not present.
  • Active: Uses EIS on the wide-angle lens.
  • Cinematic Pan: For dramatic and smooth panning shots.

There is also a dedicated Cinematic video mode, which is like Portrait mode but for videos. Further, you also get the usual slow-motion and time-lapse video capabilities.

Google Camera: Availability

google pixel fold cameras close up

Ryan Haines / Android Authority

The Google Camera app is available on all Google Pixel devices. However, depending on their camera and processing hardware, the exact feature list varies between devices. You can get the most features on the newest flagship Pixel.

For devices other than Google Pixels, there are unofficial GCam ports. Third-party enthusiasts modify the Google Camera app and make it run on unsupported phones. They also tweak some of the myriad processing values to get subjectively different results that suit the hardware output from a certain class of phones.

While you can use GCam ports to get the Google Camera experience on your non-Pixel device, note that there will always be the risk of installing unknown APKs, and we recommend against that. Please be careful with what you install on your phone, and random APKs found on the internet should not be installed. Only install apps from official sources and developers that you trust.


FAQs

Yes, the Google Camera app is free as it comes pre-installed on Pixel smartphones.

The official Google Camera app comes pre-installed on Pixel smartphones. If your phone does not have the app pre-installed, you cannot officially install the Google Camera app. Instead, you can install unofficial Google Camera ports at your own risk.

[ad_2]

TTartisan unleashes monster 500mm f/6.3 lens for full-frame mirrorless cameras

[ad_1]

There is yet another new product announcement from TTArtisan, as they release details on a new TTArtisan 500mm F6.3 super-telephoto prime lens.

Set to become one of the best budget telephoto lenses of the year, the lens will be available for Sony E-mount, Nikon Z-mount, Canon RF-mount, and L-mount. The 500mm focal length is the first telephoto lens of this range that TTArtisan has released for full-frame mirrorless bodies and provides a budget-friendly alternative to the existing mirrorless telephoto market.

With a retail price of $369 / £409, this is a powerful lens that lowers the entry point and accessibility of super-telephoto photography. With a maximum aperture of f/6.3, it provides a relatively bright image compared with some budget rivals.

TTArtisan 500mm f/6.3

TTArtisan 500mm f/6.3

It is worth noting, however, that this lens is manual focus, therefore, not a lens designed for the tracking of fast-paced subjects. Instead where this lens may shine is with landscape, astrophotography, and slower-paced wildlife photography. The manual focus element may seem like the sacrifice for focal length and aperture, but this is not an issue for photographers pre-focusing on a subject such as a branch or a moon.

Stability is vital for a lens of this focal length and to aid in this department, a tripod mount ring is included. TTArtisan has seemed to miss an opportunity here however as the tripod mount ring is not Arca-Swiss compatible.

The 500mm f/6.3 Telephoto lens is optically structured from 8 elements in 5 groups – and uses a conventional, rather than mirror, design. Included in this configuration are 2 extra-low dispersion (ED) glass lenses and 2 high-index glass lenses, stated to reduce chromatic aberration and contribute to decent image quality.

Sony E-Mount users may experience some slight vignetting when used wide open, the official press release warns – but this said not to be an issue with  Z-, RF-, and L- mount versions of the lens.

TTArtisan 500mm f/6.3

TTArtisan 500mm f/6.3

There is a real focus on astrophotography with the promotion of this lens and it is easy to see why. The 500mm focal length pierces the night sky, enabling you to ‘capture distant worlds’ – and will be particularly useful for photographing the moon. The ED lenses will also control and diminish comatic aberration, a common concern in this field of photography. 

Other key features include a 3.3m minimum focus distance (10.8 feet), an 82mm front filter ring, and a 12-blade aperture diaphragm. It weighs around 1600g (57oz).

The TTArtisan 500mm f/6.3 lens is available now at the price of $369 / £409 (approximately AU$775), and is supplied with a metal lens hood and the tripod foot as standard.

[ad_2]

Pixel 8 camera features leak with manual controls and lots of AI

[ad_1]

The focus of Google’s Pixel phones has always been on the camera, and it seems there’s a lot of new stuff coming with the Pixel 8 series, as a new leaked ad reveals manual camera controls, new features, and a whole bunch of AI in the pipeline.

A leaked video posted to 91Mobiles by the reliable leaker Kamila Wojciechowska offers a pretty detailed breakdown of all of the new camera features coming to Google Pixel 8 and Pixel 8 Pro, including some returning favorites like Night Sight, Astrophotography, and Super Res Zoom.

What’s new this year, though, is actually quite a bit.

That starts with “Video Boost,” which Google says will use AI to create a “smoother view” as well as bringing Night Sight’s effects to video, as we had previously reported was coming. The video also gives us a first look at “Audio Eraser” – another feature that popped up previously – with a quick example of removing city noise from the background of a video of a cello player.

Perhaps the most wild use of AI in the video, though, is a Google Photos feature on Pixel 8 where the app will literally be able to change people’s faces in a shot to create the shot you want – the Michael Scott treatment, if you will. There’s no explanation on how this works, but it seems like it uses multiple, similar shots to create one final product. Again, it looks absolutely wild.

Perhaps most interesting for photo enthusiasts is the arrival of full manual controls on Google Pixel 8. These are called “Pro Controls” and are “modeled after DSLR controls.” We can’t see all of the options, but shutter speed and ISO both appear, as well as manual focus.

Then, there’s Magic Editor. Google first teased this at I/O as a feature that can help you move a subject or change the sky, and Google puts that on full view in this video. It looks great.

The whole video (below) has a clear focus on AI, and it’s rather exciting to see so many new camera features in store.

But, beyond that, the leak also offers a full breakdown of the camera specs in place on the Pixel 8 series. Most notable here is that Pixel 8 Pro will add auto-focus to the selfie camera, which is a 10.5MP sensor. This is something Google offered in the past on the Pixel 3, but removed in subsequent releases. Pixel 8, though, apparently is fixed focus still despite having the same sensor.

The rest of the camera specs include 50MP primary sensors on both devices at f/1.68, a 1/1.31-inch sensor, and 1.2 μm pixel width. That seems to imply these will still be using the Samsung ISOCELL GN1 sensor, as the specs line up. Pixel 8 gets a 12MP ultrawide camera with auto-focus (something the regular Pixel 7 lacked), while Pixel 8 Pro is still gets a 48MP ultrawide and another 48MP telephoto camera at 5x.

Google is set to unveil Pixel 8 and Pixel 8 Pro on October 4.

More on Pixel 8:

FTC: We use income earning auto affiliate links. More.

[ad_2]