Google explains how Astrophotography on the Pixel 4 works
What you need to know
- Astrophotography is an automatic mode within Night Sight on the Pixel 3 Pixel 4.
- Google goes deep in detail as to how its Night Sight and Astrophotography works.
- Night Sight was first debuted on the Pixel 3, with Astrophotography showing up with the Pixel 4 devices.
Pixel cameras are great for many reasons, and one of the big ones is their Night Sight feature. This year when Google released the Pixel 4 and 4 XL, it made that feature even better with the inclusion of the Astrophotography mode. In a new blog post, Google goes deep into the details of how it managed to get a photo mode that was previously only thought to be possible for DSLRs.
If you're unfamiliar with Night Sight, in simple terms it makes photos in low-light — or almost no light — look like they were taken with significantly more light. Google details the process of dealing with the lack of luminescence as such:
As for the fantastic star shots that can be captured using a Pixel 3 or newer, Google has added the new Astrophotography mode. While Night Sight photos can be taken with the user holding the phone, your pics of the Milky Way will need to be taken with the phone on a tripod or propped against something.
When taking these great shots of stars, it can be a bit of a challenge to frame the shot on the screen properly. This is because when you open your phone and switch to Night Sight, what you see on your screen is well — black. Once you set your phone up where it will be unmoved for the next four minutes and press the shutter button, the countdown starts, and the view changes.
Something else Google takes into consideration when it comes to low light photos are "warm pixels" and "hot pixels." Because of something called dark current, which sounds like something out of a Harry Potter book, causes CMOS image sensors to see small amounts of light even when there isn't any. This issue becomes a bigger problem as the exposure time of a photo extends. When this happens, those "warm pixels" and "hot pixels" become visible in the photo as bright specs of light. Here's how Google tackles that issue:
Once the night photo is in its final processing, Google adds another trick to give you an amazing photo, and that's sky processing. Knowing that images of very dark images can appear brighter than they really are and affect the viewer's interpretation for the time of day the picture was actually taken — Google has a solution. Sky processing effectively knows that the sky should appear darker in the photo than it's going , so that portion adjusted without changing the non-sky parts.
Here are some photos that were taken by some of the writers here at Android Central:
Get the top Black Friday deals right in your inbox: Sign up now!
Receive the hottest deals and product recommendations alongside the biggest tech news from the Android Central team straight to your inbox!
To capture their beauty
The newest phone from Google brings face unlock, Motion Sense gestures, and of course, Google Assistant. These features combined with an amazing camera will help you share you pictures of the galaxies far away with the everyone.
The best Black Friday Deals for 2019