Google Explains Astrophotography Mode on the Pixel 4 https://beebom.com/wp-content/uploads/2019/11/Pixel-4-Camera-shutterstock-website.jpg - iTechnow

Breaking

Saturday, 30 November 2019

Google Explains Astrophotography Mode on the Pixel 4 https://beebom.com/wp-content/uploads/2019/11/Pixel-4-Camera-shutterstock-website.jpg

One of the unique selling points of the Google Pixel 4 apart from Motion Sense has got to be the cameras. With the Pixel 4 series, Google has stepped up its camera with new features, one being the Astrophotography mode. Google’s AI team has now explained how the feature works behind the scenes.

Soon after launching the Night Sight feature with Pixel 3, the software giant learned that viewers will tolerate motion-blurred clouds and tree branches in an image if it looks sharp otherwise but they do not tolerate “motion-blurred stars that look like short line segments”.

As a workaround, the company “split the exposure into frames with exposure times short enough to make the stars look like points of light”. They noted that the per-frame exposure time should not go over 16 seconds while taking shots of the night sky.

Google also noticed that a majority of people do not prefer to wait more than four minutes for capturing a photo, which made them limit the number of frames to 15.

Other factors including dark current and hot pixels, scene composition, autofocus, and sky processing were considered by Google so that you can get the perfect low light shots.

“Dark current causes CMOS image sensors to record a spurious signal, as if the pixels were exposed to a small amount of light, even when no actual light is present…Due to unavoidable imperfections in the sensor’s silicon substrate, some pixels exhibit higher dark current than their neighbors. In a recorded frame these “warm pixels,” as well as defective “hot pixels,” are visible as tiny bright dots.”, states Google AI team in a blog post.

You must have noticed that some images shot on Night Sight tend to look so bright that you get confused about the time of the day. Well, Google tries to mitigate this issue by “selectively darkening the sky in photos of low-light scenes.”

In order to achieve this, Google uses an “on-device convolutional neural network, trained on over 100,000 images that were manually labeled by tracing the outlines of sky regions, identifies each pixel in a photograph as ‘sky’ or ‘not sky’“.

Google has shared a few tips and tricks to take better night shots which you can check out here. Also, if you want to use these awesome features but don’t have a Pixel 4, don’t worry. Just check out our article on how to install GCam Mod on any Android smartphone and use the incredible Google Camera features on your device.



from Beebom https://beebom.com/google-explains-astrophotography-mode-on-the-pixel-4/

No comments:

Post a Comment