Google Night Sight, the new enhanced low-light imaging system, announced alongside the Pixel 3 and Pixel 3 XL, is now rolling out. The feature was first showcased at the launch of Pixel 3 in New York last month. As soon as the Pixel 3 and Pixel 3 XL went on sale, Google confirmed that the feature will only become available in November. Starting today, Google is officially rolling out the feature to all Pixel smartphones, meaning – the update will be available on first, second and third generation Pixel devices.
The feature is enabled by an update to the Google Camera app that powers the imaging experience on the Pixel smartphones. Once updated, you can experience the new feature by heading to more and then tapping on Night Sight option in the camera interface. The Night Sight takes the place of Lens in Google Camera’s revamped user interface.
With the launch of Pixel 2 and Pixel 2 XL last year, Google showed how it envisions to improve photography using artificial intelligence and machine learning. In order to catch up with Google, smartphone makers are scrambling to add new experience with two or three camera setup and improved AI features. However, smartphones have always struggled in scenarios with little or no light and the Night Sight mode aims to address that issue.
Ahead of the official release today, the Night Sight mode leaked via a ported version of Google Camera app. The feature works on both the main camera as well as the selfie camera and pictures shot in this mode feature a crescent moon symbol.
The Night Sight is basically Google’s HDR+ mode on steroids, a computational photography technology that captures a burst of frames and merged them using software. With Night Sight, Google is using the same technology but improves the signal-to-noise ratio in dim lighting areas. With low-light photography, some of the main challenges involve understanding the light surrounding the images (measured in lux) and second is to understand the number of frames to be captured to compensate for the light or lack thereof.
Other issues include auto white balance, which fails in low light and mapping actual tone of the scene. Google fixes all of that by selecting exposing the image, described as “collecting light” on the display of Pixel phone and on Pixel 3, it is capable of combining up to 15 frames to produce a well lit image. Google says these images are equivalent to a 5 second long exposure. All of this happens natively on the device and uses Pixel Visual Core to complete capturing multiple frames within split seconds.
When the scene is dark enough, Google will pop up a chip on the screen to suggest that you can take better picture using Night Sight. “Night Sight works best on Pixel 3. We’ve also brought it to Pixel 2 and the original Pixel, although on the latter we use shorter exposures because it has no optical image stabilization (OIS),” Marc Levoy, Distinguished Engineer, and Yael Pritch Knaan, Staff Software Engineer at Google, said in a blog post.
Smartphones have been extremely successful with replacing point and shoots as primary camera but they have failed in scenarios like low-light or capturing zoomed images. With Pixel 3 and Pixel 3 XL, Google is focusing on those very pain points by offering Night Sight and Super Res Zoom options. The key thing to watch would be how the features improves going forward as Google tends to use new data to improve the machine algorithm.