Google Pixel 3 may only have one rear camera, but it relies heavily on Google’s promising AI to deliver high-quality images. The latest feature Google launched for all three generations of Pixel lets you shoot clean and bright images in near darkness – even when you can barely see anything with your own eyes. It works on both front and rear cameras, and you don’t even need a tripod or a flash.
The Night Sight builds on Google’s HDR+ mode, which was first introduced in 2014 and can be found on all three generation of Pixel phones. It shoots a burst of images and merges them together, improving the dynamic range in situations with tricky lighting. “As it turns out, merging multiple pictures also reduces the impact of shot noise and read noise,” Marc Levoy of Google writes. So, “why not use HDR+ to merge dozens of frames so we can effectively see in the dark?”
With the Night Sight feature, you can capture clean and sharp images in the regime between 3 lux and 0.3 lux. To give you an idea, 3 lux is a “sidewalk lit by street lamps” and 0.3 lux is “I can’t find my keys on the floor.” So, just by using a smartphone, a single shutter press, no LED flash and no tripod – your Google Pixel phone can find those keys on the floor. And apparently, take a sharp photo of them.
According to Levoy, there are two problems with the Night Sight feature. First, it uses positive-shutter-lag (PSL), unlike the default mode which uses a zero-shutter-lag (ZSL). With the PSL, the phone waits until after you press the shutter button before it starts capturing images. So, you need to hold still for a short time after pressing the shutter. But on the other hand, this allows the use of longer exposures.
The second problem is motion blur due to the handshake or to moving objects in the scene and it comes with increasing exposure time. The optical image stabilization found on Pixel 2 and 3 “reduces handshake for moderate exposure times (up to about 1/8 second), but doesn’t help with longer exposures or with moving objects, Levoy explains. Google solved this problem is solved by measuring motion in a scene and setting per-frame exposure time that minimizes blur. If the camera is stabilized, the exposure of each frame is increased to as much as one second. In addition to varying per-frame exposure, the number of frames captured varies as well: 6 if the phone is on a tripod, and up to 15 if it is handheld.
You can read more about the Night Sight feature on Google Blog and see more sample images. The result seems pretty impressive, considering that these are smartphone images taken with a single camera.