Google explains how the Live HDR+ feature on the Pixel 4 and 4a works

Peter, 04 August 2020

The new Pixel 4a supports Live HDR+ and Dual Exposure, features first introduced with the Pixel 4 (note that these won’t be backported to older devices). The tech giant published a detailed blog post explaining how the two features work.

Google explains how the Live HDR+ feature on the Pixel 4 and 4a works

First, what is Live HDR+? It shows a real time preview of what the final HDR+ photo will look like. Note that this is just a preview derived using a different algorithm. The real HDR+ takes 3-15 underexposed photos (to reduce noise), aligns them and merges them.

After the merge, tone mapping is applied to produce the final photo where details in both highlights and shadows are visible. The phone computes a 2D histogram to achieve that, which is interesting to see.

Google explains how the Live HDR+ feature on the Pixel 4 and 4a works

However, current mobile chipsets don’t have the computational power to do that 30 times per second. Instead, a dash of AI is used. The image is sliced into small tiles and the AI predicts the tone mapping for each of them. Then every pixel on the viewfinder is computed as a combination of the tone maps from the nearest tiles.

Here’s a comparison between the predicted image and the actual HDR+ result. It doesn't get it quite right, but it looks pretty close (especially since you'll be viewing this on the phone's screen).

Predicted HDR image (seen on the viewfinder) vs. actual HDR+ result
Predicted HDR image (seen on the viewfinder) vs. actual HDR+ result

Balancing highlights and shadows is done automatically by HDR+. The Dual Exposure sliders give you manual control over the process, so you can get the desired look for your photo in camera. Traditionally, this is something you would do afterwards by processing the RAW file.

Same scene, different adjustments to the Dual Exposure sliders
Same scene, different adjustments to the Dual Exposure sliders

If you want a more detailed explanation of how all of this works, you can follow the Source link to Google's blog post for more.

Source


Related

Reader comments

  • A patel
  • 05 Aug 2020
  • Dkb

Very nice phone

Exactly, I agree with you. I never let the software to dictate the final outcome, it always been more miss than hit. For casual users who only posted photos on social media, that wouldn't be an issue. For someone like me who always sell photos a...

  • Anonymous
  • 05 Aug 2020
  • DkP

Pixels do have computational raw, ie it does just the stacking and not the colour / tone mapping. In fact many don't know that pixels have the best mobile raw along with Nokia 9.

Popular articles

More

Popular devices

Electric Vehicles

More