Apple’s Deep Fusion photography arrives with iOS 13.2 Beta

Ricky, 01 October 2019

During Apple’s keynote last month where it announced the iPhone 11 lineup, the company also unveiled new camera technology called “Deep Fusion” that takes four frames before you hit the shutter, four more once you do, and one long exposure shot. The 8-core Neural engine will select the best frames and create a high-quality HDR photo.

The resulting images are highly detailed, sharper, and more natural-looking. The machine learning part of the Neural processor will analyze the image being taken and process differently depending on whether it sees sky, foliage, or skin tones. Meanwhile, structure and color tones are based on ratios obtained by the Neural unit on the A13 processor.

Deep Fusion is only compatible with the latest iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max.

Source: TechCrunch (click for full-res)
Source: TechCrunch (click for full-res)

The feature will be coming with iOS 13.2 and those with developer access to the iOS Beta can begin testing Deep Fusion now. Those on the Public Beta will be able to test this feature soon enough. We are intrigued to test this feature out and see what kind of images the iPhone 11 trio can produce with Deep Fusion.



Reader comments

Now I understood what you mean. The name I have in mind is "still life".

Humans arent still objects unless they're dead. Even so, even the ones taking the picture aren't still object either so there's that. Do you get my point?

You sure didn't understood... You said that it'll make a difference in still objects, that "the artifacts" could be from be subject moving... but the subject on that phone wasn't moving so...? About being stupid, I'll let you think for yourself ...

Popular articles


Popular devices