Apple’s Deep Fusion photography arrives with iOS 13.2 Beta

01 October 2019
It will be available to Public Beta users once iOS 13.2 is available.

Sort by:

  • ?
  • Anonymous
  • Scq
  • 02 Oct 2019

akm, 02 Oct 2019apple just add a fancy name it same as google hdr+"apple just add a fancy name it same as google hdr+"

Apple already have HDR+ and no it's not the same

    • ?
    • Anonymous
    • Scq
    • 02 Oct 2019

    The last Oracle, 02 Oct 2019Deep fusion photography is just reworded HDR. Everybody doe... more"Half of Apple's charm is giving is fancy names."

    Actually all companies do that, of course companies want to make catchy name which is easy remember, just look the Sony TV and all the fancy names they use for their features or Samsung.

      • ?
      • Anonymous
      • Scq
      • 02 Oct 2019

      The last Oracle, 02 Oct 2019Deep fusion photography is just reworded HDR. Everybody doe... more"Deep fusion photography is just reworded HDR."

      Not really, Apple already have HDR+

        Deep fusion photography is just reworded HDR. Everybody does it, but Apple just gives it a fancy name. Like the Retina Display with a 720p display, and low display brightness. Half of Apple's charm is giving is fancy names.

        Next will be Divine HDR, or AI-HDR. It's just marketing rubbish.

          • D
          • AnonD-804996
          • M}3
          • 02 Oct 2019

          It feels like Neural Engine on A12 in older devices is totally unused given only 11 series are getting all this stuff but not X series.

            • D
            • AnonD-706668
            • KAe
            • 02 Oct 2019

            Huh ! its deep fusion is nothing but a multi frame image stacking to reduce image noises without sacrifice any details. which google pixel already used on its earlier phones.

              Deep Fusion has little in common with HDR+. It has more in common with pixel shift (only not using IBIS) and exposure stacking.

                • a
                • akm
                • HsL
                • 02 Oct 2019

                apple just add a fancy name it same as google hdr+

                  • ?
                  • Anonymous
                  • Kxu
                  • 02 Oct 2019

                  Kristo Radion, 02 Oct 2019i think it's not what he meant. Machine learning requires ... moreThat may not be the only way for training.

                  According to Apple, when they talk about Machine Learning or AI, they always say its local to the device, rather then transmitted to their end for processing. Thus kept private. So as time goes on, the phone becomes more and more tailored to your needs, as the model is built based on your experience rather then another person.

                  That is of course means you have to take their word for it.

                  I am keen if someone can disprove this fact, by showing proofs that the AI / ML model is same throughout everyone's iPhone. But until now, no one ever try.

                    DroidBoye, 02 Oct 2019My question is, where did you read the part by which Apple ... morei think it's not what he meant.
                    Machine learning requires a lot of images for training. So we can assume that Apple took tons of users' images from iCloud and used it for training.

                      Panino Manino, 02 Oct 2019This sample... don't see much different form a "normal" pho... moreThis will sure make a difference on still subjects. The artefacts could be due to a moving subject which multiple image-stacking is not applicable.

                        • ?
                        • Anonymous
                        • Scq
                        • 02 Oct 2019

                        Natural Selection, 02 Oct 2019i remember such a thing in iphone x reviwe,whats hapeningyou propably remember smart HDR

                          i remember such a thing in iphone x reviwe,whats hapening

                            ItsMeMyself, 02 Oct 2019Someone please clear my confusion up. Apple tends to use a ... moreMy question is, where did you read the part by which Apple does not collect and process data? If there is OR isn't, then it should be in the Terms and Conditions and EULA, did you read those prior to your assumption?

                              Someone please clear my confusion up. Apple tends to use a lot of machine learning terms in their marketing. So I mean, doesn't training a machine require large amounts of data, so if they claim to not collect any data how are they training their models?

                                Anonymous, 02 Oct 2019My Friends: wow bruh.. is that the picture that you took? i... moreYou should tell your friend to check his eyes with optic specialist instead being happy. I assume he got color blind or light case of cataract. That or he only ever use / see cheap stuff (ie, $250 dirt cheap "dlsr" kit, $50 pc monitor, etc)

                                  How long is long exposure?

                                    This sample... don't see much different form a "normal" photo.
                                    And can't this technique of combining multiple photos produce artifacts in the final photo?

                                      • ?
                                      • Anonymous
                                      • pQr
                                      • 02 Oct 2019

                                      The eyes look artificial at pixel level. Reminds me of AI zoom.

                                        • ?
                                        • Anonymous
                                        • pQr
                                        • 02 Oct 2019

                                        I guess that Apple totally exaggerated. Apparently Deep Fusion isn't used in perfect light conditions, but in worse light conditions. This means that the Deep fusion image quality won't be better than the image quality in perfect light conditions. But even in perfect light conditions the iPhone doesn't have perfect image quality.