Samsung is working on its own version of Apple's Deep Fusion feature

11 September 2019
Computational photography is taking over.

Sort by:

  • A
  • A Nonny Moose
  • K6q
  • 17 Sep 2019

Anonymous, 12 Sep 2019Ford isn't the first car to have 4 wheels and a steering wh... moreYeah, but they just copied the 4 wheels of the horse and carriage guys...yawn so unoriginal.

    Mr. Anonymous, 15 Sep 2019Google is also using machine learning and neural network al... more"machine learning and neural network algorithms"
    They are mostly for HDR+ processing, such as tile-based image stacking that prevents moving part of the image from being merged, AI white balance correction etc. It's good that PVC supports 3rd party software unlike NPU integrated to many smartphone SoC, though, which are often only available to the 1st party camera/AR apps.

      Nick Tagataka, 14 Sep 2019Hmm, that's interesting to hear. What are the other uses of... moreGoogle is also using machine learning and neural network algorithms for camera software effects (using Pixel Visual Core, shortly PVC).
      PVC supports Halide for image processing and TensorFlow for machine learning.

        • D
        • AnonD-889996
        • Ki1
        • 15 Sep 2019

        Anonymous, 13 Sep 2019It is correct. App drawer lag even on S10 Plus.You can't differenciate between lag and delay

          Anonymous, 13 Sep 2019It is correct. App drawer lag even on S10 Plus.Hah, i can understand the lag being on something like the J3 prime or something, but Samsung phones do not lagg for me, even on a A10.

            • ?
            • Anonymous
            • 6wN
            • 14 Sep 2019

            Anonymous, 13 Sep 2019Not on mineOpen and close it as fast as you can. It will lag.

              Mr. Anonymous, 14 Sep 2019It's actually very much alike. Pixel visual core is not onl... moreHmm, that's interesting to hear. What are the other uses of Visual Core, apart from accelerating data gatherings and image processing needed for HDR+ algorithm?
              Also, could you kindly tell me how Smart HDR and HDR+ are very much alike? Because to my understanding the former is an advanced form of HDR exposure stacking whereas the latter was derived from a regular multi-frame noise reduction. Sure, they both use buffered frames to achieve zero shutter lag, but the fundamental idea behind each image processing is quite different IMO.

                Nick Tagataka, 13 Sep 2019Yes, Google uses Pixel Visual Core to process HDR+ images q... moreIt's actually very much alike. Pixel visual core is not only about processing hdr+ quickly.

                  • s
                  • s-pen pusher
                  • PMT
                  • 14 Sep 2019

                  YUKI93, 13 Sep 2019It seems the days of hardware physics doing the job really ... morei am afraid so. i still am holding on to a k zoom up to now as my point-and-shoot. although it still has smaller sensor than regular point-and-shoot, it definitely would take better photos sans loads of post processing on today's smartphones. there is just no other way but computational photography to improve photos shot with smartphone cameras- i mean smartphones are only getting thinner and thinner, and this does not support the physical requirement for focal length for bigger and better sensors.

                    • ?
                    • Anonymous
                    • U{U
                    • 13 Sep 2019

                    Anonymous, 13 Sep 2019It is correct. App drawer lag even on S10 Plus.Not on mine

                      • ?
                      • Anonymous
                      • 6wN
                      • 13 Sep 2019

                      Cam Dexter, 13 Sep 2019A sweeping statement that's also totally incorrect. It is correct. App drawer lag even on S10 Plus.

                        • D
                        • AnonD-731363
                        • n$p
                        • 13 Sep 2019

                        Instead of making own they just wants to copy paste and rebrand.
                        Shame on shamesung.

                          Mr. Anonymous, 12 Sep 2019This was introduced first by Google in Pixel 2, they have a... moreYes, Google uses Pixel Visual Core to process HDR+ images quickly since Pixel 2. The actual image processing itself, however, is completely different on Pixel and iPhone.

                            Anonymous, 12 Sep 2019You want many features. Samsung has many of them. You want ... moreA sweeping statement that's also totally incorrect.

                              It seems the days of hardware physics doing the job really is coming to an end. I am definitely going to miss the days of Nokia 808 PureView, the Nokia Lumia 1020, and the Panasonic Lumix CM1, there will be no direct replacement for those legendary trios.

                              Honorable mention to Samsung's Galaxy S4 zoom and K zoom for bringing properly legit optical telephoto lens.

                                Mr. Anonymous, 12 Sep 2019This was introduced first by Google in Pixel 2, they have a... moreI thought the same. Who did write this article?
                                Google pioneered this feature and suddenly all the credits go to Apple?
                                Will this thing ever going to stop?

                                  [deleted post]Read more about pixel visual core to understand.

                                  If you disagree, explain why. Going down the road of personal attack only shows and proves your own ignorance.

                                    • ?
                                    • Anonymous
                                    • DWe
                                    • 13 Sep 2019

                                    Mr. Anonymous, 12 Sep 2019This was introduced first by Google in Pixel 2, they have a... moreHuawei did it first

                                      Anonymous, 12 Sep 2019Who said it uses monochrome sensor to produce high quality ... moreI read DxOMark's review on Meizu 7 Plus which says it combines data from two sensors for improved depth effects and image quality. GSMArena might have failed to mention it in the article, but to be honest I can't think of any other reasons to include a dedicated monochrome camera on a phone.
                                      https://www.dxomark.com/meizu-pro-7-plus-dual-cam-dual-screen-powerhouse/

                                      "it is similar to what pixel does. multiple stack of different exposure from 1 sensor either rgb only or monochrome only"
                                      Pixel's HDR+ does NOT stack different exposures, it merges underexposed images taken at exactly the same exposure level to preserve highlights then bring up the shadows afterwards.
                                      Also you need to understand that iPhone's Deep Fusion is much more than simply combining multiple exposures. It also uses pixel-shift-like algorithms to produce higher resolution image that does not need demosaicing, and constantly buffers 8 frames before shutter is being pressed to make sure that there won't be any shutter lag.
                                      So it's not just a "catchy term to fool the sheeps", here Apple clearly does what other phone manufacturers don't/can't do, therefore it deserves a separate name even if it sounds a bit too fancy and Apple-like.

                                        • r
                                        • red boy
                                        • HsL
                                        • 13 Sep 2019

                                        apple copy from google?