Samsung is working on its own version of Apple's Deep Fusion feature

Ro, 11 September 2019

Apple announced the so-called "Deep Fusion" feature on its iPhone 11-series smartphones the other day, promising excellent photos thanks to an advanced machine-learning algorithm. Details were scarce but it sounded a lot like Google Pixel's HDR+. And according to the latest rumors, Samsung is working on a version of its own.

Ice Universe is a renown Samsung leakster and most of the predictions coming from this Twitter account are on point, so we have a good reason to believe the report. Samsung's new feature is supposed to take advantage of the NPU inside the upcoming Exynos chipset, which will power the upcoming Galaxy S11-series.


Related

Reader comments

  • A Nonny Moose
  • 17 Sep 2019
  • K6q

Yeah, but they just copied the 4 wheels of the horse and carriage guys...yawn so unoriginal.

"machine learning and neural network algorithms" They are mostly for HDR+ processing, such as tile-based image stacking that prevents moving part of the image from being merged, AI white balance correction etc. It's good that PVC supports 3rd party...

Google is also using machine learning and neural network algorithms for camera software effects (using Pixel Visual Core, shortly PVC). PVC supports Halide for image processing and TensorFlow for machine learning.

Popular articles

More

Popular devices

Electric Vehicles

More