Samsung Galaxy A9 (2016) review: A-lister

19 February, 2016
Prepare to meet the biggest premium phone Samsung has made. No, not the Note5, even that one is dwarfed by the Samsung Galaxy A9 and its 6" Super AMOLED screen.

Sort by:

  • W
  • Woodpops
  • 7vc
  • 14 Aug 2017

It seems to use up its battery power very quickly or am I doing something wrong

    • h
    • hunaini
    • KZJ
    • 04 Feb 2017

    Why it doesnt have notification light 😭😭😭

      • ?
      • Anonymous
      • 7j{
      • 16 Jan 2017

      Arsalam, 08 Nov 2016Hi I live in dubai, A9 is not available here, cal someone ... morePlz know This phon battery life?

        Arsalam, 08 Nov 2016Hi I live in dubai, A9 is not available here, cal someone ... moreYou can buy from AliExpress

          • ?
          • Anonymous
          • 7k6
          • 18 Dec 2016

          Samsung galaxy A96 internet data not start even if APN set perfect

            • A
            • Adzangel
            • t7T
            • 04 Dec 2016

            Pls.. I need help why my samsung A9 the back cam is not clear.. What ca I do? It is better that u use a front cam than the back cam? And there are many floating apps , then the repeatedly install the malicious malware. It will never be gone? What can i do?

              • A
              • Arsalam
              • 3Y7
              • 08 Nov 2016

              Hi
              I live in dubai, A9 is not available here, cal someone please guide me from where i can buy this phone and how?

                • A
                • Anas shah
                • X}$
                • 05 Nov 2016

                A9 nice foune

                  • m
                  • monu
                  • Hs8
                  • 11 Oct 2016

                  HennaM, 28 Jun 2016It is hard to connect to wifi. It is connected but 5 mn lat... moreReset it

                    • D
                    • AnonD-596378
                    • 2Is
                    • 11 Oct 2016

                    Where can i get this?

                      • ?
                      • Anonymous
                      • KA3
                      • 13 Sep 2016

                      mir, 22 Feb 2016You obviously don't know what you're talking about.yes bro you are right according to you nokia 1200 is more power efficient because it has great battery life. hahahahahha

                        • A
                        • Amu
                        • bEU
                        • 17 Aug 2016

                        A9 is a very good phone, using since last February.
                        Found Sound and loudness level quite satisfying and upto the mark.

                        Thanks

                          • H
                          • HennaM
                          • ITj
                          • 28 Jun 2016

                          It is hard to connect to wifi. It is connected but 5 mn later, it is disconnected. I don't know how to solve it. Moreover, i try connecting to office wifi but it is blocked.

                            • Q
                            • Qureshi
                            • H4q
                            • 10 Jun 2016

                            I would like 2 bye this hand set its owesome & hv a vry good look

                              • F
                              • Finder
                              • Hkt
                              • 21 Mar 2016

                              Is A9 a CDMA phone too

                                • D
                                • AnonD-373198
                                • rvv
                                • 15 Mar 2016

                                anon, 19 Feb 2016Sounds like a flagship killer. Honestly if you can live wit... moreI would take this any time over the S6. Perfect for power users!

                                  • L
                                  • LoveThatRed
                                  • K25
                                  • 24 Feb 2016

                                  AnonD-259899, 22 Feb 2016:D You must be not very bright if you could not understand ... moreAgree. Thumbs up for the effort ..

                                    • D
                                    • AnonD-259899
                                    • tue
                                    • 23 Feb 2016

                                    mir, 22 Feb 2016You obviously don't know what you're talking about.Ah ha? enlighten me :D

                                      • m
                                      • mir
                                      • t7E
                                      • 22 Feb 2016

                                      AnonD-259899, 22 Feb 2016:D You must be not very bright if you could not understand ... moreYou obviously don't know what you're talking about.

                                        • D
                                        • AnonD-259899
                                        • M7r
                                        • 22 Feb 2016

                                        nganghanimtonla, 21 Feb 2016you said "SD652 is far more efficient than the SD615 of the... more:D You must be not very bright if you could not understand a much more simplified explanation for architectural reasons as to why the A72 is far more efficient than a A53. I could easily get into the finer points of CPU architecture and give you even more detailed explanations but given your level arrogance in your own ignorance and the sheer inability to grasp the simplest of concepts I can be excused when I say I am not going to bother with such an effort and thereby save my time.

                                        1) Yes A72 is far more efficient than A53 at any heavy load task, SD652 has the A72 and the SD615 does not, therefore logic dictates in heavy loads the 615 consumes more power to do the same heavy task. See? that didn't require you to be knowledgeable in CPU. I have already given a fairly sufficient explanation in my previous reply to you but it is clear it went over your head.

                                        2) Your comment --> "base on what you said now if i watch 2hrs movie in s652 phone ill finish it faster than in s615 phone? lol"

                                        Oh dear, where do I begin now? Should I laugh out loud at the sheer daftness of that statement? or should I feel pity for your ignorance. As an academic I am taught not to look down on people who aren't as well versed in a subject that I am experienced at but your "lol" at the end shows contempt born from ignorance and for that I shall choose the former XD so.... LMAO! ... now with that out of the way here is an easy example to show how m0r0nic that statement is.

                                        AMD's FX 8core bulldozer has significantly lower IPC per core compared to say the Haswell core, ~50-55% behind, so the AMD system is going to play the same video file 50-55% longer than the Haswell? lol oh my goodness, I can't laugh enough at that unintelligent assertion of yours.

                                        I could go into Iow level explanation as to why this is absurd but it would go above your head so I'll make it even simpler this time. Video playback is a real-time process verses rendering which is not a real-time process. Real-time process have a set of conditions to meet otherwise that process isn't considered as working. Videoplay back at whatever fps is a set thing and the algorithm works in a way that it guarantees that fps so it doesn't matter if it runs on a lower end chip like A53 or A72 both can so this task to guarantee that fixed fps. This even more eay to understand given videoplayback has no special effects calculations to be done at the time of playback (unless you apply post-processing techniques) But the savings of the A72 comes in how fast it processes operands and instructions ahead of time and then races-to-sleep thereby saving more power. The frame buffers (aka what you see on your screen) is saved to memory till the time it is ready to be shown.

                                        What happens is that the A53 which is maintaining the required fps and showing the movie without stutters, is not able to go to sleep that often and because of this it is using it's maximum frequency almost all the time and thereby using up more power. The A72 is sprinting and going to sleep often in-between saving power, yet to the user none of this is apparent, all the user sees is uniform 24fps or whatever the movie file is using.

                                        Videoplayback you cited as an example is a woefully poor one to even begin with, these applications are best suited to be run specialized ASICs designed for this purpose alone, eg: hardware decoders. While these decoders are still on the same die as the CPU core, the core are asleep while the decoders take over and they are far more power efficient because it doesn't need all the extra logic even the A53 has to do the task. Lower end SoC design houses skip good decoders because they want to save cash on the per SoC cost they just let the weaker CPU do everything, you get what you pay for ;)

                                        If you smart enough the real task you should have compared with is video transcoding, now that is where you will see the huge advantage of the A72 over the A53, transcoding is all about how fast you can convert the media since it is a rendering process, this is all about raw instructions in-flight and how well they are executed and how soon. A53 is just an in-order 2 partial wide issue CPU the A72 is an Out of order 3 wide issue CPU. The difference is night and day.

                                        3) Another hilarious example of ignorance: Your comment ---> "coc do my builder speed the building process? again no. lol"

                                        So Anadtech, HardOCP, Guru3d, techpowerup, TechReport, PCPer et al all doing worthless game comparisons of CPUs using the same GFx card because there shouldn't be any difference in rendered frame rates right? LOL wrong and that too embarrassingly wrong. Bro do you even understand reviews about hardware? Seriously? ROFLMAO!!

                                        What you are talking about has no relevance to hardware at all, that is an in game mechanic to build your assets at a certain amount of time. It is based on a clock routine in code, omg how can anyone be this dense? O.o. This is as horrendously daft as saying the system clock will run faster on the haswell compared to the AMD chip, why? "because the IPC is higher duh!"

                                        Well there is a cute term for your kind of logic, bro-science, but even bro-science has some degree of coherence and misunderstood science behind it. Your comment has no logic at all, epic fail.

                                        I'll say it again, COC builder is working on a clock, a clock code running on any chip has to remain a clock, it is not dependent on how fast a chip can process it. to slow this code down you would have to be in the single digital hertz to even begin to slow the clock down, but infact we can build a fully functional simple clock with just a 1 hertz clock pulse. Of course I don't expect you to know how that can be even done.

                                        So two utterly irrelevant examples which make no sense whatsoever and has no context to what CPU efficiency is all about. I think if am a betting person, 1 trillion says you are never going to be employed in the Electronics Engineer Industry, like ever. =))

                                        4) Your comment "why s652 is only quad core a72 if its far more power efficient than a53? why not octa instead? rofl"

                                        Oh finally one worthy logical question! maybe there is hope for you after all but am not holding my breath given the disastrous first two.

                                        Answer: Simple, Die-Size. A Wafer from which these chips are extracted is a fixed cost after all the fab etching procedures. FAB doesn't charge per chip or yield they charge by wafer, it upto the designer to make the chips smaller and thereby extract more fully functional chips from each wafer. More chips per wafer = high margin per chip = higher profits/larger discounts for the special customer buying the chips in higher volumes.

                                        A72s with their out-of-order logic takes up more die-size, A72 also has larger sized and levels of cache that also takes up space, A53s are tiny in comparison so when Android ecosystem today is a mix up low and high load content, the vendors can get away with having a mix of weaker A53s to do the low loads and A72s/A57s to do the heavy loads. In the future when process nodes get cheaper and smaller octa A72s makes business sense to make for mainstream markets. But at this time even 4 A72s seems more than plenty for today's high loads.

                                        Qualcomm has gone the smart way, they have used one core to for all work loads. They have 4 cores with good IPC and put the majority transistor budget towards making a good strong core. The space they would have otherwise spent on the A53 for a big.LITTLE HMP can now be put towards a beefier GPU or improved Modem or any other logic.

                                        A53s are meant for smallest diesize at big trade off of efficiency especially at the high work loads, the only reason why some flag ship SoCs are really having the A53s on them is to boost Multicore benchmarks scores (yup that's the un-told truth), I am looking at you Exynos ;) If your custom core is good enough you don't need the A53s ever. The A72s alone as already good. Samsung's Mongoose cores as quad core is enough for all the tasks today, they don't need the A53s on there, cheeky samsung is after that higher multicore score in benchmarks against Qualcomm, Mediatek is no different, they are trying to get the gullible to buy phone based on their chips because of higher MP scores over SP cores specially on those SoCs that only have A53s on them.

                                        It's big marketing to the gullible like you. 8 cores, 10 cores wow moar cares yeehaaw!! ....Yeah but technical reality is very very different as I have explained so far.

                                        If you don't understand something either read up or just ask, nothing wrong in that. Just saying ;)