The Asus ROG Phone 5 will have up to 18GB of RAM, Geekbench scorecard confirms

01 March 2021
Previously we've seen versions with 8GB and 16GB. Would be buyers can now reserve a place in line to buy the phone over at JD.com.

Sort by:

Anonymous, 28 Mar 2021"Theres full laptops running the 8cx," no theres ... moreMacbook 13 has the m1, same thing like galaxy book s. And
yes, its geekbench.


They both destroy. What fumes?


Its not nitpicking, theres no source for their claims. Wikichip doesnt have it.

Adrenos did fare well, look it up.


Ah yes, sure no improvement on anything else in those 40 years, got it. Totally similar.

Is that the exynos s10? They sd does just fine 60.

Thats something noone other than apple will ever know.

I compared aztec, and x1 does beat the a10 in it.

What gatekeeping? Its more demanding than what it looks like. The 732g is really weak and it does better than the pc. The devs prolly didnt care much about the pc version.


I have always been on the context, which has the better gpu. Not by terms like flops , core count, clocks and stuff, but how it runs stuff.

What choice? Its both at the same time. I was comparing apple gpus to adrenos. Apple gpus does it both at the same time.


Ah more of if they did that, they would go like this, and then do that, next this etc. Great.


Well yeah, 1080 vs 2k onscreen, its obvious what will win. The offscreen makes both run at same resolution. No metal magic to be seen there, qualcomm is comfortably ahead back then. Well not anymore.

I dont see any of the advantages of all that ram on mobiles though, not for gaming atleast. You mention some stuff, but whatever it is devs arent making use of it.


I mean, yeah its getting tiring. I even stopped with the quoting.




























    • ?
    • Anonymous
    • B{P
    • 28 Mar 2021

    Mediatek sux, 26 Mar 2021"the only thing you were complaining about soo much wa... more"Theres full laptops running the 8cx," no theres only notepads and some 2in1s, nothing really on a scale of macbook pro, anyways using 8cx for a laptop is the same thing as using 800 series for tablets and notepads, and there is no native arm benchmarking software besides geekbench probably, let alone benchmarking not even ms office is fully arm native.
    this and the fact that microsoft sqs literally support very few gpu benchmarks to compare them at all.

    "Ipads do, so do iphones. Yada yada your point yada."
    then just compare them to iphones instead of fuming about how a12z crushed snapdragons.?

    "No websites like tomshardware, anandtech, notebookcheck, or anything knows the gpu clocks of apple gpus. That site just assumed it"
    this is just nitpicking, anandtech is cringe.
    no really there are plenty of sources like this one, and wikichip, they can give more valid info than these 3 websites you mentioned. notebookchheck is not accurate about cpu info, anandtech once gave the wrong fp32 performance of iphone 6 (A8).
    regardless, each review comes up with a certain assumption, that a14 graphic performance slows down like a crabon ceramic brake, whether the approach is higher frequency or very very big amoubt of shading units crammed in each one of those 4 clusters, its a hell of an approach.

    "Its outdated. Wildlife is the current 3dmark bench. Aztec is he current kishontis bench.
    "
    ss is the latest and the last of the opengles equipped benchmark that ran on ios.keep that in mind that all of the nagging about how adrenos used to fare well (they didnt, bevause they never fared well in gfxbenchmark)against Apple A gpus on gfxbenchmark were on the opengles benchmarking era. its kind of ironic seeing how you were heavily gatekeeping and disregarding all the factors I mentioned, one of them was gl, wbich is what this point is about. btw, manhatten is still used by media outlets like anandtech. isnt that outdated idk, maybe you should give anandtech a callout too and tell them to stop using these benchmarks, as ss is newer than manhatten thus is still valid in that sense. (totally not my prob, since were talking PURELY anandtech here, as if their reviews were some sort of a holybook)

    "This isnt as 40 years old difference as those."
    yeah tho the only thing common between those was the X86 ARCHITECTURE, this is literally the same thing youre saying, that both chips are the same because theyre arm64, theyre not even the same version of arm architecture....my sides.

    "Look it up. Sd 865 s20 ultra does 120fps at max graphics pubgm. Until it throttles"
    werent you just saying that youre oneplus 7 , a 2019 flagship isnt outdated (2hich it isnt, but outdated in a sense of comparing it to the rog 5 instead of using rog 3 for the conparison)? bear with me, of a 2019 flagship isnt outdated, why the hell the s10 cant get past 20 fps on pubm on ultra.? pubgm is so light afterall, and yk, not only adrenos should run them fine, unless if adrenos were already good gpus in 2019 and they need better bandwith more than the need for a better architecture? one can guess.



    "Who said it is? Just saying how apple just did a transition well"
    who said transitions to newer arch can only be bad? is amd zen bad? no, I said it could be , which I proved by the fact that the improvement wasnt on the same scale as other manufacturers, apple worked the m1 so good because it had all the foundation laid out 3 years before. when they made their own gpus, they did not design anything like the four cluster gpu before.

    "Its you who said a10 beat it, smarphone or not, while it really didnt. The one with the 'better' gpu is the one that scored higher."

    thank god Its not the case, because if it was this would be a forever stretch

    https://gfxbench.com/compare.jsp?benchmark=gfx40&did1=52778367&os1=iOS&api1=gl&hwtype1=GPU&hwname1=Apple+A10+GPU&D2=Google+Pixel+C
    I did not even say A10 didnt beat it, I just said you shouldnt compare a low power chip with a higher one, if this statement done anytbing to you, it just made you look wrong at EVERY.SCALE.POSSIBLE.

    "Its still different in lighting, resolution, draw distance, texture sizes."
    stop gatekeeping. I play them both, the pc version only has a tad more reflections and shadows. not a big difference from mobile to pc, there is no visible difference in texture quality whatsoever. even the desktop version has less shading than zelda breath of the wild. which makes it not so much more texture packed as you mentioned compared to the mobile one. yet the gtx barely gives 25 fps on high, not ultra, which should be the ultra equivalent on mobile. which gets you about 60fps currently. come oooon.

    "Ah the maturity measurement again. Did you even read what i first said? I suggest a reread. No one wants your point of 'probably better' gpus on apple, while the results give an obvious difference in lead."
    look, tbh, I dont even know what are you complaining about atm. how my statements about apple gpus vs adrenos has anything to do with the point I made about apples more efficient cpu architecture is yet another new mystery. im tipping you, nothing personal, but thats the truth, immature way to answer, " the results show better" a point thats not related to what I tried to prove, with no further elaboration. read, and instead of wasting your time and my time, try to stay within the context of what you quoted, or make your quotations more specific.

    "What did i contradict? Fps/watt is the most important on handheld. How do you determine its better other than it scoring higher and using lower power at the same time?"
    im just goibg to quote this one
    "Who cares when it runs stuff better at lower power."
    choice is yours, pick one.


    "What obstacle? You think apple just started work a day before release?"
    omg thats soo irrelevant to what I stated. look buddy, 101 here, planning a new microarchitecture rarely gives desired performance upgrades on first cycles, regardless of the planning phase, im not saying a company dont design their architecture before release what the mainstream architecture have over the new one is readily os support and easier upgrade phase , thus it was hard for apple to give the same improvement as anyone else, not even close to their older supplier, imaginations tech, which their 8th gen gave between 35-40% performance improvements over their previous gen gpu, found in thd a10. had apple used this gpu they wouldve been considerably closer.



    "adreno 530 beat the a9 gpu.
    Remember, your first point was "qualcomm was never ahead"

    So yes, yikes bro lmao"

    really yikes https://www.anandtech.com/show/10120/the-samsung-galaxy-s7-review/4
    youre probably going to say something immature in the lines of "but that one offscreen test is the actual determinor because of the display resolution" because sadly it wasnt the majority of the case:
    https://www.anandtech.com/show/9837/snapdragon-820-preview#:~:text=820%20uses%20a%20new%20Qualcomm,power%20cores%20clocked%20at%201593MHz.
    https://gfxbench.com/device.jsp?benchmark=gfx40&D=Apple+iPhone+6S+Plus&testgroup=overall
    https://gfxbench.com/device.jsp?benchmark=gfx40&os=Android&api=gl&D=Xiaomi+MI+5+%281.8+GHz%29&testgroup=overall
    look at the massive difference in onscreen trex specifically.
    like I said. im not trying to devalue gfx or discredit any benchmark that shows iphones faring better, but metal is overpowered as hell, and that is seen in wildlife thanks to metal 2.3 its so good that it burrys vulkan in the abyss.
    however youre just cherrypicking what you want from any point I make, hence I think this is useless. if you question each sentence I make, then we go into it, then I have to make another point for the sake of convincing, then you question these statements again bring the questioning to a whole different point, honestly dont you think that this is lame look at the walls of texts here. you went all the way from asking about whether there is a gddr3 with more bandwith than 40gb/s to actually conparing latest gen smartphone graphics to an outdated 100 dollar m card fps, why we still here... whether youre convinced or not thinking more RAM is useless is up to you, because things that could boost gaming performance a lot like ramdisk and graphics memory utilization clearly dont even exist in your world. sigh*

      Anonymous, 25 Mar 2021"Did you not see the word 'phones'?" t... more"the only thing you were complaining about soo much was literally how "adrenos suck, ipads gave better peformance and a12z gives better performance" yada yada yada. thats what my point was about."

      Ipads do, so do iphones. Yada yada your point yada.

      "8x powers a surface notepad, that snapdragon is running x86 while m1 runs on roserta 2 while 8x uses the much slower windows x86 emulation, thats so smart of you to compare the two"

      Theres full laptops running the 8cx, theres native arm benchmark software. Thats what i compared, not apps.

      "you mentioned how a14 beaten adrenos, sd865 was the same gen as a14 and its a 7nm chip."

      A14 beats the 888 too. 865 isnt the same gen, if anything the release dates are closer between 888 and a14 than 865 and a14.

      "https://www.cpu-monkey.com/en/cpu-apple_a14_bionic-1693"

      No websites like tomshardware, anandtech, notebookcheck, or anything knows the gpu clocks of apple gpus. That site just assumed it.

      "you mean, 3dmark ss?
      ....
      3dmark slingshot 3.1 came after kentoshi's manhatten by approx 4 years."

      Its outdated. Wildlife is the current 3dmark bench. Aztec is he current kishontis bench.

      "no, by the time integerated graphics will grow again it will be a decade behind, graphics are evolving pretty fast and games are becoming more and more demanding."

      Well yeah, it will never reach higher power higher area chips anyway. But theres still room to improve.

      "you really LOVE contradicting yourself, first you mention anandtech's silly fps/watt charts based on gfx alone (which didnt even show that great difference) and claim apple nailed adrenos on a cross because of their efficiency and how that in specific is a useful determiner to which is better than which."

      What did i contradict? Fps/watt is the most important on handheld. How do you determine its better other than it scoring higher and using lower power at the same time?

      "ok, intels 8086 is not entirely different to amd's zen 3 because both are x86 processors."

      This isnt as 40 years old difference as those.

      "and in which way that makes them comparable to adrenos?"

      Who said it is? Just saying how apple just did a transition well.

      "A10 is a smartphone chip. the tegra x1 sole competitor was the a9x, and despite tegra x1 having a better gpu, gfx favored a9x."

      Its you who said a10 beat it, smarphone or not, while it really didnt. The one with the 'better' gpu is the one that scored higher.

      "repulsively immature.
      let us compare the a12z to the tegra x1, because the tegra x1 is nvidias only tablet chipset availabe at this time."

      You sure have some great ways to measure maturity. Uh sure compare them if it makes any sense to you. It doesnt to me. I sure hope i become as mature as you to compare chipsets with a 5 year gap or 40 year gap like the intel you mentioned.

      "did you read? I said "could be usefull"
      desktop gls like directx made use of that for decades"

      Did you read? Where is the 'could be useful'? Why do i not see it where?

      "great banter."

      Look it up. Sd 865 s20 ultra does 120fps at max graphics pubgm. Until it throttles.

      "oh my eyes are watering from the difference....get a grip, I play the game on both platforms. its nowhere the difference between pubgm and pubg. and stop nitpicking on it you wont get anywhere, this is literally an outdated 100 dollar card for 1080p 30fps. it barely handles medium-high on 25 fps, not even ultra."

      Never said it was. Its still different in lighting, resolution, draw distance, texture sizes. Whatever the way it runs on pc, it doesnt require anywhere near those specs on android. Poco x3 with 732g gets 27fps at highest settings.

      "lol wth is that kind of deflection? you literally came up with 1 single statement about results, when my point was about how apple chips consume less power (giving props to their cpu architecture design, and not critiquing them)....are you even reading at this point? I suggest investing more if youre really stubborm on proving ang point here, because no one thinks youre proving anything with that kind of immature, baseless deflection"

      Ah the maturity measurement again. Did you even read what i first said? I suggest a reread. No one wants your point of 'probably better' gpus on apple, while the results give an obvious difference in lead.

      "lol what. they were making a whole new design, a newer architecture and design will always be an obstacle on the first cycle. hence qualcomm and others managed to get same performance improvements from lesser node transitions or more for same node transitions around that time compared to apple....so youre yiking me for what bro lmao"

      What obstacle? You think apple just started work a day before release? Their arm64 bit transition and x86 to arm transition would have been even worse then. This isnt qualcomm. Good job not even reading this line - Also the adreno 530 beat the a9 gpu.

      Remember, your first point was "qualcomm was never ahead"

      So yes, yikes bro lmao.







      "










        • ?
        • Anonymous
        • B{P
        • 25 Mar 2021

        Mediatek sux, 25 Mar 2021"Since when qualcomm designed a notepad or a tablet sp... more"Did you not see the word 'phones'?"
        the only thing you were complaining about soo much was literally how "adrenos suck, ipads gave better peformance and a12z gives better performance" yada yada yada. thats what my point was about.
        "They did release some laptop chips but that was too outdated and weak compared to apples m1"
        Holy ...., the m1 powers the macbook pro while qualcomm's 8x powers a surface notepad, that snapdragon is running x86 while m1 runs on roserta 2 while 8x uses the much slower windows x86 emulation, thats so smart of you to compare the two.
        "Who compared 5nm and 7nm? Thats because apple throttles too much. "
        you mentioned how a14 beaten adrenos, sd865 was the same gen as a14 and its a 7nm chip.
        "Whats your source? "
        https://www.cpu-monkey.com/en/cpu-apple_a14_bionic-1693
        "Gfx. Ss extreme is outdated. Let that sink in."
        you mean, 3dmark ss?
        ....
        3dmark slingshot 3.1 came after kentoshi's manhatten by approx 4 years.
        "but theres still room to grow as efficiency..."
        no, by the time integerated graphics will grow again it will be a decade behind, graphics are evolving pretty fast and games are becoming more and more demanding.

        "Who cares when it runs stuff better at lower power."
        you really LOVE contradicting yourself, first you mention anandtech's silly fps/watt charts based on gfx alone (which didnt even show that great difference) and claim apple nailed adrenos on a cross because of their efficiency and how that in specific is a useful determiner to which is better than which.
        "Huh? Its still not 'entirely different"
        ok, intels 8086 is not entirely different to amd's zen 3 because both are x86 processors.

        "No, i was talking about the x86 to arm transition"
        and in which way that makes them comparable to adrenos?

        "Look it up, 'bruh'."
        A10 is a smartphone chip. the tegra x1 sole competitor was the a9x, and despite tegra x1 having a better gpu, gfx favored a9x.

        "That was the only thing available at the time. So what?"
        repulsively immature.
        let us compare the a12z to the tegra x1, because the tegra x1 is nvidias only tablet chipset availabe at this time. (/s).

        "And? Where can i see the results for that? Which app makes use of that?"
        did you read? I said "could be usefull"
        desktop gls like directx made use of that for decades.

        "Pubgm is lightweight" great banter.

        "Genshin on pc and android are too different. Go watch a comparison." oh my eyes are watering from the difference....get a grip, I play the game on both platforms. its nowhere the difference between pubgm and pubg. and stop nitpicking on it you wont get anywhere, this is literally an outdated 100 dollar card for 1080p 30fps. it barely handles medium-high on 25 fps, not even ultra.

        "Did you?" lol wth is that kind of deflection? you literally came up with 1 single statement about results, when my point was about how apple chips consume less power (giving props to their cpu architecture design, and not critiquing them)....are you even reading at this point? I suggest investing more if youre really stubborm on proving ang point here, because no one thinks youre proving anything with that kind of immature, baseless deflection.

        "Shouldnt it be easier since they have even more access? Yikes."
        lol what. they were making a whole new design, a newer architecture and design will always be an obstacle on the first cycle. hence qualcomm and others managed to get same performance improvements from lesser node transitions or more for same node transitions around that time compared to apple....so youre yiking me for what bro lmao

          Anonymous, 22 Mar 2021"The results speak for themselves, adreno has nothing ... more"Since when qualcomm designed a notepad or a tablet specific chip? they never did, stay there with your self conviction."

          Did you not see the word 'phones'? They did release some laptop chips but that was too outdated and weak compared to apples m1.

          "good, we're now comparing a 5nm chip vs 7nm chip...very thoughtful. yet only 5% or less margin of difference in sustained performance which shouldve been 25% atleast thanks to the improved node.wow the a14 literally showed the adrenos! barely having 5% difference at the same clock speeds of the 650. incase you didnt know, their gpus clock at 3 ghz, which is the dumbest move ever in history of gpu designs.
          anyways, all that based on what? so far the only bench youve been talking about is gfx. and im sticking with that. but besides that its worth to mention that both a14 and 650 got the exact same result on 3dmarks ss extreme graphics score. let that sink in."

          Who compared 5nm and 7nm? Thats because apple throttles too much. 3ghz where? Whats your source? Gfx. Ss extreme is outdated. Let that sink in.

          "integerated graphics has no room to grow. I wont debunk the integerated graphics myth again, they will always stay 6 gens or so behind dedicated graphics.
          and read the statement you mentioned, the things I listed were actually disadvantageous to mobile architecture. they were't ways to actually improve mgpu efficiency correctly."

          Uh sure, it wont match dedicated, but theres still room to grow as efficiency improves. What correctly?

          "yeah bandwith is one, powerdraw, tdp, thermals, gl and os as well. gpu architecture equates to a resulting gpu processing power at a reasonable scale pf efficiency. apple is desperate with their insane 3ghz for mgpu to catch up with adreno's fp performance game. Huawei did similar approach with their insane mp24 integeration."

          There you go with fp perf gains again. Who cares when it runs stuff better at lower power.

          "calling 2 different microarchitectures the same, holy mac n cheese balls.... "the same arm" theyre not even the same gen of ARM! LOL! Apple is 2 gens ahead, and calling 2 different ARM designs the same because theyre ARM is like calling all x86 chips the same because theyre x86. jeez."

          Huh? Its still not 'entirely different'.

          "more shaders equals a larger transition, said who? putting more of the same thing is not a larger transition. a13 beaten 12 gpu by 20%, a14 managed to beat 13 by 10% only. despite tsmc 5nm's 70% increased density."

          No, i was talking about the x86 to arm transition.

          "bruh"

          Look it up, 'bruh'.

          "wth, as if the a8x came along with the x1? you want to compare the x1 with the a8x? the a8x powervr wasnt even half the compute power of the x1, and wasnt even the same gen."

          That was the only thing available at the time. So what?

          "all im literally saying is that more integerated memory could be usefull for reducing overhead and bandwith limitations. When did I say phones cant be outdated"

          And? Where can i see the results for that? Which app makes use of that?

          "youre so good contradicting yourself, werent you complaining about vietnamess pubgm +90fps performance a couple of comments earlier? ."

          Pubgm is lightweight, balanced + 120 runs fine on sd 865.

          "a 1050 with 2gb dedicated memory cant even handle playable genshin atm. yet the gpu itself packs more power than every gpu mentioned above. gtx1050 was for 1080p + 30-40fps on low gaming from the day it was released. its literally what you get for a tiny 100 dollar card. (but its worth it tho). youre comparing the lowest and the cheapest entry level graphic card from 2016 to what like....2020 ipad pro, that showed me.... my statements about integerated vs dedicated memory just got destroyed! (/s)"

          Genshin on pc and android are too different. Go watch a comparison.

          "did you even read what you quoted?"

          Did you?

          "except that sd835 was supposdly a gen or atleast half a gen ahead, its a 10nm chip while the a10 fusion a 16nm chip, it had a much better gpu processing power and similar bandwith, it shouldve atleast been 30% faster than the iphone 7 on gfx. atleast. yet it had the same results within 5% of margin ( https://www.anandtech.com/show/11540/samsung-galaxy-s8-exynos-versus-snapdragon/4 ), cool story. (a true one as well). I could honestly care less about anandtech doing this fps/watt based on gfx or whatever theyre think is good, I aslo still remember this:
          https://m.imgur.com/gallery/Zbx92
          anandtech smartphone reviews were meme material (the comment section too) specially when you consider the fact that their now ex, topguy is an apple employee. sadly their reviews happen to be the most detailed, however Id rather take a read on notebookcheck or something anyday of the week."

          Whatever it was, it was the closest thing to compare at the time. What even is that camera comparison? I only go for the soc section. Also the adreno 530 beat the a9 gpu.

          "its not about cores, its about the architecture, apple in house gpu is a complete different, new design which ofcourse would be harder to make significant performance gain vs when you extend on your previous architecture. while a11 was 30% faster than a10 with a newer node (16-10) adreno 630 did the same 30% jump with lpe-lpp transition. yikes."

          Shouldnt it be easier since they have even more access? Yikes.





            • ?
            • Anonymous
            • B{P
            • 22 Mar 2021

            Mediatek sux, 20 Mar 2021"if you think adrenos are bad compared to ipad pro, th... more"The results speak for themselves, adreno has nothing competitive in tablets or phones compared to a12x"
            Since when qualcomm designed a notepad or a tablet specific chip? they never did, stay there with your self conviction.

            "or a14" good, we're now comparing a 5nm chip vs 7nm chip...very thoughtful. yet only 5% or less margin of difference in sustained performance which shouldve been 25% atleast thanks to the improved node.wow the a14 literally showed the adrenos! barely having 5% difference at the same clock speeds of the 650. incase you didnt know, their gpus clock at 3 ghz, which is the dumbest move ever in history of gpu designs.
            anyways, all that based on what? so far the only bench youve been talking about is gfx. and im sticking with that. but besides that its worth to mention that both a14 and 650 got the exact same result on 3dmarks ss extreme graphics score. let that sink in.

            "I did say integrated still has room to grow, and gave reasons, unlike what you claimed."
            integerated graphics has no room to grow. I wont debunk the integerated graphics myth again, they will always stay 6 gens or so behind dedicated graphics.
            and read the statement you mentioned, the things I listed were actually disadvantageous to mobile architecture. they were't ways to actually improve mgpu efficiency correctly.

            "The bottleneck isnt the memory amount. What else other than the architecture? Never said not enough bandwidth wasnt a problem" yeah bandwith is one, powerdraw, tdp, thermals, gl and os as well. gpu architecture equates to a resulting gpu processing power at a reasonable scale pf efficiency. apple is desperate with their insane 3ghz for mgpu to catch up with adreno's fp performance game. Huawei did similar approach with their insane mp24 integeration.

            "Huh? Werent you talking about a lot of theories? And? It still runs the same app. Same game engine. Same arm."
            are you out of your mind? calling 2 different microarchitectures the same, holy mac n cheese balls.... "the same arm" theyre not even the same gen of ARM! LOL! Apple is 2 gens ahead, and calling 2 different ARM designs the same because theyre ARM is like calling all x86 chips the same because theyre x86. jeez.

            "They just did an even larger transition, and still manages to be more efficient than anything."
            more shaders equals a larger transition, said who? putting more of the same thing is not a larger transition. a13 beaten 12 gpu by 20%, a14 managed to beat 13 by 10% only. despite tsmc 5nm's 70% increased density.

            "A10 didnt beat it"
            bruh
            "A10 didnt beat it, a9x came 6 months later" wth, as if the a8x came along with the x1? you want to compare the x1 with the a8x? the a8x powervr wasnt even half the compute power of the x1, and wasnt even the same gen.

            "Too bad your device will get outdated before eggns runs stuff fullspeed. "
            ...
            all im literally saying is that more integerated memory could be usefull for reducing overhead and bandwith limitations. When did I say phones cant be outdated.?


            "Oh, i sure do wonder what mobile games are targeted at"

            youre so good contradicting yourself, werent you complaining about vietnamess pubgm +90fps performance a couple of comments earlier? .
            a 1050 with 2gb dedicated memory cant even handle playable genshin atm. yet the gpu itself packs more power than every gpu mentioned above. gtx1050 was for 1080p + 30-40fps on low gaming from the day it was released. its literally what you get for a tiny 100 dollar card. (but its worth it tho). youre comparing the lowest and the cheapest entry level graphic card from 2016 to what like....2020 ipad pro, that showed me.... my statements about integerated vs dedicated memory just got destroyed! (/s)

            "The results say otherwise, but sure whatever you say!" what results.... ? did you even read what you quoted?

            "Even the 835 was ahead of the a10."

            except that sd835 was supposdly a gen or atleast half a gen ahead, its a 10nm chip while the a10 fusion a 16nm chip, it had a much better gpu processing power and similar bandwith, it shouldve atleast been 30% faster than the iphone 7 on gfx. atleast. yet it had the same results within 5% of margin ( https://www.anandtech.com/show/11540/samsung-galaxy-s8-exynos-versus-snapdragon/4 ), cool story. (a true one as well). I could honestly care less about anandtech doing this fps/watt based on gfx or whatever theyre think is good, I aslo still remember this:
            https://m.imgur.com/gallery/Zbx92
            anandtech smartphone reviews were meme material (the comment section too) specially when you consider the fact that their now ex, topguy is an apple employee. sadly their reviews happen to be the most detailed, however Id rather take a read on notebookcheck or something anyday of the week.

            "Why would the in house or core count even matter? Adrenos have 2 or 3 cores and look how much better they are than malis with 12,18,24 cores. Its the result that matters"

            its not about cores, its about the architecture, apple in house gpu is a complete different, new design which ofcourse would be harder to make significant performance gain vs when you extend on your previous architecture. while a11 was 30% faster than a10 with a newer node (16-10) adreno 630 did the same 30% jump with lpe-lpp transition. yikes.

              Anonymous, 19 Mar 2021"No, if mine is outdated compared to yours, yours is o... more"if you think adrenos are bad compared to ipad pro, then the ipad pro graphic levels are gutter compared to any proper notebook mgpu with dedicated memory. no one cares bout your apple larping, more importantly what I meant is double bandwith is still nothing, you need atleast 5 times this bandwith to make a leap and achieve proper last gen console graphics on mobile or even a powerful notepad like the ipad pro, which is late, very very late btw”

              The results speak for themselves, adreno has nothing competitive in tablets or phones compared to a12x or a14.

              "more power allows bigger design with wider memory bus, higher memory clock and more powerdraw"

              I did say integrated still has room to grow, and gave reasons, unlike what you claimed.

              "I brought it only to explain why gpu architecture is not the bottleneck and its other factors instead, in the first segment you clearly deflected what you insisted on, that the memory bandwith problem wasn't the gpu problem."

              The bottleneck isnt the memory amount. What else other than the architecture? Never said not enough bandwidth wasnt a problem.

              "gfxbench is lightly crossplatform and a valid bench to mention, also most of its tests on ios only run on metal, however excluding these factors and saying clearly ios has upper hands in gfxbench, first they are not anyqhere light on the cpu, second thing and most importantly tdp and powerdraw are completely independent factors, whether you run the test for 10 seconds or ten minutes."

              It is actually light on the cpu, check it yourself. They arent independent, when you take throttling into account.

              "what theories? those devices are completely different, architecture wise and os wise."

              Huh? Werent you talking about a lot of theories? And? It still runs the same app. Same game engine. Same arm. Sure, api different, os different. Not 'completely different'.

              "it even allows much better multicore utilization, the only thing that means is android's implementation of vulkan was very poor that even their biggest chip producers fail miserably to demonstrate its effects in work. ever since nougat and we have yet to witness vulkan's real improvements for android games and we have not, only certain controlled demos from the companies themselves show the actual great leap that is expected from a transition like this."

              Oh great, so nothing you can actually use.

              "somethibg like a trasition from imaginationtech's powervr to their own design, yeah."

              They just did an even larger transition, and still manages to be more efficient than anything.

              "tegra x1 was beaten by a9x and a10 on gfx too. idk where youre getting with this."

              A10 didnt beat it, a9x came 6 months later, and tegra x1 never had a successor other than efficiency improvements. Tegra x2 and xavier came, but as development boards or something.

              "A lot to name, one is the dinosaur GTX 285, which had double that, I hope that doesnt hurt you."

              Oh good to know. Why would it hurt me? Did i say something to hurt you? You arent hurt over me saying your 16gb ram phone is wasted for gaming right? I apologize if so. You can still use it for eggns i guess. Too bad your device will get outdated before eggns runs stuff fullspeed. 2d sidescrollers are fullspeed now i guess.

              "the 1050 didnt even need bandwith, it was targeted for 1080p sub 60fps
              ultra affordable category and its a very small card."

              Oh, i sure do wonder what mobile games are targeted at.

              "wow thats surprising, its almost as if apple didnt invest in arm architecture 2 gens before qualcomm or any major chip provider for android devices to have extremely efficient cpus that draw way less power. I wont say apple cpus arent better, but their gpus are probably not."

              The results say otherwise, but sure whatever you say!

              "qualcomm was never ahead, the 845 was only on par with the a11 because the latter had its first cycle of apples in house gpu which was only 3 cores, had they used a newer powervr things wouldve been the same."

              Even the 835 was ahead of the a10. Why would the in house or core count even matter? Adrenos have 2 or 3 cores and look how much better they are than malis with 12,18,24 cores. Its the result that matters.

              "idc about that anymore, I told you what genshin takes on average on my device and thats all, youre wasting yourself trying to preach me how to check app memory usage as if someone doesn't know it."

              I mean, you sure talked about it a lot. Theres no way its taking average or even max of 4gb on your device. Heres a gameplay of rog 3 at max genshin https://youtu.be/mIJxTzFjBSc. Ram usage is at top left corner. He tests in his other video the mi 11 and even that has the same ram usage.
















                • ?
                • Anonymous
                • B{P
                • 19 Mar 2021

                Mediatek sux, 18 Mar 2021"That wasn't your point , you were blaming gpu de... more"No, if mine is outdated compared to yours, yours is outdated compared to sd888 devices. So yours is also outdated. The ram is irrelevant to any android game. It runs on a slightly higher resolution on the 865 than the 855, so it will take a bit more ram, definitely not an average 4gb like you claimed. It would still be around 3gb at max. Learn how to check ram usage."
                thats true, my device will be outdated next year to compare it to ROG phone 6, but as for now its not, its not 2 gens behind.

                "The ipad pro gives the results"
                if you think adrenos are bad compared to ipad pro, then the ipad pro graphic levels are gutter compared to any proper notebook mgpu with dedicated memory. no one cares bout your apple larping, more importantly what I meant is double bandwith is still nothing, you need atleast 5 times this bandwith to make a leap and achieve proper last gen console graphics on mobile or even a powerful notepad like the ipad pro, which is late, very very late btw.

                "Something to do with bits and channels i think?"
                more power allows bigger design with wider memory bus, higher memory clock and more powerdraw.


                "You were the one who bought the fp numbers and tried to sound smart"
                I brought it only to explain why gpu architecture is not the bottleneck and its other factors instead, in the first segment you clearly deflected what you insisted on, that the memory bandwith problem wasn't the gpu problem.

                "Good going there. The results say the adrenos are bad now, doesnt matter what you think. You say a lot about tdp and thermals, but the gfxbench tests are very light on the cpu and finish in a few minutes. Those really dont apply much here."

                gfxbench is lightly crossplatform and a valid bench to mention, also most of its tests on ios only run on metal, however excluding these factors and saying clearly ios has upper hands in gfxbench, first they are not anyqhere light on the cpu, second thing and most importantly tdp and powerdraw are completely independent factors, whether you run the test for 10 seconds or ten minutes.

                "Not when you run the same app on the same form factor device. I'm not getting it for theory classes, its for personal use"
                what theories? those devices are completely different, architecture wise and os wise.


                "On gfxbench vulkan and opengl have similar scores, on 3dmark opengl even scores higher."
                vulkan is way lower level api and way less driver overhead. it even allows much better multicore utilization, the only thing that means is android's implementation of vulkan was very poor that even their biggest chip producers fail miserably to demonstrate its effects in work. ever since nougat and we have yet to witness vulkan's real improvements for android games and we have not, only certain controlled demos from the companies themselves show the actual great leap that is expected from a transition like this.

                "What were they doing when the sd 845 gpu beat them?" somethibg like a trasition from imaginationtech's powervr to their own design, yeah.
                "Or the tegra x1?" tegra x1 was beaten by a9x and a10 on gfx too. idk where youre getting with this.

                "What was the highest for gddr3? I found one old ATI FireGL V7350 with 41gbps. Anything over 70gbps? "
                A lot to name, one is the dinosaur GTX 285, which had double that, I hope that doesnt hurt you.

                "theres even low end cards like 1050 3gb gddr5 with just 84gbps" the 1050 didnt even need bandwith, it was targeted for 1080p sub 60fps
                ultra affordable category and its a very small card.

                ". It sure shows in results, but the whole soc uses almost double the power to get that claimed 35% improvment. Way more power usage than any iphone"
                wow thats surprising, its almost as if apple didnt invest in arm architecture 2 gens before qualcomm or any major chip provider for android devices to have extremely efficient cpus that draw way less power. I wont say apple cpus arent better, but their gpus are probably not.
                "when qualcomm was actually ahead 3 years ago with the 845"
                qualcomm was never ahead, the 845 was only on par with the a11 because the latter had its first cycle of apples in house gpu which was only 3 cores, had they used a newer powervr things wouldve been the same.

                "That how it is on android. When your foreground app requires more ram, it will free the necessary amount, dont worry. Thats why i tell you to actually look it up. Come back after that"
                idc about that anymore, I told you what genshin takes on average on my device and thats all, youre wasting yourself trying to preach me how to check app memory usage as if someone doesn't know it.

                  [deleted post]"That wasn't your point , you were blaming gpu design for the lack of memory bandwith which is wrong on so many levels."

                  Its a part of the problem then, sure its not a part of the gpu.

                  "no, your phone is outdated in a sense of comparing it to sd888 and 2021 devices, mine is just last gen of the same device we are discussing, the rog phone, plus it has more RAM than you experienced on any other phone according to you, so its more relevant to this argument and certainly has less outdated stabdards than yours, thats all really, games take lower amounts of memory on older hardware. idk if is this came surprising to you."

                  No, if mine is outdated compared to yours, yours is outdated compared to sd888 devices. So yours is also outdated. The ram is irrelevant to any android game. It runs on a slightly higher resolution on the 865 than the 855, so it will take a bit more ram, definitely not an average 4gb like you claimed. It would still be around 3gb at max. Learn how to check ram usage.

                  "except that they cant, lpddr bandwith is going nowhere, you even said that lpddr5 didnt give marginal performance advantage over lpddr4x in benchmarks, no matter how much better your mgpu is it will certainly suck for high LOD rendering, do you know what is culling? thats right, you will never be able to cull graphics on integerated memory with your logic, even if it was the ipad 20, sorry to burst your apple shaped bubbles. memory will always cripple graphic performance here. the only way is either go dedicated, which is impossible, or stay integerated and provide more of the same thing. these are the only available options for mobile gpu performance leaps so my argument stands still."

                  What are you on about? The ipad pro gives the results, not your theories, idk how much bandwidth it has, but its said to be double that of the regular a12. Though theres lpddr4x with 68gbps bandwidth on apple m1 and intel xe g7. Sd 865 with lpddr5 has 40.98 gbps. Something to do with bits and channels i think? The faster ram on 888 gives it 51.2 gbps. Seems like an efficiency issue, should increase with newer nm processes and/ or battery tech. Theres still room for improvement.

                  "sd835 with similar flops to the a9x literally gave literally 5% performance difference to the a9x, sd835 may came 2 years later, a9x had much beefier CPU, much larger tdp and much better thermals, and literally double the memory bandwith of the sd835's maximum memory bandwith (28 vs 52), I wont larp for apple just because you think adrenos are bad.

                  btw thats pretty much self explanatory, you debunked yourself here saying that the performance difference was from the bandwith not from floating point performance. if you think so and if you really don't care about floating point performance then you should not blame it on gpu architecture. at all."

                  You were the one who bought the fp numbers and tried to sound smart, now you are saying something else, now that you know how it doesnt really relate to game performance. Good going there. The results say the adrenos are bad now, doesnt matter what you think. You say a lot about tdp and thermals, but the gfxbench tests are very light on the cpu and finish in a few minutes. Those really dont apply much here.

                  "fps/watt are useless to compare any 2 different platforms."

                  Not when you run the same app on the same form factor device. I'm not getting it for theory classes, its for personal use.


                  "lol ok, first think about vulkan and how much faster vulkan graphics are, afaik when vulkan first emerged it was 100% better than the opengles counterpart, metal is probably two folds of that. most ios games are metal while most android games are opengles, you dont know how graphic libs can have a great impact on performance, And I did not say metal was the only reason, I mentioned a lot of reasons regarding graphic performance difference that make fps/watt silly, and why you cannot directly compare these devices, yet you insist that we must compare apple in house gpu to adrenos. ugh"

                  Oh i sure do, vulkan vs opengl can be the difference between playable and unplayable on some pppsspp and dolphin games. But thats usually because opengles drivers on android are missing some features that the emulator needs. On gfxbench vulkan and opengl have similar scores, on 3dmark opengl even scores higher.

                  Never heard of any problems on native android games tho, dont even know on which they are running on. I'm sure the game devs know what features the android gpu supports and design according to that. And iphones run on higher settings and resolution, and gives more fps/watt and metal alone makes up for that? What were they doing when the sd 845 gpu beat them? Or the tegra x1?

                  "I thought about literally copypasting my previous comment which is a whole paragraph just because you asked the same question here again and changed mind. you don't have to repeat yourself to gain some self validation after I disproved your argument."

                  You disproved nothing. The fact that im able to run it at max and record it with an app in the background at the same time on a 6gb device is proof enough you dont need more than that just for gaming.

                  "newsflash: lpddr bandwith is garbage to function as vram, whether its ipad pro or ipad 20x, that ipad bandwith youre boasting about is literally worse than gddr3, 2008 has called, anyways the only thing you can make use of with lpddr is being dedicated RAM for smartphones due to its low latency. like it or not RAM will always be bad for graphics due to its low bandwith because RAM bandwith is 1/10 that of vram, theres no room for improvement on that and they will always be 10"

                  What was the highest for gddr3? I found one old ATI FireGL V7350 with 41gbps. Anything over 70gbps? M1 and intel have it at 68 with lpddr4x. Even adrenos surpassed it. theres even low end cards like 1050 3gb gddr5 with just 84gbps. On weak gpus used in phones, the other stuff would bottleneck it first, like on the sd865 which lpddr4x and 5 had the same performance. And the sd888 has a slightly faster lpddr5, which qualcomm says actually makes a difference. It sure shows in results, but the whole soc uses almost double the power to get that claimed 35% improvment. Way more power usage than any iphone. No need to compare with a12x. But sure, all of it is somehow due to metal, when qualcomm was actually ahead 3 years ago with the 845. And that was on the worse samsung 10nm vs a11's tsmc 10nm. Now, i do know tsmc 5nm is better. But the 888 is still the best you got on androids for now.

                  "where did you get this astonishing fact? were talking strictly about operating systems memory consumption."

                  That how it is on android. When your foreground app requires more ram, it will free the necessary amount, dont worry. Thats why i tell you to actually look it up. Come back after that.















                    Anonymous, 12 Mar 2021"No i didnt, i said its something i include in gpu pow... more"the more you try to dispute this the more it makes you sound immature as RAM doesn't relate to gpu design in shared memory architecture, you can use the same chip on a different smartphone with different RAMs like sk hynix lpddr or samsung lpddr. it has nothing to do with GPU's OWN performance or design, 0, none, nada. if faster RAM means anything here then its better system performance."

                    Why are you stating obvious stuff like you are some sort of know it all. Its still something that greatly affects gpu perfromance. You just keep being overly technical about it.

                    "Except that I wasnt discussing any of that, and wasn't comparing mobile to pc, I was talking about graphics in general, which is something you clearly dont know anything about."

                    Except that the whole discussion is about mobile games here, and whether they require all that ram or not. And they arent there yet. Good job avoiding it again. Keep on acting like a know it all.

                    "ok lol first my asus with bloatware still have double the availabe memory than your outdated op7, sit down be humble.
                    I don't relate to your irrelevant experience on an outdated flagship, a phone from 2016 can run this game too, however I am exclusively talking about my experince on the latest chip available on the market with the biggest amount of RAM, which is relevant."

                    Why dont you be humble first, mr "know it all other peoples phones are just a year older than mine but still outdated toasters”. Your phone is outdated by your logic too, incase you didnt know theres a thing called the sd888 now. Look it up. Im not talking about 2016 phones running genshin on low. Im talking about running it on max graphics on my phone, with 60 fps. And the ram used for that. Check your ram usage again, this time read the steps clearly. Download the simple system monitor and check the active processes section. You should be able to do atleast that right?

                    "which is exactly why 18GB RAM matters, that and the fact that modern mobile game development can benefit from memory size instead of being stuck with lpddr bandwith limitations."

                    Obvious point about multitasking aside, no they cant take advantage of the extra ram amount. There are already games that only run officialy on certain flagship socs like grid autosport, but no game has any specific option for all the 12 or 16gb phones out there. By the time those games actually come your 16gb gamer phone would be outdated by midrangers.

                    "anyone that says fp performance doesn't relate to graphic performance is someone that haven't studied graphics. contrary to the popular belief that flops don't matter because they're just mathmatical calculations and games aren't mathmatical calculations, they do, graphics ARE mathmatical calculations in essence.
                    which is why this is bee es, anyone who took more or less of a single course on graphics will tell you the exact same thing, but people and companies too tend to underestimate maximum fp performance as a coping mechanism to their horrid thermal design compared to others."

                    I dont care about your theory classes, i care about ingame performance, and fp doesnt equate to it. Apple claimed a12x ipad pro to be as powerful as xbox one (1.3 tflops). Idk on what basis qualcomm claims the yearly performance increase, but the 855 was said to be 954 gflops iirc. So adding the 25% for 865 and 35% for 888, the 888 gpu would be 1.6 tflops. But its still way weaker than the a12x gpu. And apple said most of the perf increase of a12x over a12 was from the doubled bandwidth.

                    No, mobile gpus cant run games in the same settings as the xbox one, just because its got higher tflops. Read my above para. Again, idc about your theories. Its still something that really affects the game performance.

                    "that is some anandtech "bee es". because only anandtech uses fps/watt, what a useless thing to discuss, imagine comparing efficiwncy of two different gpu/cpu architecures on two different operating systems to get revenue and traffic. what if I told you that before apple's in house gpus and back when apple used Powervr, there were some android oems that used powervr of the same class of the iphone and got smashed by arduinos and malis? (asus zenfone 2 and a ton of other phones that used powervrs of similar power to apple counterparts of their time and gave horrid gpu performance) Apple is a different league in terms of their software environment, they have metal and tons of hardware acceleration and tons of accelerations PUs on their devices."

                    Thats actually the most useful thing to discuss about mobile gaming, more battery while gaming, less hot phone, less throttling. The differences are too big just for metal alone to be the reason. The tegra x1 actually beat the apple soc at the time in fps/watt. Nothing matches apple in fps/watt now. Qualcomm once came close with 845 vs a11.

                    "they cannot. I was talking about comparing them both with other devices"

                    Yes they can, they are both available for consumer use and in similar form factors to android devices, runnning the same games.

                    "multitasking wise 18gb does benefit and for futureproofing too, these were my points"

                    Your point was 6gb was somehow 'outdated' for gaming. And other than eggns, no its not. Whats the use of 16gb ram when your soc will be outdated by midrangers in the future? Keeping 200 tabs open in chrome?

                    " its not, I was explaining how much of a tumor shared memory is and why you would need high amount of it, dedicated memory won't give you the same memory consumption issues and performance wise it will demolish it too."

                    It is. What tumor? No mobile game needs that ram amount now, and bandwidth of shared memory can still be increased, like a12x.

                    "and what kind of peasant would depend on integerated graphic chips to game on them? exactly, nobody."

                    And? Its still the closest thing to a mobile soc in the pc space. And its been getting better lately.

                    "now youre seriously defending androids memory consumpion, you clearly dont know know how much RAM any other major operating system uses. just fyi, idle win10 uses 2.8 GBs max on any machine. with bloatware and office that is."

                    And you clearly dont know how each os uses ram. On phones with lower ram androids wont use the same idle memory as ones with more ram. Free ram is wasted ram, and when needed the android device will free the ram for the foreground app. Look it up.

                      • ?
                      • Anonymous
                      • 6vV
                      • 12 Mar 2021

                      Mediatek sux, 09 Mar 2021"you dont, youre dodging a lot, you claimed that memor... more"No i didnt, i said its something i include in gpu power, as in faster ram = higher perf of the entire gpu. So more power. Get it yet?"
                      the more you try to dispute this the more it makes you sound immature as RAM doesn't relate to gpu design in shared memory architecture, you can use the same chip on a different smartphone with different RAMs like sk hynix lpddr or samsung lpddr. it has nothing to do with GPU's OWN performance or design, 0, none, nada. if faster RAM means anything here then its better system performance.

                      "We are talking about mobile games and mobiles gpus here. They arent there yet. Good job avoiding the topic. Nowhere did i claim anything about pc games or gpus"

                      Except that I wasnt discussing any of that, and wasn't comparing mobile to pc, I was talking about graphics in general, which is something you clearly dont know anything about.

                      "Its your asus bloat then. I play on oneplus 7, 6gb ram. If mines a toaster, yours is just the next years model toaster. Did you even check in the developer options exactly as i told you to? Try on a phone with 6gb ram and you'll see genshin max usage on highest settings will be around 2-3 gb. Again, proving my point that ram amount over 6gb isnt important for genshin even with recording and a browser in the background."
                      ok lol first my asus with bloatware still have double the availabe memory than your outdated op7, sit down be humble.
                      I don't relate to your irrelevant experience on an outdated flagship, a phone from 2016 can run this game too, however I am exclusively talking about my experince on the latest chip available on the market with the biggest amount of RAM, which is relevant.

                      "For now, yes, you can pick, mobile game requirements arent quite there yet, for all that ram amount. You can obviously pick the lower ram model of 6/8/12 phones for now and get the same gaming perf, ofc not the same multitasking experience as the higher ram model."
                      which is exactly why 18GB RAM matters, that and the fact that modern mobile game development can benefit from memory size instead of being stuck with lpddr bandwith limitations.


                      "No they arent. Fp performance doesnt relate to general game performance"
                      anyone that says fp performance doesn't relate to graphic performance is someone that haven't studied graphics. contrary to the popular belief that flops don't matter because they're just mathmatical calculations and games aren't mathmatical calculations, they do, graphics ARE mathmatical calculations in essence.
                      which is why this is bee es, anyone who took more or less of a single course on graphics will tell you the exact same thing, but people and companies too tend to underestimate maximum fp performance as a coping mechanism to their horrid thermal design compared to others.

                      "remember when i said mobile gpus have really low bandwidth? Checkout bandwidth of the xbox one."
                      having more bandwith or not doesn't relate with the gpu's theoritical max performance or its own performance and processing power, which was my point, I already know that more bandwith or vram ate better and I explained thoroughly why they are better.
                      what youre thinking is a gpu is a graphic card, graphic cards are gpus+vrams and a ton of other stuff. gpu is just the chip responsible for processing graphics. the more you know.


                      "No, you make them have more fps/watt"

                      that is some anandtech "bee es". because only anandtech uses fps/watt, what a useless thing to discuss, imagine comparing efficiwncy of two different gpu/cpu architecures on two different operating systems to get revenue and traffic. what if I told you that before apple's in house gpus and back when apple used Powervr, there were some android oems that used powervr of the same class of the iphone and got smashed by arduinos and malis? (asus zenfone 2 and a ton of other phones that used powervrs of similar power to apple counterparts of their time and gave horrid gpu performance) Apple is a different league in terms of their software environment, they have metal and tons of hardware acceleration and tons of accelerations PUs on their devices.


                      "Yes, they can. Ipads simply have a much better gpu, so it performs better than iphones, even at much higher resolution. What did you mention before? Other than throttling reducing the iphone perf faster, we dont know how iphones would even run the same game if it ran on ipad resolution. But they dont even give an option to up the res, that should tell you something"
                      they cannot. I was talking about comparing them both with other devices.

                      "Unless you are heavily multi tasking or multi tasking while maxed out gaming, no it doesnt, atleast above a certain point anyway"

                      "I was just saying for genshin at max, even with recording, you dont need more than 6gb ram."

                      multitasking wise 18gb does benefit and for futureproofing too, these were my points


                      "The whole pc example is a bit pointless here. " its not, I was explaining how much of a tumor shared memory is and why you would need high amount of it, dedicated memory won't give you the same memory consumption issues and performance wise it will demolish it too.

                      "The closest comparison would be amd apus i guess?" and what kind of peasant would depend on integerated graphic chips to game on them? exactly, nobody.


                      I have never even thought about saying that. I know the 12gb or 18gb can be useful for specific use cases. The case of modern android firmware using 3-4 gb is the 'free ram is wasted ram' thing. Its just caching whatever it can for the smooth functioning of the system. It differs on the amount of ram the device has. And the system will free the ram when another task requires it, like when you open that game that takes a few gigs of ram - if theres not enough free ram available."

                      now youre seriously defending androids memory consumpion, you clearly dont know know how much RAM any other major operating system uses. just fyi, idle win10 uses 2.8 GBs max on any machine. with bloatware and office that is.

                        Anonymous, 09 Mar 2021"I know, but It determines the performance more than t... more"you dont, youre dodging a lot, you claimed that memory bandwith of lpddr, a *RAM* is the same thing as gpu processing power. RAM bandwith = memory performance.”

                        No i didnt, i said its something i include in gpu power, as in faster ram = higher perf of the entire gpu. So more power. Get it yet?

                        "That is the wrongest of all wrong things you mentioned so far, the obly reason rtg and nvidia are pushing +12gb vram from 8gb or so in the previous gens into their cards is to accomodate high resolution performance, thats like the basis of gpu design that you should get to know, higher res textures use more space, game data don't just float in the card and give you high res textures."

                        We are talking about mobile games and mobiles gpus here. They arent there yet. Good job avoiding the topic. Nowhere did i claim anything about pc games or gpus.

                        "I game on My 12gb RAM ROG 3, you mentioned you have a 6gb RAM lg or something, not sure know what kind of bread toaster you use to play genshin impact, anyways 4.4gb average memory usage on max'd out genshin on my rog beg to differ with you, A lot."

                        Its your asus bloat then. I play on oneplus 7, 6gb ram. If mines a toaster, yours is just the next years model toaster. Did you even check in the developer options exactly as i told you to? Try on a phone with 6gb ram and you'll see genshin max usage on highest settings will be around 2-3 gb. Again, proving my point that ram amount over 6gb isnt important for genshin even with recording and a browser in the background.


                        "BOTH Memory size and bandwith matters, memory size dictates the maximum amount of information you can hold into, bandwith dictates how fast that info can be held, a memory with double the bandwith but half the size won't give you major benefits. all of it matters really, you can't just pick what you like and abandon the others. this is actually one of the reasons smartphone games need a lot of memory like 18gb."

                        For now, yes, you can pick, mobile game requirements arent quite there yet, for all that ram amount. You can obviously pick the lower ram model of 6/8/12 phones for now and get the same gaming perf, ofc not the same multitasking experience as the higher ram model.

                        I think its obvious that the sd820 would perform worse, since this game uses all 8 cores, and it was pretty inefficient, and the a73 cores and later are more powerful in the midrangers that came later. You are probably right, i was just sharing what i saw on techutopias rog 3 gameplay, which showed more gpu usage than cpu usage.

                        "mobile gpus are fine, adrenos now have fp performance two folds of the nintendo switch, and match xbox one level of compute performance, enough said".

                        No they arent. Fp performance doesnt relate to general game performance, remember when i said mobile gpus have really low bandwidth? Checkout bandwidth of the xbox one.

                        "bigger gpus in a smartphone will just drain more power. they are useless."

                        No, you make them have more fps/watt, not just big and power hungry. Iphone gpus have a good lead here in benchmarks, but we dont see that advantage in any games, probably because they run at higher settings and dont allow the phone to get as hot as android flagships. The tegra x1 gpu was way ahead in this when it first came out.

                        "my rog 3 cpu jumps easily to 60 and above on pubg mobile which shouldnt be that demanding on a chip like the sd865+, and despite fps being pretty stable and high and never dips, thats totally not the gpus fault in this case."

                        Are you on 90 fps? Whats the usage like in genshin?

                        " and while iphones and ipads use the same principle they cannot be compared as I mentioned before."

                        Yes, they can. Ipads simply have a much better gpu, so it performs better than iphones, even at much higher resolution. What did you mention before? Other than throttling reducing the iphone perf faster, we dont know how iphones would even run the same game if it ran on ipad resolution. But they dont even give an option to up the res, that should tell you something.

                        " tl;dr the ram amount does play"

                        Unless you are heavily multi tasking or multi tasking while maxed out gaming, no it doesnt, atleast above a certain point anyway.

                        For genshin at max with 1080p60 recording the total ram usage was less than 5gb total (the whole system - android os, activities in background etc. Genshin alone just used 2.4gb). Atleast for now anyway.

                        Phones dont have dedicated vram anyway. The whole pc example is a bit pointless here. The closest comparison would be amd apus i guess? That said on the sd865 there was no difference between lpddr4x and lpddr5 in benchmarks (oneplus 8 vs 8pro) dont know how its in games though. I think qualcomm was saying that on the 888 it makes a huge difference. Maybe only the 888 has the processing power to take advantage of the extra speed.

                        If you are gonna use the same phone for 2 years or more, then i guess go with the one with more ram? The soc would get outdated by midrangers in 3 years though. You would just have a strong gpu, but inefficient, old drivers, poor cpu perf etc compared to new midrangers.

                        "keep in mind when you say things like "but my phone has 12gb thats more than my laptop" most of the times its not the case. plus the fact that modern android firmwares use 3-4gb alone which is a JOKE. and youre here saying 18gb is useless afterall."

                        I have never even thought about saying that. I know the 12gb or 18gb can be useful for specific use cases. The case of modern android firmware using 3-4 gb is the 'free ram is wasted ram' thing. Its just caching whatever it can for the smooth functioning of the system. It differs on the amount of ram the device has. And the system will free the ram when another task requires it, like when you open that game that takes a few gigs of ram - if theres not enough free ram available.

                        I was just saying for genshin at max, even with recording, you dont need more than 6gb ram. If you want the chrome with 100 tabs open, discord chat or something, all in the background while playing at max, recording, streaming etc, then sure even 8 or 12 might not be enough.
















                          • ?
                          • Anonymous
                          • B{P
                          • 09 Mar 2021

                          Mediatek sux, 09 Mar 2021I know, but It determines the performance more than the ram... more"I know, but It determines the performance more than the ram amount"
                          you dont, youre dodging a lot, you claimed that memory bandwith of lpddr, a *RAM* is the same thing as gpu processing power. RAM bandwith = memory performance.


                          "Did i say its not? "
                          https://youtu.be/kxnsmsXYGJ4


                          "Ram amount is not the reason its not going 1080p or higher."
                          That is the wrongest of all wrong things you mentioned so far, the obly reason rtg and nvidia are pushing +12gb vram from 8gb or so in the previous gens into their cards is to accomodate high resolution performance, thats like the basis of gpu design that you should get to know, higher res textures use more space, game data don't just float in the card and give you high res textures.

                          " Do you even game? Do you even know how to check ram usage of a specific app? Let me teach you - go to developer options - memory usage - memory used by apps - click on tne app you want"
                          I game on My 12gb RAM ROG 3, you mentioned you have a 6gb RAM lg or something, not sure know what kind of bread toaster you use to play genshin impact, anyways 4.4gb average memory usage on max'd out genshin on my rog beg to differ with you, A lot.



                          "And? Its still not the ram amount is it? Gpu processing power and bandwidth are what matters than having 8 or 12 or 18gb ram, even for that high resolution that the ipad pro runs genshin on. I was just giving you an example for that"
                          BOTH Memory size and bandwith matters, memory size dictates the maximum amount of information you can hold into, bandwith dictates how fast that info can be held, a memory with double the bandwith but half the size won't give you major benefits. all of it matters really, you can't just pick what you like and abandon the others. this is actually one of the reasons smartphone games need a lot of memory like 18gb.

                          "The bottleneck in genshin on android is the gpu, atleast from what i have seen in the usage meters on the rog 3. The cpu is probably the bottleneck in something like pubgm at 120 low settings or something"

                          no, the bottleneck is everything but the gpu design.
                          SD820's Adreno 530 is a massive mobile gpu, it was the best mobile gpu at the time, it gives similar performance and sometimes excel the adreno 6xx series found in newer sd7xx and sd6xx series.
                          however sd7xx and even newer sd6xx totally demolishes sd820 and sd821 interms of graphical performance. better cpu, better memory, newer node, better efficiency, etc. everything plays a role.
                          mobile gpus are fine, adrenos now have fp performance two folds of the nintendo switch, and match xbox one level of compute performance, enough said.
                          bigger gpus in a smartphone will just drain more power. they are useless. and you mentioned rog 3, my rog 3 cpu jumps easily to 60 and above on pubg mobile which shouldnt be that demanding on a chip like the sd865+, and despite fps being pretty stable and high and never dips, thats totally not the gpus fault in this case.

                          shared memory is the main reason because lpddr has very low bandwith and because it uses memory space that is used by the cpu, and while iphones and ipads use the same principle they cannot be compared as I mentioned before.
                          suppose that one computer has 8gb but its shared memory, its not better than 4gb RAM + 4 GB vram, those 8 gb will be used for the gpu as well, infact its not even better than 4gb RAM + 2gb vram because dedicated graphics memory have much more bandwith = the same as having more memory with much lower bandwith. tl;dr the ram amount does play.

                          keep in mind when you say things like "but my phone has 12gb thats more than my laptop" most of the times its not the case. plus the fact that modern android firmwares use 3-4gb alone which is a JOKE. and youre here saying 18gb is useless afterall.

                            Anonymous, 08 Mar 2021one of the reasons is ipad's memory bandwith* yeah tha... moreAre you even the same anonymous guy? Why dont you get a name?

                            The bottleneck in genshin on android is the gpu, atleast from what i have seen in the usage meters on the rog 3. The cpu is probably the bottleneck in something like pubgm at 120 low settings or something.

                              Anonymous, 08 Mar 2021bandwith isn't gpu processing power. if it is anything... moreI know, but It determines the performance more than the ram amount, anyway. When talking about gpus isnt the type of memory it uses always seen as a part of it?

                              " if it is anything then its entirely memory related, and since smartphones use shared memory space they are limited with lpddr bandwith, dont try to teach me my stuff."

                              Did i say its not? You know nothing. Ram amount is not the reason its not going 1080p or higher. If you read what i wrote i mentioned bandwidth matters, but not ram amount, for now. I gotta teach you when you know nothing, otherwise you will spread this misinformation all over.

                              "secobdly you clearly dont game on your phone,genshin maxd out uses 4gb and more easily."

                              Do you even game? Do you even know how to check ram usage of a specific app? Let me teach you - go to developer options - memory usage - memory used by apps - click on tne app you want.

                              "the reason why ipads get more fps is actually related to bandwith, their 64 bit chanbel lpddrs are way bigger and faster than their smartphone counterpart, again everything I stated was right, everything you stated was wrong."

                              And? Its still not the ram amount is it? Gpu processing power and bandwidth are what matters than having 8 or 12 or 18gb ram, even for that high resolution that the ipad pro runs genshin on. I was just giving you an example for that.

                              Again, everything you stated was wrong. I know bandwidth matters, and never disagreed on it anyway. But you are totally wrong on the ram amount. I was hoping you would understand something from the ipad example. But you avoided even mentioning the ram amount.



                                Anonymous, 08 Mar 2021its what nice or fake What do you mean? Why dont you check on your own device if you really think i'm faking?

                                  • ?
                                  • Anonymous
                                  • B{P
                                  • 08 Mar 2021

                                  Anonymous, 08 Mar 2021bandwith isn't gpu processing power. if it is anything... moreone of the reasons is ipad's memory bandwith* yeah that and different thermal design, heat dissipation, different tdp and a much better cpu design, cpus also bottleneck gpu performance, interms of floating performance adrenos can reach tflop level performance. adrenos used to have better floating point compared to apple A series (before apple designed its own gpu) and yet they didnt deliver the performance of apples previous gpu powervr, clearly gpu architecture is not the answer here.

                                    • ?
                                    • Anonymous
                                    • tTb
                                    • 08 Mar 2021

                                    Mediatek sux, 07 Mar 2021Heres my experience " I said it only uses around 3gb t... moreits what nice or fake

                                      • ?
                                      • Anonymous
                                      • B{P
                                      • 08 Mar 2021

                                      Mediatek sux, 06 Mar 2021Yeah you are right, i forgot about eggns. "graphics... morebandwith isn't gpu processing power. if it is anything then its entirely memory related, and since smartphones use shared memory space they are limited with lpddr bandwith, dont try to teach me my stuff. secobdly you clearly dont game on your phone,genshin maxd out uses 4gb and more easily. the reason why ipads get more fps is actually related to bandwith, their 64 bit chanbel lpddrs are way bigger and faster than their smartphone counterpart, again everything I stated was right, everything you stated was wrong.

                                        Hemedans, 07 Mar 2021does it matter if its genshini alone or not, it shows that ... moreHeres my experience " I said it only uses around 3gb total. Idk how it runs on 4gb device, but it runs fine on 6gb phone even with recording. I can even keep my browser with few tabs open in background."

                                        Try yourself and see. You dont need 18gb or 12 or even 8 for what i did. Like i said if you got lots of apps running in bg, sure get the 12gb or 18 or whatever.