Intel announces 5.5GHz Core i9-12900KS CPU

06 January 2022
That's for a single core - it can sustain 5.2GHz on all performance cores.

Sort by:

Carol, 09 Jan 2022If you think ARM does not get hot, well...you have a thing ... more"If you think ARM does not get hot"
You're putting words into my mouth.

    • P
    • PixelFan
    • w4Y
    • 10 Jan 2022

    Tim Chef, 08 Jan 2022RISC-V is a blessing for the Chinese, as they are building ... moreTbh, I'd have expected Google to go in that direction as well.
    But here we are with ARM based Tensor.

      • C
      • Carol
      • n@x
      • 09 Jan 2022

      Nick Tegrataker, 07 Jan 2022You can still recognise all of Apple's technological f... moreIf you think ARM does not get hot, well...you have a thing or two comming. I find nothing anti-customer behaviour in a CPU that goes up in temp when in use. It is the normal way of how a cpu works. And nobody needs bigger cases when their cpu gets hotter then before. All they need is to clean the radiator and install new cpu thermal paste. While at it, to buy a better thermal paste that is able to sustain the Cpu for more then a couple of Yearswould be a good idea.

        • P
        • Pc Tekk
        • 4Yh
        • 09 Jan 2022

        Goddess Lyrienne, 06 Jan 2022Well... to be specific it's a gain of roughly 61% comp... morea 10 -12 % boost? . More like 3% from 3770k to 4790k/.//

          • P
          • PC-Tekk
          • 4Yh
          • 09 Jan 2022

          Anonymous, 08 Jan 2022Building your own system has its downfall as well. I used t... moreSO true man. Even the power supplies change size where an upgrade no longer fits in your case....\

            • ?
            • Anonymous
            • IbE
            • 09 Jan 2022

            Anonymous, 08 Jan 2022Building your own system has its downfall as well. I used t... moreI mean so you rather not have a computer? all the points you made apply to all ways of buying a computer unless you can get a multi years upgrade plan from prebuilts like on phones

              • T
              • Tim Chef
              • 39y
              • 08 Jan 2022

              PixelFan, 07 Jan 2022Agreed. It would be interesting to see how things will pla... moreRISC-V is a blessing for the Chinese, as they are building around that license-free architecture in an attempt to rid their systems of proprietary Western technology (i.e., ARM and x86).

                • F
                • AnonF-1006353
                • M8r
                • 08 Jan 2022

                Anonymous, 08 Jan 2022Building your own system has its downfall as well. I used t... moreYeah so just throw away the enclosure and other still usable parts, right? They want you to consume, while adorning themselves as oh so environmentally friendly...

                One can just upgrade the ram, mainboard and cpu, while keeping the old GPU as long as you don't use it often for gaming. Or the other way around, if games are important for you. The PSU and SSD (or even the HDD), as well as the enclosure (there's the ATX standard for that), can still be used in a new system, as long as it isn't more power hungry. You can also sell all parts or build an office machine out of it (while keeping let's say the GPU). There are a bunch of good backup tools for Windows and Linux aswell, if one takes the effort to actually set them up right.

                But i do fully agree with the point regarding price madness on GPU and CPU parts though, since scalpers and miners tamper the market, as well as the chip shortage.

                Yeah it takes more time to build your own machine, but just because the average joe doesn't care about it, the possibility to do it shouldn't be taken away from those who know how to do it. Parts should stay interchangeable. Or do you throw away your car, just because you need new tires or brakes?

                  • ?
                  • Anonymous
                  • UG4
                  • 08 Jan 2022

                  AnonF-1006353, 08 Jan 2022It's not about apple here. It's about proper powe... moreBuilding your own system has its downfall as well. I used to build own system.. thinking its the most bang for the buck. But nowadays not really.

                  Lets say you buy all your parts for 5 years, after 5 years, the mainboard no longer supported, so is the ram, so is the cpu sockets. So you end up buying the entire set again, and if you need new storage, Windows license again.. and to load the software from your old computer to the new is a pain in itself. And the scaplers keeping the price high.

                  The concept of all in one save you time and effort. Get a new phone / tablet / PC / Mac. Just use time machine recover, you're back to work in less than 1hr. Not working? no need to debug what goes wrong.. just go for a one to one exchange.

                    • F
                    • AnonF-1006353
                    • gD$
                    • 08 Jan 2022

                    Shanti Dope, 07 Jan 2022The only anti-consumer here is you who's continuously ... moreIt's not about apple here. It's about proper powerful desktop components, for pc hardware professionals and enthusiasts who want the freedom of choosing their components, instead of buying a prebuild or notebook/tablet like the average joe.

                    If that will still be possible when arm arrives on desktop pc's, i'm all in for it. Just not a fan of that one-size-fits-all glued together nonsense. Because it's perfectly doable without that stupid trend some brands push down their customers throats. Just needs new designs and standards. I get it that some people are overchallenged if they do have a choice to choose their hardware internals. Just don't take it away of those who are not..

                    I'm neither a fan of how intel under-delivers since a few years. This 5.5 ghz heater won't be the answer for non-mainstream applications outside the arm world. AMD will hopefully be a bit more interesting with the 5800X3D and Ryzen 7000.

                      • A
                      • Adul Al Salami Kebab
                      • gx$
                      • 07 Jan 2022

                      x86 is required for backwards compatibility, emulation is not the answer yet. Maybe when ARM becomes 3 to 4 times faster it will be?

                        AnonF-1006353, 07 Jan 2022Stop fanboying stupid anti consumer trendsYou can still recognise all of Apple's technological feats while still disliking the so-called "anti consumer trends". But you could definitely make a claim that making a desktop CPU that is so hot and power-hungry that many existing custom PC users looking to get it would also need to buy a new case or install more powerful cooling solutions just to keep the temperature under control is somewhat anti-consumer too.

                          • ?
                          • Anonymous
                          • XMr
                          • 07 Jan 2022

                          Shanti Dope, 07 Jan 2022The only anti-consumer here is you who's continuously ... moreYou dont understand what is RICS........

                            • ?
                            • Anonymous
                            • PqC
                            • 07 Jan 2022

                            Shanti Dope, 07 Jan 2022The only anti-consumer here is you who's continuously ... moreFloppy drives dissapeared when USB drives came in. You need that much of a difference to see a technology replace another. The end user don't care what's more efficient as long as it doesn't get in the way with his usage.

                              • P
                              • PixelFan
                              • w4Y
                              • 07 Jan 2022

                              Shanti Dope, 07 Jan 2022Given enough ample time for ARM architecture to dominate th... moreAgreed.
                              It would be interesting to see how things will play out when the architecture is changed. But I don't think "go ARM" is the sole way to achieve that.

                              Apple did a remarkable job with ARM, no doubt. It put a much needed pressure in the market where the original big players just fell asleep at the wheel after gaining a considerable lead early on. Now, they're deservingly the leaders of SoC sphere.

                              It's great to see more players attempting to adopt ARM architecture, but that shouldn't be the only way in which development should proceed. If that happens then we'd be soon be back to square one with one major player in the market. Monopoly of any kind is bad.

                              Not saying people should go with RISC-V, but that's the only solution I have off the top of my head. The actual experts might be able to come up with something else.

                              That being said. Unfortunately, the companies might not do any of this as they lack the financial initiative. So I guess we should brace for ARM dominance soon.

                                • P
                                • PixelFan
                                • w4Y
                                • 07 Jan 2022

                                Shanti Dope, 07 Jan 2022The only anti-consumer here is you who's continuously ... more x86 Architecture is in the process of being replaced (couple of decades too late, sure but hey, atleast it's happening).

                                Moreover, pushing for ARM dominance isn't exactly pro-consumer either. I'd very much like for an alternate architecture to exist. By "exist", I don't mean it exists for the sake of it but like be actually be competent.

                                From my understanding, AMD is looking into hiring RISC-V experts. Hopefully, we'll see development on that front.

                                All in all, we need more players. Actual competent players and not posers.

                                  • n
                                  • nice looking CPU
                                  • XDv
                                  • 07 Jan 2022

                                  man..., i wish i could understand what you guys are saying.

                                    AnonF-1006353, 07 Jan 2022Stop fanboying stupid anti consumer trendsThe only anti-consumer here is you who's continuously enforcing your old, outdated, and inefficient x86 architecture to remain among all consumer computers.
                                    If Apple did not make the move of introducing the M1, both Intel and AMD won't be making any progress in developing ARM-based SoCs, and Microsoft wouldn't even bother continuing their efforts in the development of Windows on ARM.

                                    ARM is the future, whether you like it or not. x86 will be here to stay for longer, but only for running super old programs that ARM no longer support.
                                    No amount of optimization can combat the inefficiency of x86.
                                    Just imagine how much of a convenience it brings to a productive user that video editing and other similar tasks are more than two times faster to do than their previous computers and consume less than half as much power at the same time.
                                    I don't like Apple's anti-consumer decisions on their iPhones, but credit is to be given where it's due, as Apple is pretty much the leader in ARM SoC development right now.

                                      • M
                                      • Montes
                                      • ajP
                                      • 07 Jan 2022

                                      AshkanArabim, 07 Jan 2022Power draw is probably somewhere around 600 - 650 wattsHahahaha…..good one and very conservative. ;)

                                        TheLastOracle, 07 Jan 2022Yes, more advanced chip architecture. Also, after a poin... moreGiven enough ample time for ARM architecture to dominate the computer market and it will reach the same generational improvements that x86-based GPUs had, and may even surpass them by a long shot due to the more modern coding system.

                                        I want to see how much further can the RTX 3090 Ti and Radeon 6900XT go if they are not being hurdled by the burden of x86's inefficiency and outdatedness.