Hi.
I have a 9600k @ 4.8 GHz all cores, with a Rtx 3080 12Gb, 16 Gb DDR4 3600 MHz ram.
I use to play at 4K resolution.
Would I see any kind of improvement upgrading to a 14600K with 32 Gb DDR5 6000 MHz?
Thanks eveybody. :)
Hi.
I have a 9600k @ 4.8 GHz all cores, with a Rtx 3080 12Gb, 16 Gb DDR4 3600 MHz ram.
I use to play at 4K resolution.
Would I see any kind of improvement upgrading to a 14600K with 32 Gb DDR5 6000 MHz?
Thanks eveybody. :)
You would. In some titles. But people underestimate how few cores some games still use, and how well a 4.8ghz 9th gen still holds up. I had an 8600k OCd to 4.8ghz all core and 5ghz single core boost, and when I upgraded to a Ryzen 7700x I really didn’t see huge gains in a lot of games. That was at 1080p with a 6600xt. Likely similar to a 3080 at 4k. I wasn’t playing many games that utilize many threads. And a 5ghz 8th, 9th, and 10th Gen Intel CPU with no hyper threading Is still equal to a Ryzen 5600 with its SMT disabled.
People often way overestimate how CPU demanding games really are. There are exceptions. Starfield, or that Battlefield game from a few years ago that came out in a really broken state. StarWars survivor that also came out in a broken state. Lots of really bad unoptimized cash grabs.
Some games are broken in other ways, but still well optimized. Cyberpunk when it released played at 60fps on a now 12 year old 2600k. I underclocked my 8600k to 1.8ghz to see what would happen, and it still ran at 40fps.
Ray Tracing it’s very CPU heavy, and you’ll see large gains there, especially if switching to DDR5.