Intel Core i9-13900KS Review: The World's First 6 GHz

xDiVolatilX

[H]ard|Gawd
Joined
Jul 24, 2021
Messages
1,531
Don't you have a 13900KS? With a shitload of rad space? Why don't you OC it and share the results?
I rather see a [H] member OC'ing this and see the outcome than watching some ramdom Youtuber I've never heard of doing it.
Ya I have the 13900KS on 840mm of Alphacool copper push/pull and a EK Quantum Velocity 2 block with EK Quantum Kinetic D5 pump.

I haven't OC'd yet because I'm waiting for my ATX 3.0 power supply. Even though I'm sure I'll be fine I'm kinda just looking forward to tuning the whole system once I get the new 1300W MSI Meg Ai Platinum PS because the 4090 uses a lot of power along with the 13900KS and all the other 16 fans and pumps and M.2 drives and peripherals etc.

Also Skatterbenchr is probably the best Intel Overclocker on YouTube? I haven't seen anyone else with this much knowledge. Have you seen his videos? The guy is a mad scientist. He knows everything. I know about 5% of what he knows. I'm just a filthy casual admittedly Skatterbenchr is an actual pro he can fine tune a cpu better than anyone I've ever seen.

I'll post some stuff up when I get my rig setup. But you really should watch Skatterbenchr if you wanna boost past 6GHz.
 
Last edited:

Dan_D

Extremely [H]
Joined
Feb 9, 2002
Messages
61,578
I've done a lot of CPU testing with games and I think people think they don't matter for gaming based on the idea that at 4K, the results often look the same or at lower resolution you get 50-100FPS more than a slower CPU that already pulls 300FPS and declare "CPU's don't matter." They do and you can see a difference in games. It's just that the average frame rates don't tell you much of anything. You have to look at the frame times and the minimums frame rates to get a good idea of how they differ.

When I went down the rabbit hole on this very subject, I found out that at 4K, a Threadripper 2920X was basically garbage for 4K gaming. It couldn't deliver a smooth 60FPS experience in a lot of games. The average frame rates alone would have made you think otherwise. But in something like Destiny 2, I had frame rates drop into the 20's or 30's when the Intel 9900K didn't. The lowest it ever did was 56FPS in the same test and when overclocked, it didn't even drop that low.

The CPU does matter and using games as a test is indeed valid. It's not the biggest factor in your gaming performance to be sure, that's mostly on your GPU but the CPU does factor in and does matter.
 

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
8,842
...then why are you using it as a benchmark to call the KS "weak"? If it isn't the CPU that is limiting GTA, which is completely unsurprising, then it just again goes to the argument I have been making: The CPU is just not that big a deal for gaming, and using games as a benchmark for how good a CPU is doesn't make sense.
why can't a 4090 run it though? :( should slice through like butter, but the CPUs/GPUs aren't doing it
 

xDiVolatilX

[H]ard|Gawd
Joined
Jul 24, 2021
Messages
1,531
why can't a 4090 run it though? :( should slice through like butter, but the CPUs/GPUs aren't doing it
Probably just poor game optimization? Just like cyberpunk no matter what it runs like dig doodie💩
 

Dan_D

Extremely [H]
Joined
Feb 9, 2002
Messages
61,578
why can't a 4090 run it though? :( should slice through like butter, but the CPUs/GPUs aren't doing it
There are lots of design decisions in a game that can negatively impact performance. Look at the original Crysis. It still runs like crap. It's not as simple as "optimization" as people often seem to believe. The way they implemented certain features caused the game to be more demanding, but that's also why its visuals stood the test of time so well. Additionally, the engine just isn't capable of utilizing more than two CPU threads. GTA V isn't quite that bad but there are similar issues at work there.
 

Dan_D

Extremely [H]
Joined
Feb 9, 2002
Messages
61,578
Probably just poor game optimization? Just like cyberpunk no matter what it runs like dig doodie💩
I wish people would just stop with the "optimization" crap. When CDPR did optimize the game and squeeze about 10FPS more out of it in one of the previous patches, tons of people screamed about how much worse the game looked. CDPR ended up reverting the changes. Optimization isn't running some magic algorithm that makes the game run better while looking the same. That's not at all how it works. They adjust the draw distance, LOD, and tons of other visual effects to make the game run better. Sometimes they can adjust a few things that aren't noticeable and sometimes they can't.

Some engines are more efficient than others. The fact is, there is only so much that can be done about it when a game is finished. Each engine has its pros and cons. Those pros and cons come from how its coded and sometimes there just isn't a way to improve the performance all that much. It's very likely that Red Engine II isn't as efficient as Unreal 5, but the latter wasn't even available when Cyberpunk 2077 began development.

A game running "badly" isn't often about optimization as people understand it. Optimization is about making visual trade offs for performance. It's about stressing hardware less by neutering visuals, not doing some alchemy on the back end of the game that magically makes a game butter smooth.
 

xDiVolatilX

[H]ard|Gawd
Joined
Jul 24, 2021
Messages
1,531
Take it easy on a filthy casual like me I'm no engineer I just like to play games lol 😆
 
  • Like
Reactions: Dan_D
like this

Sycraft

Supreme [H]ardness
Joined
Nov 9, 2006
Messages
5,305
There are lots of design decisions in a game that can negatively impact performance. Look at the original Crysis. It still runs like crap. It's not as simple as "optimization" as people often seem to believe. The way they implemented certain features caused the game to be more demanding, but that's also why its visuals stood the test of time so well. Additionally, the engine just isn't capable of utilizing more than two CPU threads. GTA V isn't quite that bad but there are similar issues at work there.
Yes and no.

Yes, I agree that optimization can be about tradeoffs, but also there are also things to do in-engine and with the design that can make things work better or worse. A simple example is just making sure non-visible objects are culled correctly so you aren't rendering things that don't need to be.

Also things like not using more CPU cores are an example of optimization when it comes to current systems. Old games are not optimized for how current systems work, and that is one of the reasons why they under perform. It's not the fault of modern CPUs and GPUs that an old game is written in such a way that it doesn't properly use the resources available, and it would be an optimization if it was written to do so.

For a single player, no longer developed, game it makes sense that isn't going to happen. It is what it is, you aren't getting something for new computers unless there's a remaster. However for an online, we still develop and charge for new things, game like MMOs or GTA? That's more questionable. There I think can argue that the engine should get rewrites to optimize for modern hardware.
 

Dan_D

Extremely [H]
Joined
Feb 9, 2002
Messages
61,578
Yes and no.

Yes, I agree that optimization can be about tradeoffs, but also there are also things to do in-engine and with the design that can make things work better or worse. A simple example is just making sure non-visible objects are culled correctly so you aren't rendering things that don't need to be.

Also things like not using more CPU cores are an example of optimization when it comes to current systems. Old games are not optimized for how current systems work, and that is one of the reasons why they under perform. It's not the fault of modern CPUs and GPUs that an old game is written in such a way that it doesn't properly use the resources available, and it would be an optimization if it was written to do so.

For a single player, no longer developed, game it makes sense that isn't going to happen. It is what it is, you aren't getting something for new computers unless there's a remaster. However for an online, we still develop and charge for new things, game like MMOs or GTA? That's more questionable. There I think can argue that the engine should get rewrites to optimize for modern hardware.
Some optimizations aren't about visual trade offs I'd agree. However, in a lot of instances that's what we are really talking about. Some engines are better at certain types of gameplay mechanics than others. Some lend themselves well to open worlds or shooters, or simulation type games and others don't. A lot of the issues with many games comes down to the engine selection. If a company licenses a third party engine, they can only do so much with it before they are beyond the vendor's realm of support. You can't heavily modify Unreal Engine 4 and expect Epic to help you out when you've altered the code so bad they barely recognize it. In many cases it comes down to licensing or using the wrong engine for the task. We've seen how badly EA wanted to make Frostbite the engine of choice and many games built with it suffered because of that choice. Those issues weren't a result of poor optimization as much as trying to make a square peg fit in a round hole. All of the Frostbite based BioWare games are good examples of this with some problems being more egregious than others.

Taking Cyberpunk 2077 for example, the engine just isn't that good. It's complicated and difficult to work with. But I don't think anything on the back end will make it run better or CDPR would have done that already. When they optimized the game, they made adjustments to the draw distances of objects, LOD, etc. and people noticed right away that the visual fidelity of the game was compromised. People act like Cyberpunk 2077 can be made to run smoother, but the fact is that it probably can't. You can only improve the streaming engine so much. You can make the game use more video memory and there is some evidence that higher core count CPU's run it better, but when you turn up the graphics the advantage of higher core count CPU's disappears completely.

This comes down to poor decision making rather than optimization. Most of what people think of as optimization is what you are talking about, but I don't think that kind of stuff is really the solution, nor what really benefits a game the most in terms of frame rates. It comes down to the visuals and making trade offs. Of course, a lot of times there is optimization that could be done and it isn't done because some third party company was hired to port the game optimized for consoles to PC and only did the bare minimum to make the game run with reasonable stability. People are also under the mistaken impression that if game developers would just make the game use more threads, everything would be fine. However, CPU's aren't the biggest factor in game performance and a lot of the tasks involved in a game won't benefit from additional parallelization.
 

xDiVolatilX

[H]ard|Gawd
Joined
Jul 24, 2021
Messages
1,531
Just starting to mess around here.
This is with 8 P cores manually set to 6GHz AUTO everything hyperthreading off & 8 E cores enabled only, set on AUTO to 4.3GHz.
This is just a quick n' dirty test to see if it can hold 6GHz for gaming (all I am interested in as i don't do any productivity whatsoever anymore) on AUTO everything
No fine tuning no nothing EVERYTHING AUTO
Z790 AORUS MASTER
13900KS
Cinebench all core workload (16 core)
840mm Alphacool copper (push/pull low speed)
EK Quantum Velocity 2 block
EK Quantum Kinetic D5 pump (low speed 950rpm)
The Vcore is much higher than my liking at 1.440V but the test has been running for 20 mins now stable is it ok if the temps are in check and power usage is under 300W?
This is just to show 6GHz on AUTO everything just to see how it behaves. (16 core max usage 100%) About 300W power draw 90C temps average 1.440V

6GHz 8 pcores 4.3GHz 8 ecores auto.jpg
 
Last edited:
Top