Is it normal for a 2003 game to use 100% of 1 CPU core on modern CPU


Jan 18, 2016
The CPU usage is always 100% because of how games were programmed back then. They would have tight code loops running that would execute code as needed, but if there was nothing that needed to be done, the code loop would be waiting for the next thing to be done, rather than kick out NOPs to tell the CPU there was nothing to do. This means the CPU was always 100% busy doing something, even if it was just waiting to do something. The pipeline would always be full, even if instructions in the pipeline were completely wasted and useless. This is the reason that no matter what CPU you have - a 386 or an i9-10980XE, the one core will always be at 100% CPU usage.

This is also why Windows 9x had crap performance and caused thermal issues on some machines with broken ACPI tables. Windows 9x would default back to a code loop that did the same thing, rather than let the CPU idle down.
Great post.
I would assume game has loop in which it checks if there is something to do. CPU could have 999 YHz and game would still use 100% of threads CPU time :)
Thankfully for modern CPU users this doesn't mean any core is overheating any single core. Threads are shifted between actual cores thus having single thread behaving like that will be nicely spread along all available cores.

One thing you can do if think using 100% thread time on modern CPU is useless is changing process affinity to force it to run only on Efficiency cores. If game wastes CPU time by sitting in a loop it should not affect its performance.
Personally I would not bother unless doing so would make CPU cooler quieter.