Too bad Monitor bottleneck is not considered. CPU bottleneck is a term that seems way too general. There is always something preventing faster performance, game code, API, Ram Speeds, background processes and so on.
What I find interesting is looking at YouTube FarCry 6 RT results, the 4090 at 4K with 12900K and other high end CPUs is doing around 100fps-110fps. As my 3090 does 72fps with ultra settings + HDR @ 4K with 5800X3D. If I use the ingame rendering resolution and set it to .7 (70%) I get around 110fps, 80% 96fps. Quality looks better than FSR to me. The game is very smooth without scaling the rendering resolution so a moot point. Now the question is, is 30fps added to this game with already smooth game play worth $1600 -> not to me. The 4090 RT may have improved 2x-3x but RT games are still mostly rasterized so the gains in RT games is less than the improvement in RT.
As a note, the 6900XT performs virtually identical to the 3090 with RT on, go wonder
.
What I find interesting is looking at YouTube FarCry 6 RT results, the 4090 at 4K with 12900K and other high end CPUs is doing around 100fps-110fps. As my 3090 does 72fps with ultra settings + HDR @ 4K with 5800X3D. If I use the ingame rendering resolution and set it to .7 (70%) I get around 110fps, 80% 96fps. Quality looks better than FSR to me. The game is very smooth without scaling the rendering resolution so a moot point. Now the question is, is 30fps added to this game with already smooth game play worth $1600 -> not to me. The 4090 RT may have improved 2x-3x but RT games are still mostly rasterized so the gains in RT games is less than the improvement in RT.
As a note, the 6900XT performs virtually identical to the 3090 with RT on, go wonder