• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

Linux gaming thread, what games are you currently playing?

When people reply to me I will reply and nothing you say will stop me from doing so you need to learn to live with that fact.
Point in case.

As stated: At this point, your trolling is beyond obvious when you have nothing positive to contribute. To quote Douglas Adams:

Human beings, who are almost unique in having the ability to learn from the experience of others, are also remarkable for their apparent disinclination to do so.

Anyway, I'm done feeding the troll.
 
So Linux is useless for the 75-80% of the market unless you like 5-50% performance degradation, gotcha.

Next question (since AMD has nothing in my performance bracket I cannot answer this):
Why does AMD's driver under Windows suck?
That might be why they are down to 5% of sales last quarter, as catering to Linux while ignoring Windows (looking at the market share now) seems like a terrible idea.
Obviously. Can't pull the wool over your eyes.
 
Point in case.

As stated: At this point, your trolling is beyond obvious when you have nothing positive to contribute. To quote Douglas Adams:



Anyway, I'm done feeding the troll.
I have to change my argument, since you insist on killing the messenger and ignroance the facts:
Linux gaming (for the majority of users) costs 5-50% performance and introduces cashes on top of that:
Been playing ME3 Legendary Edition. It has mostly been smooth except for about three crashes but only during conversations. I blame it on having the EA version of the game. Probably found it on a really good sale a few years ago through EA's store and bought it there. Still regretting that purchase. I'll probably end up buying it through Steam during a really good sale at some point just to avoid having to use whatever EA is calling their store/launcher. Will still have to deal with the EA launcher stub through Steam but that's usually not an issue.

Kinda weird attitude: "You do not talk about performance degradation - kill the messenger" almost like facts needs to be wept under the rug to preserve the false notion that "linux is better for everything" it has a relious cult tone over it.
 
I have to change my argument, since you insist on killing the messenger and ignroance the facts:
Linux gaming (for the majority of users) costs 5-50% performance and introduces cashes on top of that:


Kinda weird attitude: "You do not talk about performance degradation - kill the messenger" almost like facts needs to be wept under the rug to preserve the false notion that "linux is better for everything" it has a relious cult tone over it.
It's obvious you have no experience with what I was talking about. First of all, The EA Storefront/launcher is a piece of shit on any operating system. Just ask anyone or do your own searching, there are complaints going back years and it's always been shit no matter what it was called.

Mass Effect games, even the LE, has always had bugs and crashes no matter the OS. I should know since I've been playing the games since the first one came out and I've never once had a crash free experience. There's a reason I save quite often.

I have no idea where you're getting performance issues. Framerate is set ingame to max out at 144 which is the refresh rate I have set for my monitors and not once have I noticed a single issue with it. Even at 1440p and 144hz the 6750xt has no issues running the game. Said card doesn't even get close to maxing out clock speed even once and it's a completely smooth experience. The original ME games and even ME LE aren't needing massive performance from hardware to run great.

It's funny how most people complaining about Linux gaming performance never take this into account. It's always about the absolute newest games pushing hardware the hardest and running the highest settings possible. Few people play like that. Not only don't they have the hardware, like 4k monitors, to push the problematic settings, most people don't even play the newest and "greatest" games. Outside of a very few select titles I have little interest in most new games. Some of it has to do with me and some of it has to do with the games and the quality of games which have been coming out. Even then my interest is constrained by my wallet which is one reason I don't have any games newer than a few years old. It's easier on my wallet to get older games I haven't played yet when on sale than it is to mess with new games and eventually those new games will become older games with discounts and lower prices. And that doesn't take into account I have other hobbies which also need funding.

I play games because it's fun to play them. I'm not playing them to max out settings or run the resolution has high as possible. With my hardware and my choice of games I'm perfectly capable of having fun playing games I find to be fun. I don't chase the latest and greatest nor do I worry about the current flavor of the month.
 
I play games because it's fun to play them. I'm not playing them to max out settings or run the resolution has high as possible. With my hardware and my choice of games I'm perfectly capable of having fun playing games I find to be fun. I don't chase the latest and greatest nor do I worry about the current flavor of the month.
This 100%.

First of all, The EA Storefront/launcher is a piece of shit on any operating system. Just ask anyone or do your own searching, there are complaints going back years and it's always been shit no matter what it was called.

The EA App works perfectly here, the only thing you need to do is copy the files from EA Staged to EA Desktop after updates - For some reason the EA App can download the updates, but it can't seem to apply the updates. Why the app needs to update so often simply defies all reasonable logic when there's absolutely no change after the update.
 
Last edited:
I just decide to test and share this, Marvel Rivals benchmark, 1080p, Ultra, so... 30% less.
I am not sure can I do something to make better result.

Marvel_Rivals_FHD_Ultra_Lin_132.pngMarvel_Rivals_FHD_Ultra_Win_176.png
 
I just decide to test and share this, Marvel Rivals benchmark, 1080p, Ultra, so... 30% less.
I am not sure can I do something to make better result.

View attachment 797951View attachment 797952
If you have any active GPU OC apps you're utilizing, try LACT on Linux to provide a similar performance boost hardware side. There's plenty of system optimization options on Linux, WINE, & Proton that may work as well.
 
"Oh no, an Nvidia issue, Linux bad........" says the troll.

I think we can have some compassion for Andrea Yates now.
 
I just decide to test and share this, Marvel Rivals benchmark, 1080p, Ultra, so... 30% less.
I am not sure can I do something to make better result.

View attachment 797951View attachment 797952

You're seeing the descriptor heap issue under VKD3D/DX12. Fixes are leaking under CachyOS, but they're far from fully baked as yet and have to be enabled using a Steam launch option (and may cause crashing and less than ideal performance gains at this point in time).

Remember - Don't switch to Linux under the misconception of the same or more performance. Switch to Linux for the sense of ownership, freedom, and privacy it provides - Some games will perform better, some games will perform worse, some games will perform the same.
 
Last edited:
You're seeing the descriptor heap issue under VKD3D/DX12. Fixes are leaking under CachyOS, but they're far from fully baked as yet and have to be enabled using a Steam launch option (and may cause crashing and less than ideal performance gains at this point in time).

Remember - Don't switch to Linux under the misconception of the same or more performance. Switch to Linux for the sense of ownership, freedom, and privacy it provides - Some games will perform better, some games will perform worse, some games will perform the same.
I just decide to test and share this, Marvel Rivals benchmark, 1080p, Ultra, so... 30% less.
I am not sure can I do something to make better result.

I'll add to that as silly as it sounds maybe try an alt CPU scheduler. (I doubt it makes up the difference but it might get you a few more frames)
I know your CPU is in no way being pushed that hard. SCX_LAVD though was developed by Valve developers to smooth frame times. Its been gaining popularity in servers actually... as its efficient load balancing is actually very good. META recently swapped all their servers to LAVD. I think as Maz says your probably seeing the NV overhead issue here.
Doesn't hurt to try though. Not sure what distro you are on. If you are on cachy its as simple as firing up SchedExt GUI Manager and turning LAVD on. Its a user space scheduler you don't have to recompile or anything.
If your on a distro using a older kernel though you may have to run a different kernel to use SchedExt.
 
The same apply for Linux users in Windows, correct?

As a general observation, I'd say if discussion was under the Windows specific sub forum that may be the case. But when we're talking about a thread under Tech News where the OP specifically makes mention of Linux in comparison to Windows, I'd say you're reaching.

I definitely think you're asking for trouble in questioning the comments of an admin.
 
Last edited:
In other news, I've been getting back into Death Stranding after struggling to grasp the concepts in the game (I feel like I've entered a story in the middle of the story, the whole 'Beach' and 'BT's' went over my head for a bit) and I'm really enjoying it. The graphics and ambience of the game, combined with the background music in sections, is just wonderful.

It performs great on my system, no performance issues whatsoever with most graphics settings maxed out.
 
  • Like
Reactions: ChadD
like this
The same apply for Linux users in Windows, correct?

My comment was a suggestion for those who were having a problem with rule #1

(1) Absolutely NO FLAMING, NAME CALLING OR PERSONAL ATTACKS, NO TROLLING. Mutual respect and civilized conversation is the required norm, this includes personal attacks in signatures.

If you have a problem with the way I moderate the forum, take it up with me, or FrgMstr .
And your temper tantrum of reporting 39 post in the last 30 minutes ( that were posted in Tech News, not Operating Systems) earned you a 3 day vacation for abusing the report post system.
 
Last edited:
Youngin' was playing a little Goat Simulator this morning. I was going to try my hand at Skald, Against the Black Priory later
 
I'll add to that as silly as it sounds maybe try an alt CPU scheduler. (I doubt it makes up the difference but it might get you a few more frames)
I know your CPU is in no way being pushed that hard. SCX_LAVD though was developed by Valve developers to smooth frame times. Its been gaining popularity in servers actually... as its efficient load balancing is actually very good. META recently swapped all their servers to LAVD. I think as Maz says your probably seeing the NV overhead issue here.
Doesn't hurt to try though. Not sure what distro you are on. If you are on cachy its as simple as firing up SchedExt GUI Manager and turning LAVD on. Its a user space scheduler you don't have to recompile or anything.
If your on a distro using a older kernel though you may have to run a different kernel to use SchedExt.
I run LAVD with Auto & --autopower. Works great on my desktop so far.
 
  • Like
Reactions: ChadD
like this
If you have any active GPU OC apps you're utilizing, try LACT on Linux to provide a similar performance boost hardware side. There's plenty of system optimization options on Linux, WINE, & Proton that may work as well.
I try LACT, and Nvidia-settings - work similar in this situation.
I've tried almost all versions in Proton-Qt—the difference is 1–3 FPS. But since this is a Steam game, I can't run it outside of Steam using Wine/Proton.

1776614979464.png


You're seeing the descriptor heap issue under VKD3D/DX12. Fixes are leaking under CachyOS, but they're far from fully baked as yet and have to be enabled using a Steam launch option (and may cause crashing and less than ideal performance gains at this point in time).

Remember - Don't switch to Linux under the misconception of the same or more performance. Switch to Linux for the sense of ownership, freedom, and privacy it provides - Some games will perform better, some games will perform worse, some games will perform the same.
I tried using VKD3D in the Steam command line, but it didn't help.

I'm just asking if there's a way to fix this game—that's all. Asking is the best way to learn :)

I'll add to that as silly as it sounds maybe try an alt CPU scheduler. (I doubt it makes up the difference but it might get you a few more frames)
I know your CPU is in no way being pushed that hard. SCX_LAVD though was developed by Valve developers to smooth frame times. Its been gaining popularity in servers actually... as its efficient load balancing is actually very good. META recently swapped all their servers to LAVD. I think as Maz says your probably seeing the NV overhead issue here.
Doesn't hurt to try though. Not sure what distro you are on. If you are on cachy its as simple as firing up SchedExt GUI Manager and turning LAVD on. Its a user space scheduler you don't have to recompile or anything.
If your on a distro using a older kernel though you may have to run a different kernel to use SchedExt.
Thanks for the information SCX_LAVD; that's interesting—it can maintain a bit more stable 0.1% low, which is very good.

Why does AMD's driver under Windows suck?
That’s not the case; AMD is clueless and relies heavily on Windows libraries, and when those fail, the driver starts causing “issues.” Nvidia uses something that works around this “issue” (I’m not sure exactly what) and is therefore a bit more “stable.” Ultimately, both drivers are fine. I use both Nvidia and AMD cards, and reinstalling the driver rarely helps for either, but fixing the Windows system files/libraries/VC++/audio drivers/etc. almost always works to resolve the issues.
 
I tried using VKD3D in the Steam command line, but it didn't help.

You're already using VKD3D under Linux. VKD3D is the Vulkan compatibility layer translating calls from DX12 > Vulkan, right now there's an issue regarding that translation under Nvidia hardware that's currently in the process of being (hopefully) resolved.
 
You're already using VKD3D under Linux. VKD3D is the Vulkan compatibility layer translating calls from DX12 > Vulkan, right now there's an issue regarding that translation under Nvidia hardware that's currently in the process of being (hopefully) resolved.
Yes, if Nvidia decides to improve its driver.
There's a similar issue in Windows with the AMD driver for Assassin's Creed Origins / Odyssey—when using DXVK, the frame rate jumps significantly (~20%+). But they never decided to fix it.

Anyway, thanks to everyone for the help :)
 
The issue isn't specifically a driver issue, the issue is mostly a Vulkan issue. Intel GPU's are also affected by the Vulkan issue.
Well that isn't really the way to put it right?
The issue is Nvidia and Intel use an inferior method of descriptor handling in hardware.
AMD used to have the same issue and the same performance hit with DXVK. Valve simply realized AMD hardware was capable of editing descriptors directly as they don't use a heap. DXVK used to have a 20% hit on AMD as well until Valve used AMDs (honestly forgotten) superior programmable descriptor idea that has carried over since Vega.

Yes they have added new Vulkan extensions which should help. I hope its zeros out translation debt on NV. Really though as I understand it, it won't. It will maybe halve it. Hopefully NV gets it 100% rolled out soon so we can find out.
 
I still blame Nvidia. If they were really interested, they'd put more people on the Linux driver team and fast track some fixes; seems like it's just 1 (or 2) dudes that are posting a little (but sorely needed) fix every 3 months or so.
 
I still blame Nvidia. If they were really interested, they'd put more people on the Linux driver team and fast track some fixes; seems like it's just 1 (or 2) dudes that are posting a little (but sorely needed) fix every 3 months or so.
There is no money in that. Nvidia is on another track these days
 
There is no money in that. Nvidia is on another track these days
I agree; at this point it's probably in NV's best interest to stop selling GPUs to consumers which could push more of their gaming income to game streaming ala GeForce Now. That way they get to keep their big-chip profit margins to corporations and a monthly fee from always-online game streaming.
 
Well that isn't really the way to put it right?
The issue is Nvidia and Intel use an inferior method of descriptor handling in hardware.
AMD used to have the same issue and the same performance hit with DXVK. Valve simply realized AMD hardware was capable of editing descriptors directly as they don't use a heap. DXVK used to have a 20% hit on AMD as well until Valve used AMDs (honestly forgotten) superior programmable descriptor idea that has carried over since Vega.

Yes they have added new Vulkan extensions which should help. I hope its zeros out translation debt on NV. Really though as I understand it, it won't. It will maybe halve it. Hopefully NV gets it 100% rolled out soon so we can find out.
Nvidia and Intel use descriptor heaps, as descriptor heaps were used under OGL, and Vulkan was supposed to be the evolution of OGL. Even DX12 makes better use of descriptor heaps compared to Vulkan.

Personally, I never believed descriptor heap fixes alone would immediately provide 100% fps parity with Windows (bearing in mind that not all games run at 100% fps parity or higher under Linux AMD either, just the cherry picked titles used by tech tubers). Right now, based on a certain tech tuber video with 20 titles compared between Windows and Linux running Nvidia hardware, the combined average fps drop is ~13% - if that figure can he halved, that figure drops to 6.5%, which is a huge improvement and nothing worth complaining about.

Running Nvidia hardware here under Linux, I never feel like my games aren't performing adequately. However, I don't deliberately keep booting into Windows and comparing fps like it's some kind of drag race between the two operating systems. I use Linux for the sense of ownership, privacy, and customization it provides - the ability to game with adequate performance is just a bonus that keeps getting better as time advances.
 
Last edited:
Nvidia and Intel use descriptor heaps, as descriptor heaps were used under OGL, and Vulkan was supposed to be the evolution of OGL. Even DX12 makes better use of descriptor heaps compared to Vulkan.

Personally, I never believed descriptor heap fixes alone would immediately provide 100% fps parity with Windows (bearing in mind that not all games run at 100% fps parity or higher under Linux either, just the cherry picked titles used by tech tubers). Right now, based on a certain tech tuber video with 20 titles compared between Windows and Linux running Nvidia hardware, the combined average fps drop is ~13% - if that figure can he halved, that figure drops to 6.5%, which is a huge improvement and nothing worth complaining about.

Running Nvidia hardware here under Linux, I never feel like my games aren't performing adequately. However, I don't deliberately keep booting into Windows and comparing fps like it's some kind of drag race between the two operating systems. I use Linux for the sense of ownership, privacy, and customization it provides - the ability to game with adequate performance is just a bonus that keeps getting better as time advances.
Indeed I don't know why people sweet 5-10% so much. We make it up in so many other ways. Who cares if a game running at 150fps COULD maybe be running at 165fps on windows. Agree completely.

AMD was forward thinking in their swap to fully programmable descriptors. I expect at some point Nvidia will also move to fully programmable descriptors. As well. AMD did it thinking multi architectural compute would be more of a thing way way back with Vega. It actually is going to start being more of a thing. So I expect heaps will be phased out in hardware at some point. Qualcomm (and I'm 95% sure Apple) have both moved to programmable descriptors as well.

I do hope the heap extensions will help. The real issue though is the game is loading into heaps for DX12, and DXVK needs to change the headers to Vulkan headers. The extensions will save one step. But not all of them. Without having each descriptor being individually changeable you still have to do reloads. Anyway fingers crossed. On par NV performance would be huge for Linux.
 
I do hope the heap extensions will help. The real issue though is the game is loading into heaps for DX12, and DXVK needs to change the headers to Vulkan headers.
Totally, I too hope the descriptor heap implementations provide a really tangible benefit as it would help Linux adoption considerably. However, things aren't as bad regarding Nvidia under Linux as tech tuber's looking for clicks would have you believe.
 
  • Like
Reactions: ChadD
like this
So the CS2 Animgraph update has finally been released under the main branch and isn't limited to the beta branch anymore, and wow, what an improvement.

A 14fps increase on min fps in the CS2 Bench map highlights that the ever present CPU bottleneck has finally been removed, which is evidenced by the fact that my GPU is hitting 95 - 99% utilization almost constantly now, which wasn't the case in the past. But the bench results only tell half the story. When connected to a server filled with players, GPU utilization no longer drops in proportion to the number of real players present in the server - In a server filled with players, my GPU is still hitting 95 - 99% utilization almost constantly, making the game far more playable in a really busy server. In the past GPU utilization would drop to ~70% in places connected to a busy server as a result of the game's CPU bottleneck.

A massive improvement, well done Valve!

Before the Animgraph update (1200p, all settings maxed out):

CS2 bench GPU mem overclock 2.png


After the Animgraph update (1200p, all settings maxed out):

CS2 Bench Animgraph update.png
 
Totally, I too hope the descriptor heap implementations provide a really tangible benefit as it would help Linux adoption considerably. However, things aren't as bad regarding Nvidia under Linux as tech tuber's looking for clicks would have you believe.
It is in a lot better place then it has ever been for sure. As much as I wish they would still properly open source their drivers. At least NV has worked with the various projects to make things painless for end users. (and yes I know the old proprietary DKMS was no sweet for folks like us... but it was a hurdle for a lot of brand new to Linux people)

You could be correct and the next extensions could 100% zero out the over head. I'm not knowledgeable enough one which part of the pipe is really the issue. I know its like a 4 step process of pushing and poking memory as it is. I know the extensions should drop that in half to 2. Vs AMDs 1. HOWEVER that 1 extra step might be really a minor thing. I am not smart enough to really know. Sometimes an extra step is no big deal... if the half of the instruction they can cut with the next extension was 90% of the issue. We might be golden.

Have you done any testing with the Cachy env variables? Or is does it still not even matter with the current NV driver?
 
Tweaked things a little more, another Catzilla 1440p run. Still running off a mechanical HDD. EDIT: And I'm running KDE Neon 6.6.4, I'm sure running CachyOS would yield higher results:

Catzilla Best 1440p.png
 
Last edited:
Nobu, Mazzspeed
I'd better write here.

It's Wayland, I tried gamescope and gamescope-plus with many different settings, but without success :)

gamescope --expose-wayland -w 2560 -h 1080 -W 3440 -H 1440 -S lanczos -f -- %command%

It’s best to launch the game at a high resolution that extends beyond the monitor, so yes, it scales, but it doesn’t fit, it can’t launch, and it just crashes or displays this error.

1777139806233.png
 
Worked for me with Timberborn on a B570 with 10GB vram. Used this command:
gamescope -W 7680 -H 4320 -f -- %command%

That's loading gog timberborn with steam wine, but it should work for you too. For some reason heroic isn't launching gog's timberborn, probably fscked the wine prefix or something.

IMG20260425151824.jpg


Obviously, don't have an 8k TV here, it does 4k60, which is what it's set to right now, or 1080p240 iirc.
 
Installed LEGO: The Hobbit (steam version)

Set launch options as above, then configured in game to 7680x4320, disabled bloom and AA. It runs very well on the B570, but you can tell it's at it's limit running that resolution. Didn't have hud enabled, but while it was usually smooth, there were times when it skipped frames, especially when you get to the forge.

After I got past that part to the first save point, I saved and quit. Opened up nvtop, which suprisingly shows the arc gpu's core and mem utilization, started the game and loaded the save again. The core was maxed (100%, but it said 47% effective or something like that), mem was at 7-8GB.
 
I beat this on the Series X, running through it on the sTeAm MaChInE now.

Screenshot_20260424-165119.png
 
  • Like
Reactions: Nobu
like this
Worked for me with Timberborn on a B570 with 10GB vram. Used this command:
gamescope -W 7680 -H 4320 -f -- %command%

That's loading gog timberborn with steam wine, but it should work for you too. For some reason heroic isn't launching gog's timberborn, probably fscked the wine prefix or something.

View attachment 799124

Obviously, don't have an 8k TV here, it does 4k60, which is what it's set to right now, or 1080p240 iirc.
lol
It's working now; I probably had to restart the PC after change gamescope with gamescope-plus or idk...
 
  • Like
Reactions: Nobu
like this
So I finally updated to the Nvidia 595.58.03 open drivers today on the KDE Neon system and decided to give the new and still under development Vulkan Descriptor Heap fixes a go. So far the only non native Vulkan title I've tested is CP2077 as it has a handy inbuilt benchmark, and I have results from a previous run at the same settings; and the results are simply amazing, with the GPU sitting at an almost constant 95 - 99% utilization now.

Hardware used is identical to specs in sig. Resolution is 1200p, all graphics settings are maxed out and full path based RT is enabled along with DLSS4 Performance, FG, and Ray Reconstruction. running Proton-CachyOS Latest with PROTON_ENABLE_WAYLAND=1 and PROTON_VKD3D_HEAP=1 added to Steam launch options for the game. Results as follows.

Pre descriptor heap fix and 580 drivers:

CP2077 i9 4800 cropped.png

Post descriptor heap fix and 595 drivers:

CP2077 Descriptor Heap fix.png

Comparing Min, Max and Avg figures, results as follows:

Min: +24.33 fps
Max: +19.18 fps
AVG: +13.74 fps.

The notable increases in min fps results indicate the CPU bottleneck is finally being vastly reduced, which is also backed up by the notable increase in GPU utilization.
 
Last edited:
Back
Top