Starfield’s Performance on Nvidia’s 4090 and AMD’s 7900 XTX analyzed by Chips and Cheese

Probably because no one cares. It's multiplayer, and not as many people play MP games on 4k display to begin with. They're playing on high refresh rate monitors at probably 1440p maximum. And even if they did, the 1% lows on both of them are nearly at 144FPS. It's also an AMD favored game to begin with, yet the 7900XTX is like 5% faster.
LOL considering Call of Duty is usually one of the top multiplayer games every year. Saying that no one cares is absolutely wrong.

This just shows that you want to ignore that it is entirely possible its the GPU architecture that makes AMD card's run better than Nvidia when it comes to specific games.

Sorry after this answer, no one can even take you seriously anymore.
 
LOL considering Call of Duty is usually one of the top multiplayer games every year. Saying that no one cares is absolutely wrong.

This just shows that you want to ignore that it is entirely possible its the GPU architecture that makes AMD card's run better than Nvidia when it comes to specific games.

Sorry after this answer, no one can even take you seriously anymore.

I like how you completely ignored the rest of the post. Don't worry, I wasn't ever taking you seriously to begin with nor do I give a flying fuck about your opinions. Blocked.
 
I think the key qualifier you're missing is "held together with bubble gum and has duct taped on features". Other engines clearly have more and better development, because most games using them have better performance while offering better graphics, and often worlds that are just as big if not bigger. Red Dead Redemption 2 stands out in my mind as even to this day when I fire it up I still think it looks great... although it's also running on a proprietary engine. Apparently a proprietary engine that's done right.
We know Bethesda games aren't built on the best graphics engine, but it's still just like any other engine in that it was built on top of engines that goes as far back as the 90's. How many people know that the Source 2 engine is based on the Quake engine? World of Warcraft is actually running on a heavily modified Warcraft 3 engine.
The more and more I play Starfield, the more and more I also wonder how much of that 10+ years of development on it they wasted on optimizing their shitty engine VS just picking up a more standard one and learning it. Because I'm wondering where exactly all of the other development went, if not that. The world is otherwise a bit disappointing.
Besides money, there's other reasons they kept their own engine. Bethesda games are known for being friendly when it comes to mods, so that may have been taken into account.
I meant exactly what I said, and I understand exactly what I said. If you're optimizing your compute units, you optimize them for the kind of workloads that they will be computing. Bethesda titles have, afaik, always essentially run like shit, versus their graphics level. If most titles out there don't compute things like a Bethesda title does, then are you going to tune your hardware to compute it? No.
I'm not going to pretend like I know what was done in Starfield that makes it run better on AMD, but other people have and have done a much better job than what you're saying.
That's like saying that basically everyone else is using their pipe wrench to do the things a pipe wrench was designed to do, but Bethesda likes using it to hammer nails instead, so maybe we should design our pipe wrenches to look more like hammers instead, because Bethesda just likes hammering shit for no reason. No one is going to do that. AMD's pipe wrench is just closer in shape to a hammer this time around, and both pipe wrenches still work for hammering those nails.
This is proof that Nvidia owners want to believe they have the best, because they paid the most. That's not been the case since forever. There's always going to be an edge case where hardware that's not been the fastest in most games, can be the fastest in one game. AMD cards have advantages and if done right can give them the edge.
Everyone just deals with it because FSR (and DLSS with a mod) come in to save the day.
DLSS is coming to the game and nobody cares because it's 2023 and new games will demand for more powerful hardware. This year we've seen people upset over a lot of new games because their new hardware can't keep up. What used to be common sense in the PC gaming world is no longer common.
A vast majority of them, more like.
Yes, that would be most games.
The 1% lows on the 4090 at 4k were higher than the 7900XTX's average in this review. This is with the reviewers not enabling Ray Tracing except when required. So these results are a bit skewed because Ray Tracing sort of really screwed with AMD in one game, where it was an absurd amount slower. It's to the point where some reviews of the 7900XTX don't bother comparing it to the 4090 to begin with. They just stop at the 4080. Which makes sense, they're in different price brackets. But at the same time, the 4090 is better than the 4080 in a degree that's actually commensurate with its price tag (actually more). Pound for pound, its price is actually usually justified.
Nvidia sponsors a lot more games as well, so it shouldn't surprise anyone that the RTX 4090 is the fastest most of those games. Also, no amount of justification can make the RTX 4090 seem like a good deal. You paid $1,600 for the best, even though we're now approaching used car prices for a graphics card. People don't buy a RTX 4090 because it'll last them 10 years, or because a game they played needs it. People buy it because the 4090 is the fastest, but when it loses to a cheaper AMD card the ego takes a hit.
Probably because no one cares. It's multiplayer, and not as many people play MP games on 4k display to begin with. They're playing on high refresh rate monitors at probably 1440p maximum. And even if they did, the 1% lows on both of them are nearly at 144FPS. It's also an AMD favored game to begin with, yet the 7900XTX is like 5% faster.
It's like sponsored games always favor the sponsors hardware? This isn't anything new.
 
Besides money, there's other reasons they kept their own engine. Bethesda games are known for being friendly when it comes to mods, so that may have been taken into account.

If your point is that Bethesda's modding ecosystem and toolkit is part of what keeps them chained to their antiquated engine... well that may be a good point. Bethesda games pretty much survive on their modders. Without modding, they're usually pretty crappy and disappointing to put it mildly (as is this game). So this is a good point, and one that I didn't think of.
We know Bethesda games aren't built on the best graphics engine, but it's still just like any other engine in that it was built on top of engines that goes as far back as the 90's. How many people know that the Source 2 engine is based on the Quake engine? World of Warcraft is actually running on a heavily modified Warcraft 3 engine.
None of these history factoids change how well each engine actually runs or looks practice, though. I don't care about where something came from. I care about the end result.

This is proof that Nvidia owners want to believe they have the best, because they paid the most. That's not been the case since forever. There's always going to be an edge case where hardware that's not been the fastest in most games, can be the fastest in one game. AMD cards have advantages and if done right can give them the edge.
My point is for the large majority of games, the 4090 performs much better. Many of these games look better than this title. It is objectively the stronger piece of hardware. Logically speaking, that means this title does things differently from the vast majority of modern titles. But the majority of modern titles perform better while having better graphics. The nitty gritty of why how, or Red vs Green is irrelevant. My point is that this game does things in a different way, and it's generally for no one's benefit. Basically every game except a tiny subset of AMD sponsored titles show that. This title does things differently but to no one's benefit. Well, except perhaps modding, accordingly.

Also I don't have a 4090 if that's what you're implying.

It's like sponsored games always favor the sponsors hardware? This isn't anything new.

That's not the point. My point to them was that most people aren't playing multiplayer shooters, and especially not competitively, on 4k displays to begin with. You can get a 240Hz 1440p monitor for like $370. And you can actually drive that framerate with either card. A 240Hz 4k display is like what, $1k? They asked why most people don't seem to care. I told them why most people wouldn't care. Because there's basically no reason to actually care when both are nearing 144Hz, which is the fastest most people's monitors probably are to begin with, and probably not 4k to begin with. Not to stereotype, but it's CoD. Which enthusiast builds a 4k gaming machine with a $1.1 4k display to play CoD? The very rare one. I view it as a "lowest common denominator" IP.

Although if AMD helped optimize Starfield and it didn't even come with DLSS out of the box, it might be a good excuse to just handwave and ignore its performance to begin with.

Everyone just likes to get hung on Team Red vs Team Green. The second you support "a side" you must be "on their team". If an AMD card had Nvidia's 4090 performance (and driver features) right now, and the situation was flipped, my arguments would be exactly the same but in favor of something being weird because AMD's card wasn't working as well as it should. It could be Intel for all I care. It's like let's just get into the weeds of who's in what tribe while ignoring that the game runs like shit given its graphics.
 
Last edited:
My point is for the large majority of games, the 4090 performs much better. Many of these games look better than this title.
Bethesda games are known for being the king of open world games. There's a lot going on in their games, compared to other open world titles. From the TechSpot article you linked the 7900 XTX and 4090 are nearly running at the same frame rate at 1440p in some games. Assassin's Creed Valhalla on the 7900 XTX is faster as well as Modern Warfare II. Hitman 3 runs identical at 1440p. Some of these games do perform better on the 4090 at 4K, but again it's not a clear 100% win for the 4090.
It is objectively the stronger piece of hardware.
That's debatable.
Logically speaking, that means this title does things differently from the vast majority of modern titles.
Most of the games Nvidia tends to win are older games.
But the majority of modern titles perform better while having better graphics.
Except for Jedi Survivor, yet another AMD sponsored game.
The nitty gritty of why how, or Red vs Green is irrelevant. My point is that this game does things in a different way, and it's generally for no one's benefit.
Seems to benefit AMD users. Without knowing what was done to the game itself, you can't assume that it can be done better.
Basically every game except a tiny subset of AMD sponsored titles show that. This title does things differently but to no one's benefit. Well, except perhaps modding, accordingly.
Keep in mind that 2023 has been a shit show for Nvidia for most of their GPU's, with the exception of the 4090. If a games uses more than 8GB of VRAM then it's a badly optimized game. If a game doesn't come with DLSS, then AMD has paid developers not to include it. Again, it isn't just Starfield where it performs better on AMD hardware, but also Jedi Survivor. We had a huge thread about Jedi Survivor and how it's a bad game, because it performs poorly on the RTX 4090. 2023 is the year Nvidia owners are having buyers remorse.

View: https://youtu.be/PSjSDapjVGc?si=pDmF3NU3JEpZrJ4e
Everyone just likes to get hung on Team Red vs Team Green. The second you support "a side" you must be "on their team". If an AMD card had Nvidia's 4090 performance (and driver features) right now, and the situation was flipped, my arguments would be exactly the same but in favor of something being weird because AMD's card wasn't working as well as it should. It could be Intel for all I care. It's like let's just get into the weeds of who's in what tribe while ignoring that the game runs like shit given its graphics.
AMD owners are constantly in a situation where it's flipped. It's just expected that AMD cards perform worse than Nvidia. Nvidia doesn't need defending, not with the pricing they ask for their cards. Soon as AMD offered the 7800 XT, the price of the 4060 Ti 16G dropped $50. The more AMD wins in games like Starfield and Jedi Survivor, the more likely Nvidia will have to drop the price of the 4090. Starfield brought out the worst in fanboys as you see Playstation fans review bombing the game, while Xbox fans are just doing as much as they can to give the game a 10/10. Now we have PC fanboys fighting because no DLSS and it runs worse on Nvidia than on AMD graphic cards. From what I hear the game is rather boring.
 
AMD lineup this generation gives better, significantly at times fps/buck. Comparing a $1600 card to a $1000 card does not speak well for Nvidia lineup, lol. I just laugh when the 4090 is compared to the 7900 XTX, there should be no comparison there to begin with.
 
I meant exactly what I said, and I understand exactly what I said. If you're optimizing your compute units, you optimize them for the kind of workloads that they will be computing. Bethesda titles have, afaik, always essentially run like shit, versus their graphics level. If most titles out there don't compute things like a Bethesda title does, then are you going to tune your hardware to compute it? No.
Huh? Bethesda titles don't run like shit. Their problem has always been in the length of QA testing. The engines they use have always been pretty damn solid. Starfield for example is CPU bound. Really think about that for a second since it can utilize 8 cores. The other engines they use, some of them coming from John Carmack himself run more efficiently than most engines ever created. When it comes to CPU utilization, Starfield is actually pretty good.

1695114280157.png

That's like saying that basically everyone else is using their pipe wrench to do the things a pipe wrench was designed to do, but Bethesda likes using it to hammer nails instead, so maybe we should design our pipe wrenches to look more like hammers instead, because Bethesda just likes hammering shit for no reason. No one is going to do that. AMD's pipe wrench is just closer in shape to a hammer this time around, and both pipe wrenches still work for hammering those nails. Everyone just deals with it because FSR (and DLSS with a mod) come in to save the day.
You mean that data is being transferred in accordance to a standard. In this case it's AMD's wave32. This is no different than what happens if nVidia were to sponsor a title. This is just like when a game is centered around CUDA and the architecture which supports it. It's not more "complicated" than that. The difference being is that AMD rarely if ever gets to control the development of a game like Starfield, while nVidia does all of the time. Like Portal (which actually DOESN'T support FSR), or Quake RTX, and especially Cyberpunk 2077. Go ahead and load up Portal Remix RTX on an AMD card.... if you get to load it at all.
And most of them being nVidia titles. Also the XTX is priced to compete with the 4080 not the 4090.
The 1% lows on the 4090 at 4k were higher than the 7900XTX's average in this review. This is with the reviewers not enabling Ray Tracing except when required. So these results are a bit skewed because Ray Tracing sort of really screwed with AMD in one game, where it was an absurd amount slower. It's to the point where some reviews of the 7900XTX don't bother comparing it to the 4090 to begin with. They just stop at the 4080. Which makes sense, they're in different price brackets. But at the same time, the 4090 is better than the 4080 in a degree that's actually commensurate with its price tag (actually more). Pound for pound, its price is actually usually justified.
Who has ever said the 4090 wasn't worth it? It's easily the most desirable / reasonable card nVidia has released. The others? Not so much. The 7900xtx is smaller than a 4090, by A LOT. If it ever ends up in a situation where it's close or beating it that goes to show just how efficient the architecture can be.
I also believe Ray Tracing is frankly the future of game design:
https://computergraphics.stackexcha...of-rasterization-over-ray-tracing/10962#10962
Apparently it's easier to implement than all the tricks a company has to do with rasterization. That being the case, I can see more and more games saving time on development by switching to RT. Apparently that's what Hogwarts Legacy did (which is why it looks much worse with RT off). With the next Nvidia gen (and arguably already with the 4090), RT is probably going to become an actual viable thing to entirely base a game around. And then I'm sure in another 5-10 years the next Bethesda game will come out and somehow have a shitty ray tracing implementation, too. That's neither here nor there, though.
RT has been "the future" since before the RTX 1080 which sucked at it. Ray-tracing itself in games is pointless. Why? Because lights and shadows are calculated/consumed differently between a human brain and a computer. The amount of data created by light at those rates far exceeds what the human brain can compute. The human brain can easily be tricked when it comes to lights and shadows because it's not designed to determine what an accurate ray of light should look like. This happens routinely (even games played with children) which is why whenever you see an explanation of it, its through freeze frames and a lengthy explanation of the reflection you're looking at.

The human brain is designed to pick out anomalies. Things that don't belong. It's not designed to tell the difference between an accurate ray of light and a ray of light that's 2 centimeters off. It cannot do that. PERIOD. When you're talking about raster it's about getting as close to realism to trick the brain without absorbing it's penalty and it's pretty damn past that at this point. Path-tracing is built on this way of thinking as well.
 
Last edited:
I went to sleep and I woke up, and I'm kind of not starting off my morning with debating about Starfield and Nvidia on an essay vs essay basis. You folks win by attrition on that front. I've seen other topics and you basically never stop posting.

On the other hand:
RT has been "the future" since before the RTX 1080 which sucked at it. Ray-tracing itself in games is pointless. Why? Because lights and shadows are calculated/consumed differently between a human brain and a computer. The amount of data created by light at those rates far exceeds what the human brain can compute. The human brain can easily be tricked when it comes to lights and shadows because it's not designed to determine what an accurate ray of light should look like. This happens routinely (even games played with children) which is why whenever you see an explanation of it, its through freeze frames and a lengthy explanation of the reflection you're looking at.

The human brain is designed to pick out anomalies. Things that don't belong. It's not designed to tell the difference between an accurate ray of light and a ray of light that's 2 centimeters off. It cannot do that. PERIOD. When you're talking about raster it's about getting as close to realism to trick the brain without absorbing it's penalty and it's pretty damn past that at this point. Path-tracing is built on this way of thinking as well.

RTX 1080? That doesn't exist, unless it's some unreleased variant of the 1080 GTX...

My point, which all of this basically completely ignores, is that afaik RT is easier to implement on a per scene basis for a developer. As I understand it much easier. Because it's just a lot of calculations with an algorithm, not a bunch of dirty tricks. That would allow them to spend time on other aspects of the game. Someone told me that this is essentially what the developer of Hogwarts Legacy did with it. And it looks quite good with Ray Tracing on, and runs reasonably on a 4090. It has a lot of nice, intricate visual details, especially within Hogwarts. Although it requires a very strong CPU to keep up due to its CPU design decisions. The fact that RT also looks better is just a cherry on top. The entirety of your ramble here is really just giving me question marks above my head. AMD also is noteworthily absolutely terrible at ray tracing. As in, at least a generation behind. They've clearly been playing catch-up. That doesn't mean Ray Tracing is dead. The 4090 already does it fine, and I believe by next gen we will most definitely have cards capable of it, maybe even among the midrange. I also believe that we will have more developers leveraging it over time. Because if it's easier and it looks better, and AMD people can just turn it off... who cares.
 
Last edited:
AMD also is noteworthily absolutely terrible at ray tracing. As in, at least a generation behind. They've clearly been playing catch-up. That doesn't mean Ray Tracing is dead. The 4090 already does it fine, and I believe by next gen we will most definitely have cards capable of it, maybe even among the midrange. I also believe that we will have more developers leveraging it over time. Because if it's easier and it looks better, and AMD people can just turn it off... who cares.
This may have been true with the 6000 series cards, but the 7000 series seem competitive with their nvidia counter parts. Cyberpunk is the most distant outlier, so outside of that AMD RT is in the ballpark. Take the 7800xt vs 4070. This doesn't look like "at least a generation behind."
Spider_RT-p.jpg

Hogwarts_RT-p.jpg

RE4_RT-p.jpg
 
This may have been true with the 6000 series cards, but the 7000 series seem competitive with their nvidia counter parts. Cyberpunk is the most distant outlier, so outside of that AMD RT is in the ballpark. Take the 7800xt vs 4070. This doesn't look like "at least a generation behind."
View attachment 599747
View attachment 599748
View attachment 599749
A recent game to compare would be immortals of aveum (because it is developed on unreal 5.1)

Also compare 3 cards: 6800xt, 7800xt & 4070 to get a better picture
 
This may have been true with the 6000 series cards, but the 7000 series seem competitive with their nvidia counter parts. Cyberpunk is the most distant outlier, so outside of that AMD RT is in the ballpark. Take the 7800xt vs 4070. This doesn't look like "at least a generation behind."
Last I checked, the 7900 XTX in any RT game is basically on the level of the 3090. Which the 4090 basically surpasses by roughly 1.5-2x (and the 4080 consistently surpasses as well, trivially). Although it's interesting how the more budget AMD boards compare to Nvidia budget boards, I guess? But Nvidia's budget boards for this gen have always sucked, and/or been mediocre values. I wouldn't really consider them RT-capable cards. I don't think many sane people would run ultra or high quality RT in a next gen title on them at least...

Edit: Also some of those results look sketch. The 4070 in Hogwarts Legacy is clearly getting handicapped by its bus width at 4k. At 1440p it's 20% faster. Is that even an RT result for Resident Evil...? It doesn't explicitly state it.
 
How did the topic turn out to be all about Nvidia's Ray Tracing even when Starfield dose not even have it and the RTX 4090 being so good at it while needing $2000 worth of hardware and GDDR 6+ memory like some cheat to compare it to something using plain old GDDR 6. it's better at costing too much.
 
How did the topic turn out to be all about Nvidia's Ray Tracing even when Starfield dose not even have it and the RTX 4090 being so good at it while needing $2000 worth of hardware and GDDR 6+ memory like some cheat to compare it to something using plain old GDDR 6. it's better at costing too much.
Good question. Egos get bruised when game devs don't backup Nvidia's marketing.
 
But the majority of modern titles perform better while having better graphics.

Except for Jedi Survivor, yet another AMD sponsored game.
You are saying it performed better on AMD? Did you know that is because AMD paid to keep both competitor tech out as well as hamper performance on competitors GPU's?

Did you know a modder put DLSS in Jedi Survivor, and not only did I see a 50% fps boost, instantly, but also had better picture quality? Even with DLSS input resolution equal to the output resolution, I was seeing up to 59% more FPS, with better image quality.

It's just expected that AMD cards perform worse than Nvidia....The more AMD wins in games like Starfield and Jedi Survivor.....
I would be happy to see AMD win, but not by cheating which is what they are doing. It's anti-competitive and anti-consumer.

They pay to keep superior technology out of games. That's not a 'win'.
 
Last edited:
You are saying it performed better on AMD? Did you know that is because AMD paid to keep both competitor tech out as well as hamper performance on competitors GPU's?

Did you know a modder put DLSS in Jedi Survivor, and not only did I see a 50% fps boost, instantly, but also had better picture quality? Even with DLSS input resolution equal to the output resolution, I was seeing up to 59% more FPS, with better image quality.


I would be happy to see AMD win, but noy by cheating which is what they are doing. It's anti-competitive and anti-consumer.

They pay to keep superior technology out of games. That's not a 'win'.
JS has DLSS now, they patched it in.
 
JS has DLSS now, they patched it in.
Yeah now they moved on to it's not good enough and insist the modded DLSS is better.

Which defeats the original claim of that it's so easy to implement DLSS, the devs just have to turn it on.

Apparently not, because then why would the modded DLSS be better than official?

I'm sure it'll be something like "AMD paid devs to botch the DLSS".
 
I'm going to necro this thread after the gold edition comes out in a few years just to ask if any of this really mattered in the end. Forget the 1000 empty worlds of content at launch, did it have DLSS support then? lol
It doesn't even matter now. Tons of copies sold and are played. I'm sure the standard Bethesda crowd are doing just fine.

Most of the criticism is based on actual game elements.
 
Yeah now they moved on to it's not good enough and insist the modded DLSS is better.

Which defeats the original claim of that it's so easy to implement DLSS, the devs just have to turn it on.

Apparently not, because then why would the modded DLSS be better than official?

I'm sure it'll be something like "AMD paid devs to botch the DLSS".
They fucked it up pretty bad though.

From the review of the patch from techPowerup.

In Star Wars Jedi: Survivor, the DLSS Super Resolution implementation has one major issue—it does not work in Fullscreen mode and you have to run the game in Windowed Fullscreen mode in order to use DLSS. For some people it may be not a big of an issue, but for those who enjoy using NVIDIA's Dynamic Super Resolution (DSR) or Deep Learning Dynamic Super Resolution (DLDSR) and want to use DLSS with DSR/DLDSR, you first need to set the desktop resolution to a desirable DSR/DLDSR resolution, launch the game in windowed fullscreen mode and enable DLSS, which is a tedious process for no good technical reason.

https://www.techpowerup.com/review/star-wars-jedi-survivor-fsr-2-2-vs-dlss-2-vs-dlss-3-comparison/

They also forgot to actually pass the buffer through the sharpening filter. They broke it for TAA native as well.

It almost looks like TAA was broken from the get go which is why it was easy to see FSR looking better than it from
Day 1.
 
Last edited:
You are saying it performed better on AMD? Did you know that is because AMD paid to keep both competitor tech out as well as hamper performance on competitors GPU's?
Nobody knows if AMD did pay to keep DLSS out of Jedi survivor, especially you.
Did you know a modder put DLSS in Jedi Survivor, and not only did I see a 50% fps boost, instantly, but also had better picture quality? Even with DLSS input resolution equal to the output resolution, I was seeing up to 59% more FPS, with better image quality.
Nah, we just had a thread about this locked so clearly I never knew Jedi Survivor had a DLSS mod. Also, that mod created graphic glitches because again it's a mod.
I would be happy to see AMD win, but noy by cheating which is what they are doing. It's anti-competitive and anti-consumer.
AMD is doing what Nvidia has been doing for two decades. It's likely that the reason Nvidia wins in most game titles is because most game titles are sponsored by Nvidia. It's just that AMD's budget isn't as big as Nvidia's, so it's rare to find an AMD sponsored title. This is why PhysX made it into so many games, as well as hairworks. More recently Ray-Tracing and now DLSS. There are still some RTX games that just don't work on AMD hardware, and yet nobody cares.
They pay to keep superior technology out of games. That's not a 'win'.
Considering that DLSS did end up in Starfield and Jedi Survivor, I doubt AMD paid to not have DLSS in these games. Otherwise AMD didn't get their monies worth. AMD probably asked for FSR to be the priority, and that's reasonable. Probably the reason why AMD is ending up as the sponsor is either because Nvidia is too focused on AI, or because developers see value in FSR because it works on all GPU's. You can argue about image quality and performance, but it still took months for DLSS to get into Jedi Survivor and 1 month for Starfield. Apparently it's not as easy as some people say it is to implement.
 
it still took months for DLSS to get into Jedi Survivor and 1 month for Starfield. Apparently it's not as easy as some people say it is to implement.
And yet somehow, even without the source code, PureDark got DLSS working in Starfield on August 31st.
 
There were no graphical glitches in Jedi Survivor with the DLSS mod that I saw, and I played it for weeks using it.

DLSS ended up (offcially) in Jedi Survivor likely because there was some "exclusivity timeframe", or the dev's finally had to add it, or AMD would have had to admit they paid to keep it out. GN submitted the exact question to AMD and AMD's response was "no comment". It's basically an admission.

I'm not aware that DLSS is officially added to Starfield.. I don't own it. There's a mod to add DLSS for Starfield that Puredark was working on, and first versions released of, on day 1. The game companies have no real excuse to leave out XeSS and DLSS, other than doing so at the request of AMD, which is done to keep them from looking bad.
 
There were no graphical glitches in Jedi Survivor with the DLSS mod that I saw, and I played it for weeks using it.

DLSS ended up (offcially) in Jedi Survivor likely because there was some "exclusivity timeframe", or the dev's finally had to add it, or AMD would have had to admit they paid to keep it out. GN submitted the exact question to AMD and AMD's response was "no comment". It's basically an admission.

I'm not aware that DLSS is officially added to Starfield.. I don't own it. There's a mod to add DLSS for Starfield that Puredark was working on, and first versions released of, on day 1. The game companies have no real excuse to leave out XeSS and DLSS, other than doing so at the request of AMD, which is done to keep them from looking bad.
They announced its coming to Starfield, it’s not in there yet.

I’ll hopefully get a chance to actually start the game late October.
 
They fucked it up pretty bad though.

From the review of the patch from techPowerup.

In Star Wars Jedi: Survivor, the DLSS Super Resolution implementation has one major issue—it does not work in Fullscreen mode and you have to run the game in Windowed Fullscreen mode in order to use DLSS. For some people it may be not a big of an issue, but for those who enjoy using NVIDIA's Dynamic Super Resolution (DSR) or Deep Learning Dynamic Super Resolution (DLDSR) and want to use DLSS with DSR/DLDSR, you first need to set the desktop resolution to a desirable DSR/DLDSR resolution, launch the game in windowed fullscreen mode and enable DLSS, which is a tedious process for no good technical reason.

https://www.techpowerup.com/review/star-wars-jedi-survivor-fsr-2-2-vs-dlss-2-vs-dlss-3-comparison/

They also forgot to actually pass the buffer through the sharpening filter. They broke it for TAA native as well.

It almost looks like TAA was broken from the get go which is why it was easy to see FSR looking better than it from
Day 1.
Add it to the long list of stuff they F'd up in that game. The recent DF video shows it's still not in great shape. And based on the fact that the first game still has stutter issues it may never be 100%.
 
Add it to the long list of stuff they F'd up in that game. The recent DF video shows it's still not in great shape. And based on the fact that the first game still has stutter issues it may never be 100%.
The game just as its predecessor has serious multi-threading problems and relentlessly pounds one core.
Their customizations to the UE4 engine also seams to struggle with caching and fetching which makes the zone stuttering especially bad.
The developers ambitions exceed UE4’s capabilities and their attempts to shove it in just leave things bursting at the seems.
 
And yet somehow, even without the source code, PureDark got DLSS working in Starfield on August 31st.
Working seems like a bit of an oversell considering many using the mod ended up disabling it due to stability issues.

Both official and modded dlss has had various issues in several recent games demonstrating that it isn't simple to enable and have working properly like many have tried to claim.
 
There were no graphical glitches in Jedi Survivor with the DLSS mod that I saw, and I played it for weeks using it.

DLSS ended up (offcially) in Jedi Survivor likely because there was some "exclusivity timeframe", or the dev's finally had to add it, or AMD would have had to admit they paid to keep it out. GN submitted the exact question to AMD and AMD's response was "no comment". It's basically an admission.

I'm not aware that DLSS is officially added to Starfield.. I don't own it. There's a mod to add DLSS for Starfield that Puredark was working on, and first versions released of, on day 1. The game companies have no real excuse to leave out XeSS and DLSS, other than doing so at the request of AMD, which is done to keep them from looking bad.
I suspect it’s not an exclusively thing but a customization one. Adding DLSS to the stock engine is easy, but depending on the changes made at a core level things can get off the rails pretty fast. Guaranteed that UE4 engine is a damned monster of modified code and plugins, I think that AMD’s framework is easier to implement in that sort of situation and much of the framework already existed for the console plugins.
 
How did the topic turn out to be all about Nvidia's Ray Tracing even when Starfield dose not even have it and the RTX 4090 being so good at it while needing $2000 worth of hardware and GDDR 6+ memory like some cheat to compare it to something using plain old GDDR 6. it's better at costing too much.

I don't know. The same way that every topic that mentions the pitfalls of one brand just turns into a shitting fest between "Red vs Green"? You get one rare "win" for AMD, you get AMD users gloating and shoving it down the throats of Nvidia users while defending it with essays and strawman arguments. You get an Nvidia win in the vast majority of games... well Nvidia must clearly paying developers to handicap it on AMD, and they all must clearly be Nvidia sponsored titles. AMD is justified in doing what they do because Nvidia has been doing it. We should just all be satisfied with this race to the bottom. I literally just mentioned the Ray Tracing thing as an aside.

Then I had some dude completely ignore my actual point while calling me shit because he wasn't capable of understanding what I was actually saying, and then being an asshat on top of it while taking a short line out of context. Needless to say that idiot got blocked.

I don't know, you tell me, why do these things go off the rails and end up just being shitting fests? Apparently we can't have calm discussions about GPUs, their actual featuresets, pitfalls, etc. Everyone feels the need to defend and justify their "side". Again, I don't even have a 4090. I have a 3080 Ti. It was literally just the best deal available to me at the time I bought it. The one reason I didn't buy a 7900XTX yet, is its Ray Tracing performance sucks (literally a sidegrade to my 3080 Ti), and I believe that future games are going to leverage Ray Tracing more and more. Clearly Starfield didn't at all. I don't see 7900XTX as being futureproof enough because I loved the way Hogwarts Legacy looked with RT on. I would actually quite welcome it if the 7900XTX kept getting more wins, because maybe then AMD would be more competitive and competition always drives prices down. It's just I don't believe from what I've seen that Starfield is any valid indicator of that.
 
DLSS ended up (offcially) in Jedi Survivor likely because there was some "exclusivity timeframe", or the dev's finally had to add it, or AMD would have had to admit they paid to keep it out. GN submitted the exact question to AMD and AMD's response was "no comment". It's basically an admission.
The carefully chosen words of AMD Frank Azor (that he joked he was having to choose carefully) on the topic of DLSS when interviewed by Verge about Starfield, pointed to what sounded like timed exclusivity of FSR. And I'd guessed it was more soft agreement than something specified in a contract.

Bethesda suddenly mentioning in last patch notes that DLSS will now be coming soon makes it seem like the backlash of not shipping with DLSS has changed their internal calculus, and they're going to fasttrack it to stop that bleeding.

Again AMD's done nothing wrong or evil that NV wouldn't, but even timed exclusivity seems like it'll mostly just shine a light on the disparity between DLSS/FSR more than help FSR adoption. Hopefully FSR continues to improve, but not easy or trivial since NV has a sarcastic amount of budget being thrown at DLSS.
 
On the topic of the article, thanks for the link, OP. That's a pretty cool look at the insides of how the shaders are being fed to the GPUs and the differences in their cache design and strategies.

I currently have a 3090 since, at the time I bought it, it was the cheapest way for me to get a 24GB card. I think that was pre-7k series launch though, or the 7900XTX would have been a contender, but NV's better RT performance probably would have meant the 3090 would win out for me anyway. I do like XFX cards though, I had a HD6950 (well still have actually) that I reflashed as a HD6970; solid card for its time.
 
Out of curiosity is this one of those 4k ultra issues only? Or is the game with lower settings/resolution also having issues too?
 
Again AMD's done nothing wrong or evil that NV wouldn't, but even timed exclusivity seems like it'll mostly just shine a light on the disparity between DLSS/FSR more than help FSR adoption. Hopefully FSR continues to improve, but not easy or trivial since NV has a sarcastic amount of budget being thrown at DLSS.
FSR's adoption has more to do with it being open and able to work on all GPU's. Many people have argued that DLSS is superior, but not enough to justify the develop time to implement it over FSR. The difference between DLSS 2 and FSR 2 requires people to zoom in and point out certain sections of a game. If anything, we may see FSR implemented into DirectX and Vulkan just because it's open source and works on all hardware.
I don't know. The same way that every topic that mentions the pitfalls of one brand just turns into a shitting fest between "Red vs Green"? You get one rare "win" for AMD, you get AMD users gloating and shoving it down the throats of Nvidia users while defending it with essays and strawman arguments. You get an Nvidia win in the vast majority of games... well Nvidia must clearly paying developers to handicap it on AMD, and they all must clearly be Nvidia sponsored titles. AMD is justified in doing what they do because Nvidia has been doing it. We should just all be satisfied with this race to the bottom. I literally just mentioned the Ray Tracing thing as an aside.
I favor AMD and Intel because Nvidia has priced themselves out of the competition. When Nvidia one day returns prices back to sane levels, I will defend them as well. It's better for the industry that we don't just give Nvidia more credit than they deserve.
The one reason I didn't buy a 7900XTX yet, is its Ray Tracing performance sucks (literally a sidegrade to my 3080 Ti), and I believe that future games are going to leverage Ray Tracing more and more.
The 7900 XTX isn't what I'd call... affordable. It wasn't until the price of the 7900 XTX started to drop is when sales started to happen. Nobody here is advocating that people should buy a 7900 XTX, cause you shouldn't. Buying those kind of GPU's at those prices just reinforces AMD and Nvidia to keep raising prices. The RTX 4090 is the most over priced GPU ever made, and it sells because there's a group of people who buy the best and the best is the RTX 4090.

I'm going to tell you what I've told Android and Apple users, and that's to stop praising multi billion dollar companies. They all suck, including AMD. We all know the flaws of AMD GPU's, just like we know the flaws of Nvidia GPU's. Buy what's the best price to performance and that will do the market a lot of good.
Clearly Starfield didn't at all. I don't see 7900XTX as being futureproof enough because I loved the way Hogwarts Legacy looked with RT on. I would actually quite welcome it if the 7900XTX kept getting more wins, because maybe then AMD would be more competitive and competition always drives prices down. It's just I don't believe from what I've seen that Starfield is any valid indicator of that.
If you're looking to buy a 7900 XTX or RTX 4090, then you're not looking at future proof. People with that kind of budget are going to upgrade every 2 years, depending on when Nvidia releases their next flag ship graphics card. People who buy RTX 4070's or RX 7800 XT's are going to probably upgrade every 3-4 years. People who buy RX 7600's or RTX 4060's will upgrade every 5 years or more. Different budgets, difference expectations.
 
I kind of get the point that both Intel Arc and Starfield found out at the same time they both existed and why no XeSS was added being Intel never had a working driver. so I still have a RTX 3070 and A770 to try once some patch work gets done with adding DLSS and that other stuff talked about their going add and should make for a nice long winter game.
 
Follow-up report

where they look at Starfield on a RX 7600, RX 6900 XT, and RTX 2060


RDNA 2 (6900xt vs 7900xtx)

AMD’s RX 6900 XT and RX 7600 show similar characteristics to the larger RX 7900 XTX, but there are a few differences worth calling out. The RX 6900 XT is occasionally ROP-bound.

Because RDNA 3 can dual issue the most common instructions like FP32 adds, multiplies, and fused multiply adds, it ends up less compute bound than RDNA 2. Dual issue is less of a factor in wave32 code, where it depends on the compiler to find dual issue pairs.

RX 7600 at 1440P​

The RX 7600 might suffer from lower Infinity Cache hitrate, though we don’t have enough data to say for certain. Different draw calls become more expensive (or vice versa) depending on resolution.

The RX 7600 has a much smaller last level cache than the other two cards. It could have suffered more misses, but there’s no way for us to confirm that because RGP doesn’t provide counters for the Infinity Cache.

Another culprit is that this shader didn’t scale down in resolution as much as you’d expect going from 4K to 1440p. The RX 7600 dispatched 676,054 invocations for the 1440p frame, while the 6900 XT dispatched 797,573 invocations. The RX 7600 therefore did 84% as much work, compared to the 44% as much work for the full screen pixel shader.

Nvidia’s RTX 2060 at 1080p: Instruction Starvation​

AMD cards still keep themselves better fed thanks to larger register files and higher occupancy. This isn’t a case of optimizing for RDNA while forgetting to do so for Nvidia, because there’s no way around a smaller register file. I suppose it would become less of an issue with simple shaders that don’t need a lot of registers, but larger, more complex shaders seem to be trending with recent games.

A return to the simplicity of the DirectX 9 and early DirectX 11 era is unlikely outside of small indie games. Nvidia might want to consider larger register files going forward. They haven’t increased per-SM vector register file capacity since Kepler. While Kepler was an excellent architecture, there is room for innovation.


https://chipsandcheese.com/2023/10/15/starfield-on-the-rx-6900-xt-rx-7600-and-rtx-2060-mobile/
 
Last edited:
Back
Top