• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

DLSS 5 - Generative AI

an0763.jpg
 
I don't get the push back. Graphics have been stagnant for over 10 years. We keep throwing more and more TFLOPS at it for minor upgrades.
Nvidia creates an algorithm that makes meh materials realistic and implements realistic lighting and everyone is clutching their pearls. Neural rendering may end up as impactful as shaders and bump mapping were.
I honestly don't care what it does or how it does it. I want the game to look better.

I suspect the push back is due to AI causing memory and gpu prices to rise. I didn't know we were huge on respecting the art direction from AAA studios now days.
I want to see what the artist originally intended, not what some computer thinks looks better.
 
I want to see what the artist originally intended, not what some computer thinks looks better.
But this is directly controlled by the artists. Also, some of the most popular mods for games are reshades, models and texture enhancements.
 
I agree. I am not 100% against AI in assisting development. I am not against it helping make place holders, or helping clean up in some type of asset prior to review and adjustment. But DLSS 5 seems to be doing exactly what we don't want to see. It is essentially creating the end product without final adjustments from the artists. More or less allowing more sloppy work with what is essentially a crappy filter thrown on top of it. And like all upscaling and whatnot, there will be some downside. And we really don't need more of that on individual assets.
DLSS5 is going to be even worse because it's a compounding problem with multiplicative errors in magnitude. First you have upscaling which has its own issues and errors then you add in framegen which adds in more issues and errors and then you top that with the new DLSS5 stuff which causes more issues and errors while compounding all the issues and errors from the previous stuff done to the images. It ends up being multiplicative compounding errors.

This is one of the reasons this is stupid. nVidia is throwing this out there to be a long term "tool". It's not meant to be turned off. Once developers decide it's best to "save money" and cut staff to put out a mediocre at best product which is supposed to be "fixed" by DLSS5 there is nothing for anyone to do except use it or not buy the games. Personally I think it's more likely that game sales go down. Just like RT this "tech" isn't ready and won't be for years with regards to hardware and with the prices of GPUs the amount of upgrading for the foreseeable future for most people is going to be small.
 
this will be coming fast right now the semantic you add is only stuff like skin, furh, hair, satin, plastic, foliage, water (just 256 different right now)..
You can't add specific prompts like that to a model that runs on a rendered image because there will be different things visible on each frame. Well, technically you could by running a visual AI model on the frame to determine what is on the picture, but I doubt that can be made feasible in a real time application.
If they do offer a way to give hints to the model like this devs would have to commit more to Nvidia proprietary tech, since that's not just drag and drop. Granted, there was that figure that the vast majority of the market is Nvidia anyway but does make me think of PhysX.
I think the best case is them allowing devs to use their own LoRAs.
That may be. To be honest. Nvidia has said multiple conflicting things about this tech. Its lighting, its not a filter, it is a filter, It enhances geometry, it leaves models alone. I don't think they were actually prepared for any backlash. They seem pretty out of touch. Maybe when Jensen said 5 enhances geometry he is just wrong and no one there wants to correct him? I don't know lol
They are trying to piss on you and claim its raining, nothing new. It's clearly an AI filter that they are trying to claim is something more than that.
 
  • Like
Reactions: ChadD
like this
For years everyone complained about not getting realistic graphics, especially after paying 3K+ for a graphics card. Then they show that you can and everyone starts crying about it.
1. They're not realistic, they are overprocessed. This is dynamic/showroom setting on TVs.

2. Photorealism is not and has never been the goal of gaming graphics. Of some games, yes, and is a valid goal as an available tool. Butchering of existing game visual design, both in geometry and rendering, ain't it.
 
2. Photorealism is not and has never been the goal of gaming graphics. Of some games, yes, and is a valid goal as an available tool. Butchering of existing game visual design, both in geometry and rendering, ain't it.
I must have been in another time line the past 30 years. For me every generation has been about the march towards photo realism. With obvious exclusions like cell shaded and pixel art games.
 
I must have been in another time line the past 30 years. For me every generation has been about the march towards photo realism. With obvious exclusions like cell shaded and pixel art games.
What is the most successful gaming platform?
 
For years everyone complained about not getting realistic graphics, especially after paying 3K+ for a graphics card. Then they show that you can and everyone starts crying about it.

But what they showed doesn't pass the eye test and feels fake. The lighting and atmosphere is all wrong. The character models look and feel uncanny.
 

View: https://www.youtube.com/watch?v=GUxsmp8iojY

Really disappointed to see Digital Foundry walk back some of their takes, saying "We should have waited to see what the reaction would be". They just plan to parrot the audience? Discourse in this space has become so bad lately

Like yeah the demo looked bad. They likely cranked settings to max to show off the tech, similar to how TVs are set to completely inaccurate color settings on the show room floor because it is flashy.

Yes the demo ran on 2 5090s. That doesn't mean it will require 2 5090s on release.

Idk man where is the nuance, I keep seeing this takes with 0 nuance and parroting misinformation
 
Last edited:

NVIDIA DLSS 5 Gets 84% Dislikes on YouTube as Backlash Grows

by AleksandarK Today, 08:16 Discuss (2 Comments)
NVIDIA's latest DLSS 5 technology has faced a significant community backlash, with its approval rating dropping considerably. On the published YouTube video, NVIDIA's official DLSS 5 announcement has received an overwhelming 83.7% dislikes, with only 16.3% likes. This is a substantial negative rating, with 16,107 likes and 82,515 dislikes (and counting) on a video with 1,527,915 views at the time of writing. Other videos published by the NVIDIA GeForce YouTube channel have also recorded surprisingly low approval from the community. The Resident Evil Requiem video only scored a 14.9% positive rating, while Starfield had an 18.2% positive ratio of likes to dislikes. Other demos such as Hogwarts Legacy and EA Sports FC saw positive ratings of 18.7% and 14.5%, respectively. The best rating is now exclusive to a tech demo, not even a real game, which is the Zorah Unreal Tech Demo with a 37% positive ratio.

Gamers' reactions are shifting negatively towards the technology, while NVIDIA CEO Jensen Huang famously noted that gamers are "completely wrong" because these games offer massive programmability and controllability in how DLSS 5 is applied, keeping the artistic intent intact. However, according to game developers from both Capcom and Ubisoft who spoke to Insider Gaming, while the individual studios may have been involved in marketing DLSS 5, the teams who worked on them were just as surprised by the results as the rest of the gaming community. A Ubisoft developer is quoted as saying, "We found out at the same time as the public," while developers at Capcom expressed similar sentiments, stating that it was surprising to see Capcom, which has generally been protective of its IPs when it comes to AI involvement, getting involved in the marketing for DLSS 5. Furthermore, the Capcom developers expressed concern about how DLSS 5 might change Capcom's approach to generative AI and its role in game development.“
 
The plan to replace GPU's with simple AI accelerators and software (to increase profit) is well underway at Team Green!

Eventually game engines won't even be needed anymore. Games will be coded with just descriptors and the AI accelerator and software will take care of the rest of the rendering.
 
The plan to replace GPU's with simple AI accelerators and software (to increase profit) is well underway at Team Green!

Eventually game engines won't even be needed anymore. Games will be coded with just descriptors and the AI accelerator and software will take care of the rest of the rendering.
Real time vibe code rendering... seems like a nightmare.
 
People are putting to much emotion into their comments, its the same people that screamed you must not buy Hogwarts Legacy. Its the same people reacting the same way about every thing they FEEL is bad. Then don't use DLSS, don't buy Nvidia, don't watch a certain channel or podcast but instead we must cancel everything all the time.

If a game came out with the same graphics without DLSS on everyone would be blown away by the amazing graphics, mind blowing, next gen............

to me it is night and day comparison, The image on the right is 10000000x better then the original. So basically "don't believe your lying eyes"

View attachment 792342
Yaass, slay!
 
People are putting to much emotion into their comments, its the same people that screamed you must not buy Hogwarts Legacy. Its the same people reacting the same way about every thing they FEEL is bad. Then don't use DLSS, don't buy Nvidia, don't watch a certain channel or podcast but instead we must cancel everything all the time.

If a game came out with the same graphics without DLSS on everyone would be blown away by the amazing graphics, mind blowing, next gen............

to me it is night and day comparison, The image on the right is 10000000x better then the original. So basically "don't believe your lying eyes"

View attachment 792342
84% dislike ratio don’t lie
 
People are putting to much emotion into their comments, its the same people that screamed you must not buy Hogwarts Legacy. Its the same people reacting the same way about every thing they FEEL is bad. Then don't use DLSS, don't buy Nvidia, don't watch a certain channel or podcast but instead we must cancel everything all the time.

If a game came out with the same graphics without DLSS on everyone would be blown away by the amazing graphics, mind blowing, next gen............

to me it is night and day comparison, The image on the right is 10000000x better then the original. So basically "don't believe your lying eyes"

View attachment 792342

I like how you cropped out just the character model.
 
People are putting to much emotion into their comments, its the same people that screamed you must not buy Hogwarts Legacy. Its the same people reacting the same way about every thing they FEEL is bad. Then don't use DLSS, don't buy Nvidia, don't watch a certain channel or podcast but instead we must cancel everything all the time.

If a game came out with the same graphics without DLSS on everyone would be blown away by the amazing graphics, mind blowing, next gen............

to me it is night and day comparison, The image on the right is 10000000x better then the original. So basically "don't believe your lying eyes"

View attachment 792342
agreed.
 
  • Like
Reactions: Niner
like this
It's new and will get better over time. Just like all the advancements in GPU technology and drivers have always been. The closer to photo realism the happier I am. Cartoony games can stay cartoony if they like but games that feature real people should look like real people and objects. This is getting us there.
 
People are putting to much emotion into their comments, its the same people that screamed you must not buy Hogwarts Legacy. Its the same people reacting the same way about every thing they FEEL is bad. Then don't use DLSS, don't buy Nvidia, don't watch a certain channel or podcast but instead we must cancel everything all the time.

If a game came out with the same graphics without DLSS on everyone would be blown away by the amazing graphics, mind blowing, next gen............

to me it is night and day comparison, The image on the right is 10000000x better then the original. So basically "don't believe your lying eyes"

View attachment 792342

You get sucked into a lot of only fan payments don't you? :) (just teasing)

Maybe your right... the Nvidia might have something here. Deep Lipstick Super Slathering.

And yes if your serious. Your eyes are lying.

This isn't specifically related. Though it feels like NV has perhaps hired some of these folks. :) lol

View: https://www.youtube.com/watch?v=T4Upf_B9RLQ
 
People are putting to much emotion into their comments, its the same people that screamed you must not buy Hogwarts Legacy. Its the same people reacting the same way about every thing they FEEL is bad. Then don't use DLSS, don't buy Nvidia, don't watch a certain channel or podcast but instead we must cancel everything all the time.

If a game came out with the same graphics without DLSS on everyone would be blown away by the amazing graphics, mind blowing, next gen............

to me it is night and day comparison, The image on the right is 10000000x better then the original. So basically "don't believe your lying eyes"
I'm not technically up on all the shading and lighting stuff. But I'll say that most of the images I've seen are brighter than the comparison shots. Is the detail more apparent on the right? Yes. But is also changed the model itself so what is it really doing and is it always going to produce good changes in the model every time? Why are a lot of the shadows gone? The background on the image on the right is much brighter than the original. Same with the Assassin's Creed images earlier a lot of the shadows were gone.

My overall take is that some of the images look better and some do not. Some aspects of some images look better and some do not. It is difficult to say if the CURRENT version of DLSS they're pushing is an overall win. Given that they've apologized and admitted that it might have been pushed out hastily they believe they probably over sold this based on the images they provided as proof.
 
You can't add specific prompts like that to a model that runs on a rendered image because there will be different things visible on each frame. Well, technically you could by running a visual AI model on the frame to determine what is on the picture, but I doubt that can be made feasible in a real time application.
Nvidia will not want to do it, because they will make something that have not only temporal stability, but exact run to run, machine to machine look, but other could do different and you could make system that are deterministic with that level of semantic, a lot of randomness we see in models is something that is injected by design to not get stuck.
 
It's new and will get better over time. Just like all the advancements in GPU technology and drivers have always been. The closer to photo realism the happier I am. Cartoony games can stay cartoony if they like but games that feature real people should look like real people and objects. This is getting us there.
Plastic surgeons and Instagram filters love you.
 
If they do offer a way to give hints to the model like this devs would have to commit more to Nvidia proprietary tech, since that's not just drag and drop. Granted, there was that figure that the vast majority of the market is Nvidia anyway but does make me think of PhysX.
It is using the nvidia streamline framework to talk to the dlss 5 implementation, but those semantic are just a simple int from simple material string value, passing them to a different model would be easy (the unreal, blender, etc... asset that describe them is not proprietary), a bit like motion vectors are not a big deal to pass to any upscaler once you have them.

the material ID tagging part will be open source and well documented if you want to follow the nvidia one and add to it after.
 
It's new and will get better over time.
I'm sure it will.

I think people are missing where this technology ultimately takes us.

average-gpu-price-over-the-last-18-months-v0-gdav610ksn371.png


We will own nothing. We will be forced to subscribe to access it. That's my biggest recoil to all of this.

People think this is going to be readily available on their own equipment. No, in the future, you're going to subscribe to a monopolistic cloud service and own nothing. Because everything is going to be astronomically expensive. The current hardware pricing hikes related to powering all of this is just the beginning.

It will not make financial sense to sell hardware to consumers when they can sell it to big corporations for big money, that will then subscribe it back to you as a service.

Edit: more useful Memory Price Trends - PCPartPicker https://pcpartpicker.com/trends/price/memory/
 
I do love you guys but sometimes it's like talking to a brick wall.
If the masses wake up from the constant... push of crap. So we can bank roll in development tech that would be cool.
We'll see on this one. You guys might be correct when this comes out the average joe video game player might just not care and love the yassified visuals.

As you have said yourself though. Maybe this tech is good in 3 years? I mean it took RT 6 or 7 years to be semi usable on higher end GPUs. Gamers bank rolled Nvidias development into a non gaming company.

I'm not sure they need the bank rolling anymore, but this new feature feels like a replay. Tech that isn't very good, introduces major errors in the visuals... for bits that look better and a lot of bits that look worse. That EATS compute for 2-4 generations of video cards, making it mostly unusable in most cases for most people.

Perhaps when this tech looks decent in 6 or 7 years and runs on higher end GPUs reasonably well. It will be time to dump it for the next thing NV needs us to bankroll. (and yes I know AMD will be chasing this same dragon)
 
Back
Top