NVIDIA CEO Jensen Huang hints at ‘exciting’ next-generation GPU update on September 20th Tuesday

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
6,483
Looks like if I'm upgrading to 4xxx, I'd be kissing my mini-ATX cases goodbye. The performance of the new cards looks...nice...but for what purpose? So I can get 4k 100+ FPS instead of 4k 95 FPS in Spiderman: Remastered? Back in the day, graphics in software was improving somewhat along the same rate as GPU horsepower. But now? Software developers have no use for that much GPU power. SLI has been dead/redundant for quite a long time now (and I remember having to convince people it was dead a few years back). We're entering a new ballgame. I don't blame the developers, they're just trying target and accommodate the common consumer rather than the enthusiast, and the buy-in on a regular GPU has increase an incredible amount over the last 5 years.
Well developers have to focus on the consoles then offer upgrades for the PC audience in terms of textures and bling. But the consoles for now and the foreseeable future are the baseline.

The bulk of gamers are still 1080p with 1440 coming in a hot second with 4K way in the back.

But the last 2 years didn’t help, the average gamer (60% by Steam hardware survey) is still running 16GB ram, on a GTX 1060 at 1080p with a 6 core cpu in the 2.3-2.7 range. If steam makes up 90% of games sales more than half your potential buyers are still at or below what the modern consoles are capable of.
So to ensure a game is available to the widest audience possible developers for the first time probably ever have to look at the base specs of the consoles and dumb it down for PC users…
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,023
Well developers have to focus on the consoles then offer upgrades for the PC audience in terms of textures and bling. But the consoles for now and the foreseeable future are the baseline.

The bulk of gamers are still 1080p with 1440 coming in a hot second with 4K way in the back.
Because of the consoles market I am not so sure, probably getting close to 50% of household with a 4k TV in the US and among console gamer probably more.
 

XenIneX

Gawd
Joined
May 19, 2012
Messages
867
Because of the consoles market I am not so sure, probably getting close to 50% of household with a 4k TV in the US and among console gamer probably more.

Consoles drive a 4k TV with -- roughly -- an RX 6700, and will continue to do so for at least the next half-decade. Meanwhile, the Steam Hardware Survey pegs PC 4k display market penetration at about 2.5%.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,023
Consoles drive a 4k TV with -- roughly -- an RX 6700, and will continue to do so for at least the next half-decade. Meanwhile, the Steam Hardware Survey pegs PC 4k display market penetration at about 2.5%.
All factual, but I am not sure to fully follow the point being made, that GPU is 2.2 time a 1060 6 gig according to techpowerup and consoles gamers are a significant portion of newer AAA games sales, which could make the game being played on a 4k being way in the back versus 1440p among gamers targeted by new full priced games claim still somewhat dubious.
 

Flogger23m

[H]F Junkie
Joined
Jun 19, 2009
Messages
12,907
Consoles drive a 4k TV with -- roughly -- an RX 6700, and will continue to do so for at least the next half-decade. Meanwhile, the Steam Hardware Survey pegs PC 4k display market penetration at about 2.5%.

...at 30 frame rates typically.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,023
If I didn't make an error, the share of the DX12 game on the September steam hardware survey that had a GTX 1080 or better was at least 36.8%, if the claim of over 120 monthly millions active users are real and that it is representative (with 92% of users having a DX12)that would be a giant 40 millions user base with an GTX 1080 and up cards. A game hard to run like Cyberpunk (35 fps at 1080p Medium on a 1060), Flight Simulator or Elden Ring can be big hits on the platform.

40 millions could be a very similar figure to the numbers of PS5 and Xbox X in the wild.
Aug 3, 2022 — To date, roughly 21.61 million PlayStation 5 console's have been sold worldwide.

X and S together are around 15 millions.

If we go by 5700xt-1080Ti-2070 and up, that would be around 31.7 millions users.

Steam would be exaggerating their figures, one could imagine, but the existence of some non-steam gamers in some market could compensate for that.
 
Last edited:

Brackle

Old Timer
Joined
Jun 19, 2003
Messages
8,162
Consoles drive a 4k TV with -- roughly -- an RX 6700, and will continue to do so for at least the next half-decade. Meanwhile, the Steam Hardware Survey pegs PC 4k display market penetration at about 2.5%.
Just because you are playing in 4k, doesn't mean its producing the image in 4k. Usually all games on the Xbox and PS5 aren't native 4k.

So, yea they can't drive a 4k tv properly.
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
6,483
If I didn't make an error, the share of the DX12 game on the September steam hardware survey that was a GTX 1080 or better was at least 36.8%, if the claim of over 120 monthly millions active users are real and that it is representative (with 92% of users having a DX12)that would be a giant 40 millions user base with an GTX 1080 and up cards. A game hard to run like Cyberpunk (35 fps at 1080p Medium on a 1060), Flight Simulator or Elden Ring can be big hit on the platform.

40 millions could be a very similar figure to how many PS5 and Xbox X in the wild.
Aug 3, 2022 — To date, roughly 21.61 million PlayStation 5 console's have been sold worldwide.

X and S together are around 15 millions.

If we go by 5700xt-1080Ti-2070 and up alone that would be around 31.7 millions users.

Steam would be exaggerating their figures, one could imagine, but the existence of some non-steam gamers in some market could compensate for that.
Yeah, the math is about right, it is also pretty close to what a few of my friends who do active game development say as well. Basically as far as their management is concerned there has been 0 growth in the last 3 years in terms of the average gaming PC. What they show has happened is the gap has widened between the top end and the average PC but the average overall has moved very little.
 

jobert

Gawd
Joined
Dec 13, 2020
Messages
919
Looking at this shows you just how much of a joke the 4080 12gb really is. At only 1440p the 4090 has a 71% gap on the 4080 12gb so just imagine at 4k. https://videocardz.com/newz/nvidia-geforce-rtx-4090-delivers-500-fps-in-overwatch-2-at-1440p-ultra


For some perspective a 3090 is only 47% faster than even just a 3060 ti at 1440p according to techpowerup. https://tpucdn.com/review/nvidia-ge...ion/images/relative-performance_2560-1440.png


So even if the 4080 12gb was called a 4070 it would probably the weakest 70 class card ever released.
 
Last edited:

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,023
how much of a joke the 4080 12gb really is
3090 is only 47% faster than even just a 3060 ti at 1440p
Well depend of the point of view, one could see it the other way around, how much of a joke the $1499 RTX 3090 was value wise for gaming.

I think there is a combo here going on, how much terrible of a value the 3090 for a gamer was (MSRP let alone actual price), the new 4090 is possibly a bit of new tier above the previous one, the 4080 maybe should not be judged has a joke for how much below the 4090 will be versus how close the 3080 was to the 3090 despite their giant price gap.

The 4080 will be a joke depending on a mix of how much they beat the 3080 and how much it cost to buy a 3090-3090TI instead, imo, not because buying a xx90 instead of a xx80 to play game started to make any sense pefr by $ wise.
 

jobert

Gawd
Joined
Dec 13, 2020
Messages
919
Well depend of the point of view, one could see it the other way around, how much of a joke the $1499 RTX 3090 was value wise for gaming.

I think there is a combo here going on, how much terrible of a value the 3090 for a gamer was (MSRP let alone actual price), the new 4090 is possibly a bit of new tier above the previous one, the 4080 maybe should not be judged has a joke for how much below the 4090 will be versus how close the 3080 was to the 3090 despite their giant price gap.

The 4080 will be a joke depending on a mix of how much they beat the 3080 and how much it cost to buy a 3090-3090TI instead, imo, not because buying a xx90 instead of a xx80 to play game started to make any sense pefr by $ wise.
The 4080 12gb is a joke no matter what and again would be pathetic even for a 4070. Look again at the chart there as the 4080 12gb is only beating the 3080 10gb by 18%. Even if they were the same price it would be horrible but the 3080 10gb launched at $699 while the 4080 12gb starts at $899. There is no way to defend that BS in any logical way.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,023
The 4080 12gb is a joke no matter what and again would be pathetic even for a 4070. Look again at the chart there as the 4080 12gb is only beating the 3080 10gb by 18%. Even if they were the same price it would be horrible but the 3080 10gb launched at $699 while the 4080 12gb starts at $899. There is no way to defend that BS in any logical way.
That on a very specific high FPS benchmark were the CPU start to be a large percentage of the time between frame, %19 is not a lot here but you are exactly in the good line, it is a joke at that price point by how little it goes above a 3080 or 3090 at their current price, not because that the 4090 is so much better, the 3090 was the aberration by how close it was to the 3080, not the 4090 by how much higher it is.

On a more GPU bound experience, NVIDIA seem to tell us that the 4080 12 gig is able to be close to a 3090TI, which could have sounded nice when the 3090TI was being sold $2000, hey we did find a cheaper way and much colder-lesser power way to give you a 3090TI experience at 50% the price point with maybe some new feature that you could like or not with it in bonus and has an extra better RT performance, the massive price drop destroyed that.

One big issue has well, even with RT and 4K enabled a 3090TI was just 23% above a 3080 10 gig in Watch Dogs Legion, it was never an interesting $/power/perf point of reference to start with, being good relative to them in that regard is not that appealing, a bit like beating Turing in perf by dollar.
 

GotNoRice

[H]F Junkie
Joined
Jul 11, 2001
Messages
11,252
I think there is a combo here going on, how much terrible of a value the 3090 for a gamer was (MSRP let alone actual price), the new 4090 is possibly a bit of new tier above the previous one, the 4080 maybe should not be judged has a joke for how much below the 4090 will be versus how close the 3080 was to the 3090 despite their giant price gap.

The 4080 will be a joke depending on a mix of how much they beat the 3080 and how much it cost to buy a 3090-3090TI instead, imo, not because buying a xx90 instead of a xx80 to play game started to make any sense pefr by $ wise.

I think that your post is actually a good example of what the big issue with the 4080 12GB is. Specifically, the way you refer to it simply as the "4080". This was exactly what Nvidia wanted people to do. But in reality you can't simply refer to either card as a "4080" because what we actually have, between the 4080 12GB and the 4080 16GB, are two completely different cards. They each use a different GPU. They not only have different amounts of RAM, but the RAM on the 12GB model is also slower. They might as well have labeled the 4090 the "4080 24GB". It also uses a different GPU, different amount of memory, and different memory bandwidth, but apparently none of that matters anymore when choosing a name.

Part of the issue is that Nvidia wants the 3000 series to fill in the midrange and low-end segments for a while, to help clear out inventory and keep board partners happy. A 4070 at a 4070 price would have tread on that a bit too much. So they just labeled it a 4080 and upped the price. It doesn't matter that it's terrible for the price, as long as it doesn't interfere with 3000-series sales for now. Maybe they will discount it later? They also know that many people will see benchmarks for the 4080 16GB thinking "wow, look how good that card performs!" and then go buy an overpriced 4080 12GB instead thinking that all they are missing out on is 4GB of video RAM (not knowing it's a totally different card).
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,023
I think that your post is actually a good example of what the big issue with the 4080 12GB is. Specifically, the way you refer to it simply as the "4080".
You are right obviously, but I feel both 4080 will be grossly overpriced by about that same level, yes the performance gap seem there, but there is a $300 (33%) price gap has well, i..e would I have split my message I would have said the exact same for both, I do not expect an FPS by dollars being that different (even with the higher the FPS, the higher they cost in mind).

And for the 16gig the closer the highest AIB model get to the cheaper 4090 you can get, there is a joke value versus the 4090 that could indeed start to creep in.
 

NightReaver

[H]ard|Gawd
Joined
Apr 20, 2017
Messages
1,809
And for the 16gig the closer the highest AIB model get to the cheaper 4090 you can get, there is a joke value versus the 4090 that could indeed start to creep in.
Intended. They want you to either buy old 3000 stock, or buy the overbought 4090.
 

Domingo

Fully [H]
Joined
Jul 30, 2004
Messages
21,364
I think it'll be really interesting to see how the 12GB 4080 model sells. Feels like it's priced too high to actually "trick" anyone into buying it and if you have a grand to drop on a GPU, what's another $150-200?
 

jobert

Gawd
Joined
Dec 13, 2020
Messages
919
Good god. Just massive. I have the 6800xt Tuf and consider it to be huge. I honestly don’t want any GPU this large again after this 6800xt. I love the AMD reference dimensions personally
I have the Tuf 3080 ti right now and it looks huge in my case and has just a slight bit of sag if you look closely. How on earth can that massive 4090 Tuf not sag like crazy and put lots of stress on the slot? And why didn't they use a 3 slot bracket to better support the weight. I had that on an EVGA card and the FE 3090 before and it kept the card from sagging.
 
Last edited:

t1337duder

Limp Gawd
Joined
Sep 7, 2014
Messages
395
I have the Tuf 3080 ti right now and it looks huge in my case and has just a slight bit of sag if you look closely. How on earth can that massive 4090 Tuf not sag like crazy and put lots of stress on the slot? And why didn't they use a 3 slot bracket to better support the weight. I had that on an EVGA card and the FE 3090 before and it kept the card from sagging.
Those anti-sag brackets are going to go from a "nice to have" to a "need to have". I don't have one on my 3090 Ti, but I'm thinking would be necessary on the 4090 and whatever the hell NVIDIA concocts down the road...
 

TheHig

[H]ard|Gawd
Joined
Apr 9, 2016
Messages
1,080
I used one of the aluminum ones on mine now to keep it level. Like a kickstand. Definitely a requirement on these 4090s. I assume lots of people will vertical mount them as well.

Bottom right.

64E49E75-899C-4876-ABAA-5959AB17D1F2.jpeg
 

GotNoRice

[H]F Junkie
Joined
Jul 11, 2001
Messages
11,252
Those anti-sag brackets are going to go from a "nice to have" to a "need to have". I don't have one on my 3090 Ti, but I'm thinking would be necessary on the 4090 and whatever the hell NVIDIA concocts down the road...

Yeah that's going to be an issue, especially if you move your computer around at all with the card installed. At least it will have 3 screws keeping it secured to the rear of the case.

It kind of reminds me of my previous Samsung monitor. It used some hipster-inspired design that only had a support on one side. And, of course, after a few years the other side had begun to sag. Sad thing is, I knew that was going to happen since the first day I saw it. Why wasn't it that obvious to the engineers? I ended up improvising a stand out of an old plastic bottle. Maybe I will have to improvise a similar solution for my 4090 also?
 

DPI

[H]F Junkie
Joined
Apr 20, 2013
Messages
12,504
They say in the future you'll just slide a mini-ITX board into an RTX 5090 as a daughtercard. I am ready for that future.

I mean look at this damn thing. This man is 6'4 and struggled just to wrap his lips around it.

1665034168379.png
 
Last edited:

TheHig

[H]ard|Gawd
Joined
Apr 9, 2016
Messages
1,080
The PCB on the founders is tiny again and looks like some of the AIBs are small as well. But yeah I imagine blocks will be pricey because why not. You already paid $1600+ for the card so clearly you can afford the block too.
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
6,483
The 4080 16GB, and the 4090 excite me on a technical level, but not on a consumer one. I really hope that AMD comes to the table ready to play I have a mid-range system I need to replace its aging 1080 so it's capable of doing some VR fun stuff. The daughter enjoys it and it makes me physically ill (on my 3080ti so it is not a framerate issue).
I get the 4000 series was intentionally priced badly to make the 3000 more attractive to clear out stock but I am looking at the 6700xt and thinking, for $590 CAD that's not too shabby.
 

Domingo

Fully [H]
Joined
Jul 30, 2004
Messages
21,364


Looks like these things aren't going to fit in a lot of popular medium-size cases. Especially that monstrous STRIX card.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,023
It was interesting
CYBERPUNK 2077 1440P (MAX SETTINGS + ULTRA RT + PSYCHO)NVIDIA GEFORCE RTX 4090NVIDIA GEFORCE RTX 4090NVIDIA GEFORCE RTX 3090 TINVIDIA GEFORCE RTX 3090 TI
DLSS 3Disabled (Native)Enabled (Quality)Disabled (Native)Enabled (Quality DLSS 2)
FPS (Average)59.9170.735.861.9
FPS (1% Lows)49.3119.6--
FPS (Min)40.691.2--
Latency (Average)75.4ms53.5ms--
GPU Clock2800-2850 MHz (Stock)2800-2850 MHz (Stock)2000-2050 MHz2000-2050 MHz
GPU Temps50-57C50-53C70-75C70-75C
GPU Power461.3W348.9W454.2W409.2W
PCAT Perf/Watt (FPS/Joule)0.1350.5130.0790.152


That could explain in part why, despite being a thicker but shorter card, you can add 100 watt without moving the GPU temps by almost any amount. And how bigger the AIB non FE seem to be in comparison. Seem to get a lot involved in those cooler.
 
Top