RTX 4xxx / RX 7xxx speculation

DFenz

[H]ard|Gawd
Joined
Apr 3, 2014
Messages
1,310
2022-06-10_2-31-06.jpg2022-06-10_2-31-13.jpg
 

oldmanbal

2[H]4U
Joined
Aug 27, 2010
Messages
2,612
Great strategy.

Even with all the improvements AMD has made in the last 5 years of GPU design, Nvidia Still outsells them by more than 10 to 1.

Let me just reiterate that in fonts that represent their sale ratios:

NVIDIA GPUs OUTSELL AMD GPUs
BY A MARGIN OF
10 to 1

While that may look like a big difference, due to forum constraints, that only represents a 3.2 to 1 size reference. The true representation probably would take up enough memory to crash a pc circa 1995.

However, this gives AMD the incentive to continue innovating and taking risks that may continue to make gamers happy and build their sales base again.


The whole reason Nvidia is rushing to launch, potentially burning their own partners who are still holding older inventory, is because Nvidia is widely anticipated to lose on both fronts, performance per watt, and overall performance, for this generation.

It actually has nothing to do with that and is entirely tied to financial decisions that can be summed up as follows: Crypto crashed and no one is buying 3xxx cards anymore, if they don't release a new generation, they're going to have one of the worst 4th quarter from the GPU division in over a decade. The shareholders have gotten use to such consistent growth, that Nvidia repeatedly said was not due to Crypto boom, and now they're in a position where the only solution is to launch ASAP. Otherwise investors would have lost confidence and the overall market would probably start downgrading their position on Nvidia's growth outlook.

Don't get me wrong, I see them as an ever evolving company, that is gearing up to engage with a TAM that will likely increase indefinitely. However, Lying about crypto put them in a position where they could have been sued, or worse, had management gutted by shareholders that wanted another 15% at the end of the year. It's silly, and I enjoy watching a fiery wreck every once in a while, don't you?
 
Last edited:

Axman

[H]F Junkie
Joined
Jul 13, 2005
Messages
13,355
Video cards aren't only for gaming, though.

I also don't think the 10:1 number is accurate. We can only really know from things like the Steam hardware survey, which puts it at five to one, on the consumer side. Their numbers are so big I'll bet the only segments that aren't generally well-reflected by Valve are datacenter and mining.

And Nvidia will all day, every day, cater to datacenters and miners first, then consumers, well, I don't even know if second is the right word. Last. Last is.
 

GoldenTiger

Fully [H]
Joined
Dec 2, 2004
Messages
25,362
I also don't think the 10:1 number is accurate. We can only really know from things like the Steam hardware survey, which puts it at five to one, on the consumer side. Their numbers are so big I'll bet the only segments that aren't generally well-reflected by Valve are datacenter and mining.

And Nvidia will all day, every day, cater to datacenters and miners first, then consumers, well, I don't even know if second is the right word. Last. Last is.
Yeah I've seen around 85% Nvidia to 15% amd for discrete consumer GPUs recently on Jon peddle research but not anything quite like 10:1.
 
D

Deleted member 289973

Guest
I've never understood
I also don't think the 10:1 number is accurate.
The research I've done has mostly shown something around 4:1 or 5:1 (17-21% for AMD, 79-83% for Nvidia). They fare a bit better with CPUs, at around 28% to Intel's 72% as of Q1 2022, but I don't think they'll see numbers like that on the GPU side anytime soon, unless the RX 7000s have a significant advantage over the RTX 4000s by price and performance. Even still, Nvidia will be the rabbit and AMD the turtle in this race.
 

crazycrave

[H]ard|Gawd
Joined
Mar 31, 2016
Messages
1,421
I been watching prices on the RX 6700 XT, as I plan to maybe add one soon to replace two AMD cards I sold back in Oct /Nov 2021 , still have 3 AMD cards left running in other systems.
 

pippenainteasy

[H]ard|Gawd
Joined
May 20, 2016
Messages
1,067
Even with all the improvements AMD has made in the last 5 years of GPU design, Nvidia Still outsells them by more than 10 to 1.

Let me just reiterate that in fonts that represent their sale ratios:

NVIDIA GPUs OUTSELL AMD GPUs
BY A MARGIN OF
10 to 1

While that may look like a big difference, due to forum constraints, that only represents a 3.2 to 1 size reference. The true representation probably would take up enough memory to crash a pc circa 1995.

However, this gives AMD the incentive to continue innovating and taking risks that may continue to make gamers happy and build their sales base again.




It actually has nothing to do with that and is entirely tied to financial decisions that can be summed up as follows: Crypto crashed and no one is buying 3xxx cards anymore, if they don't release a new generation, they're going to have one of the worst 4th quarter from the GPU division in over a decade. The shareholders have gotten use to such consistent growth, that Nvidia repeatedly said was not due to Crypto boom, and now they're in a position where the only solution is to launch ASAP. Otherwise investors would have lost confidence and the overall market would probably start downgrading their position on Nvidia's growth outlook.

Don't get me wrong, I see them as an ever evolving company, that is gearing up to engage with a TAM that will likely increase indefinitely. However, Lying about crypto put them in a position where they could have been sued, or worse, had management gutted by shareholders that wanted another 15% at the end of the year. It's silly, and I enjoy watching a fiery wreck every once in a while, don't you?

Not to mention with the fed rate hikes and QT, the economy in 2023 is gonna be a wasteland. They might as well launch earlier and reap the sales now, and let AMD eat dirt by launching RDNA3 in a recession.
 

Comixbooks

Fully [H]
Joined
Jun 7, 2008
Messages
18,932
Not to mention with the fed rate hikes and QT, the economy in 2023 is gonna be a wasteland. They might as well launch earlier and reap the sales now, and let AMD eat dirt by launching RDNA3 in a recession.

You ain't kidding about eating dirt.
Kopite Iceman Out just leaked October Launch for the 4090.
 
D

Deleted member 289973

Guest
Isn't the 6900xt, 6950xt only 2%-7% slower than 3090 but about 60% cheaper?
6900XT MSRP is $1000, 3090 is $1500. A third cheaper. It's about 15-20% slower.
6950XT is not that much of an increase, for $100 more than the 6900 it's maybe 5-6% faster, still falls about 15% short of the 3090.
Out of the three, the 6900XT would be the biggest bang for your buck.
 
D

Deleted member 289973

Guest
Userbenchmark? Not really reliable or accurate.
I've never heard anyone say anything about it until now. Educate me. Is it based on the way they calculate, or because it's aggregates of a bunch of people with non-constant specs, etc.?
 
Joined
Oct 12, 2020
Messages
559
I've never heard anyone say anything about it until now. Educate me. Is it based on the way they calculate, or because it's aggregates of a bunch of people with non-constant specs, etc.?
Well their score aggregation is just broken and has changed in terms of how their scores are weighted over the years, and they are generally known for bias towards Intel for example, to the extreme which factors into that.

Edit: So to add, the aggregate percentage difference in performance between two parts are largely bullshit in reality and looking at actual benchmarks one can see a 6900XT and 3090 are very comparable to each other, depending on resolution, and one will win over the other and vice versa depending on the game/engine. So to just say a 6900XT is "15-20% slower" is grossly misrepresentation of performance differences between the two and is just flat out wrong.
 
Last edited:
D

Deleted member 289973

Guest
Well their score aggregation is just broken and has changed in terms of how their scores are weighted over the years, and they are generally known for bias towards Intel for example, to the extreme which factors into that.

Edit: So to add, the aggregate percentage difference in performance between two parts are largely bullshit in reality and looking at actual benchmarks one can see a 6900XT and 3090 are very comparable to each other, depending on resolution, and one will win over the other and vice versa depending on the game/engine. So to just say a 6900XT is "15-20% slower" is grossly misrepresentation of performance differences between the two and is just flat out wrong.
Understood. Thanks for that.
What are some better sites to find more accurate benchmarks? That way I'm not inadvertently searching for incorrect information.
 

NightReaver

[H]ard|Gawd
Joined
Apr 20, 2017
Messages
1,392
Understood. Thanks for that.
What are some better sites to find more accurate benchmarks? That way I'm not inadvertently searching for incorrect information.
Honestly? Look for comparisons for games you actually play. A lot of people love to cite the HWUB 50 game averages, but that's still just an average.

Some games play way nicer on AMD and some are way nicer on Nvidia.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
2,452
Understood. Thanks for that.
What are some better sites to find more accurate benchmarks? That way I'm not inadvertently searching for incorrect information.
https://www.techpowerup.com/gpu-specs/geforce-rtx-3090.c3622

Is a quick way to see specs of gpu and their relative performance graph seem relatively good.

Because of architecture difference it can be a little bit more complicated, the difference between a 6900xt and a 3090 is different at 1440p vs 4k for example, a single raw number can be misleading.
 

c3k

2[H]4U
Joined
Sep 8, 2007
Messages
2,306
Yesterday, something occurred that will ensure that RDNA3 will be a revolutionary leap ahead in performance: I ordered an RX6700XT.

Yep...since I finally committed to a new card, it guarantees that the NEXT card will be the one I should've waited for. Sigh....
 

harmattan

Supreme [H]ardness
Joined
Feb 11, 2008
Messages
5,037
Yesterday, something occurred that will ensure that RDNA3 will be a revolutionary leap ahead in performance: I ordered an RX6700XT.

Yep...since I finally committed to a new card, it guarantees that the NEXT card will be the one I should've waited for. Sigh....
If it makes you feel any better, I just picked up a 3090 from a friend at a firesale price because he didn't want to deal with risk of selling on Ebay. ...This is after I sold my 3080 last week to wait for the 4xxx or 7xxx cards. I'll likely be using a Turing card for a while now and will hold off on a new gen card.

Thing is, your current gen card is by no means bad simply because a new gen is released. It performs exactly like it did yesterday.
 

NightReaver

[H]ard|Gawd
Joined
Apr 20, 2017
Messages
1,392
If it makes you feel any better, I just picked up a 3090 from a friend at a firesale price because he didn't want to deal with risk of selling on Ebay. ...This is after I sold my 3080 last week to wait for the 4xxx or 7xxx cards. I'll likely be using a Turing card for a while now and will hold off on a new gen card.

Thing is, your current gen card is by no means bad simply because a new gen is released. It performs exactly like it did yesterday.
This is what a lot of people get overtly worried about. It's not like your card performs any worse. If it drives your current display/ games, then it will continue to do so. Hell, seems like only a few games worth playing really push GPU requirements each generation. Even then, DLSS/FSR will shore that up.
 

harmattan

Supreme [H]ardness
Joined
Feb 11, 2008
Messages
5,037
This is what a lot of people get overtly worried about. It's not like your card performs any worse. If it drives your current display/ games, then it will continue to do so. Hell, seems like only a few games worth playing really push GPU requirements each generation. Even then, DLSS/FSR will shore that up.
I have no reason to want performance above a 3090, and certainly not for the premium I expect the new gen to fetch. There's not a single game out there where I don't get >60 FPS woth more than respectable IQ, more likely much better/higher than I need.

It likely won't be another year or two before software pushes me to upgrade.
 

Armenius

Extremely [H]
Joined
Jan 28, 2014
Messages
33,126
I've never understood

The research I've done has mostly shown something around 4:1 or 5:1 (17-21% for AMD, 79-83% for Nvidia). They fare a bit better with CPUs, at around 28% to Intel's 72% as of Q1 2022, but I don't think they'll see numbers like that on the GPU side anytime soon, unless the RX 7000s have a significant advantage over the RTX 4000s by price and performance. Even still, Nvidia will be the rabbit and AMD the turtle in this race.
Unless AMD can at least match NVIDIA at ray tracing performance I won't even consider them.
I have no reason to want performance above a 3090, and certainly not for the premium I expect the new gen to fetch. There's not a single game out there where I don't get >60 FPS woth more than respectable IQ, more likely much better/higher than I need.

It likely won't be another year or two before software pushes me to upgrade.
My 3090 is the first card I've had since the 8800GTX where I don't feel the need to upgrade right away with the next generation. I may be convinced to if ray tracing performance doubles again, as some games can still struggle even with DLSS with ray tracing. I still don't have the itch for a new video card at this point.
 

Nebell

2[H]4U
Joined
Jul 20, 2015
Messages
2,124
I just bought a 3060 because I couldn't live with integrated GPU. I wanted to play some MMORPGs like WoW and edit videos. Found one for €400 which is more expensive than usual, but I wanted a white card for my all white build.
Well, color me surprised, but this 3060 is able to run everything at 4k and maxed out visuals with acceptable framerates. New World is running at 35 fps maxed out. It's not ideal but it's the most demanding game I have and I play it almost never.

I'll be fine until next-gen is out and is affordable. Which is probably early on :)
 
Joined
Jan 16, 2013
Messages
3,451
I just bought a 3060 because I couldn't live with integrated GPU. I wanted to play some MMORPGs like WoW and edit videos. Found one for €400 which is more expensive than usual, but I wanted a white card for my all white build.
Well, color me surprised, but this 3060 is able to run everything at 4k and maxed out visuals with acceptable framerates. New World is running at 35 fps maxed out. It's not ideal but it's the most demanding game I have and I play it almost never.

I'll be fine until next-gen is out and is affordable. Which is probably early on :)
C'mon man lower settings and boost that frame rate, 35fps is garbage.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
2,452
but this 3060 is able to run everything at 4k and maxed out visuals with acceptable framerates.
Really not my experience with a 3070, cannot even try to play Cyberpunk maxed out at 4K without DLSS on, down to 30 before raytracing, around 35 with RT ultra DLSS performance.

Control 4K without dlss would be around mid 10s FPS I think. Boderlands 4 around 30, etc....

Without RTX on:
average-fps-3840-2160.png


4K is just a lot of pixels. But all eye is different, if 35fps average, 20 something on the low run smooth for you, about all game with RTX off would run very well at 4K maxed out yes, Control/Cyberpunk being rare exception.
 

Nebell

2[H]4U
Joined
Jul 20, 2015
Messages
2,124
I have a PS5 if I want to play more games. This 3060 is supposed to be a temporary solution. It plays WoW, Lost Ark, Guild Wars 2 just fine at 4k which I found to be very surprising. I did not expect this much from it.

The problem is I need a white GPU for my system. I don't want to overpay for 3080/3090 right now so 3060 gives me enough.
And when I'm going to get a white next-gen card? Who knows. Probably not this year.
It's chugging along just fine in this system :)

20220614_231331.jpg
 

NightReaver

[H]ard|Gawd
Joined
Apr 20, 2017
Messages
1,392
To each their own. I'd stop caring about theming the moment it prevents me from getting the level of performance I'm aiming for.

But that's just me.
 
Top