RJ1892
[H]ard|Gawd
- Joined
- Apr 3, 2014
- Messages
- 1,347
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
Great strategy.
The whole reason Nvidia is rushing to launch, potentially burning their own partners who are still holding older inventory, is because Nvidia is widely anticipated to lose on both fronts, performance per watt, and overall performance, for this generation.
10 to 1
Video cards aren't only for gaming, though.In terms of gaming parts it's five to one.
Video cards aren't only for gaming, though.
Yeah I've seen around 85% Nvidia to 15% amd for discrete consumer GPUs recently on Jon peddle research but not anything quite like 10:1.I also don't think the 10:1 number is accurate. We can only really know from things like the Steam hardware survey, which puts it at five to one, on the consumer side. Their numbers are so big I'll bet the only segments that aren't generally well-reflected by Valve are datacenter and mining.
And Nvidia will all day, every day, cater to datacenters and miners first, then consumers, well, I don't even know if second is the right word. Last. Last is.
The research I've done has mostly shown something around 4:1 or 5:1 (17-21% for AMD, 79-83% for Nvidia). They fare a bit better with CPUs, at around 28% to Intel's 72% as of Q1 2022, but I don't think they'll see numbers like that on the GPU side anytime soon, unless the RX 7000s have a significant advantage over the RTX 4000s by price and performance. Even still, Nvidia will be the rabbit and AMD the turtle in this race.I also don't think the 10:1 number is accurate.
Even with all the improvements AMD has made in the last 5 years of GPU design, Nvidia Still outsells them by more than 10 to 1.
Let me just reiterate that in fonts that represent their sale ratios:
NVIDIA GPUs OUTSELL AMD GPUs
BY A MARGIN OF
10 to 1
While that may look like a big difference, due to forum constraints, that only represents a 3.2 to 1 size reference. The true representation probably would take up enough memory to crash a pc circa 1995.
However, this gives AMD the incentive to continue innovating and taking risks that may continue to make gamers happy and build their sales base again.
It actually has nothing to do with that and is entirely tied to financial decisions that can be summed up as follows: Crypto crashed and no one is buying 3xxx cards anymore, if they don't release a new generation, they're going to have one of the worst 4th quarter from the GPU division in over a decade. The shareholders have gotten use to such consistent growth, that Nvidia repeatedly said was not due to Crypto boom, and now they're in a position where the only solution is to launch ASAP. Otherwise investors would have lost confidence and the overall market would probably start downgrading their position on Nvidia's growth outlook.
Don't get me wrong, I see them as an ever evolving company, that is gearing up to engage with a TAM that will likely increase indefinitely. However, Lying about crypto put them in a position where they could have been sued, or worse, had management gutted by shareholders that wanted another 15% at the end of the year. It's silly, and I enjoy watching a fiery wreck every once in a while, don't you?
But I did not expect AMD to be this competive.
Not to mention with the fed rate hikes and QT, the economy in 2023 is gonna be a wasteland. They might as well launch earlier and reap the sales now, and let AMD eat dirt by launching RDNA3 in a recession.
100% right.Not to mention with the fed rate hikes and QT, the economy in 2023 is gonna be a wasteland. They might as well launch earlier and reap the sales now, and let AMD eat dirt by launching RDNA3 in a recession.
let AMD eat dirt by launching RDNA3 in a recession
Would love to buy a 4080 for msrp.People who have been spending double over MSRP for hardware will be willing to spend MSRP in a recession.
Gaming is cheap thrills.
6900XT MSRP is $1000, 3090 is $1500. A third cheaper. It's about 15-20% slower.Isn't the 6900xt, 6950xt only 2%-7% slower than 3090 but about 60% cheaper?
Where are you getting "15-20% slower" from. Is that number ray-tracing performance?6900XT MSRP is $1000, 3090 is $1500. A third cheaper. It's about 15-20% slower.
That can give a better idea I think:https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3090-vs-AMD-RX-6900-XT/4081vs4091
It doesn't say explicitly, but I believe that does take ray tracing into effect.
Nope, but I do now.I thought we all knew better by now than to use userbenchmark as a source.
Userbenchmark? Not really reliable or accurate.https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3090-vs-AMD-RX-6900-XT/4081vs4091
It doesn't say explicitly, but I believe that does take ray tracing into effect.
I've never heard anyone say anything about it until now. Educate me. Is it based on the way they calculate, or because it's aggregates of a bunch of people with non-constant specs, etc.?Userbenchmark? Not really reliable or accurate.
Well their score aggregation is just broken and has changed in terms of how their scores are weighted over the years, and they are generally known for bias towards Intel for example, to the extreme which factors into that.I've never heard anyone say anything about it until now. Educate me. Is it based on the way they calculate, or because it's aggregates of a bunch of people with non-constant specs, etc.?
Understood. Thanks for that.Well their score aggregation is just broken and has changed in terms of how their scores are weighted over the years, and they are generally known for bias towards Intel for example, to the extreme which factors into that.
Edit: So to add, the aggregate percentage difference in performance between two parts are largely bullshit in reality and looking at actual benchmarks one can see a 6900XT and 3090 are very comparable to each other, depending on resolution, and one will win over the other and vice versa depending on the game/engine. So to just say a 6900XT is "15-20% slower" is grossly misrepresentation of performance differences between the two and is just flat out wrong.
Honestly? Look for comparisons for games you actually play. A lot of people love to cite the HWUB 50 game averages, but that's still just an average.Understood. Thanks for that.
What are some better sites to find more accurate benchmarks? That way I'm not inadvertently searching for incorrect information.
https://www.techpowerup.com/gpu-specs/geforce-rtx-3090.c3622Understood. Thanks for that.
What are some better sites to find more accurate benchmarks? That way I'm not inadvertently searching for incorrect information.
If it makes you feel any better, I just picked up a 3090 from a friend at a firesale price because he didn't want to deal with risk of selling on Ebay. ...This is after I sold my 3080 last week to wait for the 4xxx or 7xxx cards. I'll likely be using a Turing card for a while now and will hold off on a new gen card.Yesterday, something occurred that will ensure that RDNA3 will be a revolutionary leap ahead in performance: I ordered an RX6700XT.
Yep...since I finally committed to a new card, it guarantees that the NEXT card will be the one I should've waited for. Sigh....
This is what a lot of people get overtly worried about. It's not like your card performs any worse. If it drives your current display/ games, then it will continue to do so. Hell, seems like only a few games worth playing really push GPU requirements each generation. Even then, DLSS/FSR will shore that up.If it makes you feel any better, I just picked up a 3090 from a friend at a firesale price because he didn't want to deal with risk of selling on Ebay. ...This is after I sold my 3080 last week to wait for the 4xxx or 7xxx cards. I'll likely be using a Turing card for a while now and will hold off on a new gen card.
Thing is, your current gen card is by no means bad simply because a new gen is released. It performs exactly like it did yesterday.
I have no reason to want performance above a 3090, and certainly not for the premium I expect the new gen to fetch. There's not a single game out there where I don't get >60 FPS woth more than respectable IQ, more likely much better/higher than I need.This is what a lot of people get overtly worried about. It's not like your card performs any worse. If it drives your current display/ games, then it will continue to do so. Hell, seems like only a few games worth playing really push GPU requirements each generation. Even then, DLSS/FSR will shore that up.
Unless AMD can at least match NVIDIA at ray tracing performance I won't even consider them.I've never understood
The research I've done has mostly shown something around 4:1 or 5:1 (17-21% for AMD, 79-83% for Nvidia). They fare a bit better with CPUs, at around 28% to Intel's 72% as of Q1 2022, but I don't think they'll see numbers like that on the GPU side anytime soon, unless the RX 7000s have a significant advantage over the RTX 4000s by price and performance. Even still, Nvidia will be the rabbit and AMD the turtle in this race.
My 3090 is the first card I've had since the 8800GTX where I don't feel the need to upgrade right away with the next generation. I may be convinced to if ray tracing performance doubles again, as some games can still struggle even with DLSS with ray tracing. I still don't have the itch for a new video card at this point.I have no reason to want performance above a 3090, and certainly not for the premium I expect the new gen to fetch. There's not a single game out there where I don't get >60 FPS woth more than respectable IQ, more likely much better/higher than I need.
It likely won't be another year or two before software pushes me to upgrade.
C'mon man lower settings and boost that frame rate, 35fps is garbage.I just bought a 3060 because I couldn't live with integrated GPU. I wanted to play some MMORPGs like WoW and edit videos. Found one for €400 which is more expensive than usual, but I wanted a white card for my all white build.
Well, color me surprised, but this 3060 is able to run everything at 4k and maxed out visuals with acceptable framerates. New World is running at 35 fps maxed out. It's not ideal but it's the most demanding game I have and I play it almost never.
I'll be fine until next-gen is out and is affordable. Which is probably early on![]()
Pretty much. Chugging at 35fps is not something I'd use as an example of "look what it can do".C'mon man lower settings and boost that frame rate, 35fps is garbage.
Really not my experience with a 3070, cannot even try to play Cyberpunk maxed out at 4K without DLSS on, down to 30 before raytracing, around 35 with RT ultra DLSS performance.but this 3060 is able to run everything at 4k and maxed out visuals with acceptable framerates.