Join us on November 3rd as we unveil AMD RDNA™ 3 to the world!

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,602
People keep showing stuff like this so you have to ask yourself, why would Nvidia crank it to 450w when it can get 90-95% of the performance at 70% of the power consumption.

It suggests to me that the AMD flagship was much to close to comfort for Nvidia so they cranked everything to 11.
Not sure about 11, they seem to have left a lot of room (some OC strix edition reach the 570 watt type, so the 600 watt edition talk seem to have a plan at some point, but like it show very diminishing return already)

Yes has we saw, at least on some title AMD does come really close that those 5-6% for absurd power pay off for them (and if they build the cards for a giant power anyway before knowing how efficient their affair would be and what AMD plan were, may has well), the general point was more that lovelace do seem has potentially quite efficiency, yet to be certain despite the talk around RDNA 3 that they will have an edge there at least one that change things. That AMD would get equal with a worst node and the surcharge added by the interprocess would be quite impressive already, even with the ram difference.
 
  • Like
Reactions: DPI
like this

Axman

[H]F Junkie
Joined
Jul 13, 2005
Messages
14,984
The issue they will have is simply zero experience with such designs.

That's what I keep coming back to. AMD threaded the needle and found technology to make -- let's be honest -- outdated, older nodes, perform damn near as well as the cutting edge, all while using less silicon* (edit silicone is boobs) and less power. Sure, at the added cost of construction. But that's something Intel and Nvidia will have to learn to do, too. Mixing and matching is the way forward.

And they have years of experience implementing it that Intel and Nvidia just can't even.

It's AMD 3D chess at this point. To further mix metaphors, they've turned a sow's ear into a silk semiconductor.
 
Last edited:

ChadD

Supreme [H]ardness
Joined
Feb 8, 2016
Messages
5,931
That's what I keep coming back to. AMD threaded the needle and found technology to make -- let's be honest -- outdated, older nodes, perform damn near as well as the cutting edge, all while using less silicone and less power. Sure, at the added cost of construction. But that's something Intel and Nvidia will have to learn to do, too. Mixing and matching is the way forward.

And they have years of experience implementing it that Intel and Nvidia just can't even.

It's AMD 3D chess at this point. To further mix metaphors, they've turned a sow's ear into a silk semiconductor.
Next few gens of CPUs and GPUs are going to be fun. Both Intel and Nvidia will be going that way as well. AMD will either reap the benefit of a lead... or perhaps loose it completely. I hope they make hay while they can the next 1-3 years in both CPU and GPU.
There is also always the possibility that NV and or Intel fall on their face and have a generation that stinks. Nvidia sort of had that back with Tesla... they added in tensors but the Raster uplift wasn't there, it was so poor they didn't even bother releasing a consumer version. AMD allowed that of course by not really competing. I'm not sure Nvidia could get away with that again if their next gen doesn't best lovelace. Same goes for Intel... they are obviously competing very well with AMD now, they don't want another generation of stagnation. AMD beat them up just short of into the ground, despite 2-3 solid AMD win generations it only really takes one good Intel generation for people to start saying I knew Intel would bounce back. lol
 

Axman

[H]F Junkie
Joined
Jul 13, 2005
Messages
14,984
There is also always the possibility that NV and or Intel fall on their face and have a generation that stinks.

This or they will reverse-engineer the tech and dump money into development and bury AMD, patents with or without standing. TSMC is a mercenary company, and will serve its bidders. Especially right now, political-climate-wise (not going into this -- if anyone reading this wants to talk politics, sub to GenMay, it's tits).

AMD is the corporate equivalent of Colin McRae. When in doubt, flat out. I think they're flat out on chiplets, whether they're GCD plus MCDs, along with stacking, as they've done this generation, or flat out on all chiplet options, which I believe they are, based on the dual-GCD stuff they either teased or "teased."

Jesus, I bet even the most die-hard Nvidia fanbois are pissed at, and proud of, AMD this round, because they want to beat AMD even harder. Instead, they got cards that are expensive, suck too much power, and sometimes catch fire. That's not fanboi comfort zone.
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
7,285
DLSS 3.0 looks like trash with it's interpolation. FSR's not bad it basically does the same thing and they announced FSR 3.0 will be dropping in 2023 which they're saying can give up to 2x fps at 4k and then they got this Hyper-RX thing coming too.

just nvidia thinking $1600 is a great price to charge for graphics cards is enough to "woo" me to AMD
Yeah but their FSR 3 gets that 2x FPS increase thanks to their new “Motion Engine” aka their AI frame generator just like Nvidia’s Optical Flow. I can’t imagine AMD’s AI generated frames will look any better, let alone the latency issues it’s bound to have as well.

The 7900xt or the card formerly known as the 4080 12GB are the cards that interests me but I’m still eagerly awaiting reviews.
 

tybert7

2[H]4U
Joined
Aug 23, 2007
Messages
2,763
I'm out of the loop. How bad is the raytracing on rdna 3? Can it at least match a 3090? And what about some of the more active raytracing games like control or 2077?
 

tybert7

2[H]4U
Joined
Aug 23, 2007
Messages
2,763
Yeah but their FSR 3 gets that 2x FPS increase thanks to their new “Motion Engine” aka their AI frame generator just like Nvidia’s Optical Flow. I can’t imagine AMD’s AI generated frames will look any better, let alone the latency issues it’s bound to have as well.

The 7900xt or the card formerly known as the 4080 12GB are the cards that interests me but I’m still eagerly awaiting reviews.
Is AMD only doing a version of dlss 3.0 or are they going to do the same style as dlss 2.0 also?
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
7,285
I'm out of the loop. How bad is the raytracing on rdna 3? Can it at least match a 3090? And what about some of the more active raytracing games like control or 2077?
Honestly if you are at 1440p then the 3090 is more than good enough at Ray Tracing but for all their talk of 4K and 8K I don’t see how these are going to stack up there. But 4K is a 3% and that is handily going to NVidia with the 4090 for now, so AMD can talk a big game there but I think their 8K is what most others I know of call 6K, which these will also do fine.

Honestly I love Ray Tracing, it is the new thing but it’s still not here totally yet and won’t be until the next major console refresh. Until the consoles can do ray tracing the industry must be raster first ray-traced second.

The 7900 cards are what AMD needs, they are “cheap” to both produce and sell, they force Nvidia to respond and generate a lot of positive stuff for AMD on both the consumer and investor side of things. Assuming AMD can get the chips out in good volumes they utterly ruin Nvidia’s ability to sell the 3000 series overstock and that’s a big deal for AMD.
 

funkydmunky

2[H]4U
Joined
Aug 28, 2008
Messages
3,332
When it comes to Nvidia, how its usually been. If you want the best you have to pay for the best. It's the reason I got a 3090 on release day. Same goes for the 4090 RTX. It is the fastest card and will be once the 7900xtx is released imo.
If you have the $$ for the best you pay for the best. The rest of the world does some sort of (performance/price) equation that makes sense in reality for them (wife/kids/bills etc..don't want to bore ya with reality for the 99% ya)
[H] used to be THE place for the most from the least. Not the most from the most with effort ZERO!
Enjoy your card bruh
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,602
I'm out of the loop. How bad is the raytracing on rdna 3? Can it at least match a 3090? And what about some of the more active raytracing games like control or 2077?
Depends what we mean by that, but if you mean does a card has strong has the 7900xtx should beat a 3090 with RT enabled in some game yes, seem a possible mixed result for the more active title:

Ratio over the 6950xt at 4k with RT on according to AMD (https://www.dsogaming.com/news/amd-...on-rx-7900xtx-prices-dates-gaming-benchmarks/)
Resident evils: 1.5x
Metro exodus: 1.5x
Doom eternal: 1.6x

According to:
https://www.techpowerup.com/review/amd-radeon-rx-6950-xt-reference-design/32.html

The 3090 was over the 6950x
Resident evils: 1.06x
Metro exodus: 1.44x
Doom eternal: 1.41x

So maybe not that much if by any in something like control-cyberpunk, where the 3090 was 60-65% higher than a 6950xt.
 

tybert7

2[H]4U
Joined
Aug 23, 2007
Messages
2,763
Honestly if you are at 1440p then the 3090 is more than good enough at Ray Tracing but for all their talk of 4K and 8K I don’t see how these are going to stack up there. But 4K is a 3% and that is handily going to NVidia with the 4090 for now, so AMD can talk a big game there but I think their 8K is what most others I know of call 6K, which these will also do fine.

Honestly I love Ray Tracing, it is the new thing but it’s still not here totally yet and won’t be until the next major console refresh. Until the consoles can do ray tracing the industry must be raster first ray-traced second.

The 7900 cards are what AMD needs, they are “cheap” to both produce and sell, they force Nvidia to respond and generate a lot of positive stuff for AMD on both the consumer and investor side of things. Assuming AMD can get the chips out in good volumes they utterly ruin Nvidia’s ability to sell the 3000 series overstock and that’s a big deal for AMD.
I'll likely be getting the 7900 xtx either way, but I'm on a 4k oled cx10 for my pc display and after this I'm never going back to non emissive displays. But I am going to continue to need more performance to drive this screen. It looks like the 4090 would be a better card for my display but not going that route.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,602
No one with any sense. Not until we see reviews.
With how hard it seem to be to get those cards and with how easy return policy tend to be (or to resell them use), not sure it is at all a bad strategy to pre-order one before reviews if it would be an option to someone.
 

UnknownSouljer

Supreme [H]ardness
Joined
Sep 24, 2001
Messages
7,803
With how hard it seem to be to get those cards and with how easy return policy tend to be (or to resell them use), not sure it is at all a bad strategy to pre-order one before reviews if it would be an option to someone.
You can always play the return song and dance. Also generally not worth it. At the end of the day whether you're looking to flip the card or use it, when you pre-order you're just gambling, hoping it's worth your time and cash. Because ultimately you do not know what it is you're even buying.
I'd rather play politics and not reward companies for inferior products. I'm on team: preorder nothing, ever, for any reason, all companies are here to serve me and not the other way around, and they all could just as soon die in a fire - forever.
If no one ever preordered, companies would have to deliver the goods all the time. And we'd have a lot less bullshit. I can't control what anyone else does, but I'm out, and naturally I think wisdom is on my side. Agree or not, that's up to you.
 
Last edited:

TaintedSquirrel

[H]F Junkie
Joined
Aug 5, 2013
Messages
11,818
So this means the 7900 XT would've been around 50% faster than the 4080 12 GB for the same price.
 

jobert

[H]ard|Gawd
Joined
Dec 13, 2020
Messages
1,244
So this means the 7900 XT would've been around 50% faster than the 4080 12 GB for the same price.
I still don't see how they are going to keep the 4080 16GB at $1199 compared to the 7900xtx. And I have no doubts with how shitty Nvidia is with slower cpus that the 7900 xtx will match or beat the 4090 with some of the cpus people are actually using. I think at 1440p the 7900xtx is going to be faster than the 4090 in many if not most games even with a top end cpu. On techpowerup the 4090 is only 30% faster than the 6950 xt at 1440p.

Everything changes with ray tracing though as AMD is WAY behind. Nvidia should be over 50% faster in most cases and even 75% faster in some games.
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
7,285
I still don't see how they are going to keep the 4080 16GB at $1199 compared to the 7900xtx. And I have no doubts with how shitty Nvidia is with slower cpus that the 7900xtx will match or beat the 4090 with some of the cpus people are actually using. I think at 1440p the 7900xtx is going to be faster than the 4090 in many if not most games even with a top end cpu. On techpowerup the 4090 is only 30% faster than the 6950 xt.

Everything changes with ray tracing though as AMD is WAY behind.
At 1440p either the 7900 XTX or the 4090 will leave you CPU bottlenecked with even the 13900k. So no difference, between them at 1440p. They are 4K cards through and through.
 

jobert

[H]ard|Gawd
Joined
Dec 13, 2020
Messages
1,244
At 1440p either the 7900 XTX or the 4090 will leave you CPU bottlenecked with even the 13900k. So no difference, between them at 1440p. They are 4K cards through and through.
Not anywhere near as much on AMD if you actually look at how AMD has been scaling at lower resolutions compared to Nvidia. The last two generations of top end Nvidia cards do poorly at 1440p relative to AMD because Nvidia has piss poor hardware scheduling. There are cases where an AMD card that is not even half as fast as an Nvidia card at 4k can match or beat it lower resolutions without a top end cpu. Of course it can vary wildly depending on the game.
 

auntjemima

[H]ard DCOTM x2
Joined
Mar 1, 2014
Messages
11,267
If you have the $$ for the best you pay for the best. The rest of the world does some sort of (performance/price) equation that makes sense in reality for them (wife/kids/bills etc..don't want to bore ya with reality for the 99% ya)
[H] used to be THE place for the most from the least. Not the most from the most with effort ZERO!
Enjoy your card bruh
I like how you're telling Brackle how the forums "used to be" and then calling him "bruh".

Would read again. 10/10.
 

zehoo

Limp Gawd
Joined
Aug 22, 2004
Messages
460
Depending on how reviews pan out I might replace my 6700xt which I bought at msrp since couldn’t get a better card at non-scalper prices.
 

Domingo

Fully [H]
Joined
Jul 30, 2004
Messages
21,756
Maybe a dumb question, but what's the HDMI audio-out like on AMD cards like these days? Do they support 7.1, Atmos, DTS:X, etc. ? What about instant-on audio? My last experience with an AMD card was 5 years ago and all of those things were issues. It didn't support all of the standard formats and there was a roughly 1-2 second delay with all new audio sources.
 

scottypippin

Weaksauce
Joined
Feb 28, 2022
Messages
75
that moment when you find out that Ubisoft is FINALLY going to release a new Splinter Cell game....via an AMD event...
 

GDI Lord

Limp Gawd
Joined
Jan 14, 2017
Messages
263
Maybe a dumb question, but what's the HDMI audio-out like on AMD cards like these days? Do they support 7.1, Atmos, DTS:X, etc. ? What about instant-on audio? My last experience with an AMD card was 5 years ago and all of those things were issues. It didn't support all of the standard formats and there was a roughly 1-2 second delay with all new audio sources.
I'm happily using audio via HDMI on my Samsung QN90B from my RX 6800 XT. I also occasionally plug in a generic "ECCO" brand TV to watch streaming via PC in bed.

HTH.
 

Domingo

Fully [H]
Joined
Jul 30, 2004
Messages
21,756

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
3,602
In a just some months of difference of universe that seem like it would have been a hell of good minings cards
 

GDI Lord

Limp Gawd
Joined
Jan 14, 2017
Messages
263

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
7,285
*deer in headlights look*
I don't even know what you mean. It games great and streams movies great. Audio great.

It also Words and Excels and Visual Studios great. ;-)
There was an older installer package issue where it wouldn't install things if they weren't detected at the time of installation, so if you didn't have your equipment all plugged in when you installed the drivers it would leave things out and you would have to do a full uninstall and reinstall of the driver package on site when things were all hooked up.
It made batching deployments a serious PITA, AMD has since corrected the issue but it went on for a full year or so before they got around to fixing that particular issue.
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
7,285
Rumored AIB models with higher voltage options should be about 15-20 percent faster:



The architecture was designed to run at 3GHz.

Well AMD has to leave some room for the Powercolors and the rest of their AIB, Spooge Red Devil edition cards.
 
Top