RTX 4xxx / RX 7xxx speculation

Nebell

2[H]4U
Joined
Jul 20, 2015
Messages
2,124
I'm more about aesthetics nowadays than top performance. It's more fun to build something special than just throw parts together and call it a day.
I have built basically everything there is except LN2 cooling system. From ITX to tower ATX to hardline custom cooling box that cost me above €5000...
And I even thought I could live with a low powered laptop for net/image editing + PS5 as my gaming machine. That didn't last. I *need* something to work on, customize and overclock.

But yeah, that system is not top of the line when it comes to CPU/GPU. It cost me nearly €3000 without a GPU, and that's with a 12600k (which is also supposed to be a temporary CPU until 13th gen i9 is out).
 

TaintedSquirrel

[H]F Junkie
Joined
Aug 5, 2013
Messages
11,281
My tower is around the side of my desk on the floor. I can't even see it unless I lean against the wall and peek over.
Otherwise I see it every 3-6 months when I pop it open for cleaning.

It's a box I use to play video games and shitpost on internet forums. Don't care what it looks like, in fact I prefer the opposite, it should neither be seen nor heard.
 
Last edited:

kalston

[H]ard|Gawd
Joined
Mar 10, 2011
Messages
1,343
My tower is around the side of my desk on the floor. I can't even see it unless I lean against the wall and peek over.
Otherwise I see it every 3-6 months when I pop it open for cleaning.

It's a box I use to play video games and shitpost on internet forums. Don't care what it looks like, in fact I prefer the opposite, it should neither be seen nor heard.

I keep my pc in a separate room (long-ish cables and fancy routing) from the one where my desk actually is, for the reasons listed. I don't want to see it, nor hear it. But I want it to perform, definitely.
 

NightReaver

[H]ard|Gawd
Joined
Apr 20, 2017
Messages
1,226
I keep my pc in a separate room (long-ish cables and fancy routing) from the one where my desk actually is, for the reasons listed. I don't want to see it, nor hear it. But I want it to perform, definitely.
Yup. Once the basement gets somewhat done, it's going down there to keep both itself and the PC room upstairs cooler.
 

Nebell

2[H]4U
Joined
Jul 20, 2015
Messages
2,124
I live in Sweden, the heat is not a problem even during the summer.
The setup with its 15 fans (13 case and 2 gpu) is very silent. I'd rather have many silent fans than few loud ones. And the looks fit well in the theme of my apartment.
I will never build a hardline box again. That's too much work and too much maintenance. Mini ITX is a fun thing to build, although quite frustrating as there's so little space and so much I want to cram into it. I can definitely see myself building a tiny tiny PC next time. Although in 5 or so years I might just get a laptop and call it a day.
I want high performance too, and even though my box is flashy, that's a high end pc with ddr5 and Z690 Formula. That 12600k is at 5.1GHz all p-core and 4.1GHz all e-core. I did not cheap out on any of the components except those I want to upgrade later this year, like the CPU and GPU.
 
  • Like
Reactions: DFenz
like this

DFenz

[H]ard|Gawd
Joined
Apr 3, 2014
Messages
1,300
I live in Sweden, the heat is not a problem even during the summer.
The setup with its 15 fans (13 case and 2 gpu) is very silent. I'd rather have many silent fans than few loud ones. And the looks fit well in the theme of my apartment.
I will never build a hardline box again. That's too much work and too much maintenance. Mini ITX is a fun thing to build, although quite frustrating as there's so little space and so much I want to cram into it. I can definitely see myself building a tiny tiny PC next time. Although in 5 or so years I might just get a laptop and call it a day.
I want high performance too, and even though my box is flashy, that's a high end pc with ddr5 and Z690 Formula. That 12600k is at 5.1GHz all p-core and 4.1GHz all e-core. I did not cheap out on any of the components except those I want to upgrade later this year, like the CPU and GPU.
I've been building smaller and smaller every build. It's really satisfying.
 
D

Deleted member 289973

Guest
mITX cases have come quite a way over the years. I've found them to be quite versatile now, particularly when using full-size GPUs. My first build was a mITX with the GTX 970 about six years ago. When it comes to visibility, I'm split. I can see both sides of the coin here, and my preference is something that looks cool and isn't totally plain (because my PCs always sit on my desk) but doesn't have to be themed, flashy with over-the-top RGB, etc. instead focusing primarily on optimum performance.
 
  • Like
Reactions: DFenz
like this

Nebell

2[H]4U
Joined
Jul 20, 2015
Messages
2,124
mITX cases have come quite a way over the years. I've found them to be quite versatile now, particularly when using full-size GPUs. My first build was a mITX with the GTX 970 about six years ago. When it comes to visibility, I'm split. I can see both sides of the coin here, and my preference is something that looks cool and isn't totally plain (because my PCs always sit on my desk) but doesn't have to be themed, flashy with over-the-top RGB, etc. instead focusing primarily on optimum performance.

I know there are ITX cases where you can put a 360 CPU AIO and another 240 radiator (I'd probably get one of those high-end AIO 240 GPUs) and a few fans.
I've seen some people do custom watercooling (hardline!) in ITX and it's just amazing they have the patience for that, lol. Computer building can be art.
 
D

Deleted member 289973

Guest
I know there are ITX cases where you can put a 360 CPU AIO and another 240 radiator (I'd probably get one of those high-end AIO 240 GPUs) and a few fans.
I've seen some people do custom watercooling (hardline!) in ITX and it's just amazing they have the patience for that, lol. Computer building can be art.
You should check out some of the Meshlicious custom loops. I've seen lots of crazy cool custom hardline loops, and the Meshlicious is nice because it fits most full-size GPUs without trouble. I fully agree that computer building is an art -- and for aesthetics with flashy RGB or completely plain matte black or anything in between, it all comes down to preference.
I'm waiting for the Meshroom S, the successor to the Meshlicious, to come out in the next several weeks, and I'll be looking at a custom loop for it along with upgrading my GPU. If I do custom, it's much more likely to be a soft line, but these can still be done very cleanly.

Back to the main topic, has there been any real discussion or info on roughly how long or thick the cards will be? I'd love to imagine as technology evolves, the most powerful cards won't need four GPU slots, even some mid-range cards might need only two instead of three, along with being a bit shorter but in reality I don't know how viable this is.
 
Joined
Oct 12, 2020
Messages
544
Back to the main topic, has there been any real discussion or info on roughly how long or thick the cards will be? I'd love to imagine as technology evolves, the most powerful cards won't need four GPU slots, even some mid-range cards might need only two instead of three, along with being a bit shorter but in reality I don't know how viable this is.
Not sure, but my guess is that if the power draw rumors are true on the top end cards and we are looking at 400-450W TDP cards, then they are only going to get bigger. If the rest of the stack stays pretty normal on power draw (under 250W TDP), then I'd imagine should stick to fairly normal 2-slot designs.
 

Dayaks

[H]F Junkie
Joined
Feb 22, 2012
Messages
8,964
Not sure, but my guess is that if the power draw rumors are true on the top end cards and we are looking at 400-450W TDP cards, then they are only going to get bigger. If the rest of the stack stays pretty normal on power draw (under 250W TDP), then I'd imagine should stick to fairly normal 2-slot designs.
At the end of the day we'll have cards from 75W+, that will perform way better than last gen, like always.

Not sure why everyone gets their panties in a twist. As long as the same wattage card this gen is 50-100% faster than last gen who cares, buy the wattage you are comfortable with. Being able to buy a beast of a card and not having to do multiGPU was the dream for decades... people are acting like *only* a 600W card will be offered. - Just a general rant.
 
Joined
Oct 12, 2020
Messages
544
At the end of the day we'll have cards from 75W+, that will perform way better than last gen, like always.

Not sure why everyone gets their panties in a twist. As long as the same wattage card this gen is 50-100% faster than last gen who cares, buy the wattage you are comfortable with. Being able to buy a beast of a card and not having to do multiGPU was the dream for decades... people are acting like *only* a 600W card will be offered. - Just a general rant.
Oh I agree, most mainstream cards should be fine. It'll just be top end that might push the boundaries we traditionally have had.
 

Wat

Limp Gawd
Joined
Jun 23, 2019
Messages
397
I'll take one no matter how thick it is. Its worth it in the end.
 

kalston

[H]ard|Gawd
Joined
Mar 10, 2011
Messages
1,343
At the end of the day we'll have cards from 75W+, that will perform way better than last gen, like always.

Not sure why everyone gets their panties in a twist. As long as the same wattage card this gen is 50-100% faster than last gen who cares, buy the wattage you are comfortable with. Being able to buy a beast of a card and not having to do multiGPU was the dream for decades... people are acting like *only* a 600W card will be offered. - Just a general rant.

Yes I agree. I don't want a 500-600w card but I am not complaining, next gen will still improve performance for whatever wattage I am fine with, it's all that matters at the end of the day. But also I appreciate the fact that this power consumption gives us something in return now, unlike multi-GPU which did not always work or often worked poorly.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
2,362
Not sure why everyone gets their panties in a twist. As long as the same wattage card this gen is 50-100% faster than last gen who cares, buy the wattage you are comfortable with. Being able to buy a beast of a card and not having to do multiGPU was the dream for decades... people are acting like *only* a 600W card will be offered. - Just a general rant.
I think it is because some people want to have close to or the strongest video cards that is common out there, it is mostly illogical, but there is something brainwise into being confident that you would be able to simply set the games at very high that come with having the best I suppose.

So if that card is too hot or too pricey for them, they become a bit angry regardless of what their power/price envelope look like (just a little bit and they express it in an hyperbole way online).
 

Wat

Limp Gawd
Joined
Jun 23, 2019
Messages
397
Just because a company puts out an extreme product doesn't mean you are obligated to buy it.

If someone feels angst because they can't buy the most overpriced, ridiculous card, the marketing department owns them. They have become the mindless consumer every company dreams about.
 

harmattan

Supreme [H]ardness
Joined
Feb 11, 2008
Messages
5,026
Some more rumours dropping (and I quote):

RTX 4090, AD102-300, 16384FP32, 384bit 21Gbps 24G GDDR6X, 450W

RTX 4080, AD103-300, 10240FP32, 256bit (?18Gbps 16G GDDR6?) 420W(?)

RTX 4070, AD104-275, 7168FP32, 160bit 18Gbps GDDR6 10G. 300W

DO NOT expect a lower MSRP.

https://twitter.com/kopite7kimi/status/1539853156275761152
A few impressions, if correct:

- Wattage is outlandish. They either didn't get the uplift they were expecting per core, or they know RDNA3 is going to be a beast. Either way, they're pushing power past a reasonable limit (remember when we though gtx 480 at 375w was a barn burner?)

- Huge divide in CUs between 4090 and 4080. nV is giving themselves breathing room for more upper level SKUs.

- 160-bit bus for 4070?! Are they targeting the 1028x768 market? (I understand they'll compensate with mem speed and caching, but that's downright anemic)

- No kidding on the MSRP. We're in for pain on pricing: pucker up buttercups.
 

Armenius

Fully [H]
Joined
Jan 28, 2014
Messages
32,500
A few impressions, if correct:

- Wattage is outlandish. They either didn't get the uplift they were expecting per core, or they know RDNA3 is going to be a beast. Either way, they're pushing power past a reasonable limit (remember when we though gtx 480 at 375w was a barn burner?)

- Huge divide in CUs between 4090 and 4080. nV is giving themselves breathing room for more upper level SKUs.

- 160-bit bus for 4070?! Are they targeting the 1028x768 market? (I understand they'll compensate with mem speed and caching, but that's downright anemic)

- No kidding on the MSRP. We're in for pain on pricing: pucker up buttercups.
- The PCI-E 5.0 specification allows for up to 600W over the 16-pin auxiliary power cable, so power is well reasonable within spec. TDP of the GTX 480 was 250W. Shocking for the time considering we were coming from an era of 150-200W cards. 250W, however, became the standard for the top card going forward. Don't forget the TDP of the R9 290X was 290W.

- Looks like it based on the chip designation. NVIDIA may just want to differentiate the models more, or try to keep the mid-high range at a more reasonable price.

- 160-bit is due to the number of memory chips. Each GDDR6 chip is 32-bit. Five 16Gb chips makes 10GB of VRAM. 5 * 32 bits = 160 bits.

- Given the premium on fab space these days it's not surprising.
 
D

Deleted member 289973

Guest
I may end up going to AMD or sticking with the 3000s if what has been said is correct or close to it. 300W for a 10GB card in this day and age seems like a big waste (referring to the 4070). There are probably things that will 'buff' the card a bit, but the picture doesn't make sense to me.
 

harmattan

Supreme [H]ardness
Joined
Feb 11, 2008
Messages
5,026
- The PCI-E 5.0 specification allows for up to 600W over the 16-pin auxiliary power cable, so power is well reasonable within spec. TDP of the GTX 480 was 250W. Shocking for the time considering we were coming from an era of 150-200W cards. 250W, however, became the standard for the top card going forward. Don't forget the TDP of the R9 290X was 290W.

- Looks like it based on the chip designation. NVIDIA may just want to differentiate the models more, or try to keep the mid-high range at a more reasonable price.

- 160-bit is due to the number of memory chips. Each GDDR6 chip is 32-bit. Five 16Gb chips makes 10GB of VRAM. 5 * 32 bits = 160 bits.

- Given the premium on fab space these days it's not surprising.
While the specification can handle it, 450w to run a GPU at load is bananas.

That's my thought as well: nV is trying to broaden the divide between their launch models so they have more wiggle room to introduce mid-cycle chips. I expect a myriad of chip versions this round, all formulaically designed to segment the market further and get people to step up to "newer" cards.

Good point on the chip to bus size relation. It does sound like 4070 will have 10GB VRAM.

Price increase we'll see on MSRP will not be explainable by or equitable to increased fab cost. nV will once again be pushing the envelope on recouping value.
 

reaper12

2[H]4U
Joined
Oct 21, 2006
Messages
2,663
- Wattage is outlandish. They either didn't get the uplift they were expecting per core, or they know RDNA3 is going to be a beast. Either way, they're pushing power past a reasonable limit (remember when we though gtx 480 at 375w was a barn burner?)

AMD's RDNA 3 cards are rumoured to be in the 375W-450W region as well. And the source for the rumours Camberwell listed above, Kopite7kimi, said that he disappointed with RDNA 3.

So who knows.
 
Joined
Oct 12, 2020
Messages
544
Some more rumours dropping (and I quote):

RTX 4090, AD102-300, 16384FP32, 384bit 21Gbps 24G GDDR6X, 450W

RTX 4080, AD103-300, 10240FP32, 256bit (?18Gbps 16G GDDR6?) 420W(?)

RTX 4070, AD104-275, 7168FP32, 160bit 18Gbps GDDR6 10G. 300W

DO NOT expect a lower MSRP.

https://twitter.com/kopite7kimi/status/1539853156275761152
I can see them putting 80-class back on 256-bit, sure, but 160-bit for the 70 card? I don't buy that one at all. I'd rather they do 4080 as 320-bit 20GB card, and 4070 as the 256-bit 16GB card.

Agree on price. Has anyone been paying attention to prices of everything since the 30-series launch?
 
D

Deleted member 289973

Guest
I can see them putting 80-class back on 256-bit, sure, but 160-bit for the 70 card? I don't buy that one at all. I'd rather they do 4080 as 320-bit 20GB card, and 4070 as the 256-bit 16GB card.

Agree on price. Has anyone been paying attention to prices of everything since the 30-series launch?
I was thinking about this as well. Why not 192 and 12GB for the 4070, then 256 and 16GB for the 4080 as listed, and then the potential 4060 that has been discussed somewhat could be the 'low end' with 128 bits and 8GB, leaving room for the Ti variants to fill the gaps (4060 Ti at 160/10, 4080 Ti at 320/20).

I don't think, at least I hope not, that the prices will be based on the hyperinflated GPU price craze we've seen over the last several months, but rather taking the global economic situation into consideration and bumping up the MSRP based on the CPI and such. If they stayed the same as they are now (not even lower) then I'll be incredibly surprised.
 
Joined
Oct 12, 2020
Messages
544
I was thinking about this as well. Why not 192 and 12GB for the 4070, then 256 and 16GB for the 4080 as listed, and then the potential 4060 that has been discussed somewhat could be the 'low end' with 128 bits and 8GB, leaving room for the Ti variants to fill the gaps (4060 Ti at 160/10, 4080 Ti at 320/20).

I don't think, at least I hope not, that the prices will be based on the hyperinflated GPU price craze we've seen over the last several months, but rather taking the global economic situation into consideration and bumping up the MSRP based on the CPI and such. If they stayed the same as they are now (not even lower) then I'll be incredibly surprised.

Yeah could see them going that route too on the 4080 and 4080 Ti, good point. Would be more in line with past gens. I still just don't think less than a 256-bit card would be good for a 70-class card at greater than or equal to $500 MSRP. A 192-bit card for $500 sounds awful.

More or less my commentary on price. I wouldn't expect a 4080 to MSRP for $699, but who knows.
 

DFenz

[H]ard|Gawd
Joined
Apr 3, 2014
Messages
1,300
If the rumored cache sizes are correct, the low memory bandwidth might not matter that much for games at least.
 

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
6,783
If stock is 450w what will the AIBs have? 500w+? 600w+? In any case anything over 350w for me will have to be water cooled. I am more interested in a decent monitor (good for 5 years+) then a new GPU at this time. If MSRP or should I say real price is like last generation for the first year, AMD and Nvidia may have a very different experience of them this time sitting on the shelf.
 

BassTek

Supreme [H]ardness
Joined
Jul 13, 2002
Messages
6,287
I’ll probably go 4090 this time unless AMD matches NVDA at 4K for a cheaper price. Although I have a feeling that there will be a 4090 ti or Titan that will follow that will have a decent performance bump given the amount of room they are leaving available on the top end for future models.

Of course every time I say I’m buying the flagship card at launch I second guess myself due to the price and end up getting a cheaper model.
 

Zahua

n00b
Joined
Jun 6, 2022
Messages
41
Any words on how much of an improvement on 4K performance we'll be seeing on the new gen of either cards?
 

harmattan

Supreme [H]ardness
Joined
Feb 11, 2008
Messages
5,026
I’ll probably go 4090 this time unless AMD matches NVDA at 4K for a cheaper price. Although I have a feeling that there will be a 4090 ti or Titan that will follow that will have a decent performance bump given the amount of room they are leaving available on the top end for future models.

Of course every time I say I’m buying the flagship card at launch I second guess myself due to the price and end up getting a cheaper model.
Moreso than any release I remember, going back all the way to my 9700 pro, I don't see a single reason to move from my 3090. I'll likely sit back during this launch until the dust settles -- maybe will pick up a new card late 2023. A 4090 may bring a 50% performance uplift, but to me, it's throwing money away at this point.

Unless there's a deluge of games coming out in the next year that are going to push the graphical envelope (read: there isn't), there's really no game that requires more horsepower.

Any words on how much of an improvement on 4K performance we'll be seeing on the new gen of either cards?
No idea. It will be interesting to see if nV can compensate for the smaller mem bus size across the board with memory speeds and cache.
 

Andrew_Carr

2[H]4U
Joined
Feb 26, 2005
Messages
2,593
A few impressions, if correct:

- Wattage is outlandish. They either didn't get the uplift they were expecting per core, or they know RDNA3 is going to be a beast. Either way, they're pushing power past a reasonable limit (remember when we though gtx 480 at 375w was a barn burner?)

- Huge divide in CUs between 4090 and 4080. nV is giving themselves breathing room for more upper level SKUs.

- 160-bit bus for 4070?! Are they targeting the 1028x768 market? (I understand they'll compensate with mem speed and caching, but that's downright anemic)

- No kidding on the MSRP. We're in for pain on pricing: pucker up buttercups.
The wattage seems in line with their HPC GPUs and their failure to design a MCM GPU. I'm guessing it's a way to stay competitive with what they see as future AMD cards having a major efficiency advantage (ie. to match a 300W AMD card they predicted they'll need 500W). The bus nerf I think is partially a way to prevent miners from buying cheap gaming GPUs and forcing them into buyer higher end stuff like the 4090 / HPC GPUs (kinda unnecessary at this point but I guess the decision was made before the latest crypto crash). Not sure if RDNA3 will translate perfectly to gaming, but the MCM based CDNA2 is already a beast and it seems like RDNA3 is mostly following that with a slight node advantage.

Both AMD and nvidia's new products seem kind of weird to me I guess. They should be great for VR or productivity if you have money to burn, but for general gaming it seems like lower MSRPs or improved efficiency would've been more welcome (AMD looks like they'll deliver on efficiency gains at least). Just seems like they're all targeting a small market that's about to collapse (crypto, recession, etc.)
 
D

Deleted member 289973

Guest
The wattage seems in line with their HPC GPUs and their failure to design a MCM GPU. I'm guessing it's a way to stay competitive with what they see as future AMD cards having a major efficiency advantage (ie. to match a 300W AMD card they predicted they'll need 500W). The bus nerf I think is partially a way to prevent miners from buying cheap gaming GPUs and forcing them into buyer higher end stuff like the 4090 / HPC GPUs (kinda unnecessary at this point but I guess the decision was made before the latest crypto crash). Not sure if RDNA3 will translate perfectly to gaming, but the MCM based CDNA2 is already a beast and it seems like RDNA3 is mostly following that with a slight node advantage.

Both AMD and nvidia's new products seem kind of weird to me I guess. They should be great for VR or productivity if you have money to burn, but for general gaming it seems like lower MSRPs or improved efficiency would've been more welcome (AMD looks like they'll deliver on efficiency gains at least). Just seems like they're all targeting a small market that's about to collapse (crypto, recession, etc.)
I'm waiting to hear more about AMD's upcoming line because Nvidia so far has disappointed me. I could see myself getting a 4070 IF the specs are higher than the 160/10GB and the wattage isn't absurd. It's more likely that I'll stick with the 3000s and go for a 3080 to upgrade the 3060 I have now.
 

Andrew_Carr

2[H]4U
Joined
Feb 26, 2005
Messages
2,593
I'm waiting to hear more about AMD's upcoming line because Nvidia so far has disappointed me. I could see myself getting a 4070 IF the specs are higher than the 160/10GB and the wattage isn't absurd. It's more likely that I'll stick with the 3000s and go for a 3080 to upgrade the 3060 I have now.
I'm hopeful for it but kinda curious how well MCM will work in gaming. I'm worried that AMD's drivers team + SLI/crossfire type goblins will cause issues that'll mar the raw performance & efficiency gains.
 
D

Deleted member 289973

Guest
I'm hopeful for it but kinda curious how well MCM will work in gaming. I'm worried that AMD's drivers team + SLI/crossfire type goblins will cause issues that'll mar the raw performance & efficiency gains.
Yeah, I don't know enough about AMD's GPUs so I'd have to do some research. If what you said ends up being the case then I have no problem sticking with this gen's cards for a few more years.
 
Top