Furious_Styles
2[H]4U
- Joined
- Jan 16, 2013
- Messages
- 4,008
He spent it all on the case and RGB.To each their own. I'd stop caring about theming the moment it prevents me from getting the level of performance I'm aiming for.
But that's just me.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
He spent it all on the case and RGB.To each their own. I'd stop caring about theming the moment it prevents me from getting the level of performance I'm aiming for.
But that's just me.
My tower is around the side of my desk on the floor. I can't even see it unless I lean against the wall and peek over.
Otherwise I see it every 3-6 months when I pop it open for cleaning.
It's a box I use to play video games and shitpost on internet forums. Don't care what it looks like, in fact I prefer the opposite, it should neither be seen nor heard.
Yup. Once the basement gets somewhat done, it's going down there to keep both itself and the PC room upstairs cooler.I keep my pc in a separate room (long-ish cables and fancy routing) from the one where my desk actually is, for the reasons listed. I don't want to see it, nor hear it. But I want it to perform, definitely.
Yup. Once the basement gets somewhat done, it's going down there to keep both itself and the PC room upstairs cooler.
I've been building smaller and smaller every build. It's really satisfying.I live in Sweden, the heat is not a problem even during the summer.
The setup with its 15 fans (13 case and 2 gpu) is very silent. I'd rather have many silent fans than few loud ones. And the looks fit well in the theme of my apartment.
I will never build a hardline box again. That's too much work and too much maintenance. Mini ITX is a fun thing to build, although quite frustrating as there's so little space and so much I want to cram into it. I can definitely see myself building a tiny tiny PC next time. Although in 5 or so years I might just get a laptop and call it a day.
I want high performance too, and even though my box is flashy, that's a high end pc with ddr5 and Z690 Formula. That 12600k is at 5.1GHz all p-core and 4.1GHz all e-core. I did not cheap out on any of the components except those I want to upgrade later this year, like the CPU and GPU.
mITX cases have come quite a way over the years. I've found them to be quite versatile now, particularly when using full-size GPUs. My first build was a mITX with the GTX 970 about six years ago. When it comes to visibility, I'm split. I can see both sides of the coin here, and my preference is something that looks cool and isn't totally plain (because my PCs always sit on my desk) but doesn't have to be themed, flashy with over-the-top RGB, etc. instead focusing primarily on optimum performance.
You should check out some of the Meshlicious custom loops. I've seen lots of crazy cool custom hardline loops, and the Meshlicious is nice because it fits most full-size GPUs without trouble. I fully agree that computer building is an art -- and for aesthetics with flashy RGB or completely plain matte black or anything in between, it all comes down to preference.I know there are ITX cases where you can put a 360 CPU AIO and another 240 radiator (I'd probably get one of those high-end AIO 240 GPUs) and a few fans.
I've seen some people do custom watercooling (hardline!) in ITX and it's just amazing they have the patience for that, lol. Computer building can be art.
Not sure, but my guess is that if the power draw rumors are true on the top end cards and we are looking at 400-450W TDP cards, then they are only going to get bigger. If the rest of the stack stays pretty normal on power draw (under 250W TDP), then I'd imagine should stick to fairly normal 2-slot designs.Back to the main topic, has there been any real discussion or info on roughly how long or thick the cards will be? I'd love to imagine as technology evolves, the most powerful cards won't need four GPU slots, even some mid-range cards might need only two instead of three, along with being a bit shorter but in reality I don't know how viable this is.
At the end of the day we'll have cards from 75W+, that will perform way better than last gen, like always.Not sure, but my guess is that if the power draw rumors are true on the top end cards and we are looking at 400-450W TDP cards, then they are only going to get bigger. If the rest of the stack stays pretty normal on power draw (under 250W TDP), then I'd imagine should stick to fairly normal 2-slot designs.
Oh I agree, most mainstream cards should be fine. It'll just be top end that might push the boundaries we traditionally have had.At the end of the day we'll have cards from 75W+, that will perform way better than last gen, like always.
Not sure why everyone gets their panties in a twist. As long as the same wattage card this gen is 50-100% faster than last gen who cares, buy the wattage you are comfortable with. Being able to buy a beast of a card and not having to do multiGPU was the dream for decades... people are acting like *only* a 600W card will be offered. - Just a general rant.
At the end of the day we'll have cards from 75W+, that will perform way better than last gen, like always.
Not sure why everyone gets their panties in a twist. As long as the same wattage card this gen is 50-100% faster than last gen who cares, buy the wattage you are comfortable with. Being able to buy a beast of a card and not having to do multiGPU was the dream for decades... people are acting like *only* a 600W card will be offered. - Just a general rant.
I think it is because some people want to have close to or the strongest video cards that is common out there, it is mostly illogical, but there is something brainwise into being confident that you would be able to simply set the games at very high that come with having the best I suppose.Not sure why everyone gets their panties in a twist. As long as the same wattage card this gen is 50-100% faster than last gen who cares, buy the wattage you are comfortable with. Being able to buy a beast of a card and not having to do multiGPU was the dream for decades... people are acting like *only* a 600W card will be offered. - Just a general rant.
A few impressions, if correct:Some more rumours dropping (and I quote):
RTX 4090, AD102-300, 16384FP32, 384bit 21Gbps 24G GDDR6X, 450W
RTX 4080, AD103-300, 10240FP32, 256bit (?18Gbps 16G GDDR6?) 420W(?)
RTX 4070, AD104-275, 7168FP32, 160bit 18Gbps GDDR6 10G. 300W
DO NOT expect a lower MSRP.
https://twitter.com/kopite7kimi/status/1539853156275761152
- The PCI-E 5.0 specification allows for up to 600W over the 16-pin auxiliary power cable, so power is well reasonable within spec. TDP of the GTX 480 was 250W. Shocking for the time considering we were coming from an era of 150-200W cards. 250W, however, became the standard for the top card going forward. Don't forget the TDP of the R9 290X was 290W.A few impressions, if correct:
- Wattage is outlandish. They either didn't get the uplift they were expecting per core, or they know RDNA3 is going to be a beast. Either way, they're pushing power past a reasonable limit (remember when we though gtx 480 at 375w was a barn burner?)
- Huge divide in CUs between 4090 and 4080. nV is giving themselves breathing room for more upper level SKUs.
- 160-bit bus for 4070?! Are they targeting the 1028x768 market? (I understand they'll compensate with mem speed and caching, but that's downright anemic)
- No kidding on the MSRP. We're in for pain on pricing: pucker up buttercups.
While the specification can handle it, 450w to run a GPU at load is bananas.- The PCI-E 5.0 specification allows for up to 600W over the 16-pin auxiliary power cable, so power is well reasonable within spec. TDP of the GTX 480 was 250W. Shocking for the time considering we were coming from an era of 150-200W cards. 250W, however, became the standard for the top card going forward. Don't forget the TDP of the R9 290X was 290W.
- Looks like it based on the chip designation. NVIDIA may just want to differentiate the models more, or try to keep the mid-high range at a more reasonable price.
- 160-bit is due to the number of memory chips. Each GDDR6 chip is 32-bit. Five 16Gb chips makes 10GB of VRAM. 5 * 32 bits = 160 bits.
- Given the premium on fab space these days it's not surprising.
- Wattage is outlandish. They either didn't get the uplift they were expecting per core, or they know RDNA3 is going to be a beast. Either way, they're pushing power past a reasonable limit (remember when we though gtx 480 at 375w was a barn burner?)
I can see them putting 80-class back on 256-bit, sure, but 160-bit for the 70 card? I don't buy that one at all. I'd rather they do 4080 as 320-bit 20GB card, and 4070 as the 256-bit 16GB card.Some more rumours dropping (and I quote):
RTX 4090, AD102-300, 16384FP32, 384bit 21Gbps 24G GDDR6X, 450W
RTX 4080, AD103-300, 10240FP32, 256bit (?18Gbps 16G GDDR6?) 420W(?)
RTX 4070, AD104-275, 7168FP32, 160bit 18Gbps GDDR6 10G. 300W
DO NOT expect a lower MSRP.
https://twitter.com/kopite7kimi/status/1539853156275761152
I was thinking about this as well. Why not 192 and 12GB for the 4070, then 256 and 16GB for the 4080 as listed, and then the potential 4060 that has been discussed somewhat could be the 'low end' with 128 bits and 8GB, leaving room for the Ti variants to fill the gaps (4060 Ti at 160/10, 4080 Ti at 320/20).I can see them putting 80-class back on 256-bit, sure, but 160-bit for the 70 card? I don't buy that one at all. I'd rather they do 4080 as 320-bit 20GB card, and 4070 as the 256-bit 16GB card.
Agree on price. Has anyone been paying attention to prices of everything since the 30-series launch?
I was thinking about this as well. Why not 192 and 12GB for the 4070, then 256 and 16GB for the 4080 as listed, and then the potential 4060 that has been discussed somewhat could be the 'low end' with 128 bits and 8GB, leaving room for the Ti variants to fill the gaps (4060 Ti at 160/10, 4080 Ti at 320/20).
I don't think, at least I hope not, that the prices will be based on the hyperinflated GPU price craze we've seen over the last several months, but rather taking the global economic situation into consideration and bumping up the MSRP based on the CPI and such. If they stayed the same as they are now (not even lower) then I'll be incredibly surprised.
Maybe. Infinity cache only got RDNA2 so far.If the rumored cache sizes are correct, the low memory bandwidth might not matter that much for games at least.
Anywhere from 30% to double, depending who you ask and what day of the week it is.Any words on how much of an improvement on 4K performance we'll be seeing on the new gen of either cards?
Moreso than any release I remember, going back all the way to my 9700 pro, I don't see a single reason to move from my 3090. I'll likely sit back during this launch until the dust settles -- maybe will pick up a new card late 2023. A 4090 may bring a 50% performance uplift, but to me, it's throwing money away at this point.I’ll probably go 4090 this time unless AMD matches NVDA at 4K for a cheaper price. Although I have a feeling that there will be a 4090 ti or Titan that will follow that will have a decent performance bump given the amount of room they are leaving available on the top end for future models.
Of course every time I say I’m buying the flagship card at launch I second guess myself due to the price and end up getting a cheaper model.
No idea. It will be interesting to see if nV can compensate for the smaller mem bus size across the board with memory speeds and cache.Any words on how much of an improvement on 4K performance we'll be seeing on the new gen of either cards?
The wattage seems in line with their HPC GPUs and their failure to design a MCM GPU. I'm guessing it's a way to stay competitive with what they see as future AMD cards having a major efficiency advantage (ie. to match a 300W AMD card they predicted they'll need 500W). The bus nerf I think is partially a way to prevent miners from buying cheap gaming GPUs and forcing them into buyer higher end stuff like the 4090 / HPC GPUs (kinda unnecessary at this point but I guess the decision was made before the latest crypto crash). Not sure if RDNA3 will translate perfectly to gaming, but the MCM based CDNA2 is already a beast and it seems like RDNA3 is mostly following that with a slight node advantage.A few impressions, if correct:
- Wattage is outlandish. They either didn't get the uplift they were expecting per core, or they know RDNA3 is going to be a beast. Either way, they're pushing power past a reasonable limit (remember when we though gtx 480 at 375w was a barn burner?)
- Huge divide in CUs between 4090 and 4080. nV is giving themselves breathing room for more upper level SKUs.
- 160-bit bus for 4070?! Are they targeting the 1028x768 market? (I understand they'll compensate with mem speed and caching, but that's downright anemic)
- No kidding on the MSRP. We're in for pain on pricing: pucker up buttercups.
I'm waiting to hear more about AMD's upcoming line because Nvidia so far has disappointed me. I could see myself getting a 4070 IF the specs are higher than the 160/10GB and the wattage isn't absurd. It's more likely that I'll stick with the 3000s and go for a 3080 to upgrade the 3060 I have now.The wattage seems in line with their HPC GPUs and their failure to design a MCM GPU. I'm guessing it's a way to stay competitive with what they see as future AMD cards having a major efficiency advantage (ie. to match a 300W AMD card they predicted they'll need 500W). The bus nerf I think is partially a way to prevent miners from buying cheap gaming GPUs and forcing them into buyer higher end stuff like the 4090 / HPC GPUs (kinda unnecessary at this point but I guess the decision was made before the latest crypto crash). Not sure if RDNA3 will translate perfectly to gaming, but the MCM based CDNA2 is already a beast and it seems like RDNA3 is mostly following that with a slight node advantage.
Both AMD and nvidia's new products seem kind of weird to me I guess. They should be great for VR or productivity if you have money to burn, but for general gaming it seems like lower MSRPs or improved efficiency would've been more welcome (AMD looks like they'll deliver on efficiency gains at least). Just seems like they're all targeting a small market that's about to collapse (crypto, recession, etc.)
I'm hopeful for it but kinda curious how well MCM will work in gaming. I'm worried that AMD's drivers team + SLI/crossfire type goblins will cause issues that'll mar the raw performance & efficiency gains.I'm waiting to hear more about AMD's upcoming line because Nvidia so far has disappointed me. I could see myself getting a 4070 IF the specs are higher than the 160/10GB and the wattage isn't absurd. It's more likely that I'll stick with the 3000s and go for a 3080 to upgrade the 3060 I have now.
Yeah, I don't know enough about AMD's GPUs so I'd have to do some research. If what you said ends up being the case then I have no problem sticking with this gen's cards for a few more years.I'm hopeful for it but kinda curious how well MCM will work in gaming. I'm worried that AMD's drivers team + SLI/crossfire type goblins will cause issues that'll mar the raw performance & efficiency gains.