Axman
VP of Extreme Liberty
- Joined
- Jul 13, 2005
- Messages
- 16,336
I'm hopeful for it but kinda curious how well MCM will work in gaming.
Recent rumors say their gaming GPUs will have not multiple graphics dies. Only the workstation parts.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
I'm hopeful for it but kinda curious how well MCM will work in gaming.
Or buying a lower card without a 4 slot cooler, or just using an aio cooled card.Well there goes the end of SFF builds. Managed to squeeze a 3-slot 3080 in my mini PC but it looks like with the 4 slot monstrosities its back to medium tower cases...
Or buying a lower card without a 4 slot cooler, or just using an aio cooled card.
Yeah waiting to see about this, leakers like MLID and kopite7kimi were taking about how RDNA 3 31 were going to be MCM for over a year, now all of a sudden only a few months from release they are going to be monolithic.Recent rumors say their gaming GPUs will have not multiple graphics dies. Only the workstation parts.
Yeah waiting to see about this, leakers like MLID and kopite7kimi were taking about how RDNA 3 31 were going to be MCM for over a year, now all of a sudden only a few months from release they are going to be monolithic.
I'm not concerned about the 450w usage. I already have a 3080 running the 450w BIOS. Not a problem to pull that heat out of my case.I’m actually hoping the power numbers are fairly accurate so I can absolutely not care about owning any of these cards. Maybe the 70 at most. However I am interested in seeing what gains in performance are there. Should be able to just read all about it and comfortably watch these Nvidia cards from the sidelines personally.
Yeah, I'm concerned more about how it affects the ambient room temperature, not cooling the card. Happy with my undervolted 3080 still. Not interested in the 4xxx series right now, esp if power numbers are true.I'm not blowing out 450w of heat into my PC room just for a gpu. No way.
This is why I'm curious about switching to the RX series whether new or current. Prices for the 6000s are better, and regarding their next-gen cards, I've heard nothing about the insane power consumption rumored to [dare I say] plague the RTX 4000s.Yeah, I'm concerned more about how it affects the ambient room temperature, not cooling the card. Happy with my undervolted 3080 still. Not interested in the 4xxx series right now, esp if power numbers are true.
I agree with everything after your first sentence. Not sure what case you have now, but a ton of ITX cases these days can support ATX power supplies. I'd imagine undervolting to 350W would cause a considerable drop in performance if you run your cards at load, but I'd love to be wrong about this!My limitation is SFX power supplies, not gonna get an ATX case just for moderately faster performance. But a 450W gpu isn't off the table for me if I can undervolt it to 350w and keep most the performance. I am really curious what AMD has, if they are claiming 50% performance/watt increase and if it is true I'm very interested in this product, we haven't seen gains like that in six years (since Pascal).
Yeah I'm the same I guess. I'm on 350w right now but I think I'll wait for the following gen so that I may still get a significant performance increase while lowering the wattage to maybe 250w or so. I'm sure a 350w (maybe even 300w) RT4xx will be an upgrade already but I'm afraid it'll only be a small one.I'm not blowing out 450w of heat into my PC room just for a gpu. No way.
I have a LianLi/DAN A4-H2O (11L). My 3090 is undervolted to 750mV with minimal performance loss I'm running it at 1695mhz, usually pulls between 120w-270w, in depending on the game, some AAA games will pull around 300w.This is why I'm curious about switching to the RX series whether new or current. Prices for the 6000s are better, and regarding their next-gen cards, I've heard nothing about the insane power consumption rumored to [dare I say] plague the RTX 4000s.
I agree with everything after your first sentence. Not sure what case you have now, but a ton of ITX cases these days can support ATX power supplies. I'd imagine undervolting to 350W would cause a considerable drop in performance if you run your cards at load, but I'd love to be wrong about this!
Agree. Will need to see how well it undervolts. My 3080 only draws about 235W right now and in most games its only a frame or two slower than stock and some like HZD actually faster. Plus frame rate stays consistent.I'm not blowing out 450w of heat into my PC room just for a gpu. No way.
If AMD is moving away from a chiplet design that will be really disappointing.
Nvidia Ampere Oversupply Leak: AIBs demand Lovelace Delays!
— speculation by Moores Law Is Dead
NVIDIA RTX 4080/4090 May Get Delayed to End of 2022, Paper Launch Likely in Sept or October
https://www.hardwaretimes.com/nvidi...aper-launch-likely-in-sept-or-october-report/
At those prices you'd be better off importing one from the U.S.Seeing that some shops ever here still charge 1.200€ and more for RTX 3080's and up to 2.600€ for 3090Ti's i'm not surprised they don't sell, for a lot of cards there has been stock for well over a year but prices don't come donw, not everywhere anyways.
Is there any speculation on the power draw of the RTX 4050?
It's way too early to tell.Is there any speculation on the power draw of the RTX 4050?
me think, 0%-5% gaming experience improvement. Just more useful and more interesting items to buy first over something I would not notice a game play improvement in other words. Still I will see what comes about from AMD and Nvidia and then determine if it is worth whatever costs that comes with it.jeebus.. 2750mhz clock on the 4090 should obliterate the 3090 in pure razterization performance. what do you guys think re 4090 vs 3090 Ti? 30% better? 50% better?
same garbage launch schedule is going to mean F5 key will get destroyed. again. sigh...
If there are no instruction improvements in the architecture then the clock speed alone would equal a 60% improvement in performance over the 3090.me think, 0%-5% gaming experience improvement. Just more useful and more interesting items to buy first over something I would not notice a game play improvement in other words. Still I will see what comes about from AMD and Nvidia and then determine if it is worth whatever costs that comes with it.
What does that have to do with the gaming experience? For example, monitor 120hz, going from 200fps to 400fps in a given game-> A 100% improvement in performance -> I doubt even a few would consider the gaming experience any better in that particular case.If there are no instruction improvements in the architecture then the clock speed alone would equal a 60% improvement in performance over the 3090.
Depends on what you play and your setup in terms of gaming experience improvement. The 3090 ti won't max out cyberpunk with ray tracing and DLSS @ 4K 120. I'd eager the experience improvement there will be big. Who knows what future titles will require? I'm getting the impression games like Stalker 2 will bring the 3090 down a peg of two. I personally won't be buying a 3090 or a 4090, but the 4070 rumours have my ears perched.What does that have to do with the gaming experience? For example, monitor 120hz, going from 200fps to 400fps in a given game-> A 100% improvement in performance -> I doubt even a few would consider the gaming experience any better in that particular case.
Blanket assumptions without considering the whole system can also lead one astray. For example Far Cry 6, an unusual case, maybe just a poor usage of available CPU cores. In any case look at the performance difference between the 5800X3D to the 5600, particularly the 1% lows (where it really counts), data from Hardware Unboxed. A 36% improvement (1% lows) with the same GPU. When your talking very high frame rates the CPU really does matter. If one is going to get the presume 4090 (450w) 50%-60% increase in performance, probably best to have a 5800X3D or good Alder Lake system to support it. AMD may need something even stronger (Zen 4 Vcache?).
View attachment 492432
In my case the 5800X3D has been a eye opener, maybe because playing FarCry from a 3900x to a 5800X3D, this was at 4K, was definitely noticeable in smoothness. Where the 3900x would drop to sub 40fps, briefly but it was there, noticeable. The 5800X3D maintains over 60 FPS. Of course RT and Ultra settings using a 6900XT. Anyways the 5800X3D will get a new MSI motherboard and will change places with the 3960x (CPU that has been holding back the 3090).
Those who are going to foot the bill for the 4090, I say will need to max out the rest of the system as much as possible including a monitor that show the fps it can produce. For me uber FPS (beyond 100, I know kinda low standard for some) has very little improvement in gaming experience value. Not a competitive gamer, those who are may have a real treat on the upcoming GPUs from Nvidia and AMD.
They have an abundance of surplus Ampere cards to move coupled with a massive downturn in discretionary spending. Not surprising the launch continues to move.In the last few weeks, we went from "40 series launches in July" to "only the 4090 will launch this year."
https://twitter.com/greymon55/status/1547805136210509824
AD102:2022
AD103/104/106:2023
Yeah I figure we will be lucky to see the 4080 by the end of the year. I almost expect paper launches at this point.They have an abundance of surplus Ampere cards to move coupled with a massive downturn in discretionary spending. Not surprising the launch continues to move.
I'm guessing paper launch for the 4090 in September with actual retail launch in October. But who knows at this point, may be later.