NightReaver
2[H]4U
- Joined
- Apr 20, 2017
- Messages
- 3,585
If not interested in RT, always go AMD. More bang for buck.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
If not interested in RT, always go AMD. More bang for buck.
Yup. It’s getting better but last I checked definitely Nvidia for VRAlso add VR to this. AMD has worse frame timing so VR will feel worse with an AMD GPU than Nvidia at identical FPS.
But otherwise the 7900XTX easily. It matches/exceeds the 4080 in raster performance and is $200 cheaper.
Whatever it is you're comparing here it is decidedly not IPC. IPC means instructions-per-cycle yet not a single term in your maths here involves instruction counts.You can do the math yourself by looking at the cuda cores and clockspeeds.
Best example that puts Ada Lovelace in the best light would be the 4080 at 4k since it is not cpu limited and will not have the scaling issues of the 4090 and then compare the 4080 to the 3080.
The 4080 has 9728 cores and in real world testing in techpowerup review averages 2737 mhz. https://www.techpowerup.com/review/nvidia-geforce-rtx-4080-founders-edition/40.html
The 3080 has 8704 cores and in real world testing in techpowerup review averages 1931 mhz. https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/32.html
9728 x 2737 = 26,625,536
8704 x 1931 = 16,807,424
26,625,536 / 16,807,424 = 58%
That means with unrealistic perfect scaling and no other limitations the 4080 could be 58% faster than the 3080 if they had the same IPC. The overall difference was 49% at 4k where the 3080 was also likely running into vram constraints in a couple games. https://www.techpowerup.com/review/nvidia-geforce-rtx-4080-founders-edition/32.html
This is NOT scientific of course but does give you an idea that IPC is likely the same at best. If you used the 4090 it would look like a huge regression in IPC but big gpus don't scale well and many games run into limitations thus making the huge core count not even come close to being fully utilized.
So the POINT was earlier that if the 3070 and 4070 end up with the same core count it clearly is going to come down to the large clockspeed advantage of the 4070 to makes the difference.
IPC = num_insns / (freq_in_hz * runtime_in_seconds)
I get it, you don't understand a bit. Even with all the advantages (except for a slight bandwidth disadvantage) the 4080s performance over the 3080 is almost exactly in line with the clock speed increase. This is only at 4k where the 3080 is definitely at a disadvantage due to likely hitting RAM limits on the card which is yet another big advantage for the 4080. This is every indication that the IPC of the 4000 series is no better than the 3000 series. If the IPC was noticeably better the performance of the 4080 should have been quite a bit better. Higher IPC along with much higher clockspeed act as a multiplier for performance and we're not seeing that.Different amount of VRAM, different bandwidth; it is NOT apples to apples. FULL STOP
Edit: I will also be ignoring this bullshit comparison from here on out as it is already so far off topic that this thread should already get scrubbed.
It's a rough test to determine if there is an IPC increase, not determining exactly what the IPC increase is. Every indicator shows the IPC is at best equal between the architectures. When roughly taking into account the differences such as clock speed and CUDA cores the performance in gaming is about equal between the two cards even though the 4080 has even more advantages. Again, a strong indicator that IPC hasn't increased.Whatever it is you're comparing here it is decidedly not IPC. IPC means instructions-per-cycle yet not a single term in your maths here involves instruction counts.
Performance at 4K from a benchmark is not a useful metric for computing IPC at all.
You'd really need to run an optimized CUDA compute kernel which reasonably maxes out shader core utilization (and avoids scheduling/pipeline stalls) for a set number of time at a given frequency and count instructions using a profiler tool. Then you can compute:Do this on both architectures and you might get a reasonable IPC comparison for this specific and spectacularly contrived testcase - because that's the only thing IPC is useful for. How useful it is for anyone who is not specifically working on optimizing a particular workload for a particular architecture is even more questionable.Code:IPC = num_insns / (freq_in_hz * runtime_in_seconds)
That's my point - IPC is a completely useless metric when talking about gaming or even just general generation to generation improvements unless it's in the context of specific workloads' IPC/CPI.I get it, you don't understand a bit. Even with all the advantages (except for a slight bandwidth disadvantage) the 4080s performance over the 3080 is almost exactly in line with the clock speed increase. This is only at 4k where the 3080 is definitely at a disadvantage due to likely hitting RAM limits on the card which is yet another big advantage for the 4080. This is every indication that the IPC of the 4000 series is no better than the 3000 series. If the IPC was noticeably better the performance of the 4080 should have been quite a bit better. Higher IPC along with much higher clockspeed act as a multiplier for performance and we're not seeing that.
It's a rough test to determine if there is an IPC increase, not determining exactly what the IPC increase is. Every indicator shows the IPC is at best equal between the architectures. When roughly taking into account the differences such as clock speed and CUDA cores the performance in gaming is about equal between the two cards even though the 4080 has even more advantages. Again, a strong indicator that IPC hasn't increased.
There's no need for your effectively synthetic test to determine IPC unless you're determining IPC for that specific workload. Your test is only useful for that specific use case which in this particular discussion is moot because we're talking about gaming and the figures used for the gaming comparison are an average of many different games. It's one of the few times I would even look at an averaged figure like that because it tends to have little use otherwise.
Higher frame times = lower FPS, however - they wouldn't be identical FPS with worse/higher frame times. You may be thinking of frame pacing - the consistency of frame times and delivery.Also add VR to this. AMD has worse frame timing so VR will feel worse with an AMD GPU than Nvidia at identical FPS.
But otherwise the 7900XTX easily. It matches/exceeds the 4080 in raster performance and is $200 cheaper.
Is there a modern in depth review of this? 7000 series is obviously having issues at this time and I've reports of many 6000 series owners who had upgraded to 7000 going back to their 6000 for vastly better performance.Also add VR to this. AMD has worse frame timing so VR will feel worse with an AMD GPU than Nvidia at identical FPS.
But otherwise the 7900XTX easily. It matches/exceeds the 4080 in raster performance and is $200 cheaper.
Babel Tech Reviews posts frametime graphs, as well as synth/dropped frame graphs for a Valve Index at 100% SteamVR resolution (2016x2240 for a 1440x1600 per eye HMD), 90 Hz.Is there a modern in depth review of this? 7000 series is obviously having issues at this time and I've reports of many 6000 series owners who had upgraded to 7000 going back to their 6000 for vastly better performance.
Thanks for the links. It looks like yet another example of AMD rushing these cards out without having time to get the drivers where they needed to be for launch this gen. Hopefully they get things tightened up soon in regards to VR as the 7900 series should be well ahead at 4K/VR resolutions compared to the older gen.I didn't really perceive microstutter with the 7900 XTX, though, just high frame times/low framerates and lots of reprojection for failing to keep frame times low enough - not something I want to see in a $1,000 GPU, even if it is still a significant improvement over my old GTX 980.
Is it possible this is added latency due to the MCM design?Thanks for the links. It looks like yet another example of AMD rushing these cards out without having time to get the drivers where they needed to be for launch this gen. Hopefully they get things tightened up soon in regards to VR as the 7900 series should be well ahead at 4K/VR resolutions compared to the older gen.
They've had that issue for quite a while - VR is both a niche, and a difficult use case.Is it possible this is added latency due to the MCM design?
Well when Newegg gets them, the scalper bots from the 3rd partyBestbuy really not dropping any 4090s lately. One model here or there limited to regional drop. Newegg seems to be dropping shit load lmao. Wondering what has changed there.
NA newegg has some model stocks for hours. LIke right now they have gigabyte gaming oc model. Scalpers are nit picking certain models now, usually cheaper ones etc or Asus strix. You can grab one easy if you wanted on newegg on weekly basis now.Well when Newegg gets them, the scalper bots from the 3rd partysellersscalpers snatch them up so then you see them all when you uncheck the Sold/Shipped by Newegg options for much more than MSRP.
100%. That gigabyte gaming oc model has been in stock almost non stop for the last week. A few other models hang around for hours and hours. Like NKD said, not sure is up with BestBuy though, they have not seemed to have any significant drops in a long time.NA newegg has some model stocks for hours. LIke right now they have gigabyte gaming oc model. Scalpers are nit picking certain models now, usually cheaper ones etc or Asus strix. You can grab one easy if you wanted on newegg on weekly basis now.
Oh that's good to know. But I'm just going to wait to 2024/2025 next gen at this point.NA newegg has some model stocks for hours. LIke right now they have gigabyte gaming oc model. Scalpers are nit picking certain models now, usually cheaper ones etc or Asus strix. You can grab one easy if you wanted on newegg on weekly basis now.
I don't think I am guaranteed life anytime so I just enjoy what I can. I tell my wife, i save for the kids and family. When it comes to me enjoying my hobby don't say I spend too much every few years lmao.Oh that's good to know. But I'm just going to wait to 2024/2025 next gen at this point.
May be BB bought shit load of 4080s and 4070tis they wanna offload before trying to sell 4090s again lmao. Stock seems to be flowing through given how often Newegg drops them and I think scalpers are holding on too them too and backing off a bit.100%. That gigabyte gaming oc model has been in stock almost non stop for the last week. A few other models hang around for hours and hours. Like NKD said, not sure is up with BestBuy though, they have not seemed to have any significant drops in a long time.
4090 buyers are looking for MSRP while scalpers are looking for exclusivity. A 4090 with a basic cooler going for $1700 is in no mans land.NA newegg has some model stocks for hours. LIke right now they have gigabyte gaming oc model. Scalpers are nit picking certain models now, usually cheaper ones etc or Asus strix. You can grab one easy if you wanted on newegg on weekly basis now.
Oh yeah for sure, I agree. I am more coming from the standpoint that I am dissatisfied with the performance of my current 3080 Ti, so by the time I am not, I am sure we'll have newer and better stuff out there.I don't think I am guaranteed life anytime so I just enjoy what I can. I tell my wife, i save for the kids and family. When it comes to me enjoying my hobby don't say I spend too much every few years lmao.
That’s the other nice part. I remember upgrading motherboards and CPU to get minor boosts. Like every year. DDR2 and slightly faster? Sold!Oh yeah for sure, I agree. I am more coming from the standpoint that I am not dissatisfied with the performance of my current 3080 Ti, so by the time I am, I am sure we'll have newer and better stuff out there.
Hobby at this point is too expensive to be chasing upgrades every gen like I did from the mid-2000s till 2014.
That’s the other nice part. I remember upgrading motherboards and CPU to get minor boosts. Like every year. DDR2 and slightly faster? Sold!
Now? 5 year old I7 is still enough to game on high with the right GPU. Folks are still running 1080TI and happy as clams. Longevity is much higher now than it used to be.
If you’re running 4k 165 you’re in the bleeding edge and spent a lot on a monitor very recently. Hence you’re likely to spend a lot on a card right now, or very recently too. But totally agreed.Makes sense if you are doing low res but not if you have 4k 165hz monitor. Then the new cards make sense.
Makes sense.If you’re running 4k 165 you’re in the bleeding edge and spent a lot on a monitor very recently. Hence you’re likely to spend a lot on a card right now, or very recently too. But totally agreed.
I still haven’t jumped because outside of the absurdly expensive, there still aren’t good “everything” 4K screens. Either they’re workstation - or they’re gaming - but not both. I wait for it to hit both at around 1500 to jump. I stay pretty bleeding edge, but not every generation.
Right. Literally zero generation on generation performance per dollar uplift.I wonder how the perception of the 4070Ti is going to be in a couple/few years when its street price is much lower? Because unlike some controversial/disliked GPUs, there's really nothing inherently wrong with it other than the price. It's basically a 3080Ti that uses much less power, that's great! Too bad it also costs the same as a 3080Ti.
It is 45% faster than the 3070 ti and costs 33% more so technically it does over a small perf per dollar increase over the card it officially replaces. That said the 3070 ti was already a poor value over the plain 3070.Right. Literally zero generation on generation performance per dollar uplift.
Wrong. 3080ti was $1200. This card outperforms it and costs $800. Pretty big price drop!Right. Literally zero generation on generation performance per dollar uplift.
The problem there is that for my $800, I'd rather have a used 3090 (Ti) because I don't care about DLSS 3 and would probably benefit more from literally twice the VRAM and bus width, and then the RX 7900 XTX seemed like a lot more card for $200 more still that I might as well have taken the chance. (Which I did, and soon regretted due to the vapor chamber flaw on top of the subpar VR performance, but it wound up saving me at least $100 on an RTX 4080 in a roundabout way.)"Anyone seriously considering the 4070Ti"
-most of discussion is about 4090, 4080, 7900, 4070, everything except 4070Ti
That would be a resounding "No"!
I wonder how the perception of the 4070Ti is going to be in a couple/few years when its street price is much lower? Because unlike some controversial/disliked GPUs, there's really nothing inherently wrong with it other than the price. It's basically a 3080Ti that uses much less power, that's great! Too bad it also costs the same as a 3080Ti.
While true I feel that the problem people are having is that the 3rd card down from the top of the stack is still $800. Seeing a 70 level card for that kind of money is hard to accept for a lot of people. It feels more like a more efficient 3080ti for similar price to me having 12gb of vram. My opinion is with the ‘good card bad price’ crowd on this one. The 3070ti aib cards are in stock all day around here for 649 which is absurd for an 8GB card in 2023. The 4070ti needs to get to $700 at least IMO and that may happen if the 3070ti stock gets depleted.Wrong. 3080ti was $1200. This card outperforms it and costs $800. Pretty big price drop!
Our last price was absolutely insane, so this seemingly less insane price is such a bargain! Totally not pulling wool over the eyes of my loyal consumers! You'd be foolish not to buy it!Wrong. 3080ti was $1200. This card outperforms it and costs $800. Pretty big price drop!
This has been the lie ever since ampere came out. Compare the price to the previous terrible value (2080ti) to make the prices seem like a great value. Completely disregard the linear increase in price/perf.Our last price was absolutely insane, so this seemingly less insane price is such a bargain! Totally not pulling wool over the eyes of my loyal consumers! You'd be foolish not to buy it!
The 3080 Ti hasn't been $1200 for a very long time.Wrong. 3080ti was $1200. This card outperforms it and costs $800. Pretty big price drop!
The 3080 Ti hasn't been $1200 for a very long time.
I think NV did a great disservice to its own products as well as customers by pushing the "4070Ti = 3090/Ti" thing so hard. Like I get it, it makes the 4070Ti sound better while technically being true, kind of, in some situations. But like you said, 3090/Ti is uniquely equipped with the extra VRAM, a niche feature with specific uses that cannot simply be replaced with "higher clocks and more cache". I went with a 3090Ti because I'm somewhat of an edge case with texture mods and maxxing resolutions at low FPS- no 12GB card would work for me, not thru guile nor brute force. If that wasn't the case I may have considered a 4070Ti but NV is just being misleading saying a 12GB card is the same as a 24GB card when there is a 12GB version of the 24GB card right there for direct comparison.The problem there is that for my $800, I'd rather have a used 3090 (Ti) because I don't care about DLSS 3 and would probably benefit more from literally twice the VRAM and bus width, and then the RX 7900 XTX seemed like a lot more card for $200 more still that I might as well have taken the chance. (Which I did, and soon regretted due to the vapor chamber flaw on top of the subpar VR performance, but it wound up saving me at least $100 on an RTX 4080 in a roundabout way.)
The 4070 Ti should've been a 4060 - that's the tier when we start expecting memory buses smaller than 256 bits wide - and priced to match, to boot ($350 and under). The 1060 6 GB and 3060 12 GB seemed like good cards for their day, bang-for-the-buck champs for people who either don't need all-out GPU performance (likely due to gaming at 1080p still) or would be horribly CPU-bottlenecked otherwise (like a 3060 in anything prior to Zen 3 or Alder Lake).
The 4080 at least has the convenient excuse of beating the 3090 Ti decisively, even with a bit less VRAM and bus width compromising memory throughput somewhat - it's an objective improvement over the past generation, if less so than the 4090. It's just, as you stated, overpriced.
Arguably, the 3080Ti was almost never a $1200 card. It was like an $1800 card in 2021/ early 2022, then quickly plummeted to $1000 and below in mid 2022. I think that's a big part of the "pricing problem" with RTX 4000- it's not just that prices were very high the past couple years, it's that they were inconsistent. All the high-end cards went from way above MSRP to way below MSRP in a flash (even months and months before Ada was announced). There was no consistent value baseline coming from last-gen, just whatever prices ended up being in any given month. That unfortunately provides an opening for RTX 4000 to artificially follow the same pattern, being released high and then getting lots of price cuts later in the cycle to boost sales once value has been "established" thru high MSRP.The 3080 Ti hasn't been $1200 for a very long time.
Single model below $1200 here (and a Zotac one):The 3080 Ti hasn't been $1200 for a very long time.
3080 Ti's are off shelves. You seeing much restock of anything above a 3070 Ti in the 30-series? Yeah me neither.If that was true they would be flying off the shelves. Its disingenuous to compare prices during the COVID retardness. The ever negative Tech Jesus has a good video on why the new cards are not a great value.
3080 Ti's are off shelves. You seeing much restock of anything above a 3070 Ti in the 30-series? Yeah me neither.