The 6500xt and 6400: Meet your next gpu, PC gaming peasants.

NightReaver

[H]ard|Gawd
Joined
Apr 20, 2017
Messages
1,213
No, its not - with dlss, that NVIDIA result is more than playable

Quit justifying a shit card that should have shipped with 96-bit bus and 6gb ram., and 8x pcie

there are a lot less demanding rt games out there, and they all MOSTLY maintain that massive 5:1 performance advantage - when AMDhave the balls to price the 6500 within 50 bucks of that ass-raping 3050!
What does any of that have to do with the initial chart you posted? None of them were playable. That specific chart was dumb.
 

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,801
Same performance as the 1650 while using the same amount of power...without encoding ability.

When restricted to PCIe 3.0, it slots between a 1650 and a 1050ti. Even low profile users are better off with either of the old Nvidia cards. Standard size case owners are much better off with a used RX570 or 1650 Super.

This trash is useful for nobody.
I'm waiting for people to realize that the reason why this card is going up against 4 y.o Nvidia cards is because Nvidia hasn't developed their replacements. There's a reason for that. There's also a reason why TechPower Up is being so disingenuous here. It's because Nvidia still makes the 1650, and the 1050TI and cleans up selling them for $200+ with ZERO reviewers saying anything about that. There's a reason for that.

A better test would have been to test these lower tier cards with APUs to give everyone a sight on just how close APUs are to these.
 
Joined
Nov 24, 2021
Messages
204

Axman

[H]F Junkie
Joined
Jul 13, 2005
Messages
12,882
IF a baby ampere card gets made. Nvidia seems to have zero interest in that.

Both AMD and Nvidia have said they want the used market to replace the entry-level market. And AMD and Intel both have plans for stonkin' integrated graphics.

Nvidia is only going to focus on the mid-to-high end, along with halo parts, for the foreseeable future. They'll have to start making their own processors before they'll be able to compete on the entry-level.
 

TheHig

Gawd
Joined
Apr 9, 2016
Messages
991
Indeed and early buzz on the next gen AMD APUs are to replace entry level DGPUS even for gaming. So we could see something like the "entry" DGPUs for gaming staying around $300 new and anything under that will be used or APUs.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
2,342
Both AMD and Nvidia have said they want the used market to replace the entry-level market. And AMD and Intel both have plans for stonkin' integrated graphics.
OEM need something that is new and entry level I would imagine, I think what tend to replace new entry level is older generation mid level card.

Low level on HP/Deel seem to be a radeon RX5500, GTX 1650 super/1660Ti.

If card gain would be a lot about efficacy (delivering more performance with way less chips/card) it would make sense to make new low card, but if the gain are a lot due to complexity/higher pricing node/more chips-power and DX12/vulkan/driver support life being ultra long, it could make sense to simpy sell older card, not sure if they ever stopped making Turin all this time, stopping only the high end and keeping the low end has the low end Ampere make sense.

The generation gap is small enough that the xx60 card can easily be has powerful if not more than the next generation xx30 low budget would.
 

Axman

[H]F Junkie
Joined
Jul 13, 2005
Messages
12,882
OEM need something that is new and entry level I would imagine, I think what tend to replace new entry level is older generation mid level card.

It's all going to be integrated graphics. I wouldn't be surprised if integrated graphics comes to workstation and server parts. If any of these three companies has plans to fart out a "make the monitors go" card it's going to be Intel.
 

LukeTbk

2[H]4U
Joined
Sep 10, 2020
Messages
2,342
It's all going to be integrated graphics. I wouldn't be surprised if integrated graphics comes to workstation and server parts. If any of these three companies has plans to fart out a "make the monitors go" card it's going to be Intel.
I am not sure why that something NVIDIA would particularly want, but maybe there is no money in this anyway for them. I would feel that the 1650 of the world sold in entry option (or until very recently 2060) would have nice margin.

There is still giant room between integrated graphic and a new generation of card offering, where older cards seem to fit very well.
 

Axman

[H]F Junkie
Joined
Jul 13, 2005
Messages
12,882
There is still giant room between integrated graphic and a new generation of card offering, where older cards seem to fit very well.

Yeah, for CPUs without integrated graphics. But we're maybe one generation out from, at the very least from AMD, APUs with integrated graphics that will compete on paper with their console counterparts. How integrators will use them will be different.
 

Usual_suspect

Limp Gawd
Joined
Apr 15, 2011
Messages
184
FSR would help, but according to this, 1650 and 1050tis would get it too.
https://www.amd.com/en/partner/changing-the-game-amd-fidelityfx-super-resolutAmpere

A baby Ampere card would get dlss and true rtx support as well.
Yeah, I know, but Radeon Super Resolution is driver level FSR, which means games that don’t have FSR the 6500/6400 cards would get a leg up.

Giving a baby Ampere card true RTX is laughable though. The 3070ti can lt even utilize true RT at its targeted resolution without DLSS. Hell the 3090ti even struggles with it.

DLSS would be the only selling point, but with the way games are going graphically, and with VRAM requirements steadily going up, it will be required.

I honestly think AMD will be tied, if not ahead this upcoming generation of cards. I’m not being a fanboy, my reasoning is because of the PS5/Xbox Series X. They both use RDNA 2, so a vast majority of games are going to be programmed with that hardware in mind, then there’s these ludicrous power requirements being floated around on the Nvidia side, I think AMD has a shot at the crown this next gen coming up.
 

3dprophet

Limp Gawd
Joined
Oct 9, 2020
Messages
160
Prices came down a lot. They are pretty much down to msrp

Unless you want next gen cards I no longer see a reason to delay upgrading
 

whateverer

[H]ard|Gawd
Joined
Nov 2, 2016
Messages
1,660
Joined
Nov 24, 2021
Messages
204
Hey man, when prices are falling hard on this card, you need some excuse to charge a premium.
Skeptical as Samsung only shows 8 and 16Gb modules (1 or 2 GB). But yeah, very much pointless unless there is some obscure compute application that demands alot of vram while needing limited gpu performance.
 

zandor

2[H]4U
Joined
Dec 14, 2002
Messages
3,840
Now that pretty much everything else is back in stock these can go in laptops where they belong. I was looking at some of the new ThinkPad models and saw one in the specs. Get out of my gaming rig and into a business laptop where you belong RX 6500/6400.
 

Nobu

[H]F Junkie
Joined
Jun 7, 2007
Messages
8,372

Nobu

[H]F Junkie
Joined
Jun 7, 2007
Messages
8,372
Ah the 8 GB does seem legit but how exactly did they make the die run 8x pcie?
No clue. Actually, does it need to? That's just the number of lanes available, and you can run a 16x pcie card in an 8x slot. It's not optimal, but runs just fine. I think 8x and 16x mode are referring to something else here, but I'd have to examine the data more closely to know what exactly.
 
Joined
Nov 24, 2021
Messages
204
No clue. Actually, does it need to? That's just the number of lanes available, and you can run a 16x pcie card in an 8x slot. It's not optimal, but runs just fine. I think 8x and 16x mode are referring to something else here, but I'd have to examine the data more closely to know what exactly.
There was a definite performance hit going from pcie 4.0 x4 to pcie 3.0 x4 with that card. Since pcie 3.0 x8 has the same bandwidth as pcie 4.0 x4, it would be a nice improvement for that card in older platforms.
 

ZodaEX

Supreme [H]ardness
Joined
Sep 17, 2004
Messages
4,318
No clue. Actually, does it need to? That's just the number of lanes available, and you can run a 16x pcie card in an 8x slot. It's not optimal, but runs just fine. I think 8x and 16x mode are referring to something else here, but I'd have to examine the data more closely to know what exactly.

It doesn't need to, but for people who are on PCI-E 2.0 like me, the improvement to 8X makes the card a much better value.
 

Nobu

[H]F Junkie
Joined
Jun 7, 2007
Messages
8,372
There was a definite performance hit going from pcie 4.0 x4 to pcie 3.0 x4 with that card. Since pcie 3.0 x8 has the same bandwidth as pcie 4.0 x4, it would be a nice improvement for that card in older platforms.

It doesn't need to, but for people who are on PCI-E 2.0 like me, the improvement to 8X makes the card a much better value.
Right, but he asked how they managed to do it. I was just saying it must be possible, or we wouldn't be able to run 16x cards in an 8x slot. (or 4x, or 1x for that matter)
 

1_rick

2[H]4U
Joined
Feb 7, 2017
Messages
2,671
Right, but he asked how they managed to do it. I was just saying it must be possible, or we wouldn't be able to run 16x cards in an 8x slot. (or 4x, or 1x for that matter)
Each socket has multiple presence detect pins, so the host knows how many lanes it has. The host and card negotiate how many lanes to use, presumably at boot.

As for the video RAM, it certainly seems like you could just have more address lines between the GPU and the RAM, irrespective of the PCIe bus width.
 
Top