The Achilles' heels of Ryzen no one is talking about

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
6,773
I'm glad I can play DaVinci Resolve.... is that multiplayer?

It's a small margin, for an even smaller margin of the market, some "Achilles Heel". Bring it back when AMD makes a fuck up like pinning their CPU's to RD-RAM that costs more than their competitiors whole CPU+MOBO+RAM combo.
Granted in your case not a big deal, neither for me at this time. If I keep my next motherboard for 5 years and want a 2 gpu option, it might be an issue. In which I would just upgrade if needed. At this time I don't see 8x/8x SLI as causing any significant performance disadvantage and I will be doing just that hopefully shortly with RyZen.
 

OrangeKhrush

[H]ard|Gawd
Joined
Dec 15, 2016
Messages
1,673
It looks like AMD are maximising the highest bandwidth possible, They are maxing ot efficiency which is great, for me bandwidth > speed. Another telling aspect of AMD hitting such high effective bandwidths will be When Raven Ridge arrives, those APU's are going to be unlike the gimped APU's of the past with weak cores that bottlenecked the iGPU before memory bandwidth did. AMD chose dual channel because it is cheaper, easier to sell to a greater market than segmentation and the bandwidth is so good it is not far off quad channel levels even at low frequency.
 
  • Like
Reactions: noko
like this

Trimlock

[H]F Junkie
Joined
Sep 23, 2005
Messages
15,228
Since those tests there are more powerful GPU's that require more bandwidth - results previously done don't necessarily reflect what will happen now. Otherwise we would never have to go beyond PCIe 1.0 or 2.0, if a more powerful GPU's will always used the same bandwidth. Now what AMD is doing, if they are successful is to use the bandwidth more efficiently - as in loading only what is needed and not a whole bunch of other stuff constantly; for example just one small mipmap texture vice all the mipmap 4K textures and smaller for a far away distant object.

As for PCIe 16x and a single GPU - I don't see it as being limiting in the next 3-5 years. 8x/8x could be more limiting next gpu generation for multiple top end GPU's.
My point wasn't for you specifically but to show that just because one game (which is more of a benchmark) doesn't show improvement it doesn't mean other games wouldn't. As proven by [H] it is game dependent and can vary between games.

Note for everyone else, this also affects z270.
 

Trimlock

[H]F Junkie
Joined
Sep 23, 2005
Messages
15,228
It looks like AMD are maximising the highest bandwidth possible, They are maxing ot efficiency which is great, for me bandwidth > speed. Another telling aspect of AMD hitting such high effective bandwidths will be When Raven Ridge arrives, those APU's are going to be unlike the gimped APU's of the past with weak cores that bottlenecked the iGPU before memory bandwidth did. AMD chose dual channel because it is cheaper, easier to sell to a greater market than segmentation and the bandwidth is so good it is not far off quad channel levels even at low frequency.
They chose dual channel for many reasons most likely.

Less pins
Less silicon
Less cost to board manufacturers

I have no doubt that quad wouldn't help RyZen much but it for sure would help APUs.
 

efishta

Limp Gawd
Joined
Oct 12, 2004
Messages
184
This MSI B350 AM4 board mentioned in the thread - why the hell does it have 2 PCI slots? Am I missing something? I had figured mobos with PCI slots would be kind of niche today, for the "legacy" market. Or is this a side effect of the apparently more limited PCI-E lanes of this chipset?

Pretty sure I'm picking one up as this 2700K is getting a bit long in the tooth... but also hard to justify a whole new platform when it works so well still. Decisions. They are hard.
 
Last edited:

Pieter3dnow

Supreme [H]ardness
Joined
Jul 29, 2009
Messages
6,784

DDR4 - 3400 @1t

What about those RAM issues? AMD is doomed right?

It seems that it will take a while for these things to be sorted. Some things are overblown by most people on the internet these days.

There is a problem that people which signed a NDA can't comment on certain things with ram timings and speed. Even a simple question as "Forgive me but I remember hearing that the Bios was embeded on the CPU, correct? How does manufacturer board bios/firmware fit in all this?" can not be answered because of how it is working on the AM4 platform.
 

jahsoul

Gawd
Joined
Feb 3, 2011
Messages
620
This MSI B350 AM4 board mentioned in the thread - why the hell does it have 2 PCI slots? Am I missing something? I had figured mobos with PCI slots would be kind of niche today, for the "legacy" market. Or is this a side effect of the apparently more limited PCI-E lanes of this chipset?

Pretty sure I'm picking one up as this 2700K is getting a bit long in the tooth... but also hard to justify a whole new platform when it works so well still. Decisions. They are hard.
They said that it was for developing countries. I'm actually glad because I fall into the "niche" market. Going forward, I knew I would have to deal with a bridge, but it is more convenient having it on the board.
 

Anarchist4000

[H]ard|Gawd
Joined
Jun 10, 2001
Messages
1,659
They chose dual channel for many reasons most likely.

Less pins
Less silicon
Less cost to board manufacturers

I have no doubt that quad wouldn't help RyZen much but it for sure would help APUs.
Quad might have been pointless. If they managed an APU with HBM, the extra channels would just be an added cost. A single stack would be roughly the equivalent of 8 memory channels. The socket and board capabilities, beyond providing power, wouldn't affect it. That leaves most of the performance and capabilities to the CPU. APU gets it's bandwidth and platform stays inexpensive with two channels. Even a lower end system might be able to forego any memory channels to offset the cost of HBM.
 
  • Like
Reactions: dgz
like this

spine

2[H]4U
Joined
Feb 4, 2003
Messages
2,686
This MSI B350 AM4 board mentioned in the thread - why the hell does it have 2 PCI slots? Am I missing something? I had figured mobos with PCI slots would be kind of niche today, for the "legacy" market. Or is this a side effect of the apparently more limited PCI-E lanes of this chipset?
.

It's very useful if you have an expensive PCI soundcard for example. My Z97 Gamer has 2 PCI slots, so it's not that uncommon really.
 

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
6,773
Quad might have been pointless. If they managed an APU with HBM, the extra channels would just be an added cost. A single stack would be roughly the equivalent of 8 memory channels. The socket and board capabilities, beyond providing power, wouldn't affect it. That leaves most of the performance and capabilities to the CPU. APU gets it's bandwidth and platform stays inexpensive with two channels. Even a lower end system might be able to forego any memory channels to offset the cost of HBM.
That makes a lot of sense, plus with Vega HCC or memory manager it could access either HBM or main memory - no need for quad channel at all with HBM if ever used that is.
 

ZodaEX

Supreme [H]ardness
Joined
Sep 17, 2004
Messages
4,318
This MSI B350 AM4 board mentioned in the thread - why the hell does it have 2 PCI slots? Am I missing something? I had figured mobos with PCI slots would be kind of niche today, for the "legacy" market. Or is this a side effect of the apparently more limited PCI-E lanes of this chipset?

Pretty sure I'm picking one up as this 2700K is getting a bit long in the tooth... but also hard to justify a whole new platform when it works so well still. Decisions. They are hard.

Why the hell did my monitor come with an HDMI port when display port is the Current standard? Oh yea that's right because the world doesn't revolve around me and these products aren't designed exclusively to suite my needs alone.
 

Dan_D

Extremely [H]
Joined
Feb 9, 2002
Messages
60,629
Why the hell did my monitor come with an HDMI port when display port is the Current standard? Oh yea that's right because the world doesn't revolve around me and these products aren't designed exclusively to suite my needs alone.

I think you are confused about what's the standard. DisplayPort has been around awhile but it's more of an emerging standard than "the" standard. Many monitors come with different arrangements of DVI, HDMI and DP. Sometimes they support any two of the three or all three but not always. Fuck, some monitors still have the shitty D-SUB connector even though DVI replaced it over a decade ago.
 

efishta

Limp Gawd
Joined
Oct 12, 2004
Messages
184
Why the hell did my monitor come with an HDMI port when display port is the Current standard? Oh yea that's right because the world doesn't revolve around me and these products aren't designed exclusively to suite my needs alone.

LOL - I dunno, 4K support at 60 Hz is about the only limitation HDMI has in practical terms, whereas PCI presents much greater limitations compared to PCI-E. I don't see them as comparable, but I get where you're coming from in your analogy. Apart from PCI sound cards as another poster mentioned (which, to be honest, I thought were obsolete given the audio system changes in Windows over recent iterations, as well as the increased popularity of external sound interfaces, but I guess HQ DACs don't go out of style...), I'm having a hard time coming up with other categories of PCI cards that have enough widespread usage to necessitate TWO PCI slots on a next-gen, yet to be released "enthusiast" platform.

Yeah, you can argue that "some" people have uses for it, but I'm not convinced it's common enough to warrant taking up valuable motherboard space.

Secondly, through that question, I was wondering aloud whether this was a strategy they took to make up for the apparently limited PCIE lanes of the new chipset. Which seems like a reasonable approach, if faced with actual limitations. The architectural diagrams posted earlier seem to hint at as much.
 

Nathan_P

[H]ard DCOTM x3
Joined
Mar 2, 2010
Messages
3,464
i'm not so sure than x370 is the top end chipset. I would not be surprised if a x390 version appeared in around 6 months with 2 x16 PCIe 3.0 slots running at full speed
 

Shintai

Supreme [H]ardness
Joined
Jul 1, 2016
Messages
5,678
i'm not so sure than x370 is the top end chipset. I would not be surprised if a x390 version appeared in around 6 months with 2 x16 PCIe 3.0 slots running at full speed

X370 is the top end.

2x16 3.0 native would require a new socket and new CPU. That's why like on aLGA11xx you use a PLX chip or simply split it in 2x8.
 

Dan_D

Extremely [H]
Joined
Feb 9, 2002
Messages
60,629
They chose dual channel for many reasons most likely.

Less pins
Less silicon
Less cost to board manufacturers

I have no doubt that quad wouldn't help RyZen much but it for sure would help APUs.

Without more information it's hard to guess but going with dual channel RAM only may be to help keep platform costs low as Ryzen's main goal was to compete with the current mainstream processors from Intel. Even if Ryzen has exceeded that goal, it's still roughly what AMD was going for. Therefore, quad-channel memory may never have been on the drawing board for desktop Ryzen CPUs. I don't think pin count or silicon has that much to do with it. Motherboard costs only factor in if AMD considered this as part of targeting a lower price point for the platform. Given how X370 stacks up against Z270 I think this is probably the case.

Dual channel memory designs require less trace paths and can potentially be done on thinner PCB's. The design is easier as optimization of those trace paths is easier when you have half as many. You also need less DIMM slots and more importantly, less power phases to implement it.
 
D

Deleted member 214115

Guest
I found Ars' information regaurding the support of the PCIe lanes to be thorough enough. I was very curious about. As in how many I can plug the I/O direct to processor and memory, rather through south I/O
There is a bit of TL;DR so if posted-my bad

From Ars Technica's review:

A platform that includes a chipsetless chipset

The Ryzen R7 is not just a processor; it's a system-on-chip. Whether it's used in this capacity, however, will depend on which motherboard and chipset it's paired with.

The processor itself has 20 PCIe 3.0 lanes, 4 USB 3.1 generation 1 (5 Gb/s) controllers, and a further 4 mixed PCIe and I/O lanes; these can be grouped for a single x4 NVMe device or split into 2 SATA plus 1 x2 NVMe or 2 SATA plus x2 PCIe. There are also two DDR4 memory channels.

Of those 20 PCIe 3.0 lanes, four are normally used to communicate with the chipset. AMD has three regular chipsets; the high-end X370, mid-range B350, and low-end A320. All include USB 3.1 generation 2 (10 Gb/s) controllers (two for the X370 and B350, one for A320), six USB 2 controllers, some additional SATA controllers (4, 2, and 2, for X370, B350, and A320, respectively), two SATA Express ports (which can be used as 4 SATA 3.0 ports), and some PCIe 2.0 lanes (8, 6, and 4). The X370 and B350 both enable overclocking, and the X370 allows the 16 remaining PCIe 3.0 lanes from the processor to be split into 2x8 channels for dual GPU support.

But for small systems, the extra I/O and extra size of a chipset might be undesirable. Accordingly, there are two other chipsets, X300 and A/B300, that greatly diminish what it means to be a chipset. With X300 or A/B300, the only I/O capabilities are those within the processor itself; they don't include USB 3.1 generation 2, they don't include SATA Express, they don't add any PCIe 2.0 lanes. The chipsets provide a handful of functions, mainly around security and provision of a Trusted Platform Module. As such they are tiny: AMD says they'll fit on a chip the size of a fingernail.

Because these chipsets do so little, they don't need four PCIe lanes from the processor; there's a dedicated SPI link for them. This means that all 20 of the processor's PCIe lanes become available. The X300 also enables overclocking and dual GPUs.
 
Top