- Joined
- Aug 3, 2004
- Messages
- 23,881
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Problem is: 14P cores is HEDT (specs in sig).16 P cores is not HEDT.
/fight me.
Problem is: 14P cores is HEDT (specs in sig).
/fight me.
I saw your sig, my response was tongue in cheek. I want the extra pcie lanes so I can run 3 x m2 SSD's without sharing slots or running through the chipset, I really don't have any pressing need for multiple cards in the pcie slots, but it's nice to know they're there in the instance Nvidia bring back some form of multi GPU implementation (DLSS5?).You clearly didn't look at mine.
Our systems were HEDT. In 2019. 4-core Nehalem in 2009, 6-core Sandy Bridge-E in 2011, etc. Extra memory channels, extra PCIe slots (all mine are in use, how about you?), most of the server platform features plus OC; it's generally about having more cores and other features over the standard desktop platform.
Agreed, AVX-512 is a game changer that Intel have pretty much had to abandon due to e-cores. I guess we'll have to see how the 256 bit implementation on e-cores works before forming an opinion.16 P-cores has been on mainstream desktop since AMD's 3950X, and 24 is coming with Ryzen 10k. You can still get Threadripper/Xeon-W chips with fewer cores than desktop, but that's super entry-level stuff for those that need the platform more than the cores. Intel is just spamming the E-cores and I don't think it's providing the real experience they want us to believe it is over a comparably priced AMD performance-only CPU instead of upping the good core count with that sweet AVX-512.
Agreed.Calling a chip that just has more cores than its bretheren, maybe because of a second chiplet, but is still socketed on the same limited desktop motherboards doesn't deserve to share the HEDT label with the likes of TR and Xeon-W (and the former true HEDT Core(-X) products). Core 2 Quad wasn't the start of HEDT, either; it was just a higher tier of a mainstream platform from before there was true differentiation.
I wouldn't call any ThreadRipper or Xeon HEDT. Those are workstation. Even the later Intel X series were slipping hard towards workstation. They were basically overclocked server chips with ECC disabled, but not quite as fast as the desktop chips in games most of the time. They didn't clock as high. HEDT to me is what I used to build an SLI rig with a pair of GTX 680s back in 2012. Then I added a NIC that wanted an 8x slot. So to me HEDT = cram a lot of stuff in a gaming rig. Socket 2011 X79 board, i7-3820 that was only a little more $ than a regular desktop model i7 and about as fast CPU-wise, lots more I/O and ram capacity. If you wanted you could pay more for a 6-core CPU. To me it's not current HEDT if it can't keep up with current desktop chips in games. I like more than 8 cores for occasional hobby use, but I'm not going to pay up for an expensive workstation setup that runs games slower.Calling a chip that just has more cores than its bretheren, maybe because of a second chiplet, but is still socketed on the same limited desktop motherboards doesn't deserve to share the HEDT label with the likes of TR and Xeon-W (and the former true HEDT Core(-X) products). Core 2 Quad wasn't the start of HEDT, either; it was just a higher tier of a mainstream platform from before there was true differentiation.
Bearing in mind that x299 coupled with an x-series CPU also allowed for quad channel memory, I can get bandwidth that's right up there with many DDR5 systems - It was something that set HEDT parts above entry level consumer grade parts, and I think this is the point Grebuloner is making, increased core count alone doesn't really make a CPU a HEDT part.I wouldn't call any ThreadRipper or Xeon HEDT. Those are workstation. Even the later Intel X series were slipping hard towards workstation. They were basically overclocked server chips with ECC disabled, but not quite as fast as the desktop chips in games most of the time. They didn't clock as high. HEDT to me is what I used to build an SLI rig with a pair of GTX 680s back in 2012. Then I added a NIC that wanted an 8x slot. So to me HEDT = cram a lot of stuff in a gaming rig. Socket 2011 X79 board, i7-3820 that was only a little more $ than a regular desktop model i7 and about as fast CPU-wise, lots more I/O and ram capacity. If you wanted you could pay more for a 6-core CPU. To me it's not current HEDT if it can't keep up with current desktop chips in games. I like more than 8 cores for occasional hobby use, but I'm not going to pay up for an expensive workstation setup that runs games slower.
I did build an X299 system, largely because at the time desktop boards just didn't have enough slots. I wanted at least 5 4x PCI-e devices minimum. 3 M.2 + 10Gb SFP+ NIC + some extra for expansion because it seems like I always end up wanting to add something over the life of a rig. That X299 build could provide an 8X slot for my NIC, 3 onboard + 4 on a 16X card for 7 M.2, and still run the vid card at 16X. Fast forward to now and basic Z890 boards have 6, usually 4 M.2 and 2 4x slots. Some AMD boards have 5-6, but they tend to get into that "use this and that turns off" mess and some of it's PCI-e 3.0 because they steal lanes from SATA ports. I want an "HEDT" chipset that has more PCI-e lanes and allows me to stuff more storage and cards into a rig. It would be nice if it has a bit better CPU, but at this point a "9" desktop CPU is plenty good enough for me.
lol, gotta keep all those tiny little pins connected when that long ass-chip tries to flex. Whenever I closed one of those sockets, I was always worried I was going to crack something from the amount of force that second lever required.So... a 2 lever setup like my socket 2011 and 2066 rigs? All that was old is new again!
I always found LGA2066 to be somewhat less daunting as I had less pressure on a single leaver. FCLGA1700 really wasn't a great design, resulting in the IHS flexing, resulting in uneven contact between the IHS and the cooler unless a contact frame was used - going back to two leavers will hopefully resolve this problem 100%.lol, gotta keep all those tiny little pins connected when that long ass-chip tries to flex. Whenever I closed one of those sockets, I was always worried I was going to crack something from the amount of force that second lever required.
One of the Ultra 7 only 4 pcores max?
Looks like there are other Ultra 7 with 8 p core.
View: https://youtu.be/KIK4fgsYtqQ?si=tnqWDUgIEBpnLFKp
you can disable them in the bios usually, that a non issue.onboard IGP using power and heating the CPU as it will NEVER be used.
A disabled GPU still uses (albeit a small amount) of power and a KF is usually cheaper, why would I pay more for a "gpu" I will never use?you can disable them in the bios usually, that a non issue.
Because when you're troubleshooting the melted power connector on your 6090, you can still use your system.A disabled GPU still uses (albeit a small amount) of power and a KF is usually cheaper, why would I pay more for a "gpu" I will never use?
I am not selling my systems, my RTX 4090 Systeems goes to my wife and I dislike midgde "gpu's" on my CPU, I an not a console user.cheaper is an obvious reason to pick a KF when that the case, pay more for a iGPU despite owning a discrete GPU is resale value (or when it turn into a server second life because you do not sell it, even headless it is a nice to have the possiblity of a monitor on initial setup and what not) or emergency gpu issues/debugging, power usage of a disabled iGPU (even enabled when not used it is so minimal, disabled must be minuscule), with quicksync being often better that discrete gpu for some things, probably better to keep it enabled. It is a non-issue, it would be different if KF did not had the actual silicon for the iGPU on there and you gained something by its absence of course, but that not the case.
I am not part of that | 0.06% minority that are subpar at connecting cables.Because when you're troubleshooting the melted power connector on your 6090, you can still use your system.
I am not part of that | 0.06% minority that are subpar at connecting cables.
https://www.pcworld.com/article/138...ufacturing-debris-behind-12vhpwr-melting.html
all fair enough, but they are always there, it is not like the KF does not have them, they are just disabled, would the option exist to buy a cpu that did not waste silicon on the gpu and give you more cpu that would be a bigger deal if Intel make one or not for a new generation, but as it work it does not matter.I am not selling my systems, my RTX 4090 Systeems goes to my wife and I dislike midgde "gpu's" on my CPU, I an not a console user.
To many idiots voicing the same "conserns" in earnest online for this kinda of joke to function without sarcasm tags.