• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

AMD has Ryzen? Eh, kinda

You keep doing this.. which Ryzen chip are you comparing it to? It seems you dont understand that AMD has all but caught up to Intel, and at a lower price point. Intel will have to react to this, as price is king for 99% of consumers. The more and more you diss AMD and favour Intel, the more I suspect you are nothing but an Intel fanboy. I could care less.. I go for value for money and here AMD has caught up big time and that is what most reviewers are pointing out.

Price is king, but the competition is closer in price than you might think on the low end. AMD's pricing decisions here have baffled me a bit. They rightfully compare the R7 line to the 6800/6900/6950, where they absolutely blow Intel out of the water in terms of value. They match the performance of the 6900k at half the price. Why leave such a large gap!?

But once you drop into the price range of the 7700k, things get weird.

7700k: $350, 4.2ghz base, 4.5 turbo. R7 1700: $330, 3.0ghz base, 3.7ghz turbo. In this price bracket, AMD goes up against the best gaming CPU that exists, and loses out in performance. The R7 line just isn't competing with this processor, the intended usage is too different. Overclockers can gain fantastic value out of either, as the 7700ks usually hit 5.0ghz while the 1700 manages to keep up with the $500 1800X.

7600k: $250, 3.8ghz base, 4.2ghz turbo. 1600X: $250, 3.6ghz base, 4.2ghz turbo. While boasting two extra cores plus AMD's hyperthreading equivalent, Intel's IPC advantage and slightly better clock speeds still win out for the pure gamer. Now, an argument definitely still exists here for the 1600X. There are games and other tasks that can take advantage of the extra cores for a smoother experience, and obviously other productivity sees a boost. Against that, the Kaby Lake chip has superior overclocking by a large margin. This is what I would consider to be your "high end gamer but not an unlimited budget" matchup. It's a big pricejump to the i7/r7 line, and the extra money is usually better spent on a better GPU. A GTX 1070 + 7600k is better than a GTX 1060 + 7700k.

7600:$230, 3.5ghz base, 4.1ghz turbo 1600: $220 3.2ghz base, 3.6ghz turbo. Similar comparison. Two extra cores but a lower speed and IPC than the Intel equivalent. Non-overclockers will see better gaming out of the Intel chip. But overclockers should probably pick the 1600.

Now we're on 4 core vs 4 core.
7500: $205. 3.4ghz base, 3.8ghz turbo. 1500X: $190 3.5ghz base, 3.7ghz turbo. Pretty equivalent here in clockspeeds, the intel IPC advantage wins again for non-overclockers but an overclocker still has a better deal for the AMD chip.

7400: $185. 3.0ghz base, 3.5ghz turbo 1500: $170 3.2ghz base, 3.4ghz turbo. Exact same scenario as the above.

So, the short story is, it's a tighter race at the low end. AMD's marketing department is pretty smart and emphasized the higher end comparisons as well as 4K/1440p benchmarks.

Once again I must emphasize this is all a "pure gamer" comparison. If you're doing productivity stuff that involves video encoding/streaming/etc, then you should be eyeballing that R7 line pretty hard right now.

If I were Intel, my response wouldn't be price slashing. I think I'd just make sure the i5 line got hyperthreading in the Whatever-Lake-Comes-Next series, and probably drop this stupid locked multiplier nonsense that stops people from overclocking the non-k chips. Yes, overclocking is still a niche enthusiast market... but now those guys have the entire AMD line to look at while Intel only offers them two quad-cores and one dual-core option, if you don't count the $500+ range that very, very few people are in the market for.
 
Last edited:
Sure, but that's a silicon lottery issue. You paid for a dual core processor. If core number 3 happened to be totally defective, would you have complained about it? If you want to guarantee 4 viable cores at the advertised frequency, you pay for the 4-core. If you roll the dice on a 2-core and it turns out all four work fine, bonus!

Of course, sometimes it is pure marketing. I know a lot of the early RX480 GPUs in the "4GB" model were actually 8GBs with literally a different sticker slapped over the top of the old one. (and the bios set to disable the extra VRAM) Why, is hard to tell. I think the issue is that they want to make as many units as possible for launch, so its faster to just run everything on one production line and built the same way. People who bought the early ones could flash them to work all 8GB of VRAM.

...and many poor bastards tried to do the same with later models and bricked their GPUs.

AMD tries some weird **** to compete, is the takeaway!

The production line limitation has always been AMDs problem. Intel's biggest asset is the scope of their production and their ability to push new products when threatened by AMD. AMD releases their best and brightest, Intel usually have several generations waiting in the wings.

I've been wanting to support the AMD proc+vid card for years, but they haven't put it all together at one time in a long while. Maybe with Ryzen2 and Vega they will get there. I do like the Freesync monitors in conjunction with the AMD graphics.
 
The production line limitation has always been AMDs problem. Intel's biggest asset is the scope of their production and their ability to push new products when threatened by AMD. AMD releases their best and brightest, Intel usually have several generations waiting in the wings.

I've been wanting to support the AMD proc+vid card for years, but they haven't put it all together at one time in a long while. Maybe with Ryzen2 and Vega they will get there. I do like the Freesync monitors in conjunction with the AMD graphics.

Freesync is nice because, well, it's free. However, it does highlight the weakness of AMDs cards: freesync only works within a certain refresh range, dependent upon the monitor. Your GPU has to be powerful enough to actually reach that range, which will be a problem for 1440p/4K until Vega launches.

Nvidias G-sync works better, but that price premium is irritating, as is being locked in to Nvidia GPUs. I think Nvidia will be forced out of this position in a year or two, though, as the HDMI 2.1 spec requires adaptive sync to be available. They can only put off adopting that for so long.

I hope, anyway.
 
Freesync is nice because, well, it's free. However, it does highlight the weakness of AMDs cards: freesync only works within a certain refresh range, dependent upon the monitor. Your GPU has to be powerful enough to actually reach that range, which will be a problem for 1440p/4K until Vega launches.

Nvidias G-sync works better, but that price premium is irritating, as is being locked in to Nvidia GPUs. I think Nvidia will be forced out of this position in a year or two, though, as the HDMI 2.1 spec requires adaptive sync to be available. They can only put off adopting that for so long.

I hope, anyway.

All true, but then AMD does make cards that are powerful enough AND they are cheaper than comparable nVidia products, for the most part, especially those that support G-Sync.
 
All true, but then AMD does make cards that are powerful enough AND they are cheaper than comparable nVidia products, for the most part, especially those that support G-Sync.

All Nvidia GPUs support g-sync.
 
All Nvidia GPUs support g-sync.

Well, not all. GTX 650 Ti and on support it. Everything in the Radeon HD 7xxx and on support Freesync... so both date back to 2012.

The lower range refresh on most Freesync monitors is in the 40-48Hz range, some as low as 30Hz. I don't know that you need a particularly beefy card to hit better than 30Fps, and if you are gaming below 40Fps then it is time to think about an upgrade anyway.

I suppose someone may have an old Radeon HD 7xxx and can't run DOOM on high settings about 20fps... but that is just not a rational expectation. ;)
 
The situation changes!

So, Intel has a... response? To the new competition with the Skylake-X and Kabylake-X processors on the X299 platform. It's a strange response, though. This is the "enthusiast" platform, so "low" and "high" become veerrry relative phrases.

The 7640k and 7740k are Kabylake-X chips. They're essentially the 7600k/7700k pasted into the X299 socket. The better power delivery and surface area for heat dissipation mean these chips can overclock a little better than the mainstream counterparts. I really don't understand these processors. They're effectively the same as the regular Kabylake chips, and priced the same, ($242 and $339) but you're stuck on a set of much more expensive motherboards whose featuresets you can't even use entirely. The X299 platform is capable of supporting 44 PCIE lanes but Kabylake-X gets you only 16. You'll also lose 4 of the 8 possible RAM slots. So, other than slightly better overclocking headroom, there's nothing to show for possibly hundreds more for the motherboard.

Moving up in the ranks to the 78XX series we get the 6-core 7800X and the 8-core 7820X for $389 and $599, respectively. In core-count, best compared the the Ryzen 7 1600X for $250 and $500, respectively. I like this competition. Previously, the $6800k was Intel's first 6-core model at $434 and the 8-core 6900k was over $1000. So, about a ~30% price drop for the Intel 6/8 core entries. But adding to the weirdness: the 6-core model will basically be obsolete next month with Coffee Lake's 6-core mainstream chip. What on earth is Intel thinking?

New to this generation's marketing is the "i9" line. Last generation we'd have still called these chips i7s. But 9 is two more than 7, and that's better! The 7900X 10-core for ~$1000, 12-core 7920X for $1200, 14-core 7940X for $1400, 16-core 7960X for $1700, and the 18-core 7980XE for $2000. Christ. Ridiculous as these prices seem, this is actually an improvement. Previously, the big daddy chip was 10 cores at $1700, now you can get 16 cores at that price or 10 cores for $1000. Yay?

AMD does have a response: Threadripper for their many-core consumer market, and Epyc for their server platform. AMD needs to fire their marketing team out of a cannon, into the sun. AMD is basically just scaling up Ryzen for these chips. I don't have a lot of details, but the 12 and 16 core variants are allegedly $800 and $1000. Not bad, two extra cores over the 7900X for $200 less, or six extra cores for the same price. For fully multithreaded applications, a definite competitor in value.
Threadripper, I believe, launches next month.
Epyc is a server platform that I'm not well-qualified to comment on. I'm not sure it will be as competitive as AMD hopes it is, from what I understand the higher power draw of the AMD chips is a much, much bigger factor in a server farm. The electric bill is not trivial, and every extra watt for the processor is also a watt of extra heat you need to get rid of. But that's Amazon's question to figure out, not mine.

As with the previous Intel/AMD comparisons, we still see Intel with a substantial clockspeed advantage when overclocked, and an IPC advantage, resulting in better performance overall. Gamers looking for every last frame per second for CS:GO on their 240hz monitor are still probably sticking with the 7700k. These new chips, obviously, are in the professional/productivity/extreme enthusiast price range.

One thing I could see being popular among the Twitch streaming crowd is the 7820X or 7900X. Expensive as hell, but able to deliver very close to the 7700k levels of single-thread performance which is still important for gaming, while having tons of leftover capability for encoding/recording/streaming so you wouldn't need an extra capture card or second streaming system. Niche market, and not the most cost-effective solution for this, but certainly convenient to be able to do all of this on one box without any slowdowns. (although maybe a 6-core variant is sufficient for this? I'm not familiar enough with streaming to say) Fun fact: when overclocked some of the people reviewing the 7900X are finding a power draw of four hundred watts. It's impossible to cool with an air cooler at that point, and even high-end AIO liquid coolers can hit thermal limits and start throttling. Intel has doubled-down on their crappy thermal paste between the CPU die and the heat spreader, which was previously soldered. This creates a bottleneck in heat transfer which really screws with enthusiast overclockers. You can get around this by risking destruction of your thousand dollar chip by delidding it and replacing the crappy thermal paste with some good stuff, and forget your warranty. But hey, Intel saved like 5 bucks per unit on their thousand dollar product. :roll:
 
Last edited:
The thing I said about AMD's marketing team... well it got better. Behold, Threadripper's actual retail packaging.

hVZrHiHh.jpg
 
My current main rig is now over 10 years old, and I still have no interest in replacing it.

Since 2006 the only thing I have done is upgrade the processor, from an 64 bit dual core Athlon X2 3000 to an Athlon X2 6000. Faster performance, and that has been enough (RAM and GPU upgrades not counting). And none of the improvements in the last decade have made me interested in scrapping my CPU-MB-RAM for such a small incremental improvement.

It is 10 years old, but still does everything I need of it. It is not like back a decade ago when CPU speeds were jumping by a huge factor every other year. I will probably use this system until it eventually dies of old age.
 
New Ryzen Threadripper chips benchmarks and price.. wow.. Intel is in serious trouble. Intel needs to lower prices or do a leap in tech, because right now there is zero justification to buy Intel CPUS at almost every level due to the massive price difference and equal performance to near equal performance for the most part.
 
New Ryzen Threadripper chips benchmarks and price.. wow.. Intel is in serious trouble. Intel needs to lower prices or do a leap in tech, because right now there is zero justification to buy Intel CPUS at almost every level due to the massive price difference and equal performance to near equal performance for the most part.

For high-end desktops, yeah, Threadripper is a clear value winner. There's only a few niche things like virtualization that are still apparently not operational on Ryzen chips, which is a dealbreaker for some people with specific tasks in mind. (but I'm not personally familiar with that) This goes double with the fact that the X299 platform was a huge stumble. The chips are power hungry as all hell and barely outperform their previous generation equivalents. Despite being "skylake-X," the X299 chips actually underperform in gaming workloads due to the new interconnect between the processors adding some latency issues. Even clocked the same as a skylake chip, you don't get the per-core performance as a regular skylake core.

But we have to remind ourselves that 8+ core processors and the people who buy them are... not actually representative of the CPU market. Threadripper, like the X299 Intel platform, is something like .01% of the high end gaming nerd buys. The real money is almost entirely in laptops and servers.

Intel wins hands down in the laptop part. The large gulf in power efficiency sees to that. Ryzen's Epyc line might be breaking into the server market now.

Budget gamers have a ton of great options in the R5 and R3 line now. Intel's 7700k is still the king of enthusiast high-FPS gaming, but that lead is shrinking. (and it comes at a hefty price tag) More and more titles are starting to stretch what a 4-core chip, even with hyperthreading, can do.

I absolutely love this new competition. We already see the results: Intel scrambled a bit to shove Skylake-X out the door, (falling flat on their face and who doesn't love seeing the big bully fall down once in a while) and the upcoming Coffee Lake will finally see 6-core chips on the mainstream consumer platform. The Sandy Bridge 4-core chips were a generational leap, vaulting well ahead of AMD's chips at the time with a huge leap in performance and efficiency. That was 2011, and a 2500k or 2600k from that era is actually still a solid chip if overclocked. Since then? Nothing special. Still 4-core chips, one unlocked, one locked, one unlocked with hyperthreading, one locked with hyperthreading. For six years, just tiny incremental gains from Intel because AMD wasn't putting up a fight.

The second AMD launches real competition with upped core counts? Oh, suddenly Intel has a 6-core option in the pipe. If Coffee Lake's high end chips are 6-core versions of the 7600k/7700k, they'll be the new raw gaming performance kings for enthusiast builders, with AMD's offering a better value for many.
 
For high-end desktops, yeah, Threadripper is a clear value winner. There's only a few niche things like virtualization that are still apparently not operational on Ryzen chips, which is a dealbreaker for some people with specific tasks in mind. (but I'm not personally familiar with that) This goes double with the fact that the X299 platform was a huge stumble. The chips are power hungry as all hell and barely outperform their previous generation equivalents. Despite being "skylake-X," the X299 chips actually underperform in gaming workloads due to the new interconnect between the processors adding some latency issues. Even clocked the same as a skylake chip, you don't get the per-core performance as a regular skylake core.

But we have to remind ourselves that 8+ core processors and the people who buy them are... not actually representative of the CPU market. Threadripper, like the X299 Intel platform, is something like .01% of the high end gaming nerd buys. The real money is almost entirely in laptops and servers.

Intel wins hands down in the laptop part. The large gulf in power efficiency sees to that. Ryzen's Epyc line might be breaking into the server market now.

Budget gamers have a ton of great options in the R5 and R3 line now. Intel's 7700k is still the king of enthusiast high-FPS gaming, but that lead is shrinking. (and it comes at a hefty price tag) More and more titles are starting to stretch what a 4-core chip, even with hyperthreading, can do.

I absolutely love this new competition. We already see the results: Intel scrambled a bit to shove Skylake-X out the door, (falling flat on their face and who doesn't love seeing the big bully fall down once in a while) and the upcoming Coffee Lake will finally see 6-core chips on the mainstream consumer platform. The Sandy Bridge 4-core chips were a generational leap, vaulting well ahead of AMD's chips at the time with a huge leap in performance and efficiency. That was 2011, and a 2500k or 2600k from that era is actually still a solid chip if overclocked. Since then? Nothing special. Still 4-core chips, one unlocked, one locked, one unlocked with hyperthreading, one locked with hyperthreading. For six years, just tiny incremental gains from Intel because AMD wasn't putting up a fight.

The second AMD launches real competition with upped core counts? Oh, suddenly Intel has a 6-core option in the pipe. If Coffee Lake's high end chips are 6-core versions of the 7600k/7700k, they'll be the new raw gaming performance kings for enthusiast builders, with AMD's offering a better value for many.

AMD is going to launch Ryzen for laptops soon.. that will be interesting.
 
I spent most of yesterday trolling /r/amd

Rzyen is a nice chip but the errors right now are bad with bios and lower clocked memory. The chip (1800x) is maxed out right out of the box it needs a water cooler to stay at 100% load at base clock and forget about over clocking. You can joke about the enthusiast line being over priced but the lower priced 7700k is better for gaming and the lower priced 6800k is better for multi tasking. Ryzen is close but it's not there yet, hard pass from me.

On the bright side with the money I didn't use on a ryzen I pre ordered a gtx 1080ti.

Many moon ago, I had built a PC with an Athalon chip in it. It was all the rage back then. I had it about 6 months, and one day I was playing Civ 3, it had just released if memory serves...

I smelled Ozone, then the screen went wonky... and then smoke. Lots of blue and grey smoke. I pulled the power cable, grabbed the extinguisher... wish I had pics of this, but the damned thing MELTED my MOBO, burned out badly. I have been Intel ever since.
 
For high-end desktops, yeah, Threadripper is a clear value winner. There's only a few niche things like virtualization that are still apparently not operational on Ryzen chips, which is a dealbreaker for some people with specific tasks in mind. (but I'm not personally familiar with that) This goes double with the fact that the X299 platform was a huge stumble. The chips are power hungry as all hell and barely outperform their previous generation equivalents. Despite being "skylake-X," the X299 chips actually underperform in gaming workloads due to the new interconnect between the processors adding some latency issues. Even clocked the same as a skylake chip, you don't get the per-core performance as a regular skylake core.

But we have to remind ourselves that 8+ core processors and the people who buy them are... not actually representative of the CPU market. Threadripper, like the X299 Intel platform, is something like .01% of the high end gaming nerd buys. The real money is almost entirely in laptops and servers.

Intel wins hands down in the laptop part. The large gulf in power efficiency sees to that. Ryzen's Epyc line might be breaking into the server market now.

Budget gamers have a ton of great options in the R5 and R3 line now. Intel's 7700k is still the king of enthusiast high-FPS gaming, but that lead is shrinking. (and it comes at a hefty price tag) More and more titles are starting to stretch what a 4-core chip, even with hyperthreading, can do.

I absolutely love this new competition. We already see the results: Intel scrambled a bit to shove Skylake-X out the door, (falling flat on their face and who doesn't love seeing the big bully fall down once in a while) and the upcoming Coffee Lake will finally see 6-core chips on the mainstream consumer platform. The Sandy Bridge 4-core chips were a generational leap, vaulting well ahead of AMD's chips at the time with a huge leap in performance and efficiency. That was 2011, and a 2500k or 2600k from that era is actually still a solid chip if overclocked. Since then? Nothing special. Still 4-core chips, one unlocked, one locked, one unlocked with hyperthreading, one locked with hyperthreading. For six years, just tiny incremental gains from Intel because AMD wasn't putting up a fight.

The second AMD launches real competition with upped core counts? Oh, suddenly Intel has a 6-core option in the pipe. If Coffee Lake's high end chips are 6-core versions of the 7600k/7700k, they'll be the new raw gaming performance kings for enthusiast builders, with AMD's offering a better value for many.

Unless they have the IPC and speed to match the 4 core chips will always be king because game optimization is a very long way from scaling beyond 8 threads
 
Many moon ago, I had built a PC with an Athalon chip in it. It was all the rage back then. I had it about 6 months, and one day I was playing Civ 3, it had just released if memory serves...

I smelled Ozone, then the screen went wonky... and then smoke. Lots of blue and grey smoke. I pulled the power cable, grabbed the extinguisher... wish I had pics of this, but the damned thing MELTED my MOBO, burned out badly. I have been Intel ever since.

I was never into all the AMD rage there was like an 8 month period where they had a better chip than Intel other than that they have always been a budget option that ran slightly hot. In your case quite a bit hotter.

I havent used my Ryzen machine much I paired my 7600k with my 1080ti and havent had any issues gaming or otherwise
 
Unless they have the IPC and speed to match the 4 core chips will always be king because game optimization is a very long way from scaling beyond 8 threads

8 physical cores is still better than 8 virtual cores on such applications, generally speaking. There are some titles that the 6/8 core chips have superior 1% low or frametime metrics, which tells you the 4 core chips are choking slightly on those titles even though average FPS numbers might not show it. I don't think "very long way" is accurate
 
Many moon ago, I had built a PC with an Athalon chip in it. It was all the rage back then. I had it about 6 months, and one day I was playing Civ 3, it had just released if memory serves...

I smelled Ozone, then the screen went wonky... and then smoke. Lots of blue and grey smoke. I pulled the power cable, grabbed the extinguisher... wish I had pics of this, but the damned thing MELTED my MOBO, burned out badly. I have been Intel ever since.

I would question the board manufacturer, and if the heat sink was from an OEM boxed set, or some aftermarket one.

I will be honest here, 30 years in the industry and I have never seen that happen. In fact, I have even torture tested Athlon (and other) processors to destruction and never seen that happen. By design they should burn up long before they would ever generate enough heat to do that.

So I question a bad Motherboard. Heaven knows I have seen some real garbage ones out there over the decades.
 
I would question the board manufacturer, and if the heat sink was from an OEM boxed set, or some aftermarket one.

I will be honest here, 30 years in the industry and I have never seen that happen. In fact, I have even torture tested Athlon (and other) processors to destruction and never seen that happen. By design they should burn up long before they would ever generate enough heat to do that.

So I question a bad Motherboard. Heaven knows I have seen some real garbage ones out there over the decades.

It was stock heatsink. I was not pleased. Melted into teh MOBO.
 
It was stock heatsink. I was not pleased. Melted into teh MOBO.

Such a thing would require a catastrophic electrical fault. You can run a CPU with no heat sink at all and it shouldn't melt.
 
Such a thing would require a catastrophic electrical fault. You can run a CPU with no heat sink at all and it shouldn't melt.

I don't disagree. Doesn't change the fact the CPU/MOBO area melted.
 
This forum needs a "computer nerds" subforum ;)

Well, the AMD's "Ryzen" like of CPU's has finally started to launch. Long-anticipated shakeup of the badly-stagnated CPU market. Intel has had such a strong lead for so long, we've been getting these minor, incremental iterations with each passing year but nothing really exciting. And at the high-end range, some pretty hilarious price gouging on the "enthusiast" chipsets. But this is why they could do it:

View attachment 67214698

It's been almost a decade since this was a real competition.

The Hype Train has been full-steam these last few weeks, with leaked benchmarks and pricing showing that AMD had a CPU going head-to-head with Intel's $1000, 8-core 6900k for half the price. Which it does! Kindof. The processor definitely performs very well for a very good price, compared to Intel's 6900k, 6800k, 5820k, etc. A solid multithreading chip. In a more multithreaded environment, i.e. video encoding, streaming, etc, AMD has something you should definitely be looking at.

That I will believe only when I see it. Using AMD for video work has long been like running on a treadmill in a suit of armor, inside a sauna because it gives off so much heat.

Still, I will keep an eye on it. I'll be doing a major upgrade toward the end of the year. AMD's got a pretty big wall of presumption to clear.
 
That I will believe only when I see it. Using AMD for video work has long been like running on a treadmill in a suit of armor, inside a sauna because it gives off so much heat.

Still, I will keep an eye on it. I'll be doing a major upgrade toward the end of the year. AMD's got a pretty big wall of presumption to clear.

The reviews I have seen.. and there has been a lot... the Threadripper Ryzen chips plus the new AMD graphics cards are doing circles around Intel when it comes to content creation (minus Final Cut bs), especially when you factor in price. Now they do give off more heat, but heat is relative to its power .. which it was not before.

It is almost like Threadripper was designed for content creation.
 
The reviews I have seen.. and there has been a lot... the Threadripper Ryzen chips plus the new AMD graphics cards are doing circles around Intel when it comes to content creation (minus Final Cut bs), especially when you factor in price. Now they do give off more heat, but heat is relative to its power .. which it was not before.

It is almost like Threadripper was designed for content creation.

The heat is equal to its consumption of electricity, not its processing power :)

Power efficiency is on Intel's side still, but that's something that primarily matters to server builders. Good AC takes care of Threadripper. Epyc? (ugh, the names) That power efficiency issue could be a big deal. Epyc chips cost less for the performance than Intel's various Xeon processors, but when you're installing hundreds, or thousands of the things, extra wattage per unit is an expensive problem. You raise the power bill to run the chips, and raise it again to cool off the building from the extra heat.
 
The heat is equal to its consumption of electricity, not its processing power :)

Yes and no. Power efficiency more than often means less processing power.

Power efficiency is on Intel's side still, but that's something that primarily matters to server builders.

Yes and no. Intel is only leading because AMD has not had anything serious on the mobile market for decades if ever. That might change now.
 
Such a thing would require a catastrophic electrical fault. You can run a CPU with no heat sink at all and it shouldn't melt.

In almost every case, the CPU itself will become fried, long before a motherboard would fail in that way.

And "stock heatsink" can only mean it was an OEM set, which has a 3 year warranty.
 
Back
Top Bottom