Official AMD "Ryzen" CPU Discussion

Status
Not open for further replies.
Is that the voltage when you’re gaming?
hopefully the new cooler sorts out your issues. make sure to give us an update when you get it installed.

That's the voltage during Cinebench CPU benchmark and during a quick game of Squad. I'm hoping I get the cooler today because tomorrow is a national holiday here in Australia so I'd have all day to fuck around with it.
 
They're advertising the 2700x at 4.3ghz turbo or whatever AMD calls it. An i7-8700 is 4.6 turbo. You're getting 2/4 more cores with Ryzen 2.
R5 2600x is 4.3ghz turbo and the i5-8600 is 4.6. You're getting 2/8 more cores this time.
Prices are comparable.
They're still not going to trade blows with the 8700k, but for anything under that I would look towards Ryzen. That's of course depends on benchmarks.
To add to this, the 8600 factory turbo frequency is only guaranteed for a single core.

The ryzen 2600x turbo speed is advertised for all 8 cores which is why core vs core there isn't much of a gap at all on the benchmarks
 
Last edited:
To add to this, the 8600 factory turbo frequency is only guaranteed for a single core.

The ryzen 2600x turbo speed is advertised for all 8 cores which is why core vs core there isn't much of a gap at all on the benchmarks

iirc the 8600 boosts to 4.0 on all cores. I still wouldn’t buy it though.
I stand by my statement above, the 8700k is the only intel processor I’d buy from this gen.
It’s great to see AMD putting the boot to intels throat this gen. They’ve already forced intel to shake up their product lines. They’ll have to do it again.
This gen xeon lineup is a mess as well.
The only thing I don’t like about the new Ryzen chips is there’s no onboard video. Onboard can be great for diagnosing issues.

Has anyone read why AMD can’t put onboard video in place of the dummy dies on threadripper?
 
The Scythe Fuma rev B could be added to that list. From my experience with Scythe fans, I wouldn't expect them to last as long as the others. So you'd be saving a little bit up front, but it will cost them same after 2 years when you have to replace the fans.
Ha! How ironic. Have you forgotten our exchange? I had noticed the Fuma's incredible performance/price ratio:
http://forums.sherdog.com/posts/130618469/
To which you finally responded with this:
http://forums.sherdog.com/posts/131397577/

That's why I took the Fuma off. The very reason I chose not to add that cooler to the list was because of what you had written about the sleeve bearing types, your personal experience, and consequently the shorter lifespan of the Scythe coolers as built. Build quality is a factor towards Tier 1, so while the Fuma was definitely the mid-tier performance king at that time, competing with the big boys, I looked at it the way I did the "flagship killer" cheaper Android phones that had all the same specs on paper as the real flagships, including display resolutions, but managed to cut those corners because their displays were vastly inferior, and would disappear the moment you walked outside.

Or did they change to ball bearing sleeves with the Rev. B?
 
Ha! How ironic. Have you forgotten our exchange? I had noticed the Fuma's incredible performance/price ratio:
http://forums.sherdog.com/posts/130618469/
To which you finally responded with this:
http://forums.sherdog.com/posts/131397577/

That's why I took the Fuma off. The very reason I chose not to add that cooler to the list was because of what you had written about the sleeve bearing types, your personal experience, and consequently the shorter lifespan of the Scythe coolers as built. Build quality is a factor towards Tier 1, so while the Fuma was definitely the mid-tier performance king at that time, competing with the big boys, I looked at it the way I did the "flagship killer" cheaper Android phones that had all the same specs on paper as the real flagships, including display resolutions, but managed to cut those corners because their displays were vastly inferior, and would disappear the moment you walked outside.

Or did they change to ball bearing sleeves with the Rev. B?

The first gen had a higher than normal mounting pressure that would run a risk of bending Skylake chips. Some of the older Arctic Coolings AIO's can bend Skylake as well. I don't know if Skylake has a thinner IHS or what the deal is.
Rev B lowered that mounting pressure. It still uses the shitty sleeve bearing fans.
 
The first gen had a higher than normal mounting pressure that would run a risk of bending Skylake chips. Some of the older Arctic Coolings AIO's can bend Skylake as well. I don't know if Skylake has a thinner IHS or what the deal is.
Rev B lowered that mounting pressure. It still uses the shitty sleeve bearing fans.
Ah, gotcha. I remember quite a few CPU fans had those issues with Skylake, and that concerning you.

Definitely not gonna add it to Tier 1 with that build quality and its 2-year warranty. I bet we could add quite a few coolers to the list if we replaced their default fans with the Noctua F series fans, for example, but when that makes them more expensive than the Noctua NH-D14 or NH-D15 themselves, what's the point?
 
Ah, gotcha. I remember quite a few CPU fans had those issues with Skylake, and that concerning you.

Definitely not gonna add it to Tier 1 with that build quality and its 2-year warranty. I bet we could add quite a few coolers to the list if we replaced their default fans with the Noctua F series fans, for example, but when that makes them more expensive than the Noctua NH-D14 or NH-D15 themselves, what's the point?

If someone only had $50 for a cooler and needed one now, it wouldn't be a bad choice as long as you understood you'd have to replace the fans down the line.
I don’t keep up on mid tier coolers though so there might be something better.
IMO it’s either a hyper 212 or you step up to the the tier 1’s you listed. If you don’t have the clearance for the tier 1’s, go to a 240 aio.
The new Ryzen chips change that though so I should probably do some research. I haven’t seen any charts, but I’ve heard the Tech Deals guy say on a stream the 212 only performs 3% better than the Wraith Prism.
I prefer a tower cooler over a flower style. I don’t know why though.
I’m excited for the copper Cryorig versions coming out, they’re claiming a 15% increase in cooling on the C7.
 
My Noctua should arrive tomorrow. I hoped it would arrive yesterday as today was a holiday and I would have had time to tinker with it. so I'll have to put it on after work tomorrow and I'll do some bench tests and stress tests and see what the cooling is like. If not I'll get the chip delidded. I'll check my PC's airflow too. Right now I've got the three big fans at the front, two on the CPU cooler, with the other case mounted fan blowing the air out the back, with two top mounted fans pushing heat out the top as well.
 
Jim Keller the guy who helped design Ryzen is leaving Tesla to work on Microprocessors again it's rumored with Intel What? Yeah that's my take Intel could be facing a billion dollar lawsuit if this is right unless Keller had a contract that AMD could not stop him from leaving. Some people said he is going to work for Apple on a high performance ARM chip again.

https://electrek.co/2018/04/25/tesla-autopilot-jim-keller-leaving-chip/
 
Last edited:
Jim Keller the guy who helped design Ryzen is leaving Tesla to work on Microprocessors again it's rumored with Intel What? Yeah that's my take Intel could be facing a billion dollar lawsuit if this is right unless Keller had a contract that AMD could not stop him from leaving. Some people said he is going to work for Apple on a high performance ARM chip again.

https://electrek.co/2018/04/25/tesla-autopilot-jim-keller-leaving-chip/
If I were him I'd go to Apple, or hope to get poached by one of the Chinese companies looking to crack the top ranks (Mediatek, HiSilicon, etc.)

As much as I hate to say it, ARM is kicking x86's ass. VR is the latter's only hope because, otherwise, we've sort of reached the point in the silicon evolution where peak processing power is no longer the prime concern, and performance per watt + cost efficiency are far more critical to business relevance.

ARM is the future.
 
Noctua has arrived. I'll install it once I finish work and start doing tests. I didn't realise just how big it was compared to my 212
 
Hey guys, I need some help on a hardware issue.

My 8-ish year old desktop monitor went kaput. It can only be used a few minutes at a time.

I tried to use another old desktop which I am pretty sure still works, except that it still uses a VGA connecter. I tried to use a DVI to VGA adapter but it doesn't work. Is that excpected? If I use an HDMI to VGA adapter, will that work?
 
@Madmick @jefferz Installed the Noctua NH-D15 after work.

4.6ghz, idling about 35c and under full load it got to 79c. Much better than the 99c. About to do the Intel Processor diagnostic tool as my final test. Got to 81c on the IPDT but the ambient room temp is about 32c so I find that acceptable.
 
Last edited:
Hey guys, I need some help on a hardware issue.

My 8-ish year old desktop monitor went kaput. It can only be used a few minutes at a time.

I tried to use another old desktop which I am pretty sure still works, except that it still uses a VGA connecter. I tried to use a DVI to VGA adapter but it doesn't work. Is that excpected? If I use an HDMI to VGA adapter, will that work?
The DVI-to-VGA adapter should prove less troublesome than the HDMI-to-VGA adapter, usually, because it doesn't have to deal with separating audio.

Are you certain you need a DVI-I to VGA adapter instead of a DVI-D to VGA converter?
 
If I were him I'd go to Apple, or hope to get poached by one of the Chinese companies looking to crack the top ranks (Mediatek, HiSilicon, etc.)

As much as I hate to say it, ARM is kicking x86's ass. VR is the latter's only hope because, otherwise, we've sort of reached the point in the silicon evolution where peak processing power is no longer the prime concern, and performance per watt + cost efficiency are far more critical to business relevance.

ARM is the future.

My thought may have been shortsighted it seems that Intel slammed their chip designers post spectra an meltdown. Keller been given the top job to head Intel's future CPU efforts.

He is also to lead the efforts to integrate the CPU an GPU functions an work on AI functionality enhancements an tighter integration. More AMD people have been recruited to Intel to also handle the marketing post chip hacking.

Intel is not messing around with this shakeup house cleaning apparently.
 
I think the market is wearing its ass on its head.

That 1700X and 1700 are going to fucking sell. They've been pessimistic and wrong about AMD for the past three years. I remember someone asking me if a PC thread title praising AMD was a joke. They killed it in the market that year: outperformed all expectations. No, it wasn't a joke.

Meanwhile, Nintendo shares have been surging on news of the Switch. LOL, are you fucking kidding me? A console that doubles down the worst-selling strategy of the 8th generation, and doesn't even offer a special exclusive launch to the console? Get ready to lose some money, chumps.


I think right now will be the best time to buy AMD stock for probably several years. You won't see it undervalued again like this for a while.
Madmick looks really stupid (although I also liked NVIDIA as a buy-- I just didn't believe in the Switch).
Madmick looks incredibly smart.

upload_2018-7-12_10-21-58.png

In particular, if you'd invested $100k, for example, in May 2017 when the stock cratered, and I expressed bewilderment, noting that AMD was actually building cash reserves for the first time in a decade, and bumped GPU prices manufacturer-side due to the crypto surge, that stock would be worth $149k today; for roughly a 49% ROI in a 14-month period.
 
Madmick looks really stupid (although I also liked NVIDIA as a buy-- I just didn't believe in the Switch).
Madmick looks incredibly smart.

View attachment 405765

In particular, if you'd invested $100k, for example, in May 2017 when the stock cratered, and I expressed bewilderment, noting that AMD was actually building cash reserves for the first time in a decade, and bumped GPU prices manufacturer-side due to the crypto surge, that stock would be worth $149k today; for roughly a 49% ROI in a 14-month period.
I'm still upset I didn't put any money into AMD. Just about everyone in the tech community joked they should buy stock when it was under $3,.
 
So, is the Threadripper 1950x a good buy?
I am looking to assemble a PC for rendering in $2000-2500 price range and could use some advice. Myself I haven't been following the tech news the last couple of years.
 
So, is the Threadripper 1950x a good buy?
I am looking to assemble a PC for rendering in $2000-2500 price range and could use some advice. Myself I haven't been following the tech news the last couple of years.
http://cpu.userbenchmark.com/
Organize by "M-Core Pts" for overall performance.

Oh, if you're looking to build an editing computer, which is my assumption with that price range, then the Threadrippers absolutely destroy the Intel i9 & Core X processors in terms of value. It's only $699.99 at Amazon right now:

  • ($620) TR-1920X [12c/24t]
  • ($700) TR-1950X [16c/32t]

  • ($860) i7-6950X [10c/20t]
  • ($850) i9-7900X [10c/20t]
  • ($950) i9-7920X [12c/24t]
  • ($1050) i9-7940X [14c/28t]
  • ($1300) i9-7960X [16c/32t]
  • ($1880) i9-7980XE [18c/36t]
AMD Threadripper 1950X vs. Intel i9-7960X
You can get ~$100 back on the Intel side if you go with the cheapest ATX LGA 2066 board vs. the cheapest ATX TR4 board, and you don't really give anything up in the way of features. It doesn't matter. It doesn't spare Intel the brutality.

Aim for 32GB+ 3200 MHz DDR4 RAM.

Here is a more realistic comparison based on price (TR-1950X vs. i9-7900X):

 
Last edited:
Nvidia Drivers Have Over 2 Times Failure Rate of AMD Drivers According to 3rd-Party Analysis
A grand total of six AMD and six Nvidia graphics cards were subjected to 12 days of 24-hour stress testing using CRASH in order to identify their reliability. Each test spans four hours and covers resolution changes, rendering, display orientation changes and more, totaling 288 hours of testing for each GPU and a massive 432 tests each for AMD and Nvidia GPUs.

The final results indicated an average success rate of 93% for AMD GPUs (401 out of 432 tests) and 82% for Nvidia GPUs (356 out of 432 tests). AMD had a failure rate of 7% while Nvidia was more than twice as unreliable with an 18% failure rate.
Andy-Dwyer-Shock.gif
 
Status
Not open for further replies.

Similar threads

Back
Top