I finished my CPU non-gaming testing scripts a couple of weeks ago. It automates manual processes, but allows for equal conditing environments between platforms (aka 20 seconds between benchmarks to allow for cache processes to settle for more consistent results). The actual tests themselves don't change when you automate, and the industry standard tests like PCMark and others are basically an automated series of tests attempting to represent real-world test patterns. You still need a sanity check on the scores at the end, sure. Besides, one button to start a 6-8 hour test means being able to go off and do other things.
And if the results are weird, lets say in Winrar, do you check if you might get much better results with diff settings? Anyway, hope you had time to look at memory scaling a bit, as well as power and efficiency in actual apps.
WinRAR is a little unique, as being memory dependent means there's a fair amount of caching goes on. So if you run the test enough times, you eventually reach a steady state. My new WinRAR test on the latest version goes through ten times, and I take the average of the last five as my result.
Things like that take observation and experience, which I've tried to use extensively in our new suite.
Any chance you are simulating a hexa and quad in a soon to arrive article, as opposed to waiting for those SKUs? Curious to see how those do with more memory BW per core.
1 more thing, you state in a question that Ryzen has 16 PCIe lanes and we kinda thought it has 32 with 24 usable. Or 32 with 20 usable if 4 go to the NVMe slot. In case you can clarify before the NDA lifts.
I believe what he means is that Ryzen has 16 lanes for general purpose use, whereas 4 lanes are reserved for NVMe and 4 for the chipset, AMD has been wary of saying they have 24 lanes specifically to prevent this kind of confusion. The number of 32 comes from people adding 8 lanes from the chipset, but it's incorrect to do so.
"A note to what you said about Jim Keller though - he was definitely a brilliant CPU guy, but he was a part of that vision of what we wanted to do." Doesn't sound like they're good friends to me. Everyone thought Jim Keller was behind Zen...and a lot would credit him for Zen and AMD's comeback. How does he get mentioned by Su? "definitely a brilliant guy, BUT..." Wow
My interpretation of that is because a CPU design team is not just one person. You might have a brilliant figurehead, but there are hundreds of people, often very competent engineers who have been at this game just as long, who don't necessarily take the spotlight. In recent years Mike Clark has been doing that role, and he's an architecture guy through and through who loves to talk about the intracies of the design he has implemented and developed from his ideas and that of the whole team.
TL;DR - I think you're trying to find malice where there isn't any. It takes more than one smart dude to design a new CPU uArch.
Nuance is the word that aptly describes her response, and she did the right thing by acknowledging the whole team's contribution instead of just one person.
And that is probably because they'd want to inspire confidence in their team, since Jim is not part of AMD anymore. I remember how disappointed I was when news broke that he had left AMD, which in turn put a big question mark on Zen.
Zen taped out about the same time Jim Kellar left, so I don't know why that would affect your opinion of the design. Ian makes a very valid point here. Kellar was not the sole designer. He's brilliant and Im a fan but he was part of AMD's team. LS was right to acknowledge him AND the team.
Focusing on PC made sense for the first roll-out. However, if AMD switches to notebook-first, it may give Intel an even bigger headache, as Intel will now focus on servers first, which means Intel will be BEHIND by another year when it comes to competing in the notebook space. There's a great opportunity for AMD to take back the notebook market hard.
However, I also realize AMD wants to go after the highly profitable server space, and the only way to do that is to at least stay competitive with Intel. But I still think it would be better for AMD to "disrupt" Intel from the lower-end - the notebook market. Let the server market run its course, and just grow into that naturally. But take back the notebook market aggressively. AMD has the best chance of success there as well.
Also, yes definitely stop using the Opteron brand and come up with a new one. AMD is ZERO in the server market right now. Nobody cares about "what Opteron used to represented 10 years ago". So come up with a fresh new brand that identifies with the performance of Ryzen.
The biggest problem that AMD has in the notebook area seems to reflect the phrase "With Friends like these who needs enemies!". Often AMD notebooks are saddled with poor supporting peripherals 15.6" 720p screens, poor visibility panels, 5400 RPM HDDs, flimsy keyboards, 802.11n Wifi. AMD gave a good price for its mobile components so that OEMs can spend money equipping the laptops with good components. Instead of "value", the OEMs went for "cheapness". So effectively an AMD laptop is rarely a perceived "value" compared to an Intel one.
True, but AMD processors for the past half decade have been largely junk, and even before that Phenom didn't stack up too well. Who would design a high end laptop and then put in an AMD processor? It's sort of a catch-22 situation.
Now with Ryzen at least being competent I think you'll see notebook manufacturers more interested in putting out laptops with decent quality components.
Except the hardware isn't ready for that. APUs are 2H 2017. Adding a discrete GPU eats battery life. Definitely not the right time. Carrizo is a bit of a holdover.
Naples is much closer to release, and as Intel has 99% of the server market, I think it's wise to target this profitable area. AMD needs capital. Since server parts have high margins, they can undercut Intel and offer more performance.
It's difficult to do that on low-margin parts like mobile/notebook.
Good interview Ian, great mix of technical and business questions.
Also, can I just say that it's great to see a woman, and one with impeccable technical credentials, leading a tech company. Hopefully one day this won't be novel anymore.
And, since we are giving credit to Jim Keller and Mark Papermaster et al, let's also not forget Lisa Su here. She's been leading the company through some very tough times; there many times in 2014 and 2015 that I wasn't sure if AMD would even make it. She deserves a lot of credit for getting AMD to the place it's at today.
Yep. She decided to stop the 'scatter gun' approach and concentrate purely on CPU and Graphics. Intel could probably learn a few lessons, given the number of 'pies' they have their fingers in.
That was an interesting question about the thermal limit for CPUs being so much lower than GPUs nowadays. Part of it likely has to do with usable parallelism as Su pointed out, but then why not go for the gigahertz? Leakage, etc. holding back higher voltages / clocks I suppose, but also maybe it's just tougher to dissipate that much heat sitting on the motherboard vs. in a PCI card form factor. There was a reason CPU packages were standing up off the board for a while in the late 90's / early 00's.
Thanks a lot for the question on Bristol Ridge. I've been waiting for it for a long time and I know others have too. I do hope that AMD releases it soon and that Anandtech reviews it.
When will they ever tell the stylist that jeans does not go with high heels? Lisa looks like a man anyways, good that they are slowly moving off the grandmother image but those high heels are not needed at all...
What is it with people and worrying about how someone else is dressed? I swear some guys out there fuss over womens' fashion more than the women that have to figure out what to wear to keep your tender sensitivities over stereotypical gender-based attire from steering people into flipping out about it. Why not focus your mind on the CPU technology and try to keep your little boy downstairs from driving your fingers when you're typing?
She used to go grey but at some point she decided she wanted to look younger because she wanted AMD to present a younger, more athletic and lively profile. Anyways, try to curb your unhinged desire to randomly spurt out sexist crap, unless you want to die a virgin.
That's harsh. Why should we as CPU buyers care about what the CEO looks like?
Anyway, black tee + blue jeans seems to be the default mode of dress for many people in tech. It's clean, simple, boring, and forces you to focus on what's being said instead of who's saying it. I'm guilty of wearing this combo almost every day when I don't have to wear a suit.
What I'd like to ask Lisa, is to please stop treating its customers like they are stupid. I'm talking about using misleading marketing tactics, such as renaming an old GPU, to make it seem "all new" and stuff. The PC guys are also least likely to fall for stuff like this, and you're just making them dislike you more.
It's even worse when you're using a brand name that's supposed to be used by Vega, for Polaris cards, like you're planning to do with the RX 500 series this April. Stop doing that.
AMD is not the only company guilty of rebranding. Despite the optimizations claims made regarding Kaby Lake, Intel is effectively re-releasing Skylake. Nvidia has done the same with their GPUs too as have many, many other large and small companies. I used to change the model numbers, but none of the internal components in PCs I built in the 90s when I was running a little computer shop to keep my branding fresh looking. Cereal companies change their box art or rename a cereal after spraying on different food coloring.
If you don't find rebranding intolerable, you're living on the wrong planet.
Please, please, please put Zen in an ultra low power APU with a stonking GPU inside. Intel killed Atom and walked away from the mobile segment, now AMD can try their hand at it. I would love to see a non-pro Surface 4 with 20 hour battery life and a fast GPU, at half the price of an Intel-equipped Surface tablet.
"I think what we’re trying to address is maybe the forward thinking users, not just the today gamer."
Since this was before all the benchmarks were made public, her comments about gaming are interesting. Seems like she knew that Ryzen might not perform quite as well in games, but is really good at video encoding and other "professional" applications.
I'm more interested in what the new consoles with look like Powered by AMD. This makes AMD have funding to actually research and produce better performing products. When the console drive goes up, so does the talent for PC as they would be very similar. I don't really play any new PC games though but I assume the VR is where the interest will be headed.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
59 Comments
Back to Article
Ian Cutress - Thursday, March 2, 2017 - link
I missed a trick. Should have titled it 'Making AMD Tock'. :DEden-K121D - Thursday, March 2, 2017 - link
AMD Ryzen - Process+Architecture+ Optimization rolled into onekingkazuma - Thursday, March 2, 2017 - link
hahaha. That woulda ticked someone off at intel.Out of curiosity, do you guys automate any benchmarking or is it mostly manual?
Ian Cutress - Thursday, March 2, 2017 - link
I finished my CPU non-gaming testing scripts a couple of weeks ago. It automates manual processes, but allows for equal conditing environments between platforms (aka 20 seconds between benchmarks to allow for cache processes to settle for more consistent results). The actual tests themselves don't change when you automate, and the industry standard tests like PCMark and others are basically an automated series of tests attempting to represent real-world test patterns. You still need a sanity check on the scores at the end, sure. Besides, one button to start a 6-8 hour test means being able to go off and do other things.Automate when you can, manual when you have to.
I'm sure someone can dig up that graph...
jjj - Thursday, March 2, 2017 - link
And if the results are weird, lets say in Winrar, do you check if you might get much better results with diff settings?Anyway, hope you had time to look at memory scaling a bit, as well as power and efficiency in actual apps.
Ian Cutress - Thursday, March 2, 2017 - link
WinRAR is a little unique, as being memory dependent means there's a fair amount of caching goes on. So if you run the test enough times, you eventually reach a steady state. My new WinRAR test on the latest version goes through ten times, and I take the average of the last five as my result.Things like that take observation and experience, which I've tried to use extensively in our new suite.
jjj - Thursday, March 2, 2017 - link
Any chance you are simulating a hexa and quad in a soon to arrive article, as opposed to waiting for those SKUs?Curious to see how those do with more memory BW per core.
jjj - Thursday, March 2, 2017 - link
1 more thing, you state in a question that Ryzen has 16 PCIe lanes and we kinda thought it has 32 with 24 usable. Or 32 with 20 usable if 4 go to the NVMe slot.In case you can clarify before the NDA lifts.
HSG - Thursday, March 2, 2017 - link
I believe what he means is that Ryzen has 16 lanes for general purpose use, whereas 4 lanes are reserved for NVMe and 4 for the chipset, AMD has been wary of saying they have 24 lanes specifically to prevent this kind of confusion. The number of 32 comes from people adding 8 lanes from the chipset, but it's incorrect to do so.ddriver - Friday, March 3, 2017 - link
Winrar is basically trash, it has no place in an 8 core 16 thread chip review. Go for 7zip.Manch - Thursday, March 2, 2017 - link
It's one hell of a Tock. Until this NDA lifts, wont know if they're the Patriots or the Falcons...AndrewJacksonZA - Thursday, March 2, 2017 - link
Lol, haha! Nice one Ian! :-)mschira - Thursday, March 2, 2017 - link
Haha, I was about to suggest exactly that.And what a tock it is.
M.
Alistair - Thursday, March 2, 2017 - link
A very nice interview. Thanks!AndrewJacksonZA - Thursday, March 2, 2017 - link
Thanks for doing and posting the interview.3ogdy - Thursday, March 2, 2017 - link
"A note to what you said about Jim Keller though - he was definitely a brilliant CPU guy, but he was a part of that vision of what we wanted to do."Doesn't sound like they're good friends to me. Everyone thought Jim Keller was behind Zen...and a lot would credit him for Zen and AMD's comeback. How does he get mentioned by Su? "definitely a brilliant guy, BUT..."
Wow
Ian Cutress - Thursday, March 2, 2017 - link
My interpretation of that is because a CPU design team is not just one person. You might have a brilliant figurehead, but there are hundreds of people, often very competent engineers who have been at this game just as long, who don't necessarily take the spotlight. In recent years Mike Clark has been doing that role, and he's an architecture guy through and through who loves to talk about the intracies of the design he has implemented and developed from his ideas and that of the whole team.TL;DR - I think you're trying to find malice where there isn't any. It takes more than one smart dude to design a new CPU uArch.
Tamz_msc - Thursday, March 2, 2017 - link
Nuance is the word that aptly describes her response, and she did the right thing by acknowledging the whole team's contribution instead of just one person.AndrewJacksonZA - Thursday, March 2, 2017 - link
+13ogdy - Thursday, March 2, 2017 - link
And that is probably because they'd want to inspire confidence in their team, since Jim is not part of AMD anymore. I remember how disappointed I was when news broke that he had left AMD, which in turn put a big question mark on Zen.Manch - Thursday, March 2, 2017 - link
Zen taped out about the same time Jim Kellar left, so I don't know why that would affect your opinion of the design. Ian makes a very valid point here. Kellar was not the sole designer. He's brilliant and Im a fan but he was part of AMD's team. LS was right to acknowledge him AND the team.Krysto - Thursday, March 2, 2017 - link
Focusing on PC made sense for the first roll-out. However, if AMD switches to notebook-first, it may give Intel an even bigger headache, as Intel will now focus on servers first, which means Intel will be BEHIND by another year when it comes to competing in the notebook space. There's a great opportunity for AMD to take back the notebook market hard.However, I also realize AMD wants to go after the highly profitable server space, and the only way to do that is to at least stay competitive with Intel. But I still think it would be better for AMD to "disrupt" Intel from the lower-end - the notebook market. Let the server market run its course, and just grow into that naturally. But take back the notebook market aggressively. AMD has the best chance of success there as well.
Also, yes definitely stop using the Opteron brand and come up with a new one. AMD is ZERO in the server market right now. Nobody cares about "what Opteron used to represented 10 years ago". So come up with a fresh new brand that identifies with the performance of Ryzen.
Michael Bay - Thursday, March 2, 2017 - link
The phrase "AMD in a notebook" will give many chills. But I suspect it won`t be for reasons AMD would like.rocketbuddha - Thursday, March 2, 2017 - link
The biggest problem that AMD has in the notebook area seems to reflect the phrase"With Friends like these who needs enemies!". Often AMD notebooks are saddled with poor supporting peripherals 15.6" 720p screens, poor visibility panels, 5400 RPM HDDs, flimsy keyboards, 802.11n Wifi.
AMD gave a good price for its mobile components so that OEMs can spend money equipping the laptops with good components. Instead of "value", the OEMs went for "cheapness". So effectively an AMD laptop is rarely a perceived "value" compared to an Intel one.
Nagorak - Friday, March 3, 2017 - link
True, but AMD processors for the past half decade have been largely junk, and even before that Phenom didn't stack up too well. Who would design a high end laptop and then put in an AMD processor? It's sort of a catch-22 situation.Now with Ryzen at least being competent I think you'll see notebook manufacturers more interested in putting out laptops with decent quality components.
JasonMZW20 - Friday, March 3, 2017 - link
Except the hardware isn't ready for that. APUs are 2H 2017. Adding a discrete GPU eats battery life. Definitely not the right time. Carrizo is a bit of a holdover.Naples is much closer to release, and as Intel has 99% of the server market, I think it's wise to target this profitable area. AMD needs capital. Since server parts have high margins, they can undercut Intel and offer more performance.
It's difficult to do that on low-margin parts like mobile/notebook.
aryonoco - Thursday, March 2, 2017 - link
Good interview Ian, great mix of technical and business questions.Also, can I just say that it's great to see a woman, and one with impeccable technical credentials, leading a tech company. Hopefully one day this won't be novel anymore.
And, since we are giving credit to Jim Keller and Mark Papermaster et al, let's also not forget Lisa Su here. She's been leading the company through some very tough times; there many times in 2014 and 2015 that I wasn't sure if AMD would even make it. She deserves a lot of credit for getting AMD to the place it's at today.
Haawser - Thursday, March 2, 2017 - link
Yep. She decided to stop the 'scatter gun' approach and concentrate purely on CPU and Graphics. Intel could probably learn a few lessons, given the number of 'pies' they have their fingers in.Meteor2 - Sunday, March 5, 2017 - link
Intel actually think their future margins are in things like 5G and FPGA. I'm not convinced myself.Meteor2 - Sunday, March 5, 2017 - link
+1.ABR - Thursday, March 2, 2017 - link
That was an interesting question about the thermal limit for CPUs being so much lower than GPUs nowadays. Part of it likely has to do with usable parallelism as Su pointed out, but then why not go for the gigahertz? Leakage, etc. holding back higher voltages / clocks I suppose, but also maybe it's just tougher to dissipate that much heat sitting on the motherboard vs. in a PCI card form factor. There was a reason CPU packages were standing up off the board for a while in the late 90's / early 00's.Krysto - Thursday, March 2, 2017 - link
> but then why not go for the gigahertz?Probably for the same reason. Easier to dissipate heat from the more distributed GPU.
fanofanand - Thursday, March 2, 2017 - link
Fantastic interview Ian! 108 minutes to go until NDA lift!Ian Cutress - Thursday, March 2, 2017 - link
I need another 1080. x10. Please. :)lilmoe - Thursday, March 2, 2017 - link
"I think you should help your users through that! I really think that’s the case."Me too
ET - Thursday, March 2, 2017 - link
Thanks a lot for the question on Bristol Ridge. I've been waiting for it for a long time and I know others have too. I do hope that AMD releases it soon and that Anandtech reviews it.Ian Cutress - Thursday, March 2, 2017 - link
When they get released, I want to test them all. THEM ALL. *sinister laugh*lilmoe - Thursday, March 2, 2017 - link
You sound pretty excited about Ryzen. Hmmmmm.Ian Cutress - Thursday, March 2, 2017 - link
I just like testing.baskiria - Thursday, March 2, 2017 - link
When will they ever tell the stylist that jeans does not go with high heels? Lisa looks like a man anyways, good that they are slowly moving off the grandmother image but those high heels are not needed at all...BrokenCrayons - Thursday, March 2, 2017 - link
What is it with people and worrying about how someone else is dressed? I swear some guys out there fuss over womens' fashion more than the women that have to figure out what to wear to keep your tender sensitivities over stereotypical gender-based attire from steering people into flipping out about it. Why not focus your mind on the CPU technology and try to keep your little boy downstairs from driving your fingers when you're typing?Nagorak - Friday, March 3, 2017 - link
You're right. AMD should definitely fire Su and hire Melania Trump...EasyListening - Friday, March 3, 2017 - link
She used to go grey but at some point she decided she wanted to look younger because she wanted AMD to present a younger, more athletic and lively profile. Anyways, try to curb your unhinged desire to randomly spurt out sexist crap, unless you want to die a virgin.serendip - Sunday, March 5, 2017 - link
That's harsh. Why should we as CPU buyers care about what the CEO looks like?Anyway, black tee + blue jeans seems to be the default mode of dress for many people in tech. It's clean, simple, boring, and forces you to focus on what's being said instead of who's saying it. I'm guilty of wearing this combo almost every day when I don't have to wear a suit.
Meteor2 - Sunday, March 5, 2017 - link
WTF?Manch - Thursday, March 2, 2017 - link
2min!!!Krysto - Thursday, March 2, 2017 - link
What I'd like to ask Lisa, is to please stop treating its customers like they are stupid. I'm talking about using misleading marketing tactics, such as renaming an old GPU, to make it seem "all new" and stuff. The PC guys are also least likely to fall for stuff like this, and you're just making them dislike you more.It's even worse when you're using a brand name that's supposed to be used by Vega, for Polaris cards, like you're planning to do with the RX 500 series this April. Stop doing that.
BrokenCrayons - Thursday, March 2, 2017 - link
AMD is not the only company guilty of rebranding. Despite the optimizations claims made regarding Kaby Lake, Intel is effectively re-releasing Skylake. Nvidia has done the same with their GPUs too as have many, many other large and small companies. I used to change the model numbers, but none of the internal components in PCs I built in the 90s when I was running a little computer shop to keep my branding fresh looking. Cereal companies change their box art or rename a cereal after spraying on different food coloring.If you don't find rebranding intolerable, you're living on the wrong planet.
EasyListening - Friday, March 3, 2017 - link
Triggered. Clean up on aisle 5.0iron - Thursday, March 2, 2017 - link
Wow, a lot of 'I think' answer, 31 total!Pishi86 - Thursday, March 2, 2017 - link
Excellent review, Ian. Any power consumptions for Ryzen. Some sites state the 1700 using 20-30w less than the 7700k. If so its one hell of a CPU.MadManMark - Monday, March 6, 2017 - link
Review? Did you post this comment to the wrong article?eva2000 - Saturday, March 4, 2017 - link
AMD Server cpu name = Opteron Z:)
tipoo - Saturday, March 4, 2017 - link
"IC: Fake News!"LOL, Ian!
serendip - Sunday, March 5, 2017 - link
Please, please, please put Zen in an ultra low power APU with a stonking GPU inside. Intel killed Atom and walked away from the mobile segment, now AMD can try their hand at it. I would love to see a non-pro Surface 4 with 20 hour battery life and a fast GPU, at half the price of an Intel-equipped Surface tablet.Meteor2 - Sunday, March 5, 2017 - link
Too right. Nvidia's X1 shows what putting a real GPU on a SoC can do. AMD could do the same in the x86 space.ThreeDee912 - Sunday, March 5, 2017 - link
"I think what we’re trying to address is maybe the forward thinking users, not just the today gamer."Since this was before all the benchmarks were made public, her comments about gaming are interesting. Seems like she knew that Ryzen might not perform quite as well in games, but is really good at video encoding and other "professional" applications.
usernametaken76 - Tuesday, March 7, 2017 - link
I kept reading "power point" but after the second reference, it's clear Dr. Su was referring to a PowerPoint slide deck.AntDX316 - Tuesday, April 4, 2017 - link
I'm more interested in what the new consoles with look like Powered by AMD. This makes AMD have funding to actually research and produce better performing products. When the console drive goes up, so does the talent for PC as they would be very similar. I don't really play any new PC games though but I assume the VR is where the interest will be headed.