There are plenty of laptops out there with Carrizo processors. Why hasn't Anandtech reviewed one yet? I think everybody is pretty eager to see a decent review of this processor so we can put Zen's 40% IPC increase into perspective.
Simply put, when we've asked for samples no-one wants to send us one yet. Even though I'm not the laptop guy, I actually just ordered one for my Grandparents. I don't have the tools to do display tests, but I can run some quick CPU benchmarks.
Several Carrizo models are available in configurations with FHD IPS screens. I think they might include some configurations of the 14 and 12.5 inch Elitebooks, some Satellite P50D, some Pavillion 17z, and the Pavilion 23 AIO.
It is true that they are starting to pair these setups with better displays. However, I think many are looking for a review of Carrizo in general. Laptop specific things like display, keyboard, battery size, etc. take a back seat to things like System/CPU/GPU performance and power efficiency (Minutes / WHr). Though, I wouldn't mind a laptop specific review separate from the general Carrizo review.
If the opportunity presents itself, that would be great. It could be a Pipeline mini-review. AMD has made a lot of promises with Carrizo (non-L) that I would love to see substantiated.
HP has some models configured with the FX-8800P or A12 equivalent. I'd bet at least some of the larger more expensive models are 35W. But the default RAM configs will no doubt be screwed up and result in single channel operation, crippling the memory bandwidth. They like to pack in stupid uneven or single stick configs to save a few bucks at each RAM capacity tier.
It's a big thread, but there's some good data in there, especially the fp benchmarks at static clockspeeds. tl;dr: it has ~11% better IPC than Kaveri in fp benchmarks, though it was only 5% faster on Cinebench R10.
The downside is that voltage/clockspeed scaling is inferior for Carrizo vs. GV-A1 Kaveri past 2.6 GHz. It really isn't a logical choice for a desktop CPU unless AMD has done something to the R-series (new stepping?) to improve that situation, at least not with Kaveri out there already. Unless you simply must have GCN 1.2 in an iGPU, then it's Carrizo or bust.
@DrMrLordX : "The downside is that voltage/clockspeed scaling is inferior for Carrizo vs. GV-A1 Kaveri past 2.6 GHz. It really isn't a logical choice for a desktop CPU ..."
Not surprising. AMD stated at launch that Carrizo was superior at low power, but not at higher power. I'm pretty sure they had a chart that showed the crossover point was about 20W. Probably why you see so many 15W systems out there. Thanks for the links.
20W per module I believe as opposed to 20W for the entire APU.
With the CPU cores running at 40W total, Carrizo would still be a little faster than Kaveri, but without knowing how much power the GPU portion would use, there's little chance of working out what TDP such a chip would have. Even so, for a 45W Carrizo, it should handily beat the A8-7600 in at least 45W mode. Carrizo was meant to top out at 65W, incidentally.
Their embedded solutions are generally low-power like their mobile solutions anyway. So just like in mobile, Carrizo beats Kaveri for the R-series, takes up less mainboard real estate, and has access to DDR4 for additional memory bandwidth and future RAM cost/supply concerns.
On the desktop, it doesn't make as much sense and I doubt they will release anything built on Excavator for the desktop. If they do, it would probably be in the 45W range, 65W max. As Silverblue points out I think that would do better than the existing 45W-capable models. Could make for a nice low power HTPC with no dGPU required.
I would like to see them release new steppings of Kaveri for FM2+ between now and Zen (especially for a good overclocking Athlon K). Zen is a long way off and it's been very boring on the desktop front.
" we are working with AMD to perhaps get a couple of evaluation boards to test, at least to see how DDR3 vs. DDR4 performance comes into the equation. "
Please do! Especially for APU as all previous APUs benefited from bigger bandwidth.
Is there any precedent for a console making such a large change as the memory type?
I know that clocks occasionally increase while chips shrink and merge onto the same package/die. However, a change in memory technology sounds like it might be a "big" change, even if it made financial/technical sense.
I don't know if there's a precedent - probably not - but if DDR4 prices plummet and DDR3 climbs, Microsoft may find a financial benefit for using DDR4. I very seriously doubt they would use new CPU or GPU tech at the same time, though, unless they feel like making a total refresh.
They'd probably call it the XBox One Two, just for good measure.
A custom AMD APU with 512 GCN shaders, 2 Excavator modules, a 65W TDP and quad channel DDR4 will be cheap and fast enough for the next Nintendo console.
Nintendo likes low power and compact form factors. So if they were to release one in the near future, Puma+ would potentially be better than Excavator. They could use 8 Puma+ cores and match the clocks of the competing consoles Jaguar cores while using less power, or beat their clocks at the same power. I'd also recommend more than 512 shader cores, though they probably wouldn't need THAT much more.
But regardless of whether they went Puma+ or Excavator, I think it's kind of a waste of time. Soon then they'll have an option for a custom 14/16nm Zen-based APU with HBM2. Cost would definitely creep up a bit but it would allow for a massive leap in performance while keeping power reasonable. Zen cores + Arctic Islands + HBM2 on a much more modern process? Yes please. My advice to Nintendo would be: Don't abandon the Wii U until you have something like THAT to offer as a replacement.
Depending on the hvec and h264 encode, decode acceleration and the number of sata ports these could be great for set top boxes and media server/nas boxes.
"Sales are droping rapidly, we need to offer something new." "Yeah, let's build an APU so big, it's GPU shall be severely limited by DDR3 bandwidth!" "Oh, and DDR4 is coming soon. Better be prepared." "Alright, make that chip larger and slap both memory controllers onto it." "Great, we didn't know what else to put onto those chips anyway, right?" "Yeah, and we must sell the people some large die area, otherwise they'll be disappointed." "But don't confuse our regular customers with too much performance. They have to recognize they're running AMD, so stick with DDR3 for them." "Great, let's do it!"
AMD has a long history of hybrid memory controllers. There is probably only one controller, it's just capable of using either type of memory.
I, for one, hope Zen FX CPUs support DDR3, that would make my next upgrade a much more affordable affair (and pretty much guarantee Zen will be in my future... unless it fails miserably, which I kinda doubt).
DDR3-2133 CL9 is quite fast - and still few applications notice much of any benefit over DDR3-1333 or even DDR2-800. APUs aside, faster RAM really is not currently needed.
Yep, for 4 high performance CPU cores 2 channels of DDR3-2133 CL9 perform very well. But it's a 512 shader APU we're talking about. It's mostly DRAM bandwidth limited even with DDR3-2400.
AM2+ was supporting both AM2 processors with DDR2 only support and AM3 processors with DDR2+DDR3 support, so in my opinion AMD choose to keep DDR4 as a feature for a future Carrizo. I don't think Carrizo-L was the reason for not getting DDR4 in laptops.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
34 Comments
Back to Article
nathanddrews - Wednesday, October 21, 2015 - link
Yippee ki yay, Merlin Falcon!Oxford Guy - Saturday, October 24, 2015 - link
Silicon silicon silicon silicon silicon silicon.milli - Wednesday, October 21, 2015 - link
There are plenty of laptops out there with Carrizo processors. Why hasn't Anandtech reviewed one yet? I think everybody is pretty eager to see a decent review of this processor so we can put Zen's 40% IPC increase into perspective.Ian Cutress - Wednesday, October 21, 2015 - link
Simply put, when we've asked for samples no-one wants to send us one yet. Even though I'm not the laptop guy, I actually just ordered one for my Grandparents. I don't have the tools to do display tests, but I can run some quick CPU benchmarks.ddriver - Wednesday, October 21, 2015 - link
Hardly anyone would expect a product of that class to sport a display worthy of testing. System, CPU and GPU benches would suffice.Gc - Wednesday, October 21, 2015 - link
Several Carrizo models are available in configurations with FHD IPS screens. I think they might include some configurations of the 14 and 12.5 inch Elitebooks, some Satellite P50D, some Pavillion 17z, and the Pavilion 23 AIO.BurntMyBacon - Thursday, October 22, 2015 - link
It is true that they are starting to pair these setups with better displays. However, I think many are looking for a review of Carrizo in general. Laptop specific things like display, keyboard, battery size, etc. take a back seat to things like System/CPU/GPU performance and power efficiency (Minutes / WHr). Though, I wouldn't mind a laptop specific review separate from the general Carrizo review.nathanddrews - Wednesday, October 21, 2015 - link
If the opportunity presents itself, that would be great. It could be a Pipeline mini-review. AMD has made a lot of promises with Carrizo (non-L) that I would love to see substantiated.ImSpartacus - Wednesday, October 21, 2015 - link
That would be pretty kickass of you.silverblue - Wednesday, October 21, 2015 - link
Good luck finding one not restricted to 12 or 15W. :S Still, you can at least show the improvement over Kaveri.Alexvrb - Thursday, October 22, 2015 - link
HP has some models configured with the FX-8800P or A12 equivalent. I'd bet at least some of the larger more expensive models are 35W. But the default RAM configs will no doubt be screwed up and result in single channel operation, crippling the memory bandwidth. They like to pack in stupid uneven or single stick configs to save a few bucks at each RAM capacity tier.Flunk - Tuesday, November 3, 2015 - link
That manufacturers don't want to send out samples is generally not a good sign.DrMrLordX - Wednesday, October 21, 2015 - link
If you want benchmarks of Carrizo, The Stilt provided some over on the OCN forums:http://www.overclock.net/t/1560230/jagatreview-han...
It's a big thread, but there's some good data in there, especially the fp benchmarks at static clockspeeds. tl;dr: it has ~11% better IPC than Kaveri in fp benchmarks, though it was only 5% faster on Cinebench R10.
The downside is that voltage/clockspeed scaling is inferior for Carrizo vs. GV-A1 Kaveri past 2.6 GHz. It really isn't a logical choice for a desktop CPU unless AMD has done something to the R-series (new stepping?) to improve that situation, at least not with Kaveri out there already. Unless you simply must have GCN 1.2 in an iGPU, then it's Carrizo or bust.
BurntMyBacon - Thursday, October 22, 2015 - link
@DrMrLordX : "The downside is that voltage/clockspeed scaling is inferior for Carrizo vs. GV-A1 Kaveri past 2.6 GHz. It really isn't a logical choice for a desktop CPU ..."Not surprising. AMD stated at launch that Carrizo was superior at low power, but not at higher power. I'm pretty sure they had a chart that showed the crossover point was about 20W. Probably why you see so many 15W systems out there. Thanks for the links.
silverblue - Thursday, October 22, 2015 - link
20W per module I believe as opposed to 20W for the entire APU.With the CPU cores running at 40W total, Carrizo would still be a little faster than Kaveri, but without knowing how much power the GPU portion would use, there's little chance of working out what TDP such a chip would have. Even so, for a 45W Carrizo, it should handily beat the A8-7600 in at least 45W mode. Carrizo was meant to top out at 65W, incidentally.
Alexvrb - Thursday, October 22, 2015 - link
Their embedded solutions are generally low-power like their mobile solutions anyway. So just like in mobile, Carrizo beats Kaveri for the R-series, takes up less mainboard real estate, and has access to DDR4 for additional memory bandwidth and future RAM cost/supply concerns.On the desktop, it doesn't make as much sense and I doubt they will release anything built on Excavator for the desktop. If they do, it would probably be in the 45W range, 65W max. As Silverblue points out I think that would do better than the existing 45W-capable models. Could make for a nice low power HTPC with no dGPU required.
I would like to see them release new steppings of Kaveri for FM2+ between now and Zen (especially for a good overclocking Athlon K). Zen is a long way off and it's been very boring on the desktop front.
albert89 - Monday, November 2, 2015 - link
You may have a point. Anandtech are scared they might find that AMD's Carrizo chip to be a really good APU.przemo_li - Wednesday, October 21, 2015 - link
"we are working with AMD to perhaps get a couple of evaluation boards to test, at least to see how DDR3 vs. DDR4 performance comes into the equation.
"
Please do! Especially for APU as all previous APUs benefited from bigger bandwidth.
ImSpartacus - Wednesday, October 21, 2015 - link
I know! I'm cumming buckets just thinking of what carrizo's biggest gpu might be able to do with some high clocked ddr4.TallestJon96 - Wednesday, October 21, 2015 - link
Good first move to DDR4. Next will be zen with DDR4, then maybe even the AMD built Xbox one will go to DDR4.DDR4 is pretty reasonable to get now. You can get 16gb for $100 or less, and there is a basic 32gb kit that's about $175
ImSpartacus - Wednesday, October 21, 2015 - link
Is there any precedent for a console making such a large change as the memory type?I know that clocks occasionally increase while chips shrink and merge onto the same package/die. However, a change in memory technology sounds like it might be a "big" change, even if it made financial/technical sense.
looncraz - Wednesday, October 21, 2015 - link
I don't know if there's a precedent - probably not - but if DDR4 prices plummet and DDR3 climbs, Microsoft may find a financial benefit for using DDR4. I very seriously doubt they would use new CPU or GPU tech at the same time, though, unless they feel like making a total refresh.They'd probably call it the XBox One Two, just for good measure.
Alexvrb - Thursday, October 22, 2015 - link
Not quite the same thing, but I did have the RAM upgrades for my Saturn and N64.Refuge - Monday, November 2, 2015 - link
There is no real precedent for it, and I honestly doubt it would ever happen.The whole reason developers love consoles is because everyone has the same hardware, Fragmentation defeats the only reason for Consoles to exist.
Atreidin - Wednesday, October 21, 2015 - link
No it doesn't "beg the question." Please don't use that phrase unless you understand what it means.Metroid64 - Wednesday, October 21, 2015 - link
A custom AMD APU with 512 GCN shaders, 2 Excavator modules, a 65W TDP and quad channel DDR4 will be cheap and fast enough for the next Nintendo console.Alexvrb - Friday, October 23, 2015 - link
Nintendo likes low power and compact form factors. So if they were to release one in the near future, Puma+ would potentially be better than Excavator. They could use 8 Puma+ cores and match the clocks of the competing consoles Jaguar cores while using less power, or beat their clocks at the same power. I'd also recommend more than 512 shader cores, though they probably wouldn't need THAT much more.But regardless of whether they went Puma+ or Excavator, I think it's kind of a waste of time. Soon then they'll have an option for a custom 14/16nm Zen-based APU with HBM2. Cost would definitely creep up a bit but it would allow for a massive leap in performance while keeping power reasonable. Zen cores + Arctic Islands + HBM2 on a much more modern process? Yes please. My advice to Nintendo would be: Don't abandon the Wii U until you have something like THAT to offer as a replacement.
SunnyNW - Saturday, October 24, 2015 - link
Rumor has it the Nintendo NX (the successor to the Wii U) dev kits are already being delivered to key developers.toyotabedzrock - Wednesday, October 21, 2015 - link
Depending on the hvec and h264 encode, decode acceleration and the number of sata ports these could be great for set top boxes and media server/nas boxes.MrSpadge - Wednesday, October 21, 2015 - link
AMD meeting, 2-3 years ago:"Sales are droping rapidly, we need to offer something new."
"Yeah, let's build an APU so big, it's GPU shall be severely limited by DDR3 bandwidth!"
"Oh, and DDR4 is coming soon. Better be prepared."
"Alright, make that chip larger and slap both memory controllers onto it."
"Great, we didn't know what else to put onto those chips anyway, right?"
"Yeah, and we must sell the people some large die area, otherwise they'll be disappointed."
"But don't confuse our regular customers with too much performance. They have to recognize they're running AMD, so stick with DDR3 for them."
"Great, let's do it!"
looncraz - Wednesday, October 21, 2015 - link
AMD has a long history of hybrid memory controllers. There is probably only one controller, it's just capable of using either type of memory.I, for one, hope Zen FX CPUs support DDR3, that would make my next upgrade a much more affordable affair (and pretty much guarantee Zen will be in my future... unless it fails miserably, which I kinda doubt).
DDR3-2133 CL9 is quite fast - and still few applications notice much of any benefit over DDR3-1333 or even DDR2-800. APUs aside, faster RAM really is not currently needed.
MrSpadge - Thursday, October 22, 2015 - link
Yep, for 4 high performance CPU cores 2 channels of DDR3-2133 CL9 perform very well. But it's a 512 shader APU we're talking about. It's mostly DRAM bandwidth limited even with DDR3-2400.yannigr2 - Thursday, October 22, 2015 - link
AM2+ was supporting both AM2 processors with DDR2 only support and AM3 processors with DDR2+DDR3 support, so in my opinion AMD choose to keep DDR4 as a feature for a future Carrizo. I don't think Carrizo-L was the reason for not getting DDR4 in laptops.petteyg359 - Monday, October 26, 2015 - link
"Moving the entire GPU driver stack to open source"... did I read that right? :drool: