It is sad that AMD struggles so much. I bought an HP with 1366x768, ie budget screen on most retail laptops, and an AMD A10-4600 and it plays games like Dirt 3, and stuff just fine. I haven't installed a ton of games on it. But it certainly out performs any mobile i5 or i7 that didn't have dedicated graphics card.
I really hope Carrizo kicks off, and finally AMD gets some love.
1366x768 is fine in like sub 11.6-inch devices. Broadwell GT3e essentially has a stronger GPU so AMD needs to learn it's no selling point. They can't do much until they have a stronger CPU any way so both IGP and decent dGPU (switchable or whatever) makes sense with their chips. AMD chips is essentially still too weak to drive 7970m GPUs from 2012. Plus the only decent design win for mobile GCN is the new MacBook Pro 15, switchable graphics needs to be okay with Windows and Intel CPU/IGP's too. Do a shrink to 16/14 nm and make drivers that makes that happen and they should be decent enough though.
1. GT3e is a 28W part; this is a 15W part. 2. GT3e is a part that costs over 3 times as much. (I'm not saying I'd buy the thing; I'm saying we need to be fair to AMD here :) )
I'd have to agree people arnt reading the stat's correctly. Broadwell beats Kaveri by a few points yet costs between one and a half and three times as much. The price of Intel APU are moving farther than the performance. And it wouldn't surprise me if Carrizo out performs Broadwell in 6 out of 10 games and that's all at 28nm !!!!!
List price for a 15W GT3 Broadwell is no more then 315 USD. Should be faster or as fast as a top-end Kaveri in game benchmarks. Is available in 370 USD NUC's barbones and soon it's in 600 dollar laptops. How much is it for a Kaveri FX-laptop? Probably a lot, and most Intel parts at least GT2+ SKU's of Haswell and Broadwell is faster than the A10 Kaveri in laptops any way. The jump from 500 to 600-700 USD just gives you a much faster notebook that's just much stronger than a Kaveri A10-device overall and even FX-7600P isn't really strong enough to game on. Carrizo doesn't really change that. AMD's numbers for 3dmark 11 on the 15W parts is on par with Iris graphics and discrete HD 7750M or 940M+ would be faster any way and is found in cheap laptops.
Switchable Graphics is a real deal in Windows ecosystem, Linux lacks a ton of drivers and not expected to deliver more therefore AMD has scope in enterprise servers where Linux is thriving therefore AMD will succeed in long run if they somehow manage to have at par CPUs with Intel.
And Optimus already doesn't work. I'm NEVER buying another PC with it, and am just thankful mine lets you disable it and run on the Nvidia GPU directly.
well nice to hear that switchable amd graphics sucks, so does optimus. setting manual profiles is ok with optimus, i supose the same with AMD, anything automatic is NOT.
oh and every time i dock - undock my explorer, chrome, firefox will crash due to graphics issues... uber optimus.
Switchable graphics sucks, bad. My current work laptop is Intel/ATI, and I've also used Intel/NVidia. Neither one works as well as just a single video adapter. I've had issues with windows not refreshing, driver crashes, and just overall wonkiness.
Luckily Dell lets you disable it completely in the BIOS. That tends to get rid of the issues.
AMD should make Steam Box's. They already do APUs, chipsets (which are going on die) and memory. It would be pretty simple for AMD to partner with a motherboard maker. Imagine a Steam Box about the size of a router, with a nano-ITX motherboard, a 14 nm APU with HBM, wifi, a few USB ports and an HDMI port to connect to a TV.
An AMD/Valve partnership could potentially revolutionise the console market, providing cheap yet powerful and efficient console-type PCs.
AMD drivers for Linux are a bit weird. Catalyst is the official supported driver but it's buggy.
Open source drivers are quite good but they are slower than Catalyst and don't support latest OpenGL spec. There is no Mantle/Vulcan/HSA/Crossfire support with Open-Source drivers either. OpenCL is in alpha stage.
So AMD would need to man up and do the Linux drivers properly. They are working on it and making good progress but I doubt it is ready to be used at the moment as it is...
Besides, lots of games these days get developed with Nvidia's "help" to ensure they run well on Nvidia GPUs and run like crap on AMD GPUs. And if the games are built using Intel Compiler, they'll run like crap on AMD CPUs as well. All of these tactics are anticompetitive and should be illegal IMO but who said the world is fair...
And don't get me wrong, I love AMD, I use Linux + AMD dGPU + APU, but I don't think it's ready for the masses yet.
I agree. I'm a double AMD Linux gamer and I've run into the exact same problems as you have, and I wish they'd be more serious about Linux. Sure they have Microsoft's support, but I feel that they should take Linux more seriously outside of the enterprise (where they do take Linux more seriously).
I disagree. For casual gaming on laptops, 1366x768 is just fine. You'll need a lot more horsepower to drive a fullHD screen and battery life will suffer. I won't say that there's no benefit gaming at fullHD vs 1366x768: obviously, the visuals are better, but if you want an "all rounder" laptop which does not weight one ton (like "real" gaming laptops) and that it is below $500, it's not bad at all.
I personally would rather have a cheap 1366x768 panel. I don't care about color accuracy much, light bleed, panel responsiveness or much of anything else and haven't since we transitioned from passive to active matrix screens in the 486 to original Pentium era of notebook computers. In fact, I see higher resolutions as an unnecessary (because I have to scale things anyway to easily read text and interact with UI elements and because native resolution gaming on higher res screens demands more otherwise unnecessary GPU power) drain on battery life that invariably drives up the cost of the system to get otherwise identical performance. The drive for progressively smaller, higher pixel density displays is a pointless struggle to fill in comparable checkboxes between competitors to appease a consumer audience that has been swept up in the artificially fabricated frenzy over an irrelevant device specification.
I think it depends on the use, ultimately. For office work (i.e.: much reading/writing emails), a reasonably high resolution helps making the text sharp and easier on the eyes. For home use (web browsing, watching videos, casual gaming) though, I find it a lot less relevant. Personally, at home, I rather have a <$400 laptop always ready to be used for anything, to be moved around, even in the kitchen, than a $1000 laptop which I would need to treat with gloves for fears of damaging. Since Kaveri I also started recommending AMD again to my friends and family: much cheaper than Intel and with a decent GPU makes them a lot more versatile. Again, my opinion, based on my use. As they say: to each his own...
Built my mother a new system from her old scraps with a new A8, she loves that desktop, and when she put an SSD in it finally she loved it ten times more. the upgrade only cost her $300, for CPU, Mobo, RAM. Threw it together in 45 minutes, and she hasn't had a problem with it in 2 years so far.
I prefer the following setup: 1. Beast-mode, high-performance desktop for gaming, video editing, etc. 2. Low-power, cheap notebook/tablet for In-Home Steam Streaming and light gaming (720p) on the go.
In my use case, as long as I can load and play the game (20-30fps for RTS, 30fps+ for everything else) on a plane ride or some other scenario without AC access, I'm not really concerned with the AA or texture quality. I still want to get the best experience possible, but balanced against the cheapest possible price. The sub-$300 range is ideal for me.
1366*768 on anything larger than 13" looks a mess, but in a cheap laptop I'd rather have a 13*7 IPS for the viewing angles and better visuals than a cheap FHD TN panel - bad viewing angles especially KILL the experience of using a laptop. Still, 13*7 is pretty lousy for anything other than multimedia - it's simply too short to fit a useful amount of text vertically. A decent compromise would be moving up to 1600*900 as the standard resolution on >11" displays. Or, of course, moving to 3:2 or 4:3 displays, which would make the resolution 1366*911 or 1366*1024 and provide ample vertical space. Still, 13*7 TN panels need to go away. Now.
Like I said, to each his own. I have a Lenovo Z50 which I paid less than $470 with the A10 7300. Quite frankly, I could not be happier and I think it provides a massive value for that money. Sure, a larger battery and a better screen would not hurt, but for hustling it around the house, or bring it to friend/family house, watch movies, play games at native resolution, it is fantastic. It's no road warrior, for sure (heavy and the battery life doesn't go much beyond 3hrs of "serious" use) but playing at 1366*768 on something that weights 5 pounds and costs noticeably less than $500, is quite amazing. Impossible on an Intel+discrete graphics, as far as I know.
1366x768 is fine for movies and games. But it's a bad resolution for reading text or viewing images on the web, since you see pixels the size of moon crater.
I understand there's going to be a variety of differing opinions on the idea of seeing individual pixels. As far as I'm concerned, seeing individual pixels isn't a dreadful or horrific thing. In fact, to me it simply doesn't matter. I'm busy living my meat world life and enjoying whatever moments I have with family and friends so I don't give the ability to discern an individual pixel so much as a second thought. It is an insignificant part of my life, but what isn't is the associated decline in battery life (on relative terms) required to drive additional, utterly unnecessary pixels and to push out sufficient light as a result of the larger multitude of them. That sort of thing is marginally annoying -- then again, I still just don't care that much one way or another aside from noticing that a lot of people are very much infatuated with an insignificant, nonsense problem.
they probably have their contrast off, clear type disabled, the wrong screen font, smoothing edges of fonts disabled, then he's probably a far sighted coke bottled glasses with bifocals... thus amd must be had in dooper high rezz at 14fps...
Ideally, what you'd get is a 2560x1440 screen with PROPER scaling to 1280x720, unlike what most monitors seem to do, which is a terrible job at scaling.
I run a Kaveri on my desktop and it drives 1080p just fine. Anyways, most intel laptops in that same range run 1366x768 screens also. At least almost every laptop you find in a Staples, or a Walmart. Very few of the devices on display have a higher resolution screen, or IPS solution.
how is it an "issue that plagues AMD"? Are you saying AMD chips are buggy and crash at that resolution? Are you saying AMD graphics get corrupted at that resolution?
The 45W A8-6500T scores 71 in Cinebench R15. Even assuming that 15W kaveri gets identical performance (I couldn't think of a 15w kaveri chip to check on bench) that puts it at 106, which is finally, finally even with broad well in the xps 13, or at least <10%.
Skylake will probably add another 10% gap, but this is the closest AMD might be in the last seven years or so to catching up with Intel. And at half the price, they could seriously have the start of a comeback!
My only problem with AMD and the marketing blurbs is that they're targeting niche markets for all intents and purposes. Despite what PC gamers want us to think how big PC gaming is, it's not exactly a majority player in the entire scheme of things and gaming on a laptop isn't really that accepted.
And even if gaming is involved, it's usually not games that require beefy specs to begin with.
If you think as gaming as hard-core gaming, latest and greatest titles, max settings, high res, then yes, I agree. But, as a father, I can tell you that there are tons of kids that are OK with "some" gaming and for whom the parents won't fork 1Kusd for a powerful gaming rig, and for which AMD really offers a price/performance ratio that cannot be compared.
The marketing for this particular launch is specifically targeting players of MOBAs, such as League of Legends, DOTA2, and Starcraft. The active players of those 3 games is about 100 million people. You can argue that's not a majority of users, but AMD has never had a majority share of laptops, and 100 million people is a sizable market.
I expect that AMD will make more direct appeals for that group of people, such as sponsoring tournaments, and generally trying to establish mindshare for laptops in that group.
I don't think targeting that particular niche market is a bad idea, personally.
Their APU's can perform on those games very well, but also so can all of Intel's HD series graphics, at least every rendition I've ever seen.
It won't be an easy crowd to win over. Even if the requirements for their sport only warrant a $300 computer, you know they are gamers and still want their supercomputers.
I agree with this. But it also doesn't make sense to me why they have. CPU's have been commodity devices for some time now and it makes no sense to me whatsoever that the average consumer would buy Intel when they can transparently get the very same performance for their needs with a much cheaper AMD processor. This, plus the fact that AMD does so well with essentially embedded systems (XBox, Playstation) even though they are 'AMD Inside' suggests to me that their struggle is largely due to marketing failures rather than technological ones.
no it's the skin that surrounds the amd heart - and that skin is cheap and ugly - that's how amd plays their cards... a low class dirt bucket cheap compile - so even if it could be made to look great and "feel very expensive and well constructed" - those creating AMD skin WON'T DO IT.
Thinking again I will admit when browsing new laptops in person AMD GAMING ! is not something I can ever recall seeing ... it's very tough sometimes just to get the basic specs visible by the box stores....
I don't ind 1366 x 768 on a smaller screen (14" and less for me) and think that AMD's performance on 900p and below's great. I also think that it's sad that more people don't know about them and how capable they are.
Many casual customers who would want that extra gaming performance don't even know it exists on A8's and A10's.
Yeah, I don't mind 1366 x 768 myself on smaller laptops, and gaming at that resolution is a thing with AMD APU's. Gaming performance on the A8 and especially A10 APU's is why I've been getting AMD Processors on Laptops for years.
I don't need a lot of GPU oomph on my laptops, just needed the option to play casual games and not that hard to run stuff like WoW (well, before WoD at least) on the go.
Most people are doing fine with an Intel atom, Celeron or Pentium with a dedicated GPU while most AMD APU's are over twice the horse power of the above and that's without a dedicated GPU ! Seriously........the 1366x768 issue was fixed a long time ago while Intel are still charging dinosaur CPU's at twice the price !
I'm trying not to get my hopes up too high, AMD has a history of over-promising, and under-delivering.
But my 3rd gen APU is long in the teeth and I could use a new work laptop. If it is everything they say it will be, I expect to pickup a new Carrizo laptop this year, if I can find them that is.
I believe the improvements to power use for video decode is called intra-frame power-gating, which makes sense seeing how the uvd engine can decode a video stream x times faster than realtime.
lol another launch without reviews and they wonder why we'll all forget about their product by next week. Good or bad ,we would at least know what it is.
From the article: "However for a 15W part, to which Carrizo will be primarily, this means either a 10%+ power reduction at the same frequency or a 5% increase in frequency for the same power. Should AMD release other APUs in the 7.5W region, this is where most of the gains are."
It's power per core pair (module) and Carrizo clearly has two of them. This means the highest gains are exactly in the 15W TDP range.
"The big upgrade in graphics for Carrizo is that the maximum number of compute units for a mobile APU moves up from six (384 SPs) to eight (512 SPs), affording a 33% potential improvement."
Mobile Kaveri already has 8 CUs (FX-7600P), but only at 35W.
This article is basically just an explanation of AMD's marketing slides without actual empirical data to back things up (other than what has been provided by AMD). Worse still, it doesn't make any notable attempt to critically analyze whether the company's claims will or will not materialize.
In short, this article should've been left to AMD's marketing team and posted on the company's site.
clamboy, This is a good article, and you are just another Intel fanboy with butt hurt. Intel does the EXACT same thing, and yet dumbos like you suck it up...... Bend over and assume the position for maximum penetration!
Because there are many wackos who feels like an Intel/NVidia Corp is their mommy and hates to see AMD improve anything. Look at comments for Freesync reviews and such. Stupid how anyone gets attached to a Corp who cares nothing about you.
This is a preview piece. They don't have empirical data because the hardware isn't in actual devices yet. Look at any of AT's IDF coverage and you'll see basically the exact same thing.
nothing has been released yet. but it was announced. This is a news site, you think they are just going to ignore AMD's product announcement? That would be considered "Not doing their job"
They go through the claims, explain them, try to see if they are plausible with what little information they have. I like these articles, it gives me something to digest while I wait for a in depth review, and when I go to read said review I know exactly what information I'm most interested in.
About adaptive clocking. Power is not saved by reducing frequency by 5% for 1% of the time. Power is saved by reducing the voltage margin (increasing frequency at the same voltage) _all_ the time. Also, when the voltage instability occurs, only frequency is reduced. The requested voltage, IMHO, does not change.
It seems like a variant of this should be widely applicable (especially if AMD have patents on exactly what they do). What I have in mind is that when you detect droop rather than dynamically change the frequency (which is hard and requires at least some cycles) you simply freeze the entire chip's clock at the central distribution point --- for one cycle you just hold everything at zero rather than transitioning to one and back. This will give the capacitors time to recover from the droop (and obviously the principle can be extended to freeze the clock for two cycles or even more if that's how long it takes for the capacitors to recover).
This seems like it should allow you to run pretty damn close to the minimum necessary voltage --- basically all you now need is enough margin to ensure that you don't overdraw within a worst case single-cycle. But you don't need to provision for 3+ worst-case cycles, and you don't need the alternative of fancy check-point and recovery mechanisms.
About that power plane. "In yet more effort to suction power out of the system, the GPU will have its own dedicated voltage plane as part of the system, rather than a separate voltage island requiring its own power delivery mechanism as before" As I understand it, "before" = same power plane/island as other parts of the SoC.
Great read and analysis given the fact that actual units are not available for testing.
As a consumer looking for use of Carrizo beyond laptops, provided AMD releases it for consumers, it could be a nice living room HTPC/light gaming unit.
According to ShintelDK and Chizow...the above article results are from an Intel chip and AT have been paid to lie and say its Carrizo because their lives would have no meaning if it is a good product from AMD.
Calling an unlaunched product "bad" would be just as imprudent as calling it "good", but then what do I know, fortune-telling could be just another one of your super powers, along with mind-reading.
Unfortunatelly, time to market, as always for AMD, will be probably 6 month or more... at that time there will be something better from intel. It seems a lot like Asus from this point of view, grat product and very bad supply chain.
AMD has 3 problems ritght now:
1- with 28nm is difficult to compete against 14nm of intel 2- very slow from project design to mass production 3- no support from OEM and major system builders (HP & co.), as it seems always that Intel works to force system builder to forget about AMD like years ago...
Yep this time next year when you walk into a PC store you'll still see 25 Intel laptops and one cheap nasty AMD laptop on the shelves. OEMs don't care about AMD anymore. From what I see from customers is, that they only buy AMD if they are the cheapest machine in the shop. And then its a E1 chip and the customer really regrets it.
That problem 1 is the worst! AMD APUs are ok, but 28nm vs Intel 14nm is just a huge deal. When considering power and efficiency and how much they can put on the chip.
The only decent laptops HP makes are the Elitebooks. And you will pay through the nose for them when you can get better quality from other brands for less. HP should just stop making PCs, period.
HP has several models with AMD chips. It's one of them few laptop makers that allows you to configure an Envy laptop with say a 1080p screen, SSD, a A10 CPU, plus discrete graphics.
Point 2 is probably their killer to be honest. Point 1 is definitely puts AMD in a disadvantage, but considering that Intel don't seem to be interested in pushing performance since its Sandy Bridge days, it's giving AMD a chance to catch up in terms of performance.
To be honest, I think I have to take my hat off AMD's efforts. At 28nm, they are forced to be as creative as they can to squeeze performance out, while keeping power requirements in check.
Some OEM needs to pick this up fast. Carrizo based "NUC" device with HDMI2.0 output with more barebones approach than intel to reduce the cost of entry.
To gostan, I find your comment above baseless and unconstructive to be honest. One article on AMD means AMD marketing arm. So what does that make you then?
My impression is that it will be difficult (almost impossible?) for AMD to compete with a 28nm part against Intel's 14nm parts. And I think the next "tick tock" from Intel will be 10nm. Or not?
Finally AMD release a reasonably power efficient chip.
At 15W this is perfect for a passively cooled HTPC with 4k capability built in. I appreciate the HTPC market is small, but AMD have something that potentially (will reserve judgment until it is out and tested) beats everything Intel have comprehensively.
The problem for AMD will be that people like me already have a HTPC (in my case using i7-3770T which is overkill) and until the world moves to 4K there is no need to upgrade but if they produced something the size of Intel NUC but passively cooled I would be very tempted
I think this makes a very interesting APU. In fact, the most interesting APU from AMD to date. Unfortunately, it may not reach the shores from where I come from. It is either limited availability or the distros are not interested to carry in due to them expecting a low demand.
These scores definitely need validation. If true, Carrizo is a massive win. The FX 8800P graphic shows a 3DMark 11 score of nearly 2000 at 15W, and 2700+ at 35W. The A10-7850k has a score of 2403 at 95W. http://www.anandtech.com/show/7677/amd-kaveri-revi...
I didn't realize AMD's processors were so terrible at video playback. My 2 year old (pushing 3 now) Ivy Bridge i5-3317u equipped HP Envy 4t can manage roughly 6hrs of video playback of a 1080p h.264 12Mbps source and it only has about a 45whr battery in it. With a higher TDP chip and lots of "not power saving" features.
I am definitely in the market for one of these laptops to replace two older laptops in the house with one new one. If it had Carrizo for the hear I would be mighty happy to support AMD over Intel for this round, as the improvements here sound very much adequate for the system I am looking for.
The ideal system would be something like the HP Spectre x360 for around $750.
AMDs problem start with the 1st slide. "more people by notebooks priced between $400 and 700 than at any other price. Almost 2 out of every 5 notbooks sold is in that segment."
umm, 3 out of 5 notebooks is sold outside of the 400-700 dollar price. Thats greater then 2 out of 5.
No. 3 out of 5 notebooks are either sold below $400, or above $700, and out of those two disparate segments, neither is as large as the $400 - $700 segment.
The bigger issue is that AMD is admitting they are so uncompetitive in the market that it doesn't make business sense to chase at least 60% of consumers (and ignoring business costumers completely). And realistically, that 400-700 market is really more like a 550-700 market, as 400-500 is close enough to base iPad Air 2/premium Android tab pricing that you lose a lot of sales that direction.
for amd's sake we'll class idle as full screen video playback and 1.5 hours as all day, and no wifi bluetooth or dvd player active as full multimedia active - there, now look, you were correct ! your ego is in tact, you're never wrong
You are comparing a $400 laptop to a $1500 laptop and, what do you know, the $1500 laptop comes out better. What a surprise!
The point is that in this space batteries have long been cheap and the energy efficiency nothing like at the higher end. Which means the work-life has been something like 3 hrs. If AMD shifts that to six hours with this chip, that's a massive improvement in the target space.
You're also making bad assumptions about why these laptops are bought. If you rely on your laptop heavily for your job, you buy a $1500 laptop. These machines are bought to act as light performance desk machines that are occasionally (but only occasionally) taken to a conference room or on a field trip.
AMD does not have infinite resources. This play makes sense. Intel is essentially operating by starting with a Xeon design point and progressively stripping things out to get to Broadwell-M, which means that Broadwell-M over-supplies this $400-$700 market. Meanwhile at the really low end, Intel has Atom.
AMD is seeing (correctly, I think) that there is something of a gap in the Intel line which they can cover AND that this gap will probably persist for some time --- Intel isn't going to create a third line just to fit that gap.
I might be ready to get into AMD, as AMD has a lot of innovation lately. But it still disappoints me greatly that they aren't able to adopt a more modern process node.
If they launch their new high-performance CPU core next year as part of an APU that uses HBM memory and is at the very least on 16nm FinFET, I might get that instead of a Skylake laptop. HSA is pretty cool and one of the reasons I'd get it.
The Kaveri FX parts are still almost half as slow in IPC as a competing Intel Core i3 with the same TDP. Only in tests involving multithreaded apps that can load all four cores the FX parts are keeping up with the Core i3. Let's hope the Carrizo generation of APUs will improve this situation.
Without being an AMD apologist, I think the point was that single threaded performance was "good enough" for your usual light work which tends to be hamstrung by I/O anyway.
There are two things that I need to see clarified about Carrizo, however:
1) Does Carrizo drop CPU frequency automatically when the GPU is being taxed? That's certainly going to be an issue as regards the comparison with an i3. 2) With the addition of AVX2, were there any architectural changes made to accommodate AVX2, for example a wider FlexFPU?
Ian, you appear to have confused I-cache and D-cache.
You wrote: "The L1 data cache is also now an 8-way associative design, but with the better branch prediction when needed it will only activate the one segment required and when possible power down the rest".
This is of course gibberish. Branch prediction would help to predict the target set of an *instruction* fetch from the I-cache, but is useless for D-cache set prediction for the most part (I say "for the most part" because Brad Calder did publish a way-prediction scheme based on instruction address back in the 90s. It didn't work very well and hasn't been productized that I know of).
Imagine what they could with 14nm of this, probably at half the cost of a Core M with 60 to 70% CPU performance of the M, yet with better graphics at the same TDP.
I already signed up on the mailing list that tells you when Laptops with Carrizo come out and are ready to buy. You can do so on AMD's website if you're interested. The H.265 hardware decoding alone interests me, and all the other features like program-specific acceleration and the better GPU performance for mainstream games is nice.
If you only play stuff like LoL or Counterstrike, or browser games or even older games on GoG and Steam, the A10 and up look like they'll be quite good.
AMD not including VP9 support is a mistake. They could always drop it if YouTube isn't as popular, but a lot of video in media articles tends to be linked to YouTube. It would be nice to see a die shrink with AMD adding more CPU cores to make up the difference to at least compete with Intel in number crunching.
I think everyone should look at APU with respect, apu is the future of pc and notebook, HBM on next AMD GPU will be a start and test for new APU with HBM on chip ram, that will be faster and faster than any ddr4 now available in market and probably any 'on motherboard' ram we will ever see, AMD could start a revolution in PC market, and other will probably copy them in short, even with faster cpu, but IF that will happens we shall be grate to AMD. And sorry for my english...
Something I'm always interested in but is never addressed in these articles. The UVD playback and all its magical power savings - what codecs/players support it? If I have a CCCP installed will MPC-HC automaticall benifit? Or will that be reserved for some cyberpower payware dvd/bd player?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
137 Comments
Back to Article
SilthDraeth - Tuesday, June 2, 2015 - link
It is sad that AMD struggles so much. I bought an HP with 1366x768, ie budget screen on most retail laptops, and an AMD A10-4600 and it plays games like Dirt 3, and stuff just fine. I haven't installed a ton of games on it. But it certainly out performs any mobile i5 or i7 that didn't have dedicated graphics card.I really hope Carrizo kicks off, and finally AMD gets some love.
meacupla - Tuesday, June 2, 2015 - link
1366x768 screens is one of the very issues that plagues AMD mobile chips.monstercameron - Tuesday, June 2, 2015 - link
So true, amd needs some kind of program to prevent oem from shipping 13*7 screens with certain socs.Penti - Tuesday, June 2, 2015 - link
1366x768 is fine in like sub 11.6-inch devices. Broadwell GT3e essentially has a stronger GPU so AMD needs to learn it's no selling point. They can't do much until they have a stronger CPU any way so both IGP and decent dGPU (switchable or whatever) makes sense with their chips. AMD chips is essentially still too weak to drive 7970m GPUs from 2012. Plus the only decent design win for mobile GCN is the new MacBook Pro 15, switchable graphics needs to be okay with Windows and Intel CPU/IGP's too. Do a shrink to 16/14 nm and make drivers that makes that happen and they should be decent enough though.BillyONeal - Wednesday, June 3, 2015 - link
1. GT3e is a 28W part; this is a 15W part. 2. GT3e is a part that costs over 3 times as much. (I'm not saying I'd buy the thing; I'm saying we need to be fair to AMD here :) )Taneli - Wednesday, June 3, 2015 - link
GT3e starts at 47W. GT3 (without Crystalwell) is available with dual core cpu in 15W (HD6000) and 28W (HD6100).Penti - Wednesday, June 3, 2015 - link
GT3 is essentially stronger without eDRAM too though.albert89 - Thursday, June 4, 2015 - link
I'd have to agree people arnt reading the stat's correctly. Broadwell beats Kaveri by a few points yet costs between one and a half and three times as much. The price of Intel APU are moving farther than the performance. And it wouldn't surprise me if Carrizo out performs Broadwell in 6 out of 10 games and that's all at 28nm !!!!!Penti - Thursday, June 4, 2015 - link
Actually when you will be able to get a Broadwell laptop (or SFF machine) for the same price, what's the point?Penti - Saturday, June 6, 2015 - link
List price for a 15W GT3 Broadwell is no more then 315 USD. Should be faster or as fast as a top-end Kaveri in game benchmarks. Is available in 370 USD NUC's barbones and soon it's in 600 dollar laptops. How much is it for a Kaveri FX-laptop? Probably a lot, and most Intel parts at least GT2+ SKU's of Haswell and Broadwell is faster than the A10 Kaveri in laptops any way. The jump from 500 to 600-700 USD just gives you a much faster notebook that's just much stronger than a Kaveri A10-device overall and even FX-7600P isn't really strong enough to game on. Carrizo doesn't really change that. AMD's numbers for 3dmark 11 on the 15W parts is on par with Iris graphics and discrete HD 7750M or 940M+ would be faster any way and is found in cheap laptops.albert89 - Monday, June 8, 2015 - link
Tell me where you can buy one for that price. Because most I see are like this article described, double the price.Prashant Jain - Wednesday, June 3, 2015 - link
Switchable Graphics is a real deal in Windows ecosystem, Linux lacks a ton of drivers and not expected to deliver more therefore AMD has scope in enterprise servers where Linux is thriving therefore AMD will succeed in long run if they somehow manage to have at par CPUs with Intel.Penti - Wednesday, June 3, 2015 - link
Switchable graphics from AMD with Intel CPU's is just so much worse than Optimus.Wolfpup - Wednesday, June 3, 2015 - link
And Optimus already doesn't work. I'm NEVER buying another PC with it, and am just thankful mine lets you disable it and run on the Nvidia GPU directly.duploxxx - Thursday, June 4, 2015 - link
well nice to hear that switchable amd graphics sucks, so does optimus. setting manual profiles is ok with optimus, i supose the same with AMD, anything automatic is NOT.oh and every time i dock - undock my explorer, chrome, firefox will crash due to graphics issues... uber optimus.
barleyguy - Wednesday, June 3, 2015 - link
Switchable graphics sucks, bad. My current work laptop is Intel/ATI, and I've also used Intel/NVidia. Neither one works as well as just a single video adapter. I've had issues with windows not refreshing, driver crashes, and just overall wonkiness.Luckily Dell lets you disable it completely in the BIOS. That tends to get rid of the issues.
RandUser - Thursday, June 4, 2015 - link
It depends. On my ASUS laptop Optimus works perfectly, no isues.Margalus - Wednesday, June 3, 2015 - link
1366x768 is perfectly fine on 15.6 inch devices...fokka - Wednesday, June 3, 2015 - link
if you're bordering on blind, yes.meacupla - Wednesday, June 3, 2015 - link
Actually, the blind and hard of seeing, benefit greatly from higher DPI with much sharper images.I wouldn't be surprised if 1366x768 caused blindness in the first place, however.
renegade800x - Thursday, June 4, 2015 - link
Although viewable it's far from being "perfectly" fine. 15.6 should be FHD.albert89 - Tuesday, June 23, 2015 - link
You don't need a strong CPU since win8 because most laptops use atom, Celeron or Pentium processors. AMD APU's are the natural choice !mabsark - Wednesday, June 3, 2015 - link
AMD should make Steam Box's. They already do APUs, chipsets (which are going on die) and memory. It would be pretty simple for AMD to partner with a motherboard maker. Imagine a Steam Box about the size of a router, with a nano-ITX motherboard, a 14 nm APU with HBM, wifi, a few USB ports and an HDMI port to connect to a TV.An AMD/Valve partnership could potentially revolutionise the console market, providing cheap yet powerful and efficient console-type PCs.
Refuge - Wednesday, June 3, 2015 - link
HBM isn't coming to APU's anytime soon.Cryio - Saturday, June 6, 2015 - link
Probably the first APU after Carrizocoder111 - Wednesday, June 3, 2015 - link
Aren't Steamboxes supposed to run Linux?AMD drivers for Linux are a bit weird. Catalyst is the official supported driver but it's buggy.
Open source drivers are quite good but they are slower than Catalyst and don't support latest OpenGL spec. There is no Mantle/Vulcan/HSA/Crossfire support with Open-Source drivers either. OpenCL is in alpha stage.
So AMD would need to man up and do the Linux drivers properly. They are working on it and making good progress but I doubt it is ready to be used at the moment as it is...
Besides, lots of games these days get developed with Nvidia's "help" to ensure they run well on Nvidia GPUs and run like crap on AMD GPUs. And if the games are built using Intel Compiler, they'll run like crap on AMD CPUs as well. All of these tactics are anticompetitive and should be illegal IMO but who said the world is fair...
And don't get me wrong, I love AMD, I use Linux + AMD dGPU + APU, but I don't think it's ready for the masses yet.
AS118 - Wednesday, June 3, 2015 - link
I agree. I'm a double AMD Linux gamer and I've run into the exact same problems as you have, and I wish they'd be more serious about Linux. Sure they have Microsoft's support, but I feel that they should take Linux more seriously outside of the enterprise (where they do take Linux more seriously).yankeeDDL - Wednesday, June 3, 2015 - link
I disagree.For casual gaming on laptops, 1366x768 is just fine. You'll need a lot more horsepower to drive a fullHD screen and battery life will suffer.
I won't say that there's no benefit gaming at fullHD vs 1366x768: obviously, the visuals are better, but if you want an "all rounder" laptop which does not weight one ton (like "real" gaming laptops) and that it is below $500, it's not bad at all.
BrokenCrayons - Wednesday, June 3, 2015 - link
I personally would rather have a cheap 1366x768 panel. I don't care about color accuracy much, light bleed, panel responsiveness or much of anything else and haven't since we transitioned from passive to active matrix screens in the 486 to original Pentium era of notebook computers. In fact, I see higher resolutions as an unnecessary (because I have to scale things anyway to easily read text and interact with UI elements and because native resolution gaming on higher res screens demands more otherwise unnecessary GPU power) drain on battery life that invariably drives up the cost of the system to get otherwise identical performance. The drive for progressively smaller, higher pixel density displays is a pointless struggle to fill in comparable checkboxes between competitors to appease a consumer audience that has been swept up in the artificially fabricated frenzy over an irrelevant device specification.yankeeDDL - Wednesday, June 3, 2015 - link
I think it depends on the use, ultimately.For office work (i.e.: much reading/writing emails), a reasonably high resolution helps making the text sharp and easier on the eyes.
For home use (web browsing, watching videos, casual gaming) though, I find it a lot less relevant.
Personally, at home, I rather have a <$400 laptop always ready to be used for anything, to be moved around, even in the kitchen, than a $1000 laptop which I would need to treat with gloves for fears of damaging. Since Kaveri I also started recommending AMD again to my friends and family: much cheaper than Intel and with a decent GPU makes them a lot more versatile. Again, my opinion, based on my use. As they say: to each his own...
Refuge - Wednesday, June 3, 2015 - link
Built my mother a new system from her old scraps with a new A8, she loves that desktop, and when she put an SSD in it finally she loved it ten times more. the upgrade only cost her $300, for CPU, Mobo, RAM. Threw it together in 45 minutes, and she hasn't had a problem with it in 2 years so far.nathanddrews - Wednesday, June 3, 2015 - link
I prefer the following setup:1. Beast-mode, high-performance desktop for gaming, video editing, etc.
2. Low-power, cheap notebook/tablet for In-Home Steam Streaming and light gaming (720p) on the go.
In my use case, as long as I can load and play the game (20-30fps for RTS, 30fps+ for everything else) on a plane ride or some other scenario without AC access, I'm not really concerned with the AA or texture quality. I still want to get the best experience possible, but balanced against the cheapest possible price. The sub-$300 range is ideal for me.
AS118 - Wednesday, June 3, 2015 - link
Yeah, that's my thing as well. High resolutions at work, and at home, 768p or 900p's just fine, especially for gaming.I also recommend AMD to friends and relatives who want laptops and stuff that can do casual gaming for cheap.
FlushedBubblyJock - Tuesday, June 9, 2015 - link
Why go amd when HD3000 does just fine gaming and the added power of the intel cpu is an awesome boost overall ...Valantar - Wednesday, June 3, 2015 - link
1366*768 on anything larger than 13" looks a mess, but in a cheap laptop I'd rather have a 13*7 IPS for the viewing angles and better visuals than a cheap FHD TN panel - bad viewing angles especially KILL the experience of using a laptop. Still, 13*7 is pretty lousy for anything other than multimedia - it's simply too short to fit a useful amount of text vertically. A decent compromise would be moving up to 1600*900 as the standard resolution on >11" displays. Or, of course, moving to 3:2 or 4:3 displays, which would make the resolution 1366*911 or 1366*1024 and provide ample vertical space. Still, 13*7 TN panels need to go away. Now.yankeeDDL - Wednesday, June 3, 2015 - link
Like I said, to each his own. I have a Lenovo Z50 which I paid less than $470 with the A10 7300.Quite frankly, I could not be happier and I think it provides a massive value for that money.
Sure, a larger battery and a better screen would not hurt, but for hustling it around the house, or bring it to friend/family house, watch movies, play games at native resolution, it is fantastic.
It's no road warrior, for sure (heavy and the battery life doesn't go much beyond 3hrs of "serious" use) but playing at 1366*768 on something that weights 5 pounds and costs noticeably less than $500, is quite amazing. Impossible on an Intel+discrete graphics, as far as I know.
FlushedBubblyJock - Tuesday, June 9, 2015 - link
Nope, HD3000 plays just fineMargalus - Wednesday, June 3, 2015 - link
I'd rather have a cheaper 15.6" 1366x768 TN panel over a more expensive smaller ips panel.UtilityMax - Wednesday, June 3, 2015 - link
1366x768 is fine for movies and games. But it's a bad resolution for reading text or viewing images on the web, since you see pixels the size of moon crater.BrokenCrayons - Thursday, June 4, 2015 - link
I understand there's going to be a variety of differing opinions on the idea of seeing individual pixels. As far as I'm concerned, seeing individual pixels isn't a dreadful or horrific thing. In fact, to me it simply doesn't matter. I'm busy living my meat world life and enjoying whatever moments I have with family and friends so I don't give the ability to discern an individual pixel so much as a second thought. It is an insignificant part of my life, but what isn't is the associated decline in battery life (on relative terms) required to drive additional, utterly unnecessary pixels and to push out sufficient light as a result of the larger multitude of them. That sort of thing is marginally annoying -- then again, I still just don't care that much one way or another aside from noticing that a lot of people are very much infatuated with an insignificant, nonsense problem.FlushedBubblyJock - Tuesday, June 9, 2015 - link
they probably have their contrast off, clear type disabled, the wrong screen font, smoothing edges of fonts disabled, then he's probably a far sighted coke bottled glasses with bifocals... thus amd must be had in dooper high rezz at 14fps...meacupla - Wednesday, June 3, 2015 - link
Ideally, what you'd get is a 2560x1440 screen with PROPER scaling to 1280x720, unlike what most monitors seem to do, which is a terrible job at scaling.SilthDraeth - Wednesday, June 3, 2015 - link
I run a Kaveri on my desktop and it drives 1080p just fine. Anyways, most intel laptops in that same range run 1366x768 screens also. At least almost every laptop you find in a Staples, or a Walmart. Very few of the devices on display have a higher resolution screen, or IPS solution.yankeeDDL - Wednesday, June 3, 2015 - link
Yes, *and* Intel's in that price range don't run games.FlushedBubblyJock - Tuesday, June 9, 2015 - link
yes they do make sure you have hd3000 at least but that's cheap as beansMargalus - Wednesday, June 3, 2015 - link
how is it an "issue that plagues AMD"? Are you saying AMD chips are buggy and crash at that resolution? Are you saying AMD graphics get corrupted at that resolution?medi03 - Thursday, June 4, 2015 - link
Ditto.In fact, 1366x768 TN (!!!) screen is the main issue with AMD mobile chips.
Drumsticks - Wednesday, June 3, 2015 - link
The 45W A8-6500T scores 71 in Cinebench R15. Even assuming that 15W kaveri gets identical performance (I couldn't think of a 15w kaveri chip to check on bench) that puts it at 106, which is finally, finally even with broad well in the xps 13, or at least <10%.Skylake will probably add another 10% gap, but this is the closest AMD might be in the last seven years or so to catching up with Intel. And at half the price, they could seriously have the start of a comeback!
xenol - Wednesday, June 3, 2015 - link
My only problem with AMD and the marketing blurbs is that they're targeting niche markets for all intents and purposes. Despite what PC gamers want us to think how big PC gaming is, it's not exactly a majority player in the entire scheme of things and gaming on a laptop isn't really that accepted.And even if gaming is involved, it's usually not games that require beefy specs to begin with.
Michael Bay - Wednesday, June 3, 2015 - link
PC gaming by marketshare is on par with any pastgen/nextgen console and sometimes more than one.yankeeDDL - Wednesday, June 3, 2015 - link
If you think as gaming as hard-core gaming, latest and greatest titles, max settings, high res, then yes, I agree.But, as a father, I can tell you that there are tons of kids that are OK with "some" gaming and for whom the parents won't fork 1Kusd for a powerful gaming rig, and for which AMD really offers a price/performance ratio that cannot be compared.
FlushedBubblyJock - Tuesday, June 9, 2015 - link
having kids in usa is down down down, so not so much biz therebarleyguy - Thursday, June 4, 2015 - link
The marketing for this particular launch is specifically targeting players of MOBAs, such as League of Legends, DOTA2, and Starcraft. The active players of those 3 games is about 100 million people. You can argue that's not a majority of users, but AMD has never had a majority share of laptops, and 100 million people is a sizable market.I expect that AMD will make more direct appeals for that group of people, such as sponsoring tournaments, and generally trying to establish mindshare for laptops in that group.
I don't think targeting that particular niche market is a bad idea, personally.
Refuge - Thursday, June 4, 2015 - link
Their APU's can perform on those games very well, but also so can all of Intel's HD series graphics, at least every rendition I've ever seen.It won't be an easy crowd to win over. Even if the requirements for their sport only warrant a $300 computer, you know they are gamers and still want their supercomputers.
Magichands8 - Wednesday, June 3, 2015 - link
I agree with this. But it also doesn't make sense to me why they have. CPU's have been commodity devices for some time now and it makes no sense to me whatsoever that the average consumer would buy Intel when they can transparently get the very same performance for their needs with a much cheaper AMD processor. This, plus the fact that AMD does so well with essentially embedded systems (XBox, Playstation) even though they are 'AMD Inside' suggests to me that their struggle is largely due to marketing failures rather than technological ones.medi03 - Thursday, June 4, 2015 - link
Athlon 64th were outsold by Intel's Prescott that was:a) slower
b) consumed more power
c) more expensive
also because of "AMD's marketing failures" I guess...
FlushedBubblyJock - Tuesday, June 9, 2015 - link
no it's the skin that surrounds the amd heart - and that skin is cheap and ugly - that's how amd plays their cards... a low class dirt bucket cheap compile - so even if it could be made to look great and "feel very expensive and well constructed" - those creating AMD skin WON'T DO IT.FlushedBubblyJock - Tuesday, June 9, 2015 - link
Thinking again I will admit when browsing new laptops in person AMD GAMING ! is not something I can ever recall seeing ... it's very tough sometimes just to get the basic specs visible by the box stores....AS118 - Wednesday, June 3, 2015 - link
I don't ind 1366 x 768 on a smaller screen (14" and less for me) and think that AMD's performance on 900p and below's great. I also think that it's sad that more people don't know about them and how capable they are.Many casual customers who would want that extra gaming performance don't even know it exists on A8's and A10's.
AS118 - Saturday, June 6, 2015 - link
Yeah, I don't mind 1366 x 768 myself on smaller laptops, and gaming at that resolution is a thing with AMD APU's. Gaming performance on the A8 and especially A10 APU's is why I've been getting AMD Processors on Laptops for years.I don't need a lot of GPU oomph on my laptops, just needed the option to play casual games and not that hard to run stuff like WoW (well, before WoD at least) on the go.
albert89 - Tuesday, June 23, 2015 - link
Most people are doing fine with an Intel atom, Celeron or Pentium with a dedicated GPU while most AMD APU's are over twice the horse power of the above and that's without a dedicated GPU ! Seriously........the 1366x768 issue was fixed a long time ago while Intel are still charging dinosaur CPU's at twice the price !Stuka87 - Tuesday, June 2, 2015 - link
Great article. Carrizo really looks promising. Looking forward to performance and power consumption numbers when the time comes.Refuge - Wednesday, June 3, 2015 - link
I'm trying not to get my hopes up too high, AMD has a history of over-promising, and under-delivering.But my 3rd gen APU is long in the teeth and I could use a new work laptop. If it is everything they say it will be, I expect to pickup a new Carrizo laptop this year, if I can find them that is.
monstercameron - Tuesday, June 2, 2015 - link
I believe the improvements to power use for video decode is called intra-frame power-gating, which makes sense seeing how the uvd engine can decode a video stream x times faster than realtime.fteoath64 - Wednesday, June 3, 2015 - link
This chip would be perfect for a HTPC serving 4K and below streams. Gives the Shield TV box a run for its money!. Go AMD.jjj - Tuesday, June 2, 2015 - link
lol another launch without reviews and they wonder why we'll all forget about their product by next week. Good or bad ,we would at least know what it is.SolMiester - Wednesday, June 3, 2015 - link
I agree...too many promises from this outfit, not enough action.FlushedBubblyJock - Tuesday, June 9, 2015 - link
everything is a secret, like bad governmentsbloodypulp - Wednesday, June 3, 2015 - link
Likely aligning reviews with Windows 10 launch. Soon.JDG1980 - Tuesday, June 2, 2015 - link
Carrizo looks promising, though I want to see actual benchmarks.AMD needs to bring the HEVC decoder to their discrete video cards (not just Fiji).
Novacius - Tuesday, June 2, 2015 - link
From the article: "However for a 15W part, to which Carrizo will be primarily, this means either a 10%+ power reduction at the same frequency or a 5% increase in frequency for the same power. Should AMD release other APUs in the 7.5W region, this is where most of the gains are."It's power per core pair (module) and Carrizo clearly has two of them. This means the highest gains are exactly in the 15W TDP range.
Ian Cutress - Tuesday, June 2, 2015 - link
Good point, article updated :)Novacius - Tuesday, June 2, 2015 - link
"The big upgrade in graphics for Carrizo is that the maximum number of compute units for a mobile APU moves up from six (384 SPs) to eight (512 SPs), affording a 33% potential improvement."Mobile Kaveri already has 8 CUs (FX-7600P), but only at 35W.
el etro - Tuesday, June 2, 2015 - link
They stated that the power savings allowed them to put the 512SPs/8CUs in the 15W part.Ian Cutress - Wednesday, June 3, 2015 - link
I changed that on the fly a little while back between meetings, should be OK now :)ClamShall - Tuesday, June 2, 2015 - link
This article is basically just an explanation of AMD's marketing slides without actual empirical data to back things up (other than what has been provided by AMD). Worse still, it doesn't make any notable attempt to critically analyze whether the company's claims will or will not materialize.In short, this article should've been left to AMD's marketing team and posted on the company's site.
Iketh - Tuesday, June 2, 2015 - link
I don't have time to visit multiple sites. This is the only one I visit. Thank you AT for this article and keep them coming please.KenLuskin - Wednesday, June 3, 2015 - link
clamboy, This is a good article, and you are just another Intel fanboy with butt hurt. Intel does the EXACT same thing, and yet dumbos like you suck it up...... Bend over and assume the position for maximum penetration!SolMiester - Wednesday, June 3, 2015 - link
Eh?, and how are AMD providing the butt hurt?....formulav8 - Sunday, June 7, 2015 - link
Because there are many wackos who feels like an Intel/NVidia Corp is their mommy and hates to see AMD improve anything. Look at comments for Freesync reviews and such. Stupid how anyone gets attached to a Corp who cares nothing about you.FlushedBubblyJock - Tuesday, June 9, 2015 - link
amazing how a critically correct comment turns into an angry ranting conspiracy from youBillyONeal - Wednesday, June 3, 2015 - link
This is a preview piece. They don't have empirical data because the hardware isn't in actual devices yet. Look at any of AT's IDF coverage and you'll see basically the exact same thing.Refuge - Wednesday, June 3, 2015 - link
nothing has been released yet. but it was announced. This is a news site, you think they are just going to ignore AMD's product announcement? That would be considered "Not doing their job"They go through the claims, explain them, try to see if they are plausible with what little information they have. I like these articles, it gives me something to digest while I wait for a in depth review, and when I go to read said review I know exactly what information I'm most interested in.
KaarlisK - Wednesday, June 3, 2015 - link
About adaptive clocking.Power is not saved by reducing frequency by 5% for 1% of the time.
Power is saved by reducing the voltage margin (increasing frequency at the same voltage) _all_ the time.
Also, when the voltage instability occurs, only frequency is reduced. The requested voltage, IMHO, does not change.
ingwe - Wednesday, June 3, 2015 - link
Interesting. That makes more sense for sure.name99 - Monday, June 8, 2015 - link
It seems like a variant of this should be widely applicable (especially if AMD have patents on exactly what they do). What I have in mind is that when you detect droop rather than dynamically change the frequency (which is hard and requires at least some cycles) you simply freeze the entire chip's clock at the central distribution point --- for one cycle you just hold everything at zero rather than transitioning to one and back. This will give the capacitors time to recover from the droop (and obviously the principle can be extended to freeze the clock for two cycles or even more if that's how long it takes for the capacitors to recover).This seems like it should allow you to run pretty damn close to the minimum necessary voltage --- basically all you now need is enough margin to ensure that you don't overdraw within a worst case single-cycle. But you don't need to provision for 3+ worst-case cycles, and you don't need the alternative of fancy check-point and recovery mechanisms.
KaarlisK - Wednesday, June 3, 2015 - link
About that power plane."In yet more effort to suction power out of the system, the GPU will have its own dedicated voltage plane as part of the system, rather than a separate voltage island requiring its own power delivery mechanism as before"
As I understand it, "before" = same power plane/island as other parts of the SoC.
Gadgety - Wednesday, June 3, 2015 - link
Great read and analysis given the fact that actual units are not available for testing.As a consumer looking for use of Carrizo beyond laptops, provided AMD releases it for consumers, it could be a nice living room HTPC/light gaming unit.
Laxaa - Wednesday, June 3, 2015 - link
I would buy a Dell XPS13-esque machine with this(i.e. high quality materials, good design and a high res screen)Will Robinson - Wednesday, June 3, 2015 - link
According to ShintelDK and Chizow...the above article results are from an Intel chip and AT have been paid to lie and say its Carrizo because their lives would have no meaning if it is a good product from AMD.D. Lister - Wednesday, June 3, 2015 - link
Calling an unlaunched product "bad" would be just as imprudent as calling it "good", but then what do I know, fortune-telling could be just another one of your super powers, along with mind-reading.Gigaplex - Thursday, June 4, 2015 - link
Why would they use an Intel chip to fake a bar graph? Just put fake numbers directly into a spread sheet, job done.Dirty_Punk - Wednesday, June 3, 2015 - link
Unfortunatelly, time to market, as always for AMD, will be probably 6 month or more... at that time there will be something better from intel. It seems a lot like Asus from this point of view, grat product and very bad supply chain.AMD has 3 problems ritght now:
1- with 28nm is difficult to compete against 14nm of intel
2- very slow from project design to mass production
3- no support from OEM and major system builders (HP & co.), as it seems always that Intel works to force system builder to forget about AMD like years ago...
jabber - Wednesday, June 3, 2015 - link
Yep this time next year when you walk into a PC store you'll still see 25 Intel laptops and one cheap nasty AMD laptop on the shelves. OEMs don't care about AMD anymore. From what I see from customers is, that they only buy AMD if they are the cheapest machine in the shop. And then its a E1 chip and the customer really regrets it.haukionkannel - Wednesday, June 3, 2015 - link
That problem 1 is the worst! AMD APUs are ok, but 28nm vs Intel 14nm is just a huge deal. When considering power and efficiency and how much they can put on the chip.jimjamjamie - Wednesday, June 3, 2015 - link
HP is actually pretty good for offering a good selection of AMD-powered laptops, usually the cheaper models. Better than most other OEMs though.bloodypulp - Wednesday, June 3, 2015 - link
The only decent laptops HP makes are the Elitebooks. And you will pay through the nose for them when you can get better quality from other brands for less. HP should just stop making PCs, period.UtilityMax - Wednesday, June 3, 2015 - link
HP has several models with AMD chips. It's one of them few laptop makers that allows you to configure an Envy laptop with say a 1080p screen, SSD, a A10 CPU, plus discrete graphics.jabber - Thursday, June 4, 2015 - link
And I bet they sell about 8 of those AMD based machines a year.watzupken - Thursday, June 11, 2015 - link
Point 2 is probably their killer to be honest. Point 1 is definitely puts AMD in a disadvantage, but considering that Intel don't seem to be interested in pushing performance since its Sandy Bridge days, it's giving AMD a chance to catch up in terms of performance.To be honest, I think I have to take my hat off AMD's efforts. At 28nm, they are forced to be as creative as they can to squeeze performance out, while keeping power requirements in check.
VeixES - Wednesday, June 3, 2015 - link
Some OEM needs to pick this up fast.Carrizo based "NUC" device with HDMI2.0 output with more barebones approach than intel to reduce the cost of entry.
gostan - Wednesday, June 3, 2015 - link
Anandtech - AMD's marketing arm.bloodypulp - Wednesday, June 3, 2015 - link
Dumbest thing I've heard yet today.jabber - Thursday, June 4, 2015 - link
Everyone knows AMD has never had a marketing arm. That's why no one buys em.Seriously, the OEMs have moved on. Why bother with AMD when Intel sells because the average consumer has heard of Intel? Price doesn't come into it.
watzupken - Thursday, June 11, 2015 - link
To gostan, I find your comment above baseless and unconstructive to be honest. One article on AMD means AMD marketing arm. So what does that make you then?l_d_allan - Wednesday, June 3, 2015 - link
My impression is that it will be difficult (almost impossible?) for AMD to compete with a 28nm part against Intel's 14nm parts.And I think the next "tick tock" from Intel will be 10nm. Or not?
Novacius - Wednesday, June 3, 2015 - link
It'll be a tick, codenamed Cannonlake. But i don't expect it before the end of 2016/beginning of 2017.The_Assimilator - Wednesday, June 3, 2015 - link
Which will still be before AMD gets to 14nm.cjs150 - Wednesday, June 3, 2015 - link
Finally AMD release a reasonably power efficient chip.At 15W this is perfect for a passively cooled HTPC with 4k capability built in. I appreciate the HTPC market is small, but AMD have something that potentially (will reserve judgment until it is out and tested) beats everything Intel have comprehensively.
The problem for AMD will be that people like me already have a HTPC (in my case using i7-3770T which is overkill) and until the world moves to 4K there is no need to upgrade but if they produced something the size of Intel NUC but passively cooled I would be very tempted
watzupken - Wednesday, June 3, 2015 - link
I think this makes a very interesting APU. In fact, the most interesting APU from AMD to date. Unfortunately, it may not reach the shores from where I come from. It is either limited availability or the distros are not interested to carry in due to them expecting a low demand.Cloakstar - Wednesday, June 3, 2015 - link
These scores definitely need validation. If true, Carrizo is a massive win.The FX 8800P graphic shows a 3DMark 11 score of nearly 2000 at 15W, and 2700+ at 35W.
The A10-7850k has a score of 2403 at 95W.
http://www.anandtech.com/show/7677/amd-kaveri-revi...
azazel1024 - Wednesday, June 3, 2015 - link
I didn't realize AMD's processors were so terrible at video playback. My 2 year old (pushing 3 now) Ivy Bridge i5-3317u equipped HP Envy 4t can manage roughly 6hrs of video playback of a 1080p h.264 12Mbps source and it only has about a 45whr battery in it. With a higher TDP chip and lots of "not power saving" features.creed3020 - Wednesday, June 3, 2015 - link
I am definitely in the market for one of these laptops to replace two older laptops in the house with one new one. If it had Carrizo for the hear I would be mighty happy to support AMD over Intel for this round, as the improvements here sound very much adequate for the system I am looking for.The ideal system would be something like the HP Spectre x360 for around $750.
michal1980 - Wednesday, June 3, 2015 - link
AMDs problem start with the 1st slide. "more people by notebooks priced between $400 and 700 than at any other price. Almost 2 out of every 5 notbooks sold is in that segment."umm, 3 out of 5 notebooks is sold outside of the 400-700 dollar price. Thats greater then 2 out of 5.
AMD fails math. Fails in general.
silverblue - Wednesday, June 3, 2015 - link
No. 3 out of 5 notebooks are either sold below $400, or above $700, and out of those two disparate segments, neither is as large as the $400 - $700 segment.There isn't a "math" fail here.
takeship - Wednesday, June 3, 2015 - link
The bigger issue is that AMD is admitting they are so uncompetitive in the market that it doesn't make business sense to chase at least 60% of consumers (and ignoring business costumers completely). And realistically, that 400-700 market is really more like a 550-700 market, as 400-500 is close enough to base iPad Air 2/premium Android tab pricing that you lose a lot of sales that direction.silverblue - Wednesday, June 3, 2015 - link
Surely TrustZone is a sign that they want business customers? Additionally, being able to work on the battery all day is a good thing.Gigaplex - Thursday, June 4, 2015 - link
Read closer. Their "all day battery" claim is being able to idle for 8 hours. You won't get a full work day out of that.silverblue - Friday, June 5, 2015 - link
If you class H264 1080p video as idle, sure. My fault for saying "work" though, however if all you're doing is light stuff, you won't be far off.FlushedBubblyJock - Wednesday, June 10, 2015 - link
for amd's sake we'll class idle as full screen video playback and 1.5 hours as all day, and no wifi bluetooth or dvd player active as full multimedia active -there, now look, you were correct ! your ego is in tact, you're never wrong
Calgon take me away
name99 - Saturday, June 6, 2015 - link
You are comparing a $400 laptop to a $1500 laptop and, what do you know, the $1500 laptop comes out better. What a surprise!The point is that in this space batteries have long been cheap and the energy efficiency nothing like at the higher end. Which means the work-life has been something like 3 hrs. If AMD shifts that to six hours with this chip, that's a massive improvement in the target space.
You're also making bad assumptions about why these laptops are bought. If you rely on your laptop heavily for your job, you buy a $1500 laptop. These machines are bought to act as light performance desk machines that are occasionally (but only occasionally) taken to a conference room or on a field trip.
name99 - Saturday, June 6, 2015 - link
AMD does not have infinite resources. This play makes sense.Intel is essentially operating by starting with a Xeon design point and progressively stripping things out to get to Broadwell-M, which means that Broadwell-M over-supplies this $400-$700 market. Meanwhile at the really low end, Intel has Atom.
AMD is seeing (correctly, I think) that there is something of a gap in the Intel line which they can cover AND that this gap will probably persist for some time --- Intel isn't going to create a third line just to fit that gap.
Krysto - Wednesday, June 3, 2015 - link
I might be ready to get into AMD, as AMD has a lot of innovation lately. But it still disappoints me greatly that they aren't able to adopt a more modern process node.If they launch their new high-performance CPU core next year as part of an APU that uses HBM memory and is at the very least on 16nm FinFET, I might get that instead of a Skylake laptop. HSA is pretty cool and one of the reasons I'd get it.
UtilityMax - Wednesday, June 3, 2015 - link
The Kaveri FX parts are still almost half as slow in IPC as a competing Intel Core i3 with the same TDP. Only in tests involving multithreaded apps that can load all four cores the FX parts are keeping up with the Core i3. Let's hope the Carrizo generation of APUs will improve this situation.silverblue - Thursday, June 4, 2015 - link
Without being an AMD apologist, I think the point was that single threaded performance was "good enough" for your usual light work which tends to be hamstrung by I/O anyway.There are two things that I need to see clarified about Carrizo, however:
1) Does Carrizo drop CPU frequency automatically when the GPU is being taxed? That's certainly going to be an issue as regards the comparison with an i3.
2) With the addition of AVX2, were there any architectural changes made to accommodate AVX2, for example a wider FlexFPU?
sonicmerlin - Tuesday, June 9, 2015 - link
Yup. I'll wait for the 14 nm Zen APUs with HBM. The performance leap (both CPU and GPU) should be truly massive.Phartindust - Thursday, June 4, 2015 - link
Dude your gettin a Dell with a AMD processor!When was the last time that happened?
Looks like @Dell loves #Carrizo, and will use @AMD once again. #AMDRTP http://www.cnet.com/au/news/dell-inspirion-amd-car... …
elabdump - Friday, June 5, 2015 - link
Don't forget that Intel gives you an non fixable NSA approved BIOS: http://mjg59.dreamwidth.org/33981.htmlpatrickjchase - Friday, June 5, 2015 - link
Ian, you appear to have confused I-cache and D-cache.You wrote: "The L1 data cache is also now an 8-way associative design, but with the better branch prediction when needed it will only activate the one segment required and when possible power down the rest".
This is of course gibberish. Branch prediction would help to predict the target set of an *instruction* fetch from the I-cache, but is useless for D-cache set prediction for the most part (I say "for the most part" because Brad Calder did publish a way-prediction scheme based on instruction address back in the 90s. It didn't work very well and hasn't been productized that I know of).
zodiacfml - Friday, June 5, 2015 - link
Imagine what they could with 14nm of this, probably at half the cost of a Core M with 60 to 70% CPU performance of the M, yet with better graphics at the same TDP.AS118 - Saturday, June 6, 2015 - link
I already signed up on the mailing list that tells you when Laptops with Carrizo come out and are ready to buy. You can do so on AMD's website if you're interested. The H.265 hardware decoding alone interests me, and all the other features like program-specific acceleration and the better GPU performance for mainstream games is nice.If you only play stuff like LoL or Counterstrike, or browser games or even older games on GoG and Steam, the A10 and up look like they'll be quite good.
ivyanev - Sunday, June 7, 2015 - link
As the performance is more than enough for everyday use, and the price is good, using it in mini PC would be great.watzupken - Thursday, June 11, 2015 - link
I was thinking the same thing. If they can produce this for use in those NUC sized PC, I will consider getting one as HTPC if the price is right.Fujikoma - Sunday, June 7, 2015 - link
AMD not including VP9 support is a mistake. They could always drop it if YouTube isn't as popular, but a lot of video in media articles tends to be linked to YouTube.It would be nice to see a die shrink with AMD adding more CPU cores to make up the difference to at least compete with Intel in number crunching.
ivyanev - Tuesday, June 9, 2015 - link
Try using h264ify plugin for chrome - it disables the vp8 and vp9 video, and youtube plays the mp4 versions - butter smooth and efficientfigus77 - Thursday, June 11, 2015 - link
I think everyone should look at APU with respect, apu is the future of pc and notebook, HBM on next AMD GPU will be a start and test for new APU with HBM on chip ram, that will be faster and faster than any ddr4 now available in market and probably any 'on motherboard' ram we will ever see, AMD could start a revolution in PC market, and other will probably copy them in short, even with faster cpu, but IF that will happens we shall be grate to AMD.And sorry for my english...
JDub8 - Tuesday, June 16, 2015 - link
Something I'm always interested in but is never addressed in these articles. The UVD playback and all its magical power savings - what codecs/players support it? If I have a CCCP installed will MPC-HC automaticall benifit? Or will that be reserved for some cyberpower payware dvd/bd player?