The overall PC market did very well last quarter. Console sales should have been highest in the 4th quarter. Yet AMD still lost 1/3 of a billion. They did nothing to reduce their debt and have not released anything of relevance in a while. 3 top executives just left the company and the stock has lost nearly half of it's value in a year.
AMD traded around 4.50-4.55 through the middle of July last year, today they are trading around 2.24-2.30 (today's trading range), that is roughly half the value within the last 12 month period.
Ouch $21/share! I've got serious AMD red spot in my portfolio too.
AMD has a huge success story on it's hands with Carrizo if it can get it to market at least within the niche HTPC segment. Right now the only way to get 18Gbps HDMI 2.0 out of a PC is with Nvidia 980/970 $$$ video cards. AMD or a partner can offer a NUC or microITX all in one board/box for less than 1/2 the price of the Nvidia video card alone. There are lots of Ultra HD / 4k TVs that can support a proper 60Hz signal via HDMI 2.0 (almost no TVs have display port) and no reasonable HTPC solution currently on the market.
AMD, get Carrizo out in NUC or (complete on board) MicroITX ASAP!!! The window of opportunity is closing fast.
please bless notebook manufacturers using AMD's APUs.
I couldn't care less about "single threaded CPU performance" of Intel chips, my Excel and Word are fast enough for many years, but buying an Intel notebook with discrete GPU really hurts my wallet.
Single threaded performance is one factor that many mobile CPUs should have as it meant that the core has very high efficiency per clock as typically most mobile CPUs run at very low clock frequencies.
Laptops with discrete mobile GPU are usually meant for some gaming and of course costs much more. Just take a look at gaming laptops such as Alienware for example. Meanwhile APUs are usually rarely adequate for games on mobile laptop platforms.
APU 's are best suited for game consoles, cheap low-midrange laptops & PC's, servers, NAS, tablets and smartphones if AMD can lower the TDP and enhance the performance of the APU 's. Maybe a little more cache memory (L3 Cache) and some GDDR5 memory... ?
AMD could also integrate their APU 's with their high-end video-cards and offer a multiple APU system (Quad-CrossFireX) on cheap motherboards without CPU's, memory-banks and a north-bridge.
maglito, please stop feeding the fantasy, there is no window, there's not even an unbent spoon, amd Neo isn't going to hack into the Matrix and send out the bull market signal
it's over, it was over years and years ago, the white rabbit is dead, no, you're not a hero with a plan
Console chips are sold in the 3rd quarter in preparation for 4th quarter holiday sales, that is when they mentioned their peak revenues. Also console chips have very low profit margins (under 20%), even from the gross margin side (almost little or no net profit at all).
According to Intel Skylake's schedule was not effected much by the 14nm delays and is still on track for late Q3/early Q4 2015. Whether this only applies to Skylake Y is still yet to be seen. Otherwise it wouldn't leave much room for Broadwell on LGA1150. Broadwell K for LGA2011 is scheduled to launch at around the same time as Skylake.
It probably, unfortunately will become worse for AMD this year. Every time AMD produces slides such as the one above I am baffled by their late response to, well basically everything that happens around them. It's almost like every time they suffer a loss somewhere, their response is triggered after the fact, never before.
The above slide; this year they had 40% revenue from 'growth markets'. Revenue is all great, being on growth markets is awesome, but where is the profit? When will AMD reap the benefits of what it is doing? Answer: they never really do. When the Bitcoin market exploded and everyone wanted to start mining they had a big rise in sales but hardly capitalized on that. They launch new products with an ever decreasing profit margin (APU, GPU) or a very tight one to begin with (semi-custom APU as in the X1/PS4). This is all revenue, but little profit.
AMD is very good at keeping itself busy, but very bad at turning it into profit.
At their earnings call a few days ago, Brian Krzanich said Skylake isn't impacted by Broadwell because it's simply a new architecture, so no new fabs needed, while the new architecture will bring new functionality and performance, so of course releasing Skylake as early as possible is a no-brainer.
The major PC manufacturers do a poor job of creating the best product for AMD, hence no sales. There is an opportunity to make a decent laptop in the medium price range. Good luck finding an AMD product with a good screen and SSD. Cheap Windows gaming tablets that are not as power optimized during normal use would be another opportunity - gaming battery life would likely be about the same as Intel since the 'wells have done most of their saving at idle. There is no reason there couldn't be one less Intel Atom based tablet or a version equipped with AMD APU.
The rest of the issues are on AMD management because if you look at Nvidia they are thriving. All AMD needed to do was copy half of Nvidia's corporate plan to be profitable. Clearly something is wrong. They shouldn't be this absent from the market with their available products. Desktop traditional CPUs should suck in sales because they're offerings a 3 years old without even a die shrink. Apple did a die shrink on the A5 in the iPad 2 and it resulted in better battery life.
AMD does a poor job of producing products that manufacturers want and need, or that users want or need. This is their fault, not the pc market, or the pc manufacturers fault. If what you said would be true, then it would have affected Intel just as much, but they had a record year.
They're still relatively competitive with Nvidia in the GPU market, though Nvidia has better marketing, and while 2013 was AMD's turn to take the price for performance crown 2014 was Nvidia's turn. If they can turn it around this year with actual marketing then that should be profitable again this year.
The real problem though of course is their CPU architecture. Bulldozer goes down as one of the biggest misfires in history on multiple accounts. Meanwhile their low power/mobile designs are good from a CPU perspective, but don't have the fully integrated SOCs and other stuff manufacturers actually want from that category.
A lot, like the company a lot, will ride on their new CPU architecture out next year being competitive. Though good marketing could help, Nvidia is no better than AMD in most ways for GPUs, but talk to the average person and it's Nvidia all the way somehow. Gamers gushed over the new 980/970 TDP, even though that's a totally irrelevant number to gamers, somehow Nvidia managed to sell that to them as a point anyway. Which, good on them, and bad on AMD.
TDP is not irrelevant. When I fire up two R9 280 or R9 290X GPUs compared to two GTX 970 GPUs, the difference can be felt (heat from my PC) and heard (noise from the fans). Maybe it's my cards, but all my 280 and 290X cards are retail GPUs I bought, not engineering samples.
I could find quieter 290X options now, but a year ago when I got these it was pretty much all blowers. Also, I can't close up my case without overheating problems with the 290X cards -- I need a bigger tower with more fans I guess, or just run with the side off. Even the GTX 770 and 780 GPUs I have run substantially quieter than the R9 cards I have, with basically similar performance.
It's a really bad place to be in for AMD. For similar performance, NVIDIA has GTX 970 that uses nearly 100W less per GPU under load, or for better performance there's the GTX 980 that uses 75W less under load but costs more. At idle it doesn't matter as much, but you don't buy gaming GPUs because you want to compare idle values. By the time AMD ships their next GPU, NVIDIA will likely have their "Big Maxwell" GM210 or whatever waiting in the wings if it's needed; if it's not, we'll get GTX 980 Ti to tide us over until the fall.
Fans can be made less noisy, and you aren't sitting next to your PC, unless you are in a closed off room with an SLI rig sweating bullets it doesn't matter. But that you have such an extreme case of cognitive dissonance that you're willing to argue it shows just how effective Nvidia's PR is versus AMD's. You, and most people, will yell the heavens to defend irrelevance because NVIDIA sold you that it's importance while AMD has failed to point out that it's stupid.
I guess you're new to all of this. See the GTX580 housefires of just a few years ago, and the millions of AMD fan posts who claimed for years power usage was the sole reason to buy only AMD. AMD cannot just say now, only several card cores later, that power does not matter, when just a couple years ago, it was all that mattered according to the same -- and of course the pure angelic corporate soul of the honest and trustworthy AMD who would never rebrand.
Yes, AMD is competitive because it is selling its products at losses. nvidia makes lots of money with their products, meanwhile. You can't think that a GPU as large as Hawaii costs the same as the new GM204, which is smaller and uses less power (that means less defective pieces).
I would like to remind you that nvidia is still selling mid-range GPU as they were top performance ones. This is since GCN has been introduced. To compete with nvidia (real) top class GPU AMD had to create even bigger GPU than it has ever made (since the "small is good" mantra it prosecuted with VLIW). Just imagine how successful is this architecture for AMD cash.
So, no, AMD is not competitive. And efficiency really counts. Not being able to offer the same performance per watt as nvidia just cuts them out from notebook design wins. They just make notebooks with APUs that none seems to appreciate at all. So this also contributes to their constant loss of market share, incomes and net revenues.
On desktop, you may be as much as you like an AMD fan, but any mentally sane human being would not really consider a 290X in place of a GTX970, unless the 290X is really (but really) discounted. A 290X has no advantages over the GTX970, but it is just using more power, producing more heat and noise. And the discounted price is not helping AMD at all in being "competitive". >It is just helping in loosing even more money.
To chime in on the AMD/Nvidia GPU battle; there are other reasons people choose Nvidia over AMD besides marketing.
- Drivers (still more feature-complete, more stable and faster up to date, faster profile updates after game releases etc. in the Nvidia camp) - Software suite (GFE, Shadowplay) - Ingame performance (Far Cry, AC come to mind)
And to say 'gamers don't mind noise or heat' is total nonsense. The enthusiast gamer has always sought after the GPU that offers the best value for money and 'silent gaming' is becoming bigger these days. Besides, NO ONE likes a noisy computer next to them, so being able to reduce that noise is always a big plus. The key to the success of the 970 is not just that - it offers reduced noise and heat IN ADDITION to being a very solid card. Basically you get that as free, additional perks compared to any AMD card.
A year ago you would be laughed at for skipping an R9 290 based on the amount of noise and heat, because hey, it was the fastest card at that price point and best at high res. It actually still is best at high res. But again, this is where AMD pushed something to market ahead of its time. How many people game at 4k?
As as person with relatively current cards from both camps I feel i should chime in after your chime in :0 The software suite has pretty much reached parity. Ive suffered bad drivers from both teams and thankfully all but one of them has been quashed. I actually uninstalled the gfe since it kept hanging while installing driver updates, however this has also been fixed in recent updates. As far as in game performance that's a pendulum that swings back and forth between what people decide to cherry pick. I have no problems recommending either to my friends at this point. Get what you can afford and then stop reading performance charts because nothing fuels an unneeded upgrade like regret lol.
As a person that owns and has tested both top GPUs from both companies, let me chime in and say that the driver situation at AMD is often times untenable. Just the other day, one of my GPUs "disappeared" from my system (hard reboot during the night or maybe a Windows Update needed to restart?), giving me a device error for the second GPU. (These are R9 280 cards, incidentally.) I thought that was weird, so I tried reinstalling the Omega drivers; no luck on getting the second GPU working.
So next I uninstalled and cleaned out all AMD drivers (DDU), restarted, installed the drivers... and now my AMD control panel says "AMD FirePro Control Center". Ha! The second card was still not recognized. So I powered off, removed the GPUs, installed them again, powered on... and now the GPU was back! But the drivers didn't apparently fully install I guess as I needed to run the Omega installer one last time to get things working.
So that was about two hours of my time wasted. Maybe the root cause is a slightly flaky GPU, but the fact that I now show "FirePro" is clearly a bug in the driver software somewhere. Or maybe it's not? I should try running some professional OpenGL benchmarks to see if somehow I've unlocked previously hidden OpenGL performance? Hahaha....
Final note: this is NOT the first time I've had issues with AMD drivers during just the past six months. I don't recall precisely what happened last time, but when I was testing Assassin's Creed Unity, at one point my R9 290X cards had some horrible behavior that basically required a similar sequence of troubleshooting.
@Vayra, agree across the board. We are talking maybe $15-20 difference in many cases with their product positioning based on performance alone, with higher premiums at the top end, but you get more perceived value.
AMD fans tend to be dismissive of many of these Nvidia features as gimmicks, or stupid, pointless etc. but in reality, you add up enough of these "gimmicks" and it just amounts to a better product.
And I'll be honest, over the years of reading over much of this downplaying nonsense in spaces like this, I do feel AMD fans shoulder a lot of the blame for shaping AMD and its downfall. Instead of demanding better support/features and voicing their concerns, they seemed more interested in downplaying and criticizing what Nvidia was doing, thus empowering AMD and giving them an out in their approach to business.
But it looks like those days are (thankfully) coming to an end. Let's hope whoever buys AMD's GPU business takes a more entrepreneurial/innovative approach to the graphics market.
I agree, they are like enablers for drunks and abusive spouses. Any and every excuse and lie, and the reviewers were no better. At some point shortly ago the damn of lies finally broke open, and no longer could every site beg borrow steal and lie for AMD 100% of the time, just to root for the underdog and "preserve competition" at "any cost". AMD rebranded, then pulled their $590 290 release, violating their "we would never do what evil nVidia does" manifesto of lies their fan base bought duped as daises for a decade. Hypocrite rang in the fans head, they finally realized all those years "jerks" like me were desperately trying to wake them up, they were made fools of by AMD. So the fan base continued, and does to this day, but the absolute stalin like adherence loyalty cracked, it was no longer an unpardonable sin not to worship AMD.
It's amazing people still delude themselves that it comes down to just "better marketing". Nvidia also has better support and takes a more proactive approach to delivering new innovation and features.
People are willing to pay a premium for this, and while AMD's hardware has always been competitive, it is obvious an overwhelming majority are willing to pay a premium for the better end-user experience.
What AMD might have realized, far too late imo, is that 1 major showstopping bug they don't fix or address can mean a customer lost for life, if that customer goes to the other side and experiences a better end-user experience. Or when they are constantly the reactive force in the market, people just aren't willing to wait 6-12 months (effectively half the life of some of these parts) when product life cycles are 18-36 months between half and full generations and they can buy a solution that supports those features, today.
Remember, the entire website here, and the entire industry, screamed "housefires!" just a few years ago with the nVidia 500 series... It was a never ending meme and constant theme, and the AMD fans never stopped crowing about it, power power power power electric costs, over and over and over, yet the GTX580 had top spot for speed by a very long and large margin.
So AMD has itself to blame, pumping the power PR win down the throats of industry and fan base, making it a standing joke on nVidia ... well nVidia heard the ridicule, and came back with the 600 series doing loads better, and the 700 and now the 900 doing low power usage miracles - while somehow AMD created 95C cores for gaming and hundreds of extra watts sucking down in cpu and gpu.
@eanazag, the difference is, Intel helps spur this kind of growth with their design rebates and subsidies and their own in-house ODM center. And they got criticized for it, but the reason we see cheap Intel tablets and (some) phones is because Intel isn't waiting around and hoping someone makes a design based on their product, they provide the sandbox and resources for partners to create those devices.
AMD has really forgotten their roots. I understand their desire to focus on supposed high growth areas, but they really should continue focusing on designing fast x86 cores. Who knows when the opportunity to pull another athlon might come along? It might even be now, with intel focusing on power instead of performance. Also, profit isn't just about growth. You can have increased profits but no growth due to market leadership. Once that happens, your company prospers. Case in point: people replace devices every few years with new ones. Give them a top performing AMD chip at a good price and they'll buy. It doesn't cost much to have a small engineering team working on improving your IPC. This team could even be separate from the APU team. In the race to the bottom AMD is quickly going to discover that ARM will be far to competitive for them to continue in these supposed high growth areas. With x86, AMD has just one competitor, with ARM, they have dozens. p.s. not an AMD fanboy, though i owned an athlon x2, athlon 64, and duron back in the day. These days i run top of the line intel CPUs.
Regarding market leadership, what I was getting at with the previous comment is, even if the overall PC market isn't growing, selling users a $300 chip that costs $100 to make vs $150 for the previous generation results in increased profitability without growth.
I think you're being far too idealistic on anyone catching Intel on x86. NetBurst was a misstep, caused by the assumption that they could keep scaling clock speeds -- heck, Tejas was supposed to hit 10GHz (on AIR!) when it was being designed back in the earlier have of last decade. Then we discovered that power density became a huge hurdle at 45nm and below, and thus the wider is better approach and targeting power and efficiency rather than clock speed came about. AMD beat Intel because Athlon was competitive with Pentium 3 and they basically iterated on that and added an integrated memory controller -- something you can only do once for a performance boost. With Intel now doing high performance and higher power plus maximum performance they can get in 4.5W, 15W, 28W, 35W, 45W, and 77W TDP ranges, there's not much of an opening left.
The only real weak spot in Intel's lineup is the one where they're fighting against ARM, not AMD, and AMD is basically just another ARM vendor there. I mean, really, if Intel can't get enough people to adopt their x86 Atom chips when they're practically giving them away, what chance does AMD have? Far more likely than AMD beating Intel at x86 is for ARM and others to make x86 irrelevant, but we're still a very long way (5+ years) from that happening. Smartphones and tablets are great, but there's still a massive portion of the business world that uses laptops and desktops running Windows.
And as someone notes below: fabs matter, a lot. Intel is at 14nm and the only one close to that is Samsung. Others are just getting 20nm going and AMD still hasn't moved any of their APUs to that process. 16nm HP should finally get out the door later this year (we hope), but it will be new and have issues at the start, just like 40nm, 28nm, and 20nm.
Don't get me wrong: I'd love to see a killer APU from AMD that truly surprises me with the performance, efficiency, and price. But even if they can match Intel on all of those fronts, Intel is very much able to drop price a lot if they have to. And since Intel is already in a cutthroat battle with ARM SoCs, they're not slowing down and giving AMD a chance to catch up.
I hold some hope for the next AMD cpu, when was it supposed to come, mid 2016 or something? They're in a bad spot now, being performance competitive with i3 at highest and even then losing at efficiency.
If AMD could catch up to being on par with low end i7, they could easily double their high end prices.
Wonder why they're not releasing a version of the chip inside PS4. Resonably priced native 8-core should sell?
What's inside the PS4 is based on the Jaguar arch. I can't speak directly to it, as I've only got two Bobcat-based boards, but the cores are weak. The GPU is apparently dandy, but it's hard to tell with the CPU cores being as weak as they are. One I use for a fileserver, the other as a diagnostics/recovery machine, so it doesn't much matter on power.
The issue is that they run fine for PS4/XBone (they each have modified versions of the same semi-custom design), but they're not saddled with the overhead of Windows. They might be lovely running a stripped-down Linux, but then that's a niche of a niche market, and that's not going to make them money.
Closest you can get now are the socketed AM1 processors - they have one or two native quad-core SKUs. I'm looking at either a quad or dual in that line to replace my slowly dying fileserver, but I don't have any great hopes that the performance will be greatly better.
Well, a top of the line Beema would be good for at least 25-30% increase in performance in single threaded workloads as compared to Kabini, however I don't think it'd be any faster - perhaps slightly slower? - at base clock speed.
Base clock for Beema top of the line for portables in 2000MHz versus 2050MHz or 2100MHz (with ASUS AM1I-A MoBo) but that Beema can do Turbo 2400MHz, a thing that Athlon 5350 can't...then the Beema have a iGPU clocked at 800MHz vs 600MHz for the Athlon 5350 and RAM for Beema can work at 1866MHz vs 1600MHz.
All these values w/o OC.
So, YES, i would love to see AMD make a Beema top of the line in AM1 socket as it was supposed to be done but yet didn't happen.
Surely having a wider core, even without CMT, would help? The shared FPU never worked for them (thus comparatively poor gaming performance), the architecture worked best when under significant load (not something you'd see that often on the desktop), and each module was power gated as opposed to each core. The narrow width of each core meant higher clock speeds were needed hence the high power consumption.
You can get away with thousands of cores in a GPU due to their parallel nature, and you can generally increase performance by adding more cores whilst dropping clock speeds down. Right now, it still makes sense to have fewer, stronger CPU cores for most client workloads. Maybe one day AMD's approach will start to make some sense, but the software never truly came to their rescue. It's not something they cannot work their way out of, but we know how long architectures can take to come to market.
It's not exactly helping AMD now though. Forcing weak (though stronger than consoles had previously) CPU cores onto developers thus attempting to force them to utilise as many as possible is an admirable attempt at giving game programming a kick up the behind, but very few people have hex or octa core CPUs, and PC CPU cores are far more powerful than those within the consoles, so why cater for FX users? Besides which, those eight weak cores have eight individual FPUs... and FX8 has four SMT-based FPUs. I don't see an advantage.
AMD has too stop demanding everyone else make them viable, all they get then is a premmie abortion and they're laying their gutted and dead. AMD needs a man, a visionary, a leader, not a gutless coward who is all to quick to whine as a victim. At least with the 290X I can't think of whiny victimhood associated with it.
Where I think AMD might have a chance is in reasonably priced notebook/desktop CPUs with reasonable performance to which you can attach discreet graphics cards, assuming Intel continues going in the direction of doing away with PCIe lanes on all but high-end enthusiast/server-type CPUs. So if Intel cooperates by only making Xeon and Broadwell-like CPUs, AMD's market could be gaming laptops and smaller-than-full-size desktops targeted at gamers, all with (perhaps optional) discreet graphics.
Except that AMD's current CPU design is far too anemic to be used in a gaming machine. the MSI gx60 showed that pretty well, with the same gpu in an intel laptop performing better in every scenario, sometimes more than double the AMD system's speed. AMD hasn't replaced that chip either, since the fx7600p is MIA. They have no hope of being in a gaming laptop anytime soon.
The fact that they expect better things to start coming only at the 2rd quarter of 2015, means that they have nothing for this one, and probably only the 300 series GPUs sometime in the 2nd quarter.
Probably a tweaked rebrand 90%. To imagine that they have some core breakthrough is insane. If they did they couldn't write the drivers for it. They are demoralized. A string of miracles in rapid succession is all they need. Core miracle. Power miracle. Market miracle. Driver miracle. On a shoestring staff and budget. I see the 290X as their last crowning achievement. It's fast, but hot and power hungry, and the next won't make it to the top IMO. It is now too hard - the core is already too big and too hot, and they are too gutted.
And you have to understand financial results. The write-down just means that in previous quarters they overstimated that value that now is corrected its real value. It was just expected, as such operations make some quarters that looks good (with positive gains) and just one quarter that look bad. If you dilute this loss to the other 3 positive quarters, they would all end with negative results, making the company look just bad every quarter instead of giving some hope to the investors.
If you look at the single results of each division, apart this extra write down, you'll see that each of them had a negative result with respect to last year. By the way, 22% less revenue YoY surely can't help making better financial perfomances. And next quarter seem to be even worse. Simply put, AMD isn't able to sell its products anymore. And that is what really counts for investors (that actually are abandoning the ship).
AMD has lost that money and the write-down is just recognizing it. The acquisitions and severence pay is a bit different, but the inventory write-down is basically saying we've underestimated our losses by overestimating the value of our inventory. They just hope the investors will be dazzled by the genuine one-time writeoffs to notice their non-GAAP figures are misleading.
I am really sorry to see AMD in such a sad state. AMD is extremely important b/c they are the sole competitor to Intel and NVIDIA. A strong and profitable AMD that produces superior CPUs and GPUs means lower prices for the said products for the end user.
Latest AMD CPUs have got the single threaded performance of Intel CPUs made in 2008, and the latest top-of-the-line Radeon GPUs produce so much heat and consume so much power that no really savvy person can go for them over the NVIDIA alternatives, even if the latter cost more and to make things worse they offer no performance advantage.
AMD is in a very bad spot, and they need to do radical things if they want to stay in business.
"the latest top-of-the-line Radeon GPUs produce so much heat and consume so much power that no really savvy person can go for them over the NVIDIA alternatives, even if the latter cost more and to make things worse they offer no performance advantage."
Well, in my favourite PC game (Company of Heroes 2), the R9 290 is as fast or faster than the GTX970, and I got it for $259 canadian, vs. the cheapest GTX970 being $380 in Canada, so I went for the R9 290 (Gigabyte Windforce), and flashed the bios to 1050MHz in 5 minutes. The Windforce cooler is extremely effective and quiet. I don't hear the card at all in my CoolerMaster Storm Stryker case, so I'd say you're quite wrong about the R9 290 not being competitive, and yes, I'm a 'savvy' tech person, I'd say, after having built all my previous PCs since 1995.
nVidia simply continues to rip people off and the worst part is that people seem to like it anyhow. If you want to bend over for Jen Hsun and his proprietary technologies that lock you in to higher prices, please be my guest, but don't try to put a false spin on AMD's competitiveness.
AMD's GPU and APU sales are stumbling and AMD expects that to continue thru to Q2/Q3 2015. ASPs fell and build up in inventory has forced yet another write-off. During Q3 conference call, Devinder Kumar specifically pointed out the GPU division for its poor performance and attributed that to market dynamics and competition. Being relegated to the lower segments of the market, along with a heavy reliance on sales in Asia, has hammered them quite hard.
Are you sure it's him who's putting the false spin on AMD competitiveness?
AMD desperately needs new CPU and GPU uArchs. Primarily, they need uArchs tuned for efficiency and low cost while offering great performance. Right now, they've got bigger dies that consumer higher power that they're forced into charging less money for. All the while, they're losing market share and the eyes of the consumer.
It's the consoles that are keeping the company afloat at the moment, and presumably into 2015. The upside seems to be the EESC space, where AMD has made some inroads via their APUs and embedded products along with securing another couple of design wins.
Given the breakdown of their revenue, both expected and reported, it looks like the appeal of AMD's products are viable within low-margin markets with longer refresh cycles. While that can make for nice added (bonus?) revenue, it won't pay for costly development of their CPU, GPU, and other microarchitectures.
Arguing that AMD's current products are competitive is farcical. They're competitive in certain segments and some aspects, sure, but as a whole they're suffering.
Yawn, cry me a river, Intel has been the only CPU game in town for 8 years and where has it gotten us? Oh right, more awesome CPUs every. single. year.
But yes, I can tell you're the clingy fanboy type, its OK, I used to really like 3Dfx too until they went belly-up and I moved on, first to ATI actually, but of course Nvidia has dominated since G80 making my decision to go with them and stay with them, easy!
Chizow, you nVidiot, you need to watch this video to see how you're going to be spending at least $100 more for a G-Sync monitor than exactly the same FreeSync version:
If you're still determined to give nVidia your money, despite their clear and obvious attempts to rip you off, please go right ahead and bend right over.
It is just that g-sync is totally optional, just like surround or 3D. The difference between g-sync and freesync is not unlike doing physics - you can do it in software, or you can do it with designated hardware. Freesync is a software solution, while g-sync performs a similar function with custom hardware. Once both techs are available in retail, comparisons will be made, and the better implementation will become apparent.
I agree, once both hit the market it will be evident which is better, however, we do know G-Sync does everything Nvidia said it would because it has been on the market for over a year. That's going to be a tough act for AMD to follow. :)
Why would I need a video to confirm what PCPer and others have reported, that FreeSync isn't as good as G-Sync and the $100 premium is well worth it for something that is better and has already been on the market for months?
Oh, and I bought an ROG Swift at launch, been enjoying it for the last 5 months while AMD keeps its fanboys like you waiting with baited breath.
Let's hope FreeSync hits the market before AMD goes belly up, it may be their final gift to their devoted AMD fanboys like yourself. ;)
Everybody is entitled to an opinion. Calling somebody out for being a fanboy when you've done the same yourself is a tad hypocritical, and wears a bit thin after a while.
AMD are performance competitive for the price, but they've had to cut prices quite a lot to stay that way. I would also like to court controversy here by saying that Mantle has probably had a larger impact on gaming than PhysX, though if we're going to be balanced and all that, AMD have never truly offered anything to compete with 3D Vision nor GeForce Experience.
Nah, I unapologetically buy what's best, if that were AMD, I'd buy AMD. But they haven't been the best, not in a long time. So yes, I'm a fanboy of the best I can afford and in every case, that means passing on cheaper, less supported alternatives like AMD.
anubis44 is a massive AMD fanboy however, I mean he stll thinks FX chips and CMT was a good idea! Yes, he is the rare AMD *CPU* fanboy as well. That takes skill, devotion.
Mantle was a dud, just another problem to a question that was never asked: "Hey you know what would be great? Another API/platform from the minority stakeholder in the GPU business to further segment PC gaming!"
chizow, just to prove how 'massive' an AMD fanboy I am, I actually went out and built myself an Intel Core i5 4690K gaming rig, since all the benchmarks and Intel fanboys were screaming about the 'incredible' performance advantage of the i5 4690K over my FX-8350 from over 2 years ago. Guess what? The average FPS in my favourite game, COH2, at even the comparatively low resolution I play at (1680x1050), where the CPU should have been given even more opportunity to play a role, went from 46FPS to about 50FPS. And that's with the 4690K overclocked to 4.5GHz, just like my FX-8350. What a farce. Paying extra for the Intel CPU was a rip off, even though I got it for $229 on special. To make matters worse, when I transcode and rip CDs/DVDs and otherwise run multiple tasks, the Intel chip runs out of threads way earlier than the 8 core FX, which is just pathetic. And no, I wasn't about to spend over $300 to get a Core i7, which still only has 4 real cores and hyperthreading.
I'm still using the i5 4690K, but I might just give it to my father-in-law at some point as a upgrade for his ancient Core2Duo rig. I'm already starting to feel dirty using this overpriced Intel processor.
Oh and please forgive me if I don't buy your BS, when virtually every site that has tested that game shows massive gains from faster CPU. As usual, you're probably doing it wrong or more likely, GPU bound with something from AMD that was as slow as your 8350.
The Titan bottlenecks the 4770K but only does so to the 8350 at 4.5GHz. This may explain why anubis44 isn't noticing better performance - and that he could downclock his i5 and still get the same performance, teehee - however I must point out that CoH2 doesn't appear to use more than three CPU cores, so i5s and i7s are wasted in general. FX CPUs aren't really optimised for considering they used ICC to compile the game. I wonder how core parking would affect the FX results? The game also doesn't support multiple GPUs so this really isn't exactly a great game to extract performance data from.
Love the rage here on both sides. Obviously chizow is a huge nvidia fan boy, although he denies that fact. And there are plenty of AMD fanboys. Fan boyishness aside AMD's graphics hards were top since the 780ti which was nvidia attempting to compete with the new top r9's. "Mantle was a dud" - chizow. Mantle was a nice improvement on performance when it came out by reducing CPU overhead and lead to Nvidia following suite.
Funny how nvidia is barely there. AMD was the better choice power to performance, you can't compare and throw out the AMD competition when they are older architectures than the new Nvidia 9xx's(of course these are better). Just wait for the new R9 3xx's which are rumored to be using HBM memory to hugely improve memory bandwidth something Nvidia also says they are looking into, this should also allow for some nice performance gains as well. But hey as soon as those come out at a later date nvidia sucks so why would anyone ever buy it, it has never won (as seen from @chizow's perspective of what AMD currently is). I would also like to quote you here "Nah, I unapologetically buy what's best, if that were AMD, I'd buy AMD. But they haven't been the best, not in a long time."
And to finish off AMD's HSA architecture shows a lot of promise, although not entirely relevant with current software but they are several years ahead of intel in this field. The kaveri destroyed the i5 here with HSA.
The problem in the discussion we are having here is that people are talking about 2 different things really. The basics on how good a product is comes down to that catch phrase: "There are no bad products, there are only bad prices." AMD has internalized that and prices their product competitively, so for a perfs/$, they are usually competitive if not better than their competition (either Nvidia or Intel). This is great for consummers, as this keeps price pressure on the products and keep the market competitive in price.
The problem for AMD, is that they are not making money. If both nvidia and AMD sell $250 GPU, but it costs $120 to make for one (Nvidia, 55% gross margin, intel is similar in CPU) and $200 for the other (AMD, 29% gross margin), then one company is not as good as the other and will not be profitable. This can be acceptable in the short term, either to buy market share (as intel is trying in mobile) or to get over a bad product until your next refresh, but this depends on having the financials to support it and invest in RnD to develop new, more profitable products. Both Nvidia and Intel have done it, but both have come out stronger in the next round. AMD has been in that mode for years. They bet the farm on buying ATI and they are still paying for it. This cost them their fab, as those had been increasingly expensive to maintain (only a handful of companies have competitive fabs. Even IBM divested theirs recently). This has lead to intel leading in process (currently 14nm against 28nm). In both GPU and CPU, we can also see that they missed the design shift their competitors went through: intel pushed toward lower power since the first Core architecture and has continued to push lower with each iteration (15W -> 10W -> 8W -> 4.5W...). Similarly, while AMD was creating the biggest and most powerful GPU ever (R9 290X), Nvidia shifted towards more and more efficient architecture, culminating with Maxwell recently.
AMD is in a really bad place: they are out-designed in both market and out-fabbed in CPU, with no cash reserves and a mountain of debt. Their hope would be new architecture in both markets, which is long and costly to develop and new process nodes at their suppliers, which is out of their hand. Their advantage, heterogeneous computing, depends on software, a domain they have dropped the ball more often than not and usually rely on third party development.
Basically, unless they change their mode of operation drastically, they cannot really hope to beat their competition substantially in either of their core market.
lol and did you really cite HSA? Wow that is some oldskool AMD fanboy playbook right there. Yeah, just wait for this...wait for that...wait for underwhelm. Same story from AMD and their apologists, different day.
Except there might not be too many future days for AMD fanboys and apologists. Time to change that AMD inside t-shirt for the first time in a decade and go for that tattoo removal!
Kind of wish you could hear yourself, you are a single minded individual obviously. You tunnel vision Intel and AMD and offer no evidence to support what you say. You just sit here and bash anything anyone says. You really need an attitude adjustment.
I do like Intel and Nvidia. AMD and Nvidia have been a back and fourth on GPU power for forever and Intel is the superior CPU. Obviously you only focus on what is currently on top and nothing else(ignoring AMD, who you are biased against). You offer no real helpful information either. I like AMD for the innovation they have push and the fact that they take risks is not a bad thing. APU's were pushed by AMD and intel HD graphics started to take off behind them. HSA is starting to mature and hopefully software will follow, and intel will also follow as well i'm sure. Some of the benefits you get from Intel(unimaginative, but efficient company) have come from competition with AMD. Although you will call me an AMD fanboy again and blow off any of this so why am I even trying.
The only thing that Mantle does is bringing the catalyst drivers on parity with geforce drivers in terms of CPU reliance. Traditionaly (pre-mantle), the Radeons have been much more CPU-sensitive than their Nvidian counterparts (ref: http://www.overclock.net/t/1495236/amd-vs-nvidia-c... ). Which means that a weaker CPU would impact the performance of a Radeon a lot more (upto 15%, as per the reference link) than it would an equivalent Nvidia based solution. The difference is that instead of making a fundamental change in the drivers to overcome that, AMD gave us "Mantle", a latched-on driver mod which requires individual developer-level changes in every game instead of being automatically applicable universally.
Yep exactly, Mantle is and always was a crutch for their lackluster GPUs, because it solved a problem that didn't exist with typical well-balanced builds that used reasonable GPUs with equally paired CPUs.
Either way chizow mantle did offer higher frame rates on Intel CPUs as well as AMD CPUs. Again ignoring the fact that it benefited both and was not an issue with just AMD.
Apparently, DX12 development is older than Mantle. The real motivator behind the development of a low level graphics API is the modern consoles, which proved the potential of low-level access. Once DX12, being a much more comprehensive and universal solution, becomes available, Mantle would inevitably get faded out, joining AMD's other short-lived marketing ploys (the awesomeness of DX10.2, Bulldozer's miraculous Win8 performance boost, etc.) in technological history.
Possibly, but MS has been using a low level API since the original XBox, the next Windows would've been due for an update, without a doubt. The fact they were demonsetrating the "XBox One" a year before launch on Nvidia hardware and were able to demo DX12 at \build a few months after Mantle launched clearly shows they were already working on a lower level API, Mantle launching may have solidified their plans and forced them to speak about it sooner, maybe.
Considering that AMD has been a part of the DX12 development team alongside MS, Intel and Nvidia et al from the start, considering their past track record, it is easy to extrapolate that "Mantel" was AMD essentially trying to make a quick buck with something half cooked before DX12 came out.
That is true, but this time around, unlike in the past, MS' competing hardware (PS4) has a significant hardware advantage, and they're hoping to overcome that disadvantage by bringing the working of the development platform (PC) closer to their console, for faster and higher quality ports.
"don't try to put a false spin on AMD's competitiveness."
Did you actually even read the article? How does that work with your own seemingly lofty impression of AMD's competitiveness? AMD has been desperately trying to expand into new markets and product lines for years, all at the cost of cutting corners on quality of their primary product lines. Now, it's all just finally catching up to them.
I'd argue that AMD's acquisitions and attempted divorce from GloFo have been contributing factors. You can't blame a company for trying to do something different when they're not competing as well in their core markets, but the SeaMicro acquisition has done... what, exactly?
Has quality actually slipped? If you're referring to AMD's GPUs getting bigger and more power hungry, NVIDIA suffered the same fate for a time before Kepler. If you're referring to Bulldozer, that is partly a case of software not supporting the hardware as well as some bad design. When I read "quality", I equate that more to the reference design of graphics cards, and whilst they offered a premium looking product with the 295X2, they haven't cascaded it down the pack like NVIDIA have. I can definitely agree with that. Are drivers as bad as some people say? I suppose it depends on which people you speak to and whether they have an agenda or not.
I don't think anybody can blame AMD for not getting many wins with their Cat cores either, especially with Intel deliberately forcing them out of the market, but you can blame them for overestimating how well Llano and Trinity would sell - that was poor. I am amused by the idea that Apple didn't go with AMD because they couldn't meet demand, but then overproduced APUs to the point they had to write down hundreds of millions of dollars worth of inventory.
Ugliest part for AMD, total stockholder equity: $535 -> $187 million. If that hits zero then AMD is bankrupt, game over. And their CPU/GPU designs are only getting older and older since they use all their R&D "diversifying" and there's no new consoles in progress.
There are AMD fanboys that will still tell you the acquisition of ATI was a good idea, however, even when the point is made AMD overpaid for ATI and it was the beginning of the end for the combined AMD/ATI.
Will be a case study in every B-School in 5 years, if it isn't already.
One quarter of bad financials usually isn't a bad thing. The thing that worries me most is their product line-up. They've basically thrown in the towel on their 'traditional' market, which I feel is going to end up being their biggest mistake. Their GPUs are much more competitive with nVidia than their CPUs are with intel, which is good, but isn't enough. Do they have any planned follow-up to their Bulldozer architecture? I understand they probably aren't capable of taking Intel on in the highest performance segments, but their current desktop (and conversely server) products are barely scraping by in mid-range performance segments WHILE guzzling power like a hot rod from the 1970s! The move to ARM is extremely risky, as they are late to the party. I think its a necessary step, since devices that run ARM are so prevelant, but I don't think they're going to make any serious money here. There are a ton of competitors in this space (Apple, Qualcomm, nVidia) that I just don't see them bringing anything really amazing to the table here. Anyone feel free to correct me if I'm wrong, though. And my question about the follow-up to Bulldozer is a serious one. Do they have any high-end desktop CPUs in the works?
They have K12 and Zen planned for release some time in 2016 - ARM and x86 respectively. Just what segments of the market these chips are planned to aim for is unknown. Both are currently just names tossed about that AMD has pinned the future of its company on. Any guesses beyond what ISA each is is speculation.
OK thanks for the info. I have read that K12 was planned but no specific info on it, and no idea if it is an attempt to regain some competitiveness. 2016 is still a long way out.
I would pin hopes on Zen being a high performance architecture that can take on Skylake. It might not be wishful thinking if Intel sacrifices too much performance going for low power.
I have little hope for K12. IMO AMD is out of its mind going for ARM stuff - they're not good at low power to start with. K12 might have a fighting chance against Intel Atom, but likely not against Qualcomm Krait. If they're going for high performance, ARM comes with a boatload of problems, like lack of software support from desktop/server applications. I haven't heard of anyone picking up ARM servers, and Windows RT (ARM's best shot at real productivity machines so far) didn't get far because no one bothered to port stuff over from x86.
@Creig and others who insisted 3 months ago in the GPU buyer round-up, that price was the only thing that mattered. I guess the market has spoken (again), and you were wrong (again).
They have separated the earnings in an a way that might make wall street happy but will distort how they invest in R&D and cause more issues later on. They need better designs or there will be no custom work for them.
I like AMD, but ... R&D expenditure Q4 2002 = 244.85mln $, twelve years later R&D Q4 2014 = 238mln $. Compared to the same period, Nvidia up R&D from 57mln $ to 340mln $, Qualcomm from 112mln $ to nearly 1.4bn $.
AMD fanboys unite: chant this everyday for 30 days, Die AMD Die. After AMD dies, let Intel fanboys bask in the glory of $6000 Intel Celeron for the CPU, $400 for the cardboard packaging box, $ 2000 for the stock fan without heatsink, $2600 for the heatsink without fan and last but not least $1 for 1mm of TIM.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
108 Comments
Back to Article
Wreckage - Tuesday, January 20, 2015 - link
The overall PC market did very well last quarter. Console sales should have been highest in the 4th quarter. Yet AMD still lost 1/3 of a billion. They did nothing to reduce their debt and have not released anything of relevance in a while. 3 top executives just left the company and the stock has lost nearly half of it's value in a year.eanazag - Tuesday, January 20, 2015 - link
When did the stock have value?Nam Me - Tuesday, January 20, 2015 - link
Good question.. w/ answer...JumpingJack - Tuesday, January 20, 2015 - link
AMD traded around 4.50-4.55 through the middle of July last year, today they are trading around 2.24-2.30 (today's trading range), that is roughly half the value within the last 12 month period.mfinn999 - Wednesday, January 21, 2015 - link
Several years ago I was such an AMD fanboy I bought 100 shares of their stock at $21/share. THAT turned out well./sarcasm
maglito - Wednesday, January 21, 2015 - link
Ouch $21/share! I've got serious AMD red spot in my portfolio too.AMD has a huge success story on it's hands with Carrizo if it can get it to market at least within the niche HTPC segment. Right now the only way to get 18Gbps HDMI 2.0 out of a PC is with Nvidia 980/970 $$$ video cards. AMD or a partner can offer a NUC or microITX all in one board/box for less than 1/2 the price of the Nvidia video card alone. There are lots of Ultra HD / 4k TVs that can support a proper 60Hz signal via HDMI 2.0 (almost no TVs have display port) and no reasonable HTPC solution currently on the market.
AMD, get Carrizo out in NUC or (complete on board) MicroITX ASAP!!! The window of opportunity is closing fast.
medi03 - Saturday, January 24, 2015 - link
Dear god,please bless notebook manufacturers using AMD's APUs.
I couldn't care less about "single threaded CPU performance" of Intel chips, my Excel and Word are fast enough for many years, but buying an Intel notebook with discrete GPU really hurts my wallet.
Amen.
BlueBlazer - Saturday, January 24, 2015 - link
Single threaded performance is one factor that many mobile CPUs should have as it meant that the core has very high efficiency per clock as typically most mobile CPUs run at very low clock frequencies.Laptops with discrete mobile GPU are usually meant for some gaming and of course costs much more. Just take a look at gaming laptops such as Alienware for example. Meanwhile APUs are usually rarely adequate for games on mobile laptop platforms.
Nowwhat - Wednesday, February 4, 2015 - link
APU 's are best suited for game consoles, cheap low-midrange laptops & PC's, servers, NAS, tablets and smartphones if AMD can lower the TDP and enhance the performance of the APU 's. Maybe a little more cache memory (L3 Cache) and some GDDR5 memory... ?AMD could also integrate their APU 's with their high-end video-cards and offer a multiple APU system (Quad-CrossFireX) on cheap motherboards without CPU's, memory-banks and a north-bridge.
FlushedBubblyJock - Sunday, February 15, 2015 - link
A desperation hole prayer will change nothing, but it was a delight to see. ROFL thank youFlushedBubblyJock - Sunday, February 15, 2015 - link
maglito, please stop feeding the fantasy, there is no window, there's not even an unbent spoon, amd Neo isn't going to hack into the Matrix and send out the bull market signalit's over, it was over years and years ago, the white rabbit is dead, no, you're not a hero with a plan
medi03 - Saturday, January 24, 2015 - link
Last time AMD stocks cost 21$ or more was in 2006, nine years ago.BlueBlazer - Saturday, January 24, 2015 - link
Then AMD diluted the stock under financial pressure, and that's when its value fell down dramatically.FlushedBubblyJock - Sunday, February 15, 2015 - link
I tried to warn you, I was shunned and ridiculed and hated.BlueBlazer - Wednesday, January 21, 2015 - link
Console chips are sold in the 3rd quarter in preparation for 4th quarter holiday sales, that is when they mentioned their peak revenues. Also console chips have very low profit margins (under 20%), even from the gross margin side (almost little or no net profit at all).boozed - Tuesday, January 20, 2015 - link
Do you really expect Skylake this year? Broadwell isn't even fully out of the door yet.dragonsqrrl - Tuesday, January 20, 2015 - link
According to Intel Skylake's schedule was not effected much by the 14nm delays and is still on track for late Q3/early Q4 2015. Whether this only applies to Skylake Y is still yet to be seen. Otherwise it wouldn't leave much room for Broadwell on LGA1150. Broadwell K for LGA2011 is scheduled to launch at around the same time as Skylake.nofumble62 - Tuesday, January 20, 2015 - link
You meant it could have been worse for AMD.Vayra - Wednesday, January 21, 2015 - link
It probably, unfortunately will become worse for AMD this year. Every time AMD produces slides such as the one above I am baffled by their late response to, well basically everything that happens around them. It's almost like every time they suffer a loss somewhere, their response is triggered after the fact, never before.The above slide; this year they had 40% revenue from 'growth markets'. Revenue is all great, being on growth markets is awesome, but where is the profit? When will AMD reap the benefits of what it is doing? Answer: they never really do. When the Bitcoin market exploded and everyone wanted to start mining they had a big rise in sales but hardly capitalized on that. They launch new products with an ever decreasing profit margin (APU, GPU) or a very tight one to begin with (semi-custom APU as in the X1/PS4). This is all revenue, but little profit.
AMD is very good at keeping itself busy, but very bad at turning it into profit.
FlushedBubblyJock - Sunday, February 15, 2015 - link
They should hire Mitt Romney as CEO, but I suspect the fan base would off themselves if they did, and never see the big wins that would result.witeken - Wednesday, January 21, 2015 - link
At their earnings call a few days ago, Brian Krzanich said Skylake isn't impacted by Broadwell because it's simply a new architecture, so no new fabs needed, while the new architecture will bring new functionality and performance, so of course releasing Skylake as early as possible is a no-brainer.eanazag - Tuesday, January 20, 2015 - link
The major PC manufacturers do a poor job of creating the best product for AMD, hence no sales. There is an opportunity to make a decent laptop in the medium price range. Good luck finding an AMD product with a good screen and SSD. Cheap Windows gaming tablets that are not as power optimized during normal use would be another opportunity - gaming battery life would likely be about the same as Intel since the 'wells have done most of their saving at idle. There is no reason there couldn't be one less Intel Atom based tablet or a version equipped with AMD APU.The rest of the issues are on AMD management because if you look at Nvidia they are thriving. All AMD needed to do was copy half of Nvidia's corporate plan to be profitable. Clearly something is wrong. They shouldn't be this absent from the market with their available products. Desktop traditional CPUs should suck in sales because they're offerings a 3 years old without even a die shrink. Apple did a die shrink on the A5 in the iPad 2 and it resulted in better battery life.
Nam Me - Tuesday, January 20, 2015 - link
Echo!....Echo!....well said...melgross - Tuesday, January 20, 2015 - link
AMD does a poor job of producing products that manufacturers want and need, or that users want or need. This is their fault, not the pc market, or the pc manufacturers fault. If what you said would be true, then it would have affected Intel just as much, but they had a record year.Frenetic Pony - Tuesday, January 20, 2015 - link
They're still relatively competitive with Nvidia in the GPU market, though Nvidia has better marketing, and while 2013 was AMD's turn to take the price for performance crown 2014 was Nvidia's turn. If they can turn it around this year with actual marketing then that should be profitable again this year.The real problem though of course is their CPU architecture. Bulldozer goes down as one of the biggest misfires in history on multiple accounts. Meanwhile their low power/mobile designs are good from a CPU perspective, but don't have the fully integrated SOCs and other stuff manufacturers actually want from that category.
A lot, like the company a lot, will ride on their new CPU architecture out next year being competitive. Though good marketing could help, Nvidia is no better than AMD in most ways for GPUs, but talk to the average person and it's Nvidia all the way somehow. Gamers gushed over the new 980/970 TDP, even though that's a totally irrelevant number to gamers, somehow Nvidia managed to sell that to them as a point anyway. Which, good on them, and bad on AMD.
JarredWalton - Wednesday, January 21, 2015 - link
TDP is not irrelevant. When I fire up two R9 280 or R9 290X GPUs compared to two GTX 970 GPUs, the difference can be felt (heat from my PC) and heard (noise from the fans). Maybe it's my cards, but all my 280 and 290X cards are retail GPUs I bought, not engineering samples.I could find quieter 290X options now, but a year ago when I got these it was pretty much all blowers. Also, I can't close up my case without overheating problems with the 290X cards -- I need a bigger tower with more fans I guess, or just run with the side off. Even the GTX 770 and 780 GPUs I have run substantially quieter than the R9 cards I have, with basically similar performance.
It's a really bad place to be in for AMD. For similar performance, NVIDIA has GTX 970 that uses nearly 100W less per GPU under load, or for better performance there's the GTX 980 that uses 75W less under load but costs more. At idle it doesn't matter as much, but you don't buy gaming GPUs because you want to compare idle values. By the time AMD ships their next GPU, NVIDIA will likely have their "Big Maxwell" GM210 or whatever waiting in the wings if it's needed; if it's not, we'll get GTX 980 Ti to tide us over until the fall.
Frenetic Pony - Thursday, January 22, 2015 - link
Fans can be made less noisy, and you aren't sitting next to your PC, unless you are in a closed off room with an SLI rig sweating bullets it doesn't matter. But that you have such an extreme case of cognitive dissonance that you're willing to argue it shows just how effective Nvidia's PR is versus AMD's. You, and most people, will yell the heavens to defend irrelevance because NVIDIA sold you that it's importance while AMD has failed to point out that it's stupid.FlushedBubblyJock - Sunday, February 15, 2015 - link
I guess you're new to all of this. See the GTX580 housefires of just a few years ago, and the millions of AMD fan posts who claimed for years power usage was the sole reason to buy only AMD.AMD cannot just say now, only several card cores later, that power does not matter, when just a couple years ago, it was all that mattered according to the same -- and of course the pure angelic corporate soul of the honest and trustworthy AMD who would never rebrand.
CiccioB - Wednesday, January 21, 2015 - link
Yes, AMD is competitive because it is selling its products at losses.nvidia makes lots of money with their products, meanwhile.
You can't think that a GPU as large as Hawaii costs the same as the new GM204, which is smaller and uses less power (that means less defective pieces).
I would like to remind you that nvidia is still selling mid-range GPU as they were top performance ones. This is since GCN has been introduced. To compete with nvidia (real) top class GPU AMD had to create even bigger GPU than it has ever made (since the "small is good" mantra it prosecuted with VLIW). Just imagine how successful is this architecture for AMD cash.
So, no, AMD is not competitive. And efficiency really counts. Not being able to offer the same performance per watt as nvidia just cuts them out from notebook design wins. They just make notebooks with APUs that none seems to appreciate at all. So this also contributes to their constant loss of market share, incomes and net revenues.
On desktop, you may be as much as you like an AMD fan, but any mentally sane human being would not really consider a 290X in place of a GTX970, unless the 290X is really (but really) discounted. A 290X has no advantages over the GTX970, but it is just using more power, producing more heat and noise. And the discounted price is not helping AMD at all in being "competitive". >It is just helping in loosing even more money.
Vayra - Wednesday, January 21, 2015 - link
To chime in on the AMD/Nvidia GPU battle; there are other reasons people choose Nvidia over AMD besides marketing.- Drivers (still more feature-complete, more stable and faster up to date, faster profile updates after game releases etc. in the Nvidia camp)
- Software suite (GFE, Shadowplay)
- Ingame performance (Far Cry, AC come to mind)
And to say 'gamers don't mind noise or heat' is total nonsense. The enthusiast gamer has always sought after the GPU that offers the best value for money and 'silent gaming' is becoming bigger these days. Besides, NO ONE likes a noisy computer next to them, so being able to reduce that noise is always a big plus. The key to the success of the 970 is not just that - it offers reduced noise and heat IN ADDITION to being a very solid card. Basically you get that as free, additional perks compared to any AMD card.
A year ago you would be laughed at for skipping an R9 290 based on the amount of noise and heat, because hey, it was the fastest card at that price point and best at high res. It actually still is best at high res. But again, this is where AMD pushed something to market ahead of its time. How many people game at 4k?
siberus - Wednesday, January 21, 2015 - link
As as person with relatively current cards from both camps I feel i should chime in after your chime in :0 The software suite has pretty much reached parity. Ive suffered bad drivers from both teams and thankfully all but one of them has been quashed. I actually uninstalled the gfe since it kept hanging while installing driver updates, however this has also been fixed in recent updates. As far as in game performance that's a pendulum that swings back and forth between what people decide to cherry pick. I have no problems recommending either to my friends at this point. Get what you can afford and then stop reading performance charts because nothing fuels an unneeded upgrade like regret lol.JarredWalton - Wednesday, January 21, 2015 - link
As a person that owns and has tested both top GPUs from both companies, let me chime in and say that the driver situation at AMD is often times untenable. Just the other day, one of my GPUs "disappeared" from my system (hard reboot during the night or maybe a Windows Update needed to restart?), giving me a device error for the second GPU. (These are R9 280 cards, incidentally.) I thought that was weird, so I tried reinstalling the Omega drivers; no luck on getting the second GPU working.So next I uninstalled and cleaned out all AMD drivers (DDU), restarted, installed the drivers... and now my AMD control panel says "AMD FirePro Control Center". Ha! The second card was still not recognized. So I powered off, removed the GPUs, installed them again, powered on... and now the GPU was back! But the drivers didn't apparently fully install I guess as I needed to run the Omega installer one last time to get things working.
So that was about two hours of my time wasted. Maybe the root cause is a slightly flaky GPU, but the fact that I now show "FirePro" is clearly a bug in the driver software somewhere. Or maybe it's not? I should try running some professional OpenGL benchmarks to see if somehow I've unlocked previously hidden OpenGL performance? Hahaha....
Final note: this is NOT the first time I've had issues with AMD drivers during just the past six months. I don't recall precisely what happened last time, but when I was testing Assassin's Creed Unity, at one point my R9 290X cards had some horrible behavior that basically required a similar sequence of troubleshooting.
chizow - Wednesday, January 21, 2015 - link
@Vayra, agree across the board. We are talking maybe $15-20 difference in many cases with their product positioning based on performance alone, with higher premiums at the top end, but you get more perceived value.AMD fans tend to be dismissive of many of these Nvidia features as gimmicks, or stupid, pointless etc. but in reality, you add up enough of these "gimmicks" and it just amounts to a better product.
chizow - Wednesday, January 21, 2015 - link
And I'll be honest, over the years of reading over much of this downplaying nonsense in spaces like this, I do feel AMD fans shoulder a lot of the blame for shaping AMD and its downfall. Instead of demanding better support/features and voicing their concerns, they seemed more interested in downplaying and criticizing what Nvidia was doing, thus empowering AMD and giving them an out in their approach to business.But it looks like those days are (thankfully) coming to an end. Let's hope whoever buys AMD's GPU business takes a more entrepreneurial/innovative approach to the graphics market.
FlushedBubblyJock - Sunday, February 15, 2015 - link
I agree, they are like enablers for drunks and abusive spouses. Any and every excuse and lie, and the reviewers were no better. At some point shortly ago the damn of lies finally broke open, and no longer could every site beg borrow steal and lie for AMD 100% of the time, just to root for the underdog and "preserve competition" at "any cost".AMD rebranded, then pulled their $590 290 release, violating their "we would never do what evil nVidia does" manifesto of lies their fan base bought duped as daises for a decade.
Hypocrite rang in the fans head, they finally realized all those years "jerks" like me were desperately trying to wake them up, they were made fools of by AMD.
So the fan base continued, and does to this day, but the absolute stalin like adherence loyalty cracked, it was no longer an unpardonable sin not to worship AMD.
chizow - Wednesday, January 21, 2015 - link
It's amazing people still delude themselves that it comes down to just "better marketing". Nvidia also has better support and takes a more proactive approach to delivering new innovation and features.People are willing to pay a premium for this, and while AMD's hardware has always been competitive, it is obvious an overwhelming majority are willing to pay a premium for the better end-user experience.
What AMD might have realized, far too late imo, is that 1 major showstopping bug they don't fix or address can mean a customer lost for life, if that customer goes to the other side and experiences a better end-user experience. Or when they are constantly the reactive force in the market, people just aren't willing to wait 6-12 months (effectively half the life of some of these parts) when product life cycles are 18-36 months between half and full generations and they can buy a solution that supports those features, today.
FlushedBubblyJock - Sunday, February 15, 2015 - link
Remember, the entire website here, and the entire industry, screamed "housefires!" just a few years ago with the nVidia 500 series... It was a never ending meme and constant theme, and the AMD fans never stopped crowing about it, power power power power electric costs, over and over and over, yet the GTX580 had top spot for speed by a very long and large margin.So AMD has itself to blame, pumping the power PR win down the throats of industry and fan base, making it a standing joke on nVidia ... well nVidia heard the ridicule, and came back with the 600 series doing loads better, and the 700 and now the 900 doing low power usage miracles - while somehow AMD created 95C cores for gaming and hundreds of extra watts sucking down in cpu and gpu.
chizow - Wednesday, January 21, 2015 - link
@eanazag, the difference is, Intel helps spur this kind of growth with their design rebates and subsidies and their own in-house ODM center. And they got criticized for it, but the reason we see cheap Intel tablets and (some) phones is because Intel isn't waiting around and hoping someone makes a design based on their product, they provide the sandbox and resources for partners to create those devices.FlushedBubblyJock - Sunday, February 15, 2015 - link
LOL - you made me realize that other guy just blamed the motherboard makers and vendors - hahahahhahaha - it's never AMD's fault.betam4x - Tuesday, January 20, 2015 - link
AMD has really forgotten their roots. I understand their desire to focus on supposed high growth areas, but they really should continue focusing on designing fast x86 cores. Who knows when the opportunity to pull another athlon might come along? It might even be now, with intel focusing on power instead of performance. Also, profit isn't just about growth. You can have increased profits but no growth due to market leadership. Once that happens, your company prospers. Case in point: people replace devices every few years with new ones. Give them a top performing AMD chip at a good price and they'll buy. It doesn't cost much to have a small engineering team working on improving your IPC. This team could even be separate from the APU team. In the race to the bottom AMD is quickly going to discover that ARM will be far to competitive for them to continue in these supposed high growth areas. With x86, AMD has just one competitor, with ARM, they have dozens. p.s. not an AMD fanboy, though i owned an athlon x2, athlon 64, and duron back in the day. These days i run top of the line intel CPUs.betam4x - Tuesday, January 20, 2015 - link
Regarding market leadership, what I was getting at with the previous comment is, even if the overall PC market isn't growing, selling users a $300 chip that costs $100 to make vs $150 for the previous generation results in increased profitability without growth.JarredWalton - Wednesday, January 21, 2015 - link
I think you're being far too idealistic on anyone catching Intel on x86. NetBurst was a misstep, caused by the assumption that they could keep scaling clock speeds -- heck, Tejas was supposed to hit 10GHz (on AIR!) when it was being designed back in the earlier have of last decade. Then we discovered that power density became a huge hurdle at 45nm and below, and thus the wider is better approach and targeting power and efficiency rather than clock speed came about. AMD beat Intel because Athlon was competitive with Pentium 3 and they basically iterated on that and added an integrated memory controller -- something you can only do once for a performance boost. With Intel now doing high performance and higher power plus maximum performance they can get in 4.5W, 15W, 28W, 35W, 45W, and 77W TDP ranges, there's not much of an opening left.The only real weak spot in Intel's lineup is the one where they're fighting against ARM, not AMD, and AMD is basically just another ARM vendor there. I mean, really, if Intel can't get enough people to adopt their x86 Atom chips when they're practically giving them away, what chance does AMD have? Far more likely than AMD beating Intel at x86 is for ARM and others to make x86 irrelevant, but we're still a very long way (5+ years) from that happening. Smartphones and tablets are great, but there's still a massive portion of the business world that uses laptops and desktops running Windows.
And as someone notes below: fabs matter, a lot. Intel is at 14nm and the only one close to that is Samsung. Others are just getting 20nm going and AMD still hasn't moved any of their APUs to that process. 16nm HP should finally get out the door later this year (we hope), but it will be new and have issues at the start, just like 40nm, 28nm, and 20nm.
Don't get me wrong: I'd love to see a killer APU from AMD that truly surprises me with the performance, efficiency, and price. But even if they can match Intel on all of those fronts, Intel is very much able to drop price a lot if they have to. And since Intel is already in a cutthroat battle with ARM SoCs, they're not slowing down and giving AMD a chance to catch up.
Jamor - Wednesday, January 21, 2015 - link
I hold some hope for the next AMD cpu, when was it supposed to come, mid 2016 or something? They're in a bad spot now, being performance competitive with i3 at highest and even then losing at efficiency.If AMD could catch up to being on par with low end i7, they could easily double their high end prices.
Wonder why they're not releasing a version of the chip inside PS4.
Resonably priced native 8-core should sell?
fluxtatic - Wednesday, January 21, 2015 - link
What's inside the PS4 is based on the Jaguar arch. I can't speak directly to it, as I've only got two Bobcat-based boards, but the cores are weak. The GPU is apparently dandy, but it's hard to tell with the CPU cores being as weak as they are. One I use for a fileserver, the other as a diagnostics/recovery machine, so it doesn't much matter on power.The issue is that they run fine for PS4/XBone (they each have modified versions of the same semi-custom design), but they're not saddled with the overhead of Windows. They might be lovely running a stripped-down Linux, but then that's a niche of a niche market, and that's not going to make them money.
Closest you can get now are the socketed AM1 processors - they have one or two native quad-core SKUs. I'm looking at either a quad or dual in that line to replace my slowly dying fileserver, but I don't have any great hopes that the performance will be greatly better.
silverblue - Wednesday, January 21, 2015 - link
Well, a top of the line Beema would be good for at least 25-30% increase in performance in single threaded workloads as compared to Kabini, however I don't think it'd be any faster - perhaps slightly slower? - at base clock speed.AJSB - Wednesday, January 21, 2015 - link
Base clock for Beema top of the line for portables in 2000MHz versus 2050MHz or 2100MHz (with ASUS AM1I-A MoBo) but that Beema can do Turbo 2400MHz, a thing that Athlon 5350 can't...then the Beema have a iGPU clocked at 800MHz vs 600MHz for the Athlon 5350 and RAM for Beema can work at 1866MHz vs 1600MHz.All these values w/o OC.
So, YES, i would love to see AMD make a Beema top of the line in AM1 socket as it was supposed to be done but yet didn't happen.
FlushedBubblyJock - Sunday, February 15, 2015 - link
Beema and Kabini ? They even sound like cheap kiddie knock off cartoon cars.Klimax - Wednesday, January 21, 2015 - link
Xbox One runs specialized Windows and IIRC it is all under Hyper-V hypervisor...silverblue - Wednesday, January 21, 2015 - link
Surely having a wider core, even without CMT, would help? The shared FPU never worked for them (thus comparatively poor gaming performance), the architecture worked best when under significant load (not something you'd see that often on the desktop), and each module was power gated as opposed to each core. The narrow width of each core meant higher clock speeds were needed hence the high power consumption.You can get away with thousands of cores in a GPU due to their parallel nature, and you can generally increase performance by adding more cores whilst dropping clock speeds down. Right now, it still makes sense to have fewer, stronger CPU cores for most client workloads. Maybe one day AMD's approach will start to make some sense, but the software never truly came to their rescue. It's not something they cannot work their way out of, but we know how long architectures can take to come to market.
Jamor - Wednesday, January 21, 2015 - link
I'd hazard a guess at least games will soon start to effectively use multiple cores, as that's the one path available for PS4 and XBOne developers.silverblue - Wednesday, January 21, 2015 - link
It's not exactly helping AMD now though. Forcing weak (though stronger than consoles had previously) CPU cores onto developers thus attempting to force them to utilise as many as possible is an admirable attempt at giving game programming a kick up the behind, but very few people have hex or octa core CPUs, and PC CPU cores are far more powerful than those within the consoles, so why cater for FX users? Besides which, those eight weak cores have eight individual FPUs... and FX8 has four SMT-based FPUs. I don't see an advantage.FlushedBubblyJock - Sunday, February 15, 2015 - link
AMD has too stop demanding everyone else make them viable, all they get then is a premmie abortion and they're laying their gutted and dead.AMD needs a man, a visionary, a leader, not a gutless coward who is all to quick to whine as a victim.
At least with the 290X I can't think of whiny victimhood associated with it.
Ktracho - Wednesday, January 21, 2015 - link
Where I think AMD might have a chance is in reasonably priced notebook/desktop CPUs with reasonable performance to which you can attach discreet graphics cards, assuming Intel continues going in the direction of doing away with PCIe lanes on all but high-end enthusiast/server-type CPUs. So if Intel cooperates by only making Xeon and Broadwell-like CPUs, AMD's market could be gaming laptops and smaller-than-full-size desktops targeted at gamers, all with (perhaps optional) discreet graphics.TheinsanegamerN - Monday, February 9, 2015 - link
Except that AMD's current CPU design is far too anemic to be used in a gaming machine. the MSI gx60 showed that pretty well, with the same gpu in an intel laptop performing better in every scenario, sometimes more than double the AMD system's speed. AMD hasn't replaced that chip either, since the fx7600p is MIA. They have no hope of being in a gaming laptop anytime soon.5150Joker - Wednesday, January 21, 2015 - link
That's all good in theory but where is AMD supposed to get the fabs?Guspaz - Wednesday, January 21, 2015 - link
They had them. They decided to get rid of them.yannigr2 - Wednesday, January 21, 2015 - link
The fact that they expect better things to start coming only at the 2rd quarter of 2015, means that they have nothing for this one, and probably only the 300 series GPUs sometime in the 2nd quarter.silverblue - Wednesday, January 21, 2015 - link
There's rumours of a new graphics product, but only in March, which wouldn't affect Q1 really.FlushedBubblyJock - Sunday, February 15, 2015 - link
Probably a tweaked rebrand 90%.To imagine that they have some core breakthrough is insane.
If they did they couldn't write the drivers for it.
They are demoralized.
A string of miracles in rapid succession is all they need.
Core miracle.
Power miracle.
Market miracle.
Driver miracle.
On a shoestring staff and budget.
I see the 290X as their last crowning achievement.
It's fast, but hot and power hungry, and the next won't make it to the top IMO.
It is now too hard - the core is already too big and too hot, and they are too gutted.
ET - Wednesday, January 21, 2015 - link
AMD didn't actually lose that money, it's just a write-down. Regarding consoles, see 4th paragraph from the end.Short of it is, you've got to read what's being written.
CiccioB - Wednesday, January 21, 2015 - link
And you have to understand financial results.The write-down just means that in previous quarters they overstimated that value that now is corrected its real value.
It was just expected, as such operations make some quarters that looks good (with positive gains) and just one quarter that look bad. If you dilute this loss to the other 3 positive quarters, they would all end with negative results, making the company look just bad every quarter instead of giving some hope to the investors.
If you look at the single results of each division, apart this extra write down, you'll see that each of them had a negative result with respect to last year. By the way, 22% less revenue YoY surely can't help making better financial perfomances. And next quarter seem to be even worse.
Simply put, AMD isn't able to sell its products anymore. And that is what really counts for investors (that actually are abandoning the ship).
Kjella - Wednesday, January 21, 2015 - link
AMD has lost that money and the write-down is just recognizing it. The acquisitions and severence pay is a bit different, but the inventory write-down is basically saying we've underestimated our losses by overestimating the value of our inventory. They just hope the investors will be dazzled by the genuine one-time writeoffs to notice their non-GAAP figures are misleading.Achaios - Wednesday, January 21, 2015 - link
I am really sorry to see AMD in such a sad state. AMD is extremely important b/c they are the sole competitor to Intel and NVIDIA. A strong and profitable AMD that produces superior CPUs and GPUs means lower prices for the said products for the end user.Latest AMD CPUs have got the single threaded performance of Intel CPUs made in 2008, and the latest top-of-the-line Radeon GPUs produce so much heat and consume so much power that no really savvy person can go for them over the NVIDIA alternatives, even if the latter cost more and to make things worse they offer no performance advantage.
AMD is in a very bad spot, and they need to do radical things if they want to stay in business.
anubis44 - Wednesday, January 21, 2015 - link
"the latest top-of-the-line Radeon GPUs produce so much heat and consume so much power that no really savvy person can go for them over the NVIDIA alternatives, even if the latter cost more and to make things worse they offer no performance advantage."Well, in my favourite PC game (Company of Heroes 2), the R9 290 is as fast or faster than the GTX970, and I got it for $259 canadian, vs. the cheapest GTX970 being $380 in Canada, so I went for the R9 290 (Gigabyte Windforce), and flashed the bios to 1050MHz in 5 minutes. The Windforce cooler is extremely effective and quiet. I don't hear the card at all in my CoolerMaster Storm Stryker case, so I'd say you're quite wrong about the R9 290 not being competitive, and yes, I'm a 'savvy' tech person, I'd say, after having built all my previous PCs since 1995.
nVidia simply continues to rip people off and the worst part is that people seem to like it anyhow. If you want to bend over for Jen Hsun and his proprietary technologies that lock you in to higher prices, please be my guest, but don't try to put a false spin on AMD's competitiveness.
mrdude - Wednesday, January 21, 2015 - link
AMD's GPU and APU sales are stumbling and AMD expects that to continue thru to Q2/Q3 2015. ASPs fell and build up in inventory has forced yet another write-off. During Q3 conference call, Devinder Kumar specifically pointed out the GPU division for its poor performance and attributed that to market dynamics and competition. Being relegated to the lower segments of the market, along with a heavy reliance on sales in Asia, has hammered them quite hard.Are you sure it's him who's putting the false spin on AMD competitiveness?
AMD desperately needs new CPU and GPU uArchs. Primarily, they need uArchs tuned for efficiency and low cost while offering great performance. Right now, they've got bigger dies that consumer higher power that they're forced into charging less money for. All the while, they're losing market share and the eyes of the consumer.
It's the consoles that are keeping the company afloat at the moment, and presumably into 2015. The upside seems to be the EESC space, where AMD has made some inroads via their APUs and embedded products along with securing another couple of design wins.
Given the breakdown of their revenue, both expected and reported, it looks like the appeal of AMD's products are viable within low-margin markets with longer refresh cycles. While that can make for nice added (bonus?) revenue, it won't pay for costly development of their CPU, GPU, and other microarchitectures.
Arguing that AMD's current products are competitive is farcical. They're competitive in certain segments and some aspects, sure, but as a whole they're suffering.
chizow - Wednesday, January 21, 2015 - link
lol @ anubis44, bitter AMD fanboy until the end I see.Better start downloading and archiving those driver releases, keep em safe! :D
anubis44 - Wednesday, January 21, 2015 - link
Don't worry, chizow, I've still got my old 3dfx drivers, too. :)And keep sucking on Jen Hsun's wang, 'cause due to people like you, nVidia could soon be the only game in town for GPUs.
chizow - Wednesday, January 21, 2015 - link
Yawn, cry me a river, Intel has been the only CPU game in town for 8 years and where has it gotten us? Oh right, more awesome CPUs every. single. year.But yes, I can tell you're the clingy fanboy type, its OK, I used to really like 3Dfx too until they went belly-up and I moved on, first to ATI actually, but of course Nvidia has dominated since G80 making my decision to go with them and stay with them, easy!
anubis44 - Wednesday, January 21, 2015 - link
Chizow, you nVidiot, you need to watch this video to see how you're going to be spending at least $100 more for a G-Sync monitor than exactly the same FreeSync version:http://www.fudzilla.com/news/graphics/36791-amd-fr...
If you're still determined to give nVidia your money, despite their clear and obvious attempts to rip you off, please go right ahead and bend right over.
D. Lister - Wednesday, January 21, 2015 - link
It is just that g-sync is totally optional, just like surround or 3D. The difference between g-sync and freesync is not unlike doing physics - you can do it in software, or you can do it with designated hardware. Freesync is a software solution, while g-sync performs a similar function with custom hardware. Once both techs are available in retail, comparisons will be made, and the better implementation will become apparent.chizow - Thursday, January 22, 2015 - link
I agree, once both hit the market it will be evident which is better, however, we do know G-Sync does everything Nvidia said it would because it has been on the market for over a year. That's going to be a tough act for AMD to follow. :)chizow - Thursday, January 22, 2015 - link
Why would I need a video to confirm what PCPer and others have reported, that FreeSync isn't as good as G-Sync and the $100 premium is well worth it for something that is better and has already been on the market for months?Oh, and I bought an ROG Swift at launch, been enjoying it for the last 5 months while AMD keeps its fanboys like you waiting with baited breath.
Let's hope FreeSync hits the market before AMD goes belly up, it may be their final gift to their devoted AMD fanboys like yourself. ;)
silverblue - Wednesday, January 21, 2015 - link
Everybody is entitled to an opinion. Calling somebody out for being a fanboy when you've done the same yourself is a tad hypocritical, and wears a bit thin after a while.AMD are performance competitive for the price, but they've had to cut prices quite a lot to stay that way. I would also like to court controversy here by saying that Mantle has probably had a larger impact on gaming than PhysX, though if we're going to be balanced and all that, AMD have never truly offered anything to compete with 3D Vision nor GeForce Experience.
chizow - Wednesday, January 21, 2015 - link
Nah, I unapologetically buy what's best, if that were AMD, I'd buy AMD. But they haven't been the best, not in a long time. So yes, I'm a fanboy of the best I can afford and in every case, that means passing on cheaper, less supported alternatives like AMD.anubis44 is a massive AMD fanboy however, I mean he stll thinks FX chips and CMT was a good idea! Yes, he is the rare AMD *CPU* fanboy as well. That takes skill, devotion.
Mantle was a dud, just another problem to a question that was never asked: "Hey you know what would be great? Another API/platform from the minority stakeholder in the GPU business to further segment PC gaming!"
anubis44 - Wednesday, January 21, 2015 - link
chizow, just to prove how 'massive' an AMD fanboy I am, I actually went out and built myself an Intel Core i5 4690K gaming rig, since all the benchmarks and Intel fanboys were screaming about the 'incredible' performance advantage of the i5 4690K over my FX-8350 from over 2 years ago. Guess what? The average FPS in my favourite game, COH2, at even the comparatively low resolution I play at (1680x1050), where the CPU should have been given even more opportunity to play a role, went from 46FPS to about 50FPS. And that's with the 4690K overclocked to 4.5GHz, just like my FX-8350. What a farce. Paying extra for the Intel CPU was a rip off, even though I got it for $229 on special. To make matters worse, when I transcode and rip CDs/DVDs and otherwise run multiple tasks, the Intel chip runs out of threads way earlier than the 8 core FX, which is just pathetic. And no, I wasn't about to spend over $300 to get a Core i7, which still only has 4 real cores and hyperthreading.I'm still using the i5 4690K, but I might just give it to my father-in-law at some point as a upgrade for his ancient Core2Duo rig. I'm already starting to feel dirty using this overpriced Intel processor.
chizow - Thursday, January 22, 2015 - link
That's probably because you're using a trash AMD GPU too.chizow - Thursday, January 22, 2015 - link
Oh and please forgive me if I don't buy your BS, when virtually every site that has tested that game shows massive gains from faster CPU. As usual, you're probably doing it wrong or more likely, GPU bound with something from AMD that was as slow as your 8350.https://www.google.com/search?q=company+of+heroes+...
silverblue - Thursday, January 22, 2015 - link
You mean this article, don't you?http://www.techspot.com/review/689-company-of-hero...
The Titan bottlenecks the 4770K but only does so to the 8350 at 4.5GHz. This may explain why anubis44 isn't noticing better performance - and that he could downclock his i5 and still get the same performance, teehee - however I must point out that CoH2 doesn't appear to use more than three CPU cores, so i5s and i7s are wasted in general. FX CPUs aren't really optimised for considering they used ICC to compile the game. I wonder how core parking would affect the FX results? The game also doesn't support multiple GPUs so this really isn't exactly a great game to extract performance data from.
Crunchy005 - Wednesday, January 21, 2015 - link
Love the rage here on both sides. Obviously chizow is a huge nvidia fan boy, although he denies that fact. And there are plenty of AMD fanboys. Fan boyishness aside AMD's graphics hards were top since the 780ti which was nvidia attempting to compete with the new top r9's."Mantle was a dud" - chizow. Mantle was a nice improvement on performance when it came out by reducing CPU overhead and lead to Nvidia following suite.
http://www.extremetech.com/gaming/175881-amd-mantl...
Here is an article from mid 2014 showing top recommended GPUs:
http://www.pcmag.com/article2/0,2817,2422133,00.as...
Funny how nvidia is barely there. AMD was the better choice power to performance, you can't compare and throw out the AMD competition when they are older architectures than the new Nvidia 9xx's(of course these are better). Just wait for the new R9 3xx's which are rumored to be using HBM memory to hugely improve memory bandwidth something Nvidia also says they are looking into, this should also allow for some nice performance gains as well. But hey as soon as those come out at a later date nvidia sucks so why would anyone ever buy it, it has never won (as seen from @chizow's perspective of what AMD currently is). I would also like to quote you here "Nah, I unapologetically buy what's best, if that were AMD, I'd buy AMD. But they haven't been the best, not in a long time."
And to finish off AMD's HSA architecture shows a lot of promise, although not entirely relevant with current software but they are several years ahead of intel in this field. The kaveri destroyed the i5 here with HSA.
http://wccftech.com/amd-kaveri-i54670k-benchmarks-...
frenchy_2001 - Wednesday, January 21, 2015 - link
The problem in the discussion we are having here is that people are talking about 2 different things really.The basics on how good a product is comes down to that catch phrase: "There are no bad products, there are only bad prices." AMD has internalized that and prices their product competitively, so for a perfs/$, they are usually competitive if not better than their competition (either Nvidia or Intel). This is great for consummers, as this keeps price pressure on the products and keep the market competitive in price.
The problem for AMD, is that they are not making money. If both nvidia and AMD sell $250 GPU, but it costs $120 to make for one (Nvidia, 55% gross margin, intel is similar in CPU) and $200 for the other (AMD, 29% gross margin), then one company is not as good as the other and will not be profitable. This can be acceptable in the short term, either to buy market share (as intel is trying in mobile) or to get over a bad product until your next refresh, but this depends on having the financials to support it and invest in RnD to develop new, more profitable products. Both Nvidia and Intel have done it, but both have come out stronger in the next round. AMD has been in that mode for years. They bet the farm on buying ATI and they are still paying for it. This cost them their fab, as those had been increasingly expensive to maintain (only a handful of companies have competitive fabs. Even IBM divested theirs recently). This has lead to intel leading in process (currently 14nm against 28nm). In both GPU and CPU, we can also see that they missed the design shift their competitors went through: intel pushed toward lower power since the first Core architecture and has continued to push lower with each iteration (15W -> 10W -> 8W -> 4.5W...). Similarly, while AMD was creating the biggest and most powerful GPU ever (R9 290X), Nvidia shifted towards more and more efficient architecture, culminating with Maxwell recently.
AMD is in a really bad place: they are out-designed in both market and out-fabbed in CPU, with no cash reserves and a mountain of debt. Their hope would be new architecture in both markets, which is long and costly to develop and new process nodes at their suppliers, which is out of their hand. Their advantage, heterogeneous computing, depends on software, a domain they have dropped the ball more often than not and usually rely on third party development.
Basically, unless they change their mode of operation drastically, they cannot really hope to beat their competition substantially in either of their core market.
chizow - Thursday, January 22, 2015 - link
Yeah again, who isn't a fan of great product? Oh right, AMD fanboys.chizow - Thursday, January 22, 2015 - link
lol and did you really cite HSA? Wow that is some oldskool AMD fanboy playbook right there. Yeah, just wait for this...wait for that...wait for underwhelm. Same story from AMD and their apologists, different day.Except there might not be too many future days for AMD fanboys and apologists. Time to change that AMD inside t-shirt for the first time in a decade and go for that tattoo removal!
Crunchy005 - Friday, January 23, 2015 - link
Kind of wish you could hear yourself, you are a single minded individual obviously. You tunnel vision Intel and AMD and offer no evidence to support what you say. You just sit here and bash anything anyone says. You really need an attitude adjustment.I do like Intel and Nvidia. AMD and Nvidia have been a back and fourth on GPU power for forever and Intel is the superior CPU. Obviously you only focus on what is currently on top and nothing else(ignoring AMD, who you are biased against). You offer no real helpful information either. I like AMD for the innovation they have push and the fact that they take risks is not a bad thing. APU's were pushed by AMD and intel HD graphics started to take off behind them. HSA is starting to mature and hopefully software will follow, and intel will also follow as well i'm sure. Some of the benefits you get from Intel(unimaginative, but efficient company) have come from competition with AMD. Although you will call me an AMD fanboy again and blow off any of this so why am I even trying.
Crunchy005 - Friday, January 23, 2015 - link
yes great product here.http://www.extremetech.com/extreme/198214-198214
So what is Nvidias response to this?
D. Lister - Wednesday, January 21, 2015 - link
The only thing that Mantle does is bringing the catalyst drivers on parity with geforce drivers in terms of CPU reliance. Traditionaly (pre-mantle), the Radeons have been much more CPU-sensitive than their Nvidian counterparts (ref: http://www.overclock.net/t/1495236/amd-vs-nvidia-c... ). Which means that a weaker CPU would impact the performance of a Radeon a lot more (upto 15%, as per the reference link) than it would an equivalent Nvidia based solution. The difference is that instead of making a fundamental change in the drivers to overcome that, AMD gave us "Mantle", a latched-on driver mod which requires individual developer-level changes in every game instead of being automatically applicable universally.chizow - Thursday, January 22, 2015 - link
Yep exactly, Mantle is and always was a crutch for their lackluster GPUs, because it solved a problem that didn't exist with typical well-balanced builds that used reasonable GPUs with equally paired CPUs.chizow - Thursday, January 22, 2015 - link
er that should say lackluster CPUs in 1st line.Crunchy005 - Friday, January 23, 2015 - link
Either way chizow mantle did offer higher frame rates on Intel CPUs as well as AMD CPUs. Again ignoring the fact that it benefited both and was not an issue with just AMD.silverblue - Thursday, January 22, 2015 - link
But it did push Microsoft to make DirectX more efficient, no?D. Lister - Thursday, January 22, 2015 - link
http://techreport.com/review/26239/a-closer-look-a...Apparently, DX12 development is older than Mantle. The real motivator behind the development of a low level graphics API is the modern consoles, which proved the potential of low-level access. Once DX12, being a much more comprehensive and universal solution, becomes available, Mantle would inevitably get faded out, joining AMD's other short-lived marketing ploys (the awesomeness of DX10.2, Bulldozer's miraculous Win8 performance boost, etc.) in technological history.
chizow - Thursday, January 22, 2015 - link
Possibly, but MS has been using a low level API since the original XBox, the next Windows would've been due for an update, without a doubt. The fact they were demonsetrating the "XBox One" a year before launch on Nvidia hardware and were able to demo DX12 at \build a few months after Mantle launched clearly shows they were already working on a lower level API, Mantle launching may have solidified their plans and forced them to speak about it sooner, maybe.D. Lister - Thursday, January 22, 2015 - link
Considering that AMD has been a part of the DX12 development team alongside MS, Intel and Nvidia et al from the start, considering their past track record, it is easy to extrapolate that "Mantel" was AMD essentially trying to make a quick buck with something half cooked before DX12 came out.D. Lister - Thursday, January 22, 2015 - link
A correction in my earlier post, I meant to say "DX10.1“ and not 10.2. Sorry about that.D. Lister - Thursday, January 22, 2015 - link
@ chizowThat is true, but this time around, unlike in the past, MS' competing hardware (PS4) has a significant hardware advantage, and they're hoping to overcome that disadvantage by bringing the working of the development platform (PC) closer to their console, for faster and higher quality ports.
D. Lister - Wednesday, January 21, 2015 - link
@anubis44"don't try to put a false spin on AMD's competitiveness."
Did you actually even read the article? How does that work with your own seemingly lofty impression of AMD's competitiveness? AMD has been desperately trying to expand into new markets and product lines for years, all at the cost of cutting corners on quality of their primary product lines. Now, it's all just finally catching up to them.
silverblue - Wednesday, January 21, 2015 - link
I'd argue that AMD's acquisitions and attempted divorce from GloFo have been contributing factors. You can't blame a company for trying to do something different when they're not competing as well in their core markets, but the SeaMicro acquisition has done... what, exactly?Has quality actually slipped? If you're referring to AMD's GPUs getting bigger and more power hungry, NVIDIA suffered the same fate for a time before Kepler. If you're referring to Bulldozer, that is partly a case of software not supporting the hardware as well as some bad design. When I read "quality", I equate that more to the reference design of graphics cards, and whilst they offered a premium looking product with the 295X2, they haven't cascaded it down the pack like NVIDIA have. I can definitely agree with that. Are drivers as bad as some people say? I suppose it depends on which people you speak to and whether they have an agenda or not.
I don't think anybody can blame AMD for not getting many wins with their Cat cores either, especially with Intel deliberately forcing them out of the market, but you can blame them for overestimating how well Llano and Trinity would sell - that was poor. I am amused by the idea that Apple didn't go with AMD because they couldn't meet demand, but then overproduced APUs to the point they had to write down hundreds of millions of dollars worth of inventory.
Kjella - Wednesday, January 21, 2015 - link
Ugliest part for AMD, total stockholder equity: $535 -> $187 million. If that hits zero then AMD is bankrupt, game over. And their CPU/GPU designs are only getting older and older since they use all their R&D "diversifying" and there's no new consoles in progress.chizow - Wednesday, January 21, 2015 - link
There are AMD fanboys that will still tell you the acquisition of ATI was a good idea, however, even when the point is made AMD overpaid for ATI and it was the beginning of the end for the combined AMD/ATI.Will be a case study in every B-School in 5 years, if it isn't already.
WagonWheelsRX8 - Wednesday, January 21, 2015 - link
One quarter of bad financials usually isn't a bad thing. The thing that worries me most is their product line-up. They've basically thrown in the towel on their 'traditional' market, which I feel is going to end up being their biggest mistake. Their GPUs are much more competitive with nVidia than their CPUs are with intel, which is good, but isn't enough.Do they have any planned follow-up to their Bulldozer architecture? I understand they probably aren't capable of taking Intel on in the highest performance segments, but their current desktop (and conversely server) products are barely scraping by in mid-range performance segments WHILE guzzling power like a hot rod from the 1970s!
The move to ARM is extremely risky, as they are late to the party. I think its a necessary step, since devices that run ARM are so prevelant, but I don't think they're going to make any serious money here. There are a ton of competitors in this space (Apple, Qualcomm, nVidia) that I just don't see them bringing anything really amazing to the table here.
Anyone feel free to correct me if I'm wrong, though.
And my question about the follow-up to Bulldozer is a serious one. Do they have any high-end desktop CPUs in the works?
mrdude - Wednesday, January 21, 2015 - link
They have K12 and Zen planned for release some time in 2016 - ARM and x86 respectively. Just what segments of the market these chips are planned to aim for is unknown. Both are currently just names tossed about that AMD has pinned the future of its company on. Any guesses beyond what ISA each is is speculation.WagonWheelsRX8 - Wednesday, January 21, 2015 - link
OK thanks for the info. I have read that K12 was planned but no specific info on it, and no idea if it is an attempt to regain some competitiveness. 2016 is still a long way out.chlamchowder - Wednesday, January 21, 2015 - link
I would pin hopes on Zen being a high performance architecture that can take on Skylake. It might not be wishful thinking if Intel sacrifices too much performance going for low power.I have little hope for K12. IMO AMD is out of its mind going for ARM stuff - they're not good at low power to start with. K12 might have a fighting chance against Intel Atom, but likely not against Qualcomm Krait. If they're going for high performance, ARM comes with a boatload of problems, like lack of software support from desktop/server applications. I haven't heard of anyone picking up ARM servers, and Windows RT (ARM's best shot at real productivity machines so far) didn't get far because no one bothered to port stuff over from x86.
chizow - Wednesday, January 21, 2015 - link
@Creig and others who insisted 3 months ago in the GPU buyer round-up, that price was the only thing that mattered. I guess the market has spoken (again), and you were wrong (again).toyotabedzrock - Wednesday, January 21, 2015 - link
They have separated the earnings in an a way that might make wall street happy but will distort how they invest in R&D and cause more issues later on. They need better designs or there will be no custom work for them.FriendlyUser - Wednesday, January 21, 2015 - link
The market was expecting these results or even worse, which is why the stock is rising.Achtung_BG - Wednesday, January 21, 2015 - link
I like AMD, but ... R&D expenditure Q4 2002 = 244.85mln $, twelve years later R&D Q4 2014 = 238mln $.Compared to the same period, Nvidia up R&D from 57mln $ to 340mln $, Qualcomm from 112mln $ to nearly 1.4bn $.
silverblue - Thursday, January 22, 2015 - link
http://www.electronicsweekly.com/mannerisms/rd/st-...Different to two years ago.
Kyururin - Wednesday, April 8, 2015 - link
AMD fanboys unite: chant this everyday for 30 days, Die AMD Die. After AMD dies, let Intel fanboys bask in the glory of $6000 Intel Celeron for the CPU, $400 for the cardboard packaging box, $ 2000 for the stock fan without heatsink, $2600 for the heatsink without fan and last but not least $1 for 1mm of TIM.