The article showed it using half the power (or less) than the GTX950 at the same performance.
That's a game changer if AMD/RTG releases these as competitors. This means the new low end tier of dGPU performance (i.e. under the 75W power limit) may be in the area of the GTX 960 (a $200+ card).
nVidia will see these same benefits, of course, so this upcoming generation is looking quite enticing for once. This bodes well for AMD APU's, though memory bandwidth is already the limiting factor there, so the memory compression tech and any L4/HBM offerings are of primary interest there.
This is such dumb move by AMD, now nvidia knows exactly when, and how much GPU they have to release, because make no mistake nvidia has thier next gen ready to go, they've just been milking profits as per usual. Second, it's gonna hurt sales...why buy amd now when you know this is in the pipe?
And you are the expert that everyone should listen too - thanks for your wisdom straight from the strategic headquarters in your mom's basement. AMD should hire you right away..
It is not a dumb move to announce high level details and plans along with a product demonstration. Nvidia and Intel has been doing just that for years and it hasn't helped AMD at all. It helps those people preparing for the next generation hardware design laptops and get interested in AMD products. Equal performance at about half the wattage is a good marketing demo.
The other aspect you're forgetting is that AMD needs to give investors something worth investing in and the demo does just that. That demo for investors makes even better business sense. You can here the crickets chirping from the CPU division, so the GPU division needs to continue to float the company. This does it for a few months.
Would you believe them if they said nothing's in the pipeline? If you do, you are dumb, period. if you don't, then not announcing it would be dumb as it does nothing good to either consumers or investors - in other words you're dumb to think they are dumb.
You have seriously no idea how the industry operates if you think Nvidia has some "future tech" they're just keeping for a rainy day. Complete. Ignorance.
As exciting as the thought is, don't expect to see an APU with HBM this year. AMD was clear when they first made their HBM announcement that APU's were probably doing going to get the treatment until HBM3 or possibly even HBM4. :-/
Maybe not from AMD (though I think I've read the high end card will be getting HBM, and the smaller cards normal RAM) NVidia wanted to go HBM as well Gonna see something more interesting this year; hopefully
I dont know about same performance. the demo was highly controlled. The key point being, they used vsync. The 950 could very well be much faster, but using more power. We will never know until they show us that same demo with vsync off.
If that was the case and it was hitting the vsync cap much more easily on the 950, that would be reflected in the power numbers to a larger degree.
But as others have pointed out both vendors are going FinFET so I suspect we'll see large leaps like this for both manufacturers. Further, pricing is going to play a big role in how attractive any given card is, and the competition will greatly benefit consumers in this regard!
Why are people so hung up on power:performance numbers? If you're into enthusiast gaming (like most people here) you should care first about $:performance. Power:performance should be somewhere down around 5 or 6 on your list.
Because all things considered you can't really make a card drain much more than 250W or so. If you get twice the performance per watt, then you've just doubled the maximum computing power a single GPU can have.
That's why I gave up SLI. Having two 200+ watt GPU's and a CPU pumping basically make your PC a space heater. Since my house doesn't have zones for AC this making summer time gaming when it's 80 degrees outside unacceptable. Ideally a PC shouldn't produce more than 200 watts while gaming, and laptops have trouble doing half that without some creative thermal design.
Mainstream GPU's need to take a note from the GTX 750Ti, a card that could run off the PCIe bus without any additional power, while still playing just about any game at acceptable peformance and detail.
Totally agree. You can get a portable AC to dump the heat back outside again, but that means a whole lot of energy used for gaming. So much better to have power efficient performance.
Hoping the GTX 970 successor doubles performance at the same thermal envelope. Pretty much would mean GTX 980 TI performance at GTX 970 power and pricing. Would love that! 14/16nm could definitely make that possible.
Because not all of us are 16 any more. I want performance that is good enough for what I do, I want the hardware to be as quiet as possible and I can afford that.
My space heater is 1000W. It can bring my room to 80F when it's below freezing outside. I don't use it in the summer. I use my computer in the summer. There's a reason I don't it to take 1000W when it's already 90F in here.
My 89W Athlon X2 5200+ and 8800GTS used to noticeably warm the room. I feel sorry for anyone who bought a FX-9590 and 390x.
Depends where you live (cost per MWh), how long do you run the PC and how loaded it is... It ain't that hard to calculate... where I live I pay ~ 62Eur per MWh (75 with tax)... so running 600W power hog vs 300W 8 hours a day (wife@home, so prolly it's even more) puts you on 108 vs 54 Eur a year (plus tax) on computer alone. It's not tragic, but also not that little to just plain neglect it...
Because AMD runs too hot to be cooled easily compared to NVidia Well high end cards anyway Less heat = less noise/or more frequency (=more performance)
Thanks to the destruction of the middle class, the erosion of purchasing power and price rise of all sources of energy in the last 40 years, the Watt per frame equation is actually more important than ever. Unless you are part of the privileged 1% and wipe your ass with a Benjamin...
I watched your TED Talk, and I don't see how what Ramon Zarat said is refuted in this TED Talk.
Obviously, when he refers to the 'destruction of the middle class', he means the one in the United States. And he's right. there has bee a net destruction of good-paying, solid, middle class jobs in the United States (and Canada, where I live, for that matter). He's also right that the optimistic goal espoused after WWII to use nuclear power to produce cheap energy for all has been completely co-opted by a corrupt cabal of wealthy manipulators, and for some mysterious reason, we're STILL burning fossil fuels to produce most of our energy (including transportation). Of course, the expensive energy economy these greedy fools have tried to impose on us is unsustainable, and is unravelling at the seams now that so many countries are dependent on oil revenues, and have allowed their manufacturing to flow to China, there is no solidarity among oil producers, and all they're ALL pumping the stuff out of the ground as fast as they can because they're all dependent on the oil revenues to survive. Either most of these countries get out of the oil business and realize they need to have a manufacturing-based economy once again while they can, or we'll descend into a desperate, anarchic world, with countries simply invading each other for oil in order to try to maintain their falling oil incomes by controlling more and more of the production. Disgusting.
Like Spoelie said, kinda, the world as a whole is living much more materially comfortably than 40 years ago, thanks in a large part to the rise of Asian middle-classes and a significant improvement in South America. Purchasing power and living standards are also higher in Europe and the US than they were 40 years ago, even though the gains from economic growth has been disproportionately concentrated in the hands of the richest, especially in the US.
Now, there is clearly a stagnation of income for the American middle-class, but things are not really going worse than they were 40 years ago.
We can debate the economic fortunes of the US and Europe in the last 10 years, but the world average is much, much better than it was in 1975.
Meanwhile, in the real world... people who pay their own bills know that a 40W GPU difference is effectively zero.
I replaced all of my lighting with LEDs and saved an estimated average of 500W 24/7. The extra 40W, or even 140W, under load for the few hours of gaming a day any normal person has time for will not make any type of impact on the electric bill that is noticeable beyond the noise of a two degree temperatures swing outside.
It's far from my main concern, but power-performance does impact the experience. I love the fact that NVidia's GPUs are less power hungry and generate less heat at the moment than AMD gpus. It generally makes for quieter and more overclockable cards.
As an 'enthusiast hobbyist', I like to build fanless computers. In such a situation, performance:watt is the primary constraint on gaming performance.
I'm not sure what your list looks like, but IMO "$:Performance" should be a lower priority than power consumption for 'enthusiast gamers', simply because more heat puts physical constraints on how powerfully a system can perform.
Despite that AMD and Nvidia being stuck on 28nm, Nvidia was able to make decent mobile GPU's via Maxwell. AMD just didn't bother to concentrate their efforts in mobile and low powered GPU's.
AMD had a significant investment to move onward with 20nm and was unable to backport all of the features and improvements due to time and staffing limitations. nVidia, meanwhile, was able to throw more manpower at the situation to handle the situation much more gracefully.
AMD's next issue was an underestimation of nVidia's performance improvement, which resulted in AMD's need to push the clockrates on their GPUs beyond the peak efficiency band. You can get ~80% of the performance of a GCN card at about half the power consumption, which the R9 Nano illustrates wonderfully. If AMD's estimate had been accurate, we'd be viewing things differently:
Imagine the 7970 releasing at 800Mhz and overclocking to 1.1GHz, and the R9 290/X having a base clock of 850MHz, and hitting 1.2GHz... Changes the dynamic entirely. nVidia outperformed expectations, AMD delivered exactly what was expected... then had to push to match nVidia.
Good for smaller devices (and I'm excited) but I doubt the high-end. It may use less power but the performance might only be a small increment over the previous high-end GPUs
Seriously, can't stand it. RTG is just AMD's pathetic marketing attempt to separate the GPU and moribund CPU divisions so that when they inevitably spin off the GPU division (or shut down the CPUs) the only remotely valuable piece of the company left won't be tainted by association.
Pay attention to what? The fact that they could have called it something with AMD in the name? The fact that they could just refer to it as Radeon? RTG is a crappy acronym and I find its presence in the article 8000 times to be incredibly distracting.
Personally I would just prefer if they just say Radeon for short.
I'm pretty sure anyone who's been around computers since before the merger or just follows GPU's in general will understand. We don't refer to Ford Motor Company as FMC even though that is exactly what it is. It's just Ford. There's a reason the Radeon name has stuck around this long and I think it should just be referred to as Radeon. Then again that's just my opinion, we all have one, and I'm sure others prefer the acronym.
Currently RTG is what Radeon Technology Group requests to be called. That is why we refer to them as such and also why the name is brought up so often (we have to refer to them as something and their chosen name is prefered). So at this point it is less preference and more about the official branding.
Yeah, I know. I just don't care for it and agree with Owan as far as the marketing part. The acronym just doesn't read very smooth to me right now and you did mention it quite a few times. I know there really isn't much that you can do about that because as soon as you leave it out someone will get confused about where the information is coming from so I choose to blame Radeon for now. I'm also sure that I probably won't care as much once we have been stuck with the acronym for a bit and the edges get rounded off so that I will read it just as smoothly as any other acronym....but I still don't like it. =)
That's a bad example bc people usually prefer to say the acronym GM, not General Motors or General. Also people usually 50/50 with POS, or Piece of Sh!t instead of Chevy. Most importantly though RTG before being bought by AMD(acronym preferred) was ATI(acronym preferred. No one really referred to it as Array Technology Inc, Array Technologies Inc or as it was later changed to ATI Technology Inc which is an acronym within an acronym....
It's a minor, off-putting annoyance to have to see RTG over and over in an article, but I do agree with others who feel it's disruptive to read. I'd rather see AMD's people produce a product that speaks for itself and defines the group that developed it. However, I'm nor surprised to see it's AMD that's pushing the awful name. If you've read past AMD press releases, you're likely to find yourself drowning in buzzwords and clumsy-feeling, artificial excitement. They have a history of botching when it comes to effective communications which is why I'd much prefer AMD focuses on delivering competitive products at competitive prices and worry less about telling journalists how to write their articles.
All I can think of when I see it is radioisotope thermoelectric generator, the primary power source of most of NASA's best toys. However, it is also applicable to AMD. Consistent, if not a bit less powerful that competing technologies, and very, very warm. ;-)
It will be interesting to see when NVIDIA will make their FinFET product announcement. Products are still a bit out, but here's hoping that the new products will give us good reason to upgrade from 2-3 year old video cards!
Yes. The Nvidia fans are mentally challenged and don't grasp the baseline system differences between the two are just the card and the Total System Power consumption differences are highlighting the card architectures and FinFET. It seems each one comes in repeats the same mentally challenged thought and leave.
Yes, would't a Geforce 960m also hit the FPS cap at medium on the X-Wing Map in Battlefront and use a lot less power doing it?
Although when it comes to prerelease GPU demos pointless pretty much is par for the course. I'm excited to see how these perform when they're released, as well as Nvidia's Pascal architecture of course.
Still makes the comment "GTX 950 uses a cut-down version of GM206, which is nVidia's latest and most power-efficient chip to date" not true, which is what tviceman pointed out.
Clearly they do it to make AMD look good - there's plenty of nvidia chips that can hit fps and use less power, particularly mobile ones. Capping fps is a bigger flaw - obviously the AMD chip can only manage 60 hence the cap, but the Nvidia one can manage more. They could have been more blatant and used an OC 980Ti - that would also give you 60 fps if capped at 60 fps and use even more power!
I'm going to cautiously turn my optimism dial up to 3. Raja has some good stuff between his ears (and we're going to also attribute any good news to Based Scott, right?), and the 28nm GPU stalemate sucked for everyone. It's certain that the die shrink will be a big boon for consumers, less certain is if AMD can grab back any market share as Nvidia won't be sitting with their thumbs up their butts either.
C-suite executives can't save floundering companies from extinction. AMD is going extinct because of an erosion of brand value, and inferior technology.
Nobody gives a crap if your R9 380 is "competitive". It consumes 150W more than the competition. Inability to improve performance per watt is a fatal design flaw, and it will ultimately limit the availability of AMD's inferior design.
Is starwars battlefront still a game with a very lopsided performance differential in AMDs favor? When I looked at what was needed for 1080p/max for a friend right after launch it was a $200 AMD card vs a $300 nVidia one. I haven't paid attention to it since then, has nVidia made any major driver performance updates to narrow the gap?
AMD, err, RTG -- if you're reading this: Please take my free recommendation to use the plural of primitive, i.e. primitives discard accelerator, instead. Thanks.
You don't want to use the word "primitive" anywhere near a description of your next generation tech product. This is a marketing slide by AMD, not a whitepaper. So please take your stupidity somewhere else.
Finally! We've been stuck at 28nm for too long. I hope this finally gives me a reason to upgrade my 4 year old GPU. I'm thinking about switching my main monitor from 1080p to 1440p this year and whoever (AMD or Nvidia) has the best bang for my buck at that resolution will get my money.
Yup, the 9600 is more like 7+ years old and only supports up to DirectX 10. If that has suited your needs thus far you may not need to be first in line for this!
They are comparing it to a current generation Nvidia card in the $150 - $170 price range and showing the results of performance/watt at that range for this card that isn't optimized yet. In short, the card will be in this price range with this new GCN architecture, FinFET, at a massive drop in power consumption and all the benefits in performance it will offer.
This isn't a pissing contest about cherry picking a card for the hell of it. It's about specific retail pricing, perf/watt and power consumption.
Dude, it's just a very generic preview to kinda 'prove' that they are taking power consumption seriously. I mean think about it, AMD has taken MAJOR heat about their lack of efficiency in their recent GPU designs. Obviously they have heard this and are aware, and are making power consumption a 1st class citizen in terms of priorities. They are just trying to put some evidence out there that, "Yes, we hear you, we are working on this issue..."
I'm glad the power numbers should be much better... but this article (while somewhat interesting) doesn't have enough information. It's exciting that we'll get new genuinely new GPU's this year from AMD and NV, with 10 bit color, HDMI 2, and lower power consumption... but I think we all pretty much knew that anyways.
The big takeaway for me was that we will not see mid-high range GPUs launch until the autumn at the earliest. Secondly, for all the talk about Polaris/Artic Islands being just a shrink of Fiji(tons of it in various forums), the article fleshed out that the changes are the biggest in at least 4 years. And then there's a ton of different stuff, like the new primitive discard accelerator etc.
Yes, I would have liked to see more, but to pretend that we knew everything is either a lazy claim or a testament that you didn't read it carefully enough.
May this be the last generation to suffer from the 275w+ maddness that is Hawaii. With the generational improvements since then (tonga and fury) plus a new manufacturing process, hopefully tdp can stay reasonable. I also hope we never see ddr3 graphics cards again, and gddr5 can makes its way down to the lower end of graphics cards.
Exciting to have a really new generation of cards coming. Prices will probably be really high because of demand at first. And a card with 950 performance at sub 750 ti power levels sounds sweet, even if its an AMD tilted benchmark. Imagine what a real 750 ti replacement from NVIDIA could do.
Yeah, it would be nice to see DDR3 on graphics cards go away, but unfortunately I doubt that will happen entirely for quite some time. Blame it on the OEM's making cheap cards/laptops, though. It's not really AMD/nVidia's issue. I mean AMD could go ahead and remove the DDR3 memory controller capability, but the only thing that would come of that is that the OEM's would just use an nVidia GPU instead.
I don't understand why they compare their future-gen card with the current-gen competitor's card? Obviously Nvidia would enjoy the same generational benefits with Pascal.
Uhh, because they don't have a next gen nvidia card... Perhaps they could have compared to their own last gen, but really it doesnt matter, it was just a very rough general demo. The point is NOT to try to predict the power consumption or performance of the future Polaris GPU's based on this demo, but to show that they have made a huge relative improvement, that is all.
They could really compare it to their own, instead of showcasing it as some sort of "advantage" over competitor, which is just not fair comparison at all.
The smaller Parts are choosen to 14nm because they are more likely to be paired with CPU-Cores to an APU in the Future, thats very simple to guess.
So the bigger GPU-Cores that are not gonna make it to APUs in the next say 4 years, aren´t 14nm at a not very trained big GPU-Core manufacturer (GF or Samsung). The more say fragile goal for big GPU-Cores will be accomplished at a Pro in doing big GPUs for longer, TSMC.
And AMD gets better on 14nm before doing the first APU on that node.
Further the change at the Hardware Scheduler should take some work off the Driver OS and CPU
Has there never been a Hardware Scheduler in GCN or Cayman VLIW4 ? I saw Charts of Kepler not being far behind Mantle in DX11 paired with weaker CPUs, but this is an eyeopener to me.
Fundamentally asynchronous dispatch is achieved by having the GPU hide some information about its real state from applications and kernels, in essence leading to virtualization of GPU resources. As far as each kernel is concerned it’s running in its own GPU, with its own command queue and own virtual address space. This places more work on the GPU and drivers to manage this shared execution, but the payoff is that it’s better than context switching.
"...will support the latest DisplayPort and HDMI standards."
What about HDMI-CDC, part of the HDMI 1.0 standard that almost no PC hardware seems to support? I doubt there's much chance of getting it supported at this point, but do you think you could at least pin down AMD/RTG (or NVIDIA for that matter) for a straight answer about why they never implimented it?
I know it's not important for most PCs, but it'd be helpful for Home Theater systems, and might even be handy for things such a monitor with built-in speakers.
Probably completely offtopic, but since the slide mentions stars as being the most efficient photon generator in the universe....
How does that make sense? The sun seems pretty inefficient, in terms of how much energy is used to generate light, and how most of the energy is released as heat rather than light.
Or is it the fact that it uses nuclear fusion rather than fission, and that is the most efficient source of energy generation?
Everything in the electromagnetic spectrum is photons. they never said visible light. and fusion is basically the most efficient mass-energy conversion we can achieve apart from complete annihilation via antimatter-matter collisions.
So we can expect the next arch to be called pozitron or antiproton i guess xD
You do realize that emitted heat IS light right? Visible light is only a tiny part of the full em spectrum. The charge carrier for the electromagnetic force is the photon, that ranges from radio waves, to infrared(what you feel as heat), to visible light, to x-rays, and gamma rays.
But that's not the only thing the sun emits, it also spews out atomic nuclei.
But...ya, i think they ment fusion being the most efficient known source of energy generation.
photons are not charge carriers. Protons, electrons are charge carriers. Electromagnetic force is a field that exists between charge carriers. Photons are massless particles with zero charge, and do not interactive with other photons or electromagnetic fields. You cannot use an electromagnetic field to bend light.
The big question is whether we'll see Radeons with the same/slightly lower power consumption as current high end cards but with significant performance hikes. Getting the same/marginally better performance at a much lower power consumption isn't really that appealing to me :)
Identical performance at reduced heat output would be appealing to me. I'd be a lot more interested in discrete graphics cards if AMD or NV can produce a GPU for laptops that can be passively cooled alongside a passively cooled CPU. If that doesn't happen soon, I'd rather continue using Intel's processor graphics and make do with whatever they're capable of handling. Computers with cooling fans aren't something I'm interested in dealing with.
The Celeron N3050 in the refreshed HP Stream 11 & 13 doesn't require a cooling fan. Under heavy gaming demands, the one I own only gets slightly warm to the touch. Based on my experiences with the Core M, I do think it's a decent processor but highly overpriced and simply not deserving of a purchase in light of Cherry Trail processors' 16 EU IGP being a very good performer. I do lament the idea of manufacturers capping system memory at 2GB on a single channel and storage at 32GB in the $200 price bracket since, at this point, it's unreasonable to penalize performance that way and force the system to burn up flash storage life by swapping. At least the storage problem is easy to fix with SD or a tiny plug-and-forget USB stick that doesn't protrude much from the case.
I think that a ~2 watt AMD GPU could easily be added to such a laptop to relieve the system's RAM of the responsibility of supporting the IGP and be placed somewhere in the chassis where it could work fine with just a copper heat plate without thermally overloading the design. Something like that would make a fantastic gaming laptop and I'd happily part with $300 to purchase one.
If that doesn't come to fruition soon, I think I'd rather just shift my gaming over to an Android. A good no contract phone can be had for about $30 and really only needs a bluetooth keyboard and controller pad to become a gaming platform. Plus you can make the occasional call with it if you want. I admit that it makes buying a laptop for gaming a hard sell with the only advantage being an 11 versus 4 inch screen. But I've traveled a lot with just a Blackberry for entertainment and later a 3.2 inch Android and it worked pretty well for a week or two in a hotel to keep me busy after finishing on-site. I have kicked around the idea of a 6 inch Kindle Fire though too....either way, AMD is staring down a lot of good options for gaming so it needs to get to work bringing wattage way down in order to compete with gaming platforms that can fit in a pocket.
I don't know about fanless. But if nVidia or AMD could get a meaningful CPU + GPU package in under ~30 watts tdp - that would be very interesting. I can see an ultrabook with a small fan @ 1080p/medium being a great buy for rocket league/lol fans.
I'm surprised that RTG are comparing their upcoming tech with a GTX950..how bizzare... why not compare it to another AMD product as we all know how power hungry their cards have been.
The game is also been tested at 1080p, capped @ 60FPS on medium setting....C'MON!!!
It is not surprising that they employ TSMC for the latest node as it is their primary producer for their GPUs. GloFlo will continue to produce CPUs/APUs, which could mean, they might be producing them soon.
I haven't been able to read all the comments, so I don't know if this has been brought up already. We read here, that FinFet will be used for generations to come. But the industry has acknowledged that FinFet doesn't look to work at 7nm. I know that a ways off yet, but it's a problem nevertheless, as the two technologies touted as its replacement at that node haven't been proven to work either.
So what we're talking about right now, is optimization of 14nm, which hasn't yet been done, and then, sometime in 2017, for Intel, at least, the beginnings of 10nm. After that, we just don't know yet.
Cept its really more like the difference between 300 watts full system and 250 watts full system. Which 8 hours a day would be 146 kwh per year. Where i live thats about $20.
And where i live its actually less then that, because its cold climate, where most of the year i would need to run the heater for that difference anyway. My system currently draws just about 300 watts when running maxed(that covers monitor, network router, etc, everything plugged into my UPS as well), and thats not enough heat for that room in the winter. In the summer it would mean AC, if the house had AC...it does not(not common where i live because you would only really use it for a month or 2).
So, ya at least for me i don't care at all if one card draws 50 more watts then the other. Ill buy whatever is best performance/$ at the time.
But it's still GCN! Again with the same re-brands! Regardless if a old 7850/Pitcairn/Curacao/Trinidad only uses 50watts, it is still a old 7870 from 2012!
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
153 Comments
Back to Article
Shadow7037932 - Monday, January 4, 2016 - link
FINALLY! A node shrink. We were stuck on 28nm for so long.boozed - Monday, January 4, 2016 - link
Maybe?boozed - Monday, January 4, 2016 - link
So other sources are suggesting a shrink of two full nodes. And not a moment too soon.However, the important power:performance comparison is against their own GCN 1.1/1.2 parts.
looncraz - Monday, January 4, 2016 - link
The article showed it using half the power (or less) than the GTX950 at the same performance.That's a game changer if AMD/RTG releases these as competitors. This means the new low end tier of dGPU performance (i.e. under the 75W power limit) may be in the area of the GTX 960 (a $200+ card).
nVidia will see these same benefits, of course, so this upcoming generation is looking quite enticing for once. This bodes well for AMD APU's, though memory bandwidth is already the limiting factor there, so the memory compression tech and any L4/HBM offerings are of primary interest there.
Pissedoffyouth - Monday, January 4, 2016 - link
I'm just thinking how if they made a quad-channel DDR4 APU on 14nm might kick some serious arsedsumanik - Monday, January 4, 2016 - link
This is such dumb move by AMD, now nvidia knows exactly when, and how much GPU they have to release, because make no mistake nvidia has thier next gen ready to go, they've just been milking profits as per usual. Second, it's gonna hurt sales...why buy amd now when you know this is in the pipe?Cellar Door - Monday, January 4, 2016 - link
And you are the expert that everyone should listen too - thanks for your wisdom straight from the strategic headquarters in your mom's basement. AMD should hire you right away..edwpang - Monday, January 4, 2016 - link
Why and how do you think nVidia is not working on die shrink already?eanazag - Tuesday, January 5, 2016 - link
It is not a dumb move to announce high level details and plans along with a product demonstration. Nvidia and Intel has been doing just that for years and it hasn't helped AMD at all. It helps those people preparing for the next generation hardware design laptops and get interested in AMD products. Equal performance at about half the wattage is a good marketing demo.The other aspect you're forgetting is that AMD needs to give investors something worth investing in and the demo does just that. That demo for investors makes even better business sense. You can here the crickets chirping from the CPU division, so the GPU division needs to continue to float the company. This does it for a few months.
levizx - Wednesday, January 6, 2016 - link
Would you believe them if they said nothing's in the pipeline?If you do, you are dumb, period.
if you don't, then not announcing it would be dumb as it does nothing good to either consumers or investors - in other words you're dumb to think they are dumb.
sna1970 - Tuesday, January 19, 2016 - link
people buy new pcs with GPU cards all the time. and the dont wait 6 months to play a game.JonnyDough - Monday, January 25, 2016 - link
GPUs take years of development. It isn't going to hurt anything.JKflipflop98 - Tuesday, January 26, 2016 - link
You have seriously no idea how the industry operates if you think Nvidia has some "future tech" they're just keeping for a rainy day. Complete. Ignorance.Computer Bottleneck - Tuesday, January 5, 2016 - link
Kaveri was actually quad channel, but AMD only enabled two channels:http://www.anandtech.com/show/7702/amd-kaveri-docs...
plonk420 - Monday, January 18, 2016 - link
supposedly Zen may include HBM, bypassing slow DDR4Refuge - Monday, January 4, 2016 - link
As exciting as the thought is, don't expect to see an APU with HBM this year. AMD was clear when they first made their HBM announcement that APU's were probably doing going to get the treatment until HBM3 or possibly even HBM4. :-/Peter2k - Monday, January 4, 2016 - link
Maybe not from AMD (though I think I've read the high end card will be getting HBM, and the smaller cards normal RAM)NVidia wanted to go HBM as well
Gonna see something more interesting this year; hopefully
jasonelmore - Tuesday, January 5, 2016 - link
I dont know about same performance. the demo was highly controlled. The key point being, they used vsync. The 950 could very well be much faster, but using more power. We will never know until they show us that same demo with vsync off.Alexvrb - Saturday, January 9, 2016 - link
If that was the case and it was hitting the vsync cap much more easily on the 950, that would be reflected in the power numbers to a larger degree.But as others have pointed out both vendors are going FinFET so I suspect we'll see large leaps like this for both manufacturers. Further, pricing is going to play a big role in how attractive any given card is, and the competition will greatly benefit consumers in this regard!
Creig - Monday, January 4, 2016 - link
Why are people so hung up on power:performance numbers? If you're into enthusiast gaming (like most people here) you should care first about $:performance. Power:performance should be somewhere down around 5 or 6 on your list.Friendly0Fire - Monday, January 4, 2016 - link
Because all things considered you can't really make a card drain much more than 250W or so. If you get twice the performance per watt, then you've just doubled the maximum computing power a single GPU can have.Samus - Monday, January 4, 2016 - link
That's why I gave up SLI. Having two 200+ watt GPU's and a CPU pumping basically make your PC a space heater. Since my house doesn't have zones for AC this making summer time gaming when it's 80 degrees outside unacceptable. Ideally a PC shouldn't produce more than 200 watts while gaming, and laptops have trouble doing half that without some creative thermal design.Mainstream GPU's need to take a note from the GTX 750Ti, a card that could run off the PCIe bus without any additional power, while still playing just about any game at acceptable peformance and detail.
coldpower27 - Monday, January 4, 2016 - link
Totally agree. You can get a portable AC to dump the heat back outside again, but that means a whole lot of energy used for gaming. So much better to have power efficient performance.Hoping the GTX 970 successor doubles performance at the same thermal envelope. Pretty much would mean GTX 980 TI performance at GTX 970 power and pricing. Would love that! 14/16nm could definitely make that possible.
smilingcrow - Monday, January 4, 2016 - link
It's a primary metric for mobile GPUs and a big deal for fans of quiet GPUs also.Arnulf - Monday, January 4, 2016 - link
Because not all of us are 16 any more. I want performance that is good enough for what I do, I want the hardware to be as quiet as possible and I can afford that.Mondozai - Monday, January 4, 2016 - link
250 watt GPUs are not noticably louder than the 165 W ones. Take a look at the 380X reviews for one, or the 980 Ti ones.Arnulf - Monday, January 4, 2016 - link
... and this is relevant ... how exactly ?dsumanik - Monday, January 4, 2016 - link
because you just said:"I want the hardware to be as quiet as possible and I can afford that"
He's pointing out the difference in power envelopes (performance/watt) doesnt have the impact on quiet computing that you are suggesting it does.
Seem pretty relevant to me smart guy
RafaelHerschel - Monday, January 4, 2016 - link
A small and silent system in the living room.Faster cards at the maximum practical power requirement.
Shorter cards (less cooling) offer more flexibility when it comes to choosing a case.
A general dislike for inefficiency.
DominionSeraph - Monday, January 4, 2016 - link
My space heater is 1000W. It can bring my room to 80F when it's below freezing outside. I don't use it in the summer.I use my computer in the summer. There's a reason I don't it to take 1000W when it's already 90F in here.
My 89W Athlon X2 5200+ and 8800GTS used to noticeably warm the room. I feel sorry for anyone who bought a FX-9590 and 390x.
Cinnabuns - Monday, January 4, 2016 - link
Power:performance translates into $:performance if you're the one footing the utility bill.nikaldro - Tuesday, January 5, 2016 - link
This is a common myth. Unless your PC drinks A LOT of Watts, your power bill won't change much.HollyDOL - Tuesday, January 5, 2016 - link
Depends where you live (cost per MWh), how long do you run the PC and how loaded it is... It ain't that hard to calculate... where I live I pay ~ 62Eur per MWh (75 with tax)... so running 600W power hog vs 300W 8 hours a day (wife@home, so prolly it's even more) puts you on 108 vs 54 Eur a year (plus tax) on computer alone. It's not tragic, but also not that little to just plain neglect it...Peter2k - Monday, January 4, 2016 - link
Because AMD runs too hot to be cooled easily compared to NVidiaWell high end cards anyway
Less heat = less noise/or more frequency (=more performance)
Ramon Zarat - Monday, January 4, 2016 - link
Thanks to the destruction of the middle class, the erosion of purchasing power and price rise of all sources of energy in the last 40 years, the Watt per frame equation is actually more important than ever. Unless you are part of the privileged 1% and wipe your ass with a Benjamin...Spoelie - Tuesday, January 5, 2016 - link
Actually, no.You may want to watch this if you want to improve your world-view beyond that of a chimp.
https://t.co/kpnCsLDidb
anubis44 - Thursday, January 14, 2016 - link
I watched your TED Talk, and I don't see how what Ramon Zarat said is refuted in this TED Talk.Obviously, when he refers to the 'destruction of the middle class', he means the one in the United States. And he's right. there has bee a net destruction of good-paying, solid, middle class jobs in the United States (and Canada, where I live, for that matter). He's also right that the optimistic goal espoused after WWII to use nuclear power to produce cheap energy for all has been completely co-opted by a corrupt cabal of wealthy manipulators, and for some mysterious reason, we're STILL burning fossil fuels to produce most of our energy (including transportation). Of course, the expensive energy economy these greedy fools have tried to impose on us is unsustainable, and is unravelling at the seams now that so many countries are dependent on oil revenues, and have allowed their manufacturing to flow to China, there is no solidarity among oil producers, and all they're ALL pumping the stuff out of the ground as fast as they can because they're all dependent on the oil revenues to survive. Either most of these countries get out of the oil business and realize they need to have a manufacturing-based economy once again while they can, or we'll descend into a desperate, anarchic world, with countries simply invading each other for oil in order to try to maintain their falling oil incomes by controlling more and more of the production. Disgusting.
ASEdouardD - Tuesday, January 5, 2016 - link
Like Spoelie said, kinda, the world as a whole is living much more materially comfortably than 40 years ago, thanks in a large part to the rise of Asian middle-classes and a significant improvement in South America. Purchasing power and living standards are also higher in Europe and the US than they were 40 years ago, even though the gains from economic growth has been disproportionately concentrated in the hands of the richest, especially in the US.Now, there is clearly a stagnation of income for the American middle-class, but things are not really going worse than they were 40 years ago.
We can debate the economic fortunes of the US and Europe in the last 10 years, but the world average is much, much better than it was in 1975.
boozed - Monday, January 4, 2016 - link
If your parents are paying for your electricity and air conditioning, sure, go ahead and ignore power consumption.Meanwhile, in the real world...
looncraz - Tuesday, January 5, 2016 - link
Meanwhile, in the real world... people who pay their own bills know that a 40W GPU difference is effectively zero.I replaced all of my lighting with LEDs and saved an estimated average of 500W 24/7. The extra 40W, or even 140W, under load for the few hours of gaming a day any normal person has time for will not make any type of impact on the electric bill that is noticeable beyond the noise of a two degree temperatures swing outside.
ASEdouardD - Tuesday, January 5, 2016 - link
It's far from my main concern, but power-performance does impact the experience. I love the fact that NVidia's GPUs are less power hungry and generate less heat at the moment than AMD gpus. It generally makes for quieter and more overclockable cards.grant3 - Monday, January 11, 2016 - link
As an 'enthusiast hobbyist', I like to build fanless computers. In such a situation, performance:watt is the primary constraint on gaming performance.I'm not sure what your list looks like, but IMO "$:Performance" should be a lower priority than power consumption for 'enthusiast gamers', simply because more heat puts physical constraints on how powerfully a system can perform.
pugster - Monday, January 4, 2016 - link
Despite that AMD and Nvidia being stuck on 28nm, Nvidia was able to make decent mobile GPU's via Maxwell. AMD just didn't bother to concentrate their efforts in mobile and low powered GPU's.looncraz - Tuesday, January 5, 2016 - link
AMD had a significant investment to move onward with 20nm and was unable to backport all of the features and improvements due to time and staffing limitations. nVidia, meanwhile, was able to throw more manpower at the situation to handle the situation much more gracefully.AMD's next issue was an underestimation of nVidia's performance improvement, which resulted in AMD's need to push the clockrates on their GPUs beyond the peak efficiency band. You can get ~80% of the performance of a GCN card at about half the power consumption, which the R9 Nano illustrates wonderfully. If AMD's estimate had been accurate, we'd be viewing things differently:
Imagine the 7970 releasing at 800Mhz and overclocking to 1.1GHz, and the R9 290/X having a base clock of 850MHz, and hitting 1.2GHz... Changes the dynamic entirely. nVidia outperformed expectations, AMD delivered exactly what was expected... then had to push to match nVidia.
Scopezz - Friday, January 8, 2016 - link
Best Analysis ever. +1zodiacfml - Wednesday, January 6, 2016 - link
Good for smaller devices (and I'm excited) but I doubt the high-end. It may use less power but the performance might only be a small increment over the previous high-end GPUsNinhalem - Monday, January 4, 2016 - link
I'm looking forward to what improvements the Polaris family will make to HBM on the unannounced high end cards.Shadow7037932 - Monday, January 4, 2016 - link
Indeed. I'm thinking they'll likely increase capacity to say 6-8GB as 4GB can start to become limiting especially at very high resolutions.jjj - Monday, January 4, 2016 - link
You are killing me with the RTG, stop saying it every 4 words.....jjj - Monday, January 4, 2016 - link
30 times on page 1, 41 times on page 2 and 24 times on page 3the article is unreadable
MrSpadge - Monday, January 4, 2016 - link
If that's your only critique the article may well get the literature nobel prize.Le Geek - Monday, January 4, 2016 - link
Lol, that made me crack up. Maybe because I felt the same.jasonelmore - Tuesday, January 5, 2016 - link
AMD Press has been hammering all tech journalist on the notion that they should be called "RTG" from now onowan - Monday, January 4, 2016 - link
Seriously, can't stand it. RTG is just AMD's pathetic marketing attempt to separate the GPU and moribund CPU divisions so that when they inevitably spin off the GPU division (or shut down the CPUs) the only remotely valuable piece of the company left won't be tainted by association.Mondozai - Monday, January 4, 2016 - link
RTG is more than marketing, they are moving a lot of resources under a single roof. Pay attention.owan - Monday, January 4, 2016 - link
Pay attention to what? The fact that they could have called it something with AMD in the name? The fact that they could just refer to it as Radeon? RTG is a crappy acronym and I find its presence in the article 8000 times to be incredibly distracting.TechGod123 - Tuesday, January 5, 2016 - link
It isn't a pathetic attempt. It is quite smart but you seem to irrationally hate AMD so why should I bother talking to you?prtskg - Tuesday, January 5, 2016 - link
Lol! I think reading the name RTG over and over is what's irritating the readers.fequalma - Thursday, January 14, 2016 - link
> Pay attention.hahahaha. shut up kid.
masouth - Monday, January 4, 2016 - link
Personally I would just prefer if they just say Radeon for short.I'm pretty sure anyone who's been around computers since before the merger or just follows GPU's in general will understand. We don't refer to Ford Motor Company as FMC even though that is exactly what it is. It's just Ford. There's a reason the Radeon name has stuck around this long and I think it should just be referred to as Radeon. Then again that's just my opinion, we all have one, and I'm sure others prefer the acronym.
Daniel Williams - Monday, January 4, 2016 - link
Currently RTG is what Radeon Technology Group requests to be called. That is why we refer to them as such and also why the name is brought up so often (we have to refer to them as something and their chosen name is prefered). So at this point it is less preference and more about the official branding.masouth - Monday, January 4, 2016 - link
Yeah, I know. I just don't care for it and agree with Owan as far as the marketing part. The acronym just doesn't read very smooth to me right now and you did mention it quite a few times. I know there really isn't much that you can do about that because as soon as you leave it out someone will get confused about where the information is coming from so I choose to blame Radeon for now. I'm also sure that I probably won't care as much once we have been stuck with the acronym for a bit and the edges get rounded off so that I will read it just as smoothly as any other acronym....but I still don't like it. =)Manch - Tuesday, January 19, 2016 - link
That's a bad example bc people usually prefer to say the acronym GM, not General Motors or General. Also people usually 50/50 with POS, or Piece of Sh!t instead of Chevy. Most importantly though RTG before being bought by AMD(acronym preferred) was ATI(acronym preferred. No one really referred to it as Array Technology Inc, Array Technologies Inc or as it was later changed to ATI Technology Inc which is an acronym within an acronym....at80eighty - Tuesday, January 5, 2016 - link
you should send them your evident grievances and sue for emotional damagesanubis44 - Thursday, January 14, 2016 - link
Keep dreaming.Mondozai - Monday, January 4, 2016 - link
RTGBrokenCrayons - Tuesday, January 5, 2016 - link
It's a minor, off-putting annoyance to have to see RTG over and over in an article, but I do agree with others who feel it's disruptive to read. I'd rather see AMD's people produce a product that speaks for itself and defines the group that developed it. However, I'm nor surprised to see it's AMD that's pushing the awful name. If you've read past AMD press releases, you're likely to find yourself drowning in buzzwords and clumsy-feeling, artificial excitement. They have a history of botching when it comes to effective communications which is why I'd much prefer AMD focuses on delivering competitive products at competitive prices and worry less about telling journalists how to write their articles.nathanddrews - Tuesday, January 5, 2016 - link
All I can think of when I see it is radioisotope thermoelectric generator, the primary power source of most of NASA's best toys. However, it is also applicable to AMD. Consistent, if not a bit less powerful that competing technologies, and very, very warm. ;-)jardows2 - Monday, January 4, 2016 - link
It will be interesting to see when NVIDIA will make their FinFET product announcement. Products are still a bit out, but here's hoping that the new products will give us good reason to upgrade from 2-3 year old video cards!tviceman - Monday, January 4, 2016 - link
The GTX 950 is a cut down version of Nvidia's currently-in-production least efficient chip. What a terrible comparison.rrinker - Monday, January 4, 2016 - link
Indeed. Even the GTX 960 tested by HardOCP didn't use 140 watts at full load.Pantsu - Monday, January 4, 2016 - link
The power consumption numbers were from the wall socket.mdriftmeyer - Monday, January 4, 2016 - link
Yes. The Nvidia fans are mentally challenged and don't grasp the baseline system differences between the two are just the card and the Total System Power consumption differences are highlighting the card architectures and FinFET. It seems each one comes in repeats the same mentally challenged thought and leave.DominionSeraph - Monday, January 4, 2016 - link
Using the 950 is like using the 7870XT as an example of Tahiti, or the 5830 for Cypress.Flunk - Monday, January 4, 2016 - link
Yes, would't a Geforce 960m also hit the FPS cap at medium on the X-Wing Map in Battlefront and use a lot less power doing it?Although when it comes to prerelease GPU demos pointless pretty much is par for the course. I'm excited to see how these perform when they're released, as well as Nvidia's Pascal architecture of course.
ToTTenTranz - Monday, January 4, 2016 - link
The GTX 950 uses a cut-down version of GM206, which is nVidia's latest and most power-efficient chip to date.tviceman - Monday, January 4, 2016 - link
GM206 is the least power efficient of all the maxwell chips, by a considerable amount. See the GTX 960 here in the most recent video card review: http://www.techpowerup.com/reviews/Gigabyte/GTX_98...The GTX 950 is even slightly less efficient than the GTX 960: http://www.techpowerup.com/reviews/Gigabyte/GTX_95...
Ch4os - Tuesday, January 5, 2016 - link
22% difference between GM206 and GM204? Oh no, what an mind boggling difference...@Dribble "obviously the AMD chip can only manage 60 hence the cap, but the Nvidia one can manage more."
It isn't obvious, it is one of the possibilities and also the one you chose to roll with.
"there's plenty of nvidia chips that can hit fps and use less power, particularly mobile ones."
And they are also considerably more expensive/rarer since they use hand picked chips and components and still aren't that much more power efficient.
FourEyedGeek - Saturday, January 16, 2016 - link
Still makes the comment "GTX 950 uses a cut-down version of GM206, which is nVidia's latest and most power-efficient chip to date" not true, which is what tviceman pointed out.Jleppard - Monday, January 18, 2016 - link
Moron the cap it 60 only to show at 60 hz the power each use. Amd will not show performance numbers until release.Dribble - Monday, January 4, 2016 - link
Clearly they do it to make AMD look good - there's plenty of nvidia chips that can hit fps and use less power, particularly mobile ones. Capping fps is a bigger flaw - obviously the AMD chip can only manage 60 hence the cap, but the Nvidia one can manage more. They could have been more blatant and used an OC 980Ti - that would also give you 60 fps if capped at 60 fps and use even more power!TheinsanegamerN - Tuesday, January 5, 2016 - link
Got any proof that the amd chip can't do more than 60?fequalma - Thursday, January 14, 2016 - link
You got proof that it can?beck2050 - Thursday, January 7, 2016 - link
AMD, over promise and under deliveriwod - Monday, January 4, 2016 - link
No mention of GDDR5X and HBM2?HEVC decode and encode. Do AMD now pay 50 million every year to HEVC Advance?
tipoo - Monday, January 4, 2016 - link
I'm going to cautiously turn my optimism dial up to 3. Raja has some good stuff between his ears (and we're going to also attribute any good news to Based Scott, right?), and the 28nm GPU stalemate sucked for everyone. It's certain that the die shrink will be a big boon for consumers, less certain is if AMD can grab back any market share as Nvidia won't be sitting with their thumbs up their butts either.fequalma - Thursday, January 14, 2016 - link
C-suite executives can't save floundering companies from extinction. AMD is going extinct because of an erosion of brand value, and inferior technology.Nobody gives a crap if your R9 380 is "competitive". It consumes 150W more than the competition. Inability to improve performance per watt is a fatal design flaw, and it will ultimately limit the availability of AMD's inferior design.
DanNeely - Monday, January 4, 2016 - link
Is starwars battlefront still a game with a very lopsided performance differential in AMDs favor? When I looked at what was needed for 1080p/max for a friend right after launch it was a $200 AMD card vs a $300 nVidia one. I haven't paid attention to it since then, has nVidia made any major driver performance updates to narrow the gap?cknobman - Monday, January 4, 2016 - link
Star Wars Battlefront is a game that supports the GeForce Experience and is listed on the Nvidia website as an optimized game.Friendly0Fire - Monday, January 4, 2016 - link
All that means is that Nvidia has performance profiles for their cards.jasonelmore - Tuesday, January 5, 2016 - link
yeah that doesnt matter.AMD Paid DICE thousands of dollars to prefer their architecture over nvidia when the Mantle deal went down.
Daniel Egger - Monday, January 4, 2016 - link
"primitive discard accelerator". Not sure I'd use the ambiguous word primitive in that context...looncraz - Monday, January 4, 2016 - link
Yeah, I first read "Primitive" as "Crude" until I realized they were talking about graphics primitives.Daniel Egger - Monday, January 4, 2016 - link
AMD, err, RTG -- if you're reading this: Please take my free recommendation to use the plural of primitive, i.e. primitives discard accelerator, instead. Thanks.extide - Monday, January 4, 2016 - link
Now, that, I can support!extide - Monday, January 4, 2016 - link
Primitive is not an ambiguous term in this case. It is talking about geometry early in the pipeline.Funny how so many commenters complain about this site being dumbed down lately, and then we they use some real words people also complain.
Sheesh!
DominionSeraph - Monday, January 4, 2016 - link
You don't want to use the word "primitive" anywhere near a description of your next generation tech product. This is a marketing slide by AMD, not a whitepaper. So please take your stupidity somewhere else.BehindEnemyLines - Monday, January 4, 2016 - link
The audience isn't the average Joe. It's a technology submit, so the slides are geared for someone who understand a bit more technical jargon.zeeBomb - Monday, January 4, 2016 - link
14nm Finfet. Awesome.bodonnell - Monday, January 4, 2016 - link
Finally! We've been stuck at 28nm for too long. I hope this finally gives me a reason to upgrade my 4 year old GPU. I'm thinking about switching my main monitor from 1080p to 1440p this year and whoever (AMD or Nvidia) has the best bang for my buck at that resolution will get my money.Yaldabaoth - Monday, January 4, 2016 - link
That's where I am, too. I have a Zotac 9600 [something] that is begging to be retired. 1440p and (DirectX 12-compatible drivers) soon!Mondozai - Monday, January 4, 2016 - link
If you're still using a 9600 and haven't found a reason to upgrade thus far you never will.bodonnell - Monday, January 4, 2016 - link
Yup, the 9600 is more like 7+ years old and only supports up to DirectX 10. If that has suited your needs thus far you may not need to be first in line for this!Birra - Monday, January 4, 2016 - link
Mediudm preset 1080p ?Why did they not compare it to a GTX 750Ti with 60fps lock proof ?
AMD.... AMD....
mdriftmeyer - Monday, January 4, 2016 - link
They are comparing it to a current generation Nvidia card in the $150 - $170 price range and showing the results of performance/watt at that range for this card that isn't optimized yet. In short, the card will be in this price range with this new GCN architecture, FinFET, at a massive drop in power consumption and all the benefits in performance it will offer.This isn't a pissing contest about cherry picking a card for the hell of it. It's about specific retail pricing, perf/watt and power consumption.
extide - Monday, January 4, 2016 - link
Dude, it's just a very generic preview to kinda 'prove' that they are taking power consumption seriously. I mean think about it, AMD has taken MAJOR heat about their lack of efficiency in their recent GPU designs. Obviously they have heard this and are aware, and are making power consumption a 1st class citizen in terms of priorities. They are just trying to put some evidence out there that, "Yes, we hear you, we are working on this issue..."andrewaggb - Monday, January 4, 2016 - link
I'm glad the power numbers should be much better... but this article (while somewhat interesting) doesn't have enough information. It's exciting that we'll get new genuinely new GPU's this year from AMD and NV, with 10 bit color, HDMI 2, and lower power consumption... but I think we all pretty much knew that anyways.Mondozai - Monday, January 4, 2016 - link
The big takeaway for me was that we will not see mid-high range GPUs launch until the autumn at the earliest. Secondly, for all the talk about Polaris/Artic Islands being just a shrink of Fiji(tons of it in various forums), the article fleshed out that the changes are the biggest in at least 4 years. And then there's a ton of different stuff, like the new primitive discard accelerator etc.Yes, I would have liked to see more, but to pretend that we knew everything is either a lazy claim or a testament that you didn't read it carefully enough.
Jleppard - Monday, January 18, 2016 - link
Well where is any Nvidia information except car parts with fake chips.TallestJon96 - Monday, January 4, 2016 - link
May this be the last generation to suffer from the 275w+ maddness that is Hawaii. With the generational improvements since then (tonga and fury) plus a new manufacturing process, hopefully tdp can stay reasonable. I also hope we never see ddr3 graphics cards again, and gddr5 can makes its way down to the lower end of graphics cards.Exciting to have a really new generation of cards coming. Prices will probably be really high because of demand at first. And a card with 950 performance at sub 750 ti power levels sounds sweet, even if its an AMD tilted benchmark. Imagine what a real 750 ti replacement from NVIDIA could do.
extide - Monday, January 4, 2016 - link
Yeah, it would be nice to see DDR3 on graphics cards go away, but unfortunately I doubt that will happen entirely for quite some time. Blame it on the OEM's making cheap cards/laptops, though. It's not really AMD/nVidia's issue. I mean AMD could go ahead and remove the DDR3 memory controller capability, but the only thing that would come of that is that the OEM's would just use an nVidia GPU instead.madwolfa - Monday, January 4, 2016 - link
I don't understand why they compare their future-gen card with the current-gen competitor's card? Obviously Nvidia would enjoy the same generational benefits with Pascal.extide - Monday, January 4, 2016 - link
Uhh, because they don't have a next gen nvidia card... Perhaps they could have compared to their own last gen, but really it doesnt matter, it was just a very rough general demo. The point is NOT to try to predict the power consumption or performance of the future Polaris GPU's based on this demo, but to show that they have made a huge relative improvement, that is all.Mondozai - Monday, January 4, 2016 - link
Preach.madwolfa - Monday, January 4, 2016 - link
They could really compare it to their own, instead of showcasing it as some sort of "advantage" over competitor, which is just not fair comparison at all.Jleppard - Monday, January 18, 2016 - link
Better to rub it in Nvidias face good business plan.Jleppard - Monday, January 18, 2016 - link
You can't say that But Amd did just show the power saving at 60hz blew Nvidias best power effective card awayShadowmaster625 - Monday, January 4, 2016 - link
If it is the size of an i7-6700K I'm not sure you can call it small. Also, at that size, wouldnt it have almost as many transistors as Tonga?LukaP - Tuesday, January 5, 2016 - link
the 6700k is around 120mm2. that is small compared to say 227mm2 for the 950 and if we look at the extremes, 601mm2 for the TitanXTheinsanegamerN - Tuesday, January 5, 2016 - link
The die size of an i7 is tiny compared to guys. You may be thinking of the grey ihs, which is far larger.TheinsanegamerN - Tuesday, January 5, 2016 - link
Gpus. Stupid autocorrect.Fusion_GER - Monday, January 4, 2016 - link
The smaller Parts are choosen to 14nm because they are more likely to be paired with CPU-Cores to an APU in the Future, thats very simple to guess.So the bigger GPU-Cores that are not gonna make it to APUs in the next say 4 years, aren´t 14nm at a not very trained big GPU-Core manufacturer (GF or Samsung). The more say fragile goal for big GPU-Cores will be accomplished at a Pro in doing big GPUs for longer, TSMC.
And AMD gets better on 14nm before doing the first APU on that node.
Fusion_GER - Monday, January 4, 2016 - link
Further the change at the Hardware Scheduler should take some work off the Driver OS and CPUHas there never been a Hardware Scheduler in GCN or Cayman VLIW4 ?
I saw Charts of Kepler not being far behind Mantle in DX11 paired with weaker CPUs, but this is an eyeopener to me.
2010 Dec: Cayman
http://www.anandtech.com/show/4061/amds-radeon-hd-...
Fundamentally asynchronous dispatch is achieved by having the GPU hide some information about its real state from applications and kernels, in essence leading to virtualization of GPU resources. As far as each kernel is concerned it’s running in its own GPU, with its own command queue and own virtual address space. This places more work on the GPU and drivers to manage this shared execution, but the payoff is that it’s better than context switching.
Halulam - Monday, January 4, 2016 - link
Wait what ??? look at the comparaison screen, the core I7 4790k isn't supposed tu support DDR4! maybe an typo...Anyway looking forward to see what AMD can bring this year, hopefully they can compete well with Nvidia.
nfriedly - Monday, January 4, 2016 - link
"...will support the latest DisplayPort and HDMI standards."What about HDMI-CDC, part of the HDMI 1.0 standard that almost no PC hardware seems to support? I doubt there's much chance of getting it supported at this point, but do you think you could at least pin down AMD/RTG (or NVIDIA for that matter) for a straight answer about why they never implimented it?
I know it's not important for most PCs, but it'd be helpful for Home Theater systems, and might even be handy for things such a monitor with built-in speakers.
medi03 - Friday, January 8, 2016 - link
Did you mean HDMI-CEC?webdoctors - Monday, January 4, 2016 - link
Probably completely offtopic, but since the slide mentions stars as being the most efficient photon generator in the universe....How does that make sense? The sun seems pretty inefficient, in terms of how much energy is used to generate light, and how most of the energy is released as heat rather than light.
Or is it the fact that it uses nuclear fusion rather than fission, and that is the most efficient source of energy generation?
LukaP - Tuesday, January 5, 2016 - link
Everything in the electromagnetic spectrum is photons. they never said visible light. and fusion is basically the most efficient mass-energy conversion we can achieve apart from complete annihilation via antimatter-matter collisions.So we can expect the next arch to be called pozitron or antiproton i guess xD
none12345 - Monday, January 4, 2016 - link
You do realize that emitted heat IS light right? Visible light is only a tiny part of the full em spectrum. The charge carrier for the electromagnetic force is the photon, that ranges from radio waves, to infrared(what you feel as heat), to visible light, to x-rays, and gamma rays.But that's not the only thing the sun emits, it also spews out atomic nuclei.
But...ya, i think they ment fusion being the most efficient known source of energy generation.
fequalma - Thursday, January 14, 2016 - link
photons are not charge carriers. Protons, electrons are charge carriers. Electromagnetic force is a field that exists between charge carriers. Photons are massless particles with zero charge, and do not interactive with other photons or electromagnetic fields. You cannot use an electromagnetic field to bend light.baii9 - Monday, January 4, 2016 - link
All this efficency talk, you guys must love intel.Sherlock - Tuesday, January 5, 2016 - link
Off topic - Is it technically feasible to have a GPU with only USB 3 ports & other interfaces tunneled through it?Strom- - Tuesday, January 5, 2016 - link
Not really. The available bandwidth with USB 3.1 Gen 2 is 10 Gbit/s. DisplayPort 1.3 has 25.92 Gbit/s.daddacool - Tuesday, January 5, 2016 - link
The big question is whether we'll see Radeons with the same/slightly lower power consumption as current high end cards but with significant performance hikes. Getting the same/marginally better performance at a much lower power consumption isn't really that appealing to me :)BrokenCrayons - Tuesday, January 5, 2016 - link
Identical performance at reduced heat output would be appealing to me. I'd be a lot more interested in discrete graphics cards if AMD or NV can produce a GPU for laptops that can be passively cooled alongside a passively cooled CPU. If that doesn't happen soon, I'd rather continue using Intel's processor graphics and make do with whatever they're capable of handling. Computers with cooling fans aren't something I'm interested in dealing with.TheinsanegamerN - Tuesday, January 5, 2016 - link
Almost all laptops have fans. Only the core m series can go fanless, and gaming on them is painful at best.TheinsanegamerN - Tuesday, January 5, 2016 - link
And amd or no making a 2-3 watt you would perform no better. Physically, it's impossible to make a decent got in a fanless laptopBrokenCrayons - Wednesday, January 6, 2016 - link
The Celeron N3050 in the refreshed HP Stream 11 & 13 doesn't require a cooling fan. Under heavy gaming demands, the one I own only gets slightly warm to the touch. Based on my experiences with the Core M, I do think it's a decent processor but highly overpriced and simply not deserving of a purchase in light of Cherry Trail processors' 16 EU IGP being a very good performer. I do lament the idea of manufacturers capping system memory at 2GB on a single channel and storage at 32GB in the $200 price bracket since, at this point, it's unreasonable to penalize performance that way and force the system to burn up flash storage life by swapping. At least the storage problem is easy to fix with SD or a tiny plug-and-forget USB stick that doesn't protrude much from the case.I think that a ~2 watt AMD GPU could easily be added to such a laptop to relieve the system's RAM of the responsibility of supporting the IGP and be placed somewhere in the chassis where it could work fine with just a copper heat plate without thermally overloading the design. Something like that would make a fantastic gaming laptop and I'd happily part with $300 to purchase one.
If that doesn't come to fruition soon, I think I'd rather just shift my gaming over to an Android. A good no contract phone can be had for about $30 and really only needs a bluetooth keyboard and controller pad to become a gaming platform. Plus you can make the occasional call with it if you want. I admit that it makes buying a laptop for gaming a hard sell with the only advantage being an 11 versus 4 inch screen. But I've traveled a lot with just a Blackberry for entertainment and later a 3.2 inch Android and it worked pretty well for a week or two in a hotel to keep me busy after finishing on-site. I have kicked around the idea of a 6 inch Kindle Fire though too....either way, AMD is staring down a lot of good options for gaming so it needs to get to work bringing wattage way down in order to compete with gaming platforms that can fit in a pocket.
Nagorak - Wednesday, January 6, 2016 - link
Well, you're clearly not much of a gamer, since what the Intel IGP's are capable of is practically nothing.I will agree that the space heaters we currently have in laptops are kind of annoying, but I don't see that improving any time soon.
doggface - Wednesday, January 6, 2016 - link
I don't know about fanless. But if nVidia or AMD could get a meaningful CPU + GPU package in under ~30 watts tdp - that would be very interesting. I can see an ultrabook with a small fan @ 1080p/medium being a great buy for rocket league/lol fans.smartthanyou - Tuesday, January 5, 2016 - link
So AMD announces their next GPU architecture that will eventually be delayed and then will disappoint when it is finally released. Neat!EdInk - Tuesday, January 5, 2016 - link
I'm surprised that RTG are comparing their upcoming tech with a GTX950..how bizzare... why not compare it to another AMD product as we all know how power hungry their cards have been.The game is also been tested at 1080p, capped @ 60FPS on medium setting....C'MON!!!
mapsthegreat - Wednesday, January 6, 2016 - link
Good Job! AMD! It will be a really gamechanger! congrats in advance! :Dvladx - Wednesday, January 6, 2016 - link
Since the press called the previous GCN 1.0, 1.1, 1.2 calling the new version GCN 2.0 sounds a lot more natural than GCN 4.zodiacfml - Wednesday, January 6, 2016 - link
It is not surprising that they employ TSMC for the latest node as it is their primary producer for their GPUs. GloFlo will continue to produce CPUs/APUs, which could mean, they might be producing them soon.melgross - Thursday, January 7, 2016 - link
I haven't been able to read all the comments, so I don't know if this has been brought up already. We read here, that FinFet will be used for generations to come. But the industry has acknowledged that FinFet doesn't look to work at 7nm. I know that a ways off yet, but it's a problem nevertheless, as the two technologies touted as its replacement at that node haven't been proven to work either.So what we're talking about right now, is optimization of 14nm, which hasn't yet been done, and then, sometime in 2017, for Intel, at least, the beginnings of 10nm. After that, we just don't know yet.
vladx - Thursday, January 7, 2016 - link
It won't be market viable below 5nm anyway so both Intel and IBM are probably most of their R&D to nanotubes research.mdriftmeyer - Friday, January 8, 2016 - link
Moved to a carbon based solution like nanotubes and other exotic materials will be the future. Silicon is a dead end.none12345 - Friday, January 8, 2016 - link
Cept its really more like the difference between 300 watts full system and 250 watts full system. Which 8 hours a day would be 146 kwh per year. Where i live thats about $20.And where i live its actually less then that, because its cold climate, where most of the year i would need to run the heater for that difference anyway. My system currently draws just about 300 watts when running maxed(that covers monitor, network router, etc, everything plugged into my UPS as well), and thats not enough heat for that room in the winter. In the summer it would mean AC, if the house had AC...it does not(not common where i live because you would only really use it for a month or 2).
So, ya at least for me i don't care at all if one card draws 50 more watts then the other. Ill buy whatever is best performance/$ at the time.
fequalma - Thursday, January 14, 2016 - link
Take everything in an AMD slide deck with a HUGE grain of salt.There is no way AMD will deliver a GPU that will consume 86W @ 1080p. AMD doesn't have that kind of technology.
No way.
FourEyedGeek - Saturday, January 16, 2016 - link
Mobile GPUs use less than that!fishjunk - Tuesday, January 19, 2016 - link
Looks like AMD will surpass NV for the next generation of GPU. Was there any mention of GDDR5X?P39Airacobra - Wednesday, March 9, 2016 - link
But it's still GCN! Again with the same re-brands! Regardless if a old 7850/Pitcairn/Curacao/Trinidad only uses 50watts, it is still a old 7870 from 2012!