Comments Locked

213 Comments

Back to Article

  • rtho782 - Monday, August 14, 2017 - link

    First? lol
  • FireSnake - Monday, August 14, 2017 - link

    Good! Now, let us read this in peace :)
  • coolhardware - Monday, August 14, 2017 - link

    Exactly. I am VERY excited to read about this, especially since AMD has been dragging this launch out for what seems forver.

    While reading I will also have another window open furiously refreshing http://amzn.to/2hZ9iPb (shortened URL for direct amd vega search on Amazon!) to see when they come in stock, and if we can get one before they sell out! ;-)

    WOW, just checked and NewEgg is already out of EVERY Vega SKU :-( Like 15 different models from various brands :-( Bummer and I bet 80% are miners!
  • coolhardware - Monday, August 14, 2017 - link

    BestBuy sold out of all of their SKUs as well. :-(
  • Targon - Monday, August 14, 2017 - link

    I ran into the Out of Stock, auto-notify on Newegg for hours....and suddenly one showed up that I could actually buy. So, I hit it, and it has been in packaging for the past five hours. Amazon really messed up with the Ryzen launch, allowing far more orders than the expected number of Ryzen 7 chips, to the point where it took several additional weeks before some of them shipped out. That is why I won't order a highly anticipated item from Amazon.
  • Manch - Tuesday, August 15, 2017 - link

    I ordered the Oculus package, the $399 one from Amazon on July 12th. They shipped the controllers two days ago. headset is out of stock until further notice. It was in stock when I ordered. Then it was all orders before July 15th will be filled first. Then it was the touch controllers are out of stock. Then the touch controllers ship but the headset is out of stock. Aggravating to say the least. They are one of the few that ships electronics to APO without being shitty about it or charging triple of actual costs.
  • coolhardware - Tuesday, August 15, 2017 - link

    Way to stick with it! Did Best buy complete your order? Fingers crossed for you :-)
  • rtho782 - Monday, August 14, 2017 - link

    I think the GTA5 1440p benchmarks and the BF1 load power consumption graphs made me laugh the most.

    I guess it's a pretty effective space heater. Maybe they want to discourage crypto mining by using more power to make it unprofitable.

    It's a shame, we need more competition. *sigh*
  • Ratman6161 - Monday, August 14, 2017 - link

    295 watts..?!?!?! Currently my whole system only pulls about 225 watts even when torture testing. That testing is only including CPU and RAM but other articles say my RX460 is about 104 watts during torture testing. So if I was stress testing CPU, RAM and video card all at once I'd be at around 329. Not a gamer myself but its hard for me to imagine over 500 watts for my system. Just doesn't make any sense in this day and age.
  • Kratos86 - Monday, August 14, 2017 - link

    Hmm you either don't understand how crypto mining works or what a joke is. Cryptominers generally turn the GPU clock down because it isn't very useful in these situations, even bandwidth isn't as relevant as latency. These cards with a bit of tweaking are getting 35 mh/s at $35 for $500. The Vega 56 blows the 64 away but both GPU's beat the RX 580 in terms of bang for buck and that's considering they haven't been optimised for mining performance yet.

    If these things hit 40 at $500 a piece, two for $1000, thats 80 mh/s for less than a Titan XP which at a cost of $1370 does around 37 mh/s. Saving $50 a year on power consumption and paying double the price for that privilege is not a very intelligent way to do things.

    Suffice to say if you want one of these at the prices they are supposed to be selling at, you might get lucky and find one sometime this year because you are not finding these GPU's at these prices anytime soon and thats if they aren't sold out at any price. Unless AMD do something to get this in stock and keep it in stock the next few months are going to suck if you want one of these at prices that aren't inflated.

    I guess AMD could have worst problems than "cryptominers keep buying our GPUs faster than we can make them" but it's still a situation they need to remedy.
  • Kratos86 - Monday, August 14, 2017 - link

    That is at 200 Watts, not $35. Anandtech, reporting on the world of tomorrow, without an edit button.
  • mapesdhs - Monday, August 14, 2017 - link

    If/when AT finally does revamp the forums to enable editing, it's going to be a bigger forum headline splash than Threadripper. :D I'd post with typos just so I could delight at being able to edit it ten seconds later. 8)
  • AndrewJacksonZA - Tuesday, August 15, 2017 - link

    @Kratos86: You made me chuckle. :-)
  • Lolimaster - Monday, August 14, 2017 - link

    Proof of stake is almost here, Ethereum is basically done unless you get the gpu's for free, else you don't have much more than 4 months for ROI.
  • Notmyusualid - Monday, August 14, 2017 - link

    Thats right. Ethereum was the only crypto-coin out there.

    Fool.
  • Ryan Smith - Monday, August 14, 2017 - link

    As a heads up, this article is very much not done, as Nate and I had to rush to cover everything in 3 days. The performance data is up there, along with bits and pieces on the architecture.

    I have probably another 5000 words on the architecture left to draft and revise, and I hope to get that added in the next couple of days.

    In the meantime I apologize for the state of things, and we're continuing to work on the article to wrap things up.
  • FireSnake - Monday, August 14, 2017 - link

    Take your time ... we will wait :)
  • rtho782 - Monday, August 14, 2017 - link

    Ha, understandable, and I'd much rather have this than nothing :) Your unfinished reviews are generally more indepth than most places complete reviews.

    Hows the GTX960 review coming tho? :P
  • ddriver - Monday, August 14, 2017 - link

    No longer doing folding at double precision?
  • Ryan Smith - Monday, August 14, 2017 - link

    Since no one has shipped a consumer GPU with FP64 performance better than 1/16 in a few years now, there's not much of a need for a FP64 benchmark.
  • ddriver - Tuesday, August 15, 2017 - link

    More like "since nvidia castrates FP64 performance like razy" and "AT sure doesn't want to make nvidia look bad"...
  • Manch - Tuesday, August 15, 2017 - link

    and here we go with the shill comments....
  • ZeDestructor - Tuesday, August 15, 2017 - link

    AMD would look just as bad given they cut down FP64 just as much on modern cards.
  • zoxo - Monday, August 14, 2017 - link

    fp32 works well for MD tasks, there is not much need for double precision atm.
  • mapesdhs - Monday, August 14, 2017 - link

    If you need FP64, just stuff in some cheap, used GTX 580s. Or hunt for an original Titan or two. Actually, a Quadro 6000 is also a pretty decent buy for FP64.
  • abrowne1993 - Monday, August 14, 2017 - link

    I imagine a lot of people are very happy to have something when the embargo lifts even if it's not complete yet.
  • vanilla_gorilla - Monday, August 14, 2017 - link

    Thank you for not making us wait for the whole thing, that's awesome.
  • Xajel - Monday, August 14, 2017 - link

    Take your time with it...

    On a Side note, I would love to see a revisit on the the current status of GPU's and iGPU's (&CPU's & APU) on the Media Centric usage scenario (like the old HTPC GPU roundup), but with increased modern media usage scenarios like for example streaming, 4K, x265, HEVC, Plex, transcoding, encoding, etc...
  • romrunning - Monday, August 14, 2017 - link

    6th para - "seceded" should be "ceded" - AMD basically yielded the high-market to Nvidia, not "withdraw" to Nvidia. :)
  • Dr.Neale - Monday, August 14, 2017 - link

    Hear, hear!

    Indeed, this needed to be noted.

    Thank you for doing so!
  • Otritus - Monday, August 14, 2017 - link

    in the first page on the AMD Radeon RX Series Specification Comparison chart it says vega 56 has 3585 shaders instead of 3584 shaders
  • Otritus - Monday, August 14, 2017 - link

    gtx 1070 msrp is 349, price drop after 1080 ti
  • Targon - Monday, August 14, 2017 - link

    I'd be surprised if we don't need another two to three months to see how the Vega performance ends up with the expected driver updates. Every high end card from AMD and NVIDIA gets at least one big driver update to add 5-10 percent performance in games.
  • Cellar Door - Monday, August 14, 2017 - link

    Thanks for your excellent work Ryan.
  • redwarrior - Tuesday, August 15, 2017 - link

    Did you noter that the duynamic cache controller that AMD has touted is disabled at this point since the drivers have not been perfected to operate the cache efficiently. Once they solve that issue performance will jump anywhere from 10 to 15%. There is also some feature with the shaders that is also still disabled. All in all if we exercise a little patience Vega 64 should be a credible performer about half way between 1080 and 1080 Ti in performance. I hope when the drivers are more mature that people will do further reviews on these Vega offerings.
  • AndrewJacksonZA - Tuesday, August 15, 2017 - link

    Thank you Ryan.
  • ddriver - Monday, August 14, 2017 - link

    It seems amd still have a long way to do with the drivers, despite vega being so late... Judging by the battlefield result, a title that has both optimized for amd rather than exclusively for nvidia, and has been optimized on driver level by amd, this is where vega's actual graphics capabilities lie when it comes to the actual hardware, between the 1080 and the Ti.

    The good (for amd) news and bad (for people like me) is that vega looks like it has exceptional compute performance, which means prices will no doubt go through the roof because of the mining craze. This is not bad for gamers, since nvidia seems like the better value, but people who need compute for stuff other than mining will have to wait a while before vega could be bought at a good price.
  • TheinsanegamerN - Monday, August 14, 2017 - link

    If we have to rely on AMD optimizing every single game for VEGA, we will never see its true potential. AMD couldnt manage to do it right in a year and some change.
  • ddriver - Tuesday, August 15, 2017 - link

    nvidia is pretty much doing that, they spend a tremendous amount of money doing other's work, money that amd is not in the position to spend
  • Scabies - Monday, August 14, 2017 - link

    SR-IOV?
  • bcronce - Monday, August 14, 2017 - link

    Exactly. I REALLY want to run my games in a VM guest.
  • sutamatamasu - Monday, August 14, 2017 - link

    In RTG slide on architecture side. Vega have some MB SRAM. Can you tell me what this SRAM use for?
  • DanNeely - Monday, August 14, 2017 - link

    Various caches and internal buffers; on die memory is normally SRAM because it's several times faster than DRAM. (DRAM is several times denser since it only uses 1 transistor/bit vs the 4(?) for SRAM; which is why its used for main memory where total capacity is more important - and where the data bus is the main latency source anyway.) I'd be curious what the breakdown is since only 4MB if it's in the L2 cache.
  • sutamatamasu - Monday, August 14, 2017 - link

    Yes, same with me. Like we all know GCN 5 has no change on L2 Cache size but i am curious, AMD say this SRAM and L2 Cache size differently.
  • extide - Monday, August 14, 2017 - link

    A lot of it is going to be in the low level L1 caches and stuff local to the shaders -- there are a lot of shaders, so it will add up fast. GCN 5 does have double L2 cache, at least according to this article, 4MB vs 2MB. AMD says there is a total of over 45MB of SRAM on there, which is pretty impressive for a GPU!
  • ratbuddy - Monday, August 14, 2017 - link

    I'm disappointed that Vega Frontier results were not included in the benches :-/
  • Ryan Smith - Monday, August 14, 2017 - link

    AMD did not sample that card, and there's not much of a reason for us to include it now when the RX Vega is faster.
  • Nfarce - Monday, August 14, 2017 - link

    Another Fury X fail. You'd have to be a hard core AMD fan to buy this over a GTX 1080, and that's not even taking into consideration the horrid power use compared to the 1080. Isn't that what AMD fans tell us is so important when comparing Ryzen to i7 CPUs in core/watt performance? Amazingly they are silent here.
  • IchiOni - Monday, August 14, 2017 - link

    I do not care about power consumption. Only poor people care about power consumption. I will be purchasing an air cooled Vega 64.
  • Hurr Durr - Monday, August 14, 2017 - link

    So Barnum was right in the end.
  • MatthiasP - Monday, August 14, 2017 - link

    Only poor people buy the air cooled Vega and not the liquid one.
  • FreckledTrout - Monday, August 14, 2017 - link

    LOL I care about power consumption because it makes my computer loud. Did you see how loud t he Vega 64 is, way way to many db's for me. So some not "poor" people care.
  • Aldaris - Monday, August 14, 2017 - link

    Freckled, did you not pay attention to that graph? It's basically as loud as Nvidia's offerings.
  • sorten - Monday, August 14, 2017 - link

    If by "basically as loud" you mean that the AMD cards are 10% louder, then yes. Power draw translates to heat and, typically, noise. I would never buy the AMD cards given the power draw, but to each their own.
  • Brett Howse - Monday, August 14, 2017 - link

    dB is a logarithmic scale. 3dB means twice as much sound, although we don't perceive it as that much. Closer to 10 dB would be perceived as twice as loud. Vega 64 vs 1080 FE is about 7 dB difference, which isn't "basically as loud" it's going to be perceived as a lot louder.
  • Lazacom - Tuesday, August 15, 2017 - link

    Nope, logarithmic sound scale means 3dB is around 23% larger amplitude of sound waves. 10dB is perfect 2 times more. But our ears usually don't register 1-3dB difference. On the other hand, 7dB should be heard as difference, but again depends what frequencies are most loud, around 1kHz are most noticeable. If they wrote it basically as loud, probably there are lower frequencies spikes contrary to 1080Ti...
  • bcronce - Tuesday, August 15, 2017 - link

    Decibel is base 10. 10dB is 10x more energy. 3dB is about 47% more energy.
  • Dug - Tuesday, August 15, 2017 - link

    Anyone that deals with audio, knows that 3db is noticeable. 1db is perceivable, but not usually noticed if it's constant. The issue with sound from a case is that it can actually be like a speaker, so just because the card is 3db more at 1m in free air, it can become quite a bit louder due to cabinet size and materials. Just like putting a case under a desk can sometimes be louder than if on top of the desk. Last is heat. If case fans have to operate at higher levels to keep heat from video card down, then this will add to perceived loudness.
  • Exodite - Monday, August 14, 2017 - link

    While the power consumption of the cards, Vega 64 in particular, is disappointing I'd await 3rd party cooling systems before judging.

    The stock AMD cooling solution has been terrible since, well, forever really. And while I prefer AMD GPUs I've learned to avoid stock cards like the plague.
  • mapesdhs - Monday, August 14, 2017 - link

    Exodite, 3rd party coolers can help with the noise, but I can't see them doing much about the power behaviour.
  • Oxford Guy - Monday, August 14, 2017 - link

    This may be more of a shortcoming in the GloFo process than in the design of the chip. It would be very interesting to see how well it would do in PPW on TSMC's process.
  • Exodite - Wednesday, August 16, 2017 - link

    Of course, I were talking about the noise. :)

    The power consumption is, as I mentioned, a disappointment.

    I'd be interested in seeing how the cards do with undervolting and other tweaks, the 480 actually gained performance in some situations due to the lower power draw resulting in more headroom.
  • Oxford Guy - Monday, August 14, 2017 - link

    A vapor chamber is hardly "terrible" in terms of quality. But a high power draw + a blower = physics of noise.
  • Oxford Guy - Monday, August 14, 2017 - link

    Also, if one wants to talk about terrible stock cooling one should never forget the GTX 480.
  • mapesdhs - Monday, August 14, 2017 - link

    If one is in the UK, that would be a strange thing to do given it costs more than a 1080 Ti. It's priced 100 UKP higher than aftermarket 1080s with a 1759MHz base. Doesn't make sense. Factor in the power/noise, bit of a meh. If one is in the US where the price really is $500 (in the UK it's the equivalent of $750), well then maybe it's a bit more down to (irrational) brand loyalty, but still the power/noise issue doesn't make it an attractive buy IMO.

    The Vega56 looks far more interesting re price/performance and indeed performance, though it still has some power/noise issues. Perhaps aftermarket cooled versions will improve both cards, at least on the noise front.
  • mapesdhs - Monday, August 14, 2017 - link

    Rats, that was supposed to be a reply to IchiOni. Why can't we edit??
  • xfrgtr - Tuesday, August 15, 2017 - link

    You'd have to be a hard core AMD fan to buy this over a GTX 1080
  • Glock24 - Tuesday, August 15, 2017 - link

    Also people that care about heat and noise care about power consumption.
  • FourEyedGeek - Tuesday, August 22, 2017 - link

    Only poor people buy AMD Vega 64, buy 1080 Ti for better performance.
  • tipoo - Monday, August 14, 2017 - link

    64 vs the 1080, yes.

    56 vs the 1070 is much more appealing, 100 dollars less for the same performance, plus a discount on Freesync monitors.
  • Lolimaster - Monday, August 14, 2017 - link

    RXVega56 offwers a damn good value, smashing the 1070 for $100 less-
  • Da W - Monday, August 14, 2017 - link

    Think my Haswell + 780 rig is closing to retirement. A Vega 56 + Ryzen 1700X combo and a giant screen looks a good replacement.
  • thartist - Monday, August 14, 2017 - link

    The 1700X won't do any better for gaming if that's what you care about, but the card surely will.
  • Lolimaster - Monday, August 14, 2017 - link

    4c/8t extra to not care about cpu for a long time on top of being to actually do productivity things while gaming.
  • Da W - Tuesday, August 15, 2017 - link

    Actually i want a workhorse that can ALSO game fro time to time. What kids do to a life.......
  • tipoo - Monday, August 14, 2017 - link

    There's a cryptocurrency specific ISA addition, eh. People wanting it for graphics may hate it, but this may see huge demand from miners.
  • extide - Monday, August 14, 2017 - link

    Yeah, noticed that one too, heh.
  • NeonFlak - Monday, August 14, 2017 - link

    So, after reading this review I'm not sure what the point is of the Vega 64 for $100 more than the Vega 56.
  • DanNeely - Monday, August 14, 2017 - link

    The same thing as the 1080 over the 1070 in Nvidia land; which has a similar price/power/performance tradeoff. It's the fully enabled higher clock speed version for people willing to pay a premium for higher performance and who're less concerned about power/dollar efficiency.
  • Jumangi - Monday, August 14, 2017 - link

    Not with its performance and terrible power/heat it does. The Vega 56 is a solid competitor to the 1070 but the 64 does very poorly against the 1080.
  • TheinsanegamerN - Monday, August 14, 2017 - link

    Enough AMD fans will buy at inflated prices to make AMD some cold hard cash, then they will lower the price in 3 months.
  • Aldaris - Monday, August 14, 2017 - link

    It's still a competitor against the 1080.
  • mapesdhs - Monday, August 14, 2017 - link

    Not when it costs 100 UKP more (UK pricing). If the US pricing is as claimed, then I guess it's down to how much one cares about power/noise.
  • xfrgtr - Tuesday, August 15, 2017 - link

    the 64 does very poorly against the 1080
  • darkfalz - Monday, August 14, 2017 - link

    Not really. The x70 is usually 75% the performance of the x80. 25% is nothing to baulk at even if the price premium is much more than 25%.

    The 56 is something like 85-90% as fast as the 64 and significantly cheaper.

    It's a shame this GPU arch is essentially DOA. Makes the wait for Volta much longer. Then again my 1080 is still giving me great performance and I'm CPU limited a lot at 1440p.
  • mapesdhs - Monday, August 14, 2017 - link

    I think you're right, this will give NV more time to refine Volta, it'll sustain 10x0 sales for longer, so we'll have to wait for something better for those who want to move beyond the current 10x0 series.
  • Da W - Monday, August 14, 2017 - link

    Ryzen, Vega, Infinity fabric. The stage is set for a new fusion. Can't wait to see what their top 4-core + iGPU can do as a streamer box.
  • tipoo - Monday, August 14, 2017 - link

    It's surprising there still isn't even an APU as powerful as the PS4s GPU yet.
  • Qwertilot - Monday, August 14, 2017 - link

    They have, alas, no R&D money for that sort of 'side' project.
  • msroadkill612 - Monday, August 14, 2017 - link

    I hear nothing but good from folks who actually use amdS apuS appropriately. The 7850k was a classic for the money.

    Importantly, they have remained in the apu biz all along, and have the unique skillset to competently execute a new gen apu.

    I wouldnt call mobile ryzen a side project. Its a cornerstone.
  • tipoo - Monday, August 14, 2017 - link

    For a while those APUs were floating them while their standalone CPUs and GPUs struggled. Maybe they've gotten too slim for three strong tentpoles, alas, and one will always suffer.
  • Da W - Tuesday, August 15, 2017 - link

    Buldozer core sucked next to ''ok'' GCN igpu. It was very bandwith dependant and the igpu was most of the time starving for data. There was no point of pushing another dozer apu. They were waiting for Ryzen core, and infinity fabric to feed the gpu. Vega is just launching now and if you noticed AMD is only making one 8 core monolitic die sold in multiple package (1 for ryzen-2 fro treadripper-4 for epic). They have yet to cut that 8 core die in half and integrate their new Vega core in there, which is, i believe, what R&D is doing as of this morning....
  • Yaldabaoth - Monday, August 14, 2017 - link

    So, the TL;DR is that the Vega 64 competes on (relatively) cheap computing power and perhaps 4K gaming, and the Vega 56 competes on (relatively) very cheap computing power and being a value for 1440p gaming? Neither seem to compete on efficiency.
  • tipoo - Monday, August 14, 2017 - link

    Vega 56 seems well positioned for now. 1070 performance at a decently lower price. Question is if Nvidia can/will drop that price on a whim with enough margin (with a smaller die in theory they could, but AMD is probably getting low margins on these). Vega 64 is a far less clear value prospect, in one way it's similar to the 1070 vs 1080, but with Nvidia you're actually getting the best, which 64 can't claim.
  • Jumangi - Monday, August 14, 2017 - link

    Thats the big unknown. I suspect Nvidia is playing with much better margins than AMD is when looking at the chips to compete with them here. If Nvidia can lower prices on the 1070 to squeeze AMD if they want and still make a good profit.
  • webdoctors - Monday, August 14, 2017 - link

    The Vega56 is so cheap for the hardware you get I wonder if its being sold for a loss. I commented earlier that I thought these chips would be selling for double what they released at, and if they're profitable at this price point AMD might have some secret low cost manufacturing technology that is worth more than their entire company right now.

    As a consumer I'm practically getting paid to take it LOL.
  • tipoo - Monday, August 14, 2017 - link

    I doubt it's at a loss, but it's probably at a very slim margin. Nvidia could potentially split the difference with a 50 dollar drop and still have the smaller cheaper die (presumably, if TSMC/Glofo cost similar).
  • Drumsticks - Monday, August 14, 2017 - link

    Great review Ryan and Nate. I totally agree with your comment at the end about where Vega was designed. Relative to Nvidia, it's a further step back in almost every metric you can measure - perf/w, perf/mm^2, absolue perf of high end flagship...

    You really have to hope AMD can find one more rabbit in their hat a year or two from now. Nevertheless, the Vega 56 looks like an impressive product, but you can't be happy about getting 8% more performance out of something >50% larger in silicon.
  • Morawka - Monday, August 14, 2017 - link

    yup and next generation memory to boot.. AMD need better gpu designers. If not for Crypto, AMD would be in serious trouble.
  • Threska - Thursday, April 4, 2019 - link

    Hello. I'm writing from the future and I bring important news about Google Stadia.

    " To make it possible on its servers, Google has combined an x86 processor (likely an Intel one) with hyperthreading that runs at 2.7GHz, with 16GB of RAM, and a custom AMD graphics chip. It’s said to uses HBM 2 and has 56 compute units, delivering enough raw horsepower for 10.7 TFlops.

    That sounds like a modified Vega 56, although it’s equally possible that it’s one of AMD’s upcoming Navi line of graphics cards."

    https://www.digitaltrends.com/gaming/google-stadia...
  • Stuka87 - Monday, August 14, 2017 - link

    So my question is, can these be under-volted like Polaris can for some pretty decent power savings, and what is the power usage like when you enable AMD's Chill mode. They had stated you get about 90-95% of the performance but at a significantly lower power usage.
  • tamalero - Monday, August 14, 2017 - link

    Does this means that all the future of VEGA 64 will rest in the hands of FINEWINE(tm)'s optimizations and boosts?

    Because right now Vega 64 is nothing but a disappointment.
  • Chaser - Monday, August 14, 2017 - link

    This is a letdown. I don't understand why AMD chooses to lag behind Nvidia. The market is ripe for a competitive alternative to Nvidia. AMD hasn't been it. I am very pleased with my GTX 1080 purchase in January. Hopefully, come my next GPU upgrade time, AMD will have something better to consider.
  • Stuka87 - Monday, August 14, 2017 - link

    They don't "choose" to. They had the money to either make an amazing CPU, or an amazing GPU. And the CPU market is larger, so they chose to push R&D budget into Ryzen (Which has payed off big time).
  • TheinsanegamerN - Monday, August 14, 2017 - link

    They chose to split their resources between two GPUs (polaris and vega) rather then focusing on one line of chips. They chose to rebrand and resell the same chips for 5 years.

    AMD isnt rich, but they make quite a few boneheaded decisions.
  • Aldaris - Monday, August 14, 2017 - link

    Actually, that looks like it paid off for them in market share. Also, Polaris was always out of stock (irrelevant as to the reasons why. It's still money in AMD's pocket).
  • mapesdhs - Monday, August 14, 2017 - link

    That's a good point; whatever the buyer, a sale is still a sale. However, perhaps from AMD's pov they'd rather sell them to gamers because when Etherium finally crashes there will be a huge dump of used AMD cards on the market that will at least for a time stifle new card sales, whereas gamers tend to keep their cards for some time. Selling GPUs to miners now is certainly money in the bag, but it builds up a potential future sting.
  • mattcrwi - Monday, August 14, 2017 - link

    I would never buy a used GPU that has been run at full throttle 24/7 for months. I'm sure some people won't understand what miners do with their cards or will be enticed by the prices anyway.
  • wolfemane - Tuesday, August 15, 2017 - link

    I own a wide range of 290s and 290xs I picked up at the end of the last mining craze for great prices. Purchased all off miners. They all still work to this day with 0 issues. I've also purchased and sold 10x that quantity across 280 - 290x. Of those only one failed and sapphire replaced it under end of warranty.

    I look forward to the new craze ending. Will get some great cards for dirt cheap, and a vast majority still under warranty.

    Nothing wrong with buying them.
  • nintendoeats - Tuesday, August 15, 2017 - link

    I have been running Folding @ Home on the GPU for several years now. I have yet to find any reason to believe that running a card 24/7 is a problem. What I would be more concerned about is heat cycles, which aren't an issue when you just run the card hot all the time.
  • DanNeely - Tuesday, August 15, 2017 - link

    I've been running BOINC on my GPUs since the GTX 260. With a half dozen cards totaling about 20-25 years of operation I've had 1 card fail at the 2 year mark (GT 560), and one fan fail after about a year (HD 5850). The others all lasted 3-4 years until newer gaming purchases pushed them out of my slowed box and into retirement.
  • Otritus - Monday, August 14, 2017 - link

    While they dont have the funds to truly compete with nvidia I do see what you mean because in the gpu sector it seems rtg is focused on adding features and not increasing performance while decreasing power consumption. Polaris had more perf/per watt than vega, and I hate that regression from amd.
  • Aldaris - Monday, August 14, 2017 - link

    Why would anyone choose to lag behind? It's obviously not a choice.
  • milkod2001 - Monday, August 14, 2017 - link

    Vega is indeed disappointment and now we know that officially. The worst part is that NV don't have to lower its current GPU prices nor rush next gen GPU's.

    When can we expect next gen GPU from both camps?
  • TheinsanegamerN - Monday, August 14, 2017 - link

    Volta will come out with a 30-35% increase per category, and will sell for most of its reign unopposed. Navi will eventually come out, just before volta's replacement launches.

    AMD fell behind, and now must either rush a new generation or lag behind for half a year to get out of their current position.
  • Aldaris - Monday, August 14, 2017 - link

    Well, we don't know that officially because that's an opinion. Looks to me like NV do have to lower prices.
  • mapesdhs - Monday, August 14, 2017 - link

    Based on what? In the UK, the 1080 is 100 UKP cheaper than the Vega64/Air, while the Vega64/Liquid costs more than a 1080 Ti. NV doesn't have to do anything, at least not re the 1080 anyway. As for the 1070, perhaps a different matter, we'll see what happens with final retail pricing.

    If you're in the US though and the 64 really is $500, well maybe it might be more attractive if you're not bothered by the power/noise issues. Alas, outside the US the real consumer pricing is more of a mess.
  • vladx - Monday, August 14, 2017 - link

    AnandTech is turning into a joke with catchphrases like "Vega Burning Bright" or "The Vega Architecture: AMD’s Brightest Day". Quit trying to polish a turd and call Vega for what it really is, a crappy product that was released way too late, is slower than the competition on neutral games and extremely inefficient as well.

    I knew AT had a bias for AMD and Apple, but this is really getting ridiculous.
  • casperes1996 - Monday, August 14, 2017 - link

    Last comment section I read repeatedly stated how anti-AMD AnandTech is... Go figure.
  • nevcairiel - Monday, August 14, 2017 - link

    Something that burns bright typically puts out a lot of excess energy (heat) and eventually burns out. :p
  • coolhardware - Monday, August 14, 2017 - link

    I think that is actually why they went with that headline.

    As vladx's comment, AnandTech is not a joke. They put a lot of hard work into what they do and they have been doing it for decades. Anand's early work helped me build my first PC and they have continued to do a good job after his departure to Apple.

    I for one am very appreciative of all Anandtech's efforts :-)
  • vladx - Monday, August 14, 2017 - link

    "I think that is actually why they went with that headline."

    Nice spin, only a fool would buy into that sort of bullshit reasoning.
  • coolhardware - Monday, August 14, 2017 - link

    Then I am a fool.

    Have a nice day :-)
  • sor - Monday, August 14, 2017 - link

    That was actually the very first thing that came to mind when I saw the article "burning... I see what you did there".
  • Oxford Guy - Monday, August 14, 2017 - link

    You're not the only one.
  • Ryan Smith - Monday, August 14, 2017 - link

    I almost didn't go with that headline specifically because I was afraid too many people wouldn't get it.

    It's a shining moment for AMD in terms of architectural advancement. It's also a rather hot card in terms of heat dissipated.
  • vladx - Tuesday, August 15, 2017 - link

    Like I said it's a really nice spin, if anything Polaris was the architecture that was shining brightly not Vega which is a clear regression compared to Fiji. So I'm sorry Mr. Smith, but I refuse to take you and any AT writer for that big of a fools and it's clear from other reviews as well that everyone is trying their best to polish the turd called VEGA.
  • Beany2013 - Tuesday, August 15, 2017 - link

    Narrator:
    In reality, VladX was incredibly hurt that he didn't get the joke, as so many others managed to, and had made himself look like a fool.
  • vladx - Tuesday, August 15, 2017 - link

    A joke at AMD's expense is nothing to laugh about, so color me extra doubtful.
  • FourEyedGeek - Tuesday, August 22, 2017 - link

    Why be so immature?
  • FourEyedGeek - Tuesday, August 22, 2017 - link

    Your reflexes aren't fast enough
  • Aldaris - Monday, August 14, 2017 - link

    NV fanboy alert.

    Tell me, in what world did those results suggest to you it's slower?
  • Manch - Tuesday, August 15, 2017 - link

    ddriver calls them an Intel/Nvidia shill.
    Vladx calls them an AMD/Apple shill

    I think it was fair and balanced :D
  • sor - Monday, August 14, 2017 - link

    Performance wise it actually seems pretty good. People were worried it wouldn't even be able to compete with a 1080, but in many cases it slots between the 1080 and the Ti. The killer though is that power consumption. Burning 100+ more watts is insane. Otherwise, seems like it was a nice, competitive card.
  • blublub - Monday, August 14, 2017 - link

    This excessive power draw is, and many ppl forget that, node related.

    It's the same as with Ryzen:
    GloFo's 14nm is low power plus! Meaning it's very power efficient up to a certain frequency but once it surpasses it it drinks electricity like an elephant in steroids.

    It can be seen with Ryzen and Polaris, drop frequency and voltage and power goes down more than proportionally.

    AMD just didn't have enough money and was bound to GloFo so they couldn't take out different GPU sizes and on a different process
  • FreckledTrout - Monday, August 14, 2017 - link

    Yeah but they do have a shining light in that IBM bought 7nm process, its high frequency should really help both AMD's GPU and CPU's a lot.
  • Manch - Tuesday, August 15, 2017 - link

    You can't drink electricity. I get your point though.

    Make like a tree and get the out of here!
  • Yojimbo - Monday, August 14, 2017 - link

    It'll be interesting to see how much game developers take advantage of double rate FP16. Maybe there are some bottlenecks that can be alleviated without impacting quality much.
  • beck2050 - Monday, August 14, 2017 - link

    Over clocking seems very limited with that power draw. Custom 1080s are often 10 to 15% faster out of the box and still cooler and less power hungry.
    A bit disappointing.
  • mapesdhs - Monday, August 14, 2017 - link

    I mentioned that elsewhere, in the UK a 1080 with a 1759MHz base is 60 UKP cheaper than a Vega64/Air, and one can get a 1080 Ti for the price of a Vega64/Liquid.
  • msroadkill612 - Monday, August 14, 2017 - link

    Anyone who is kinda meh about a vega 56 or a 1070, and plans a ryzen rig, is mad not to get the sibling vega imo.

    Synergies are bound to pay dividends for some time to come.
  • Leyawiin - Monday, August 14, 2017 - link

    lol - "synergies". That was disproved years ago.
  • mapesdhs - Monday, August 14, 2017 - link

    I remember the days when tech sites were investigating why NV cards seemed to run better on AMD hw, think it was back in the P67 days or somesuch. The issue faded away, but it proved there's not necessarily a benefit to having all the tech from one side of the fence.
  • beast6228 - Monday, August 14, 2017 - link

    AMD pulled an Nvidia on this one, that touted $499 ended up being $599 and there was very limited supply on launch. I went to Microcenter and they only had 4 cards, 2 Gigabytes and 2 XFX. When I saw the $599 I was like, you can buy a faster, cooler and less power hungry 1080 for $100 less. This performance does not warrant this high price, sorry AMD you failed pretty hard.
  • coolhardware - Monday, August 14, 2017 - link

    Glad to hear Microcenter at least had *some* stock. Everywhere seems out now and I am just wondering when Amazon is going to finally pull the trigger on their cards and bundles!
  • Aldaris - Monday, August 14, 2017 - link

    How is it AMD's fault sellers are ignoring RRP?
  • mapesdhs - Monday, August 14, 2017 - link

    $750 in the UK (equivalent). AMD is responsible for the hype though, and thus arguably for the demand, so if they can't meet the supply then sellers can spike the prices to exploit the demand. Economics 101. AMD must have known that either the power/noise behaviour wasn't so good, or the supply could not be met. They're not directly responsible, but the resulting pricing should not be a surprise. Those buying them at inflated prices are just the ones who either don't know about the performance/power/noise issues or don't care (Vega64 performs better than I expected, but at 20% more expensive than a 1080 FE it makes no sense. Heck, for the cost of a 64/Liquid in the UK one can get a 1080 Ti.
  • BrokenCrayons - Monday, August 14, 2017 - link

    GPU power consumption was already too high before Vega's release. I think it's a mistake to raise that already high bar even higher.
  • Smell This - Monday, August 14, 2017 - link

    Good for AMD.
    You guys touched on it, but this is TSMC 28nm 'big chip' shrunk to GloFo 14nm LPP. Impressive 15% smaller with nearly 50% more transistors.

    And, it seems to me with the Infinity Fabric the next logical step is Zen+HBM2+Stars.

    Make it so, Dr Su!
  • FreckledTrout - Monday, August 14, 2017 - link

    Or vega + vega + HBM2 on a 7nm process aka Navi
  • BrokenCrayons - Monday, August 14, 2017 - link

    The hypothetical APU that contains Zen, Polaris/Vega, and HBM2 would be interesting if AMD can keep the power and heat down. Outside of the many cores Threadripper, Zen doesn't do badly on power versus performance so something like 4-6 CPU cores plus a downclocked and smaller GPU would be good for the industry if the package's TDP ranged from 25-95W for mobile and desktop variants.

    By itself though, Vega is an inelegant and belated response to the 1080. It shares enough in common with Fiji that it strikes me as an inexpensive (to engineer) stopgap that tweaks GCN just enough to keep it going for one more generation. I'm hopeful that AMD will have a better, more efficient design for their next generation GPU. The good news is that with the latest product announcements, AMD will likely avoid bankruptcy and get a bit healthier looking in the near term. Things were looking pretty bad for them until Ryzen's announcement, but we'll need to see a few more quarters of financials that ideally show a profit in order to be certain the company can hang in there. I'm personally willing to go out on a limb and say AMD will be out of the red in Q1 of FY18 even without tweaking the books on a non-GAAP basis. Hopefully, they'll have enough to pay down liabilities and invested in the R&D necessary to stay competitive. With process node shrinks coming far less often these days, there's an several years' long opening for them right now.
  • TheinsanegamerN - Monday, August 14, 2017 - link

    " It shares enough in common with Fiji that it strikes me as an inexpensive (to engineer) stopgap that tweaks GCN just enough to keep it going for one more generation. "

    We thought the same thing about polaris. I think the reality is that AMD cannot afford to do a full up arch, and can only continue to tweak GCN in an attempt to stay relevant.

    They still have not done a Maxwell-Esq redesign of their GPUs streamlining them for consumer use. They continue to put tons of compute in their chips which is great, but it restricts clock rates and pushes power usage sky high.
  • mapesdhs - Monday, August 14, 2017 - link

    I wonder if AMD decided it made more sense to get back into the CPU game first, then focus later on GPUs once the revenue stream was more healthy.
  • Manch - Tuesday, August 15, 2017 - link

    Just like there CPU's it's a jack of all trades design. Cheaper R&D to use one chip for many but you got to live with the trade offs.

    The power requirement doesn't bother me. Maybe after the third party customs coolers, I'll buy one if it's the better deal. I have a ventilated comm closet. All my equipment stays in there, including my PCs. I have outlets on the wall to plug everything else into. Nice and quiet regardless of what I run.
  • Sttm - Monday, August 14, 2017 - link

    That Battlefield 1 Power Consumption with Air, is that actually correct? 459 watts.... WTF AMD.
  • Aldaris - Monday, August 14, 2017 - link

    Buggy driver? Something is totally out of whack there.
  • Ryan Smith - Monday, August 14, 2017 - link

    Yes, that is correct.

    I also ran Crysis 3 on the 2016 GPU testbed. That ended up being 464W at the wall.
  • haukionkannel - Monday, August 14, 2017 - link

    Much better than I expected!
    Nice to see competition Also in GPU highend. I was expecting the Vega to suffer deeply in DX11, but it is actuallu doing very nice in those titles... I am really surpriced!
  • Leyawiin - Monday, August 14, 2017 - link

    A day late and a dollar short (and a power pig at that). Shame. I was hoping for a repeat of Ryzen's success, but they'll sell every one they make to miners so I guess its still a win.
  • Targon - Monday, August 14, 2017 - link

    I would love to see a proper comparison between an AMD Ryzen 7 and an Intel i7-7700k at this point with Vega to see how they compare, rather than testing only on an Intel based system, since the 299X is still somewhat new. All of the Ryzen launch reviews were done on a new platform, and the AMD 370X is mature enough where reviews will be done with a lot more information. Vega is a bit of a question mark in terms of how well it does when you compare between the two platforms. Even how well drivers should have matured in how well the 370X chipset deals with the Geforce 1080 is worth looking at in my opinion.

    I've had the thought, without resources, that NVIDIA drivers may not do as well on an AMD based machine compared to an Intel based machine, simply because of driver issues, but without a reasonably high end video card from AMD, there has been no good way to do a comparison to see if some of the game performance differences between processors could have been caused by NVIDIA drivers as well.
  • BOBOSTRUMF - Monday, August 14, 2017 - link

    well, I was expected lower performance compared to a geforce 1080 so this is one of the few plusses. Now NVIDIA only has to bump the base clocks for the Geforce 1080 while still consuming less power. Competition is great but this is not the best product from AMD, on 14nm the gains should be much higher. Fortunately AMD is great now on CPU's and that will hopefully bring income that should be invested in GPU research.
    Good luck AMD
  • mapesdhs - Monday, August 14, 2017 - link

    NV doesn't have to do anything as long as retail pricing has the 1080 so much cheaper. I look foward to seeing how the 56 fares.
  • webdoctors - Tuesday, August 15, 2017 - link

    It looks like the 1080 MSRP is actually less! Other sites mentioning the initial price included a $100 rebate which has expired :( and the new MSRP has taken effect....

    https://pcgamesn.com/amd/amd-rx-vega-rebates
  • mdriftmeyer - Monday, August 14, 2017 - link

    Remember your last paragraph after the game engines adopt AMD's architecture and features, of which they have committed themselves in doing, and already partially in development. When that happens I look forward to you asking what the hell went wrong at Nvidia.
  • Yojimbo - Monday, August 14, 2017 - link

    The whole "game engines will adopt AMD's architecture" thesis was made when the Xbox One and PS4 were released in 2013. Since then, AMD's market share among PC gamers has declined considerably and NVIDIA seems to be doing just fine in terms of features and performance in relevant game engines. The XBox One and PS4 architectures account for a significant percentage of total software sales. Vega architecture will account for a minuscule percentage. So why would the thesis hold true for Vega when it didn't hold true for Sea Islands?

    Besides, NVIDIA has had packed FP16 capability since 2015 with the Tegra X1. They also have it in their big GP100 and GV100 GPUs. They can relatively easily implement it in consumer GeForce GPUs whenever they feel it is appropriate. And within 3 months of doing so they will have more FP16-enabled gaming GPUs in the market than Vega will represent over its entire lifespan.
  • Yojimbo - Monday, August 14, 2017 - link

    That means the Nintendo Switch is FP16 capable, by the way.
  • mapesdhs - Monday, August 14, 2017 - link

    Good points, and an extra gazillion for reminding me of an awesome movie. 8)
  • stockolicious - Tuesday, August 15, 2017 - link

    "the Xbox One and PS4 were released in 2013. Since then, AMD's market share among PC gamers has declined considerably "

    The problem AMD had was they could not play to their advantage - which was having a CPU and GPU. The CPU was so aweful that nobody used them to game (or very few) now that Ryzen is here and successful they will gain GPU share even though their top cards dont beat Nvida. This is called "Attach Rate" - when a person buys a Computer with an AMD CPU the get an AMD GPU 55% of the time vs 25% of the time with an Intel CPU. AMD had the same issue with their APU - the CPU side was so bad that nobody cared to build designs around them but now with Raven Ridge coming Ryzen/Vega they will do very well there as well.
  • Yojimbo - Tuesday, August 15, 2017 - link

    I wouldn't expect bulldozer (or whatever their latest pre-zen architecture was called) attach rates to hold true for Ryzen. There were probably a significant percentage of AMD fans accounting for bulldozer sales. If Ryzen is a lot more successful (and by all accounts it looks like it will be), then only a small percentage of Ryzen sales will be by die hard AMD fans. Most will be by people looking to get the best value. Then you can expect attach rates for AMD GPUs with Ryzen CPUs to be significantly lower than with bulldozer.
  • nwarawa - Monday, August 14, 2017 - link

    *yawn* Wake me up when the prices return to normal levels. I've had my eye on a few nice'n'cheap freesync monitors for awhile now, but missed my chance at an affordable RX470/570.

    Make a Vega 48 -3GB card (still enough RAM for 1080P for me, but should shoo-off the miners) for around $250, and I'll probably bite. And get that power consumption under control while you're at it. I'll undervolt it either way.
  • Makaveli - Monday, August 14, 2017 - link

    https://www.newegg.ca/Product/ProductList.aspx?Sub...

    Air cooled Vega 64 for $839 CAD I don't think so.

    When you can pay $659 CAD for the 1080?

    I prefer AMD cards over NV but even I'm not dumb enough to do this.

    $180 extra for the same performance and must higher heat output.
  • mapesdhs - Monday, August 14, 2017 - link

    That was exactly my point, real pricing is way out of whack for the Vega64 to make any sense atm. Can you check, is it possible to buy a 1080 Ti for the same price as the cheapest Vega64/Liquid? This is the case in the UK, where the 64/Liquid is basically 700 UKP, same as a Palit Jetstream 1080 Ti.
  • Makaveli - Monday, August 14, 2017 - link

    @mapesdhs

    Liquid Vega 64 is $979 CAD

    1080 TI air is in the $924-$984 CAD.

    Liquid 1080 Ti's are in the $1,014-$1,099 range.

    And the 1080 Ti will still be faster and using less power....
  • zodiacfml - Monday, August 14, 2017 - link

    AMD is truly an underdog. Fighting Nvidia and Intel at the same time but I could see that they are doing their best based on their designs. Their unique position is what made them successful in gaming consoles. Can't wait to see the performance of Raven Ridge parts.
  • Frenetic Pony - Monday, August 14, 2017 - link

    Regarding FP16 game use, Devs are already using it because it's supported on the PS4 Pro. While the specific "Checkerboard" rendering used in Mass Effect Andromeda and Battlefield 1 are PS4 Pro only due to hardware oddities, it still uses FP16 optimizations. And since both PS4 and Xbox One have FP16 register capabilities it's an easy target for optimization there, and easy to bring over to PC.

    Frankly I'd expect it to be adopted fairly quickly. Register pressure alone is reason enough for high end games to call for explicit FP16 where applicable, and porting such to the PC is relatively easy.
  • BaroMetric - Monday, August 14, 2017 - link

    Vega is already sold out and the bundles on newegg require you buy gpu, cpu, mobo etc. You can't just pay the extra hundred dollars you actually have to purchase the other components. Which is not what we were lead to believe
  • Azix - Tuesday, August 15, 2017 - link

    need to include clock speed profiles. should be basic information reviewers include nowadays. put performance in context. This is nearly useless without it.
  • Ryan Smith - Friday, August 18, 2017 - link

    Ask and you shall receive. Check the power/temp/noise page.=)
  • HollyDOL - Tuesday, August 15, 2017 - link

    It was all good... until I reached power consumption/noise part of the review.
  • Outlander_04 - Tuesday, August 15, 2017 - link

    Now go and work out what that extra power will cost you if you game 2 hours a day for a year.
    The answer is NOTHING if you heat your house with a thermostat controlling temps,
    And a very small amount if you don't.
    Now go turn off a couple of lights. You know you want to.
  • HollyDOL - Tuesday, August 15, 2017 - link

    Thank you, I already did. Not everywhere is cheap electricity.
  • Gigaplex - Tuesday, August 15, 2017 - link

    A little over $30 per year extra. I tend to upgrade on a 3 year cadence. That's around $100 extra I can use to bump up to the Nvidia card.
  • Outlander_04 - Tuesday, August 15, 2017 - link

    The highest cost for electricity I can see in the US is 26 cents per kilowatt hour.
    The difference in gaming power consumption is 0.078 Kilowatts hour Meaning it would take 12.8 hours to burn that extra kW/H
    Two hours of full load gaming every day adds up to 730 hours a year means 57 kW/H's extra for a total cost of $14.82 per year .
    In states with electricity cost of 10 cents kW/H the difference is about $5.70 a year

    You might have to save a bit longer than you expect .
  • Yojimbo - Wednesday, August 16, 2017 - link

    Why did you assume he was interested in the 1070/Vega 56? Comparing the 1080 FE with the Vega 64 air cooled, the difference is .150 kilowatts. At your same assumption of 2 hours a day and 26 cents a kilowatt-hour it comes to $28.50 a year, right in line with his estimate. It's not a stretch to think he would game more than 730 hours a year, either.
  • Outlander_04 - Thursday, August 17, 2017 - link

    The BF1 power consumption difference between Vega 64 and the 1080 FE is 0.08 kW/H.
    Not sure where you get your numbers from , but it is not this review .
    The numbers are essentially the same as I suggested above . 0.078 vs 0.080 .

    Less than $6 a year in states with lower utility costs and as much as $15 a year in Hawaii .
    Yes you could game more than 14 hours a week . Its also not a stretch to think you might game a lot less . What was your point?
  • HollyDOL - Friday, August 18, 2017 - link

    I don't know where you look, but 1080 FE system is taking 310W, Vega 64 then 459W, which is 149W for no gain whatsoever.
  • Outlander_04 - Friday, August 18, 2017 - link

    379 vs 459 watts for the 1080 fe vs Vega 64.
    delta is 0.08 kW/H
    Those figures are right here in this review on the gaming power consumption chart.
  • HollyDOL - Saturday, August 19, 2017 - link

    Lol man, you need to reread that chart. 379W is 1080Ti FE, not 1080FE.
  • FourEyedGeek - Tuesday, August 22, 2017 - link

    What if you live in a hot part of the world? Extra heat equals extra throttling, during the summer I reduce my OCs due to this. Slap on the air conditioning and it'll run a bit extra too to compensate costing more.

    I'd look at undervolting if possible a VEGA 56
  • ET - Tuesday, August 15, 2017 - link

    So Vega 72 yet to come? Page 2 says that there are 6 CU arrays of 3 CU's each. That's 18 CU's, with only 16 enabled in Vegz 64.
  • Ryan Smith - Tuesday, August 15, 2017 - link

    3 CUs per array is a maximum, not a fixed amount. Each Hawaii shader engine had a 4/4/3 configuration, for example.

    http://images.anandtech.com/doci/7457/HawaiiDiagra...

    So in the case of Vega 10, it should be a 3/3/3/3/2/2 configuration.
  • watzupken - Tuesday, August 15, 2017 - link

    I think the performance is in line with recent rumors and my expectation. The fact that AMD beats around the bush to release Vega was a tell tale sign. Unlike Ryzen where they are marketing how well it runs in the likes of Cinebench and beating the gong and such, AMD revealed nothing on benchmarks throughout the year for Vega just like they did when they first released Polaris.
    The hardware no doubt is forward looking, but where it needs to matter most, I feel AMD may have fallen short. It seems like the way around is probably to design a new GPU from scratch.
  • Yojimbo - Wednesday, August 16, 2017 - link

    "It seems like the way around is probably to design a new GPU from scratch. "

    Well, perhaps, but I do think with more money they could be doing better with what they've got. They made the decision to focus on reviving their CPU business with their resources, however.

    They probably have been laying the groundwork for an entirely new architecture for some time, though. My belief is that APUs were of primary concern when originally designing GCN. They were hoping to enable heterogeneous computing, but it didn't work out. If that strategy did tie them down somewhat, their next gen architecture should free them from those tethers.
  • Glock24 - Tuesday, August 15, 2017 - link

    Nice review, I'll say the outcome was expected given the Vega FE reviews.

    Other reviews state that the Vega 64 has a switch that sets the power limts, and you have "power saving", "normal" and "turbo" modes. From what I've read the difference between the lowest and highest power limit is as high as 100W for about 8% more performance.

    It seems AMD did not reach the expected performance levels so they just boosted the clocks and voltage. Vega is like Skylake-X in that sense :P

    As others have mentioned, it would be great to have a comparison of Vega using Ryzen CPUs vs. Intel's CPUs.
  • Vertexgaming - Wednesday, August 16, 2017 - link

    It sucks so much that price drops on GPUs aren't a thing anymore because of miners. I have been upgrading my GPU every year and getting an awesome deal on the newest generation GPU, but now the situation has changed so much, that I will have to skip a generation to justify a $600-$800 (higher than MSRP) price tag for a new graphics card. :-(
  • prateekprakash - Wednesday, August 16, 2017 - link

    In my opinion, it would have been great if Vega 64 had a 16gb vram version at 100$ more... That would be 599$ apiece for the air cooled version... That would future proof it to run future 4k games (CF would benefit too)...

    It's too bad we still don't have 16gb consumer gaming cards, the Vega pro being not strictly for gamers...
  • Dosi - Wednesday, August 16, 2017 - link

    So the system does consumes 91W more with Vega 64, cant imagine with the LC V64... it can be 140W more? Actually what you saved on the GPU (V64 instead 1080) you already spent on electricity bill...
  • versesuvius - Wednesday, August 16, 2017 - link

    NVIDIA obviously knows how to break down the GPU tasks into chunks and processing those chunks and sending them out the door better than AMD. And more ROPs can certainly help AMD cards a lot.
  • peevee - Thursday, August 17, 2017 - link

    "as electrons can only move so far on a single (ever shortening) clock cycle"

    Seriously? Electrons? You think that how far electrons move matters? Sheesh.
  • FourEyedGeek - Tuesday, August 22, 2017 - link

    You being serious or sarcastic? If serious then you are ignorant.
  • peevee - Friday, September 1, 2017 - link

    I am afraid it is you who are ignorant. Signal propagation has very little to do with the speed of actual electrons.
  • MajGenRelativity - Thursday, September 28, 2017 - link

    I'm sure it was intended as a metaphor for people who don't understand electromagnetic wave propagation in detail.
  • skrewler2 - Sunday, August 20, 2017 - link

    Would be nice to see stuff like crypto hash rates included in benchmarks
  • Chimpacide - Friday, February 23, 2018 - link

    I just got the new Hitman game and I'm averaging like 20 frames lower than what other sites are claiming. Anyone have some tips for getting this card to run faster?

Log in

Don't have an account? Sign up now