Comments Locked

45 Comments

Back to Article

  • nandnandnand - Wednesday, October 14, 2015 - link

    The show must go on.
  • B3an - Wednesday, October 14, 2015 - link

    Errors:
    "has push though"
    "AMD does not however"
  • Ryan Smith - Wednesday, October 14, 2015 - link

    Whoops. Thanks for pointing that out.
  • weagle05 - Wednesday, October 14, 2015 - link

    Calling it a point release is a weak distinction too, they are all year.month
  • Ryan Smith - Wednesday, October 14, 2015 - link

    "Point Release" is a term that has a specific meaning in the software industry, though it is originally derived from product numbering.

    https://en.wikipedia.org/wiki/Point_release
  • Reikon - Wednesday, October 14, 2015 - link

    As defined in the Wikipedia article you linked to, "The term "point release" refers to a common method of software versioning in which a major version is followed by a decimal point and a minor version. When a new minor version is released, the number after the decimal point is incremented, e.g. from 7.0 to 7.1, or from 2.4.9 to 2.4.10."

    This means nothing for AMD drivers. It's Year.Month. The update going to 16.x just means it's 2016, while 15.10 to 15.11 could be a massive overhaul. You shouldn't use "point release" to dismiss an AMD driver update as minor.
  • Beany2013 - Wednesday, October 14, 2015 - link

    For what it's worth, I didn't know that - every day is a school day.
  • tamalero - Monday, October 26, 2015 - link

    Didnt they stopped doing monthly releases since a year and half ago?
  • AS118 - Wednesday, October 14, 2015 - link

    That's cool, I guess I'll check these out. Beta drivers haven't crashed my system yet this year. Knock on wood.
  • medi03 - Thursday, October 15, 2015 - link

    The only time I ever had to roll back GPU driver, was with nVidia.
  • MartinT - Wednesday, October 14, 2015 - link

    Will this finally fix the display corruption that has plagued the 15.x tree when downclocking the memory?

    Not holding my breath.
  • tamalero - Wednesday, October 14, 2015 - link

    are you 100% sure its not your video card? havent had that bug since the 15.4
    also, are you using dual monitors or single?
  • tamalero - Monday, October 26, 2015 - link

    Why would you downclock the memory too much?
    There are minimums for each card/memory to drive up an screen, even higher when using 2.
  • Morawka - Wednesday, October 14, 2015 - link

    So AMD's Game Ready Driver for Star Wars Battlefront comes late after the Beta has ended..

    This is exactly why i preach about Nvidia's superior Drivers being a perk of buying Nvidia
  • RussianSensation - Wednesday, October 14, 2015 - link

    What's the difference if their 1st Beta driver allowed excellent performance? Your comment hardly makes any sense in practice wrt to SW BF:
    http://www.guru3d.com/articles_pages/star_wars_bat...

    As far as you preaching NV's driver superiority, if AMD drivers are so bad, why is it they have the best price/performance and class leading performance in all categories on the desktop from $80 to $550 levels?
    http://www.techpowerup.com/reviews/AMD/R9_Nano/30....

    You can preach gospel all you want but it doesn't add up to reality.

    Even dual SLI vs. CF, your statement is still gospel as both SLI/CF have issues depending on the game:
    http://www.techspot.com/review/1033-gtx-980-ti-sli...

    AMD's single GPU drivers are rock solid, have never killed a desktop or mobile GPU via driver update either, had Full color RGB over HDMI available for more than a decade. Let's be objective at least.
  • D. Lister - Wednesday, October 14, 2015 - link

    It takes guts to make such claims, considering the state AMD is in. Well okay, maybe not guts, but it does take something... and a lot of it.
  • Horza - Wednesday, October 14, 2015 - link

    I don't see you refuting any of them just some mindless snark.
  • D. Lister - Thursday, October 15, 2015 - link

    lol, of course you don't, "subtlety" is inherently less accessible.
  • piiman - Saturday, October 17, 2015 - link

    Not to mention the fact that you indeed did NOT refute a single thing. Face it, you're just being a fanboy
  • RussianSensation - Friday, October 16, 2015 - link

    I don't understand the point of your post. AMD's drivers for single GPUs have allowed GCN to not only wipe out all the advantage Kepler GPUs have had over the last 3 years but cards like 380 and 390 are beating their direct competitors - 960 and 970. Again, if AMD's drivers were so "terribad", how is that possible?
  • Morawka - Thursday, October 15, 2015 - link

    the fact that you had to bring up all that irrelevant stuff makes your argument look even weaker.

    My point was, When Indie, AA and AAA Games are released, Nvidia has Day 1 Drivers ready AND HAS SLI Profiles, AND Geforce Experience Settings... oh did i mention they do all of this and still make WHQL Certification?

    DirectX12 is not used in any games so that's irrelavant..
    Price, Of course AMD will win Performance Per Dollar, That's because nobody's buying them. Nvidia has 75% GPU market share for crying out loud.

    The whole windows 10 thing is blown out of proportion and i have yet to have 1 single problem. You wanna talk about bad windows 10 gpu driver support, start talking about Intel.
  • RussianSensation - Friday, October 16, 2015 - link

    I don't understand the point of your post. AMD's drivers for single GPUs have allowed GCN to not only wipe out all the advantage Kepler GPUs have had over the last 3 years but cards like 380 and 390 are beating their direct competitors - 960 and 970. Again, if AMD's drivers were so "terribad", how is that possible?

    "My point was, When Indie, AA and AAA Games are released, Nvidia has Day 1 Drivers ready AND HAS SLI Profiles, AND Geforce Experience Settings"

    So now we are bringing SLI vs. CF into the mix? Roughly 300,000 gamers worldwide have SLI/CF and Steam has over 125 million users. Moving on then. Even on this topic, you ignore hard scientific data. All I read from your is an opinion:

    "Where the Fury X Crossfire setup won big was in Thief where it was 50% faster and Total War: Attila where it was 36% faster. Removing Thief's result sees the Fury X cards losing to the GTX 980 Tis overall by 1%. Now for the interesting part, typically we expect Nvidia to have the edge when looking at frame time (99th percentile) performance, but this wasn't the case here. The R9 Fury X Crossfire cards were on average 22% faster when comparing the 99th percentile data."
    http://www.techspot.com/review/1033-gtx-980-ti-sli...

    See that's how facts actually work. Your point that AMD's drivers are shit is hilarious considering they are winning the $80-550 price/performance and Fury X CF loses to 980Ti SLI only when 980Ti SLI is overclocked. That's pretty good for a claim that "their drivers are crap."

    "oh did i "mention they do all of this and still make WHQL Certification?"

    WHQL certification means nothing to me as an Intel/AMD/NV user. What do I care if the drivers are WHQL certified or not? In the past AMD had WHQL certified drivers and they were worse than they are in the last 4 years. So if you just want a badge of WHQL, knock yourself out.

    "Geforce Experience Settings"

    1. AMD has Raptr.
    2. I am old enough to know how to adjust settings on my own. If you need a console-like hand-holding to tune the settings for you, that's your choice and nothing wrong with that but to say that GeForce Experience means NV has better drivers is laughable as GeForce Experience has nothing to do with stability or performance of actual games for users who own AMD/NV cards and adjust settings on their own.

    "DirectX12 is not used in any games so that's irrelavant.."

    Sure, it's 100% relevant because you are generalizing how all AMD drivers are garbage and yet AMD is winning in all price segments besides 980Ti vs. Fury X in 2/2 DX12 games. So DX12 driver performance doesn't matter now? Ok, we'll revisit in 2016 and beyond.

    "Price, Of course AMD will win Performance Per Dollar, That's because nobody's buying them. Nvidia has 75% GPU market share for crying out loud."

    What a crazy argument. So you are a hardcore NV user and if AMD offers amazing price/performance it's because they make garbage products? I hope you are not even 18 years old as that's forgivable for someone with so little PC gaming experience. Just a reminder, NV's GeForce 3 Ti 200, GeForce 4 Ti 4200, GeForce 5900XT, GeForce 6800 non-U unlocked, 6800GT, 6600GT, 7800GT/7950GT, 8800GT, GTX460/470, 560Ti, and in recent era GTX970 offered amazing price/performance. Does that mean NV is desperate cuz no one was buying them? Try to make a coherent argument if you are going to make a point.

    Let's try this - single GPUs:

    Sept 2014
    780Ti $699 = beats 290X by 14% at 1080P, 9% at 2560x1600
    https://www.techpowerup.com/reviews/NVIDIA/GeForce...

    Sept 2015
    780Ti = beats 290X by only 4.5% at 1080P, 0% at 2560x1600
    https://www.techpowerup.com/reviews/Colorful/iGame...

    780Ti cost $699 while 290X cost $549. Your argument continues to fail while AMD's drivers for GCN are far superior to whatever NV has been able to produce for Kepler. This looks even worse considering 780Ti cost more and is no better than a cheaper 290X today.

    Going back to your SLI vs. CF argument, it falls apart even more.

    290X CF beats 780Ti SLI and 970 SLI at 1440P:
    http://www.sweclockers.com/test/20216-nvidia-gefor...

    See, those are what I call facts. I am providing real data that disproves your point and all you do is keep repeating the same thing how AMD drivers are crap.

    "The whole windows 10 thing is blown out of proportion and i have yet to have 1 single problem. You wanna talk about bad windows 10 gpu driver support, start talking about Intel."

    Google - nvidia driver issues windows 10

    Way to ignore how NV also had many driver issues - destroyed GPUs in laptops due to broken fan profiles due to NV's control panel, completely broken full RGB mode over HDMI for more than a decade, insanely horrendous blurry LOD under SSAA mode all the way until October 2012 -- proof:
    http://www.computerbase.de/2012-10/nvidia-geforce-...

    AMD's VSR still has superior IQ to NV's DSR - tested scientifically in games:
    http://www.overclock.net/t/1529509/computerbase-de...

    If you are going to have an opinion as strong as "AMD's drivers are crap", you better have data to bring to the table or you will get called on it.

    But considering it seems you are emotionally invested in NV, it sounds to me like nothing anyone will post will change your mind to an objective state regardless. Just my opinion.
  • BurntMyBacon - Friday, October 16, 2015 - link

    Ouch. And I thought my bacon was burnt.

    I personally use ATi exclusively for HTPC applications. That said, I favor nVidia for multi-GPU builds. More importantly, I favor single-GPU builds over multi-GPU builds. It is common knowledge that nVidia generally has the edge over ATi in multi-GPU setups. Here's a few recent articles I spent 30s to find dating from current to 2013:
    http://www.hardocp.com/article/2015/10/06/amd_rade...
    http://www.hardocp.com/article/2015/09/28/asus_str...
    http://techreport.com/review/21516/inside-the-seco...
    http://www.pcper.com/reviews/Graphics-Cards/Frame-...

    Feel free to look around. There are more to be had. However, as you stated and linked earlier, nVidia is not immune to issues either. I prefer to just stay out of such a volatile and update dependent setup.

    My personal experience with nVidia and ATi is that they have both had their quirks for single card setups and not very many have been insurmountable from either team. ATi has generally offered more performance at the same cost with a number of standout cards from nVidia (I believe you mentioned some earlier). Until recently, nVidia has been much better about getting developer support, which gives them a bit of an advantage with day 1 drivers. ATi has been making up ground here recently.

    For the time being, I'll stick with ATi for HTPC, nVidia for multi-GPU, and the best deal for my single card needs. The situation could obviously change at any point and I'm waiting to see how the "Sync" wars play out.
  • Chaser - Thursday, October 15, 2015 - link

    If you prefer inefficient, power sucking, room heaters, for video cards and CPUs that are slower, then AMD is a great option.
  • RussianSensation - Friday, October 16, 2015 - link

    What do CPUs have anything to do with this discussion? I buy either NV/AMD depending on what's better at the time I upgrade.

    Sapphire Fury runs quieter at max load than a 980Ti does at idle:
    http://www.anandtech.com/show/9421/the-amd-radeon-...

    How loud and hot a videocard runs is a large function of its cooling system. It's possible to have a 250W TDP quieter card than a 180W TDP one. If you don't do your own research, then sure you'll make stupid claims how AMD's cards run hot and loud.

    As far as power efficiency goes, since HD4800 days, AMD cards have made me tens of thousands of dollars due to bitcoin mining. As a tech enthusiast I will take tens of thousands of US dollars over saving $3 a month in my electric bill. Thanks though.
  • The_Countess - Tuesday, October 27, 2015 - link

    ah yes a 30 watt difference in game between a 980Ti vs fury X make one a space heater while the other isn't.

    that makes perfect sense obviously.
  • kurahk7 - Wednesday, October 14, 2015 - link

    That's funny. AMD released their SWB drivers over a week before NVidia did. http://www.anandtech.com/show/9667/amd-releases-ca...
  • Morawka - Thursday, October 15, 2015 - link

    That is BS,, read the article and the patch notes.. amd specifically said this driver introduces flickering in SWBF3 in crossfire X setup.. that's the entirety that was said about star wars BF.
  • D. Lister - Thursday, October 15, 2015 - link

    @kurahk7

    Err... wasn't 15.9 rather hurriedly rolled back by AMD because of a crippling memory leak?
  • silverblue - Thursday, October 15, 2015 - link

    ...with an update, 15.9.1, following a few days later.
  • kuttan - Wednesday, October 14, 2015 - link

    Most of the DirectX 12 games released recently runs pretty solid on AMD GPU where as Nvidia is playing catch up games with driver optimizations. Nvidia been trying for sometime to implement pseudo asynchronous compute support with there drivers they still didn't succeeded yet.

    And what superior Nvidia driver you preaching about ?? Windows 10 Nvidia drivers were still a big mess. You can see angry, frustrated Nvidia customers complaining windows 10 Nvidia drivers at:
    https://www.reddit.com/r/nvidia/comments/3krzli/wi...
  • Morawka - Thursday, October 15, 2015 - link

    Yeah 6 upvotes on on all those comments.. the problem sure is wide spread.....

    i havent had 1 single issue with my gtx 980 and i'm running windows 10 technical preview for gods sake.

    I agree with you on the Asynchronis compute support (and i wish they had it) but that has nothing to do with drivers.. that is a hardware problem.. Nvidia' gpu's dont have async compute in the GPU DIE itself. I will say async computer not really a issue on 300W TDP desktops, maybe on consoles where every bit of performance and frame counts to achieve 30 FPS.
  • Asomething - Thursday, October 15, 2015 - link

    that is actually wrong, gm2xx gpu's can use async shading but to just activate it requires a lot of cpu and driver overhead (they cannot switch contexts without the cpu and driver), here is the dev forum that discovered the implimenation and its issues https://forum.beyond3d.com/threads/dx12-performanc... they found that it can work but when it does even on a simple test it can take up a lot of cpu resources and can even just crash the test if it runs for long enough.
  • RussianSensation - Friday, October 16, 2015 - link

    Oh that says it all - GTX980 owner. The worst GPU to buy from $200-650 level in the last 14 months. No wonder you are butthurt since 290X/390X are trading blows in modern titles and making 980 look like an overpriced pile of garbage. Now it all starts to make sense how NV's drivers are "better." You have a typical case of this - trying to justify your purchase after the fact to make yourself feel better about why it was 'worth it':
    http://techreport.com/blog/21294/the-science-of-fa...
  • showb1z - Thursday, October 15, 2015 - link

    So you're complaining about drivers even though benchmarks show AMD beating Nvidia across the board on this game? Oh, but you don't even own an AMD card of course.
    Keep preaching/trolling buddy.
  • RussianSensation - Friday, October 16, 2015 - link

    Ya, all his posts are pure trolling/justifying how he wasted $ on his $450-550 GTX980 when today 980 can barely beat a 290X/390/390X in modern games, especially in DX12 and SW BF. But BETA drivers are supposedly a terrible thing. Then again, it's pretty much impossible to argue with most 980 owners on any sort of reasonable level. Even the loyal NV owners who have technical knowledge would have purchased 970 SLI instead of that turd 980.
  • Morawka - Saturday, October 17, 2015 - link

    If you've ever done a SLI or CrossFire X Setup you'll soon learn to avoid it. It's widely discussed that a Strong Single GPU setup is much more preferable over Dual GPU solution, even if it's one card with 2 GPU's.

    The 980 is still the golden staple at which most games target 60 FPS at max settings @ 1080p/1200p

    It's not a cut down chip so i Marvell at it's design and efficiency. I'm running a i7 at 4Ghz and a 980 Super clocked on a 450w PSU, something that cannot be achieved on top end amd cards. I have had it since release, and i always skip the neutered Ti releases. They throw so many cores for very little performance gain. 50% more cores for 30% more performance.

    As a engineer, i'm pulled towards design and efficiency. I'm not trolling, i'm just stating my opinion. I know you disagree with it :).

    I have a comfortable income so i don't look at price as a main driving force in my decision. it's all about the engineering and efficiency while giving great performance.. And the 980 does exactly that.
  • BurntMyBacon - Monday, October 19, 2015 - link

    @Morawka "As a engineer, i'm pulled towards design and efficiency. I'm not trolling, i'm just stating my opinion."

    I can respect that.
  • Mahigan - Thursday, October 15, 2015 - link

    AMD had performance optimizations for Star Wars Battlefront on September 29: http://www.anandtech.com/show/9667/amd-releases-ca...

    "Today AMD has released AMD Catalyst 15.9 Beta as their latest driver update, with display driver version 15.201.1151. This driver provides optimizations for the upcoming Star Wars: Battlefront Beta that we will be seeing next week and for the Fable Legends DX12 benchmark that we saw last week"

    They had a bug with Crossfire and memory leaks which they promptly fixed on October 1st with the 15.9.1 release. With 15.10, AMD have introduced even more performance optimizations.

    So yeah, AMD had Battlefront drivers before NVIDIA.
  • Morawka - Thursday, October 15, 2015 - link

    Not WHQL Certified... Just that beta driver junk
  • RussianSensation - Friday, October 16, 2015 - link

    Troll harder it's amusing.

    $300 R9 290X vs. $480 GTX980 in SW BF at 1440P at GameGPU:
    54 fps vs. 50 fps
    http://gamegpu.ru/action-/-fps-/-tps/star-wars-bat...

    $300 R9 290X / $370 R9 390X vs. $480 GTX980 in SW BF at 1440P at Guru3d:
    67 fps / 69 fps vs. 63 fps
    http://www.guru3d.com/articles_pages/star_wars_bat...

    Face it, you got ripped off by buying a 980 and now you are trying to find any means to justify why you made the wrong decision by bringing stupid arguments like WHQL vs. Beta into the mix despite much cheaper AMD cards beating your card in the said game.

    Beta driver junk? Interestingly enough, R9 295X2/290X CF cost barely $50-100 more than a single 980 cost around its launch and today they are literally wiping the floor with a 980 in almost all modern titles.

    Coincidentally, single GPU performance for 980 continues to get worse against 290X:

    Sept 2014
    980 beats 290X by 14.9% at 1440P
    https://www.techpowerup.com/reviews/NVIDIA/GeForce...

    Sept 2015
    980 beats 290X by 11.9% at 1440P
    http://www.techpowerup.com/reviews/AMD/R9_Nano/30....
    ^ And this is best case scenario here because the 980 benefits greatly from Project CARS, WoW and broken Wolfenstein bench. Remove those and it would barely beat the 290X.

    980 continues to look worse and worse and keeps proving itself to be one overpriced Maxwell SKU of this generation. So much for AMD's junk drivers huh?

    Beta driver junk?
  • Morawka - Saturday, October 17, 2015 - link

    they priced cheaper cuz nobody buys them.. go figure. AMD is going out of business soon, good luck with that.

    in 2 years when some indian company has bought the ATI division, your gonna see how much end of life driver support those cards are gonna get.

    And for the record, Star Wars Battlefront should get better performance on AMD hardware considering AMD paid DICE $100,000 to use Mantle, optimize, and promote the hell out of it.

    Lets talk about real games that actually have proper developers like Witcher 3.

    https://www.techpowerup.com/reviews/Performance_An...
  • The_Countess - Tuesday, October 27, 2015 - link

    you mean bought and paid for developers that are contractually obligated to included technology (and enable it by default) that performs poorly on ALL cards that are not nvidia's latest generation, including nvidia's own older generations?

    ow yes that's so much better.

    and all nvidia had to do was change the default tessellation from a ridicules 64x to 16x and visually nothing would change while nobody, including their own customers!, would have been negatively effected by it.
  • Mahigan - Thursday, October 15, 2015 - link

    As for superior drivers. AMD do have superior Windows 10 drivers, over NVIDIA. The issues with Windows 10 NVIDIA drivers is well documented... Not just on reddit. Google is your friend ;) use it and you'll see this to be true.
  • RussianSensation - Friday, October 16, 2015 - link

    Apparently he hasn't had a single driver issue with a 980, not one. How can you expect to form a coherent objective discussion with such a user? Not to mention he is also the owner of a 980 which is ageing the worst out of the entire $300-650 market segment.

Log in

Don't have an account? Sign up now