Quick question: shouldn't the memory clock in the table on the fist page be expressed in Hz instead of bps being a clock and all? Or you could go with throughput but that would be just shy of 500GBps I think...
Good question. Because of the various clocks within GDDR5(X)*, memory manufacturers prefer that we list the speed as bandwidth per pin instead of frequency. The end result is that the unit is in bps rather than Hz.
Before you do that though, you should test Ryzen with the Ti. Reviewers everywhere are showing that for whatever reason, Ryzen shines with the 1080 Ti at 4k.
I did a double take there as I actually thought you type pull a rabbit out of my a... Was like ... wait, what?? (..chuckle) Anyway, good review Ryan. I read about the Ti being out soon.. didn't realize it was here already.
I can't wait to see what Vega brings. I'm hoping we at least get a price war over a part that can sit in between the 1080 and Ti parts. I would love to see Vega pull off 75% faster than a Fury X (50% clock speed boost, 20% more IPC?) but wow that would be a tough order. Let's just hope AMD can bring some fire back to the market in May.
I'm also extremely interested in seeing what Vega brings as well. My wallet is ready to drop the bills necessary to get a card in this price range, but I'm waiting for Vega to see who gets my money.
It will bring the same thing as ever - superior hardware nvidia will pay off most game developers to sandbag, forcing amd to sell at a very nice price to the benefit of people like me, who don't care about games but instead use gpus for compute.
For compute amd's gpus are usually 2-3 TIMES better value than nvidia. And I have 64 7950s in desperate need of replacing.
It is kinda both, although I wouldn't really call it a job, because that's when you are employed by someone else to do what he says. More like it's my work and hobby. Building a super computer on the budget out of consumer grade hardware turned out very rewarding in every possible aspect.
This is something I'd like to do. Not necessarily with GPUs but I have no idea how to make any money tobpay the bill yet. I'vw only started thinking about it recently.
"nvidia will pay off most game developers to sandbag"
AMD, nvidia, etc. might work with developers to optimize a game for their hardware.
Suggesting that they would pay developers to deliberately not optimize a game for the competition or even make it perform worse is conspiracy theories made up on the internet.
Not to mention it is illegal. No one would dare do it in this day and age when everything leaks eventually.
Something that blatant would be illegal. What nVidia does do is to offer a bunch of blobs that do various effects simulations/etc that can save developers a huge amount of time vs coding their own versions but which run much faster on their own hardware than nominally equivalent AMD cards. I'm not even going accuse them of deliberately gimping AMD (or Intel) performance, only having a single code path that is optimized for the best results on their hardware will be sub-optimal on anything else. And because Gameworks is offered up as blobs (or source with can't show it to AMD NDA restrictions) AMD can't look at the code to suggest improvements to the developers or to fix things after the fact with driver optimizations.
Have you noticed anyone touching nvidia lately? They are in bed with the world's most evil bstards. Nobody can touch them. Their practice is they offer assistance on exclusive terms, all this aims to lock in developers into their infrastructure, or the very least on the implied condition they don't break a sweat optimizing for radeons.
I have very close friends working at AAA game studios and I know first hand. It all goes without saying. And nobody talks about it, not if they'd like to keep their job, or be able to get a good job in the industry in general.
nvidia pretty much do the same intel was found guilty of on every continent. But it is kinda less illegal, because it doesn't involve discounts, so they cannot really pin bribery on them, in case that anyone would dare challenge them.
amd is actually very competitive hardware wise, but failing at their business model, they don't have the money to resist nvidia's hold on the market. I run custom software at a level as professional as it gets, and amd gpus totally destroy nvidian at the same or even higher price point. Well, I haven't been able to do a comparison lately, as I have migrated my software stack to OpenCL2, which nvidia deliberately do not implement to prop up their cuda, but couple of years back I was able to do direct comparisons, and as mentioned above, nvidia offered 2 to 3 times worse value than amd. And nothing has really changed in that aspect, architecturally amd continue to offer superior compute performance, even if their DP rates have been significantly slashed in order to stay competitive with nvidia silicon.
A quick example: ~2500$ buys you either a: fire pro with 32 gigs of memory and 2.6 tflops FP64 perf and top notch CL support quadro with 8 gigs of memory and 0.13 tflops FP64 perf and CL support years behind
Better compute features, 4 times more memory and 20 times better compute performance at the same price. And yet the quadro outsells the firepro. Amazing, ain't it?
It is true that 3rd party cad software still runs a tad better on a quadro, for the reasons and nvidian practices outlined above, but even then, the firepro is still fast enough to do the job, while completely annihilating quadros in compute. Which is why at this year's end I will be buying amd gpus by the dozens rather than nvidia ones.
> "And nobody talks about it, not if they'd like to keep their job"
Haha we're not scared of NVIDIA, they are just awesome. I'm in AAA for over a decade, they almost bought my first company and worked closely with my next three so I know them very well. Nobody is "scared" of NVIDIA. NVIDIA have their devrel down. They are much more helpful with optimizations, free hardware, support, etc. Try asking AMD for the same and they treat you like you're a peasant. When NVIDIA give us next-generation graphics cards for all our developers for free, we tend to use them. When NVIDIA sends their best graphics engineers onsite to HELP us optimize for free, we tend to take them up on their offers. Don't think I haven't tried getting the same out of AMD, they just don't run the company that way, and that's their choice.
And if you're really high up, their dev-rel includes $30,000 nights out that end up at the strip club. NVIDIA have given me some of the best memories of my life, they've handed me a next generation graphics card at GDC because I joked that I wanted one, they've funded our studio when it hit a rough patch and tried to justify it with a vendor promotion on stage at CES with our title. I don't think that was profitable for them, but the good-will they instilled definitely has been.
I should probably write a "Secret diaries of..." blog about my experiences, but the bottom line is they never did anything but offer help that was much appreciated.
Actually, heh, The worst thing they did, was turn on physx support by default for a game we made with them for benchmarks back when they bought Ageia. My game engine was used for their launch demo, and the review sites (including here I think) found out that if you turned a setting off to software mode, Intel chips doing software physics were faster than NVIDIA physics accelerated mode. Still not illegal, and still not afraid of keeping my job, since I've made it pretty obvious who I am to the right people.
Well, for you it might be the carrot, but for others is the stick. Not all devs are as willing to leave their products upoptimized in exchange for a carrot as you are. Nor do they need nvidia to hold them by the hand and walk them through everything that is remotely complex in order to be productive.
In reality both companies treat you like a peasant, the difference is that nvidia has the resources to make into a peasant they can use, while to poor old amd you are just a peasant they don't have the resources to pamper. Try this if you dare - instead of being a lazy grateful slob take the time and effort to optimize your engine to take the most of amd hardware and brag about that marvelous achievement, and see if nvidia's pampering will continue.
It is still technically a bribe - helping someone to do something for free that ends up putting them at an unfair advantage. It is practically the same thing as giving you the money to hire someone who is actually competent to do what you evidently cannot be bother with or are unable to do. They still pay the people who do that for you, which would be the same thing if you paid them with money nvidia gave you for it. And you are so grateful for that assistance, that you won't even be bothered to optimize your software for that vile amd, who don't rush to offer to do your job for you like noble, caring nvidia does.
So we moved from "nvidia pays devs to deliberately not optimize for AMD" to "nvidia works with devs to optimize the games for their own hardware, which might spoil them and result in them not optimizing for AMD properly".
How is that bribery, illegal? If they did not prevent the devs from optimizing for AMD then nothing illegal happened. It was the devs own doing.
Nope, there is an implicit, unspoken condition to receiving support from nvidia. To lazy slobs, that's welcome, and most devs are lazy slobs. Their line of reasoning is quite simple:
"Working to optimize for amd is hard, I am a lazy and possibly lousy developer, so if they don't do that for me like nvidia does, I won't do that either, besides that would angry nvidia, since they only assist me in order to make their hardware look better, if I do my job and optimize for amd and their hardware ends up beating nvidia's, I risk losing nvidia's support, since why would they put money into helping me if they don't get the upper hand in performance. Besides, most people use nvidia anyway, so why even bother. I'd rather be taken to watch strippers again than optimize my software."
Manipulation, bribery and extortion. nvidia uses its position to create situation in which game developers have a lot to profit from NOT optimizing for amd, and a lot to lose if they do. Much like intel did with its exclusive discounts. OEM's weren't exactly forced to take those discounts in exchange for not selling amd, they did what they knew would please intel to get rewarded for it. Literally the same thing nvidia does. Game developers know nvidia will be pleased to see their hardware getting an unfair performance advantage, and they know amd doesn't have the money to pamper them, so they do what is necessary please nvidia and ensure they keep getting support.
Where to start? Best not to start, as you are completely, 100% insane and I've spent two and a half 'reads' of your replies... trying to grasp WTH you're talking about and I'm lost Totally, completely lost in your conspiracy theories about two major GPU silicon builders while being apparently and completely clueless about ANY of it! Lol - Wow, I'm truly astounded that you were able to make up that much BS ...
You forgot to mention one thing. Nvidia tweaking the drivers to force users into hardware updates. Say, there is a bunch of games coming up this Christmas. If you have a card that's 3-4 years old, they release a new driver which performs poorly on your card ( on those games ) and another driver which performs way better on the newest cards. Then, if you start crying, they say: It's an old card, pal, why don't you buy a new one ! With DX11 they could do that a lot. With DX12 and Vulkan it's a lot harder. Most if not all optimizations have to be done by the game programmers. Very little is left to the driver.
That's how the ENTIRE industry is. Do you really expect developers to optimize for old architectures. Everyone does it, nvidia, AMD, intel, etc.
It is not deliberate. Companies are not going to spend time and money on old hardware with little market share. That's how it's been forever.
Before you say that's not the case with radeons, it's because their GCN architecture hasn't changed dramatically since its first iteration. As a result, any optimization done for the latest GCN, affects the older ones to some extent too.
There is good news for the future. As DX12 and Vulkan become mainstream API's, game developers will have to roll up their sleeves and sweat it hard. Architecturely, these API's are totally different from the ground up and both trace their origin from Mantle. And Mantle was the biggest advance in graphics API's in a generation. The good days for lazy game developers is coming to an end, since these new API's put just about everything back into their hands whether they like it or not. Tweaking the driver won't make much of a difference. Read the API's documentation.
Yes hopefully this will be the future where games are the responsibility of the developer. Just like on Consoles. I know people hate consoles sometimes but the closed platform shows which developers have their stuff together and which are lazy bums because Sony and Microsoft don't optimize anything for the games.
Always amusing watching to tin foil hat Nvidia conspiracy nuts talk. Here's my example: working on Project Cars as an "early investor." Slightly Mad Studios gave both Nvidia and AMD each 12 copies of the beta release to work on, the same copy I bought. Nvidia was in constant communication with SMS developers and AMD was all but never heard from. After about six months, Nvidia had a demo of the racing game ready for a promotion of their hardware. Since AMD didn't take Project Cars seriously with SMS, Nvidia was able to get the game tweaked better for Nvidia. And SMS hat-tipped Nvidia with having billboards in the game showing Nvidia logos.
Of course all the AMD fanboys claimed unfair competition and the usual whining when their GPUs do not perform as well in some games as Nvidia (they amazingly stayed silent when DiRT Rally, another development I was involved with, ran better on AMD GPUs and had AMD billboards).
So was there anything preventing the actual developers from optimizing the game? They didn't have nvidia and amd hardware, so they sent betas to the companies to profile things and see how it runs?
How silly one must be to expect that nvidia - a company that rakes in billions every year, and amd - a company is in the red most of the time and has lost billions, will have the same capacity to do game developers jobs for them?
It is the game developer's job top optimize. Alas, as it seems, nvidia has bred a new breed of developers - those who do their job half-assedly and then wait on them to optimize, conveniently creating unfair advantage to their hardware.
Also talking about fanboys - I am not that. Yes, I am running dozens of amd gpus, and I don't see myself buying any nvidia product any time soon, but that's only because the offer superior value to what I need them for.
I don't give amd extra credit for offering a better value. I know this is not what they want. It is what they are being forced into.
I am in a way grateful to nvidia for sandbagging amd, because this way I can get a much better value products. If things were square between the two, and all games were equally optimized, then both companies would offer products with approximately identical value.
Which I would hate, because I'd lose the currently, 2-3x better value for the money i get with amd. I benefit and profit from nvidia being crooks, and I am happy that I can do that.
So nvidia, keep doing what you are doing. I am not really objecting, I am simply stating the facts. Of course, nvidia fanboys would have a problem understanding that, and a problem with anyone tarnishing the good name of that helpful awesome and paying for strippers company.
Geez developers can't catch a break... I been a registered developer with ATI/AMD and NVIDIA both for years... Besides the research and papers and tools they invest to further the interest of game development. I have never had to do anything but reap the rewards and say thank you. Maybe there are secret meetings and koolaid I am missing?
Not sure what the actual correct scenario is tell you the truth?
Either triple AAA dev is hamstringing everyone's PC experience because development is all Console centric ( Which means tweaking FOR an "AMD owned architecture landscape" for two solid console cycles where "this console cycle alone sees both BOBCAT and PUMA being supported"! ) Or the Industry is in Nvidia's back pocket because of what? the overwhelming PhysX support? There is an option to turn on Hair in a game? Their driver support is an evil conspiracy?
Whatever... If there is some koolaid I wanna know where that line is? Gimme! I want me some green Koolaid!
Underhanded deals are part of the industry. Have you ever wondered about the prices even when price fixing is disallowed and supposedly abolished?
These deals are always happening, look at the public terms of the deal NVIDIA and Intel. Nvidia gets: 1.5 Billion Dollars, Over six Years. 6 Year Extension of C2D/AGTL+ Bus License Access To Unspecified Intel Microprocessor Patents. Denver?
NVIDIA Doesn't Get: DMI/QPI Bus License; Nehalem/Sandy Bridge Chipsets x86 License, Including Rights To Make an x86 Emulator
For a company that was supposed to be headed into x86 wouldn't you say that's anti-competitive?
Prices, I don't know. Isn't it a combination of market demand, (lack of) competition and shortages.
What does that deal has to do with this? Not giving certain licenses, publicly, is not the same as actively preventing devs from optimizing for the competition in shadows.
It would not be illegal, it's called "having the market by its bawles". Have you ever actually talked to someone who does this or do you like assuming it's illegal and conspiracy theory (@eddman)?
You're thinking about GameWorks and GW is offered to developers under NDA with the agreement prohibiting them from changing the GW libraries in any way or "optimizing" for AMD. And by the time AMD gets their hands on the proper optimizations to put in their drivers it's pretty much too late. Also GW targets all the weaknesses of AMD GPUs like excessive use of tessellation. But since everywhere you throw a game at you're going to find an Nvidia user (75% market share), game developers realize it's good for business and don't make too much noise.
Nothing is illegal, it's certainly not a conspiracy theory, it may very well be immoral but it's business. And it happens every day, all around you.
I'm not talking about GPU-bound GW effects that can be disabled in game, and I do know that they are closed-source and cannot be optimized for AMD.
I'm talking about whole game codes. Nvidia cannot legally prevent a developer from optimizing the non-GW code of a game (which is the main part) for AMD.
Besides, a lot of GW effects are CPU-only, so it doesn't matter what brand you use in such cases.
It cannot legally prevent devs from optimizing for amd, but it can legally cut their generous support.
There isn't that much difference between:
"If you optimize for AMD we will stop giving you money" and "If you optimize for AMD we will stop paying people to help you".
It is practically a combination of bribe and extortion. Both very much illegal.
And that's at studio level, at individual level nvidia can make it pretty much impossible to have a job at a good game studio.
Intel didn't exactly bribe anyone directly either, they just offered discounts. But they were found guilty.
Although if you ask me, intel being found guilty on all continents didn't really serve the purpose of punishing them for their crimes. It was more of a way to allow them to cheaply legalize their monopoly.
An actually punitive action would be fining them the full amount of money they made on their illegal practices, and cutting the company down to its pre-monopoly size. Instead the fines were pretty much modest, even laughably low, a tiny fraction of the money they made on their illegal business practices, and they got to keep the monopoly they build on them as well.
So yeah, some people made some money, the damage done by intel was not undone by any means, and their monopoly built through illegal practices has been washed clean and legal. Now that monopolist is as pure as the driven snow. And it took less than a quarter's worth of average net profit to wipe clean years of abuse.
amd was not in the position to make discounts, intel was, so they abused that on exclussive terms for monetary gains
amd is not in the position to sponsor software developers, nvidia is, so they abuse that on exclussive terms for monetary gains
I don't really see a difference between the two. No one was exactly forced to take intel's exclussive discounts either, much like with nvidia's exclusive support and visits to strip clubs and such.
So, by citing intel as precedent, I'd say what nvidia does is VERY MUCH ILLEGAL.
Yeah, also amd was free to offer discounts to those who didn't sell intel products. They helped make two games, and gave Larry who only sells amd systems a free amd t-shirt as a reward. Because that's what amd can afford, after years of being sandbagged by intel and Hecktor made it buy ati for 3 times what it was worth so it can go bankrupt so it will be forced to sell its fabs to Hecktor's arab boyfriends.
What nvidia does is the same thing as lobying. It is legalized bribe. You cannot give a briefcase of money to a politician and tell him to do what you want him to. But you can spend a briefcase of money to make a politician do what you want him to do. And it is not illegal, politicians have legalized it, and as far as they and the lobbyists are concerned, that is a good thing, a political contribution.
The same kind of advantage that allows nvidia to do that is what would give them the upper hand in court. It could be proven to be a crime, if only amd had enough money to out-sue nvidia. Which they don't. And if they did, they'd be able to support game developers, so it wouldn't even come to that. nvidia is friends with the big boys, amd is a perpetual underdog. In such scenarios, even if a lawsuit was to take place, it would be mostly a show for the public, and if found guilty, the punishment would be a symbolic and gentle slap on the wrist.
Now, with your question answered, do you feel better?
There is high likelihood that we will see such a case against nvidia, but not until they have completely cemented their dominance position, and that case would only serve to wipe nvidia clean, so they can enjoy their dominance without being haunted by their past of sleazy illegal practices, giving them a clean slate at a very desirable price.
It can be as legal or illegal as killing people. What nvidia does is most certainly unfair business practices and abuse of its position.
The legal system is rarely about what is right or wrong. What nvidia does is certainly wrong. If they can get away with it, it is legal. If someone kills your entire family and then walks free because the legal system found him to be innocent, would you be as OK and defending his innocence as you are doing for nvidia?
Are you by a chance on the spectrum? There is nothing wrong with helping to optimize software. For the last time - what is wrong is offering that help on implied exclusive terms. I don't know people who have been offered support by amd in exchange of sandbagging nvidia. But I know people who eventually optimized for amd and as a result lost the support nvidia offered prior to that. And the revenge didn't end there either, subsequent driver releases significantly worsened the performance of the already nvidia optimized code.
nvidia doesn't help out of the kindness of their hearts or awesomeness, they do not even help to make the best out of their hardware, they only help if that would get them an unfair advantage, so it is implied that their help is only available to those who leave the amd rendering pipeline deliberately unoptimized.
...and some other people tell otherwise. Who to believe. Can you provide anything solid to back that up?
"subsequent driver releases significantly worsened the performance of the already nvidia optimized code"
Which games? Which drivers? This one can be tested.
Why would nvidia reduce the performance of a game on their own cards, which is going to hurt them? The whole purpose of this was to make the game work best and sell cards based on that.
Also, granted, there are some amd optimized games, albeit few and far in between. But that doesn't excuse what nvidia does, nor does it justify it.
Besides it was nvidia who started this practice, amd does simply try its best to balance things out, but they don't have nowhere nearly the resources.
amd optimized games are so rare, than out of my many contacts in the industry, I don't know a single one. So I cannot speak of the kinds of terms amd offers their assistance. I can only do that for nvidia's terms.
If amd's terms for support are just as exclusive as nvidia's, then amd is being guilty too. But even then, that doesn't make nvidia innocent. It makes amd guilty, and it makes nvidia like a 100 times guiltier.
Read the reply above. nvidia doesn't state the terms, because that would be illegal, the terms are implied, and they refuse further support if you break them... and worse... so it is a form of legal bribery
and since their drivers are closed source, any hindrances they might implement to hamper your software remain a secret, but hey, there is a good reason why those drivers keep getting more and more bloated
What if is nothing implied? Why are you so sure something must be implied if they're helping a dev? What if they simply want that game to work best with their hardware because it's an important game in their mind and might help sell some cards?
We are heavily into guessing and assuming territory.
I'm not saying shady stuff doesn't happen at all, but to think that it happens all the time without exception would be extreme exaggeration.
There is no "nothing implied". It doesn't take a genius to figure what nvidia's motivation for helping is. Of course, if it is an important, prominent title, nvidia might help out even if the studio optimizes for amd just to save face.
But then again, nvidia support can vary a lot, it can be just patching up something that would make them look bad, it can be free graphics cards and strippers as in the case of our lad above. I am sure he didn't optimize for amd. I mean that developers don't really care all that much how well their software runs, if it runs bad, just get a faster gpu. They care about how much pampering they get. So even in the case of a studio which is too big for nvidia to blackmail, there is still ample motivation to please it for the perks which they won't be getting from amd.
There is no assuming in what I say. I know this first hand. nvidia is very kind and generous to those willing to play ball, and and very thuggish with those who don't. So it doesn't come as a surprise if most of the developers chose to be on its good side. The more you please nvidia, the more you get from it, if tomorrow you apply for a job, and there is a sexy chick competing for the position, it can get the job by blowing the manager, and even if you would too, he is not into guys. You are not in the position to compete, and it is an unethical thing that wins her the job. U happy about it?
I don't understand your stubbornness. What ddriver is alluding to is questionable practices. In a free market system, fierce competition and all that, it becomes the norm. But it doesn't make it right. Free markets, you know, are like democracy. And you probably know what good old Winston had to say about that.
Agreed. Had Intel been fined the true value of what it gained with global dominance over the next 10+ years, things would be vastly different in Sunnyvale right now. So much so I would even speculate that Green team's only hope of viability would have come in the form of an acquisition on Blue team's part.
I was talking about optimizing Nvidia's libraries. When you're using an SDK to develop a game you'er relying a lot on that SDK. And if that's exclusively optimized for one GPU/driver combination you're not going to develop an alternate engine that's also optimized for a completely different GPU/driver. And there's a limit to how much you can optimize for AMD when you're building a game using Nvidia SDK.
Yes, the developer could go ahead and ignore any SDK out there (AMD or Nvidia) just so they're not lazy but that would only bring worse results equally spread across all types of GPUs, and longer development times (with the associated higher costs).
AMD offers the same services technically but why would developers go for it? They're optimizing their game for just 25% of the market. Only now is AMD starting to push with the Bethesda partnership.
So to summarize: -You cannot touch Nvidia's *libraries and code* to optimize them for AMD -You are allowed to optimize your game for AMD without losing any kind of support from Nvidia but when you're basing it on Nvidia's SDK there's only so much you can do -AMD doesn't really support developers much with this since optimizing a game based on Nvidia's SDK seems to be too much effort even for them, and AMD would rather have developers using the AMD libraries but... -Developers don't really want to put in triple the effort to optimize for AMD also when they have only 20% market share compared to Nvidia's 80% (discrete GPUs) -None of this is illegal, it's "just business" and the incentive for developers is already there: Nvidia has the better cards so people go for them, it's logical that developers will follow
Again, most of those gameworks effects are CPU only. It does NOT matter at all what GPU you have.
As for GPU-bound gameworks, they are limited to just a few in-game effects that can be DISABLED in the options menu.
The main code of the game is not gameworks related and the developer can optimize it for AMD. Is it clear now?
Sure, it sucks that GPU-bound gameworks effects cannot be optimized for AMD and I don't like it either, but they are limited to only a few cosmetic effects that do not have any effect on the main game.
AMD demonstrated they "cache thing" (which seems to be tile based rendering, as in Maxwell and Pascal) to result in a 50% performance increase. So 20% IPC might be far too conservative. I wouldn't bet on a 50% clock speed increase, though. nVidia designed Pascal for high clocks, it's not just the process. AMD seems to intend the same, but can they get it similarly well? If so I'm inclined to ask "why did it take you so long"?
I look forward to vega and seeing how much performance it brings, and i really hope it does end up giving performance around a 1080 level for typically lower and more reasonable AMD pricing, but honestly, i expect it to probably come close to but not quite match a 1070 in dx11, surpass it in dx12, and at a much lower price.
Even if its just 2 polaris chips of performance you're past 1070 level. I think conservative is 1080 @ $400-450. Not that there won't be a cut down part at 1070 level, but I'd be really surprised if that is the full die version.
Mind you Volta is only coming to Tesla this year, and not consumer until next year. Do AMD should have a competitive full stack for a year. Good times!
Can't wait to pickup one of these in a year for ~$400. Easily future-proof for the next generation of console games, and my 780Ti is really showing its age, by about 154%...
If you think you are going to see a 1080Ti in 12 months selling for $400, I've got a like-new Ferrari to selling you for $15k. It will be nearly summer before the AIB GPUs (ASUS, EVGA, Gigabyte, MSI, etc.) start becoming available in decent supply.
I know I'm an outlier, but having lived with SLI 980s for a couple of years I'm looking to go back to a single card, and I really wish I could find SLI 980 to 1080ti benchmarks :P
If it's of any help, I've done a fair few 980 SLI tests for 3DMark, Unigine, etc., you could compare those to 1080 Ti reviews (several sites have included 3DMark results, and Techpowerup has a couple of Unigine results threads/tables); PM me for links, or email (Google "Ian SGI", find the Contact page, use my Yahoo address).
It should be borne in mind that Benchmark results are misleading, especially when referring to the 780TI.
The 780TI launched at ridiculously low clocks of 875MHz Core Clock and 928 MHz Boost Clock, which wasn't much different from what a GTX 580 OC'd could do.
When overclocked, the 780TI worked at around 1.25 GHz, a huge difference over the stock card.
So, while the 1080TI has got more than double the performance of an OC'd 780TI as per 3D Mark Firestrike, the performance shown herein for the 780TI is ridiculously low and thus, misleading.
I guess if you're worried about someone making judgments of general GPU trends without fully researching it, that's true. But owners of 780 Ti cards should be aware of where their individual cards sit compared to the reference design. In any case it's always going to be hairy taking overclocking performance into account when making judgments about generational trends. For instance, 2017 1060s and 1080s are going to be able to overclock better than 2016 1060s and 1080s. There's also a whole range of clock speeds that come out. Do you take the fastest or the average? If you take the average do you take the average of the SKUs or an average weighted by units sold? It's not so easy to get an accurate picture without a lot of work.
Some 780 Ti owners may have oc'd their cards that high, but not many I suspect. I've been searching for 780 Ti cards for a while, for CUDA, most tend to be around 980MHz at best.
Glad to see I have no need for a card that powerful and expensive. I run a single 27" 2K IPS monitor and my Gigabyte Extreme gaming 1080 is more than enough to keep me on the high end for years to come. 4K is 90% bragging rights in terms of visible difference and also game developer support.
Depends on the game and what kind of detail one likes. Site reviews and forum commentaries also don't take into account game mods which often significantly increase the GPU load (check out the OCN Best Skyrim Pics thread, have a look at the builds people are using).
I like to play games with all details/settings at maxed out. Thus, such a card is very relevant. Sure, plenty of players don't mind if the fps drops to 30 or 40, but some like it smooth at a minimum. I've currently no interest in high frequency monitors (which ironically can sensitise one's vision anyway), but I do seek 60Hz minimum sync'd, something I can't get atm with a single 980 at 1920x1200. I plan on moving up to 4K soon; with a 1080 Ti, and likely being able to get away with turning of some of the AA options because of the higher pixel density (which regains performance), minimum 60Hz looks very possible.
It depends on one's needs; everyone has different thresholds of what they're happy with.
Quality work as always Ryan! I spotted a minor typo on the conclusions page "Because the GeForce GTX 1080 Tii Founder’s Edition isn’t NVIDIA’s first GP102-based product". Did a little midnight coffee spill and make that i key sticky? :)
I've given Ryan a LOT of heat for the last years complete lack of or very late reviews.
But I'm also one to give credit where credit is due.
Amazing review Ryan, this rabbit you pulled you should be really proud of and the fact that you didn't hurt yourself on some motherboard or screwdriver and got it done on launch is really remarkable. Quite surprising as I'd thought Anandtech would start to do less PC hardware reviews and focus more on mobile.
Really amazing work and finally back to the highest of standard, quality and timely reviews that Anand was known for. This great work is what I think he saw in you, and I hope you can keep it up and keep AT at this level as "the bench" which all other reviews are measured!
I think everyone knew exactly how this was going to perform, as it was pretty much a TItan, less 1 GB of GDDR5X (although a bit faster due to newer bins) and 88 rather than 96 ROPs. Otherwise largely identical.
The timeliness of this review has been great, but I was wondering about reviews for any of the Polaris family, especially the 480 and 460. I know there was a preview on the 480, but are there any plans to do a full review on any of the parts?
The 'preview' of the RX 480 wasn't really any less detailed than this review, it's just missing compute and synthetic benchmarks. Plenty of detail on the background to the card and the architecture.
It would've been nice to test more modern games than that, at least introduce Resident Evil 7 etc. Of course a new high end card is going to play old games better....
Fermi technically had a 2-year lifecycle. The only reason it was special is because the original silicon had issues, and NV opted to fix it.
There are still yearly product refreshes, but GPUs are too expensive to develop (and the underlying manufacturing process too slow to advance) to do top-to-bottom new GPUs every single year.
Disappointed to STILL not see ANY VR-related benchmarks in the suite. VR rendering has unique rendering requirements, and a huge margin of gamers have moved or are planning to move to VR during the lifetime of this card.
I'd love to see how that compares father back, adjusted for inflation. My rose tinted goggles have me remembering top-tier GPUs launching for $400 or less back in the early 2000s, but who knows how that compares to 2017 dollars.
No titans because they are very niche cards for those gamers who cannot wait and/or have more money than sense. The following Ti variants perform almost as good as the titan cards anyway.
No radeons because this is an nvidia-only chart. I should've titled it as such. I focused on nvidia because ATI/AMD usually don't price their cards so high.
Ok on the Radeon angled, shoulda realised it was NV only. :D
However, your description of those who buy Titans cannot be anything than your own opinion, and what you don't realise is that for many store owners it's these very top-tier cards which bring in the majority of their important profit margins. They make far less on the mainstream cards. They enthusiast market is extremely important, whether or not one individually regards the products as being relevant for one's own needs. You need to be objective here.
I think I am being objective. Titan cards do not fit into the regular geforce range. They are like an early pass. Wait a few months and you can have a Ti that performs the same at a much lower price.
If nvidia never released a similar performing Ti card, I would've included them.
Those days are long-gone, and not just because of profit taking. 16/14nm FinFET GPUs are astonishingly expensive to design and fab. The masks are in the millions, and now everything has to be double-patterned.
The only reason that we had $500 cards is because of the fierce competition from ATI back then.
Whenever ATI's cards couldn't compete, or could but were not launched yet, nvidia jacked the prices up. I don't know why people forget the $600 8800 GTX or $650 GTX 280. Take a look at the link I posted above.
What I miss is being able to buy a couple of well-priced mid-range cards that beat the high-end, with good scaling. I couldn't afford the 580 when it was new, but 2x 460 SLI was faster and served nicely for a good while. With support & optimisations moving away from SLI/CF though (lesser gains, more stuttering, costly connectors, unlock codes, etc.), a single good GPU is more attractive, but the cost way up the scale compared to 5 years ago.
Have a look at the Anand review for the 280 though, it shows what I mean: 2x 8800GT SLI was faster than the 280, but $200 cheaper and with excellent scaling. I had 2x 8800GT 1GB before switching to the two 460s. Today, mid-range cards don't even support SLI (GTX 1060).
So 3.5 years later, Nvidia's flagship consumer card has improved 260% while using pretty much the same amount of power and generating pretty much the same level of noise.
If this continues, I can see that in the next few years, all sort of software will find a way to utilise the GPU more, not just games and neural networks.
Nvidia has a lot to be proud of. Their execution in the past few years has been Apple-esque.
Unfortunately no. We don't have a central office; I'm in the US with the GPUs, and Ian is in the UK with the CPUs. He's working on game testing for Ryzen Part 2, but we likely won't be able to include the GTX 1080 Ti.
I'd love to see an Anandtech investigation of what pairings of CPU and GPU really do give the best FPS/price ratio. Could a 1080 Ti bottleneck an i5, for an example? Would a 7600 be ok but a 2500 choke?
Do Re Mi Fa So La Ti .... "Ti" is a clever marketing name for a penultimate card. 11GB further emphasizes that it is leading buyers toward the ultimate.
Should be interesting to see how AMD does with 8GB for Vega.
Looks like the best Vega can hope for is a tie and they themselves (Raja) said they only had enough software engineers to be working on Vulkan drivers. So that means Dx11/OpenGL and possible DX12 won't be very good, or could take ages to catch up if AMD doesn't make some money to hire more people. I really hope Vega launch doesn't end up like Ryzen (motherboards all over the place having issues, and SMT etc issues in games). I think I'll wait out Vega and maybe even 1080r2 with GDDR5x & faster clocks before jumping. Also have to wait for AMD to fix ryzen if they can. At least the motherboard part, as sites like PCper think the game part will stay the same forever. AMD talking Ryzen rev2 already (that fixes things) makes me think PCper etc are correct.
Still, an exciting time for hardware and a great time to buy a PC this year. Even the low end is getting a major boost probably. The 1060 is low to me no point in spending under $200 if you want to really game IMHO, faster speeds, GDDR5x, could be interesting. That combo could make a really great HTPC. I wonder how much faster 1060 will get. Pity it seems AMD has no access to GDDR5x as nvidia is using it all.
Just read Hardocp's review, 30-35% faster than 1080. Much the same as here. AMD has a rough road ahead (so do their shareholders). That said, competition is good :)
Hm, 11GB RAM, 11Gbps, 11.3 TFLOPs. Why do I get the impression Marketing wanted to use the phrase "goes to eleven" for this card? Did they announce it with Spinal Tap music?
Thanks Ryan, nice review. Still missing info/specs on Audio (inc. supported sampling rates), which, I suspect, you still not have at hand. I have asked about this in the past.
Nvidia is so gay. I don't understand why people complain that intel's chips slowly optimize over the years. You seriously want your hardware to depreciate 50% every year?
Hi, After going through the well written review, I think: wouldn't it be nice if AIB partners (atleast one of them) released a blower type card with 3 slots, so that it could include a heftier heatsink, yet exhaust heat out? That way I could consider putting it in a congested chassis and not worrying about thermal throttling. PS: here in India, my zotac gtx 1060 mini reaches 78° c even in open air!
I have a GTX 1080Ti and to date, I've been using FSX which apparently devotes more use to the CPU than the GPU. I've just loaded P3D - and it really does look super smooth. The temp maxed out around 85 degs - and my monitoring software was having kittens showing me temps in the RED zone.
Can thus GPU continue to run at this temp indefinitely? For unlimited hours?
I have a GTX 1080Ti and to date, I've been using FSX which apparently devotes more use to the CPU than the GPU. I've just loaded P3D - and it really does look super smooth. The temp maxed out around 85 degs - and my monitoring software was having kittens showing me temps in the RED zone.
Can thus GPU continue to run at this temp indefinitely? For unlimited hours?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
161 Comments
Back to Article
Jon Tseng - Thursday, March 9, 2017 - link
Launch day Anandtech review?My my wonders never cease! :-)
Ryan Smith - Thursday, March 9, 2017 - link
For my next trick, watch me pull a rabbit out of my hat.blanarahul - Thursday, March 9, 2017 - link
Ooh.YukaKun - Thursday, March 9, 2017 - link
/clapsGood article as usual.
Cheers!
Yaldabaoth - Thursday, March 9, 2017 - link
Rocky: "Again?"Ryan Smith - Thursday, March 9, 2017 - link
No doubt about it. I gotta get another hat.Anonymous Blowhard - Thursday, March 9, 2017 - link
And now here's something we hope you'll really like.close - Friday, March 10, 2017 - link
Quick question: shouldn't the memory clock in the table on the fist page be expressed in Hz instead of bps being a clock and all? Or you could go with throughput but that would be just shy of 500GBps I think...Ryan Smith - Friday, March 10, 2017 - link
Good question. Because of the various clocks within GDDR5(X)*, memory manufacturers prefer that we list the speed as bandwidth per pin instead of frequency. The end result is that the unit is in bps rather than Hz.* http://images.anandtech.com/doci/10325/GDDR5X_Cloc...
close - Friday, March 10, 2017 - link
Probably due to the QDR part that's not obvious from reading a just the frequency. Thanks.MrSpadge - Thursday, March 9, 2017 - link
An HBM2 equipped vega(n) rabbit?eek2121 - Thursday, March 9, 2017 - link
Before you do that though, you should test Ryzen with the Ti. Reviewers everywhere are showing that for whatever reason, Ryzen shines with the 1080 Ti at 4k.just4U - Friday, March 10, 2017 - link
I did a double take there as I actually thought you type pull a rabbit out of my a... Was like ... wait, what?? (..chuckle) Anyway, good review Ryan. I read about the Ti being out soon.. didn't realize it was here already.Drumsticks - Thursday, March 9, 2017 - link
Nice review Ryan.I can't wait to see what Vega brings. I'm hoping we at least get a price war over a part that can sit in between the 1080 and Ti parts. I would love to see Vega pull off 75% faster than a Fury X (50% clock speed boost, 20% more IPC?) but wow that would be a tough order. Let's just hope AMD can bring some fire back to the market in May.
MajGenRelativity - Thursday, March 9, 2017 - link
I'm also extremely interested in seeing what Vega brings as well. My wallet is ready to drop the bills necessary to get a card in this price range, but I'm waiting for Vega to see who gets my money.ddriver - Thursday, March 9, 2017 - link
It will bring the same thing as ever - superior hardware nvidia will pay off most game developers to sandbag, forcing amd to sell at a very nice price to the benefit of people like me, who don't care about games but instead use gpus for compute.For compute amd's gpus are usually 2-3 TIMES better value than nvidia. And I have 64 7950s in desperate need of replacing.
MajGenRelativity - Thursday, March 9, 2017 - link
That's a lot of 7950s. What do you compute with them?A5 - Thursday, March 9, 2017 - link
Fake internet money, I assume. And maybe help the power company calculate his bill...ddriver - Thursday, March 9, 2017 - link
Nope, I do mostly 3D rendering, multiphysics simulations, video processing and such. Cryptocurrency is BS IMO, and I certainly don't need it.MajGenRelativity - Thursday, March 9, 2017 - link
I'm assuming you do that for your job? If not, that's an expensive hobby :Pddriver - Thursday, March 9, 2017 - link
It is kinda both, although I wouldn't really call it a job, because that's when you are employed by someone else to do what he says. More like it's my work and hobby. Building a super computer on the budget out of consumer grade hardware turned out very rewarding in every possible aspect.Zingam - Friday, March 10, 2017 - link
This is something I'd like to do. Not necessarily with GPUs but I have no idea how to make any money tobpay the bill yet. I'vw only started thinking about it recently.eddman - Thursday, March 9, 2017 - link
"nvidia will pay off most game developers to sandbag"AMD, nvidia, etc. might work with developers to optimize a game for their hardware.
Suggesting that they would pay developers to deliberately not optimize a game for the competition or even make it perform worse is conspiracy theories made up on the internet.
Not to mention it is illegal. No one would dare do it in this day and age when everything leaks eventually.
DanNeely - Thursday, March 9, 2017 - link
Something that blatant would be illegal. What nVidia does do is to offer a bunch of blobs that do various effects simulations/etc that can save developers a huge amount of time vs coding their own versions but which run much faster on their own hardware than nominally equivalent AMD cards. I'm not even going accuse them of deliberately gimping AMD (or Intel) performance, only having a single code path that is optimized for the best results on their hardware will be sub-optimal on anything else. And because Gameworks is offered up as blobs (or source with can't show it to AMD NDA restrictions) AMD can't look at the code to suggest improvements to the developers or to fix things after the fact with driver optimizations.eddman - Thursday, March 9, 2017 - link
True, but most of these effects are CPU-only, and fortunately the ones that run on the GPU can be turned off in the options.Still, I agree that vendor specific, source-locked GPU effects are not helping the industry as a whole.
ddriver - Thursday, March 9, 2017 - link
Have you noticed anyone touching nvidia lately? They are in bed with the world's most evil bstards. Nobody can touch them. Their practice is they offer assistance on exclusive terms, all this aims to lock in developers into their infrastructure, or the very least on the implied condition they don't break a sweat optimizing for radeons.I have very close friends working at AAA game studios and I know first hand. It all goes without saying. And nobody talks about it, not if they'd like to keep their job, or be able to get a good job in the industry in general.
nvidia pretty much do the same intel was found guilty of on every continent. But it is kinda less illegal, because it doesn't involve discounts, so they cannot really pin bribery on them, in case that anyone would dare challenge them.
amd is actually very competitive hardware wise, but failing at their business model, they don't have the money to resist nvidia's hold on the market. I run custom software at a level as professional as it gets, and amd gpus totally destroy nvidian at the same or even higher price point. Well, I haven't been able to do a comparison lately, as I have migrated my software stack to OpenCL2, which nvidia deliberately do not implement to prop up their cuda, but couple of years back I was able to do direct comparisons, and as mentioned above, nvidia offered 2 to 3 times worse value than amd. And nothing has really changed in that aspect, architecturally amd continue to offer superior compute performance, even if their DP rates have been significantly slashed in order to stay competitive with nvidia silicon.
A quick example:
~2500$ buys you either a:
fire pro with 32 gigs of memory and 2.6 tflops FP64 perf and top notch CL support
quadro with 8 gigs of memory and 0.13 tflops FP64 perf and CL support years behind
Better compute features, 4 times more memory and 20 times better compute performance at the same price. And yet the quadro outsells the firepro. Amazing, ain't it?
It is true that 3rd party cad software still runs a tad better on a quadro, for the reasons and nvidian practices outlined above, but even then, the firepro is still fast enough to do the job, while completely annihilating quadros in compute. Which is why at this year's end I will be buying amd gpus by the dozens rather than nvidia ones.
eddman - Friday, March 10, 2017 - link
So you're saying nvidia constantly engages in illegal activities with developers?I don't see how pro cards and software have to do with geforce and games. There is no API lock-in for games.
thehemi - Friday, March 10, 2017 - link
> "And nobody talks about it, not if they'd like to keep their job"Haha we're not scared of NVIDIA, they are just awesome. I'm in AAA for over a decade, they almost bought my first company and worked closely with my next three so I know them very well. Nobody is "scared" of NVIDIA. NVIDIA have their devrel down. They are much more helpful with optimizations, free hardware, support, etc. Try asking AMD for the same and they treat you like you're a peasant. When NVIDIA give us next-generation graphics cards for all our developers for free, we tend to use them. When NVIDIA sends their best graphics engineers onsite to HELP us optimize for free, we tend to take them up on their offers. Don't think I haven't tried getting the same out of AMD, they just don't run the company that way, and that's their choice.
And if you're really high up, their dev-rel includes $30,000 nights out that end up at the strip club. NVIDIA have given me some of the best memories of my life, they've handed me a next generation graphics card at GDC because I joked that I wanted one, they've funded our studio when it hit a rough patch and tried to justify it with a vendor promotion on stage at CES with our title. I don't think that was profitable for them, but the good-will they instilled definitely has been.
I should probably write a "Secret diaries of..." blog about my experiences, but the bottom line is they never did anything but offer help that was much appreciated.
Actually, heh, The worst thing they did, was turn on physx support by default for a game we made with them for benchmarks back when they bought Ageia. My game engine was used for their launch demo, and the review sites (including here I think) found out that if you turned a setting off to software mode, Intel chips doing software physics were faster than NVIDIA physics accelerated mode. Still not illegal, and still not afraid of keeping my job, since I've made it pretty obvious who I am to the right people.
ddriver - Friday, March 10, 2017 - link
Well, for you it might be the carrot, but for others is the stick. Not all devs are as willing to leave their products upoptimized in exchange for a carrot as you are. Nor do they need nvidia to hold them by the hand and walk them through everything that is remotely complex in order to be productive.In reality both companies treat you like a peasant, the difference is that nvidia has the resources to make into a peasant they can use, while to poor old amd you are just a peasant they don't have the resources to pamper. Try this if you dare - instead of being a lazy grateful slob take the time and effort to optimize your engine to take the most of amd hardware and brag about that marvelous achievement, and see if nvidia's pampering will continue.
It is still technically a bribe - helping someone to do something for free that ends up putting them at an unfair advantage. It is practically the same thing as giving you the money to hire someone who is actually competent to do what you evidently cannot be bother with or are unable to do. They still pay the people who do that for you, which would be the same thing if you paid them with money nvidia gave you for it. And you are so grateful for that assistance, that you won't even be bothered to optimize your software for that vile amd, who don't rush to offer to do your job for you like noble, caring nvidia does.
ddriver - Friday, March 10, 2017 - link
It is actually a little sad to see developers so cheap. nvidia took you to see strippers once and now you can't get your tongue out their ass :)but it is understandable, as a developer there is a very high chance it was the first pussy you've seen in real life :D
eddman - Friday, March 10, 2017 - link
So we moved from "nvidia pays devs to deliberately not optimize for AMD" to "nvidia works with devs to optimize the games for their own hardware, which might spoil them and result in them not optimizing for AMD properly".How is that bribery, illegal? If they did not prevent the devs from optimizing for AMD then nothing illegal happened. It was the devs own doing.
ddriver - Friday, March 10, 2017 - link
Nope, there is an implicit, unspoken condition to receiving support from nvidia. To lazy slobs, that's welcome, and most devs are lazy slobs. Their line of reasoning is quite simple:"Working to optimize for amd is hard, I am a lazy and possibly lousy developer, so if they don't do that for me like nvidia does, I won't do that either, besides that would angry nvidia, since they only assist me in order to make their hardware look better, if I do my job and optimize for amd and their hardware ends up beating nvidia's, I risk losing nvidia's support, since why would they put money into helping me if they don't get the upper hand in performance. Besides, most people use nvidia anyway, so why even bother. I'd rather be taken to watch strippers again than optimize my software."
Manipulation, bribery and extortion. nvidia uses its position to create situation in which game developers have a lot to profit from NOT optimizing for amd, and a lot to lose if they do. Much like intel did with its exclusive discounts. OEM's weren't exactly forced to take those discounts in exchange for not selling amd, they did what they knew would please intel to get rewarded for it. Literally the same thing nvidia does. Game developers know nvidia will be pleased to see their hardware getting an unfair performance advantage, and they know amd doesn't have the money to pamper them, so they do what is necessary please nvidia and ensure they keep getting support.
akdj - Monday, March 13, 2017 - link
Where to start?Best not to start, as you are completely, 100% insane and I've spent two and a half 'reads' of your replies... trying to grasp WTH you're talking about and I'm lost
Totally, completely lost in your conspiracy theories about two major GPU silicon builders while being apparently and completely clueless about ANY of it!
Lol - Wow, I'm truly astounded that you were able to make up that much BS ...
cocochanel - Friday, March 10, 2017 - link
You forgot to mention one thing. Nvidia tweaking the drivers to force users into hardware updates. Say, there is a bunch of games coming up this Christmas. If you have a card that's 3-4 years old, they release a new driver which performs poorly on your card ( on those games ) and another driver which performs way better on the newest cards. Then, if you start crying, they say: It's an old card, pal, why don't you buy a new one !With DX11 they could do that a lot. With DX12 and Vulkan it's a lot harder. Most if not all optimizations have to be done by the game programmers. Very little is left to the driver.
eddman - Friday, March 10, 2017 - link
That's how the ENTIRE industry is. Do you really expect developers to optimize for old architectures. Everyone does it, nvidia, AMD, intel, etc.It is not deliberate. Companies are not going to spend time and money on old hardware with little market share. That's how it's been forever.
Before you say that's not the case with radeons, it's because their GCN architecture hasn't changed dramatically since its first iteration. As a result, any optimization done for the latest GCN, affects the older ones to some extent too.
cocochanel - Friday, March 10, 2017 - link
There is good news for the future. As DX12 and Vulkan become mainstream API's, game developers will have to roll up their sleeves and sweat it hard. Architecturely, these API's are totally different from the ground up and both trace their origin from Mantle. And Mantle was the biggest advance in graphics API's in a generation. The good days for lazy game developers is coming to an end, since these new API's put just about everything back into their hands whether they like it or not. Tweaking the driver won't make much of a difference. Read the API's documentation.cmdrdredd - Monday, March 13, 2017 - link
Yes hopefully this will be the future where games are the responsibility of the developer. Just like on Consoles. I know people hate consoles sometimes but the closed platform shows which developers have their stuff together and which are lazy bums because Sony and Microsoft don't optimize anything for the games.Nfarce - Friday, March 10, 2017 - link
Always amusing watching to tin foil hat Nvidia conspiracy nuts talk. Here's my example: working on Project Cars as an "early investor." Slightly Mad Studios gave both Nvidia and AMD each 12 copies of the beta release to work on, the same copy I bought. Nvidia was in constant communication with SMS developers and AMD was all but never heard from. After about six months, Nvidia had a demo of the racing game ready for a promotion of their hardware. Since AMD didn't take Project Cars seriously with SMS, Nvidia was able to get the game tweaked better for Nvidia. And SMS hat-tipped Nvidia with having billboards in the game showing Nvidia logos.Of course all the AMD fanboys claimed unfair competition and the usual whining when their GPUs do not perform as well in some games as Nvidia (they amazingly stayed silent when DiRT Rally, another development I was involved with, ran better on AMD GPUs and had AMD billboards).
ddriver - Friday, March 10, 2017 - link
So was there anything preventing the actual developers from optimizing the game? They didn't have nvidia and amd hardware, so they sent betas to the companies to profile things and see how it runs?How silly one must be to expect that nvidia - a company that rakes in billions every year, and amd - a company is in the red most of the time and has lost billions, will have the same capacity to do game developers jobs for them?
It is the game developer's job top optimize. Alas, as it seems, nvidia has bred a new breed of developers - those who do their job half-assedly and then wait on them to optimize, conveniently creating unfair advantage to their hardware.
ddriver - Friday, March 10, 2017 - link
Also talking about fanboys - I am not that. Yes, I am running dozens of amd gpus, and I don't see myself buying any nvidia product any time soon, but that's only because the offer superior value to what I need them for.I don't give amd extra credit for offering a better value. I know this is not what they want. It is what they are being forced into.
I am in a way grateful to nvidia for sandbagging amd, because this way I can get a much better value products. If things were square between the two, and all games were equally optimized, then both companies would offer products with approximately identical value.
Which I would hate, because I'd lose the currently, 2-3x better value for the money i get with amd. I benefit and profit from nvidia being crooks, and I am happy that I can do that.
So nvidia, keep doing what you are doing. I am not really objecting, I am simply stating the facts. Of course, nvidia fanboys would have a problem understanding that, and a problem with anyone tarnishing the good name of that helpful awesome and paying for strippers company.
theuglyman0war - Sunday, March 12, 2017 - link
Geez developers can't catch a break...I been a registered developer with ATI/AMD and NVIDIA both for years...
Besides the research and papers and tools they invest to further the interest of game development. I have never had to do anything but reap the rewards and say thank you. Maybe there are secret meetings and koolaid I am missing?
Not sure what the actual correct scenario is tell you the truth?
Either triple AAA dev is hamstringing everyone's PC experience because development is all Console centric ( Which means tweaking FOR an "AMD owned architecture landscape" for two solid console cycles where "this console cycle alone sees both BOBCAT and PUMA being supported"! )
Or the Industry is in Nvidia's back pocket because of what? the overwhelming PhysX support? There is an option to turn on Hair in a game? Their driver support is an evil conspiracy?
Whatever... If there is some koolaid I wanna know where that line is? Gimme! I want me some green Koolaid!
theuglyman0war - Sunday, March 12, 2017 - link
Turns out your right! There is a monopoly in game development Software optimization supporting only one hardware platform! Alert the press!http://developer.amd.com/wordpress/media/2012/10/S...
HomeworldFound - Thursday, March 9, 2017 - link
Underhanded deals are part of the industry. Have you ever wondered about the prices even when price fixing is disallowed and supposedly abolished?These deals are always happening, look at the public terms of the deal NVIDIA and Intel.
Nvidia gets:
1.5 Billion Dollars, Over six Years.
6 Year Extension of C2D/AGTL+ Bus License
Access To Unspecified Intel Microprocessor Patents. Denver?
NVIDIA Doesn't Get:
DMI/QPI Bus License; Nehalem/Sandy Bridge Chipsets
x86 License, Including Rights To Make an x86 Emulator
For a company that was supposed to be headed into x86 wouldn't you say that's anti-competitive?
eddman - Friday, March 10, 2017 - link
Prices, I don't know. Isn't it a combination of market demand, (lack of) competition and shortages.What does that deal has to do with this? Not giving certain licenses, publicly, is not the same as actively preventing devs from optimizing for the competition in shadows.
theuglyman0war - Sunday, March 12, 2017 - link
Well there was/is a lot of litigation it seems like admittedly!close - Friday, March 10, 2017 - link
It would not be illegal, it's called "having the market by its bawles". Have you ever actually talked to someone who does this or do you like assuming it's illegal and conspiracy theory (@eddman)?You're thinking about GameWorks and GW is offered to developers under NDA with the agreement prohibiting them from changing the GW libraries in any way or "optimizing" for AMD. And by the time AMD gets their hands on the proper optimizations to put in their drivers it's pretty much too late. Also GW targets all the weaknesses of AMD GPUs like excessive use of tessellation.
But since everywhere you throw a game at you're going to find an Nvidia user (75% market share), game developers realize it's good for business and don't make too much noise.
Nothing is illegal, it's certainly not a conspiracy theory, it may very well be immoral but it's business. And it happens every day, all around you.
eddman - Friday, March 10, 2017 - link
I'm not talking about GPU-bound GW effects that can be disabled in game, and I do know that they are closed-source and cannot be optimized for AMD.I'm talking about whole game codes. Nvidia cannot legally prevent a developer from optimizing the non-GW code of a game (which is the main part) for AMD.
Besides, a lot of GW effects are CPU-only, so it doesn't matter what brand you use in such cases.
ddriver - Friday, March 10, 2017 - link
It cannot legally prevent devs from optimizing for amd, but it can legally cut their generous support.There isn't that much difference between:
"If you optimize for AMD we will stop giving you money" and
"If you optimize for AMD we will stop paying people to help you".
It is practically a combination of bribe and extortion. Both very much illegal.
And that's at studio level, at individual level nvidia can make it pretty much impossible to have a job at a good game studio.
Intel didn't exactly bribe anyone directly either, they just offered discounts. But they were found guilty.
Although if you ask me, intel being found guilty on all continents didn't really serve the purpose of punishing them for their crimes. It was more of a way to allow them to cheaply legalize their monopoly.
An actually punitive action would be fining them the full amount of money they made on their illegal practices, and cutting the company down to its pre-monopoly size. Instead the fines were pretty much modest, even laughably low, a tiny fraction of the money they made on their illegal business practices, and they got to keep the monopoly they build on them as well.
So yeah, some people made some money, the damage done by intel was not undone by any means, and their monopoly built through illegal practices has been washed clean and legal. Now that monopolist is as pure as the driven snow. And it took less than a quarter's worth of average net profit to wipe clean years of abuse.
ddriver - Friday, March 10, 2017 - link
amd was not in the position to make discounts, intel was, so they abused that on exclussive terms for monetary gainsamd is not in the position to sponsor software developers, nvidia is, so they abuse that on exclussive terms for monetary gains
I don't really see a difference between the two. No one was exactly forced to take intel's exclussive discounts either, much like with nvidia's exclusive support and visits to strip clubs and such.
So, by citing intel as precedent, I'd say what nvidia does is VERY MUCH ILLEGAL.
eddman - Friday, March 10, 2017 - link
Then why nothing has happened all these years, for something that seems to be open knowledge among game publishers, developers and GPU manufacturers?A dev asking nvidia for help optimizing a game is not illegal. It is a service. AMD is free to do the same.
What about those games where AMD helped the devs to optimize for their hardware? Was that illegal too? It's either illegal for both or none.
If they've been doing it openly for years and even put their logos in games' loading screens, then it's safe to say it's not illegal at all.
ddriver - Friday, March 10, 2017 - link
Yeah, also amd was free to offer discounts to those who didn't sell intel products. They helped make two games, and gave Larry who only sells amd systems a free amd t-shirt as a reward. Because that's what amd can afford, after years of being sandbagged by intel and Hecktor made it buy ati for 3 times what it was worth so it can go bankrupt so it will be forced to sell its fabs to Hecktor's arab boyfriends.eddman - Friday, March 10, 2017 - link
You did not answer the question.Being known in the industry for many years, yet nothing has happened.
ddriver - Friday, March 10, 2017 - link
What nvidia does is the same thing as lobying. It is legalized bribe. You cannot give a briefcase of money to a politician and tell him to do what you want him to. But you can spend a briefcase of money to make a politician do what you want him to do. And it is not illegal, politicians have legalized it, and as far as they and the lobbyists are concerned, that is a good thing, a political contribution.The same kind of advantage that allows nvidia to do that is what would give them the upper hand in court. It could be proven to be a crime, if only amd had enough money to out-sue nvidia. Which they don't. And if they did, they'd be able to support game developers, so it wouldn't even come to that. nvidia is friends with the big boys, amd is a perpetual underdog. In such scenarios, even if a lawsuit was to take place, it would be mostly a show for the public, and if found guilty, the punishment would be a symbolic and gentle slap on the wrist.
Now, with your question answered, do you feel better?
ddriver - Friday, March 10, 2017 - link
There is high likelihood that we will see such a case against nvidia, but not until they have completely cemented their dominance position, and that case would only serve to wipe nvidia clean, so they can enjoy their dominance without being haunted by their past of sleazy illegal practices, giving them a clean slate at a very desirable price.eddman - Friday, March 10, 2017 - link
So it is not illegal to help devs doing optimizations?ddriver - Friday, March 10, 2017 - link
It can be as legal or illegal as killing people. What nvidia does is most certainly unfair business practices and abuse of its position.The legal system is rarely about what is right or wrong. What nvidia does is certainly wrong. If they can get away with it, it is legal. If someone kills your entire family and then walks free because the legal system found him to be innocent, would you be as OK and defending his innocence as you are doing for nvidia?
eddman - Friday, March 10, 2017 - link
That comparison isn't even remotely relevant.Companies work with devs in the entire computing industry all the time to make sure software works best with hardware. It has never been illegal.
ddriver - Friday, March 10, 2017 - link
Are you by a chance on the spectrum? There is nothing wrong with helping to optimize software. For the last time - what is wrong is offering that help on implied exclusive terms. I don't know people who have been offered support by amd in exchange of sandbagging nvidia. But I know people who eventually optimized for amd and as a result lost the support nvidia offered prior to that. And the revenge didn't end there either, subsequent driver releases significantly worsened the performance of the already nvidia optimized code.nvidia doesn't help out of the kindness of their hearts or awesomeness, they do not even help to make the best out of their hardware, they only help if that would get them an unfair advantage, so it is implied that their help is only available to those who leave the amd rendering pipeline deliberately unoptimized.
eddman - Saturday, March 11, 2017 - link
...and some other people tell otherwise. Who to believe. Can you provide anything solid to back that up?"subsequent driver releases significantly worsened the performance of the already nvidia optimized code"
Which games? Which drivers? This one can be tested.
Why would nvidia reduce the performance of a game on their own cards, which is going to hurt them? The whole purpose of this was to make the game work best and sell cards based on that.
eddman - Saturday, March 11, 2017 - link
What about you? Are you on the spectrum?ddriver - Friday, March 10, 2017 - link
Also, granted, there are some amd optimized games, albeit few and far in between. But that doesn't excuse what nvidia does, nor does it justify it.Besides it was nvidia who started this practice, amd does simply try its best to balance things out, but they don't have nowhere nearly the resources.
amd optimized games are so rare, than out of my many contacts in the industry, I don't know a single one. So I cannot speak of the kinds of terms amd offers their assistance. I can only do that for nvidia's terms.
If amd's terms for support are just as exclusive as nvidia's, then amd is being guilty too. But even then, that doesn't make nvidia innocent. It makes amd guilty, and it makes nvidia like a 100 times guiltier.
eddman - Friday, March 10, 2017 - link
What terms? Are we back to "If we help, you cannot optimize your game for AMD"? How do you know there are such terms?Also, you said the entire helping out thing is illegal, terms or no terms. Now it's illegal only if there are certain terms?
ddriver - Friday, March 10, 2017 - link
Read the reply above. nvidia doesn't state the terms, because that would be illegal, the terms are implied, and they refuse further support if you break them... and worse... so it is a form of legal briberyand since their drivers are closed source, any hindrances they might implement to hamper your software remain a secret, but hey, there is a good reason why those drivers keep getting more and more bloated
eddman - Friday, March 10, 2017 - link
What if is nothing implied? Why are you so sure something must be implied if they're helping a dev? What if they simply want that game to work best with their hardware because it's an important game in their mind and might help sell some cards?We are heavily into guessing and assuming territory.
I'm not saying shady stuff doesn't happen at all, but to think that it happens all the time without exception would be extreme exaggeration.
ddriver - Friday, March 10, 2017 - link
There is no "nothing implied". It doesn't take a genius to figure what nvidia's motivation for helping is. Of course, if it is an important, prominent title, nvidia might help out even if the studio optimizes for amd just to save face.But then again, nvidia support can vary a lot, it can be just patching up something that would make them look bad, it can be free graphics cards and strippers as in the case of our lad above. I am sure he didn't optimize for amd. I mean that developers don't really care all that much how well their software runs, if it runs bad, just get a faster gpu. They care about how much pampering they get. So even in the case of a studio which is too big for nvidia to blackmail, there is still ample motivation to please it for the perks which they won't be getting from amd.
There is no assuming in what I say. I know this first hand. nvidia is very kind and generous to those willing to play ball, and and very thuggish with those who don't. So it doesn't come as a surprise if most of the developers chose to be on its good side. The more you please nvidia, the more you get from it, if tomorrow you apply for a job, and there is a sexy chick competing for the position, it can get the job by blowing the manager, and even if you would too, he is not into guys. You are not in the position to compete, and it is an unethical thing that wins her the job. U happy about it?
eddman - Friday, March 10, 2017 - link
You know this first hand how? You are claiming a lot and providing nothing concrete.eddman - Friday, March 10, 2017 - link
I already listed the motivation. Game runs good on their cards. People buy their cards.cocochanel - Saturday, March 11, 2017 - link
I don't understand your stubbornness. What ddriver is alluding to is questionable practices. In a free market system, fierce competition and all that, it becomes the norm. But it doesn't make it right. Free markets, you know, are like democracy. And you probably know what good old Winston had to say about that.eddman - Saturday, March 11, 2017 - link
No, he's outright calling such partnerships illegal. He claims to have "first hand" info but reveals nothing.Hardware companies working with software studios has been going on for decades. It wasn't illegal then and it isn't now.
DMCalloway - Saturday, March 11, 2017 - link
Agreed. Had Intel been fined the true value of what it gained with global dominance over the next 10+ years, things would be vastly different in Sunnyvale right now. So much so I would even speculate that Green team's only hope of viability would have come in the form of an acquisition on Blue team's part.close - Monday, March 13, 2017 - link
I was talking about optimizing Nvidia's libraries. When you're using an SDK to develop a game you'er relying a lot on that SDK. And if that's exclusively optimized for one GPU/driver combination you're not going to develop an alternate engine that's also optimized for a completely different GPU/driver. And there's a limit to how much you can optimize for AMD when you're building a game using Nvidia SDK.Yes, the developer could go ahead and ignore any SDK out there (AMD or Nvidia) just so they're not lazy but that would only bring worse results equally spread across all types of GPUs, and longer development times (with the associated higher costs).
You have the documentation here:
https://docs.nvidia.com/gameworks/content/gamework...
AMD offers the same services technically but why would developers go for it? They're optimizing their game for just 25% of the market. Only now is AMD starting to push with the Bethesda partnership.
So to summarize:
-You cannot touch Nvidia's *libraries and code* to optimize them for AMD
-You are allowed to optimize your game for AMD without losing any kind of support from Nvidia but when you're basing it on Nvidia's SDK there's only so much you can do
-AMD doesn't really support developers much with this since optimizing a game based on Nvidia's SDK seems to be too much effort even for them, and AMD would rather have developers using the AMD libraries but...
-Developers don't really want to put in triple the effort to optimize for AMD also when they have only 20% market share compared to Nvidia's 80% (discrete GPUs)
-None of this is illegal, it's "just business" and the incentive for developers is already there: Nvidia has the better cards so people go for them, it's logical that developers will follow
eddman - Monday, March 13, 2017 - link
Again, most of those gameworks effects are CPU only. It does NOT matter at all what GPU you have.As for GPU-bound gameworks, they are limited to just a few in-game effects that can be DISABLED in the options menu.
The main code of the game is not gameworks related and the developer can optimize it for AMD. Is it clear now?
Sure, it sucks that GPU-bound gameworks effects cannot be optimized for AMD and I don't like it either, but they are limited to only a few cosmetic effects that do not have any effect on the main game.
eddman - Monday, March 13, 2017 - link
Not to mention that a lot of gameworks game do not use any GPU-bound effects at all. Only CPU.eddman - Monday, March 13, 2017 - link
Just one example: http://www.geforce.com/whats-new/articles/war-thun...Look for the word "CPU" in the article.
Meteor2 - Tuesday, March 14, 2017 - link
Get a room you two!MrSpadge - Thursday, March 9, 2017 - link
AMD demonstrated they "cache thing" (which seems to be tile based rendering, as in Maxwell and Pascal) to result in a 50% performance increase. So 20% IPC might be far too conservative. I wouldn't bet on a 50% clock speed increase, though. nVidia designed Pascal for high clocks, it's not just the process. AMD seems to intend the same, but can they get it similarly well? If so I'm inclined to ask "why did it take you so long"?FalcomPSX - Thursday, March 9, 2017 - link
I look forward to vega and seeing how much performance it brings, and i really hope it does end up giving performance around a 1080 level for typically lower and more reasonable AMD pricing, but honestly, i expect it to probably come close to but not quite match a 1070 in dx11, surpass it in dx12, and at a much lower price.Midwayman - Thursday, March 9, 2017 - link
Even if its just 2 polaris chips of performance you're past 1070 level. I think conservative is 1080 @ $400-450. Not that there won't be a cut down part at 1070 level, but I'd be really surprised if that is the full die version.Meteor2 - Tuesday, March 14, 2017 - link
I think that sometimes Volta is over-looked. Whatever Vega brings, I feel Volta is going to top it.AMD is catching up with Intel and Nvidia, but outside of mainstream GPUs and HEDT CPUs, they've not done it yet.
Meteor2 - Tuesday, March 14, 2017 - link
Mind you Volta is only coming to Tesla this year, and not consumer until next year. Do AMD should have a competitive full stack for a year. Good times!SaolDan - Thursday, March 9, 2017 - link
i dont need it but i really want it. currently gaming and vr on a 1070 and loving it.Endda - Thursday, March 9, 2017 - link
Would have loved to see the Titan XP in those graphsRyan Smith - Thursday, March 9, 2017 - link
Unfortunately NVIDIA never sampled us on that one, so I don't have one on-hand to test again.Samus - Thursday, March 9, 2017 - link
Can't wait to pickup one of these in a year for ~$400. Easily future-proof for the next generation of console games, and my 780Ti is really showing its age, by about 154%...Nfarce - Friday, March 10, 2017 - link
If you think you are going to see a 1080Ti in 12 months selling for $400, I've got a like-new Ferrari to selling you for $15k. It will be nearly summer before the AIB GPUs (ASUS, EVGA, Gigabyte, MSI, etc.) start becoming available in decent supply.rtho782 - Thursday, March 9, 2017 - link
I know I'm an outlier, but having lived with SLI 980s for a couple of years I'm looking to go back to a single card, and I really wish I could find SLI 980 to 1080ti benchmarks :PDrumsticks - Thursday, March 9, 2017 - link
I think the original AT review of the 1080 compares them. Go there and just tack on another 30% maybe?mapesdhs - Saturday, March 11, 2017 - link
If it's of any help, I've done a fair few 980 SLI tests for 3DMark, Unigine, etc., you could compare those to 1080 Ti reviews (several sites have included 3DMark results, and Techpowerup has a couple of Unigine results threads/tables); PM me for links, or email (Google "Ian SGI", find the Contact page, use my Yahoo address).Achaios - Thursday, March 9, 2017 - link
It should be borne in mind that Benchmark results are misleading, especially when referring to the 780TI.The 780TI launched at ridiculously low clocks of 875MHz Core Clock and 928 MHz Boost Clock, which wasn't much different from what a GTX 580 OC'd could do.
When overclocked, the 780TI worked at around 1.25 GHz, a huge difference over the stock card.
So, while the 1080TI has got more than double the performance of an OC'd 780TI as per 3D Mark Firestrike, the performance shown herein for the 780TI is ridiculously low and thus, misleading.
Yojimbo - Thursday, March 9, 2017 - link
I guess if you're worried about someone making judgments of general GPU trends without fully researching it, that's true. But owners of 780 Ti cards should be aware of where their individual cards sit compared to the reference design. In any case it's always going to be hairy taking overclocking performance into account when making judgments about generational trends. For instance, 2017 1060s and 1080s are going to be able to overclock better than 2016 1060s and 1080s. There's also a whole range of clock speeds that come out. Do you take the fastest or the average? If you take the average do you take the average of the SKUs or an average weighted by units sold? It's not so easy to get an accurate picture without a lot of work.mapesdhs - Saturday, March 11, 2017 - link
Some 780 Ti owners may have oc'd their cards that high, but not many I suspect. I've been searching for 780 Ti cards for a while, for CUDA, most tend to be around 980MHz at best.mapesdhs - Saturday, March 11, 2017 - link
Also recall one model which was 1002MHz.Chaser - Thursday, March 9, 2017 - link
Glad to see I have no need for a card that powerful and expensive. I run a single 27" 2K IPS monitor and my Gigabyte Extreme gaming 1080 is more than enough to keep me on the high end for years to come. 4K is 90% bragging rights in terms of visible difference and also game developer support.sharath.naik - Thursday, March 9, 2017 - link
I think you may need to consider multi monitor gaming. Then this card makes sense if you want to use this in a small case that allows only one GPU.mapesdhs - Saturday, March 11, 2017 - link
Depends on the game and what kind of detail one likes. Site reviews and forum commentaries also don't take into account game mods which often significantly increase the GPU load (check out the OCN Best Skyrim Pics thread, have a look at the builds people are using).I like to play games with all details/settings at maxed out. Thus, such a card is very relevant. Sure, plenty of players don't mind if the fps drops to 30 or 40, but some like it smooth at a minimum. I've currently no interest in high frequency monitors (which ironically can sensitise one's vision anyway), but I do seek 60Hz minimum sync'd, something I can't get atm with a single 980 at 1920x1200. I plan on moving up to 4K soon; with a 1080 Ti, and likely being able to get away with turning of some of the AA options because of the higher pixel density (which regains performance), minimum 60Hz looks very possible.
It depends on one's needs; everyone has different thresholds of what they're happy with.
Chaser - Thursday, March 9, 2017 - link
Very well written, balanced review. Nicely done Ryan.BrokenCrayons - Thursday, March 9, 2017 - link
+1!nismotigerwvu - Thursday, March 9, 2017 - link
Quality work as always Ryan! I spotted a minor typo on the conclusions page "Because the GeForce GTX 1080 Tii Founder’s Edition isn’t NVIDIA’s first GP102-based product". Did a little midnight coffee spill and make that i key sticky? :)BrokenCrayons - Thursday, March 9, 2017 - link
Typo/Error also on Page 4 of the review in the line that reads:"For our review of the GTX 1060, we’re using NVIDIA’s 378.78 driver."
Probably should be "GTX 1080 Ti"
The table directly below that line is also missing the GTX 1080 Ti in the video cards section.
Ryan Smith - Thursday, March 9, 2017 - link
Thanks!funkforce - Thursday, March 9, 2017 - link
I've given Ryan a LOT of heat for the last years complete lack of or very late reviews.But I'm also one to give credit where credit is due.
Amazing review Ryan, this rabbit you pulled you should be really proud of and the fact that you didn't hurt yourself on some motherboard or screwdriver and got it done on launch is really remarkable. Quite surprising as I'd thought Anandtech would start to do less PC hardware reviews and focus more on mobile.
Really amazing work and finally back to the highest of standard, quality and timely reviews that Anand was known for. This great work is what I think he saw in you, and I hope you can keep it up and keep AT at this level as "the bench" which all other reviews are measured!
Thank you!
CrazyElf - Thursday, March 9, 2017 - link
Cool review! Thanks for the lauch day.I think everyone knew exactly how this was going to perform, as it was pretty much a TItan, less 1 GB of GDDR5X (although a bit faster due to newer bins) and 88 rather than 96 ROPs. Otherwise largely identical.
Let's hope AMD has a good response in Vega.
MajGenRelativity - Thursday, March 9, 2017 - link
The timeliness of this review has been great, but I was wondering about reviews for any of the Polaris family, especially the 480 and 460. I know there was a preview on the 480, but are there any plans to do a full review on any of the parts?Meteor2 - Friday, March 10, 2017 - link
The 'preview' of the RX 480 wasn't really any less detailed than this review, it's just missing compute and synthetic benchmarks. Plenty of detail on the background to the card and the architecture.ElBerryKM13 - Thursday, March 9, 2017 - link
Cmon anandtech? no Pascal Titan X benchmarks to see how it compares to this 1080ti? are you serious?Ryan Smith - Thursday, March 9, 2017 - link
http://www.anandtech.com/comments/11180/the-nvidia...jiffylube1024 - Thursday, March 9, 2017 - link
Whoah, what a monster card!HomeworldFound - Thursday, March 9, 2017 - link
It would've been nice to test more modern games than that, at least introduce Resident Evil 7 etc. Of course a new high end card is going to play old games better....Holliday75 - Thursday, March 9, 2017 - link
And a newer card will play newer games better as well.Ryan Smith - Thursday, March 9, 2017 - link
We'll be refreshing the benchmark suite for Vega, that way we go into a new architecture with equally new games.ryvoth - Thursday, March 9, 2017 - link
Solid card release from NVIDIA too bad our costs in Canada suck due to the dollar.mapesdhs - Saturday, March 11, 2017 - link
Ditto the UK. Sites keep mentioning an RRP of $700, but in the UK it's the equivalent of more like $900+.Meteor2 - Tuesday, March 14, 2017 - link
Don't forget the US doesn't have VAT. Or a NHS.Mr Perfect - Thursday, March 9, 2017 - link
Wait, so Pascal is going to have a two year lifecycle as-is? There won't be a refinement cycle like Fermi to Kepler? That's a little disappointing.Ryan Smith - Thursday, March 9, 2017 - link
Fermi technically had a 2-year lifecycle. The only reason it was special is because the original silicon had issues, and NV opted to fix it.There are still yearly product refreshes, but GPUs are too expensive to develop (and the underlying manufacturing process too slow to advance) to do top-to-bottom new GPUs every single year.
supcaj - Thursday, March 9, 2017 - link
Disappointed to STILL not see ANY VR-related benchmarks in the suite. VR rendering has unique rendering requirements, and a huge margin of gamers have moved or are planning to move to VR during the lifetime of this card.r3loaded - Thursday, March 9, 2017 - link
Hey, 'member when top-tier GPUs costed $499? I 'member!eddman - Thursday, March 9, 2017 - link
http://i.imgur.com/pGKlskP.pngMr Perfect - Thursday, March 9, 2017 - link
I'd love to see how that compares father back, adjusted for inflation. My rose tinted goggles have me remembering top-tier GPUs launching for $400 or less back in the early 2000s, but who knows how that compares to 2017 dollars.eddman - Friday, March 10, 2017 - link
http://i.imgur.com/OU6612c.pngThere are a lot of inflation calculators online. Like this: https://www.bls.gov/data/inflation_calculator.htm
A geforce 2 ultra was $705 in 2017 dollars!
eddman - Friday, March 10, 2017 - link
Adjusted for inflation: http://i.imgur.com/ZZnTS5V.pngMeteor2 - Friday, March 10, 2017 - link
Great charts!mapesdhs - Saturday, March 11, 2017 - link
Except they excude the Titans, Fury, etc.eddman - Saturday, March 11, 2017 - link
I, personally, made these charts.No titans because they are very niche cards for those gamers who cannot wait and/or have more money than sense. The following Ti variants perform almost as good as the titan cards anyway.
No radeons because this is an nvidia-only chart. I should've titled it as such. I focused on nvidia because ATI/AMD usually don't price their cards so high.
mapesdhs - Saturday, March 11, 2017 - link
Ok on the Radeon angled, shoulda realised it was NV only. :DHowever, your description of those who buy Titans cannot be anything than your own opinion, and what you don't realise is that for many store owners it's these very top-tier cards which bring in the majority of their important profit margins. They make far less on the mainstream cards. They enthusiast market is extremely important, whether or not one individually regards the products as being relevant for one's own needs. You need to be objective here.
mapesdhs - Saturday, March 11, 2017 - link
Sorry for the typos... am on a train, wobbly kybd. :D Is this site ever gonna get modern and allow editing??...eddman - Saturday, March 11, 2017 - link
I think I am being objective. Titan cards do not fit into the regular geforce range. They are like an early pass. Wait a few months and you can have a Ti that performs the same at a much lower price.If nvidia never released a similar performing Ti card, I would've included them.
eddman - Saturday, March 11, 2017 - link
Also I don't see how stores and their profits has anything to do with that.Mr Perfect - Tuesday, March 14, 2017 - link
Nice work on those charts.So there where a couple years where top tier cards where $400 or less. Inflation number normalize that price a fair bit though.
Ryan Smith - Thursday, March 9, 2017 - link
Those days are long-gone, and not just because of profit taking. 16/14nm FinFET GPUs are astonishingly expensive to design and fab. The masks are in the millions, and now everything has to be double-patterned.webdoctors - Thursday, March 9, 2017 - link
No, its because the dollar is worthless and real inflation is off the charts. Once we make the dollar great again, we'll see prices come down.eddman - Thursday, March 9, 2017 - link
The only reason that we had $500 cards is because of the fierce competition from ATI back then.Whenever ATI's cards couldn't compete, or could but were not launched yet, nvidia jacked the prices up. I don't know why people forget the $600 8800 GTX or $650 GTX 280. Take a look at the link I posted above.
mapesdhs - Saturday, March 11, 2017 - link
What I miss is being able to buy a couple of well-priced mid-range cards that beat the high-end, with good scaling. I couldn't afford the 580 when it was new, but 2x 460 SLI was faster and served nicely for a good while. With support & optimisations moving away from SLI/CF though (lesser gains, more stuttering, costly connectors, unlock codes, etc.), a single good GPU is more attractive, but the cost way up the scale compared to 5 years ago.Have a look at the Anand review for the 280 though, it shows what I mean: 2x 8800GT SLI was faster than the 280, but $200 cheaper and with excellent scaling. I had 2x 8800GT 1GB before switching to the two 460s. Today, mid-range cards don't even support SLI (GTX 1060).
Ian.
Meteor2 - Tuesday, March 14, 2017 - link
What about two RX480s? Can they top a 1080?aryonoco - Thursday, March 9, 2017 - link
The 780 Ti was released in November 2013.The 1080 Ti is being released now, in March 2017.
So 3.5 years later, Nvidia's flagship consumer card has improved 260% while using pretty much the same amount of power and generating pretty much the same level of noise.
If this continues, I can see that in the next few years, all sort of software will find a way to utilise the GPU more, not just games and neural networks.
Nvidia has a lot to be proud of. Their execution in the past few years has been Apple-esque.
Meteor2 - Friday, March 10, 2017 - link
Reflected in their share price!virtuastro - Thursday, March 9, 2017 - link
@Ryan SmithCan you test Intel i7 7700K, Intel-E Processors, and Ryzen 7 1800x with a GTX 1080ti in benchmark? :)
Ryan Smith - Thursday, March 9, 2017 - link
Unfortunately no. We don't have a central office; I'm in the US with the GPUs, and Ian is in the UK with the CPUs. He's working on game testing for Ryzen Part 2, but we likely won't be able to include the GTX 1080 Ti.virtuastro - Thursday, March 9, 2017 - link
Aw dang. Thanks for answer anyway. :DMeteor2 - Friday, March 10, 2017 - link
FedEx?mapesdhs - Saturday, March 11, 2017 - link
It would be delayed for a month by their Customs lunacy. :DMeteor2 - Tuesday, March 14, 2017 - link
I'd love to see an Anandtech investigation of what pairings of CPU and GPU really do give the best FPS/price ratio. Could a 1080 Ti bottleneck an i5, for an example? Would a 7600 be ok but a 2500 choke?Gc - Thursday, March 9, 2017 - link
Do Re Mi Fa So La Ti ...."Ti" is a clever marketing name for a penultimate card.
11GB further emphasizes that it is leading buyers toward the ultimate.
Ranger1065 - Friday, March 10, 2017 - link
Great review, well done Anandtech.TheJian - Friday, March 10, 2017 - link
Should be interesting to see how AMD does with 8GB for Vega.Looks like the best Vega can hope for is a tie and they themselves (Raja) said they only had enough software engineers to be working on Vulkan drivers. So that means Dx11/OpenGL and possible DX12 won't be very good, or could take ages to catch up if AMD doesn't make some money to hire more people. I really hope Vega launch doesn't end up like Ryzen (motherboards all over the place having issues, and SMT etc issues in games). I think I'll wait out Vega and maybe even 1080r2 with GDDR5x & faster clocks before jumping. Also have to wait for AMD to fix ryzen if they can. At least the motherboard part, as sites like PCper think the game part will stay the same forever. AMD talking Ryzen rev2 already (that fixes things) makes me think PCper etc are correct.
Still, an exciting time for hardware and a great time to buy a PC this year. Even the low end is getting a major boost probably. The 1060 is low to me no point in spending under $200 if you want to really game IMHO, faster speeds, GDDR5x, could be interesting. That combo could make a really great HTPC. I wonder how much faster 1060 will get. Pity it seems AMD has no access to GDDR5x as nvidia is using it all.
Just read Hardocp's review, 30-35% faster than 1080. Much the same as here. AMD has a rough road ahead (so do their shareholders). That said, competition is good :)
Ken_g6 - Friday, March 10, 2017 - link
Hm, 11GB RAM, 11Gbps, 11.3 TFLOPs. Why do I get the impression Marketing wanted to use the phrase "goes to eleven" for this card? Did they announce it with Spinal Tap music?Ryan Smith - Saturday, March 11, 2017 - link
They did not. Though clearly they should have.mapesdhs - Saturday, March 11, 2017 - link
Even SGI did this. IIRC:"audiopanel -spinaltap"
Changes the scale to 11. :D
oranos - Friday, March 10, 2017 - link
pretty meh if you own a gtx 1080 already. higher TDP, lower base clocks.justaviking - Friday, March 10, 2017 - link
30% faster is "meh"?deathtollwrx - Wednesday, April 19, 2017 - link
I upgraded from aAsus 1080GTX 8oc and it made a huge difference for me.
OW was 110 fps now running about 155
OldManMcNasty - Friday, March 10, 2017 - link
Ok is it me but why are all the games old AF? AND why is the 800-pound gorilla named BF1 missing from just about every review?stardude82 - Friday, March 10, 2017 - link
Quick search pulls up many BF1 benchmarks for the 1080 Ti...bill44 - Friday, March 10, 2017 - link
Thanks Ryan, nice review.Still missing info/specs on Audio (inc. supported sampling rates), which, I suspect, you still not have at hand.
I have asked about this in the past.
hughw - Saturday, March 11, 2017 - link
Does the 1080 Ti have dual DMA channels as the Titan X does?MarkieGcolor - Saturday, March 11, 2017 - link
Nvidia is so gay. I don't understand why people complain that intel's chips slowly optimize over the years. You seriously want your hardware to depreciate 50% every year?Meteor2 - Tuesday, March 14, 2017 - link
That's one way to look at it! But lose the homophobia.prateekprakash - Sunday, March 12, 2017 - link
Hi,After going through the well written review, I think: wouldn't it be nice if AIB partners (atleast one of them) released a blower type card with 3 slots, so that it could include a heftier heatsink, yet exhaust heat out? That way I could consider putting it in a congested chassis and not worrying about thermal throttling. PS: here in India, my zotac gtx 1060 mini reaches 78° c even in open air!
PocketNuke - Monday, March 13, 2017 - link
GP106-GP102 have the same 4x int8 performance according to this article:https://devblogs.nvidia.com/parallelforall/mixed-p...
panicp - Sunday, October 15, 2017 - link
I have a GTX 1080Ti and to date, I've been using FSX which apparently devotes more use to the CPU than the GPU. I've just loaded P3D - and it really does look super smooth. The temp maxed out around 85 degs - and my monitoring software was having kittens showing me temps in the RED zone.Can thus GPU continue to run at this temp indefinitely? For unlimited hours?
Is it going to damage the card in the long run?
I'd appreciate your kind advice.
panicp - Sunday, October 15, 2017 - link
I have a GTX 1080Ti and to date, I've been using FSX which apparently devotes more use to the CPU than the GPU. I've just loaded P3D - and it really does look super smooth. The temp maxed out around 85 degs - and my monitoring software was having kittens showing me temps in the RED zone.Can thus GPU continue to run at this temp indefinitely? For unlimited hours?
Is it going to damage the card in the long run?
I'd appreciate your kind advice.