I agree with the conclusion that these cards aren't a good buy for 1080ti owners. My 1080ti overclocks very nicely and I'll be happy to stick with it until the next generation in 7 nm. By then we might have a decent selection of games that make use of ray tracing and the performance increase will be more appealing.
Yah i agree, especially its only a 20-25fps increase on average. While many might thing thats great, considering the price increase over 1080TI and the fact many 1080TI can overclock to close that gap even more. The features don't justify the cost.
However, it could be lots of performance could be unlocked via driver updates..we really don't know how tensor cores could increase performance till the games get updated to use it. Also, while super expensive option...how does the new SLI performance increase performance? Lets see a compare from 1080TI sli to newer sli 2080TI..maybe its easier to put into games? So many what-ifs with this product.
I feel this product should of been delayed till more games/software already had feature sets available to see.
You'll be glad to hear then that we'll be backfilling cards.
There was a very limited amount of time ahead of this review, and we thought it to be more important to focus on things like clocking the cards at their actual reference clocks (rather than NVIDIA's factory overclocks).
Many thanks for that, I think it is useful job, people are still using maxwell(or even older) generation GPU in 2018. And when we could expect maxwell (980/980ti) results to appear in GPU 2018 bench? Could you also please add Geforce GTX Titan X (maxwell) to GPU 2018?
I'm not sure how midrange 2070/2060 cards will sell if they're not a significant value in performance/price compared to 1070/1060 cards. If AMD offer no competition, Nvidia should still compete with itself
It's interesting that every comment I've seen says a similar thing and that nobody thinks of uses outside of gaming. I would think that for real raytracers and Adobe's graphics and video software for instance the tensor and RT cores would be very interesting. I wonder though if open source software will be able to successfully use that new hardware or that Nvidia is too closed for it to get the advantages you might expect. And apart from raytracers and such there is also the software science students use too. And with the interest in AI currently by students and developers it might also be an interesting offering. Although that again relies on Nvidia playing ball a bit.
"where paying 43% or 50% more gets you 27-28% more performance" 1080 Ti can be bought in the $600 range, wheres the 2080 Ti is $1200 .. so I'd say thats more than 43-50% price increase..at a minimum we're talking a 71% increase, at worst 100% (Launch MSRP for 1080 Ti was $699)
NVIDIA didn’t just increase the price for shit and giggles, the Turing GPUs are much more expensive to fab, since you’re talking about almost 20 BILLION transistors squeezed into a few hundred mm2.
Regardless: Comparing the 2080 with the 1080, and claiming there is a 70% price increase, is a bogus logic in the first place, since the 2080 brings a number of things to the table that the 1080 isn’t even capable of.
Find me a 1080ti with DLSS and that is also capable of raytracing, and then we can compare prices and figure out if there’s a price increase or not.
No, actually YOUR logic is bogus. Find me a DLSS or Raytracing game to bench.. You can't. There is a reason for that. Raytracing will require a Massive FPS hit, Nvidia knows this and is delaying you from seeing that as damage control.
There are no ray tracing games because the technology is new, not because NVIDIA is "delaying them". As far as DLSS, I think those games will appear faster than ray tracing.
Darksiders III from Gunfire Games / THQ Nordic Deliver Us The Moon: Fortuna from KeokeN Interactive Fear The Wolves from Vostok Games / Focus Home Interactive Hellblade: Senua's Sacrifice from Ninja Theory KINETIK from Hero Machine Studios Outpost Zero from Symmetric Games / tinyBuild Games Overkill's The Walking Dead from Overkill Software / Starbreeze Studios SCUM from Gamepires / Devolver Digital Stormdivers from Housemarque Ark: Survival Evolved from Studio Wildcard Atomic Heart from Mundfish Dauntless from Phoenix Labs Final Fantasy XV: Windows Edition from Square Enix Fractured Lands from Unbroken Studios Hitman 2 from IO Interactive / Warner Bros. Islands of Nyne from Define Human Studios Justice from NetEase JX3 from Kingsoft Mechwarrior 5: Mercenaries from Piranha Games PlayerUnknown’s Battlegrounds from PUBG Corp. Remnant: From The Ashes from Arc Games Serious Sam 4: Planet Badass from Croteam / Devolver Digital Shadow of the Tomb Raider from Square Enix / Eidos-Montréal / Crystal Dynamics / Nixxes The Forge Arena from Freezing Raccoon Studios We Happy Few from Compulsion Games / Gearbox
Funny how the same people who praised AMD for being the first to bring full DX12 support yet only 15 games in the first two years used it, are the same people sh*tting on nVidia for bringing a far more revolutionary technology that's going to be in far more games in a shorter time span.
Considering AMD was the first to bring support to an API that all GPUs could have support for, DLSS is not a comparison. DLSS is an Nvidia-only feature and Nvidia couldn't manage to have even ONE game on launch day with DLSS.
AMD spawned Mantle which then turned into Vulcan. Also pushed MS to dev DX12 as it was in both their interests. These APIs can be used by all.
DLSS while potentially very cool, is as Jordan said proprietary. Like hair works and other crap ot will get light support but devs when it comes to feature sets will spend most of their effort building to common ground. With consoles being AMD GPU based, guess where that will be.
If will be interesting how AMD will ultimatley respond. Ie gsync/freesync CUDA/OpenCL, etc.
As Nvidia has stated, these features are designed to work with how current game engines already function so they dont (the devs) have to reinvent the wheel. Ultimately this meanz the integration wont be very deep at least not for awhile.
For consumers the end goal is always better graphics at the same price point when new releases happen.
Not that these are bad cards, just expensove and two very key features are unavailable, and that sucks. Hopefully the situation will change sooner rather than later.
The DLSS basically gives you a resolution jump for free (e.g. 4k for 1440p performance) and is really easy to implement. That's going to take off fast and probably means even the 2070 will be faster then the 1080Ti in games that support it.
Die size is irrelevant to consumers. They see price vs performance. not how big the silicon is.
AMD was toasted for having hot slow chips. many times.. so did nvidia.. big and hot means nothing if it doesn't perform as expected for the insane prices they 're asking for.
Die size is not irrelevant to consumers because increased die size means increased cost to manufacture. Increased cost to manufacture means a pressure for higher prices. The question is what you get in return for those higher prices.
People like what they know... what they are used to. If some new AA technique comes along and increases performance significantly but introduces visual artifacts it will be rejected as a step backwards. But if a new technology comes along that has a significant performance cost yet increases visual quality much more significantly than the aforementioned artifacts decrease it, people will also have a tendency to reject it. That is, until they become familiar with the technology... That's where we are with RTX. No one can become familiar with the technology when there are no games that make use of it. So trying to judge the value of the larger die sizes is an abstract thing. In a few months the situation will be different.
Personally, I think the architecture will be remembered as one of the biggest and most important in the entire history of gaming. There is so much new technology in it that some of it barely anyone is saying much about (where have you heard about texture space shading, for example?). Several of these technologies will have their greatest benefits with VR, and if VR had taken off people would be marveling about this architecture immediately. But I think that VR will eventually take off, and I think several of these technologies will become the standard way of doing things for the next several years. They are new and complicated for developers, though. Only a few developers are prepared to take advantage of the stuff today. It's going to be some time before we really can put the architecture into its proper historical perspective.
From the point of view of a purchase today, though, it's a bit of an unknown. If you buy a card now and plan to keep it for 4 years, I think you'd be better off getting a 20 series than a 10 series. If you buy it and keep it for 2 years, then it's a bit less clear, but we'll have a better idea of the answer to that question in 6 months, I think.
I do think, though, that if an architecture with this much new stuff were introduced 20 years ago everybody, including games enthusiast sites like Anandtech, would be going gaga over it. The industry was moving faster then and people were more optimistic. Also the sites didn't try to be so demure. Hmm, gaga as the opposite of demure. Maybe that's why she's called Lady Gaga.
I agree that this might be the most game-changing graphics tech of the last couple of decades, and that the future belongs to ray-tracing, but I also think that precisely due to the general uncertainty and the very high prices Nvidia might suffer one of their biggest sales slumps this generation, if not *the* biggest. They did not handle the launch well : it is absurd to release a new series with zero ray-traced, DLSS supporting or mesh shaded games at launch.
Their extensive NDAs, lack of information and ban on benchmarks between the Gamescom pre-launch and the actual launch, despite going live with (blind faith based) preorders during that window, was also controversial and highly suspicious. It appears that Nvidia gave graphics cards to game developers very late to avoid leaks, but that resulted in having no RTX supporting games at launch. They thought they could not have it both ways apparently, but choosing that over having RTX supporting games at launch was a very risky gamble.
Since their sales will almost certainly slump, they should release 7nm based graphics cards (either a true 30xx series or Turing at 7nm, I guess the latter) much sooner, probably in around 6 months. They might have expected a sales slump, which is why they released 2080Ti now. I suppose they will try to avoid it with aggressive marketing and somewhat lower prices lately, but it is not certain they'll succeed.
Would you've still defended this if it was priced at $1500? How about $2000? Do you always ignore price when new tech is involved?
The cards, themselves, aren't bad. They are actually very good. It's their pricing.
These cards, specifically 2080 Ti, are overpriced compared to their direct predecessors. Ray tracing, DLSS, etc. etc. they still do not justify such prices for such FPS gains in regular rasterized games.
A 2080 Ti might be an ok purchase for $850-900, but certainly not $1200+. Even 8800 GTX with its new cuda cores and new generation of lighting tech launched at the same MSRP as 7800 GTX.
These cards are surely more expensive to make, but there is no doubt that the biggest factor for these price jumps is that nvidia is basically competing with themselves. Why price them lower when they can easily sell truckloads of pascal cards at their usual prices until the inventory is gone.
You, like so many others don't get it. nVidia has re-worked their product lines. Didn't you notice how the Ti came out at the same time as the 2080? You might also notice that Titan is now called Titan V (volta) and not GTX Titan. Titan is now in its own family of non-gaming cards and that is reflected in the driver section on their site. They now have titan specific drivers.
You took an opinion and decided it's a fact. It's not. That guy is not the authority on graphics cards.
There is no official word that titan is now 2080 Ti. Nvidia named that card 2080 Ti, it has a 102 named chip. Nvidia themselves constantly compare it to 1080 Ti, which also has a 102 named chip, therefore it's the successor to 1080 Ti and it's very normal to expect similar pricing.
Don't worry, there will be a Titan turing, considering that 2080 Ti does not even use the fully enabled chip.
It's really baffling to see people, paying customers, defending a $1200 price tag. It is as if you like to be charged more.
Even though the review is older and this comment is a few months old I just wanted to jump in and say "hah, look, eddman was right!" Now that the Titan RTX leaks are showing up. Lol. They didn't even wait for supply to stabilize on the 2080ti before dropping the titan.
Plus, if the 2080 replaced the 1080ti then why is it more expensive and no faster? That would be a first even for Nvidia..
The model numbers aren't that significant. NVIDIA could just have easily released a 2080, a 2070, and a 2060 by putting different labels on the boxes of the 2080 Ti, the 2080, and the 2070 for instance. The Ti, the Titan, all of those are long standing marketing identities that buyers now automatically associate with a certain relative scale of performance among other GPUs of the same generation. NVIDIA can play upon buyer expectations by releasing various products to fill those expectations in the way that best advances the company's interest. Any company with enough brand recognition can easily do the same. Consider Intel's long-running i-series CPU numbering. The fact that something labeled as a Ti came out at a certain time isn't an example of technological development, but a way of meeting customer expectations in reflection of the MSRP. We would have balked much more at $1200 for the exact same product if it was labeled as a plain vanilla 2080 and the current vanilla 2080 was branded as a 2070. Instead, we say, "Well, the 2080 Ti is really expensive, but at least its a Ti so that makes it a little bit more reasonable."
Model numbers are significant in the way that they point out the models in the same successive line up. That's the entire point of them.
I and a lot of people are not in this "we" you talk about. Again, nvidia themselves compare it to 1080 Ti every chance they get, so I do not see why I should in any way think its price is "reasonable".
That's not how past generational leaps worked, even for 8800 GTX. We got massive performance gains AND usually new rendering features at similar MSRPs or maybe a bit higher. The difference this time is that AMD has left the building, for now.
Don't misunderstand me. I'm not implying that the price is okay or that anyone should find it reasonable to stomach a $1200 MSRP for a mere graphics card. I also agree that part of the pricing problem is due to an absence of credible competition from AMD. I'm just arguing that the people in the NVIDIA marketing department may justify the price in part by slapping a Ti label on the box so consumers are less likely to balk during checkout. The reality is that we're getting a step sideways in performance for a noteworthy increase in TDP due to the addition of capabilities that may or may not actually add much value because said features are too demanding to play nicely at high resolutions and because there are not indications that the software side will move to take advantage of said features. At best, the addition of the hardware won't be very compelling until the next generation of GPUs after Turing when its likely that performance will pick up a bit.
Then again, who am I to talk? I play PC games on a laptop with an HD 4000 infrequently and end up mostly gaming on my ancient dual core Kitkat era phone that I've been keeping as a cheap wireless mini tablet. To me, PC gaming became an overly pricey sink of my limited, single parent free time. I'd rather bank my spare money in something that yields interest over time than throw it into gaming hardware that's obsolete in a matter of a few years. That and my kids me to be both of their parents these days since that worthless ex of mine schlepped off to marry some woman in Canada. *grumble*
More like that they are pricing their high end cards like they are flagship cards. The 2080 Founders seems identical in price to a 1080TI. That is unacceptable. Specially when they are almost identical in performance (going slower in most games by a few small points).
They(Nvidia) just want to clear the huge build up of PASCAL cards.. by charging insanity for those who are willing to claim to be "gamers" with money. period.
"You, like so many others don't get it. nVidia has re-worked their product lines. Didn't you notice how the Ti came out at the same time as the 2080?" What the hell does this has to do? Nothing for the consumer again.
"Die size is not irrelevant to consumers because increased die size means increased cost to manufacture. Increased cost to manufacture means a pressure for higher prices. The question is what you get in return for those higher prices. " You're repeating the same. Die size means NOTHING to a consumer. It means something for the manufacturer because it costs THEM. If the die doesnt benefit anything at all (Fermi) compared to smaller dies that offer almost the same performance (Pascal). Why would the consumer have to pay MORE for LESS?
New tech is nothing if there is nothing to show. And there is NOTHING to show right now. by the time raytracing becomes really viable, the new generation of cards will be out.
This x1000. These cards are a necessary step towards getting the technology out there, but I'm thoroughly unconvinced that it is a good idea for anyone to buy them. The sacrifice in die area was too great, for far too little benefit. Given the strong indications that 1080p ~45fps is where real-time raytracing will be at right now, I just don't care. They sold me on high-resolution and high-framerate because those actually affect how much enjoyment I get from my games. I'm not interested in that rug being pulled from under my feet *and* paying damn near double price for the privilege.
how does that matter? Are you suggesting that magically makes the die size irrelevant? If you have a 300mm wafer and you double the die size, you also halve the number of die per wafer. This would also ignore yield. A larger die is more costly to produce because you get fewer die per wafer and increase the probability of having a defect within a die.
The problem is that it does not bring those things to the current table but is going to bring them to a future table. Essentially they expect you to buy a graphics cards that no current game can support its advanced features merely on faith that it both will deliver them in the future *and* that they will be will be worth the very high premium.
If there is one ultimate unwritten rule when buying computers, computer parts or anything really, it must be this one : Never buy anything based on promises of *future* capabilities - always make your purchasing decisions based on what the products you buy can deliver *now*. All experienced computer and console consumers, in particular, must have that maxim engraved on their brain after having been burnt by so many broken promises.
It's not that the price increase wasn't warranted, at least from the transistor count perspective, it's that there's not a lot to show for it.
Many more transistors...concentrated in Tensor cores and RTX cores, which aren't being touched in current games. The increased price is for a load of baggage that will take at least a year to really get used (and before you say it, 3 games is not "really getting used"). We're used to new GPUs performing better in current games for the same price, not performing the same in current games for the same price (and I'm absolutely discounting everything before 2008 because that was 10 years ago and the expectations of what a new μArch should bring have changed).
I get the whole "future of gaming" angle you're pushing, and it's a perfectly valid reason to buy these new GPUs, but don't act like an apples-to-apples comparison of performance *right now* is the "wrong way of looking at it". How the card performs right now is an important metric for a lot of people, and will influence their decision. Especially when we're talking a potential price difference of $100+ (with sales on 1080 Ti's, and FE 2080 prices). Obviously there isn't a valid comparison for the 2080 Ti, but anyone who can drop $1300 on a GPU probably doesn't care too much about the price tag.
Nvidia is charging what they are because they have no competition at the top end. That's it, nothing else. They're taking in the cash today in preparation for having to price more competitively later.
Flunk, we are talking Nvidia here.. typically speaking they don't lower prices to compete.. Sometimes they bump to high and to few bite.. but that's about it. The last time they lowered prices to compete was the 400 series but they'd just come off getting zonked by amd for basically 2 generations.. and when they went to the 500s series it was fairly competitive with amd.. (initially they were better but Amd continued to improve their 5000/6000 series.. til it was consistently beating Nvidia.. did they lower prices? NO.. not one bit..)
TNT cards were competitive and cheap.. but once Nvidia knocked off all other contenders (aside from AMD) and started in with their geforce line they have always carried premiums regardless competition or not.
GTX 280, launched at $650 because they thought AMD couldn't do much. AMD came up with 4870. What happened? Nvidia cut the card's price to $500 a mere month after launch. So yes, they do cut prices to compete.
13.6 and 18.6 (bln transistor estimated) die size of 454/754mm2 (2080/2080Ti) 12nm 7.2 and 12 (bln transistor estimated) die size of 314/471 (1070/1080-1080Ti/TitanX) 16nm
yes it is "expensive" no doubt about that, but, it is Nv we are talking about, there is a reason they are way over valued as they are, they produce as cheaply as possible and rack them up in price as much as they can even when their actual cards shipped are no where near the $$$ figure they report as it should.
also, if anything else, they always have and always will BS the numbers to make themselves ALWAYS appear "supreme" no matter if it is actual power used, TDP, API features, or transistor count etc etc etc.
as far as the ray tracing crap...if they used an open source style so that everyone can use the exact same ray tracing engine so they can be directly compared to see how good they are or not then it might be "worthy" but, it is Nv they are and continue to be "it has to be our way or you don't play" type approach...I remember way back when with PhysX (which Nv boug out Ageia to do it) when Radeons were able to use it (before Nv took the ability away) they ran circles around comparable Nv cards AND used less cpu and power to do it.
Nv does not want to get "caught" in their BS, so they find nefarious ways around everything, and when you have a massive amount of $$$$$$$$$$$$$$$$ floating everything you do, it is not hard for them to "buy silence" Intel has done so time and time again, Nv does so time and time again........blekk
DLSS or whatever the fk they want to call it, means jack shit when only specific cards will be able to use it instead of being a truly open source initiative where everyone/everything gets to show how good they are (or not) and also stand to gain benefit from others putting effort into making it as good as it possibly can be...there is a reason why Nv barely supports Vulkan, because they are not "in control" it is way too easy to "prove them wrong"..funny because Vulkan has ray tracing "built in"
IMO if they are as good as they claim they are, they would do everything in the light to show they are "the best" not find ways to "hide" what they are doing.....their days are numbered....hell their stock price just took a hit....good IMHO because they should not be over $200 anyways, $100, maybe, but they absolutely should not be valued above others whos financials and product shipment as magnitudes larger.
Remind me why consumers should give a rats-ass about die size, other than its visible effects of price and performance.
If you want to sell me a substantially larger, more expensive chip that performs a little better for a lot more money, a better reason is needed than "maybe it will make some games that aren't out yet really cool in a way that we refuse to give you any performance indications about".
They look poor value; good performance, sure. But a 1080ti offers the same for much less. They want me to buy promises! Seriously, promises are never worth the paper they are printed on - digital or the real stuff.
Why would I want a feature like DLSS when current AA methods do the job fine and we can also just run at native, higher resolution anyway and not use any AA whatsoever?
And why would anyone care about vaporware like RTRT?
You must be joking right? What do we care if the price of manufacturing increased for Nvidia. We are mot supporters, we are clients. We don't have to support their pricing because WE ARE NOT SOCIOS! Let Nvidia reduce their costs by cutting the salaries of their CEOs and other wortless corporate officers. Then I will BUY their 2080Ti product, at the consumer-friendly price of €750
Except for the fact that the non founders edition is $999, not $1200. And the GTX 1080ti released at $699 but for the better part of the past two years cost substantially more.
So buy a 1080ti. For some the new features are worthy and boost perf quite massively making it truly worth it if those technologies are the new way forward. At worst, a good 25 games are already coming with NV's new tech. Many are huge titles most would like to play surely.
Also as Hexus noted a while back, the price to make these things is just below MSRP. Note the small chip is as large as a titan, and the larger chips...WOW. That's a lot of transistors for a game card. Also apples vs. oranges here, as said by others 1080 etc can't do raytracing or dlss.
https://nvidianews.nvidia.com/news/nvidia-rtx-plat... 27 games coming with NV tech. Will they look the same on 1080ti or less? NOPE. Will they be faster and BETTER looking on RTX...YEP. Value is in the eye of the beholder ;)
The problem is, you just counted twice when you said "will they be faster and better looking on RTX".
The absolute truth of it from what little we can glean so far (after the official launch!!!) is that you can have RTX effects /OR/ you can have your better performance, not both. That's a heavy caveat!
It would be one thing if it were a proposition of waiting a couple of months for some amazing features that will knock your socks off and have few drawbacks. It's another to be paying over the odds for a card now to maybe get some cool stuff that will DEFINITELY run slower and at a lower res than you're used to.
Wrong. It's already known that the tensor cores have enough juice to run ray-traced effects and DLSS at the same time: https://youtu.be/pgEI4tzh0dc?t=10m55s
While your argument is solid, these days are just so weird that even $1200 cards seem to make a hell of a lot sense. This is also valid for similar desktop cpus. Why? Well , go buy a high-end iphone or a high end android phone... Enough said.
PS. For those who may use arguments like geekbench and such , it is just insulting to put it very very kindly!
So basically buying high-priced electronics make sense because the companies selling them just increase the prices every year to make profits and a certain type of consumers are supporting those companies by purchasing no matter what the price (call them fanbois). The question is: Why, why are those people acting like that? What drives them?
Consumers paying these premium prices for features that are not even fully developed or finished is mind boggling!! People are being bent over and screwed by Nvidia hard yet they still pay $1500 to be beta testers until next Gen.
Mindboggling? I suppose it would be for a time traveller visiting from the 19th century, but for everyone else it’s perfectly normal.
There is always a price premium for those early adopters who want to live on the cutting edge of technology.
When DVD players came out, they cost over a 1000$ and the selection of movies they could watch was extremely small. When Blu-ray players came out, they also cost well over 1000$ and the entire catalogue of Blu-ray titles was a dozen movies or so.
And keep in mind, that the price that Nvidia charges for joining the early adopter club is really shockingly low.
When OLED or 4K televisions first came out, people paid tens of thousands of dollars for a set, and the selection of 4K entertainment to watch on them was pretty much zero.
With the 2080, early adopters can climb aboard for 600-1000$.
Games that take advantage of DLSS and RTX will be here soon and in the meantime they have the most powerful graphics card on the market that will play pretty much anything you can throw at it, in 4K without breaking a sweat.
Again, two technology that have not even seen the real light of day, let alone to be proven worth it at all. Early adopters of the other techs you listed at least got WORKING TECH from the start as promised.
THose games you listed don't have it now, they are COMING. lol Even then the difference is not even worth it considering the games don't hardly take a hit for the 1080TI. You are the nvidia shill on here and forums as everyone knows.
Yeah, there was media available at launch. Also Blu-Ray provided a noticeable jump in both quality AND resolution over DVD. RTX provides maybe the first and definitely not the second.
And it’s clear that you didn’t read the article, or skimmed it at best, if you’re claiming that “the two technologies have not even seen the real light of day”.
The tools are out there, developers are working with them, and not only are there many games on the way that support them, there are games out now that use RTX.
Let me quote from the review:
“not only was the feat achieved but implemented, and not with proofs-of-concept but with full-fledged AA and AAA games. Today is a milestone from a purely academic view of computer graphics.”
Development means nothing unless they are released. As plans get cancelled, budgets gets cut and technology is replaced or converted/merged into a different standard.
You just proved yourself wrong with own quote. lol Guess what? Python language is out there, lets all develop games from it! All the tools are available! Its so easy! /sarcasm
Just like those HD-DVD adopters, Laser Disc adopters, BetaMax adopters. V900 is pointing out that early adopters accept a level of risk in adopting new technology to enjoy cutting-edge stuff. This is no different that Bluray or DVDs when they came out. People who buy RTX cards have "WORKING TECH" and will have few options to use it just like the 2nd wave of Bluray players. The first Bluray player actually never had a movie released for it and it cost $3800.
"The first consumer device arrived in stores on April 10, 2003: the Sony BDZ-S77, a $3,800 (US) BD-RE recorder that was made available only in Japan.[20] But there was no standard for prerecorded video, and no movies were released for this player."
Even 3 years after that when they actually had a standard studios would produce movies for the players that were out cost over $1000 and there was a whopping 7 titles that were available. Similar to RTX being the fastest cards available for current technology, those Bluray players also played DVDs (gasp).
Again, the point is bluray WORKED out of the box even if expensive. This doesn't even have any way to even test the other stuff.. You are literally buying something for a FPS boost over previous gens that is not really a big one at that. It be a different tune if lots of games already had the tech in hand by nvidia, had it in games just not enabled...but its not even available to test is silly.
You are simply ignoring the facts. When Bluray players launched they didn't play Blurays because there were none, because there was no standard. It took 3 years before there was a standard and 7 movies were released. Before then they were just high-end DVD players.
These RTX cards also work out of the box. Its crazy I know, they actually can play current games and all with the highest settings and fastest frame rates. Similar to what happened with bluray, they will also support those new fangled DXR and DLSS options in games as they come out.
First Blu-ray movies would be released on June 20, coinciding with the release of the first Blu-ray DVD player from Samsung, and a Sony VAIO Blu-ray PC.Jun 13, 2006 The first batch distributed by Sony was on June 20th, 2006:
The Fifth Element 50 First Dates Hitch House of Flying Daggers The Terminator Underworld: Evolution xXx
Working out of the box? You mean like the RTX 2080/ti/2070 cards?
You’re willfully ignoring facts and pretend that it’s totally up in the air whether games will support RTX, and that they won’t be available for a long time...
Which is entirely false.
There are games that support RTX out right now. Like Tomb Raider
More are coming this year: One of the biggest titles this year: Battlefield 5 supports it.
And dozens of titles supporting RTX, many of them big AAA titles, are coming out in H1 2019.
So no: Nobody is buying a card they “can’t even test.” If you buy an RTX card, it’ll work out of the box with RTX technology enabled.
And it’s of course also the fastest card on the market for all the old titles that don’t support RTX.
If you can’t compare performance per dollar of this gen and the previous, why even bother praising the new tech when currently none of the tech sites have any mean to test them?
After reading your first comment, I couldn’t help but think that you’re a paid shill for Nvidia.
You just ignored the facts and even proved yourself wrong in own statement. lol
Let me sell you a car with the fastest engine, but i'm not going to let you use all the Horsepower..but i promise i'll enable it when i get the right parts for it. Don't worry about the $100k price tag on the car, its going to be awesome i swear..
Do you get paid for every post by nvidia, or just a lump sum?
You are comparing apples to oranges. There isn't even a raytracing game to compare to.. And when the truth comes out that even with RTX, there is a massive FPS hit.. well, it's game over.
First you complain that there aren't any raytracing games to get benchmarks from, then you state there will be a massive performance hit. If there are no games available to test with, you can't have any idea what their performance will be like.
Unlike a video card, DVD were a STANDARD. set to replace the DVD. This wasn't a war between BETAMAX and VHS again. It was an evolution. And as you said it, they had a few titles coming on. Nvidia currently is offering ZERO options for what they charge insanely.
Even those 4k TVs you mentioned.. had demos and downlodable content. It was the future.
Nvidia's game in some of these RTX features are solely of Nvidia, not a global standard.
Not a great comparrison. Mainly because: games making use of RTX and othe new features is: Zero. OlED and 4k/DVD/Blue: pretty much zero/extremely small/dozen or so - not of the aforementioned is as low as zero, so the consumer could see what they were getting.
Early adopters have always paid over the odds for an immature experience. That's the decision they make. You pays your money and you takes your chances...
Blame both. Why the f you blame AMD for NVIDIA's own fault? And yes, AMD had competitive offering on mid-end, not on high end. But, that's before 7mm. Let's see what will we got on 7mm. 7mm will be released next year anyway, it's not that far off.
So you are saying that if AMD were competitive then NVIDIA could never have implemented such major innovations in games technology... So, competition is bad?
Competition can stifle innovation when the market is involved in race to see how efficiently they can leverage current technology. The consumer GPU market has been about the core count/core efficiency race for a very long time.
Because Nvidia has a commanding lead in that department, they are able to add in other technology without falling behind AMD. In fact, they’ve been given the opportunity to start an entirely new market with ray-tracing tech.
There are a great many more companies developing ray-tracing hardware than rasterization focused hardware at the current moment. With Nvidia throwing their hat in now, it could mean other companies start to bring hardware solutions to the fore that don’t have a Radeon badge. It won’t be Red v. Green anymore, and that’s very exciting.
Your Brave New World would involve someone else magically catching up with AMD and Nvidia's lead in conventional rasterization tech. Spoiler alert: nobody has in the past 2 decades and the best potential competition, Intel, isn't entering the fray until ~2020
No. I’m saying that companies that specialize in ray-tracing technology may have an opportunity to get into the consumer discrete GPU market. They don’t need to catch up with anything.
Not AMD fault if Nvidia is asking 1200$ US. Stop blaming AMD because you want to purchase Nvidia cards at better price, BLAME Nvidia!
It is not AMD who force Ray Tracing on us. It is not AMD who want to provide gamework tools to sabotage the competition and gamers at the same time. It is not AMD charging us the G-sync tax. It is not AMD that screw gamers for the wallet of investors.
It is all Nvidia fault! Stop defending them! There is no excuses.
I accept that nVidia's choices are their own and not the "fault" of any third party. On the other hand, nVidia is a business and their primary objective is to make money. Manufacturing GPUs with features and performance that customers find valuable is a tool to meet their objective. So while their decisions are their own responsibility, they are not unexpected. Competition from a third party with the same money making objective limits their ability to make money as they now have to provide at least the perception of more value to the customer. Previous generation hardware also limits their ability to make money as the relative increase in features and performance (and consequently value) are less than if the previous generation didn't exist. If the value isn't perceived to be high enough, customers won't upgrade from existing offerings. However, if nVidia simply stops offering previous generation hardware, new builds may still be a significant source of sales for those without an existing viable product.
Long story short, since there is no viable competition from AMD or another third party to limit nVidia's prices, it falls to us as consumers to keep the prices in check through waiting or buying previous gen hardware. If, however, consumers in general decide these cards are worth the cost, then those who are discontent simply need to accept that they fit into a lower price category of the market than they previously did. It is unlikely that nVidia will bring prices back down without reason.
Note: I tend to believe that nVidia got a good idea of how much more the market was willing to pay for their product during the mining push. Though I don't like it (and won't pay for it), I can't really blame them for wanting the extra profits in their own coffers rather than letting it go to retailers.
It's NVIDIA making a conscious decision to spend its engineering resources on innovating and implementing new technologies that will shift the future of gaming instead of spending that energy and die space on increasing performance as much as it can in today's titles. If NVIDIA left out the RT cores and other new technologies they could have easily increased performance 50 or 60% in legacy technologies by building chips bigger than Pascal but smaller than Turing, while increasing prices only moderately. Then everyone would be happy getting a card that would be leading them into a gaming torpor. In a few years when everyone is capable of running at 4k and over 60 fps they'd get bored and wonder why the industry were going nowhere.
nVidia has done the same thing in the past, introducing new technologies and platforms like tesselation, PhysX, HairWorks, GameWorks, GPP etc. All of these were proved to be just tricks in order to kill competition, like always, which nowadays means to kill AMD. Pseudoraytracing is not an innovation or something mandatory for gaming. It's just another premature technology that the opponent doesn't have in order to be nVidia unique again with huge cost for the consumer and performance regression.
i don't think it's fair to compare ray tracing to HairWorks... ray tracing is a superior way to render graphics compared to rasterisation, there's no question about this.
GPP was a partner promotion program. Hairworks is part of Gameworks. PhysX is part of Gameworks. Gameworks is not a trick, and neither is the PhysX part of it. But neither of them compare to ray tracing. Maybe you should like up what the word "pseudo" means, because you're using it wrong.
In 1 year or a year and a half AMD will have their own ray tracing acceleration hardware and then you'll be all in on it.
As for killing AMD, NVIDIA are not interested in it. It wouldn't be good for them, anyway. NVIDIA are, however, interested in building their platform and market dominance.
Nvidia is throwing down the throat of gamers Ray Tracing development. We are paying for something that we didn't even wanted at first.
You didn't even know about Ray Tracing and DLSS before it was announced. You are just drinking the coolaid unlike many of us who stand out and raging against these INDECENT prices.
I couldn't give a hoot either way, I just want games that make sense and are believable, that's far more important than how a game looks. If an object cannot be used or behave in a manner that corresponds to its appearance, then what's the point? Everyone went mental about the puddle in the PS4 game, but did anyone stop to ask whether the water on the ground was wet? Likewise, th RTX demo of that fire effect (which looked grud awful anyway), is the fire hot? Can it melt the glass if fired close enough? Can I break the glass? Use a shard as a weapon? Would an enemy reveal their position by walking on the fragments, or do the pieces just fade away because they're nothing more than a fancy PhysX visual? Can I throw a grenade into the cabin to make the glass explode and harm passing enemies?
World interactivity, object function and unexpected complexity & behaviour makes for a far more immersive game than any amount of ray tracing can ever provide. A glazed china teapot can look glorious with complex reflections & suchlike, but if I can't use it to make tea than it's not a teapot. If I can't open a door, close it, lock it, break it down, etc., then it's not a door. People are obsessed with visuals in games atm because they've been told to be. The sheep behaviour of consumers with all this is utterly mind boggling.
That aside, these Turing cards are simply not fast enough for doing RT effects anyway. NVIDIA has spent the last five yers hyping people up for high frequency gaming, 4K and VR, all things which need strong fill rates (rasterisation performance). Those who've gotten used to high frequency monitors physically cannot go back, the brain's vision system adapts, standard 60Hz sudden looks terrible to such users. Now all of a sudden NVIDIA is trying to tell the very crowd with money to spend, who've largely jumped onto the HF/4K/VR bandwagon, that they should take a huge step backwards to sub-60Hz 1080p, at prices which make no sense at all. That's absolutely crazy, doubly so when dual-GPU is dead below the 2080, a card which is not usefully faster than a 1080 Ti, costs more and has less RAM.
?? I knew about ray tracing before it was announced. Ray tracing isn't a new technology, its been around for more than 25 years, the idea might predate computers.
Who DOESN"T want ray tracing?!
You can argue you don't want to pay a premium for it, but that's not the same thing.
I just want better games, I don't care whether they're ray traced or not. This is why I like Subnautica so much, functionally it's a far more interesting and engaging game than most I've seen recently, even though the visuals are not as sophisticated. I had been spending much time playing Elite Dangerous, but that game has become very wide with no depth, it lacks the interactivitity and depth that Subnautica captures nicely. And re my comments above, see:
@V900: "If you look at AMDs Vega and compare it with the previous AMD flagship: Fury, you see a similar 30-40% increase in performance.
In other words: This isn’t Nvidia wanting to rip gamers off, it’s just a consequence of GPU makers pushing up against the end of Moore’s law."
Point of consideration: Though VEGA did see a lesser performance increase (not sure how accurate 30%-40% is), the MSRP of Vega64 ($500) was less than the MSRP of the FuryX ($650) and even the Fury ($550).
If that were true then Nvidia could have left off the RTX parts this time around and created a GPU that offers a simple ~30% performance improvement at roughly the same retail cost.
Following that, the die-area benefits from 7nm could have been spent on both RTX features and another ~30% performance boost at a similar or slightly-higher cost. By then they could probably have added enough resources to at least manage high refresh rates at 1080p, if not 2.5K
Instead they massively inflated their die for features that require you to accept resolutions and frame-rates that PC gaming left behind 6 years ago.
That last sentence is something I which tech sites would emphasise a lot more. It very much defines how those who normally buy into the higher tier tech now regard what they like doing and why. NVIDIA pushed hard to create the market for high-refresh gaming, 4K & VR, now suddenly they're trying to do an about-face. I can't see how it can work. I just bought a 27" 1440p IPS panel for 200 UKP, the cost of good screens has come down a lot, and now NVIDIA wants us to drop back down to 1080p? :D I get the impression the reaction of a great many is just laughter.
Ahaa! You are getting close :) Come on, just spell it: they want to "milk" us as much as possible before Moore's Law ends and we will completely stop upgrading our PC's and we'll just replace the defective part twice in a life time. No more billions of moneyz for Corporate Commander :)
These cards are a disappointment for the price, the 2080ti should be priced at most 800$, it just doesn't offer the performance required for justifying its price, worse here they compared it to the 1080ti FE which as GamerNexus pointed out is not ideal, for the cards are noticeably slower than other cards with proper cooling, so the 1080ti is at least as fast as the 2080. On the ray tracing side, I like the technology but it's not impressive enough to justify the hefty price tag, I'd rather have a real generational leap with a 2070 beating a 1080ti and a 2080ti having at least 70% more performance than having RT, it's a niche product and obly few games will benefit from it, and the whole DLSS isn't good either limited to only a few games, with more brut force we could achieve 4k and super sampling.
"I'd rather have a real generational leap with a 2070 beating a 1080ti and a 2080ti having at least 70% more performance than having RT" That reminded me of a very old quote: "If I had asked people what they wanted, they would have said ‘faster horses.’" — Henry Ford
That quote only makes sense if Nvidia came up with a "different" radical product than a graphical horse. They just made a slightly faster horse with a RTX ON button which nobody is ready to push yet i.e. developers. So, if you have a choice between a much faster horse and a RTX ON button - one would take a much faster horse. Now, when developers are ready to push the button/envelope, and sign on to the RTX, then this quote makes sense. Nvidia is asking customers to pay the price of new tech-adoption without show-casing the products that use it. They could have invested with devs and in games, to use the RTX, and then released it. But no, they want to fill in a gap until 7nm arrives.
Nobody was ready to push the mass produced automobile button, yet, either. Do you think Ford started mass producing cars and then immediately there were roads and gas stations? No, at first horses could comfortably go many more places than cars could.
His quote is entirely appropriate.
There is no gap to fill before 7 nm arrives since AMD will have no competition. NVIDIA introduced this now because they see value in the product which will generate sales. Plus it will get the ball rolling on developers implementing the new technologies that are present in the architecture and will be present in future NVIDIA architectures.
Have to agree here. No only where automobiles extremely limited in where they could go on introduction, they were also very loud and considered disruptive to society with a large voice of opposition. These new cards at least have the benefit of being able to go anywhere their predecessors can while still enabling new capabilities.
I very much agree that nVidia is using this architecture to "get the ball rolling" on the new tech. They are probably very much aware that sales of RTX cards will be lower until they can fit a meaningful amount of the new hardware resources into a mainstream chip. Though, given the size of the chips and typical associated yields, nVidia may still end up selling every chip they can make.
So don't buy it, eddman. In the end the only real justification for prices is what people are willing to pay. If one isn't able to make a product cheaply enough for it to be sold for what people are willing to pay then the product is a bad product.
I don't understand why you are so worried about the price. Or why you think they are "cut-throat". A cut-throat price is a very low price, not a high one.
There is a wealthy minority who'd pay that much, and? It's only "justified" if you are an nvidia shareholder.
The cards are overpriced compared to last gen and that's an absolute fact. Your constant defending of nvidia's pricing is certainly not a normal consumer behavior.
Yojimbo is right that an item is only ever worth what someone is willing to pay, so in that sense NVIDIA can do what it likes, in the end it's up to the market, to consumers, whether the prices "make sense", ie. whether people actually buy them. In this regard the situation we have atm is largely that made by gamers themselves, because even when AMD released competitive products (whether by performance, value, or both), people didn't buy them. There are even people saying atm they hope AMD can release something to compete with Turing just so NVIDIA will drop its prices and thus they can buy a cheaper NVIDIA card; that's completely crazy, AMD would be mad to make something if that's how the market is going to respond.
What's interesting this time though is that even those who in the past have been happy to buy the more expensive cards are saying they're having major hesitation about buying Turing, and the street cred which used to be perceived as coming with buying the latest & greatest has this time largely gone, people are more likely to react like someone is a gullible money pumped moron for buying these products ("More money than sense!", as my parents used to say). By contrast, when the 8800 GTX came out, that was a huge leap over the 7800 and people were very keen to get one, those who could afford it. Having one was cool. Ditto the later series right through to Maxwell (though a bit of a dip with the GTX 480 due to heat/power). The GTX 460 was a particularly good release (though the endless rebranding later was annoying). Even Pascal was a good bump over what had come before.
Not this time though, it's a massive price increase for little gain, while the headline features provide sub-60Hz performance at a resolution far below what NVIDIA themselves have been pushing as desirable for the last 5 years (the focus has been on high frequency monitors, 4K and VR); now NVIDIA is trying to roll back the clock, which won't work, especially since those who've gotten used to high frequency monitors physically cannot go back (ref New Scientist, changes in the brain's vision system).
Thus, eddman is right that the card's are overpriced in a general sense, as they don't remotely match what the market has come to expect from NVIDIA based on previous releases. However, if gamers don't vote with their wallets then nothing will change. Likewise, if AMD releases something just as good, or better value, but gamers don't buy them, then again nothing will change, we'll be stuck with this new expensive normal.
I miss the Fermi days, buy two GTX 460s to have better performance than a GTX 580, didn't cost much, games ran great, and the lesser VRAM didn't bother me anyway as I wasn't using an uber monitor. Now we have cards that cost many hundreds that don't even support multi-GPU. It's as daft as Intel making the cost entry point to >= 40 PCIe lanes much higher than it was with X79 (today it's almost 1000 UKP); an old cheapo 4820K can literally do things a 7820X can't. :D
Alas though, again it boils down to individual choice. Some want the fastest possible and if they can afford it then that's up to them, it's their free choice, we don't have the right to tell people they shouldn't buy these cards. It's their money afterall (anything else is communism). It is though an unfortunate reality that if the cards do sell well then NVIDIA will know they can maintain this higher priced and more feature restricted strategy, while selling the premium parts to Enterprise. Btw, it amazes me how people keep comparing the 2080 to the 1080 Ti even though the former has less RAM; how is that an upgrade in the product stack? (people will respond with ray tracing! Ray tracing! A feature which can't be used yet and runs too slow to be useful anyway, and with an initial implementation that's a pretty crippled implementation of the idea aswell).And why doesn't the 2080 Ti have more than 11GB? It really should, unless NVIDIA figures that if they can indeed push people back to 1080p then 11GB is enough anyway, which would be ironic.
I'm just going to look for a used 1080 Ti, more than enough for my needs. For those with much older cards, a used 980 Ti or 1070, or various AMD cards, are good options.
No reason Ford couldn't have done both though. There is no technological reason nVidia could not have released a GTX 2080 Ti as well. But they know they couldn't charge as much, and the vast majority of people would not buy the RTX version. Instead, it makes their 1080 Ti stock look much more appealing to for value oriented gamers, helping them shift that stock as well as charge a huge price for the new cards.
It's really great business, but as a gamer and not a stockholder, I'm salty.
That quote applies perfectly to our digital electronic World: we want to go faster from point A to point B. To do that, Henry Ford gave us a car (a faster "horse"). We want the same from GPUs and CPU's, to be faster. Prettier sure, pink even. But first just make it fast.
overall dissapointing performance. RTX 2080 is a flat out bad buy at $800+ when 1080 ti custom boards are as low as $600. the RTX 2080 TI is a straight up ripoff when consumers can easily surpass its performance with 2 x 1080 TIs. I agree on the conclusion though that you are buying hardware that you wont take adavantage of yet but still, if Nvidia wants to push this hardware to all gamers, they need to drop the pricing in line with their performance otherwise not many will buy into the hype.
just checked a local store, the lowest priced 2080 card, a gigabyte rtx 2080 is $1080, and thats canadian dollars... the most expensive RTX card ..EVGA RTX 2080 Ti XC ULTRA GAMING 11GB is $1700 !!!! again that's canadian dollars !! to make things worse.. that's PRE ORDER pricing, and have this disclaimer : Please note that the prices of the GeForce RTX cards are subject to change due to FX rate and the possibility of tariffs. We cannot guarantee preorder prices when the stock arrives - prices will be updated as needed as stock become available. even if i could afford these cards.. i think i would pass.. just WAY to expensive.. id prob grab a 1080 or 1080ti and be done with it... IMO... nvida is being a tad bit greedy just to protect and keep its profit margins.. but, they CAN do this.. cause there is no one else to challenge them...
would you care to share the bill of materials for the tu102 chip? Since you seem to suggest you know the production costs, and therefor know the profit margin which you suggest is a bit greedy.
popin.. all i am trying to say is nvidia doesnt have to charge the prices they are charging.. but they CAN because there is nothing else out there to provide competition...
I specifically replied to Qasar's claim "nvida is being a tad bit greedy just to protect and keep its profit margins.. but, they CAN do this" which is baseless unless they have cost information to know what their profit margins are.
PopinFRESH * sigh * i guess you will never understand the concept of " no competition, we can charge what ever we want, and people will STILL buy it cause it is the only option if you want the best or fastest " it has NOTHING to do with knowing cost info or what a companies profit margins are... but i guess you will never understand this....
Ofcourse their being greedy. Since they saw their cards flying off the shelfs at 50% above MSRP earlier this year they know people are willing to pay.. so their pushing the limit. As they normally do.. this isn't new with Nvidia. Not sure why any are defending them.. or getting excessively mad about it. (..shrug)
Effectively, gamers are complaining about themselves. Previous cards sold well at higher prices, so NVIDIA thinks it can push up the pricing further, and reduce product resources at the same time even when the cost is higher. If the cards do sell well then gamers only have themselves to blame, in which case nothing will change until *gamers* stop making it fashionable and cool to have the latest and greatest card. Likewise, if AMD does release something competitive, whether via price, performance or both, then gamers need to buy the damn things instead of just exploiting the lowered NVIDIA pricing as a way of getting a cheaper NVIDIA card. There's no point AMD even being in this market if people don't buy their products even when it does make sense to do so.
@PopinFRESH007: "would you care to share the bill of materials for the tu102 chip? Since you seem to suggest you know the production costs, and therefor know the profit margin which you suggest is a bit greedy."
You have a valid point. It is hard to establish a profit margin without a bill of materials (among other things). We don't have a bill of materials, but let me establish some knowns so we can better assess.
Typically, most supporting components on a graphics card are pretty similar to previous generation cards. Often times different designs used to do the same function are a cost cutting measure. I'm going to make an assumption that power transistors, capacitors, output connectors, etc. will remain nominally the same cost. So I'll focus on differences. The obvious is the larger GPU. This is not the first chip made on this process (TSMC 12nm) and the process appears to be a half node, so defect rates should be lower and the wafer cost should be similar to TSMC 14nm. On the other hand, the chip is still very large which will likely offset some of that yield gain and reduce the number of chips fabricated per wafer. Pascal was first generation chip produced on a new full node process (TSMC 14nm), but quite a bit smaller, so yields may have been higher and there were more chips fabricated per wafer. Also apparent is the newer GDDR6 memory tech, which will naturally cost more than GDDR5(X) at the moment, but clearly not as much as HBM2. The chips also take more power, so I'd expect a marginal increase for power related circuitry and cooling relative to pascal. I'd expect about the same cost here as for maxwell based chips, given similar power requirements.
From all this, it sounds like graohics cards based on Turing chips will cost more to manufacture than Pascal equivalents. I it is probably not unreasonable to suggest that a TU106 may have a similar cost bill of materials to a GP102 with the note that the cost to produce the GP102 has most certainly dropped since introduction.
I'll leave the debate on how greedy or not this is to others.
burntmybacon just like popinfresh, i guess you will never understand the concept of " no competition, we can charge what ever we want, and people will STILL buy it cause it is the only option if you want the best or fastest " it has NOTHING to do with knowing cost info or what a companies profit margins are... but i guess you will never understand this as well
Raytracing would be amazing to have in games, and it really is the future of gaming. Its crazy to think there will be games with it already next year. (And some later this year!)
Is it too expensive? Meh, we are talking about TWENTY BILLION transistors squeezed into the area of a postage stamp.
People pay 600-1000$ for a phone, and some have no problem paying 1000$ for a CPU or a designer chair.
7-1200$ isn’t an unreasonable price for a cutting edge GPU that’s capable of raytracing and will be fast enough for the newest games for years to come.
Some of the pro-RTX posts sound more like basic trolling though, just to stir things up. If they're getting paid to post +ve stuff, they're doing a pretty rotten job of it. :D
He has a point. People are willing to pay $1000 for a phone, $1000 for a CPU, but $1000 for a high end graphics card is outrageous? I wish the pricing was cheaper, but I'm not having a fit over it. If people don't want to pay the price, they won't. If Nvidia doesn't sell the numbers they want, they'll probably cut the price somewhat.
The $1000 phone actually had more technological advancement... And it's an SoC not individual parts...
About a $1000 CPU, it's normal because it's enthusiast product (e.g. Threadripper or i9). There's no $1000 i7 or Ryzen 7... Nvidia shouldn't have made 2080 Ti. They should've made Titan Turing or something...
news flash.. 2080TI is just part of a large part of a system. Unlike a flagship phone.. You cant game with only a 2080TI or a 2080. You need other parts. Your argument is retarded. Be honest, you got cash in nvidia's stock? your family works for Nvidia?
what argument is that? no I don't have any nVidia stock and none of my family work for them. You sound envious of people who can afford to be early adopters.
You're defending the almost 50% price hike with a "is an enthusiast product". Now that is a dumb excuse. Nothing to do with being envious. A fool and his money are soon departed. So if you want to buy it. Go ahead!.
For the majority of us its not worth to buy something which its "flagship" features arent even working or available for probably months to come, has only 30% average performance increase, for almost double the price..
So you think that means they haven't paid full price for their phone? Or are you saying you don't understand how to buy a video card on a similar payment plan (hint -- it's called a credit card)?
People have absolutely no sense of logic or really intelligence at all when it comes to evaluating technology prices. NONE. This forum, and pretty much every internet forum where predominantly inexperienced kids who have no clue how actual money works post (which is apparently all of them), is all the proof you need of that.
^^ This, 100% this. People who think that changing the structure of payment somehow magically changes the price of something (other than increasing it due to TVM) are amazingly ignorant.
It doesn't change the price, however it certainly hides the price. A lot of people would not buy their expensive phone if it meant coughing up $800 all at once.
Not a good example. A credit card probably has 20% per year interest rate. A phone contact usually bundles in the monthly payment on the phone at low or no interest rate.
the carriers here ( canada ) have been considering dropping the " subsidized phone " thing for a few years now.. IF they do.. i wonder how many people would NOT get a new phone every year.... specifically those get " have to " get the new iphone every year... i dont know any one that would pay that much for a phone each year, if they had to buy it outright from their own pocket....
We're not having a fit over price, we're just not happy about it. A lot of people are digging deep to justify it, though, including making spurious arguments about the cost of things that have nothing to do with gaming GPUs. That strikes me as... not rational.
$1000 phones are a great analogy, incidentally, but not for the reason you thought. It's another market area where manufacturers noticed people were sweating their assets for longer, so flimsy justifications were made for increasing the cost of entry to sustain margins. People buy it because they want it, not because it's good value.
Nvidia want to walk in here singing about a brave new world of Ray Tracing and then they tell me the cost to ride is $700+. To that I am saying nooooo thank you.
Does my not caring affect Nvidia much? No. But this is a forum, this is where we share opinions. Stop trying to act like only your opinion is rational and everyone else's is childish or misinformed.
I will NEVER pay €1000 for a smartphone. Unless of course ultra-inflation will....certainly happen in the next 15 years, when €300 now will be €1000 tomorrow. It has already started in the USA.
BTW: Why do I get the distinct impression, that most of the people complaining about the price, would insist that it’s a totally fair and reasonable price, if this was an AMD graphics card?
They wouldn't, because they would not have the extra bloat in the card, but performed the exact same..people would get the AMD card because of all the bullshit in the nvidia card not yet even proven to by worth it.
Better question: How much does nvidia pay you to be a shill here and on forums? I mean you are obviously delusional about it.
Do you actually read V900's other comments? He is a shill.
And anyway AMD did the same thing with Vega 64 (1080 ish performance but a little expensive, except in Vulkan) and see that many also outrageous with Vega releases. That's just because AMD is one year late and making it a little expensive.
I think they're substantially overblowing the significance of raytracing and DLSS, at least in this generation's lifespan.
The best way I've seen it put is this: look at the Shadow of the Tomb Raider demo. There isn't *that* much of a difference between raytracing on and off. Shaders, ambient occlusion, and other such cheats have gotten so good that they're very close to ray tracing.
Will ray tracing ever be significantly better than raster rendering? Of course it will. It's the future of rendering. Is it significantly better now? No.
It's not an issue of "card is expensive", it "card isn't a good value compared to its predecessors in current games". Most of those extra transistors aren't contributing to the card's performance right now, so you're paying a lot more for nothing, at least right now. For current games, the 2080 is 1080 Ti performance for 1080 Ti prices.
Unless you either A. must have the latest and greatest, or B. anticipate playing games that have been confirmed to support RT, I don't see a compelling reason to buy a 2080 over a 1080 Ti, at least until 1080 Ti supply dries up.
@Inteli: "For current games, the 2080 is 1080 Ti performance for 1080 Ti prices."
I think you meant greater than 1080Ti prices. At the same price and performance, I'd buy newer if for no other reason than to have longer support. The extra features (regardless of whether or not they are used in games I play) are just icing on the cake at that point.
Wow... another review which is comparing shitty Pascal FE to Turding FE and unrealistic MRSP to MRSP. Instead of comparing... you know, actual cards that you can buy (partner cards), at the prices they are at ?
How much are Nvidia paying you guys to write such biased, unrealistic review, to attempt to salvage this disastrous launch ?
Sometimes I wonder why haven't I already removed Anandtech from my bookmarks yet. I might just do that after I finish posting this comment.
reminds me of something...... same situation and outcome..pushing new tek the hardware is horrible price/perf https://www.anandtech.com/show/742/12 from geforce 3 it took till geforce 6 to get a nvidia gpu that was good on performance and price...i guess 1200 bux is the new 500 bux
The computing benchmark really means a lot. But I wonder why GTX-10 series are absent in SGEMM tests. They should be able to get quite some GFLOPs in SGEMM.
Nvidia could be capping performance of turing to make pascal look good in comparison to sell remaining stock which is a win win for nvidia they don't have to discount pascal to maintain max profits. Look at most convulsions saying pascal is still competitive in comparison to the 2080. Unfortunately we don't have a choice in a monopoly. When intel enters the gpu market we are probably going to have the most competition in a long time even if its mid level.
I'm going to take a (admittedly small) leap of faith and suggest that nVidia most likely is not intentionally limiting performance of Turing cards. Given the amount of hardware dedicated to tasks that don't benefit rasterization, it just doesn't seem like could have left that much performance on the table. It is much more likely that they've simply got prices set high with the intent of dropping them once high end pascal inventory clears out. Of course, after the mining push, they've seen how much the market is willing to bear. They may be trying to establish a new pricing structure that gives the extra profits to them rather than retailers.
I honestly believe the endgame of Nvidia is simple. They want to increase their margin, and the only way to to that is to sell the WHOLE full chips, tensor and all to gamers. While still charging top notch to Pros.
This would lead Nvidia to make LESS variants, saving costs in having to design multiple versions when they cant scale down or cut.
You are kidding me? This is exactly this. They made an all around chip to tackle pros, gamers and compute. Vega has the same issue. It was aimed at being an iGPU to a dGPU. It does extremely well at low input, but as a dGPU.
They save cost and standardize their manufacturing process. It is nothing else.
Going to go a weird direction with this. I believe cards are going to start diverging from one another in terms of what gamers are looking for. Hardcore gamers that are after the professional scene and absolute performance always turn graphics down, they drive high refresh rate monitor, with low response times, and high frame rates to absolutely limit the amount of variance (spiking) that is present in their gaming experience.
Nvidia went for absolute fidelity where they believe the best gaming experience will come from picture perfect representations of an environment, such as with ray tracing. I see ray tracing as a gamer and I go 'Welp that's something I'll turn off'. Hardware review websites are only looking at gaming from a cinematic standpoint, where best gaming will always have everything maxed out running at 8k. Cards do perform differently under different resolutions and especially with different amounts of eye candy turned on. I really think Anand should look into testing cards at 1080p on lowest settings with everything uncapped - Not as the only datapoint, but as another one. Professional gamers or anyone who takes gaming seriously will be running that route.
Which runs into another scenario, where card makers are going to diverge. AMDs upcomming 7nm version of Vega for instance may continue down Vegas current path, which means they'll be focusing on current day performance (although they mentioned concentrating more on compute we can assume the two will intersect). That means while a 2080ti might be faster running 4k@ultra, especially with rays if that ever takes off, it may lose out completely at 1080p@low (but not eyecancer, such as turning down resolution or textures).
For testing at absolute bleeding speeds, that 1% that is removed in 99% percentile testing really starts to matter. Mainly because the spikes, the micro-stutters, the extra long hiccups get you killed and that pisses off gamers that aim for the pinnacle. Those might seem like outliers, but if they happen frequently-infrequently enough, they are part of a distribution and shouldn't be removed. When aiming for bleeding speeds, they actually matter a lot more.
So thus is born the esports gaming card and the cinematic gaming card. Please test both.
so they should include the horizontal line of a completely CPU bound test? Also I'm not understanding the statistical suggestions, they make no sense. Using the 99th percentile is very high already and the minuscule amount of outliers being dropped are often not due to the GPU. As long as they are using the same metric for all tests in the data set it is irrelevant.
Not all games are CPU bound, furthermore it wouldn't be completely horizontal, that would imply zero outliers, which never happens. In the case of that instead of looking at 99% frame time you would instead focus on the other part 1% frame time or all the stuttering and microstutters. You can have a confidence in tail ends of a distribution if there is enough data points.
Also having played tons of games on low settings, you are 100% incorrect about it being a flat line. Go play something like Overwatch or Fortnite on low, you don't automagically end up at CPU cap.
Despite all the (AMD) fanboy rage about higher prices, here’s what will happen:
Early adopters and anyone who wants the fastest card on the market, will get an RTX 2080/2070, Nvidia is going to make a tidy profit, and in 6-12 months prices will have dropped and cheaper Turing cards will hit the market.
That’s when people with a graphics card budget smaller than 600$ will get one. (AMD fanboys will keep raging though, prob about something else that’s also Nvidia related.)
That’s how it always works out when a new generation of graphics hit the market.
But everyone who’s salty about “only” getting a 40% faster card for a higher price won’t enjoy the rest of the decade.
There won’t be anymore GPUs that deliver a 70-80% performance increase. Not from AMD and not from Nvidia.
We’re hitting the limits of Moore’s law now, so from here on out, a performance increase of 30% or less on a new GPU will be the standard.
Why gratuitously characterize all those complaining of price as "AMD fanboys"? Obviously, most are those who intended to buy the new NVidia boards but are dismayed to find them so costly! Hardly AMD disciples.
Your repeated, needless digs mark you as the fanboy. And do nothing to promote your otherwise reasonable statements.
I'm such an AMD fanboy that I have a 1060 and have never owned an AMD product. You, on the other hand, reek of pro nvidia bias.
A 40% higher launch MSRP is not "how it always works out".
No one is complaining about the performance itself but the horrible price/performance increase, or should I say decrease, compared to pascal.
Moore's law? 980 Ti was made on the same 28nm process as 780 Ti and yet offered considerably higher performance and still launched at a LOWER MSRP, $650 vs. $700.
Got enough straw for that strawman you're building? The last 3 GPUs I've bought have been Nvidia (970, 1060, 1070 Ti). I would have considered a Vega 56, but the price wasn't low enough that I was willing to buy one.
News flash: in current games, the 2080 is tied with the 1080 Ti for the same MSRP (and the 1080 Ti will be cheaper with the new generation launches). Sure, if you compare the 1080 to the 2080, the 2080 is significantly faster, but those only occupy the same position in Nvidia's product stack, not in the market. No consumer is going to compare a $500 card to a $700-800 card.
The issues people take with the Turing launch have absolutely nothing to do with either "We only got a 40% perf increase" or "Nvidia raised the prices" in isolation. Every single complaint I've seen is "Turing doesn't perform any better in current games as an equivalently priced Pascal card".
Lots of consumers are pragmatists, and will buy the best performing card they can afford. These are the people complaining about (or more accurately disappointed by) Turing's launch: Turing doesn't do anything for them. Sure, Nvidia increased the performance of the -80 and -80 Ti cards compared to last generation, but they increased the price as well, so price/performance is either the same or worse compared to Pascal. Many people were holding off on buying new cards until this launch, and in return for their patience they got...the same performance as before.
Where I live (UK), the 2080 is 100 UKP more expensive than a new 1080 Ti from normal retail sources, while a used 1080 Ti is even less. The 2080 is not worth it, especially with less RAM. It wouldn't have been quite so bad if the 2080 was the same speed for less cost, but being more expensive just makes it look silly. Likewise, if the 2080 Ti had had 16GB that would at least have been something to distinguish it from the 1080 Ti, but as it stands, the extra performance is meh, especially for the enormous cost (literally 100% more where I am).
If you think that ray tracing on 2080ti is damm cool and game changer.....You should see what AMD has done. Just that no one really care about it back then.....
It was just another "rasterization cheat" though, may have looked nice but ultimately didn't have the longevity that Ray Tracing may have. No 3D developer or 3D artist is going to ever argue that Ray Tracing is not the future, the question is just how to get there.
Meh... The image itself is captured by cameras, it’s the manipulation of it that’s done in real time.
Which is of course a neat little trick; and while it looks good, it’s hardly as impressive and computationally demanding as creating a whole image through raytracing.
The demos you linked to are beautiful indeed, however both the ATi Ruby demo and Rigid Gems demo use standard DirectX feature from those times, no ray tracing at all or any vendor specific features. Due to the latter it is worth pointing out that the 2008 Ruby demo (called Double Cross IIRC) was perfectly happy to run on nVidia cards of the time.
If these demos show anything, it is that there were and are extremely talented artists out there who can do amazing things to work around the limitations of rasterization. This way however we can always merely approximate how a scene should look, with increasingly high costs, so going back to proper ray tracing was only a question of when its costs will approach that of rasterization. We seem to have arrived at the balancing point, hence hybrid rendering. I also think if AMD could have pushed nVidia more with high end GPUs, nVidia may not have made this step at this time, at least it certainly could have been a more risky proposition otherwise.
Clearly, we're firmly in the age of 4K gaming for the upper range of cards. From Vega 64, 1080, 1080Ti, 2080, 2080Ti, all are throwing up very playable performance on 4K resolutions. I still expect to see people say you'll need SLI 2080TIs for 4K gaming, but that rings incredibly hollow when even a Vega 56 can deliver playable frame rates in most games at 4K with near maximum settings.
My takeaway: 1080 owners and above should hold off, 1070 and below are probably justified in upgrading, with an eye to the future as the technologies baked in these cards mature and games utilizing them are released.
I am running a GTX970 and have zero interest in this generation of cards at that price. I am going to let the market settle, AMD release its next batch of cards or two and see where things are a year or two from now. Waste of money as far as I am concerned.
NV is messing with us. Even with no competition from AMD those price hikes at such low performance gains are laughable. This generation of new GPU seems like just a stop gap before NV will have something more serious to show next year.
No they seem like they will be exactly the same as the 1000 series: they are what they are, you pay what they ask, and they will be the only decent option they offer for the next two years.
Maybe if Radeon ever gets their shit together the landscape might look different in 2-3 years but trust me: for now, expect more of the same.
Yeah we are pretty much getting into Intel vs AMD scenario when Intel dominates for a years and bring customers overpriced products with very slow performance upgrades. There is a hope AMD will at least try to do something about it.
The temperature and noise results are shocking. The results are much closer to what you'd expect from a blower, rather than an open-air cooler. Previous gen OEM solutions do much better than this. What's the reason for this?
I think we need DLSS and Hybrid Ray Tracing to judge whether it is worth it. At the moment, we could have the nearly double the performance of 1080Ti if we simply have a 7xxmm2 Die of it.
I think the idea Nvidia had is that we have reached the plateau of Graphic Gaming. Imagine what you could do with a 7nm 7xxm2 Die of 2080Ti? Move the 4K Ultra High Quality frame rate from ~60 to 100? That is in 2019, in 2022 3nm, double that frame rate from 100 to 200?
The industry as a whole needs to figure out how to extract even more Graphics Quality with less transistors, simpler software while at the same time makes 3D design modelling easier. The graphics assets from gaming are now worth 100s to millions. Just the asset, not engine programming, network, 3D interaction etc, nothing to do with code. Just the Graphics. And Hybrid Ray tracing is an attempt to bring more quality graphics without the ever increasing cost of Engine and graphics designer simulating those effect.
What is interesting is that we will have 8 Core 5Ghz CPU and 7nm GPU next year.
Given how much die space is dedicated to the new features software support will definitely be the key for these cards' success. Otherwise their price is just too high for what they offer today. Buying these cards now is somewhat of a gamble, but nVidia does have excellent relations with developers however, so support should come. As someone who would like to have a capable GPU for 100+ FPS gaming at 1440p, especially one that is future proof, I would much rather take my chances with these new cards.
To me the question is this, would it really be worth focusing even more on 4k gaming, when it is a fairly niche market segment still due to monitor prices (especially ones with low latency for gaming). Arguably these high end cards are niche too, but when we can already have 4k@60 FPS, with maxed graphics settings, other considerations become more important. At any given resolution and feature level pure performance becomes meaningless after a certain point, at least for gaming. Arguing that reaching 100 FPS at 4k definitely has merit in my opinion, but by the time really good 4k monitors take over we'll get there, even with the path nVidia took.
Regarding graphics quality and transistor count, ray tracing should be a win here, if not now in the future certainly. There are diminishing returns with rasterization as you approach more realistic scenes and ray tracing makes you jump through less hoops to if you want to create a correct looking scene.
"I think the idea Nvidia had is that we have reached the plateau of Graphic Gaming. Imagine what you could do with a 7nm 7xxm2 Die of 2080Ti?"
Yes, but that is probably why they stuck with 12nmFF actually. Note the die size, plus each card has its own GPU, rather than binned selection from the same GPU (kudos to Nate for also ruminating briefly on this in text). This means maximizing yiled is particularly important, and so begs for a mature, efficient process. TSMC achieved great things with their current 7nm process, no knock on it, but it is still UV-based, it's been long documented that there are yield challengels with that. IMO Nvidia will wait to hitch their wagons to TSMC's next process (expected next year), EUV-based 7nm+, which is expected to mitigate a lot of these yield concerns.
In other words it will be very interesting to see what the 2180 Ti looks like next year -- yes, I built a lot of assumptions into that sentence ;)
The quality of this website has hit rock bottom under the helm of Cutress. I waited all day for this review but by the time it was posted word had spread of how underwhelming these cards are and I had no interest in reading it. This is the second major hardware release in a year in which Anandtech has totally screwed up the review, the other being Ryzen. Please Mr Cutress step down so someone else can run the site and hopefully return it to it's former glory.
No, I said it was widely known that the cards' performance was underwhelming by the time the article was posted. If you reread what I said the main point of my post is that Mr Cutress should resign because of the poor quality of how the website has been run since he took over.
After reading my original post again how in the hell did you come to the conclusion that I was underwhelmed by the article? It's very clear that I was talking about the performance of the card.
"The quality of this website has hit rock bottom under the helm of Cutress"
That would be rather amazing, seeing as how I run it, not Ian.
Anyhow, if you have any questions or concerns, please don't hesitate to drop me a line (my email address is listed on the site). I always appreciate the feedback.
Then take my comments and replace Cutress with Smith. I've been an avid follower of this website since 1998 including having it as my homepage for several years and the quality has declined. The Ryzen review and this review are prime examples of that decline
I follow what you're saying. I thought the website was run by Ian, my mistake. I'm saying you need to take my original comment and replace the name Cutress with the name Smith. Here I'll do it for you.
"The quality of this website has hit rock bottom under the helm of Smith. I waited all day for this review but by the time it was posted word had spread of how underwhelming these cards are and I had no interest in reading it. This is the second major hardware release in a year in which Anandtech has totally screwed up the review, the other being Ryzen. Please Mr Smith step down so someone else can run the site and hopefully return it to it's former glory."
Now do you understand? I'm trying to say the site needs new leadership. The articles for the two biggest releases of the past year have been totally screwed up, 2080 cards and Ryzen refresh. Ryzen took over a month to complete.
I'm not saying the article was bad because it was released 11 hours later than everyone else, I'm sure it was fine. I'm saying that the management of this website has went to hell because they screwed up the two biggest releases of the last year. It took 5 weeks to finish the Ryzen article.
I really wonder how well raytracing will be implemented in the next 1-2 years. My bet is that for many of the titles Nvidia's announced, the effects will be limited and sort of gimicky, and the real benefits will come with titles that are starting development now (or, more likely, in a few months, when devs can look at how the 1st attempts at using RT faired).
If my hunch is right, then that means that the RT features are likely to be of little practical use in this generation, since the real benefits won't come until some point after Nvidia's next-gen (7nm? 5?) chips come out with much-improved performance.
It sounds like the 2080 TI at maybe 1440 might be viable for RT. But yeah, for the most part this is about the future, and starting to get some games out there so for future releses there is not the "chicken & egg" problem they have now (no games to use it for, but reason there are no games is there are no cards to use it).
Nvidia clearly sacrificing short-term profitability to establish this base; notice how the 2080 is both priced & performs about the same as the 1080 Ti. With the larger die size despite 12nmFF, driver development costs, etc, there is little doubt in my mind that Nvidia will be making a bigger margin on the 1080 Ti than the 2080. But they want to make it cheap enough so that, even if there is little to gain RIGHT NOW from buying 2080 instead fo 1080 Ti, there is also little lost either.
RTX 2080 not worth buying right now, 1080ti is cheaper (lol), cooler and performs equal. RTX 2080ti is a 1200+ Card that is around 60% price increase from 1080ti for a 25-28% performance increase? How is that a good purchase? Neither of them are worth buying right now.
THANK YOU for the Ashes of the Singularity benchmark results. The deltas may not translate to other games but show me exactly what to expect from an upgrade.
But NVIDIA's key features - such as real time ray tracing and DLSS - aren't being utilized by any games right at launch. In fact, it's not very clear at all when those games might arrive, because NVIDIA ultimately is reliant on developers here.
In the Star Wars Reflections demo, we measured the RTX 2080 Ti Founders Edition managing around a 14.7fps average at 4K and 31.4fps average at 1440p when rendering the real time ray traced scene. With DLSS enabled, it jumps to 33.8 and 57.2fps
Direct quotes from article. Price premium for NV tech that (1) will not be in games at launch and may have months of buggy implementation from early adoption and may not have widespread adoption, (2) needs extreme help from DLSS to have usable framerates. Should've been named DTX not RTX.
Tomb Raider is a title out now with RTX enabled in the game.
Battlefield 5 is out in a month or two (though you can play it right now) and will also utilize RTX.
Sorry to destroy your narrative with the fact, that one of the biggest titles this year is supporting RTX.
And that’s of course just one out of a handful of titles that will do so, just in the next few months.
Developer support seems to be the last thing that RTX2080 owners need to worry about, considering that there are dozens of titles, many of them big AAA games, scheduled for release just in the first half of 2019.
Unless I'm mistaken, TR does not support RTX yet. Obviously, otherwise it would be showing up in reviews everywhere. There is a reason every single reviewer is only benchmarking traditional games; that's all there is right now.
These cards are nothing more than a cheap magic trick show. Nvidia knew about the performances being lackluster, and based their marketing over gimmick to square the competition by affirming that these will be the future of gaming and you will be missing out without it.
Literally, they basically tried to create a need... and if you are defending Nvidia over this, you have just drinking the coolaid at this point.
Quote me on this, this will be the next gameworks feature that devs will not bother touching. Why? Because devs are developing games on consoles and transit them to PC. The extra time in development doesn't bring back any additional profit.
Here's the thing though, I don't the performance is that lacklustre, the issue is we have this huge die and half of it does not do what most people want; give us more frames. If they had made the same size die with nothing but traditional CUDA cores, the 2080 Ti would be an absolute beast. And I'd imagine it would be a lot cheaper as well.
But nVidia (maybe not mistakenly) have decided to push the raytracing path, and those of us you just want maximum performance for the price (me) and were waiting for the next 1080 Ti are basically left thinking "... oh well, skip".
DOn't get me wrong, these cards are a normal upgrade performance jump, however it is not the second christ sent that Nvidia is marketing.
The problem here is Nvidia want to corner AMD and their tactic they choose is RTX. However RTX is nothing else than a FEATURE. The gamble could cost them a lot.
If AMD gaming and 7nm strategy pays off, devs will develop on AMD hardware and transit to PC architecture leaving devs no incentive to put the extra work for a FEATURE.
The extra cost of the bigger die should have been for gaming performances, but Nvidia strategy is to disrupt competition and further their stand as a monopoly as they can.
Physx didn't work, hairwork didn't work and this will not work. As cool as it is, this should have been a feature for pro cards only, not consumers.
This reminds me quite a bit of the original GeForce 256 launch. Not sure how many of you were following Anandtech back then, but it was my go-to site then just as it is now. Here are links to some of the original reviews:
Similar to the 20XX series, the GeForce256 was Nvidia's attempt to change the graphics card paradigm, adding hardware tranformation and lighting to the graphics card (and relieving the CPU from those tasks). The card was faster than the contemporary cards, but also much more expensive, making the value questionable for many.
At the time I was a young mechanical engineer, and I remember feeling that Nvidia was brilliant for creating this card. It let me run Pro/E R18 on my $1000 home computer, about as fast as I could on my $20,000 HP workstation. That card basically destroyed the market of workstation-centric companies like SGI and Sun, as people could now run CAD packages on a windows PC.
The 20XX series gives me a similar feeling, but with less obvious benefit to the user. The cards are as fast or faster than the previous generation, but are also much more expensive. The usefulness is likely there for developers and some professionals like industrial designers who would love to have an almost-real-time, high quality, rendered image. For gamers, the value seems to be a stretch.
While I was extremely excited about the launch of the original GeForce256, I am a bit "meh" about the 20XX series. I am looking to build a new computer and replace my GTX 680/i5-3570K, but this release has not changed the value equation at all.
If I look at Wolfenstein, then a strong argument could be made for the 2080 being more future proof, but pretty much all other games are a wash. The high price of the 20XX series means that the 1080 prices aren't dropping, and I doubt the 2070 will change things much since it looks like it would be competing with the vanilla 1080, but costing $100 more.
Looks like I will wait a bit more to see how that price/performance ends up, but I don't see the ray-tracing capabilities bringing immediate value to the general public, so paying extra for it doesn't seem to make a lot of sense. Maybe driver updates will improve performance in today's games, making the 20XX series look better than it does now, but I think like many, I was hoping for a bit more than an actual reduction in the performance/price ratio.
How much was a 256 at launch? I couldn't find any concrete pricing info but let's go with $500 to be safe. That's just $750 by today's dollar for something that is arguably the most revolutionary nvidia video card.
Yep, and it was also not selling well among "gamers" novelty, that became popular after falling under $100 a pop years later. Same here, financial analysts say the expected revenue from gaming products will drop in the near future, and Wall Street already dropped NVidia. Product is good, but expensive, it is not going to sell in volume, their revenue will drop in the imminent quarters. Apple's XS phone was the same, but Apple started a buy-one-get-one campaign on the very next day, plus upfront discount and solid buyback of iPhones. Yet, not clear whether they will achieve volume and revenue growth within the priced in expectations. These are public companies - they make money from Wall Street, and they /NVidia/ can lose much more and much faster on the capital markets, versus what they would gain in profitability from lesser volume high end boutique products. This was relatively sh**y launch - NVidia actually didn't want to launch anything, they want to sell their glut of GTX inventory first, but they have silicon ordered and made already at TSMC, and couldn't just sit on it waiting...
I think it was actually much less, judging by comments made in one of the reviews I linked. Maybe around $350 or so, which was very expensive at the time. It is true that it was a revolutionary card, but at the same time it was greeted with a lukewarm reception from the gaming community. Much like the 20XX series. I doubt that the 20XX will seem as revolutionary in hindsight as the GeForce256 did, but the initial reception does seem similar between the two. Will be interesting to see what the next year brings to the table.
Wow, that's just $525 now. I'm interested in old card prices because some people claim they have always been super expensive. It seems they have a selective memory. I'm yet to find a card more expensive than 2080 Ti from that time period.
I'm not surprised that people still didn't buy many 256 cards. The previous cards were cheaper and performed close enough for the time.
I am pretty sure I'll get a 2080ti, simply because nothing else will run INT4 or INT8 based inference with similar performance and ease of availability and tools support. Sure, when you are BAIDU or Facebook, you can buy even faster inference hardware or if you are Google you can build your own. But if you are not, I don't know where you'll get something that comes close.
As far as gaming is concerned, my 1080ti falls short on 4k with ARK, which is noticeable at 43". If the 2080ti can get me through the critical minimum of 30FPS, it will have been worth it.
As far as ray tracing is concerned, I am less concerned about its support in games: Photo realism isn't an absolute necessity for game immersion.
But I'd love to see hybrid render support in software like Blender: The ability to pimp up the quality for video content creation and replace CPU based rander farms with something that is visually "awsome enough" points towards the real "game changing" capacity of this generation.
It pushes three distinct envelopes, raster, compute and render: If you only care about one, the value may not be there. In my case, I like the ability to explore all three, while getting an 2080ti for me allows me to push down an 1070 to one of my kids still running an R290X: Christmas for both of us!
In the end though that's kinda the point, these are not gaming cards anymore and haven't been for some time. These are side spins from compute, where the real money & growth lie. We don't *need* raytracing for gaming, that glosses over so many other far more relevant issues about what makes for a good game.
High performance and (more than) matching price. nVidia seemingly put the card classification down one notch (x80 => x70; Ti => x80; Titan => Ti) while keeping the prices and overclocked then from day one so it looks like solid progress if one disregards the price.
I think it will be a short lived (1 year or so) generation. A pricey stop gap with a few useless new features (because when devs catch up and actually deploy DXR enabled games, these cards will have been replaced by something faster).
Spelling/grammar errors (Only 2!): Wrong word: "All-in-all, NVIDIA is keeping the Founders Edition premium, now increased to $100 to $200 over the baseline" Should be: "All-in-all, NVIDIA is keeping the Founders Edition premium, now increased from $100 to $200 over the baseline" Missing "s": "Of course, NVIDIA maintain that the cards will provide expected top-tier" Should be: "Of course, NVIDIA maintains that the cards will provide expected top-tier"
Author thinks that all gamers buy only fastest cards? May be. But I doubt all of them buy the new generestion card every year. In short, where are comparisons to 980/980Ti and even 780/780Ti? Owners of those cards are more interested to upgrade.
See from top menu on right, there is a bench where you can see results. I presume they add data to huge database soon. And yes,people are talking about high end GPU but most are spending $400 max. for it.
Very nice review, by far the best one I've read. Thanks for that. How likely do you think the launch of another generation is in 2019 from Nvidia / and or something competitive from AMD based on 7nm?
I currently have gtx970, skipped the Pascal generation and was waiting for Turing. But I don't like being an early adopter and feel that for pure rasterisation, these cards aren't worth it. Yes they are more powerful then the 10er series I skipped, but they also costs more - so performance pro $$$ is similar, and I'm not willing to pay the same amout of $$$ for the same performance as I would have 2 years ago. Guess I'll just have to stick it out with my 970 at 1080p?
Just about every review of these cards states that right now they're disappointing and we need to wait and see how ray tracing games pan out to see if that will change.
We waited this many years to have the smallest generation to generation performance jump we have ever seen. Price went way up too. The cards are hotter and use a more power which makes me question how long they last before they die.
The weird niche Nvidia "features" these cards have will end up like PhysX.
The performance you get for what you pay for a 2080 or 2080 Ti is simply terrible.
V900, you've posted a lot stuff here that was itself debatable, but that comment was just nonsense. I don't believe for a moment you think most tech sites think these cards are a worthy buy. The vast majority of reviews have been generally or heavily negative. I therefore conclude troll.
Oof, still on the 12nm process. Which frankly is quite remarkable how much rasterization performance they were able to squeeze out, while putting in the tensor and ray tracing cores. The huge dies are not surprising in that regard. In the end, architectural efficiency can only go so far, and the fundamental limit is still on transistor budget. With that said, I'm guessing there's going to be a 7nm refresh pretty soon-ish? I would wait...
Don’t see a 7nm refresh on the horizon. Maybe in a year, probably not until 2020.
*There isn’t any HP/high density 7nm process available right now. (The only 7nm product shipping right now is the A12. And that’s a low power/mobile process. The 7nm HP processes are all in various form of pre-production/research.
*Price. 7nm processes are going to be expensive. And the Turing dies are gigantic, and already expensive to make on its current node. That means that Nvidia will most likely wait with a 7nm Turing until proces have come down, and the process is more mature.
*And then there’s the lack of competition: AMD doesn’t have anything even close to the 2080 right now, and won’t for a good 3 years if Navi is a mid-range GPU. As long as the 2080Ti is the king of performance, there’s no reason for Nvidia to rush to a smaller process.
Kirin 980 has been shipping for a while, should be in stores in two weeks, we know that atleast Vega was sampling in June, so it depends on the allocation at TSMC it's not 100% Apple.
The assumption under which this article operates that RTX2080 should be compared to GTX1080 and RTX2080TI to GTX1080TI is a disgrace. It allows you to be overly satisfied with performance evolutions between GPUS with a vastly different price tag! It just shows that you completely bought the BS renaming of Titan into Ti's. Of course the next gen Titan is going to perform better than the previous generation's Ti ! Such a gullible take on these new products cannot be by sheer stupidity alone.
Not sure why people are so negative about these and the prices. Sell your old card and amortize the cost over how long you'll keep the new one. So maybe $400/year (less if you keep it longer).
If you're a serious gamer, are you really not willing to spend a few hundred dollars per year on your hardware? I mean, the performance is there and it's somewhat future proofed (assuming things take off for RT and DLSS.)
A bowling league (they still have those?) probably costs more per year than this card. If you only play Minecraft I guess you don't need it, but if you want the highest setting in the newest games and potentially the new technology, then I think it's worth it.
Performance is not there. Around 20% actual performance boost is not very convincing especially due much higher price. How can you be positive about it? Future tech promise doesn't add that much and it is not clear if game developers will bother. When one spend $1000 of GPU it has to deliver perfect 4k all maxed gaming and NV charges ever more. This is a joke, NV is just testing how much they can squeeze of us until we simply don't buy.
The article clearly says that the Ti is 32% better on average.
The idea about future tech is you either do it and early adopters pay for it in hopes it catches on, or you never do it and nothing ever improves. Game developers don't really create technology and then ask hardware producers to support it/figure out how to do it. Dice didn't knock on Nvidia's door and pay them to figure out how to do ray tracing in real time.
My point remains though: If this is a favorite hobby/pass-time, then it's a modest price to pay for what will be hundreds of hours of entertainment and the potential that ray tracing and DLSS and whatever else catches on and you get to experience it sooner rather than later. You're saying this card is too expensive, yet I can find console players who think a $600 video game is too expensive too. Different strokes for different folks. $1100 is not terrible value. You talking hundreds of dollars here, not 10s of thousands of dollars. It's drop in the bucket in the scope of life.
"Ultimately, gamers can't be blamed for wanting to game with their cards, and on that level they will have to think long and hard about paying extra to buy graphics hardware that is priced extra with features that aren't yet applicable to real-world gaming, and yet only provides performance comparable to previous generation video cards."
So, I guess I can't read charts, because I thought they said 2080ti was massively faster than anything before it. We also KNOW devs will take 40-100% perf improvements seriously (already said such, and NV has 25 games being worked on now coming soon with support for their tech) and will support NV's new tech since they sell massively more cards than AMD.
Even the 2080 vs. 1080 is a great story at 4k as the cards part by quite a margin in most stuff. IE, battlefield 1, 4k test 2080fe scores 78.9 vs. 56.4 for 1080fe. That's a pretty big win to scoff at calling it comparable is misleading at best correct? Far Cry 5 same story, 57 2080fe, vs. 42 for 1080fe. Again, pretty massive gain for $100. Ashes, 74 to 61fps (2080fe vs. 1080fe). Wolf2 100fps for 2080fe, vs. 60 for 1080fe...LOL. Well, 40% is, uh, "comparable perf"...ROFL. OK, I could go on but whatever dude. Would I buy one if I had a 1080ti, probably not unless I had cash to burn, but for many that usually do buy these things, they just laughed at $100 premiums...ROFL.
Never mind what these cards are doing to the AMD lineup. No reason to lower cards, I'd plop them on top of the old ones too, since they are the only competition. When you're competing with yourself you just do HEDT like stuff, rather than shoving down the old lines. Stack on top for more margin and profits!
$100 for future tech and a modest victory in everything or quite a bit more in some things, seems like a good deal to me for a chip we know is expensive to make (even the small one is Titan size).
Oh, I don't count that fold@home crap, synthetic junk as actual benchmarks because you gain nothing from doing it but a high electric bill (and a hot room). If you can't make money from it, or play it for fun (game), it isn't worth benchmarking something that means nothing. How fast can you spit in the wind 100 times. Umm, who cares. Right. Same story with synthetics.
It's future tech that cannot deliver *now*, so what's the point? The performance just isn't there, and it's a pretty poor implementation of what they're boasting about anyway (I thought the demos looked generally awful, though visual realism is less something I care about now anyway, games need to better in other ways). Fact is, the 2080 is quite a bit more expensive than a new 1080 Ti for a card with less RAM and no guarantee these supposed fancy features are going to go anywhere anyway. The 2080 Ti is even worse; it has the speed in some cases, but the price completely spoils the picture, where I am the 2080 Ti is twice the cost of a 1080 Ti, with no VRAM increase either.
NVIDIA spent the last 5 years pushing gamers into high frequency displays, 4K and VR. Now they're trying to do a total about face. It won't work.
They are both owned by Purch. Marketing company responsible for those annoying auto play videos and the lowest crap possible From the web section. They go with motto: Ad clicks over anything. Don't think it will change anytime soon. Anand sold his soul twice to Apple and also his web to Purch.
That's a perfect summary of why tom's looney article was so bad. If your current hw is doing just fine for the games you're playing atm, then upgrading makes no sense. It's rather cynical of NVIDIA, and some tech sites, to basically create a need and then push people into thinking they're idiots if they don't upgrade, while hiding behind very poor price/performance dynamics.
As someone working in the game industry for already 8+ years I can tell you only one thing.No body will rush to implement proprietary features! The ones that have demos or are about soon to have the features implemented are the ones Nvidia reached to not vise versa. This days making a game is no different than making any other product on this planet.It is corporate bussines which means you want maximum profit which translates in maximum user coverage which translates in maximum platforfm coverage PC (Windows , Mac , Linux), Consoles and mobile. There is just no basis to anyhow compare Vulkan to anything proprietary.Vulkan is coming with the promise that what i make will look and feel the same way visually across multple platforms without requiring too much husstle on the development side.Even when you use a flexible engine as UE4 it is not that easy to have the same stuff working across multiple platforms and changes and further development for materials and meshes are required to have the stuff atleast to look indetical. So i can hardly imagine that while you are bogged down with tons of bugs and trying to deliver your product across multiple platforms you will add yourself one more pain in the ass as nVidia Ray Tracing that will have doubfull income effect on your title given the small user reach. I can give you Wargaming and Valve games as an example of old engines that are making tons of money. So while nVidia is trying to ripoff people with that amount of money I'm wondering how to optimise one level so it could run fast and look cool on 6 years old midrange hardware.
Although it is true that proprietary features do not always take off in a meaningful way (the GPU-accelerated mode of physx as an example), it doesn't mean an open standard would always be the popular choice.
Take big budget games from large publishers. These games, in the large majority of cases, are only available on three platforms, PS, xbox and windows, because these are the platforms that have the hardware to support such games and also have the largest audience.
IINM, vulkan is not available on X1 and PS4. If a AAA game dev was to use vulkan on windows, they'd still need to code the game for directx on X1 and GNM or GNMX on PS4, meaning they'd have to support three APIs.
If they go with directx on windows, then two platforms will already be covered and they'd only need to do additional coding for PS4 support.
On the other hand, vulkan does make sense for devs of smaller games where they want to cover as many platforms as possible, specially for mobile games, where vulkan covers windows, linux, mac, android and I think ios and even switch.
That's why I wrote it's not always the case. Sometimes it works, sometimes not. It all has to do with the standard having industry support. Being open source does not automatically mean it'd catch on, unfortunately.
Someone will and it will be great and then it will catch on. Or maybe it won't, but that's how things happen. Someone takes a risk and it pans out. Either contribute to it or don't, play safe or take a chance.
Nvidia has obviously been making some pretty good decisions over the years and has turned out some amazing products. Sometimes they've been wrong, but more often than not they are right. If they were wrong more than not, they'd be out of business, not a $150B company.
If you don't ever take a risk doing something new or cutting edge, you'll disappear. This is true for all technology.
Oh, and remember, at some point you don't have to care about 6 year old hardware. Look at consoles. At some point the studio just stops making a game for the last-gen, even though the new gen doesn't have the same size install base yet. Or a non-franchise game just shows up for next-gen and that's it. They never even bother to attempt to make it on older stuff.
Thanks for your comment, this was my argument all along.
The only way to force a new feature is by shear numbers. Basically, if RTX was something available on new consoles, then that would make a business stand point sense, however AMD is owning consoles and might for a long term.
AMD should force multi-GPU via Infinity Fabric through consoles. This would work because devs would have access to additional power on the die via proper coding... and this delivered to 100% of the user base.
If this is only developed for less than 1% of the PC user base, this will fail miserably and nobody would add support unless sponsored by Nvidia themselves.
Financial analysts are seeing it and downgrades are coming.
I just want to be able to play all my games at 1440p, 60 FPS with all the eye candy turned on. Looks like my overclocked 1080 TI will be good for the immediate future is what I got from this review. The only real upgrade path is to the 2080 TI, and at $1200 that's an extremely hard sell.
Well the problem is in India retailers are not willing to reduce the price of 1080 deries.. At present the 2080 is cheaper than all models of 1080 ti.. If given the chance I will definitely go for 2080..thing is that I will have to invest in a gaming monitor first
I agree these are really for early-adopters of RT, or if you're doing a new build or need of a new card but want it to last you 3+ years, so you need to catch the RT wave now.
I think the next generation of RT-enabled cards will probably be the optimal entry-point; Presumably they'll be able to double (or so) RT performance on a 7nm process, and that means that the next xx70/80 products will actually have enough RT to match the resolution/framerate expectations of a high-end card, and also that the RT core won't be too costly to put into xx50/60 tier SKUs (If we even see a 2060 SKU, I don't think it will include RT cores at all, simply because the performance it could offer won't really be meaningful).
More than a few things are conspiring against the price too -- Aside from the specter of terriffs, the high price of all kinds of RAM right now, and that this is a 12nm product rather than 7nm, it looks to me like the large and relatively monolithic nature of the RT core itself is preventing nV from salvaging/binning more dies -- with the cuda/tensor cores I'd imagine they build in some redundant units so they can salvage the SM even if there are minor flaws in the functional units, but since there's only 1 RT core per SM, any flaw there means the whole SM is out -- that explains why the 2080 is based on the same GPU as the TI, and why the 2070 is the only card based on the GPU that would normally serve the xx70 and xx80 SKUs. Its possible they might be holding onto the dies with too many flawed RT cores to re-purpose them for the AI market, but that would compete with existing products.
Is there a graph error for BF1 99th percentile at 4k resolution? The 2080 TI FE is at 90, and the 2080 TI (non founders) is 68. How is it possible to have this gigantic difference when almost all other benchmarks and games they are neck and neck?
Are you willing to pay the extra $1-200 for a 2080 over a 1080 Ti for the same performance in current games, in exchange for the new Turing features (Ray-tracing and DLSS)?
I'm not convinced yet that the 2080 will be able to run ray-traced games at acceptable frame rates, but it is "more future-proof" for the extra money you pay.
Thing is, for the features you're talking about, the 2080 _is not fast enough_ at using them. I don't understand why more people aren't taking this onboard. NVIDIA's own demos show this to be the case, at least for RT. DLSS is more interesting, but the 2080 has less RAM. Until games exist that make decent use of these new features, buying into this tech now when it's at such a gimped low level is unwise. He's far better off with the 1080 Ti, that'll run his existing games faster straight away.
Way too expensive. If those cards were the same price as the 1080 / TI it would be a generation change.
This actually is a regression, the price has increased out of every proportion even if you completely were to ignore that newer generations are EXPECTED to be a lot faster than the older ones for essentially the same price (plus a bit of inflation max).
paying 50% more for a 30% increase is a simple ripoff. no one should buy these cards ... sadly a lot of people will let themselves get ripped off once more by nvidia.
and no: raytracing is not an argument here, this feature is not supported anywhere and by the time it will be adopted (if ever) years will have gone bye and these cards will be old and obsolete. all of this is just marketing and hot air.
There are those who claim buying RTX to get the new features is sensible for future proofing; I don't understand why they ignore that the performance of said features on these cards is so poor. NVIDIA spent years pushing gamers into high frequency monitors, 4K and VR, now they're trying to flip everyone round the other way, pretend that sub-60Hz 1080p is ok.
And btw, it's a lot more than 50%. Where I am (UK) the 2080 Ti is almost 100% more than the 1080 Ti launch price. It's also crazy that the 2080 Ti does not have more RAM, it should have had at least 16GB. My guess is NVIDIA knows that by pushing gamers back down to lower resolutions, they don't need so much VRAM. People though have gotten used to the newer display tech, and those who've adopted high refresh displays physically cannot go back.
Why the tame conclusion? Do us a favor will u? Just come out and say you'd have to be mental to pay these prices for features that aren't even available yet and you'd be better off buying previous gen video cards that are currently heavily discounted.
Why does the 2080ti not use the tensor cores for GEMM? I have not seen any benchmark anywhere showing it working. It would be a good news story if Nvidia secretly gimped the tensor cores.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
337 Comments
Back to Article
ESR323 - Wednesday, September 19, 2018 - link
I agree with the conclusion that these cards aren't a good buy for 1080ti owners. My 1080ti overclocks very nicely and I'll be happy to stick with it until the next generation in 7 nm. By then we might have a decent selection of games that make use of ray tracing and the performance increase will be more appealing.imaheadcase - Wednesday, September 19, 2018 - link
Yah i agree, especially its only a 20-25fps increase on average. While many might thing thats great, considering the price increase over 1080TI and the fact many 1080TI can overclock to close that gap even more. The features don't justify the cost.However, it could be lots of performance could be unlocked via driver updates..we really don't know how tensor cores could increase performance till the games get updated to use it. Also, while super expensive option...how does the new SLI performance increase performance? Lets see a compare from 1080TI sli to newer sli 2080TI..maybe its easier to put into games? So many what-ifs with this product.
I feel this product should of been delayed till more games/software already had feature sets available to see.
Aybi - Thursday, September 20, 2018 - link
There wont be driver&optimization support for 1000 series. They will focus on 2000 series and with that the gap going to increase a lot.If you remember 980ti and 1080ti it was the same case when 1080ti announced and then you know what happened.
Vayra - Friday, September 21, 2018 - link
Actually I don't and there is also no data to back up what you're saying. The 980ti still competes with the 1070 as it did at Pascal launch.Don't spread BS
Matthmaroo - Sunday, September 23, 2018 - link
Dude that’s not true at allNvidia will fully support the 10 series for the next 5 -10 years
They all use the same CUDA cores
Don’t just make crap up to justify your purchase
SanX - Thursday, September 20, 2018 - link
What the useless job the reviewer is doing comparing only to latest generstion cards? Add at least 980Ti and 780TiMrSpadge - Thursday, September 20, 2018 - link
Ever heard of their benchmark database?Ryan Smith - Thursday, September 20, 2018 - link
You'll be glad to hear then that we'll be backfilling cards.There was a very limited amount of time ahead of this review, and we thought it to be more important to focus on things like clocking the cards at their actual reference clocks (rather than NVIDIA's factory overclocks).
dad_at - Sunday, September 23, 2018 - link
Many thanks for that, I think it is useful job, people are still using maxwell(or even older) generation GPU in 2018. And when we could expect maxwell (980/980ti) results to appear in GPU 2018 bench? Could you also please add Geforce GTX Titan X (maxwell) to GPU 2018?StevoLincolnite - Sunday, September 23, 2018 - link
Hopefully you back-fill a substantial amount, the GPU bench this year has been a bit lacking... Especially in regards to mid-range and older parts.Whole point of it is so that you can see how the latest and greatest compare it to your old and crusty.
Hixbot - Friday, September 21, 2018 - link
I'm not sure how midrange 2070/2060 cards will sell if they're not a significant value in performance/price compared to 1070/1060 cards. If AMD offer no competition, Nvidia should still compete with itselfWwhat - Saturday, September 22, 2018 - link
It's interesting that every comment I've seen says a similar thing and that nobody thinks of uses outside of gaming.I would think that for real raytracers and Adobe's graphics and video software for instance the tensor and RT cores would be very interesting.
I wonder though if open source software will be able to successfully use that new hardware or that Nvidia is too closed for it to get the advantages you might expect.
And apart from raytracers and such there is also the software science students use too.
And with the interest in AI currently by students and developers it might also be an interesting offering.
Although that again relies on Nvidia playing ball a bit.
michaelrw - Wednesday, September 19, 2018 - link
"where paying 43% or 50% more gets you 27-28% more performance"1080 Ti can be bought in the $600 range, wheres the 2080 Ti is $1200 .. so I'd say thats more than 43-50% price increase..at a minimum we're talking a 71% increase, at worst 100% (Launch MSRP for 1080 Ti was $699)
V900 - Wednesday, September 19, 2018 - link
Which is the wrong way of looking at it.NVIDIA didn’t just increase the price for shit and giggles, the Turing GPUs are much more expensive to fab, since you’re talking about almost 20 BILLION transistors squeezed into a few hundred mm2.
Regardless: Comparing the 2080 with the 1080, and claiming there is a 70% price increase, is a bogus logic in the first place, since the 2080 brings a number of things to the table that the 1080 isn’t even capable of.
Find me a 1080ti with DLSS and that is also capable of raytracing, and then we can compare prices and figure out if there’s a price increase or not.
imaheadcase - Wednesday, September 19, 2018 - link
In brings it to the table..on paper more like it. You literally listed the two things that are not really shown AT ALL.mscsniperx - Wednesday, September 19, 2018 - link
No, actually YOUR logic is bogus. Find me a DLSS or Raytracing game to bench.. You can't. There is a reason for that. Raytracing will require a Massive FPS hit, Nvidia knows this and is delaying you from seeing that as damage control.Yojimbo - Wednesday, September 19, 2018 - link
There are no ray tracing games because the technology is new, not because NVIDIA is "delaying them". As far as DLSS, I think those games will appear faster than ray tracing.Andrew LB - Thursday, September 20, 2018 - link
Coming soon:Darksiders III from Gunfire Games / THQ Nordic
Deliver Us The Moon: Fortuna from KeokeN Interactive
Fear The Wolves from Vostok Games / Focus Home Interactive
Hellblade: Senua's Sacrifice from Ninja Theory
KINETIK from Hero Machine Studios
Outpost Zero from Symmetric Games / tinyBuild Games
Overkill's The Walking Dead from Overkill Software / Starbreeze Studios
SCUM from Gamepires / Devolver Digital
Stormdivers from Housemarque
Ark: Survival Evolved from Studio Wildcard
Atomic Heart from Mundfish
Dauntless from Phoenix Labs
Final Fantasy XV: Windows Edition from Square Enix
Fractured Lands from Unbroken Studios
Hitman 2 from IO Interactive / Warner Bros.
Islands of Nyne from Define Human Studios
Justice from NetEase
JX3 from Kingsoft
Mechwarrior 5: Mercenaries from Piranha Games
PlayerUnknown’s Battlegrounds from PUBG Corp.
Remnant: From The Ashes from Arc Games
Serious Sam 4: Planet Badass from Croteam / Devolver Digital
Shadow of the Tomb Raider from Square Enix / Eidos-Montréal / Crystal Dynamics / Nixxes
The Forge Arena from Freezing Raccoon Studios
We Happy Few from Compulsion Games / Gearbox
Funny how the same people who praised AMD for being the first to bring full DX12 support yet only 15 games in the first two years used it, are the same people sh*tting on nVidia for bringing a far more revolutionary technology that's going to be in far more games in a shorter time span.
jordanclock - Thursday, September 20, 2018 - link
Considering AMD was the first to bring support to an API that all GPUs could have support for, DLSS is not a comparison. DLSS is an Nvidia-only feature and Nvidia couldn't manage to have even ONE game on launch day with DLSS.Manch - Thursday, September 20, 2018 - link
AMD spawned Mantle which then turned into Vulcan. Also pushed MS to dev DX12 as it was in both their interests. These APIs can be used by all.DLSS while potentially very cool, is as Jordan said proprietary. Like hair works and other crap ot will get light support but devs when it comes to feature sets will spend most of their effort building to common ground. With consoles being AMD GPU based, guess where that will be.
If will be interesting how AMD will ultimatley respond. Ie gsync/freesync CUDA/OpenCL, etc.
As Nvidia has stated, these features are designed to work with how current game engines already function so they dont (the devs) have to reinvent the wheel. Ultimately this meanz the integration wont be very deep at least not for awhile.
For consumers the end goal is always better graphics at the same price point when new releases happen.
Not that these are bad cards, just expensove and two very key features are unavailable, and that sucks. Hopefully the situation will change sooner rather than later.
Dribble - Thursday, September 20, 2018 - link
The DLSS basically gives you a resolution jump for free (e.g. 4k for 1440p performance) and is really easy to implement. That's going to take off fast and probably means even the 2070 will be faster then the 1080Ti in games that support it.Lolimaster - Saturday, September 22, 2018 - link
No not free, everyone can see the blurry mess the renamed blur effect is.Inteli - Saturday, September 22, 2018 - link
TIL that when you stop isolating variables in a benchmark, a lower-end card can be faster than a higher-end card.tamalero - Wednesday, September 19, 2018 - link
Die size is irrelevant to consumers. They see price vs performance. not how big the silicon is.AMD was toasted for having hot slow chips. many times.. so did nvidia.. big and hot means nothing if it doesn't perform as expected for the insane prices they 're asking for.
Yojimbo - Wednesday, September 19, 2018 - link
Die size is not irrelevant to consumers because increased die size means increased cost to manufacture. Increased cost to manufacture means a pressure for higher prices. The question is what you get in return for those higher prices.People like what they know... what they are used to. If some new AA technique comes along and increases performance significantly but introduces visual artifacts it will be rejected as a step backwards. But if a new technology comes along that has a significant performance cost yet increases visual quality much more significantly than the aforementioned artifacts decrease it, people will also have a tendency to reject it. That is, until they become familiar with the technology... That's where we are with RTX. No one can become familiar with the technology when there are no games that make use of it. So trying to judge the value of the larger die sizes is an abstract thing. In a few months the situation will be different.
Personally, I think the architecture will be remembered as one of the biggest and most important in the entire history of gaming. There is so much new technology in it that some of it barely anyone is saying much about (where have you heard about texture space shading, for example?). Several of these technologies will have their greatest benefits with VR, and if VR had taken off people would be marveling about this architecture immediately. But I think that VR will eventually take off, and I think several of these technologies will become the standard way of doing things for the next several years. They are new and complicated for developers, though. Only a few developers are prepared to take advantage of the stuff today. It's going to be some time before we really can put the architecture into its proper historical perspective.
From the point of view of a purchase today, though, it's a bit of an unknown. If you buy a card now and plan to keep it for 4 years, I think you'd be better off getting a 20 series than a 10 series. If you buy it and keep it for 2 years, then it's a bit less clear, but we'll have a better idea of the answer to that question in 6 months, I think.
I do think, though, that if an architecture with this much new stuff were introduced 20 years ago everybody, including games enthusiast sites like Anandtech, would be going gaga over it. The industry was moving faster then and people were more optimistic. Also the sites didn't try to be so demure. Hmm, gaga as the opposite of demure. Maybe that's why she's called Lady Gaga.
Santoval - Wednesday, September 19, 2018 - link
I agree that this might be the most game-changing graphics tech of the last couple of decades, and that the future belongs to ray-tracing, but I also think that precisely due to the general uncertainty and the very high prices Nvidia might suffer one of their biggest sales slumps this generation, if not *the* biggest. They did not handle the launch well : it is absurd to release a new series with zero ray-traced, DLSS supporting or mesh shaded games at launch.Their extensive NDAs, lack of information and ban on benchmarks between the Gamescom pre-launch and the actual launch, despite going live with (blind faith based) preorders during that window, was also controversial and highly suspicious. It appears that Nvidia gave graphics cards to game developers very late to avoid leaks, but that resulted in having no RTX supporting games at launch. They thought they could not have it both ways apparently, but choosing that over having RTX supporting games at launch was a very risky gamble.
Since their sales will almost certainly slump, they should release 7nm based graphics cards (either a true 30xx series or Turing at 7nm, I guess the latter) much sooner, probably in around 6 months. They might have expected a sales slump, which is why they released 2080Ti now. I suppose they will try to avoid it with aggressive marketing and somewhat lower prices lately, but it is not certain they'll succeed.
eddman - Thursday, September 20, 2018 - link
Would you've still defended this if it was priced at $1500? How about $2000? Do you always ignore price when new tech is involved?The cards, themselves, aren't bad. They are actually very good. It's their pricing.
These cards, specifically 2080 Ti, are overpriced compared to their direct predecessors. Ray tracing, DLSS, etc. etc. they still do not justify such prices for such FPS gains in regular rasterized games.
A 2080 Ti might be an ok purchase for $850-900, but certainly not $1200+. Even 8800 GTX with its new cuda cores and new generation of lighting tech launched at the same MSRP as 7800 GTX.
These cards are surely more expensive to make, but there is no doubt that the biggest factor for these price jumps is that nvidia is basically competing with themselves. Why price them lower when they can easily sell truckloads of pascal cards at their usual prices until the inventory is gone.
Andrew LB - Thursday, September 20, 2018 - link
You, like so many others don't get it. nVidia has re-worked their product lines. Didn't you notice how the Ti came out at the same time as the 2080? You might also notice that Titan is now called Titan V (volta) and not GTX Titan. Titan is now in its own family of non-gaming cards and that is reflected in the driver section on their site. They now have titan specific drivers.Here, watch this. Jay explains it fairly well.
https://youtu.be/5XRWATUDS7o?t=6m2s
eddman - Thursday, September 20, 2018 - link
You took an opinion and decided it's a fact. It's not. That guy is not the authority on graphics cards.There is no official word that titan is now 2080 Ti. Nvidia named that card 2080 Ti, it has a 102 named chip. Nvidia themselves constantly compare it to 1080 Ti, which also has a 102 named chip, therefore it's the successor to 1080 Ti and it's very normal to expect similar pricing.
Don't worry, there will be a Titan turing, considering that 2080 Ti does not even use the fully enabled chip.
It's really baffling to see people, paying customers, defending a $1200 price tag. It is as if you like to be charged more.
eddman - Thursday, September 20, 2018 - link
$1000, but it's still too high, and you cannot find any card at that price anyway.Bp_968 - Sunday, December 2, 2018 - link
Even though the review is older and this comment is a few months old I just wanted to jump in and say "hah, look, eddman was right!" Now that the Titan RTX leaks are showing up. Lol. They didn't even wait for supply to stabilize on the 2080ti before dropping the titan.Plus, if the 2080 replaced the 1080ti then why is it more expensive and no faster? That would be a first even for Nvidia..
PeachNCream - Thursday, September 20, 2018 - link
The model numbers aren't that significant. NVIDIA could just have easily released a 2080, a 2070, and a 2060 by putting different labels on the boxes of the 2080 Ti, the 2080, and the 2070 for instance. The Ti, the Titan, all of those are long standing marketing identities that buyers now automatically associate with a certain relative scale of performance among other GPUs of the same generation. NVIDIA can play upon buyer expectations by releasing various products to fill those expectations in the way that best advances the company's interest. Any company with enough brand recognition can easily do the same. Consider Intel's long-running i-series CPU numbering. The fact that something labeled as a Ti came out at a certain time isn't an example of technological development, but a way of meeting customer expectations in reflection of the MSRP. We would have balked much more at $1200 for the exact same product if it was labeled as a plain vanilla 2080 and the current vanilla 2080 was branded as a 2070. Instead, we say, "Well, the 2080 Ti is really expensive, but at least its a Ti so that makes it a little bit more reasonable."eddman - Thursday, September 20, 2018 - link
Model numbers are significant in the way that they point out the models in the same successive line up. That's the entire point of them.I and a lot of people are not in this "we" you talk about. Again, nvidia themselves compare it to 1080 Ti every chance they get, so I do not see why I should in any way think its price is "reasonable".
That's not how past generational leaps worked, even for 8800 GTX. We got massive performance gains AND usually new rendering features at similar MSRPs or maybe a bit higher. The difference this time is that AMD has left the building, for now.
PeachNCream - Thursday, September 20, 2018 - link
Don't misunderstand me. I'm not implying that the price is okay or that anyone should find it reasonable to stomach a $1200 MSRP for a mere graphics card. I also agree that part of the pricing problem is due to an absence of credible competition from AMD. I'm just arguing that the people in the NVIDIA marketing department may justify the price in part by slapping a Ti label on the box so consumers are less likely to balk during checkout. The reality is that we're getting a step sideways in performance for a noteworthy increase in TDP due to the addition of capabilities that may or may not actually add much value because said features are too demanding to play nicely at high resolutions and because there are not indications that the software side will move to take advantage of said features. At best, the addition of the hardware won't be very compelling until the next generation of GPUs after Turing when its likely that performance will pick up a bit.Then again, who am I to talk? I play PC games on a laptop with an HD 4000 infrequently and end up mostly gaming on my ancient dual core Kitkat era phone that I've been keeping as a cheap wireless mini tablet. To me, PC gaming became an overly pricey sink of my limited, single parent free time. I'd rather bank my spare money in something that yields interest over time than throw it into gaming hardware that's obsolete in a matter of a few years. That and my kids me to be both of their parents these days since that worthless ex of mine schlepped off to marry some woman in Canada. *grumble*
tamalero - Thursday, September 20, 2018 - link
More like that they are pricing their high end cards like they are flagship cards.The 2080 Founders seems identical in price to a 1080TI. That is unacceptable. Specially when they are almost identical in performance (going slower in most games by a few small points).
They(Nvidia) just want to clear the huge build up of PASCAL cards.. by charging insanity for those who are willing to claim to be "gamers" with money. period.
tamalero - Thursday, September 20, 2018 - link
"You, like so many others don't get it. nVidia has re-worked their product lines. Didn't you notice how the Ti came out at the same time as the 2080?"What the hell does this has to do? Nothing for the consumer again.
tamalero - Thursday, September 20, 2018 - link
"Die size is not irrelevant to consumers because increased die size means increased cost to manufacture. Increased cost to manufacture means a pressure for higher prices. The question is what you get in return for those higher prices."
You're repeating the same.
Die size means NOTHING to a consumer. It means something for the manufacturer because it costs THEM.
If the die doesnt benefit anything at all (Fermi) compared to smaller dies that offer almost the same performance (Pascal). Why would the consumer have to pay MORE for LESS?
New tech is nothing if there is nothing to show. And there is NOTHING to show right now.
by the time raytracing becomes really viable, the new generation of cards will be out.
Spunjji - Friday, September 21, 2018 - link
This x1000. These cards are a necessary step towards getting the technology out there, but I'm thoroughly unconvinced that it is a good idea for anyone to buy them. The sacrifice in die area was too great, for far too little benefit. Given the strong indications that 1080p ~45fps is where real-time raytracing will be at right now, I just don't care. They sold me on high-resolution and high-framerate because those actually affect how much enjoyment I get from my games. I'm not interested in that rug being pulled from under my feet *and* paying damn near double price for the privilege.Morawka - Wednesday, September 19, 2018 - link
Doesn't TSMC charge their customers by the wafer nowadays?PopinFRESH007 - Wednesday, September 19, 2018 - link
how does that matter? Are you suggesting that magically makes the die size irrelevant? If you have a 300mm wafer and you double the die size, you also halve the number of die per wafer. This would also ignore yield. A larger die is more costly to produce because you get fewer die per wafer and increase the probability of having a defect within a die.Santoval - Wednesday, September 19, 2018 - link
The problem is that it does not bring those things to the current table but is going to bring them to a future table. Essentially they expect you to buy a graphics cards that no current game can support its advanced features merely on faith that it both will deliver them in the future *and* that they will be will be worth the very high premium.If there is one ultimate unwritten rule when buying computers, computer parts or anything really, it must be this one : Never buy anything based on promises of *future* capabilities - always make your purchasing decisions based on what the products you buy can deliver *now*. All experienced computer and console consumers, in particular, must have that maxim engraved on their brain after having been burnt by so many broken promises.
Writer's Block - Monday, October 1, 2018 - link
That is certainly true; 'we promise', politicians and companies selling their shit use it a lot... And break it about as often.Inteli - Wednesday, September 19, 2018 - link
It's not that the price increase wasn't warranted, at least from the transistor count perspective, it's that there's not a lot to show for it.Many more transistors...concentrated in Tensor cores and RTX cores, which aren't being touched in current games. The increased price is for a load of baggage that will take at least a year to really get used (and before you say it, 3 games is not "really getting used"). We're used to new GPUs performing better in current games for the same price, not performing the same in current games for the same price (and I'm absolutely discounting everything before 2008 because that was 10 years ago and the expectations of what a new μArch should bring have changed).
I get the whole "future of gaming" angle you're pushing, and it's a perfectly valid reason to buy these new GPUs, but don't act like an apples-to-apples comparison of performance *right now* is the "wrong way of looking at it". How the card performs right now is an important metric for a lot of people, and will influence their decision. Especially when we're talking a potential price difference of $100+ (with sales on 1080 Ti's, and FE 2080 prices). Obviously there isn't a valid comparison for the 2080 Ti, but anyone who can drop $1300 on a GPU probably doesn't care too much about the price tag.
Flunk - Thursday, September 20, 2018 - link
Nvidia is charging what they are because they have no competition at the top end. That's it, nothing else. They're taking in the cash today in preparation for having to price more competitively later.just4U - Thursday, September 20, 2018 - link
Flunk, we are talking Nvidia here.. typically speaking they don't lower prices to compete.. Sometimes they bump to high and to few bite.. but that's about it. The last time they lowered prices to compete was the 400 series but they'd just come off getting zonked by amd for basically 2 generations.. and when they went to the 500s series it was fairly competitive with amd.. (initially they were better but Amd continued to improve their 5000/6000 series.. til it was consistently beating Nvidia.. did they lower prices? NO.. not one bit..)TNT cards were competitive and cheap.. but once Nvidia knocked off all other contenders (aside from AMD) and started in with their geforce line they have always carried premiums regardless competition or not.
eddman - Thursday, September 20, 2018 - link
GTX 280, launched at $650 because they thought AMD couldn't do much. AMD came up with 4870. What happened? Nvidia cut the card's price to $500 a mere month after launch. So yes, they do cut prices to compete.Dragonstongue - Thursday, September 20, 2018 - link
13.6 and 18.6 (bln transistor estimated) die size of 454/754mm2 (2080/2080Ti) 12nm7.2 and 12 (bln transistor estimated) die size of 314/471 (1070/1080-1080Ti/TitanX) 16nm
yes it is "expensive" no doubt about that, but, it is Nv we are talking about, there is a reason they are way over valued as they are, they produce as cheaply as possible and rack them up in price as much as they can even when their actual cards shipped are no where near the $$$ figure they report as it should.
also, if anything else, they always have and always will BS the numbers to make themselves ALWAYS appear "supreme" no matter if it is actual power used, TDP, API features, or transistor count etc etc etc.
as far as the ray tracing crap...if they used an open source style so that everyone can use the exact same ray tracing engine so they can be directly compared to see how good they are or not then it might be "worthy" but, it is Nv they are and continue to be "it has to be our way or you don't play" type approach...I remember way back when with PhysX (which Nv boug out Ageia to do it) when Radeons were able to use it (before Nv took the ability away) they ran circles around comparable Nv cards AND used less cpu and power to do it.
Nv does not want to get "caught" in their BS, so they find nefarious ways around everything, and when you have a massive amount of $$$$$$$$$$$$$$$$ floating everything you do, it is not hard for them to "buy silence" Intel has done so time and time again, Nv does so time and time again........blekk
DLSS or whatever the fk they want to call it, means jack shit when only specific cards will be able to use it instead of being a truly open source initiative where everyone/everything gets to show how good they are (or not) and also stand to gain benefit from others putting effort into making it as good as it possibly can be...there is a reason why Nv barely supports Vulkan, because they are not "in control" it is way too easy to "prove them wrong"..funny because Vulkan has ray tracing "built in"
IMO if they are as good as they claim they are, they would do everything in the light to show they are "the best" not find ways to "hide" what they are doing.....their days are numbered....hell their stock price just took a hit....good IMHO because they should not be over $200 anyways, $100, maybe, but they absolutely should not be valued above others whos financials and product shipment as magnitudes larger.
Spunjji - Friday, September 21, 2018 - link
Remind me why consumers should give a rats-ass about die size, other than its visible effects of price and performance.If you want to sell me a substantially larger, more expensive chip that performs a little better for a lot more money, a better reason is needed than "maybe it will make some games that aren't out yet really cool in a way that we refuse to give you any performance indications about".
Screw that.
Writer's Block - Monday, October 1, 2018 - link
They look poor value; good performance, sure. But a 1080ti offers the same for much less.They want me to buy promises! Seriously, promises are never worth the paper they are printed on - digital or the real stuff.
Writer's Block - Monday, October 1, 2018 - link
Oh and, yeh agree.Vayra - Friday, September 21, 2018 - link
Why would I want a feature like DLSS when current AA methods do the job fine and we can also just run at native, higher resolution anyway and not use any AA whatsoever?And why would anyone care about vaporware like RTRT?
Lolimaster - Saturday, September 22, 2018 - link
No DLSS, NO.Blur.
Not capable of raytracing, just raytrace small parts of a frame on selected scenes on selected games...
Gastec - Thursday, September 27, 2018 - link
You must be joking right? What do we care if the price of manufacturing increased for Nvidia. We are mot supporters, we are clients. We don't have to support their pricing because WE ARE NOT SOCIOS! Let Nvidia reduce their costs by cutting the salaries of their CEOs and other wortless corporate officers. Then I will BUY their 2080Ti product, at the consumer-friendly price of €750Andrew LB - Thursday, September 20, 2018 - link
Except for the fact that the non founders edition is $999, not $1200. And the GTX 1080ti released at $699 but for the better part of the past two years cost substantially more.eva02langley - Thursday, September 20, 2018 - link
Then find one at that price, genius.3rd parties are OC their cards and offering additional cooling solutions, they will all be over the MSRP and close to the FE.
Also, they use GDDR6... you didn't learn anything from Vega and HBM2?
jeffcd57 - Thursday, September 20, 2018 - link
Agree the cost is ridiculous. I haven't paid it and won't. Got to many children to raise. I've never seen them at the above mentioned.jeffcd57 - Thursday, September 20, 2018 - link
1080 Ti for $600, where, when?ezridah - Thursday, September 20, 2018 - link
A month ago.https://www.theverge.com/good-deals/2018/8/21/1776...
TheJian - Thursday, September 20, 2018 - link
So buy a 1080ti. For some the new features are worthy and boost perf quite massively making it truly worth it if those technologies are the new way forward. At worst, a good 25 games are already coming with NV's new tech. Many are huge titles most would like to play surely.Also as Hexus noted a while back, the price to make these things is just below MSRP. Note the small chip is as large as a titan, and the larger chips...WOW. That's a lot of transistors for a game card. Also apples vs. oranges here, as said by others 1080 etc can't do raytracing or dlss.
https://nvidianews.nvidia.com/news/nvidia-rtx-plat...
27 games coming with NV tech. Will they look the same on 1080ti or less? NOPE. Will they be faster and BETTER looking on RTX...YEP. Value is in the eye of the beholder ;)
Spunjji - Friday, September 21, 2018 - link
The problem is, you just counted twice when you said "will they be faster and better looking on RTX".The absolute truth of it from what little we can glean so far (after the official launch!!!) is that you can have RTX effects /OR/ you can have your better performance, not both. That's a heavy caveat!
It would be one thing if it were a proposition of waiting a couple of months for some amazing features that will knock your socks off and have few drawbacks. It's another to be paying over the odds for a card now to maybe get some cool stuff that will DEFINITELY run slower and at a lower res than you're used to.
Billstpor - Friday, September 21, 2018 - link
Wrong. It's already known that the tensor cores have enough juice to run ray-traced effects and DLSS at the same time:https://youtu.be/pgEI4tzh0dc?t=10m55s
Vayra - Monday, September 24, 2018 - link
Wrong, the tensor cores need DLSS to run ray tracing at somewhat bearable FPS - that is, 30 to 60.DLSS is a way to reduce the amount of rays to cast.
Vayra - Monday, September 24, 2018 - link
Hence the non-existant improvement, or even worse position in terms of quality compared to SSAA x4 or better.In other words, running at native 4K is miles sharper and will perform miles better than a DLSS+RTRT combination still.
IUU - Sunday, September 23, 2018 - link
While your argument is solid, these days are just so weird that even $1200 cards seem to make a hell of a lot sense. This is also valid for similar desktop cpus.Why? Well , go buy a high-end iphone or a high end android phone... Enough said.
PS. For those who may use arguments like geekbench and such , it is just insulting to put it very very kindly!
Gastec - Thursday, September 27, 2018 - link
So basically buying high-priced electronics make sense because the companies selling them just increase the prices every year to make profits and a certain type of consumers are supporting those companies by purchasing no matter what the price (call them fanbois). The question is: Why, why are those people acting like that? What drives them?watek - Wednesday, September 19, 2018 - link
Consumers paying these premium prices for features that are not even fully developed or finished is mind boggling!! People are being bent over and screwed by Nvidia hard yet they still pay $1500 to be beta testers until next Gen.V900 - Wednesday, September 19, 2018 - link
Mindboggling? I suppose it would be for a time traveller visiting from the 19th century, but for everyone else it’s perfectly normal.There is always a price premium for those early adopters who want to live on the cutting edge of technology.
When DVD players came out, they cost over a 1000$ and the selection of movies they could watch was extremely small. When Blu-ray players came out, they also cost well over 1000$ and the entire catalogue of Blu-ray titles was a dozen movies or so.
And keep in mind, that the price that Nvidia charges for joining the early adopter club is really shockingly low.
When OLED or 4K televisions first came out, people paid tens of thousands of dollars for a set, and the selection of 4K entertainment to watch on them was pretty much zero.
With the 2080, early adopters can climb aboard for 600-1000$.
Games that take advantage of DLSS and RTX will be here soon and in the meantime they have the most powerful graphics card on the market that will play pretty much anything you can throw at it, in 4K without breaking a sweat.
It’s not a bad deal at all.
imaheadcase - Wednesday, September 19, 2018 - link
Again, two technology that have not even seen the real light of day, let alone to be proven worth it at all. Early adopters of the other techs you listed at least got WORKING TECH from the start as promised.V900 - Wednesday, September 19, 2018 - link
Ok, you’re either deliberately spreading untruths and FUD, or you just haven’t paid attention.There are games NOW that support RTX and DLSS. Games like Shadow of the Tombraidet, Control and PUBG.
And there are more games coming out THIS YEAR with RTX/DLSS support: Battlefield 5 is one of them.
The next Metro is one of many games coming out in early 2019 that also support RTX.
So tell me again how this is different from when Blueray players came out?
imaheadcase - Wednesday, September 19, 2018 - link
THose games you listed don't have it now, they are COMING. lol Even then the difference is not even worth it considering the games don't hardly take a hit for the 1080TI. You are the nvidia shill on here and forums as everyone knows.imaheadcase - Wednesday, September 19, 2018 - link
Because bluray players played movies from the start, delivered what they promised from the start even if cost a lot? Duh.PopinFRESH007 - Thursday, September 20, 2018 - link
They played DVDs from the start. Your statement is falseimaheadcase - Thursday, September 20, 2018 - link
Umm nope its true.Spunjji - Friday, September 21, 2018 - link
Yeah, there was media available at launch. Also Blu-Ray provided a noticeable jump in both quality AND resolution over DVD. RTX provides maybe the first and definitely not the second.V900 - Wednesday, September 19, 2018 - link
And it’s clear that you didn’t read the article, or skimmed it at best, if you’re claiming that “the two technologies have not even seen the real light of day”.The tools are out there, developers are working with them, and not only are there many games on the way that support them, there are games out now that use RTX.
Let me quote from the review:
“not only was the feat achieved but implemented, and not with proofs-of-concept but with full-fledged AA and AAA games. Today is a milestone from a purely academic view of computer graphics.”
tamalero - Wednesday, September 19, 2018 - link
Development means nothing unless they are released. As plans get cancelled, budgets gets cut and technology is replaced or converted/merged into a different standard.imaheadcase - Wednesday, September 19, 2018 - link
You just proved yourself wrong with own quote. lolGuess what? Python language is out there, lets all develop games from it! All the tools are available! Its so easy! /sarcasm
Ranger1065 - Thursday, September 20, 2018 - link
V900 shillage stench.PopinFRESH007 - Wednesday, September 19, 2018 - link
Just like those HD-DVD adopters, Laser Disc adopters, BetaMax adopters. V900 is pointing out that early adopters accept a level of risk in adopting new technology to enjoy cutting-edge stuff. This is no different that Bluray or DVDs when they came out. People who buy RTX cards have "WORKING TECH" and will have few options to use it just like the 2nd wave of Bluray players. The first Bluray player actually never had a movie released for it and it cost $3800."The first consumer device arrived in stores on April 10, 2003: the Sony BDZ-S77, a $3,800 (US) BD-RE recorder that was made available only in Japan.[20] But there was no standard for prerecorded video, and no movies were released for this player."
Even 3 years after that when they actually had a standard studios would produce movies for the players that were out cost over $1000 and there was a whopping 7 titles that were available. Similar to RTX being the fastest cards available for current technology, those Bluray players also played DVDs (gasp).
imaheadcase - Wednesday, September 19, 2018 - link
Again, the point is bluray WORKED out of the box even if expensive. This doesn't even have any way to even test the other stuff.. You are literally buying something for a FPS boost over previous gens that is not really a big one at that. It be a different tune if lots of games already had the tech in hand by nvidia, had it in games just not enabled...but its not even available to test is silly.PopinFRESH007 - Thursday, September 20, 2018 - link
You are simply ignoring the facts. When Bluray players launched they didn't play Blurays because there were none, because there was no standard. It took 3 years before there was a standard and 7 movies were released. Before then they were just high-end DVD players.These RTX cards also work out of the box. Its crazy I know, they actually can play current games and all with the highest settings and fastest frame rates. Similar to what happened with bluray, they will also support those new fangled DXR and DLSS options in games as they come out.
imaheadcase - Thursday, September 20, 2018 - link
First Blu-ray movies would be released on June 20, coinciding with the release of the first Blu-ray DVD player from Samsung, and a Sony VAIO Blu-ray PC.Jun 13, 2006The first batch distributed by Sony was on June 20th, 2006:
The Fifth Element
50 First Dates
Hitch
House of Flying Daggers
The Terminator
Underworld: Evolution
xXx
What was that again about facts?
V900 - Thursday, September 20, 2018 - link
Working out of the box? You mean like the RTX 2080/ti/2070 cards?You’re willfully ignoring facts and pretend that it’s totally up in the air whether games will support RTX, and that they won’t be available for a long time...
Which is entirely false.
There are games that support RTX out right now. Like Tomb Raider
More are coming this year: One of the biggest titles this year: Battlefield 5 supports it.
And dozens of titles supporting RTX, many of them big AAA titles, are coming out in H1 2019.
So no: Nobody is buying a card they “can’t even test.” If you buy an RTX card, it’ll work out of the box with RTX technology enabled.
And it’s of course also the fastest card on the market for all the old titles that don’t support RTX.
sonny73n - Thursday, September 20, 2018 - link
@V900If you can’t compare performance per dollar of this gen and the previous, why even bother praising the new tech when currently none of the tech sites have any mean to test them?
After reading your first comment, I couldn’t help but think that you’re a paid shill for Nvidia.
imaheadcase - Thursday, September 20, 2018 - link
You just ignored the facts and even proved yourself wrong in own statement. lolLet me sell you a car with the fastest engine, but i'm not going to let you use all the Horsepower..but i promise i'll enable it when i get the right parts for it. Don't worry about the $100k price tag on the car, its going to be awesome i swear..
Do you get paid for every post by nvidia, or just a lump sum?
mscsniperx - Wednesday, September 19, 2018 - link
You are comparing apples to oranges. There isn't even a raytracing game to compare to.. And when the truth comes out that even with RTX, there is a massive FPS hit.. well, it's game over.DigitalFreak - Wednesday, September 19, 2018 - link
First you complain that there aren't any raytracing games to get benchmarks from, then you state there will be a massive performance hit. If there are no games available to test with, you can't have any idea what their performance will be like.tamalero - Wednesday, September 19, 2018 - link
The slides/demos of Nvidia and the Beta of Battlefield brother..They show performance that could be potentially be "above" what users would expect.
imaheadcase - Wednesday, September 19, 2018 - link
Hello Hairworks would like a word with you..CoachAub - Wednesday, September 19, 2018 - link
Laser discs were a big hit! ;)tamalero - Wednesday, September 19, 2018 - link
Unlike a video card, DVD were a STANDARD. set to replace the DVD. This wasn't a war between BETAMAX and VHS again. It was an evolution.And as you said it, they had a few titles coming on.
Nvidia currently is offering ZERO options for what they charge insanely.
Even those 4k TVs you mentioned.. had demos and downlodable content.
It was the future.
Nvidia's game in some of these RTX features are solely of Nvidia, not a global standard.
tamalero - Wednesday, September 19, 2018 - link
Errata: set to replace the "CDS" not DVDs.. They do really need an edit button here.Writer's Block - Monday, October 1, 2018 - link
Not a great comparrison.Mainly because: games making use of RTX and othe new features is: Zero.
OlED and 4k/DVD/Blue: pretty much zero/extremely small/dozen or so - not of the aforementioned is as low as zero, so the consumer could see what they were getting.
boozed - Wednesday, September 19, 2018 - link
Early adopters have always paid over the odds for an immature experience. That's the decision they make. You pays your money and you takes your chances...Gastec - Thursday, September 27, 2018 - link
Yes, drug addicts would also agreeianmills - Wednesday, September 19, 2018 - link
RTX - Radeon technology cross off the list. Nvidia is free to price as they please XDNikosD - Wednesday, September 19, 2018 - link
1080 Ti vs 980 Ti ~ 70% for 50$ more MSRP1080 vs 980 ~ 60% for 50$ more MSRP
2080 Ti vs 1080 Ti ~ 30% for 300$ more MSRP (actual price difference is a lot more)
Please, let's boycott Turing cards.
nVidia must learn its lesson.
Skip it.
V900 - Wednesday, September 19, 2018 - link
If you look at AMDs Vega and compare it with the previous AMD flagship: Fury, you see a similar 30-40% increase in performance.In other words: This isn’t Nvidia wanting to rip gamers off, it’s just a consequence of GPU makers pushing up against the end of Moore’s law.
formulaLS - Wednesday, September 19, 2018 - link
It IS nVidia wanting to rip off gamers. Their prices are absolutely a huge rip off for what you get.DigitalFreak - Wednesday, September 19, 2018 - link
Blame AMD for not competing. Nvidia would never be able to do this if AMD had a competitive offering.Fritzkier - Wednesday, September 19, 2018 - link
Blame both. Why the f you blame AMD for NVIDIA's own fault?And yes, AMD had competitive offering on mid-end, not on high end. But, that's before 7mm. Let's see what will we got on 7mm. 7mm will be released next year anyway, it's not that far off.
PopinFRESH007 - Wednesday, September 19, 2018 - link
Yep, lets wait for those 7mm processes. Those chips should only be the size of my computer with a couple hundred thousand transistors.Holliday75 - Friday, September 21, 2018 - link
Haha I was about to question your statement until I paid more attention to the process size he mentioned.Fritzkier - Saturday, September 22, 2018 - link
We seriously needs an edit button. Thanks autocorrect.Yojimbo - Wednesday, September 19, 2018 - link
So you are saying that if AMD were competitive then NVIDIA could never have implemented such major innovations in games technology... So, competition is bad?dagnamit - Thursday, September 20, 2018 - link
Competition can stifle innovation when the market is involved in race to see how efficiently they can leverage current technology. The consumer GPU market has been about the core count/core efficiency race for a very long time.Because Nvidia has a commanding lead in that department, they are able to add in other technology without falling behind AMD. In fact, they’ve been given the opportunity to start an entirely new market with ray-tracing tech.
There are a great many more companies developing ray-tracing hardware than rasterization focused hardware at the current moment. With Nvidia throwing their hat in now, it could mean other companies start to bring hardware solutions to the fore that don’t have a Radeon badge. It won’t be Red v. Green anymore, and that’s very exciting.
Spunjji - Friday, September 21, 2018 - link
Your Brave New World would involve someone else magically catching up with AMD and Nvidia's lead in conventional rasterization tech. Spoiler alert: nobody has in the past 2 decades and the best potential competition, Intel, isn't entering the fray until ~2020dagnamit - Sunday, September 23, 2018 - link
No. I’m saying that companies that specialize in ray-tracing technology may have an opportunity to get into the consumer discrete GPU market. They don’t need to catch up with anything.eva02langley - Thursday, September 20, 2018 - link
Not AMD fault if Nvidia is asking 1200$ US. Stop blaming AMD because you want to purchase Nvidia cards at better price, BLAME Nvidia!It is not AMD who force Ray Tracing on us. It is not AMD who want to provide gamework tools to sabotage the competition and gamers at the same time. It is not AMD charging us the G-sync tax. It is not AMD that screw gamers for the wallet of investors.
It is all Nvidia fault! Stop defending them! There is no excuses.
BurntMyBacon - Thursday, September 20, 2018 - link
I accept that nVidia's choices are their own and not the "fault" of any third party. On the other hand, nVidia is a business and their primary objective is to make money. Manufacturing GPUs with features and performance that customers find valuable is a tool to meet their objective. So while their decisions are their own responsibility, they are not unexpected. Competition from a third party with the same money making objective limits their ability to make money as they now have to provide at least the perception of more value to the customer. Previous generation hardware also limits their ability to make money as the relative increase in features and performance (and consequently value) are less than if the previous generation didn't exist. If the value isn't perceived to be high enough, customers won't upgrade from existing offerings. However, if nVidia simply stops offering previous generation hardware, new builds may still be a significant source of sales for those without an existing viable product.Long story short, since there is no viable competition from AMD or another third party to limit nVidia's prices, it falls to us as consumers to keep the prices in check through waiting or buying previous gen hardware. If, however, consumers in general decide these cards are worth the cost, then those who are discontent simply need to accept that they fit into a lower price category of the market than they previously did. It is unlikely that nVidia will bring prices back down without reason.
Note: I tend to believe that nVidia got a good idea of how much more the market was willing to pay for their product during the mining push. Though I don't like it (and won't pay for it), I can't really blame them for wanting the extra profits in their own coffers rather than letting it go to retailers.
Holliday75 - Friday, September 21, 2018 - link
Good thing there are cops around to keep me honest. If they weren't I'd go on a murder spree and blame them for it.Yojimbo - Wednesday, September 19, 2018 - link
It's NVIDIA making a conscious decision to spend its engineering resources on innovating and implementing new technologies that will shift the future of gaming instead of spending that energy and die space on increasing performance as much as it can in today's titles. If NVIDIA left out the RT cores and other new technologies they could have easily increased performance 50 or 60% in legacy technologies by building chips bigger than Pascal but smaller than Turing, while increasing prices only moderately. Then everyone would be happy getting a card that would be leading them into a gaming torpor. In a few years when everyone is capable of running at 4k and over 60 fps they'd get bored and wonder why the industry were going nowhere.NikosD - Wednesday, September 19, 2018 - link
nVidia has done the same thing in the past, introducing new technologies and platforms like tesselation, PhysX, HairWorks, GameWorks, GPP etc.All of these were proved to be just tricks in order to kill competition, like always, which nowadays means to kill AMD.
Pseudoraytracing is not an innovation or something mandatory for gaming.
It's just another premature technology that the opponent doesn't have in order to be nVidia unique again with huge cost for the consumer and performance regression.
I repeat.
Skip that Turing fraud.
maximumGPU - Thursday, September 20, 2018 - link
i don't think it's fair to compare ray tracing to HairWorks...ray tracing is a superior way to render graphics compared to rasterisation, there's no question about this.
Lolimaster - Saturday, September 22, 2018 - link
But with what, nvidia RTX only do it on a small part of a FRAME, on selected scenes. On tensor core repurposed for that.You will need tensor cores in the 100's to make nvidia implementation more "wowish", 1000's to actually talk about raytracing being a thing.
Consoles dictate gaming progress, AMD holds that.
Lolimaster - Saturday, September 22, 2018 - link
Exactly, to start talking about actual raytracing or at least most of the parts of a scene, we need 10-100x the current gpu performance.Yojimbo - Saturday, September 22, 2018 - link
GPP was a partner promotion program. Hairworks is part of Gameworks. PhysX is part of Gameworks. Gameworks is not a trick, and neither is the PhysX part of it. But neither of them compare to ray tracing. Maybe you should like up what the word "pseudo" means, because you're using it wrong.In 1 year or a year and a half AMD will have their own ray tracing acceleration hardware and then you'll be all in on it.
As for killing AMD, NVIDIA are not interested in it. It wouldn't be good for them, anyway. NVIDIA are, however, interested in building their platform and market dominance.
Yojimbo - Saturday, September 22, 2018 - link
Edit: look up*Eris_Floralia - Thursday, September 20, 2018 - link
I've read all ur comments and still struggle to find any consistent logic.eva02langley - Thursday, September 20, 2018 - link
Nvidia is throwing down the throat of gamers Ray Tracing development. We are paying for something that we didn't even wanted at first.You didn't even know about Ray Tracing and DLSS before it was announced. You are just drinking the coolaid unlike many of us who stand out and raging against these INDECENT prices.
Midwayman - Thursday, September 20, 2018 - link
You *should* want ray tracing. Its freaking awesome. I think the question really is if it is worth the trade-off yet.Fritzkier - Saturday, September 22, 2018 - link
I agree with you. Even though Nvidia shouldn't have priced RTX that high, we still want ray tracing.mapesdhs - Wednesday, September 26, 2018 - link
I couldn't give a hoot either way, I just want games that make sense and are believable, that's far more important than how a game looks. If an object cannot be used or behave in a manner that corresponds to its appearance, then what's the point? Everyone went mental about the puddle in the PS4 game, but did anyone stop to ask whether the water on the ground was wet? Likewise, th RTX demo of that fire effect (which looked grud awful anyway), is the fire hot? Can it melt the glass if fired close enough? Can I break the glass? Use a shard as a weapon? Would an enemy reveal their position by walking on the fragments, or do the pieces just fade away because they're nothing more than a fancy PhysX visual? Can I throw a grenade into the cabin to make the glass explode and harm passing enemies?World interactivity, object function and unexpected complexity & behaviour makes for a far more immersive game than any amount of ray tracing can ever provide. A glazed china teapot can look glorious with complex reflections & suchlike, but if I can't use it to make tea than it's not a teapot. If I can't open a door, close it, lock it, break it down, etc., then it's not a door. People are obsessed with visuals in games atm because they've been told to be. The sheep behaviour of consumers with all this is utterly mind boggling.
That aside, these Turing cards are simply not fast enough for doing RT effects anyway. NVIDIA has spent the last five yers hyping people up for high frequency gaming, 4K and VR, all things which need strong fill rates (rasterisation performance). Those who've gotten used to high frequency monitors physically cannot go back, the brain's vision system adapts, standard 60Hz sudden looks terrible to such users. Now all of a sudden NVIDIA is trying to tell the very crowd with money to spend, who've largely jumped onto the HF/4K/VR bandwagon, that they should take a huge step backwards to sub-60Hz 1080p, at prices which make no sense at all. That's absolutely crazy, doubly so when dual-GPU is dead below the 2080, a card which is not usefully faster than a 1080 Ti, costs more and has less RAM.
Gastec - Thursday, September 27, 2018 - link
1000 thumbs-ups sensei! :)Writer's Block - Monday, October 1, 2018 - link
+1I'm an occasional gamer; I'd be more than an occasional gamer if games did what your suggest
Gastec - Thursday, September 27, 2018 - link
Like that freak said: "How much of your life do you not want to be Ray traced?" or some similar abomination.webdoctors - Thursday, September 20, 2018 - link
?? I knew about ray tracing before it was announced. Ray tracing isn't a new technology, its been around for more than 25 years, the idea might predate computers.Who DOESN"T want ray tracing?!
You can argue you don't want to pay a premium for it, but that's not the same thing.
mapesdhs - Wednesday, September 26, 2018 - link
I just want better games, I don't care whether they're ray traced or not. This is why I like Subnautica so much, functionally it's a far more interesting and engaging game than most I've seen recently, even though the visuals are not as sophisticated. I had been spending much time playing Elite Dangerous, but that game has become very wide with no depth, it lacks the interactivitity and depth that Subnautica captures nicely. And re my comments above, see:http://www.sgidepot.co.uk/reflections.txt
sonny73n - Thursday, September 20, 2018 - link
@V900Are you gonna reply to every comment to justify Nvidia’s rip-offs? lol
BurntMyBacon - Thursday, September 20, 2018 - link
@V900: "If you look at AMDs Vega and compare it with the previous AMD flagship: Fury, you see a similar 30-40% increase in performance.In other words: This isn’t Nvidia wanting to rip gamers off, it’s just a consequence of GPU makers pushing up against the end of Moore’s law."
Point of consideration: Though VEGA did see a lesser performance increase (not sure how accurate 30%-40% is), the MSRP of Vega64 ($500) was less than the MSRP of the FuryX ($650) and even the Fury ($550).
https://en.wikipedia.org/wiki/AMD_Radeon_Rx_300_se...
https://en.wikipedia.org/wiki/AMD_RX_Vega_series
Spunjji - Friday, September 21, 2018 - link
If that were true then Nvidia could have left off the RTX parts this time around and created a GPU that offers a simple ~30% performance improvement at roughly the same retail cost.Following that, the die-area benefits from 7nm could have been spent on both RTX features and another ~30% performance boost at a similar or slightly-higher cost. By then they could probably have added enough resources to at least manage high refresh rates at 1080p, if not 2.5K
Instead they massively inflated their die for features that require you to accept resolutions and frame-rates that PC gaming left behind 6 years ago.
mapesdhs - Wednesday, September 26, 2018 - link
That last sentence is something I which tech sites would emphasise a lot more. It very much defines how those who normally buy into the higher tier tech now regard what they like doing and why. NVIDIA pushed hard to create the market for high-refresh gaming, 4K & VR, now suddenly they're trying to do an about-face. I can't see how it can work. I just bought a 27" 1440p IPS panel for 200 UKP, the cost of good screens has come down a lot, and now NVIDIA wants us to drop back down to 1080p? :D I get the impression the reaction of a great many is just laughter.Gastec - Thursday, September 27, 2018 - link
Ahaa! You are getting close :) Come on, just spell it: they want to "milk" us as much as possible before Moore's Law ends and we will completely stop upgrading our PC's and we'll just replace the defective part twice in a life time. No more billions of moneyz for Corporate Commander :)Yojimbo - Wednesday, September 19, 2018 - link
"Please, let's boycott Turing cards."Throw down your chains and resist!
Ranger1065 - Thursday, September 20, 2018 - link
100%Xex360 - Wednesday, September 19, 2018 - link
These cards are a disappointment for the price, the 2080ti should be priced at most 800$, it just doesn't offer the performance required for justifying its price, worse here they compared it to the 1080ti FE which as GamerNexus pointed out is not ideal, for the cards are noticeably slower than other cards with proper cooling, so the 1080ti is at least as fast as the 2080.On the ray tracing side, I like the technology but it's not impressive enough to justify the hefty price tag, I'd rather have a real generational leap with a 2070 beating a 1080ti and a 2080ti having at least 70% more performance than having RT, it's a niche product and obly few games will benefit from it, and the whole DLSS isn't good either limited to only a few games, with more brut force we could achieve 4k and super sampling.
kron123456789 - Wednesday, September 19, 2018 - link
"I'd rather have a real generational leap with a 2070 beating a 1080ti and a 2080ti having at least 70% more performance than having RT"That reminded me of a very old quote:
"If I had asked people what they wanted, they would have said ‘faster horses.’" — Henry Ford
saikrishnav - Wednesday, September 19, 2018 - link
That quote only makes sense if Nvidia came up with a "different" radical product than a graphical horse. They just made a slightly faster horse with a RTX ON button which nobody is ready to push yet i.e. developers. So, if you have a choice between a much faster horse and a RTX ON button - one would take a much faster horse. Now, when developers are ready to push the button/envelope, and sign on to the RTX, then this quote makes sense. Nvidia is asking customers to pay the price of new tech-adoption without show-casing the products that use it. They could have invested with devs and in games, to use the RTX, and then released it. But no, they want to fill in a gap until 7nm arrives.Yojimbo - Wednesday, September 19, 2018 - link
Nobody was ready to push the mass produced automobile button, yet, either. Do you think Ford started mass producing cars and then immediately there were roads and gas stations? No, at first horses could comfortably go many more places than cars could.His quote is entirely appropriate.
There is no gap to fill before 7 nm arrives since AMD will have no competition. NVIDIA introduced this now because they see value in the product which will generate sales. Plus it will get the ball rolling on developers implementing the new technologies that are present in the architecture and will be present in future NVIDIA architectures.
BurntMyBacon - Thursday, September 20, 2018 - link
Have to agree here. No only where automobiles extremely limited in where they could go on introduction, they were also very loud and considered disruptive to society with a large voice of opposition. These new cards at least have the benefit of being able to go anywhere their predecessors can while still enabling new capabilities.I very much agree that nVidia is using this architecture to "get the ball rolling" on the new tech. They are probably very much aware that sales of RTX cards will be lower until they can fit a meaningful amount of the new hardware resources into a mainstream chip. Though, given the size of the chips and typical associated yields, nVidia may still end up selling every chip they can make.
eddman - Thursday, September 20, 2018 - link
It still doesn't justify their prices. Great cards, finally ray-tracing for games, horribly cutthroat prices.Yojimbo - Saturday, September 22, 2018 - link
So don't buy it, eddman. In the end the only real justification for prices is what people are willing to pay. If one isn't able to make a product cheaply enough for it to be sold for what people are willing to pay then the product is a bad product.I don't understand why you are so worried about the price. Or why you think they are "cut-throat". A cut-throat price is a very low price, not a high one.
eddman - Sunday, September 23, 2018 - link
There is a wealthy minority who'd pay that much, and? It's only "justified" if you are an nvidia shareholder.The cards are overpriced compared to last gen and that's an absolute fact. Your constant defending of nvidia's pricing is certainly not a normal consumer behavior.
mapesdhs - Wednesday, September 26, 2018 - link
Yojimbo is right that an item is only ever worth what someone is willing to pay, so in that sense NVIDIA can do what it likes, in the end it's up to the market, to consumers, whether the prices "make sense", ie. whether people actually buy them. In this regard the situation we have atm is largely that made by gamers themselves, because even when AMD released competitive products (whether by performance, value, or both), people didn't buy them. There are even people saying atm they hope AMD can release something to compete with Turing just so NVIDIA will drop its prices and thus they can buy a cheaper NVIDIA card; that's completely crazy, AMD would be mad to make something if that's how the market is going to respond.What's interesting this time though is that even those who in the past have been happy to buy the more expensive cards are saying they're having major hesitation about buying Turing, and the street cred which used to be perceived as coming with buying the latest & greatest has this time largely gone, people are more likely to react like someone is a gullible money pumped moron for buying these products ("More money than sense!", as my parents used to say). By contrast, when the 8800 GTX came out, that was a huge leap over the 7800 and people were very keen to get one, those who could afford it. Having one was cool. Ditto the later series right through to Maxwell (though a bit of a dip with the GTX 480 due to heat/power). The GTX 460 was a particularly good release (though the endless rebranding later was annoying). Even Pascal was a good bump over what had come before.
Not this time though, it's a massive price increase for little gain, while the headline features provide sub-60Hz performance at a resolution far below what NVIDIA themselves have been pushing as desirable for the last 5 years (the focus has been on high frequency monitors, 4K and VR); now NVIDIA is trying to roll back the clock, which won't work, especially since those who've gotten used to high frequency monitors physically cannot go back (ref New Scientist, changes in the brain's vision system).
Thus, eddman is right that the card's are overpriced in a general sense, as they don't remotely match what the market has come to expect from NVIDIA based on previous releases. However, if gamers don't vote with their wallets then nothing will change. Likewise, if AMD releases something just as good, or better value, but gamers don't buy them, then again nothing will change, we'll be stuck with this new expensive normal.
I miss the Fermi days, buy two GTX 460s to have better performance than a GTX 580, didn't cost much, games ran great, and the lesser VRAM didn't bother me anyway as I wasn't using an uber monitor. Now we have cards that cost many hundreds that don't even support multi-GPU. It's as daft as Intel making the cost entry point to >= 40 PCIe lanes much higher than it was with X79 (today it's almost 1000 UKP); an old cheapo 4820K can literally do things a 7820X can't. :D
Alas though, again it boils down to individual choice. Some want the fastest possible and if they can afford it then that's up to them, it's their free choice, we don't have the right to tell people they shouldn't buy these cards. It's their money afterall (anything else is communism). It is though an unfortunate reality that if the cards do sell well then NVIDIA will know they can maintain this higher priced and more feature restricted strategy, while selling the premium parts to Enterprise. Btw, it amazes me how people keep comparing the 2080 to the 1080 Ti even though the former has less RAM; how is that an upgrade in the product stack? (people will respond with ray tracing! Ray tracing! A feature which can't be used yet and runs too slow to be useful anyway, and with an initial implementation that's a pretty crippled implementation of the idea aswell).And why doesn't the 2080 Ti have more than 11GB? It really should, unless NVIDIA figures that if they can indeed push people back to 1080p then 11GB is enough anyway, which would be ironic.
I'm just going to look for a used 1080 Ti, more than enough for my needs. For those with much older cards, a used 980 Ti or 1070, or various AMD cards, are good options.
Ian.
Yojimbo - Wednesday, September 19, 2018 - link
Yes, exactly. A very appropriate quote.Skiddywinks - Thursday, September 20, 2018 - link
No reason Ford couldn't have done both though. There is no technological reason nVidia could not have released a GTX 2080 Ti as well. But they know they couldn't charge as much, and the vast majority of people would not buy the RTX version. Instead, it makes their 1080 Ti stock look much more appealing to for value oriented gamers, helping them shift that stock as well as charge a huge price for the new cards.It's really great business, but as a gamer and not a stockholder, I'm salty.
Spunjji - Friday, September 21, 2018 - link
Ford didn't invent the car, though. Ford invented a way to make them cheaper.Ford's strategy was not to make a new car that might do something different one day and then charge through the effing nose for it.
Gastec - Thursday, September 27, 2018 - link
That quote applies perfectly to our digital electronic World: we want to go faster from point A to point B. To do that, Henry Ford gave us a car (a faster "horse"). We want the same from GPUs and CPU's, to be faster. Prettier sure, pink even. But first just make it fast.Writer's Block - Monday, October 1, 2018 - link
Except there is no evidence he said that - it is a great statement though, and conveys the intended message wellHxx - Wednesday, September 19, 2018 - link
overall dissapointing performance. RTX 2080 is a flat out bad buy at $800+ when 1080 ti custom boards are as low as $600. the RTX 2080 TI is a straight up ripoff when consumers can easily surpass its performance with 2 x 1080 TIs. I agree on the conclusion though that you are buying hardware that you wont take adavantage of yet but still, if Nvidia wants to push this hardware to all gamers, they need to drop the pricing in line with their performance otherwise not many will buy into the hype.Qasar - Wednesday, September 19, 2018 - link
just checked a local store, the lowest priced 2080 card, a gigabyte rtx 2080 is $1080, and thats canadian dollars... the most expensive RTX card ..EVGA RTX 2080 Ti XC ULTRA GAMING 11GB is $1700 !!!! again that's canadian dollars !! to make things worse.. that's PRE ORDER pricing, and have this disclaimer : Please note that the prices of the GeForce RTX cards are subject to change due to FX rate and the possibility of tariffs. We cannot guarantee preorder prices when the stock arrives - prices will be updated as needed as stock become available.even if i could afford these cards.. i think i would pass.. just WAY to expensive.. id prob grab a 1080 or 1080ti and be done with it... IMO... nvida is being a tad bit greedy just to protect and keep its profit margins.. but, they CAN do this.. cause there is no one else to challenge them...
PopinFRESH007 - Wednesday, September 19, 2018 - link
would you care to share the bill of materials for the tu102 chip? Since you seem to suggest you know the production costs, and therefor know the profit margin which you suggest is a bit greedy.Qasar - Wednesday, September 19, 2018 - link
popin.. all i am trying to say is nvidia doesnt have to charge the prices they are charging.. but they CAN because there is nothing else out there to provide competition...tamalero - Thursday, September 20, 2018 - link
Please again explain how the cost of materials is somehow relevant on the price performance argument for consumers?Chips like R600, Fermi, similars.. were huge.. did it matter? NO, did performance matter? YES.
PopinFRESH007 - Thursday, September 20, 2018 - link
I specifically replied to Qasar's claim "nvida is being a tad bit greedy just to protect and keep its profit margins.. but, they CAN do this" which is baseless unless they have cost information to know what their profit margins are.Nagorak - Thursday, September 20, 2018 - link
Nvidia is a public company. You can look up their profit margin and it is quite high.Qasar - Thursday, September 20, 2018 - link
PopinFRESH * sigh * i guess you will never understand the concept of " no competition, we can charge what ever we want, and people will STILL buy it cause it is the only option if you want the best or fastest " it has NOTHING to do with knowing cost info or what a companies profit margins are... but i guess you will never understand this....just4U - Thursday, September 20, 2018 - link
Ofcourse their being greedy. Since they saw their cards flying off the shelfs at 50% above MSRP earlier this year they know people are willing to pay.. so their pushing the limit. As they normally do.. this isn't new with Nvidia. Not sure why any are defending them.. or getting excessively mad about it. (..shrug)mapesdhs - Wednesday, September 26, 2018 - link
Effectively, gamers are complaining about themselves. Previous cards sold well at higher prices, so NVIDIA thinks it can push up the pricing further, and reduce product resources at the same time even when the cost is higher. If the cards do sell well then gamers only have themselves to blame, in which case nothing will change until *gamers* stop making it fashionable and cool to have the latest and greatest card. Likewise, if AMD does release something competitive, whether via price, performance or both, then gamers need to buy the damn things instead of just exploiting the lowered NVIDIA pricing as a way of getting a cheaper NVIDIA card. There's no point AMD even being in this market if people don't buy their products even when it does make sense to do so.BurntMyBacon - Thursday, September 20, 2018 - link
@PopinFRESH007: "would you care to share the bill of materials for the tu102 chip? Since you seem to suggest you know the production costs, and therefor know the profit margin which you suggest is a bit greedy."You have a valid point. It is hard to establish a profit margin without a bill of materials (among other things). We don't have a bill of materials, but let me establish some knowns so we can better assess.
Typically, most supporting components on a graphics card are pretty similar to previous generation cards. Often times different designs used to do the same function are a cost cutting measure. I'm going to make an assumption that power transistors, capacitors, output connectors, etc. will remain nominally the same cost. So I'll focus on differences. The obvious is the larger GPU. This is not the first chip made on this process (TSMC 12nm) and the process appears to be a half node, so defect rates should be lower and the wafer cost should be similar to TSMC 14nm. On the other hand, the chip is still very large which will likely offset some of that yield gain and reduce the number of chips fabricated per wafer. Pascal was first generation chip produced on a new full node process (TSMC 14nm), but quite a bit smaller, so yields may have been higher and there were more chips fabricated per wafer. Also apparent is the newer GDDR6 memory tech, which will naturally cost more than GDDR5(X) at the moment, but clearly not as much as HBM2. The chips also take more power, so I'd expect a marginal increase for power related circuitry and cooling relative to pascal. I'd expect about the same cost here as for maxwell based chips, given similar power requirements.
From all this, it sounds like graohics cards based on Turing chips will cost more to manufacture than Pascal equivalents. I it is probably not unreasonable to suggest that a TU106 may have a similar cost bill of materials to a GP102 with the note that the cost to produce the GP102 has most certainly dropped since introduction.
I'll leave the debate on how greedy or not this is to others.
Qasar - Friday, September 21, 2018 - link
burntmybacon just like popinfresh, i guess you will never understand the concept of " no competition, we can charge what ever we want, and people will STILL buy it cause it is the only option if you want the best or fastest " it has NOTHING to do with knowing cost info or what a companies profit margins are... but i guess you will never understand this as welleva02langley - Thursday, September 20, 2018 - link
Once again, not AMD fault if Nvidia is trying to corner AMD with new hardware gimmicks like physix, and charging the customers.You never intended in buying an AMD card anyway, you just want a more affordable Nvidia solution. Guess what, pay for it or leave it.
Even by earning 100K a year, I refuse to pay the gimmick tax. I will buy Navi at release. Screw Nvidia.
V900 - Wednesday, September 19, 2018 - link
What a cool piece of technology!Raytracing would be amazing to have in games, and it really is the future of gaming. Its crazy to think there will be games with it already next year. (And some later this year!)
Is it too expensive? Meh, we are talking about TWENTY BILLION transistors squeezed into the area of a postage stamp.
People pay 600-1000$ for a phone, and some have no problem paying 1000$ for a CPU or a designer chair.
7-1200$ isn’t an unreasonable price for a cutting edge GPU that’s capable of raytracing and will be fast enough for the newest games for years to come.
imaheadcase - Wednesday, September 19, 2018 - link
Did that nvidia check cash they sent you to promote the items yet?shabby - Wednesday, September 19, 2018 - link
Definitely a shill, its too obvious.tamalero - Thursday, September 20, 2018 - link
Theres way too many here defending nvidia just because "its a huge chip".That means nothing for the consumer. We're not buying SIZE, we're buying PERFORMANCE AND FEATURES.
mapesdhs - Wednesday, September 26, 2018 - link
Some of the pro-RTX posts sound more like basic trolling though, just to stir things up. If they're getting paid to post +ve stuff, they're doing a pretty rotten job of it. :DformulaLS - Wednesday, September 19, 2018 - link
Your comments sounds like a paid ad. There is no decency excuse for the prices they are charging.DigitalFreak - Wednesday, September 19, 2018 - link
He has a point. People are willing to pay $1000 for a phone, $1000 for a CPU, but $1000 for a high end graphics card is outrageous? I wish the pricing was cheaper, but I'm not having a fit over it. If people don't want to pay the price, they won't. If Nvidia doesn't sell the numbers they want, they'll probably cut the price somewhat.Fritzkier - Wednesday, September 19, 2018 - link
The $1000 phone actually had more technological advancement... And it's an SoC not individual parts...About a $1000 CPU, it's normal because it's enthusiast product (e.g. Threadripper or i9). There's no $1000 i7 or Ryzen 7...
Nvidia shouldn't have made 2080 Ti. They should've made Titan Turing or something...
PopinFRESH007 - Wednesday, September 19, 2018 - link
news flash: 2080 Ti is an enthusiast product.tamalero - Thursday, September 20, 2018 - link
news flash.. 2080TI is just part of a large part of a system. Unlike a flagship phone.. You cant game with only a 2080TI or a 2080. You need other parts.Your argument is retarded.
Be honest, you got cash in nvidia's stock? your family works for Nvidia?
PopinFRESH007 - Thursday, September 20, 2018 - link
what argument is that? no I don't have any nVidia stock and none of my family work for them. You sound envious of people who can afford to be early adopters.tamalero - Friday, September 21, 2018 - link
You're defending the almost 50% price hike with a "is an enthusiast product".Now that is a dumb excuse.
Nothing to do with being envious. A fool and his money are soon departed. So if you want to buy it.
Go ahead!.
For the majority of us its not worth to buy something which its "flagship" features arent even working or available for probably months to come, has only 30% average performance increase, for almost double the price..
cmdrdredd - Wednesday, September 19, 2018 - link
I don't know anyone who has every paid for their phone outright. Everyone is on a lease/upgrade/Iphone forever plan.bji - Wednesday, September 19, 2018 - link
So you think that means they haven't paid full price for their phone? Or are you saying you don't understand how to buy a video card on a similar payment plan (hint -- it's called a credit card)?People have absolutely no sense of logic or really intelligence at all when it comes to evaluating technology prices. NONE. This forum, and pretty much every internet forum where predominantly inexperienced kids who have no clue how actual money works post (which is apparently all of them), is all the proof you need of that.
PopinFRESH007 - Wednesday, September 19, 2018 - link
^^ This, 100% this. People who think that changing the structure of payment somehow magically changes the price of something (other than increasing it due to TVM) are amazingly ignorant.Nagorak - Thursday, September 20, 2018 - link
It doesn't change the price, however it certainly hides the price. A lot of people would not buy their expensive phone if it meant coughing up $800 all at once.Nagorak - Thursday, September 20, 2018 - link
Not a good example. A credit card probably has 20% per year interest rate. A phone contact usually bundles in the monthly payment on the phone at low or no interest rate.Qasar - Thursday, September 20, 2018 - link
the carriers here ( canada ) have been considering dropping the " subsidized phone " thing for a few years now.. IF they do.. i wonder how many people would NOT get a new phone every year.... specifically those get " have to " get the new iphone every year... i dont know any one that would pay that much for a phone each year, if they had to buy it outright from their own pocket....Gastec - Friday, September 28, 2018 - link
@cmdrdreddWatch out for those online scammers, you are in their target demographics ;)
Writer's Block - Monday, October 1, 2018 - link
always buy up-front.BurntMyBacon - Thursday, September 20, 2018 - link
@DigitalFreak: "He has a point. People are willing to pay $1000 for a phone, $1000 for a CPU, but $1000 for a high end graphics card is outrageous?"Fair point, but I tend to take the opposite approach and ask why people will pay more for a phone than a high end graphics card. ;')
Spunjji - Friday, September 21, 2018 - link
We're not having a fit over price, we're just not happy about it. A lot of people are digging deep to justify it, though, including making spurious arguments about the cost of things that have nothing to do with gaming GPUs. That strikes me as... not rational.$1000 phones are a great analogy, incidentally, but not for the reason you thought. It's another market area where manufacturers noticed people were sweating their assets for longer, so flimsy justifications were made for increasing the cost of entry to sustain margins. People buy it because they want it, not because it's good value.
Nvidia want to walk in here singing about a brave new world of Ray Tracing and then they tell me the cost to ride is $700+. To that I am saying nooooo thank you.
Does my not caring affect Nvidia much? No. But this is a forum, this is where we share opinions. Stop trying to act like only your opinion is rational and everyone else's is childish or misinformed.
Gastec - Friday, September 28, 2018 - link
I will NEVER pay €1000 for a smartphone. Unless of course ultra-inflation will....certainly happen in the next 15 years, when €300 now will be €1000 tomorrow. It has already started in the USA.godrilla - Wednesday, September 19, 2018 - link
Raytracing did you read the article ?Ranger1065 - Thursday, September 20, 2018 - link
Pathetic creature.Gastec - Friday, September 28, 2018 - link
@V900Soundeth like a true Avram Piltch's padawan :)
V900 - Wednesday, September 19, 2018 - link
BTW: Why do I get the distinct impression, that most of the people complaining about the price, would insist that it’s a totally fair and reasonable price, if this was an AMD graphics card?Qasar - Wednesday, September 19, 2018 - link
if this was an amd card.. it still would be too much, and i still wouldnt get one...imaheadcase - Wednesday, September 19, 2018 - link
They wouldn't, because they would not have the extra bloat in the card, but performed the exact same..people would get the AMD card because of all the bullshit in the nvidia card not yet even proven to by worth it.Better question: How much does nvidia pay you to be a shill here and on forums? I mean you are obviously delusional about it.
DigitalFreak - Wednesday, September 19, 2018 - link
Typical fanboy response.Fritzkier - Wednesday, September 19, 2018 - link
Do you actually read V900's other comments? He is a shill.And anyway AMD did the same thing with Vega 64 (1080 ish performance but a little expensive, except in Vulkan) and see that many also outrageous with Vega releases. That's just because AMD is one year late and making it a little expensive.
bji - Wednesday, September 19, 2018 - link
He may not be a shill; perhaps he's just trying to counterbalance the clueless posters who don't understand why products cost what they do.I know that I am not a shill for anyone, but I can understand the motivation in posting something like that.
tamalero - Wednesday, September 19, 2018 - link
Cost means nothing if they cant provide what they are expected to.Noone gives a darn if the die is insane if their performance isnt as good as promised.
R600 was huge, hot, and was a total new game in graphic design and tech. Was its performance good? NO.
Inteli - Wednesday, September 19, 2018 - link
I think they're substantially overblowing the significance of raytracing and DLSS, at least in this generation's lifespan.The best way I've seen it put is this: look at the Shadow of the Tomb Raider demo. There isn't *that* much of a difference between raytracing on and off. Shaders, ambient occlusion, and other such cheats have gotten so good that they're very close to ray tracing.
Will ray tracing ever be significantly better than raster rendering? Of course it will. It's the future of rendering. Is it significantly better now? No.
It's not an issue of "card is expensive", it "card isn't a good value compared to its predecessors in current games". Most of those extra transistors aren't contributing to the card's performance right now, so you're paying a lot more for nothing, at least right now. For current games, the 2080 is 1080 Ti performance for 1080 Ti prices.
Unless you either A. must have the latest and greatest, or B. anticipate playing games that have been confirmed to support RT, I don't see a compelling reason to buy a 2080 over a 1080 Ti, at least until 1080 Ti supply dries up.
BurntMyBacon - Thursday, September 20, 2018 - link
@Inteli: "For current games, the 2080 is 1080 Ti performance for 1080 Ti prices."I think you meant greater than 1080Ti prices. At the same price and performance, I'd buy newer if for no other reason than to have longer support. The extra features (regardless of whether or not they are used in games I play) are just icing on the cake at that point.
Inteli - Saturday, September 22, 2018 - link
Right.eva02langley - Thursday, September 20, 2018 - link
Because it is their only selling point. It is obviously not the performances.imaheadcase - Wednesday, September 19, 2018 - link
OH so my 1080Ti makes me a AMD fanboy. right..Makaveli - Wednesday, September 19, 2018 - link
lol that distinct impression is your own personal bias.A High price is a High price regardless of who's logo is on it.
Nagorak - Thursday, September 20, 2018 - link
Probably because you yourself are a fanboy.Spunjji - Friday, September 21, 2018 - link
And your evidence for that assertion is... presumably still up your ass, as you have yet to pull it out.wavetrex - Wednesday, September 19, 2018 - link
Wow... another review which is comparing shitty Pascal FE to Turding FE and unrealistic MRSP to MRSP.Instead of comparing... you know, actual cards that you can buy (partner cards), at the prices they are at ?
How much are Nvidia paying you guys to write such biased, unrealistic review, to attempt to salvage this disastrous launch ?
Sometimes I wonder why haven't I already removed Anandtech from my bookmarks yet.
I might just do that after I finish posting this comment.
diehardmacfan - Wednesday, September 19, 2018 - link
Are they supposed to have a real-time pricing information link to what each card will cost?If you think they got paid to offer up a less than glowing review.. lol.
DigitalFreak - Wednesday, September 19, 2018 - link
There are plenty of other reviews out there you can read.DILLIGAFF - Wednesday, September 19, 2018 - link
reminds me of something...... same situation and outcome..pushing new tek the hardware is horrible price/perf https://www.anandtech.com/show/742/12from geforce 3 it took till geforce 6 to get a nvidia gpu that was good on performance and price...i guess 1200 bux is the new 500 bux
maroon1 - Wednesday, September 19, 2018 - link
Huge gains in Wolfenstein II when using vulkanI like to see Doom with Vulkan test please
FirstStrike - Wednesday, September 19, 2018 - link
The computing benchmark really means a lot. But I wonder why GTX-10 series are absent in SGEMM tests. They should be able to get quite some GFLOPs in SGEMM.godrilla - Wednesday, September 19, 2018 - link
Nvidia could be capping performance of turing to make pascal look good in comparison to sell remaining stock which is a win win for nvidia they don't have to discount pascal to maintain max profits. Look at most convulsions saying pascal is still competitive in comparison to the 2080. Unfortunately we don't have a choice in a monopoly. When intel enters the gpu market we are probably going to have the most competition in a long time even if its mid level.godrilla - Wednesday, September 19, 2018 - link
Conclusions*BurntMyBacon - Thursday, September 20, 2018 - link
I'm going to take a (admittedly small) leap of faith and suggest that nVidia most likely is not intentionally limiting performance of Turing cards. Given the amount of hardware dedicated to tasks that don't benefit rasterization, it just doesn't seem like could have left that much performance on the table. It is much more likely that they've simply got prices set high with the intent of dropping them once high end pascal inventory clears out. Of course, after the mining push, they've seen how much the market is willing to bear. They may be trying to establish a new pricing structure that gives the extra profits to them rather than retailers.mapesdhs - Wednesday, September 26, 2018 - link
Give the amount of deceitful PR being used for this launch, I don't think your leap is justified.tamalero - Wednesday, September 19, 2018 - link
I honestly believe the endgame of Nvidia is simple. They want to increase their margin, and the only way to to that is to sell the WHOLE full chips, tensor and all to gamers. While still charging top notch to Pros.This would lead Nvidia to make LESS variants, saving costs in having to design multiple versions when they cant scale down or cut.
PopinFRESH007 - Wednesday, September 19, 2018 - link
your argument is invalidated by the evidence of this product launch. All three cards are on different chips.eva02langley - Thursday, September 20, 2018 - link
You are kidding me? This is exactly this. They made an all around chip to tackle pros, gamers and compute. Vega has the same issue. It was aimed at being an iGPU to a dGPU. It does extremely well at low input, but as a dGPU.They save cost and standardize their manufacturing process. It is nothing else.
Bensam123 - Wednesday, September 19, 2018 - link
Going to go a weird direction with this. I believe cards are going to start diverging from one another in terms of what gamers are looking for. Hardcore gamers that are after the professional scene and absolute performance always turn graphics down, they drive high refresh rate monitor, with low response times, and high frame rates to absolutely limit the amount of variance (spiking) that is present in their gaming experience.Nvidia went for absolute fidelity where they believe the best gaming experience will come from picture perfect representations of an environment, such as with ray tracing. I see ray tracing as a gamer and I go 'Welp that's something I'll turn off'. Hardware review websites are only looking at gaming from a cinematic standpoint, where best gaming will always have everything maxed out running at 8k. Cards do perform differently under different resolutions and especially with different amounts of eye candy turned on. I really think Anand should look into testing cards at 1080p on lowest settings with everything uncapped - Not as the only datapoint, but as another one. Professional gamers or anyone who takes gaming seriously will be running that route.
Which runs into another scenario, where card makers are going to diverge. AMDs upcomming 7nm version of Vega for instance may continue down Vegas current path, which means they'll be focusing on current day performance (although they mentioned concentrating more on compute we can assume the two will intersect). That means while a 2080ti might be faster running 4k@ultra, especially with rays if that ever takes off, it may lose out completely at 1080p@low (but not eyecancer, such as turning down resolution or textures).
For testing at absolute bleeding speeds, that 1% that is removed in 99% percentile testing really starts to matter. Mainly because the spikes, the micro-stutters, the extra long hiccups get you killed and that pisses off gamers that aim for the pinnacle. Those might seem like outliers, but if they happen frequently-infrequently enough, they are part of a distribution and shouldn't be removed. When aiming for bleeding speeds, they actually matter a lot more.
So thus is born the esports gaming card and the cinematic gaming card. Please test both.
PopinFRESH007 - Wednesday, September 19, 2018 - link
so they should include the horizontal line of a completely CPU bound test? Also I'm not understanding the statistical suggestions, they make no sense. Using the 99th percentile is very high already and the minuscule amount of outliers being dropped are often not due to the GPU. As long as they are using the same metric for all tests in the data set it is irrelevant.Bensam123 - Thursday, September 20, 2018 - link
Not all games are CPU bound, furthermore it wouldn't be completely horizontal, that would imply zero outliers, which never happens. In the case of that instead of looking at 99% frame time you would instead focus on the other part 1% frame time or all the stuttering and microstutters. You can have a confidence in tail ends of a distribution if there is enough data points.Also having played tons of games on low settings, you are 100% incorrect about it being a flat line. Go play something like Overwatch or Fortnite on low, you don't automagically end up at CPU cap.
V900 - Thursday, September 20, 2018 - link
Despite all the (AMD) fanboy rage about higher prices, here’s what will happen:Early adopters and anyone who wants the fastest card on the market, will get an RTX 2080/2070, Nvidia is going to make a tidy profit, and in 6-12 months prices will have dropped and cheaper Turing cards will hit the market.
That’s when people with a graphics card budget smaller than 600$ will get one. (AMD fanboys will keep raging though, prob about something else that’s also Nvidia related.)
That’s how it always works out when a new generation of graphics hit the market.
But everyone who’s salty about “only” getting a 40% faster card for a higher price won’t enjoy the rest of the decade.
There won’t be anymore GPUs that deliver a 70-80% performance increase. Not from AMD and not from Nvidia.
We’re hitting the limits of Moore’s law now, so from here on out, a performance increase of 30% or less on a new GPU will be the standard.
nevcairiel - Thursday, September 20, 2018 - link
Technically AMD would have to hit a 70-80% advancement at least once over their own cards at least if they ever want to offer high-end cards again.Arbie - Thursday, September 20, 2018 - link
Why gratuitously characterize all those complaining of price as "AMD fanboys"? Obviously, most are those who intended to buy the new NVidia boards but are dismayed to find them so costly! Hardly AMD disciples.Your repeated, needless digs mark you as the fanboy. And do nothing to promote your otherwise reasonable statements.
eddman - Thursday, September 20, 2018 - link
I'm such an AMD fanboy that I have a 1060 and have never owned an AMD product. You, on the other hand, reek of pro nvidia bias.A 40% higher launch MSRP is not "how it always works out".
No one is complaining about the performance itself but the horrible price/performance increase, or should I say decrease, compared to pascal.
Moore's law? 980 Ti was made on the same 28nm process as 780 Ti and yet offered considerably higher performance and still launched at a LOWER MSRP, $650 vs. $700.
Inteli - Saturday, September 22, 2018 - link
Got enough straw for that strawman you're building? The last 3 GPUs I've bought have been Nvidia (970, 1060, 1070 Ti). I would have considered a Vega 56, but the price wasn't low enough that I was willing to buy one.News flash: in current games, the 2080 is tied with the 1080 Ti for the same MSRP (and the 1080 Ti will be cheaper with the new generation launches). Sure, if you compare the 1080 to the 2080, the 2080 is significantly faster, but those only occupy the same position in Nvidia's product stack, not in the market. No consumer is going to compare a $500 card to a $700-800 card.
The issues people take with the Turing launch have absolutely nothing to do with either "We only got a 40% perf increase" or "Nvidia raised the prices" in isolation. Every single complaint I've seen is "Turing doesn't perform any better in current games as an equivalently priced Pascal card".
Lots of consumers are pragmatists, and will buy the best performing card they can afford. These are the people complaining about (or more accurately disappointed by) Turing's launch: Turing doesn't do anything for them. Sure, Nvidia increased the performance of the -80 and -80 Ti cards compared to last generation, but they increased the price as well, so price/performance is either the same or worse compared to Pascal. Many people were holding off on buying new cards until this launch, and in return for their patience they got...the same performance as before.
mapesdhs - Wednesday, September 26, 2018 - link
Where I live (UK), the 2080 is 100 UKP more expensive than a new 1080 Ti from normal retail sources, while a used 1080 Ti is even less. The 2080 is not worth it, especially with less RAM. It wouldn't have been quite so bad if the 2080 was the same speed for less cost, but being more expensive just makes it look silly. Likewise, if the 2080 Ti had had 16GB that would at least have been something to distinguish it from the 1080 Ti, but as it stands, the extra performance is meh, especially for the enormous cost (literally 100% more where I am).escksu - Thursday, September 20, 2018 - link
Lol.....most important??If you think that ray tracing on 2080ti is damm cool and game changer.....You should see what AMD has done. Just that no one really care about it back then.....
https://forums.geforce.com/default/topic/437987/th...
Even more incredible is that this was done 10yrs ago on 4870x2....REAL TIME......Yes, I repeat, REAL TIME.......
nevcairiel - Thursday, September 20, 2018 - link
It was just another "rasterization cheat" though, may have looked nice but ultimately didn't have the longevity that Ray Tracing may have. No 3D developer or 3D artist is going to ever argue that Ray Tracing is not the future, the question is just how to get there.V900 - Thursday, September 20, 2018 - link
Meh... The image itself is captured by cameras, it’s the manipulation of it that’s done in real time.Which is of course a neat little trick; and while it looks good, it’s hardly as impressive and computationally demanding as creating a whole image through raytracing.
escksu - Thursday, September 20, 2018 - link
https://www.cinemablend.com/games/DirectX-11-Ray-T...AMD did that again with 5870.... REAL TIME......but nobody cares........because the world only bothers about Nvidia.........
Chawitsch - Thursday, September 20, 2018 - link
The demos you linked to are beautiful indeed, however both the ATi Ruby demo and Rigid Gems demo use standard DirectX feature from those times, no ray tracing at all or any vendor specific features. Due to the latter it is worth pointing out that the 2008 Ruby demo (called Double Cross IIRC) was perfectly happy to run on nVidia cards of the time.If these demos show anything, it is that there were and are extremely talented artists out there who can do amazing things to work around the limitations of rasterization. This way however we can always merely approximate how a scene should look, with increasingly high costs, so going back to proper ray tracing was only a question of when its costs will approach that of rasterization. We seem to have arrived at the balancing point, hence hybrid rendering. I also think if AMD could have pushed nVidia more with high end GPUs, nVidia may not have made this step at this time, at least it certainly could have been a more risky proposition otherwise.
Chawitsch - Thursday, September 20, 2018 - link
Besides artists, programmers working on such demos also deserve high praise of course.Bateluer - Thursday, September 20, 2018 - link
Clearly, we're firmly in the age of 4K gaming for the upper range of cards. From Vega 64, 1080, 1080Ti, 2080, 2080Ti, all are throwing up very playable performance on 4K resolutions. I still expect to see people say you'll need SLI 2080TIs for 4K gaming, but that rings incredibly hollow when even a Vega 56 can deliver playable frame rates in most games at 4K with near maximum settings.kallogan - Thursday, September 20, 2018 - link
Higher perf but higher power consumption, this is barely an improvement. Skip.kallogan - Thursday, September 20, 2018 - link
There is absolutely no point whatsoever to buy these over a 1080 Ti indeedLuke212 - Thursday, September 20, 2018 - link
why did you not show the other cards for SGEMM? epic fail. nothing to compare them to.martixy - Thursday, September 20, 2018 - link
My takeaway:1080 owners and above should hold off, 1070 and below are probably justified in upgrading, with an eye to the future as the technologies baked in these cards mature and games utilizing them are released.
TheinsanegamerN - Thursday, September 20, 2018 - link
I highly doubt that those that bought a $350 card are looking at a $1200 card as an upgrade.Holliday75 - Friday, September 21, 2018 - link
I am running a GTX970 and have zero interest in this generation of cards at that price. I am going to let the market settle, AMD release its next batch of cards or two and see where things are a year or two from now. Waste of money as far as I am concerned.RSAUser - Thursday, September 20, 2018 - link
The 1080 Ti partner cards are substantially better than the FE ones, any chance of adding that to the benchmark?Ryan Smith - Thursday, September 20, 2018 - link
Unfortunately not. We like to keep our comparisons apples-to-apples. In this case using reference cards or the thing closest to one.Plus we don't have any other 1080 Tis in right now.
milkod2001 - Thursday, September 20, 2018 - link
NV is messing with us. Even with no competition from AMD those price hikes at such low performance gains are laughable. This generation of new GPU seems like just a stop gap before NV will have something more serious to show next year.willis936 - Thursday, September 20, 2018 - link
No they seem like they will be exactly the same as the 1000 series: they are what they are, you pay what they ask, and they will be the only decent option they offer for the next two years.Maybe if Radeon ever gets their shit together the landscape might look different in 2-3 years but trust me: for now, expect more of the same.
milkod2001 - Thursday, September 20, 2018 - link
Yeah we are pretty much getting into Intel vs AMD scenario when Intel dominates for a years and bring customers overpriced products with very slow performance upgrades. There is a hope AMD will at least try to do something about it.yhselp - Thursday, September 20, 2018 - link
The temperature and noise results are shocking. The results are much closer to what you'd expect from a blower, rather than an open-air cooler. Previous gen OEM solutions do much better than this. What's the reason for this?milkod2001 - Thursday, September 20, 2018 - link
Chips are much bigger than previous gen.iwod - Thursday, September 20, 2018 - link
I think we need DLSS and Hybrid Ray Tracing to judge whether it is worth it. At the moment, we could have the nearly double the performance of 1080Ti if we simply have a 7xxmm2 Die of it.I think the idea Nvidia had is that we have reached the plateau of Graphic Gaming. Imagine what you could do with a 7nm 7xxm2 Die of 2080Ti? Move the 4K Ultra High Quality frame rate from ~60 to 100? That is in 2019, in 2022 3nm, double that frame rate from 100 to 200?
The industry as a whole needs to figure out how to extract even more Graphics Quality with less transistors, simpler software while at the same time makes 3D design modelling easier. The graphics assets from gaming are now worth 100s to millions. Just the asset, not engine programming, network, 3D interaction etc, nothing to do with code. Just the Graphics. And Hybrid Ray tracing is an attempt to bring more quality graphics without the ever increasing cost of Engine and graphics designer simulating those effect.
What is interesting is that we will have 8 Core 5Ghz CPU and 7nm GPU next year.
Chawitsch - Thursday, September 20, 2018 - link
Given how much die space is dedicated to the new features software support will definitely be the key for these cards' success. Otherwise their price is just too high for what they offer today. Buying these cards now is somewhat of a gamble, but nVidia does have excellent relations with developers however, so support should come. As someone who would like to have a capable GPU for 100+ FPS gaming at 1440p, especially one that is future proof, I would much rather take my chances with these new cards.To me the question is this, would it really be worth focusing even more on 4k gaming, when it is a fairly niche market segment still due to monitor prices (especially ones with low latency for gaming). Arguably these high end cards are niche too, but when we can already have 4k@60 FPS, with maxed graphics settings, other considerations become more important. At any given resolution and feature level pure performance becomes meaningless after a certain point, at least for gaming. Arguing that reaching 100 FPS at 4k definitely has merit in my opinion, but by the time really good 4k monitors take over we'll get there, even with the path nVidia took.
Regarding graphics quality and transistor count, ray tracing should be a win here, if not now in the future certainly. There are diminishing returns with rasterization as you approach more realistic scenes and ray tracing makes you jump through less hoops to if you want to create a correct looking scene.
MadManMark - Thursday, September 20, 2018 - link
"I think the idea Nvidia had is that we have reached the plateau of Graphic Gaming. Imagine what you could do with a 7nm 7xxm2 Die of 2080Ti?"Yes, but that is probably why they stuck with 12nmFF actually. Note the die size, plus each card has its own GPU, rather than binned selection from the same GPU (kudos to Nate for also ruminating briefly on this in text). This means maximizing yiled is particularly important, and so begs for a mature, efficient process. TSMC achieved great things with their current 7nm process, no knock on it, but it is still UV-based, it's been long documented that there are yield challengels with that. IMO Nvidia will wait to hitch their wagons to TSMC's next process (expected next year), EUV-based 7nm+, which is expected to mitigate a lot of these yield concerns.
In other words it will be very interesting to see what the 2180 Ti looks like next year -- yes, I built a lot of assumptions into that sentence ;)
eddman - Thursday, September 20, 2018 - link
Come on, the naming is already set; 1080, 2080, 3080. What the hell is a "2180"?P.S. OCD
Lolimaster - Saturday, September 22, 2018 - link
That works if you expect graphics to be stagnant, tons of mini effects and polygon count will chunk a current 1080ti to 10fps in 2021.29a - Thursday, September 20, 2018 - link
The quality of this website has hit rock bottom under the helm of Cutress. I waited all day for this review but by the time it was posted word had spread of how underwhelming these cards are and I had no interest in reading it. This is the second major hardware release in a year in which Anandtech has totally screwed up the review, the other being Ryzen. Please Mr Cutress step down so someone else can run the site and hopefully return it to it's former glory.MadManMark - Thursday, September 20, 2018 - link
You posted to tell us you were "underwhelmed" by an article that you admit you didn't' even read?29a - Thursday, September 20, 2018 - link
No, I said it was widely known that the cards' performance was underwhelming by the time the article was posted. If you reread what I said the main point of my post is that Mr Cutress should resign because of the poor quality of how the website has been run since he took over.29a - Thursday, September 20, 2018 - link
After reading my original post again how in the hell did you come to the conclusion that I was underwhelmed by the article? It's very clear that I was talking about the performance of the card.Ryan Smith - Thursday, September 20, 2018 - link
"The quality of this website has hit rock bottom under the helm of Cutress"That would be rather amazing, seeing as how I run it, not Ian.
Anyhow, if you have any questions or concerns, please don't hesitate to drop me a line (my email address is listed on the site). I always appreciate the feedback.
29a - Thursday, September 20, 2018 - link
Then take my comments and replace Cutress with Smith. I've been an avid follower of this website since 1998 including having it as my homepage for several years and the quality has declined. The Ryzen review and this review are prime examples of that declineRyan Smith - Thursday, September 20, 2018 - link
I'm still not sure you follow. Ian had nothing to do with this review. This was put together by Nate Oh (with some help by me).29a - Thursday, September 20, 2018 - link
I follow what you're saying. I thought the website was run by Ian, my mistake. I'm saying you need to take my original comment and replace the name Cutress with the name Smith. Here I'll do it for you."The quality of this website has hit rock bottom under the helm of Smith. I waited all day for this review but by the time it was posted word had spread of how underwhelming these cards are and I had no interest in reading it. This is the second major hardware release in a year in which Anandtech has totally screwed up the review, the other being Ryzen. Please Mr Smith step down so someone else can run the site and hopefully return it to it's former glory."
Now do you understand? I'm trying to say the site needs new leadership. The articles for the two biggest releases of the past year have been totally screwed up, 2080 cards and Ryzen refresh. Ryzen took over a month to complete.
Holliday75 - Friday, September 21, 2018 - link
I am really confused by your logic. Are you saying the article was bad because it was not released when you expected it to be released?29a - Saturday, September 22, 2018 - link
I'm not saying the article was bad because it was released 11 hours later than everyone else, I'm sure it was fine. I'm saying that the management of this website has went to hell because they screwed up the two biggest releases of the last year. It took 5 weeks to finish the Ryzen article.mapesdhs - Thursday, September 27, 2018 - link
Does the site earn anything from your reading their articles? Just curious.sing_electric - Thursday, September 20, 2018 - link
I really wonder how well raytracing will be implemented in the next 1-2 years. My bet is that for many of the titles Nvidia's announced, the effects will be limited and sort of gimicky, and the real benefits will come with titles that are starting development now (or, more likely, in a few months, when devs can look at how the 1st attempts at using RT faired).If my hunch is right, then that means that the RT features are likely to be of little practical use in this generation, since the real benefits won't come until some point after Nvidia's next-gen (7nm? 5?) chips come out with much-improved performance.
MadManMark - Thursday, September 20, 2018 - link
It sounds like the 2080 TI at maybe 1440 might be viable for RT. But yeah, for the most part this is about the future, and starting to get some games out there so for future releses there is not the "chicken & egg" problem they have now (no games to use it for, but reason there are no games is there are no cards to use it).Nvidia clearly sacrificing short-term profitability to establish this base; notice how the 2080 is both priced & performs about the same as the 1080 Ti. With the larger die size despite 12nmFF, driver development costs, etc, there is little doubt in my mind that Nvidia will be making a bigger margin on the 1080 Ti than the 2080. But they want to make it cheap enough so that, even if there is little to gain RIGHT NOW from buying 2080 instead fo 1080 Ti, there is also little lost either.
milkod2001 - Thursday, September 20, 2018 - link
It will have to be AMD who also jumps on raytracing thing first then maybe game developers will take it seriously.MadManMark - Thursday, September 20, 2018 - link
"Bartender, I'll have what milkod2001's having!" ;)El Sama - Thursday, September 20, 2018 - link
RTX 2080 not worth buying right now, 1080ti is cheaper (lol), cooler and performs equal. RTX 2080ti is a 1200+ Card that is around 60% price increase from 1080ti for a 25-28% performance increase? How is that a good purchase? Neither of them are worth buying right now.Toadster - Thursday, September 20, 2018 - link
decisions - 24 monthly payments for an iPhone XS Max? or GTX 2080Ti :)milkod2001 - Thursday, September 20, 2018 - link
Get iPhone if you want to be cool guy, you cannot put RTX 2080ti into your pocket :)Arbie - Thursday, September 20, 2018 - link
THANK YOU for the Ashes of the Singularity benchmark results. The deltas may not translate to other games but show me exactly what to expect from an upgrade.darckhart - Thursday, September 20, 2018 - link
But NVIDIA's key features - such as real time ray tracing and DLSS - aren't being utilized by any games right at launch. In fact, it's not very clear at all when those games might arrive, because NVIDIA ultimately is reliant on developers here.In the Star Wars Reflections demo, we measured the RTX 2080 Ti Founders Edition managing around a 14.7fps average at 4K and 31.4fps average at 1440p when rendering the real time ray traced scene. With DLSS enabled, it jumps to 33.8 and 57.2fps
Direct quotes from article. Price premium for NV tech that (1) will not be in games at launch and may have months of buggy implementation from early adoption and may not have widespread adoption, (2) needs extreme help from DLSS to have usable framerates. Should've been named DTX not RTX.
V900 - Thursday, September 20, 2018 - link
That’s plain false.Tomb Raider is a title out now with RTX enabled in the game.
Battlefield 5 is out in a month or two (though you can play it right now) and will also utilize RTX.
Sorry to destroy your narrative with the fact, that one of the biggest titles this year is supporting RTX.
And that’s of course just one out of a handful of titles that will do so, just in the next few months.
Developer support seems to be the last thing that RTX2080 owners need to worry about, considering that there are dozens of titles, many of them big AAA games, scheduled for release just in the first half of 2019.
Skiddywinks - Friday, September 21, 2018 - link
Unless I'm mistaken, TR does not support RTX yet. Obviously, otherwise it would be showing up in reviews everywhere. There is a reason every single reviewer is only benchmarking traditional games; that's all there is right now.Writer's Block - Monday, October 1, 2018 - link
Exactly.Is supporting or enabled.
However - neiher actually have it now to see, to experience.
eva02langley - Thursday, September 20, 2018 - link
These cards are nothing more than a cheap magic trick show. Nvidia knew about the performances being lackluster, and based their marketing over gimmick to square the competition by affirming that these will be the future of gaming and you will be missing out without it.Literally, they basically tried to create a need... and if you are defending Nvidia over this, you have just drinking the coolaid at this point.
Quote me on this, this will be the next gameworks feature that devs will not bother touching. Why? Because devs are developing games on consoles and transit them to PC. The extra time in development doesn't bring back any additional profit.
Skiddywinks - Friday, September 21, 2018 - link
Here's the thing though, I don't the performance is that lacklustre, the issue is we have this huge die and half of it does not do what most people want; give us more frames. If they had made the same size die with nothing but traditional CUDA cores, the 2080 Ti would be an absolute beast. And I'd imagine it would be a lot cheaper as well.But nVidia (maybe not mistakenly) have decided to push the raytracing path, and those of us you just want maximum performance for the price (me) and were waiting for the next 1080 Ti are basically left thinking "... oh well, skip".
eva02langley - Friday, September 21, 2018 - link
DOn't get me wrong, these cards are a normal upgrade performance jump, however it is not the second christ sent that Nvidia is marketing.The problem here is Nvidia want to corner AMD and their tactic they choose is RTX. However RTX is nothing else than a FEATURE. The gamble could cost them a lot.
If AMD gaming and 7nm strategy pays off, devs will develop on AMD hardware and transit to PC architecture leaving devs no incentive to put the extra work for a FEATURE.
The extra cost of the bigger die should have been for gaming performances, but Nvidia strategy is to disrupt competition and further their stand as a monopoly as they can.
Physx didn't work, hairwork didn't work and this will not work. As cool as it is, this should have been a feature for pro cards only, not consumers.
mapesdhs - Thursday, September 27, 2018 - link
That's the thing though, they aren't a "normal" upgrade performance jump, because the prices make no sense.AnnoyedGrunt - Thursday, September 20, 2018 - link
This reminds me quite a bit of the original GeForce 256 launch. Not sure how many of you were following Anandtech back then, but it was my go-to site then just as it is now. Here are links to some of the original reviews:GeForce256 SDR: https://www.anandtech.com/show/391
GeForce256 DDR: https://www.anandtech.com/show/429
Similar to the 20XX series, the GeForce256 was Nvidia's attempt to change the graphics card paradigm, adding hardware tranformation and lighting to the graphics card (and relieving the CPU from those tasks). The card was faster than the contemporary cards, but also much more expensive, making the value questionable for many.
At the time I was a young mechanical engineer, and I remember feeling that Nvidia was brilliant for creating this card. It let me run Pro/E R18 on my $1000 home computer, about as fast as I could on my $20,000 HP workstation. That card basically destroyed the market of workstation-centric companies like SGI and Sun, as people could now run CAD packages on a windows PC.
The 20XX series gives me a similar feeling, but with less obvious benefit to the user. The cards are as fast or faster than the previous generation, but are also much more expensive. The usefulness is likely there for developers and some professionals like industrial designers who would love to have an almost-real-time, high quality, rendered image. For gamers, the value seems to be a stretch.
While I was extremely excited about the launch of the original GeForce256, I am a bit "meh" about the 20XX series. I am looking to build a new computer and replace my GTX 680/i5-3570K, but this release has not changed the value equation at all.
If I look at Wolfenstein, then a strong argument could be made for the 2080 being more future proof, but pretty much all other games are a wash. The high price of the 20XX series means that the 1080 prices aren't dropping, and I doubt the 2070 will change things much since it looks like it would be competing with the vanilla 1080, but costing $100 more.
Looks like I will wait a bit more to see how that price/performance ends up, but I don't see the ray-tracing capabilities bringing immediate value to the general public, so paying extra for it doesn't seem to make a lot of sense. Maybe driver updates will improve performance in today's games, making the 20XX series look better than it does now, but I think like many, I was hoping for a bit more than an actual reduction in the performance/price ratio.
-AG
eddman - Thursday, September 20, 2018 - link
How much was a 256 at launch? I couldn't find any concrete pricing info but let's go with $500 to be safe. That's just $750 by today's dollar for something that is arguably the most revolutionary nvidia video card.Ananke - Thursday, September 20, 2018 - link
Yep, and it was also not selling well among "gamers" novelty, that became popular after falling under $100 a pop years later. Same here, financial analysts say the expected revenue from gaming products will drop in the near future, and Wall Street already dropped NVidia. Product is good, but expensive, it is not going to sell in volume, their revenue will drop in the imminent quarters.Apple's XS phone was the same, but Apple started a buy-one-get-one campaign on the very next day, plus upfront discount and solid buyback of iPhones. Yet, not clear whether they will achieve volume and revenue growth within the priced in expectations.
These are public companies - they make money from Wall Street, and they /NVidia/ can lose much more and much faster on the capital markets, versus what they would gain in profitability from lesser volume high end boutique products. This was relatively sh**y launch - NVidia actually didn't want to launch anything, they want to sell their glut of GTX inventory first, but they have silicon ordered and made already at TSMC, and couldn't just sit on it waiting...
AnnoyedGrunt - Friday, September 21, 2018 - link
I think it was actually much less, judging by comments made in one of the reviews I linked. Maybe around $350 or so, which was very expensive at the time. It is true that it was a revolutionary card, but at the same time it was greeted with a lukewarm reception from the gaming community. Much like the 20XX series. I doubt that the 20XX will seem as revolutionary in hindsight as the GeForce256 did, but the initial reception does seem similar between the two. Will be interesting to see what the next year brings to the table.-AG
eddman - Friday, September 21, 2018 - link
Wow, that's just $525 now. I'm interested in old card prices because some people claim they have always been super expensive. It seems they have a selective memory. I'm yet to find a card more expensive than 2080 Ti from that time period.I'm not surprised that people still didn't buy many 256 cards. The previous cards were cheaper and performed close enough for the time.
abufrejoval - Thursday, September 20, 2018 - link
I am pretty sure I'll get a 2080ti, simply because nothing else will run INT4 or INT8 based inference with similar performance and ease of availability and tools support. Sure, when you are BAIDU or Facebook, you can buy even faster inference hardware or if you are Google you can build your own. But if you are not, I don't know where you'll get something that comes close.As far as gaming is concerned, my 1080ti falls short on 4k with ARK, which is noticeable at 43". If the 2080ti can get me through the critical minimum of 30FPS, it will have been worth it.
As far as ray tracing is concerned, I am less concerned about its support in games: Photo realism isn't an absolute necessity for game immersion.
But I'd love to see hybrid render support in software like Blender: The ability to pimp up the quality for video content creation and replace CPU based rander farms with something that is visually "awsome enough" points towards the real "game changing" capacity of this generation.
It pushes three distinct envelopes, raster, compute and render: If you only care about one, the value may not be there. In my case, I like the ability to explore all three, while getting an 2080ti for me allows me to push down an 1070 to one of my kids still running an R290X: Christmas for both of us!
mapesdhs - Thursday, September 27, 2018 - link
In the end though that's kinda the point, these are not gaming cards anymore and haven't been for some time. These are side spins from compute, where the real money & growth lie. We don't *need* raytracing for gaming, that glosses over so many other far more relevant issues about what makes for a good game.Pyrostemplar - Thursday, September 20, 2018 - link
High performance and (more than) matching price. nVidia seemingly put the card classification down one notch (x80 => x70; Ti => x80; Titan => Ti) while keeping the prices and overclocked then from day one so it looks like solid progress if one disregards the price.I think it will be a short lived (1 year or so) generation. A pricey stop gap with a few useless new features (because when devs catch up and actually deploy DXR enabled games, these cards will have been replaced by something faster).
ballsystemlord - Thursday, September 20, 2018 - link
Spelling/grammar errors (Only 2!):Wrong word:
"All-in-all, NVIDIA is keeping the Founders Edition premium, now increased to $100 to $200 over the baseline"
Should be:
"All-in-all, NVIDIA is keeping the Founders Edition premium, now increased from $100 to $200 over the baseline"
Missing "s":
"Of course, NVIDIA maintain that the cards will provide expected top-tier"
Should be:
"Of course, NVIDIA maintains that the cards will provide expected top-tier"
Ryan Smith - Thursday, September 20, 2018 - link
Thanks!ballsystemlord - Thursday, September 20, 2018 - link
Nate! Can you add DP folding @ home benchmark numbers? There were none in the Vega review and only SP in this Nvidia review.SanX - Thursday, September 20, 2018 - link
Author thinks that all gamers buy only fastest cards? May be. But I doubt all of them buy the new generestion card every year. In short, where are comparisons to 980/980Ti and even 780/780Ti? Owners of those cards are more interested to upgrade.milkod2001 - Friday, September 21, 2018 - link
See from top menu on right, there is a bench where you can see results. I presume they add data to huge database soon. And yes,people are talking about high end GPU but most are spending $400 max. for it.beisat - Thursday, September 20, 2018 - link
Very nice review, by far the best one I've read. Thanks for that.How likely do you think the launch of another generation is in 2019 from Nvidia / and or something competitive from AMD based on 7nm?
I currently have gtx970, skipped the Pascal generation and was waiting for Turing. But I don't like being an early adopter and feel that for pure rasterisation, these cards aren't worth it. Yes they are more powerful then the 10er series I skipped, but they also costs more - so performance pro $$$ is similar, and I'm not willing to pay the same amout of $$$ for the same performance as I would have 2 years ago.
Guess I'll just have to stick it out with my 970 at 1080p?
dguy6789 - Thursday, September 20, 2018 - link
RTX 2080 Ti and 2080 are highly disappointing.V900 - Thursday, September 20, 2018 - link
That’s a rather debatable take that most hardware sites and tech-journalists would disagree with.But would do they know, amirite?
dguy6789 - Friday, September 21, 2018 - link
Just about every review of these cards states that right now they're disappointing and we need to wait and see how ray tracing games pan out to see if that will change.We waited this many years to have the smallest generation to generation performance jump we have ever seen. Price went way up too. The cards are hotter and use a more power which makes me question how long they last before they die.
The weird niche Nvidia "features" these cards have will end up like PhysX.
The performance you get for what you pay for a 2080 or 2080 Ti is simply terrible.
dguy6789 - Friday, September 21, 2018 - link
Not to mention that Nvidia's stock was just downgraded due to the performance of the 2080 and 2080 Ti.mapesdhs - Thursday, September 27, 2018 - link
V900, you've posted a lot stuff here that was itself debatable, but that comment was just nonsense. I don't believe for a moment you think most tech sites think these cards are a worthy buy. The vast majority of reviews have been generally or heavily negative. I therefore conclude troll.hammer256 - Thursday, September 20, 2018 - link
Oof, still on the 12nm process. Which frankly is quite remarkable how much rasterization performance they were able to squeeze out, while putting in the tensor and ray tracing cores. The huge dies are not surprising in that regard. In the end, architectural efficiency can only go so far, and the fundamental limit is still on transistor budget.With that said, I'm guessing there's going to be a 7nm refresh pretty soon-ish? I would wait...
V900 - Thursday, September 20, 2018 - link
You might have to wait a long time then.Don’t see a 7nm refresh on the horizon. Maybe in a year, probably not until 2020.
*There isn’t any HP/high density 7nm process available right now. (The only 7nm product shipping right now is the A12. And that’s a low power/mobile process. The 7nm HP processes are all in various form of pre-production/research.
*Price. 7nm processes are going to be expensive. And the Turing dies are gigantic, and already expensive to make on its current node. That means that Nvidia will most likely wait with a 7nm Turing until proces have come down, and the process is more mature.
*And then there’s the lack of competition: AMD doesn’t have anything even close to the 2080 right now, and won’t for a good 3 years if Navi is a mid-range GPU. As long as the 2080Ti is the king of performance, there’s no reason for Nvidia to rush to a smaller process.
Zoolook - Thursday, September 27, 2018 - link
Kirin 980 has been shipping for a while, should be in stores in two weeks, we know that atleast Vega was sampling in June, so it depends on the allocation at TSMC it's not 100% Apple.Antoine. - Thursday, September 20, 2018 - link
The assumption under which this article operates that RTX2080 should be compared to GTX1080 and RTX2080TI to GTX1080TI is a disgrace. It allows you to be overly satisfied with performance evolutions between GPUS with a vastly different price tag! It just shows that you completely bought the BS renaming of Titan into Ti's. Of course the next gen Titan is going to perform better than the previous generation's Ti ! Such a gullible take on these new products cannot be by sheer stupidity alone.mapesdhs - Thursday, September 27, 2018 - link
It also glosses over the huge pricing differences and the fact that most gamers buy AIB models, not reference cards.noone2 - Thursday, September 20, 2018 - link
Not sure why people are so negative about these and the prices. Sell your old card and amortize the cost over how long you'll keep the new one. So maybe $400/year (less if you keep it longer).If you're a serious gamer, are you really not willing to spend a few hundred dollars per year on your hardware? I mean, the performance is there and it's somewhat future proofed (assuming things take off for RT and DLSS.)
A bowling league (they still have those?) probably costs more per year than this card. If you only play Minecraft I guess you don't need it, but if you want the highest setting in the newest games and potentially the new technology, then I think it's worth it.
milkod2001 - Friday, September 21, 2018 - link
Performance is not there. Around 20% actual performance boost is not very convincing especially due much higher price. How can you be positive about it?Future tech promise doesn't add that much and it is not clear if game developers will bother.
When one spend $1000 of GPU it has to deliver perfect 4k all maxed gaming and NV charges ever more. This is a joke, NV is just testing how much they can squeeze of us until we simply don't buy.
noone2 - Friday, September 21, 2018 - link
The article clearly says that the Ti is 32% better on average.The idea about future tech is you either do it and early adopters pay for it in hopes it catches on, or you never do it and nothing ever improves. Game developers don't really create technology and then ask hardware producers to support it/figure out how to do it. Dice didn't knock on Nvidia's door and pay them to figure out how to do ray tracing in real time.
My point remains though: If this is a favorite hobby/pass-time, then it's a modest price to pay for what will be hundreds of hours of entertainment and the potential that ray tracing and DLSS and whatever else catches on and you get to experience it sooner rather than later. You're saying this card is too expensive, yet I can find console players who think a $600 video game is too expensive too. Different strokes for different folks. $1100 is not terrible value. You talking hundreds of dollars here, not 10s of thousands of dollars. It's drop in the bucket in the scope of life.
mapesdhs - Thursday, September 27, 2018 - link
So Just Buy It then? Do you work for toms? :DTheJian - Thursday, September 20, 2018 - link
"Ultimately, gamers can't be blamed for wanting to game with their cards, and on that level they will have to think long and hard about paying extra to buy graphics hardware that is priced extra with features that aren't yet applicable to real-world gaming, and yet only provides performance comparable to previous generation video cards."So, I guess I can't read charts, because I thought they said 2080ti was massively faster than anything before it. We also KNOW devs will take 40-100% perf improvements seriously (already said such, and NV has 25 games being worked on now coming soon with support for their tech) and will support NV's new tech since they sell massively more cards than AMD.
Even the 2080 vs. 1080 is a great story at 4k as the cards part by quite a margin in most stuff.
IE, battlefield 1, 4k test 2080fe scores 78.9 vs. 56.4 for 1080fe. That's a pretty big win to scoff at calling it comparable is misleading at best correct? Far Cry 5 same story, 57 2080fe, vs. 42 for 1080fe. Again, pretty massive gain for $100. Ashes, 74 to 61fps (2080fe vs. 1080fe). Wolf2 100fps for 2080fe, vs. 60 for 1080fe...LOL. Well, 40% is, uh, "comparable perf"...ROFL. OK, I could go on but whatever dude. Would I buy one if I had a 1080ti, probably not unless I had cash to burn, but for many that usually do buy these things, they just laughed at $100 premiums...ROFL.
Never mind what these cards are doing to the AMD lineup. No reason to lower cards, I'd plop them on top of the old ones too, since they are the only competition. When you're competing with yourself you just do HEDT like stuff, rather than shoving down the old lines. Stack on top for more margin and profits!
$100 for future tech and a modest victory in everything or quite a bit more in some things, seems like a good deal to me for a chip we know is expensive to make (even the small one is Titan size).
Oh, I don't count that fold@home crap, synthetic junk as actual benchmarks because you gain nothing from doing it but a high electric bill (and a hot room). If you can't make money from it, or play it for fun (game), it isn't worth benchmarking something that means nothing. How fast can you spit in the wind 100 times. Umm, who cares. Right. Same story with synthetics.
mapesdhs - Thursday, September 27, 2018 - link
It's future tech that cannot deliver *now*, so what's the point? The performance just isn't there, and it's a pretty poor implementation of what they're boasting about anyway (I thought the demos looked generally awful, though visual realism is less something I care about now anyway, games need to better in other ways). Fact is, the 2080 is quite a bit more expensive than a new 1080 Ti for a card with less RAM and no guarantee these supposed fancy features are going to go anywhere anyway. The 2080 Ti is even worse; it has the speed in some cases, but the price completely spoils the picture, where I am the 2080 Ti is twice the cost of a 1080 Ti, with no VRAM increase either.NVIDIA spent the last 5 years pushing gamers into high frequency displays, 4K and VR. Now they're trying to do a total about face. It won't work.
lenghui - Thursday, September 20, 2018 - link
Thanks for rushing the review out. BTW, the auto-play video on every AT page has got to stop. You are turning into Tom's Hardware.milkod2001 - Friday, September 21, 2018 - link
They are both owned by Purch. Marketing company responsible for those annoying auto play videos and the lowest crap possible From the web section. They go with motto: Ad clicks over anything. Don't think it will change anytime soon. Anand sold his soul twice to Apple and also his web to Purch.mapesdhs - Thursday, September 27, 2018 - link
One can use Ublock Origin to prevent those jw-player vids.darkos - Friday, September 21, 2018 - link
Please add flight simulation testing to your list of applications. eg: X-Plane, Prepar3d.Vinny DePaul - Friday, September 21, 2018 - link
I am still rocking 980 with fps over 60 and everything turned up to max. I guess I will wait.mapesdhs - Thursday, September 27, 2018 - link
That's a perfect summary of why tom's looney article was so bad. If your current hw is doing just fine for the games you're playing atm, then upgrading makes no sense. It's rather cynical of NVIDIA, and some tech sites, to basically create a need and then push people into thinking they're idiots if they don't upgrade, while hiding behind very poor price/performance dynamics.DARK_BG - Friday, September 21, 2018 - link
As someone working in the game industry for already 8+ years I can tell you only one thing.No body will rush to implement proprietary features!The ones that have demos or are about soon to have the features implemented are the ones Nvidia reached to not vise versa.
This days making a game is no different than making any other product on this planet.It is corporate bussines which means you want maximum profit which translates in maximum user coverage which translates in maximum platforfm coverage PC (Windows , Mac , Linux), Consoles and mobile.
There is just no basis to anyhow compare Vulkan to anything proprietary.Vulkan is coming with the promise that what i make will look and feel the same way visually across multple platforms without requiring too much husstle on the development side.Even when you use a flexible engine as UE4 it is not that easy to have the same stuff working across multiple platforms and changes and further development for materials and meshes are required to have the stuff atleast to look indetical.
So i can hardly imagine that while you are bogged down with tons of bugs and trying to deliver your product across multiple platforms you will add yourself one more pain in the ass as nVidia Ray Tracing that will have doubfull income effect on your title given the small user reach.
I can give you Wargaming and Valve games as an example of old engines that are making tons of money.
So while nVidia is trying to ripoff people with that amount of money I'm wondering how to optimise one level so it could run fast and look cool on 6 years old midrange hardware.
eddman - Friday, September 21, 2018 - link
*off-topic*Although it is true that proprietary features do not always take off in a meaningful way (the GPU-accelerated mode of physx as an example), it doesn't mean an open standard would always be the popular choice.
Take big budget games from large publishers. These games, in the large majority of cases, are only available on three platforms, PS, xbox and windows, because these are the platforms that have the hardware to support such games and also have the largest audience.
IINM, vulkan is not available on X1 and PS4. If a AAA game dev was to use vulkan on windows, they'd still need to code the game for directx on X1 and GNM or GNMX on PS4, meaning they'd have to support three APIs.
If they go with directx on windows, then two platforms will already be covered and they'd only need to do additional coding for PS4 support.
On the other hand, vulkan does make sense for devs of smaller games where they want to cover as many platforms as possible, specially for mobile games, where vulkan covers windows, linux, mac, android and I think ios and even switch.
eva02langley - Friday, September 21, 2018 - link
Yes and no, for example Freesync is finally getting support from TV makers. Finally, I can get a 4k big screeen with HDR and... freesync.Open source is the way to go over proprietary technology.
eddman - Friday, September 21, 2018 - link
That's why I wrote it's not always the case. Sometimes it works, sometimes not. It all has to do with the standard having industry support. Being open source does not automatically mean it'd catch on, unfortunately.noone2 - Friday, September 21, 2018 - link
Someone will and it will be great and then it will catch on. Or maybe it won't, but that's how things happen. Someone takes a risk and it pans out. Either contribute to it or don't, play safe or take a chance.Nvidia has obviously been making some pretty good decisions over the years and has turned out some amazing products. Sometimes they've been wrong, but more often than not they are right. If they were wrong more than not, they'd be out of business, not a $150B company.
If you don't ever take a risk doing something new or cutting edge, you'll disappear. This is true for all technology.
noone2 - Friday, September 21, 2018 - link
Oh, and remember, at some point you don't have to care about 6 year old hardware. Look at consoles. At some point the studio just stops making a game for the last-gen, even though the new gen doesn't have the same size install base yet. Or a non-franchise game just shows up for next-gen and that's it. They never even bother to attempt to make it on older stuff.eva02langley - Friday, September 21, 2018 - link
Thanks for your comment, this was my argument all along.The only way to force a new feature is by shear numbers. Basically, if RTX was something available on new consoles, then that would make a business stand point sense, however AMD is owning consoles and might for a long term.
AMD should force multi-GPU via Infinity Fabric through consoles. This would work because devs would have access to additional power on the die via proper coding... and this delivered to 100% of the user base.
If this is only developed for less than 1% of the PC user base, this will fail miserably and nobody would add support unless sponsored by Nvidia themselves.
Financial analysts are seeing it and downgrades are coming.
whaever85343 - Friday, September 21, 2018 - link
Whatever, this is your new benchmark:https://albertoven.com/2018/08/29/light-lands/
Golgatha777 - Friday, September 21, 2018 - link
I just want to be able to play all my games at 1440p, 60 FPS with all the eye candy turned on. Looks like my overclocked 1080 TI will be good for the immediate future is what I got from this review. The only real upgrade path is to the 2080 TI, and at $1200 that's an extremely hard sell.vivekvs1992 - Friday, September 21, 2018 - link
Well the problem is in India retailers are not willing to reduce the price of 1080 deries.. At present the 2080 is cheaper than all models of 1080 ti.. If given the chance I will definitely go for 2080..thing is that I will have to invest in a gaming monitor firstwebdoctors - Friday, September 21, 2018 - link
Any mining benchmarks?Can I actually make money buying these cards?
ravyne - Friday, September 21, 2018 - link
I agree these are really for early-adopters of RT, or if you're doing a new build or need of a new card but want it to last you 3+ years, so you need to catch the RT wave now.I think the next generation of RT-enabled cards will probably be the optimal entry-point; Presumably they'll be able to double (or so) RT performance on a 7nm process, and that means that the next xx70/80 products will actually have enough RT to match the resolution/framerate expectations of a high-end card, and also that the RT core won't be too costly to put into xx50/60 tier SKUs (If we even see a 2060 SKU, I don't think it will include RT cores at all, simply because the performance it could offer won't really be meaningful).
More than a few things are conspiring against the price too -- Aside from the specter of terriffs, the high price of all kinds of RAM right now, and that this is a 12nm product rather than 7nm, it looks to me like the large and relatively monolithic nature of the RT core itself is preventing nV from salvaging/binning more dies -- with the cuda/tensor cores I'd imagine they build in some redundant units so they can salvage the SM even if there are minor flaws in the functional units, but since there's only 1 RT core per SM, any flaw there means the whole SM is out -- that explains why the 2080 is based on the same GPU as the TI, and why the 2070 is the only card based on the GPU that would normally serve the xx70 and xx80 SKUs. Its possible they might be holding onto the dies with too many flawed RT cores to re-purpose them for the AI market, but that would compete with existing products.
gglaw - Saturday, September 22, 2018 - link
Is there a graph error for BF1 99th percentile at 4k resolution? The 2080 TI FE is at 90, and the 2080 TI (non founders) is 68. How is it possible to have this gigantic difference when almost all other benchmarks and games they are neck and neck?vandamick - Saturday, September 22, 2018 - link
A GTX 980 user. Would the RTX2080 be a big upgrade? Or should I stick with the 1080Ti that I had earlier planned? My upgrade cycle is about 3 years.Inteli - Saturday, September 22, 2018 - link
Are you willing to pay the extra $1-200 for a 2080 over a 1080 Ti for the same performance in current games, in exchange for the new Turing features (Ray-tracing and DLSS)?I'm not convinced yet that the 2080 will be able to run ray-traced games at acceptable frame rates, but it is "more future-proof" for the extra money you pay.
mapesdhs - Thursday, September 27, 2018 - link
Thing is, for the features you're talking about, the 2080 _is not fast enough_ at using them. I don't understand why more people aren't taking this onboard. NVIDIA's own demos show this to be the case, at least for RT. DLSS is more interesting, but the 2080 has less RAM. Until games exist that make decent use of these new features, buying into this tech now when it's at such a gimped low level is unwise. He's far better off with the 1080 Ti, that'll run his existing games faster straight away.AshlayW - Saturday, September 22, 2018 - link
They've completely priced me out of the entire RTX series lol. My budget ends at £400 and even that is pushing it :(dustwalker13 - Sunday, September 23, 2018 - link
Way too expensive. If those cards were the same price as the 1080 / TI it would be a generation change.This actually is a regression, the price has increased out of every proportion even if you completely were to ignore that newer generations are EXPECTED to be a lot faster than the older ones for essentially the same price (plus a bit of inflation max).
paying 50% more for a 30% increase is a simple ripoff. no one should buy these cards ... sadly a lot of people will let themselves get ripped off once more by nvidia.
and no: raytracing is not an argument here, this feature is not supported anywhere and by the time it will be adopted (if ever) years will have gone bye and these cards will be old and obsolete. all of this is just marketing and hot air.
mapesdhs - Thursday, September 27, 2018 - link
There are those who claim buying RTX to get the new features is sensible for future proofing; I don't understand why they ignore that the performance of said features on these cards is so poor. NVIDIA spent years pushing gamers into high frequency monitors, 4K and VR, now they're trying to flip everyone round the other way, pretend that sub-60Hz 1080p is ok.And btw, it's a lot more than 50%. Where I am (UK) the 2080 Ti is almost 100% more than the 1080 Ti launch price. It's also crazy that the 2080 Ti does not have more RAM, it should have had at least 16GB. My guess is NVIDIA knows that by pushing gamers back down to lower resolutions, they don't need so much VRAM. People though have gotten used to the newer display tech, and those who've adopted high refresh displays physically cannot go back.
milkod2001 - Monday, September 24, 2018 - link
I wonder if NV did not hike prices so much on purpose so they don't have to lower existing GTX prices that much. There is still ton of GTX in stock.poohbear - Friday, September 28, 2018 - link
Why the tame conclusion? Do us a favor will u? Just come out and say you'd have to be mental to pay these prices for features that aren't even available yet and you'd be better off buying previous gen video cards that are currently heavily discounted.zozaino - Thursday, October 4, 2018 - link
i really want to use itzozaino - Thursday, October 4, 2018 - link
i really want to use it https://subwaysurfers.vip/https://psiphon.vip/
https://hillclimbracing.vip/
Luke212 - Wednesday, October 24, 2018 - link
Why does the 2080ti not use the tensor cores for GEMM? I have not seen any benchmark anywhere showing it working. It would be a good news story if Nvidia secretly gimped the tensor cores.