The top picture on page 1 should have had the 1080 Ti to compare with. Then, we would get a better idea of how well it will play Crysis or any other game, when compared with its upcoming replacement.
Great question! There was a lot of smoke and mirrors in Nvidia's presentation, and the whole bit about them having to come up with a new benchmark struck me as a a tiny bit suspect. Chances are all 3 cards will be incredibly fast, but until I see benchmarks from 3rd party testers that I trust (that would be AT!) then my money is staying in my pocket.
TDP is creeping up too. It looks like the RTX 2070 is taking the place of the GTX 1080 in terms of heat and power while the RTX 2080 and the Ti variant are reaching much higher. All that extra hardware needs to be fed and NVIDIA is still on a brute force for performance approach that is disappointing to say the least. Here's to the hopefully 30W 2030 that'll be out in a year or so...meh.
I am curious how they managed to rework the cooling to achieve such a massive (claimed) noise reduction at full speed. I'm guessing it's not just due to the added fan... Maybe though. This card looks great.
Previous founders edition cards were typically around 10db louder than the best of the third party cards. So they probably mean 5x quieter based on db values. Most cards are around 35-50db at full load, and there's ZERO chance any of these cards are 10db at full speed LOL so they probably mean something like 10-15db and are just using the logarithmic scale marketing speak.
Assuming they're talking about 5x lower sound pressure, that'd be a 7db reduction, which sounds about right a 2 fan design's still going to have to spin a bit faster than a 3 fan one for a given level of cooling.
You are confusing intensity with loudness. Intensity doubling is defined at each 3 dB increase. Loudness doubling (perceived, human hearing) is at each 6-10 dB increase (depends on SPL starting level, frequency and complexity of sound, it is non-linear as a function of frequency, see equal loudness contours). Most often perceived loudness doubling is said to bappen at 10dB diferentials, but that is just a rough rule of thumb. Current psychoacoustic research points to the direction that it is more often closer to 6dB difference than 10dB. Again, 3dB refers to intensity doubling, not human loudness perception.
they did mention it's a vapor chamber design with two massive ultraquiet blades spinning... I wouldn't be surprised it's 5x more silent that the previous crappy FE cards
See its depend on how it works, that's really fulfillments the thought had been designed the programme as well as its a question in another part of the matter? Need Some more effective on this like as http://xboxsupport.strikingly.com/ - Xbox Support, But do the more effectively work on it need some extraordinary support for this matter.
They've priced the 2070 at the 1080's price, the 2080 at the 1080 Ti's price and the 2080 Ti at the Titan Xp's price. The Titan V meanwhile has taken on the Quadro's price and the Turing Quadros are coming close to the Tesla cards. Rip-offs all around.
Or its just a shift in naming if the performance matches their price points. If the 2070 beats a 1080 consistently by a decent margin, why wouldn't it take its spot?
Review and benchmarks will tell is if they are worth the money being asked, or not.
Jensen Huang made very clear in his presentation that he is targeting the future, while departing from the past. Which is why his company dedicated almost half the die space of Turing to tensor and RT cores. So, you need to define exactly what you mean by "consistently and by a decent margin". Do you mean past games that support no tensor cores, no RT cores and can make no use of the INT8 units of the shaders and of variable shading?
If so no, the 2070 will not beat the 1080 consistently and by a decent margin. It will either beat it barely or it will be a bit behind it. Do you rather mean upcoming games (where the priorities will be balanced between rendering quality and fps)? Then yes, 2070 will "beat" the 1080 consistently, but not necessarily in fps. Some games could offload AI and physics to the tensor cores and thus allow the shaders to raise the fps, as Huang showed, but not all games, and I don't think these games will also be able to support ray-tracing. It will be a case of "either or" apparently, and the choice will be left to developers and game studios.
In all other games that will support ray-tracing the focus will be on approaching photo-realistic lighting, reflections, refractions, shadows etc while retaining roughly the same fps rate. However quality is not easily measurable like fps, which arguably represent quantity. This is why Huang (or his marketing team) coined the term "RTX OPS", so that every aspect is taken into account.
Therefore you need to clarify the kind of "beats" you refer to : merely fps or .. much higher photorealism?
p.s. By "while retaining roughly the same fps rate" I did not mean the same as the previous generation, of course, I meant roughly the same fps rate Turing would have if it lacked tensor and RT cores.
"dedicated almost half the die space of Turing to tensor and RT cores"
Where'd you get or how did you decide on this?
Guesstimating by transistor counts of the Quadro cards, I don't think NVIDIA has dedicated half the die area to the RT cores. Perhaps 1/3, maybe a bit less.
As far as Tensor Cores, it's hard to estimate, but I believe they are integrated into the SMs and, furthermore, I think they are created by a rewiring of the existing execution units in the SMs. Guesstimating by comparing the transistor counts of Volta and Pascal, they don't seem to add that many more transistors to the architecture. So I am sure they cost a lot in terms of engineering cost and design flexibility, but they don't seem to cost that much in terms of die area.
Now how do we make sense of it? The GP100 has 6 GPCs, each containing 10 SMs. I have no idea what's what there, and I certainly have no idea how to relate it to the picture NVIDIA showed supposedly showing the Turing chip. It looks like a different type of photograph. Maybe one is metal and one is polysilicon, I dunno. So if you understand it, where are the RT cores and how do you know?
A jack of all trades and a master of none? That may be one type of future.
We have long been sold a lot of wasted die area for Intel's integrated graphics. Sometimes the future holds unwarranted profits when there isn't adequate competition.
RT cores are practically ASIC SIPs. How is that a jack of all trades, master of none? It's targeted specifically at rendering graphics, so it is actually the opposite.
And how does a company get profits by increasing both the cost to produce and the market price of their products? That makes no sense to me economically. Regardless of the "adequacy" of the competition, they will sell more at a lower price and have better margins with goods that are cheaper to produce. Only in special situations would a company be able to shoehorn in something that customers don't really want and get larger margins for it. That isn't the situation here with NVIDIA. They are instituting a technology transition, dedicating significant die area to a new technology, not to a useless technology. And, according to their most recent financial conference call, they expect gross margins to be down the next quarter, not up, despite the higher MSRPs of these new parts. So they don't seem to be trying to squeeze any more out of consumers, these are just more expensive parts to produce.
As far as the Tensor Cores, once they are developed for the compute market, I don't think they add much to the manufacturing cost of a gaming-oriented GPU.Plus, they are vital to the viability of the ray tracing methods. Without them, real-time ray tracing would still be several years out.
I hope you are aware that, with your logic, the third best consumer card from nvidia in 2025 would be around 1500 dollars.
We must expect next generation products to have better performance per dollar. Otherwise we just increase price at the same rate as performance does. Yeah... it's that obvious.
Actually, if you assume that each generation the SKU model number goes up, so each model's "successor" is the one from the grade "below" it, you see fairly constant pricing and a constant upward scale in performance.
Compare 780 to 970 to 1060, for example, or 780Ti to 980 to 1070.
The 970 is 6% faster than the 780. The 1060 3GB is 5% faster than the 970. In what world is that a "constant upward scale in performance"? I think you've drunken far too much of the Intel Kool-aid if 5-6% performance gains are considered good.
Following that logic we'll be at 10K consumer GPUs in no time. Nvidia will release 7nm turing and oh look the 3070 performs just as well as the 2080 so according to you that's good enough to charge insane prices. Oh and the 4070 will perform just as good as the 3080 so yay, another price hike! I hope you see the pattern by now.
Eventually Nvidia will be forced to stop, simply because they've fleeced every penny from PC gamers. Oh well, I'm sure they'll just throw some money at the devs and their GimpWorks program and tighten their grip on AIBs to ensure no competitor can challenge them.
Looking at listings on Newegg, it looks like all the initial cards are using the same set of outputs as the founder edition, and presumably are a reference layout. I wonder if custom boards will add back DVI again, or if this will be the end of the road for old 2560x1600 and 1440p Korean monitors that only support DVI in. ($80-100 for a flaky adapter is a really bad deal, the cheaper ones won't work unless the monitor does something non-standard and rarely done).
Those ones only do 1080p60 in standards compliant mode. There're a few that claim 2560x1440/1600p60; but they only work with a very tiny subset of monitors that can take an ~100% overclock (to HDMI1.4 equivalent speed) on DVI; this is limited to a handful of the last DVI only displays that had an HDMI capable panel controller but didn't put an HDMI port on to save a few cents on the BoM.
hah couple of weeks, try 3 months. That's when we'll finally see custom cards available to ship. Even then, your likely to play the waiting game as everyone rushes to buy the EVGA and Asus cards
Nvidia certainly does know how to milk people. They've got people paying the $200 founders edition tax and people believing that they'll actually pay MSRP. Oh, that's rich. MSRP is a joke, expect to be milked for the full founders edition price until Nvidia is forced otherwise.
Remember the 10 years of Intel CPU's being the only choice? Welcome to the GPU version of that.
This is exactly what is wrong with AMD competing against Nvidia in the high-end GPU market. People only want AMD to compete to drive down Nvidia prices so that they buy the Nvidia cards anyway, and AMD gets stuck with the bill without being compensated for R&D.
AMD should not waste resources on competing in the high end GPU market and mostly focus on giving the mainstream GPU users (that actually care about both price and performance) the products that they want and deserve.
The only way AMD can offer enough competitive pressure to influence NVidia's pricing is to offer similar levels of performance per mm^2 of GPU die. If they need 2x as much chip to match performance nvidia can lift prices across the entire product line matching price/pref with their competition while inflating the price of high end cards and laughing all the way to the bank. Meanwhile if AMD is reasonably competitive densitywise there's no reason they can't make cards that are able to compete most if not all the way up the stack.
Not to entirely defend their prices (I agree it is higher than I want to pay), but the launch is compared to the 8800 for very good reason. Inflation adjusted, the 8800 with its introduction of CUDA was also an inflation adjusted $1000 on launch... and also useless. New tech is expensive, and not great. RTX is not going to work at 60fps 1080p... the demos were showing something closer to 720p 30fps. People who buy these chips have 4k monitors and will pretty much never use RTX (except for rendering video projects), just as nobody used CUDA for much of anything when it was introduced. This is the same thing. Fast forward a few years, you see the launch of the 580, for an inflation adjusted $500, and CUDA was actually powerful enough to be useful. We will see a similar curve here. Lots of R&D to pay for, lots of kinks to work out, lots more cores that need to be added, and eventually it will all be affordable and usable, even on a 4k display.
That said; How much you want to bet that if AMD had a competing card, nVidia would be selling similar hardware without the tensor cores and ray tracing capabilities, but still fantastic gaming cards for $600... would be great to see some competition, but sadly I bet we will see pressure from Intel's dGPUs before we see anything high end from AMD (other than a few limited release cards that nobody can actually purchase)
Then he'd be wrong if he meant Ultra, since 8800 GTX is the equivalent card to 2080. Also, the 8 series launched with 8800 GTX, not Ultra, which launched 6 months later simply because of lack of proper competition from AMD in order to maximize profits.
He never said 2080, either, so why is her wrong? The 2080 doesn't cost $1,000. One can argue that the 2080 Ti is the 8800 Ultra coming out 6 months earlier, for which is actually deserves a price premium. As technology performance increases, the value of current technology goes down over time, so something coming out now is more valuable than something coming out 6 months from now.
I am not following your reasoning. You mentioned the 2080 and the 2080 Ti. You compared the 2080 to the 1080. Why does that preclude him from being able to compare the 2080 to the GTX 8800 and the 2080 Ti to the 8800 Ultra?
The 2080 Ti can be compared to 1080 Ti but not to 8800 Ultra? Why? Because you said so? The 1080 Ti does not match his "new technology" thesis. If you're gonna refute his thesis you need to work with it, not just ignore it.
As far as supporting his thesis, I think a better argument is the increase in die size of the GPUs in question and the fact that in NVIDIA's latest earnings conference call they projected that gross margins would go down next quarter, not up. That suggests that, assuming they sell a significant number of these new cards next quarter, these increased prices do seem to have a correlation to increased manufacturing costs, and not just "NVIDIA raising prices because they can". If they were raising prices "because they can" (which every company is trying to do!) then their margins should be increasing.
... because at first I was comparing 2080 to 1080 and he replied to that.
8800 Ultra was a massively overpriced card because of no competition. It didn't offer anything besides a bit higher clock speeds. It simply existed to make more money. If that new technology's cost was that crippling, nvidia would've priced 8800 GTX more.
Thinking that these cards are this expensive ONLY because of their new technologies is naive. I don't doubt they could sell these at Pascal level prices and still make a lot of profit. They'd have simply recuperated their expenses a bit later.
These new technologies might be playing a role in pricing but based on nvidia's track record it's safe to say that the biggest reason here is lack of competition.
P.S. I have nothing against nvidia and might even upgrade to a 2070 from my 1060.
Products in general simply exist to make money. Of course someone can develop or champion something for another reason, but if it isn't fulfilling a desire in the market, in which case it can make money, then it's bound to fail.
"Thinking that these cards are this expensive ONLY because of their new technologies is naive."
No it's not, as shown by the gross margins argument I have made.
"These new technologies might be playing a role in pricing but based on nvidia's track record it's safe to say that the biggest reason here is lack of competition."
That simply isn't true. Based on the track record the prices have not gone up in relation despite NVIDIA's lack of competition in the PC gaming space. It's just something people repeat. In fact what has happened is a greater number of people have bought the higher priced cards than in previous generations. Why? Because the graphics cards have become more and more important to the performance of gaming systems in relation to the other components of the system. There also is less of a need to spend extra money on the system for uses of the systems besides gaming. Spreadsheets, word processors, compression tools, maintenance utilities, etc, used to tax the hardware relative to its maximum performance much more than they do now. Therefore that money saved on the other uses for the system can be re-allocated towards the graphics card to improve the gaming experience.
The point was that 8800 UItra was a mere cash grab. It didn't have to be that expensive or even exist, but nvidia did it since they knew AMD can't respond and that there are enough people to sell these cards to. Easy money. Nothing inherently wrong with that tough.
Margins are expected to go down but it doesn't mean they couldn't have priced the cards lower. Yes, they made a lot of investments which had an effect on profit, but there is no reason to think that lower prices would've resulted in losses. They still would've made the money back but at a bit later date.
"The point was that 8800 UItra was a mere cash grab. It didn't have to be that expensive or even exist, but nvidia did it since they knew AMD can't respond and that there are enough people to sell these cards to. Easy money. Nothing inherently wrong with that tough."
Why do any graphics cards have to exist? Why can't NVIDIA raise chickens instead? Huang can be a long-haul trucker. The only way a product "has to be a certain price" is if a company can't possibly sell it for less without going out of business, I guess. If a company is pricing their products that way then they are in big, big trouble. So of course it was a "cash grab" priced to maximize profits. That's how products are priced! It's ubiquitous. It isn't special to the 8800 Ultra. And since it is ubiquitous the argument cannot possibly be used to refute a claim that the 8800 Ultra cost more because of new technology. It's entirely possible that the only reason people were willing to pay that much money for the card was because of its combination of performance and new technology. You have to realize that companies don't just set the price and then that's it. A company must always obey the market with their prices, otherwise they will lose money in comparison to obeying the market. It's the market that sets the price.
"Margins are expected to go down but it doesn't mean they couldn't have priced the cards lower."
Yes sure, margins could have gone down even more. Margins could even go to zero. They could sell the card at cost for a while before they tank their business. What's your point? We are arguing about whether they are "squeezing consumers" more not whether they are NOT "squeezing consumers" less!
"Prices DID go up after fermi."
No, the GTX 480 debuted at $499. 499 2010 dollars is 577 2018 dollars. Besides, AMD was still competitive during the Kepler days. The GTX 780 debuted at 649 2013 dollars, which is 702 2018 dollars. That's actually right where the GTX 2080 is debuting. Maxwell was when AMD was least competitive and the GTX 980 debuted at 549 2014 dollars which is 584 2018 dollars. So the price went down significantly, right in line with the GTX 480 debut price. This even though NVIDIA had much less competition with Maxwell and the GPU was more important to the total gaming performance of the system in 2014 compared to 2010.
What I think is true is that with more AMD competition in the past prices dropped after after the debut. You can't use the Pascal generation for that comparison though, because it was a unique situation due to the crypto-currency craze and skyrocketing DRAM prices.
What I think is true is that with more AMD competition in the past prices dropped sooner after the debut. You can't use the Pascal generation for that comparison though, because it was a unique situation due to the crypto-currency craze and skyrocketing DRAM prices.
Oh, one last comment. There's another reason prices dropped faster in the past and that's because the time between generations/refreshes was shorter. But I do think that AMD competition was also a factor.
No, a business prices their stuff based on how much they can get from buyers. They can make a card that costs $200 to make, including R&D and else, and sell it for $1000 if they know 1) there is no competition and 2) there are enough people willing to pay for it. That's overpricing and they still can and will do it.
We don't know how much these cards cost but I VERY much doubt they are on the edge of losing money per card. I have no proof, obviously, but I suspect they could drop 2080 Ti to $800, or even less, and still make a lot of profit. Do you think they would've still gone with such high prices if AMD was able to respond properly.
Missed the point. Fermis were cheap (vs. now) because of AMD. 600 series did not have a big-chip card. The prices of big-chip cards went way up after fermi, starting with 780, and stayed that way since AMD could not properly compete.
"No, a business prices their stuff based on how much they can get from buyers."
Yes, exactly. That's what they ALWAYS do. It's not "overpricing", it's correct pricing. They price to maximize their profits.
"We don't know how much these cards cost but I VERY much doubt they are on the edge of losing money per card."
I never said they were on the edge of losing money per card.I can guarantee you they are no where near being on the edge of losing money per card. And they shouldn't be anywhere near there. You, as a consumer, shouldn't want them to be there, because if they were then they would have no money for investment and no ability to absorb any sort of recession or market downturn. But it's all irrelevant. The point we are supposed to be discussing in this thread is whether NVIDIA is making MORE money on the Turing cards than on cards in the same market segment in the past. That is what you seemed to claim and that is what I and that other guy are arguing against. And I have given you evidence that no, they are not.
"Do you think they would've still gone with such high prices if AMD was able to respond properly."
Yes, because AMD would have also gone with such high prices if they could respond properly.
"Missed the point. Fermis were cheap (vs. now) because of AMD."
Fermis were not cheap. I demonstrated that. In fact I looked even deeper since then and found that the Tesla-based GTX 280 launched in 2008 for $650. That's $760 in today's money, which is $60 more than the RTX 2080 is launching. AMD was competitive at that time.
What has happened is that at times when AMD wasn't competitive and hardly anyone was buying their cards they lowered the prices of their GPUs to minimize their losses. NVIDIA responded to maintain their market share. That's less the result of healthy competition and more the result of a desperate company trying not to bleed cash. If AMD has a strong product they too will try to charge as much as they can for it. And that's exactly what they did in the past. And prices overall have not gone up or down in all that time.
"The prices of big-chip cards went way up after fermi, starting with 780, and stayed that way since AMD could not properly compete."
No. The Maxwell cards were among the cheapest, historically. And NVIDIA's market dominance was greatest during the Maxwell period.
... except when there is no competition, it turns into overpricing. They are charging more than the card being replaced in that category, ergo they are overpricing. Simple as that.
No, I didn't claim they are making more money. I'm saying the profit margin is probably high enough that they could cut the price and still make a healthy amount of money without being anywhere near the edge and without getting into any kind of financial problems. I don't want them to barely even out but I also don't want to be price gouged. Why are you even defending this? Do you like being overcharged?
No, if AMD was in proper shape it would've probably ended up like fermi era pricing.
You missed the point again and also missed a massive historical incident. You do realize that nvidia cut 280's price to $500 just a month after launch because it was unable to compete with 4870 at $650? There goes that argument.
Fermi's ARE among the cheapest cards in the past 14 years.
No, maxwells are not the cheapest big-chips; not even close. That honor goes to 285, 480 and 580. Fermis about $570-580, and 285 about $470. 980 Ti's launch price is about $690 today.
I just want to add that since I haven't mastered english yet, sometimes it might seem I'm being disrespectful. That's not the case. It's good to have a healthy discussion/argument once in a while.
"It's more expensive because of newer technologies" does hold water. You cannot claim to have refuted a statement simply by refuting one piece of evidence in support of it.
8800 GTX cost the same as 7800 GTX, so it means new technology does not automatically make something more expensive. It's a classic case of "We can charge more, so we would".
Every company operates with "we can charge more, so we do". They are all trying to maximize profits, within the constraints of longer-term planning, of course.
New technology does not "automatically" make something more expensive. But increased die area does make the manufacture of something more expensive, and RT cores seem to result in almost a 1/3 increase in die area. R&D efforts also make something more expensive, and the R&D that went into the Tensor Cores and RT cores was certainly not insignificant. Finally, providing software support for new hardware and incentivizing games developers to make use of the new hardware sooner also costs money. It's pretty clear by looking at the actual facts of the situation instead of making generalized analogies to another situation from over 10 years ago that this new technology actually is more expensive. That conclusion is evidenced by NVIDIA's expected margins for the upcoming financial quarter, which are projected to go down.
Pricing is a complicated thing. Bigger die sizes, increased R&D into the hardware and supporting software, money spent on getting developers to offer some support of the technology before it has a big market share, increased DRAM cost, and finally the re-optimization of the price point because of the higher underlying cost structure of the product all contribute to the increases in price.
With this price, NVIDIA is making a prediction of demand. But we can see from their conference call that in this prediction they do not expect to have higher margins despite the higher prices. So then what do you think is causing the increase in prices other than the things I mentioned above?
RT will be likely used for certain special effects rather than the whole scene in action games. This is similar to how vastly important it was in the marketplace for an ugly old man (Witcher 3) to have luxuriously tessellated locks (to give AMD a nice performance hit and to give people the eye candy experience of said ugly old man with luxurious flowing locks).
This line of thinking is funny because I remember Nvidia fanboys making fun of AMD for years for introducing tech that would likely "never be useful". It's funny how quickly the shoe changes. Regardless it is a good laugh seeing how people will defend their purchases, especially of this scale.
I'm glad you are here to tell us something is overpriced. You would make a terrible CEO. Somehow you've already decided what everything should cost for everyone despite you having no idea how much it costs to run a company or how much it costs to produce something. You know, those little things like payroll and insurance that doesn't go away. Future r&d (which got us to where we are now), insurance, pay raises, taxes, etc. Maybe they should sell at a loss to keep all those investors unhappy just to appease your made up expectations.
I don't run nvidia. I simply buy their stuff and to me, as a customer, this card is overpriced compared to the card it is replacing.
Sell at a loss? You perhaps don't remember GTX 280. It launched for $650 and a month later nvidia slashed the price to $500 because of radeon 4870. Nvidia still reported profit in the following quarters, IINM.
Again, this has very little to do with costs an almost everything to do with the lack of effective competition.
Duopolies aren't good enough to ensure quality competition, either. People continue to forget that. There are too many tech duopolies masquerading as a competitive marketplace.
That goes for politics, too. Two parties aren't enough. They end up being much too similar, which doesn't benefit the public but does benefit narrower interests (i.e. the rich).
The solution to that is to work a little harder and trim some of your costs so you can benefit from compound interest rather than being beholden to it through recurring debt. Upward mobility is doable if you get education, work the system to your benefit, and manage your costs so you can reap the rewards of wealth in a duopoly world rather be subjugated by it and complain to people here about it.
You're missing the fact that corporate desires are not the same as consumer desires. It's fallacious to argue from the position that only one matters and not the other. Corporations and consumers are in a battle to get the most from one another.
I know all that you wrote up there. I'm not saying their desires doesn't matter. I'm simply not buying the "more expensive because new technology" explanation.
They, as a business, can charge whatever they want and I, as a customer, can say they are overpricing the cards that are direct replacements of the older ones.
If you look at Nvidia's reported quarterly earnings for the past few years, the cost of developing these cards is in line with their other previous generations. There is nothing to justify the price increase other than greed.
People were already tired of the mining prices for GPUs and the average consumer wasn't willing to put up with $500 or more for x70 series GPUs, only the miners were biting the bullet to try to make their returns with them.
But hey, good goys, look at the deals! $500 x70 series GPUs, but get this, it's Emm Ess Arr Pee!!! Don't you look forward to paying msrp for video cards?! Nah, Nvidia. $500 was too expensive for what you were getting a year ago, and it's still expensive now for what should be effectively a midrange GPU. And honestly, even if it was $400, I'd still gripe about it being higher than the $379 MSRP of the last gen.
AMD and Intel, please reel in these clowns back down to reality and release some good performing midrange GPUs.
I bet you did the same thing here two years ago, mocking people for being excited about Pascal, declaring it overpriced, saying how much better in both price & performance Vega would be when it got here by the end of 2016.
Yep that's me! well I'm still stuck on my 7950. I've gone back to my indie gaming roots haha. Radeon 680 or its competitor will probably be my next card in the first half of next year
I'm not an AMD or nVidia fanboy. I want better prices for better performing GPUs, and I don't frankly care from which camp it's coming from. nVidia's capitalizing on its market dominance and is releasing GPUs with ever higher pricetags, and naturally I want nVidia's two potential competitors to release a homerun product at lower pricepoints.
I would say the same thing if AMD released overpriced GPUs. This has nothing to do with brandwars.
Pascal didn't price the 1080 Ti at $1,200, it was $700. Big difference but I guess that makes me a peasant. We are going to need a super elite PCMR badge for this guy.
AMD may never release another good gaming GPU. Relying on a duopoly setup for competition is a bad idea, especially when one of the companies is small and competing very heavily in a different market (CPUs in this case).
Why the fixation on name (x70)? They are offering cca 30% more performance ("greater than titan xp") for 100USD less than Vega 64. Thats a great value! And it will force AMD to go way down with their Vega GPUs (or maybe discontinue them without immediate replacement as they did with Radeon Fury). I have no doubt that when AMD will release their GPUs, they will do similar thing to nvidia (e.g. offer better performance for lower cost and force nv to lower prices). But that doesn't change the fact, that until then the price of 2070 is great for its performance (obviously if it is what they claim).
In Ray tracing genius, not in game performances. With the actual die augment, the frequency and lithography, you can expect 25-35% max... which place a 2080 RTX at about 10% faster than a TI. Basically, the usual Nvidia upgrade.
Context was game specific and it strongly implied otherwise "genius". If it was in RT, he would not specify "Titan XP", because in RT it should easily outperform Titan V (as claimed with "RTX OPS"), which is way faster.
RTX 2080 Ti has same TDP as GTX 1080 Ti & titan Xp, and Titan V
However, RTX 2080 and 2070 has higher TDP than previous model, but I think it will probably have better performance per watt than pascal, and thats make the gap against Vega even bigger
It's not only powering around the same or more CUDA cores, it also has all the new RT and AI hardware.
Though it's likely it is using even more power than it appears for that; we are just seeing net usage after improved efficiency of DDR6 and 12nm taken into account.
Isn't it also providing power to the universal VR link too? I thought I read somewhere that is accounted for in the overhead but the actual usage will be lower depending on what's being used.
But the graphics card is more important than the motherboard, SSD, RAM, and CPU when it comes to games performance. Basically, if you have "replacement level" equipment for all those things other than the GPU and you spend a lot on a GPU you are much better off than spreading your money around equally. So of course the cost of the system will tilt more and more towards the graphics card over time (it's not just the GPU, because the on-card DRAM is also very important).
Note, however, that the die area dedicated to the GPU (and the on-board DRAM) has also increased over time. The GeForce 256 seems to have had a 111 or 125 square mm die size. The RTX 2080 Ti has a 754 square mm die size. That's a 6+ times increase. Almost all of that increase comes from more execution units, added memory controllers, etc, and not things like video encoders. Compare the die area dedicated to VRAM on the GeForce 256 with the GTX 1080 Ti. It looks to me like it's a lot more on the GTX 1080 Ti. The Intel Pentium III 700 had a 105 square millimeter die size. The latest core i7s are about 300 square mm, I think, and that's including an integrated GPU and a bunch of other stuff that wasn't on the Pentium III 700 die (they were part of the chipset on the motherboard at the time) but take up a significant amount of die area. The die area for the CPU hasn't really changed much.
RT, Tensor, and other stuff that may not be particularly relevant enough to gamers during the majority of the lifetime of the GPU chip take up a lot of space, looking at diagrams I've seen in the past.
Die size isn't the whole story. If it were, the Fury X would have been a big hit.
We are discussing the trend in the pricing of GPUs relative to other components. Since this trend has been a long-term trend that has not reversed we can conclude that the GPU is providing more and more of the value of the gaming systems.
And regardless of how well a particular GPU is executed, larger die sizes incur higher costs and so demand higher prices. Fury X was not executed well. It had a memory capacity limit that, while probably not detrimental to performance turned people off. It didn't offer a performance advantage over cards based on NVIDIA chips that were cheaper to produce. Maybe the Turing RTX line will be a similar failure. But, again, we are talking about the general trend of GPUs taking up more and more of the cost of gaming systems. The Turing RTX fits that trend, and so it is not an outlier.
Let me know how it is fielding a baseball team without a 2nd baseman, a shortstop, a catcher, a pitcher, etc, if Mike Trout is more valuable than any of them :P
Turing is definitely consumer-grade Volta. In each second generation, NVidia had the computing card which added high-performance FP64 cores plus somewhat increased resources per core. In Fermi generation, it was CC 2.0architecture, in Kepler generation we had CC 3.7, and in Pascal gen it was CC 6.0. So, it seems that Volta and Turing is the same thing but Turing, as usually, reduces FP64 resources. There is possibility that it drops even more resources compared to Volta, f.e. has less registers or less shared memory per SM, or more ALUs per SM (which effectively reduces registers/sharedmem per ALU). By nio means it can increase anything compared to Volta, including bandwidth!! Moreover, Volta PR sais that bandwidth increased 4x, so 2x increase stated may really mean 2x less compared to Volta :D
But overall, they don't have resources to produce two different architectures, so most probably it's just Volta minus FP64 plus RTX. In particular, area (and number of transistors) per ALU is pretty close to that ratio in Volta, and this is perfectly explained by replacing FP64 cores with RTX cores, so overall SM area doesn't changed much.
Nvidia benefit from having near monopoly power. Back when AMD released a consumer GPU with a hardware tessellator, it couldn't even get Microsoft to add support for that in DirectX.
NVIDIA does not have near monopoly power. Who exactly are they pressuring with their market share and how? Don't confuse having a dominant market share with "monopoly power". They are two different things. The first is simply a result of economics, the second is an illegal coercion.
In fact, NVIDIA only has a dominant share in PC gaming. But PC gaming and console gaming are a shared space as far as most large games developers are concerned. AMD accounts for a greater number of GPUs that are used to play games than NVIDIA does.
Microsoft is leading its own push to get real time ray tracing in graphics. I doubt Microsoft cared too much about a hardware tessellator. AMD is part of this push as well, by the way. NVIDIA is just first. When AMD comes out with their "look at the amazing thrills of ray tracing" presentation a year or whatever later, most of these people in this forum dumping on NVIDIA will be saying "boo-ya!"
Near monopoly power? 80% of all GPUs sold are Intel... I think as an "enthusiast" people sometimes fail to see the whole picture and forget that they are only looking at the smallest section of the AIB GPU market: the bloodiest of the bleeding edge.
A few ranks down and AMD are very much present. Not to mention the pretty much dominant position of Intel with an IGP in pretty much everything and even AMD with their Vega IGP in Ryzen.
But yes, they can currently command an enviable premium from their top ranked products...
And we know full well that plenty of sheep er... customers... will be encouraging Nvidia to develop the hideous market phenomenon that is "blind-pre-purchasing" on us when it should be shot in the cradle.
@jwcalla -- Are you sure? You mean by bringing brand new technology to consumer GPUs that have the potential to change the way top end games will look in significant ways? Hmm...
Nvidia must really be pissing off it's partners. First it was the geforce partner fiasco stopping them from selling rival cards. Now with the Founder edition they have a month of exclusivity, charge higher prices -AND- have a card with proper cooling. It looks like this will be the first generation where the Founder cards are as good as its board partners. The partners have lost all their advantages!
It would be interesting to hear off the record how nvidia's partners are feeling right about now
the people who invent the card have every right to opening sales. The Board partners will get plenty of business, they just won't get the uber fanboi's who pre-order without benchmarks.
Internet shopping makes partners less and less relevant with time. Nvidia could drop all of them and still dominate AMD. However, there are clearly benefits to having them, like getting a lot more review site hype (with all the dribbling out of 3rd-party designs over time).
It's Nvidia's blind-prepurchase payment that is really pissing me off. As long as fanboiz are willing to empty their wallets of $1k before a single review has even been written means that this sort of prepayment will start even earlier next year...
... and before you know it, we'll be pledging for "perks" for the next generation of GPUs instead of purchasing.
Pricing issues aside (although, IMO I'm not to shocked or put out by the prices, obviously lower is better but I'm not angry or losing sleep) I'm surprised that there is less enthusiasm on the forums for the first GPU to highlight real time ray tracing as a fully supported feature set designed to be used in game at playable frame rates. That's pretty revolutionary. Ever since I started following PC graphics (dual VooDoo2 SLI 8MB's were my first GPU's, still remember them fondly) real time ray tracing was considered the holy grail of rendering - the mile stone that was always talked about as "you'll know we've arrived at nirvana when it comes."
We'll, it's here! And, for all the talk about Nvidia's next gen of cards over the last 6 months, with the exception of the week since the Quadro announcement at SIGGRAPH, it's been kept pretty much under wraps. As of a month ago I would not have guessed that the new gen of cards would have such a focus on ray tracing, so the news is somewhat surprising and, from a pure techno-nerd stand point I think it's awesome! Not saying that as a Nvidia fanboy (I've owned many cards from many different OEM's over the years), but just as an avid gamer and technology enthusiast this is a pretty seminal moment. Regardless of the price point it's at today, this tech is going to filter out into all of Nvidia's (and presumably AMD's) product stack over the next 1-2 years, and that's extremely exciting news! Assuming AMD has been active with MS in developing DXR, their next gen of GPU should be supporting ray tracing as well, which means there is a decent chance that a DXR/ray tracing feature set is included in the next gen of consoles due in the next couple of years. This is really ground breaking tech - 6 months ago the common wisdom was that "real time ray tracing" was still years away, yet it's launching Sept 20th.
Granted, it's a hybrid approach, but given the 20+ year investment in rasterization 3D modeling, and the fact that every game currently released and in development is designed for a rasterized pipeline, it's not surprising, and frankly that's probably the smartest way to deploy it - rasterization has many good qualities to it, and if you can improve once if it's biggest weaknesses (lighting) through a hybrid approach with ray tracing that is the best of both worlds.
Anyway, I'm rambling. From a long time gamer I'm very excited to see some reviews of these cards and how dev's integrate ray tracing into their engines. Great time to be a gamer! Great job Nvidia! Say what you will about the price of the cards, that's a (relatively) short term phenomenon, while the direction this is pushing the industry in will be felt over the next decade, if not beyond.
Very well said HammerStrike! I'm and old school gamer too, grew up on the Voodoo 2 SLIs w/ Riva 128. Those were the days! Nvidia's keynote today really hit it right out of the park! It is propelling us into the next decade of gaming graphics. I'm truly excited and can't wait to pick up a new card!
I think you’re not seeing more enthusiasm for real-time ray-tracing because we won’t see any AAA titles that take advantage of it until the average GPU supports it. And that’s not going to happen when the price point is $1200. It’s not going to happen until there are new Xbox and PlayStation consoles that have ray-tracing capabilities. And by the time that happens the 20** series will be obsolete. It’s exciting that they’re going in this direction, but these cards will never actually take us there.
They literally showed 20+ games upcoming, to include 2 near term triple A launches (granted it's unknown if those games will support RTX at launch). BFV and Tomb Raider looked insanely good.
I'm with you - reading these comments is depressing. Gamers and tech lovers are some salty people of late - a jaded, skeptical lot here at AT! I know it's fair to be somewhat reserved, but the IQ increase of Ray Tracing looked more than a gimmick - it fundamentally changes how real time games looked, at least by demos. I know most people here want it, they are just pissed they can't afford it, so they want to dump on the capability here to justify their anger. I get it.
I decided to preorder the 2080. I'm all in on this...assuming increased performance over the 1080 for "old rendered" games (from 8 to 10 TFlops?) - and hopefully it can run BFV at 60 FPS with RTX on. I know I'll be buying 3 of the RTX enabled games anyways. The way lighting and shadow worked in Shadow of Tomb Raider was impressive. BFV looked like offline renders.
You cannot put pricing aside. In order to get useful raytracing performance 1080p one will need 2080 Ti. From that I can conclude that RTX 2070 and 2080 will not be powefull enough to run realtime raytracing in games. I think for gamer spending 1200$ just to get realistic reflections and shadows is ridiculous. BTW there was no mention of VR in keynote. What VR neededed was more rendering power for less money so these cards probably sound end of VR on PC.
1. I never said don't take pricing as a point of data, I just said there is a LOT more about the RTX series announcement then just the price, which is what the majority of comments seemed focused on. Real time ray tracing is the holy grail of rendering - the fact that it's showing up in any form in 2018 is pretty exciting.
2. Where are you getting your info that you'll need a 2080 ti to run "useful" frames at 1080p with RT enabled? While I agree that we should all reserve a BUYING decision until trusted reviews are out there, this is just pure conjuncture that is apparently driven by your disappointment / saltiness / anger over the price of the cards.
3. We could argue that RT, even as it's implemented in the RTX series, is a lot more then just "realistic shadows", but that's besides the point. Regardless of what aspects of the image quality are improved, that's the whole reason to upgrade a GPU right now. If image quality is of minor or no importance, you can drop your IQ settings to low and get 60+ FPS on a GTX 1050TI right now for $150. I get confused with all this "it's not useful, it just makes the image quality better" line of argument. People weren't drooling over the Cyberpunk 2077 demo at E3 because it was running at a super high frame rate.
3. While I don't disagree with your point on VR, I wasn't really debating that. However, there still is the question of what Nvidia will position for their cards that cost less then $500. Regardless of what they are pricing the 2070+ cards at, my (semi) educated guess is that 95%+ of GPU's sold are sub $500. I'd bet that 80%+ are less then $300. While we can talk about "market" pricing, and whether Nvidia is good or evil for pricing these cards where they did, it doesn't change the fact that the mainstream of the market is at $300 or less, and no amount of halo product / pricing is going to change that. Say what you will of Nvidia, they are not stupid, and I'd be surprised if they didn't refresh their product stack in that price range, and I'd be equally surprised if they didn't provide a meaningful performance boast in comparable price tiers.
It's also worth noting that the die's on the RTX series are FREAKING HUGE. Lots of consternation of "price inflation" based on model series from 7 gen to 9th to 10th to 20th, but end of the day those are just names. The primary driver of the cost of production in any semiconductor is the size of the die. The 2080ti has a die Size of ~757mm^2, where the 1080ti has a 471mm^2 die. That's basically a 60% increase in die size. Even assuming the yields are the same, they still use 300mm wafers, and the cost of manufacturing is per wafer, so everything else being equal their productions costs went up by 60%. DRAM prices are all higher then they were 2 years ago. If you take the $700 price of the 1080ti at launch, add an additional 60% to it and throw in $30-$50 more in GDDR pricing and you have a card in the $1200 range.
Not saying that makes the pricing "ok" but simply to suggest that there no reason other then "greed" (I know you didn't make that specific argument, but it's pretty common on the forums) is pretty short sighted.
This is a freakin' monster chip with revolutionary tech that is enabling for the first time the holy grail of rendering tech, real time ray tracing. Whether or not blowing out the silicon die area to support that, and charging the prices they are that are necessitated by that, turn out to be a good business decision remains to be seen. But from a pure tech standpoint it's pretty awesome. It't would be like Ford announcing a Mustang with a 1500 HP engine in it that costs $150K. Maybe there is enough of a market there, maybe there isn't, but I'm not going to get so hung up on the price to not appreciate the pure ridiculousness of what they made. Same here.
Cant wait to see some real benchmarks on the 2080... My 1080 is so close to being good enough to do 4k games, but if this offers some substantial 4k performance I may be able to justify the 2080... after I sell my 1080... and if the 2080 is on a sale... maybe...
Yeah my 1080ti is pretty decent at 4k mostly but there's a few times I would like just a bit more power there to turn settings up. Still not sure I'm willing to put down $1k to do it(I'd go for the ti), at least not this year.
Just optimized you graphic settings, there's no point of trying to stick to everything "ulta/maxed out" when you damn now there plenty of near useless effect that will chunk performance by 30-40% while requiring you to watch a pic comparison for 2 minutes to spot differences.
*The main offender is high quality+ shadows and postprocessing.
No AA for me at 4k... pointless... and motion blur is the antichrist... Whoever thought THAT would be a good idea. HQ shadows can kill performance, so I usually turn those down a bit... I also find that medium AO is good enough for me so that I can run my favourite games on my Oculus pegged at 90FPS or my 4k screen at 60FPS.
NVidia and partners have a load of stock GTXs to sell, hence NVidia is not jeopardizing those sales and keeps the next generation at higher tier prices. Maybe in 6-8 months, once their 1060, 1070 chips deplete, and they will introduce something midmarket 2xxx series. So, expected and smart marketing move, though will probably not going to generate volume sales and user base, and that translates to zero interest for dev support - hence NVidia gives away RTX cards to developers now. This tech may drag some attention in a year, or maybe in 2020, but NVidia's problem meanwhile may be very well the ray tracing standardization between MS and AMD, eventually that standards included in next generation consoles which would exclude NVidia from the gaming market. NVidia is walking on a very thin ice here, I think they would lose on the popularity and volume - indicators of financial performance expectations are usually the stock prices, and they were not very bright today.
This. I see no reason why they should sell these cards cheaper. I'm surprised they aren't more expensive. They really have no competition at the moment for their 10 series cards which they allegedly have a glut of still. Essentially they just kept their 10 series teir viable longer with these prices. Smart. Also I'm sure they are planning price breaks for whatever AMD releases their next platform. It's easy to lower mrsp, impossible to raise them after the fact.
Sure prices are higher for next level tech, but look how much real estate has gone up in Silicon Valley in the last 5 years. Where will engineers live and sleep? Ppl have families, and all the NIMBYs have driven prices higher than vid cardz due to limited availability and prop13 causing investment bubbles in RE. Its a domino effect that leads to higher priced products to cover higher costs.
The dollar has been printed like toilet paper thanks to QE2, and the debt hitting 20T. Still need to pay TSMC overseas for the chips, which could be tarrif'd in the future thanks to Trumps tariff war with all his frenz. Everything needs to be accounted for, when you're not getting illegal labor from Mexico or slave labor from China and demand livable wages things go up, look at prices for an IPhone.
Look at these vid cards as an invedtment, can mine ETH with it. Huge demand by Venezuelans for currency and Greeks that aint paying taxes.
I hate pre-orders without benchmarks. It's not as bad as the Ryzen launch where we had no idea how it would perform. We can sort of guess how Turing cards will perform because of similarities to Pascal and Volta, but not well enough to determine if the prices offer good value or not.
I hear you there. I actually draw a hard line at 500. Not because I can't afford it but because I can't justify it for my use-case. It kinda sucks for me this go around because I prefer visual quality over performance, so I really appreciate the upgrade with ray tracing.
Back in the heat of the mining craze amid the rampant rumors of nvidia's next offerings I realized they would most likely be priced much higher due to current market conditions and lack of competition. I occasionally checked Nvidia for availability and got lucky and picked up a 1070ti FE for mrsp a couple months back. (I know, lucky me, right?) I'm okay with it though because it's going to be a while for prices to drop and the card performs beautifully for my needs. I'll just sell it for whatever it sells for and pickup the newest card around 500 next year. Or wait until next generation if ray tracing really gains traction to be ahead of the curve... At least ahead of the 500 dollar curve.
I'm going to wait for the next generation. I'm guessing that by then ray tracing will have caught on and I can get a card that costs about $400 and is powerful enough to take good advantage of it.
I don't mind the RTX Ops. Only time will tell how useful they are. It could be that there is a strong correlation between how games perform and these RTX Ops. If they are useful they will stick around, otherwise they'll fall by the wayside. They will only really be useful for comparing NVIDIA cards to each other, but NVIDIA is the only one with cards with these capabilities out at the moment anyway. You can't create a metric that targets cards that don't exist yet.
Way too expensive for 95% of the market. And practically near zero game support (just a few titles with quickly slapped on water reflections and softer shadows which will kill performance in FPS shooters anyway). And no foreseeable console support for a few years into the future.
Time to wait for new mfg nodes, 2nd/3rd gen RTX cards and see what AMD/Intel can do.
At these prices nVidia's sales and stock price deserve to take a plunge.
Only FPS shooters matter. I don't even see why they worry about things like global illumination. And near-zero game support at the launch of a new technology is a non-starter. If games developers aren't spending their time supporting features that don't exist then what the hell are they doing?
That is some salty jealous commentary man. Also - it doesn't seem like you can base your assessment of RTX performance on the history of PCSS impacts on older games.
I doubt it. When has retail price been significantly more than MSRP other than during the crypto craze? And when was the last time NVIDIA instituted massive price cuts in response to anything AMD did?
Hey, good idea. But I don't think crypto will get as hot as before for a while, if ever. It was caused by a world-wide speculation craze. Many of those people in the craze got bitten by the sharp price declines. Another craze is much less likely to happen. As far as I can see, there is no good reason the currency should be $1,000,000 a coin or $100,000 a coin or $1,000 a coin or whatever. The coins are used for very little at the moment. Mostly they are just tools for speculation. So if there isn't a mass influx of speculators, like there was until recently, there isn't a good reason for the coins to be increasing in value relative to the dollar.
The bubble built throughout 2017 and then burst in 2018. Bitcoin was up to a $327B market cap and now it's down to $116B. Ethereum built up to a $135B market cap and now it's $30B. Some people made a lot of money, but those were the ones who sold when it was spiking (or were holding the currency for years, but there are much less of those than the number who became involved in 2017). People who bought when it was spiking have lost a lot of money. If the volume charts are to be trusted, there was very little volume until the middle of 2017. That's the craze I am talking about. From the middle of 2017 until the end of 2017.
Reviews embargo? How does that work? That means someone already knows how the cards perform compared to the previous ones. Excluding ray tracing, what kind of fps increase do we get on old titles vs previous cards? That's something missing! people get so blind with RTX hype they can't see anything else and drop $1200 for a card. lol
Yeah I think it's crazy to pre-order. But people can do what they want with their money.
Well, most people aren't even under review embargoes yet, I guess, because they probably haven't received any review samples, yet. I think they usually get the samples about a week before the embargo lifts. The idea of the embargo is so reviewers have time to put together high-quality reviews. Otherwise sites would rush out their reviews to get the clicks from being first.
If the review embargo lifts September 13th (I have no idea if it will), then maybe sites would get their samples around September 6th. Until then, very few people outside NVIDIA and their AIB partners will have benchmarks on the cards.
I'm guessing laptops will not see this RTX feature until next gen. I'm guessing we will hear about an 11 series incremental bump to mobile chips in the Winter. I've heard some rumors to this effect for even desktop cards. I.e., no RTX 2060, but rather just an 11-series bump that keeps the GTX moniker. With the die size and heat/performance requirements for RTX turing chips - that sounds like a lot to squeeze in a 18mm laptop.
I am guessing that if NVIDIA comes out with a non-RTX 1060 that the GPU it is based on will contain the other improvements made to Turing other than the RT cores, with the RT cores disabled or removed. So it will be a true generational increase in games performance. Maybe that's what you mean by an "11-series incremental bump" but your chosen words seem to have the connotation of disappointment or marginalization to me. I, myself, wouldn't use that phrase for a 40% boost in performance. I believe the GPUs would also have variable rate shading, independent integer units, Tensor Cores, etc., as well.
Now that would be nice... An 11 series GTX which keeps the 14G memory bandwidth but just removes the RT garb... er... overhead and gives a nice 25% performance jump.
The big weakness being area due to these extra compute units. Huge die, lots of memory , stupid high margins and PC gaming is starting to become a luxury.
PC gaming was always a luxury. The Apple II GS was introduced at 2,230 2017 dollars, not including a monitor or any floppy drives. For a color monitor, a 2.5 inch floppy drive, and a 3.5 inch floppy drive, it cost another 2672 2017 dollars, for a grand total of 4,902 2017 dollars.
It's just that more of the cost of the gaming system has shifted to the graphics card from the other components of the system because the graphics card has become relatively more important to the performance of the system. PC gaming is still cheaper now than before.
You're joking, right? Imagine how many 10 series cards are going to flood the second hand market in the run up to Christmas... Hell, I might pick another 1080ti on the cheap for some SLI action.
That's a mobile core with 150 GFLOPS of FP32 and probably a memory bandwidth of about 20 GB/s shared between the GPU and CPU. Also, as eddman pointed out, the power consumption you pointed out is only for the SoC and doesn't include the RAM. Power efficiency of mobile SoCs are also much greater than that of high powered chips. BTW, 10 billion divided by 100 million is 100. 100 times 2 Watts is 200 Watts. Now with a mind towards the points I made about the differences between SoCs and high powered graphics cards above, the efficiency of that ImgTec device looks like shit to me in comparison. Maybe we will have a better idea when NVIDIA comes out with a Turing-based Tegra.
Nobody cares, since HDMI 2.1-compatible screens and cables are nonexistent. And by the time you can afford 8K these cards will be too slow to drive it.
This makes it a hard pass for me, because my next computer upgrade is destined to be hooked up to a large OLED that supports hdmi 2.1 and VRR when such set becomes available (2019 or 2020).
This looks real bad. We get decrease in core count and clock speed with more features for more money. GTX 1080 is around 400USD last few months and RTX 1070 that has fewer cores and lower clockspeed has MSRP of 500USD. There are no performance figures in existing games presented so I do not expect much if any impovement in performance but we get increase in price. AMD get your act together we need some competition in GPU market.
I am now more interested in this "future" Nvidia is painting. Are hybrid rendering really that "good". And at what cost? Are all the features standardise in Direct X? Because at this moment it is looking like Direct X is Glide and Nvidia is 3Dfx ( Which they actually acquired so I cant say this is wrong ). Where does AMD stand in this?
Navi was rumoured to be aiming at PS5, hybrid rendering in PS5 as well?
We only have may be ~4x more transistor to go, 7nm brings 2x of 14 / 12nm, and 3nm brings 2x of 7nm density. ( Those naming node are TSMC's version ). Are we sure that is the most transistor efficient way to go?
I'm in general agreement with everyone about the price and I also think the TDP is absurd despite the technology offering impressive features and capabilities. The trouble is that it's you end up paying a lot for PC hardware to simply end up scraping the scraps and leftovers as a second-class gamer scabbing around several months after the fact for poorly ported console games. What's the point of all that supposedly high end PC gear if the software portion of your hobby is treated like the nearly expired discount rack at the console and mobile grocery store?
ray tracing is a Halo technology, maybe in 5 years some games may have it, currently is useless for gaming. Also lol at Gigarays per second, what a silly unit is that??.
It's lovely how the crazy fanboys are buying pairs of these cards without real reviews and real performance numbers! wow hehehe nice job from nvidia marketing! With the same amount of ROP, I suspect 2080 ti will be about 10 or 15% better than 1080 ti, and this is based on architecture improvements of 12nm, faster memory and larger bus. Excluding the ray tracing hype what can these cards do better than the previous ones? What fps increase do we get? Probably not much with lower clock speeds.
I imagine NVIDIA's break room has a huge poster of Gordon Gekko from the 1987 film Wall Street, delivering his infamous "Greed is good!"
The MSRPs are getting more ridiculous with every new generation. Remember when the flagship used to cost $500 not-so-long-ago? Like the GTX 580. And then NVIDIA decided to name the successor of the GTX 560 Ti GTX 680, and not 660Ti, slapped on a $500 tag, called it a flagship, despite the fact it was based on a mid-sized GPU, all the while purposefully withholding and delaying the actual flagship, which used to come out first. It flew. Still, later on, they were a bit worried whether the original Titan would sell well, because even they didn't think people would embrace a $1000 consumer video card. Yeah... Fun times. The rest is history. Nobody to blame but us, the consumer.
When I saw 80 Ti, 80, and 70 in the title I naively expected to also see $650, $500, and $350. Alas. The only actual break from the established "norm" since Kepler is that consumers can buy the ultra-overpriced card based on the big GPU a few months early. Which is for the better, I suppose, if the prices hadn't jumped across the board as well, and if we were sure a $650-700 version of the big-GPU card was coming.
On a more positive note, judging by the TDPs, the 2080 might actually be based on a cut-down version of the big GPU, like the GTX 570, or the GTX 780 if you will. That would be nice. It would also mean that the 2070 has a fully-fledged mid-sized GPU. I can see how somewhere in NVIDIA's Gekko cash-crazed mind, the current pricing policy is beneficial to the consumer, since they charge $600 for the full-fledged mid-sized card last time, and not it's down to $500. All praise the lord.
All in all, I was excited about this new generation but the insane, and unjustified, pricing just makes me want to quit this hobby for good. An Xbox One X can do so much for a console and it costs $500.
It is incredible to see the effect of the cards name on everybody.
They put shokingly higher prices on the cards, but just by naming them x80 TI / x80 / x70 instead of x80 / x70 / x60 (as in first generation flagship, high/mid range at 75% of the flagship, and mid range à 50% of the flagship), the effect is somehow mitigated, and people just find them "a little overpriced" compared to the previous generation.
I'm pretty sure we will still get a "TI" refresh in a year.
To be fair, it really does seem that the 2080 Ti is based on a proper, large-size GPU, aka "Big Turing", meaning it can be genuinely considered as a flagship. Yes, it is a cut-down version of said GPU, but that should not make it any less of a flagship; in fact, this has been standard practice over the years - Fermi launched with a cut-down-GPU flagship, GTX 480, later introduced the fully enabled GTX 580; the original Titan was based on a cut-down GPU, then we had GTX 780 (double cut-down), and finally - the fully enabled GTX 980 Ti. Thing is, those sweet Fermi flagship cost $500 then, and the equivalent of say a GTX 980 cost just $250... But, yeah, "hopefully" they release a $700 card based on Big Turing in 2019, cut-down or not.
Otherwise, I'm just as frustrated as you are, as you can tell by my comment just above yours.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
223 Comments
Back to Article
Lazlo Panaflex - Monday, August 20, 2018 - link
I bet it'll play Crysis.Simon_Says - Monday, August 20, 2018 - link
Only 30fps tho.Oxford Guy - Tuesday, August 21, 2018 - link
The top picture on page 1 should have had the 1080 Ti to compare with. Then, we would get a better idea of how well it will play Crysis or any other game, when compared with its upcoming replacement.Huhschwine - Tuesday, August 21, 2018 - link
I've missed info about real tests, when will we see it?colonelclaw - Tuesday, August 21, 2018 - link
Great question! There was a lot of smoke and mirrors in Nvidia's presentation, and the whole bit about them having to come up with a new benchmark struck me as a a tiny bit suspect. Chances are all 3 cards will be incredibly fast, but until I see benchmarks from 3rd party testers that I trust (that would be AT!) then my money is staying in my pocket.piiman - Tuesday, August 21, 2018 - link
Only with RTX OFFShadyghost - Monday, August 20, 2018 - link
Yikes! That price though... I do like the redesign. Much closer to the AMD cards.PeachNCream - Tuesday, August 21, 2018 - link
TDP is creeping up too. It looks like the RTX 2070 is taking the place of the GTX 1080 in terms of heat and power while the RTX 2080 and the Ti variant are reaching much higher. All that extra hardware needs to be fed and NVIDIA is still on a brute force for performance approach that is disappointing to say the least. Here's to the hopefully 30W 2030 that'll be out in a year or so...meh.Hixbot - Tuesday, August 21, 2018 - link
I thought performance per dollar is supposed to improve after 2.5 years?Shadyghost - Monday, August 20, 2018 - link
I am curious how they managed to rework the cooling to achieve such a massive (claimed) noise reduction at full speed. I'm guessing it's not just due to the added fan... Maybe though. This card looks great.Sancus - Monday, August 20, 2018 - link
Previous founders edition cards were typically around 10db louder than the best of the third party cards. So they probably mean 5x quieter based on db values. Most cards are around 35-50db at full load, and there's ZERO chance any of these cards are 10db at full speed LOL so they probably mean something like 10-15db and are just using the logarithmic scale marketing speak.DanNeely - Monday, August 20, 2018 - link
Assuming they're talking about 5x lower sound pressure, that'd be a 7db reduction, which sounds about right a 2 fan design's still going to have to spin a bit faster than a 3 fan one for a given level of cooling.Santoval - Monday, August 20, 2018 - link
The decibel scale is logarithmic, not linear. Each additional 3dB means twice as loud sound.halcyon - Tuesday, August 21, 2018 - link
You are confusing intensity with loudness. Intensity doubling is defined at each 3 dB increase.Loudness doubling (perceived, human hearing) is at each 6-10 dB increase (depends on SPL starting level, frequency and complexity of sound, it is non-linear as a function of frequency, see equal loudness contours). Most often perceived loudness doubling is said to bappen at 10dB diferentials, but that is just a rough rule of thumb. Current psychoacoustic research points to the direction that it is more often closer to 6dB difference than 10dB.
Again, 3dB refers to intensity doubling, not human loudness perception.
Byte - Monday, August 20, 2018 - link
It is simple, a total REVOLUTION of their cooler by putting a SECOND fan!!!LemmingOverlord - Tuesday, August 21, 2018 - link
they did mention it's a vapor chamber design with two massive ultraquiet blades spinning... I wouldn't be surprised it's 5x more silent that the previous crappy FE cardsxboxsupportchat - Sunday, August 26, 2018 - link
See its depend on how it works, that's really fulfillments the thought had been designed the programme as well as its a question in another part of the matter? Need Some more effective on this like as http://xboxsupport.strikingly.com/ - Xbox Support, But do the more effectively work on it need some extraordinary support for this matter.Cellar Door - Monday, August 20, 2018 - link
Simply TOO EXPENSIVE at this point.r3loaded - Monday, August 20, 2018 - link
They've priced the 2070 at the 1080's price, the 2080 at the 1080 Ti's price and the 2080 Ti at the Titan Xp's price. The Titan V meanwhile has taken on the Quadro's price and the Turing Quadros are coming close to the Tesla cards. Rip-offs all around.nevcairiel - Monday, August 20, 2018 - link
Or its just a shift in naming if the performance matches their price points. If the 2070 beats a 1080 consistently by a decent margin, why wouldn't it take its spot?Review and benchmarks will tell is if they are worth the money being asked, or not.
Lolimaster - Monday, August 20, 2018 - link
A new gen is supposed to take the sport of the older gen model1070 --> 2070
1080 --> 2080
SAME PRICE
mkaibear - Tuesday, August 21, 2018 - link
It hasn't for the last 5 years, why would nVidia do that now?Orange_Swan - Tuesday, August 21, 2018 - link
I always thought of it as it as970 --> 780
1070 --> 980
2080 --> 1080
last years power comes down a price point.
Santoval - Tuesday, August 21, 2018 - link
Jensen Huang made very clear in his presentation that he is targeting the future, while departing from the past. Which is why his company dedicated almost half the die space of Turing to tensor and RT cores. So, you need to define exactly what you mean by "consistently and by a decent margin". Do you mean past games that support no tensor cores, no RT cores and can make no use of the INT8 units of the shaders and of variable shading?If so no, the 2070 will not beat the 1080 consistently and by a decent margin. It will either beat it barely or it will be a bit behind it. Do you rather mean upcoming games (where the priorities will be balanced between rendering quality and fps)? Then yes, 2070 will "beat" the 1080 consistently, but not necessarily in fps.
Some games could offload AI and physics to the tensor cores and thus allow the shaders to raise the fps, as Huang showed, but not all games, and I don't think these games will also be able to support ray-tracing. It will be a case of "either or" apparently, and the choice will be left to developers and game studios.
In all other games that will support ray-tracing the focus will be on approaching photo-realistic lighting, reflections, refractions, shadows etc while retaining roughly the same fps rate. However quality is not easily measurable like fps, which arguably represent quantity. This is why Huang (or his marketing team) coined the term "RTX OPS", so that every aspect is taken into account.
Therefore you need to clarify the kind of "beats" you refer to : merely fps or .. much higher photorealism?
Santoval - Tuesday, August 21, 2018 - link
p.s. By "while retaining roughly the same fps rate" I did not mean the same as the previous generation, of course, I meant roughly the same fps rate Turing would have if it lacked tensor and RT cores.Santoval - Tuesday, August 21, 2018 - link
Grr.. edit #2 : not "merely fps or .. much higher photorealism?", but rather "merely fps or fps along with much higher photorealism?"Yojimbo - Tuesday, August 21, 2018 - link
"dedicated almost half the die space of Turing to tensor and RT cores"Where'd you get or how did you decide on this?
Guesstimating by transistor counts of the Quadro cards, I don't think NVIDIA has dedicated half the die area to the RT cores. Perhaps 1/3, maybe a bit less.
As far as Tensor Cores, it's hard to estimate, but I believe they are integrated into the SMs and, furthermore, I think they are created by a rewiring of the existing execution units in the SMs. Guesstimating by comparing the transistor counts of Volta and Pascal, they don't seem to add that many more transistors to the architecture. So I am sure they cost a lot in terms of engineering cost and design flexibility, but they don't seem to cost that much in terms of die area.
piiman - Tuesday, August 21, 2018 - link
"Where'd you get or how did you decide on this?"LOL You can LOOK at it
Yojimbo - Tuesday, August 21, 2018 - link
Where exactly are the SMs, where exactly are the RT cores, where exactly are the Tensor Cores?Here's a die shot of the GP100: http://cdn.wccftech.com/wp-content/uploads/2016/08...
Now how do we make sense of it? The GP100 has 6 GPCs, each containing 10 SMs. I have no idea what's what there, and I certainly have no idea how to relate it to the picture NVIDIA showed supposedly showing the Turing chip. It looks like a different type of photograph. Maybe one is metal and one is polysilicon, I dunno. So if you understand it, where are the RT cores and how do you know?
Oxford Guy - Tuesday, August 21, 2018 - link
A jack of all trades and a master of none? That may be one type of future.We have long been sold a lot of wasted die area for Intel's integrated graphics. Sometimes the future holds unwarranted profits when there isn't adequate competition.
Yojimbo - Tuesday, August 21, 2018 - link
RT cores are practically ASIC SIPs. How is that a jack of all trades, master of none? It's targeted specifically at rendering graphics, so it is actually the opposite.And how does a company get profits by increasing both the cost to produce and the market price of their products? That makes no sense to me economically. Regardless of the "adequacy" of the competition, they will sell more at a lower price and have better margins with goods that are cheaper to produce. Only in special situations would a company be able to shoehorn in something that customers don't really want and get larger margins for it. That isn't the situation here with NVIDIA. They are instituting a technology transition, dedicating significant die area to a new technology, not to a useless technology. And, according to their most recent financial conference call, they expect gross margins to be down the next quarter, not up, despite the higher MSRPs of these new parts. So they don't seem to be trying to squeeze any more out of consumers, these are just more expensive parts to produce.
Yojimbo - Tuesday, August 21, 2018 - link
As far as the Tensor Cores, once they are developed for the compute market, I don't think they add much to the manufacturing cost of a gaming-oriented GPU.Plus, they are vital to the viability of the ray tracing methods. Without them, real-time ray tracing would still be several years out.SirPerro - Tuesday, August 21, 2018 - link
I hope you are aware that, with your logic, the third best consumer card from nvidia in 2025 would be around 1500 dollars.We must expect next generation products to have better performance per dollar. Otherwise we just increase price at the same rate as performance does. Yeah... it's that obvious.
mkaibear - Tuesday, August 21, 2018 - link
Actually, if you assume that each generation the SKU model number goes up, so each model's "successor" is the one from the grade "below" it, you see fairly constant pricing and a constant upward scale in performance.Compare 780 to 970 to 1060, for example, or 780Ti to 980 to 1070.
eddman - Tuesday, August 21, 2018 - link
No.elzafir - Tuesday, August 21, 2018 - link
Yes.evernessince - Friday, August 24, 2018 - link
The 970 is 6% faster than the 780. The 1060 3GB is 5% faster than the 970. In what world is that a "constant upward scale in performance"? I think you've drunken far too much of the Intel Kool-aid if 5-6% performance gains are considered good.evernessince - Friday, August 24, 2018 - link
Following that logic we'll be at 10K consumer GPUs in no time. Nvidia will release 7nm turing and oh look the 3070 performs just as well as the 2080 so according to you that's good enough to charge insane prices. Oh and the 4070 will perform just as good as the 3080 so yay, another price hike! I hope you see the pattern by now.Eventually Nvidia will be forced to stop, simply because they've fleeced every penny from PC gamers. Oh well, I'm sure they'll just throw some money at the devs and their GimpWorks program and tighten their grip on AIBs to ensure no competitor can challenge them.
Xorp - Monday, August 20, 2018 - link
For you.DanNeely - Monday, August 20, 2018 - link
Looking at listings on Newegg, it looks like all the initial cards are using the same set of outputs as the founder edition, and presumably are a reference layout. I wonder if custom boards will add back DVI again, or if this will be the end of the road for old 2560x1600 and 1440p Korean monitors that only support DVI in. ($80-100 for a flaky adapter is a really bad deal, the cheaper ones won't work unless the monitor does something non-standard and rarely done).peterfares - Monday, August 20, 2018 - link
Used DP to DL-DVI adapters are getting very cheap, I see them for around $25 often now.ddor - Monday, August 20, 2018 - link
Will it work for my 110 HZ DVI-D Korean panels? They're fighting for their lives hereDanNeely - Tuesday, August 21, 2018 - link
As buggy as the active ones were, I wouldn't even consider buying one from a place that doesn't have a no questions asked return policy.Nexworks - Monday, August 20, 2018 - link
DP to DVI cables are like $15DanNeely - Tuesday, August 21, 2018 - link
Those ones only do 1080p60 in standards compliant mode. There're a few that claim 2560x1440/1600p60; but they only work with a very tiny subset of monitors that can take an ~100% overclock (to HDMI1.4 equivalent speed) on DVI; this is limited to a handful of the last DVI only displays that had an HDMI capable panel controller but didn't put an HDMI port on to save a few cents on the BoM.maroon1 - Monday, August 20, 2018 - link
Very cheapRTX 2080 Ti for 999$ dollar ONLY. If it beats titan V or even titan Xp, then it is not bad deal
RTX 2080 will probably beat GTX 1080 Ti, and thats not bad for 699 dollar
gamingkingx - Monday, August 20, 2018 - link
I think something went wrong??https://image.ibb.co/hEdPAz/priser.png
peterfares - Monday, August 20, 2018 - link
Those are BS "founders edition" pricing for people too impatient to wait a couple weeks.Morawka - Monday, August 20, 2018 - link
hah couple of weeks, try 3 months. That's when we'll finally see custom cards available to ship. Even then, your likely to play the waiting game as everyone rushes to buy the EVGA and Asus cardsGraXXoR - Tuesday, August 21, 2018 - link
I'm holding out for a Zotac AMP Extreme... Have had great luck with them since my first 970 SLI pair back in the day.eva02langley - Tuesday, August 21, 2018 - link
You didn't read the article, did you? This MSRP will never be achieved. As of now, third party cards are retailing for as much or even more.Vega went into the same situation because of HBM2 pricing. The same will occur for GDDR6. Add to this a 760 mm2 die and you got yourself a trap.
piiman - Tuesday, August 21, 2018 - link
Couple of weeks? lol good luckevernessince - Friday, August 24, 2018 - link
Nvidia certainly does know how to milk people. They've got people paying the $200 founders edition tax and people believing that they'll actually pay MSRP. Oh, that's rich. MSRP is a joke, expect to be milked for the full founders edition price until Nvidia is forced otherwise.Remember the 10 years of Intel CPU's being the only choice? Welcome to the GPU version of that.
Frenetic Pony - Monday, August 20, 2018 - link
It's not a "bad deal" only because AMD is failing still, would probably be hundreds less if the competition could get their act together : /sgeocla - Tuesday, August 21, 2018 - link
This is exactly what is wrong with AMD competing against Nvidia in the high-end GPU market. People only want AMD to compete to drive down Nvidia prices so that they buy the Nvidia cards anyway, and AMD gets stuck with the bill without being compensated for R&D.AMD should not waste resources on competing in the high end GPU market and mostly focus on giving the mainstream GPU users (that actually care about both price and performance) the products that they want and deserve.
DanNeely - Tuesday, August 21, 2018 - link
The only way AMD can offer enough competitive pressure to influence NVidia's pricing is to offer similar levels of performance per mm^2 of GPU die. If they need 2x as much chip to match performance nvidia can lift prices across the entire product line matching price/pref with their competition while inflating the price of high end cards and laughing all the way to the bank. Meanwhile if AMD is reasonably competitive densitywise there's no reason they can't make cards that are able to compete most if not all the way up the stack.Impulses - Monday, August 20, 2018 - link
I'm in for an RTX 2080 if it actually beats 1080 To, but if it barely does so then a clearance deal on the latter might be the better deal...eddman - Monday, August 20, 2018 - link
What kind of reasoning is that. Based on that thought process it'd be ok for 3080 to cost $1000 since it might match or beat 2080 Ti.2080 is replacing 1080 and as a direct replacement it is $100 overpriced. Nvidia is simply raising the price because they can.
P.S. No, inflation is not the driving factor here since 2016 $600 is about $630 now.
CaedenV - Monday, August 20, 2018 - link
Not to entirely defend their prices (I agree it is higher than I want to pay), but the launch is compared to the 8800 for very good reason. Inflation adjusted, the 8800 with its introduction of CUDA was also an inflation adjusted $1000 on launch... and also useless. New tech is expensive, and not great. RTX is not going to work at 60fps 1080p... the demos were showing something closer to 720p 30fps. People who buy these chips have 4k monitors and will pretty much never use RTX (except for rendering video projects), just as nobody used CUDA for much of anything when it was introduced. This is the same thing.Fast forward a few years, you see the launch of the 580, for an inflation adjusted $500, and CUDA was actually powerful enough to be useful. We will see a similar curve here. Lots of R&D to pay for, lots of kinks to work out, lots more cores that need to be added, and eventually it will all be affordable and usable, even on a 4k display.
That said; How much you want to bet that if AMD had a competing card, nVidia would be selling similar hardware without the tensor cores and ray tracing capabilities, but still fantastic gaming cards for $600... would be great to see some competition, but sadly I bet we will see pressure from Intel's dGPUs before we see anything high end from AMD (other than a few limited release cards that nobody can actually purchase)
eddman - Tuesday, August 21, 2018 - link
$1000? 8800 GTX launched for $600 in 2006 which is about $750 today and 580's $500 is about $577 now.I do get the point about new technologies, R&D, etc. but it is still $100 more expensive than the card it is replacing (ok, $70).
Yojimbo - Tuesday, August 21, 2018 - link
He never said 8800 GTX, he said 8800. There were a lot of 8800 GPUs and the 8800 Ultra launched for $830 in 2007, which is over 1,000 2018 dollars.eddman - Tuesday, August 21, 2018 - link
Then he'd be wrong if he meant Ultra, since 8800 GTX is the equivalent card to 2080. Also, the 8 series launched with 8800 GTX, not Ultra, which launched 6 months later simply because of lack of proper competition from AMD in order to maximize profits.Yojimbo - Tuesday, August 21, 2018 - link
He never said 2080, either, so why is her wrong? The 2080 doesn't cost $1,000. One can argue that the 2080 Ti is the 8800 Ultra coming out 6 months earlier, for which is actually deserves a price premium. As technology performance increases, the value of current technology goes down over time, so something coming out now is more valuable than something coming out 6 months from now.eddman - Tuesday, August 21, 2018 - link
... because he replied to my comment where I compared 2080 to 1080. Isn't that obvious?No, 2080 Ti can be compared to 1080 Ti, 980 Ti, etc. and it is quite more expensive.
Yojimbo - Tuesday, August 21, 2018 - link
I am not following your reasoning. You mentioned the 2080 and the 2080 Ti. You compared the 2080 to the 1080. Why does that preclude him from being able to compare the 2080 to the GTX 8800 and the 2080 Ti to the 8800 Ultra?The 2080 Ti can be compared to 1080 Ti but not to 8800 Ultra? Why? Because you said so? The 1080 Ti does not match his "new technology" thesis. If you're gonna refute his thesis you need to work with it, not just ignore it.
As far as supporting his thesis, I think a better argument is the increase in die size of the GPUs in question and the fact that in NVIDIA's latest earnings conference call they projected that gross margins would go down next quarter, not up. That suggests that, assuming they sell a significant number of these new cards next quarter, these increased prices do seem to have a correlation to increased manufacturing costs, and not just "NVIDIA raising prices because they can". If they were raising prices "because they can" (which every company is trying to do!) then their margins should be increasing.
eddman - Tuesday, August 21, 2018 - link
... because at first I was comparing 2080 to 1080 and he replied to that.8800 Ultra was a massively overpriced card because of no competition. It didn't offer anything besides a bit higher clock speeds. It simply existed to make more money. If that new technology's cost was that crippling, nvidia would've priced 8800 GTX more.
Thinking that these cards are this expensive ONLY because of their new technologies is naive. I don't doubt they could sell these at Pascal level prices and still make a lot of profit. They'd have simply recuperated their expenses a bit later.
These new technologies might be playing a role in pricing but based on nvidia's track record it's safe to say that the biggest reason here is lack of competition.
P.S. I have nothing against nvidia and might even upgrade to a 2070 from my 1060.
Yojimbo - Tuesday, August 21, 2018 - link
"It simply existed to make more money."Products in general simply exist to make money. Of course someone can develop or champion something for another reason, but if it isn't fulfilling a desire in the market, in which case it can make money, then it's bound to fail.
"Thinking that these cards are this expensive ONLY because of their new technologies is naive."
No it's not, as shown by the gross margins argument I have made.
"These new technologies might be playing a role in pricing but based on nvidia's track record it's safe to say that the biggest reason here is lack of competition."
That simply isn't true. Based on the track record the prices have not gone up in relation despite NVIDIA's lack of competition in the PC gaming space. It's just something people repeat. In fact what has happened is a greater number of people have bought the higher priced cards than in previous generations. Why? Because the graphics cards have become more and more important to the performance of gaming systems in relation to the other components of the system. There also is less of a need to spend extra money on the system for uses of the systems besides gaming. Spreadsheets, word processors, compression tools, maintenance utilities, etc, used to tax
the hardware relative to its maximum performance much more than they do now. Therefore that money saved on the other uses for the system can be re-allocated towards the graphics card to improve the gaming experience.
eddman - Tuesday, August 21, 2018 - link
The point was that 8800 UItra was a mere cash grab. It didn't have to be that expensive or even exist, but nvidia did it since they knew AMD can't respond and that there are enough people to sell these cards to. Easy money. Nothing inherently wrong with that tough.Margins are expected to go down but it doesn't mean they couldn't have priced the cards lower. Yes, they made a lot of investments which had an effect on profit, but there is no reason to think that lower prices would've resulted in losses. They still would've made the money back but at a bit later date.
Prices DID go up after fermi.
Yojimbo - Tuesday, August 21, 2018 - link
"The point was that 8800 UItra was a mere cash grab. It didn't have to be that expensive or even exist, but nvidia did it since they knew AMD can't respond and that there are enough people to sell these cards to. Easy money. Nothing inherently wrong with that tough."Why do any graphics cards have to exist? Why can't NVIDIA raise chickens instead? Huang can be a long-haul trucker. The only way a product "has to be a certain price" is if a company can't possibly sell it for less without going out of business, I guess. If a company is pricing their products that way then they are in big, big trouble. So of course it was a "cash grab" priced to maximize profits. That's how products are priced! It's ubiquitous. It isn't special to the 8800 Ultra. And since it is ubiquitous the argument cannot possibly be used to refute a claim that the 8800 Ultra cost more because of new technology. It's entirely possible that the only reason people were willing to pay that much money for the card was because of its combination of performance and new technology. You have to realize that companies don't just set the price and then that's it. A company must always obey the market with their prices, otherwise they will lose money in comparison to obeying the market. It's the market that sets the price.
"Margins are expected to go down but it doesn't mean they couldn't have priced the cards lower."
Yes sure, margins could have gone down even more. Margins could even go to zero. They could sell the card at cost for a while before they tank their business. What's your point? We are arguing about whether they are "squeezing consumers" more not whether they are NOT "squeezing consumers" less!
"Prices DID go up after fermi."
No, the GTX 480 debuted at $499. 499 2010 dollars is 577 2018 dollars. Besides, AMD was still competitive during the Kepler days. The GTX 780 debuted at 649 2013 dollars, which is 702 2018 dollars. That's actually right where the GTX 2080 is debuting. Maxwell was when AMD was least competitive and the GTX 980 debuted at 549 2014 dollars which is 584 2018 dollars. So the price went down significantly, right in line with the GTX 480 debut price. This even though NVIDIA had much less competition with Maxwell and the GPU was more important to the total gaming performance of the system in 2014 compared to 2010.
Yojimbo - Tuesday, August 21, 2018 - link
What I think is true is that with more AMD competition in the past prices dropped after after the debut. You can't use the Pascal generation for that comparison though, because it was a unique situation due to the crypto-currency craze and skyrocketing DRAM prices.Yojimbo - Tuesday, August 21, 2018 - link
EDIT:What I think is true is that with more AMD competition in the past prices dropped sooner after the debut. You can't use the Pascal generation for that comparison though, because it was a unique situation due to the crypto-currency craze and skyrocketing DRAM prices.
Yojimbo - Tuesday, August 21, 2018 - link
Oh, one last comment. There's another reason prices dropped faster in the past and that's because the time between generations/refreshes was shorter. But I do think that AMD competition was also a factor.eddman - Tuesday, August 21, 2018 - link
When Pascal cards launched crypto-currency craze was yet to begin. Their MSRP was not affected by it. The craze began a few months later.eddman - Tuesday, August 21, 2018 - link
No, a business prices their stuff based on how much they can get from buyers. They can make a card that costs $200 to make, including R&D and else, and sell it for $1000 if they know 1) there is no competition and 2) there are enough people willing to pay for it. That's overpricing and they still can and will do it.We don't know how much these cards cost but I VERY much doubt they are on the edge of losing money per card. I have no proof, obviously, but I suspect they could drop 2080 Ti to $800, or even less, and still make a lot of profit. Do you think they would've still gone with such high prices if AMD was able to respond properly.
Missed the point. Fermis were cheap (vs. now) because of AMD. 600 series did not have a big-chip card. The prices of big-chip cards went way up after fermi, starting with 780, and stayed that way since AMD could not properly compete.
Yojimbo - Tuesday, August 21, 2018 - link
"No, a business prices their stuff based on how much they can get from buyers."Yes, exactly. That's what they ALWAYS do. It's not "overpricing", it's correct pricing. They price to maximize their profits.
"We don't know how much these cards cost but I VERY much doubt they are on the edge of losing money per card."
I never said they were on the edge of losing money per card.I can guarantee you they are no where near being on the edge of losing money per card. And they shouldn't be anywhere near there. You, as a consumer, shouldn't want them to be there, because if they were then they would have no money for investment and no ability to absorb any sort of recession or market downturn. But it's all irrelevant. The point we are supposed to be discussing in this thread is whether NVIDIA is making MORE money on the Turing cards than on cards in the same market segment in the past. That is what you seemed to claim and that is what I and that other guy are arguing against. And I have given you evidence that no, they are not.
"Do you think they would've still gone with such high prices if AMD was able to respond properly."
Yes, because AMD would have also gone with such high prices if they could respond properly.
"Missed the point. Fermis were cheap (vs. now) because of AMD."
Fermis were not cheap. I demonstrated that. In fact I looked even deeper since then and found that the Tesla-based GTX 280 launched in 2008 for $650. That's $760 in today's money, which is $60 more than the RTX 2080 is launching. AMD was competitive at that time.
What has happened is that at times when AMD wasn't competitive and hardly anyone was buying their cards they lowered the prices of their GPUs to minimize their losses. NVIDIA responded to maintain their market share. That's less the result of healthy competition and more the result of a desperate company trying not to bleed cash. If AMD has a strong product they too will try to charge as much as they can for it. And that's exactly what they did in the past. And prices overall have not gone up or down in all that time.
"The prices of big-chip cards went way up after fermi, starting with 780, and stayed that way since AMD could not properly compete."
No. The Maxwell cards were among the cheapest, historically. And NVIDIA's market dominance was greatest during the Maxwell period.
Anyway, I'm out. Thanks for the conversation.
eddman - Tuesday, August 21, 2018 - link
... except when there is no competition, it turns into overpricing. They are charging more than the card being replaced in that category, ergo they are overpricing. Simple as that.No, I didn't claim they are making more money. I'm saying the profit margin is probably high enough that they could cut the price and still make a healthy amount of money without being anywhere near the edge and without getting into any kind of financial problems. I don't want them to barely even out but I also don't want to be price gouged. Why are you even defending this? Do you like being overcharged?
No, if AMD was in proper shape it would've probably ended up like fermi era pricing.
You missed the point again and also missed a massive historical incident. You do realize that nvidia cut 280's price to $500 just a month after launch because it was unable to compete with 4870 at $650? There goes that argument.
Fermi's ARE among the cheapest cards in the past 14 years.
No, maxwells are not the cheapest big-chips; not even close. That honor goes to 285, 480 and 580. Fermis about $570-580, and 285 about $470. 980 Ti's launch price is about $690 today.
You said you were out and then came back.
eddman - Tuesday, August 21, 2018 - link
I just want to add that since I haven't mastered english yet, sometimes it might seem I'm being disrespectful. That's not the case. It's good to have a healthy discussion/argument once in a while.eddman - Tuesday, August 21, 2018 - link
*No edit button*8800 GTX launched for the same $600 as 7800 GTX, so the "it is more expensive because of newer technologies" does not hold water.
Yojimbo - Tuesday, August 21, 2018 - link
"It's more expensive because of newer technologies" does hold water. You cannot claim to have refuted a statement simply by refuting one piece of evidence in support of it.Yojimbo - Tuesday, August 21, 2018 - link
I meant to say "...simply by refuting one piece of evidence provided in support of it."eddman - Tuesday, August 21, 2018 - link
8800 GTX cost the same as 7800 GTX, so it means new technology does not automatically make something more expensive. It's a classic case of "We can charge more, so we would".Also, GTX 280.
Yojimbo - Tuesday, August 21, 2018 - link
Every company operates with "we can charge more, so we do". They are all trying to maximize profits, within the constraints of longer-term planning, of course.New technology does not "automatically" make something more expensive. But increased die area does make the manufacture of something more expensive, and RT cores seem to result in almost a 1/3 increase in die area. R&D efforts also make something more expensive, and the R&D that went into the Tensor Cores and RT cores was certainly not insignificant. Finally, providing software support for new hardware and incentivizing games developers to make use of the new hardware sooner also costs money. It's pretty clear by looking at the actual facts of the situation instead of making generalized analogies to another situation from over 10 years ago that this new technology actually is more expensive. That conclusion is evidenced by NVIDIA's expected margins for the upcoming financial quarter, which are projected to go down.
eddman - Tuesday, August 21, 2018 - link
I very much doubt the bigger dies and new technologies are the main reason for the 16% and 42%(!) increase in pricing.eddman - Tuesday, August 21, 2018 - link
I very much doubt the bigger dies and new technologies are the main reason for the 16% and 42%(!) increase in pricing.Yojimbo - Tuesday, August 21, 2018 - link
Pricing is a complicated thing. Bigger die sizes, increased R&D into the hardware and supporting software, money spent on getting developers to offer some support of the technology before it has a big market share, increased DRAM cost, and finally the re-optimization of the price point because of the higher underlying cost structure of the product all contribute to the increases in price.With this price, NVIDIA is making a prediction of demand. But we can see from their conference call that in this prediction they do not expect to have higher margins despite the higher prices. So then what do you think is causing the increase in prices other than the things I mentioned above?
eddman - Tuesday, August 21, 2018 - link
... doesn't mean there wasn't enough room for lower prices. As mentioned, they simply want to make back their money sooner.Yojimbo - Tuesday, August 21, 2018 - link
"... doesn't mean there wasn't enough room for lower prices. As mentioned, they simply want to make back their money sooner."Of course there's room for lower prices. It's not relevant. That isn't what we are arguing about.
eddman - Tuesday, August 21, 2018 - link
Then you've misunderstood.Oxford Guy - Tuesday, August 21, 2018 - link
RT will be likely used for certain special effects rather than the whole scene in action games. This is similar to how vastly important it was in the marketplace for an ugly old man (Witcher 3) to have luxuriously tessellated locks (to give AMD a nice performance hit and to give people the eye candy experience of said ugly old man with luxurious flowing locks).evernessince - Friday, August 24, 2018 - link
This line of thinking is funny because I remember Nvidia fanboys making fun of AMD for years for introducing tech that would likely "never be useful". It's funny how quickly the shoe changes. Regardless it is a good laugh seeing how people will defend their purchases, especially of this scale.Dug - Monday, August 20, 2018 - link
I'm glad you are here to tell us something is overpriced. You would make a terrible CEO.Somehow you've already decided what everything should cost for everyone despite you having no idea how much it costs to run a company or how much it costs to produce something. You know, those little things like payroll and insurance that doesn't go away. Future r&d (which got us to where we are now), insurance, pay raises, taxes, etc. Maybe they should sell at a loss to keep all those investors unhappy just to appease your made up expectations.
eddman - Tuesday, August 21, 2018 - link
I don't run nvidia. I simply buy their stuff and to me, as a customer, this card is overpriced compared to the card it is replacing.Sell at a loss? You perhaps don't remember GTX 280. It launched for $650 and a month later nvidia slashed the price to $500 because of radeon 4870. Nvidia still reported profit in the following quarters, IINM.
Again, this has very little to do with costs an almost everything to do with the lack of effective competition.
Oxford Guy - Tuesday, August 21, 2018 - link
Duopolies aren't good enough to ensure quality competition, either. People continue to forget that. There are too many tech duopolies masquerading as a competitive marketplace.Oxford Guy - Tuesday, August 21, 2018 - link
That goes for politics, too. Two parties aren't enough. They end up being much too similar, which doesn't benefit the public but does benefit narrower interests (i.e. the rich).PeachNCream - Tuesday, August 21, 2018 - link
The solution to that is to work a little harder and trim some of your costs so you can benefit from compound interest rather than being beholden to it through recurring debt. Upward mobility is doable if you get education, work the system to your benefit, and manage your costs so you can reap the rewards of wealth in a duopoly world rather be subjugated by it and complain to people here about it.Oxford Guy - Tuesday, August 21, 2018 - link
"The solution to that is to work a little harder"This is the bootstraps fantasy and there is more than enough hard data to refute it.
Oxford Guy - Tuesday, August 21, 2018 - link
You're missing the fact that corporate desires are not the same as consumer desires. It's fallacious to argue from the position that only one matters and not the other. Corporations and consumers are in a battle to get the most from one another.eddman - Tuesday, August 21, 2018 - link
I know all that you wrote up there. I'm not saying their desires doesn't matter. I'm simply not buying the "more expensive because new technology" explanation.They, as a business, can charge whatever they want and I, as a customer, can say they are overpricing the cards that are direct replacements of the older ones.
evernessince - Friday, August 24, 2018 - link
If you look at Nvidia's reported quarterly earnings for the past few years, the cost of developing these cards is in line with their other previous generations. There is nothing to justify the price increase other than greed.piiman - Tuesday, August 21, 2018 - link
They will only be 999.00 if they don't sell and/or the 10 series is finally sold out. But they may have just kept the 10 series alive.JoeyJoJo123 - Monday, August 20, 2018 - link
People were already tired of the mining prices for GPUs and the average consumer wasn't willing to put up with $500 or more for x70 series GPUs, only the miners were biting the bullet to try to make their returns with them.But hey, good goys, look at the deals! $500 x70 series GPUs, but get this, it's Emm Ess Arr Pee!!! Don't you look forward to paying msrp for video cards?! Nah, Nvidia. $500 was too expensive for what you were getting a year ago, and it's still expensive now for what should be effectively a midrange GPU. And honestly, even if it was $400, I'd still gripe about it being higher than the $379 MSRP of the last gen.
AMD and Intel, please reel in these clowns back down to reality and release some good performing midrange GPUs.
MadManMark - Monday, August 20, 2018 - link
I bet you did the same thing here two years ago, mocking people for being excited about Pascal, declaring it overpriced, saying how much better in both price & performance Vega would be when it got here by the end of 2016.Well, how did that work out for you?
ianmills - Monday, August 20, 2018 - link
Yep that's me! well I'm still stuck on my 7950. I've gone back to my indie gaming roots haha. Radeon 680 or its competitor will probably be my next card in the first half of next yearJoeyJoJo123 - Tuesday, August 21, 2018 - link
I'm not an AMD or nVidia fanboy. I want better prices for better performing GPUs, and I don't frankly care from which camp it's coming from. nVidia's capitalizing on its market dominance and is releasing GPUs with ever higher pricetags, and naturally I want nVidia's two potential competitors to release a homerun product at lower pricepoints.I would say the same thing if AMD released overpriced GPUs. This has nothing to do with brandwars.
evernessince - Friday, August 24, 2018 - link
Pascal didn't price the 1080 Ti at $1,200, it was $700. Big difference but I guess that makes me a peasant. We are going to need a super elite PCMR badge for this guy.kron123456789 - Monday, August 20, 2018 - link
"AMD and Intel, please reel in these clowns back down to reality and release some good performing midrange GPUs."— They will. Somewhere in 2020.
Oxford Guy - Tuesday, August 21, 2018 - link
AMD may never release another good gaming GPU. Relying on a duopoly setup for competition is a bad idea, especially when one of the companies is small and competing very heavily in a different market (CPUs in this case).qap - Monday, August 20, 2018 - link
Why the fixation on name (x70)? They are offering cca 30% more performance ("greater than titan xp") for 100USD less than Vega 64. Thats a great value! And it will force AMD to go way down with their Vega GPUs (or maybe discontinue them without immediate replacement as they did with Radeon Fury).I have no doubt that when AMD will release their GPUs, they will do similar thing to nvidia (e.g. offer better performance for lower cost and force nv to lower prices). But that doesn't change the fact, that until then the price of 2070 is great for its performance (obviously if it is what they claim).
sorten - Tuesday, August 21, 2018 - link
where are you getting your performance numbers (30% more ...)? AT doesn't even have a full spec sheet yet.qap - Tuesday, August 21, 2018 - link
From the presentation - it was told, that 2070 will be faster than titan xp. titan xp is ~30% faster than 1080 and Vega64.PeachNCream - Tuesday, August 21, 2018 - link
You may want to wait until there are benchmarks that prove those claims are factual.qap - Tuesday, August 21, 2018 - link
As I wrote in the original comment...eva02langley - Tuesday, August 21, 2018 - link
In Ray tracing genius, not in game performances. With the actual die augment, the frequency and lithography, you can expect 25-35% max... which place a 2080 RTX at about 10% faster than a TI. Basically, the usual Nvidia upgrade.qap - Tuesday, August 21, 2018 - link
Context was game specific and it strongly implied otherwise "genius". If it was in RT, he would not specify "Titan XP", because in RT it should easily outperform Titan V (as claimed with "RTX OPS"), which is way faster.evernessince - Friday, August 24, 2018 - link
/faceplamGetting your performance numbers from Nvidia....
nuff said.
Rev_C - Monday, August 20, 2018 - link
That's cute. How about ETH hashrate? :DLuckyGyunsun - Monday, August 20, 2018 - link
That power consumption tho...Exactly what Vega was criticised for.
But I guess NV managed to get away with this thanks to the insane horsepower.
maroon1 - Monday, August 20, 2018 - link
RTX 2080 Ti has same TDP as GTX 1080 Ti & titan Xp, and Titan VHowever, RTX 2080 and 2070 has higher TDP than previous model, but I think it will probably have better performance per watt than pascal, and thats make the gap against Vega even bigger
MadManMark - Monday, August 20, 2018 - link
It's not only powering around the same or more CUDA cores, it also has all the new RT and AI hardware.Though it's likely it is using even more power than it appears for that; we are just seeing net usage after improved efficiency of DDR6 and 12nm taken into account.
Shadyghost - Monday, August 20, 2018 - link
Isn't it also providing power to the universal VR link too? I thought I read somewhere that is accounted for in the overhead but the actual usage will be lower depending on what's being used.BurnItDwn - Monday, August 20, 2018 - link
Video Card pricing is insane now even after they have reduced a little bit after ecoin crashes.Anand's Article on the Geforce DDR shows the appropriate response to GPUs that cost more than the motherboard + SSD + Ram + CPU combined.
https://www.anandtech.com/show/429
Yojimbo - Tuesday, August 21, 2018 - link
But the graphics card is more important than the motherboard, SSD, RAM, and CPU when it comes to games performance. Basically, if you have "replacement level" equipment for all those things other than the GPU and you spend a lot on a GPU you are much better off than spreading your money around equally. So of course the cost of the system will tilt more and more towards the graphics card over time (it's not just the GPU, because the on-card DRAM is also very important).Note, however, that the die area dedicated to the GPU (and the on-board DRAM) has also increased over time. The GeForce 256 seems to have had a 111 or 125 square mm die size. The RTX 2080 Ti has a 754 square mm die size. That's a 6+ times increase. Almost all of that increase comes from more execution units, added memory controllers, etc, and not things like video encoders. Compare the die area dedicated to VRAM on the GeForce 256 with the GTX 1080 Ti. It looks to me like it's a lot more on the GTX 1080 Ti. The Intel Pentium III 700 had a 105 square millimeter die size. The latest core i7s are about 300 square mm, I think, and that's including an integrated GPU and a bunch of other stuff that wasn't on the Pentium III 700 die (they were part of the chipset on the motherboard at the time) but take up a significant amount of die area. The die area for the CPU hasn't really changed much.
Oxford Guy - Tuesday, August 21, 2018 - link
RT, Tensor, and other stuff that may not be particularly relevant enough to gamers during the majority of the lifetime of the GPU chip take up a lot of space, looking at diagrams I've seen in the past.Die size isn't the whole story. If it were, the Fury X would have been a big hit.
Yojimbo - Tuesday, August 21, 2018 - link
We are discussing the trend in the pricing of GPUs relative to other components. Since this trend has been a long-term trend that has not reversed we can conclude that the GPU is providing more and more of the value of the gaming systems.And regardless of how well a particular GPU is executed, larger die sizes incur higher costs and so demand higher prices. Fury X was not executed well. It had a memory capacity limit that, while probably not detrimental to performance turned people off. It didn't offer a performance advantage over cards based on NVIDIA chips that were cheaper to produce. Maybe the Turing RTX line will be a similar failure. But, again, we are talking about the general trend of GPUs taking up more and more of the cost of gaming systems. The Turing RTX fits that trend, and so it is not an outlier.
PeachNCream - Tuesday, August 21, 2018 - link
Let us know how running a game works for you without a CPU, motherboard, RAM, and SSD since the graphics card is more important than any of them. :PYojimbo - Tuesday, August 21, 2018 - link
Let me know how it is fielding a baseball team without a 2nd baseman, a shortstop, a catcher, a pitcher, etc, if Mike Trout is more valuable than any of them :PQED!
Bulat Ziganshin - Monday, August 20, 2018 - link
Turing is definitely consumer-grade Volta. In each second generation, NVidia had the computing card which added high-performance FP64 cores plus somewhat increased resources per core. In Fermi generation, it was CC 2.0architecture, in Kepler generation we had CC 3.7, and in Pascal gen it was CC 6.0. So, it seems that Volta and Turing is the same thing but Turing, as usually, reduces FP64 resources. There is possibility that it drops even more resources compared to Volta, f.e. has less registers or less shared memory per SM, or more ALUs per SM (which effectively reduces registers/sharedmem per ALU). By nio means it can increase anything compared to Volta, including bandwidth!! Moreover, Volta PR sais that bandwidth increased 4x, so 2x increase stated may really mean 2x less compared to Volta :DBut overall, they don't have resources to produce two different architectures, so most probably it's just Volta minus FP64 plus RTX. In particular, area (and number of transistors) per ALU is pretty close to that ratio in Volta, and this is perfectly explained by replacing FP64 cores with RTX cores, so overall SM area doesn't changed much.
maroon1 - Monday, August 20, 2018 - link
Battlefield 5, shadow of tomb raider, and the new Metro will support ray tracing.Oxford Guy - Tuesday, August 21, 2018 - link
Nvidia benefit from having near monopoly power. Back when AMD released a consumer GPU with a hardware tessellator, it couldn't even get Microsoft to add support for that in DirectX.Yojimbo - Tuesday, August 21, 2018 - link
NVIDIA does not have near monopoly power. Who exactly are they pressuring with their market share and how? Don't confuse having a dominant market share with "monopoly power". They are two different things. The first is simply a result of economics, the second is an illegal coercion.In fact, NVIDIA only has a dominant share in PC gaming. But PC gaming and console gaming are a shared space as far as most large games developers are concerned. AMD accounts for a greater number of GPUs that are used to play games than NVIDIA does.
Microsoft is leading its own push to get real time ray tracing in graphics. I doubt Microsoft cared too much about a hardware tessellator. AMD is part of this push as well, by the way. NVIDIA is just first. When AMD comes out with their "look at the amazing thrills of ray tracing" presentation a year or whatever later, most of these people in this forum dumping on NVIDIA will be saying "boo-ya!"
GraXXoR - Tuesday, August 21, 2018 - link
Near monopoly power? 80% of all GPUs sold are Intel...I think as an "enthusiast" people sometimes fail to see the whole picture and forget that they are only looking at the smallest section of the AIB GPU market: the bloodiest of the bleeding edge.
A few ranks down and AMD are very much present. Not to mention the pretty much dominant position of Intel with an IGP in pretty much everything and even AMD with their Vega IGP in Ryzen.
But yes, they can currently command an enviable premium from their top ranked products...
And we know full well that plenty of sheep er... customers... will be encouraging Nvidia to develop the hideous market phenomenon that is "blind-pre-purchasing" on us when it should be shot in the cradle.
Oxford Guy - Tuesday, August 21, 2018 - link
"80% of all GPUs sold are Intel..."lol
jwcalla - Monday, August 20, 2018 - link
You can tell that Nvidia has shifted focus towards non-gaming business ventures.Oxford Guy - Tuesday, August 21, 2018 - link
As has AMD.Yojimbo - Tuesday, August 21, 2018 - link
How is that? By being first to market in technology that pushes forward the state of the art in gaming?GraXXoR - Tuesday, August 21, 2018 - link
@jwcalla -- Are you sure? You mean by bringing brand new technology to consumer GPUs that have the potential to change the way top end games will look in significant ways? Hmm...ianmills - Monday, August 20, 2018 - link
Nvidia must really be pissing off it's partners. First it was the geforce partner fiasco stopping them from selling rival cards. Now with the Founder edition they have a month of exclusivity, charge higher prices -AND- have a card with proper cooling. It looks like this will be the first generation where the Founder cards are as good as its board partners. The partners have lost all their advantages!It would be interesting to hear off the record how nvidia's partners are feeling right about now
Morawka - Monday, August 20, 2018 - link
the people who invent the card have every right to opening sales. The Board partners will get plenty of business, they just won't get the uber fanboi's who pre-order without benchmarks.Dug - Monday, August 20, 2018 - link
Pretty good as all pre-sale cards from all vendors are releasing the same day, and all of them are sold out.Oxford Guy - Tuesday, August 21, 2018 - link
Internet shopping makes partners less and less relevant with time. Nvidia could drop all of them and still dominate AMD. However, there are clearly benefits to having them, like getting a lot more review site hype (with all the dribbling out of 3rd-party designs over time).GraXXoR - Tuesday, August 21, 2018 - link
It's Nvidia's blind-prepurchase payment that is really pissing me off. As long as fanboiz are willing to empty their wallets of $1k before a single review has even been written means that this sort of prepayment will start even earlier next year...... and before you know it, we'll be pledging for "perks" for the next generation of GPUs instead of purchasing.
HammerStrike - Monday, August 20, 2018 - link
Pricing issues aside (although, IMO I'm not to shocked or put out by the prices, obviously lower is better but I'm not angry or losing sleep) I'm surprised that there is less enthusiasm on the forums for the first GPU to highlight real time ray tracing as a fully supported feature set designed to be used in game at playable frame rates. That's pretty revolutionary. Ever since I started following PC graphics (dual VooDoo2 SLI 8MB's were my first GPU's, still remember them fondly) real time ray tracing was considered the holy grail of rendering - the mile stone that was always talked about as "you'll know we've arrived at nirvana when it comes."We'll, it's here! And, for all the talk about Nvidia's next gen of cards over the last 6 months, with the exception of the week since the Quadro announcement at SIGGRAPH, it's been kept pretty much under wraps. As of a month ago I would not have guessed that the new gen of cards would have such a focus on ray tracing, so the news is somewhat surprising and, from a pure techno-nerd stand point I think it's awesome! Not saying that as a Nvidia fanboy (I've owned many cards from many different OEM's over the years), but just as an avid gamer and technology enthusiast this is a pretty seminal moment. Regardless of the price point it's at today, this tech is going to filter out into all of Nvidia's (and presumably AMD's) product stack over the next 1-2 years, and that's extremely exciting news! Assuming AMD has been active with MS in developing DXR, their next gen of GPU should be supporting ray tracing as well, which means there is a decent chance that a DXR/ray tracing feature set is included in the next gen of consoles due in the next couple of years. This is really ground breaking tech - 6 months ago the common wisdom was that "real time ray tracing" was still years away, yet it's launching Sept 20th.
Granted, it's a hybrid approach, but given the 20+ year investment in rasterization 3D modeling, and the fact that every game currently released and in development is designed for a rasterized pipeline, it's not surprising, and frankly that's probably the smartest way to deploy it - rasterization has many good qualities to it, and if you can improve once if it's biggest weaknesses (lighting) through a hybrid approach with ray tracing that is the best of both worlds.
Anyway, I'm rambling. From a long time gamer I'm very excited to see some reviews of these cards and how dev's integrate ray tracing into their engines. Great time to be a gamer! Great job Nvidia! Say what you will about the price of the cards, that's a (relatively) short term phenomenon, while the direction this is pushing the industry in will be felt over the next decade, if not beyond.
schen8 - Monday, August 20, 2018 - link
Very well said HammerStrike!I'm and old school gamer too, grew up on the Voodoo 2 SLIs w/ Riva 128.
Those were the days! Nvidia's keynote today really hit it right out of the park!
It is propelling us into the next decade of gaming graphics. I'm truly excited and can't wait to pick up a new card!
gerz1219 - Monday, August 20, 2018 - link
I think you’re not seeing more enthusiasm for real-time ray-tracing because we won’t see any AAA titles that take advantage of it until the average GPU supports it. And that’s not going to happen when the price point is $1200. It’s not going to happen until there are new Xbox and PlayStation consoles that have ray-tracing capabilities. And by the time that happens the 20** series will be obsolete. It’s exciting that they’re going in this direction, but these cards will never actually take us there.Dug - Monday, August 20, 2018 - link
You didn't watch the video, did you?wyatterp - Tuesday, August 21, 2018 - link
They literally showed 20+ games upcoming, to include 2 near term triple A launches (granted it's unknown if those games will support RTX at launch). BFV and Tomb Raider looked insanely good.eva02langley - Tuesday, August 21, 2018 - link
That they paid to get their support. It is all about the perception, but this is nothing more than Physix 2.0.Nobody developing their games on console are going to bother because AMD is the sole GPU provider.
The money is not worth it for maybe 4% of the PC market purchasing these cards.
wyatterp - Tuesday, August 21, 2018 - link
I'm with you - reading these comments is depressing. Gamers and tech lovers are some salty people of late - a jaded, skeptical lot here at AT! I know it's fair to be somewhat reserved, but the IQ increase of Ray Tracing looked more than a gimmick - it fundamentally changes how real time games looked, at least by demos. I know most people here want it, they are just pissed they can't afford it, so they want to dump on the capability here to justify their anger. I get it.I decided to preorder the 2080. I'm all in on this...assuming increased performance over the 1080 for "old rendered" games (from 8 to 10 TFlops?) - and hopefully it can run BFV at 60 FPS with RTX on. I know I'll be buying 3 of the RTX enabled games anyways. The way lighting and shadow worked in Shadow of Tomb Raider was impressive. BFV looked like offline renders.
bogda - Tuesday, August 21, 2018 - link
You cannot put pricing aside. In order to get useful raytracing performance 1080p one will need 2080 Ti. From that I can conclude that RTX 2070 and 2080 will not be powefull enough to run realtime raytracing in games.I think for gamer spending 1200$ just to get realistic reflections and shadows is ridiculous.
BTW there was no mention of VR in keynote. What VR neededed was more rendering power for less money so these cards probably sound end of VR on PC.
HammerStrike - Tuesday, August 21, 2018 - link
1. I never said don't take pricing as a point of data, I just said there is a LOT more about the RTX series announcement then just the price, which is what the majority of comments seemed focused on. Real time ray tracing is the holy grail of rendering - the fact that it's showing up in any form in 2018 is pretty exciting.2. Where are you getting your info that you'll need a 2080 ti to run "useful" frames at 1080p with RT enabled? While I agree that we should all reserve a BUYING decision until trusted reviews are out there, this is just pure conjuncture that is apparently driven by your disappointment / saltiness / anger over the price of the cards.
3. We could argue that RT, even as it's implemented in the RTX series, is a lot more then just "realistic shadows", but that's besides the point. Regardless of what aspects of the image quality are improved, that's the whole reason to upgrade a GPU right now. If image quality is of minor or no importance, you can drop your IQ settings to low and get 60+ FPS on a GTX 1050TI right now for $150. I get confused with all this "it's not useful, it just makes the image quality better" line of argument. People weren't drooling over the Cyberpunk 2077 demo at E3 because it was running at a super high frame rate.
3. While I don't disagree with your point on VR, I wasn't really debating that. However, there still is the question of what Nvidia will position for their cards that cost less then $500. Regardless of what they are pricing the 2070+ cards at, my (semi) educated guess is that 95%+ of GPU's sold are sub $500. I'd bet that 80%+ are less then $300. While we can talk about "market" pricing, and whether Nvidia is good or evil for pricing these cards where they did, it doesn't change the fact that the mainstream of the market is at $300 or less, and no amount of halo product / pricing is going to change that. Say what you will of Nvidia, they are not stupid, and I'd be surprised if they didn't refresh their product stack in that price range, and I'd be equally surprised if they didn't provide a meaningful performance boast in comparable price tiers.
It's also worth noting that the die's on the RTX series are FREAKING HUGE. Lots of consternation of "price inflation" based on model series from 7 gen to 9th to 10th to 20th, but end of the day those are just names. The primary driver of the cost of production in any semiconductor is the size of the die. The 2080ti has a die Size of ~757mm^2, where the 1080ti has a 471mm^2 die. That's basically a 60% increase in die size. Even assuming the yields are the same, they still use 300mm wafers, and the cost of manufacturing is per wafer, so everything else being equal their productions costs went up by 60%. DRAM prices are all higher then they were 2 years ago. If you take the $700 price of the 1080ti at launch, add an additional 60% to it and throw in $30-$50 more in GDDR pricing and you have a card in the $1200 range.
Not saying that makes the pricing "ok" but simply to suggest that there no reason other then "greed" (I know you didn't make that specific argument, but it's pretty common on the forums) is pretty short sighted.
This is a freakin' monster chip with revolutionary tech that is enabling for the first time the holy grail of rendering tech, real time ray tracing. Whether or not blowing out the silicon die area to support that, and charging the prices they are that are necessitated by that, turn out to be a good business decision remains to be seen. But from a pure tech standpoint it's pretty awesome. It't would be like Ford announcing a Mustang with a 1500 HP engine in it that costs $150K. Maybe there is enough of a market there, maybe there isn't, but I'm not going to get so hung up on the price to not appreciate the pure ridiculousness of what they made. Same here.
CaedenV - Monday, August 20, 2018 - link
Cant wait to see some real benchmarks on the 2080... My 1080 is so close to being good enough to do 4k games, but if this offers some substantial 4k performance I may be able to justify the 2080... after I sell my 1080... and if the 2080 is on a sale... maybe...cmdrdredd - Monday, August 20, 2018 - link
Yeah my 1080ti is pretty decent at 4k mostly but there's a few times I would like just a bit more power there to turn settings up. Still not sure I'm willing to put down $1k to do it(I'd go for the ti), at least not this year.Lolimaster - Monday, August 20, 2018 - link
Just optimized you graphic settings, there's no point of trying to stick to everything "ulta/maxed out" when you damn now there plenty of near useless effect that will chunk performance by 30-40% while requiring you to watch a pic comparison for 2 minutes to spot differences.*The main offender is high quality+ shadows and postprocessing.
RSAUser - Tuesday, August 21, 2018 - link
Yeah. Definitely turn stuff off like AA, no point when you already have that many pixels for an <30" screen. If gaming on a large TV, maybe.Most more expensive screens (even some cheap Samsung ones actually) have an option to enhance the image which to my eyes looks like a form of AA.
Don't even use AA on my 1440p 26" anymore as can't really tell the difference for most games and would rather have the higher fps for gsync.
GraXXoR - Tuesday, August 21, 2018 - link
No AA for me at 4k... pointless... and motion blur is the antichrist... Whoever thought THAT would be a good idea.HQ shadows can kill performance, so I usually turn those down a bit... I also find that medium AO is good enough for me so that I can run my favourite games on my Oculus pegged at 90FPS or my 4k screen at 60FPS.
Ananke - Monday, August 20, 2018 - link
NVidia and partners have a load of stock GTXs to sell, hence NVidia is not jeopardizing those sales and keeps the next generation at higher tier prices. Maybe in 6-8 months, once their 1060, 1070 chips deplete, and they will introduce something midmarket 2xxx series.So, expected and smart marketing move, though will probably not going to generate volume sales and user base, and that translates to zero interest for dev support - hence NVidia gives away RTX cards to developers now. This tech may drag some attention in a year, or maybe in 2020, but NVidia's problem meanwhile may be very well the ray tracing standardization between MS and AMD, eventually that standards included in next generation consoles which would exclude NVidia from the gaming market.
NVidia is walking on a very thin ice here, I think they would lose on the popularity and volume - indicators of financial performance expectations are usually the stock prices, and they were not very bright today.
Shadyghost - Monday, August 20, 2018 - link
This. I see no reason why they should sell these cards cheaper. I'm surprised they aren't more expensive. They really have no competition at the moment for their 10 series cards which they allegedly have a glut of still. Essentially they just kept their 10 series teir viable longer with these prices. Smart. Also I'm sure they are planning price breaks for whatever AMD releases their next platform. It's easy to lower mrsp, impossible to raise them after the fact.sorten - Tuesday, August 21, 2018 - link
Microsoft is already fully on board and has been working with NVidia as well as game engine makers. Peter at ars posted an article about it today.Oxford Guy - Tuesday, August 21, 2018 - link
Smart and expected in a monopolist environment. Yay for us.Sttm - Monday, August 20, 2018 - link
SLI sucks! We cannot sell them on dual graphics cards! What will we do Jensen!'Puts on Glasses'
DOUBLE THE PRICE FOR THE Ti!
Hixbot - Monday, August 20, 2018 - link
So a new generation and performance per dollar stays the same?bogda - Tuesday, August 21, 2018 - link
No, new generation and performance per dollar goes down.evilpaul666 - Monday, August 20, 2018 - link
So when are reviews going up?GraXXoR - Tuesday, August 21, 2018 - link
About 24 hours after the first cards get released and the testers get hold of them, I reckon.sorten - Tuesday, August 21, 2018 - link
ray tracing + tensor cores sound awesome, but the pricing may have me waiting until these have been sitting on shelves for a while.webdoctors - Tuesday, August 21, 2018 - link
Sure prices are higher for next level tech, but look how much real estate has gone up in Silicon Valley in the last 5 years. Where will engineers live and sleep? Ppl have families, and all the NIMBYs have driven prices higher than vid cardz due to limited availability and prop13 causing investment bubbles in RE. Its a domino effect that leads to higher priced products to cover higher costs.The dollar has been printed like toilet paper thanks to QE2, and the debt hitting 20T. Still need to pay TSMC overseas for the chips, which could be tarrif'd in the future thanks to Trumps tariff war with all his frenz. Everything needs to be accounted for, when you're not getting illegal labor from Mexico or slave labor from China and demand livable wages things go up, look at prices for an IPhone.
Look at these vid cards as an invedtment, can mine ETH with it. Huge demand by Venezuelans for currency and Greeks that aint paying taxes.
RSAUser - Tuesday, August 21, 2018 - link
You're missing the /s tag.Yojimbo - Tuesday, August 21, 2018 - link
I hate pre-orders without benchmarks. It's not as bad as the Ryzen launch where we had no idea how it would perform. We can sort of guess how Turing cards will perform because of similarities to Pascal and Volta, but not well enough to determine if the prices offer good value or not.Shadyghost - Tuesday, August 21, 2018 - link
So... Don't pre-order.Yojimbo - Tuesday, August 21, 2018 - link
I won't. Besides, I wouldn't buy a graphics card with less than 10 gigarays/sec and I won't spend $1,000 on a graphics card.Shadyghost - Tuesday, August 21, 2018 - link
I hear you there. I actually draw a hard line at 500. Not because I can't afford it but because I can't justify it for my use-case. It kinda sucks for me this go around because I prefer visual quality over performance, so I really appreciate the upgrade with ray tracing.Back in the heat of the mining craze amid the rampant rumors of nvidia's next offerings I realized they would most likely be priced much higher due to current market conditions and lack of competition. I occasionally checked Nvidia for availability and got lucky and picked up a 1070ti FE for mrsp a couple months back. (I know, lucky me, right?) I'm okay with it though because it's going to be a while for prices to drop and the card performs beautifully for my needs. I'll just sell it for whatever it sells for and pickup the newest card around 500 next year. Or wait until next generation if ray tracing really gains traction to be ahead of the curve... At least ahead of the 500 dollar curve.
Yojimbo - Tuesday, August 21, 2018 - link
I'm going to wait for the next generation. I'm guessing that by then ray tracing will have caught on and I can get a card that costs about $400 and is powerful enough to take good advantage of it.Shadyghost - Tuesday, August 21, 2018 - link
Agreed.Yojimbo - Tuesday, August 21, 2018 - link
I don't mind the RTX Ops. Only time will tell how useful they are. It could be that there is a strong correlation between how games perform and these RTX Ops. If they are useful they will stick around, otherwise they'll fall by the wayside. They will only really be useful for comparing NVIDIA cards to each other, but NVIDIA is the only one with cards with these capabilities out at the moment anyway. You can't create a metric that targets cards that don't exist yet.halcyon - Tuesday, August 21, 2018 - link
Way too expensive for 95% of the market. And practically near zero game support (just a few titles with quickly slapped on water reflections and softer shadows which will kill performance in FPS shooters anyway). And no foreseeable console support for a few years into the future.Time to wait for new mfg nodes, 2nd/3rd gen RTX cards and see what AMD/Intel can do.
At these prices nVidia's sales and stock price deserve to take a plunge.
Yojimbo - Tuesday, August 21, 2018 - link
Only FPS shooters matter. I don't even see why they worry about things like global illumination. And near-zero game support at the launch of a new technology is a non-starter. If games developers aren't spending their time supporting features that don't exist then what the hell are they doing?Oxford Guy - Tuesday, August 21, 2018 - link
"Only FPS shooters matter." This is perhaps the biggest failing in the gaming industry.RSAUser - Tuesday, August 21, 2018 - link
I am sure there are more MMO RPG Players than there are FPS players.Oxford Guy - Tuesday, August 21, 2018 - link
The MMO barely qualify as a video game.GraXXoR - Tuesday, August 21, 2018 - link
"It's a non starter...""What the Hell are they doing?"
Meanwhile: Nvidia and all third party suppliers COMPLETELY sell out of all pre-purchase units @>$1k2
LOL.
wyatterp - Tuesday, August 21, 2018 - link
That is some salty jealous commentary man. Also - it doesn't seem like you can base your assessment of RTX performance on the history of PCSS impacts on older games.WorldWithoutMadness - Tuesday, August 21, 2018 - link
I'm sure the retail price is gonna be sky high more than the MRSP and then get massive price cut after AMD's come out at 2019Yojimbo - Tuesday, August 21, 2018 - link
I doubt it. When has retail price been significantly more than MSRP other than during the crypto craze? And when was the last time NVIDIA instituted massive price cuts in response to anything AMD did?Oxford Guy - Tuesday, August 21, 2018 - link
Just invent a new coin or two with fancier names. Oh, gee... the new coins need Tensor and RT hardware to work efficiently? Fascinating!Yojimbo - Tuesday, August 21, 2018 - link
Hey, good idea. But I don't think crypto will get as hot as before for a while, if ever. It was caused by a world-wide speculation craze. Many of those people in the craze got bitten by the sharp price declines. Another craze is much less likely to happen. As far as I can see, there is no good reason the currency should be $1,000,000 a coin or $100,000 a coin or $1,000 a coin or whatever. The coins are used for very little at the moment. Mostly they are just tools for speculation. So if there isn't a mass influx of speculators, like there was until recently, there isn't a good reason for the coins to be increasing in value relative to the dollar.GraXXoR - Tuesday, August 21, 2018 - link
At one point, I had 10 bitcoins... Lost them somewhere a few years back... Oh well...Oxford Guy - Tuesday, August 21, 2018 - link
"Another craze is much less likely to happen."Yeah, they said that after the Bitcoin wave. I guess Ethereum was unlikely to happen.
Yojimbo - Tuesday, August 21, 2018 - link
Bitcoin and Ethereum spiked concurrently.https://ibb.co/hGaTqz
https://ibb.co/jRzXHe
The bubble built throughout 2017 and then burst in 2018. Bitcoin was up to a $327B market cap and now it's down to $116B. Ethereum built up to a $135B market cap and now it's $30B. Some people made a lot of money, but those were the ones who sold when it was spiking (or were holding the currency for years, but there are much less of those than the number who became involved in 2017). People who bought when it was spiking have lost a lot of money. If the volume charts are to be trusted, there was very little volume until the middle of 2017. That's the craze I am talking about. From the middle of 2017 until the end of 2017.
SerggioC - Tuesday, August 21, 2018 - link
Where can I see the benchmarks and comparison with the previous versions?Yojimbo - Tuesday, August 21, 2018 - link
You can't, yet. Gotta wait for the review embargos to end. Maybe that will be a week before launch (around September 13th)?? I have no idea.SerggioC - Tuesday, August 21, 2018 - link
Reviews embargo? How does that work? That means someone already knows how the cards perform compared to the previous ones. Excluding ray tracing, what kind of fps increase do we get on old titles vs previous cards? That's something missing! people get so blind with RTX hype they can't see anything else and drop $1200 for a card. lolYojimbo - Tuesday, August 21, 2018 - link
Yeah I think it's crazy to pre-order. But people can do what they want with their money.Well, most people aren't even under review embargoes yet, I guess, because they probably haven't received any review samples, yet. I think they usually get the samples about a week before the embargo lifts. The idea of the embargo is so reviewers have time to put together high-quality reviews. Otherwise sites would rush out their reviews to get the clicks from being first.
If the review embargo lifts September 13th (I have no idea if it will), then maybe sites would get their samples around September 6th. Until then, very few people outside NVIDIA and their AIB partners will have benchmarks on the cards.
wyatterp - Tuesday, August 21, 2018 - link
I'm guessing laptops will not see this RTX feature until next gen. I'm guessing we will hear about an 11 series incremental bump to mobile chips in the Winter. I've heard some rumors to this effect for even desktop cards. I.e., no RTX 2060, but rather just an 11-series bump that keeps the GTX moniker. With the die size and heat/performance requirements for RTX turing chips - that sounds like a lot to squeeze in a 18mm laptop.Yojimbo - Tuesday, August 21, 2018 - link
I am guessing that if NVIDIA comes out with a non-RTX 1060 that the GPU it is based on will contain the other improvements made to Turing other than the RT cores, with the RT cores disabled or removed. So it will be a true generational increase in games performance. Maybe that's what you mean by an "11-series incremental bump" but your chosen words seem to have the connotation of disappointment or marginalization to me. I, myself, wouldn't use that phrase for a 40% boost in performance. I believe the GPUs would also have variable rate shading, independent integer units, Tensor Cores, etc., as well.GraXXoR - Tuesday, August 21, 2018 - link
Now that would be nice... An 11 series GTX which keeps the 14G memory bandwidth but just removes the RT garb... er... overhead and gives a nice 25% performance jump.Huhschwine - Tuesday, August 21, 2018 - link
When will we see the real tests of the new RTXs? When does NDA expire?jjj - Tuesday, August 21, 2018 - link
The big weakness being area due to these extra compute units.Huge die, lots of memory , stupid high margins and PC gaming is starting to become a luxury.
Yojimbo - Tuesday, August 21, 2018 - link
PC gaming was always a luxury. The Apple II GS was introduced at 2,230 2017 dollars, not including a monitor or any floppy drives. For a color monitor, a 2.5 inch floppy drive, and a 3.5 inch floppy drive, it cost another 2672 2017 dollars, for a grand total of 4,902 2017 dollars.It's just that more of the cost of the gaming system has shifted to the graphics card from the other components of the system because the graphics card has become relatively more important to the performance of the system. PC gaming is still cheaper now than before.
milkod2001 - Tuesday, August 21, 2018 - link
Anyone can get brand new PC at around $1000 perfectly capable of very decent 1080p gaming. That's not that bad at all.It will become expensive when you enter into 4k gaming and that's the luxury you have to pay extra for.
GraXXoR - Tuesday, August 21, 2018 - link
You're joking, right?Imagine how many 10 series cards are going to flood the second hand market in the run up to Christmas... Hell, I might pick another 1080ti on the cheap for some SLI action.
p.s.
PC gaming has ALWAYS been a luxury, tho.
Shadyghost - Tuesday, August 21, 2018 - link
Good luck with the lack of support and micro stuttering. I'll never go that route again. I even had games perform worse in sli.mkaibear - Tuesday, August 21, 2018 - link
Intel continuing their tradition of pushing SKUs upmarket, I see. Those prices... eeshk.MrPoletski - Tuesday, August 21, 2018 - link
peak 10 billion rays/s in 250w? not that impressive given that 2 years ago we had measured 100 million rays/s in TWO watts. https://home.otoy.com/otoy-and-imagination-unveil-...eddman - Tuesday, August 21, 2018 - link
I don't see how it's bad. 125 times more power for 100 times better ray tracing performance.Also, that power consumption is for the entire card. The GPU also has a LOT more pixel shaders, so not surprising that it consumes so much more power.
Yojimbo - Tuesday, August 21, 2018 - link
That's a mobile core with 150 GFLOPS of FP32 and probably a memory bandwidth of about 20 GB/s shared between the GPU and CPU. Also, as eddman pointed out, the power consumption you pointed out is only for the SoC and doesn't include the RAM. Power efficiency of mobile SoCs are also much greater than that of high powered chips. BTW, 10 billion divided by 100 million is 100. 100 times 2 Watts is 200 Watts. Now with a mind towards the points I made about the differences between SoCs and high powered graphics cards above, the efficiency of that ImgTec device looks like shit to me in comparison. Maybe we will have a better idea when NVIDIA comes out with a Turing-based Tegra.SydneyBlue120d - Tuesday, August 21, 2018 - link
Is the HDMI still limited to 2.0b? No HDMI 2.1 ?!?!The_Assimilator - Tuesday, August 21, 2018 - link
Nobody cares, since HDMI 2.1-compatible screens and cables are nonexistent. And by the time you can afford 8K these cards will be too slow to drive it.Simon_Says - Tuesday, August 21, 2018 - link
The appeal is not 8k resolution but having UHD + HDR + 4:4:4 + >60fps peak + Adaptive Refresh all on one display.RSAUser - Tuesday, August 21, 2018 - link
These cards don't come close to maxing out the 2.0 standard, so why would you add the extra cost to support the 2.1 standard.euler007 - Tuesday, August 21, 2018 - link
This makes it a hard pass for me, because my next computer upgrade is destined to be hooked up to a large OLED that supports hdmi 2.1 and VRR when such set becomes available (2019 or 2020).The_Assimilator - Tuesday, August 21, 2018 - link
No, Anandtech, the RTX 2070 Founder's Edition is not $799 as per your page 3. Try proofreading sometime.Ryan Smith - Wednesday, August 22, 2018 - link
Just for that I'm increasing it to $899!Thanks Assimilator!
bogda - Tuesday, August 21, 2018 - link
This looks real bad. We get decrease in core count and clock speed with more features for more money.GTX 1080 is around 400USD last few months and RTX 1070 that has fewer cores and lower clockspeed has MSRP of 500USD.
There are no performance figures in existing games presented so I do not expect much if any impovement in performance but we get increase in price.
AMD get your act together we need some competition in GPU market.
GraXXoR - Tuesday, August 21, 2018 - link
Don't buy it. Nvidia will then be forced to respond.simples.
but yeah, some competition would be nice.
iwod - Tuesday, August 21, 2018 - link
I am now more interested in this "future" Nvidia is painting. Are hybrid rendering really that "good". And at what cost? Are all the features standardise in Direct X? Because at this moment it is looking like Direct X is Glide and Nvidia is 3Dfx ( Which they actually acquired so I cant say this is wrong ). Where does AMD stand in this?Navi was rumoured to be aiming at PS5, hybrid rendering in PS5 as well?
We only have may be ~4x more transistor to go, 7nm brings 2x of 14 / 12nm, and 3nm brings 2x of 7nm density. ( Those naming node are TSMC's version ). Are we sure that is the most transistor efficient way to go?
PeachNCream - Tuesday, August 21, 2018 - link
I'm in general agreement with everyone about the price and I also think the TDP is absurd despite the technology offering impressive features and capabilities. The trouble is that it's you end up paying a lot for PC hardware to simply end up scraping the scraps and leftovers as a second-class gamer scabbing around several months after the fact for poorly ported console games. What's the point of all that supposedly high end PC gear if the software portion of your hobby is treated like the nearly expired discount rack at the console and mobile grocery store?Oxford Guy - Tuesday, August 21, 2018 - link
It's even worse if you're not into bloodbaths and macho nonsense.El Sama - Tuesday, August 21, 2018 - link
ray tracing is a Halo technology, maybe in 5 years some games may have it, currently is useless for gaming. Also lol at Gigarays per second, what a silly unit is that??.SerggioC - Tuesday, August 21, 2018 - link
It's lovely how the crazy fanboys are buying pairs of these cards without real reviews and real performance numbers! wow hehehe nice job from nvidia marketing!With the same amount of ROP, I suspect 2080 ti will be about 10 or 15% better than 1080 ti, and this is based on architecture improvements of 12nm, faster memory and larger bus. Excluding the ray tracing hype what can these cards do better than the previous ones? What fps increase do we get? Probably not much with lower clock speeds.
yhselp - Wednesday, August 22, 2018 - link
I imagine NVIDIA's break room has a huge poster of Gordon Gekko from the 1987 film Wall Street, delivering his infamous "Greed is good!"The MSRPs are getting more ridiculous with every new generation. Remember when the flagship used to cost $500 not-so-long-ago? Like the GTX 580. And then NVIDIA decided to name the successor of the GTX 560 Ti GTX 680, and not 660Ti, slapped on a $500 tag, called it a flagship, despite the fact it was based on a mid-sized GPU, all the while purposefully withholding and delaying the actual flagship, which used to come out first. It flew. Still, later on, they were a bit worried whether the original Titan would sell well, because even they didn't think people would embrace a $1000 consumer video card. Yeah... Fun times. The rest is history. Nobody to blame but us, the consumer.
When I saw 80 Ti, 80, and 70 in the title I naively expected to also see $650, $500, and $350. Alas. The only actual break from the established "norm" since Kepler is that consumers can buy the ultra-overpriced card based on the big GPU a few months early. Which is for the better, I suppose, if the prices hadn't jumped across the board as well, and if we were sure a $650-700 version of the big-GPU card was coming.
On a more positive note, judging by the TDPs, the 2080 might actually be based on a cut-down version of the big GPU, like the GTX 570, or the GTX 780 if you will. That would be nice. It would also mean that the 2070 has a fully-fledged mid-sized GPU. I can see how somewhere in NVIDIA's Gekko cash-crazed mind, the current pricing policy is beneficial to the consumer, since they charge $600 for the full-fledged mid-sized card last time, and not it's down to $500. All praise the lord.
All in all, I was excited about this new generation but the insane, and unjustified, pricing just makes me want to quit this hobby for good. An Xbox One X can do so much for a console and it costs $500.
Jiai - Wednesday, August 22, 2018 - link
It is incredible to see the effect of the cards name on everybody.They put shokingly higher prices on the cards, but just by naming them x80 TI / x80 / x70 instead of x80 / x70 / x60 (as in first generation flagship, high/mid range at 75% of the flagship, and mid range à 50% of the flagship), the effect is somehow mitigated, and people just find them "a little overpriced" compared to the previous generation.
I'm pretty sure we will still get a "TI" refresh in a year.
yhselp - Wednesday, August 22, 2018 - link
To be fair, it really does seem that the 2080 Ti is based on a proper, large-size GPU, aka "Big Turing", meaning it can be genuinely considered as a flagship. Yes, it is a cut-down version of said GPU, but that should not make it any less of a flagship; in fact, this has been standard practice over the years - Fermi launched with a cut-down-GPU flagship, GTX 480, later introduced the fully enabled GTX 580; the original Titan was based on a cut-down GPU, then we had GTX 780 (double cut-down), and finally - the fully enabled GTX 980 Ti. Thing is, those sweet Fermi flagship cost $500 then, and the equivalent of say a GTX 980 cost just $250... But, yeah, "hopefully" they release a $700 card based on Big Turing in 2019, cut-down or not.Otherwise, I'm just as frustrated as you are, as you can tell by my comment just above yours.
yhselp - Wednesday, August 22, 2018 - link
*I meant to type "and finally - the fully enabled GTX 780 Ti", and not GTX 980 Ti.