That's an oddly aggressive response to a comment about rising GPU costs. Although consoles do get priority from most of the larger game studios and PC ports are unfortunately treated like second-class afterthoughts, there are still PC only developers out there including a number of smaller and indy teams that are putting priority on those systems. I also agree with you that it's a shame PC hardware costs a lot more and we end up stuck with halfhearted, months-delayed releases that make the high priced GPUs and CPUs we buy pretty pointless, I'm not going to lash out randomly at someone that rightfully points out we're not getting a very good performance-versus cost return in the GTX to RTX transition. It's a valid point after all.
It is not AMD fault that Nvidia is charging overpriced "mainstream" GPU, it is AMD own greed and their investors. Stop blaming AMD for Nvidia behaviors.
It is not AMD fault that Nvidia is charging overpriced "mainstream" GPU, it is *****NVIDIA***** own greed and their investors. Stop blaming AMD for Nvidia behaviors.
Well milk~ isn't wrong. The fact that AMD doesn't have anything to compete is one of the reasons NVIDA can get away with this. No competition to undercut them so they don't have to drop prices. Intel did for years with no competition. Now look. We're getting 6 core 8 core CPUs at great prices. Competition is always a win for the consumer.
HOLD THE PRESSES: The launch price of the 2080 is the same as the launch price of the 1080 Ti! NVidia is comparing their new product to a lower-priced part. The 1080 Ti is about 30% faster than a vanilla 1080, so push all of those bars up to 1.3 instead of 1.0, for a more realistic comparison.
In an apples-to-apples comparison, versus the 1080 Ti, we are looking at a ~10% to 25% performance uplift compared to a similarly priced card from the previous generation. Note that I'm ignoring DLSS speed gains, which are only relevant if anti-aliasing of some kind is used in the comparison -- something not really needed when running at 4k resolution.
I may be wrong, but I call this an unfair comparison.
Don't worry, when reviews come out we will get all the answers. With the 1080ti hitting $529 the 2080 will seem overpriced and should drop rather quickly.
The chart is not using price as a basis of comparison, but rather the class of GPU. It is comparing this year's 2080 to last year's 1080. They are comparing the performance gain from last year's model, which seems like a fair comparison to me
This! The question is how much more performance-per-dollar Nvidia is providing with this new set of cards compared to the previous generation. The name they have applied to the new cards merely reflects marketing. Unless reviewers call BS, Nvidia gets away with showing "50%" more performance by applying a favorable new naming convention!
Think about what comparing a $700 launch price 1080 Ti to a $1000 launch price 2080 Ti really means when looking at the performance per dollar that Nvidia are providing. The same applies to this "2080 vs 1080" comparison.
Let's be realistic here, no card are at the MSRP, they are all around the Founder Edition price bracket at +-50$... and that is in the US, elsewhere in the world, we are talking crazy prizes. Europe get screwed big time for a 2080 TI.
Reviewers will not call BS outright because they won't get pre-NDA lift hardware to play with if they besmirch the good name of the OEM by being rightfully critical. Benchmark graphs may show the reality, but the text around it will likely be very forgiving or carefully phrased to moderate the tone. It's the world we live in now since journalists can't realistically be asked to purchase GPUs out of the business expense account and they'd not get the hardware in advance of a NDA lift even if they did which would ultimately endanger the reviewer in question's job security.
Read HardOCP then, they have been on NV's shitlist for a while and buy their own cards - they said NV is actually going to give them RTXs this time around but I doubt they are going to sugar coat anything.
I agree with the price:perf thing, the models don't line up at all anymore so I don't give a flying F what they call it - how much does it cost and what improvement will I get? I was hoping to move from my 970 as it's pretty maxed running 2-4k games and I recently went from 2k>4k on my main screen so I'd like a bit more performance. I imagine some of the pricing woes are due to the insane memory pricing and large amount of it on these cards and not just NV sticking it to us because AMD is pulling another Radeon.
DLSS is not an antialising technique, because antialising cannot have a frame rate performance benefit. DLSS will employ deep learning to super-sample games (I just described its acronym btw)) that are running internally at a lower than 4K resolution (probably 1440p) and output them at 4K. Which is why Huang was showing these low res cats etc being super-sampled to a higher resolution in his presentation. If DLSS is effective enough the results might be practically identical.
Supersampling aka FSAA is the oldest form of AA. If they're *actually* supersampling (and not just using the term for marketing), they're rendering a higher resolution, then downsampling to a lower res. This method uses the extra samples for color calc. With various types of AA there can be a performance benefit vs FSAA, but not a performance benefit vs running without any AA in the first place.
Again, this assumes they're actually performing some form of supersampling and not just marketing it as such to be extremely obnoxious.
Yeah maybe it should be called deep learning upsampling. But let's try to make a case for calling it supersampling.
One could argue that what you described is supersampling anti-aliasing, and not just supersampling. Supersampling would mean choosing more samples than exist in the render output target, taken from some other space. The space does not necessarily have to be a greater resolution render, that's just one possibility. In this case the space is an interpolation produced by a neural network operating on the initial render output. So they get these super samples and then, since they are not anti-aliasing, they don't downsample. Instead, they simply render at the new supersampled resolution. Deep learning supersampling?
One minor clarification. I should say ..."taken from some other higher resolution space", because otherwise the term "super" isn't justified. But the neural network interpolation is such a higher resolution space that was made without rendering at a higher resolution.
You know what, it seems they are comparing this to Temporal Anti-Aliasing. I guess they are doing temporal super sampling in an interpolated space created by deep learning inference on the rendered output. But I dunno, I'm just guessing here. I don't really know how temporal anti-aliasing works. Maybe someone with knowledge of anti-aliasing and temporal anti-aliasing can help.
Prior to DLSS I used to render at lower quality and then manually super-sample through interactive yes-no queries. But it was hard to do and maintain good game play. I bought a 6-button mouse just for that purpose, though.
The 1080 Ti came out over 9 months later, so it's not a fair comparison. It's definitely not an "apples-to-apples comparison", unlike what you claim.
The GTX 1080 launched at $699 for the founders edition. The RTX 2080 is going to launch at $799 for the founders edition. So it's more expensive but not nearly as much as you're making it sound to be.
"The GTX 1080 launched at $699 for the founders edition. The RTX 2080 is going to launch at $799 for the founders edition. So it's more expensive but not nearly as much as you're making it sound to be."
Furthermore. $699 was the MSRP for the 1080 FE but the retail price was considerably higher for months after the launch.
That's not even the problem. 1080 in SLI is not even close to the performance of two cards. Lots of these games (especially the more extreme ones) use HDR, which hammers the 1080 by 20% for some reason, and 4k, when the 1080 is known to lack bandwidth at 4k. These are the most cherry-picked results and the 2080 is probably only 15-20% faster truly, matching or slightly beating the 1080 Ti depending on the workload, while being at a way higher price.
I think what happened is the last generation (really the current generation cards) launched at a time the mining thing went ape shit and they were all inflated. So now things are returning to semi-normal, thats why the 2080 is priced at the MSRP of the 1080Ti.
There is a difference between "need" and "want" so the claim that AA isn't needed is valid. Though, it's a short walk down a slippery slope to argue that games in general aren't a need either so there is that. I guess where you draw the line in the sand for what constitutes a need is different for everyone, but still -- no hogwash there.
I use a 4k 32" and I'd say AA is still nice for some games but it isn't quite as necessary as when I played at 1080 on a 27". Aside from shadows it's the first thing I'm willing to turn down.
2080 is a lot closer in core count to a 1080 than a 1080ti, though, so if you're trying to compare architectures it's reasonable. Bang-for-buck is definitely another reasonable way to compare cards; it's just not what they're going for here.
I've bought every single 10 series card over the last 2 years for various computers, as they were great. Let's try to understand what the comparison slide actually means.
After taxes, the 2080 right now is $1231 CAD, and the 1080 is $660 CAD. So yeah, I'm still pissed about the pricing. 87 percent more expensive for probably 40 percent faster at 1440p? For me, I never doubted it would be faster, that was never the problem.
This is the first time in history instead of nVidia delivering 20-50 percent faster per dollar performance, the frames per dollar actually DECLINED. That has never happened before. Worse launch ever, and I'll be skipping these cards.
So I take it you have no care about the move towards real time Ray-Tracing as an alternate way to render? If you were NVidia - how would you shift the industry if you could (i.e., towards a better way to render)? It's clear they've taken a big gamble here in a lot of ways - trying to force a paradigm shift. This is an Apple move - and it may fail. I still give them credit because they ARE giving you a performance jump of 50% - and it looks like the 2080 beats the current $750 US dollar 1080TI.
They're not giving you a 50 percent performance jump. They sold the 1060 and the 1070 before, you could have just bought a 1070 instead of a 1060, but you didn't because they were more expensive. Companies aren't giving you anything unless it performs better per dollar. Twice as fast for twice as much money isn't giving you anything.
Also it isn't ray tracing for games. It is hybrid ray tracing (a long way from complete ray tracing). I think actually maybe an add-in card like a a second card for phys-x would maybe have worked better.
"Checkerboard" has some specific baggage that doesn't apply here. But yes, the concept is similar: it's a way to get a level of quality similar to rendering at 4K without the actual computational costs of fully rendering at that resolution.
Now I see why Nvidia hasn't released any product roadmaps in a long while. More and more they're going to have to rely on these buzzword technologies to sell products.
"NVIDIA presented a non-interactive Epic Infiltrator 4K demo that was later displayed on the floor, comparing Temporal Anti Aliasing (TAA) to DLSS, where the latter provided on-average near-identical-or-better image quality but at a lower performance cost. In this case, directly improving framerates." How can an antialising technique with a "performance cost", whether high or low, improve frame rate? That is contradictory. Judging from the high increase in frame rate in the games that employ it (the graph isn't clear but I suppose that's what it shows), and by how Jensen Huang focused on deep learning's ability to super-sample low res images by "adding missing pixels" in his presenation, DLSS does not appear to be an antialising technique at all.
Rather, it looks like it is a pure super-sampler, just as its acronym says. My guess is that it allows games to be rendered internally at lower resolution (say 1440p, maybe even 1080p), then super-sample them to a 4K-equivalent resolution, and then output them as native 4K, and thus raise frame rate close to that of the lower internal resolution. Huang would not have presented that ability for no reason, and frankly, how else would it provide a performance gain?
An alternative idea might be to off-load game AI and physics to the tensor cores and raise frame rate by freeing the shaders from that task (something that might also be possible, but as a separate technique), but would that lead to so much higher frame rate? And why would that be called "super sampling"?
I don't actually object to the lower resolution super-sampling scenario by the way, providing it beats most blind tests. Super-sampling (or "resolution upscaling", which is the TV set term) has always been a lame technique with little to no benefit, but deep learning might actually make it viable.
I don't think it will always be faster than the 1080TI based on the fact the TI has more memory and still has slightly faster memory throughput. This chart obviously makes a complex comparison difficult. I'm still excited for the 50% increase over the 1080, and more importantly, I'm on board with the move to Ray Tracing. I've yet to buy a high refresh rate monitor precisely because I don't need that assessment spoiling how I view games. I want the best possible IQ with 60FPS if possible. I still play console games which largely cling to 30FPS - so I even accept that somewhat on PC games, even though I really want 60FPS min in most games.
I'm guessing they may not be getting the amount of pre-orders, for what seems to be an overpriced card, that they hoped for. So they release a graph that shows no useful information, and claim 135% increased performance. The graph shows the 1080 goes to 1, while the 2080 goes all the way to 2.35.......the only thing that really tells us is that the tallest graph bar for the 2080 is 135% longer than the bar for the 1080, given the graphs scale. The best thing to do is wait for 3rd party testing to give us actual performance numbers. Until then, people need to resist the marketing hype, and remember that you should never pre-order something you know nothing about.
Every Nvidia card release they ALWAYS have numbers to back it up. Something smells with this release of new cards and having no solid number comparisons. I'm holding off a pre-order.
If this is indeed a stumble by Nvidia, unfortunately AMD is nowhere around to capitalize. I like Nvidia cards well enough but the prices have been climbing due to a lack of competition from AMD. They're getting obnoxiously expensive mining craze aside. I'll hang onto my 290X's and 1060's for a while longer. 30% improvement for 50% price increase BS. I'll hang back and wait as well but you're right. Something smells. vendor agnostic MGPU gaming cant come soon enough.
Has anyone at the Gamescom event been able to check the Windows Device Manager for the Nvidia driver version? Do they allow any unsupervised playing time with the demo machines? Although it seems kind of doubtful they would permit that. But if so: shouldn't be too difficult to boot a portable SSD running "Windows to Go", copy over & install the Nvidia device drivers from the system's internal boot drive, and obtain some initial GPU-Z screen shots or maybe some crypto-mining benchmarks, for example.
So this RTX2080 will eclipse the existing 1080TI out of the box?
If so, why is everyone so damn salty? The 1080TI is not a "$529" card - used sales don't count. I'm seeing 1080TI's still going for $750+. Even if they drop prior to launch - that's not what the 1080TI cost right up until the RTX2080 launch. If you are the type that thinks RayTracing is just a "gimmick" and don't buy NVidia's push here (which is fine), you can still get a 1080TI which suddenly became old news and probably 100-200 cheaper (by October).
I for one am not seeing anything that should dissuade me on my preorder. Yes, it's hella expensive, but so is the 1080TI, a card I previously scoffed at as ridiculous. I think realtime Ray-tracing is going to be the future, and faster than everyone thinks.
We live in a complicated communications environment. Firstly, it only takes a small percentage of people to create a hubbub because there are so many people to begin with. It can look like lots and lots are salty even when it's only a small percentage. Then that can snowball until it actually is lots of people. Secondly, there are fanboys who aren't even trying to be reasonable. They just say whatever to try support their fandom. Thirdly, you have bloggers and journalists that are competing fiercely for clicks. They get more clicks by being emotional and alarmist than being reasonable. So saying "Something is fishy with the presentation. Deviousness is afoot. The performance must be bad. Resist the hype!" gets more clicks than saying "Perhaps the new cards don't offer a sizable performance boost in legacy games compared to the old ones, but perhaps NVIDIA just wanted to hype up the ray tracing abilities without distraction. We have to wait until we get more information."
You are right, and it is unfortunate the internet has become the megaphone of the negative. It collates every negative opinion quickly from folks the world over into singular forums.
Early on - the internet seemed to thrive with more excitement and hype. Hype is dead - I'd hate to face the comments if I were an NVidia engineer.
I do think the video played of Shadow of the Tomb Raider gameplay wasn't impressive from a ray tracing perspective, but it also is a scene that fairs well with existing rasterization techniques (vs the night scene they showcased in the on-stage demo). I saw a handy cam capture of the BFV demo (beyond what was on the stage demo released to public) - and that is a way more eye opening demonstration of what ray tracing is capable of.
It's incredible that we can, in one generation, run games at playable framerates AT ALL using all ray tracing. Which bodes well for the transformation - even if many will not like that it means 30-60FPS on very expensive cards. This is how it will always work - like new tech in expensive luxury cars. It will be more fascinating if the RTX2070 can support 30FPS in ray-tracing in most games at 1080-1440P. That's still an incredible feat that our jaded cynical world wide web will dump all over.
But it isn't all raytracing. Not even close. That's years and years away.
What this is, is a tentative effort at a paradigm shift to start making raytracing the standard.
The problem is, a lot of people (myself included) buy based on performance, and raytracing even only the things that are raytraced appears to bring some rather disappointing performance to the games that have the option enabled.
Couple this with people not only being eager for some new cards to spend their shinies on, but also having literally only just gotten out of months of excessively high card prices, the appeal of forking out over a grand for what appears to be around a 20% at best traditional performance improvement on a model Vs model basis has left a very sour taste in many people's mouths.
I'm all for the progression of tech. I spent way over the odds for an SSd back in the day, but that improved performance. The level RT is at currently seems more like a side upgrade, and I think a lot of people feel the way I do, and would rather a range of cards that cost significantly less and cannot do RT, or cost the same and sacrifice RT cores etc for higher clocks or more traditional cores.
I don't think ray tracing is years and years away.
As far as what people prefer, well, my opinion is they have their habits. They skip over visual artifacts but they are hypertuned to frames per second. If you're not playing competitively then what is the use of frames per second? Experiential quality. Well, to me, the differences that ray tracing make are worth more to experiential quality than some extra frames per second. I'm guessing I'd rather play at 40 fps with G-Sync and ray tracing than over 60 fps without ray tracing. That's if it's implemented well, of course.
Eventually ray tracing will enable a change in the artistic design of games, and even some game play changes. Those changes will take a longer time to come about, especially the game play ones. That can't really happen until most cards are ray tracing capable. The artistic changes will take time for the artists to relearn what is and isn't possible. But at the moment I think there's adequate low-hanging fruit that will come out to justify an extra $50 or $100 for the price of a card, even if it knocks frames per second down a bit in the process.
As far as sour taste in people's mouths, I don't buy it. I think the salty ones are a minority. Most people willing to pay that much for a graphics card are probably excited to be getting something special they don't get every generation.
I suppose you'd be ok with paying $1000 for a 3080 and $1500 for a 4080. After all, better performance warrants a higher price. By that logic a x080 card would be $4000 in a few years and users should be happy about it, right?
8800 GTX, which was a massive technological step (according to nvidia), launched at the SAME price point as 7800 GTX, even though it was up to two times faster or more in certain games.
It seems you've become used to being price gouged.
... and you are ignoring the fact that nvidia cut its price by $150 down to $500 a mere month after launch because it was overpriced for the offered performance compared to 4870.
So what? They still launched it at the high price. If AMD comes out with a card in a month that's competitive with the GTX 2080 and decides to price it at $400 then NVIDIA will again be forced to cut their prices. But AMD won't be able to do that. NVIDIA lost money 4 out of 5 quarters between mid 2008 and mid 2009.
Look at the quarterly earnings per share (second graph). You can see why they were trying to price things higher and how it hurt them to have to cut their prices. Of course AMD was also losing money over that time period. Of course the recession didn't help. AMD was also losing more and more to Intel Core processors on the CPU side at this time, and I am guessing they might have had heavy debt payments from their acquisition of ATI a couple of years prior. Why AMD decided to introduce cards at low prices at this time I don't know. Perhaps they either had a different idea of what the real price/demand curve for graphics cards was than NVIDIA or maybe they thought they could hurt NVIDIA through financial attrition even though they themselves were bleeding money. I really have no idea.
As you mentioned, the recession was in effect. AMD priced their cards quite low, yes, but I doubt it was low enough to lose them money. They were not in a situation to willingly sell cards at a loss. I could be wrong. Same goes for nvidia.
Intel's figures also took a nosedive in that period, so it seems recession was the main factor.
Oh it definitely was low enough to lose them money. The same for 2007 and 2006. You can look at their financial results. Go to page 60 of this financial document: http://quarterlyearnings.amd.com/static-files/2b12...
Look at the Graphics operating income. They lost 6 million in 2006, 39 million in 2007, and made 12 million in 2008. Making 12 million is pretty much breaking even. There was no recession in 2006 or most of 2007. I think NVIDIA only lost money in 2008 out of those years, probably a mix of the recession and NVIDIA switching to an expensive new architecture (Tesla).
Note that for AMD the Graphics segment "includes graphics, video and multimedia products and related revenue as well as revenue from royalties received in connection with the sale of game console systems that incorporate our graphics technology". They were probably making money on consoles, as the XBox 360 was pretty popular, so they probably still lost money on PC graphics cards in 2008.
Also notice that in 2007 and 2008 AMD was bleeding money in their CPU business. Why they decided to fight on price so heavily that they lost money in GPUs as well I don't know. But they either willingly sold at a loss, or they thought NVIDIA couldn't or wouldn't cut prices and therefore AMD would gain market share, or they miscalculated the price demand curve, or they would have lost money anyway even if they hadn't cut prices. I think the last one is very unlikely. People still would have bought almost as many graphics cards even if they all cost $50 more, as history has shown. Anyway, it wasn't good for them because they had to start cutting down their R&D expenses and then they fell behind NVIDIA. So when people say we need more competition to drive down prices, well, they had their time of lower prices years ago at the expense of said competition today, because the price cuts of yesteryear (along with AMD's awful CPU performance) cost AMD the ability to compete, until with Fury and Vega they were forced to build such expensive cards just to try to compete that they couldn't get much market share even by cutting prices. The cards were so expensive to make they could only cut the prices so much. At times they have given up the high end of the market completely.
You said AMD was losing money in that period and now we see that in 2008, where 4870/50 launched in June, they actually made money. So yea, the cards DID make them money even at such low prices. They were still recovering from the ATI purchase. Making even that amount of money in 2008 and in that financial situation and that time period is not bad with such low priced cards.
"Perhaps the most notable thing about the GeForce 7800 GTX at launch was the $600 USD price tag. Previous generations had drawn the line for flagship pricing at $400, so back in 2005 $600 really was considered to be pretty staggering."
Maybe your example is a bad one. The 8800 GTX wasn't a big jump in price only because NVIDIA had already pushed the price up to new territory with the 7800 GTX.
Let's look at what happened after the 8800 GTX. The 9800 GTX launched at $350.
In any case, things were not as stable. back then as now. It's not a good comparison to be looking at generational price differences. But if we look at the 200 series and onward, prices have gone up and down frok generation to generation, but there hasn't really been a trend one way or the other.
It doesn't matter if 7800 GTX was already more expensive than the prior cards (and I did consider it to be quite overpriced back then). If technological advancements and better performance are the driving factors for a price increase, then how come 8800 GTX wasn't even MORE expensive, considering it was massively outperforming 7800 GTX and introduced CUDA?
9800 GTX is a terrible example. It was launched only 2 months before GTX 280 and 260, simply to act as a lower range, cheaper card to those two. Nvidia never released true low-range Tesla based cards.
2080 Ti is the most expensive generational flagship launch card in the past 18 years at $1000. The second most expensive is 7800 GTX at $767 adjusted for 2018. That's about 30% more.
"If technological advancements and better performance are the driving factors for a price increase, then how come 8800 GTX wasn't even MORE expensive,"
I think you can answer that for yourself if you just stop and think a second. The answer isn't hard to reach.
"9800 GTX is a terrible example."
All of the examples from back then are terrible. NVIDIA and ATI/AMD were coming out with cards all the time back then. Besides, there's no reason they need to introduce the card for a low price just because they will come out with something better and more expensive later. They can cut the price like GPU manufacturers used to do all the time in those days.
"2080 Ti is the most expensive generational flagship launch card in the past 18 years at $1000"
No. The Titan X launched at $1200. What was the major difference between the Titan X and the 1080 Ti other than time of introduction and 1 GB of RAM? It also launched close to the smaller chips just like the 2080 Ti has. What you want to call the flagship products (Ti versions) always launched later. The 2080 Ti is launching alongside the 2080 this time. Also, compare the die size and the percentage of cores cut off the full die of the GTX 780 with the RTX 2080 Ti and it's obviously the wrong comparison to make. The RTX 2080 Ti is a huge chip. It is not proper to compare it to the GTX 780. What has happened is that people are buying more and more powerful GPUs as the GPU has become more and more important to the game experience relative to the other components. Therefore, NVIDIA has introduced a new class of product with huge die sizes that are priced higher. They introduced that back in the Maxwell generation. At first they released them as prosumer cards first, only later coming out with a "Ti" branded version at a reduced price. This time they skipped the Titan and released the "Ti" branded card at the architecture launch. The proper comparison to the GTX 780 or the GTX 980 or the GTX 1080 is the GTX 2080.
How about you answer why 8800 gtx wasn't more expensive instead of dodging the question.
No, just a few are bad examples, like the one you came up with and I clearly explained why.
Titans are not mainstream cards and do not fit in the regular geforce line up.
Excuses. It's not my fault they named it 2080 Ti which makes it a direct replacement for 1080 Ti. It is overpriced. Simple as that. I really don't understand how you as a consumer could defend a corporation's pricing behaviour.
"How about you answer why 8800 gtx wasn't more expensive"
OK. Because there are only so many bullets in a chamber. If they already raised the prices the generation before they have less ability to raise them now, everything else being equal. It's the market that sets the prices, ultimately. A company is just trying to maximize their prices. There is a price demand curve that takes into account how many customers are willing to pay what. We would have to look at market conditions to determine what the hell was going on at the time. The conditions of cost to produce, competitive environment, strength of the economy. That's why looking back at such an example is not a good way of trying to have this discussion. My point with talking about historical prices is only to show that the trend has not been to decrease or increase same-class graphics cards over time. Sometimes they have been cheaper and sometimes more expensive. But to single out a particular launch and say "this launch is most like the current launch, let's look at that one" is a bad idea, because there are so many other factors that influence that individual launch besides the cost to manufacture. Additionally, we didn't even establish the actual cost to manufacture or research the 8800 GTX. You just declared it was new technology and therefore should be a direct comparison.
"No, just a few are bad examples, like the one you came up with and I clearly explained why."
No, they are all bad. Go back and make a chart of product introductions and price cuts back in those days. They happened much more frequently and haphazardly. It was a different time.
"Titans are not mainstream cards and do not fit in the regular geforce line up."
In the Maxwell days Titan was a GeForce card. It fit in the GeForce lineup, it just was called Titan because it carried on the moniker of the card that went into the supercomputer. The Maxwell, Titan, however, was the first to now have FP64 and really didn't have anything that distinguished it from GeForce cards other than being the highest performing GeForce card of its time.
"Excuses. It's not my fault they named it 2080 Ti which makes it a direct replacement for 1080 Ti. It is overpriced. Simple as that. I really don't understand how you as a consumer could defend a corporation's pricing behaviour."
The name is the excuse. What's in a name? It's not the name that's important. It's the market segment that's important. That can be seen by die size, features, and by the release schedule. It's not your fault that it was named that. It's your fault that you are trying to argue that it's apples-to-apples even though the real-world situation is different, just because the name is the same. Then at the same time you try to argue that the 2080 Ti should also be compared with the 1080 strictly because of their launch schedule. Well, which is it? Should we follow the name or the launch schedule? In fact, we should follow the entire market conditions: the launch schedule, yes, but also the die size and the features of the card, i.e., the market segment the card is targeted at.
You said new technologies that require R&D and a bigger die force a price increase. 8800 GTX had CUDA, had a much bigger die and yet launched at the same price as 7800 GTX. The fact of the matter is, as soon as the competition gets gets weak, prices jump up. It has ZERO to do with the features and performance. This very much applies to 2080 cards.
It wasn't a different time. Price cuts shortly after overpriced launch prices are not the same as regular price cuts during a generation.
Doesn't matter if titans at first had geforce in their names. They were never part of the regular cards. They were for special use cases, like kepler titan with its FP64 performance, and/or a way to extract money from those people who didn't want to wait for a Ti card (maxwell titan); and with pascal titans nvidia officially made them a separate line from geforce, confirming what everyone was saying all along.
I NEVER compared 2080 Ti to 1080. You don't read my comments properly. I compared 1080 to 2080 and 1080 Ti to 2080 Ti.
I, as a buyer, do not care in the least how big the chip is or what it brings. If the names match, then the cards are in the same category, therefore they should be in the same pricing ballpark. If the price is this much higher, then they are overpricing it compared to its predecessor. It cannot get simpler than this.
"You said new technologies that require R&D and a bigger die force a price increase. 8800 GTX had CUDA, had a much bigger die and yet launched at the same price as 7800 GTX."
Yes, I did say that, and I supported that in direct ways. It's pretty obvious that a larger die size will be more expensive to produce. It's also obvious that to maintain margins a costlier card will have to be sold at a higher price. These are facts. You mentioning the 8800 GTX is meaningless without hiring a market analyst to look into the specific of the 8800 GTX and the GTX 2080. We just can't glean anything from it with that. It's too complicated. I gave you a simple reason why the 8800 didn't increase prices further than the last generation, anyway.
"The fact of the matter is, as soon as the competition gets gets weak, prices jump up. It has ZERO to do with the features and performance. This very much applies to 2080 cards."
Not true if you look at the historical price trends. I've said this over and over: AMD had the weakest competition during the Maxwell era.. Maxwell cards were actually among the more affordable. And competition in graphics cards always has to do with features and performance. I don't know what you mean by that.
"It wasn't a different time. Price cuts shortly after overpriced launch prices are not the same as regular price cuts during a generation."
It was a very different time. Go back and look at the price cuts and launch history. Each manufacturer seemingly came out with cards 3 times a year. They'd cut prices at launches and otherwise. Just because you've categorized one particular instance of a price cut in your mind doesn't change that.
"Doesn't matter if titans at first had geforce in their names."
Wait, does the name matter or not? Make up your mind. I'm being serious here. You can't have it both ways.
"They were never part of the regular cards. They were for special use cases, like kepler titan with its FP64 performance, and/or a way to extract money from those people who didn't want to wait for a Ti card (maxwell titan); ""
Not wanting to wait for a card with a particular name is a special use case? Hmm, ok. The Maxwell Titan filled the exact same market segment as the 2080 Ti. Nothing more, nothing less. You haven't made any argument against that other than the name.
"I NEVER compared 2080 Ti to 1080. You don't read my comments properly. I compared 1080 to 2080 and 1080 Ti to 2080 Ti."
You said: "2080 Ti is the most expensive generational flagship launch card in the past 18 years at $1000. The second most expensive is 7800 GTX at $767 adjusted for 2018. That's about 30% more."
The flagship launch card of the Pascal generation was the 1080. So you most certainly did compare the 2080 Ti to the 1080.
"I, as a buyer, do not care in the least how big the chip is or what it brings. If the names match, then the cards are in the same category, therefore they should be in the same pricing ballpark. If the price is this much higher, then they are overpricing it compared to its predecessor. It cannot get simpler than this."
That's fine. As a buyer you can do what you want. Then NVIDIA loses your purchase. It doesn't make your claims right or your decision rational, though.
I really doubt it's so expensive that they "need" to price it at $1000. I bet they could cut it to $800 and still make a sizable profit. I can't prove it but when there is no competition, companies tend to overprice. That's simple business 101.
Maxwells were still quite more expensive than fermis when AMD was competing properly.
... yet none of those 3 cards a year were generational upgrades.
I have made my mind. It said Titan in the name and they were not among the regular lineup. Later nvidia even removed the geforce from the name.
No, Titans never competed against Ti cards. Why? because we had and still have Ti cards. There is going to be a Titan turing card since 2080 Ti doesn't even have all its cores enabled.
2080 Ti IS the most expensive flagship launch card. They both are the big-chip flagships of their generation. I cannot do anything about the lack of a 7800 Ti card. As I've already mentioned, I have never compared 2080 Ti to 1080. I should've written "generational flagship".
My claims are quite straightforward and clear. 1080 Ti, $700; 2080 Ti, $1000, therefore massively overpriced.
"I really doubt it's so expensive that they "need" to price it at $1000. I bet they could cut it to $800 and still make a sizable profit."
No, no chance they can make a sizable profit taking $200 off. How much do you think 11 GB of 14 Gbps GDDR6 costs? NVIDIA sells GPUs. Their operating margins are something like 35%, I think. I'm not sure what they sell the GPU for, but there's a lot of cost for the other components of a card. Suppose they sell it for $600. 35% of $600 is $210. If they took $200 off the price they'd be pretty much breaking even.
And NVIDIA tends to always make sizable profits. That's not a bad thing for consumers. It allows them to continue to invest in the technology. Look at the current difference between what NVIDIA can produce and what AMD can produce. That difference is a result of NVIDIA's sizable profits and AMD's losses.
But again, the point is not NVIDIA's sizable profits. The question is whether NVIDIA is making more money with RTX 2080 at $700 or GTX 1080 at $600. (The 1080 Ti and 2080 Ti are NOT a good comparison. The 1080 Ti did not come out until over 9 months later! If you want to make the comparison you must at least wait 9 months and consider the price of the 2080 Ti then, though m y guess is that 9 months from now NVIDIA will be close to introducing a new generation and so may not play around with price much until that new generation comes out. That short time to the new generation is one reason the 2080 Ti is out now.). And the answer is no, they don't seem to be making more with RTX 2080 at $700 than GTX 1080 at $600 as evidenced by their projected gross margins for the upcoming quarter.
As far as a Titan Turning, no, I doubt it. Or if there is the Titan will include more RAM and the Ti will come down in price at that time. The Titan and the Ti share the same space. In the Maxwell generation they completely shared the same space and the only real difference was the timing of the release. It does'nt matter how much you try to deny that that's true, it's true. You just insist it without any real argument against it. "Why? because we had and still have Ti cards." No! We didn't have a Maxwell Ti card when the Titan X was introduced! And I don't think you want to be talking about the Maxwell Ti introduction too much because it blows apart your whole "when NVIDIA doesn't have competition they price things very high and never cut prices" claim.
"2080 Ti IS the most expensive flagship launch card."
The 2080 Ti may be the most expensive flagship card launch but it is in a different segment from all the other flagship card launches! It's like saying that when Toyota came out with the Avalon it was the most expensive Camry ever.
"As I've already mentioned, I have never compared 2080 Ti to 1080. I should've written "generational flagship"."
No matter how you try to spin it or what labels you use, you ARE comparing the 2080 Ti to the 1080 when you are comparing "generational flagships". Stop and think for a minute. How can you compare two things without comparing them?
"My claims are quite straightforward and clear. 1080 Ti, $700; 2080 Ti, $1000, therefore massively overpriced."
Yeah, a card that came out 9 1/2 months after launch and a card that came out at launch. What's quite straightforward and clear is that the comparison is flawed.
Titans are not part of the regular geforce line. Nvidia removed the "geforce" part for the pascal variants and finally confirmed what everyone was suspecting. It doesn't get clearer than that.
Titans come early to milk as much money as possible and when that's done Ti cards come in and perform as well or better in games. That's why we didn't have a Ti maxwell before titan.
False car analogy. Avalon is a completely different line from Camry. That's the entire reason they are named differently. A newer generation Camry usually comes with a lot of new features and technologies and yet is just a bit more expensive than the last one. 2080s are simply a newer generation Camry but nvidia wants to charge you the price of an Avalon.
You are doing the spinning here. There was no Ti back then. 7800 GTX was the big-chip flagship card. Stop playing around.
It's not my fault nvidia released 2080 Ti with 2080. They did and priced it at $1000 and since it's the direct successor to 1080 Ti, as any normal person can see it is, I can say it's WAY overpriced compared to its direct predecessor.
I never thought a customer would be so willing to defend a corporations pricing tactics. It is as if you like to be charged more.
What if hey priced it at $1500? Would you have also defended that? At what level you'd say it's overpriced? $2000? $3000? Corporations overprice when there is no one to stop them. That's the reality.
I doubt that. Where did you get 35%? I don't buy that at $600 2080 would be barely breaking even. I really doubt it. Yes, I'm pretty sure 2080 cards are more expensive than 1080 cards but not expensive enough to justify such prices, specially for 2080 Ti.
Nvidia isn't that stupid to price a card at $700 if a mere $100 price cut would wipe out the profits. Companies always price their cards high enough so that they can counter unexpected forced, early price cuts.
How many times I have to repeat this, I am NOT claiming 2080 is making them more money than 1080. I'm just saying they could drop the price a bit and still make a respectable profit.
2080 Ti would almost certainly still cost $1000 nine months later. No competition, therefore no price cuts, and since those two cards are both Ti, therefore I CAN compare them.
1157/3123 = 37%. What the operating margins are of a particular product, I have no idea. It's probably a bit higher than average for the Ti parts, but it's tough to say because the data center (Tesla) parts probably have margins much higher than average. But these are just rough numbers to show you that what you said is way off base.
"I don't buy that at $600 2080 would be barely breaking even. I really doubt it."
That's not what I said. I said that if you take $200 off the price of the 2080 Ti GPU (I wasn't talking about the 2080, although taking $100 off the 2080 would do something similar. It just wasn't the calculation I made) they are probably approaching break even. Let's clarify something. NVIDIA sells GPUs, not graphics cards. When you buy an MSI RTX 2080 Ti for $1000, a cut of that goes to the retailer. I don't know what their margins are, but let's say they bought the card for $900. Now MSI had to buy all the RAM and the PCB and the GPU and all the other components to put it together, plus they need to make a margin as well to make it worth it. Perhaps it costs them $800 to make the card and $100 is their cut. Out of that $800, they need to pay for the assembly line and workers and the components, including the GPU. So I think $600 for the TU-102 GPU was a very high estimate, since it doesn't leave very much for all the other stuff. That $600 is what NVIDIA is getting. If you take $200 off of that then you leave them with $400. If NVIDIA is selling the card at about 1/3 operating margins then 2/3 of the money they receive for the part is going toward expenses. 2/3 of $600 is $400. So, that $200 you took off was surely almost all of their operating profits.
Now, perhaps you want NVIDIA's AIB partners and the retailers (good luck with the retailers) to share in the margin loss with your price cut. Then you are not just squeezing NVIDIA's profits you are squeezing the others' profits too. Maybe you can get Micron to take less for the GDDR6 DRAM chips...
My point is this: NVIDIA projects upcoming margins to be lower than the previous quarter's margins. So it doesn't appear like these new cards are priced in a way to give NVIDIA richer margins than the 10 series cards. That suggests that the higher prices are accounted for by greater costs to manufacture.
"So it doesn't appear like these new cards are priced in a way to give NVIDIA richer margins than the 10 series cards."
Sigh, for the 4th or 5th time, I never said that's the case. 2080s are almost certainly making less profit than 1080s, but I do not believe for a second that lower prices would still not have made them a sizable profit.
You can't calculate a card's profit margin based on the entire company's profit numbers. There is no way each 2080 Ti costs AIBs $800 to make. I very, very much doubt that. As I've mentioned before, businesses always leave enough room for unexpected price cuts so that they'd still make an acceptable profit.
"Sigh, for the 4th or 5th time, I never said that's the case. 2080s are almost certainly making less profit than 1080s, but I do not believe for a second that lower prices would still not have made them a sizable profit."
Then what are we arguing about? The raise in prices are justified by the greater cost of the card if they are still making less profit off them with the raise. You're just making a normative statement of "NVIDIA should be making less profit altogether".
"You can't calculate a card's profit margin based on the entire company's profit numbers."
You're right, but it's the best that we have. We can guesstimate.
"There is no way each 2080 Ti costs AIBs $800 to make."
If NVIDIA is charging $600 for the GPU, then that leaves $200 left for the other stuff. $200 for the RAM, PCB, voltage regulators, heat sink, labor and assembly line costs, etc., seems exceedingly low. I tried to estimate the price NVIDIA was charging AIBs in the high range because that gives the best chance of your $200 price cut to not result in a loss for NVIDIA, making your case as strong as possible. If we move the price NVIDIA is charging AIB's to $500 then that leaves more room for a possible $700 cost to make the cards. But that doesn't help your case.
"As I've mentioned before, businesses always leave enough room for unexpected price cuts so that they'd still make an acceptable profit."
No they don't. They pretty much maximize their profits for the expected market conditions while avoiding risky situations that could put them in financial distress. But NVIDIA doesn't really have to worry about that latter part at the moment. The planning of what costs are acceptable happens a lot earlier than bringing the product to market. It's the market that sets the price. Companies try to predict the market conditions and then maximize their profits within those conditions. They end up with margins because maintaining margins is the whole point of the game, not in case there are unexpected price cuts.
I assume hey price this so they can lowered it down a year later into normal range. Imagine next year you get double the performance of RTX 2080 with 7nm.
I haven't been following GPU close, what happens to Dual GPU config? Are the software still not up to it?
Judging by your hands-on with real-time ray tracing in games from a couple of days ago, an RTX 2080 Ti struggles to maintain 60fps at 1080p. And even though games and drivers aren't final yet, it's still hard to believe they'd be able to gain much performance. If so, what on earth is NVIDIA on about with these 4K60fps stats, and on a less powerful card no less?
Is NVIDIA advertising two separate features (ray tracing and 4K) that can work on their own but not together? Are we talking about 2080 being capable of 4K rasterization and ray tracing at, what, 900p? Seems about right if 2080 Ti struggles at 1080p... And what of the 2070 then? Would it be able to run ray tracing in games at all?
It doesn't seem likely NVIDIA would champion a marquee feature, and name their cards after it, that is such a performance hog that it can only run on a $1000 flagship at 1080p in late 2018.
Something doesn't add up. Please, confirm, deny, or provide more information.
My impression has been that the struggling performance we've seen with the likes of BFV, Metro, and Tomb Raider was with RT enabled, and the recent numbers straight from nVidia have been without RT and with some unspecified AA (to then compare to DLSS). The 4K numbers are almost without a doubt with RT disabled, otherwise they'd be shouting it out from the rooftops.
Here in the UK the new 2080 (non-Ti) starts at £715. A 1080Ti can be had for £630. Sources below: https://www.scan.co.uk/shop/computer-hardware/gpu-... https://www.scan.co.uk/shop/computer-hardware/gpu-... Given that the 2080 is £85 more expensive, has less VRAM, and is a year newer, I would seriously hope that it would be not just as fast as a 1080Ti, but a lot faster. I suspect the opening topic for discussion at the most recent Nvidia board meeting was "How much can we get away with charging?".
Ansel has to be one of the most underrated features of modern nvidia cards. The kind of screenshots you can get in The Witcher 3 are really quite stunning. And such stupidly high resolutions too.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
92 Comments
Back to Article
erwos - Wednesday, August 22, 2018 - link
Prices still seem high... we used to get these performance increases with no or $50 price hikes.HighTech4US - Wednesday, August 22, 2018 - link
Go back to playing on your peasant console these high and mighty cards are not meant for you.lilmoe - Wednesday, August 22, 2018 - link
I believe this product is right for you.https://www.amazon.com/Curious-Minds-Busy-Bags-Str...
eva02langley - Thursday, August 23, 2018 - link
Since when WCCF forums invaded anandtech? I know that facts doesn't worth anything there, but here it does.PeachNCream - Thursday, August 23, 2018 - link
That's an oddly aggressive response to a comment about rising GPU costs. Although consoles do get priority from most of the larger game studios and PC ports are unfortunately treated like second-class afterthoughts, there are still PC only developers out there including a number of smaller and indy teams that are putting priority on those systems. I also agree with you that it's a shame PC hardware costs a lot more and we end up stuck with halfhearted, months-delayed releases that make the high priced GPUs and CPUs we buy pretty pointless, I'm not going to lash out randomly at someone that rightfully points out we're not getting a very good performance-versus cost return in the GTX to RTX transition. It's a valid point after all.kohlscard - Wednesday, August 29, 2018 - link
https://kohlscreditcard.loginsi.comHollyDOL - Thursday, August 23, 2018 - link
Tell me about it, local store offers 2080 Ti for 1400Eur and 2080 for 1000Eur. This generation I'll be skipping.Manch - Thursday, August 23, 2018 - link
yeah, unfortunately this means current cards wont be dropping much in price.milkod2001 - Thursday, August 23, 2018 - link
Stupidly high. This happens when there is no competition. AMD really dropped ball with GPUs this year , at least they do good with CPUs.eva02langley - Thursday, August 23, 2018 - link
It is not AMD fault that Nvidia is charging overpriced "mainstream" GPU, it is AMD own greed and their investors. Stop blaming AMD for Nvidia behaviors.eva02langley - Thursday, August 23, 2018 - link
Edit:It is not AMD fault that Nvidia is charging overpriced "mainstream" GPU, it is *****NVIDIA***** own greed and their investors. Stop blaming AMD for Nvidia behaviors.
Manch - Thursday, August 23, 2018 - link
Well milk~ isn't wrong. The fact that AMD doesn't have anything to compete is one of the reasons NVIDA can get away with this. No competition to undercut them so they don't have to drop prices. Intel did for years with no competition. Now look. We're getting 6 core 8 core CPUs at great prices. Competition is always a win for the consumer.TrackSmart - Wednesday, August 22, 2018 - link
HOLD THE PRESSES: The launch price of the 2080 is the same as the launch price of the 1080 Ti! NVidia is comparing their new product to a lower-priced part. The 1080 Ti is about 30% faster than a vanilla 1080, so push all of those bars up to 1.3 instead of 1.0, for a more realistic comparison.In an apples-to-apples comparison, versus the 1080 Ti, we are looking at a ~10% to 25% performance uplift compared to a similarly priced card from the previous generation. Note that I'm ignoring DLSS speed gains, which are only relevant if anti-aliasing of some kind is used in the comparison -- something not really needed when running at 4k resolution.
I may be wrong, but I call this an unfair comparison.
shabby - Wednesday, August 22, 2018 - link
Don't worry, when reviews come out we will get all the answers. With the 1080ti hitting $529 the 2080 will seem overpriced and should drop rather quickly.bloodyduster - Wednesday, August 22, 2018 - link
The chart is not using price as a basis of comparison, but rather the class of GPU. It is comparing this year's 2080 to last year's 1080. They are comparing the performance gain from last year's model, which seems like a fair comparison to meAlistair - Wednesday, August 22, 2018 - link
It's usually fair, because the price has usually been similar or not that different. Not this time.TrackSmart - Wednesday, August 22, 2018 - link
This! The question is how much more performance-per-dollar Nvidia is providing with this new set of cards compared to the previous generation. The name they have applied to the new cards merely reflects marketing. Unless reviewers call BS, Nvidia gets away with showing "50%" more performance by applying a favorable new naming convention!Think about what comparing a $700 launch price 1080 Ti to a $1000 launch price 2080 Ti really means when looking at the performance per dollar that Nvidia are providing. The same applies to this "2080 vs 1080" comparison.
eva02langley - Thursday, August 23, 2018 - link
Let's be realistic here, no card are at the MSRP, they are all around the Founder Edition price bracket at +-50$... and that is in the US, elsewhere in the world, we are talking crazy prizes. Europe get screwed big time for a 2080 TI.1400-1500 euros = 1600-1700$ US
PeachNCream - Thursday, August 23, 2018 - link
Reviewers will not call BS outright because they won't get pre-NDA lift hardware to play with if they besmirch the good name of the OEM by being rightfully critical. Benchmark graphs may show the reality, but the text around it will likely be very forgiving or carefully phrased to moderate the tone. It's the world we live in now since journalists can't realistically be asked to purchase GPUs out of the business expense account and they'd not get the hardware in advance of a NDA lift even if they did which would ultimately endanger the reviewer in question's job security.Icehawk - Thursday, August 23, 2018 - link
Read HardOCP then, they have been on NV's shitlist for a while and buy their own cards - they said NV is actually going to give them RTXs this time around but I doubt they are going to sugar coat anything.I agree with the price:perf thing, the models don't line up at all anymore so I don't give a flying F what they call it - how much does it cost and what improvement will I get? I was hoping to move from my 970 as it's pretty maxed running 2-4k games and I recently went from 2k>4k on my main screen so I'd like a bit more performance. I imagine some of the pricing woes are due to the insane memory pricing and large amount of it on these cards and not just NV sticking it to us because AMD is pulling another Radeon.
Santoval - Wednesday, August 22, 2018 - link
DLSS is not an antialising technique, because antialising cannot have a frame rate performance benefit. DLSS will employ deep learning to super-sample games (I just described its acronym btw)) that are running internally at a lower than 4K resolution (probably 1440p) and output them at 4K. Which is why Huang was showing these low res cats etc being super-sampled to a higher resolution in his presentation. If DLSS is effective enough the results might be practically identical.Alexvrb - Wednesday, August 22, 2018 - link
What you just described is the opposite of supersampling.https://en.wikipedia.org/wiki/Supersampling
Supersampling aka FSAA is the oldest form of AA. If they're *actually* supersampling (and not just using the term for marketing), they're rendering a higher resolution, then downsampling to a lower res. This method uses the extra samples for color calc. With various types of AA there can be a performance benefit vs FSAA, but not a performance benefit vs running without any AA in the first place.
Again, this assumes they're actually performing some form of supersampling and not just marketing it as such to be extremely obnoxious.
Yojimbo - Thursday, August 23, 2018 - link
Yeah maybe it should be called deep learning upsampling. But let's try to make a case for calling it supersampling.One could argue that what you described is supersampling anti-aliasing, and not just supersampling. Supersampling would mean choosing more samples than exist in the render output target, taken from some other space. The space does not necessarily have to be a greater resolution render, that's just one possibility. In this case the space is an interpolation produced by a neural network operating on the initial render output. So they get these super samples and then, since they are not anti-aliasing, they don't downsample. Instead, they simply render at the new supersampled resolution. Deep learning supersampling?
Yojimbo - Thursday, August 23, 2018 - link
One minor clarification. I should say ..."taken from some other higher resolution space", because otherwise the term "super" isn't justified. But the neural network interpolation is such a higher resolution space that was made without rendering at a higher resolution.Yojimbo - Thursday, August 23, 2018 - link
You know what, it seems they are comparing this to Temporal Anti-Aliasing. I guess they are doing temporal super sampling in an interpolated space created by deep learning inference on the rendered output. But I dunno, I'm just guessing here. I don't really know how temporal anti-aliasing works. Maybe someone with knowledge of anti-aliasing and temporal anti-aliasing can help.Yojimbo - Wednesday, August 22, 2018 - link
Prior to DLSS I used to render at lower quality and then manually super-sample through interactive yes-no queries. But it was hard to do and maintain good game play. I bought a 6-button mouse just for that purpose, though.Yojimbo - Wednesday, August 22, 2018 - link
The 1080 Ti came out over 9 months later, so it's not a fair comparison. It's definitely not an "apples-to-apples comparison", unlike what you claim.The GTX 1080 launched at $699 for the founders edition. The RTX 2080 is going to launch at $799 for the founders edition. So it's more expensive but not nearly as much as you're making it sound to be.
Kvaern1 - Thursday, August 23, 2018 - link
"The GTX 1080 launched at $699 for the founders edition. The RTX 2080 is going to launch at $799 for the founders edition. So it's more expensive but not nearly as much as you're making it sound to be."Furthermore. $699 was the MSRP for the 1080 FE but the retail price was considerably higher for months after the launch.
RSAUser - Tuesday, August 28, 2018 - link
That's not even the problem.1080 in SLI is not even close to the performance of two cards.
Lots of these games (especially the more extreme ones) use HDR, which hammers the 1080 by 20% for some reason, and 4k, when the 1080 is known to lack bandwidth at 4k.
These are the most cherry-picked results and the 2080 is probably only 15-20% faster truly, matching or slightly beating the 1080 Ti depending on the workload, while being at a way higher price.
Samus - Thursday, August 23, 2018 - link
I think what happened is the last generation (really the current generation cards) launched at a time the mining thing went ape shit and they were all inflated. So now things are returning to semi-normal, thats why the 2080 is priced at the MSRP of the 1080Ti.Socius - Thursday, August 23, 2018 - link
I play on a 27” 4K monitor and I can’t stand games that don’t offer TAA. This idea that you don’t need AA at 4K is complete hogwash.PeachNCream - Thursday, August 23, 2018 - link
There is a difference between "need" and "want" so the claim that AA isn't needed is valid. Though, it's a short walk down a slippery slope to argue that games in general aren't a need either so there is that. I guess where you draw the line in the sand for what constitutes a need is different for everyone, but still -- no hogwash there.Icehawk - Thursday, August 23, 2018 - link
I use a 4k 32" and I'd say AA is still nice for some games but it isn't quite as necessary as when I played at 1080 on a 27". Aside from shadows it's the first thing I'm willing to turn down.Peeling - Thursday, August 23, 2018 - link
2080 is a lot closer in core count to a 1080 than a 1080ti, though, so if you're trying to compare architectures it's reasonable. Bang-for-buck is definitely another reasonable way to compare cards; it's just not what they're going for here.Alistair - Wednesday, August 22, 2018 - link
I've bought every single 10 series card over the last 2 years for various computers, as they were great. Let's try to understand what the comparison slide actually means.After taxes, the 2080 right now is $1231 CAD, and the 1080 is $660 CAD. So yeah, I'm still pissed about the pricing. 87 percent more expensive for probably 40 percent faster at 1440p? For me, I never doubted it would be faster, that was never the problem.
This is the first time in history instead of nVidia delivering 20-50 percent faster per dollar performance, the frames per dollar actually DECLINED. That has never happened before. Worse launch ever, and I'll be skipping these cards.
Alistair - Wednesday, August 22, 2018 - link
Excuse my language, I wish I could edit that out. I am also interested in seeing an Anandtech analysis of DLSS. Could be very interesting.wyatterp - Thursday, August 23, 2018 - link
So I take it you have no care about the move towards real time Ray-Tracing as an alternate way to render? If you were NVidia - how would you shift the industry if you could (i.e., towards a better way to render)? It's clear they've taken a big gamble here in a lot of ways - trying to force a paradigm shift. This is an Apple move - and it may fail. I still give them credit because they ARE giving you a performance jump of 50% - and it looks like the 2080 beats the current $750 US dollar 1080TI.Alistair - Thursday, August 23, 2018 - link
They're not giving you a 50 percent performance jump. They sold the 1060 and the 1070 before, you could have just bought a 1070 instead of a 1060, but you didn't because they were more expensive. Companies aren't giving you anything unless it performs better per dollar. Twice as fast for twice as much money isn't giving you anything.Alistair - Thursday, August 23, 2018 - link
Also it isn't ray tracing for games. It is hybrid ray tracing (a long way from complete ray tracing). I think actually maybe an add-in card like a a second card for phys-x would maybe have worked better.evilpaul666 - Wednesday, August 22, 2018 - link
So DLSS is AI-powered checkerboard rendering, basically? Using AI training done on the dev-side and inferencing on the GPU at runtime?Ryan Smith - Wednesday, August 22, 2018 - link
"Checkerboard" has some specific baggage that doesn't apply here. But yes, the concept is similar: it's a way to get a level of quality similar to rendering at 4K without the actual computational costs of fully rendering at that resolution.maroon1 - Wednesday, August 22, 2018 - link
And thats look like a good technology for people have RTX 2070 and can not run some demanding games at 4KYou will get almost comparable quality of 4K but with performance cost similar to maybe 1800p
jwcalla - Wednesday, August 22, 2018 - link
Now I see why Nvidia hasn't released any product roadmaps in a long while. More and more they're going to have to rely on these buzzword technologies to sell products.wyatterp - Thursday, August 23, 2018 - link
Real time ray-tracing is a buzzword? Ray-tracing as a rendering technique is 3 decades old and is THE gold standard in rendering outside of games.Santoval - Wednesday, August 22, 2018 - link
"NVIDIA presented a non-interactive Epic Infiltrator 4K demo that was later displayed on the floor, comparing Temporal Anti Aliasing (TAA) to DLSS, where the latter provided on-average near-identical-or-better image quality but at a lower performance cost. In this case, directly improving framerates."How can an antialising technique with a "performance cost", whether high or low, improve frame rate? That is contradictory. Judging from the high increase in frame rate in the games that employ it (the graph isn't clear but I suppose that's what it shows), and by how Jensen Huang focused on deep learning's ability to super-sample low res images by "adding missing pixels" in his presenation, DLSS does not appear to be an antialising technique at all.
Rather, it looks like it is a pure super-sampler, just as its acronym says. My guess is that it allows games to be rendered internally at lower resolution (say 1440p, maybe even 1080p), then super-sample them to a 4K-equivalent resolution, and then output them as native 4K, and thus raise frame rate close to that of the lower internal resolution. Huang would not have presented that ability for no reason, and frankly, how else would it provide a performance gain?
An alternative idea might be to off-load game AI and physics to the tensor cores and raise frame rate by freeing the shaders from that task (something that might also be possible, but as a separate technique), but would that lead to so much higher frame rate? And why would that be called "super sampling"?
I don't actually object to the lower resolution super-sampling scenario by the way, providing it beats most blind tests. Super-sampling (or "resolution upscaling", which is the TV set term) has always been a lame technique with little to no benefit, but deep learning might actually make it viable.
eddman - Wednesday, August 22, 2018 - link
"How can an antialising technique with a "performance cost", whether high or low, improve frame rate?"He meant improved performance compared to TAA. That graph should've said "1080+TAA" and "2080+TAA".
maroon1 - Wednesday, August 22, 2018 - link
If RTX 2080 is 40-50% faster than GTX 1080, then that should make it even faster than GTX1080TiWith DLSS enabled, it would blow any pascal GPU out of water (even Titan V)
wyatterp - Thursday, August 23, 2018 - link
I don't think it will always be faster than the 1080TI based on the fact the TI has more memory and still has slightly faster memory throughput. This chart obviously makes a complex comparison difficult. I'm still excited for the 50% increase over the 1080, and more importantly, I'm on board with the move to Ray Tracing. I've yet to buy a high refresh rate monitor precisely because I don't need that assessment spoiling how I view games. I want the best possible IQ with 60FPS if possible. I still play console games which largely cling to 30FPS - so I even accept that somewhat on PC games, even though I really want 60FPS min in most games.ThrakazogZ - Wednesday, August 22, 2018 - link
I'm guessing they may not be getting the amount of pre-orders, for what seems to be an overpriced card, that they hoped for. So they release a graph that shows no useful information, and claim 135% increased performance. The graph shows the 1080 goes to 1, while the 2080 goes all the way to 2.35.......the only thing that really tells us is that the tallest graph bar for the 2080 is 135% longer than the bar for the 1080, given the graphs scale. The best thing to do is wait for 3rd party testing to give us actual performance numbers. Until then, people need to resist the marketing hype, and remember that you should never pre-order something you know nothing about.Batmeat - Wednesday, August 22, 2018 - link
Every Nvidia card release they ALWAYS have numbers to back it up. Something smells with this release of new cards and having no solid number comparisons. I'm holding off a pre-order.darckhart - Wednesday, August 22, 2018 - link
Yea I clicked this article because it said "performance numbers" and all I see is a bar graph with some BS y-axis that is not fps.Manch - Thursday, August 23, 2018 - link
If this is indeed a stumble by Nvidia, unfortunately AMD is nowhere around to capitalize. I like Nvidia cards well enough but the prices have been climbing due to a lack of competition from AMD. They're getting obnoxiously expensive mining craze aside. I'll hang onto my 290X's and 1060's for a while longer. 30% improvement for 50% price increase BS. I'll hang back and wait as well but you're right. Something smells. vendor agnostic MGPU gaming cant come soon enough.vailr - Wednesday, August 22, 2018 - link
Has anyone at the Gamescom event been able to check the Windows Device Manager for the Nvidia driver version? Do they allow any unsupervised playing time with the demo machines? Although it seems kind of doubtful they would permit that. But if so: shouldn't be too difficult to boot a portable SSD running "Windows to Go", copy over & install the Nvidia device drivers from the system's internal boot drive, and obtain some initial GPU-Z screen shots or maybe some crypto-mining benchmarks, for example.HollyDOL - Thursday, August 23, 2018 - link
Ugh... Remembering huge availability issues to get my current 1080 I really hope this card cannot cryptomine.wyatterp - Thursday, August 23, 2018 - link
So this RTX2080 will eclipse the existing 1080TI out of the box?If so, why is everyone so damn salty? The 1080TI is not a "$529" card - used sales don't count. I'm seeing 1080TI's still going for $750+. Even if they drop prior to launch - that's not what the 1080TI cost right up until the RTX2080 launch. If you are the type that thinks RayTracing is just a "gimmick" and don't buy NVidia's push here (which is fine), you can still get a 1080TI which suddenly became old news and probably 100-200 cheaper (by October).
I for one am not seeing anything that should dissuade me on my preorder. Yes, it's hella expensive, but so is the 1080TI, a card I previously scoffed at as ridiculous. I think realtime Ray-tracing is going to be the future, and faster than everyone thinks.
Yojimbo - Thursday, August 23, 2018 - link
We live in a complicated communications environment. Firstly, it only takes a small percentage of people to create a hubbub because there are so many people to begin with. It can look like lots and lots are salty even when it's only a small percentage. Then that can snowball until it actually is lots of people. Secondly, there are fanboys who aren't even trying to be reasonable. They just say whatever to try support their fandom. Thirdly, you have bloggers and journalists that are competing fiercely for clicks. They get more clicks by being emotional and alarmist than being reasonable. So saying "Something is fishy with the presentation. Deviousness is afoot. The performance must be bad. Resist the hype!" gets more clicks than saying "Perhaps the new cards don't offer a sizable performance boost in legacy games compared to the old ones, but perhaps NVIDIA just wanted to hype up the ray tracing abilities without distraction. We have to wait until we get more information."wyatterp - Thursday, August 23, 2018 - link
You are right, and it is unfortunate the internet has become the megaphone of the negative. It collates every negative opinion quickly from folks the world over into singular forums.Early on - the internet seemed to thrive with more excitement and hype. Hype is dead - I'd hate to face the comments if I were an NVidia engineer.
I do think the video played of Shadow of the Tomb Raider gameplay wasn't impressive from a ray tracing perspective, but it also is a scene that fairs well with existing rasterization techniques (vs the night scene they showcased in the on-stage demo). I saw a handy cam capture of the BFV demo (beyond what was on the stage demo released to public) - and that is a way more eye opening demonstration of what ray tracing is capable of.
It's incredible that we can, in one generation, run games at playable framerates AT ALL using all ray tracing. Which bodes well for the transformation - even if many will not like that it means 30-60FPS on very expensive cards. This is how it will always work - like new tech in expensive luxury cars. It will be more fascinating if the RTX2070 can support 30FPS in ray-tracing in most games at 1080-1440P. That's still an incredible feat that our jaded cynical world wide web will dump all over.
Skiddywinks - Thursday, August 23, 2018 - link
But it isn't all raytracing. Not even close. That's years and years away.What this is, is a tentative effort at a paradigm shift to start making raytracing the standard.
The problem is, a lot of people (myself included) buy based on performance, and raytracing even only the things that are raytraced appears to bring some rather disappointing performance to the games that have the option enabled.
Couple this with people not only being eager for some new cards to spend their shinies on, but also having literally only just gotten out of months of excessively high card prices, the appeal of forking out over a grand for what appears to be around a 20% at best traditional performance improvement on a model Vs model basis has left a very sour taste in many people's mouths.
I'm all for the progression of tech. I spent way over the odds for an SSd back in the day, but that improved performance. The level RT is at currently seems more like a side upgrade, and I think a lot of people feel the way I do, and would rather a range of cards that cost significantly less and cannot do RT, or cost the same and sacrifice RT cores etc for higher clocks or more traditional cores.
Yojimbo - Thursday, August 23, 2018 - link
I don't think ray tracing is years and years away.As far as what people prefer, well, my opinion is they have their habits. They skip over visual artifacts but they are hypertuned to frames per second. If you're not playing competitively then what is the use of frames per second? Experiential quality. Well, to me, the differences that ray tracing make are worth more to experiential quality than some extra frames per second. I'm guessing I'd rather play at 40 fps with G-Sync and ray tracing than over 60 fps without ray tracing. That's if it's implemented well, of course.
Eventually ray tracing will enable a change in the artistic design of games, and even some game play changes. Those changes will take a longer time to come about, especially the game play ones. That can't really happen until most cards are ray tracing capable. The artistic changes will take time for the artists to relearn what is and isn't possible. But at the moment I think there's adequate low-hanging fruit that will come out to justify an extra $50 or $100 for the price of a card, even if it knocks frames per second down a bit in the process.
As far as sour taste in people's mouths, I don't buy it. I think the salty ones are a minority. Most people willing to pay that much for a graphics card are probably excited to be getting something special they don't get every generation.
eddman - Thursday, August 23, 2018 - link
I suppose you'd be ok with paying $1000 for a 3080 and $1500 for a 4080. After all, better performance warrants a higher price. By that logic a x080 card would be $4000 in a few years and users should be happy about it, right?8800 GTX, which was a massive technological step (according to nvidia), launched at the SAME price point as 7800 GTX, even though it was up to two times faster or more in certain games.
It seems you've become used to being price gouged.
Yojimbo - Thursday, August 23, 2018 - link
You're still willfully ignorant of the fact that the GTX 280 cost more than the GTX 2080 and the GTX 780 cost the same...eddman - Thursday, August 23, 2018 - link
... and you are ignoring the fact that nvidia cut its price by $150 down to $500 a mere month after launch because it was overpriced for the offered performance compared to 4870.Yojimbo - Thursday, August 23, 2018 - link
So what? They still launched it at the high price. If AMD comes out with a card in a month that's competitive with the GTX 2080 and decides to price it at $400 then NVIDIA will again be forced to cut their prices. But AMD won't be able to do that. NVIDIA lost money 4 out of 5 quarters between mid 2008 and mid 2009.https://www.macrotrends.net/stocks/charts/NVDA/nvi...
Look at the quarterly earnings per share (second graph). You can see why they were trying to price things higher and how it hurt them to have to cut their prices. Of course AMD was also losing money over that time period. Of course the recession didn't help. AMD was also losing more and more to Intel Core processors on the CPU side at this time, and I am guessing they might have had heavy debt payments from their acquisition of ATI a couple of years prior. Why AMD decided to introduce cards at low prices at this time I don't know. Perhaps they either had a different idea of what the real price/demand curve for graphics cards was than NVIDIA or maybe they thought they could hurt NVIDIA through financial attrition even though they themselves were bleeding money. I really have no idea.
eddman - Thursday, August 23, 2018 - link
As you mentioned, the recession was in effect. AMD priced their cards quite low, yes, but I doubt it was low enough to lose them money. They were not in a situation to willingly sell cards at a loss. I could be wrong. Same goes for nvidia.Intel's figures also took a nosedive in that period, so it seems recession was the main factor.
Yojimbo - Thursday, August 23, 2018 - link
Oh it definitely was low enough to lose them money. The same for 2007 and 2006. You can look at their financial results. Go to page 60 of this financial document: http://quarterlyearnings.amd.com/static-files/2b12...Look at the Graphics operating income. They lost 6 million in 2006, 39 million in 2007, and made 12 million in 2008. Making 12 million is pretty much breaking even. There was no recession in 2006 or most of 2007. I think NVIDIA only lost money in 2008 out of those years, probably a mix of the recession and NVIDIA switching to an expensive new architecture (Tesla).
Note that for AMD the Graphics segment "includes graphics, video and multimedia products and related revenue as well as revenue from royalties received in
connection with the sale of game console systems that incorporate our graphics technology". They were probably making money on consoles, as the XBox 360 was pretty popular, so they probably still lost money on PC graphics cards in 2008.
Also notice that in 2007 and 2008 AMD was bleeding money in their CPU business. Why they decided to fight on price so heavily that they lost money in GPUs as well I don't know. But they either willingly sold at a loss, or they thought NVIDIA couldn't or wouldn't cut prices and therefore AMD would gain market share, or they miscalculated the price demand curve, or they would have lost money anyway even if they hadn't cut prices. I think the last one is very unlikely. People still would have bought almost as many graphics cards even if they all cost $50 more, as history has shown. Anyway, it wasn't good for them because they had to start cutting down their R&D expenses and then they fell behind NVIDIA. So when people say we need more competition to drive down prices, well, they had their time of lower prices years ago at the expense of said competition today, because the price cuts of yesteryear (along with AMD's awful CPU performance) cost AMD the ability to compete, until with Fury and Vega they were forced to build such expensive cards just to try to compete that they couldn't get much market share even by cutting prices. The cards were so expensive to make they could only cut the prices so much. At times they have given up the high end of the market completely.
eddman - Monday, August 27, 2018 - link
You said AMD was losing money in that period and now we see that in 2008, where 4870/50 launched in June, they actually made money. So yea, the cards DID make them money even at such low prices. They were still recovering from the ATI purchase. Making even that amount of money in 2008 and in that financial situation and that time period is not bad with such low priced cards.Yojimbo - Thursday, August 23, 2018 - link
besides..."Perhaps the most notable thing about the GeForce 7800 GTX at launch was the $600 USD price tag. Previous generations had drawn the line for flagship pricing at $400, so back in 2005 $600 really was considered to be pretty staggering."
Maybe your example is a bad one. The 8800 GTX wasn't a big jump in price only because NVIDIA had already pushed the price up to new territory with the 7800 GTX.
Let's look at what happened after the 8800 GTX. The 9800 GTX launched at $350.
In any case, things were not as stable. back then as now. It's not a good comparison to be looking at generational price differences. But if we look at the 200 series and onward, prices have gone up and down frok generation to generation, but there hasn't really been a trend one way or the other.
eddman - Thursday, August 23, 2018 - link
It doesn't matter if 7800 GTX was already more expensive than the prior cards (and I did consider it to be quite overpriced back then). If technological advancements and better performance are the driving factors for a price increase, then how come 8800 GTX wasn't even MORE expensive, considering it was massively outperforming 7800 GTX and introduced CUDA?9800 GTX is a terrible example. It was launched only 2 months before GTX 280 and 260, simply to act as a lower range, cheaper card to those two. Nvidia never released true low-range Tesla based cards.
2080 Ti is the most expensive generational flagship launch card in the past 18 years at $1000. The second most expensive is 7800 GTX at $767 adjusted for 2018. That's about 30% more.
I made this graph last year, so prices are adjusted for 2017 dollar value: https://i.imgur.com/ZZnTS5V.png
Yojimbo - Thursday, August 23, 2018 - link
"If technological advancements and better performance are the driving factors for a price increase, then how come 8800 GTX wasn't even MORE expensive,"I think you can answer that for yourself if you just stop and think a second. The answer isn't hard to reach.
"9800 GTX is a terrible example."
All of the examples from back then are terrible. NVIDIA and ATI/AMD were coming out with cards all the time back then. Besides, there's no reason they need to introduce the card for a low price just because they will come out with something better and more expensive later. They can cut the price like GPU manufacturers used to do all the time in those days.
"2080 Ti is the most expensive generational flagship launch card in the past 18 years at $1000"
No. The Titan X launched at $1200. What was the major difference between the Titan X and the 1080 Ti other than time of introduction and 1 GB of RAM? It also launched close to the smaller chips just like the 2080 Ti has. What you want to call the flagship products (Ti versions) always launched later. The 2080 Ti is launching alongside the 2080 this time. Also, compare the die size and the percentage of cores cut off the full die of the GTX 780 with the RTX 2080 Ti and it's obviously the wrong comparison to make. The RTX 2080 Ti is a huge chip. It is not proper to compare it to the GTX 780. What has happened is that people are buying more and more powerful GPUs as the GPU has become more and more important to the game experience relative to the other components. Therefore, NVIDIA has introduced a new class of product with huge die sizes that are priced higher. They introduced that back in the Maxwell generation. At first they released them as prosumer cards first, only later coming out with a "Ti" branded version at a reduced price. This time they skipped the Titan and released the "Ti" branded card at the architecture launch. The proper comparison to the GTX 780 or the GTX 980 or the GTX 1080 is the GTX 2080.
eddman - Thursday, August 23, 2018 - link
How about you answer why 8800 gtx wasn't more expensive instead of dodging the question.No, just a few are bad examples, like the one you came up with and I clearly explained why.
Titans are not mainstream cards and do not fit in the regular geforce line up.
Excuses. It's not my fault they named it 2080 Ti which makes it a direct replacement for 1080 Ti. It is overpriced. Simple as that. I really don't understand how you as a consumer could defend a corporation's pricing behaviour.
Yojimbo - Thursday, August 23, 2018 - link
"How about you answer why 8800 gtx wasn't more expensive"OK. Because there are only so many bullets in a chamber. If they already raised the prices the generation before they have less ability to raise them now, everything else being equal. It's the market that sets the prices, ultimately. A company is just trying to maximize their prices. There is a price demand curve that takes into account how many customers are willing to pay what. We would have to look at market conditions to determine what the hell was going on at the time. The conditions of cost to produce, competitive environment, strength of the economy. That's why looking back at such an example is not a good way of trying to have this discussion. My point with talking about historical prices is only to show that the trend has not been to decrease or increase same-class graphics cards over time. Sometimes they have been cheaper and sometimes more expensive. But to single out a particular launch and say "this launch is most like the current launch, let's look at that one" is a bad idea, because there are so many other factors that influence that individual launch besides the cost to manufacture. Additionally, we didn't even establish the actual cost to manufacture or research the 8800 GTX. You just declared it was new technology and therefore should be a direct comparison.
"No, just a few are bad examples, like the one you came up with and I clearly explained why."
No, they are all bad. Go back and make a chart of product introductions and price cuts back in those days. They happened much more frequently and haphazardly. It was a different time.
"Titans are not mainstream cards and do not fit in the regular geforce line up."
In the Maxwell days Titan was a GeForce card. It fit in the GeForce lineup, it just was called Titan because it carried on the moniker of the card that went into the supercomputer. The Maxwell, Titan, however, was the first to now have FP64 and really didn't have anything that distinguished it from GeForce cards other than being the highest performing GeForce card of its time.
"Excuses. It's not my fault they named it 2080 Ti which makes it a direct replacement for 1080 Ti. It is overpriced. Simple as that. I really don't understand how you as a consumer could defend a corporation's pricing behaviour."
The name is the excuse. What's in a name? It's not the name that's important. It's the market segment that's important. That can be seen by die size, features, and by the release schedule. It's not your fault that it was named that. It's your fault that you are trying to argue that it's apples-to-apples even though the real-world situation is different, just because the name is the same. Then at the same time you try to argue that the 2080 Ti should also be compared with the 1080 strictly because of their launch schedule. Well, which is it? Should we follow the name or the launch schedule? In fact, we should follow the entire market conditions: the launch schedule, yes, but also the die size and the features of the card, i.e., the market segment the card is targeted at.
Yojimbo - Thursday, August 23, 2018 - link
should be "the Maxwell Titan was the first Titan to not have FP 64..."eddman - Thursday, August 23, 2018 - link
You said new technologies that require R&D and a bigger die force a price increase. 8800 GTX had CUDA, had a much bigger die and yet launched at the same price as 7800 GTX. The fact of the matter is, as soon as the competition gets gets weak, prices jump up. It has ZERO to do with the features and performance. This very much applies to 2080 cards.It wasn't a different time. Price cuts shortly after overpriced launch prices are not the same as regular price cuts during a generation.
Doesn't matter if titans at first had geforce in their names. They were never part of the regular cards. They were for special use cases, like kepler titan with its FP64 performance, and/or a way to extract money from those people who didn't want to wait for a Ti card (maxwell titan); and with pascal titans nvidia officially made them a separate line from geforce, confirming what everyone was saying all along.
I NEVER compared 2080 Ti to 1080. You don't read my comments properly. I compared 1080 to 2080 and 1080 Ti to 2080 Ti.
I, as a buyer, do not care in the least how big the chip is or what it brings. If the names match, then the cards are in the same category, therefore they should be in the same pricing ballpark. If the price is this much higher, then they are overpricing it compared to its predecessor. It cannot get simpler than this.
Yojimbo - Thursday, August 23, 2018 - link
"You said new technologies that require R&D and a bigger die force a price increase. 8800 GTX had CUDA, had a much bigger die and yet launched at the same price as 7800 GTX."Yes, I did say that, and I supported that in direct ways. It's pretty obvious that a larger die size will be more expensive to produce. It's also obvious that to maintain margins a costlier card will have to be sold at a higher price. These are facts. You mentioning the 8800 GTX is meaningless without hiring a market analyst to look into the specific of the 8800 GTX and the GTX 2080. We just can't glean anything from it with that. It's too complicated. I gave you a simple reason why the 8800 didn't increase prices further than the last generation, anyway.
"The fact of the matter is, as soon as the competition gets gets weak, prices jump up. It has ZERO to do with the features and performance. This very much applies to 2080 cards."
Not true if you look at the historical price trends. I've said this over and over: AMD had the weakest competition during the Maxwell era.. Maxwell cards were actually among the more affordable. And competition in graphics cards always has to do with features and performance. I don't know what you mean by that.
"It wasn't a different time. Price cuts shortly after overpriced launch prices are not the same as regular price cuts during a generation."
It was a very different time. Go back and look at the price cuts and launch history. Each manufacturer seemingly came out with cards 3 times a year. They'd cut prices at launches and otherwise. Just because you've categorized one particular instance of a price cut in your mind doesn't change that.
"Doesn't matter if titans at first had geforce in their names."
Wait, does the name matter or not? Make up your mind. I'm being serious here. You can't have it both ways.
"They were never part of the regular cards. They were for special use cases, like kepler titan with its FP64 performance, and/or a way to extract money from those people who didn't want to wait for a Ti card (maxwell titan); ""
Not wanting to wait for a card with a particular name is a special use case? Hmm, ok. The Maxwell Titan filled the exact same market segment as the 2080 Ti. Nothing more, nothing less. You haven't made any argument against that other than the name.
"I NEVER compared 2080 Ti to 1080. You don't read my comments properly. I compared 1080 to 2080 and 1080 Ti to 2080 Ti."
You said: "2080 Ti is the most expensive generational flagship launch card in the past 18 years at $1000. The second most expensive is 7800 GTX at $767 adjusted for 2018. That's about 30% more."
The flagship launch card of the Pascal generation was the 1080. So you most certainly did compare the 2080 Ti to the 1080.
"I, as a buyer, do not care in the least how big the chip is or what it brings. If the names match, then the cards are in the same category, therefore they should be in the same pricing ballpark. If the price is this much higher, then they are overpricing it compared to its predecessor. It cannot get simpler than this."
That's fine. As a buyer you can do what you want. Then NVIDIA loses your purchase. It doesn't make your claims right or your decision rational, though.
eddman - Thursday, August 23, 2018 - link
I really doubt it's so expensive that they "need" to price it at $1000. I bet they could cut it to $800 and still make a sizable profit. I can't prove it but when there is no competition, companies tend to overprice. That's simple business 101.Maxwells were still quite more expensive than fermis when AMD was competing properly.
... yet none of those 3 cards a year were generational upgrades.
I have made my mind. It said Titan in the name and they were not among the regular lineup. Later nvidia even removed the geforce from the name.
No, Titans never competed against Ti cards. Why? because we had and still have Ti cards. There is going to be a Titan turing card since 2080 Ti doesn't even have all its cores enabled.
https://videocardz.com/77696/exclusive-nvidia-gefo...
2080 Ti IS the most expensive flagship launch card. They both are the big-chip flagships of their generation. I cannot do anything about the lack of a 7800 Ti card. As I've already mentioned, I have never compared 2080 Ti to 1080. I should've written "generational flagship".
My claims are quite straightforward and clear. 1080 Ti, $700; 2080 Ti, $1000, therefore massively overpriced.
Yojimbo - Thursday, August 23, 2018 - link
"I really doubt it's so expensive that they "need" to price it at $1000. I bet they could cut it to $800 and still make a sizable profit."No, no chance they can make a sizable profit taking $200 off. How much do you think 11 GB of 14 Gbps GDDR6 costs? NVIDIA sells GPUs. Their operating margins are something like 35%, I think. I'm not sure what they sell the GPU for, but there's a lot of cost for the other components of a card. Suppose they sell it for $600. 35% of $600 is $210. If they took $200 off the price they'd be pretty much breaking even.
And NVIDIA tends to always make sizable profits. That's not a bad thing for consumers. It allows them to continue to invest in the technology. Look at the current difference between what NVIDIA can produce and what AMD can produce. That difference is a result of NVIDIA's sizable profits and AMD's losses.
But again, the point is not NVIDIA's sizable profits. The question is whether NVIDIA is making more money with RTX 2080 at $700 or GTX 1080 at $600. (The 1080 Ti and 2080 Ti are NOT a good comparison. The 1080 Ti did not come out until over 9 months later! If you want to make the comparison you must at least wait 9 months and consider the price of the 2080 Ti then, though m y guess is that 9 months from now NVIDIA will be close to introducing a new generation and so may not play around with price much until that new generation comes out. That short time to the new generation is one reason the 2080 Ti is out now.). And the answer is no, they don't seem to be making more with RTX 2080 at $700 than GTX 1080 at $600 as evidenced by their projected gross margins for the upcoming quarter.
Yojimbo - Thursday, August 23, 2018 - link
As far as a Titan Turning, no, I doubt it. Or if there is the Titan will include more RAM and the Ti will come down in price at that time. The Titan and the Ti share the same space. In the Maxwell generation they completely shared the same space and the only real difference was the timing of the release. It does'nt matter how much you try to deny that that's true, it's true. You just insist it without any real argument against it. "Why? because we had and still have Ti cards." No! We didn't have a Maxwell Ti card when the Titan X was introduced! And I don't think you want to be talking about the Maxwell Ti introduction too much because it blows apart your whole "when NVIDIA doesn't have competition they price things very high and never cut prices" claim."2080 Ti IS the most expensive flagship launch card."
The 2080 Ti may be the most expensive flagship card launch but it is in a different segment from all the other flagship card launches! It's like saying that when Toyota came out with the Avalon it was the most expensive Camry ever.
"As I've already mentioned, I have never compared 2080 Ti to 1080. I should've written "generational flagship"."
No matter how you try to spin it or what labels you use, you ARE comparing the 2080 Ti to the 1080 when you are comparing "generational flagships". Stop and think for a minute. How can you compare two things without comparing them?
"My claims are quite straightforward and clear. 1080 Ti, $700; 2080 Ti, $1000, therefore massively overpriced."
Yeah, a card that came out 9 1/2 months after launch and a card that came out at launch. What's quite straightforward and clear is that the comparison is flawed.
eddman - Friday, August 24, 2018 - link
Titans are not part of the regular geforce line. Nvidia removed the "geforce" part for the pascal variants and finally confirmed what everyone was suspecting. It doesn't get clearer than that.Titans come early to milk as much money as possible and when that's done Ti cards come in and perform as well or better in games. That's why we didn't have a Ti maxwell before titan.
False car analogy. Avalon is a completely different line from Camry. That's the entire reason they are named differently. A newer generation Camry usually comes with a lot of new features and technologies and yet is just a bit more expensive than the last one. 2080s are simply a newer generation Camry but nvidia wants to charge you the price of an Avalon.
You are doing the spinning here. There was no Ti back then. 7800 GTX was the big-chip flagship card. Stop playing around.
It's not my fault nvidia released 2080 Ti with 2080. They did and priced it at $1000 and since it's the direct successor to 1080 Ti, as any normal person can see it is, I can say it's WAY overpriced compared to its direct predecessor.
I never thought a customer would be so willing to defend a corporations pricing tactics. It is as if you like to be charged more.
What if hey priced it at $1500? Would you have also defended that? At what level you'd say it's overpriced? $2000? $3000? Corporations overprice when there is no one to stop them. That's the reality.
eddman - Friday, August 24, 2018 - link
*Ti cards come in and perform almost as well in games or even better when OCed for a much lower price.eddman - Friday, August 24, 2018 - link
I doubt that. Where did you get 35%? I don't buy that at $600 2080 would be barely breaking even. I really doubt it. Yes, I'm pretty sure 2080 cards are more expensive than 1080 cards but not expensive enough to justify such prices, specially for 2080 Ti.Nvidia isn't that stupid to price a card at $700 if a mere $100 price cut would wipe out the profits. Companies always price their cards high enough so that they can counter unexpected forced, early price cuts.
How many times I have to repeat this, I am NOT claiming 2080 is making them more money than 1080. I'm just saying they could drop the price a bit and still make a respectable profit.
2080 Ti would almost certainly still cost $1000 nine months later. No competition, therefore no price cuts, and since those two cards are both Ti, therefore I CAN compare them.
Yojimbo - Saturday, August 25, 2018 - link
"I doubt that. Where did you get 35%?"https://s22.q4cdn.com/364334381/files/doc_financia...
1157/3123 = 37%. What the operating margins are of a particular product, I have no idea. It's probably a bit higher than average for the Ti parts, but it's tough to say because the data center (Tesla) parts probably have margins much higher than average. But these are just rough numbers to show you that what you said is way off base.
"I don't buy that at $600 2080 would be barely breaking even. I really doubt it."
That's not what I said. I said that if you take $200 off the price of the 2080 Ti GPU (I wasn't talking about the 2080, although taking $100 off the 2080 would do something similar. It just wasn't the calculation I made) they are probably approaching break even. Let's clarify something. NVIDIA sells GPUs, not graphics cards. When you buy an MSI RTX 2080 Ti for $1000, a cut of that goes to the retailer. I don't know what their margins are, but let's say they bought the card for $900. Now MSI had to buy all the RAM and the PCB and the GPU and all the other components to put it together, plus they need to make a margin as well to make it worth it. Perhaps it costs them $800 to make the card and $100 is their cut. Out of that $800, they need to pay for the assembly line and workers and the components, including the GPU. So I think $600 for the TU-102 GPU was a very high estimate, since it doesn't leave very much for all the other stuff. That $600 is what NVIDIA is getting. If you take $200 off of that then you leave them with $400. If NVIDIA is selling the card at about 1/3 operating margins then 2/3 of the money they receive for the part is going toward expenses. 2/3 of $600 is $400. So, that $200 you took off was surely almost all of their operating profits.
Now, perhaps you want NVIDIA's AIB partners and the retailers (good luck with the retailers) to share in the margin loss with your price cut. Then you are not just squeezing NVIDIA's profits you are squeezing the others' profits too. Maybe you can get Micron to take less for the GDDR6 DRAM chips...
My point is this: NVIDIA projects upcoming margins to be lower than the previous quarter's margins. So it doesn't appear like these new cards are priced in a way to give NVIDIA richer margins than the 10 series cards. That suggests that the higher prices are accounted for by greater costs to manufacture.
eddman - Saturday, August 25, 2018 - link
"So it doesn't appear like these new cards are priced in a way to give NVIDIA richer margins than the 10 series cards."Sigh, for the 4th or 5th time, I never said that's the case. 2080s are almost certainly making less profit than 1080s, but I do not believe for a second that lower prices would still not have made them a sizable profit.
You can't calculate a card's profit margin based on the entire company's profit numbers. There is no way each 2080 Ti costs AIBs $800 to make. I very, very much doubt that. As I've mentioned before, businesses always leave enough room for unexpected price cuts so that they'd still make an acceptable profit.
Yojimbo - Sunday, August 26, 2018 - link
"Sigh, for the 4th or 5th time, I never said that's the case. 2080s are almost certainly making less profit than 1080s, but I do not believe for a second that lower prices would still not have made them a sizable profit."Then what are we arguing about? The raise in prices are justified by the greater cost of the card if they are still making less profit off them with the raise. You're just making a normative statement of "NVIDIA should be making less profit altogether".
"You can't calculate a card's profit margin based on the entire company's profit numbers."
You're right, but it's the best that we have. We can guesstimate.
"There is no way each 2080 Ti costs AIBs $800 to make."
If NVIDIA is charging $600 for the GPU, then that leaves $200 left for the other stuff. $200 for the RAM, PCB, voltage regulators, heat sink, labor and assembly line costs, etc., seems exceedingly low. I tried to estimate the price NVIDIA was charging AIBs in the high range because that gives the best chance of your $200 price cut to not result in a loss for NVIDIA, making your case as strong as possible. If we move the price NVIDIA is charging AIB's to $500 then that leaves more room for a possible $700 cost to make the cards. But that doesn't help your case.
"As I've mentioned before, businesses always leave enough room for unexpected price cuts so that they'd still make an acceptable profit."
No they don't. They pretty much maximize their profits for the expected market conditions while avoiding risky situations that could put them in financial distress. But NVIDIA doesn't really have to worry about that latter part at the moment. The planning of what costs are acceptable happens a lot earlier than bringing the product to market. It's the market that sets the price. Companies try to predict the market conditions and then maximize their profits within those conditions. They end up with margins because maintaining margins is the whole point of the game, not in case there are unexpected price cuts.
eddman - Sunday, August 26, 2018 - link
There is no way nvidia is charging AIBs $600 for a GPU. Where do you even get these numbers from? I bet the entire card costs AIBs no more than $500.2080s are overpriced.
eva02langley - Thursday, August 23, 2018 - link
You can buy a 1080 TI AMP on amazon for 529$.milkod2001 - Thursday, August 23, 2018 - link
BS, it is $679iwod - Thursday, August 23, 2018 - link
I assume hey price this so they can lowered it down a year later into normal range. Imagine next year you get double the performance of RTX 2080 with 7nm.I haven't been following GPU close, what happens to Dual GPU config? Are the software still not up to it?
yhselp - Thursday, August 23, 2018 - link
Judging by your hands-on with real-time ray tracing in games from a couple of days ago, an RTX 2080 Ti struggles to maintain 60fps at 1080p. And even though games and drivers aren't final yet, it's still hard to believe they'd be able to gain much performance. If so, what on earth is NVIDIA on about with these 4K60fps stats, and on a less powerful card no less?Is NVIDIA advertising two separate features (ray tracing and 4K) that can work on their own but not together? Are we talking about 2080 being capable of 4K rasterization and ray tracing at, what, 900p? Seems about right if 2080 Ti struggles at 1080p... And what of the 2070 then? Would it be able to run ray tracing in games at all?
It doesn't seem likely NVIDIA would champion a marquee feature, and name their cards after it, that is such a performance hog that it can only run on a $1000 flagship at 1080p in late 2018.
Something doesn't add up. Please, confirm, deny, or provide more information.
Skiddywinks - Thursday, August 23, 2018 - link
My impression has been that the struggling performance we've seen with the likes of BFV, Metro, and Tomb Raider was with RT enabled, and the recent numbers straight from nVidia have been without RT and with some unspecified AA (to then compare to DLSS). The 4K numbers are almost without a doubt with RT disabled, otherwise they'd be shouting it out from the rooftops.eva02langley - Thursday, August 23, 2018 - link
Guess what, they probably comparing 16X MSAA or either Ubersampling in to compare their cards with their new "equivalent" features.Unless we see benchmarks, this graph proves absolutely nothing beside it looks like the 2080 is about 10% faster than a 1080TI.
colonelclaw - Thursday, August 23, 2018 - link
Here in the UK the new 2080 (non-Ti) starts at £715. A 1080Ti can be had for £630. Sources below:https://www.scan.co.uk/shop/computer-hardware/gpu-...
https://www.scan.co.uk/shop/computer-hardware/gpu-...
Given that the 2080 is £85 more expensive, has less VRAM, and is a year newer, I would seriously hope that it would be not just as fast as a 1080Ti, but a lot faster.
I suspect the opening topic for discussion at the most recent Nvidia board meeting was "How much can we get away with charging?".
Solidstate89 - Thursday, August 23, 2018 - link
Ansel has to be one of the most underrated features of modern nvidia cards. The kind of screenshots you can get in The Witcher 3 are really quite stunning. And such stupidly high resolutions too.