FWIW, the RX 580 can regularly be had for ~$200 these days, so TECHNICALLY benedict is right - the 580 goes for ~57% of the 2060's MSRP ($349) but offers more like 60-65% of the performance (and that's assuming you can find the 2060 for MSRP, which we'll have to see).
IMO, they're just in different markets. The 590 is looking like a misstep by AMD at this point, since it's midway between the 580 and the 2060 in price, but isn't much faster than the 580 in real world performance. The Vega series are interesting, but AMD and partners probably have limited ability to lower prices due to how expensive the HBM2 memory on those boards is.
No, Benedict is not right. At best, he is only partially right on one out of at least three points, and only when viewed in the most favorable light. He said 1) the 2060 is "not much faster," 2) said that the 1060 is much more expensive and 3) said the 580 has the best price/performance ratio. The GTX2060 is clearly significantly faster than the RX580 (by more than 50%) so his first point is flat out wrong. His second point is also wrong because the GTX1060 costs about the same as the RX580 in terms of market price and performs about the same. His 3rd point is debateable and may be right. The 580 does have a marginally better price/performance ratio if you compare market price RX580 ($200) to MSRP GTX2060 ($350). However, if you compare MSRP RX580 ($240+) to MSRP FE GTX2060 ($350) then the equation changes as the RX580 no longer has the best price/performance either as the 2060 is 46% pricier but performs 50-60% better. And other models of the 2060 will surely be cheaper than the FE version.
He is completely right - RTX2060 has 48 ROPs and 6 GB VRAM, which is pointless at $349, regardless how "fast" is the card. Raytracing is also pointless with so scarcely resourced card. At this point a person should either buy 2080Ti, if money are no object, or buy Radeon 580 8GB for under $200. Most modern games simply manage huge textures in VRAM - you need more RAM, quick memory buses and ROPs for that. I am no particular fan, but AMD will kill NVidia any moment with announcement of 7 nm chips - at smaller process brute force will have advantage no matter how smart is NVidia's prefetch, bus compression and all other "smart" staff. Same happened to Intel. And TSMC is already occupied working for AMD, Samsung is busy and any excess is reserved by Intel, so NVidia will have no access to 7nm fab for a while
Real world results matter more than on paper specs. The actual real world benchmarks show the Gtx2060 is 50-60% faster than the rx580. You seem to be overly hung up on VRAM and ROPs. Anadtech and techspot have done plenty of tests on VRAM, and modern games dont use nearly as much VRAM as you think - even on 1440p. And take a look at the 4k benchmarks in the article. The 6Gb vram is clearly sufficient. Furthermore, the rx580 8gb costs about $200-250 new, whereas the 2060 is $350 for the more expensive founders edition. Realistically, the aftermarket cards will be something like 200-250 for new rx580s vs 300 for a new non FE 2060.
As for your mention of future AMD chips, we are comparing 2060 vs 1060 vs rx580.
There are several RX580 8GB cards sales today for around $160-170. It night be old, it might be hot, but runs 1080p just perfect on any game, and it costs half. Next generation consoles are already well into making, and developers are developing games for consoles, not for PCs, which will never change since the scale of business is tenfold if not larger. Next might be game services directly integrated in the TV or whatever device gives a mass market, but would never be a PC. Anyway, new consoles will have A LOT more VRAM, and 6GB will simply not cut it. And they are coming later this year/early next year, not like waiting ten years for it. I have lived through enough NVidia "cycles" and can tell you this is typical NVidia greed - narrow memory bus and limited memory - exactly like 12 years ago, when AMD had nothing, but came with the rough, hot unoptimized 5850 and collected half of the market. The card is OK, it is just not worth $350 today, that's my point.
And take a look at the article's 4k benchmarks. The 2060 6gb performs equal to or better than the vega 64 or gtx1070 gpus with 8GB of vram even on 4k resolution. You are overly hung up on paper specs and completely missing what is actually important - real world performance. The 2060 clearly has sufficent VRAM as it performs better than its closest priced competitors and spanks the rx580 8gb by 50-60%.
ROFL. AMD is aiming first 7nm cards at GTX 1080/1070ti perf. This is not going to kill NV's cash cows, which 80% of NET INCOME comes from ABOVE $300. IE, 2060 on up and workstation/server cards at 80% of NV's INCOME (currently at ~3B a year, AMD <350mil).
NV is one of the companies listed as having already made LARGE orders at TSMC, along with Apple, AMD etc. The only reason AMD is first (July, MAYBE, So NV clear sailing for 6 more months on 2060+ pricing and NET INCOME), is Nvidia is waiting for price to come down so they can put out a LARGER die (you know brute force you mentioned), AND perhaps more importantly their 12nm 2080ti will beat the snot out of AMD 7nm for over a year as AMD is going small with 7nm because it's NEW. NV asked TSMC to make 12nm special for them...LOL, as you can do that with 3B net INCOME. It worked as it smacked around AMD 14nm chips and looks like they'll be fine vs. 7nm that is SMALL die sizes. If you are NOT competing above $300 you'll remain broke vs. Intel/NV. https://www.extremetech.com/computing/283241-nvidi... As he says, not likely NV will allow more than 6-12 on a new node without a response from NV, though if you’re not battling above $300, no point for an NV response as not much of their money is under $250. “If Navi is a midrange play that tops out at $200-$300, Nvidia may not care enough to respond with a new top-to-bottom architectural refresh. If Navi does prove to be competitive against the RTX 2070 at a lower price point, Nvidia could simply respond by cutting Turing prices while holding a next-generation architecture in reserve precisely so it can re-establish the higher prices it so obviously finds preferable. This is particularly true if Navi lacks GPU ray tracing and Nvidia can convince consumers that ray tracing is a technology worth investing in.” AGREED. No response needed if you are NOT chasing my cash cows. Again, just a price cut needed even if NAVI is really good (still missing RT+DLSS) and then right back to higher prices with 7nm next gen Q1 2020? This is how AMD should operate, but management doesn’t seem to get chasing RICH people instead of poor.
BRUTE FORCE=LARGER than your enemy, and beating them to death with it (NV does this, reticle limits hit regularly). NV 12nm is 445mm^2, AMD 7nm looks like 1/2 that. The process doesn't make up that much, so no win for AMD vs. top end NV stuff obviously and they are not even claiming that with 1080 GTX perf...LOL. 10-15% better than Vega64 or 1080 (must prove this at $249)? Whatever you're smoking, please pass it. ;)
Raytracing+DLSS on RTX 2060 is 88fps. Without both 90fps. I call that USEFUL, not useless. I call it FREE raytracing. https://wccftech.com/amd-rx-3080-3070-3060-navi-gp... Not impressed for no RT/DLSS tech likely for another year as AMD said they won't bother for now. This card doesn't compete vs ANY RTX card, as they come with RT+DLSS as NV shows releasing new 10 series cards, probably to justify the RTX pricing too (showing you AMD is in a different rung by rehashing 10 series). Which as you see below DLSS+RT massively boosts perf so it's free. It's clear using tensor cores to boost RT is pretty great.
https://wccftech.com/nvidia-geforce-rtx-2060-offic... Now that is impressive. Beats my 1070ti handily, looks better while doing it, and shaves off 20-30w from 180. That will be 120w with 7nm and likely faster and BEFORE xmas if AMD 7nm is anything worth talking about on gpu (cpu great, gpu aimed too low, like xbox1/ps4 needing another rev or two...LOL). 5Grays/s (none for AMD, this year) which is plenty for 1080p as shown. It's almost like getting a monitor upgrade for free (1080p DLSS+RT is 1440p or better looking with no perf loss). What do you think happens with games DESIGNED for RT/DLSS instead of patched in like BF5? LOL. These games will be faster than BF5 unless designed poorly right? It only took 3 weeks to up perf 50% in a game NOT made for the tech (due to a bug in the hardware EA found I guess as noted in their vid - NV fixed, and bam, perf up 50%). 6.5TF, which even without RT+DLSS is looking tough for AMD 7nm currently based on announcements above. AMD will be 150w for 3080 it seems, vs. NV 150-160w 12nm 2060 etc. Good luck, no wiping away NV here.
"If you’re having a hard time believing this, don’t worry, you’re not alone because it does sound unbelievable." Yep. I agree, but we'll see. Not too hard to understand why AMD would have to price down, as it has no RT+DLSS. You get more from NV, so it's likely higher if rumor pricing on AMD is correct.
Calling 60% barely faster is just WRONG. Categorically WRONG. Maybe you could say that at 10%, heck I'll give 15% to you too. But at 60% faster, you're not even on the same playing field now. RTX cards (2060+ all models total) will sell MILLIONS in the next year and likely have already sold a million (millions? All of xmas sales) with 2080/2080ti/2070 out for a while. It only takes 10mil for a dev to start making games for consoles, and I’m guessing it’s the same roughly for a PC technology (at 7nm they’ll ALL be RTX probably to push the tech). The first run of Titans was 100k and sold out in an F5 olympics session...LOL. Heck the 2nd run did the same IIRC and NV said they couldn’t make them as fast as they sold them. I'm pretty sure volume on a 2060 card is MUCH higher than Titan runs (500k? A million run for midrange?). I think we’re done here, as you have no data to back up your points ;) Heck we can’t even tell what NV is doing on 7nm yet, as it’s all just RUMOR as the extremetech article shows. Also, I HIGHLY doubt AMD will decimate the new 590 with prices like WCCFTECH rumored. The 3070 price would pretty much KILL all 590 sales if $199. Again, I don’t believe these prices, but we’ll see if AMD is this dumb. NONE of these “rumors” look like they are AMD slides etc. Just words on a page. I think they’ll all be $50 or more higher than rumored.
I think your confused about DLSS. DLSS is fancy upscaling. It won't make a 1080p image look like 1440p, it will make a 720p image "sorta" look like a 1080p image (does it even work in 1080p now? It was originally only available on 1440p and 4k, for obvious reasons.).
The 2060 is the best value 20 series card so far but when compared to historical values its absolute garbage. In the 350$ price bracket you had the 4GB 970, then the 8gb 1070/1070ti, and now the *6GB* 2060. The 1070 and 1070ti were a huge improvement over the equivalent priced 970 and the 2060 (the same price bracket) is a couple of percent better (if that) and at a 2GB memory deficit!
Nvidia shoveled a enteprise design onto gamers to try and recover some R&D costs from gamers itching for a new generation card. There are already rumors of a 2020 release of an entirely new design based on 7nm. This seems quite plausible considering they are going to be facing new 7nm cards from AMD and *something* from intel in 2020. And underestimating Intel has always been a very very dangerous thing to do.
"DLSS is fancy upscaling. It won't make a 1080p image look like 1440p, it will make a 720p image "sorta" look like a 1080p image (does it even work in 1080p now? It was originally only available on 1440p and 4k, for obvious reasons.)."
Em no, it's not upscaling, but rather the opposite.
https://www.digitaltrends.com/computing/everything... "DLSS also leverages some form of super-sampling to arrive at its eventual image. That involves rendering content at a higher resolution than it was originally intended for and using that information to create a better-looking image. But super-sampling typically results in a big performance hit because you’re forcing your graphics card to do a lot more work. DLSS however, appears to actually improve performance."
That is where the tensor AI comes in. It is trained in-house by Nvidia with the game, with higher res images (upto 64X resolution), which the AI then applies to change the image you see by effectively applying supersampling (aka DOWNscaling).
Example: If object A looks like A1 at 1080p, which looks like A2 if downsampled from resolution(n*1080p) If object B looks like B1 at 1080p, which looks like B2 if downsampled from resolution(n*1080p) If object C looks like C1 at 1080p, which looks like C2 if downsampled from resolution(n*1080p)
Then a scene at 1080p which has objects A, B and C which, without DLSS, would look like A1B1C1, would look like A2B2C2 with DLSS.
Intel will release low end cards, from a gaming point of view, considering 1440p as the new standard. Nvidia will still have the high end and they will raise the prices of their high performance cards even further. I see nothing good in the near future for our pockets. The wealthy trolls will be even more aggresive.
Why not? If DLSS allows for the right performances I can't see why 4K are excluded from the possibile use of this card. Or do you think that 4K is only 8+GB of RAM, 3000+ shaders and 64+ROPS with 500+GB/s bandwidth for any particular reason? They are all there for <b>incrementing performances</b>. If can do that with other kind of work (like using DLSS) then, what's the problem? The fact that AMD has to create a card with the above resources to do (maybe) the same work as they have not invested a single dollar in AI up to now ad are at least 4 years behind the competition on this particular feature?
No, the 580 has a significantly worse price/performance ratio. All of you here are making the common mistake of calculating price/performance as though a card works without a PC to put it in, instead of improving the performance of said PC. If we calculated price/performance by the cost of the individual component then every discrete card would be infinitely worse than integrated graphics, which costs $0. We can intuitively understand that this is flat out false.
If you've just dumped $3000 into a high end machine and its peripherals, with a RX 580 the cost goes to $3200. With a RTX 2060 it's $3350. The RTX 2060 is 4.7% more expensive for 50% more speed. This price/performance holds true until we hit a $100 PC. If you pulled a $100 Sandy Bridge rig off ebay, then now at $450 with a RTX 1060 it is 50% more expensive than at $300 with a RX 580.
As long as your rig is worth more than $100 the RTX 1060 stomps the RX 580 in price/performance.
You're ignoring the point that not everyone wants to spend $350 on a GPU. 1080p gaming is perfectly acceptable on cards costing less than $300 and at the moment AMD offers the best value here. Especially with RX 570. Turing is just overpriced.
Those people can stick with the lower-end card, this card is in a separate class, why are people comparing them in the first place lol. This card should be compared with the Vega 56 on the AMD side....
10 gallons of premium gas ($2.60) is only 00.002498438476% more than regular ($2.50) in my $40K truck.
However for my my $500 riding lawnmower that would be .1904751904762%
If we were to take your example to the extreme, then you would need to add in the cost of electricity, factor in the percentage of your heating/cooling bill, house structure, lot, cost of living, price of your birth...and anything else that's a factor in the operation of the computer, because otherwise you would be making the common mistake of calculating price/performance as though computers operate without existence...
In actuality premium being typically 10 cents more per gallon is in fact 4% more, bc like GPU's you are comparing the product to a similar product.
In the case of the 580 & 2060, a 75% increase in price for 50% more performance with the caveat of DLSS & RT are also included.
You are comparing single time purchasing cost to expendable resources. Your logic in fact is good when you buy a serve class HW where the purchase cost of the mere HW is only a fraction of the cost of the entire cost needed to male it run, with energy and cooling system taking the bigger part of the bill.
Here we are saying that you have bought a $40K truck and adding that $50 comfortable chair is not that expensive instead of buying that other bare wooden chair. Yes with the latter you would have $50 more in your pocket but at the cost of diminished comfort which you could have enjoyed for a $50/$40K % more in the truck purchase.
And yet, about consuming, buying an AMD card to have the same performance than nvidia's is going to use more "fuel" and so cost more as much as you use it for as long as you use it. It's like buying a $40K truck engine that AMD has to sell at $10 to have the same performance and appeal (seen the fact that is not really that "smart") of a $20 muscle car. You can see why many go directly for the $20 car despite the AMD discount on their crap product.
Wow, OK. Pick anything not consumable then. Although a GPU can be considered expendable as they tend to go obsolete in the life time of a set of tires. Hardly a one time cost or do you not upgrade? The consumability of the product isn't the point nor does it have any bearing though.
His point would be fine if you were comparing an entire system DIY or prebuilt system. Lets say a Dell vs HP. All things equal except for the GPU and the price reflected that, OK fine. But were comparing only the GPUs. @ 3k it's 4.7% @ 1.5K it's 11% of total cost. Then if were comparing system price, then total system perf comes into play not just the GPU perf. Thats the trap you both are falling into.
Lets be clear. I'm not advocating the AMD card or the Nvidia card. I'm merely pointing out the trap he and yourself is falling into with such an argument.
TBF if you're looking at a card with 2060 perf, then you're not considering the 580. If you're looking to make a 1080 gaming machine, the 580 is fine, the 2060 is overkill along with your 3K system.
And cost 75% more. Also, we are talking about FPS so it is a non-linear comparison. Anyway it is still not awesome value.
If Navi is 15% faster than a Vega 64 and cost 100$ less than a RTX 2060, you understand the 2060 RTX is still fairly overpriced. There is still no value here and you might be able to grab a similar card for less if you find a good deal.
"If Navi is 15% faster than a Vega 64 and cost 100$ less than a RTX 2060..."
What if it is 100% faster than Vega 64, and they give it away for free? Heck, what if they threw a big chocolate cake into the deal and Lisa Su personally came to my house to spoon-feed it to me while a hot chick cosplaying as Ruby stripped for my entertainment? C'mon, my fantasy may be slightly more unrealistic than yours but it is certainly a lot more fun.
Problem is that most probably AMD has to price their high end card to that low price to have a chance to sell it. If, as it seems, Navi has not RT, DLSS, mesh shading (without speaking about multi-projection which helps a lot in VR, and Voxel effects acceleration which, alas, have not been developed up to now due to their lack of support on the crappy console HW) they will run only in the lower segment of the market. Their possibly high frames will all be fake as the use of a single advanced effect supported by the competition will make them fall as fast as a falling lead stone. Yet, we will have hoards of AMD fanboy crying out for the "Gameswork" tricks and bells and twists and nvidia payment to the developers an all the things we have already heard since AMD solutions have not been able to keep up with nvidia's advanced solutions that do not require simple brute force, that is since AMD acquired ATI, when the simple reality is that they have been behind in technological development and requires the market to slow down to not leave them in the dust.
RX580 is totally beaten by RTX 2060. Not quite double the performance, but not far from it. Not to mention perf/w, perf/money, noise, etc. characteristics, which are boatloads better than on the AMD card.
Sorry, RX 580 noise levels seem to be quite reasonable. I've been watching mostly Vega since it is only thing that is actually a proper upgrade for me.
Not sure what you mean by "watching," but if you mean "waiting for a price drop," I wouldn't hold my breath. The HBM2 memory on those boards is significantly more expensive than the stuff found on Nvidia's, plus, the cards are very power hungry (which not only is a concern for the user, but also means that they need circuitry on board to deliver that large amount of power, which also adds to the cost to make the card).
This legend that HBM costs like the other kind of memory is in place since people stated that HBM cost like GDDR5 (then it was against the lowly available GDDR5Xm produced only by Micron) and that using it against 12 chip of GDDR5/X/6 and the complexity needed for the PCB layout to handle them was almost the same.
Unfortunatelty nothing of this is true. HBM costs more per chip (and by GB) by itself for its construction that requires a high end process for stacking up all those layers. Moreover, it requires a big silicon interposer that is expensive enough to cover the cost of any GDDRn memory type based PCB. Third it requires a different path for mounting and aligning it on the interposer that is also an expensive procedure (see the problems AMD encountered for it) and that can't be done in the AIB fabs where GDDRn chips are usually mounted and soldered for 0,01$ each chip.
What AMD fanboys constantly states is their hope that AMD is not going to loose so much money for any Vega that they are selling. Unfortunately, they are, and this new video card by nvidia will make even more hard pressure on Vega as we already have announcements of further price cuts on such an expensive piece of crap that can't compete with anything in any market it has been presented and has required the constant price cut even before it was launched. In fact Vega FE cards started discounted by $200 at day one with respect to the former announced MSRP price... what a marvelous debut!
Totally beaten in raw performance, yes. But I don't want it anywhere near badly enough to pay $349 for it. What you can actually buy for what I would be willing to spend is the Rx580 or GTX1060 at under $200.
zepi, if all one wants is normal 1080p gaming then an RX 580 is a much better buy, especially used. Mine only cost 138 UKP. The real joke here is the price hiking of the entire midraange segment. The 2060 is what should have been the 2050. People are paying almost double for no real speed upgrade compared to two years ago at the next teir up (which should have been what the 2070 is now). Tech sites know this, some talked about it early on, but now they've all caved in to the new 2x higher pricing schema, the only exception being Paul at "not an apple fan" who continues to point out this nonense. If people go ahead and buy these products then the prices will keep going up. And AMD will follow suit, they'd be mad not to from a business standpoint.
I would not expect the 2060 to be anywhere near msrp considering its the only turing card with a reasonable msrp/performance ratio the demand will be high. And we all know what happens when demand is high.
@Benedict, Did you even read the article? The GTX2060 is more than 50% faster than the RX580 and the GTX1060. Furthermore, the GTX1060 6GBs cost around the same as the RX580 - the 1060 is not "much more expensive."
Feel free to spend 100% more than what x60 cards used to cost, for a performance level that should be the tier below, but don't complain when the prices get hiked again because consumers keep buying this ripoff junk.
If you want the best price/performance you need to go a little bit lower than the 580. The 570's have been ~$130 AR pretty regularly with a couple dips below that. Personally, I picked up an 8gb 570 for $190 with three stacked rebates/GCs for a total of $90 off bringing the cost of the card down to $100. I also sold off my old video card and one of the games from the bundle for another $50 bringing my upgrade cost down to ~$50. I had been wanting to upgrade for a while and was hoping for a 580 or a 1060 3gb or 6gb but the 570 looked like such a good deal I couldn't resist. Yes it was quite a bit of rebates but at this point I've gotten all of them so that is my final AR cost. Granted this even further down the performance curve but a 8gb 570 is certainly going to be a lot better than %50 of the performance of an 8gb 580 but that's what I paid for mine.
the rx 570 can be found for 140$-150$ now and comes with your choice of 2 games out of three (unreleased games) (devil may cry, divison 2 and Resident evil 2 remake). for that price i think fir 570 is best GPU for the price, great for 1080p gaming.
2060 is considerably faster than a 580 tho, I recently upgraded from an R9 290X to EVGA RTX 2060 XC Black and love it, the 290X served me very well tho great card even with todays games @ 1080P but struggled a bit trying to hit my monitors 144hz refresh.
I don't care if it's as fast as a 1070ti. A xx60 series card should never cost more than 250 and the 1060 was allready overpriced for most of the time, due to all that bitcoin-fuckery.
The Vegas are a good bit cheaper than what the scale shows. Not just on sale but regular price reductions. Even mentioned in the article so why tye discrepancy? Also I thoight Vega was a bit slower than the vanilla1080. Its showing to be faster than the FE?
I'm not sure what you're referring to, since the best deal I've heard of on the Vega 56 was ~$320 on Black Friday, and today, I can't find a card for less than $370 (at NewEgg on one model, all others are $400+). I like AMD but given today's prices, the only price category where I think AMD wins right now is with the ~$200 580. The ~$280 RX 590 is most of the way to the 2060's MSRP but offers significantly less performance.
Per the article, ". In the mix are concurrent events like AMD-partner Sapphire’s just-announced RX Vega price cuts, which will see the RX Vega 64 Nitro Plus moved to $379 and the RX Vega 56 Pulse to $329, and both with an attached 3-game bundle" Thats even better than what Ive seen.
I just bought a MSI vega 64 from amazon for $399 with the 3 game bundle in Dec. Ive seen on avg 400-450 for Vega 64 and a good bit lower for Vega 56.
Compared to a 2060? The avg diff according to Anand's Bench is 130watts.
Avg price of electricity in the US is 12 cents a kilowatt hour. That means it would cost you 1.2 cents per 100watts an hour. It would cost you on average 1.668 cents more an hour to run a VEGA 64 at full bore balls out compared to the 2060. If we then calculate the difference for an entire year @ 100% power draw for 365 days or 8760hrs the total comes out to $146.12 Here in Germany it would be about double that.
Lets be real no one does that. (Miners?)
Avg is 12hrs a week! Highly doubtful the card is running 100% for 12hrs a week but if it were. 52 weeks in a year, 12 hrs a week for 624hrs for a soul crushing total of $10.41
So yes it cost more to run a higher power card....duh, but it's not double. Stop the FUD.
Here's the thing, though, right now, there ISN'T a card on the market that offers anything like that level of performance for that price, if you can actually buy one for close to MSRP. The RX 590 is almost embarrassing in this test; a recently-launched card (though based on older tech) for $60 less than the 2060 but offering nowhere near the performance. The way I read the chart on performance/prices, there's good value at ~$200 (for a 580 card), then no good values up till you get to the $350 2060 (assuming it's available for close to MSRP). If AMD can offer the Vega 56 for say, $300 or less, it becomes a good value, but today, the best price I can find on one is $370, and that's just not worth it.
I don't say, that the 2060 isn't good value, but it simply is priced way too high to be a midrange card, which the xx60-series is supposed to be. Midrange = $1000 gaming-rig and that only leaves some $200-250 for the GPU. And as I wrote, even the 1060 was out of that pricerange for most of the last two years.
I totally get your point - but to some extent, it's semantics. I'd never drop the ~$700 that it costs to get a 2080 today, but given that that card exists and is sold to consumers as a gaming card, it is now the benchmark for "high end." The RTX 2060 is half that price, so I guess is "mid range," even if $350 is more than I'd spend on a GPU.
We've seen the same thing with phones - $700 used to be 'premium' but now the premium is more like $1k.
The one upside of all this is that the prices mean that there's likely to be a lot of cards like the 1060/1070/RX 580 in gaming rigs for the next few years, and so game developers will likely bear that in mind when developing titles. (On the other hand, I'm hoping maybe AMD or Intel will release something that hits a much better $/perf ratio in the next 2 years, finally putting pricing pressure on Nvidia at the mid/high end which just doesn't exist at the moment.)
It could be possible that the GTX2060 is not midranged but lower high range card. Most XX60 cards in the past were midranged, but they were not all midranged. Though most past XX60 cards have been midranged and cost around $200-$300, if you go to the GTX200 series, the GTX260's MSRP was $400 and was more of an upper ranged card. The Founder's Edition of the 1060 also launched at $300.
Weeeeeeeeeelllll.... before all the mining happened, the 970 was a pretty popular card at $300-$325. (At one point iirc it was the single most popular discrete GPU on Steam's hardware survey.)
Yeah, I think 350 is just about the maximum Nvidia can charge for midrange. The 970 had the bonus of offering 780ti levels of performance very shortly after that card launched. Today, we're looking at almost 3 years for such a jump (1080 > 2060).
I paid an inflated $450 for my launch 1070 2.5 years, and this 2060 is barely faster at $100 less. Godawful value proposition especially when release dates are taken into consideration.
It's been a *long* time since I've seen a board vendor offer a board with more VRAM than spec'd by the GPU maker. I would be surprised if anyone did it...easier to point people at the 2070.
It's likely that Nvidia has actually done something to restrict the 2060s to 6GB - either though its agreements with board makers or by physically disabling some of the RAM channels on the chip (or both). I agree, it'd be interesting to see how it performs, since I'd suspect it'd be at a decent price/perf point compared to the 2070, but that's also exactly why we're not likely to see it happen.
You can't add memory at will. You need to take into consideration the available bus, and as this is a 192bit bus, you can install 3, 6 or 12 GB of memory unless you cope with hybrid configuration thorough heavily optimized drivers (as nvidia did with 970).
Even if they wanted to increase it, just adding 2GB more is hard to impossible. The chip has a certain memory interface, in this case 192-bit. Thats 6x 32-bit memory controller, for 6 1GB chips. You cannot just add 2 more without getting into trouble - like the 970, which had unbalanced memory speeds, which was terrible.
It was unnoticeable back then, because even the most intensive game/benchmark rarely utilized more than 3.5GB of RAM. The issue, however, comes when newer games inevitably start to consume more and more VRAM - at which point the "terrible" 0.5GB of VRAM will become painfully apparent.
So, you agree with my original comment which was that it was not terrible at the time? Four years from launch and it's not yet "painfully apparent"?
That's not a bad lifespan for a graphics card. Or if you disagree can you tell me which games, now, have noticeable performance issues from using a 970?
FWIW my 970 has been great at 1440p for me for the last 4 years. No performance issues at all.
I am more interested in that comment " yesterday’s announcement of game bundles for RTX cards, as well as ‘G-Sync Compatibility’, where NVIDIA cards will support VESA Adaptive Sync. That driver is due on the same day of the RTX 2060 (6GB) launch, and it could mean the eventual negation of AMD’s FreeSync ecosystem advantage." will ALL nvidia cards support Freesync/Freesync2 or only the the RTX series ?
Important to remember that VESA ASync and FreeSync aren't exactly the same.
I don't *think* it will be instant compatibility with the whole FreeSync range, but it would be nice. The G-sync hardware is too expensive for its marginal benefits - this capitulation has been a loooooong time coming.
More overpriced useless shit. These reviews are very rarely harsh enough on this kind of crap either, and i mean tech media in general. This shit isn't close to being acceptable.
Professionalism doesn't demand harshness. The charts and the pricing are reliable facts that speak for themselves and let a reader reach conclusions about the value proposition or the acceptability of the product as worthy of purchase. Since opinions between readers can differ significantly, its better to exercise restraint. These GPUs are given out as media samples for free and, if I'm not mistaken, other journalists have been denied pre-NDA-lift samples by blasting the company or the product. With GPU shortages all around and the need to have a day one release in order to get search engine placement that drives traffic, there is incentive to tenderfoot around criticism when possible.
It all depends on what is your definition of "shit". Shit may be something that for you costs too much (so shit is Porche, Lamborghini and Ferrari, but for some else, also Audi, BMW and Mercedes and for some one else also all C cars) or may be something that does not work as expected or under perform with respect to the resources it has. So for someone else it may be shit a chip that with 230mm^q, 256GB/s of bandwidth and 240W perform like a chip that is 200mm^2, 192GB/s of bandwidth and uses half the power. Or it may be a chip that with 480mm^2, 8GB of latest HBM technology and more than 250W perform just a bit better than a 314mm^2 chip with GDDR5X and that uses 120W less.
On each one its definition of "shit" and what should be bought to incentive real technological progress.
The new feature doesn't subtract from its normal functions though - there is still an appreciable performance increase despite the focus on RTS and whatnot. Plus, you can simply turn RTS off and use it like a normal GPU? I don't see the issue here
If you feel compelled to turn off the feature, then perhaps it is better to buy the alternative without it at a lower price. It comes down to how much the eye candy is worth to you at performance levels that you can get from a sub $200 card.
It's shit when these fancy new features are kept back by the console market that has difficult at handling less than half the polygons that Pascal can, let alone the new Turing CPUs. The problem is not the technology that is put at disposal, but it is the market that is held back by obsolete "standards".
You mean held back by economics? If Nvidia feels compelled to sell ray tracing in its infancy for thousands of dollars, what do you expect of console makers who are selling the hardware for a loss? Consoles sell games, and if the games are compelling without the massive polygons and ray tracing then the hardware limitations can be justified. Besides, this hardly can be said of modern consoles that can push some form of 4K gaming at 30fps of AAA games not even being sold on PC. Ray tracing is nice to look at but it hardly justifies the performance penalties at the price point.
The same may be said for 4K: fancy to see but 4x the performance vs FulllHD is too much. But as you can se, there are more and more people looking for 4K benchmarks to decide which card to buy. I would trade better graphics vs resolution any day. Raytraced films on bluray (so in FullHD) are way much better than any rasterized graphics at 4K. The path for graphics quality has been traced. Bear with it.
4K vs ray tracing seems like an obvious choice to you but people vote with their money and right now, 4K is far less cost prohibitive for the eye-candy choice you can get. One company doing it alone will not solve this, especially at such cost vs performance. We got to 4K and adaptive sync because it is an affordable solution, it wasn't always but we are here now and ray tracing is still just a fancy gimmick too expensive for most. Like it or not, it will take AMD and Intel to get on board for ray tracing on hardware across platforms, but before that, a game that truly shows the benefits of ray tracing. Preferably one that doesn't suck.
I would like to remind you that when the 4K interest begun there were cards like the 980TI and the Fury, both unable to cope with such a resolution. Did you ever write a single sentence against the fact that 4K was a gimmick useless to most people because it was too expensive to support? You may know that if you want to get to a point you have to start walking towards it. If you never start, you'll never reach it. nvidia started before any other one in the market. You find it a gimmick move. I find it real innovation. Does it costs too much for you? Yes, also Plasma panels had 4 zeros in their price tag at the beginning, but a certain point I could get one myself without going bankruptcy.
AMD and Intel will come to the ray tracing table sooner than you think (that is next generation for AMD after Navi that is already finalized without the new computing units)
Here's the problem with that comparison, 4K is not simply about gaming while ray tracing is. 4K started in the movie industry, then home video, then finally games. There is a trend that the gaming industry couldn't avoid if it tried so yes, nvidia started it but its not like nobody was that surprised and many thought AMD will soon follow. Ray tracing in real time is a technical feat that not everyone will get on board right away. I do applaud nvidia for starting it but it's too expensive and that's a harder barrier to entry than 4K ever was.
Quote from this review " We also noticed no visual difference using "Uber" versus "Ultra" Image Streaming unfortunately. In the end, it’s probably not worth it and best just to use the "Ultra" setting for the best experience."
I wish the GPU pricing comparison charts included a relative performance index (even if it was something like the simple arithmetic mean of all the scores in the review).
The 2060 looks like it's in a "sweet spot" for performance if you want to spend more less than $500 but are willing to spend more than $200, but you can't really tell that from the chart (though if you read the whole review it's clear). Spending the extra $80 to go from a 1060/RX 580 to a RX 590 doesn't net you much performance, OTOH, going from the $280 RX 580 to the $350 2060 gets you a very significant boost in performance.
A GPU born for the computational task, with 480mm^2 of silicon thought for that, 8GB of expensive HBM and consuming 120W more being powned by a chip in the x60 class sold for the same price (and despite the silicon not all being used and benchmarked for the today games, the latter still preforms better, let's see when RTX compute units and tensor will be used for other tasks like ray tracing but also DLSS, AI and any other kind of effects. And do not forget about mesh shading).
I wonder how low the price of that crap should go down before someone consider it a good deal. Vega chip failed miserably at its aim of making any competition to Pascal in both games, prosumer and professional market, now with this new cutted Turing chip Vega completely looses any meaning of even being produced. Each sold pieces is a rob to AMD's cash coffin and it will be EOF sooner than later. The problem for AMD is that until Navi they will have nothing to go against Turing (the 590 launch is a joke, can't you really thing a company that is serious in this market can do that, can you?) and will constantly loose money in the graphics division. And if Navi is not launched soon enough, they will lose a lot of money the more GPU they (under)sell. If launched too early they will loose money for using a not mature enough PP with lower yields (and boosting the voltage isn't really going to produce a #poorvolta(ge) device even at 7nm). These are the problem of being an underdog that needs latest expensive technological applications to create something that can vaguely being considered decent with respect to the competition.
Let's hope Navi is not a flop as Polaris, or also the generation after Turing will cost even more, after the price have already gone up with Kepler, Maxwell and Pascal.
Great job this GCN architecture! Great job Koduri!
@Semel: "...get a proepr Vega 56 card, undervolt it..."
Why is AMD so bad at setting the voltage in their GPUs? How good their products can be if they can't even properly do something that even the average Joe weekend overclocker can figure out?
Answer to the first question is: "They aren't. AMD sets those voltages because they know it is necessary to keep the GPU stable under load. So, when you think yourself more clever than a multi billion dollar tech giant and undervolt a Radeon, you make it less reliable outside of scripted benchmark runs.
Selfish opinion: but I really would have appreciated a 970 in the graph, in addition to the 1060. (Only two generations old, same market segment and similar price point.)
I wish they would show the 970 in tests too - partially because it was a popular card and most folks wait a couple of cycles to update and partially because that is what I have :) I would like to upgrade as it struggles at 4k and even 1440 on some of the latest games but I can’t stomach $500+
There's a lot of good value at ~$200 (RX 580, 1060 6GB since prices are being dragged down by the 2060), and then essentially nothing worth buying until the 2060 at $350, and then nothing until the 2070. (You could make a case for a Vega 64 on sale for $350, but even then, it's more power-hungry, etc.).
So if GPU performance is important, and your budget can accommodate a $250-400 GPU, the 2060 is the one to buy. People can complain about $350 being a "high end" price, but the fact is, it's WAY faster than what you get for spending say, $280 on an RX 590.
So you think its performance level is not worthy of its price tag? In that case then you wouldn't find a single GPU on the market right now that can meet your price/performance standards
So.. NVIDIA's back to their old ways of giving cards too little RAM. Their cards have typically had much less RAM than the ones from AMD. So the 2060 is a tiny bit faster than a 1070 Ti... But loses 2GB of VRAM. Once you start using over 6GB VRAM, the 2060 is going to lose hard to the 1070 Ti. Price is also too high. A firm "meh".
How about identifying a couple of games in which the 1070Ti "wins hard" over the 2060?
I think you'll find that the 6Gb of RAM is more than enough to run games at the resolutions that the 1070Ti or 2060 excel at (1080p fast or 1440p 60Hz)
So who's going to bet this card doesn't launch at MSRP like every other Nvidia card this generation? You might as well throw away any price/performance comparison done in this article as well all know there is going to be Nvidia tax thrown on top of that price tag.
If this is Nvidia's new mainstream card, I hate to see the price tag on their "budget" $200 2050.
I am surprised that the answer to your question is "well, actually, many are" available at or about MSRP. The one thing I'm puzzle by, is even though the article states NVIDIA will sell their own cards (as they have and do for many other models), I haven't seen one listed on NVIDIA's site (other than linking to AIB cards) since day one.
My sense tells me that RTX 2000 series will be a short-lived line, and NV shall come up with the next-gen (be it called 3000 series or not) on 7nm later this year. Why bother buying a half-node-like card now?
Well, people who are building now, for one. There's still a lot of people who were put off upgrades when mining shot prices through the roof; this is a new entry that's significantly faster than say, $200 cards.
Having said that, I tend to agree with you: 7nm will probably offer significantly better performance, and if you care about raytracing, it's really the games that *started* development after the RTX came out that will show a real benefit (instead of seeing a reflection in one pond in one part of one map or something), and those games will be coming out around the time that the next-gen RTX will anyways.
The reality is that this card, spec wise is a replacement of the 1070, but to call it a 2070 would show small performance increases vs calling it 2060. With it being faster than a 1070 at $30 less MSRP, it's an okay upgrade from a 1060, although I would have rather had 8+ GB RAM. I would expect the 2050 TI to have similar specs as a 1060, although with just 4 GB RAM.
My take-home is: the 2060 is a good, maybe even very good graphics card. Price-performance wise, it's not a bad proposition, if (IF) you're reasonably sure that you won't run into the memory limit. The 6 GB the 2060 comes with is vintage Nvidia: it'll keep the 2060 off the 2070's back even for games that wouldn't require the 2070's bigger GPU brawn, and give Nvidia an easy way to make a 2060 Ti model in the near future; just add 2 GB for a full 8. That's my biggest beef with this card: it could have gone from a good to a great mid-upper level card just by giving it the 8 GB VRAM to start with. Now, it's not so sure how future proof it is.
Going to do this in a few posts, since I was writing while reading a dozen or more reviews and piling up a TON of data. I own AMD stock (NV soon too), so as a trader, you HAVE to do this homework, PERIOD(or you're dumb, and like to lose money...LOL). Don't like data or own stock? Move along.
Why is DLSS and RT or VRS benchmarks not shown? It should have been the HIGHLIGHT of the entire article. NO HD textures in far cry (would a user turn this off before testing it?)? https://www.youtube.com/watch?v=9mxjV3cuB-c CLEARLY DLSS is awesome. Note how many times DLSS makes the 2060 run like a 2080 with TAA. 39% improvement he says with DLSS. WOW. 6:29 you see 2060+DLSS BEATING 2080 with TAA. Note he has MULTIPLE tests here and a very good vid review with many useful data points tested. Why can't anandtech show any of these games that use NEW TECH? Ah right, sold out to AMD as a portal site. Same as Tomshardware (your sister site, no dlss or RT there either, just COMING soon...LOL). Note he also says in there, it would be INSANE to do RTX features and not have 2060 capable as it will be the BASE of RTX cards likely for years (poor will just get them next year at 7nm or something for a little cheaper than this year people) kind of how Intel screwed base graphics with, well CRAP graphics integrated so devs didn't aim higher. This is the same with last gen console stuff, which held us back on PC for how long? @9:19, 60% faster than 1060 for 40% more MONEY (in older crysis 3 even). It was 42% faster than RX 590 in the same game. Next game Shadow of the Tomb Raider, 59% faster than 1060, 40% faster than RX590. Out of 11 titles tested it’s better than Vega56 in 10 of them, only far cry 5 was better on vega56 (only because of perf spurt in beginning of benchmark or that one lost too). Beats Vega64 in many too even rebenched with latest drivers as he notes.
@ 14:30 of the vid above Wolf New Collossus with VRS perf turned on vs. 1060 92% higher fps (again for 40% more cash)! Vega56 just died, 64 not far behind, as you get RT+DLSS on NV which just adds to above info. Cheapest Vega on newegg $369, Vega64 higher at $399. Tough sell against 2060 WITH RT+DLSS+VRS and less watts (210 V56, 295 V64, 160 for 2060 RTX - that's bad). Power bill for 50w 8hrs a day is $19 @ .12 (and many places over .2 in USA never mind elsewhere). So double that for V64 at best (less than 8hrs) if you game and have a kid etc that does too on that PC. Easy to hit 8hrs avg even alone if you game heavy just on weekends. You can easily put in 20hrs on a weekend if you're single and a gamer, and again easily another 4 a night during the week. Got kids, you’ll have more people doing damage. My current old Dell 24 (11yrs old Dell wfp2407-hc) uses ~110w. Simply replacing it pays for Gsync, as I'd save the same $19 a year (for a decade? @ .12 watt cost, many places in USA over .20 so savings higher for some) just buying a 30in new model at 50w. Drop that to 27in Dell and it goes to 35w! Think TCO here people, not today's price. So simply replacing your monitor+gpu (say 2060), might save you $39 a year for 5-10yrs. Hey, that's a free 2060 right there ;) This is why I'll laugh at paying $100 more for 7nm with 1070ti perf (likely much higher) with better watts/more features. I know I'll play on it for 5yrs probably then hand it to someone else in the family for another 3-5. I game more on my main PC than TV (htpc), so 1070ti can move to the HTPC and I'll save on the main pc with 7nm more. Always think TCO.
https://www.ign.com/boards/threads/variable-rate-s... “The end result is that instead of the RTX 2080 improving on the GTX 1080 by an average of around 25 to 30% in most titles, the 2080 outperforms the 1080 by around 60% in Wolfenstein II using VRS.” “So in essence, turning the VRS to Performance mode gets you half way between a 1080 Ti and a 2080 Ti, as opposed to basically matching the 1080 Ti in performance.” And again mentions next gen consoles/handheld to have it.
https://store.steampowered.com/hwsurvey/Steam-Hard... Again, why are you even bothering with 4k at 1.42% usage on steam (125 MILLION GAMERS). NOBODY is using it. Yeah, I call 1.42% NOBODY. Why not test more USEFUL games at resolutions people actually use? This is like my argument with Ryan on the 660ti article where he kept claiming 1440p was used by enthusiasts...LOL. Not even sure you can claim that TODAY, years later as 1080p is used by 60.7% of us and only 3.89% on 1440p. Enthusiasts are NOT playing 4k or even 1440p unless you think gaming enthusiasts are only 5% of the public? Are you dense? 4k actually dropped .03%...ROFLMAO. 72% of MULTI-MONITOR setups are not even 4k…LOL. Nope, just 3840x1080. Who is paying you people to PUSH 4k when NOBODY uses it? You claimed 1440p in 2012 for GTX 660ti. That is stupid or ignorant even TODAY. The TOTAL of 1440p+4K is under 5%. NOBODY is 5%. Why is 4K listed before 1080p? NOBODY is using it, so the focus should be 1080P! This is like setting POLICY in your country based on a few libtards or extremists wanting X passed. IE, FREE COLLEGE for everyone, without asking WHO PAYS FOR IT? Further, NOT realizing MANY people have no business going to college as they suck at learning. Vocational for them at best. You are wasting college on a kid that has a gpa under 3.0 (attach any gpa you want, you get the point). 4k, but, but, but…So what, NOBODY USES IT. Nobody=1.42%...LOL.
MipsToRemove, again lowering qual? Why not test full-on, as many users don't even know what .ini files are?...LOL. I'm guessing settings like this make it easier for AMD. MipsToRemove=0 sets ground textures to max and taxes vid memory. What are you guys using? If it's not 0 why?
“The highest quality preset, "Mein leben!", was used. Wolfenstein II also features Vega-centric GPU Culling and Rapid Packed Math, as well as Radeon-centric Deferred Rendering; in accordance with the preset, neither GPU Culling nor Deferred Rendering was enabled. NVIDIA Adaptive Shading was not enabled.” So in this case you turn off tech for both sides that NOBODY would turn off if buying EITHER side (assuming quality doesn’t drop SET properly). You buy AMD stuff for AMD features, and you do the same for RTX stuff on NV etc. Who goes home and turns off MAIN features of their hardware. Unless it makes a game UNPLAYABLE why the heck would ANYONE do this? So why test like this? Who tests games in a way WE WOULD NOT PLAY them? Oh, right, anandtech. Providing you with the most useless tests in the industry, “Anandtech”. We turn off everything you’d use in real life, you’re welcome…LOL.
Anandtech quote: “hidden settings such as GameWorks features” for Final Fantasy and no DLSS. Umm, who buys NV cards to turn off features? “For our testing, we enable or adjust settings to the highest except for NVIDIA-specific features and 'Model LOD', the latter of which is left at standard.” WTH are you doing here? Does it speed up NV cards? If yes why would anyone turn it off (trying to show NV weaker?)? I paid for that crap, so I’d definitely turn it on if it is FASTER or higher QUALITY.
Crap, have to reply to my own post to keep info in order (5.5 pages in word...LOL). Oh well, I'll just title 2nd post and on so I don't have to care ;)
https://www.techpowerup.com/reviews/NVIDIA/DLSS_Su... If someone can test DLSS in EARLY DECEMBER, why can’t Anandtech in January? You could at least show it on NV vs. NV without so people see how FAST it is (39% as noted before by DigitalFoundry youtube vid above). Ah, right, you don’t want people to know there is a 39% bump in perf coming for many games huh? I see why AMD will skip it, it takes a lot of homework to get it right for each game, as the article from techpowerup discusses. Not feasible on 50mil net, maybe next year: “DLSS is possible only after NVIDIA has generated and sampled what it calls a "ground truth" images—the best iteration and highest image quality image you can engender in your mind, rendered at a 64x supersampling rate. The neural network goes on to work on thousands of these pre-rendered images for each game, applying AI techniques for image analysis and picture quality optimization. After a game with DLSS support (and NVIDIA NGX integration) is tested and retested by NVIDIA, a DLSS model is compiled. This model is created via a permanent back propagation process, which is essentially trial and error as to how close generated images are to the ground truth. Then, it is transferred to the user's computer (weighing in at mere MBs) and processed by the local Tensor cores in the respective game (even deeper GeForce Experience integration). It essentially trains the network to perform the steps required to take the locally generated image as close to the ground truth image as possible, which is all done via an algorithm that does not really have to be rendered.”
Yeah, AMD can’t afford this on a PER GAME basis at under $50mil NET income yearly. Usually AMD posts a LOSS BTW, 3 of 4 recent years 400mil loss or MORE, 8B lost over the life of AMD as a company, 2018 should be ~400mil vs. NV 3B-4B+ NET per year now (3.1B 2017 + over 4B for 2018 1B+ per Q NET INCOME) .
“With DLSS enabled, the game runs at a nice and steady 67 FPS (a 33% performance increase). That's a huge difference that alone can be the deciding factor for many.” Agreed, I guess that’s why Anadtech/toms refuse to delve into this tech? ;) “You can now increase settings on models and textures that would have previously driven FPS down below playable levels. DLSS will in turn bring the framerates back up.” OK, then, it’s great, says he likes quality also stating “pleased”. I’m sure someone will say it’s not as good as 4K native. Well no, the point is 4K “LIKE” quality on crappy hardware that can’t support it ;) As noted in the previous quote. Turn on options that would normally kill your fps, and use DLSS to get those fps back making it playable again. DLSS will always look better than your original res, as again, it’s turning 1080p into 1440p/4k (at 4k, so far in this game, it’s just fps boosting). From what I’ve seen its pretty much 1440p for free without a monitor upgrade, or reasonable 4k “LIKE” quality again, on 1080p or 1440p. Also enables playable 4k for some that would normally turn crap down to get there or not play at all.
I could go on, but I can’t even be bothered to read the rest of the article as I keep having to check to see if benchmarks include some crap that makes the data USELESS to me yet again. You should be testing AMD best advantages vs. NV best advantages ALWAYS unless it changes QUALITY (which would be like cheating if you lower it for better perf). IE, turn on everything that boosts speed for BOTH sides, unless again, QUALITY is dropping then turn it off. USERS will turn on everything unless it HURTS them right? 8:02 in that youtube vid above, he gains 3-4% PER move up in adaptive shading settings. This lets the 2060 best 1080 by max 15%. Besides, there are so many OTHER reviews to read that maybe didn’t do all the dumb things pointed out here. https://videocardz.com/79627/nvidia-geforce-rtx-20...
I did skip to the end (conclusions on gpu reviews are usually ridiculous). Not quite mainstream, but that mark has moved up, just ask Intel/NV (hedt, and top 2080ti selling faster than lower models, might change with 2060 though). “The biggest question here isn't whether it will impact the card right now, but whether 6GB will still be enough even a year down the line.”...LOL. Uh, I can say that about EVERY card on the market at some point right? <15% of 125mil steam users have 8GB+. Don't you think game devs will aim at 85% of the market first (maybe a HD texture pack pushes a few over later, but main game aims at MAIN audience)? This is like consoles etc mentioned above holding us back, intel igpu doing the same. 6GB will too probably I guess.
"Now it is admittedly performing 14-15% ahead of the 8GB GTX 1070, a card that at MSRP was a relatively close $379"...LOL. Anything to take a shot, even when the conclusion is asinine as you're not even talking what happens NEXT YEAR when all the games are using DLSS+RT (either or both), and maybe VSR which got 92% boost vs. 1060 here. That is a LOT more than 15% over 1070 too! FAR higher than 59% you quote without using it right??). I'd guess 20-30mil will be sold from RTX line in the next year (NV sells 65-70% of the discrete cards sold yearly, surely 1/2 of NV sales will be RTX after 7nm), and much of those will be 6GB or less? No doubt the 3050, next year or whatever will then have 6GB too and sell even larger numbers. If you are claiming devs will aim at 15% of the market with 8GB, I will humbly guess they will DEFINITELY aim at 6GB which IS already 11% (same as 8GB % BTW), and will double in the next year or less. Most are on 1080p and this card does that handily in everything. With only 3.5% on 1440p, I don’t think many people will be aiming at 8GB for a while. Sure you can max something the crap out of a few to CAUSE this, but I doubt many will USE it like this anyway (you benchmark most stuff with features disabled!).
There are ~92-100mil discrete cards sold yearly now (36% of ~260-280mil pcs sold yearly), and 70% currenly are NV cards. How many 6GB cards OR LESS do you think will sell from ~62-70mil NV gpus sold in 2018? AMD might release a 6GB in the next year or two also. Navi has a 4GB at under $200 (rumor $129, I think higher but…), so plenty of new stuff will sell under 6GB. Hopefully 2050 etc will have 6GB of GDDR5x (cheaper than GDDR6 for now) or something too too get more 6GB out there raising the 40% on 2GB/4GB (~20% each, great upgrade for 2GB people). Your site is such a fan of TURNING OFF features, or graphics DOWN to play 1440p/4k, I don't get why this is a problem anyway. Can't you just turn off HD textures like you already do to avoid it (or something else)? Never mind, I know you can. So something to revisit is just hogwash. People can just turn down one or two things and magically it will be using 6GB or less again...LOL. Again, 6GB or LESS is 85% of the market in gamers! See steam. How many games hit 8GB in your benchmarks? LOL. You test with stuff OFF that would cause an 8GB hit (hd textures etc) already. What are you complaining about here? The card doesn't stop functioning because a game goes over 6GB, you just turn something down right? LOL.
“ What makes the $350 pricing at least a bit more reasonable is its Radeon competition. Against RX Vega at its current prices the RTX 2060 (6GB) is near-lethal” So is it “a bit more reasonable” or LETHAL? LETHAL sounds VERY reasonable to anyone looking at AMD before July if navi even hits by then and they don’t have RT+DLSS AFAIK, so worth something with the numbers from the youtube guy at digitalfoundry. His results testing the NEW features (well duh anandtech) are quite amazing.
“That driver is due on the same day of the RTX 2060 (6GB) launch, and it could mean the eventual negation of AMD’s FreeSync ecosystem advantage.”
Uh, no, it DEFINITELY ENDS IT AS OF NOW. It is no ADVANTAGE if the other guy has it on all current cards right? It’s just a matter of checking the top ~100 freesync monitors for approval as NV will surely aim at the best sellers first (just like game devs vs. 85% of the market). Oh wait, they tested 400 already, 12 made it so far, but you can turn it on in all models if desired: “Owners of other Adaptive-Sync monitors will be able to manually enable VRR on Nvidia graphics cards as well, but Nvidia won't certify how well that support will work.” So maybe yours works already even if NOT on the list, just a matter of how well, but heck that describes freesync anyways (2nd rate gen1 at least, gen2 not much better as AMD isn’t forcing quality components still) vs. gsync which is easily the best solution (consider TCO over monitor life). You didn’t even mention DLSS in the conclusion and that is a MASSIVE boost to perf, netting the hit from RT basically. But you guys didn’t even go there…ROFLMAO. Yet again, you mention the 6GB in the final paragraph…LOL. How many times can you mention a “what if >6GB” scenario (that can simply be fixed by turning something down slightly or OFF like HD textures) vs. IGNORING completely DLSS a main feature of RTX cards? A LOT, apparently. Even beta benchmarks of the tech are AWESOME. See digitalfoundry guy above. He and the techpowerup VSR/DLSS info both say 32%/33% gain turning either on. You don’t think this should be discussed at all in your article? That is MASSIVE for EITHER tech right? As techpowerup notes in their DLSS article: “Our RTX 2080 Ti ran at 50 FPS, which was a bit too low for comfort. With DLSS enabled, the game runs at a nice and steady 67 FPS (a 33% performance increase). That's a huge difference that alone can be the deciding factor for many.” OK, so 2080ti shows 33% DLSS, and 2060 shows 32% at digitalfoundry on VSR. Seems like even patched in games will do it for this perf increase. Why would a dev ignore ~33% increase across the board? AMD better get this tech soon as I totally agree this would persuade MANY buyers alone. Note DF did their talk of this at 1080p, techpowerup did it at 4k but not as much value here IMHO, just making playable from unplayable really, but lower res add 4k LIKE LOOK too.
Screw it, all in a row, doesn't look bad...LOL. 3rd final (assuming the post takes, each grows):
https://devblogs.nvidia.com/turing-variable-rate-s... How can you ignore something that is SIMPLE to integrate, and will be adding VSR plugins for game engines soon. Do you even do much work if NV includes it as a plugin? I doubt it. Also note, the more COMPLEX your pixel shaders are, the MORE you GAIN in perf using the tech. So your WHAT IF scenario (games always needing more stuff) works in reverse here right? Games will NOT pass this up if it’s easy to add especially with a plugin in game engines. But you guys didn’t even mention a 33% perf add and as he also noted, when GTX 980 launched it wasn’t a massive improvement, but over it’s life “as maxwell matured, it left previous gens in the DUST” with driver updates!
https://www.youtube.com/watch?v=edYiCE0b8-c For those who want to know VSR tech. This vid is Dec4…LOL. A month later Anandtech has never heard of it. Also note the info regarding handhelds, as VRS tech really helps when your gpu resources are stretched to the limit already (think Nintendo switch etc). Note Devs have been asking for this tech for ages, so he thinks next consoles will support it. https://hothardware.com/reviews/nvidia-geforce-rtx... Discussion here of Content Adaptive/Foveated/Motion Adaptive/Lens Optimized & VRS as a whole etc shading. NVIDIA claims developers can implement content-based shading rate reductions without modifying their existing rendering pipeline and with only small changes to shader code. This is HUGE for VR too as you can avoid rendering pixels that would be DISCARDED anyway before going to VR headset. “Turing’s new Variable Rate Shading techniques, as well as its more flexible Multiview Rendering capabilities will take time for developers to adopt, but the net gain could be over a 20 percent speed-up in graphically rich scenes and game engines, but with comparable or higher image quality as a result of these optimizations.” Actually we’ve already seen 33% in Wolfenstein right? So he’s a little low here, but point made.
https://developer.nvidia.com/vrworks/graphics/vari... “Variable Rate Shading is a new, easy to implement rendering technique enabled by Turing GPUs. It increases rendering performance and quality by applying varying amount of processing power to different areas of the image.” Ahh, well, only a 32% boost feature (netting 92% boost over 1060…LOL), so who cares about this crap, and never mind maybe more too depending on complexity as noted before. But AMD doesn’t have it, so turn it all off and ignore free perf…LOL.
https://www.tomshardware.com/reviews/nvidia-turing... Tomshardware knew what it was at 2080 review in SEPT and noted THIS: “But on a slower card in a more demanding game, it may become possible to get 20%+-higher performance at the 60ish FPS level. Perhaps more important, there was no perceivable image quality loss.”
OK so why did they turn it all off in 2060 review? https://www.tomshardware.com/reviews/nvidia-geforc... “To keep our Wolfenstein II benchmarks fair, we disable all of the Turing card's' Adaptive Shading features.” ER, UM, if QUALITY is NOT lost as they noted before, WTH would you turn it OFF for? Is it UNFAIR that NV is far faster due to BETTER tech that does NOT degrade quality? NO, it’s AMD’s problem they don’t have it. ALL users will turn EVERY feature on, if those features do NOT drop QUALITY right? Heck, ok, if you’re not going to test it AGAINST AMD, then at least test it against themselves, so people can see how massive the boost can be. Why would you ignore that? Ah right, sister site just as bad as anandtech…ROFL. Acting like one company doesn’t have features that they are CLEARLY spending R&D on (and REMEMBER as noted before devs wanted this tech!), just because the OTHER guy hasn’t caught up, is DUMB or MISLEADING for both sites that have went down the toilet for reliable data as you’d use your product. It only counts if AMD has it too…Until then, THESE FEATURES DON’T EXIST, we SWEAR…LOL. The second AMD gets it too, we’ll see Toms/Anand bench it…LOL. Then it won’t be “turned off turings adaptive features”, it will be “we turned on BOTH cards Adaptive features” because AMD now won’t be left in the DUST. They screw up their conclusion page too…LOL: “No, GeForce RTX 2060 needs to be faster and cheaper than the competition in order to turn heads.” Uh, Why do they have to be CHEAPER if they are FASTER than AMD? “Nvidia’s biggest sin is probably calling this card a GeForce RTX 2060. The GeForce GTX 1060 6GB launched at $250.” Uh, no, it started at $299 as founder’s edition just as this one is called that (by PCworld, Guru3d etc), so again, expect price drop once NV is done selling them direct
https://www.pcworld.com/article/3331247/components... As PCWorld says, “Not only does the card pack the dedicated RT and tensor core hardware that gives RTX GPUs their cutting-edge ray tracing capabilities, it trades blows in traditional game performance with the $450 GTX 1070 Ti rather than the $380 GTX 1070.” https://www.pcworld.com/article/3331247/components... “The only potential minor blemish on the spec sheet: memory capacity. The move to GDDR6 memory greatly improves overall bandwidth for the RTX 2060 versus the GTX 1060, but the 6GB capacity might not be enough to run textures and other memory-intensive graphics options at maximum settings in all games if you’re playing at 1440p resolution.” OK, so according to them, who cares, as 1440p cards have 8GB, and is only 3.5% of the market anyway…LOL. NV’s response to why 6GB “Right now the faster memory bandwidth is more important than the larger memory size.” They could have put cheaper 8GB of GDDR5, but they chose faster rather than more to hit $349 (and $300 next month probably from other vendors that are NOT founders model). Though they think maybe all will be $349 it seems. “We focused our testing on 1440p and 1080p, as those are the natural resolutions for these graphics cards.” LOL, agreed…Why anyone tests 4K here with ~1.5% is dumb. “We use the Ultra graphics preset but drop the Shadow and Texture Quality settings to High to avoid exceeding 8GB of VRAM usage” Ahh, so PCWorld proves Anandtech is misleading people like cards just die if you hit 8GB, nope, you turn something down in a game like Middle Earth Shadow of War…LOL. Anandtech acts like this is a SHOWSTOPPER. OMG, OMG…LOL. Hmm, 18 degrees lower than Vega64, 8 below vega56, ouch.
Should all make sense, but it is Almost 10am and I’ve been up all night...LOL. No point in responding anandtech, that will just end with me using DATA to beat you to death like the 660ti article (used ryan's own data against his own conclusions, and tons of other data from elsewhere to prove him an outright liar), which ended with Ryan and company calling me names/attacking my character (not the data...LOL), which just looks bad for professionals ;) Best to just change how you operate, or every time a big product hits and I have time, boom, data on how ridiculous this site has become since Anand left (toms since Tom left...ROFL). I love days off. Stock homework all day, game a bit, destroy a few dumb reviews if I have time left over. :) Yeah, I plan days off for "EVENTS", like launches. Why does my review seem to cover more RELEVANT data than yours? ROFL. I own AMD stock, NOT Nvidia (Yet, this year…LOL, wait for Q1 down report first people IMHO). One more point, Vulkan already has VRS support, PCworld mentions they are working with MSFT on DX support for VRS but “Until then, it'll expose Adaptive Shading functionality through the NVAPI software development kit, which allows direct access to GPU features beyond the scope of DirectX and OpenGL.” OH OK ;) Back to reading OTHER reviews, done trashing this useless site (for gpus at least, too much info missing that buyers would LIKE to know about perf).
I would wait a few generations before I would purchase a video card for it's ray tracing abilities. The main reason I would buy any high end card is for FPS, because in competitive gaming, FPS is king, so any visual quality feature would be at minimum setting or turned off anyway.
AMD is about to have its best year for video cards mark my words! NVidia have launched a product that while on paper and technology wise is awesome. however. the physical hardware Is not capable to use that technology. im referring to RTX its too slow. And NV is asking a premium for it. NVidia has made AMD's job very easy. all AMD hasto do is. provide good cards MUCH CHEAPER and they will own at least 50% of GPU market.
I bought a 580 8gb for 199 Australian just before Xmas. was because they all had excess supply again from all the miners. well we all know what happened there! It crashed faster than a meth head would on sunday afternoon after being up for 3 days. and they all had ROOMS full of video cards. knowing one of the bigger etailers boss personally. they had so much excess stock. they were going to throw out low end cards because they didnt have room for more expensive products in the warehouse! for instance they had 6 pallets of of gigabyte aurorus 580's and the next day 590's were being delivered. luckily when they dropped to $199AU a lot of ppl picked them up. they still made a profit. moral here is AMD needs to compete with NV for nothing. set their own low prices that they still make 15% on and spank NV
Couple of points : 1) Nvidia's strategy - which seems to have backfired - was to hope that RayTracing will take off in a big way - and put all its eggs in that basket. Nvidia added in specialized hardware to support RT, which has effectively priced it out of the mainstream market and has to add in "features" like DLSS to convince the consumer that they are getting a good deal @ $350 for what is essentially a 1080p class GPU - equivalent performance from the "traditional" GPU's is available ~$200 2) RayTracing will not take off until it is supported on the XBox and PS as a majority of the games are developed with those platforms in mind. Considering AMD is rumored (confirmed??) to supply the CPU/GPU for the next-gen consoles - unless AMD starts supporting RayTracing this gen - Nvidia has essentially wasted hardware resources and consumer money on something that is completely useless. By the time we have software that uses RT in a meaningful way - these Nvidia cards will be redundant
it wont take off because the current hardware can not run it very well. its good on paper, and great to show. but this should have been released next year with a faster chip. because when RTX is on FPS hits so low.
Next year, with the right HW, nvidia wll have all developers ready to create optimized RT games (unlike BF5). You just look a the finger, not what it it point at. And if the weapon in AMD hands is slowing the technological progress because they have the monopoly in the console market, well, let me say it, I hope that AMD will die soon.
Foolish. If you think nvidia can bring ray tracing to the masses by its lonesome, then you are dreaming. They have to charge you hundreds and thousands just for that finger as you call it, that must be the middle finger because it's not going to get the point across. Just for anything to become a standard, be it 2K, 4K, adaptive sync, it needs to be affordable and for that to happen AMD and Intel needs to get on board.
Nvidia marketing fantasy to counter another nonsensical product...580. Just build one top card and sell many. Then build another one. As it seems we are stuck on 1080 performance four years in a row. Lame.
Do not buy 20xx serie actually. Why, because, PCI-express 4.0 is coming with AMD, soon with Intel probably. Wait for 21xx by nVidia instead. The scores are not really exciting, and nVidia must jump to 7~10nm which can provide more transistors with lower TDP. RTX 2060 card with 160 Watt consumption is really high!
Vic finished his pancakes first. I was still eating when he came around behind me. His hands slid down over my sweater and then slid up under. I still hadn't put on a bra and his icy hands folded against the skin of my boobs https://livesex.bar/ . I actually liked that and was unable to say no. I let him pull off everything up top. Eventually, I reached back between his legs to tug him around where I could see. The bulge in his pants was impressive and I opened them. His cock flopped out when I pulled down his jockeys. I watched it flop about as he slid out of his shoes and pants.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
134 Comments
Back to Article
Hameedo - Monday, January 7, 2019 - link
Final Fantasy 15 already supports DLSS via a recently released patch, please correct that infobenedict - Monday, January 7, 2019 - link
Right now the Radeon RX 580 (8GB) has the best price/performance. 590, 1060 and 2060 are much more expensive but not that much faster.Hameedo - Monday, January 7, 2019 - link
The 2060 is on par with 1070Ti, it means it's much faster than 580 and 1060 (about 60%)sing_electric - Monday, January 7, 2019 - link
FWIW, the RX 580 can regularly be had for ~$200 these days, so TECHNICALLY benedict is right - the 580 goes for ~57% of the 2060's MSRP ($349) but offers more like 60-65% of the performance (and that's assuming you can find the 2060 for MSRP, which we'll have to see).IMO, they're just in different markets. The 590 is looking like a misstep by AMD at this point, since it's midway between the 580 and the 2060 in price, but isn't much faster than the 580 in real world performance. The Vega series are interesting, but AMD and partners probably have limited ability to lower prices due to how expensive the HBM2 memory on those boards is.
Bluescreendeath - Monday, January 7, 2019 - link
No, Benedict is not right. At best, he is only partially right on one out of at least three points, and only when viewed in the most favorable light. He said 1) the 2060 is "not much faster," 2) said that the 1060 is much more expensive and 3) said the 580 has the best price/performance ratio. The GTX2060 is clearly significantly faster than the RX580 (by more than 50%) so his first point is flat out wrong. His second point is also wrong because the GTX1060 costs about the same as the RX580 in terms of market price and performs about the same. His 3rd point is debateable and may be right. The 580 does have a marginally better price/performance ratio if you compare market price RX580 ($200) to MSRP GTX2060 ($350). However, if you compare MSRP RX580 ($240+) to MSRP FE GTX2060 ($350) then the equation changes as the RX580 no longer has the best price/performance either as the 2060 is 46% pricier but performs 50-60% better. And other models of the 2060 will surely be cheaper than the FE version.Ananke - Monday, January 7, 2019 - link
He is completely right - RTX2060 has 48 ROPs and 6 GB VRAM, which is pointless at $349, regardless how "fast" is the card. Raytracing is also pointless with so scarcely resourced card.At this point a person should either buy 2080Ti, if money are no object, or buy Radeon 580 8GB for under $200. Most modern games simply manage huge textures in VRAM - you need more RAM, quick memory buses and ROPs for that.
I am no particular fan, but AMD will kill NVidia any moment with announcement of 7 nm chips - at smaller process brute force will have advantage no matter how smart is NVidia's prefetch, bus compression and all other "smart" staff. Same happened to Intel. And TSMC is already occupied working for AMD, Samsung is busy and any excess is reserved by Intel, so NVidia will have no access to 7nm fab for a while
Bluescreendeath - Monday, January 7, 2019 - link
Real world results matter more than on paper specs. The actual real world benchmarks show the Gtx2060 is 50-60% faster than the rx580. You seem to be overly hung up on VRAM and ROPs. Anadtech and techspot have done plenty of tests on VRAM, and modern games dont use nearly as much VRAM as you think - even on 1440p. And take a look at the 4k benchmarks in the article. The 6Gb vram is clearly sufficient. Furthermore, the rx580 8gb costs about $200-250 new, whereas the 2060 is $350 for the more expensive founders edition. Realistically, the aftermarket cards will be something like 200-250 for new rx580s vs 300 for a new non FE 2060.As for your mention of future AMD chips, we are comparing 2060 vs 1060 vs rx580.
heavyarms1912 - Monday, January 7, 2019 - link
Reality check, RX580s 8gb can be had below $200 and even the top end aftermaket ones and with 3 game titles.wintermute000 - Monday, January 7, 2019 - link
And its what 50% slower, so what's your point?Ananke - Tuesday, January 8, 2019 - link
There are several RX580 8GB cards sales today for around $160-170. It night be old, it might be hot, but runs 1080p just perfect on any game, and it costs half.Next generation consoles are already well into making, and developers are developing games for consoles, not for PCs, which will never change since the scale of business is tenfold if not larger. Next might be game services directly integrated in the TV or whatever device gives a mass market, but would never be a PC.
Anyway, new consoles will have A LOT more VRAM, and 6GB will simply not cut it. And they are coming later this year/early next year, not like waiting ten years for it. I have lived through enough NVidia "cycles" and can tell you this is typical NVidia greed - narrow memory bus and limited memory - exactly like 12 years ago, when AMD had nothing, but came with the rough, hot unoptimized 5850 and collected half of the market.
The card is OK, it is just not worth $350 today, that's my point.
Bluescreendeath - Monday, January 7, 2019 - link
And take a look at the article's 4k benchmarks. The 2060 6gb performs equal to or better than the vega 64 or gtx1070 gpus with 8GB of vram even on 4k resolution. You are overly hung up on paper specs and completely missing what is actually important - real world performance. The 2060 clearly has sufficent VRAM as it performs better than its closest priced competitors and spanks the rx580 8gb by 50-60%.ryrynz - Tuesday, January 8, 2019 - link
This guy gets it.TheJian - Tuesday, January 8, 2019 - link
ROFL. AMD is aiming first 7nm cards at GTX 1080/1070ti perf. This is not going to kill NV's cash cows, which 80% of NET INCOME comes from ABOVE $300. IE, 2060 on up and workstation/server cards at 80% of NV's INCOME (currently at ~3B a year, AMD <350mil).NV is one of the companies listed as having already made LARGE orders at TSMC, along with Apple, AMD etc. The only reason AMD is first (July, MAYBE, So NV clear sailing for 6 more months on 2060+ pricing and NET INCOME), is Nvidia is waiting for price to come down so they can put out a LARGER die (you know brute force you mentioned), AND perhaps more importantly their 12nm 2080ti will beat the snot out of AMD 7nm for over a year as AMD is going small with 7nm because it's NEW. NV asked TSMC to make 12nm special for them...LOL, as you can do that with 3B net INCOME. It worked as it smacked around AMD 14nm chips and looks like they'll be fine vs. 7nm that is SMALL die sizes. If you are NOT competing above $300 you'll remain broke vs. Intel/NV.
https://www.extremetech.com/computing/283241-nvidi...
As he says, not likely NV will allow more than 6-12 on a new node without a response from NV, though if you’re not battling above $300, no point for an NV response as not much of their money is under $250.
“If Navi is a midrange play that tops out at $200-$300, Nvidia may not care enough to respond with a new top-to-bottom architectural refresh. If Navi does prove to be competitive against the RTX 2070 at a lower price point, Nvidia could simply respond by cutting Turing prices while holding a next-generation architecture in reserve precisely so it can re-establish the higher prices it so obviously finds preferable. This is particularly true if Navi lacks GPU ray tracing and Nvidia can convince consumers that ray tracing is a technology worth investing in.”
AGREED. No response needed if you are NOT chasing my cash cows. Again, just a price cut needed even if NAVI is really good (still missing RT+DLSS) and then right back to higher prices with 7nm next gen Q1 2020? This is how AMD should operate, but management doesn’t seem to get chasing RICH people instead of poor.
BRUTE FORCE=LARGER than your enemy, and beating them to death with it (NV does this, reticle limits hit regularly). NV 12nm is 445mm^2, AMD 7nm looks like 1/2 that. The process doesn't make up that much, so no win for AMD vs. top end NV stuff obviously and they are not even claiming that with 1080 GTX perf...LOL. 10-15% better than Vega64 or 1080 (must prove this at $249)? Whatever you're smoking, please pass it. ;)
Raytracing+DLSS on RTX 2060 is 88fps. Without both 90fps. I call that USEFUL, not useless. I call it FREE raytracing.
https://wccftech.com/amd-rx-3080-3070-3060-navi-gp...
Not impressed for no RT/DLSS tech likely for another year as AMD said they won't bother for now. This card doesn't compete vs ANY RTX card, as they come with RT+DLSS as NV shows releasing new 10 series cards, probably to justify the RTX pricing too (showing you AMD is in a different rung by rehashing 10 series). Which as you see below DLSS+RT massively boosts perf so it's free. It's clear using tensor cores to boost RT is pretty great.
https://wccftech.com/nvidia-geforce-rtx-2060-offic...
Now that is impressive. Beats my 1070ti handily, looks better while doing it, and shaves off 20-30w from 180. That will be 120w with 7nm and likely faster and BEFORE xmas if AMD 7nm is anything worth talking about on gpu (cpu great, gpu aimed too low, like xbox1/ps4 needing another rev or two...LOL). 5Grays/s (none for AMD, this year) which is plenty for 1080p as shown. It's almost like getting a monitor upgrade for free (1080p DLSS+RT is 1440p or better looking with no perf loss). What do you think happens with games DESIGNED for RT/DLSS instead of patched in like BF5? LOL. These games will be faster than BF5 unless designed poorly right? It only took 3 weeks to up perf 50% in a game NOT made for the tech (due to a bug in the hardware EA found I guess as noted in their vid - NV fixed, and bam, perf up 50%). 6.5TF, which even without RT+DLSS is looking tough for AMD 7nm currently based on announcements above. AMD will be 150w for 3080 it seems, vs. NV 150-160w 12nm 2060 etc. Good luck, no wiping away NV here.
"If you’re having a hard time believing this, don’t worry, you’re not alone because it does sound unbelievable."
Yep. I agree, but we'll see. Not too hard to understand why AMD would have to price down, as it has no RT+DLSS. You get more from NV, so it's likely higher if rumor pricing on AMD is correct.
Calling 60% barely faster is just WRONG. Categorically WRONG. Maybe you could say that at 10%, heck I'll give 15% to you too. But at 60% faster, you're not even on the same playing field now. RTX cards (2060+ all models total) will sell MILLIONS in the next year and likely have already sold a million (millions? All of xmas sales) with 2080/2080ti/2070 out for a while. It only takes 10mil for a dev to start making games for consoles, and I’m guessing it’s the same roughly for a PC technology (at 7nm they’ll ALL be RTX probably to push the tech). The first run of Titans was 100k and sold out in an F5 olympics session...LOL. Heck the 2nd run did the same IIRC and NV said they couldn’t make them as fast as they sold them. I'm pretty sure volume on a 2060 card is MUCH higher than Titan runs (500k? A million run for midrange?). I think we’re done here, as you have no data to back up your points ;) Heck we can’t even tell what NV is doing on 7nm yet, as it’s all just RUMOR as the extremetech article shows. Also, I HIGHLY doubt AMD will decimate the new 590 with prices like WCCFTECH rumored. The 3070 price would pretty much KILL all 590 sales if $199. Again, I don’t believe these prices, but we’ll see if AMD is this dumb. NONE of these “rumors” look like they are AMD slides etc. Just words on a page. I think they’ll all be $50 or more higher than rumored.
Bp_968 - Tuesday, January 8, 2019 - link
I think your confused about DLSS. DLSS is fancy upscaling. It won't make a 1080p image look like 1440p, it will make a 720p image "sorta" look like a 1080p image (does it even work in 1080p now? It was originally only available on 1440p and 4k, for obvious reasons.).The 2060 is the best value 20 series card so far but when compared to historical values its absolute garbage. In the 350$ price bracket you had the 4GB 970, then the 8gb 1070/1070ti, and now the *6GB* 2060. The 1070 and 1070ti were a huge improvement over the equivalent priced 970 and the 2060 (the same price bracket) is a couple of percent better (if that) and at a 2GB memory deficit!
Nvidia shoveled a enteprise design onto gamers to try and recover some R&D costs from gamers itching for a new generation card. There are already rumors of a 2020 release of an entirely new design based on 7nm. This seems quite plausible considering they are going to be facing new 7nm cards from AMD and *something* from intel in 2020. And underestimating Intel has always been a very very dangerous thing to do.
D. Lister - Tuesday, January 8, 2019 - link
"DLSS is fancy upscaling. It won't make a 1080p image look like 1440p, it will make a 720p image "sorta" look like a 1080p image (does it even work in 1080p now? It was originally only available on 1440p and 4k, for obvious reasons.)."Em no, it's not upscaling, but rather the opposite.
https://www.digitaltrends.com/computing/everything...
"DLSS also leverages some form of super-sampling to arrive at its eventual image. That involves rendering content at a higher resolution than it was originally intended for and using that information to create a better-looking image. But super-sampling typically results in a big performance hit because you’re forcing your graphics card to do a lot more work. DLSS however, appears to actually improve performance."
That is where the tensor AI comes in. It is trained in-house by Nvidia with the game, with higher res images (upto 64X resolution), which the AI then applies to change the image you see by effectively applying supersampling (aka DOWNscaling).
Example:
If object A looks like A1 at 1080p, which looks like A2 if downsampled from resolution(n*1080p)
If object B looks like B1 at 1080p, which looks like B2 if downsampled from resolution(n*1080p)
If object C looks like C1 at 1080p, which looks like C2 if downsampled from resolution(n*1080p)
Then a scene at 1080p which has objects A, B and C which, without DLSS, would look like A1B1C1, would look like A2B2C2 with DLSS.
Gastec - Saturday, January 19, 2019 - link
Intel will release low end cards, from a gaming point of view, considering 1440p as the new standard. Nvidia will still have the high end and they will raise the prices of their high performance cards even further. I see nothing good in the near future for our pockets. The wealthy trolls will be even more aggresive.ryrynz - Tuesday, January 8, 2019 - link
What? Pointless? You don't use this card for 4K...CiccioB - Wednesday, January 9, 2019 - link
Why not?If DLSS allows for the right performances I can't see why 4K are excluded from the possibile use of this card.
Or do you think that 4K is only 8+GB of RAM, 3000+ shaders and 64+ROPS with 500+GB/s bandwidth for any particular reason?
They are all there for <b>incrementing performances</b>. If can do that with other kind of work (like using DLSS) then, what's the problem?
The fact that AMD has to create a card with the above resources to do (maybe) the same work as they have not invested a single dollar in AI up to now ad are at least 4 years behind the competition on this particular feature?
DominionSeraph - Monday, January 7, 2019 - link
No, the 580 has a significantly worse price/performance ratio. All of you here are making the common mistake of calculating price/performance as though a card works without a PC to put it in, instead of improving the performance of said PC. If we calculated price/performance by the cost of the individual component then every discrete card would be infinitely worse than integrated graphics, which costs $0. We can intuitively understand that this is flat out false.If you've just dumped $3000 into a high end machine and its peripherals, with a RX 580 the cost goes to $3200. With a RTX 2060 it's $3350. The RTX 2060 is 4.7% more expensive for 50% more speed.
This price/performance holds true until we hit a $100 PC. If you pulled a $100 Sandy Bridge rig off ebay, then now at $450 with a RTX 1060 it is 50% more expensive than at $300 with a RX 580.
As long as your rig is worth more than $100 the RTX 1060 stomps the RX 580 in price/performance.
AshlayW - Monday, January 7, 2019 - link
You're ignoring the point that not everyone wants to spend $350 on a GPU. 1080p gaming is perfectly acceptable on cards costing less than $300 and at the moment AMD offers the best value here. Especially with RX 570. Turing is just overpriced.RIFLEMAN007 - Tuesday, January 8, 2019 - link
Those people can stick with the lower-end card, this card is in a separate class, why are people comparing them in the first place lol. This card should be compared with the Vega 56 on the AMD side....Manch - Tuesday, January 8, 2019 - link
That's a logical fallacy.10 gallons of premium gas ($2.60) is only 00.002498438476% more than regular ($2.50) in my $40K truck.
However for my my $500 riding lawnmower that would be .1904751904762%
If we were to take your example to the extreme, then you would need to add in the cost of electricity, factor in the percentage of your heating/cooling bill, house structure, lot, cost of living, price of your birth...and anything else that's a factor in the operation of the computer, because otherwise you would be making the common mistake of calculating price/performance as though computers operate without existence...
In actuality premium being typically 10 cents more per gallon is in fact 4% more, bc like GPU's you are comparing the product to a similar product.
In the case of the 580 & 2060, a 75% increase in price for 50% more performance with the caveat of DLSS & RT are also included.
CiccioB - Wednesday, January 9, 2019 - link
Yours is a fallacy in logic.You are comparing single time purchasing cost to expendable resources. Your logic in fact is good when you buy a serve class HW where the purchase cost of the mere HW is only a fraction of the cost of the entire cost needed to male it run, with energy and cooling system taking the bigger part of the bill.
Here we are saying that you have bought a $40K truck and adding that $50 comfortable chair is not that expensive instead of buying that other bare wooden chair.
Yes with the latter you would have $50 more in your pocket but at the cost of diminished comfort which you could have enjoyed for a $50/$40K % more in the truck purchase.
And yet, about consuming, buying an AMD card to have the same performance than nvidia's is going to use more "fuel" and so cost more as much as you use it for as long as you use it.
It's like buying a $40K truck engine that AMD has to sell at $10 to have the same performance and appeal (seen the fact that is not really that "smart") of a $20 muscle car.
You can see why many go directly for the $20 car despite the AMD discount on their crap product.
Manch - Friday, January 11, 2019 - link
Wow, OK. Pick anything not consumable then. Although a GPU can be considered expendable as they tend to go obsolete in the life time of a set of tires. Hardly a one time cost or do you not upgrade? The consumability of the product isn't the point nor does it have any bearing though.His point would be fine if you were comparing an entire system DIY or prebuilt system. Lets say a Dell vs HP. All things equal except for the GPU and the price reflected that, OK fine. But were comparing only the GPUs. @ 3k it's 4.7% @ 1.5K it's 11% of total cost. Then if were comparing system price, then total system perf comes into play not just the GPU perf. Thats the trap you both are falling into.
Lets be clear. I'm not advocating the AMD card or the Nvidia card. I'm merely pointing out the trap he and yourself is falling into with such an argument.
TBF if you're looking at a card with 2060 perf, then you're not considering the 580. If you're looking to make a 1080 gaming machine, the 580 is fine, the 2060 is overkill along with your 3K system.
eva02langley - Monday, January 7, 2019 - link
And cost 75% more. Also, we are talking about FPS so it is a non-linear comparison. Anyway it is still not awesome value.If Navi is 15% faster than a Vega 64 and cost 100$ less than a RTX 2060, you understand the 2060 RTX is still fairly overpriced. There is still no value here and you might be able to grab a similar card for less if you find a good deal.
D. Lister - Tuesday, January 8, 2019 - link
"If Navi is 15% faster than a Vega 64 and cost 100$ less than a RTX 2060..."What if it is 100% faster than Vega 64, and they give it away for free? Heck, what if they threw a big chocolate cake into the deal and Lisa Su personally came to my house to spoon-feed it to me while a hot chick cosplaying as Ruby stripped for my entertainment? C'mon, my fantasy may be slightly more unrealistic than yours but it is certainly a lot more fun.
CiccioB - Wednesday, January 9, 2019 - link
Problem is that most probably AMD has to price their high end card to that low price to have a chance to sell it.If, as it seems, Navi has not RT, DLSS, mesh shading (without speaking about multi-projection which helps a lot in VR, and Voxel effects acceleration which, alas, have not been developed up to now due to their lack of support on the crappy console HW) they will run only in the lower segment of the market.
Their possibly high frames will all be fake as the use of a single advanced effect supported by the competition will make them fall as fast as a falling lead stone.
Yet, we will have hoards of AMD fanboy crying out for the "Gameswork" tricks and bells and twists and nvidia payment to the developers an all the things we have already heard since AMD solutions have not been able to keep up with nvidia's advanced solutions that do not require simple brute force, that is since AMD acquired ATI, when the simple reality is that they have been behind in technological development and requires the market to slow down to not leave them in the dust.
zepi - Monday, January 7, 2019 - link
I don't know what mushrooms you are eating, but I want some of those as well.https://www.anandtech.com/bench/product/2299?vs=23...
RX580 is totally beaten by RTX 2060. Not quite double the performance, but not far from it. Not to mention perf/w, perf/money, noise, etc. characteristics, which are boatloads better than on the AMD card.
zepi - Monday, January 7, 2019 - link
Sorry, RX 580 noise levels seem to be quite reasonable. I've been watching mostly Vega since it is only thing that is actually a proper upgrade for me.sing_electric - Monday, January 7, 2019 - link
Not sure what you mean by "watching," but if you mean "waiting for a price drop," I wouldn't hold my breath. The HBM2 memory on those boards is significantly more expensive than the stuff found on Nvidia's, plus, the cards are very power hungry (which not only is a concern for the user, but also means that they need circuitry on board to deliver that large amount of power, which also adds to the cost to make the card).zepi - Monday, January 7, 2019 - link
Watching with disgust due to high power usage and noise. I won't upgrade to something that sounds as bad as my current AMD card.eva02langley - Monday, January 7, 2019 - link
HBM2 and GDDR6 are fairly at the same price even without the actual numbers. GDDR6 is 70% more expensive than the GDDR5.But yeah, Koduri mistake was to push HBM2 on Vega, it should have been GDDR5 or GDDR5x.
CiccioB - Monday, January 7, 2019 - link
This legend that HBM costs like the other kind of memory is in place since people stated that HBM cost like GDDR5 (then it was against the lowly available GDDR5Xm produced only by Micron) and that using it against 12 chip of GDDR5/X/6 and the complexity needed for the PCB layout to handle them was almost the same.Unfortunatelty nothing of this is true.
HBM costs more per chip (and by GB) by itself for its construction that requires a high end process for stacking up all those layers.
Moreover, it requires a big silicon interposer that is expensive enough to cover the cost of any GDDRn memory type based PCB.
Third it requires a different path for mounting and aligning it on the interposer that is also an expensive procedure (see the problems AMD encountered for it) and that can't be done in the AIB fabs where GDDRn chips are usually mounted and soldered for 0,01$ each chip.
What AMD fanboys constantly states is their hope that AMD is not going to loose so much money for any Vega that they are selling. Unfortunately, they are, and this new video card by nvidia will make even more hard pressure on Vega as we already have announcements of further price cuts on such an expensive piece of crap that can't compete with anything in any market it has been presented and has required the constant price cut even before it was launched. In fact Vega FE cards started discounted by $200 at day one with respect to the former announced MSRP price... what a marvelous debut!
Ratman6161 - Monday, January 7, 2019 - link
Totally beaten in raw performance, yes. But I don't want it anywhere near badly enough to pay $349 for it. What you can actually buy for what I would be willing to spend is the Rx580 or GTX1060 at under $200.mapesdhs - Monday, January 28, 2019 - link
zepi, if all one wants is normal 1080p gaming then an RX 580 is a much better buy, especially used. Mine only cost 138 UKP. The real joke here is the price hiking of the entire midraange segment. The 2060 is what should have been the 2050. People are paying almost double for no real speed upgrade compared to two years ago at the next teir up (which should have been what the 2070 is now). Tech sites know this, some talked about it early on, but now they've all caved in to the new 2x higher pricing schema, the only exception being Paul at "not an apple fan" who continues to point out this nonense. If people go ahead and buy these products then the prices will keep going up. And AMD will follow suit, they'd be mad not to from a business standpoint.Opencg - Monday, January 7, 2019 - link
I would not expect the 2060 to be anywhere near msrp considering its the only turing card with a reasonable msrp/performance ratio the demand will be high. And we all know what happens when demand is high.Bluescreendeath - Monday, January 7, 2019 - link
@Benedict, Did you even read the article? The GTX2060 is more than 50% faster than the RX580 and the GTX1060. Furthermore, the GTX1060 6GBs cost around the same as the RX580 - the 1060 is not "much more expensive."mapesdhs - Monday, January 28, 2019 - link
Feel free to spend 100% more than what x60 cards used to cost, for a performance level that should be the tier below, but don't complain when the prices get hiked again because consumers keep buying this ripoff junk.kpb321 - Monday, January 7, 2019 - link
If you want the best price/performance you need to go a little bit lower than the 580. The 570's have been ~$130 AR pretty regularly with a couple dips below that. Personally, I picked up an 8gb 570 for $190 with three stacked rebates/GCs for a total of $90 off bringing the cost of the card down to $100. I also sold off my old video card and one of the games from the bundle for another $50 bringing my upgrade cost down to ~$50. I had been wanting to upgrade for a while and was hoping for a 580 or a 1060 3gb or 6gb but the 570 looked like such a good deal I couldn't resist. Yes it was quite a bit of rebates but at this point I've gotten all of them so that is my final AR cost. Granted this even further down the performance curve but a 8gb 570 is certainly going to be a lot better than %50 of the performance of an 8gb 580 but that's what I paid for mine.zaza - Monday, January 7, 2019 - link
the rx 570 can be found for 140$-150$ now and comes with your choice of 2 games out of three (unreleased games) (devil may cry, divison 2 and Resident evil 2 remake). for that price i think fir 570 is best GPU for the price, great for 1080p gaming.JRW - Saturday, February 23, 2019 - link
2060 is considerably faster than a 580 tho, I recently upgraded from an R9 290X to EVGA RTX 2060 XC Black and love it, the 290X served me very well tho great card even with todays games @ 1080P but struggled a bit trying to hit my monitors 144hz refresh.PeachNCream - Monday, January 7, 2019 - link
Turing's MSRP makes the benchmark performance meaningless.jrs77 - Monday, January 7, 2019 - link
Midrange card for 350 bucks... :facepalm:I don't care if it's as fast as a 1070ti. A xx60 series card should never cost more than 250 and the 1060 was allready overpriced for most of the time, due to all that bitcoin-fuckery.
Manch - Monday, January 7, 2019 - link
The Vegas are a good bit cheaper than what the scale shows. Not just on sale but regular price reductions. Even mentioned in the article so why tye discrepancy? Also I thoight Vega was a bit slower than the vanilla1080. Its showing to be faster than the FE?sing_electric - Monday, January 7, 2019 - link
I'm not sure what you're referring to, since the best deal I've heard of on the Vega 56 was ~$320 on Black Friday, and today, I can't find a card for less than $370 (at NewEgg on one model, all others are $400+). I like AMD but given today's prices, the only price category where I think AMD wins right now is with the ~$200 580. The ~$280 RX 590 is most of the way to the 2060's MSRP but offers significantly less performance.Manch - Monday, January 7, 2019 - link
Per the article, ". In the mix are concurrent events like AMD-partner Sapphire’s just-announced RX Vega price cuts, which will see the RX Vega 64 Nitro Plus moved to $379 and the RX Vega 56 Pulse to $329, and both with an attached 3-game bundle" Thats even better than what Ive seen.I just bought a MSI vega 64 from amazon for $399 with the 3 game bundle in Dec. Ive seen on avg 400-450 for Vega 64 and a good bit lower for Vega 56.
The chart has Vega 56 at 499 which isnt the case.
Manch - Monday, January 7, 2019 - link
Vega 64 $399, Vega 56 $368 new egg. Plus 3 games.Manch - Monday, January 7, 2019 - link
vega 64 $399 on amazon as well. There are higher pri ed cards but who cares is theyre readily available at these prices?Vayra - Wednesday, January 9, 2019 - link
They also take twice as much power at the wall. *poof* there go the savings. And you get free extra noise and heat in the case to boot.Manch - Friday, January 11, 2019 - link
Double?! LOLCompared to a 2060? The avg diff according to Anand's Bench is 130watts.
Avg price of electricity in the US is 12 cents a kilowatt hour. That means it would cost you 1.2 cents per 100watts an hour. It would cost you on average 1.668 cents more an hour to run a VEGA 64 at full bore balls out compared to the 2060. If we then calculate the difference for an entire year @ 100% power draw for 365 days or 8760hrs the total comes out to $146.12 Here in Germany it would be about double that.
Lets be real no one does that. (Miners?)
Avg is 12hrs a week! Highly doubtful the card is running 100% for 12hrs a week but if it were.
52 weeks in a year, 12 hrs a week for 624hrs for a soul crushing total of $10.41
So yes it cost more to run a higher power card....duh, but it's not double. Stop the FUD.
just4U - Wednesday, January 23, 2019 - link
wait late to this and likely no one will read it but shoot you never know. I have Vega cards. I undervolt and overclock. They work great.sing_electric - Monday, January 7, 2019 - link
Here's the thing, though, right now, there ISN'T a card on the market that offers anything like that level of performance for that price, if you can actually buy one for close to MSRP. The RX 590 is almost embarrassing in this test; a recently-launched card (though based on older tech) for $60 less than the 2060 but offering nowhere near the performance. The way I read the chart on performance/prices, there's good value at ~$200 (for a 580 card), then no good values up till you get to the $350 2060 (assuming it's available for close to MSRP). If AMD can offer the Vega 56 for say, $300 or less, it becomes a good value, but today, the best price I can find on one is $370, and that's just not worth it.jrs77 - Monday, January 7, 2019 - link
I don't say, that the 2060 isn't good value, but it simply is priced way too high to be a midrange card, which the xx60-series is supposed to be.Midrange = $1000 gaming-rig and that only leaves some $200-250 for the GPU. And as I wrote, even the 1060 was out of that pricerange for most of the last two years.
sing_electric - Monday, January 7, 2019 - link
I totally get your point - but to some extent, it's semantics. I'd never drop the ~$700 that it costs to get a 2080 today, but given that that card exists and is sold to consumers as a gaming card, it is now the benchmark for "high end." The RTX 2060 is half that price, so I guess is "mid range," even if $350 is more than I'd spend on a GPU.We've seen the same thing with phones - $700 used to be 'premium' but now the premium is more like $1k.
The one upside of all this is that the prices mean that there's likely to be a lot of cards like the 1060/1070/RX 580 in gaming rigs for the next few years, and so game developers will likely bear that in mind when developing titles. (On the other hand, I'm hoping maybe AMD or Intel will release something that hits a much better $/perf ratio in the next 2 years, finally putting pricing pressure on Nvidia at the mid/high end which just doesn't exist at the moment.)
Bluescreendeath - Monday, January 7, 2019 - link
It could be possible that the GTX2060 is not midranged but lower high range card. Most XX60 cards in the past were midranged, but they were not all midranged. Though most past XX60 cards have been midranged and cost around $200-$300, if you go to the GTX200 series, the GTX260's MSRP was $400 and was more of an upper ranged card. The Founder's Edition of the 1060 also launched at $300.dave_the_nerd - Monday, January 7, 2019 - link
Weeeeeeeeeelllll.... before all the mining happened, the 970 was a pretty popular card at $300-$325. (At one point iirc it was the single most popular discrete GPU on Steam's hardware survey.)Vayra - Wednesday, January 9, 2019 - link
Yeah, I think 350 is just about the maximum Nvidia can charge for midrange. The 970 had the bonus of offering 780ti levels of performance very shortly after that card launched. Today, we're looking at almost 3 years for such a jump (1080 > 2060).StrangerGuy - Wednesday, January 9, 2019 - link
I paid an inflated $450 for my launch 1070 2.5 years, and this 2060 is barely faster at $100 less. Godawful value proposition especially when release dates are taken into consideration.ScottSoapbox - Monday, January 7, 2019 - link
I wonder if custom 2060 cards will add 2GB more VRAM and how much that addition will cost.A5 - Monday, January 7, 2019 - link
It's been a *long* time since I've seen a board vendor offer a board with more VRAM than spec'd by the GPU maker. I would be surprised if anyone did it...easier to point people at the 2070.sing_electric - Monday, January 7, 2019 - link
It's likely that Nvidia has actually done something to restrict the 2060s to 6GB - either though its agreements with board makers or by physically disabling some of the RAM channels on the chip (or both). I agree, it'd be interesting to see how it performs, since I'd suspect it'd be at a decent price/perf point compared to the 2070, but that's also exactly why we're not likely to see it happen.CiccioB - Monday, January 7, 2019 - link
You can't add memory at will. You need to take into consideration the available bus, and as this is a 192bit bus, you can install 3, 6 or 12 GB of memory unless you cope with hybrid configuration thorough heavily optimized drivers (as nvidia did with 970).nevcairiel - Monday, January 7, 2019 - link
Even if they wanted to increase it, just adding 2GB more is hard to impossible. The chip has a certain memory interface, in this case 192-bit. Thats 6x 32-bit memory controller, for 6 1GB chips. You cannot just add 2 more without getting into trouble - like the 970, which had unbalanced memory speeds, which was terrible.mkaibear - Tuesday, January 8, 2019 - link
"terrible" in this case defined as "unnoticeable to anyone not obsessed with benchmark scores"Retycint - Tuesday, January 8, 2019 - link
It was unnoticeable back then, because even the most intensive game/benchmark rarely utilized more than 3.5GB of RAM. The issue, however, comes when newer games inevitably start to consume more and more VRAM - at which point the "terrible" 0.5GB of VRAM will become painfully apparent.mkaibear - Wednesday, January 9, 2019 - link
So, you agree with my original comment which was that it was not terrible at the time? Four years from launch and it's not yet "painfully apparent"?That's not a bad lifespan for a graphics card. Or if you disagree can you tell me which games, now, have noticeable performance issues from using a 970?
FWIW my 970 has been great at 1440p for me for the last 4 years. No performance issues at all.
atragorn - Monday, January 7, 2019 - link
I am more interested in that comment " yesterday’s announcement of game bundles for RTX cards, as well as ‘G-Sync Compatibility’, where NVIDIA cards will support VESA Adaptive Sync. That driver is due on the same day of the RTX 2060 (6GB) launch, and it could mean the eventual negation of AMD’s FreeSync ecosystem advantage." will ALL nvidia cards support Freesync/Freesync2 or only the the RTX series ?A5 - Monday, January 7, 2019 - link
Important to remember that VESA ASync and FreeSync aren't exactly the same.I don't *think* it will be instant compatibility with the whole FreeSync range, but it would be nice. The G-sync hardware is too expensive for its marginal benefits - this capitulation has been a loooooong time coming.
Devo2007 - Monday, January 7, 2019 - link
Anandtech's article about this last night mentioned support will be limited to Pascal & Turing cardsRyan Smith - Monday, January 7, 2019 - link
https://www.anandtech.com/show/13797/nvidia-to-sup...B3an - Monday, January 7, 2019 - link
More overpriced useless shit. These reviews are very rarely harsh enough on this kind of crap either, and i mean tech media in general. This shit isn't close to being acceptable.PeachNCream - Monday, January 7, 2019 - link
Professionalism doesn't demand harshness. The charts and the pricing are reliable facts that speak for themselves and let a reader reach conclusions about the value proposition or the acceptability of the product as worthy of purchase. Since opinions between readers can differ significantly, its better to exercise restraint. These GPUs are given out as media samples for free and, if I'm not mistaken, other journalists have been denied pre-NDA-lift samples by blasting the company or the product. With GPU shortages all around and the need to have a day one release in order to get search engine placement that drives traffic, there is incentive to tenderfoot around criticism when possible.CiccioB - Monday, January 7, 2019 - link
It all depends on what is your definition of "shit".Shit may be something that for you costs too much (so shit is Porche, Lamborghini and Ferrari, but for some else, also Audi, BMW and Mercedes and for some one else also all C cars) or may be something that does not work as expected or under perform with respect to the resources it has.
So for someone else it may be shit a chip that with 230mm^q, 256GB/s of bandwidth and 240W perform like a chip that is 200mm^2, 192GB/s of bandwidth and uses half the power.
Or it may be a chip that with 480mm^2, 8GB of latest HBM technology and more than 250W perform just a bit better than a 314mm^2 chip with GDDR5X and that uses 120W less.
On each one its definition of "shit" and what should be bought to incentive real technological progress.
saiga6360 - Tuesday, January 8, 2019 - link
It's shit when your Porsche slows down when you turn on its fancy new features.Retycint - Tuesday, January 8, 2019 - link
The new feature doesn't subtract from its normal functions though - there is still an appreciable performance increase despite the focus on RTS and whatnot. Plus, you can simply turn RTS off and use it like a normal GPU? I don't see the issue heresaiga6360 - Tuesday, January 8, 2019 - link
If you feel compelled to turn off the feature, then perhaps it is better to buy the alternative without it at a lower price. It comes down to how much the eye candy is worth to you at performance levels that you can get from a sub $200 card.CiccioB - Tuesday, January 8, 2019 - link
It's shit when these fancy new features are kept back by the console market that has difficult at handling less than half the polygons that Pascal can, let alone the new Turing CPUs.The problem is not the technology that is put at disposal, but it is the market that is held back by obsolete "standards".
saiga6360 - Tuesday, January 8, 2019 - link
You mean held back by economics? If Nvidia feels compelled to sell ray tracing in its infancy for thousands of dollars, what do you expect of console makers who are selling the hardware for a loss? Consoles sell games, and if the games are compelling without the massive polygons and ray tracing then the hardware limitations can be justified. Besides, this hardly can be said of modern consoles that can push some form of 4K gaming at 30fps of AAA games not even being sold on PC. Ray tracing is nice to look at but it hardly justifies the performance penalties at the price point.CiccioB - Wednesday, January 9, 2019 - link
The same may be said for 4K: fancy to see but 4x the performance vs FulllHD is too much.But as you can se, there are more and more people looking for 4K benchmarks to decide which card to buy.
I would trade better graphics vs resolution any day.
Raytraced films on bluray (so in FullHD) are way much better than any rasterized graphics at 4K.
The path for graphics quality has been traced. Bear with it.
saiga6360 - Wednesday, January 9, 2019 - link
4K vs ray tracing seems like an obvious choice to you but people vote with their money and right now, 4K is far less cost prohibitive for the eye-candy choice you can get. One company doing it alone will not solve this, especially at such cost vs performance. We got to 4K and adaptive sync because it is an affordable solution, it wasn't always but we are here now and ray tracing is still just a fancy gimmick too expensive for most. Like it or not, it will take AMD and Intel to get on board for ray tracing on hardware across platforms, but before that, a game that truly shows the benefits of ray tracing. Preferably one that doesn't suck.CiccioB - Thursday, January 10, 2019 - link
I would like to remind you that when the 4K interest begun there were cards like the 980TI and the Fury, both unable to cope with such a resolution.Did you ever write a single sentence against the fact that 4K was a gimmick useless to most people because it was too expensive to support?
You may know that if you want to get to a point you have to start walking towards it. If you never start, you'll never reach it.
nvidia started before any other one in the market. You find it a gimmick move. I find it real innovation. Does it costs too much for you? Yes, also Plasma panels had 4 zeros in their price tag at the beginning, but a certain point I could get one myself without going bankruptcy.
AMD and Intel will come to the ray tracing table sooner than you think (that is next generation for AMD after Navi that is already finalized without the new computing units)
saiga6360 - Thursday, January 10, 2019 - link
Here's the problem with that comparison, 4K is not simply about gaming while ray tracing is. 4K started in the movie industry, then home video, then finally games. There is a trend that the gaming industry couldn't avoid if it tried so yes, nvidia started it but its not like nobody was that surprised and many thought AMD will soon follow. Ray tracing in real time is a technical feat that not everyone will get on board right away. I do applaud nvidia for starting it but it's too expensive and that's a harder barrier to entry than 4K ever was.maroon1 - Monday, January 7, 2019 - link
wolfenstein 2 Uber texture is waste of memory. It does not look any different compared to ultrahttp://m.hardocp.com/article/2017/11/13/wolfenstei...
Quote from this review
" We also noticed no visual difference using "Uber" versus "Ultra" Image Streaming unfortunately. In the end, it’s probably not worth it and best just to use the "Ultra" setting for the best experience."
sing_electric - Monday, January 7, 2019 - link
I wish the GPU pricing comparison charts included a relative performance index (even if it was something like the simple arithmetic mean of all the scores in the review).The 2060 looks like it's in a "sweet spot" for performance if you want to spend more less than $500 but are willing to spend more than $200, but you can't really tell that from the chart (though if you read the whole review it's clear). Spending the extra $80 to go from a 1060/RX 580 to a RX 590 doesn't net you much performance, OTOH, going from the $280 RX 580 to the $350 2060 gets you a very significant boost in performance.
Semel - Monday, January 7, 2019 - link
"11% faster than the RX Vega 56 at 1440p/1080p, "A two fans card is faster than a terrible, underperforming due to a bad one fan design reference Vega card. Shocker.
Now get a proepr Vega 56 card, undervolt it and OC it. And compare to OCed 2060.
YOu are in for a surprise.
CiccioB - Monday, January 7, 2019 - link
A GPU born for the computational task, with 480mm^2 of silicon thought for that, 8GB of expensive HBM and consuming 120W more being powned by a chip in the x60 class sold for the same price (and despite the silicon not all being used and benchmarked for the today games, the latter still preforms better, let's see when RTX compute units and tensor will be used for other tasks like ray tracing but also DLSS, AI and any other kind of effects. And do not forget about mesh shading).I wonder how low the price of that crap should go down before someone consider it a good deal.
Vega chip failed miserably at its aim of making any competition to Pascal in both games, prosumer and professional market, now with this new cutted Turing chip Vega completely looses any meaning of even being produced. Each sold pieces is a rob to AMD's cash coffin and it will be EOF sooner than later.
The problem for AMD is that until Navi they will have nothing to go against Turing (the 590 launch is a joke, can't you really thing a company that is serious in this market can do that, can you?) and will constantly loose money in the graphics division. And if Navi is not launched soon enough, they will lose a lot of money the more GPU they (under)sell. If launched too early they will loose money for using a not mature enough PP with lower yields (and boosting the voltage isn't really going to produce a #poorvolta(ge) device even at 7nm). These are the problem of being an underdog that needs latest expensive technological applications to create something that can vaguely being considered decent with respect to the competition.
Let's hope Navi is not a flop as Polaris, or also the generation after Turing will cost even more, after the price have already gone up with Kepler, Maxwell and Pascal.
Great job this GCN architecture! Great job Koduri!
nevcairiel - Monday, January 7, 2019 - link
Comparing two bog standard reference cards is perfectly valid. If AMD wanted to shine there, they should've done a better job.Retycint - Tuesday, January 8, 2019 - link
Exactly. AMD shouldn't have pushed the Vega series so far past the performance/voltage sweet spot in the first place.sing_electric - Tuesday, January 8, 2019 - link
I mean, at that point, then, why bother releasing it? If you look at perf/watt, it's not really much of an improvement over Polaris.D. Lister - Monday, January 14, 2019 - link
@Semel: "...get a proepr Vega 56 card, undervolt it..."Why is AMD so bad at setting the voltage in their GPUs? How good their products can be if they can't even properly do something that even the average Joe weekend overclocker can figure out?
Answer to the first question is: "They aren't. AMD sets those voltages because they know it is necessary to keep the GPU stable under load. So, when you think yourself more clever than a multi billion dollar tech giant and undervolt a Radeon, you make it less reliable outside of scripted benchmark runs.
dave_the_nerd - Monday, January 7, 2019 - link
Selfish opinion: but I really would have appreciated a 970 in the graph, in addition to the 1060. (Only two generations old, same market segment and similar price point.)CiccioB - Monday, January 7, 2019 - link
Yes, also the 1080Ti is missing, and it is quite a pity, especially for the compute tests.Icehawk - Tuesday, January 8, 2019 - link
I wish they would show the 970 in tests too - partially because it was a popular card and most folks wait a couple of cycles to update and partially because that is what I have :) I would like to upgrade as it struggles at 4k and even 1440 on some of the latest games but I can’t stomach $500+poohbear - Monday, January 7, 2019 - link
Uhm, you didn't test for its RTX performance? Wasn't that the main contention with a GTX 2060????boozed - Monday, January 7, 2019 - link
Still waiting for real-world tests?saiga6360 - Tuesday, January 8, 2019 - link
Battlefield V? Crappy game but it does have ray tracing implemented.RamIt - Monday, January 7, 2019 - link
This card is worth no more than $199 us dollars. Sorry Nvidia your pricing stricture keeps me from buying your products from now on.RamIt - Monday, January 7, 2019 - link
Sorry for the typos. A little bit hammered at the moment but certainly mean what I implied.mkaibear - Tuesday, January 8, 2019 - link
Don't you buy on price/performance then? That seems odd.For the price this offers great performance.
sing_electric - Tuesday, January 8, 2019 - link
There's a lot of good value at ~$200 (RX 580, 1060 6GB since prices are being dragged down by the 2060), and then essentially nothing worth buying until the 2060 at $350, and then nothing until the 2070. (You could make a case for a Vega 64 on sale for $350, but even then, it's more power-hungry, etc.).So if GPU performance is important, and your budget can accommodate a $250-400 GPU, the 2060 is the one to buy. People can complain about $350 being a "high end" price, but the fact is, it's WAY faster than what you get for spending say, $280 on an RX 590.
Retycint - Tuesday, January 8, 2019 - link
So you think its performance level is not worthy of its price tag? In that case then you wouldn't find a single GPU on the market right now that can meet your price/performance standardspiroroadkill - Tuesday, January 8, 2019 - link
So.. NVIDIA's back to their old ways of giving cards too little RAM. Their cards have typically had much less RAM than the ones from AMD. So the 2060 is a tiny bit faster than a 1070 Ti... But loses 2GB of VRAM. Once you start using over 6GB VRAM, the 2060 is going to lose hard to the 1070 Ti. Price is also too high. A firm "meh".mkaibear - Wednesday, January 9, 2019 - link
How about identifying a couple of games in which the 1070Ti "wins hard" over the 2060?I think you'll find that the 6Gb of RAM is more than enough to run games at the resolutions that the 1070Ti or 2060 excel at (1080p fast or 1440p 60Hz)
Disagree? provide me with some evidence!
prateekprakash - Tuesday, January 8, 2019 - link
Maybe AIB partners should offer one custom rtx 2060 with 12gb vram by using 2gb gddr6 chips, that may be helpful down the road...evernessince - Tuesday, January 8, 2019 - link
So who's going to bet this card doesn't launch at MSRP like every other Nvidia card this generation? You might as well throw away any price/performance comparison done in this article as well all know there is going to be Nvidia tax thrown on top of that price tag.If this is Nvidia's new mainstream card, I hate to see the price tag on their "budget" $200 2050.
catavalon21 - Thursday, March 7, 2019 - link
I am surprised that the answer to your question is "well, actually, many are" available at or about MSRP. The one thing I'm puzzle by, is even though the article states NVIDIA will sell their own cards (as they have and do for many other models), I haven't seen one listed on NVIDIA's site (other than linking to AIB cards) since day one.catavalon21 - Thursday, March 7, 2019 - link
*puzzled* - need more coffeeoverseer - Tuesday, January 8, 2019 - link
My sense tells me that RTX 2000 series will be a short-lived line, and NV shall come up with the next-gen (be it called 3000 series or not) on 7nm later this year. Why bother buying a half-node-like card now?sing_electric - Tuesday, January 8, 2019 - link
Well, people who are building now, for one. There's still a lot of people who were put off upgrades when mining shot prices through the roof; this is a new entry that's significantly faster than say, $200 cards.Having said that, I tend to agree with you: 7nm will probably offer significantly better performance, and if you care about raytracing, it's really the games that *started* development after the RTX came out that will show a real benefit (instead of seeing a reflection in one pond in one part of one map or something), and those games will be coming out around the time that the next-gen RTX will anyways.
richough3 - Tuesday, January 8, 2019 - link
The reality is that this card, spec wise is a replacement of the 1070, but to call it a 2070 would show small performance increases vs calling it 2060. With it being faster than a 1070 at $30 less MSRP, it's an okay upgrade from a 1060, although I would have rather had 8+ GB RAM. I would expect the 2050 TI to have similar specs as a 1060, although with just 4 GB RAM.Storris - Tuesday, January 8, 2019 - link
The RTX2060 game bundle includes RTX showcases Battlefield 5 and Anthem, yet you haven't tested either of those games.What's the point of an RTX review, if the RTX doesn't actually get reviewed?
Also, what's the point of a launch, and the day 1 driver, when no-one can buy the card yet?
catavalon21 - Thursday, March 7, 2019 - link
Paper launches are nothing new for either Nvidia or AMD GPUs.eastcoast_pete - Tuesday, January 8, 2019 - link
My take-home is: the 2060 is a good, maybe even very good graphics card. Price-performance wise, it's not a bad proposition, if (IF) you're reasonably sure that you won't run into the memory limit. The 6 GB the 2060 comes with is vintage Nvidia: it'll keep the 2060 off the 2070's back even for games that wouldn't require the 2070's bigger GPU brawn, and give Nvidia an easy way to make a 2060 Ti model in the near future; just add 2 GB for a full 8.That's my biggest beef with this card: it could have gone from a good to a great mid-upper level card just by giving it the 8 GB VRAM to start with. Now, it's not so sure how future proof it is.
TheJian - Tuesday, January 8, 2019 - link
Going to do this in a few posts, since I was writing while reading a dozen or more reviews and piling up a TON of data. I own AMD stock (NV soon too), so as a trader, you HAVE to do this homework, PERIOD(or you're dumb, and like to lose money...LOL). Don't like data or own stock? Move along.Why is DLSS and RT or VRS benchmarks not shown? It should have been the HIGHLIGHT of the entire article. NO HD textures in far cry (would a user turn this off before testing it?)?
https://www.youtube.com/watch?v=9mxjV3cuB-c
CLEARLY DLSS is awesome. Note how many times DLSS makes the 2060 run like a 2080 with TAA. 39% improvement he says with DLSS. WOW. 6:29 you see 2060+DLSS BEATING 2080 with TAA. Note he has MULTIPLE tests here and a very good vid review with many useful data points tested. Why can't anandtech show any of these games that use NEW TECH? Ah right, sold out to AMD as a portal site. Same as Tomshardware (your sister site, no dlss or RT there either, just COMING soon...LOL). Note he also says in there, it would be INSANE to do RTX features and not have 2060 capable as it will be the BASE of RTX cards likely for years (poor will just get them next year at 7nm or something for a little cheaper than this year people) kind of how Intel screwed base graphics with, well CRAP graphics integrated so devs didn't aim higher. This is the same with last gen console stuff, which held us back on PC for how long? @9:19, 60% faster than 1060 for 40% more MONEY (in older crysis 3 even). It was 42% faster than RX 590 in the same game. Next game Shadow of the Tomb Raider, 59% faster than 1060, 40% faster than RX590. Out of 11 titles tested it’s better than Vega56 in 10 of them, only far cry 5 was better on vega56 (only because of perf spurt in beginning of benchmark or that one lost too). Beats Vega64 in many too even rebenched with latest drivers as he notes.
@ 14:30 of the vid above Wolf New Collossus with VRS perf turned on vs. 1060 92% higher fps (again for 40% more cash)! Vega56 just died, 64 not far behind, as you get RT+DLSS on NV which just adds to above info. Cheapest Vega on newegg $369, Vega64 higher at $399. Tough sell against 2060 WITH RT+DLSS+VRS and less watts (210 V56, 295 V64, 160 for 2060 RTX - that's bad). Power bill for 50w 8hrs a day is $19 @ .12 (and many places over .2 in USA never mind elsewhere). So double that for V64 at best (less than 8hrs) if you game and have a kid etc that does too on that PC. Easy to hit 8hrs avg even alone if you game heavy just on weekends. You can easily put in 20hrs on a weekend if you're single and a gamer, and again easily another 4 a night during the week. Got kids, you’ll have more people doing damage. My current old Dell 24 (11yrs old Dell wfp2407-hc) uses ~110w. Simply replacing it pays for Gsync, as I'd save the same $19 a year (for a decade? @ .12 watt cost, many places in USA over .20 so savings higher for some) just buying a 30in new model at 50w. Drop that to 27in Dell and it goes to 35w! Think TCO here people, not today's price. So simply replacing your monitor+gpu (say 2060), might save you $39 a year for 5-10yrs. Hey, that's a free 2060 right there ;) This is why I'll laugh at paying $100 more for 7nm with 1070ti perf (likely much higher) with better watts/more features. I know I'll play on it for 5yrs probably then hand it to someone else in the family for another 3-5. I game more on my main PC than TV (htpc), so 1070ti can move to the HTPC and I'll save on the main pc with 7nm more. Always think TCO.
https://www.ign.com/boards/threads/variable-rate-s...
“The end result is that instead of the RTX 2080 improving on the GTX 1080 by an average of around 25 to 30% in most titles, the 2080 outperforms the 1080 by around 60% in Wolfenstein II using VRS.”
“So in essence, turning the VRS to Performance mode gets you half way between a 1080 Ti and a 2080 Ti, as opposed to basically matching the 1080 Ti in performance.” And again mentions next gen consoles/handheld to have it.
https://store.steampowered.com/hwsurvey/Steam-Hard...
Again, why are you even bothering with 4k at 1.42% usage on steam (125 MILLION GAMERS). NOBODY is using it. Yeah, I call 1.42% NOBODY. Why not test more USEFUL games at resolutions people actually use? This is like my argument with Ryan on the 660ti article where he kept claiming 1440p was used by enthusiasts...LOL. Not even sure you can claim that TODAY, years later as 1080p is used by 60.7% of us and only 3.89% on 1440p. Enthusiasts are NOT playing 4k or even 1440p unless you think gaming enthusiasts are only 5% of the public? Are you dense? 4k actually dropped .03%...ROFLMAO. 72% of MULTI-MONITOR setups are not even 4k…LOL. Nope, just 3840x1080. Who is paying you people to PUSH 4k when NOBODY uses it? You claimed 1440p in 2012 for GTX 660ti. That is stupid or ignorant even TODAY. The TOTAL of 1440p+4K is under 5%. NOBODY is 5%. Why is 4K listed before 1080p? NOBODY is using it, so the focus should be 1080P! This is like setting POLICY in your country based on a few libtards or extremists wanting X passed. IE, FREE COLLEGE for everyone, without asking WHO PAYS FOR IT? Further, NOT realizing MANY people have no business going to college as they suck at learning. Vocational for them at best. You are wasting college on a kid that has a gpa under 3.0 (attach any gpa you want, you get the point). 4k, but, but, but…So what, NOBODY USES IT. Nobody=1.42%...LOL.
MipsToRemove, again lowering qual? Why not test full-on, as many users don't even know what .ini files are?...LOL. I'm guessing settings like this make it easier for AMD. MipsToRemove=0 sets ground textures to max and taxes vid memory. What are you guys using? If it's not 0 why?
“The highest quality preset, "Mein leben!", was used. Wolfenstein II also features Vega-centric GPU Culling and Rapid Packed Math, as well as Radeon-centric Deferred Rendering; in accordance with the preset, neither GPU Culling nor Deferred Rendering was enabled. NVIDIA Adaptive Shading was not enabled.”
So in this case you turn off tech for both sides that NOBODY would turn off if buying EITHER side (assuming quality doesn’t drop SET properly). You buy AMD stuff for AMD features, and you do the same for RTX stuff on NV etc. Who goes home and turns off MAIN features of their hardware. Unless it makes a game UNPLAYABLE why the heck would ANYONE do this? So why test like this? Who tests games in a way WE WOULD NOT PLAY them? Oh, right, anandtech. Providing you with the most useless tests in the industry, “Anandtech”. We turn off everything you’d use in real life, you’re welcome…LOL.
Anandtech quote:
“hidden settings such as GameWorks features” for Final Fantasy and no DLSS. Umm, who buys NV cards to turn off features?
“For our testing, we enable or adjust settings to the highest except for NVIDIA-specific features and 'Model LOD', the latter of which is left at standard.”
WTH are you doing here? Does it speed up NV cards? If yes why would anyone turn it off (trying to show NV weaker?)? I paid for that crap, so I’d definitely turn it on if it is FASTER or higher QUALITY.
TheJian - Tuesday, January 8, 2019 - link
Crap, have to reply to my own post to keep info in order (5.5 pages in word...LOL). Oh well, I'll just title 2nd post and on so I don't have to care ;)https://www.techpowerup.com/reviews/NVIDIA/DLSS_Su...
If someone can test DLSS in EARLY DECEMBER, why can’t Anandtech in January? You could at least show it on NV vs. NV without so people see how FAST it is (39% as noted before by DigitalFoundry youtube vid above). Ah, right, you don’t want people to know there is a 39% bump in perf coming for many games huh? I see why AMD will skip it, it takes a lot of homework to get it right for each game, as the article from techpowerup discusses. Not feasible on 50mil net, maybe next year:
“DLSS is possible only after NVIDIA has generated and sampled what it calls a "ground truth" images—the best iteration and highest image quality image you can engender in your mind, rendered at a 64x supersampling rate. The neural network goes on to work on thousands of these pre-rendered images for each game, applying AI techniques for image analysis and picture quality optimization. After a game with DLSS support (and NVIDIA NGX integration) is tested and retested by NVIDIA, a DLSS model is compiled. This model is created via a permanent back propagation process, which is essentially trial and error as to how close generated images are to the ground truth. Then, it is transferred to the user's computer (weighing in at mere MBs) and processed by the local Tensor cores in the respective game (even deeper GeForce Experience integration). It essentially trains the network to perform the steps required to take the locally generated image as close to the ground truth image as possible, which is all done via an algorithm that does not really have to be rendered.”
Yeah, AMD can’t afford this on a PER GAME basis at under $50mil NET income yearly. Usually AMD posts a LOSS BTW, 3 of 4 recent years 400mil loss or MORE, 8B lost over the life of AMD as a company, 2018 should be ~400mil vs. NV 3B-4B+ NET per year now (3.1B 2017 + over 4B for 2018 1B+ per Q NET INCOME) .
“With DLSS enabled, the game runs at a nice and steady 67 FPS (a 33% performance increase). That's a huge difference that alone can be the deciding factor for many.”
Agreed, I guess that’s why Anadtech/toms refuse to delve into this tech? ;)
“You can now increase settings on models and textures that would have previously driven FPS down below playable levels. DLSS will in turn bring the framerates back up.” OK, then, it’s great, says he likes quality also stating “pleased”. I’m sure someone will say it’s not as good as 4K native. Well no, the point is 4K “LIKE” quality on crappy hardware that can’t support it ;) As noted in the previous quote. Turn on options that would normally kill your fps, and use DLSS to get those fps back making it playable again. DLSS will always look better than your original res, as again, it’s turning 1080p into 1440p/4k (at 4k, so far in this game, it’s just fps boosting). From what I’ve seen its pretty much 1440p for free without a monitor upgrade, or reasonable 4k “LIKE” quality again, on 1080p or 1440p. Also enables playable 4k for some that would normally turn crap down to get there or not play at all.
I could go on, but I can’t even be bothered to read the rest of the article as I keep having to check to see if benchmarks include some crap that makes the data USELESS to me yet again. You should be testing AMD best advantages vs. NV best advantages ALWAYS unless it changes QUALITY (which would be like cheating if you lower it for better perf). IE, turn on everything that boosts speed for BOTH sides, unless again, QUALITY is dropping then turn it off. USERS will turn on everything unless it HURTS them right? 8:02 in that youtube vid above, he gains 3-4% PER move up in adaptive shading settings. This lets the 2060 best 1080 by max 15%. Besides, there are so many OTHER reviews to read that maybe didn’t do all the dumb things pointed out here.
https://videocardz.com/79627/nvidia-geforce-rtx-20...
I did skip to the end (conclusions on gpu reviews are usually ridiculous). Not quite mainstream, but that mark has moved up, just ask Intel/NV (hedt, and top 2080ti selling faster than lower models, might change with 2060 though). “The biggest question here isn't whether it will impact the card right now, but whether 6GB will still be enough even a year down the line.”...LOL. Uh, I can say that about EVERY card on the market at some point right? <15% of 125mil steam users have 8GB+. Don't you think game devs will aim at 85% of the market first (maybe a HD texture pack pushes a few over later, but main game aims at MAIN audience)? This is like consoles etc mentioned above holding us back, intel igpu doing the same. 6GB will too probably I guess.
"Now it is admittedly performing 14-15% ahead of the 8GB GTX 1070, a card that at MSRP was a relatively close $379"...LOL. Anything to take a shot, even when the conclusion is asinine as you're not even talking what happens NEXT YEAR when all the games are using DLSS+RT (either or both), and maybe VSR which got 92% boost vs. 1060 here. That is a LOT more than 15% over 1070 too! FAR higher than 59% you quote without using it right??). I'd guess 20-30mil will be sold from RTX line in the next year (NV sells 65-70% of the discrete cards sold yearly, surely 1/2 of NV sales will be RTX after 7nm), and much of those will be 6GB or less? No doubt the 3050, next year or whatever will then have 6GB too and sell even larger numbers. If you are claiming devs will aim at 15% of the market with 8GB, I will humbly guess they will DEFINITELY aim at 6GB which IS already 11% (same as 8GB % BTW), and will double in the next year or less. Most are on 1080p and this card does that handily in everything. With only 3.5% on 1440p, I don’t think many people will be aiming at 8GB for a while. Sure you can max something the crap out of a few to CAUSE this, but I doubt many will USE it like this anyway (you benchmark most stuff with features disabled!).
There are ~92-100mil discrete cards sold yearly now (36% of ~260-280mil pcs sold yearly), and 70% currenly are NV cards. How many 6GB cards OR LESS do you think will sell from ~62-70mil NV gpus sold in 2018? AMD might release a 6GB in the next year or two also. Navi has a 4GB at under $200 (rumor $129, I think higher but…), so plenty of new stuff will sell under 6GB. Hopefully 2050 etc will have 6GB of GDDR5x (cheaper than GDDR6 for now) or something too too get more 6GB out there raising the 40% on 2GB/4GB (~20% each, great upgrade for 2GB people). Your site is such a fan of TURNING OFF features, or graphics DOWN to play 1440p/4k, I don't get why this is a problem anyway. Can't you just turn off HD textures like you already do to avoid it (or something else)? Never mind, I know you can. So something to revisit is just hogwash. People can just turn down one or two things and magically it will be using 6GB or less again...LOL. Again, 6GB or LESS is 85% of the market in gamers! See steam. How many games hit 8GB in your benchmarks? LOL. You test with stuff OFF that would cause an 8GB hit (hd textures etc) already. What are you complaining about here? The card doesn't stop functioning because a game goes over 6GB, you just turn something down right? LOL.
“ What makes the $350 pricing at least a bit more reasonable is its Radeon competition. Against RX Vega at its current prices the RTX 2060 (6GB) is near-lethal”
So is it “a bit more reasonable” or LETHAL? LETHAL sounds VERY reasonable to anyone looking at AMD before July if navi even hits by then and they don’t have RT+DLSS AFAIK, so worth something with the numbers from the youtube guy at digitalfoundry. His results testing the NEW features (well duh anandtech) are quite amazing.
“That driver is due on the same day of the RTX 2060 (6GB) launch, and it could mean the eventual negation of AMD’s FreeSync ecosystem advantage.”
Uh, no, it DEFINITELY ENDS IT AS OF NOW. It is no ADVANTAGE if the other guy has it on all current cards right? It’s just a matter of checking the top ~100 freesync monitors for approval as NV will surely aim at the best sellers first (just like game devs vs. 85% of the market). Oh wait, they tested 400 already, 12 made it so far, but you can turn it on in all models if desired:
“Owners of other Adaptive-Sync monitors will be able to manually enable VRR on Nvidia graphics cards as well, but Nvidia won't certify how well that support will work.”
So maybe yours works already even if NOT on the list, just a matter of how well, but heck that describes freesync anyways (2nd rate gen1 at least, gen2 not much better as AMD isn’t forcing quality components still) vs. gsync which is easily the best solution (consider TCO over monitor life). You didn’t even mention DLSS in the conclusion and that is a MASSIVE boost to perf, netting the hit from RT basically. But you guys didn’t even go there…ROFLMAO. Yet again, you mention the 6GB in the final paragraph…LOL. How many times can you mention a “what if >6GB” scenario (that can simply be fixed by turning something down slightly or OFF like HD textures) vs. IGNORING completely DLSS a main feature of RTX cards? A LOT, apparently. Even beta benchmarks of the tech are AWESOME. See digitalfoundry guy above. He and the techpowerup VSR/DLSS info both say 32%/33% gain turning either on. You don’t think this should be discussed at all in your article? That is MASSIVE for EITHER tech right? As techpowerup notes in their DLSS article:
“Our RTX 2080 Ti ran at 50 FPS, which was a bit too low for comfort. With DLSS enabled, the game runs at a nice and steady 67 FPS (a 33% performance increase). That's a huge difference that alone can be the deciding factor for many.”
OK, so 2080ti shows 33% DLSS, and 2060 shows 32% at digitalfoundry on VSR. Seems like even patched in games will do it for this perf increase. Why would a dev ignore ~33% increase across the board? AMD better get this tech soon as I totally agree this would persuade MANY buyers alone. Note DF did their talk of this at 1080p, techpowerup did it at 4k but not as much value here IMHO, just making playable from unplayable really, but lower res add 4k LIKE LOOK too.
TheJian - Tuesday, January 8, 2019 - link
Screw it, all in a row, doesn't look bad...LOL. 3rd final (assuming the post takes, each grows):https://devblogs.nvidia.com/turing-variable-rate-s...
How can you ignore something that is SIMPLE to integrate, and will be adding VSR plugins for game engines soon. Do you even do much work if NV includes it as a plugin? I doubt it. Also note, the more COMPLEX your pixel shaders are, the MORE you GAIN in perf using the tech. So your WHAT IF scenario (games always needing more stuff) works in reverse here right? Games will NOT pass this up if it’s easy to add especially with a plugin in game engines. But you guys didn’t even mention a 33% perf add and as he also noted, when GTX 980 launched it wasn’t a massive improvement, but over it’s life “as maxwell matured, it left previous gens in the DUST” with driver updates!
https://www.youtube.com/watch?v=edYiCE0b8-c
For those who want to know VSR tech. This vid is Dec4…LOL. A month later Anandtech has never heard of it. Also note the info regarding handhelds, as VRS tech really helps when your gpu resources are stretched to the limit already (think Nintendo switch etc). Note Devs have been asking for this tech for ages, so he thinks next consoles will support it.
https://hothardware.com/reviews/nvidia-geforce-rtx...
Discussion here of Content Adaptive/Foveated/Motion Adaptive/Lens Optimized & VRS as a whole etc shading. NVIDIA claims developers can implement content-based shading rate reductions without modifying their existing rendering pipeline and with only small changes to shader code. This is HUGE for VR too as you can avoid rendering pixels that would be DISCARDED anyway before going to VR headset.
“Turing’s new Variable Rate Shading techniques, as well as its more flexible Multiview Rendering capabilities will take time for developers to adopt, but the net gain could be over a 20 percent speed-up in graphically rich scenes and game engines, but with comparable or higher image quality as a result of these optimizations.”
Actually we’ve already seen 33% in Wolfenstein right? So he’s a little low here, but point made.
https://developer.nvidia.com/vrworks/graphics/vari...
“Variable Rate Shading is a new, easy to implement rendering technique enabled by Turing GPUs. It increases rendering performance and quality by applying varying amount of processing power to different areas of the image.”
Ahh, well, only a 32% boost feature (netting 92% boost over 1060…LOL), so who cares about this crap, and never mind maybe more too depending on complexity as noted before. But AMD doesn’t have it, so turn it all off and ignore free perf…LOL.
https://www.tomshardware.com/reviews/nvidia-turing...
Tomshardware knew what it was at 2080 review in SEPT and noted THIS:
“But on a slower card in a more demanding game, it may become possible to get 20%+-higher performance at the 60ish FPS level. Perhaps more important, there was no perceivable image quality loss.”
OK so why did they turn it all off in 2060 review?
https://www.tomshardware.com/reviews/nvidia-geforc...
“To keep our Wolfenstein II benchmarks fair, we disable all of the Turing card's' Adaptive Shading features.”
ER, UM, if QUALITY is NOT lost as they noted before, WTH would you turn it OFF for? Is it UNFAIR that NV is far faster due to BETTER tech that does NOT degrade quality? NO, it’s AMD’s problem they don’t have it. ALL users will turn EVERY feature on, if those features do NOT drop QUALITY right? Heck, ok, if you’re not going to test it AGAINST AMD, then at least test it against themselves, so people can see how massive the boost can be. Why would you ignore that? Ah right, sister site just as bad as anandtech…ROFL. Acting like one company doesn’t have features that they are CLEARLY spending R&D on (and REMEMBER as noted before devs wanted this tech!), just because the OTHER guy hasn’t caught up, is DUMB or MISLEADING for both sites that have went down the toilet for reliable data as you’d use your product. It only counts if AMD has it too…Until then, THESE FEATURES DON’T EXIST, we SWEAR…LOL. The second AMD gets it too, we’ll see Toms/Anand bench it…LOL. Then it won’t be “turned off turings adaptive features”, it will be “we turned on BOTH cards Adaptive features” because AMD now won’t be left in the DUST. They screw up their conclusion page too…LOL:
“No, GeForce RTX 2060 needs to be faster and cheaper than the competition in order to turn heads.”
Uh, Why do they have to be CHEAPER if they are FASTER than AMD?
“Nvidia’s biggest sin is probably calling this card a GeForce RTX 2060. The GeForce GTX 1060 6GB launched at $250.”
Uh, no, it started at $299 as founder’s edition just as this one is called that (by PCworld, Guru3d etc), so again, expect price drop once NV is done selling them direct
https://hothardware.com/reviews/nvidia-geforce-rtx...
“GeForce RTX 2060 - Taking Turing Mainstream”
I guess, some think $350 is mainstream…LOL.
https://www.pcworld.com/article/3331247/components...
As PCWorld says, “Not only does the card pack the dedicated RT and tensor core hardware that gives RTX GPUs their cutting-edge ray tracing capabilities, it trades blows in traditional game performance with the $450 GTX 1070 Ti rather than the $380 GTX 1070.”
https://www.pcworld.com/article/3331247/components...
“The only potential minor blemish on the spec sheet: memory capacity. The move to GDDR6 memory greatly improves overall bandwidth for the RTX 2060 versus the GTX 1060, but the 6GB capacity might not be enough to run textures and other memory-intensive graphics options at maximum settings in all games if you’re playing at 1440p resolution.”
OK, so according to them, who cares, as 1440p cards have 8GB, and is only 3.5% of the market anyway…LOL.
NV’s response to why 6GB “Right now the faster memory bandwidth is more important than the larger memory size.” They could have put cheaper 8GB of GDDR5, but they chose faster rather than more to hit $349 (and $300 next month probably from other vendors that are NOT founders model). Though they think maybe all will be $349 it seems.
“We focused our testing on 1440p and 1080p, as those are the natural resolutions for these graphics cards.”
LOL, agreed…Why anyone tests 4K here with ~1.5% is dumb.
“We use the Ultra graphics preset but drop the Shadow and Texture Quality settings to High to avoid exceeding 8GB of VRAM usage”
Ahh, so PCWorld proves Anandtech is misleading people like cards just die if you hit 8GB, nope, you turn something down in a game like Middle Earth Shadow of War…LOL. Anandtech acts like this is a SHOWSTOPPER. OMG, OMG…LOL. Hmm, 18 degrees lower than Vega64, 8 below vega56, ouch.
Should all make sense, but it is Almost 10am and I’ve been up all night...LOL. No point in responding anandtech, that will just end with me using DATA to beat you to death like the 660ti article (used ryan's own data against his own conclusions, and tons of other data from elsewhere to prove him an outright liar), which ended with Ryan and company calling me names/attacking my character (not the data...LOL), which just looks bad for professionals ;) Best to just change how you operate, or every time a big product hits and I have time, boom, data on how ridiculous this site has become since Anand left (toms since Tom left...ROFL). I love days off. Stock homework all day, game a bit, destroy a few dumb reviews if I have time left over. :) Yeah, I plan days off for "EVENTS", like launches. Why does my review seem to cover more RELEVANT data than yours? ROFL. I own AMD stock, NOT Nvidia (Yet, this year…LOL, wait for Q1 down report first people IMHO). One more point, Vulkan already has VRS support, PCworld mentions they are working with MSFT on DX support for VRS but “Until then, it'll expose Adaptive Shading functionality through the NVAPI software development kit, which allows direct access to GPU features beyond the scope of DirectX and OpenGL.” OH OK ;) Back to reading OTHER reviews, done trashing this useless site (for gpus at least, too much info missing that buyers would LIKE to know about perf).
PeachNCream - Wednesday, January 9, 2019 - link
TL;DRboozed - Friday, January 11, 2019 - link
JTFC...LSeven777 - Tuesday, January 22, 2019 - link
It's a shame you don't use all that energy to do something usefull.El Sama - Tuesday, January 8, 2019 - link
This company needs to be put back into reality (where are you AMD?) at this trend we will be having 500 USD RTX 2260 in a few years.richough3 - Tuesday, January 8, 2019 - link
I would wait a few generations before I would purchase a video card for it's ray tracing abilities. The main reason I would buy any high end card is for FPS, because in competitive gaming, FPS is king, so any visual quality feature would be at minimum setting or turned off anyway.nunya112 - Tuesday, January 8, 2019 - link
AMD is about to have its best year for video cards mark my words!NVidia have launched a product that while on paper and technology wise is awesome. however. the physical hardware Is not capable to use that technology. im referring to RTX its too slow. And NV is asking a premium for it. NVidia has made AMD's job very easy.
all AMD hasto do is. provide good cards MUCH CHEAPER and they will own at least 50% of GPU market.
webdoctors - Tuesday, January 8, 2019 - link
Yo how much do you get paid to post? Can I do it too from home? We can split the referral bonus. Make me an offer, could use quick cash.nunya112 - Tuesday, January 8, 2019 - link
I bought a 580 8gb for 199 Australian just before Xmas. was because they all had excess supply again from all the miners. well we all know what happened there! It crashed faster than a meth head would on sunday afternoon after being up for 3 days. and they all had ROOMS full of video cards. knowing one of the bigger etailers boss personally. they had so much excess stock. they were going to throw out low end cards because they didnt have room for more expensive products in the warehouse! for instance they had 6 pallets of of gigabyte aurorus 580's and the next day 590's were being delivered. luckily when they dropped to $199AU a lot of ppl picked them up. they still made a profit.moral here is AMD needs to compete with NV for nothing. set their own low prices that they still make 15% on and spank NV
Sherlock - Tuesday, January 8, 2019 - link
Couple of points :1) Nvidia's strategy - which seems to have backfired - was to hope that RayTracing will take off in a big way - and put all its eggs in that basket. Nvidia added in specialized hardware to support RT, which has effectively priced it out of the mainstream market and has to add in "features" like DLSS to convince the consumer that they are getting a good deal @ $350 for what is essentially a 1080p class GPU - equivalent performance from the "traditional" GPU's is available ~$200
2) RayTracing will not take off until it is supported on the XBox and PS as a majority of the games are developed with those platforms in mind. Considering AMD is rumored (confirmed??) to supply the CPU/GPU for the next-gen consoles - unless AMD starts supporting RayTracing this gen - Nvidia has essentially wasted hardware resources and consumer money on something that is completely useless. By the time we have software that uses RT in a meaningful way - these Nvidia cards will be redundant
nunya112 - Wednesday, January 9, 2019 - link
it wont take off because the current hardware can not run it very well. its good on paper, and great to show. but this should have been released next year with a faster chip. because when RTX is on FPS hits so low.CiccioB - Wednesday, January 9, 2019 - link
Next year, with the right HW, nvidia wll have all developers ready to create optimized RT games (unlike BF5). You just look a the finger, not what it it point at.And if the weapon in AMD hands is slowing the technological progress because they have the monopoly in the console market, well, let me say it, I hope that AMD will die soon.
saiga6360 - Wednesday, January 9, 2019 - link
Foolish. If you think nvidia can bring ray tracing to the masses by its lonesome, then you are dreaming. They have to charge you hundreds and thousands just for that finger as you call it, that must be the middle finger because it's not going to get the point across. Just for anything to become a standard, be it 2K, 4K, adaptive sync, it needs to be affordable and for that to happen AMD and Intel needs to get on board.808Hilo - Friday, January 11, 2019 - link
Nvidia marketing fantasy to counter another nonsensical product...580. Just build one top card and sell many. Then build another one. As it seems we are stuck on 1080 performance four years in a row. Lame.Ravenmaster - Saturday, January 12, 2019 - link
It's a 1070Tihotsacoman - Tuesday, January 15, 2019 - link
Basically...the 2060 is the ultimate Ultra-Quality, 1080p, 60 fps card for 2019 @ $350.karakarga - Tuesday, January 15, 2019 - link
Do not buy 20xx serie actually. Why, because, PCI-express 4.0 is coming with AMD, soon with Intel probably. Wait for 21xx by nVidia instead. The scores are not really exciting, and nVidia must jump to 7~10nm which can provide more transistors with lower TDP. RTX 2060 card with 160 Watt consumption is really high!Gastec - Saturday, January 19, 2019 - link
I have played this waiting game for just 2 years. Now the prices have doubled. Moore's Law is dead. Long live Rock's law! :Pfavemogi - Monday, March 8, 2021 - link
Vic finished his pancakes first. I was still eating when he came around behind me. His hands slid down over my sweater and then slid up under. I still hadn't put on a bra and his icy hands folded against the skin of my boobs https://livesex.bar/ . I actually liked that and was unable to say no. I let him pull off everything up top. Eventually, I reached back between his legs to tug him around where I could see. The bulge in his pants was impressive and I opened them. His cock flopped out when I pulled down his jockeys. I watched it flop about as he slid out of his shoes and pants.