Bless you. I've read other site's reviews of the 30 series and RDNA2s, gotten to the end and though "That was two pages. Where's the charts? Where's the architecture breakdown? Where's the review?"
You might as well take your time - no one will be able to buy this product either. The PC DIY industry is hopeless right now. The only parts available are the parts you shouldn't buy because they have been replaced by better parts that you cannot find in-stock anywhere.
I'm having a hard time with this. Why should I reward AMD or Nvidia by purchasing a product months after the greasy-faced YouTube kiddies all got them for free? They took no real action to stop the scalpers and digital currency miners. They even refuse to set up something as simple as a waiting list, so you can go on with your normal life, and just wait for it. They want all of us, banging away at online retailers ... hopelessly ... trying to find a product that for all practical purposes DOES NOT EXIST.
It's all tied to the Fed's pumping of trillions of dollars into the economy. Stock prices, housing prices, stockpiling of $1,500 graphics cards, a 20% decline in exchange rates vs. the Euro - all because the Fed dumped several trillion dollars worth into the market over the course of a couple months this summer.
Well, a 3070 for $500 (provided you can find one for that price) is still a compelling upgrade from a 1080Ti especially if you want raytracing (and if you are playing 2077, you will)
10% power consumption reduction seems like it might be a less optimistic, limits-pushing rating than the rest of the RTX 3-series cards given the performance relative to the higher end models. I almost think the NV is backing off on official TDP for the higher end cards since that got a bit of negative attention already.
True, but by that standard there should be a market for a profitable 1080p / 1440p 60hz card in the $250-300 price range. I guess we'll see what comes up.
Games were also much lower fidelity, had no ray-tracing etc. You gotta remember that the more powerful cards get, the more demanding games become as well. So it is pointless to remain fixated on the resolution alone
The 3060ti is capable of at least 60 FPS in every modern game at max settings at 1440p. This is more then just a 1080p card.
And yes, cards are getting more expensive. DDR6 memory is more expensive, newer nodes are more expensive, coolers are getting more expensive. The $200 price of older AMD cards was entirely due to being non competitive, and barely made any profit, such prices are unsustainable.
Such an apologist, DDR6 isn't any more expensive than DDR3 was when that was new. The way processing nodes work means they get more wafer for less money on smaller nodes, so what you said is exactly wrong. Wasn't long ago the 8800GT, a card that could max out every game "even Crisis", was $140.
Truth is video cards have been getting more expensive because Moore's law is effectively dead, memory improvements only go so far and fidelity, separate from resolution, is pretty much as good as it's gonna get.
Offloading the work of advanced lighting from developers onto algorithmic hardware is hardly something consumers should be paying for. Let the developers do the work, job security! Not like Ray Tracing hardware being ubiquitous is gonna mean cheaper video games.
so let me understand this: they needed something below 3070 and decided to go with 3060 Ti instead of just 3060? Don't they usually milk us with the non-Ti first and then drop the Ti later? and anyway I thought they did away with the whole "Ti" and wanted to use "Super" now?
Aren't you just searching hard for something to gripe about? What in a name? In any case, What sets a Ti from a non-Ti is its market position. Ti is higher than non-Ti. So this being 3060 Ti implies that nothing is likely to come between it and the 3070 and also that a 3060 should be forthcoming in the future. Regarding "SUPER", that was for a refresh. If they had used Ti it would have been confusing because it didn't mean what Ti had always meant. I suppose they could have just introduced one 2080 SUPER card at the top, charging the price of the previous 2080, then dropped the price of all the other cards on down the line. But the specs were both a little different and I'm sure NVIDIA decided it was advantageous to announce a refresh than try to hide a refresh. Plus if NVIDIA called it the same with slightly different specs they would have been pulled apart by a certain percentage of people.
How did NVIDIA rush to release it? The power characteristics of the process or their architecture aren't going to change significantly between when they actually released it and a few months later. And when AMD come out with a 6700 XT they are likely to come out with a 6700 around the same time, which presumably would be a 3060 competitor.
A lot of these decisions probably come down to what's most advantageous when choosing the different die sizes and the cuts to make on each die. Right now the 3070 is the largest and only GPU using the GP 104. In the past, the x80 and x70 were on the same die (104) and the x60 was on a smaller die (106). But this time the x80 is on the 102 and the x70 is the big version of the 104 GPU. Perhaps the way the yields work out, a bigger cut from the 3070 would waste value at the moment. Perhaps there is more space to fill between the x80 and the x50 this time, so they want an extra SKU in between. Why is the faster one before the slower one? Perhaps again because of how they relate to dies and cuts of dies. These types of considerations are likely more important than any tit-for-tat product launches while the companies are rolling out their new stacks. For example, very often in the past NVIDIA would roll out from the top down and AMD would start in the middle and then move up (e.g., the Radeon 200 and 300 series). You might even say that Polaris/Vega and RDNA/RDNA2 are the result of that same sort of philosophy stretched out further in time. RDNA2 will presumably at some point extend down and replace RDNA, although I haven't seen the rumors about the release date of those lower-tier RDNA2 cards rattling around on the tech rumor sites a whole lot. So presumably sites like videocardz and wccftech don't have much confidence in the sources of the rumors.
Such a shame these cards are made on samsung process. They could have been much better with TSMC 7nm. Hoping that next gen, Hopper on 5nm will move back to TSMC.
That's true, but they'd hardly be in existence if they were on TSMC 7 nm. I doubt NVIDIA's next gen graphics cards will be on 5 nm. They've been avoiding the latest nodes more and more as the years go on. More likely is 6 nm.
yeeeeman, you can get a TSMC 7nm nVidia card right now, the A100. They have significantly lower FP32 units and I'm not sure if they even have RT cores.
They still don't have to like what they got. Nvidia went with Samsung over TSMC out of pricing and production concerns. After seeing Samsung scale up to mass production, it appears that the trade off wasn't worth it, so I would expect them to want to go back.
By what reasoning do you conclude this? What is wrong with the mass production of NVIDIA's chips on Samsung's 8 nm process? NVIDIA chose cost savings and availability over performance and power efficiency. And, given their architectural lead on their competitors and the volume they need to supply in order to maintain their market share, it seems to have been a smart choice. I doubt they are unhappy with it. Would they like the latest process node with as much capacity as they ask for and at a low price? Yes, of course. But that doesn't exist. AMD seem to be having great difficulty supplying their latest graphics cards right now, whereas NVIDIA seems to have a strong supply. The 3080 has already appeared in the November Steam hardware survey. I believe the 2080 didn't appear until the January list, even though it also launched in the middle of September. What NVIDIA chooses for their next process is anybody's guess.
"What is wrong with the mass production of NVIDIA's chips on Samsung's 8 nm process?" 1) Power efficiency 2) Yields
"NVIDIA seems to have a strong supply" This is contrary to all established fact. It's not the weakest, but far from "strong". It's successfully outsold the miserable performance of the 2080 at this stage, but nobody really wanted that card anyway.
"Confirmed" how? And orders for what, exactly? NVIDIA currently produce different products on different nodes. Even if the rumor is true, how do you know the 5 nm order isn't for a data center product or an SoC?
Say hello the the 'fleeced' club any 2080 SUPER card buyers. In the club you will meet all the 2080Ti buyers and anyone foolish enough to have went SLI.
If they purchases a 2080 Super card.. they have a product that actually renders something. A product that you cannot find in stock .. anywhere .. has an Average FPS of zero.
The only people getting these cards are people who have no job or jobs that allow them to camp out in front of the computer 24/7, have a bot program, or willing to buy from scalpers....
Can anyone explain why this card which delivers 16.2 TFLOPS is considered only ok for 1440p, but the XBox Series X pushing out 12 TFLOPS is supposed to run 4k ray tracing graphics for the foreseeable future?
The xbox will be using checkerboarding to reduce rendering pressure, and we ALL know that the consoles will run at reduced fidelity and graphical quality compared to PCs, AND will largely run at 30 FPS.
It's also related to the assumption that everybody runs everything at Ultra. A little tweaking and this card will be way more than "good enough" for 1440p, even at 144Hz.
I suspect AMD will need to launch an entry-level RDNA2 card or drop the price of the outgoing 5700XT to $300 as an interim solution because it isn't a good time for them to have no product in the $400 segment...everyone buying a holiday PC for gaming is going to be focused on this category especially in OEM systems where a 200-watt card would be the realistic maximum when the stock PSU is 400-watts.
Having a product is not the same as having a product to sell! AMD has the 6700 incoming which is rumoured to be capable of some pretty high boosts with 12gb ram, unless Nvidis has a boatload of failed/low yield parts then this will be the same, I wanted a 6800xt this time round but gave up in the end and managed to get a 3090 for list price from a store but I know folks who paid when the cards launched and are still at position 300+ on waiting lists. Iexpect the 6900 to be the same as will be the 6700/xt when it launches, demand is just insane right now
That MSRP is not really realistic, lowest from partners is $420 and most are $450 to $500. The required cooling etc. make the margins too slim, only Nvidia themselves can offer this at $400.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
55 Comments
Back to Article
Ryan Smith - Tuesday, December 1, 2020 - link
And yes, a review is forthcoming.=)SSNSeawolf - Tuesday, December 1, 2020 - link
I've kept the faith and chosen to wait for the Anandtech reviews. Hope this means all is well.Kurosaki - Tuesday, December 1, 2020 - link
YAYYYYYYYY! :DMr Perfect - Tuesday, December 1, 2020 - link
Bless you. I've read other site's reviews of the 30 series and RDNA2s, gotten to the end and though "That was two pages. Where's the charts? Where's the architecture breakdown? Where's the review?"yeeeeman - Tuesday, December 1, 2020 - link
Techpowerup.comTEAMSWITCHER - Tuesday, December 1, 2020 - link
You might as well take your time - no one will be able to buy this product either. The PC DIY industry is hopeless right now. The only parts available are the parts you shouldn't buy because they have been replaced by better parts that you cannot find in-stock anywhere.I'm having a hard time with this. Why should I reward AMD or Nvidia by purchasing a product months after the greasy-faced YouTube kiddies all got them for free? They took no real action to stop the scalpers and digital currency miners. They even refuse to set up something as simple as a waiting list, so you can go on with your normal life, and just wait for it. They want all of us, banging away at online retailers ... hopelessly ... trying to find a product that for all practical purposes DOES NOT EXIST.
twtech - Monday, December 14, 2020 - link
It's all tied to the Fed's pumping of trillions of dollars into the economy. Stock prices, housing prices, stockpiling of $1,500 graphics cards, a 20% decline in exchange rates vs. the Euro - all because the Fed dumped several trillion dollars worth into the market over the course of a couple months this summer.YB1064 - Tuesday, December 1, 2020 - link
Here is the conclusion:There is no need to upgrade if you own a 1080Ti.
Samus - Sunday, December 6, 2020 - link
Well, a 3070 for $500 (provided you can find one for that price) is still a compelling upgrade from a 1080Ti especially if you want raytracing (and if you are playing 2077, you will)Spunjji - Tuesday, December 15, 2020 - link
This ain't looking so good now that we know Cyberpunk runs like a three-legged dog on the 3070 with RT on.bearxor - Wednesday, March 2, 2022 - link
Is it tho?Otritus - Tuesday, December 1, 2020 - link
Corrections: Listed as RTX 3060 in chart instead of Ti. In body, 3060 Ti is said to have a 220 watt TDP instead of 200 watt.PeachNCream - Tuesday, December 1, 2020 - link
10% power consumption reduction seems like it might be a less optimistic, limits-pushing rating than the rest of the RTX 3-series cards given the performance relative to the higher end models. I almost think the NV is backing off on official TDP for the higher end cards since that got a bit of negative attention already.Pneumothorax - Tuesday, December 1, 2020 - link
"Launching This Week: NVIDIA’s GeForce RTX 3060 Ti, A Smaller Bite of Ampere For $750from your favorite scalper sites!"
FTFY
Good luck against the scalper bots folks.... So far 0/6 for:
1. PS5
2. 3080
3. 3090
4. 3070
5. Xbox One X
6. 5900X
Pneumothorax - Tuesday, December 1, 2020 - link
Actually 0/8Forgot
7. 6800
8. 6800XT
benedict - Tuesday, December 1, 2020 - link
A 1080p card was 200$ a few years ago. It's a bad time for mainstream gamers.webdoctors - Tuesday, December 1, 2020 - link
What 1080p card was $200? GTX660? Even that couldn't push 1080p for AAA games.benedict - Tuesday, December 1, 2020 - link
Radeon RX580 was exactly $200 and it still runs perfectly at 1080p.TheinsanegamerN - Saturday, December 12, 2020 - link
AMD barely made any profit from that card, and it was only sold at that price due to lacking a competitive edge against nvidia.You cant expect that to stay forever.
Spunjji - Tuesday, December 15, 2020 - link
True, but by that standard there should be a market for a profitable 1080p / 1440p 60hz card in the $250-300 price range. I guess we'll see what comes up.Retycint - Tuesday, December 1, 2020 - link
Games were also much lower fidelity, had no ray-tracing etc. You gotta remember that the more powerful cards get, the more demanding games become as well. So it is pointless to remain fixated on the resolution alone0siris - Wednesday, December 2, 2020 - link
Wow, what a novel concept. Imagine that: expecting something better for the same price after five years of technological advancement.TheinsanegamerN - Saturday, December 12, 2020 - link
The 3060ti is capable of at least 60 FPS in every modern game at max settings at 1440p. This is more then just a 1080p card.And yes, cards are getting more expensive. DDR6 memory is more expensive, newer nodes are more expensive, coolers are getting more expensive. The $200 price of older AMD cards was entirely due to being non competitive, and barely made any profit, such prices are unsustainable.
Hrel - Friday, January 1, 2021 - link
Such an apologist, DDR6 isn't any more expensive than DDR3 was when that was new. The way processing nodes work means they get more wafer for less money on smaller nodes, so what you said is exactly wrong. Wasn't long ago the 8800GT, a card that could max out every game "even Crisis", was $140.Truth is video cards have been getting more expensive because Moore's law is effectively dead, memory improvements only go so far and fidelity, separate from resolution, is pretty much as good as it's gonna get.
Offloading the work of advanced lighting from developers onto algorithmic hardware is hardly something consumers should be paying for. Let the developers do the work, job security! Not like Ray Tracing hardware being ubiquitous is gonna mean cheaper video games.
darckhart - Tuesday, December 1, 2020 - link
so let me understand this: they needed something below 3070 and decided to go with 3060 Ti instead of just 3060? Don't they usually milk us with the non-Ti first and then drop the Ti later? and anyway I thought they did away with the whole "Ti" and wanted to use "Super" now?Yojimbo - Tuesday, December 1, 2020 - link
Aren't you just searching hard for something to gripe about? What in a name? In any case, What sets a Ti from a non-Ti is its market position. Ti is higher than non-Ti. So this being 3060 Ti implies that nothing is likely to come between it and the 3070 and also that a 3060 should be forthcoming in the future. Regarding "SUPER", that was for a refresh. If they had used Ti it would have been confusing because it didn't mean what Ti had always meant. I suppose they could have just introduced one 2080 SUPER card at the top, charging the price of the previous 2080, then dropped the price of all the other cards on down the line. But the specs were both a little different and I'm sure NVIDIA decided it was advantageous to announce a refresh than try to hide a refresh. Plus if NVIDIA called it the same with slightly different specs they would have been pulled apart by a certain percentage of people.davide445 - Wednesday, December 2, 2020 - link
Because RX 6700 XT is coming, they needed something more powerful than the 3060.The same why they rushed to release the Ampere pushing max on power.
Yojimbo - Wednesday, December 2, 2020 - link
How did NVIDIA rush to release it? The power characteristics of the process or their architecture aren't going to change significantly between when they actually released it and a few months later. And when AMD come out with a 6700 XT they are likely to come out with a 6700 around the same time, which presumably would be a 3060 competitor.A lot of these decisions probably come down to what's most advantageous when choosing the different die sizes and the cuts to make on each die. Right now the 3070 is the largest and only GPU using the GP 104. In the past, the x80 and x70 were on the same die (104) and the x60 was on a smaller die (106). But this time the x80 is on the 102 and the x70 is the big version of the 104 GPU. Perhaps the way the yields work out, a bigger cut from the 3070 would waste value at the moment. Perhaps there is more space to fill between the x80 and the x50 this time, so they want an extra SKU in between. Why is the faster one before the slower one? Perhaps again because of how they relate to dies and cuts of dies. These types of considerations are likely more important than any tit-for-tat product launches while the companies are rolling out their new stacks. For example, very often in the past NVIDIA would roll out from the top down and AMD would start in the middle and then move up (e.g., the Radeon 200 and 300 series). You might even say that Polaris/Vega and RDNA/RDNA2 are the result of that same sort of philosophy stretched out further in time. RDNA2 will presumably at some point extend down and replace RDNA, although I haven't seen the rumors about the release date of those lower-tier RDNA2 cards rattling around on the tech rumor sites a whole lot. So presumably sites like videocardz and wccftech don't have much confidence in the sources of the rumors.
haukionkannel - Thursday, December 3, 2020 - link
They have ga104 now that is used in 3070...The ga 106 that is used in 3060 is not in production yet, so it will come next spring.
TheWereCat - Tuesday, December 1, 2020 - link
€520 from the only retailer that had them in stock on launch for a few secondsyeeeeman - Tuesday, December 1, 2020 - link
Such a shame these cards are made on samsung process. They could have been much better with TSMC 7nm. Hoping that next gen, Hopper on 5nm will move back to TSMC.Yojimbo - Tuesday, December 1, 2020 - link
That's true, but they'd hardly be in existence if they were on TSMC 7 nm. I doubt NVIDIA's next gen graphics cards will be on 5 nm. They've been avoiding the latest nodes more and more as the years go on. More likely is 6 nm.yeeeeman - Tuesday, December 1, 2020 - link
Many rumours say it will be 5nm, because nvidia didn't like the regression in efficiency from Samsung eitherYojimbo - Tuesday, December 1, 2020 - link
6 nm is from TSMC. And NVIDIA knew what they were getting from Samsung when they signed up, so that rumor makes no sense anyway.Assimilator87 - Tuesday, December 1, 2020 - link
yeeeeman, you can get a TSMC 7nm nVidia card right now, the A100. They have significantly lower FP32 units and I'm not sure if they even have RT cores.Otritus - Tuesday, December 1, 2020 - link
They still don't have to like what they got. Nvidia went with Samsung over TSMC out of pricing and production concerns. After seeing Samsung scale up to mass production, it appears that the trade off wasn't worth it, so I would expect them to want to go back.Yojimbo - Wednesday, December 2, 2020 - link
By what reasoning do you conclude this? What is wrong with the mass production of NVIDIA's chips on Samsung's 8 nm process? NVIDIA chose cost savings and availability over performance and power efficiency. And, given their architectural lead on their competitors and the volume they need to supply in order to maintain their market share, it seems to have been a smart choice. I doubt they are unhappy with it. Would they like the latest process node with as much capacity as they ask for and at a low price? Yes, of course. But that doesn't exist. AMD seem to be having great difficulty supplying their latest graphics cards right now, whereas NVIDIA seems to have a strong supply. The 3080 has already appeared in the November Steam hardware survey. I believe the 2080 didn't appear until the January list, even though it also launched in the middle of September. What NVIDIA chooses for their next process is anybody's guess.Spunjji - Tuesday, December 15, 2020 - link
"What is wrong with the mass production of NVIDIA's chips on Samsung's 8 nm process?"1) Power efficiency
2) Yields
"NVIDIA seems to have a strong supply"
This is contrary to all established fact. It's not the weakest, but far from "strong". It's successfully outsold the miserable performance of the 2080 at this stage, but nobody really wanted that card anyway.
Gigaplex - Wednesday, December 2, 2020 - link
Nvidia have already placed their order for TSMC 5nm. This was confirmed back in May.Yojimbo - Wednesday, December 2, 2020 - link
"Confirmed" how? And orders for what, exactly? NVIDIA currently produce different products on different nodes. Even if the rumor is true, how do you know the 5 nm order isn't for a data center product or an SoC?RSAUser - Wednesday, December 9, 2020 - link
When was this confirmed? Those were rumors of Nvidia negotiating with TSMC, those fell through as can be seen by Nvidia using Samsung.Agent Smith - Tuesday, December 1, 2020 - link
Say hello the the 'fleeced' club any 2080 SUPER card buyers.In the club you will meet all the 2080Ti buyers and anyone foolish enough to have went SLI.
Yojimbo - Tuesday, December 1, 2020 - link
Technology marches on. Isn't that the point?TEAMSWITCHER - Tuesday, December 1, 2020 - link
If they purchases a 2080 Super card.. they have a product that actually renders something. A product that you cannot find in stock .. anywhere .. has an Average FPS of zero.Yojimbo - Wednesday, December 2, 2020 - link
Except you can find them in stock. People are getting them somehow. The 3080 has already appeared on Steam's hardware survey.Pneumothorax - Thursday, December 3, 2020 - link
The only people getting these cards are people who have no job or jobs that allow them to camp out in front of the computer 24/7, have a bot program, or willing to buy from scalpers....Yojimbo - Thursday, December 3, 2020 - link
Maybe, but somehow or another they are getting out in the wild. So some people can find them in stock .. somewhere .. unlike what the other guy said.TheinsanegamerN - Saturday, December 12, 2020 - link
No, you can find scalpers selling them. That is not in stock.TheinsanegamerN - Saturday, December 12, 2020 - link
No, you cant. You can find them for double the price on ebay, but that is not "in stock". And at that price the 2080 super is a good deal.Cobusvj - Wednesday, December 2, 2020 - link
Can anyone explain why this card which delivers 16.2 TFLOPS is considered only ok for 1440p, but the XBox Series X pushing out 12 TFLOPS is supposed to run 4k ray tracing graphics for the foreseeable future?TheinsanegamerN - Saturday, December 12, 2020 - link
The xbox will be using checkerboarding to reduce rendering pressure, and we ALL know that the consoles will run at reduced fidelity and graphical quality compared to PCs, AND will largely run at 30 FPS.Not really that hard to figure out.
Spunjji - Tuesday, December 15, 2020 - link
It's also related to the assumption that everybody runs everything at Ultra. A little tweaking and this card will be way more than "good enough" for 1440p, even at 144Hz.Samus - Sunday, December 6, 2020 - link
I suspect AMD will need to launch an entry-level RDNA2 card or drop the price of the outgoing 5700XT to $300 as an interim solution because it isn't a good time for them to have no product in the $400 segment...everyone buying a holiday PC for gaming is going to be focused on this category especially in OEM systems where a 200-watt card would be the realistic maximum when the stock PSU is 400-watts.alufan - Monday, December 7, 2020 - link
Having a product is not the same as having a product to sell!AMD has the 6700 incoming which is rumoured to be capable of some pretty high boosts with 12gb ram, unless Nvidis has a boatload of failed/low yield parts then this will be the same, I wanted a 6800xt this time round but gave up in the end and managed to get a 3090 for list price from a store but I know folks who paid when the cards launched and are still at position 300+ on waiting lists. Iexpect the 6900 to be the same as will be the 6700/xt when it launches, demand is just insane right now
RSAUser - Wednesday, December 9, 2020 - link
That MSRP is not really realistic, lowest from partners is $420 and most are $450 to $500.The required cooling etc. make the margins too slim, only Nvidia themselves can offer this at $400.