Can you think of one reason to make them ? I mean they will be made, as in vega10 chips for datacenters but for gaming, theyre gonna have higher MUCH higher asps on 5700
Yes Vega 10 is being made for Google Stadia gaming as they use PRO V340 cards with dual "Vega 56" GPUs. GCN still is better for Compute and HPC I think, but Vega 20 will largely succeed that in HPC. GCN is not going anywhere.
Oh, I do not think Vega 10 is cheaper to make than Navi 10. Yes the process is mature and cheaper, but the die is almost 2X the size and you factor HBM2 and interposer cost into that and the price is largely in the same ballpark.
Navi 10 cannot do a "V340" style card easily, or as effectively, as Google Stadia needed graphics density and the on-package memory on Vega 10 makes the overall space requirements much smaller, so yes Vega 10 is likely to be made still, and itself is much cheaper than Vega 20.
What? where did you get Hawaii from? V340 uses "Vega 10" which has on-package HBM2 instead of GDDR5. That is a major advantage for space saving when putting multiple GPU packages on the same card.
My point was that Hawaii cards use 512-bits of datapath on a single card, so perhaps 2x 256-bit Navi's can fit.
Regarding density, I don't see your point. From a server's perspective, a PCIe card is a PCIe card, unless it's low-profile, which I don't think it is.
Servers would likely have to use PCI-e LP or mini PCI-e. In whichever case, most servers are built more for memory I/O and storage performance and capacity (with also a focus on CPU performance) rather than on how powerful the GPU is onboard. Most servers are even headless and don't even have GPUs on board. The few that do usually use theirs for interfacing with a terminal.
This does not include server or server farms built specifically for mutli-gpu setups. Usually that is scientific/graphics oriented or with the increasing niche of bitcoin mining.
Pretty sure I heard Su mention that NAVI included (at the "Rdna" level) improvements to compute. We'll know in a few days, of course. (Impatience makes time drag, eh?...;)) I don't usually do this, but *provided I can buy a 5700XT for either $499 (20th anniversary) or $449 MSRP* I'll be buying one next week. I'm going to be rather ticked off if the prices for the card a grossly inflated! Here's hoping AMD will control this much better than than happened at the RX-480's debut--what makes me shudder a bit is that I just read some days ago that Bitcoin stock was on the rise again! Stadia servers are likely only using Vega now because NAVI simply wasn't available when they began. Should change in a couple of weeks, possibly.
As far as consumer cards go, they've been drawing down inventory from the market for a couple of months now. I don't know if they've been formally discontinued, but they may as well be de-facto done for.
Listing your business information on these <a href="http://www.apnidigitalduniya.com/">Faridab... Local Business Directory</a> increases online exposure and provides new avenues to reach potential customers.
No HDMI 2.1 support, variable rate pixel shading, or hardware ray tracing? This kinda seems like the Ryzen 1 launch; some promising improvements, but still not enough to really push the envelope. If I have to option between a 5700 XT and a RTX 2070, I would probably go Nvidia still, just because of the feature support.
If the RTX 2070 get a price cut to 449 then yes it is the better card I think. And I am a Radeon fan so it pains me to say that, by the way, but with HW DXR, the 2070 would have the better feature set and Turing is a very good forward looking architecture, a first for Nvidia in a while. So yes I would prefer the 2070 at 449, too. And I think Nvidia will do just that to deflate this launch.
How exactly are tensor core's potential for other applications a factor when buying a card now? It's the same as telling people to buy AMD because of the compute core's potential to ray trrace, without substance. Show me at least 1 game first, then talk.
Just noticed page 2. HDMI 2.0b and Freesync-over-HDMI, but no 2.1. I really want that increased bandwidth for 4k 120Hz with VRR. Guess I will keep waiting.
Please be realistic. This would be too soon for AMD to have ray-tracing support.
In fact, that's probably the reason XBox Next and PS5 aren't launching this year. It takes a long time to design and validate chips, you know? The specs are pretty much baked a couple *years* before launch!
In some areas yes - just as Nvidia was years behind AMD and only finally caught up with original GCN with Pascal. Though not in everything. AMD is still superior in terms of async compute for example.
Nvidia released CUDA back in 2007 when the 8800GTS was king. While AMD might have had GPUs that were better at numerical code since then, nvidia's infrastructure+GPUs made them own the market on GPU computing.
AMD is good at piggybacking on Intel's infrastructure in the AMD64 market. They have a harder time doing the same with nvidia and the GPU market and the narrow focus of where the money can be spent shows. You'd think they could take some of the lessons learned in making the console GPUs and at least catch up to where nvidia was a couple of years ago but apparently that is too expensive.
I don't think wumpus realizes that Intel owns x86 and but licenses from AMD for x86-64. Without AMD's "AMD64", Intel wouldn't exist today. AMD's designs for the past 15 years (maybe longer) aren't even based on Intel's designs. While they may have been a 2nd-tier manufacturer of Intel-based microprocessors, they haven't been for many years.
ATi dwarfed nVidia, and AMD almost bought nVidia back in the day but they weren't ok having a competent CEO at the time so the deal fell through. *BOTH* halves of the current AMD were much larger than nVidia, so we have seen with great clarity what they would do if they had every advantage.
The problem for NVIDIA is, when PS5 and Xbox Scarlett hit the shelves, games will use whatever AMD choose to accelerate via hardware, and that may or may not work well with NVIDIA's current design.
I suspect AMD has been working on a ray tracing hardware solution similar to Tensor cores or something for some time now. They may even have considered implementing it into Navi for consumer GPUs but I imagine even if the performance was there, the increase in die size would push cost above the mid-range market these cards are targeting. There may also have been a time factor involved, ie perhaps the ray tracing performance itself wasn't quite as good as it needed to be to be viable (and AMD has pretty consistently asserted that they feel the technology isn't quite there yet and will support it when it becomes viable... although I think with some time and additional programming effort, Nvidia is finally starting to show some compelling evidence that it is in fact becoming viable).
These two factors (price and performance) lead me to believe that whatever AMD has been working on will show up in the PS5/Xbox Scarlett hardware, especially since both companies (to my surprise) have mentioned ray tracing support. These are "semi-custom" designs, after all, and hardware ray tracing is a nice checkbox for a console feature. This would also explain why consumer Navi is launching so much earlier than the console hardware (I suspected the consoles would launch first, near the beginning of this year, but I was also under the assumption that they were working with Zen+ due to time constraints, so obviously I was very wrong). Given how everything has played out so far, it's clear that AMD needed to launch something as soon as possible and I suspect that due to price and performance concerns, it just made more sense to launch Navi without it.
I gotta admit, I'm a little disappointed with the lack of HDMI 2.1 though. As someone who often games on a TV, I would hate to spend $450 on a brand new GPU that won't be able to do VRR over HDMI when I finally upgrade my TV (then again, I'm not planning on doing that for a while, but still).
Well for the first release of a new arch, this is pretty good. More FineWine potential here, also with arch tweaks - see how much they gained with Zen 2. And efficiency wise I think the cooperation with Samsungccould turn out to be very helpful. Targeting mobile/tablets necessitates this.
It is because it is clocked so. If you look at the 5700, that is rated the same 180W, with performance (judging from the slides) very close to 2070. Of course, that is ~parity when comparing the products- cards, not the technological level (because this is all 12nm nVidia vs 7nm AMD).
I see no problem with no HDMI support. That's a TV standard, and can easily get adapted from DP via an adapter. It also makes the card pricier to have a new controller to support that standard, so there's that. They might release a HTPC version of a card later to support that, but that's not even in my or most of the PC gamers' I know radar.
Ray Tracing and DLSS are worthless on a 2070. At least with AMD you can use their upscaler / sharpener with a less then 1% performance impact with better results then DLSS. I don't see a problem with AMD introducing features that work instead of pipe dreams that suck in implementation like Nvidia.
Ray trasing is irrelevant at this point. Only one card 2080ti can run raytrasing at reasonable speed... and 2080ti cost 1200$ so no it is good that Navi don`t have it yet! If Navi will have it next year. Expect less speed with more money next year... because raytrasing takes up chip space for pure raw draving speed...
You'd think that with the money saved on Ryzen over Intel you could afford a decent Turing (or perhaps an older/used Pascal. No real advantage of going full Turing outside the money-is-no-object 2080ti).
I guess that's the difference between fans and fanboys.
Learn what? For anyone knowing the industry- there are plenty of reasons TO NOT BUY NVIDIA, and to support the underdog. And now, with Navi, there are even better AMD options to buy a faster GPU for less.
Well, remember that "SUPER" versions of RTX cards are about to arrive, so regular ones will get a price cut. And nowadays when AMD says about their card as "competitive with XYZ" it should be understood as "~5-10% slower on average than XYZ" (like Radeon VII vs 2080). So I guess waiting for price drop for regular 2070 better deal than getting 5700 XT.
AMD is back in business in the CPU area, but unless we tak about the low-end cards, Red Team does not have impressive products. OK, Radeon VII is impressive if you're a person which does some AI/ML pet projects and likes to game at the same time, but that's a niche.
I'm quite disapointed that there is not "Big Navi" this year. I plan to finally upgrade my 2600K/GTX970 rig and while choosing the CPU is easy (Ryzen 9 3950X is no-brainer for me), the GPU market right now sucks. I guess I go for used 1080 Ti, which is best value for 1440p gaming I guess...
"fast" GPU, not faster. Supporting a underdog in a tech industry is the most insane logic i ever heard. Its not like you are shouting at some sports team. lol
No kidding, rewarding a company for producing mediocre products, so they will continue to produce mediocre products. It takes some special thinking to justify that.
and paying a company way to much for its products is better ?? its like saying, keep charging us these insane prices, even though most of us know they are over priced cause all you care about is your profits, and we will keep buying them at these prices... phynaz, you are the most dumbest, close minded, ignorant person i have seen yet.... seems like you want amd's vid card business to fail, to nvidia has no competition, and charge even more for their already overpriced products
Medicore as a metric depends on many variables. A 40% performance uplift over previous generation cards with a nice drop in power consumption and a vastly smaller die certainly seems less mediocre then the 27% performance increase along with higher power consumption and price increases across the board The only way you can rationalize these new AMD cards as mediorce is from a pure performance perspective compared to a $1,200 video card. Otherwise Nvidia's turning generation is far more mediocre, especially when you consider the price hikes. In fact it provides worse performance per dollar then the previous gen Nvidia cards, especially the new titan and 2080 Ti.
evernessince, keep in mind, phynaz, is just trying to justify his purchase of his, my guess, 2080. a card he paid WAY to much for, for little gain over the 1000 series.....
"No kidding, rewarding a company for producing mediocre products, so they will continue to produce mediocre products. It takes some special thinking to justify that."
I tend to support some companies rather than others for various reasons. Unfair business practices means quite a lot to me, as does competition in the market. So if one particular company that I prefer not to endorse has a superior product than the other company that I would prefer to endorse, I heavily compare value for dollar and the actual performance difference, and weigh whether or not it truly matters for my particular application.
The end result is one of three things. Either I end up buying the product from the company I would rather endorse, because the value is still there and the performance deficit is minimal, or I wait to see what that same company produces at a later point (because the value is not there and my need to buy is not urgent), or I end up buying from the other company because I have a need and the value is too good to ignore. At the end of the day, I'm not going to spend decent money on a crappy product just because of the company that is selling it.
And I do think even if you're a "fan" of certain companies, it's dumb not to criticize their products when they are bad and urge them to do better, and it's dumb to throw money at products that aren't worth buying. In fact, I think that's the most salient point of the entire conversation. Like you said, if you reward the company for producing mediocre products, they will continue to produce mediocre products.
Having said all that, I don't think Navi is mediocre. I think it's decent. It still lags behind Nvidia on performance-per-watt (although we really need to see some actual testing and real performance numbers to draw any conclusions). IMO it's a much better showing than Vega was, at least in terms of being a consumer card, but AMD obviously still has some work to do to make it better. What's the most disappointing with these cards, IMO, is price, and I think Nvidia has a lot to do with that. I think the XT should be around $399 (maybe a touch more) and the 5700 no more than $349. I think AMD feels that current 20-series pricing justifies these prices, but they're already a year late, Nvidia has "Super" coming, and likely the ability to adjust the entire price stack down a bit. So it'll be very interesting to see how Nvidia responds and what AMD does to correct the situation, which will likely tilt back towards Nvidia's favor as soon as they respond. AMD may be forced to price these cards more competitively because of that.
There's a saying in business, there are no bad products, only bad prices.
Blowers are also more consistent and don't dump heat into your case. If you have a case with poor airflow, a blower will outperform. Blowers are a good choice for a reference card because they will perform equally as well in every PC.
I am really disappointed with the price, like, really disappointed. Honestly they don't really bring a lot new to the table and as far as I can tell with the reference design: 8+6 pin means it isn't really that efficient if it needs that power. And the blower, ugh, just no.
I knew this wasn't for me, I am happy to stay with my Radeon VII +Kraken G12+Asetek 570LC modification until next year but I was kinda hoping these RX 5700XT and 5700 to be priced like 349 and 299 respectively. Yes... Please don't hate on me, I know AMD isn't a charity and 7nm process is likely expensive but this chip succeeds Polaris and i was hoping it would be priced like it. Honestly I think custom Vega 56 for £250 is still the champ - these cards are readily available for this price here.
I await Big Navi RDNA+HBM2, or maybe I get an Nvidia 7nm card, but I will not pay more than 699 US / £650 for the card and I will not buy with less than 11 GB of Video Memory.
I agree. The rumors had these things priced WAAAY lower. If they come out slightly agead of the 2070 then there's maybe reason for excitemen. But I honestly think that the favorable comparison amd showed off today was all best case benchmarks. Either way maybe they have some room to come down. So a price war in the mid teir may be what they are trying to encite vs nvidia. (They would probably benefit more to eliminate nvidias margins before they drop)
Oh well wait for the benchmarks I guess. I really freaking hope they lose the f$^^#&= blower syle cards for navi 20. seriously
I agree. I bought into the hype for 2 reasons. Navi was supposed to use a new architecture that could scale well using inerposers. If that is true well they didn't use it for these cards.
The second reason is that the current generation of cards are still greatly inflated in price due to rtx and tensor cores taking up silicon. I was hoping that AMD could just match pascals value and that would give them an edge. They only got halfway there though. Perhaps they are making insane margins on these and have room to go down.
Either way it's a half let down for me. Im still waiting for the day when a dollar could buy as much performance as the 10 series when mining wasn't inflating.
Same here as far as price is concerned - was hoping for at least $50 less, but if nVidia responds with either faster spec and or lower price RTX 2060 / 2070 then we may still see lower prices across the board with everyone winning, regardless of which brand they prefer.
In the end, we will have to wait for reviews and what else RX5700 contains in terms of features.
AMD needs to keep reasonable margins - at least 45%. They tried to offer stuff at very low prices for a long time and often had significantly better value than Nvidia and still didn't gain very much market share (volume would offset lower margin). So I imagine they thought "fuck it" and went for normal pricing this time - same thing for Ryzen 3000.. No more charity prices.
If Nvidia lowers prices, then AMD will lower as well but there is no reason to aggressively undercut them. I mean.. Look at average gamers - Nvidia releases overpriced shit with often worse value than Pascal and people still buy it. How do you compete with that mentality?
Also, inflation. 300$ now isn't 300$ ten years ago.
Yeah, but I think it’s widely that in tech, cash prices don’t go up. Maybe they will no longer go down over time as they have in previous decades, but people just don’t accept steep prices rises for tech. Apple’s iphone sales are dropping for a reason, even with the boom of the Chinese middle class.
"Yeah, but I think it’s widely that in tech, cash prices don’t go up. Maybe they will no longer go down over time as they have in previous decades, but people just don’t accept steep prices rises for tech."
That's absurd. Prices do indeed go up, and they have for years. 2 decades ago, a high-end consumer graphics card was $299. Even if we adjust for inflation, that's only $469. That's a far cry from the $1200 that the 2080 Ti commands. Granted, the performance range of graphics hardware is a lot greater now than it was back then, but that doesn't change the fact that people are indeed willing to pay greater prices for better performance.
" but that doesn't change the fact that people are indeed willing to pay greater prices for better performance " maybe you are willing to pay, but not all of us, at some point, there has to be a price that is just too high, and the prices that nvidia charges for the 2070 and up, is at that point, the 2080 and titan, are well passed that point
Margins don't help if you don't have volume. They still need to cover the NRE (design costs) of the boards and those prices aren't helping. It's pretty bad when your competitor can slap on 20% more transistors and pass on the "raytracing tax" on consumers and you can't really compete with those boards.
Except they're being released a year later and all over the web and even on this site ppl kept saying the AMD cards would be hugely cheaper and ppl were getting ripped off. Now we find Radeon 7 and other new cards are priced about the same as Nvidia's cards.
You mean all the anonymous kids posting on this site comments section were wrong? We're not going to get cards below cost? INCONCEIVABLE!
And the regular 2060/2070 will be discounted when thier "SUPER" versions arrive. So, I guess buying card from 5700 series on launch is rather pointless...
garbage response is pure garbage. replaces performance argument with sales argument, thinking no one is paying attention. but we already know who you are.
zinfamous more like a garbage reply from and nvidia fanboy. as has been mentioned.. before final judgement is passed.. wait till these cards are reviewed by 3rd parties, and IMO.. its in every ones best interests, if these cards do perform as amd claims.. does any one want to keep paying nvidia's inflated prices for their cards ?? i sure has hell dont.. paying at best $500 bucks for an entry level 2060 card is not worth the upgrade of my current 1060. if this doesnt close the performance gap, then i think there arent many people that could be upgrading their current video cards.
Er, I kind of want to see AMD succeed, but the 2060 is like $349 MSRP. If Nvidia prices are inflated, AMD sure isn't doing much to poke a hole in that bubble this time around. The 5700 is a little faster than the 2060 for $30 more, and the 5700XT is roughly equal to the 2070 for $50 less. They're both very vulnerable to price cuts or bundles from Nvidia right now.
Drumsticks maybe in the US, but in other countries.. the prices are much higher... aka US $349, canada, $500 ( of the least expensive 2060 ) top of the line 2080ti is $2100, an RTX titan is $3500 !!!!!!!!!
phynaz, that comment right there shows how ignorant, and blind you really are... these prices are in US dollars, even if ati wasnt bought out, they would STILL list their prices in US dollars... where something is made.. has NO bearing on it being in US dollars.. these cards are made where? oh.. guess where.. asia... where do you think TSMC is ??
I bought OC'ed GTX 970 for 1500 of local currency. Two generations and the equivalent OC'ed RTX 2700 costs 2750. This is insane. I really, really want AMD to start compete in mid/high end of GPU market. Hell, even Intel may be a saviour for the GPU market if they will be capable of having product across whole spectrum from low-end to high-end.
You know that we can all see you moving those goalposts around, right? A bad argument is a bad argument, squealing and calling your opponent biased when they point that out is a troll's game. Did you learn to argue from Ben Shapiro or something? :/
He started posting fanboy comments before anyone even mentioned AMD vs Nvidia and didn't get enough attention so shortly after that posted even more pointless, non-informative remarks. Still not getting the attention he's craving so maybe he'll just go away.
Vega 64 rarely bested the GTX 1080, unfortunately. Due to supply shortages & cryptomining, it didn't really make up for the shortfall on price, either.
So, yes, I'd say AMD finally has a strong competitor for the GTX 1080 / RTX 2070. At least, for games not using the RTX features.
No, it didn’t best the GTX 1080. But it did equal it, which was probably AMD’s aim. It did make it rather pointless in the gaming market place, of course. But cryptominers loved it; it mined ETH 30% faster than a 1080.
I said it *rarely* bested the GTX 1080. There were indeed a couple games, although one was using Vega's Rapid Packed Math feature. I think it came out at the end of last year... perhaps it was an installment in the Battlefield franchise?
Don't stress, dear Mac user. Vega Pro II is faster. Eventually, I'm sure AMD will release a Navi Pro that will replace the Radeon 580 Pro currently set to ship in the Mac Pro base config.
You can thank Nvidia for that one. AMD is just doing the smart thing and trying to earn higher margins rather than relying on volume shipments. I'm sure the prices will come down 100$ or more my Christmas.
With all due respect, I am not a "fanboy". This is basic economics. If you price your product too far below that of a competitor, you decrease the perceived value of your product and miss out on higher margins. It's also important to recognize that a Turing refresh is on the way that will tack on 50-100$ to the successor of the 2070.
There has been no announcement. There are rumours, which I place firmly in the “I’ll believe it when I see it” category. The rumours consist of bumping RAM up by 2 Gb, and possibly increasing memory bandwidth. I’m not sure either or both would make a great deal of difference to frame rates. The GPUs themselves would be unchanged.
holy shit man. look at your pure garbage responses so far: you brought this nonsense here, and you only. "calling out fanboys" after introducing this garbage.
what throwaway nvidia schwag did your lazy posts pay for, anyway?
So, have anything to say about this $450 ripoff? Three years later and it’s $50 less than a GTX 1080. Of course if you were waiting for another hot clocked AMD furnace then I guess you will be thrilled.
The 2070 is currently 15% faster than a 1080 and sells for anywhere from 490-550+ USD. No one is paying 500$ for a 1080 without getting ripped off. Even if the 5700XT is only on par with 2070, it's 15% faster than the 1080 and at 40-100$ cheaper than the 2070.
The Radeon VII runs ~10C hotter than the 2070 (74C) while drawing 60W more under load. The 5700XT has a 75W lower TDP than VII. So it stands to reason that it will not run much hotter than the 2070, even before we factor in AMD's blower redesign.
and only nvidia fanboys are happy to pay the prices that nvida charges for their cards.. but SaberKOG91 is right for the current prices of video cards.. THEY kept raising the prices of their cards to where they are now.. the sweet spot was around the 300 mark.. but now.. it seems to be between 500 and 600....
Korguz, to be fair, for $300 you now get a lot more AAA FPS now with Nvidia than you ever have before. It’s just that the high end has become even higher.
AMD don’t really have a compelling or even just competing $300 product.
maybe.,. but are these the only to navi cards amd will release ?? i doubt that, this could just be the mainstream cards... there could be an RX5800/5900 series and a rx 5600 series yet to come...
phynaz.. and you like paying for nvidia's over priced 20 series, that isnt much of an update performance wise to their own 10 series cards ?? sorry phynaz.. but nvidia is more of the rip off here.. not these cards.. but im going to wager a guess.. you have more money then brains?
Phynaz, realistically, since I assume you have an RTX card, how much do you use DLSS and RTX?
With the low amount of games supporting it, and the huge performance hit for a bit better lighting and reflection mapping, it's not a killer feature. I bet less than 5% of people who bought these cards use these features on a regular basis...
and the main feature of the RTX series ?? only used in a hand full of games, causes a pretty big performance hit when used, and cost to much for that feature... even the ray tracing fall back driver they release for the non RTX cards isnt worth the performance hit, i tried it on my 1060.. no thanks.. got any thing NEW to counter with ?? ray tracing wont be usable for another generation or 2 still... and will probably STILL be expensive...
What's stopping you from buying nVidia when they offer the better price to performance and watt on the free market? No reason to have an aneurysm over the alternative product you can't see yourself buying.
Not a gun - they set the price/performance ladder. AMD is just slotting into that. No sense in under-cutting by a lot - it would just hurt their margins probably more than the additional volume could offset.
There are two likely options here: 1) They don't know how bad their arguments are and have a serious logic impairment, 2) They know exactly how bad their arguments are and have some form of social impairment
Either way, getting angry with them probably isn't going to help. Best to leave them to stew in their own ill-concealed self-loathing.
He's obviously craving for internet attention. Only way to make it go away is to ignore him. He's wasted too much space in this comments section already so don't keep giving him more fuel.
Nobody said Nvidia forced AMD to price higher - Nvidia's pricing decisions opened up room for AMD to follow suit; that's how markets work. You couldn't have made your strawman more obvious. Why bother? If you just want to have dumb fights where facts don't matter, piss off to 4chan.
Haha. This is a funny comment. It's NVIDIA's fault!!
I'm not sure why these cards are so expensive. Maybe it's a statement to their investors. I think Nvidia will release their "super" lineup and then AMD will cut prices rather quickly.
Anyway, 7 nm is new and I'm sure the price will drop as both it & GDDR6 both mature. Remember that Polaris is currently mature and now selling well below its original list price.
Yes, and that's why AMD balance is so low at the end of the quarter. GPU sells are pulling AMD quarter results at low numbers as that division is loosing a lot of money with respect to the CPU division.
Was/are Polaris and Navi actually that bad power/perf wise? Or Did nVidia hit it out of the park so hard with Maxwell and Pascal that nobody else can catch up?
Either way it sucks for those of us who game, and don't want to pay >$600 for a tangible upgrade from GTX1070 level and/or actually have usable 4K gaming.
Pity the person who wants a good VR rig.
(and no, this isn't an nVidia shill - I'd love to grab another AMD card, but whoever gets me a 4K gaming card for $400 first is gonna win it)
I think you're onto something. When Nvidia set about to design the Tegra X1, they had to focus on power-efficiency in a way they never did before. When they scaled up to a desktop GPU, this gave them a perf/W edge that ultimately translated into more perf. Just look at the performance gap between Kepler and Maxwell, even though they shared the same manufacturing node!
AMD has taken a couple generations to wise up. It seems they are still on the journey.
Anyone can catch up, if it wants to affors the costs of redoing its inefficient architecture. by passing from Kepler to Maxwell nvidia deeply redesigned the entire architecture (making it also a bit fatter, so a little more expensive) bu they knew that was the thing to do to create a better architecture.
AMD started with GCN in 2012 and is proposing it's "Maxwell" in 2019. Despite the fact that the technology has advanced and beside the 7nm PP there are more things that they still lacks like all the new features nvidia put in Maxwell, Pascal and even more in Turing. They just started understanding that memory compression is an advantage instead of being wasted transistors. They are about 6 years back from this point of view.
They're definitely not 6 years behind! They introduced tile rendering in Vega, which Nvidia first brought out in Maxwell. So, perhaps more like 2-3 years.
On geometry capacity they are 6 years behind. Like for memory compression that allows nvidia to use about 33% less bandwidth which obliged AMD use expensive HBM on high end cards to non make enormous and expensive bus on GPUs that are already fatter than the competition for the same performance. Without talking about the double projection feature and the acceleration for voxel to better support volumetric lights and effects (which we can see only though GameWorks extension as no console engine is thought to support them because AMD has not dedicated acceleration for them and they would result in a slide show).
Underclocking / undervolting experiments have shown that GCN is actually quite competitive in terms of power/perf, *for a given level of performance*. Unfortunately for AMD, Nvidia have been consistently able to hit a higher absolute level of performance, forcing AMD to hot-clock their cards just to keep up.
That is absolutely down to Nvidia hitting it out of the park with Maxwell - they nailed architectural efficiency in a way that has clearly taken AMD some time to catch up on, and they managed it with an architecture that scales up extremely well.
Define tangible. We’re seeing the same slow-down in performance increases that we’ve seen with CPUs.
Pascal and Maxwell in particular were amazing; the GPU equivalent of Core or the Bridge series x86 cores. AMD has caught up on the CPU side, but not so much on the GPU.
4K 60+ FPS on max AAA settings is extremely hard to do. Nvidia have got there, just, but at what a price. AMD can’t build such a GPU; there’s not enough thermal capacity in a PC case for the amount of power such a Navi GPU would need.
Even if we take AMD slides as the gospel truth, 10% better performance than the 2060 for 8.6% more money but with no RTX hardware at all? These parts seem quite a bit overpriced, and that's if the rumors around the 'super' offerings from nVidia are wrong. If they are correct, or even half as good as the claims, these parts will lose in every metric. And that's assuming this isn't another 'overclocks like a dream' or 'poor Volta' moment.
Well, 3 years late to the party, sporting the new shiny coat of paint on last years performance... I just don't get it. AMD had every advantage for this card with a significant manufacturing process advantage over Nvidia, and yet, still can't beat what Nvidia had out ~3 years ago (1080ti). I can only believe that they have stopped trying. They most definitely didn't try the last 3 generations of cards. I mean, they didn't even make an attempt at a card that could perform in games as well as Nvidia's high end cards. It was sure good that AMD's cards could at least perform well for compute work, otherwise I have not idea how they have staying in the graphics card business the last 6 years.
Look, I get it, the real sales are in the mid-range products for stand-alone cards, and in the low end products on the integrated side. But the thing that drives those sales are the high end. The consumers read, see, hear, all about how Nvidia is the fastest, highest performance, "best" card all over the place in the benchmarks, and all of a sudden Nvidia is equated with better products. Soon customers are buying that laptop or desktop because it has the Nvidia card in it, and there goes the sales on integrated and and low end systems from the big manufacturers. I doesn't matter that AMD might compete at price/performance, the brand itself is seen as second class because they havn't been able to compete with Nvidia on the high end in a decade or more now... I was REALLY hoping that this card/generation was going to be something different. 7nm vs 14nm should have been able to blow away Nvidia's performance, yet it can barely match Nvidia's 3rd or 4th fastest cards...
These are mid-range parts. Wait for Navi 20 with the full 64CU and 4096 shaders. Should fall between the 2080 and the 2080 Ti if I had to guess. Unless it is on 7nm+ which might net it some even better performance.
AMD had every advantage...until they fell 6 months behind schedule due to a retape in October. We were supposed to have had Navi 10 in January and be getting Navi 20 now. Instead they are playing catch-up and licking their wounds. Stuff happens. Nvidia aren't really trying to get much faster, so AMD still have time to catch up. Closing the gap is a huge first step.
Nvidia moved it there. AMD are just not going to give up the margins this time around. I'm not happy about any of this. But clearly you are incapable of doing anything other than seeing someone who disagrees with you as a fanboy. Which is ironic because all of your petty little comments make you worse than any fanboy I have ever encountered.
When did Nvidia do that to mid range? You mean the 2xxx cards that have a ton more features than AMDs cards? Guess what, features cost money to implement.
What AMD gave their fans today was a $50 price reduction and a power usage increase over the gtx 1080.
Wait for Polaris. Wait for Vega Wait for Navi
The correct answer has always been buy Nvidia now and enjoy!
You mean the consumer cards that Nvidia designed with datacenter features and sold them to you by inventing ways of using them that no one cares about? DLSS only exists so that tensor cores aren't worthless to games. RTX only exists to sell high-margin Quadro cards. The 20 series barely improves on the 10 series for all of the other features of the card. It'll be years before any of the extended features of the RTX cards are actually made mainstream in games. If Nvidia cared about performance, they would have just scaled up the silicon used in the 16XX cards and gotten a huge boost in gaming performance. Instead they stuck a bunch RTX and Tensor Cores onto the die and sold you a workstation card that you can't even take advantage of. So all your old games are barely better and none of your new games can use the new features for a year after launch. And rather than keep up with inflation, they jack the price up a few hundred dollars over last generation and tell you it's all worth it.
You're just too stupid to see how badly they screwed you over.
You are saying that introducing new feature to make the market advance over the now old classic rasterization rendering method, and the costs associated with this, is a wrong thing and the best strategy would have been packing more and more transistors to make the same old things just faster?
If this Navi has finally that geometric boost we will finally see games with more polygons. Finally. since Kepler nvidia could support more than twice the number of polygons GCN has been able to and with the mesh shading in Turing they can now support more than 10x. But we are stuck with a little more than Wii model complexity due to GCN and game engine/assets developed for consoles.
We are way back of what we could be just beacure GCN can't keep up with all the new functionalities and technology nvidia has introduced these years.
Pro tip, CiccioB - If you're starting a comment with something like "So you're saying", you're clearly signalling to anyone paying attention that you either: 1) Didn't understand what the person was saying, or 2) Are trying to deliberately misrepresent what the person was saying
In this case, you're inferring that the poster said creating new features is bad while assuming that the cost Nvidia have attached to those features is necessary or inevitable.
The truth is that while Ray Tracing will be a great addition to gaming when we have cards that can support it at reasonable performance levels, only the 2080Ti really makes the grade. That card costs significantly more than I paid for my entire gaming laptop. That's *not* a good value proposition by any stretch of the imagination.
Nvidia could have introduced RTX at the ultra-high-end this generation and moved it downwards on the next - at least then we'd have still have some good value from their "mid-range" cards. Instead they pushed those features down to a level where they don't make any sense and used that to justify deflating the perf/$ ratio of their products.
Saber's argument is pretty sound - these features only really make sense for AI right now, but Nvidia made a bet that they could get their gamer fans to subsidize that product development for them. It's good business sense and I don't begrudge them doing it, I just begrudge people for buying into it as if it's somehow The Only Right Thing To Do.
I do not understand your "intro" to your worthless comment. There have been statement saying that creating new features is not good because it requires lot of time for them to be adopted, so better wasting transistor to accelerate what we have now (and had since someone else introduced new features).
Your statement here is worthless (and clearly expressed under red fanboysm):
Nvidia could have introduced RTX at the ultra-high-end this generation and moved it downwards on the next [..]. Instead they pushed those features down to a level where they don't make any sense and used that to justify deflating the perf/$ ratio of their products.
They started with this generation with big dies to include those new features at the level they could and you already said that they did to justify something you just hate without waiting for the next generation when the shrinking may enable those feature to be scaled down to the low-mid level of the market. You have said something that is possible to be achieved in a generation evolution of the architecture together with a die shrink, but your fanboysm in defense of something that AMD could not achieve in 3 years (since the launch of Polaris), just make you state that nvidia is bad because they haven't brought RTX to mainstream in 6 months.
If RTX is going to take a couple of yyears instead of an entire console cycle to be adopted is also because nvidia wanted to sell expensive cards with those features. You are not obliged to buy them, you can just continue buy crappy GPUs on obsolete architecture that consume twice the power to get the same work (but not the new features) done if that make you happy.
It's a free market and accusing a company to sell a more expensive product in the attempt to bring the market ahead (and not grinding it to an halt as AMD has done since it introduced GCN) it's clearly stupid and just denotes that you are just angry by the fact that AMD even with 7nm and 3 years of development didn't managed to get where nvidia is both in terms of features (which is not only RTX) and in efficiency. Because yes, those fatty die feature rich GPUs by nvidia can do more work (even without using the new features) with the same W that the new AMD's GPUs at 7nm can.
The reality is this one. AMD with a PP of advantage can't keep up with nvidia efficiency and feature list and this is the real reason we have high prices. because 7nm is not cheap, and having shrunk GCN to get those Navi performance is another flop that is going to be payed when nvidia will on its turn shrink Turing to a new performance (and feature rich) levels.
CiccioB your " statement is worthless " comment.. is also worthless.. as Spunjji is correct.. nvidia COULD of kept RTX to the ulta highend, say titan and 2080/ti, and then made a card for the 2070/2060 that did increase performance for every one else over the 10 series.. but, they didnt.. instead.. they want every one to pay for the ray tracing development.
" brought RTX to mainstream in 6 months " RTX is NOT mainstream, far from it.. the cards are priced so only those with more money then brains, can buy them. which i assume.. is you Phynaz, due to your constant defending of RTX, and you just need something to justify the price you paid WAY to much for to get an RTX card ...
" this is the real reason we have high prices." WRONG, nvidia put the prices where they are, cause over the last few years.. they keep charging more and more for their cards, when they didnt need to.. but all they were worried about.. was their PROFITS !! look at the comments by nvidia for their earnings call between 2018 and 2019, now that the crypto mining craze is dead... that alone shows nvidia is only worried about profits..
You have a convoluted mind, surely due to the fact that you are a red fanboy that cannot see the facts. 1. There's really no reason at all to introduce a feature like raytracing only on high end card that are going to be maybe 5% of the market when it needs a big enough user base to be supported. It would have been only a way to see "hey, we are there, so AMD, think about it as well and catchup with us the next generation". 2. nvidia has not put a gun on your head to "make you all pay for the raytracing development". You are free to buy whatever other card without RTX and stay in the cheap budget you have. 3. New features have a cost, and it may shock you, but they have to be payed somehow buy the one that buy those GPUs. But you are a red fanboy and you are used to cheap crappy architectures which have not brought a single advancement over the last 10 years, so yes, you may be horrified by the idea that technological advancement have a cost that have to be repaid. 4. At the end of you r worthless rant you have AMD launching a new generation which is a PP ahead that still can't reach competition efficiency and most important is pricing it at the same level of the competition with not of a single new feature introduced by it (despite the packet math). So now you have to by expensive crap with no advanced feature to have the same performance of classic game engines but still using more W (or if you want to play with voltages and clocks with the same power but a PP of advantage.. yes, that's the advancement we all were waiting for!). But don't stress. You can still buy the cheap power hungry Polaris crap with not new advanced features that AMD is selling at discount since the launch of the GTX680. That is going to help AMD to improve its balance and have more money to invest for the next generation. So that next generation when AMD chip will get still fatter for RT support yuou can still buy cheap GPUs and not pay for the new features and again help AMD to reach generations later the features introduced by the competition years before.
and you dont ?? face.. for the most part.. nvida priced your coveted new feature out of the hands of most people, and even you must admit, that ray tracing on anything but a 2080, is almost useless cause of the performance hit.
1: see above 2: stay in the cheap budget ?? um sorry, but maybe you are still living at home, with next to no bills to pay, but some of us, have better things to spend our money on, like a mortgage, kids, food, etc... none of the people i know.. have RTX cards, and its because they cant justify the high prices your beloved nvidia is charging for them... 3. i am a red fanboy ?? tell that to the 4 1060s i own, and the 3 readeon cards i also own, all in working comps. 4. at least amd has priced it A LOT more affordable, that more could afford to buy, with out the main, for the time being, useless main feature that you cant really take advantage of... but i will guess.. you are an nvidia fan boy, who loves to pay for their over priced cards that have made the last few years... who lives at home, and there fore, has more money then brains....
You are a clueless AMD fanboy despite having some nvidia cards. I'm not for nvidia at all costs and there's not doubts that Turing cards are expensive. But you are just prompting the usual mantra "AMD is better because it has lower prices". The reality is that is has lower prices because it has worse products. In fact, now that they believe (and we'll see if that's true) they are rising the prices.
The fact that Vega (and Polaris too) is sold at a discount price so that at the end of the quarter AMD has to cover its losses with the money coming from Ryzen is not a good thing even though it is good for your low budged pocket. It just is a sign that the products are so bad that they need very low prices to match they very low value. It's a simple marketing law that AMD fanboy constantly forget. Actaully, it is easy to recognize an AMD fanboy (or an ignorant, which is the same>) as they constantly use dumb reasons to justify their preferred company without knowing what are the real effects of the strategy that AMD is using.
On Turing, the high prices are due to the large dies. You are not forced to buy those large dies and be happy with your obsoleted cheap technology. You think that ray tracing won't be useful until next or 2 generations. If we were waiting for AMD we would not have it in 10 years as they have not been able to bring a single technological advancement in 13 years (that's the launch of Terascale architecture by ATI). They just follow like a dogs does with its prey It is easy not to have to invest in new things and just discount products to make them appear economical better than they technologically actually are. Big dies, more W, low price to stay in par with lower tier products made by the competition.
You may use all the red glasses you want to look at how the things stand with this Navi: but the reality is summed in 2 simple things: 1. in 2019 with a completely new PP they matched Pascal perfrormance/W 2. as soon as nvidia shrinks Turing they'll return in the dust as they deserve not having presented one new features on what it actually is a redesign of an obsolete architecture that should have dies in 2012 instead of being sold at discount for all these years making kids like you believing that low price = better products and never looking at the fact that it is an hole in fiscal quarters.
And then you fanboy constantly speak about lack of money to do this and that. It's all about the same cause: bad products = low price = low margins = no money.
They know about this and they are trying to make money before nvidia make its shrinks (which will be done when the new PP is cheaper because nvidia wants money not your charity) and Intel comes out with 10nm solutions (which is a bit further in time but they come and they will regain the market as they have before).
CiccioB news for you buddy, based on your own replies, and your constant need to insult people.. you are WORSE, and probably just a young punk kid. the way you keep pushing nvidia, and constant bashing amd make YOU a fanboy yourself. FYI... navi, matched nvidia's midrange 20 series products.. the holy grail you call ray tracing, is only viable on 2080 or above. i could easily buy a 2080, but i wont.. cause i have more important things to spend my money on. just because you have more money then brains, and love paying nvidia for their over priced cards.. is your choice, must be nice to live at home with probably no bills, and no financial responsibility
your calling me a fanboy.. fine go ahead.. doesnt bother me as we NEED competition, look what intel did to the cpu market.. and you said it your self " because nvidia wants money " thats ALL they care about.. MONEY AND PROFITS " On Turing, the high prices are due to the large dies " that is part of it, but its also something called no reason to price them lower, cause there is nothing really out there as an alternative, again.. look at intel for proof of this.
oh and CiccioB not once that i can see, have you comment on how bad turing's performance hit is with ray tracing, or how the 20 series isnt that much faster then the 10 series with ray tracing off, for the price you pay, all you seem to be focusing on.. is trying to make nvidia look better in your replies.based on this.. do YOU think the 20 series is a good upgrade to the 10 series ?? if you do.. then you are a blind nvidia fanboy
Spunjji, there is a reason that NVIDIA put RTX in their midrange cards. The reason is that most sales are midrange and lower. The entire issue with RTX is the chicken and the egg problem. Game developers won’t put the effort for raytracing if there is no hardware in the consumers hands that can take advantage of it, and consumers will typically not opt to purchase hardware that is not going to be used by any games. NVIDIA is effectively forcing the current chicken to lay a next generation egg to open the market for new techniques for games by creating a large enough of an install base for game companies to do the math and see that there is an existing market for their raytracing games.
Yes Nvidia could have simply put RTX on their highest end cards and waited. And they would be waiting and waiting and waiting for game companies to actually implement raytracing. A gaming studio won’t invest the time and effort to retool their game engines for a potential consumer base of a few thousand people. However if that potential consumer base is a few hundred thousand, or a million, they will take a look at adding the features.
a lot of good that thinking does.. when most people cant afford the cards, or dont want to pay the cost of entry for those cards, besides.. to make ray tracing usable, you kind of need a 2070, or better yet, a 2080, as the performance hit.. is just to great.
Fallen Kell, but the prices nvidia is charging for their " mid range cards " is NOT mid range pricing, they have priced their mid range cards, to be more like entry level, high end range cards.. mid range would be 400 or less..
Reading this comment thread, you must be an egotistical idiot or something.
First off, Ray Tracing needs to be on AMD (and Intel's upcoming XE) hardware before it will be supported industry wide. Why waste tons of money on a feature barely anyone will use?
Secondly, RTX is awesome, no one is denying it. But it's also not a reason I'll buy into an RTX card. sure, I can play Metro Exodus with it if I had that card, but my 980ti is rendering games just fine still, so I don't need to upgrade. I may buy a 2080 as my next card because I do use Moonlight, which is the open source nvidia stream service. But if AMD's Performance at a lower price in games without Ray Tracing is higher, then that'll be my purchase. I've got no allegiances to either AMD or NVIDIA. I got a 980ti because it was the strongest card at the time. ((Plus I watercooled it. It's darn awesome :) ))
if navi supported ray tracing.. was just as slow at it as turing, on par with a 2080 in everything else and still $100 cheaper.. most of those that are ridiculing amd now... would probably still be ridiculing them ..
You think AMD stated prices to make AMD look bad? I'm honestly confused by your statement, or are you saying after enough backlash they'll be forced to lower prices? Or that if/when the 'super' green parts come out they'll lower the price, before they launch?
If prices go up it is bad for us, if they go down it means that AMD product is not the value they wanted to sell it at and thus having a cut to their margins (as they did since GCN encountered Kepler). Sorry, but there's nothing to wait to see if it evolving to a better or worse situation. The MSRP are those announced and they have to be compared to the ones of the competition with the same performance.
Dude, you need to chill. I’m not even in this fight, I can’t remember the last dedicated video card I bought, but you don’t have to hop on every comment chain and say the same crap.
This is a mid range card. Of course it’s going to be similar performance to the best card from previous gen. It’s how the product cycles work.
I’m so out of it that I had to go look up the pricing on RTX 2070. Looks to be $549 on high end and $499 on low. If this new card is a “few percent” better and priced at $449 it seems reasonable, objectively. The price difference could reflect the feature disparity people are alluding to.
oh like nvidia has done over the last few years with the pricing of their own cards ? come on, unless you have more money then brains.. do you like the prices of nvidia's cards?? i know a few people who hate what nvidia charges for their cards, THAT is the rip off ....
7nm is a quite impressive technology. 251mm2 with 10.3 billion transistors. Let's put this into perspective. Take a normal hair (0.05 mm thick), cut it. At the cutting surface you would have about 80'000 transistors.
V900 not really.. at the moment.. how many games even use ray tracing ?? not worth the price of admission to get an RTX card.... they are WAY to expensive...
and definitely no point in buying into RTX for the 2000 series unless you're an developer. RTX3000 and it might be a selling point if performance increases enough.
This is the point so many people seem to be missing / are deliberately trying to talk past. RTX 2000 series is basically dev hardware being marketed to gamers.
spunjii... yep... but the same could be said for most of the new features nvidia and amd/ati have added over the years to their cards.. 1st gen.. dev hardware, 2nd gen, usable performance.. transform and lighening way back when was like that ....
Yawn. These cards probably comparable to their Nvidia counterparts at 1080p while consuming more power. Same old story. It doesn't have ray tracing hardware. Launch price going to drop pretty fast.
Navi was probably designed around the consoles so it didn't need to be all that competitive. Consoles are propped up by smoke and mirrors. How else can one convince the public to accept Jaguar CPUs for gaming?
If you look at the standard features, yes they are. But Turing packs a lot of new features that old gen GPU s have not, and if these features are going to be used Pascal cards (and much so GCN/Navi) will remain in the dust.
By the time a significant number of games come out that make proper use of these features, the current cards will be old news. This is usually true for most new graphics technology (see the GeForce 3 and DirectX 8) but I don't think it's been true to this extent and at such a high cost of entry before. We're in new territory here.
We are in the territory where console monopoly is lowering the technical level te market can reach. If just one of the two consoles were powered by nvidia HW AMD would already be in the dust and highly regret by its buyer while new games engine would exploit all the new features nvidia has inserted into its GPUs since Maxwell. In fact, we have a market were AMD struggle to keep up with nvidia HW even though it has all the optimization and choices made for the games to run best on its HW. And we still have the same geometric complexity of 2012.
maybe nvidia, is just charging too much to put their GPU tech in a console.. how much would it cost to put even a 2060 into a console ?? i would guess 75% of the cost of the console, would just be the GPU....
Maybe because nvidai didn't want to sell at discount prices as AMD did just not to be annihilated in the gaming market? Think about that: two consoles one with nvidia and one with AMD HW. Now just think which one would have netter graphics and features.
or maybe because.. ALL your precious nvidia cares about.. is its PROFITS.. or are you to blind of a nvidia fan boy to see this aspect ??? my guess.. you are... and.. have more money then brains.
" two consoles one with nvidia and one with AMD HW. Now just think which one would have netter graphics and features. " ok.. and how about you think.. which would cost A LOT more then the other one because it has nvidia inside it?? ever think about that ?? as YOU said your self : " New features have a cost " and while its easy to pass that cost on to the consumer with a discrete video card.. that wont work in the console market, but when you think with your wallet.. and not your brain.. this is what happens...
All companies care about their PROFITS, and I may shock you that AMD does too!!! They all sell the products at the best price*potential piece sold at that price to maximize their returns. Guess waht? nvidia can apply a higher price becuase their products have higher value. AMD has to apply a low price because they sell crap HW that has value only whan at discount.
It's a basic economical law. Now that you have learnt it, you can return play at BF5 at 120FPS but with only 10 polygons on the screen with your Vega card.
higher price because their products have higher value.?? BS... they charge that, because they can, and just want profits. but you dont understand that.. cause your are just a young punk kid, with more money then brains, with no financial responsibility, living off mommy and daddy.... when you grow up, and get a mortgage, car payments, and kids.. then maybe you will start to understand the value of money and realize spending 1200 or more on a video card.. is not as important has being able to feed your children, and provide a roof over their heads, fanboy
Well that's wrong - the features have only been out for months and there are already several games using them extensively with significantly better visuals as a result. That's way better the adoption rate of most DX's. In addition the early adoption has pushed ray tracing into the next gen consoles in some form - you can bet if Nvidia hadn't released RTX they would have none. Now AMD is scrambling to put something in Navi 2 (or whatever consoles have) as the console makers are both demanding it.
You can argue the performance isn't there yet (same for pretty well every major new graphics feature) but you can't really argue that RTX hasn't hit the ground running and had a pretty big impact.
Would still pick the 1080Ti over any Turing, because ot makes much more sense, even now still. But Nvidia was smart enough to axe it as the only Pascal one. They knew their 2080 was too crappy to leave it on the market.
The regression in perf/$ is simply disgusting. When my 970 gives up I might as well go APU and hope Google Stadia is actually good. Too bad the Ryzen APUs are a full generation behind.
I liked what I saw. In contrast with the $749 Ryzen part (even thought that's revolutionary stuff right there, our wallets are still doomed -_-). Don't take the hype train bait and it'll be twice as difficult to disappoint you. Call me an AMDtard fanboy I don't mind ¯\_(ツ)_/¯
(on the topic of all the megafeaturez: not believing in wide future adoption of all those DLSSes doesn't make a person a fanboy. That's some Elon Muscus level shtick imo. Same way, you'd probably call me a fanboy for bemoaning the low popularity of numerous other -- open-source -- RTG goodies. Looks like AMD decided not to bemoan any longer and go freestyle mode. No matter the performance and competition. Meh? Meh. But product quality isn't always measured in ad banner space and use in pre-builts, winkity wink to the "yarr Vega sux!!!111" gang and RTX's Witnesses. Vega was never bad and is certainly no worse than on launch at its current prices: 56 starting at $200 used, $270 new, 64 $270 used, $330 new. (Lithuania used market, U.K. stores for new units) Don't want Vega or Pascal or Polaris? The world of RTXes, where the 2060 and 2070 just barely, and with a lot of effort, reached 1060 and 1070 MSRP, in select stores only -- while the 2080 (Ti) is still cosmic -- is waiting for you. /offtop)
Disappointed to see that they still can't keep up with high end NVidia cards. I'm planning to get a new workstation / gaming rig and I would have liked to go fuill AMD, but for the gaming part I would still want an NVidia card. This leaves me with few options since I mostly use Linux and boot into Windows for gaming, there is no good Linux driver for NVidia (only their binary release that's a pain in the ass to use).
These are my options: 1) Go full Intel (i9900KS) + NVidia (RTX2080 Ti or successor) - Excellent graphics support under Linux with the iGUP, excellent graphics performance 2) Go mixed: High End Ryzen (3950X), get an RTX2080 Ti and a low-end GPU for Linux, possibly sacrificing a few PCIe lanes in the process
Price wise the latter one is probably more expensive. Or I could wait for Intel to release a new Desktop CPU...
Great suggestion, that might work as well, though not as fast as the RTX 2080Ti it's probably fast enough. I'm wondering about the power use in a Desktop scenario though, that's usually better with the iGPU (and disabling the dGPU).
To be fair, Radeon VII was never meant to compete against 2080ti, which costs nearly double of the VII. If you've got the money, go for the ti, otherwise, make a decision on what's most important to you, not what an Nvidia or AMD fan tells you to buy.
Money really isn't the object here. It probably makes more sense to go full AMD just to not further support NVidia and their Linux asinine driver policy. I presume one could use two of these in Crossfire mode.
Do keep in mind that the VII has 16GB of 1TB/s HBM2 memory on-board, which will be a huge boon in certain workloads if you're targeting professional workloads. The 2080ti tops out at 11GB of GDDR6 memory, which is half the bandwidth... I think.
Latest Ubuntu build has the nVidia binary drivers included, pretty easy to use(although they don't install by default). Also, nVidia binary drivers, at least for gaming, are significantly faster then their AMD counterparts.
HDMI 2.1 hardware is still really expensive and power hungry. Given the dearth of HDMI 2.1 displays and the fact that these cards wouldn't even do 30 fps at 8K I don't see the lack fo HDMI 2.1 being a big deal. DP 1.4 is far more useful for these mid range cards.
If AMD is going back to suffixes, it should have at least gone for something more interesting like a palindrome. RX 5775 XR would have been quite cool.
That GPU looks like someone took it to the gym in a gym bag, and accidentally dropped a 45 LB plate on it...; the dented look is horrible, IMO... Fire the entire aesthetics team!
Well, I was letdown by a lot of this, given the pricing. I suspect after launch though we'll see a pricing way between AMD and Nvidia like the days of old, and these may wind up at a better price.
not really but AMD created well calculated prices for these cards (as usual), making it a dilemma to choose between AMD or Nvidia. the 7nm process has no benefit to the user unless we undervolt or under clock these cards. the price differential can be used if RTX hardware in Nvidia adds value or not. AMD has created same old story with their GPUs unlike the Ryzen CPUs
Nothing on the new video codec blocks (Video Core Next++)? I hope they slot AV1 support in before the 2020 APUs. I want to buy something I can at least play streaming video without spinning up the fans for the next 10 years, and it looks like Netflix, YouTube et. al. want to move off patent-encumbered formats as soon as possible.
It would be a HUGE failure on AMD's part of Navi+ doesn't bring at least AV1 decode acceleration to PS5/Xbox Scarlett. But at least some sort of encode acceleration should be in there, too, because game streamers will badly need it.
Without AV1 support, PS5/Xbox Scarlett will not be future-proof basically. You can refuse to believe it all you want, but it's true.
LG has their 8K SM99 75" HDMI 2.1 TV imminent, which sounds like it might only make you choose between it and a motorcycle, not a house, for the price, so I'd really like to see an HDMI 2.1 card to drive it. AMD fail :-(
Keep in mind how small the die is for Radeon VII. People are duped by the inclusion of HBM II. You're not really getting a super-powerful prosumer chip with Radeon VII. You're getting Vega recycled with a small die.
I also think it's lame for people to get too excited over the replacement for Polaris, a midrange product at best that's years old. I wasn't excited about Polaris to begin with. All it was was AMD blasting past the efficiency curve for its LPP on a small die. Big deal.
Anandtech should really highlight die size in these reviews like it used to to give people better perspective about what they're really getting and how it compares with the past.
These cards are dead on arrival at these prices. AMD just doesn't have the kind of brand recognition that nvidia does these days. These cards aren't cheap enough that anyone is going to go with AMD again.
and how cheap should they be ?? my guess.. they would never be cheap enough... so far.. a know a few people that are interested in these cards.. and are waiting till next month to see how they perform.. why ?? cause nvidia priced them out of the market...
Nvidia RTX were overpriced due to lack of competition at the time of their launch. Performance per dollar no better than the 3 year old GTX 1000s. AMD finally release their new gen and their prices do not significantly undercut Nvidia RTX, especially considering the lack of hardware ray tracing. This is disappointing to see as a consumer. We expected to see AMD drive the GPU performance per dollar upwards in the ways they have done to CPU.
Hixbot, and what if amd did have ray tracing, was just as slow as turing, had the same features, but was priced less then what nvidia charges ?? then what? would those that are harping on amd now, still be harping on them ?? of course.. but now they would be complaining that amd had a year more development time for ray tracing, and the performance is still the same ?? lets face it.. ray tracing isnt really viable performance wise unless you are using a 2080 or higher.. the hit is just to great to make it usable... most of the people i know.. dont care about ray tracing right now.. its adds to much to the cost, and is barely used... none of them even asked if navi had ray tracing... again.. cause with nvidia's pricing, they are priced out of the ray tracing market
I'm not following your point. Nvidia's terrible pricing are not being challenged by AMD. If it were, I think they would get a lot of credit from myself and others, just as they are in the CPU market. Nobody is praising Nvidia here, I was just expecting more value from AMD.
Hixbot, i think amd would still be being harped on, and what i mean is.. they would still be criticized like they are now for not having ray tracing, and cause of the price of the cards.. instead.. i think the common complaint would be.. amd had an extra year to work on their cards, and all they can do is match the performance ?? they suck.. and their cards are still over priced...
So what performance level are you seeing in your games with RTX on using your 2070 and where do you think it should be?
I just played through Quake 2 RTX on a 2060 and I thought it was great. Been playing metro exodus and Tomb Raider with ray tracing on too, not having problems with either of them either. Minecraft isn't my thing but my kids think the ray tracing in that game is great too(that one will run on older hardware).
Part of the disconnect in this conversation is people talking about how bad performance is at 4k ultra with ray tracing when mid range cards can't run these games art those settings without ray tracing anyway.
BenSkywalker i dont have any 20 series cards.. i have a 1060, but going by the reviews.. and word of mouth with those that do have a 20 series card, it seems the performance isnt there. maybe thats it, resolution 1440 or higher.. but it seems if you mention you play @ 1080p, you get made fun of.
I'll say all the people I know IRL that actually play the games that have tried ray tracing on with RTX hardware have thought performance was solid, on a 2060. Now to be fair no competitive shooters were used online, just actual gamers playing actual games. As an example Tomb Raider max settings without ray tracing at 1440p is close to identical performance as 1080p with ray tracing at ultra settings. Now you can run RT at medium 1440p and be quite playable(mid 40s to mid 50s), but between the two settings, double blind, everyone I had compare said the 1080p with ray tracing was much better(full disclosure, I picked a spot with ray traced shadows on screen).
But my tournament level frame rates are down..... As opposed to what image enhancing technology?
Five years from now it'll be a joke that anyone argued against it in the first place, most of those that are, rabidly, will deny they ever did.
BenSkywalker, what games are you trying RT with ?? im not sure.. but tomb raider isnt really all that taxing, is it ? i would assume the " competitive shooters were used online " you mention.. may be a lot more taxing.. and the performance may not be all that bearable with RT on, on a 2060. im not arguing against it, but, maybe be unlike some, for the price, it just isnt worth it... yet... i am going to assume you are in the US, but for those of us in canada, ( maybe other parts of the world as well ) take your US prices and add 200 at the low end, to around 500 at the top end and decide of the prices are worth it.. 2060's here start at $500 ( when not on sale ), and go as high as $2100 for a lot of those i know.. it just isnt worth it, as we have bills to pay, kids to feed etc...
Metro exodus, Tomb Raider, Minecraft and Quake 2 RTX. Take Metro Exodus, fps are higher using 100% shader resolution with ray tracing on then ray tracing off with shader resolution maxed out. Digital foundry made a video about Quake 2 RTX, really gives a great example of the impact(performance drop is *huge*, so is the visual impact).
So you say it isn't worth the premium, in no way whatsoever would I say that's wrong, I know first hand readily available disposable income isn't always sitting around in large quantities, but what about when there is no premium?
That's what I'm seeing with this launch. The 5700 is more expensive than the 2060, barely edges it in traditional rendering and doesn't give the option to play with ray tracing *at all*.
There's only a handful of games and it's a big performance hit, both completely valid, but the pricing issue is kind of out the window now that AMD has decided to avoid the value position.
BenSkywalker, looking that req's for Metro exodus, im a little surprised that it runs that well for you :-) but to then add in a 22??? year old game, that would run ( exaggerating here ) 400 fps on modern hardware, add RT to it, and have it run at " only " 200 fps, is a little moot... what would the performance be, and i dont mean patching in RT support, but if the game was made for RT from the start ? the friends i talked to.. and i think this is part of the reason why that dont think it is worth it to get an 20 series card, yet, is be cause with Metro exodus, Tomb Raider, Minecraft and Quake 2 RTX, they dont play any of those games.. so the money spent for RT and the other features the 20 series brings to the table, wouldn't be used. to compare the 5700 series to the low end 20 series.. is also a little lopsided, as amd is aiming these 2 cards well above that, so comparing the 2060 series to say a 5600 or even a 5500 type series might be a little more even. but it really comes down to how much is one willing to spend on a video card, how much they can afford to, and would the games one plays, see any benefit from that purchase? one of the friends i talked to.. is one of those with a lot of disposable income, and even he says the 20 series isnt worth the cash right now :-)
The canned bench from metro exodus is like a torture test, the game runs quite nicely. Quake 2 is actually 32 years old, you were way off on you're estimates(think more like 1.5k to sub 100), but I'd say watch the digital foundry video.
Now, the rest of your comments, they are directly refuting AMD's claims. AMD is calling the 5700 a 2060 competitor, saying it is 10% faster for 8.5% more money. That's not my interpretation, that's what they have come out and said. AMD is saying, based on their hands picked benches that they are going to charge almost exactly the same $/fps as nVidia but no option for ray tracing. Again, this isn't me spinning anything, it's all in their slides, their words.
Is it unreasonable to assume AMD chose benches that made them look slightly better than average? Combine that with the price premium and you may find that real world has a factory overclock 2060 at the same price as the 5700 MSRP is dead even with AMD, but with the option of playing with RTX.
i said 22 ?? years old.. same as you.. my estimates ? you mean my exaggerations of how many FPS's quake would get on modern hardware ? just i wasnt sure about the age... maybe.. but that is just my thoughts too... will have to wait to see what street prices turn out to be when these are released.. they could very well be less then what amd has said. well.. thats what those i know said when i asked.. they dont play those games, so they see getting an rtx card as a waste of money.. but consider that they are comparing what they have now.. to getting an rtx card, and it could very well be a waste.. i am sure.. if they did play games that have rt in it.. they might be considering it.. again.. it really comes down to the games that have RT and if one plays them.. for myself, and the games i play.. i would be wasting my money.. but i have always wondered how one game i played would run on newer hardware/better hardware then i have currently, as it always brought my comp to its knees...my current 1060 strix with the 5939k @ 4.2 ghz, vs 1080ti or the 20series/radeon 7/5700xt, with a zen/zen 2 based cpu or a much newer intel cpu, a game called supreme commander... i read a review of it ages ago.. and if you didnt have at least a dual core, dont even bother.. with mine.. on the HUGE maps it has, and 1000 units per side ( up to 8 ) after about 20 mins.. im turning the game speed up to max, and lowering the eye candy down to at least the middle...
You had the question marks next to 22 years, I see contextually those were incredulous modifiers now, also your estimates, my frame rate goes from 1,460FPS to roughly 60FPS, it's much worse than 400 to 200.
In no way am I asserting you are making a bad decision for you, not even close- what I'm saying is why would someone choose the 5700 over the 2060? Ignoring ray tracing all together they are close to identical in performance per dollar, so with one you can play around with it if you want, the other you can't.
If you are saying neither are worth it, completely valid argument and I wouldn't argue it.
heh.. my exaggeration was a little off for how fast quake would run ;-)
well, until these cards are out ( 5700 series ) its hard to say if they will be priced that close together, currently the 2060's are priced between 500 and up to 570, even if the entry level 5700 is less then 500 then it could be a better buy.. but wont know for sure for a few more weeks...
right now.. i guess in a way i am, cause they dont provide a big enough performance increase over what i or those i know.. currently have... my self.. i would need to go to at least a 2070/5700xt.
ahh yes.. the newegg angle.. are you also factoring potenial shipping costs ?? currency exchange.... moot point as that changes daily... and can vary. i will wait to see what prices are when released... then compare...
That's NewEgg Canada, not the U.S., is shipping different for some reason? There is no currency conversion on the newegg.ca site if you're in Canada.
Obviously waiting for reviews will give us a fuller picture, my issue is what they claimed just doesn't seem to have any compelling reason to buy it over the competition.
BenSkywalker i was referring to newegg.ca in regards to shipping, but seems they have free shipping on that gpu ATM.. either way.. still something to consider if newegg as anything cheaper then going to a local store as with shipping, could negate the lower initial price. as for currency convert.. that was in regards to the 5700.. as there is no current cdn prices... yet...
I'd love to get one of these. Nvidia has basically had the market for too long and they now force you to have an account and telemetry enabled to have gaming driver updates and optimizations. They've really not gotten any good karma with me or anyone caring about privacy.
" they now force you to have an account and telemetry enabled to have gaming driver updates and optimizations. " they do?? must be on the 20 series.. cause i dont have that issue with the 1060 i have....
Please explain why RX5700 has almost twice as many transistors as RX590 while having essentially the same performance (and number of ALUs) - the tiny difference in FP32 throughput can be entirely explained by higher boost clock and then some.
Canon Printer is the best among the printers in the market. If you face any kind of issues in your printer and you are not whiling to go out to service center, we are here for you. Contact our Canon Printer Customer Support Number and get instant solution. We are available 24*7 for our valuable customer <a href="https://www.canonprintersupport247.com/blog/how-to... canon printer offline</a>
Listing your business information on these Faridabad Local Business Directory increases online exposure and provides new avenues to reach potential customers. http://www.apnidigitalduniya.com/
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
326 Comments
Back to Article
Cellar Door - Monday, June 10, 2019 - link
Ryan - do you know if vega 56 and 64 are EOL?RaV[666] - Monday, June 10, 2019 - link
Can you think of one reason to make them ?I mean they will be made, as in vega10 chips for datacenters but for gaming, theyre gonna have higher MUCH higher asps on 5700
AshlayW - Monday, June 10, 2019 - link
Yes Vega 10 is being made for Google Stadia gaming as they use PRO V340 cards with dual "Vega 56" GPUs. GCN still is better for Compute and HPC I think, but Vega 20 will largely succeed that in HPC. GCN is not going anywhere.Oh, I do not think Vega 10 is cheaper to make than Navi 10. Yes the process is mature and cheaper, but the die is almost 2X the size and you factor HBM2 and interposer cost into that and the price is largely in the same ballpark.
Navi 10 cannot do a "V340" style card easily, or as effectively, as Google Stadia needed graphics density and the on-package memory on Vega 10 makes the overall space requirements much smaller, so yes Vega 10 is likely to be made still, and itself is much cheaper than Vega 20.
mode_13h - Monday, June 10, 2019 - link
Why would 2x Navi's be so much worse than 1x Hawaii? You're talking about 512-bits of memory data bus, in each case - just one compute die vs. two.AshlayW - Tuesday, June 11, 2019 - link
What? where did you get Hawaii from? V340 uses "Vega 10" which has on-package HBM2 instead of GDDR5. That is a major advantage for space saving when putting multiple GPU packages on the same card.mode_13h - Wednesday, June 12, 2019 - link
My point was that Hawaii cards use 512-bits of datapath on a single card, so perhaps 2x 256-bit Navi's can fit.Regarding density, I don't see your point. From a server's perspective, a PCIe card is a PCIe card, unless it's low-profile, which I don't think it is.
olafgarten - Monday, June 17, 2019 - link
They might not be using standard PCIe or maybe putting multiple chips on a single card. Either way density helps.Acreo_Aeneas - Sunday, June 30, 2019 - link
Servers would likely have to use PCI-e LP or mini PCI-e. In whichever case, most servers are built more for memory I/O and storage performance and capacity (with also a focus on CPU performance) rather than on how powerful the GPU is onboard. Most servers are even headless and don't even have GPUs on board. The few that do usually use theirs for interfacing with a terminal.This does not include server or server farms built specifically for mutli-gpu setups. Usually that is scientific/graphics oriented or with the increasing niche of bitcoin mining.
WaltC - Thursday, July 4, 2019 - link
Pretty sure I heard Su mention that NAVI included (at the "Rdna" level) improvements to compute. We'll know in a few days, of course. (Impatience makes time drag, eh?...;)) I don't usually do this, but *provided I can buy a 5700XT for either $499 (20th anniversary) or $449 MSRP* I'll be buying one next week. I'm going to be rather ticked off if the prices for the card a grossly inflated! Here's hoping AMD will control this much better than than happened at the RX-480's debut--what makes me shudder a bit is that I just read some days ago that Bitcoin stock was on the rise again! Stadia servers are likely only using Vega now because NAVI simply wasn't available when they began. Should change in a couple of weeks, possibly.Ryan Smith - Monday, June 10, 2019 - link
As far as consumer cards go, they've been drawing down inventory from the market for a couple of months now. I don't know if they've been formally discontinued, but they may as well be de-facto done for.apnidigitalduniya - Tuesday, June 9, 2020 - link
Listing your business information on these <a href="http://www.apnidigitalduniya.com/">Faridab... Local Business Directory</a> increases online exposure and provides new avenues to reach potential customers.Someguyperson - Monday, June 10, 2019 - link
No HDMI 2.1 support, variable rate pixel shading, or hardware ray tracing? This kinda seems like the Ryzen 1 launch; some promising improvements, but still not enough to really push the envelope. If I have to option between a 5700 XT and a RTX 2070, I would probably go Nvidia still, just because of the feature support.AshlayW - Monday, June 10, 2019 - link
If the RTX 2070 get a price cut to 449 then yes it is the better card I think. And I am a Radeon fan so it pains me to say that, by the way, but with HW DXR, the 2070 would have the better feature set and Turing is a very good forward looking architecture, a first for Nvidia in a while. So yes I would prefer the 2070 at 449, too. And I think Nvidia will do just that to deflate this launch.mode_13h - Monday, June 10, 2019 - link
Don't forget about Nvidia's Tensor cores. I think some significant gaming roles for them might still emerge, beyond DLSS.evernessince - Wednesday, June 12, 2019 - link
How exactly are tensor core's potential for other applications a factor when buying a card now? It's the same as telling people to buy AMD because of the compute core's potential to ray trrace, without substance. Show me at least 1 game first, then talk.V900 - Monday, June 10, 2019 - link
AMD fanboys are usually the worst kind of fanboys.Good to see there are also some sane and reasonable AMD/Radeon fans out there. You good sir, are a credit to the red team!
Phynaz - Tuesday, June 11, 2019 - link
Agree. AMD fanboys are a pestilence.Korguz - Tuesday, June 11, 2019 - link
and intel fanboys.. the the exact same :-)Phynaz - Tuesday, June 11, 2019 - link
Hardly, look in the forums, the AMD fans turn on their own.Xyler94 - Tuesday, June 11, 2019 - link
And so do all other Fanboyscatavalon21 - Friday, July 5, 2019 - link
+1Korguz - Tuesday, June 11, 2019 - link
heh.. look on some of the posts for the articles, there are a few intel fanboys that may be worse...Lord of the Bored - Tuesday, June 11, 2019 - link
All the fanboys are terrible.FreckledTrout - Tuesday, June 11, 2019 - link
+1 Anyone who follows a product like It’s a religion is delusional on some level.peevee - Wednesday, June 12, 2019 - link
Anyone who follows anything like a religion, including any religion, is delusional...WinterCharm - Tuesday, June 11, 2019 - link
They're not as bad as the smartphone crew. Apple and Samsung Fanboys drive me up the wall with their bullshit.evernessince - Wednesday, June 12, 2019 - link
How about you try something positive instead of putting other's down.Tewt - Monday, June 17, 2019 - link
Let's look at comments before and many after V900's statement and wonder WTH he is talking about.I also have to wonder about the mentality of someone that comes into an AMD article and expects the commenters to praise Intel or Nvidia.
sircod - Monday, June 10, 2019 - link
Did they specify that it did not have HDMI 2.1 support? That is one thing I am really waiting for.sircod - Monday, June 10, 2019 - link
Just noticed page 2. HDMI 2.0b and Freesync-over-HDMI, but no 2.1. I really want that increased bandwidth for 4k 120Hz with VRR. Guess I will keep waiting.godrilla - Monday, June 10, 2019 - link
7nm+ fingers crossed.arakan94 - Tuesday, June 11, 2019 - link
doesn't DisplayPort DSC solve your problem?mode_13h - Monday, June 10, 2019 - link
Please be realistic. This would be too soon for AMD to have ray-tracing support.In fact, that's probably the reason XBox Next and PS5 aren't launching this year. It takes a long time to design and validate chips, you know? The specs are pretty much baked a couple *years* before launch!
Phynaz - Tuesday, June 11, 2019 - link
So you are saying AMD is years behind Nvidia.Meteor2 - Tuesday, June 11, 2019 - link
Well, yeah. They have fewer resources than either Intel or Nvidia, but are competing with both. As Su says, they’re dependent on bets paying off.arakan94 - Tuesday, June 11, 2019 - link
In some areas yes - just as Nvidia was years behind AMD and only finally caught up with original GCN with Pascal. Though not in everything. AMD is still superior in terms of async compute for example.Phynaz - Tuesday, June 11, 2019 - link
AMD is superior in a game tech demo from 2013. Yay?Xyler94 - Tuesday, June 11, 2019 - link
You know, a GPU is not only for gaming. And unfortunately for AMD, they didn't have the insane R&D money NVIDIA has. Imagine if they did?wumpus - Tuesday, June 11, 2019 - link
Nvidia released CUDA back in 2007 when the 8800GTS was king. While AMD might have had GPUs that were better at numerical code since then, nvidia's infrastructure+GPUs made them own the market on GPU computing.AMD is good at piggybacking on Intel's infrastructure in the AMD64 market. They have a harder time doing the same with nvidia and the GPU market and the narrow focus of where the money can be spent shows. You'd think they could take some of the lessons learned in making the console GPUs and at least catch up to where nvidia was a couple of years ago but apparently that is too expensive.
Korguz - Tuesday, June 11, 2019 - link
" AMD is good at piggybacking on Intel's infrastructure in the AMD64 market " how did they piggy back intel with amd64 ?Acreo_Aeneas - Sunday, June 30, 2019 - link
I don't think wumpus realizes that Intel owns x86 and but licenses from AMD for x86-64. Without AMD's "AMD64", Intel wouldn't exist today. AMD's designs for the past 15 years (maybe longer) aren't even based on Intel's designs. While they may have been a 2nd-tier manufacturer of Intel-based microprocessors, they haven't been for many years.BenSkywalker - Tuesday, June 11, 2019 - link
ATi dwarfed nVidia, and AMD almost bought nVidia back in the day but they weren't ok having a competent CEO at the time so the deal fell through. *BOTH* halves of the current AMD were much larger than nVidia, so we have seen with great clarity what they would do if they had every advantage.Korguz - Tuesday, June 11, 2019 - link
phynaz, the same can be said about nvidia and their own tech demo's from current to their past demosevernessince - Wednesday, June 12, 2019 - link
Um, you do realize many games use compute based shaders right?RSAUser - Tuesday, June 11, 2019 - link
AMD will have support for the DX ray tracing though?mode_13h - Tuesday, June 11, 2019 - link
Software support is one thing, dedicated hardware is another.wumpus - Tuesday, June 11, 2019 - link
So will Intel. Either one will produce a slideshow, but nifty screenshots.levizx - Wednesday, June 12, 2019 - link
The problem for NVIDIA is, when PS5 and Xbox Scarlett hit the shelves, games will use whatever AMD choose to accelerate via hardware, and that may or may not work well with NVIDIA's current design.rarson - Thursday, June 20, 2019 - link
I suspect AMD has been working on a ray tracing hardware solution similar to Tensor cores or something for some time now. They may even have considered implementing it into Navi for consumer GPUs but I imagine even if the performance was there, the increase in die size would push cost above the mid-range market these cards are targeting. There may also have been a time factor involved, ie perhaps the ray tracing performance itself wasn't quite as good as it needed to be to be viable (and AMD has pretty consistently asserted that they feel the technology isn't quite there yet and will support it when it becomes viable... although I think with some time and additional programming effort, Nvidia is finally starting to show some compelling evidence that it is in fact becoming viable).These two factors (price and performance) lead me to believe that whatever AMD has been working on will show up in the PS5/Xbox Scarlett hardware, especially since both companies (to my surprise) have mentioned ray tracing support. These are "semi-custom" designs, after all, and hardware ray tracing is a nice checkbox for a console feature. This would also explain why consumer Navi is launching so much earlier than the console hardware (I suspected the consoles would launch first, near the beginning of this year, but I was also under the assumption that they were working with Zen+ due to time constraints, so obviously I was very wrong). Given how everything has played out so far, it's clear that AMD needed to launch something as soon as possible and I suspect that due to price and performance concerns, it just made more sense to launch Navi without it.
I gotta admit, I'm a little disappointed with the lack of HDMI 2.1 though. As someone who often games on a TV, I would hate to spend $450 on a brand new GPU that won't be able to do VRR over HDMI when I finally upgrade my TV (then again, I'm not planning on doing that for a while, but still).
Spoelie - Tuesday, June 11, 2019 - link
Even beyond features, i'm disappointed by the board power. Assuming close to performance parityRTX 2070 - 12nm - 180W
5700 XT - 7nm - 225W
AMD isnt really there yet
Mil0 - Tuesday, June 11, 2019 - link
Well for the first release of a new arch, this is pretty good. More FineWine potential here, also with arch tweaks - see how much they gained with Zen 2.And efficiency wise I think the cooperation with Samsungccould turn out to be very helpful. Targeting mobile/tablets necessitates this.
neblogai - Tuesday, June 11, 2019 - link
It is because it is clocked so. If you look at the 5700, that is rated the same 180W, with performance (judging from the slides) very close to 2070. Of course, that is ~parity when comparing the products- cards, not the technological level (because this is all 12nm nVidia vs 7nm AMD).mode_13h - Wednesday, June 12, 2019 - link
Looking at AMD's GPU roadmap slides, they say that perf/W will be an area of continued focus.That's the right idea, anyhow. Efficiency isn't a one-time feature - it's an ongoing journey.
Nvidia has gone so far that it's not surprising AMD hasn't yet caught them.
evernessince - Wednesday, June 12, 2019 - link
If you factor in that the 5700 XT is faster, those power numbers seem extremely competitive.CiccioB - Tuesday, June 18, 2019 - link
When you factor the 5700 XT is a PP ahead of nvidia, those power numbers (and die size) are awful for the performance they get.m16 - Tuesday, June 11, 2019 - link
I see no problem with no HDMI support. That's a TV standard, and can easily get adapted from DP via an adapter. It also makes the card pricier to have a new controller to support that standard, so there's that. They might release a HTPC version of a card later to support that, but that's not even in my or most of the PC gamers' I know radar.Beaver M. - Tuesday, June 11, 2019 - link
They are both crap. You should wait for the next gen, which is probably coming next year anyway.evernessince - Wednesday, June 12, 2019 - link
Ray Tracing and DLSS are worthless on a 2070. At least with AMD you can use their upscaler / sharpener with a less then 1% performance impact with better results then DLSS. I don't see a problem with AMD introducing features that work instead of pipe dreams that suck in implementation like Nvidia.haukionkannel - Thursday, June 13, 2019 - link
Ray trasing is irrelevant at this point. Only one card 2080ti can run raytrasing at reasonable speed... and 2080ti cost 1200$ so no it is good that Navi don`t have it yet!If Navi will have it next year. Expect less speed with more money next year... because raytrasing takes up chip space for pure raw draving speed...
powerarmour - Monday, June 10, 2019 - link
Probably DOA these...Phynaz - Tuesday, June 11, 2019 - link
Members of the Red Team will shell out all for these side grades. They never learnwumpus - Tuesday, June 11, 2019 - link
You'd think that with the money saved on Ryzen over Intel you could afford a decent Turing (or perhaps an older/used Pascal. No real advantage of going full Turing outside the money-is-no-object 2080ti).I guess that's the difference between fans and fanboys.
neblogai - Tuesday, June 11, 2019 - link
Learn what? For anyone knowing the industry- there are plenty of reasons TO NOT BUY NVIDIA, and to support the underdog. And now, with Navi, there are even better AMD options to buy a faster GPU for less.elwro - Tuesday, June 11, 2019 - link
Well, remember that "SUPER" versions of RTX cards are about to arrive, so regular ones will get a price cut. And nowadays when AMD says about their card as "competitive with XYZ" it should be understood as "~5-10% slower on average than XYZ" (like Radeon VII vs 2080). So I guess waiting for price drop for regular 2070 better deal than getting 5700 XT.AMD is back in business in the CPU area, but unless we tak about the low-end cards, Red Team does not have impressive products. OK, Radeon VII is impressive if you're a person which does some AI/ML pet projects and likes to game at the same time, but that's a niche.
I'm quite disapointed that there is not "Big Navi" this year. I plan to finally upgrade my 2600K/GTX970 rig and while choosing the CPU is easy (Ryzen 9 3950X is no-brainer for me), the GPU market right now sucks. I guess I go for used 1080 Ti, which is best value for 1440p gaming I guess...
imaheadcase - Tuesday, June 11, 2019 - link
"fast" GPU, not faster. Supporting a underdog in a tech industry is the most insane logic i ever heard. Its not like you are shouting at some sports team. lolPhynaz - Tuesday, June 11, 2019 - link
No kidding, rewarding a company for producing mediocre products, so they will continue to produce mediocre products. It takes some special thinking to justify that.Korguz - Wednesday, June 12, 2019 - link
and paying a company way to much for its products is better ?? its like saying, keep charging us these insane prices, even though most of us know they are over priced cause all you care about is your profits, and we will keep buying them at these prices... phynaz, you are the most dumbest, close minded, ignorant person i have seen yet.... seems like you want amd's vid card business to fail, to nvidia has no competition, and charge even more for their already overpriced productsPhynaz - Wednesday, June 12, 2019 - link
And here’s one of the special thinkers now....evernessince - Wednesday, June 12, 2019 - link
Medicore as a metric depends on many variables. A 40% performance uplift over previous generation cards with a nice drop in power consumption and a vastly smaller die certainly seems less mediocre then the 27% performance increase along with higher power consumption and price increases across the board The only way you can rationalize these new AMD cards as mediorce is from a pure performance perspective compared to a $1,200 video card. Otherwise Nvidia's turning generation is far more mediocre, especially when you consider the price hikes. In fact it provides worse performance per dollar then the previous gen Nvidia cards, especially the new titan and 2080 Ti.Qasar - Thursday, June 13, 2019 - link
evernessince, keep in mind, phynaz, is just trying to justify his purchase of his, my guess, 2080. a card he paid WAY to much for, for little gain over the 1000 series.....rarson - Thursday, June 20, 2019 - link
"No kidding, rewarding a company for producing mediocre products, so they will continue to produce mediocre products. It takes some special thinking to justify that."I tend to support some companies rather than others for various reasons. Unfair business practices means quite a lot to me, as does competition in the market. So if one particular company that I prefer not to endorse has a superior product than the other company that I would prefer to endorse, I heavily compare value for dollar and the actual performance difference, and weigh whether or not it truly matters for my particular application.
The end result is one of three things. Either I end up buying the product from the company I would rather endorse, because the value is still there and the performance deficit is minimal, or I wait to see what that same company produces at a later point (because the value is not there and my need to buy is not urgent), or I end up buying from the other company because I have a need and the value is too good to ignore. At the end of the day, I'm not going to spend decent money on a crappy product just because of the company that is selling it.
And I do think even if you're a "fan" of certain companies, it's dumb not to criticize their products when they are bad and urge them to do better, and it's dumb to throw money at products that aren't worth buying. In fact, I think that's the most salient point of the entire conversation. Like you said, if you reward the company for producing mediocre products, they will continue to produce mediocre products.
Having said all that, I don't think Navi is mediocre. I think it's decent. It still lags behind Nvidia on performance-per-watt (although we really need to see some actual testing and real performance numbers to draw any conclusions). IMO it's a much better showing than Vega was, at least in terms of being a consumer card, but AMD obviously still has some work to do to make it better. What's the most disappointing with these cards, IMO, is price, and I think Nvidia has a lot to do with that. I think the XT should be around $399 (maybe a touch more) and the 5700 no more than $349. I think AMD feels that current 20-series pricing justifies these prices, but they're already a year late, Nvidia has "Super" coming, and likely the ability to adjust the entire price stack down a bit. So it'll be very interesting to see how Nvidia responds and what AMD does to correct the situation, which will likely tilt back towards Nvidia's favor as soon as they respond. AMD may be forced to price these cards more competitively because of that.
There's a saying in business, there are no bad products, only bad prices.
Korguz - Tuesday, June 11, 2019 - link
phynaz, and members of the green team have been shelling out WAY to much for those cards.. whats your point, nvidia fanboy ?RaV[666] - Monday, June 10, 2019 - link
Well, i really like the cards, and the extra features, but at these prices and with the retail blower cooling, they dont make sense.Meteor2 - Tuesday, June 11, 2019 - link
What’s wrong with blowers?Phynaz - Tuesday, June 11, 2019 - link
NothingKorguz - Tuesday, June 11, 2019 - link
heh.. nothing?? usually louder then open air. most i think wait for the custom cooler releases... better cooling.. and quietierPhynaz - Tuesday, June 11, 2019 - link
Must be a problem with AMD’s blowersKorguz - Tuesday, June 11, 2019 - link
sorry phynaz.. but its ALSO nvidia's blowers.. but being a blind arrogant nvidia fanboy.. you dont see itevernessince - Wednesday, June 12, 2019 - link
RX Vega's blower is pretty impressive for how much heat it has to dissipate. It was a more effective design then the one Nvidia used pre-turing.evernessince - Wednesday, June 12, 2019 - link
Blowers are also more consistent and don't dump heat into your case. If you have a case with poor airflow, a blower will outperform. Blowers are a good choice for a reference card because they will perform equally as well in every PC.elwro - Tuesday, June 11, 2019 - link
Sapphire will fix itAshlayW - Monday, June 10, 2019 - link
I am really disappointed with the price, like, really disappointed. Honestly they don't really bring a lot new to the table and as far as I can tell with the reference design: 8+6 pin means it isn't really that efficient if it needs that power. And the blower, ugh, just no.I knew this wasn't for me, I am happy to stay with my Radeon VII +Kraken G12+Asetek 570LC modification until next year but I was kinda hoping these RX 5700XT and 5700 to be priced like 349 and 299 respectively. Yes... Please don't hate on me, I know AMD isn't a charity and 7nm process is likely expensive but this chip succeeds Polaris and i was hoping it would be priced like it. Honestly I think custom Vega 56 for £250 is still the champ - these cards are readily available for this price here.
I await Big Navi RDNA+HBM2, or maybe I get an Nvidia 7nm card, but I will not pay more than 699 US / £650 for the card and I will not buy with less than 11 GB of Video Memory.
AshlayW - Monday, June 10, 2019 - link
Please don't hate me for "bashing AMD" : I am just a bit negative today and seeing the down sides of all things.Opencg - Tuesday, June 11, 2019 - link
I agree. The rumors had these things priced WAAAY lower. If they come out slightly agead of the 2070 then there's maybe reason for excitemen. But I honestly think that the favorable comparison amd showed off today was all best case benchmarks. Either way maybe they have some room to come down. So a price war in the mid teir may be what they are trying to encite vs nvidia. (They would probably benefit more to eliminate nvidias margins before they drop)Oh well wait for the benchmarks I guess. I really freaking hope they lose the f$^^#&= blower syle cards for navi 20. seriously
Meteor2 - Tuesday, June 11, 2019 - link
People seem to be under the illusion that 7nm is cheap to manufacture. It’s not! It costs more than the next-larger nodes.PixyMisa - Tuesday, June 11, 2019 - link
Currently it costs as much per transistor as 12nm, so nearly twice as much for a given die area. That will come down, but right now it is not cheap.Opencg - Tuesday, June 11, 2019 - link
I agree. I bought into the hype for 2 reasons. Navi was supposed to use a new architecture that could scale well using inerposers. If that is true well they didn't use it for these cards.The second reason is that the current generation of cards are still greatly inflated in price due to rtx and tensor cores taking up silicon. I was hoping that AMD could just match pascals value and that would give them an edge. They only got halfway there though. Perhaps they are making insane margins on these and have room to go down.
Either way it's a half let down for me. Im still waiting for the day when a dollar could buy as much performance as the 10 series when mining wasn't inflating.
Irata - Tuesday, June 11, 2019 - link
Same here as far as price is concerned - was hoping for at least $50 less, but if nVidia responds with either faster spec and or lower price RTX 2060 / 2070 then we may still see lower prices across the board with everyone winning, regardless of which brand they prefer.In the end, we will have to wait for reviews and what else RX5700 contains in terms of features.
eva02langley - Thursday, June 13, 2019 - link
This is what fanboys don't understand. It was planned by AMD when they heard about Super.catavalon21 - Friday, July 5, 2019 - link
As it turns out, the 5700 XT just became $50 less - 2 days before launch!arakan94 - Monday, June 10, 2019 - link
AMD needs to keep reasonable margins - at least 45%. They tried to offer stuff at very low prices for a long time and often had significantly better value than Nvidia and still didn't gain very much market share (volume would offset lower margin). So I imagine they thought "fuck it" and went for normal pricing this time - same thing for Ryzen 3000.. No more charity prices.If Nvidia lowers prices, then AMD will lower as well but there is no reason to aggressively undercut them. I mean.. Look at average gamers - Nvidia releases overpriced shit with often worse value than Pascal and people still buy it. How do you compete with that mentality?
Also, inflation. 300$ now isn't 300$ ten years ago.
Meteor2 - Tuesday, June 11, 2019 - link
Yeah, but I think it’s widely that in tech, cash prices don’t go up. Maybe they will no longer go down over time as they have in previous decades, but people just don’t accept steep prices rises for tech. Apple’s iphone sales are dropping for a reason, even with the boom of the Chinese middle class.rarson - Thursday, June 20, 2019 - link
"Yeah, but I think it’s widely that in tech, cash prices don’t go up. Maybe they will no longer go down over time as they have in previous decades, but people just don’t accept steep prices rises for tech."That's absurd. Prices do indeed go up, and they have for years. 2 decades ago, a high-end consumer graphics card was $299. Even if we adjust for inflation, that's only $469. That's a far cry from the $1200 that the 2080 Ti commands. Granted, the performance range of graphics hardware is a lot greater now than it was back then, but that doesn't change the fact that people are indeed willing to pay greater prices for better performance.
Qasar - Friday, June 21, 2019 - link
" but that doesn't change the fact that people are indeed willing to pay greater prices for better performance "maybe you are willing to pay, but not all of us, at some point, there has to be a price that is just too high, and the prices that nvidia charges for the 2070 and up, is at that point, the 2080 and titan, are well passed that point
Meteor2 - Sunday, June 30, 2019 - link
I would posit that the FX5900 and Radeon 9800 are better references, which launched at $499. That’s when the top-of-the-range was clearly established.Nvidia have jacked that up to $699 and more; too much.
wumpus - Tuesday, June 11, 2019 - link
Margins don't help if you don't have volume. They still need to cover the NRE (design costs) of the boards and those prices aren't helping. It's pretty bad when your competitor can slap on 20% more transistors and pass on the "raytracing tax" on consumers and you can't really compete with those boards.Gastec - Tuesday, June 11, 2019 - link
How many Watts does your Radeon VII consume?eva02langley - Thursday, June 13, 2019 - link
Similar to the 2080. You can even achieve better power than the 2080 when undervolting the card.Basically, Vega 7nm is roughly on par with Turing in term of power basically making RDNA supposed to be around 25-35% more efficient than Turing.
That's with the numbers provided, of course. We will find out in July.
Meteor2 - Tuesday, June 11, 2019 - link
Let’s be clear.The 5700XT is $449 vs $499 for the 2070, and is very slightly faster at 1440p. FPS/$ is better.
The 5700 is $379 vs $349 for the 2060, and is slightly faster. FPS/$ about the same.
These two cards are competitive. Nothing more, nothing less.
webdoctors - Tuesday, June 11, 2019 - link
Except they're being released a year later and all over the web and even on this site ppl kept saying the AMD cards would be hugely cheaper and ppl were getting ripped off. Now we find Radeon 7 and other new cards are priced about the same as Nvidia's cards.You mean all the anonymous kids posting on this site comments section were wrong? We're not going to get cards below cost? INCONCEIVABLE!
Beaver M. - Tuesday, June 11, 2019 - link
You will still keep an 5700 longer than an 2060 with its laughable 6 GB.evernessince - Wednesday, June 12, 2019 - link
6 months later. Turning low / mid range didn't release until feb 22nd, 2019.elwro - Tuesday, June 11, 2019 - link
And the regular 2060/2070 will be discounted when thier "SUPER" versions arrive. So, I guess buying card from 5700 series on launch is rather pointless...eva02langley - Thursday, June 13, 2019 - link
Correction, buying an Nvidia card is pointless.Phynaz - Tuesday, June 11, 2019 - link
How do they compare once DXR is enabled?eva02langley - Thursday, June 13, 2019 - link
I would add that input lag reduction is a major new feature. Value is better than RTX IMO.Beaver M. - Tuesday, June 11, 2019 - link
Well, the naming reminds of the Nvidia 5xxx cards. They werent great either.rarson - Thursday, June 20, 2019 - link
I personally think $399 and $349 would have been reasonable starting prices. Anyone who doesn't want a blower can just wait a bit for AIB cards.catavalon21 - Friday, July 5, 2019 - link
They just met your challenge. Sweet.AshlayW - Monday, June 10, 2019 - link
"AMD’s first 7nm GPU, which is being fabbed over at TSMC, measures in at 251mm2, pacing in 10.3 billion transistors into that modestly-sized die. "Let's not forget Radeon VII; Vega 20 is the first 7nm GPU :')
Phynaz - Monday, June 10, 2019 - link
AMD announces their GTX1080 competitor three years late.AshlayW - Monday, June 10, 2019 - link
RX Vega 64 launched a year after 1080, what are you talking about?Phynaz - Monday, June 10, 2019 - link
Vega wasn’t any competition as shown by sales.zinfamous - Monday, June 10, 2019 - link
garbage response is pure garbage. replaces performance argument with sales argument, thinking no one is paying attention. but we already know who you are.Phynaz - Monday, June 10, 2019 - link
Hahaha...Another AMD fanboy that can’t comprehend that Nvidia shipped this card three years ago today, and it used less power. How was the wait?Korguz - Monday, June 10, 2019 - link
zinfamous more like a garbage reply from and nvidia fanboy. as has been mentioned.. before final judgement is passed.. wait till these cards are reviewed by 3rd parties, and IMO.. its in every ones best interests, if these cards do perform as amd claims.. does any one want to keep paying nvidia's inflated prices for their cards ?? i sure has hell dont.. paying at best $500 bucks for an entry level 2060 card is not worth the upgrade of my current 1060. if this doesnt close the performance gap, then i think there arent many people that could be upgrading their current video cards.Drumsticks - Tuesday, June 11, 2019 - link
Er, I kind of want to see AMD succeed, but the 2060 is like $349 MSRP. If Nvidia prices are inflated, AMD sure isn't doing much to poke a hole in that bubble this time around. The 5700 is a little faster than the 2060 for $30 more, and the 5700XT is roughly equal to the 2070 for $50 less. They're both very vulnerable to price cuts or bundles from Nvidia right now.Korguz - Tuesday, June 11, 2019 - link
Drumsticks maybe in the US, but in other countries.. the prices are much higher... aka US $349, canada, $500 ( of the least expensive 2060 ) top of the line 2080ti is $2100, an RTX titan is $3500 !!!!!!!!!Phynaz - Tuesday, June 11, 2019 - link
People from other countries should stop complaining about the price of US tech if they can’t make their own. You Canucks should have kept ATI.Korguz - Tuesday, June 11, 2019 - link
phynaz, that comment right there shows how ignorant, and blind you really are... these prices are in US dollars, even if ati wasnt bought out, they would STILL list their prices in US dollars... where something is made.. has NO bearing on it being in US dollars.. these cards are made where? oh.. guess where.. asia... where do you think TSMC is ??Horza - Wednesday, June 12, 2019 - link
Hey Phynaz, how about you learn to NAVIgate social environments before telling anyone else how to behave.elwro - Tuesday, June 11, 2019 - link
I bought OC'ed GTX 970 for 1500 of local currency. Two generations and the equivalent OC'ed RTX 2700 costs 2750. This is insane. I really, really want AMD to start compete in mid/high end of GPU market. Hell, even Intel may be a saviour for the GPU market if they will be capable of having product across whole spectrum from low-end to high-end.CHADBOGA - Tuesday, June 11, 2019 - link
Now we get the opportunity to pay for AMD's inflated prices for their cards too.Progress is a wonderful thing. /s
Spunjji - Tuesday, June 11, 2019 - link
You know that we can all see you moving those goalposts around, right? A bad argument is a bad argument, squealing and calling your opponent biased when they point that out is a troll's game. Did you learn to argue from Ben Shapiro or something? :/evernessince - Wednesday, June 12, 2019 - link
Wait, are you bashing AMD or Nvidia here? You do realize that 1080 has better efficiency then the 2080 right? How embarrassing...gglaw - Tuesday, June 11, 2019 - link
He started posting fanboy comments before anyone even mentioned AMD vs Nvidia and didn't get enough attention so shortly after that posted even more pointless, non-informative remarks. Still not getting the attention he's craving so maybe he'll just go away.Meteor2 - Tuesday, June 11, 2019 - link
Vega sold plenty. To cryptominers.evernessince - Wednesday, June 12, 2019 - link
By that logic call of duty was the only shooter 2007 - 2017mode_13h - Monday, June 10, 2019 - link
Vega 64 rarely bested the GTX 1080, unfortunately. Due to supply shortages & cryptomining, it didn't really make up for the shortfall on price, either.So, yes, I'd say AMD finally has a strong competitor for the GTX 1080 / RTX 2070. At least, for games not using the RTX features.
Meteor2 - Tuesday, June 11, 2019 - link
No, it didn’t best the GTX 1080. But it did equal it, which was probably AMD’s aim. It did make it rather pointless in the gaming market place, of course. But cryptominers loved it; it mined ETH 30% faster than a 1080.mode_13h - Wednesday, June 12, 2019 - link
I said it *rarely* bested the GTX 1080. There were indeed a couple games, although one was using Vega's Rapid Packed Math feature. I think it came out at the end of last year... perhaps it was an installment in the Battlefield franchise?Gastec - Tuesday, June 11, 2019 - link
Time is irrelevant for me, I'm immortal :)eva02langley - Thursday, June 13, 2019 - link
Turing is the same lineup than Pascal, just with different sku. The only difference is the 2080 TI... that's it...So what is your point fanboy?
SaberKOG91 - Monday, June 10, 2019 - link
Ryan - Can you clarify whether or not the number of shaders per CU stays the same?If I'm understanding this right, Navi is 2 x SIMD-32 per CU instead of 4 x SIMD-16 per CU for earlier GCN.
Ryan Smith - Monday, June 10, 2019 - link
There are 64 shaders per CU, organized as 2 32-wide SIMDs. So your understanding is correct.SaberKOG91 - Monday, June 10, 2019 - link
Thanks!vFunct - Monday, June 10, 2019 - link
How does the Navi architecture compare to the Vega II on the new Mac Pros?extide - Monday, June 10, 2019 - link
Vega II is just Vega 20, ie Radeon VII. Navi is next gen compared to that.mode_13h - Monday, June 10, 2019 - link
Don't stress, dear Mac user. Vega Pro II is faster. Eventually, I'm sure AMD will release a Navi Pro that will replace the Radeon 580 Pro currently set to ship in the Mac Pro base config.Phynaz - Monday, June 10, 2019 - link
Double the price of Polaris. Thanks AMD!SaberKOG91 - Monday, June 10, 2019 - link
You can thank Nvidia for that one. AMD is just doing the smart thing and trying to earn higher margins rather than relying on volume shipments. I'm sure the prices will come down 100$ or more my Christmas.Phynaz - Monday, June 10, 2019 - link
Oh sure, Nvidia held a gun to AMD’s head.Fanboys are idiots.
SaberKOG91 - Monday, June 10, 2019 - link
With all due respect, I am not a "fanboy". This is basic economics. If you price your product too far below that of a competitor, you decrease the perceived value of your product and miss out on higher margins. It's also important to recognize that a Turing refresh is on the way that will tack on 50-100$ to the successor of the 2070.Meteor2 - Tuesday, June 11, 2019 - link
We won’t see any refreshes to the 20x0 cards this year. GPUs aren’t on a annual release cadence.SaberKOG91 - Tuesday, June 11, 2019 - link
I'm talking about Nvidia's "Super" announcement. Faster GDDR6 for the existing cards. No different than the V100 refreshes.Meteor2 - Tuesday, June 11, 2019 - link
There has been no announcement. There are rumours, which I place firmly in the “I’ll believe it when I see it” category. The rumours consist of bumping RAM up by 2 Gb, and possibly increasing memory bandwidth. I’m not sure either or both would make a great deal of difference to frame rates. The GPUs themselves would be unchanged.zinfamous - Monday, June 10, 2019 - link
holy shit man. look at your pure garbage responses so far: you brought this nonsense here, and you only. "calling out fanboys" after introducing this garbage.what throwaway nvidia schwag did your lazy posts pay for, anyway?
Phynaz - Monday, June 10, 2019 - link
So, have anything to say about this $450 ripoff? Three years later and it’s $50 less than a GTX 1080. Of course if you were waiting for another hot clocked AMD furnace then I guess you will be thrilled.SaberKOG91 - Monday, June 10, 2019 - link
The 2070 is currently 15% faster than a 1080 and sells for anywhere from 490-550+ USD. No one is paying 500$ for a 1080 without getting ripped off. Even if the 5700XT is only on par with 2070, it's 15% faster than the 1080 and at 40-100$ cheaper than the 2070.The Radeon VII runs ~10C hotter than the 2070 (74C) while drawing 60W more under load. The 5700XT has a 75W lower TDP than VII. So it stands to reason that it will not run much hotter than the 2070, even before we factor in AMD's blower redesign.
Phynaz - Tuesday, June 11, 2019 - link
The 2070 has many features this card doesn’t. Only fanboys are happy to pay 2019 prices for 2016 tech.Korguz - Tuesday, June 11, 2019 - link
and only nvidia fanboys are happy to pay the prices that nvida charges for their cards.. but SaberKOG91 is right for the current prices of video cards.. THEY kept raising the prices of their cards to where they are now.. the sweet spot was around the 300 mark.. but now.. it seems to be between 500 and 600....Meteor2 - Tuesday, June 11, 2019 - link
Korguz, to be fair, for $300 you now get a lot more AAA FPS now with Nvidia than you ever have before. It’s just that the high end has become even higher.AMD don’t really have a compelling or even just competing $300 product.
Korguz - Tuesday, June 11, 2019 - link
maybe.,. but are these the only to navi cards amd will release ?? i doubt that, this could just be the mainstream cards... there could be an RX5800/5900 series and a rx 5600 series yet to come...Korguz - Monday, June 10, 2019 - link
phynaz.. and you like paying for nvidia's over priced 20 series, that isnt much of an update performance wise to their own 10 series cards ?? sorry phynaz.. but nvidia is more of the rip off here.. not these cards.. but im going to wager a guess.. you have more money then brains?Phynaz - Monday, June 10, 2019 - link
AMD video cards....second rate for the poor I guess. I’m glad that in the USA we can afford the best.Korguz - Tuesday, June 11, 2019 - link
too bad the best.. is over priced...Phynaz - Tuesday, June 11, 2019 - link
Too bad the second best is missing so many features. AMD give you a DXR fallback driver yet? Why not?Xyler94 - Tuesday, June 11, 2019 - link
Phynaz, realistically, since I assume you have an RTX card, how much do you use DLSS and RTX?With the low amount of games supporting it, and the huge performance hit for a bit better lighting and reflection mapping, it's not a killer feature. I bet less than 5% of people who bought these cards use these features on a regular basis...
Korguz - Tuesday, June 11, 2019 - link
Xyler94 exactly... but the nvidia fanboy Phynaz.. is to close minded, and ignorant to understand thatKorguz - Tuesday, June 11, 2019 - link
and the main feature of the RTX series ?? only used in a hand full of games, causes a pretty big performance hit when used, and cost to much for that feature... even the ray tracing fall back driver they release for the non RTX cards isnt worth the performance hit, i tried it on my 1060.. no thanks.. got any thing NEW to counter with ?? ray tracing wont be usable for another generation or 2 still... and will probably STILL be expensive...Horza - Wednesday, June 12, 2019 - link
Well I've exceed my daily recommended intake of cringe, better move on!uefi - Monday, June 10, 2019 - link
What's stopping you from buying nVidia when they offer the better price to performance and watt on the free market? No reason to have an aneurysm over the alternative product you can't see yourself buying.evernessince - Wednesday, June 12, 2019 - link
"So, have anything to say about this $450 ripoff? Three years later and it’s $50 less than a GTX 1080"You do realize the exact same could be said of every Nvidia turning card right?
Korguz - Monday, June 10, 2019 - link
you should know Phynaz, you are one yourselfmode_13h - Monday, June 10, 2019 - link
Not a gun - they set the price/performance ladder. AMD is just slotting into that. No sense in under-cutting by a lot - it would just hurt their margins probably more than the additional volume could offset.Phynaz - Tuesday, June 11, 2019 - link
hahaha! Nvidia responsible for AMD’s prices. Clueless and priceless at the same time!Korguz - Tuesday, June 11, 2019 - link
phynaz.. you really must be dense and blind ...Spunjji - Tuesday, June 11, 2019 - link
There are two likely options here:1) They don't know how bad their arguments are and have a serious logic impairment,
2) They know exactly how bad their arguments are and have some form of social impairment
Either way, getting angry with them probably isn't going to help. Best to leave them to stew in their own ill-concealed self-loathing.
gglaw - Tuesday, June 11, 2019 - link
He's obviously craving for internet attention. Only way to make it go away is to ignore him. He's wasted too much space in this comments section already so don't keep giving him more fuel.Holliday75 - Tuesday, June 18, 2019 - link
What he said.mode_13h - Tuesday, June 11, 2019 - link
I think you don't understand how business works.Spunjji - Tuesday, June 11, 2019 - link
Nobody said Nvidia forced AMD to price higher - Nvidia's pricing decisions opened up room for AMD to follow suit; that's how markets work. You couldn't have made your strawman more obvious. Why bother? If you just want to have dumb fights where facts don't matter, piss off to 4chan.Horza - Wednesday, June 12, 2019 - link
"Fanboys are idiots"Said without a hint of self-awareness.
eva02langley - Thursday, June 13, 2019 - link
You are labeling yourself as an idiot then? Well, that was kind of obvious.Yojimbo - Tuesday, June 11, 2019 - link
Haha. This is a funny comment. It's NVIDIA's fault!!I'm not sure why these cards are so expensive. Maybe it's a statement to their investors. I think Nvidia will release their "super" lineup and then AMD will cut prices rather quickly.
Phynaz - Tuesday, June 11, 2019 - link
It’s always the same for the red team, it’s always someone else’s fault, never AMD’s.Korguz - Tuesday, June 11, 2019 - link
yea.. cause nvidia is perfectmode_13h - Monday, June 10, 2019 - link
I wonder how much role tariffs played.Anyway, 7 nm is new and I'm sure the price will drop as both it & GDDR6 both mature. Remember that Polaris is currently mature and now selling well below its original list price.
Phynaz - Tuesday, June 11, 2019 - link
7nm is over a year old. And if memory prices drop, they drop for Nvidia too.Meteor2 - Tuesday, June 11, 2019 - link
It’s brand new for big dies.mode_13h - Tuesday, June 11, 2019 - link
A year is still pretty new, for a process node. It probably didn't become economically viable for GPU-sized dies, until very recently.CiccioB - Tuesday, June 11, 2019 - link
Yes, and that's why AMD balance is so low at the end of the quarter.GPU sells are pulling AMD quarter results at low numbers as that division is loosing a lot of money with respect to the CPU division.
evernessince - Wednesday, June 12, 2019 - link
Lol, we all know Nvidia set the pricing way back when turing launched. Blaming AMD for pricing set 6 months ago by Nvidia is just asinine.eva02langley - Thursday, June 13, 2019 - link
And offer twice the performance... price performance ratio are better than the RTX 2060.xrror - Monday, June 10, 2019 - link
It's like... serious question here.Was/are Polaris and Navi actually that bad power/perf wise?
Or
Did nVidia hit it out of the park so hard with Maxwell and Pascal that nobody else can catch up?
Either way it sucks for those of us who game, and don't want to pay >$600 for a tangible upgrade from GTX1070 level and/or actually have usable 4K gaming.
Pity the person who wants a good VR rig.
(and no, this isn't an nVidia shill - I'd love to grab another AMD card, but whoever gets me a 4K gaming card for $400 first is gonna win it)
mode_13h - Monday, June 10, 2019 - link
I think you're onto something. When Nvidia set about to design the Tegra X1, they had to focus on power-efficiency in a way they never did before. When they scaled up to a desktop GPU, this gave them a perf/W edge that ultimately translated into more perf. Just look at the performance gap between Kepler and Maxwell, even though they shared the same manufacturing node!AMD has taken a couple generations to wise up. It seems they are still on the journey.
V900 - Monday, June 10, 2019 - link
Yes pretty much. Maxwell and Pascal were that great, even when NVIDIA is using an older/bigger node than AMD.We’ll see what Intel brings to the GPU market, though.
As for a tangible upgrade to the 1070, the RTX 2070 is available for 450-500$ right now, so no, you wouldn’t have to spend >600$.
CiccioB - Tuesday, June 11, 2019 - link
Anyone can catch up, if it wants to affors the costs of redoing its inefficient architecture.by passing from Kepler to Maxwell nvidia deeply redesigned the entire architecture (making it also a bit fatter, so a little more expensive) bu they knew that was the thing to do to create a better architecture.
AMD started with GCN in 2012 and is proposing it's "Maxwell" in 2019.
Despite the fact that the technology has advanced and beside the 7nm PP there are more things that they still lacks like all the new features nvidia put in Maxwell, Pascal and even more in Turing.
They just started understanding that memory compression is an advantage instead of being wasted transistors. They are about 6 years back from this point of view.
mode_13h - Tuesday, June 11, 2019 - link
They're definitely not 6 years behind! They introduced tile rendering in Vega, which Nvidia first brought out in Maxwell. So, perhaps more like 2-3 years.CiccioB - Wednesday, June 12, 2019 - link
On geometry capacity they are 6 years behind.Like for memory compression that allows nvidia to use about 33% less bandwidth which obliged AMD use expensive HBM on high end cards to non make enormous and expensive bus on GPUs that are already fatter than the competition for the same performance.
Without talking about the double projection feature and the acceleration for voxel to better support volumetric lights and effects (which we can see only though GameWorks extension as no console engine is thought to support them because AMD has not dedicated acceleration for them and they would result in a slide show).
Spunjji - Tuesday, June 11, 2019 - link
Underclocking / undervolting experiments have shown that GCN is actually quite competitive in terms of power/perf, *for a given level of performance*. Unfortunately for AMD, Nvidia have been consistently able to hit a higher absolute level of performance, forcing AMD to hot-clock their cards just to keep up.That is absolutely down to Nvidia hitting it out of the park with Maxwell - they nailed architectural efficiency in a way that has clearly taken AMD some time to catch up on, and they managed it with an architecture that scales up extremely well.
Meteor2 - Tuesday, June 11, 2019 - link
Define tangible. We’re seeing the same slow-down in performance increases that we’ve seen with CPUs.Pascal and Maxwell in particular were amazing; the GPU equivalent of Core or the Bridge series x86 cores. AMD has caught up on the CPU side, but not so much on the GPU.
4K 60+ FPS on max AAA settings is extremely hard to do. Nvidia have got there, just, but at what a price. AMD can’t build such a GPU; there’s not enough thermal capacity in a PC case for the amount of power such a Navi GPU would need.
eastcoast_pete - Monday, June 10, 2019 - link
Thanks Ryan! Are the 64 ROPs confirmed?xrror - Monday, June 10, 2019 - link
2nd this - 64 ROPs is something to get excited about.Ryan Smith - Monday, June 10, 2019 - link
Yes. It is confirmed.BenSkywalker - Monday, June 10, 2019 - link
Even if we take AMD slides as the gospel truth, 10% better performance than the 2060 for 8.6% more money but with no RTX hardware at all? These parts seem quite a bit overpriced, and that's if the rumors around the 'super' offerings from nVidia are wrong. If they are correct, or even half as good as the claims, these parts will lose in every metric. And that's assuming this isn't another 'overclocks like a dream' or 'poor Volta' moment.Fallen Kell - Monday, June 10, 2019 - link
Well, 3 years late to the party, sporting the new shiny coat of paint on last years performance... I just don't get it. AMD had every advantage for this card with a significant manufacturing process advantage over Nvidia, and yet, still can't beat what Nvidia had out ~3 years ago (1080ti). I can only believe that they have stopped trying. They most definitely didn't try the last 3 generations of cards. I mean, they didn't even make an attempt at a card that could perform in games as well as Nvidia's high end cards. It was sure good that AMD's cards could at least perform well for compute work, otherwise I have not idea how they have staying in the graphics card business the last 6 years.Look, I get it, the real sales are in the mid-range products for stand-alone cards, and in the low end products on the integrated side. But the thing that drives those sales are the high end. The consumers read, see, hear, all about how Nvidia is the fastest, highest performance, "best" card all over the place in the benchmarks, and all of a sudden Nvidia is equated with better products. Soon customers are buying that laptop or desktop because it has the Nvidia card in it, and there goes the sales on integrated and and low end systems from the big manufacturers. I doesn't matter that AMD might compete at price/performance, the brand itself is seen as second class because they havn't been able to compete with Nvidia on the high end in a decade or more now... I was REALLY hoping that this card/generation was going to be something different. 7nm vs 14nm should have been able to blow away Nvidia's performance, yet it can barely match Nvidia's 3rd or 4th fastest cards...
SaberKOG91 - Monday, June 10, 2019 - link
These are mid-range parts. Wait for Navi 20 with the full 64CU and 4096 shaders. Should fall between the 2080 and the 2080 Ti if I had to guess. Unless it is on 7nm+ which might net it some even better performance.AMD had every advantage...until they fell 6 months behind schedule due to a retape in October. We were supposed to have had Navi 10 in January and be getting Navi 20 now. Instead they are playing catch-up and licking their wounds. Stuff happens. Nvidia aren't really trying to get much faster, so AMD still have time to catch up. Closing the gap is a huge first step.
Phynaz - Tuesday, June 11, 2019 - link
You’re saying AMD has moved mid range to $500. And you’re happy about that.SaberKOG91 - Tuesday, June 11, 2019 - link
Nvidia moved it there. AMD are just not going to give up the margins this time around. I'm not happy about any of this. But clearly you are incapable of doing anything other than seeing someone who disagrees with you as a fanboy. Which is ironic because all of your petty little comments make you worse than any fanboy I have ever encountered.Korguz - Tuesday, June 11, 2019 - link
SaberKOG91, yep.. that he is ...Phynaz - Tuesday, June 11, 2019 - link
When did Nvidia do that to mid range? You mean the 2xxx cards that have a ton more features than AMDs cards? Guess what, features cost money to implement.What AMD gave their fans today was a $50 price reduction and a power usage increase over the gtx 1080.
Wait for Polaris.
Wait for Vega
Wait for Navi
The correct answer has always been buy Nvidia now and enjoy!
SaberKOG91 - Tuesday, June 11, 2019 - link
You mean the consumer cards that Nvidia designed with datacenter features and sold them to you by inventing ways of using them that no one cares about? DLSS only exists so that tensor cores aren't worthless to games. RTX only exists to sell high-margin Quadro cards. The 20 series barely improves on the 10 series for all of the other features of the card. It'll be years before any of the extended features of the RTX cards are actually made mainstream in games. If Nvidia cared about performance, they would have just scaled up the silicon used in the 16XX cards and gotten a huge boost in gaming performance. Instead they stuck a bunch RTX and Tensor Cores onto the die and sold you a workstation card that you can't even take advantage of. So all your old games are barely better and none of your new games can use the new features for a year after launch. And rather than keep up with inflation, they jack the price up a few hundred dollars over last generation and tell you it's all worth it.You're just too stupid to see how badly they screwed you over.
CiccioB - Tuesday, June 11, 2019 - link
You are saying that introducing new feature to make the market advance over the now old classic rasterization rendering method, and the costs associated with this, is a wrong thing and the best strategy would have been packing more and more transistors to make the same old things just faster?If this Navi has finally that geometric boost we will finally see games with more polygons. Finally. since Kepler nvidia could support more than twice the number of polygons GCN has been able to and with the mesh shading in Turing they can now support more than 10x. But we are stuck with a little more than Wii model complexity due to GCN and game engine/assets developed for consoles.
We are way back of what we could be just beacure GCN can't keep up with all the new functionalities and technology nvidia has introduced these years.
Spunjji - Tuesday, June 11, 2019 - link
Pro tip, CiccioB - If you're starting a comment with something like "So you're saying", you're clearly signalling to anyone paying attention that you either:1) Didn't understand what the person was saying, or
2) Are trying to deliberately misrepresent what the person was saying
In this case, you're inferring that the poster said creating new features is bad while assuming that the cost Nvidia have attached to those features is necessary or inevitable.
The truth is that while Ray Tracing will be a great addition to gaming when we have cards that can support it at reasonable performance levels, only the 2080Ti really makes the grade. That card costs significantly more than I paid for my entire gaming laptop. That's *not* a good value proposition by any stretch of the imagination.
Nvidia could have introduced RTX at the ultra-high-end this generation and moved it downwards on the next - at least then we'd have still have some good value from their "mid-range" cards. Instead they pushed those features down to a level where they don't make any sense and used that to justify deflating the perf/$ ratio of their products.
Saber's argument is pretty sound - these features only really make sense for AI right now, but Nvidia made a bet that they could get their gamer fans to subsidize that product development for them. It's good business sense and I don't begrudge them doing it, I just begrudge people for buying into it as if it's somehow The Only Right Thing To Do.
CiccioB - Tuesday, June 11, 2019 - link
I do not understand your "intro" to your worthless comment.There have been statement saying that creating new features is not good because it requires lot of time for them to be adopted, so better wasting transistor to accelerate what we have now (and had since someone else introduced new features).
Your statement here is worthless (and clearly expressed under red fanboysm):
They started with this generation with big dies to include those new features at the level they could and you already said that they did to justify something you just hate without waiting for the next generation when the shrinking may enable those feature to be scaled down to the low-mid level of the market.
You have said something that is possible to be achieved in a generation evolution of the architecture together with a die shrink, but your fanboysm in defense of something that AMD could not achieve in 3 years (since the launch of Polaris), just make you state that nvidia is bad because they haven't brought RTX to mainstream in 6 months.
If RTX is going to take a couple of yyears instead of an entire console cycle to be adopted is also because nvidia wanted to sell expensive cards with those features.
You are not obliged to buy them, you can just continue buy crappy GPUs on obsolete architecture that consume twice the power to get the same work (but not the new features) done if that make you happy.
It's a free market and accusing a company to sell a more expensive product in the attempt to bring the market ahead (and not grinding it to an halt as AMD has done since it introduced GCN) it's clearly stupid and just denotes that you are just angry by the fact that AMD even with 7nm and 3 years of development didn't managed to get where nvidia is both in terms of features (which is not only RTX) and in efficiency.
Because yes, those fatty die feature rich GPUs by nvidia can do more work (even without using the new features) with the same W that the new AMD's GPUs at 7nm can.
The reality is this one.
AMD with a PP of advantage can't keep up with nvidia efficiency and feature list and this is the real reason we have high prices. because 7nm is not cheap, and having shrunk GCN to get those Navi performance is another flop that is going to be payed when nvidia will on its turn shrink Turing to a new performance (and feature rich) levels.
Korguz - Tuesday, June 11, 2019 - link
CiccioByour " statement is worthless " comment.. is also worthless.. as Spunjji is correct.. nvidia COULD of kept RTX to the ulta highend, say titan and 2080/ti, and then made a card for the 2070/2060 that did increase performance for every one else over the 10 series.. but, they didnt.. instead.. they want every one to pay for the ray tracing development.
" brought RTX to mainstream in 6 months " RTX is NOT mainstream, far from it.. the cards are priced so only those with more money then brains, can buy them. which i assume.. is you Phynaz, due to your constant defending of RTX, and you just need something to justify the price you paid WAY to much for to get an RTX card ...
" this is the real reason we have high prices." WRONG, nvidia put the prices where they are, cause over the last few years.. they keep charging more and more for their cards, when they didnt need to.. but all they were worried about.. was their PROFITS !! look at the comments by nvidia for their earnings call between 2018 and 2019, now that the crypto mining craze is dead... that alone shows nvidia is only worried about profits..
CiccioB - Wednesday, June 12, 2019 - link
You have a convoluted mind, surely due to the fact that you are a red fanboy that cannot see the facts.1. There's really no reason at all to introduce a feature like raytracing only on high end card that are going to be maybe 5% of the market when it needs a big enough user base to be supported. It would have been only a way to see "hey, we are there, so AMD, think about it as well and catchup with us the next generation".
2. nvidia has not put a gun on your head to "make you all pay for the raytracing development".
You are free to buy whatever other card without RTX and stay in the cheap budget you have.
3. New features have a cost, and it may shock you, but they have to be payed somehow buy the one that buy those GPUs. But you are a red fanboy and you are used to cheap crappy architectures which have not brought a single advancement over the last 10 years, so yes, you may be horrified by the idea that technological advancement have a cost that have to be repaid.
4. At the end of you r worthless rant you have AMD launching a new generation which is a PP ahead that still can't reach competition efficiency and most important is pricing it at the same level of the competition with not of a single new feature introduced by it (despite the packet math). So now you have to by expensive crap with no advanced feature to have the same performance of classic game engines but still using more W (or if you want to play with voltages and clocks with the same power but a PP of advantage.. yes, that's the advancement we all were waiting for!).
But don't stress. You can still buy the cheap power hungry Polaris crap with not new advanced features that AMD is selling at discount since the launch of the GTX680.
That is going to help AMD to improve its balance and have more money to invest for the next generation. So that next generation when AMD chip will get still fatter for RT support yuou can still buy cheap GPUs and not pay for the new features and again help AMD to reach generations later the features introduced by the competition years before.
Korguz - Wednesday, June 12, 2019 - link
and you dont ?? face.. for the most part.. nvida priced your coveted new feature out of the hands of most people, and even you must admit, that ray tracing on anything but a 2080, is almost useless cause of the performance hit.1: see above
2: stay in the cheap budget ?? um sorry, but maybe you are still living at home, with next to no bills to pay, but some of us, have better things to spend our money on, like a mortgage, kids, food, etc... none of the people i know.. have RTX cards, and its because they cant justify the high prices your beloved nvidia is charging for them...
3. i am a red fanboy ?? tell that to the 4 1060s i own, and the 3 readeon cards i also own, all in working comps.
4. at least amd has priced it A LOT more affordable, that more could afford to buy, with out the main, for the time being, useless main feature that you cant really take advantage of...
but i will guess.. you are an nvidia fan boy, who loves to pay for their over priced cards that have made the last few years... who lives at home, and there fore, has more money then brains....
CiccioB - Friday, June 14, 2019 - link
You are a clueless AMD fanboy despite having some nvidia cards.I'm not for nvidia at all costs and there's not doubts that Turing cards are expensive.
But you are just prompting the usual mantra "AMD is better because it has lower prices".
The reality is that is has lower prices because it has worse products.
In fact, now that they believe (and we'll see if that's true) they are rising the prices.
The fact that Vega (and Polaris too) is sold at a discount price so that at the end of the quarter AMD has to cover its losses with the money coming from Ryzen is not a good thing even though it is good for your low budged pocket.
It just is a sign that the products are so bad that they need very low prices to match they very low value. It's a simple marketing law that AMD fanboy constantly forget. Actaully, it is easy to recognize an AMD fanboy (or an ignorant, which is the same>) as they constantly use dumb reasons to justify their preferred company without knowing what are the real effects of the strategy that AMD is using.
On Turing, the high prices are due to the large dies. You are not forced to buy those large dies and be happy with your obsoleted cheap technology. You think that ray tracing won't be useful until next or 2 generations. If we were waiting for AMD we would not have it in 10 years as they have not been able to bring a single technological advancement in 13 years (that's the launch of Terascale architecture by ATI).
They just follow like a dogs does with its prey It is easy not to have to invest in new things and just discount products to make them appear economical better than they technologically actually are.
Big dies, more W, low price to stay in par with lower tier products made by the competition.
You may use all the red glasses you want to look at how the things stand with this Navi: but the reality is summed in 2 simple things:
1. in 2019 with a completely new PP they matched Pascal perfrormance/W
2. as soon as nvidia shrinks Turing they'll return in the dust as they deserve not having presented one new features on what it actually is a redesign of an obsolete architecture that should have dies in 2012 instead of being sold at discount for all these years making kids like you believing that low price = better products and never looking at the fact that it is an hole in fiscal quarters.
And then you fanboy constantly speak about lack of money to do this and that. It's all about the same cause: bad products = low price = low margins = no money.
They know about this and they are trying to make money before nvidia make its shrinks (which will be done when the new PP is cheaper because nvidia wants money not your charity) and Intel comes out with 10nm solutions (which is a bit further in time but they come and they will regain the market as they have before).
Korguz - Friday, June 14, 2019 - link
CiccioB news for you buddy, based on your own replies, and your constant need to insult people.. you are WORSE, and probably just a young punk kid. the way you keep pushing nvidia, and constant bashing amd make YOU a fanboy yourself. FYI... navi, matched nvidia's midrange 20 series products.. the holy grail you call ray tracing, is only viable on 2080 or above. i could easily buy a 2080, but i wont.. cause i have more important things to spend my money on. just because you have more money then brains, and love paying nvidia for their over priced cards.. is your choice, must be nice to live at home with probably no bills, and no financial responsibilityyour calling me a fanboy.. fine go ahead.. doesnt bother me as we NEED competition, look what intel did to the cpu market.. and you said it your self " because nvidia wants money " thats ALL they care about.. MONEY AND PROFITS " On Turing, the high prices are due to the large dies " that is part of it, but its also something called no reason to price them lower, cause there is nothing really out there as an alternative, again.. look at intel for proof of this.
Korguz - Friday, June 14, 2019 - link
oh and CiccioB not once that i can see, have you comment on how bad turing's performance hit is with ray tracing, or how the 20 series isnt that much faster then the 10 series with ray tracing off, for the price you pay, all you seem to be focusing on.. is trying to make nvidia look better in your replies.based on this.. do YOU think the 20 series is a good upgrade to the 10 series ?? if you do.. then you are a blind nvidia fanboyFallen Kell - Tuesday, June 11, 2019 - link
Spunjji, there is a reason that NVIDIA put RTX in their midrange cards. The reason is that most sales are midrange and lower. The entire issue with RTX is the chicken and the egg problem. Game developers won’t put the effort for raytracing if there is no hardware in the consumers hands that can take advantage of it, and consumers will typically not opt to purchase hardware that is not going to be used by any games. NVIDIA is effectively forcing the current chicken to lay a next generation egg to open the market for new techniques for games by creating a large enough of an install base for game companies to do the math and see that there is an existing market for their raytracing games.Fallen Kell - Tuesday, June 11, 2019 - link
Yes Nvidia could have simply put RTX on their highest end cards and waited. And they would be waiting and waiting and waiting for game companies to actually implement raytracing. A gaming studio won’t invest the time and effort to retool their game engines for a potential consumer base of a few thousand people. However if that potential consumer base is a few hundred thousand, or a million, they will take a look at adding the features.Korguz - Tuesday, June 11, 2019 - link
a lot of good that thinking does.. when most people cant afford the cards, or dont want to pay the cost of entry for those cards, besides.. to make ray tracing usable, you kind of need a 2070, or better yet, a 2080, as the performance hit.. is just to great.Korguz - Tuesday, June 11, 2019 - link
Fallen Kell, but the prices nvidia is charging for their " mid range cards " is NOT mid range pricing, they have priced their mid range cards, to be more like entry level, high end range cards.. mid range would be 400 or less..Phynaz - Tuesday, June 11, 2019 - link
I’ll bet you do a 180 when AMD has these featuresXyler94 - Tuesday, June 11, 2019 - link
Reading this comment thread, you must be an egotistical idiot or something.First off, Ray Tracing needs to be on AMD (and Intel's upcoming XE) hardware before it will be supported industry wide. Why waste tons of money on a feature barely anyone will use?
Secondly, RTX is awesome, no one is denying it. But it's also not a reason I'll buy into an RTX card. sure, I can play Metro Exodus with it if I had that card, but my 980ti is rendering games just fine still, so I don't need to upgrade. I may buy a 2080 as my next card because I do use Moonlight, which is the open source nvidia stream service. But if AMD's Performance at a lower price in games without Ray Tracing is higher, then that'll be my purchase. I've got no allegiances to either AMD or NVIDIA. I got a 980ti because it was the strongest card at the time. ((Plus I watercooled it. It's darn awesome :) ))
Meteor2 - Sunday, June 30, 2019 - link
Xyler, you read all that?! :-oeva02langley - Thursday, June 13, 2019 - link
ROFL... it is mid-range... and it is almost matching High-end Nvidia. Get out of here...Korguz - Thursday, June 13, 2019 - link
if navi supported ray tracing.. was just as slow at it as turing, on par with a 2080 in everything else and still $100 cheaper.. most of those that are ridiculing amd now... would probably still be ridiculing them ..Korguz - Monday, June 10, 2019 - link
BenSkywalker these cards arent even out yet.. wait to see where the prices fall when they are released, then see....BenSkywalker - Monday, June 10, 2019 - link
You think AMD stated prices to make AMD look bad? I'm honestly confused by your statement, or are you saying after enough backlash they'll be forced to lower prices? Or that if/when the 'super' green parts come out they'll lower the price, before they launch?Korguz - Tuesday, June 11, 2019 - link
nope.. announced prices.. may not be release prices... they could go up.. or down... just ahve to wait and seeCiccioB - Tuesday, June 11, 2019 - link
If prices go up it is bad for us, if they go down it means that AMD product is not the value they wanted to sell it at and thus having a cut to their margins (as they did since GCN encountered Kepler).Sorry, but there's nothing to wait to see if it evolving to a better or worse situation.
The MSRP are those announced and they have to be compared to the ones of the competition with the same performance.
Phynaz - Tuesday, June 11, 2019 - link
Exactly, these bring nothing new. This is AMD fleecing their fans, and their fans are too dumb to notice.sor - Tuesday, June 11, 2019 - link
Dude, you need to chill. I’m not even in this fight, I can’t remember the last dedicated video card I bought, but you don’t have to hop on every comment chain and say the same crap.This is a mid range card. Of course it’s going to be similar performance to the best card from previous gen. It’s how the product cycles work.
I’m so out of it that I had to go look up the pricing on RTX 2070. Looks to be $549 on high end and $499 on low. If this new card is a “few percent” better and priced at $449 it seems reasonable, objectively. The price difference could reflect the feature disparity people are alluding to.
Korguz - Tuesday, June 11, 2019 - link
oh like nvidia has done over the last few years with the pricing of their own cards ? come on, unless you have more money then brains.. do you like the prices of nvidia's cards?? i know a few people who hate what nvidia charges for their cards, THAT is the rip off ....wr3zzz - Monday, June 10, 2019 - link
Is it just me or that starting from last year the price-performance progression has been linear?awehring - Monday, June 10, 2019 - link
7nm is a quite impressive technology. 251mm2 with 10.3 billion transistors.Let's put this into perspective. Take a normal hair (0.05 mm thick), cut it. At the cutting surface you would have about 80'000 transistors.
neblogai - Tuesday, June 11, 2019 - link
Did not know, that hair has transistors in it.. /jMeteor2 - Sunday, June 30, 2019 - link
That is (genuinely) amazing. Great exampleV900 - Monday, June 10, 2019 - link
No HDMI 2.1, no hardware raytracing.Disappointing.
Navi is an allright card, but roughly the same performance at roughly the same price as an Nvidia RTX 2060/2070 doesn’t cut it.
And Nvidias cards are a year old at this point and a node shrink bigger.
Navi is an allright card, but not the savior that Radeon needs at this point.
Qasar - Monday, June 10, 2019 - link
V900 not really.. at the moment.. how many games even use ray tracing ?? not worth the price of admission to get an RTX card.... they are WAY to expensive...oleyska - Tuesday, June 11, 2019 - link
and definitely no point in buying into RTX for the 2000 series unless you're an developer.RTX3000 and it might be a selling point if performance increases enough.
Spunjji - Tuesday, June 11, 2019 - link
This is the point so many people seem to be missing / are deliberately trying to talk past. RTX 2000 series is basically dev hardware being marketed to gamers.Korguz - Tuesday, June 11, 2019 - link
spunjii... yep... but the same could be said for most of the new features nvidia and amd/ati have added over the years to their cards.. 1st gen.. dev hardware, 2nd gen, usable performance.. transform and lighening way back when was like that ....Beaver M. - Tuesday, June 11, 2019 - link
Much better choice than a 2060 at least.zodiacfml - Monday, June 10, 2019 - link
Yawn. These cards probably comparable to their Nvidia counterparts at 1080p while consuming more power. Same old story. It doesn't have ray tracing hardware. Launch price going to drop pretty fast.Oxford Guy - Monday, June 10, 2019 - link
Navi was probably designed around the consoles so it didn't need to be all that competitive. Consoles are propped up by smoke and mirrors. How else can one convince the public to accept Jaguar CPUs for gaming?Phynaz - Tuesday, June 11, 2019 - link
These cards are comparable to Nvidia cards from three years ago, even the price.Gastec - Tuesday, June 11, 2019 - link
So are the RTX 2070 and 2080, comparable to 1080 and 1080Ti but more expensive :)CiccioB - Tuesday, June 11, 2019 - link
If you look at the standard features, yes they are.But Turing packs a lot of new features that old gen GPU s have not, and if these features are going to be used Pascal cards (and much so GCN/Navi) will remain in the dust.
Spunjji - Tuesday, June 11, 2019 - link
By the time a significant number of games come out that make proper use of these features, the current cards will be old news. This is usually true for most new graphics technology (see the GeForce 3 and DirectX 8) but I don't think it's been true to this extent and at such a high cost of entry before. We're in new territory here.CiccioB - Tuesday, June 11, 2019 - link
We are in the territory where console monopoly is lowering the technical level te market can reach.If just one of the two consoles were powered by nvidia HW AMD would already be in the dust and highly regret by its buyer while new games engine would exploit all the new features nvidia has inserted into its GPUs since Maxwell.
In fact, we have a market were AMD struggle to keep up with nvidia HW even though it has all the optimization and choices made for the games to run best on its HW.
And we still have the same geometric complexity of 2012.
Korguz - Tuesday, June 11, 2019 - link
maybe nvidia, is just charging too much to put their GPU tech in a console.. how much would it cost to put even a 2060 into a console ?? i would guess 75% of the cost of the console, would just be the GPU....CiccioB - Wednesday, June 12, 2019 - link
Maybe because nvidai didn't want to sell at discount prices as AMD did just not to be annihilated in the gaming market?Think about that: two consoles one with nvidia and one with AMD HW. Now just think which one would have netter graphics and features.
Korguz - Wednesday, June 12, 2019 - link
or maybe because.. ALL your precious nvidia cares about.. is its PROFITS.. or are you to blind of a nvidia fan boy to see this aspect ??? my guess.. you are... and.. have more money then brains." two consoles one with nvidia and one with AMD HW. Now just think which one would have netter graphics and features. " ok.. and how about you think.. which would cost A LOT more then the other one because it has nvidia inside it?? ever think about that ?? as YOU said your self : " New features have a cost " and while its easy to pass that cost on to the consumer with a discrete video card.. that wont work in the console market, but when you think with your wallet.. and not your brain.. this is what happens...
CiccioB - Friday, June 14, 2019 - link
All companies care about their PROFITS, and I may shock you that AMD does too!!!They all sell the products at the best price*potential piece sold at that price to maximize their returns.
Guess waht?
nvidia can apply a higher price becuase their products have higher value.
AMD has to apply a low price because they sell crap HW that has value only whan at discount.
It's a basic economical law.
Now that you have learnt it, you can return play at BF5 at 120FPS but with only 10 polygons on the screen with your Vega card.
Korguz - Friday, June 14, 2019 - link
higher price because their products have higher value.?? BS... they charge that, because they can, and just want profits. but you dont understand that.. cause your are just a young punk kid, with more money then brains, with no financial responsibility, living off mommy and daddy.... when you grow up, and get a mortgage, car payments, and kids.. then maybe you will start to understand the value of money and realize spending 1200 or more on a video card.. is not as important has being able to feed your children, and provide a roof over their heads, fanboyDribble - Tuesday, June 11, 2019 - link
Well that's wrong - the features have only been out for months and there are already several games using them extensively with significantly better visuals as a result. That's way better the adoption rate of most DX's. In addition the early adoption has pushed ray tracing into the next gen consoles in some form - you can bet if Nvidia hadn't released RTX they would have none. Now AMD is scrambling to put something in Navi 2 (or whatever consoles have) as the console makers are both demanding it.You can argue the performance isn't there yet (same for pretty well every major new graphics feature) but you can't really argue that RTX hasn't hit the ground running and had a pretty big impact.
Korguz - Tuesday, June 11, 2019 - link
too bad.. that impact.. is on everyones wallet Dribble :-) :-)Beaver M. - Tuesday, June 11, 2019 - link
Would still pick the 1080Ti over any Turing, because ot makes much more sense, even now still.But Nvidia was smart enough to axe it as the only Pascal one. They knew their 2080 was too crappy to leave it on the market.
jabbadap - Tuesday, June 11, 2019 - link
How about virtuallink? Does it have that, is it on tdp or is that card power only like nvidia?akyp - Tuesday, June 11, 2019 - link
The regression in perf/$ is simply disgusting. When my 970 gives up I might as well go APU and hope Google Stadia is actually good. Too bad the Ryzen APUs are a full generation behind.zodiacfml - Tuesday, June 11, 2019 - link
About to say the same thing but remember these cards are comparable to Vega cards, not the RX 480/580/590RavenRampkin - Tuesday, June 11, 2019 - link
I liked what I saw. In contrast with the $749 Ryzen part (even thought that's revolutionary stuff right there, our wallets are still doomed -_-). Don't take the hype train bait and it'll be twice as difficult to disappoint you. Call me an AMDtard fanboy I don't mind ¯\_(ツ)_/¯(on the topic of all the megafeaturez: not believing in wide future adoption of all those DLSSes doesn't make a person a fanboy. That's some Elon Muscus level shtick imo. Same way, you'd probably call me a fanboy for bemoaning the low popularity of numerous other -- open-source -- RTG goodies. Looks like AMD decided not to bemoan any longer and go freestyle mode. No matter the performance and competition. Meh? Meh. But product quality isn't always measured in ad banner space and use in pre-builts, winkity wink to the "yarr Vega sux!!!111" gang and RTX's Witnesses. Vega was never bad and is certainly no worse than on launch at its current prices: 56 starting at $200 used, $270 new, 64 $270 used, $330 new. (Lithuania used market, U.K. stores for new units) Don't want Vega or Pascal or Polaris? The world of RTXes, where the 2060 and 2070 just barely, and with a lot of effort, reached 1060 and 1070 MSRP, in select stores only -- while the 2080 (Ti) is still cosmic -- is waiting for you. /offtop)
nils_ - Tuesday, June 11, 2019 - link
Disappointed to see that they still can't keep up with high end NVidia cards. I'm planning to get a new workstation / gaming rig and I would have liked to go fuill AMD, but for the gaming part I would still want an NVidia card. This leaves me with few options since I mostly use Linux and boot into Windows for gaming, there is no good Linux driver for NVidia (only their binary release that's a pain in the ass to use).These are my options:
1) Go full Intel (i9900KS) + NVidia (RTX2080 Ti or successor) - Excellent graphics support under Linux with the iGUP, excellent graphics performance
2) Go mixed: High End Ryzen (3950X), get an RTX2080 Ti and a low-end GPU for Linux, possibly sacrificing a few PCIe lanes in the process
Price wise the latter one is probably more expensive. Or I could wait for Intel to release a new Desktop CPU...
scineram - Tuesday, June 11, 2019 - link
What about Radeon VII?nils_ - Tuesday, June 11, 2019 - link
Great suggestion, that might work as well, though not as fast as the RTX 2080Ti it's probably fast enough. I'm wondering about the power use in a Desktop scenario though, that's usually better with the iGPU (and disabling the dGPU).Xyler94 - Tuesday, June 11, 2019 - link
To be fair, Radeon VII was never meant to compete against 2080ti, which costs nearly double of the VII. If you've got the money, go for the ti, otherwise, make a decision on what's most important to you, not what an Nvidia or AMD fan tells you to buy.Korguz - Tuesday, June 11, 2019 - link
keep in mind... and as i mentioned in a previous post... there very well could be a rx 5800/5900 series, and a rx 5600 series still to come...nils_ - Tuesday, June 11, 2019 - link
Money really isn't the object here. It probably makes more sense to go full AMD just to not further support NVidia and their Linux asinine driver policy. I presume one could use two of these in Crossfire mode.Xyler94 - Tuesday, June 18, 2019 - link
Do keep in mind that the VII has 16GB of 1TB/s HBM2 memory on-board, which will be a huge boon in certain workloads if you're targeting professional workloads. The 2080ti tops out at 11GB of GDDR6 memory, which is half the bandwidth... I think.BenSkywalker - Tuesday, June 11, 2019 - link
Latest Ubuntu build has the nVidia binary drivers included, pretty easy to use(although they don't install by default). Also, nVidia binary drivers, at least for gaming, are significantly faster then their AMD counterparts.nils_ - Tuesday, June 11, 2019 - link
I run a vanilla kernel, usually the most current stable version, that always causes troubles with out of tree drivers. I don't play games under Linux.dr.denton - Wednesday, June 12, 2019 - link
Navi 10 is a relatively small mid-range chip, it was never intended to take on 2080/ti. That will come in 2020 with the next iteration of Navi.eva02langley - Thursday, June 13, 2019 - link
ROFL... mid-range... almost matching high-end competition... blind fanboyism at its best.yhselp - Tuesday, June 11, 2019 - link
251mm2 for $450... :(Phynaz - Tuesday, June 11, 2019 - link
Yup.eva02langley - Thursday, June 13, 2019 - link
Just go back to WCCF man...R3MF - Tuesday, June 11, 2019 - link
Re: display controller remaining a little ~2018Do you think the display controller might get an update for hdmi 2.1 / VRR for low end Navi later this year?
Later release, so more time to introduce improvements, and more necessary for a part that will be attractive to the home theatre market...
TheUnhandledException - Tuesday, June 11, 2019 - link
HDMI 2.1 hardware is still really expensive and power hungry. Given the dearth of HDMI 2.1 displays and the fact that these cards wouldn't even do 30 fps at 8K I don't see the lack fo HDMI 2.1 being a big deal. DP 1.4 is far more useful for these mid range cards.R3MF - Wednesday, June 12, 2019 - link
Sure, but it would be nice to get the bandwidth necessary to do:4:4:4 chroma / 10bit colour / /HDR / 4k at 60Hz.
Or even use VRR to let the framerate roll between 40 and 80 fps.
hdmi 2.1 will be normal on midrange 4k (+fald) TV's next year, shame for small Navi to miss the marketing boat.
ET - Tuesday, June 11, 2019 - link
If AMD is going back to suffixes, it should have at least gone for something more interesting like a palindrome. RX 5775 XR would have been quite cool.Oxford Guy - Tuesday, June 11, 2019 - link
You're more creative than they are.MDD1963 - Tuesday, June 11, 2019 - link
That GPU looks like someone took it to the gym in a gym bag, and accidentally dropped a 45 LB plate on it...; the dented look is horrible, IMO... Fire the entire aesthetics team!Psycho_McCrazy - Tuesday, June 11, 2019 - link
Ryan, can I request you to include UW(3440.1440, 3840.1600) resolutions in the review of GPUs, starting with Navi reviews??gijames1225 - Tuesday, June 11, 2019 - link
Well, I was letdown by a lot of this, given the pricing. I suspect after launch though we'll see a pricing way between AMD and Nvidia like the days of old, and these may wind up at a better price.andrewaggb - Tuesday, June 11, 2019 - link
Unfortunately, I think nvidia has the better graphics offerings by a wide margin and got there first.zodiacfml - Tuesday, June 11, 2019 - link
not really but AMD created well calculated prices for these cards (as usual), making it a dilemma to choose between AMD or Nvidia. the 7nm process has no benefit to the user unless we undervolt or under clock these cards. the price differential can be used if RTX hardware in Nvidia adds value or not.AMD has created same old story with their GPUs unlike the Ryzen CPUs
eva02langley - Thursday, June 13, 2019 - link
With RTX? Is that a joke? because performance wise, AMD is beating them at mid-range.GreenReaper - Tuesday, June 11, 2019 - link
Nothing on the new video codec blocks (Video Core Next++)? I hope they slot AV1 support in before the 2020 APUs. I want to buy something I can at least play streaming video without spinning up the fans for the next 10 years, and it looks like Netflix, YouTube et. al. want to move off patent-encumbered formats as soon as possible.levizx - Wednesday, June 12, 2019 - link
Then you should wait another 2 generations. AV1 hardware acceleration is pretty much essential if you want future-proof.Krysto - Wednesday, June 12, 2019 - link
It would be a HUGE failure on AMD's part of Navi+ doesn't bring at least AV1 decode acceleration to PS5/Xbox Scarlett. But at least some sort of encode acceleration should be in there, too, because game streamers will badly need it.Without AV1 support, PS5/Xbox Scarlett will not be future-proof basically. You can refuse to believe it all you want, but it's true.
Meteor2 - Sunday, June 30, 2019 - link
Absolutely true.hubick - Tuesday, June 11, 2019 - link
LG has their 8K SM99 75" HDMI 2.1 TV imminent, which sounds like it might only make you choose between it and a motorcycle, not a house, for the price, so I'd really like to see an HDMI 2.1 card to drive it. AMD fail :-(Oxford Guy - Tuesday, June 11, 2019 - link
Keep in mind how small the die is for Radeon VII. People are duped by the inclusion of HBM II. You're not really getting a super-powerful prosumer chip with Radeon VII. You're getting Vega recycled with a small die.I also think it's lame for people to get too excited over the replacement for Polaris, a midrange product at best that's years old. I wasn't excited about Polaris to begin with. All it was was AMD blasting past the efficiency curve for its LPP on a small die. Big deal.
Oxford Guy - Tuesday, June 11, 2019 - link
Anandtech should really highlight die size in these reviews like it used to to give people better perspective about what they're really getting and how it compares with the past.webdoctors - Tuesday, June 11, 2019 - link
They already have transistor count in the table above, that's sufficient.Threska - Tuesday, June 11, 2019 - link
Why do that when others do it for them?https://www.techpowerup.com/gpu-specs/
Oxford Guy - Tuesday, June 11, 2019 - link
And Vega was Fiji recycled with a smaller node and somewhat upgraded VRAM. As I recall, the IPC of both was basically identical."Reissue, repackage, repackage." — Morrissey
Korguz - Wednesday, June 12, 2019 - link
Oxford Guy nvidia has done the same " rebranding " of their cards as well...Phynaz - Wednesday, June 12, 2019 - link
“Both sides” loleva02langley - Thursday, June 13, 2019 - link
WCCF post quality...peevee - Wednesday, June 12, 2019 - link
So, no competitor for Gforce 1660 then?cmdrmonkey - Wednesday, June 12, 2019 - link
These cards are dead on arrival at these prices. AMD just doesn't have the kind of brand recognition that nvidia does these days. These cards aren't cheap enough that anyone is going to go with AMD again.Qasar - Thursday, June 13, 2019 - link
and how cheap should they be ?? my guess.. they would never be cheap enough... so far.. a know a few people that are interested in these cards.. and are waiting till next month to see how they perform.. why ?? cause nvidia priced them out of the market...Hixbot - Friday, June 14, 2019 - link
Nvidia RTX were overpriced due to lack of competition at the time of their launch. Performance per dollar no better than the 3 year old GTX 1000s. AMD finally release their new gen and their prices do not significantly undercut Nvidia RTX, especially considering the lack of hardware ray tracing. This is disappointing to see as a consumer. We expected to see AMD drive the GPU performance per dollar upwards in the ways they have done to CPU.Korguz - Friday, June 14, 2019 - link
Hixbot, and what if amd did have ray tracing, was just as slow as turing, had the same features, but was priced less then what nvidia charges ?? then what? would those that are harping on amd now, still be harping on them ?? of course.. but now they would be complaining that amd had a year more development time for ray tracing, and the performance is still the same ?? lets face it.. ray tracing isnt really viable performance wise unless you are using a 2080 or higher.. the hit is just to great to make it usable... most of the people i know.. dont care about ray tracing right now.. its adds to much to the cost, and is barely used... none of them even asked if navi had ray tracing... again.. cause with nvidia's pricing, they are priced out of the ray tracing marketHixbot - Saturday, June 15, 2019 - link
I'm not following your point. Nvidia's terrible pricing are not being challenged by AMD. If it were, I think they would get a lot of credit from myself and others, just as they are in the CPU market. Nobody is praising Nvidia here, I was just expecting more value from AMD.Korguz - Saturday, June 15, 2019 - link
Hixbot, i think amd would still be being harped on, and what i mean is.. they would still be criticized like they are now for not having ray tracing, and cause of the price of the cards.. instead.. i think the common complaint would be.. amd had an extra year to work on their cards, and all they can do is match the performance ?? they suck.. and their cards are still over priced...BenSkywalker - Saturday, June 15, 2019 - link
So what performance level are you seeing in your games with RTX on using your 2070 and where do you think it should be?I just played through Quake 2 RTX on a 2060 and I thought it was great. Been playing metro exodus and Tomb Raider with ray tracing on too, not having problems with either of them either. Minecraft isn't my thing but my kids think the ray tracing in that game is great too(that one will run on older hardware).
Part of the disconnect in this conversation is people talking about how bad performance is at 4k ultra with ray tracing when mid range cards can't run these games art those settings without ray tracing anyway.
Korguz - Saturday, June 15, 2019 - link
BenSkywalker i dont have any 20 series cards.. i have a 1060, but going by the reviews.. and word of mouth with those that do have a 20 series card, it seems the performance isnt there. maybe thats it, resolution 1440 or higher.. but it seems if you mention you play @ 1080p, you get made fun of.BenSkywalker - Sunday, June 16, 2019 - link
I'll say all the people I know IRL that actually play the games that have tried ray tracing on with RTX hardware have thought performance was solid, on a 2060. Now to be fair no competitive shooters were used online, just actual gamers playing actual games. As an example Tomb Raider max settings without ray tracing at 1440p is close to identical performance as 1080p with ray tracing at ultra settings. Now you can run RT at medium 1440p and be quite playable(mid 40s to mid 50s), but between the two settings, double blind, everyone I had compare said the 1080p with ray tracing was much better(full disclosure, I picked a spot with ray traced shadows on screen).But my tournament level frame rates are down..... As opposed to what image enhancing technology?
Five years from now it'll be a joke that anyone argued against it in the first place, most of those that are, rabidly, will deny they ever did.
Korguz - Sunday, June 16, 2019 - link
BenSkywalker, what games are you trying RT with ?? im not sure.. but tomb raider isnt really all that taxing, is it ? i would assume the " competitive shooters were used online " you mention.. may be a lot more taxing.. and the performance may not be all that bearable with RT on, on a 2060.im not arguing against it, but, maybe be unlike some, for the price, it just isnt worth it... yet... i am going to assume you are in the US, but for those of us in canada, ( maybe other parts of the world as well ) take your US prices and add 200 at the low end, to around 500 at the top end and decide of the prices are worth it.. 2060's here start at $500 ( when not on sale ), and go as high as $2100 for a lot of those i know.. it just isnt worth it, as we have bills to pay, kids to feed etc...
BenSkywalker - Monday, June 17, 2019 - link
Metro exodus, Tomb Raider, Minecraft and Quake 2 RTX. Take Metro Exodus, fps are higher using 100% shader resolution with ray tracing on then ray tracing off with shader resolution maxed out. Digital foundry made a video about Quake 2 RTX, really gives a great example of the impact(performance drop is *huge*, so is the visual impact).So you say it isn't worth the premium, in no way whatsoever would I say that's wrong, I know first hand readily available disposable income isn't always sitting around in large quantities, but what about when there is no premium?
That's what I'm seeing with this launch. The 5700 is more expensive than the 2060, barely edges it in traditional rendering and doesn't give the option to play with ray tracing *at all*.
There's only a handful of games and it's a big performance hit, both completely valid, but the pricing issue is kind of out the window now that AMD has decided to avoid the value position.
Korguz - Monday, June 17, 2019 - link
BenSkywalker, looking that req's for Metro exodus, im a little surprised that it runs that well for you :-) but to then add in a 22??? year old game, that would run ( exaggerating here ) 400 fps on modern hardware, add RT to it, and have it run at " only " 200 fps, is a little moot... what would the performance be, and i dont mean patching in RT support, but if the game was made for RT from the start ? the friends i talked to.. and i think this is part of the reason why that dont think it is worth it to get an 20 series card, yet, is be cause with Metro exodus, Tomb Raider, Minecraft and Quake 2 RTX, they dont play any of those games.. so the money spent for RT and the other features the 20 series brings to the table, wouldn't be used. to compare the 5700 series to the low end 20 series.. is also a little lopsided, as amd is aiming these 2 cards well above that, so comparing the 2060 series to say a 5600 or even a 5500 type series might be a little more even. but it really comes down to how much is one willing to spend on a video card, how much they can afford to, and would the games one plays, see any benefit from that purchase? one of the friends i talked to.. is one of those with a lot of disposable income, and even he says the 20 series isnt worth the cash right now :-)BenSkywalker - Monday, June 17, 2019 - link
The canned bench from metro exodus is like a torture test, the game runs quite nicely. Quake 2 is actually 32 years old, you were way off on you're estimates(think more like 1.5k to sub 100), but I'd say watch the digital foundry video.Now, the rest of your comments, they are directly refuting AMD's claims. AMD is calling the 5700 a 2060 competitor, saying it is 10% faster for 8.5% more money. That's not my interpretation, that's what they have come out and said. AMD is saying, based on their hands picked benches that they are going to charge almost exactly the same $/fps as nVidia but no option for ray tracing. Again, this isn't me spinning anything, it's all in their slides, their words.
Is it unreasonable to assume AMD chose benches that made them look slightly better than average? Combine that with the price premium and you may find that real world has a factory overclock 2060 at the same price as the 5700 MSRP is dead even with AMD, but with the option of playing with RTX.
BenSkywalker - Monday, June 17, 2019 - link
That should say Quake2 is 22 years old.Korguz - Tuesday, June 18, 2019 - link
i said 22 ?? years old.. same as you.. my estimates ? you mean my exaggerations of how many FPS's quake would get on modern hardware ? just i wasnt sure about the age... maybe.. but that is just my thoughts too... will have to wait to see what street prices turn out to be when these are released.. they could very well be less then what amd has said. well.. thats what those i know said when i asked.. they dont play those games, so they see getting an rtx card as a waste of money.. but consider that they are comparing what they have now.. to getting an rtx card, and it could very well be a waste.. i am sure.. if they did play games that have rt in it.. they might be considering it.. again.. it really comes down to the games that have RT and if one plays them.. for myself, and the games i play.. i would be wasting my money.. but i have always wondered how one game i played would run on newer hardware/better hardware then i have currently, as it always brought my comp to its knees...my current 1060 strix with the 5939k @ 4.2 ghz, vs 1080ti or the 20series/radeon 7/5700xt, with a zen/zen 2 based cpu or a much newer intel cpu, a game called supreme commander... i read a review of it ages ago.. and if you didnt have at least a dual core, dont even bother.. with mine.. on the HUGE maps it has, and 1000 units per side ( up to 8 ) after about 20 mins.. im turning the game speed up to max, and lowering the eye candy down to at least the middle...BenSkywalker - Tuesday, June 18, 2019 - link
You had the question marks next to 22 years, I see contextually those were incredulous modifiers now, also your estimates, my frame rate goes from 1,460FPS to roughly 60FPS, it's much worse than 400 to 200.In no way am I asserting you are making a bad decision for you, not even close- what I'm saying is why would someone choose the 5700 over the 2060? Ignoring ray tracing all together they are close to identical in performance per dollar, so with one you can play around with it if you want, the other you can't.
If you are saying neither are worth it, completely valid argument and I wouldn't argue it.
Korguz - Wednesday, June 19, 2019 - link
heh.. my exaggeration was a little off for how fast quake would run ;-)well, until these cards are out ( 5700 series ) its hard to say if they will be priced that close together, currently the 2060's are priced between 500 and up to 570, even if the entry level 5700 is less then 500 then it could be a better buy.. but wont know for sure for a few more weeks...
right now.. i guess in a way i am, cause they dont provide a big enough performance increase over what i or those i know.. currently have... my self.. i would need to go to at least a 2070/5700xt.
BenSkywalker - Wednesday, June 19, 2019 - link
Newegg.ca has the Asus phoenix 2060 for $449, a large selection for under $480. At a direct currency exchange the 5700 will be $508 in Canada.Korguz - Wednesday, June 19, 2019 - link
ahh yes.. the newegg angle.. are you also factoring potenial shipping costs ?? currency exchange.... moot point as that changes daily... and can vary.i will wait to see what prices are when released... then compare...
BenSkywalker - Wednesday, June 19, 2019 - link
That's NewEgg Canada, not the U.S., is shipping different for some reason? There is no currency conversion on the newegg.ca site if you're in Canada.Obviously waiting for reviews will give us a fuller picture, my issue is what they claimed just doesn't seem to have any compelling reason to buy it over the competition.
Korguz - Thursday, June 20, 2019 - link
BenSkywalker i was referring to newegg.ca in regards to shipping, but seems they have free shipping on that gpu ATM.. either way.. still something to consider if newegg as anything cheaper then going to a local store as with shipping, could negate the lower initial price. as for currency convert.. that was in regards to the 5700.. as there is no current cdn prices... yet...powerwiz - Tuesday, June 18, 2019 - link
yaySlashchat - Tuesday, June 18, 2019 - link
and where are the 5800 xt and 5900 dual navi xt?m16 - Sunday, June 23, 2019 - link
I'd love to get one of these. Nvidia has basically had the market for too long and they now force you to have an account and telemetry enabled to have gaming driver updates and optimizations. They've really not gotten any good karma with me or anyone caring about privacy.Qasar - Monday, June 24, 2019 - link
" they now force you to have an account and telemetry enabled to have gaming driver updates and optimizations. " they do?? must be on the 20 series.. cause i dont have that issue with the 1060 i have....peevee - Friday, June 28, 2019 - link
Please explain why RX5700 has almost twice as many transistors as RX590 while having essentially the same performance (and number of ALUs) - the tiny difference in FP32 throughput can be entirely explained by higher boost clock and then some.Where did the extra 5B transistors go?
oliya - Sunday, June 30, 2019 - link
Canon Printer is the best among the printers in the market. If you face any kind of issues in your printer and you are not whiling to go out to service center, we are here for you. Contact our Canon Printer Customer Support Number and get instant solution. We are available 24*7 for our valuable customer<a href="https://www.canonprintersupport247.com/blog/how-to... canon printer offline</a>
sweeps12 - Thursday, July 18, 2019 - link
<a href="http://www.sweepstakesnew.com">Latest Online Giveaway 2019</a><a href="http://www.sweepstakesnew.com">Win car</a>
<a href="http://www.sweepstakesnew.com">win trip</a>
<a href="http://www.sweepstakesnew.com">online sweepstakes</a>
<a href="http://www.sweepstakesnew.com">Online Win Trip</a>
Hury Up!! Only Few Days Left <a href="http://www.sweepstakesnew.com">Win car</a>
Hury Up!! Only Few Days Left <a href="http://www.sweepstakesnew.com">Online Win Trip</a>
Hury Up!! Only Few Days Left take participate and win <a href="http://www.sweepstakesnew.com">win trip</a>
want to win cash price visit limited period offer <a href="http://www.sweepstakesnew.com">Latest Online Giveaway 2019</a>
apnidigitalduniya - Tuesday, June 9, 2020 - link
Listing your business information on these Faridabad Local Business Directory increases online exposure and provides new avenues to reach potential customers.http://www.apnidigitalduniya.com/