Sometimes the flying is part of getting the product to review. Other times... well, trade shows and the like are sometimes compensation for the level of pay provided by an online news reporting gig. This isn't the glory days of Personal Computer World, with issues 600-pages thick with ads.
Should have been compared to the 1070 TI FE imo, they dont even sell the 1070 anymore, at least not in the nvidia store in my country. 1070 TI FE is the best bang for your buck right now IMO, if you dont want/need the raytracing capapility and play at 1920x. Personally I recently bought the 1070 TI FE, and have been quite happy with it. Only cost 75% of what a 2070 FE cost, and maxes out my FPS at the resolution I play(1920x1200)
Just because the 1070 isnt sold in "the nvidia store in your country" doesnt mean it has ceased to exist, or that it is no longer the market the 2070 is now targeting.
Well, that might be the case. But does not change that 1070 TI is much better value for money then 1070(and 2070 for that matter). The review not including the "in between" option is not helpful for prospective buyers.. all IMO of course.
"1070 TI is much better value for money then 1070" Taking the Techpowerup summary of the relative performance (compared to a 2080Ti) and running it through the cheapest model available from a respectable retailer in Germany, the 1070 is less than .15% better performance/€ in 1080p titles than the 1070, 1.8% better performance/€in 1440p titles and a whopping 3.2% better performance/€ in 2160p titles. "much better value for money" looks different to me. Then there is the fact that there are 53 1070 products and only 36 1070ti ones listed here. And I have to question the sanity and your financial sense of anyone who looks to upgrade from a 1070ti to a 2070. Generational upgrades are usually not worth it without a corresponding process node shrink and the same is true here, especially since the 1070ti is more a 1080 in disguise than a true midrange card.
At least for me, it's not at all about upgrading from a 1070TI to a 2070, but rather which should I purchase to replace my 970. The 1070Ti is roughly $400-450 while the 2070 is $600. That's a huge difference and it's hard to know how big the performance delta is when the article keeps comparing to the 1070 that's effectively been supereseded.
If you're buying a 1000 series card, it's a 1060, 1070ti, 1080 or 1080ti; virtually no one is buying a regular 1070 at this point. Honestly I think I'm going to sit out this generation until we have an idea of what this hardware can do for ray tracing and DLSS. I have a feeling these cards (especially the 2070) will struggle mightily to run those settings anywhere near 60fps. So, if I personally don't believe the 2070 will be able to deliver acceptable performance for RTX tech, then I'm leaning towards grabbing a 1000 series for (relatively) cheap.
"If you're buying a 1000 series card, it's a 1060, 1070ti, 1080 or 1080ti; virtually no one is buying a regular 1070 at this point" Any data on this? Or just made up to further your point? What about second hand cards? With a performanc/€ delta of less than % at 1080p, you are making quite bold claims here.
The 1070Ti is far better value then 1080 or 1070. Your comparison uses a stock 1070Ti - a card which only makes sense overclocked(due to Nvidia banning any factory OC models). When overclocked, 1070Ti is 7% lower perf then an OC 1080 for a lot less.
Oh and since 1070Ti was just launched last October, they are perfect to grab on the used market as all cards will still have 2 years of warranty left. I picked up a mint used 1070Ti for $300.
https://www.nvidia.com/object/manufacturer_warrant... This warranty applies only to the original purchases of the Warranted Products from a retailer, mail order operation, or on-line retail store; this warranty will not extend to any person that acquires a Warranted Product on a used basis.
Have to agree here. I'd like to see the 1070Ti for completeness but Anandtech may not have the card to hand (you simply can't always keep stock of / put your hands on every card you want), might be planning on adding it later or just thought it'd be easy enough to extrapolate where it will fall. As for value for money, I did a massive table (in pen for some reason) which calculated aggregate benchmarks for a load of cards (1080Ti, 1080, 1070Ti, 1070, Vega 64) that I was interested in and then calculated a performance per Pound (UK here) for each one. Aside from the 1080Ti (which was far better value in terms of performance per Pound), the rest had practically identical value rankings aside from the AMD card which was slightly lower. I actually ended up buying a Vega 64 as the price briefly dropped by £110 just before Turing was released which made it an absolute bargain and, seeing the performance here, I'm relatively delighted that for once the GPU market has not screwed me over. I'm guessing the prices dropped as people were expecting Turing to be amazing and now they're back up as it's just pretty meh all over and very overpriced. My old card went for £100 (I don't normally sell them, but give them away to a friend) and so my cost to upgrade was ~£300 which I'm very happy with given that I would never EVER consider spending 2080 / 2080Ti money on a GPU. The worst part for Nvidia is that I am their target market - I'm a gamer who spends a lot of money on his PC to play the latest games at decent quality settings and I'm also a professional who isn't exactly poorly off. Could I afford a 2080Ti? Easily. Would I EVER buy one? No. The price is insulting and so is the marketing. I'd have never have considered AMD before they released all that insulting marketing rubbish. They gave AMD a sale.
I know at least one other person with the exact same story. It's irritating for folks like us, but as long as the market continues to swallow it they won't miss our sales!
A 1070 Ti is way too close to the 2070 in perf, why bother at this point when you don't know if the new features bring anything to the table with the 2070?
A lot of people have 1070's, non-Ti, and have more reason to wonder if this a significant enough upgrade. I think Anandtech did the right thing here.
Game performance differentials range from 6% to as much as 22%, but i'd say for 10% more on the price, you get a tiny bit more than 10% of the performance uptick. Hence, 1080 is a better overall value in terms of price/performance to 1070 Ti.
Not according to Techpowerup and German prices. The cheapest 1070ti is 430€, the cheapest 1080 is 480€. The relative performance towards a 2080Ti is 62%/55%/50% for the 1070Ti in 1080p/1440p/2160p and 66%/58%/52% for the 1080 in the same resolutions. That means the 1080 delivers only 95%/94%/93% of the performance/€ compared to the 1070Ti. Things are so close though as to warrant a real inspection when seeing a deal for one card or another.
While true, once overclocked, the 1080 can stretch its legs a bit more due to the faster GDDR5X, making the lead a bit larger, and its additional shaders also benefit a bit more from a 'similar' clock to a 1070ti. They're almost a perfect match if you consider perf/dollar, and in that case the 1080 is and was always the better choice, because higher absolute performance should usually result in a worse perf/dollar number.
Why do you compare in your final words the RTX2070 to the GTX1070? Just because they have the same marketing endname does not mean it's in the best interest of your readers to compare these two. The only relevant guide here is the pricing of course! RTX2070 have the same pricetag as GTX1080TI. Comparing it to GTX1070TI is a stretch but why not. But the 1070?! a 400-dollar card?
Because the 2070 is a 1070 replacement? Why do you people get hung up on "WELL IT COST A BUNCH SO ITS A HIGH TIER CARD"
No, it isnt, its a mid tier card with a high tier pricetag. It doesnt make sense to compare the 2070 to a 1080ti because of the pricetag, anymore then comparing a vega 64 to a 1070 because they both cost $400 at some point.
Exactly, it could be called RTX 2050 and still use the same chip, don't compare marketing with die specifications, RTX chips are huge, and they are priced accordingly to that, we will never see 2080Ti come close to 1080ti prices
^^ So much of this right here! Name it what whatever, but the retail price is ultimately what dictates what it must compete with from prior generations or from the nearest competitor. As far as pricing is concerned, the RTX 2070 must contend with the GTX 1080Ti and 1080 when a present day buyer is looking at options in that price range.
die size is irrelevant, price points is what is important in price brackets.. if the 2070 is on 1080gtx prices for a tiny improvement, its not worth it..
What an absolutely ridiculous statement. People don't cross shop products that have radically different pricing- they pick a budget and look for what is best in that price segment. Hmmm I'm torn between a Kia and a BMW. Yeah right.
That's reasonable, the Kia Stinger is a pretty decent RWD/AWD luxury sedan. For like $45k CAD you get roughly what you'd have to pay $65k CAD to get in a BMW, so a $45k BMW by contrast is a much less interesting vehicle. (Although personally, I'd go with the Genesis G70 instead of the Kia, you can get it with a 6-speed manual.)
Car analogies do and don't work here. Some might start with size, while other start with price. Some will start with looks, some will start with power, some will start with drivetrain, etc. Some will cross shop products that have different prices, some won't.
Some might be torn between a Kia and a BMW, if both made rwd sports cars that compete against each other, even with a price tag difference. Look at all the comparisons between something like a Subaru WRX STI vs a BMW M3, even though the M3 is easily $20k over the STI.
The only time I see someone set a budget first, is those looking at the used car market. Not the looking at the new car market.
Why would you say that the 1080 is the card to beat and then use a garbage FE version as the benchmark comparison. Every 1080 card you're going to buy today is substantially faster than that FE version.
They have tested numerous non-FE 1080 cards. The issue is that its a comparison nobody will be making when buying a 1080. It makes the 2070 look way better in the graphs than it should. If they feel the need to include a FE model for reference, fine. But they should have included a version with the faster ram and a typical factory OC since that is what is most often for sale right now. Particularly in light of the price point of the 2070.
How does it make it look much better than it should when they downclocked the founder's edition to a clock below what the 3rd party 2070 cards which are comparable to the 1080s you want to use will be using.
And I don't think you can use the price point of the 2070 FE or the base 2070 as a justification to include factory overclocked cards from 3rd party board partners. There are other reasons for the price differential besides price/performance in current games. And since there is a price premium for NVIDIA FE cards you're going to end up with a price comparison problem anyway.
They tested numerous non-FE 1080 cards and when they are available I'm sure they will test numerous non-FE 2070 cards. When that happens I am sure they will make the comparisons among those two sets of cards, since there will no longer be the FE/non-FE problem.
It's a difficult situation because there seems to be a dollar value to the founder's edition beyond the performance, and the reviewed card is a founder's edition.
Our editorial policy long has been (and remains) that we'll always compare products stock to stock. That means comparing a reference clocked 1070 to a reference clocked 2070, etc. This is so that we never overstate the performance of a product; as we use the reference specs, anything you buy will be as fast as our card, if not faster. As we can only loo at a finite number of cards, it continues to be the fairest and most consistent way to test cards.
Plus we got a earful (rightfully) whenever we've deviated from this. You guys have made it very clear that you don't like seeing factory overclocked cards treated as any kind of baseline in a launch article.
My first PC was a Tandy 1000 RL with an Intel 8086 CPU. The first PC I ever built was a 486SX/25 and I've been a PC gamer ever since. For the first time since, well, ever, I'm seriously considering just forgoing PC gaming in the short-term. Between the ridiculous pricing of GPU's and RAM, I just don't see how this can be a hobby for the vast majority of people anymore. It's nice that you can get a lot of bang for your CPU buck these days, that doesn't even begin to make up for how much you have to bend over for the rest of it. I think I'll be getting a PS5 and call it a day and use my current PC with its OC GTX 970 for any PC exclusives I may want to play. I just can't justify spending these kind of prices. Nvidia is going to kill PC gaming for a lot of people. I'm not sure what their strategy is except to bend people over for as long and hard as they can and only then start dropping prices one sales start taking a hit. Well, sorry, Nvidia. You need to find someone else to take advantage of.
If you had truly been building since the 486 era, then you would know that, despite the price jumps, computers today are MONUMENTALLY cheaper then they were in the 90s. You dont see $4000 desktops in stores today, you sure did in 1991.
hold yer horses there lad, lets us some calcs. $2000 in 1991 would be $3,684.96 today...I see LOTS of computers people build that are ~ this level and $3600 does not buy "cream of the crop" parts today, very high end no doubt, but also not "best of the best"
use a different number 250 1991 money which is ~ mid range gpu pricing these days would be $460.62.
I guess to put a slightly different way, it depends on what one is buying to see that the "value" of the $ spent is often times equivalent much worse or only "slightly" better then we have today.
We may get "more" for the $, but, all things being equal also pay more for what is received, I think the "only" thing in my books that has gotten far less expensive taking everything into account if hard drive pricing 50 in 1991 would be 92.12 today, for 92 you can pretty easily get 2tb hard drive which is WAY more substantial of a hard drive then you could get in pretty much every regard than 50 would have got you in 1991 ^.^
The hard drive you could get for $50 in 1991 was a 0 MB hard drive.
I don't understand why you decided to use $2,000 in 1991 when the post you replied to talked about $4,000 in 1991. That's over $7,200 today. A $2,000 computer in 1991 was pretty mid range. So what;s the big deal if $3,600 does not buy "cream of the crop" parts today? $3,600 today gets you something certainly high end and not mid-range. Also, you are talking about driving a range of visuals that just didn't exist for consumers back in 1991. You can spend a good chunk of that $3,600 on a decent 4K monitor, driving almost 4 times the pixels of a standard 1080p monitor and over 8 times the pixels of running at 720p. I don't think these massive differences in display capabilities existed back then. Your extra money back then was mostly going towards a faster CPU, faster I/O, and enhanced sound capabilities.
So who played at 1600x1200? I mean 8K has been a thing for several years but who plays games at it? The resolution scaling game didn't really kick off until later. In the 1990s and early 2000s there was a whole lot of relatively easy visual quality improvements to be achieved through better algorithms. I don't believe people were spending massive amounts of money buying monitors with very small dot pitches so they could play games at high resolutions with crisper images. I'm sure they spent more for bigger monitors, but it was probably getting a 17 inch versus a 15 inch. That sort of difference in size doesn't induce someone to need a bigger GPU to push more pixels.
Yeah, if I remember correctly my father bought me a Dell 486SX/25 with 4 MB of RAM, a monitor, keyboard, mouse, 120 MB hd, 3.5 in and 5.25 in floppy drives. It just had the PC speakers and a standard 2d graphics adapter. It cost $1,600 I think, which is $3,000 today. PC gaming is much cheaper today.
The GPU has become more and more important to gaming performance in relation to the other components of the system. So people spend more money on their GPUs to achieve higher performance and no longer spend $1,000 for a CPU or significantly extra money for super fast RAM or a super fast hard drive.
My parents got a similar spec no-name white box PC with non accelerated graphics adapter for $1100 in summer '93. Upgrades over the next few years were 4mb more ram, CDROM+sound blaster clone, ~500 MB hdd (I think, not 100% sure on the capacity), 14.4 modem. I bought the ram and about half the HDD price as a teen, remainder were Christmas purchases.
The 970 is still fine so you really don't need to worry. Even if you did need an upgrade, prices are dropping as they always have for the last generation and if you spent the same amount of money you spent for a 970 at launch now you'd probably be able to get a 1080 so what's really the problem? Nvidia is making the 20xx series larger and more expensive because other people are willing to pay for them, it's as simple as that.
You've got a good graphics card in the 970 that should get you at least a couple more years of reasonable performance. If I were in your position, I wouldn't be in the market for a new GPU. However, I do sympathize with you when it comes to the cost it takes to be able to play these days and I agree that a shift to some form of console is a sensible alternative. PC hardware pricing has been on the rise in the last few years and it stings when you've come to expect performance improvements alongside cost reductions that we've been enjoying for the majority of the years since microcomputers found their way into homes in the 1980s.
I think what's driving that is a diminishing market. Economies of scale don't work when there's no further growth for what's become a mature industry (PCs in general) and a declining segment (desktop PCs in specific) due to the slow shift of computing tasks to mobile phones. I don't see anywhere for desktop components to go but further up as we lean into the physical limits of the materials we have available while also contending with falling sales numbers. Compound that with the damage these prices will inflict on the appeal of PC gaming to the masses and we're starting to look at a heck of an ugly snowball on its way down the hill.
It's probably a good time to make a graceful exit like you're mulling over now. As someone else that's thrown in the towel, I can happily confirm there's lots of fun to be had on very modest, inexpensive hardware. From older games to low system requirements new releases, I have faith that there will always be a way to burn up a lot of free time at a keyboard even if you end up with very old, very cheap hardware.
Concur. I'm still rocking a 750Ti and feeling no need to upgrade it or the even older CPU (phenom x4) despite having money put aside. I'll replace when it breaks, like my fridge, unless something does make going past 1080p compelling - whether that's VR, ray tracing, or a must have game that I can't play at all.
Been considering to make my current rig - older i7 (Haswel) with recently added 1070 - my last gaming PC. It really boils down to how next gen consoles turn out - but even as current gen is, I seem to be spending more time on PS4 than on gaming PC. In fact, MHW is the only game I am playing on PC atm, and event hat because of friends who insisted to play it on PC. Eventually, we are lucky if we get to play it together once a week, on average... definitely not worth investment into new rig, for me.
I mean you don't need the most top end or recent parts. I am gaming on a 5850 and i5-4670K (I think that is the model it has been so long I might be mixing things up). It runs great. 256 GB SSD and 16 GB of ram.
The prices are crazy for the high end but you also don't need the highest end and most recent gen when performance improvements are marginal.
In the 486 days, computer gaming was worth that much money. The landscape was rapidly changing, games were rapidly changing. The internet was taking hold, 3D gfx starting to be born. It was amazing. It was money well spent to be able to play groundbreaking new types of games.
Nowadays, although overall less expensive perhaps, your money doesn't buy you much new in terms of originality and exciting gameplay. All we get are prettier and prettier textures with duller and duller games. WoW and Counterstrike are STILL massively popular games, certainly not for their gfx.
Nate, iirc they handicapped the tensor performance of FP16 with FP32 accumulate, which is only half of those on equivlant Quadro cards, maybe that's why HGEMM performance is low.
The chart says half precision GEMM. So I think a lack of accelerated 32-bit accumulation should not be slowing the GPU down. As far as I know, the Turing Tensor Cores perform FP16 multiplications with FP16 accumulations at 8 operations per clock much like Volta Tensor Cores perform FP16 mults. with FP32 accumulation at 8 operations per clock.
Turing FP16 with FP16 accumulate is fully enabled on all RTX cards, but FP16 with FP32 accumulate is 1/2 rate on GeForce cards.
They used out of the box configuration which likely used Volta's FP16 with FP32 accumulate, resulting in half the performance.
HGEMM results for 2080TI/2080/2070 are very close to their 54/40/30 TFLOPS theoretical performance. If it was a Quadro card you will see double the performance with this config. If they updated the binary support, you'll likely see double the perf with FP16 accumulate too.
"They used out of the box configuration which likely used Volta's FP16 with FP32 accumulate, resulting in half the performance."
It could be some driver error. But I don't see why the GPUs not having FP32 accumulate should be the ultimate cause for the poor results. I admit I don't know much about the test, but why should the test demand FP16 multiplications with FP32 accumulate? That's more or less an experimental situation only available commercially in NVIDIA's hardware, as far as I know. If the test is meant to use FP16 accumulate and FP32 is being forced in the test then the reason for the poor results is a driver or testing error, not that Turing GPUs only have FP16 accumulate at full Tensor Core speed for the precision.
It's hilarious in a way, take the tensor cores and ray tracing out of the equation and there's barely any difference between pascal and Turing. It's almost like that extra memory bandwidth is giving Turing its speed bump and nothing more.
NVIDIA is heavily marketing ray tracing as the killer feature for the RTX cards. Its clear that a generational gain in performance wasn't in the cards (pun intended) this time around.
So in Far Cry 5, a game that I play a lot, I've essentially got RTX 2070 performance with my Vega 56 (OC+ Flashed to 64), but for £399 and the game free with it? Cool!
If you run your computer for anything like sensible periods of time, that extra power draw still doesn't come close to amounting to the price difference. Remember, you have to consider it in context of the power draw of your entire home.
I think my price limit on GPUs is the "not much more than an entire gaming console with slightly better performance" bracket of $350-400. I guess we'll see if the 2060 fits that bill and makes a worthy upgrade to the 970. Otherwise I'll be waiting one extra generation this time around instead of upgrading every other generation.
I’m with the crowd that says wtf to the new pricing - I’m a 670>970 owner and was hoping to upgrade to another x70 for $350-400 but they are priced too high for me now to justify. Hope they bring prices back to reality for the 2170 or that they offer GTX models along with RTX.
If they want to shift the cards up a rank, IMO, they should have adjusted the naming schema.
I feel much the same as you, and honestly I'd bet most people who buy the upper-mid range feel the same way. I also have a GTX 970 and as I told a couple of my friends while laughing at the new RTX pricing "this makes it so much easier to wait for 2020 to see if Intel can compete". I stick by that statement and barring a pricing revolution or my 970 dying here's to 2020.
@thestryker, same here. I got a 970 a couple years ago, and won't be upgrading any time soon. I'm sure it'll run Doom Eternal just fine...thanks Vulcan ;-)
New consoles have been hitting $600 at release, and then come down after a year or two. So, $600 for a new card is still in that range of being the price of an entire console. When I see $700+, that is when I really question how much faster the card is to justify the higher price.
The thing is that MS, Sony or Nintendo can sell their consoles at a lost because they are going to get it back on software... a GPU doesn`t work this way.
Count me the same as well. With AAA developers no long pushing technology beyond console envelops, instead of a new GPU every other gen I am likely going with just one GPU (980) for this entire current console cycle.
Completely agree. For the cost of the most expensive games console you should at least get the most powerful gfx card. Have Nvidia forgotten that you basically need to spend the same amount again to get a working computer? $500 for a 'mid-range' card is utter lunacy.
Used, 2nd hand market price breakdown for both 1070ti and 1080 are going to be a major headache for Nvidia. I bought my MSI GTX 1080 Gaming X for the "buy it now price" of $320.00 and 1070ti cards go for less than $300.00 on the 2nd hand market such us ebay, facebook marketplace, and FS/FT sections of AT Forum.
i get dizzy from turning my head to read the labels. i loved that you made the AMD bar in the compute benches red, helps me identify red team. maybe make a repeating bg with barely discernible logo's. Just saw i dont get dizzy, help an old man out :) If you need help with the web dev, let me know.
Thanks for the review, nice as always. Was hoping to upgrade my 970 before Turing was announced, but I feel like I'm getting ripped of with these cards. The review did nothing to change that feeling, but that was to be expected.
On the "The Test" page you show that the "NVIDIA GeForce GTX 1070 Ti Founders Edition" is one of the cards being compared, but it does not show up in the benches.
From the information, seeing Vega 64 going up to a temp of 86C would put it into thermal throttle range, which would cripple performance. From my own experience, manually adjusting the fan settings in Global Wattman to go up to 4500rpm and with a temperature target of 75C will avoid the throttle issues in the first place and also improving performance significantly, even without tweaking clock speeds or voltages.
So, if Vega 64 is getting throttled and still hitting the numbers reported, that implies that with the fan profile adjusted as I suggested, we would be seeing Vega 64 doing a bit better in terms of framerates.
980 owner here gaming at 1440p. Really wanted to upgrade but when I cost everything up, PC gaming has suddenly become a very expensive hobby.
Decided to completely abandon the PC as a future gaming platform mostly thanks to the pricing of the new gpu cards.
2.5yrs since the 1080 for barely better performance. RTX isn’t viable on this card. My own view is the new line up sucks.
Practically all my mates are on consoles these days which is a shame but it’s a sign of the times. Tried the BF5 beta on my xbox one S and was blown away at how decent it was. Had real fun playing with friends which is what matters.
So I can only imagine it’s even better on the Xbox One X which you can buy for the price of just this GPU.
Prices have gone insane, so I’m stepping out. Total respect for those that can justify the prices and carry on PC gaming. I can’t.
I find a lot of the discussions around here odd: Lots of people trying to convince each other that only their choice makes any sense… Please, let’s just enjoy that there are a lot more choices, even if that can be difficult.
For me, compute pays the rent, gaming is a side benefit. So I aimed for maximum GPU memory and lowest noise, because training neural network can take a long time and I don’t have an extra room to spare. It was a GTX 1070 from Zotac, 150 Watts TDP, compact, low noise at high loads, not exactly a top performer in games, ok at 1080 slightly overwhelmed here and there with my Oculus Rift CV1, although quite ok with the DK2. I added a GTX 1050ti on another box mostly because it would do video conversion just as fast, but run extremely quiet and at zero power on that 24x7 machine.
Then I made a 'mistake': I bought a 43” 4k monitor to replace a threesome of 24” 1080 screens.
Naturally now games wouldn’t focus on one of those, but the big screen, which is 4x the number of pixels. With a screen so big and so close, I cannot really discern all pixel together at all times, but when I swivel my head, I will notice if pixels in my focus are sharp or blurred, so cutting down on resolution or quality won’t really do.
I replaced the 1070 with the top gun available at the time, a GTX 1080ti.
Actually, it wasn’t really the top gun, I got a Zotac Mini which again was nicely compact and low noise, does perfectly fine for GPU compute, but will settle on 180Watts for anything long-term. It’s very hard to achieve better than 70% utilization on GPU machine learning compute jobs, so all of these GPUs (except a mobile 1070) tend to stay very quiet.
A desperate friend took the 1050ti off my hands, because he needed something that wouldn’t require extra power connectors, so I chipped in some extra dosh and got a GTX 1060(6GB) to replace it. Again, I went for a model recommended for low noise from MSI, but was shocked to see that it was vastly bigger than the 1080ti in every direction when I unpacked it. It was, however, very silent even at top gaming loads, a hard squeeze to fit inside the chassis but a perfect fit for ‘noise’ and a surprisingly adequate for 1080 gaming at 120 Watts.
The reason I keep quoting those Watts is my observation that it’s perhaps a better sign of effective GPU power than the chip, as long as generation and process size are the same: There is remarkably little difference between the high-clocked 1060 at 120Watts, the average clocked 1070 at 150 Watts and the low-clocked 1080ti at 180Watts. Yes, the 1080ti will go to 250 Watts for bursts and deliver accordingly. But soon physics will weigh in onto that 1080ti and increasing fan speed does nothing but add noise, because surface area much like displacement in an engine is hard to replace.
I got an RTX 2080ti last week, because I want to explore INT8 and INT4 for machine learning inference vs. FP16 or FP32 training: A V100 only gives me FP16 and some extra cores and bandwidth while it costs 4x as much, even among friends. That makes the Turing based consumer product an extremely good deal for my use case: I don’t care for FP64, ECC or GPU virtualization enough to pay the Tesla/Quadro premiums.
And while the ML stuff will take weeks if not moths to figure out and measure, I am glad to report, that the Palit RTX 2080ti (only one available around here) turned out to be nicely quiet and finally potent enough to run ARK Survival Evolved at full quality at 4k without lags. Physically it’s a monster, but that also means it sustains 250 Watts throughout. That’s exactly how much a GTX 980ti and an R290X gulped from the mains inside that very same 18-Core Xeon box, but with performance increases harking back to the best times of Gordon Moore’s prediction.
IMHO discussions about the 2xxx delivering 15% more speed at 40% higher prices vs. 1xxx GPUs are meaningless: 15FPs vs. 9 FPs or 250FPs vs. 180FPs are academic. The GTX 1080ti failed at 4k, I had to either compromise quality or go down to 3k: I liked neither. The RTX 2080ti won’t deliver 100FPs at 4k: I couldn’t care less! But it never drops below 25FPS neither, and that makes it worth all the money to a gamer, while actually INT8 and INT4 compute will pay the bill for me.
I can’t imagine buying an RTX 1070 for myself, because I have enough systems and choices. But even I can imagine how someone would want the ability to explore ray-tracing or machine learning on a budget that offers a choice between a GTX 1080ti or an RTX 1070: Not an easy compromise to make, but a perfectly valid choice made millions of times.
Don't waste breath or keystrokes on being 'religious' about GPU choices: Enjoy a new generation of compute and bit of quality gaming on the side!
s/RTX 1070/RTX 2070 above: Want edit! It's this RTX 2070 which may not make a lot of sense to pure blooded games, except if they are sure that they continue to run at 1920x1080 over the next couple of years (where a GTX 1080ti is overkill) *and* want to try next generations graphics.
So Nvidia has decided to push all their card numbers down one because AMD isn't competitive at the moment. The 2060 is now the 2070, 2070 is the 2080 and the 2080 is the 2080 TI. This sort of hubris is just calling out for a competitor to arrive and sell a more competitively priced product.
As for ray tracing, I'll eat my hat if the 2070 can handle ray-tracing in real games at reasonable frame-rates and real resolutions when they arrive.
TBH...who gives a crap? With the advent of usable integrated GPUs from Intel and AMD, dGPU vendors are basically no longer making x20, x30 or x40 cards. So maybe they're just pushing up the product stack - instead of "enthusiasts" buying x60, x70 and x80 cards, we'll now be buying x50, x60, x70 and halo x80 products. I could care less what the badge number is for my card, what I care about it performance vs price.
That said, I don't think I'll ever by a dGPU for more than $400. The highest I've ever paid was I think ~$350 for my 970 or 670. As long as there's a reasonably competitive card in the $300-$400 USD range, I don't care what they call it - it could be a RTX 2110 and I'll snap it up. Given the products NVidia has released so far under the RTX line, I'm going to wait and see what develops. Either I'll grab a cheap used 1080/1080ti or wait for smaller and cheaper 2100 cards. NV can ask whatever they want for a card, but at the end of the day most consumers have a price ceiling in which they won't purchase anything above. Seems like a lot of people are in the 350-500 range so either prices will have to come down or cheaper products will come out. I'm curious whether NV will make any more GTX cards since Tensor cores not only aren't that usable right now, but dramatically increase the fab cost given their size and complexity.
Nahh, look at the die sizes. The 2080 is bigger than the 1080 Ti. The 2070 is bigger than the 1080. The price/performance changes are not because NVIDIA is pushing the cards down one, it's entirely because of the resources spent for ray tracing capabilities. As far as the 2070's ability to handle ray tracing, we won't really know for a few more months.
As for competitors, if AMD had a competitive product now they might be cleaning up. But since they don't, by the time they or some other competitor (Intel) does arrive they will probably need real time ray tracing to compete.
No one is forcing you to buy an RTX. If you're not interested in real time ray tracing you probably shouldn't be buying an RTX, and the introduction of RTX has forced the 10 series (and probably soon the Vega and Polaris series) prices down.
What is exactly "RTX 2080" which is bouncing around the tables? I did not find any reference in the test description chapter. I assumed it could be card "stock clocked" RTX 2080 FE, but it seems these cards are not always performing in expected order (sometimes 2080 beats 2080 FE).
Also, in the temp and noise section, there are two cards: 2080 and "2080 (baseline)" which give again quite different thermal and noise results.
Too much blabbering in the comments section. Way I see it:
GTX 2070 offers the same performance with a GTX 1080, is significantly more expensive than the GTX 1080 whilst being less power efficient and hotter at the same time.
Personally I'm curious about what non-gaming software will use those tensor and RT cores and what that will bring. I mean if for example Blender traced 3 times faster it would be quite a thing for Blender users. Same for video editing software users I imagine. And then there's the use for students and scientist. And the whole wave of AI stuff that people are now getting into.
It's funny because I would have thought that Anadtech would the site that was the one with not exclusively gamers and people using graphics cards exclusively for gaming, but going through the comments you'd think this was a gamer-oriented site - and a gamers site only.
So that's a solid "no" then? You can get better performance for significantly less. This card isn't targeted at me (a 1080 owner), but until the ray tracing stuff starts to be worth anything, this card seems just too overpriced for a reasonable person to consider.
Spelling and grammar corrections. I did not read through the whole thing, but this is what I did find. "The card is already coming in with a price premium so it's important to firmly faster." Missing "be". "The card is already coming in with a price premium so it's important to be firmly faster."
"For the RTX 2070, 4K and 1440p performance once agani settles near the GTX 1080." Right letters, wrong ordering "For the RTX 2070, 4K and 1440p performance once again settles near the GTX 1080."
Also, I am of the opinion that you should focus your reviews on the performance of the cards vs. price/speed positioning/slot. For example, you could note that the 20 series tends to have better 99th percentile frame rates. This was a big win for the Vega when it first came out. I have not actually crunched the numbers to see if the Vega is better or worse than the 20 series. The calculation would be (minimum*100)/average == % a lower value being a larger discrepancy (worse).
Certainly makes me feel better about pulling the trigger on a $525 overclocked 1080 with a free game last weekend. 2070s are certainly less abundant, and definitely not for $525. The premium only buys 5-10% performance at base clocks, not worth another $100
Nvidia gimped the tensor cores on consumer RTX, that’s why tensor core benchmarks are half a titan V or Quadro RTX. It can’t do FP32 accumulate full speed.
I currently have GTX 1070 and just can't justify upgrading due to the fact that Ray Tracing is currently not being used in any games right now. yes there is 15 - 25 FPS performance boost running 1440P still not worth $499 - $599 cost. Wait a year and this Video Card will drop and there actually might be some games taking advantage of Ray Tracing.
Does 2070 worth it vs 1070 for its Tensor cores? I would utilize the card on machine learning. In Final Words, this wasn't covered. It seems to me that 2070 is the cheapest solution in rder to get dedicated Tensor cores, which if I am not mistaken make a great portion of difference in the computational performance between these two cards. Opinions?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
121 Comments
Back to Article
Jon Tseng - Tuesday, October 16, 2018 - link
Anandtech review out on time? What is the world coming to???MrSpadge - Tuesday, October 16, 2018 - link
That was a nate review. Ehm, I mean neat.ianmills - Tuesday, October 16, 2018 - link
Oh, I get it!bug77 - Wednesday, October 17, 2018 - link
Oh, nate! I mean, oh neat!dollarshort - Tuesday, October 16, 2018 - link
Technically a day late if you count Kyle B's review ;)Jon Tseng - Tuesday, October 16, 2018 - link
BUT I WANT MY REVIEWS TO HAVE LOTS OF EGREGIOUS TECHNICAL DETAIL AND DROP 2 WEEKS LATE! :-pimaheadcase - Tuesday, October 16, 2018 - link
Apparently its come to flying around the globe to tech conferences, reporting on news than actual product.mkaibear - Tuesday, October 16, 2018 - link
Waa! Waa! My free entertainment isn't exactly what I want it to be, time to get on my keyboard and complain about it.Grow up. Or go elsewhere. Anandtech is doing a great job given the constraints they operate under.
Diji1 - Wednesday, October 17, 2018 - link
Waa! Waa! This comment isn't exactly what I want it to be, time to get on my keyboard and complain about it.Grow up. Or go elsewhere. imaheadcase is doing a great job given the constraints they operate under.
GreenReaper - Thursday, October 18, 2018 - link
Like... being a headcase? >_>Sometimes the flying is part of getting the product to review. Other times... well, trade shows and the like are sometimes compensation for the level of pay provided by an online news reporting gig. This isn't the glory days of Personal Computer World, with issues 600-pages thick with ads.
rtho782 - Wednesday, October 17, 2018 - link
I can't wait for the GTX960 review to finally come out so I can see how it compares!!Meteor2 - Thursday, October 25, 2018 - link
:-Dbtb - Tuesday, October 16, 2018 - link
Should have been compared to the 1070 TI FE imo, they dont even sell the 1070 anymore, at least not in the nvidia store in my country. 1070 TI FE is the best bang for your buck right now IMO, if you dont want/need the raytracing capapility and play at 1920x. Personally I recently bought the 1070 TI FE, and have been quite happy with it. Only cost 75% of what a 2070 FE cost, and maxes out my FPS at the resolution I play(1920x1200)TheinsanegamerN - Tuesday, October 16, 2018 - link
Just because the 1070 isnt sold in "the nvidia store in your country" doesnt mean it has ceased to exist, or that it is no longer the market the 2070 is now targeting.btb - Tuesday, October 16, 2018 - link
Well, that might be the case. But does not change that 1070 TI is much better value for money then 1070(and 2070 for that matter). The review not including the "in between" option is not helpful for prospective buyers.. all IMO of course.Death666Angel - Tuesday, October 16, 2018 - link
"1070 TI is much better value for money then 1070"Taking the Techpowerup summary of the relative performance (compared to a 2080Ti) and running it through the cheapest model available from a respectable retailer in Germany, the 1070 is less than .15% better performance/€ in 1080p titles than the 1070, 1.8% better performance/€in 1440p titles and a whopping 3.2% better performance/€ in 2160p titles. "much better value for money" looks different to me. Then there is the fact that there are 53 1070 products and only 36 1070ti ones listed here. And I have to question the sanity and your financial sense of anyone who looks to upgrade from a 1070ti to a 2070. Generational upgrades are usually not worth it without a corresponding process node shrink and the same is true here, especially since the 1070ti is more a 1080 in disguise than a true midrange card.
Kakti - Tuesday, October 16, 2018 - link
At least for me, it's not at all about upgrading from a 1070TI to a 2070, but rather which should I purchase to replace my 970. The 1070Ti is roughly $400-450 while the 2070 is $600. That's a huge difference and it's hard to know how big the performance delta is when the article keeps comparing to the 1070 that's effectively been supereseded.If you're buying a 1000 series card, it's a 1060, 1070ti, 1080 or 1080ti; virtually no one is buying a regular 1070 at this point. Honestly I think I'm going to sit out this generation until we have an idea of what this hardware can do for ray tracing and DLSS. I have a feeling these cards (especially the 2070) will struggle mightily to run those settings anywhere near 60fps. So, if I personally don't believe the 2070 will be able to deliver acceptable performance for RTX tech, then I'm leaning towards grabbing a 1000 series for (relatively) cheap.
Death666Angel - Wednesday, October 17, 2018 - link
"If you're buying a 1000 series card, it's a 1060, 1070ti, 1080 or 1080ti; virtually no one is buying a regular 1070 at this point" Any data on this? Or just made up to further your point? What about second hand cards? With a performanc/€ delta of less than % at 1080p, you are making quite bold claims here.Cellar Door - Wednesday, October 17, 2018 - link
The 1070Ti is far better value then 1080 or 1070. Your comparison uses a stock 1070Ti - a card which only makes sense overclocked(due to Nvidia banning any factory OC models). When overclocked, 1070Ti is 7% lower perf then an OC 1080 for a lot less.Oh and since 1070Ti was just launched last October, they are perfect to grab on the used market as all cards will still have 2 years of warranty left. I picked up a mint used 1070Ti for $300.
GreenReaper - Thursday, October 18, 2018 - link
Unfortunately, it's not your warranty, so you don't get to benefit from it:https://www.reddit.com/r/nvidia/comments/4xmewv/ps...
https://www.nvidia.com/object/manufacturer_warrant...
This warranty applies only to the original purchases of the Warranted Products from a retailer, mail order operation, or on-line retail store; this warranty will not extend to any person that acquires a Warranted Product on a used basis.
webdoctors - Friday, October 19, 2018 - link
You should shop around. I bought a new EVGA 1070TI from Amazon last weekend for ~$270.The regular 1070 is also that price now on Amazon website.
The 1070TI definitely seems the sweet spot for upgrading right now...
philehidiot - Tuesday, October 16, 2018 - link
Have to agree here. I'd like to see the 1070Ti for completeness but Anandtech may not have the card to hand (you simply can't always keep stock of / put your hands on every card you want), might be planning on adding it later or just thought it'd be easy enough to extrapolate where it will fall. As for value for money, I did a massive table (in pen for some reason) which calculated aggregate benchmarks for a load of cards (1080Ti, 1080, 1070Ti, 1070, Vega 64) that I was interested in and then calculated a performance per Pound (UK here) for each one. Aside from the 1080Ti (which was far better value in terms of performance per Pound), the rest had practically identical value rankings aside from the AMD card which was slightly lower. I actually ended up buying a Vega 64 as the price briefly dropped by £110 just before Turing was released which made it an absolute bargain and, seeing the performance here, I'm relatively delighted that for once the GPU market has not screwed me over. I'm guessing the prices dropped as people were expecting Turing to be amazing and now they're back up as it's just pretty meh all over and very overpriced. My old card went for £100 (I don't normally sell them, but give them away to a friend) and so my cost to upgrade was ~£300 which I'm very happy with given that I would never EVER consider spending 2080 / 2080Ti money on a GPU. The worst part for Nvidia is that I am their target market - I'm a gamer who spends a lot of money on his PC to play the latest games at decent quality settings and I'm also a professional who isn't exactly poorly off. Could I afford a 2080Ti? Easily. Would I EVER buy one? No. The price is insulting and so is the marketing. I'd have never have considered AMD before they released all that insulting marketing rubbish. They gave AMD a sale.Spunjji - Wednesday, October 17, 2018 - link
I know at least one other person with the exact same story. It's irritating for folks like us, but as long as the market continues to swallow it they won't miss our sales!hansmuff - Tuesday, October 16, 2018 - link
A 1070 Ti is way too close to the 2070 in perf, why bother at this point when you don't know if the new features bring anything to the table with the 2070?A lot of people have 1070's, non-Ti, and have more reason to wonder if this a significant enough upgrade. I think Anandtech did the right thing here.
JoeyJoJo123 - Tuesday, October 16, 2018 - link
Nah. 1070Ti is as low as ~$400. 1080 is as low as ~$440.http://gpu.userbenchmark.com/Compare/Nvidia-GTX-10...
Game performance differentials range from 6% to as much as 22%, but i'd say for 10% more on the price, you get a tiny bit more than 10% of the performance uptick. Hence, 1080 is a better overall value in terms of price/performance to 1070 Ti.
Death666Angel - Tuesday, October 16, 2018 - link
Not according to Techpowerup and German prices. The cheapest 1070ti is 430€, the cheapest 1080 is 480€. The relative performance towards a 2080Ti is 62%/55%/50% for the 1070Ti in 1080p/1440p/2160p and 66%/58%/52% for the 1080 in the same resolutions. That means the 1080 delivers only 95%/94%/93% of the performance/€ compared to the 1070Ti. Things are so close though as to warrant a real inspection when seeing a deal for one card or another.Vayra - Monday, October 22, 2018 - link
While true, once overclocked, the 1080 can stretch its legs a bit more due to the faster GDDR5X, making the lead a bit larger, and its additional shaders also benefit a bit more from a 'similar' clock to a 1070ti. They're almost a perfect match if you consider perf/dollar, and in that case the 1080 is and was always the better choice, because higher absolute performance should usually result in a worse perf/dollar number.Marlin1975 - Tuesday, October 16, 2018 - link
I'll take a $329 RTX2080. Comparison chart is off.Performance is not bad, but the price is. Until the price comes down the 1080 seems like a much better buy, esp with rebates on some now.
Koenig168 - Tuesday, October 16, 2018 - link
Paying for RTX which we cannot use ATM and not sure we want, bearing in mind the performance hit.Antoine. - Tuesday, October 16, 2018 - link
Why do you compare in your final words the RTX2070 to the GTX1070? Just because they have the same marketing endname does not mean it's in the best interest of your readers to compare these two.The only relevant guide here is the pricing of course! RTX2070 have the same pricetag as GTX1080TI. Comparing it to GTX1070TI is a stretch but why not. But the 1070?! a 400-dollar card?
TheinsanegamerN - Tuesday, October 16, 2018 - link
Because the 2070 is a 1070 replacement? Why do you people get hung up on "WELL IT COST A BUNCH SO ITS A HIGH TIER CARD"No, it isnt, its a mid tier card with a high tier pricetag. It doesnt make sense to compare the 2070 to a 1080ti because of the pricetag, anymore then comparing a vega 64 to a 1070 because they both cost $400 at some point.
ioni - Tuesday, October 16, 2018 - link
2070's die size is closer to the 1080 ti than it is to the 1080. It should be, at minimum, being compared to a 1080.Nioktefe - Tuesday, October 16, 2018 - link
Exactly, it could be called RTX 2050 and still use the same chip, don't compare marketing with die specifications, RTX chips are huge, and they are priced accordingly to that, we will never see 2080Ti come close to 1080ti pricesPeachNCream - Tuesday, October 16, 2018 - link
^^ So much of this right here! Name it what whatever, but the retail price is ultimately what dictates what it must compete with from prior generations or from the nearest competitor. As far as pricing is concerned, the RTX 2070 must contend with the GTX 1080Ti and 1080 when a present day buyer is looking at options in that price range.tamalero - Wednesday, October 17, 2018 - link
die size is irrelevant, price points is what is important in price brackets.. if the 2070 is on 1080gtx prices for a tiny improvement, its not worth it..Vayra - Monday, October 22, 2018 - link
Die size is not irrelevant. Cause & Effect: larger die = lower yield per wafer = higher price.CheapSushi - Thursday, October 18, 2018 - link
Do you make your choice based on die size OR price?dguy6789 - Tuesday, October 16, 2018 - link
What an absolutely ridiculous statement. People don't cross shop products that have radically different pricing- they pick a budget and look for what is best in that price segment. Hmmm I'm torn between a Kia and a BMW. Yeah right.evilspoons - Wednesday, October 17, 2018 - link
That's reasonable, the Kia Stinger is a pretty decent RWD/AWD luxury sedan. For like $45k CAD you get roughly what you'd have to pay $65k CAD to get in a BMW, so a $45k BMW by contrast is a much less interesting vehicle. (Although personally, I'd go with the Genesis G70 instead of the Kia, you can get it with a 6-speed manual.)khanikun - Friday, October 19, 2018 - link
Car analogies do and don't work here. Some might start with size, while other start with price. Some will start with looks, some will start with power, some will start with drivetrain, etc. Some will cross shop products that have different prices, some won't.Some might be torn between a Kia and a BMW, if both made rwd sports cars that compete against each other, even with a price tag difference. Look at all the comparisons between something like a Subaru WRX STI vs a BMW M3, even though the M3 is easily $20k over the STI.
The only time I see someone set a budget first, is those looking at the used car market. Not the looking at the new car market.
Vayra - Monday, October 22, 2018 - link
In the same vein you could say 'why get so hung up on a name to defend its a same tier card'Price matters because if perf/dollar doesn't improve there is no reason for any *buyer* to see it as a direct replacement.
Midwayman - Tuesday, October 16, 2018 - link
Why would you say that the 1080 is the card to beat and then use a garbage FE version as the benchmark comparison. Every 1080 card you're going to buy today is substantially faster than that FE version.Dr. Swag - Tuesday, October 16, 2018 - link
Because that's what they have... Plus they downclocked the founders 2070 to reference speeds too so it's not like it's that big of a deal.Midwayman - Tuesday, October 16, 2018 - link
They have tested numerous non-FE 1080 cards. The issue is that its a comparison nobody will be making when buying a 1080. It makes the 2070 look way better in the graphs than it should. If they feel the need to include a FE model for reference, fine. But they should have included a version with the faster ram and a typical factory OC since that is what is most often for sale right now. Particularly in light of the price point of the 2070.Yojimbo - Tuesday, October 16, 2018 - link
How does it make it look much better than it should when they downclocked the founder's edition to a clock below what the 3rd party 2070 cards which are comparable to the 1080s you want to use will be using.And I don't think you can use the price point of the 2070 FE or the base 2070 as a justification to include factory overclocked cards from 3rd party board partners. There are other reasons for the price differential besides price/performance in current games. And since there is a price premium for NVIDIA FE cards you're going to end up with a price comparison problem anyway.
They tested numerous non-FE 1080 cards and when they are available I'm sure they will test numerous non-FE 2070 cards. When that happens I am sure they will make the comparisons among those two sets of cards, since there will no longer be the FE/non-FE problem.
Yojimbo - Tuesday, October 16, 2018 - link
It's a difficult situation because there seems to be a dollar value to the founder's edition beyond the performance, and the reviewed card is a founder's edition.Ryan Smith - Tuesday, October 16, 2018 - link
Our editorial policy long has been (and remains) that we'll always compare products stock to stock. That means comparing a reference clocked 1070 to a reference clocked 2070, etc. This is so that we never overstate the performance of a product; as we use the reference specs, anything you buy will be as fast as our card, if not faster. As we can only loo at a finite number of cards, it continues to be the fairest and most consistent way to test cards.Plus we got a earful (rightfully) whenever we've deviated from this. You guys have made it very clear that you don't like seeing factory overclocked cards treated as any kind of baseline in a launch article.
Exodite - Tuesday, October 16, 2018 - link
Thank you Ryan!I, for one, appreciate this approach and I'm very glad to see Anandtech sticking to it.
Eletriarnation - Tuesday, October 16, 2018 - link
Pretty sure there's a mistake in the chart on the front page that puts the transistor count of the 2070 as >2x the 2080.cwolf78 - Tuesday, October 16, 2018 - link
My first PC was a Tandy 1000 RL with an Intel 8086 CPU. The first PC I ever built was a 486SX/25 and I've been a PC gamer ever since. For the first time since, well, ever, I'm seriously considering just forgoing PC gaming in the short-term. Between the ridiculous pricing of GPU's and RAM, I just don't see how this can be a hobby for the vast majority of people anymore. It's nice that you can get a lot of bang for your CPU buck these days, that doesn't even begin to make up for how much you have to bend over for the rest of it. I think I'll be getting a PS5 and call it a day and use my current PC with its OC GTX 970 for any PC exclusives I may want to play. I just can't justify spending these kind of prices. Nvidia is going to kill PC gaming for a lot of people. I'm not sure what their strategy is except to bend people over for as long and hard as they can and only then start dropping prices one sales start taking a hit. Well, sorry, Nvidia. You need to find someone else to take advantage of.TheinsanegamerN - Tuesday, October 16, 2018 - link
If you had truly been building since the 486 era, then you would know that, despite the price jumps, computers today are MONUMENTALLY cheaper then they were in the 90s. You dont see $4000 desktops in stores today, you sure did in 1991.TheinsanegamerN - Tuesday, October 16, 2018 - link
I mean, seriously, a 4MB RAM stick cost $140 in 1994, and you care complaining that 32 GB cost $300 today?Dragonstongue - Tuesday, October 16, 2018 - link
hold yer horses there lad, lets us some calcs.$2000 in 1991 would be $3,684.96 today...I see LOTS of computers people build that are ~ this level
and $3600 does not buy "cream of the crop" parts today, very high end no doubt, but also not "best of the best"
use a different number 250 1991 money which is ~ mid range gpu pricing these days would be $460.62.
I guess to put a slightly different way, it depends on what one is buying to see that the "value" of the $ spent is often times equivalent much worse or only "slightly" better then we have today.
We may get "more" for the $, but, all things being equal also pay more for what is received, I think the "only" thing in my books that has gotten far less expensive taking everything into account if hard drive pricing 50 in 1991 would be 92.12 today, for 92 you can pretty easily get 2tb hard drive which is WAY more substantial of a hard drive then you could get in pretty much every regard than 50 would have got you in 1991 ^.^
Yojimbo - Tuesday, October 16, 2018 - link
The hard drive you could get for $50 in 1991 was a 0 MB hard drive.I don't understand why you decided to use $2,000 in 1991 when the post you replied to talked about $4,000 in 1991. That's over $7,200 today. A $2,000 computer in 1991 was pretty mid range. So what;s the big deal if $3,600 does not buy "cream of the crop" parts today? $3,600 today gets you something certainly high end and not mid-range. Also, you are talking about driving a range of visuals that just didn't exist for consumers back in 1991. You can spend a good chunk of that $3,600 on a decent 4K monitor, driving almost 4 times the pixels of a standard 1080p monitor and over 8 times the pixels of running at 720p. I don't think these massive differences in display capabilities existed back then. Your extra money back then was mostly going towards a faster CPU, faster I/O, and enhanced sound capabilities.
Vayra - Monday, October 22, 2018 - link
You wot? Back in 1994, 1600x1200 was a thing already, and the vast majority played on 800x600 or worse. In fact, even that was still a high end res.Yojimbo - Monday, October 22, 2018 - link
So who played at 1600x1200? I mean 8K has been a thing for several years but who plays games at it? The resolution scaling game didn't really kick off until later. In the 1990s and early 2000s there was a whole lot of relatively easy visual quality improvements to be achieved through better algorithms. I don't believe people were spending massive amounts of money buying monitors with very small dot pitches so they could play games at high resolutions with crisper images. I'm sure they spent more for bigger monitors, but it was probably getting a 17 inch versus a 15 inch. That sort of difference in size doesn't induce someone to need a bigger GPU to push more pixels.Yojimbo - Monday, October 22, 2018 - link
"GPU" should read "graphics accelerator".Yojimbo - Tuesday, October 16, 2018 - link
Yeah, if I remember correctly my father bought me a Dell 486SX/25 with 4 MB of RAM, a monitor, keyboard, mouse, 120 MB hd, 3.5 in and 5.25 in floppy drives. It just had the PC speakers and a standard 2d graphics adapter. It cost $1,600 I think, which is $3,000 today. PC gaming is much cheaper today.The GPU has become more and more important to gaming performance in relation to the other components of the system. So people spend more money on their GPUs to achieve higher performance and no longer spend $1,000 for a CPU or significantly extra money for super fast RAM or a super fast hard drive.
DanNeely - Tuesday, October 16, 2018 - link
My parents got a similar spec no-name white box PC with non accelerated graphics adapter for $1100 in summer '93. Upgrades over the next few years were 4mb more ram, CDROM+sound blaster clone, ~500 MB hdd (I think, not 100% sure on the capacity), 14.4 modem. I bought the ram and about half the HDD price as a teen, remainder were Christmas purchases.Eletriarnation - Tuesday, October 16, 2018 - link
The 970 is still fine so you really don't need to worry. Even if you did need an upgrade, prices are dropping as they always have for the last generation and if you spent the same amount of money you spent for a 970 at launch now you'd probably be able to get a 1080 so what's really the problem? Nvidia is making the 20xx series larger and more expensive because other people are willing to pay for them, it's as simple as that.PeachNCream - Tuesday, October 16, 2018 - link
You've got a good graphics card in the 970 that should get you at least a couple more years of reasonable performance. If I were in your position, I wouldn't be in the market for a new GPU. However, I do sympathize with you when it comes to the cost it takes to be able to play these days and I agree that a shift to some form of console is a sensible alternative. PC hardware pricing has been on the rise in the last few years and it stings when you've come to expect performance improvements alongside cost reductions that we've been enjoying for the majority of the years since microcomputers found their way into homes in the 1980s.I think what's driving that is a diminishing market. Economies of scale don't work when there's no further growth for what's become a mature industry (PCs in general) and a declining segment (desktop PCs in specific) due to the slow shift of computing tasks to mobile phones. I don't see anywhere for desktop components to go but further up as we lean into the physical limits of the materials we have available while also contending with falling sales numbers. Compound that with the damage these prices will inflict on the appeal of PC gaming to the masses and we're starting to look at a heck of an ugly snowball on its way down the hill.
It's probably a good time to make a graceful exit like you're mulling over now. As someone else that's thrown in the towel, I can happily confirm there's lots of fun to be had on very modest, inexpensive hardware. From older games to low system requirements new releases, I have faith that there will always be a way to burn up a lot of free time at a keyboard even if you end up with very old, very cheap hardware.
WarlockOfOz - Wednesday, October 17, 2018 - link
Concur. I'm still rocking a 750Ti and feeling no need to upgrade it or the even older CPU (phenom x4) despite having money put aside. I'll replace when it breaks, like my fridge, unless something does make going past 1080p compelling - whether that's VR, ray tracing, or a must have game that I can't play at all.nikon133 - Tuesday, October 16, 2018 - link
I hear you.Been considering to make my current rig - older i7 (Haswel) with recently added 1070 - my last gaming PC. It really boils down to how next gen consoles turn out - but even as current gen is, I seem to be spending more time on PS4 than on gaming PC. In fact, MHW is the only game I am playing on PC atm, and event hat because of friends who insisted to play it on PC. Eventually, we are lucky if we get to play it together once a week, on average... definitely not worth investment into new rig, for me.
ingwe - Wednesday, October 17, 2018 - link
I mean you don't need the most top end or recent parts. I am gaming on a 5850 and i5-4670K (I think that is the model it has been so long I might be mixing things up). It runs great. 256 GB SSD and 16 GB of ram.The prices are crazy for the high end but you also don't need the highest end and most recent gen when performance improvements are marginal.
Farfolomew - Monday, October 22, 2018 - link
In the 486 days, computer gaming was worth that much money. The landscape was rapidly changing, games were rapidly changing. The internet was taking hold, 3D gfx starting to be born. It was amazing. It was money well spent to be able to play groundbreaking new types of games.Nowadays, although overall less expensive perhaps, your money doesn't buy you much new in terms of originality and exciting gameplay. All we get are prettier and prettier textures with duller and duller games. WoW and Counterstrike are STILL massively popular games, certainly not for their gfx.
Eris_Floralia - Tuesday, October 16, 2018 - link
Nate, iirc they handicapped the tensor performance of FP16 with FP32 accumulate, which is only half of those on equivlant Quadro cards, maybe that's why HGEMM performance is low.https://cdn.discordapp.com/attachments/47593159264...
Eris_Floralia - Tuesday, October 16, 2018 - link
*equivalentYojimbo - Tuesday, October 16, 2018 - link
The chart says half precision GEMM. So I think a lack of accelerated 32-bit accumulation should not be slowing the GPU down. As far as I know, the Turing Tensor Cores perform FP16 multiplications with FP16 accumulations at 8 operations per clock much like Volta Tensor Cores perform FP16 mults. with FP32 accumulation at 8 operations per clock.Eris_Floralia - Tuesday, October 16, 2018 - link
Turing FP16 with FP16 accumulate is fully enabled on all RTX cards, but FP16 with FP32 accumulate is 1/2 rate on GeForce cards.They used out of the box configuration which likely used Volta's FP16 with FP32 accumulate, resulting in half the performance.
HGEMM results for 2080TI/2080/2070 are very close to their 54/40/30 TFLOPS theoretical performance. If it was a Quadro card you will see double the performance with this config. If they updated the binary support, you'll likely see double the perf with FP16 accumulate too.
Yojimbo - Tuesday, October 16, 2018 - link
"They used out of the box configuration which likely used Volta's FP16 with FP32 accumulate, resulting in half the performance."It could be some driver error. But I don't see why the GPUs not having FP32 accumulate should be the ultimate cause for the poor results. I admit I don't know much about the test, but why should the test demand FP16 multiplications with FP32 accumulate? That's more or less an experimental situation only available commercially in NVIDIA's hardware, as far as I know. If the test is meant to use FP16 accumulate and FP32 is being forced in the test then the reason for the poor results is a driver or testing error, not that Turing GPUs only have FP16 accumulate at full Tensor Core speed for the precision.
hansmuff - Tuesday, October 16, 2018 - link
Great review making some very pointed and smart commentary. Thank you!Hixbot - Tuesday, October 16, 2018 - link
Nvidia are not even interested in competing with themselves.shabby - Tuesday, October 16, 2018 - link
It's hilarious in a way, take the tensor cores and ray tracing out of the equation and there's barely any difference between pascal and Turing. It's almost like that extra memory bandwidth is giving Turing its speed bump and nothing more.PeachNCream - Tuesday, October 16, 2018 - link
NVIDIA is heavily marketing ray tracing as the killer feature for the RTX cards. Its clear that a generational gain in performance wasn't in the cards (pun intended) this time around.shabby - Tuesday, October 16, 2018 - link
And with Ray tracing turned on these things will perform like cards from 4 years ago. Nvidias going back to the future.AshlayW - Tuesday, October 16, 2018 - link
So in Far Cry 5, a game that I play a lot, I've essentially got RTX 2070 performance with my Vega 56 (OC+ Flashed to 64), but for £399 and the game free with it? Cool!The_Assimilator - Tuesday, October 16, 2018 - link
But you also need a small nuclear reactor to power it and a moderately-sized dam to cool it, so there's that.Spunjji - Wednesday, October 17, 2018 - link
If you run your computer for anything like sensible periods of time, that extra power draw still doesn't come close to amounting to the price difference. Remember, you have to consider it in context of the power draw of your entire home.pixelstuff - Tuesday, October 16, 2018 - link
I think my price limit on GPUs is the "not much more than an entire gaming console with slightly better performance" bracket of $350-400. I guess we'll see if the 2060 fits that bill and makes a worthy upgrade to the 970. Otherwise I'll be waiting one extra generation this time around instead of upgrading every other generation.Icehawk - Tuesday, October 16, 2018 - link
I’m with the crowd that says wtf to the new pricing - I’m a 670>970 owner and was hoping to upgrade to another x70 for $350-400 but they are priced too high for me now to justify. Hope they bring prices back to reality for the 2170 or that they offer GTX models along with RTX.If they want to shift the cards up a rank, IMO, they should have adjusted the naming schema.
thestryker - Tuesday, October 16, 2018 - link
I feel much the same as you, and honestly I'd bet most people who buy the upper-mid range feel the same way. I also have a GTX 970 and as I told a couple of my friends while laughing at the new RTX pricing "this makes it so much easier to wait for 2020 to see if Intel can compete". I stick by that statement and barring a pricing revolution or my 970 dying here's to 2020.Lazlo Panaflex - Friday, October 19, 2018 - link
@thestryker, same here. I got a 970 a couple years ago, and won't be upgrading any time soon. I'm sure it'll run Doom Eternal just fine...thanks Vulcan ;-)Targon - Tuesday, October 16, 2018 - link
New consoles have been hitting $600 at release, and then come down after a year or two. So, $600 for a new card is still in that range of being the price of an entire console. When I see $700+, that is when I really question how much faster the card is to justify the higher price.cfenton - Tuesday, October 16, 2018 - link
The most expensive console launch recently was the Xbox One X at $500. The PS4 and PS4 Pro were $400 at launch.eva02langley - Tuesday, October 16, 2018 - link
The thing is that MS, Sony or Nintendo can sell their consoles at a lost because they are going to get it back on software... a GPU doesn`t work this way.@cfenton, 599$? https://www.youtube.com/watch?v=BOHqG1nc_tw
wr3zzz - Wednesday, October 17, 2018 - link
Count me the same as well. With AAA developers no long pushing technology beyond console envelops, instead of a new GPU every other gen I am likely going with just one GPU (980) for this entire current console cycle.colonelclaw - Thursday, October 18, 2018 - link
Completely agree. For the cost of the most expensive games console you should at least get the most powerful gfx card. Have Nvidia forgotten that you basically need to spend the same amount again to get a working computer? $500 for a 'mid-range' card is utter lunacy.adlep - Tuesday, October 16, 2018 - link
Used, 2nd hand market price breakdown for both 1070ti and 1080 are going to be a major headache for Nvidia. I bought my MSI GTX 1080 Gaming X for the "buy it now price" of $320.00 and 1070ti cards go for less than $300.00 on the 2nd hand market such us ebay, facebook marketplace, and FS/FT sections of AT Forum.The_Assimilator - Tuesday, October 16, 2018 - link
1080 Ti as well - the fastest cards from the previous gen usually get the largest % discount.brunis.dk - Tuesday, October 16, 2018 - link
i get dizzy from turning my head to read the labels. i loved that you made the AMD bar in the compute benches red, helps me identify red team. maybe make a repeating bg with barely discernible logo's. Just saw i dont get dizzy, help an old man out :) If you need help with the web dev, let me know.beisat - Tuesday, October 16, 2018 - link
Thanks for the review, nice as always.Was hoping to upgrade my 970 before Turing was announced, but I feel like I'm getting ripped of with these cards. The review did nothing to change that feeling, but that was to be expected.
Luke212 - Tuesday, October 16, 2018 - link
Please investigate why Turing is slower than Volta for HGEMM. If it was using the tensor cores they should be not that slow.SMOGZINN - Tuesday, October 16, 2018 - link
On the "The Test" page you show that the "NVIDIA GeForce GTX 1070 Ti Founders Edition" is one of the cards being compared, but it does not show up in the benches.Targon - Tuesday, October 16, 2018 - link
From the information, seeing Vega 64 going up to a temp of 86C would put it into thermal throttle range, which would cripple performance. From my own experience, manually adjusting the fan settings in Global Wattman to go up to 4500rpm and with a temperature target of 75C will avoid the throttle issues in the first place and also improving performance significantly, even without tweaking clock speeds or voltages.So, if Vega 64 is getting throttled and still hitting the numbers reported, that implies that with the fan profile adjusted as I suggested, we would be seeing Vega 64 doing a bit better in terms of framerates.
The_Assimilator - Tuesday, October 16, 2018 - link
Let's be honest: Vega isn't here for competition purposes, it's just included as a courtesy.atl - Tuesday, October 16, 2018 - link
Would be good to have some SLI & Cryptocurrency benchmarks includedTEAMSWITCHER - Tuesday, October 16, 2018 - link
These RTX cards are going to be a fantastic value......next summer when they drop the prices.
eva02langley - Tuesday, October 16, 2018 - link
Even there, I don`t know if Navi can really be a 250$ GPU with 1080 GTX performances.sandman74 - Tuesday, October 16, 2018 - link
980 owner here gaming at 1440p. Really wanted to upgrade but when I cost everything up, PC gaming has suddenly become a very expensive hobby.Decided to completely abandon the PC as a future gaming platform mostly thanks to the pricing of the new gpu cards.
2.5yrs since the 1080 for barely better performance. RTX isn’t viable on this card. My own view is the new line up sucks.
Practically all my mates are on consoles these days which is a shame but it’s a sign of the times. Tried the BF5 beta on my xbox one S and was blown away at how decent it was. Had real fun playing with friends which is what matters.
So I can only imagine it’s even better on the Xbox One X which you can buy for the price of just this GPU.
Prices have gone insane, so I’m stepping out. Total respect for those that can justify the prices and carry on PC gaming. I can’t.
The_Assimilator - Tuesday, October 16, 2018 - link
tl;dr rather get a heavily discounted 1080 Ti (which will probably be factory overclocked and have a beefier cooler).Arbie - Tuesday, October 16, 2018 - link
Thanks for including Ashes Escalation in the results. I hope you will continue to do so. This is a unique game with great features.abufrejoval - Tuesday, October 16, 2018 - link
I find a lot of the discussions around here odd: Lots of people trying to convince each other that only their choice makes any sense… Please, let’s just enjoy that there are a lot more choices, even if that can be difficult.For me, compute pays the rent, gaming is a side benefit. So I aimed for maximum GPU memory and lowest noise, because training neural network can take a long time and I don’t have an extra room to spare. It was a GTX 1070 from Zotac, 150 Watts TDP, compact, low noise at high loads, not exactly a top performer in games, ok at 1080 slightly overwhelmed here and there with my Oculus Rift CV1, although quite ok with the DK2. I added a GTX 1050ti on another box mostly because it would do video conversion just as fast, but run extremely quiet and at zero power on that 24x7 machine.
Then I made a 'mistake': I bought a 43” 4k monitor to replace a threesome of 24” 1080 screens.
Naturally now games wouldn’t focus on one of those, but the big screen, which is 4x the number of pixels. With a screen so big and so close, I cannot really discern all pixel together at all times, but when I swivel my head, I will notice if pixels in my focus are sharp or blurred, so cutting down on resolution or quality won’t really do.
I replaced the 1070 with the top gun available at the time, a GTX 1080ti.
Actually, it wasn’t really the top gun, I got a Zotac Mini which again was nicely compact and low noise, does perfectly fine for GPU compute, but will settle on 180Watts for anything long-term. It’s very hard to achieve better than 70% utilization on GPU machine learning compute jobs, so all of these GPUs (except a mobile 1070) tend to stay very quiet.
A desperate friend took the 1050ti off my hands, because he needed something that wouldn’t require extra power connectors, so I chipped in some extra dosh and got a GTX 1060(6GB) to replace it. Again, I went for a model recommended for low noise from MSI, but was shocked to see that it was vastly bigger than the 1080ti in every direction when I unpacked it. It was, however, very silent even at top gaming loads, a hard squeeze to fit inside the chassis but a perfect fit for ‘noise’ and a surprisingly adequate for 1080 gaming at 120 Watts.
The reason I keep quoting those Watts is my observation that it’s perhaps a better sign of effective GPU power than the chip, as long as generation and process size are the same: There is remarkably little difference between the high-clocked 1060 at 120Watts, the average clocked 1070 at 150 Watts and the low-clocked 1080ti at 180Watts. Yes, the 1080ti will go to 250 Watts for bursts and deliver accordingly. But soon physics will weigh in onto that 1080ti and increasing fan speed does nothing but add noise, because surface area much like displacement in an engine is hard to replace.
I got an RTX 2080ti last week, because I want to explore INT8 and INT4 for machine learning inference vs. FP16 or FP32 training: A V100 only gives me FP16 and some extra cores and bandwidth while it costs 4x as much, even among friends. That makes the Turing based consumer product an extremely good deal for my use case: I don’t care for FP64, ECC or GPU virtualization enough to pay the Tesla/Quadro premiums.
And while the ML stuff will take weeks if not moths to figure out and measure, I am glad to report, that the Palit RTX 2080ti (only one available around here) turned out to be nicely quiet and finally potent enough to run ARK Survival Evolved at full quality at 4k without lags. Physically it’s a monster, but that also means it sustains 250 Watts throughout. That’s exactly how much a GTX 980ti and an R290X gulped from the mains inside that very same 18-Core Xeon box, but with performance increases harking back to the best times of Gordon Moore’s prediction.
IMHO discussions about the 2xxx delivering 15% more speed at 40% higher prices vs. 1xxx GPUs are meaningless: 15FPs vs. 9 FPs or 250FPs vs. 180FPs are academic. The GTX 1080ti failed at 4k, I had to either compromise quality or go down to 3k: I liked neither. The RTX 2080ti won’t deliver 100FPs at 4k: I couldn’t care less! But it never drops below 25FPS neither, and that makes it worth all the money to a gamer, while actually INT8 and INT4 compute will pay the bill for me.
I can’t imagine buying an RTX 1070 for myself, because I have enough systems and choices. But even I can imagine how someone would want the ability to explore ray-tracing or machine learning on a budget that offers a choice between a GTX 1080ti or an RTX 1070: Not an easy compromise to make, but a perfectly valid choice made millions of times.
Don't waste breath or keystrokes on being 'religious' about GPU choices: Enjoy a new generation of compute and bit of quality gaming on the side!
abufrejoval - Tuesday, October 16, 2018 - link
s/RTX 1070/RTX 2070 above: Want edit! It's this RTX 2070 which may not make a lot of sense to pure blooded games, except if they are sure that they continue to run at 1920x1080 over the next couple of years (where a GTX 1080ti is overkill) *and* want to try next generations graphics.Flunk - Tuesday, October 16, 2018 - link
So Nvidia has decided to push all their card numbers down one because AMD isn't competitive at the moment. The 2060 is now the 2070, 2070 is the 2080 and the 2080 is the 2080 TI. This sort of hubris is just calling out for a competitor to arrive and sell a more competitively priced product.As for ray tracing, I'll eat my hat if the 2070 can handle ray-tracing in real games at reasonable frame-rates and real resolutions when they arrive.
Kakti - Tuesday, October 16, 2018 - link
TBH...who gives a crap? With the advent of usable integrated GPUs from Intel and AMD, dGPU vendors are basically no longer making x20, x30 or x40 cards. So maybe they're just pushing up the product stack - instead of "enthusiasts" buying x60, x70 and x80 cards, we'll now be buying x50, x60, x70 and halo x80 products. I could care less what the badge number is for my card, what I care about it performance vs price.That said, I don't think I'll ever by a dGPU for more than $400. The highest I've ever paid was I think ~$350 for my 970 or 670. As long as there's a reasonably competitive card in the $300-$400 USD range, I don't care what they call it - it could be a RTX 2110 and I'll snap it up. Given the products NVidia has released so far under the RTX line, I'm going to wait and see what develops. Either I'll grab a cheap used 1080/1080ti or wait for smaller and cheaper 2100 cards. NV can ask whatever they want for a card, but at the end of the day most consumers have a price ceiling in which they won't purchase anything above. Seems like a lot of people are in the 350-500 range so either prices will have to come down or cheaper products will come out. I'm curious whether NV will make any more GTX cards since Tensor cores not only aren't that usable right now, but dramatically increase the fab cost given their size and complexity.
Yojimbo - Wednesday, October 17, 2018 - link
Nahh, look at the die sizes. The 2080 is bigger than the 1080 Ti. The 2070 is bigger than the 1080. The price/performance changes are not because NVIDIA is pushing the cards down one, it's entirely because of the resources spent for ray tracing capabilities. As far as the 2070's ability to handle ray tracing, we won't really know for a few more months.As for competitors, if AMD had a competitive product now they might be cleaning up. But since they don't, by the time they or some other competitor (Intel) does arrive they will probably need real time ray tracing to compete.
No one is forcing you to buy an RTX. If you're not interested in real time ray tracing you probably shouldn't be buying an RTX, and the introduction of RTX has forced the 10 series (and probably soon the Vega and Polaris series) prices down.
Voodoo2-SLI - Tuesday, October 16, 2018 - link
WQHD Performance Index for AnandTech's GeForce RTX 2070 Launch Review165.1% ... GeForce RTX 2080 Ti FE
137.5% ... GeForce RTX 2080 FE
115.3% ... GeForce RTX 2070 FE
110.6% ... GeForce RTX 2070 Reference
126.8% ... GeForce GTX 1080 Ti FE
100% ..... GeForce GTX 1080 FE
81,7% .... GeForce GTX 1070 FE
99,2% .... Radeon RX Vega 64 Reference
Index from 15 other launchreviews with an overall performance index of the GeForce RTX 2070 launch here:
https://www.3dcenter.org/news/geforce-rtx-2070-lau...
risa2000 - Wednesday, October 17, 2018 - link
What is exactly "RTX 2080" which is bouncing around the tables? I did not find any reference in the test description chapter. I assumed it could be card "stock clocked" RTX 2080 FE, but it seems these cards are not always performing in expected order (sometimes 2080 beats 2080 FE).Also, in the temp and noise section, there are two cards: 2080 and "2080 (baseline)" which give again quite different thermal and noise results.
Achaios - Wednesday, October 17, 2018 - link
Too much blabbering in the comments section. Way I see it:GTX 2070 offers the same performance with a GTX 1080, is significantly more expensive than the GTX 1080 whilst being less power efficient and hotter at the same time.
/Thread
milkod2001 - Wednesday, October 17, 2018 - link
Well and accurately said.+1
FreckledTrout - Wednesday, October 17, 2018 - link
That pretty much sums it up.Doesn't this entire generation seem like it should have been made on 7nm to keep die sizes and costs down along with the heat?
Wwhat - Wednesday, October 17, 2018 - link
in games*You forgot to add.
Personally I'm curious about what non-gaming software will use those tensor and RT cores and what that will bring. I mean if for example Blender traced 3 times faster it would be quite a thing for Blender users. Same for video editing software users I imagine.
And then there's the use for students and scientist.
And the whole wave of AI stuff that people are now getting into.
It's funny because I would have thought that Anadtech would the site that was the one with not exclusively gamers and people using graphics cards exclusively for gaming, but going through the comments you'd think this was a gamer-oriented site - and a gamers site only.
althaz - Wednesday, October 17, 2018 - link
So that's a solid "no" then? You can get better performance for significantly less. This card isn't targeted at me (a 1080 owner), but until the ray tracing stuff starts to be worth anything, this card seems just too overpriced for a reasonable person to consider.ballsystemlord - Wednesday, October 17, 2018 - link
Spelling and grammar corrections.I did not read through the whole thing, but this is what I did find.
"The card is already coming in with a price premium so it's important to firmly faster."
Missing "be".
"The card is already coming in with a price premium so it's important to be firmly faster."
"For the RTX 2070, 4K and 1440p performance once agani settles near the GTX 1080." Right letters, wrong ordering
"For the RTX 2070, 4K and 1440p performance once again settles near the GTX 1080."
Also, I am of the opinion that you should focus your reviews on the performance of the cards vs. price/speed positioning/slot. For example, you could note that the 20 series tends to have better 99th percentile frame rates. This was a big win for the Vega when it first came out. I have not actually crunched the numbers to see if the Vega is better or worse than the 20 series. The calculation would be (minimum*100)/average == % a lower value being a larger discrepancy (worse).
FullmetalTitan - Thursday, October 18, 2018 - link
Certainly makes me feel better about pulling the trigger on a $525 overclocked 1080 with a free game last weekend. 2070s are certainly less abundant, and definitely not for $525. The premium only buys 5-10% performance at base clocks, not worth another $100lenghui - Friday, October 19, 2018 - link
Dear AT, please stop auto-playing your "Buy the Right CPU" video. Pleeeeeeeeeeeze. It's driving me away from your site. I am on my last thread.DominionSeraph - Friday, October 19, 2018 - link
Unfortunately the design makes it look like a terrible XFX AMD card.rtho782 - Saturday, October 20, 2018 - link
2070 incurs less of a perf hit in HDR? Ryan seems to think it has no impact: https://twitter.com/RyanSmithAT/status/80115626506...Luke212 - Thursday, October 25, 2018 - link
Nvidia gimped the tensor cores on consumer RTX, that’s why tensor core benchmarks are half a titan V or Quadro RTX. It can’t do FP32 accumulate full speed.dcole001 - Friday, October 26, 2018 - link
I currently have GTX 1070 and just can't justify upgrading due to the fact that Ray Tracing is currently not being used in any games right now. yes there is 15 - 25 FPS performance boost running 1440P still not worth $499 - $599 cost. Wait a year and this Video Card will drop and there actually might be some games taking advantage of Ray Tracing.ThanosPAS - Saturday, November 10, 2018 - link
Does 2070 worth it vs 1070 for its Tensor cores? I would utilize the card on machine learning. In Final Words, this wasn't covered. It seems to me that 2070 is the cheapest solution in rder to get dedicated Tensor cores, which if I am not mistaken make a great portion of difference in the computational performance between these two cards. Opinions?