It's a bit of a curious test really. The Metro Last Light benchmark is really not indicative of performance during actual gameplay. Although built around assets from the game's most demanding level, it's a GPU and CPU stress-test using conditions that aren't actually present in the game as you see them in that sequence. It's a benchmark, nothing more.
I feel a better comparison would have been Metro 2033, which actually sees radical differences compared to the original, whereas Last Light has less noticeable enhancements. I guess at least the article demonstrates that unless you want the DLC, you don't get a hugely improved experience with Redux over buying the original? But that just makes the case for a 2033 performance comparison that much more apparent.
In general I don't plan on doing many of these where the game is an update to an existing game -- it would normally just be a question of "how well does game XYZ run on various GPUs?" Metro Redux was a bit of special case, and since the original Last Light is still a bear to run I figured a short look at how much things have changed (if at all) would be good.
You're right that the built-in benchmark isn't necessarily indicative of the actual game, but there are scenes that can be very demanding and so it's basically a worst-case test. If you can run the benchmark and get decent frame rates, you can safely play the rest of the game. It's also not all that CPU-intensive, or at least not so much that going from stock clocks to 4.1GHz on the i7-4770K makes much of a difference on single GPUs at anything beyond the lower settings.
Regarding the Zotac GTX970: is it a factory-overclocked one? If raising the fan speed helps a bit, it seems like the clocks have been pushed too high for that chip. Does underclocking solve the stability issues? If that's the case it's a problem of Zotac and their binning, not nVidia. If the card runs at stock clocks it's nVidias problem.
Considering less than 3% run above 1920x1200 NV has time to fix this (rather you have time to RMA your card...LOL), and it may just be Zotac's fan that is a problem here or your particular SINGLE sample of a single vendor's cards. Considering further that most of that 3% have more than one card, this comment is even more pointless.
http://www.anandtech.com/show/8568/the-geforce-gtx... Why was there no problems in basically the same game (and all the others) tested previously in Anandtech's 970 review. Seems silly to call NV's whole 970 product line into question (made by many vendors) when you get ONE sample that doesn't do the job in a particular res but your own 970 review shows nothing even OC'ed in any game up to 4K.
http://www.anandtech.com/show/8568/the-geforce-gtx... If it's right up there with Crisis 3, why doesn't this game crap out in anandtech's 970 review at 1600p or 4k? Or any other game at these resolutions? Oh, right, AMD slant...
But that's anandtech for you (AMD portal and all) ;) Just saying...I mean, when you have a single card sample of AMD with an issue do you call the whole AMD line a problem? You don't even suggest such things. You say you had a problem with your particular card sample (at worst), which is exactly what you should have done here, with a tidbit saying but it probably doesn't affect other cards since so many review sites had ZERO issues at 1600P or even 4K.
http://www.anandtech.com/show/8568/the-geforce-gtx... Why doesn't it crap out when OC'ed to 1.45ghz at 1600p or 4K, and I mean why doesn't 970 do this at ANY of the TON of websites who tested it? Either your particular card has problems, or perhaps you're just lying because at this point there is no other way to make AMD's xmas get saved? Either way, many websites NOT having the issue would seem to suggest YOU have a problem, not NV's whole 970's card list from many vendors. Your suggestion is complete BS, but no surprise with that AMD slant this site has had since, oh 660ti review or so. :(
Such comments are silly, when evidence from everywhere at all resolutions say 970 is fine even up to ~1.5ghz. How can you question the whole 970 line knowing NO other site (AFAIK) had any problems with a SINGLE game, worse you call the whole 900 series into question...LOL.
http://in.ign.com/nvidia-gtx-980/64915/review/nvid... A single sample of 980 worked fine in the same game, so I guess based on one sample (using your logic) we now know ALL 980's have ZERO problems in this game even up to 4K right?...ROFL. See how stupid that is? Worse, I could go on and say since they had no problems here, probably 970 has no issues either.
Both your example and mine are stupid, but the tons of sites and a huge number of games tested across all those sites allows me to confidently say, you sir are an idiot or this is just yet another example of this site showing the old Anandtech AMD slant (I don't think you're an idiot, so draw your own conclusions). I can't really see how it could be anything besides these two options.
In the face of all other websites showing 980/970 fine in all resolutions and tons of games, can you please explain how a SINGLE card having an issue means 970's at least, and possibly all 980's could need to have Nvidia tweak their drivers? ROFL.
Left off one more option though I guess...You could just be lying ;) In any case, your assumption is ridiculous. It only took 3 comments for you to be called out on this (MrSpadge already explained what should have been obvious to a long time reviewer on a site like Anandtech). But I digress...
So two systems running the same software environment and the same SKU would expect different results between samples? If their temperatures and clocks are the same? This is a joke right? Drivers will affect data. Other hardware will affect data. Specific samples will always perform identically non-overclocked. Always.
Also baseless accusations of falsifying data is disgusting.
Holy cow! Let me start by saying that the paragraph causing the most controversy was (in my mind) rather innocuous. Seriously, saying "the game is crashing at times with Advanced PhysX enabled" was not the main point of this post! Anyway, I'm going to confine all of my responses in regards to the comments on stability/drivers to this single post. And for the record, yes, I deleted several of my earlier comments in order to clean things up (there's no edit function for me either, but I can at least delete my comments and post them again in edited form). I repeated myself a few times, so I've wiped those four initial responses and I'm going to try to cover all my bases in this single comment, which is in response to the first 10 or so comments (with a heavy slant towards TheJian).
First, while the Zotac is technically not a stock card (GPU clock is 1076 and RAM is 7010, compared to 1050/7000 for true "stock"), I don't think that's the problem. It runs very quiet, but it's a bit too quiet as it's getting relatively hot. I don't know if it's the GPU core or some other element, but where the card tended to crash after 5-10 minutes of heavy load initially in some games, a small bump in fan speed ramps did the trick for fixing that problem.
The exception is MLL Redux with Advanced PhysX enabled, where it's still consistently crashing on the second pass of the benchmark, sometimes even the first pass. It could be the card, it could be the game, or it could be the drivers – or perhaps a little bit of each. However, where all the other settings and games I've tried now run fine on the GTX 970, Redux with PhysX is unstable. This was not the case with other NVIDIA GPUs in my limited testing of PhysX, but the GTX 970 can't even make it through two passes of the benchmark. (I run three passes and use the best result of the second or third pass for the reported results, if you're wondering).
That makes me think it's the drivers, and I've pinged NVIDIA on it so we'll see, but as I noted the Advanced PhysX really doesn't seem to do anything useful so it's not a huge issue. I'm currently doing additional testing, and where Advanced PhysX crashed on the second run at high and very high settings, it managed to make it through three full passes before crashing on the fourth loop when quality was dropped to Medium. Advanced PhysX looks like the main culprit, but it takes longer to manifest at lower quality settings – and we still don't know if the root issue is with the game or the (PhysX) drivers.
Continuing with the above, I swapped GPUs and went back to the GTX 780 to run some tests with Advanced PhysX doing 8 passes of the benchmark. At 1080p High and Very High + SSAA, the benchmark completed all eight passes without crashing to the desktop. At 2560x1440 however, it crashed to desktop on the fourth pass (51 seconds into the benchmark). So instability with Advanced PhysX is certainly present on other NVIDIA GPUs, but it appears to be less of a problem than it is on the GTX 970. (And note: I'm only talking about Redux with Advanced PhysX here – at least initial, limited testing of Batman: Arkham Origins didn't encounter any problems.)
Keep in mind, we're talking about a brand new GPU on a brand (remastered) new game, which is sort of the point of this article -- how does a recent release perform? In this case, Redux is unstable on my Zotac GTX 970, but only at certain settings (basically higher quality settings). I had to bump up the fan speed to address this, and now the only remaining problem is PhysX. The other NVIDIA GPUs didn't encounter this problem in my normal testing, at least not in three passes of the benchmark, but then they've all been around for 6+ months. Again, this is the reason to test new releases on a broader set of hardware in articles like this. What does a hardware reviewer find in terms of performance and stability? In this case, performance is about what you'd expect from a "new" Metro game, and stability issues occurred when enabling Advanced PhysX, particularly on the GTX 970.
On a broader note, this does not affect my opinion of the GTX 970 as a good card. Heck, I went out and bought the Zotac because it was a very compelling card – I wanted one for my own use! So yeah, TheJian, I love AMD so much and I hate NVIDIA so badly that I spent $329 on a Zotac GTX 970. [Rolls eyes] Okay, part of the reason for getting the card was to also run tests like this, and the only reason I bought the Zotac instead of a different GTX 970 was because the others were all out of stock. With my fan speed tweaks, it's running at <70C under full load and still generates far less noise than, say, the GTX 780 – and the R9 290X is in a completely different category when it comes to noise (in a bad way).
Finally, I've gone back and edited/clarified the text a bit just to make sure everyone knows what I'm saying when I discuss potential issues with drivers and the GTX 970 – basically, a more concise version of this comment. If you've got any remaining concerns/comments, let me know.
Addendum: So I think PhysX is actually the only problem with the Zotac 970. I went back to retest some more, and while I thought the game had crashed at 2560x1440 without PhysX, I can't get that to happen now. Of course I've reinstalled the drivers today so that may have had something to do with things as well. I still like to run my cards a bit cooler, and the fans on the 970 are quiet enough that I don't mind ramping up the RPMs a bit more, but YMMV.
Hi Jarred, yeah it's definitely a bug that affects Maxwell cards only. Running PhysX in CPU mode fixes it. Metro Last Light Redux uses PhysX 3.3 which runs way faster on CPU than previous editions, so turning it on doesn't have any sort of performance hit that I could gather when running on the CPU..
Is there any benefit to even turning Advanced PhysX on in this game? Maybe it's not visible in the benchmark scene, as I can't tell any difference between having it on or off.
Yeah, I would say there's a benefit. You get a lot more particles, debris, destruction, smoke and fog effects plus some cloth as well.. Some of the effects aren't as interactive as they used to be in the original games, but thats because PhysX 3.x is geared more towards the CPU than the GPU. It doesn't really matter though as the overall effect is still solid in terms of how it adds to the atmosphere of the game.
I'm still in shock at how well PhysX 3.3 runs on the CPU, because the 2.x versions all ran horribly on it. It scales perfectly across my overclocked 3930K and runs without a hitch! It's a sign of things to come with future PhysX titles to be sure..
I just tested this and I'm not sure PhysX is truly running 100% on the CPU, or if NVIDIA is doing some funny stuff in their drivers. If PhysX set to CPU is really doing everything on the CPU, why do I get better performance that way than if I use GPU PhysX? And what's more, if PhysX runs like this on the CPU, why doesn't it run that well on AMD GPUs? I know with Optimus there were games that would sometimes ignore the "force to run on dGPU (or iGPU)", so it's possible "forcing" CPU PhysX isn't really doing what we expect.
Hi Jarred, can you compare this games performance in Linux and Windows with the same hardware? I'm curious in the difference across platforms and it would also function as a good signpost for future uses of benchmarks that compare with the performance here.
Take it easy on Jarred here, guys. Totally out of line criticizing him in this way. He obviously put a lot of time into this, had some problems with the 970, and proposed one potential cause.
Jarred - I've seen some negative user reviews of the Zotac 970, which happens to be one of the only 970s that's been available since launch. Perhaps this first run of 970 GPUs did have some problems. I don't think increasing the fan speed, however, would alleviate crashes. If it's crashing, there's either a hardware or software (game/driver) problem, assuming you're running at stock settings of course.
In other words, Nvidia should be taking a look at these results, because even if it's an error by Zotac in their build of the 970, Nvidia would want to know about it.
"I would like to chime in, and say metro redux crashes my 780 TI as well when i have the advanced physx setting enabled. No matter what i ever tried, that game makes my nvidia driver do one of those crashed and recovered errors. A ton of people have this issue, its the game sadly. I doubt it is your GPU itself."
Note EVGA tech support sent a private mail to the guy that had an issue. No bios or driver update mentioned. My guess, he probably got an RMA.
http://steamcommunity.com/app/286690/discussions/0... Steam community, problems with driver crashing even when running physx on cpu (comment 8). But here 1st comment is crashing 780, 780ti, 980 and according to the user in less than 10mins.
http://steamcommunity.com/app/286690/discussions/0... Crashing even in 1600x900, and one guy saying a thread was deleted and assuming a person or two are working for 4A or Deep Silver (at least at steam's forums): "So you two have tagged teamed up? Are you working undercover for 4A or Deep Silver? I only say this because I already explained in detail why you're wrong Mattplego and why the game was obviously rushed cash grab based on the facts we know. The thread was mysteriously deleted, Not closed and locked but deleted. "
At this point I'm not even sure your CARD is the problem (certainly not the chip), but that was the point of the first post anyway :) Maxwell chips are FINE. You, however, appear to not be. ;)
Wow dude you have some issues. The man made an educated guess at an issue and gave many options and avenues to the cause of it. There was no bias in his writing and was pretty much spot on when it comes to integrity.
Just because you spent an extra god knows how many hours doing research in the matter and found a more probable reason for the issue doesn't devalue the quality of the author's work. In fact your attitude in summary just shows you to be a bit of a douche
If the 970 is crashing then I'd think it would be prudent to test another 970, just to rule out hardware error. I haven't played Metro with my 970, but I haven't experienced any stability issues since installing mine.
I thought about that before, but basically it would double the time and that's a lot of work for a small payoff in information. Plus, what CPU should I use as a second option? i3-3225 I have sitting around, or I could get an FX-8320. I think most CPUs used for gaming will be at least at the i5 level, and outside of CrossFire and SLI rigs the performance will generally be GPU limited, regardless of CPU. If there's enough demand for it, I'll reconsider, but for now I'm sticking to one CPU. :)
I'd love to see testing at higher res and/or SLI/CF configs, I know that adds to the variables and would require a different test group besides the laptops, and I realize those of us running larger displays and multiple cards are the minority, but still...
A lot of these tests are just gonna boil down to a single fact otherwise: one large percentage of recent cards can run everything just fine at 1080p and a smaller percentage can't. Pushing setups to the limit is usually a more interesting read, and more revealing of the relative performance differences.
Unfortunately I have no SLI configurations right now, though I can run 280, 290, and 290X CF. I have a lot more AMD GPUs at present than NVIDIA, though I'll see if we can fix that. Just one more 970 and two 980 cards and I'd be set. Hahaha.
Metro Last Light was one of those games thats remarkable but unremarkable when it came to SLI.
We did extensive benchmarks with Titans on Surround 1600p & 1440p (portrait and Surround meaning 7680x1600 & 4800x2560), Multiple GPU is similar to Titanfall... Not optimized.
It would be interesting to see if they fixed this issue in the redux version. Are there any plans to check this in resolutions other than 1080p. We of course appreciate your article, but the industry is moving away from 1080p surely but slowly and it would good to see benchmarks from you guys in this regard.
Surround / higher resolutions are more demanding and more accurately portray if the developer is taking advantage of these higher textures. I am not knocking 1080p as a developer, they look for what "70+%" of their clientele (us) use. Hence why SLI / Surround is never optimized proficiently. However, with higher resolutions, SLI usage should be increasing.
Hope that makes sense in a dumb down version, but would be nice to see benchmarks that are not just 1080p, otherwise most of these articles are not as beneficial. Anandtech of all people should be able to switch out a monitor or two.
At this time, I only have CrossFire setups for 290X and 280 (and 6970, though that's sort of not useful now). I only have single NVIDIA GPUs for the time being, and of course more configurations means more time to test. Assuming I can get a second GTX 970 (780 and 770 optional), I could at least run a few comparisons for surround gaming as I do have multiple monitors available.
However, let me just say that I don't think developers are targeting the 70+% when they ignore multiple monitors but more like the 95%. Yes, multiple displays are a good way to bring GPUs to their knees, but so are 2560x1440 and 3840x2160. I'm more inclined to add those than surround gaming.
I actually have 2560x1440 numbers available, but at least for Metro Redux (with SSAA enabled) it's not particularly useful data without SLI/CF results. The GTX 780 hits 26.6 FPS average, the 970 is 25.6, and the R9 290X is also 25.6. A single R9 280 meanwhile is down at 15.6 FPS and the R9 280X is 19.2 FPS. In other words, not one of the single GPU configurations is able to reach 30+ FPS in Metro Redux. (Note: GTX 980 probably gets there, but not with much room to spare.)
Hope that helps; if this becomes a regular section on AnandTech (which is what I'm hoping to do), we'll almost certainly add additional GPUs in the future. Consider this the beta release. Hahaha. :-)
I hear what you are saying. I do feel though that in today's day and age multiple GPU configurations will become more and more mainstream. From 4k to multiple monitors it would be great to see benchmarks more reflective of that.
As I said the reason programs/games are not as optimized always is due to the developers budget. Nvidia and AMD claim it's a simple drive update, but it's not the case, AKA titan fall, and a few other games that needed code edited.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
29 Comments
Back to Article
OrphanageExplosion - Thursday, October 2, 2014 - link
It's a bit of a curious test really. The Metro Last Light benchmark is really not indicative of performance during actual gameplay. Although built around assets from the game's most demanding level, it's a GPU and CPU stress-test using conditions that aren't actually present in the game as you see them in that sequence. It's a benchmark, nothing more.I feel a better comparison would have been Metro 2033, which actually sees radical differences compared to the original, whereas Last Light has less noticeable enhancements. I guess at least the article demonstrates that unless you want the DLC, you don't get a hugely improved experience with Redux over buying the original? But that just makes the case for a 2033 performance comparison that much more apparent.
JarredWalton - Thursday, October 2, 2014 - link
In general I don't plan on doing many of these where the game is an update to an existing game -- it would normally just be a question of "how well does game XYZ run on various GPUs?" Metro Redux was a bit of special case, and since the original Last Light is still a bear to run I figured a short look at how much things have changed (if at all) would be good.You're right that the built-in benchmark isn't necessarily indicative of the actual game, but there are scenes that can be very demanding and so it's basically a worst-case test. If you can run the benchmark and get decent frame rates, you can safely play the rest of the game. It's also not all that CPU-intensive, or at least not so much that going from stock clocks to 4.1GHz on the i7-4770K makes much of a difference on single GPUs at anything beyond the lower settings.
MrSpadge - Thursday, October 2, 2014 - link
Regarding the Zotac GTX970: is it a factory-overclocked one? If raising the fan speed helps a bit, it seems like the clocks have been pushed too high for that chip. Does underclocking solve the stability issues? If that's the case it's a problem of Zotac and their binning, not nVidia. If the card runs at stock clocks it's nVidias problem.TheJian - Thursday, October 2, 2014 - link
Considering less than 3% run above 1920x1200 NV has time to fix this (rather you have time to RMA your card...LOL), and it may just be Zotac's fan that is a problem here or your particular SINGLE sample of a single vendor's cards. Considering further that most of that 3% have more than one card, this comment is even more pointless.http://www.anandtech.com/show/8568/the-geforce-gtx...
Why was there no problems in basically the same game (and all the others) tested previously in Anandtech's 970 review. Seems silly to call NV's whole 970 product line into question (made by many vendors) when you get ONE sample that doesn't do the job in a particular res but your own 970 review shows nothing even OC'ed in any game up to 4K.
http://www.anandtech.com/show/8568/the-geforce-gtx...
If it's right up there with Crisis 3, why doesn't this game crap out in anandtech's 970 review at 1600p or 4k? Or any other game at these resolutions? Oh, right, AMD slant...
But that's anandtech for you (AMD portal and all) ;) Just saying...I mean, when you have a single card sample of AMD with an issue do you call the whole AMD line a problem? You don't even suggest such things. You say you had a problem with your particular card sample (at worst), which is exactly what you should have done here, with a tidbit saying but it probably doesn't affect other cards since so many review sites had ZERO issues at 1600P or even 4K.
http://www.anandtech.com/show/8568/the-geforce-gtx...
Why doesn't it crap out when OC'ed to 1.45ghz at 1600p or 4K, and I mean why doesn't 970 do this at ANY of the TON of websites who tested it? Either your particular card has problems, or perhaps you're just lying because at this point there is no other way to make AMD's xmas get saved? Either way, many websites NOT having the issue would seem to suggest YOU have a problem, not NV's whole 970's card list from many vendors. Your suggestion is complete BS, but no surprise with that AMD slant this site has had since, oh 660ti review or so. :(
Such comments are silly, when evidence from everywhere at all resolutions say 970 is fine even up to ~1.5ghz. How can you question the whole 970 line knowing NO other site (AFAIK) had any problems with a SINGLE game, worse you call the whole 900 series into question...LOL.
http://in.ign.com/nvidia-gtx-980/64915/review/nvid...
A single sample of 980 worked fine in the same game, so I guess based on one sample (using your logic) we now know ALL 980's have ZERO problems in this game even up to 4K right?...ROFL. See how stupid that is? Worse, I could go on and say since they had no problems here, probably 970 has no issues either.
Both your example and mine are stupid, but the tons of sites and a huge number of games tested across all those sites allows me to confidently say, you sir are an idiot or this is just yet another example of this site showing the old Anandtech AMD slant (I don't think you're an idiot, so draw your own conclusions). I can't really see how it could be anything besides these two options.
In the face of all other websites showing 980/970 fine in all resolutions and tons of games, can you please explain how a SINGLE card having an issue means 970's at least, and possibly all 980's could need to have Nvidia tweak their drivers? ROFL.
Left off one more option though I guess...You could just be lying ;) In any case, your assumption is ridiculous. It only took 3 comments for you to be called out on this (MrSpadge already explained what should have been obvious to a long time reviewer on a site like Anandtech). But I digress...
willis936 - Thursday, October 2, 2014 - link
So two systems running the same software environment and the same SKU would expect different results between samples? If their temperatures and clocks are the same? This is a joke right? Drivers will affect data. Other hardware will affect data. Specific samples will always perform identically non-overclocked. Always.Also baseless accusations of falsifying data is disgusting.
JarredWalton - Thursday, October 2, 2014 - link
Holy cow! Let me start by saying that the paragraph causing the most controversy was (in my mind) rather innocuous. Seriously, saying "the game is crashing at times with Advanced PhysX enabled" was not the main point of this post! Anyway, I'm going to confine all of my responses in regards to the comments on stability/drivers to this single post. And for the record, yes, I deleted several of my earlier comments in order to clean things up (there's no edit function for me either, but I can at least delete my comments and post them again in edited form). I repeated myself a few times, so I've wiped those four initial responses and I'm going to try to cover all my bases in this single comment, which is in response to the first 10 or so comments (with a heavy slant towards TheJian).First, while the Zotac is technically not a stock card (GPU clock is 1076 and RAM is 7010, compared to 1050/7000 for true "stock"), I don't think that's the problem. It runs very quiet, but it's a bit too quiet as it's getting relatively hot. I don't know if it's the GPU core or some other element, but where the card tended to crash after 5-10 minutes of heavy load initially in some games, a small bump in fan speed ramps did the trick for fixing that problem.
The exception is MLL Redux with Advanced PhysX enabled, where it's still consistently crashing on the second pass of the benchmark, sometimes even the first pass. It could be the card, it could be the game, or it could be the drivers – or perhaps a little bit of each. However, where all the other settings and games I've tried now run fine on the GTX 970, Redux with PhysX is unstable. This was not the case with other NVIDIA GPUs in my limited testing of PhysX, but the GTX 970 can't even make it through two passes of the benchmark. (I run three passes and use the best result of the second or third pass for the reported results, if you're wondering).
That makes me think it's the drivers, and I've pinged NVIDIA on it so we'll see, but as I noted the Advanced PhysX really doesn't seem to do anything useful so it's not a huge issue. I'm currently doing additional testing, and where Advanced PhysX crashed on the second run at high and very high settings, it managed to make it through three full passes before crashing on the fourth loop when quality was dropped to Medium. Advanced PhysX looks like the main culprit, but it takes longer to manifest at lower quality settings – and we still don't know if the root issue is with the game or the (PhysX) drivers.
Continuing with the above, I swapped GPUs and went back to the GTX 780 to run some tests with Advanced PhysX doing 8 passes of the benchmark. At 1080p High and Very High + SSAA, the benchmark completed all eight passes without crashing to the desktop. At 2560x1440 however, it crashed to desktop on the fourth pass (51 seconds into the benchmark). So instability with Advanced PhysX is certainly present on other NVIDIA GPUs, but it appears to be less of a problem than it is on the GTX 970. (And note: I'm only talking about Redux with Advanced PhysX here – at least initial, limited testing of Batman: Arkham Origins didn't encounter any problems.)
Keep in mind, we're talking about a brand new GPU on a brand (remastered) new game, which is sort of the point of this article -- how does a recent release perform? In this case, Redux is unstable on my Zotac GTX 970, but only at certain settings (basically higher quality settings). I had to bump up the fan speed to address this, and now the only remaining problem is PhysX. The other NVIDIA GPUs didn't encounter this problem in my normal testing, at least not in three passes of the benchmark, but then they've all been around for 6+ months. Again, this is the reason to test new releases on a broader set of hardware in articles like this. What does a hardware reviewer find in terms of performance and stability? In this case, performance is about what you'd expect from a "new" Metro game, and stability issues occurred when enabling Advanced PhysX, particularly on the GTX 970.
On a broader note, this does not affect my opinion of the GTX 970 as a good card. Heck, I went out and bought the Zotac because it was a very compelling card – I wanted one for my own use! So yeah, TheJian, I love AMD so much and I hate NVIDIA so badly that I spent $329 on a Zotac GTX 970. [Rolls eyes] Okay, part of the reason for getting the card was to also run tests like this, and the only reason I bought the Zotac instead of a different GTX 970 was because the others were all out of stock. With my fan speed tweaks, it's running at <70C under full load and still generates far less noise than, say, the GTX 780 – and the R9 290X is in a completely different category when it comes to noise (in a bad way).
Finally, I've gone back and edited/clarified the text a bit just to make sure everyone knows what I'm saying when I discuss potential issues with drivers and the GTX 970 – basically, a more concise version of this comment. If you've got any remaining concerns/comments, let me know.
--Jarred Walton
JarredWalton - Friday, October 3, 2014 - link
Addendum: So I think PhysX is actually the only problem with the Zotac 970. I went back to retest some more, and while I thought the game had crashed at 2560x1440 without PhysX, I can't get that to happen now. Of course I've reinstalled the drivers today so that may have had something to do with things as well. I still like to run my cards a bit cooler, and the fans on the 970 are quiet enough that I don't mind ramping up the RPMs a bit more, but YMMV.Carfax83 - Friday, October 3, 2014 - link
Hi Jarred, yeah it's definitely a bug that affects Maxwell cards only. Running PhysX in CPU mode fixes it. Metro Last Light Redux uses PhysX 3.3 which runs way faster on CPU than previous editions, so turning it on doesn't have any sort of performance hit that I could gather when running on the CPU..JarredWalton - Friday, October 3, 2014 - link
Is there any benefit to even turning Advanced PhysX on in this game? Maybe it's not visible in the benchmark scene, as I can't tell any difference between having it on or off.Carfax83 - Friday, October 3, 2014 - link
Yeah, I would say there's a benefit. You get a lot more particles, debris, destruction, smoke and fog effects plus some cloth as well.. Some of the effects aren't as interactive as they used to be in the original games, but thats because PhysX 3.x is geared more towards the CPU than the GPU. It doesn't really matter though as the overall effect is still solid in terms of how it adds to the atmosphere of the game.I'm still in shock at how well PhysX 3.3 runs on the CPU, because the 2.x versions all ran horribly on it. It scales perfectly across my overclocked 3930K and runs without a hitch! It's a sign of things to come with future PhysX titles to be sure..
JarredWalton - Friday, October 3, 2014 - link
I just tested this and I'm not sure PhysX is truly running 100% on the CPU, or if NVIDIA is doing some funny stuff in their drivers. If PhysX set to CPU is really doing everything on the CPU, why do I get better performance that way than if I use GPU PhysX? And what's more, if PhysX runs like this on the CPU, why doesn't it run that well on AMD GPUs? I know with Optimus there were games that would sometimes ignore the "force to run on dGPU (or iGPU)", so it's possible "forcing" CPU PhysX isn't really doing what we expect.srkelley - Tuesday, October 21, 2014 - link
Hi Jarred, can you compare this games performance in Linux and Windows with the same hardware? I'm curious in the difference across platforms and it would also function as a good signpost for future uses of benchmarks that compare with the performance here.Termie - Thursday, October 2, 2014 - link
Take it easy on Jarred here, guys. Totally out of line criticizing him in this way. He obviously put a lot of time into this, had some problems with the 970, and proposed one potential cause.Jarred - I've seen some negative user reviews of the Zotac 970, which happens to be one of the only 970s that's been available since launch. Perhaps this first run of 970 GPUs did have some problems. I don't think increasing the fan speed, however, would alleviate crashes. If it's crashing, there's either a hardware or software (game/driver) problem, assuming you're running at stock settings of course.
In other words, Nvidia should be taking a look at these results, because even if it's an error by Zotac in their build of the 970, Nvidia would want to know about it.
gonchuki - Thursday, October 2, 2014 - link
No R9 285? that's disappointing. This is one of the few cases where we could see how the updated GCN fares in this old vs new engine comparison.TheJian - Thursday, October 2, 2014 - link
http://forums.evga.com/To-eVGA-GTX980-SC-NEEDS-Bio...Apparently the game crashes on many cards (780 ti also), and one user says it's the GAME itself:
"I would like to chime in, and say metro redux crashes my 780 TI as well when i have the advanced physx setting enabled. No matter what i ever tried, that game makes my nvidia driver do one of those crashed and recovered errors. A ton of people have this issue, its the game sadly. I doubt it is your GPU itself."
Note EVGA tech support sent a private mail to the guy that had an issue. No bios or driver update mentioned. My guess, he probably got an RMA.
http://steamcommunity.com/app/286690/discussions/0...
Steam community, problems with driver crashing even when running physx on cpu (comment 8). But here 1st comment is crashing 780, 780ti, 980 and according to the user in less than 10mins.
I suppose I could dig for AMD crashes too, but this one came up in my NV search...LOL
http://www.playstationtrophies.org/forum/metro-las...
PS4 is AMD right? It crashes too? Whatever you get the point.
http://steamcommunity.com/app/286690/discussions/0...
Crashing even in 1600x900, and one guy saying a thread was deleted and assuming a person or two are working for 4A or Deep Silver (at least at steam's forums):
"So you two have tagged teamed up? Are you working undercover for 4A or Deep Silver?
I only say this because I already explained in detail why you're wrong Mattplego and why the game was obviously rushed cash grab based on the facts we know. The thread was mysteriously deleted, Not closed and locked but deleted. "
At this point I'm not even sure your CARD is the problem (certainly not the chip), but that was the point of the first post anyway :) Maxwell chips are FINE. You, however, appear to not be. ;)
Death666Angel - Thursday, October 2, 2014 - link
Someone didn't take his meds...Bob Todd - Thursday, October 2, 2014 - link
He never takes his meds. Every post is a 15 paragraph long tl;dr glimpse into cuckoo town.Tunnah - Friday, October 3, 2014 - link
Wow dude you have some issues. The man made an educated guess at an issue and gave many options and avenues to the cause of it. There was no bias in his writing and was pretty much spot on when it comes to integrity.Just because you spent an extra god knows how many hours doing research in the matter and found a more probable reason for the issue doesn't devalue the quality of the author's work. In fact your attitude in summary just shows you to be a bit of a douche
Subyman - Thursday, October 2, 2014 - link
If the 970 is crashing then I'd think it would be prudent to test another 970, just to rule out hardware error. I haven't played Metro with my 970, but I haven't experienced any stability issues since installing mine.mapesdhs - Thursday, October 2, 2014 - link
Blimey, some people do get kinda heated about so many issues... :|So, less seriously, Jarred, a smile was inevitable at finding both of
your, "And on a final note" paragraphs... ;D
Ian.
JarredWalton - Friday, October 3, 2014 - link
Fixed! No double final notes -- that's what I get for editing and adding content after posting. Hahaha.xTRICKYxx - Thursday, October 2, 2014 - link
Could you do different CPU + GPU combinations for these kind of articles? I know it would be time consuming...JarredWalton - Friday, October 3, 2014 - link
I thought about that before, but basically it would double the time and that's a lot of work for a small payoff in information. Plus, what CPU should I use as a second option? i3-3225 I have sitting around, or I could get an FX-8320. I think most CPUs used for gaming will be at least at the i5 level, and outside of CrossFire and SLI rigs the performance will generally be GPU limited, regardless of CPU. If there's enough demand for it, I'll reconsider, but for now I'm sticking to one CPU. :)Impulses - Friday, October 3, 2014 - link
I'd love to see testing at higher res and/or SLI/CF configs, I know that adds to the variables and would require a different test group besides the laptops, and I realize those of us running larger displays and multiple cards are the minority, but still...A lot of these tests are just gonna boil down to a single fact otherwise: one large percentage of recent cards can run everything just fine at 1080p and a smaller percentage can't. Pushing setups to the limit is usually a more interesting read, and more revealing of the relative performance differences.
JarredWalton - Friday, October 3, 2014 - link
Unfortunately I have no SLI configurations right now, though I can run 280, 290, and 290X CF. I have a lot more AMD GPUs at present than NVIDIA, though I'll see if we can fix that. Just one more 970 and two 980 cards and I'd be set. Hahaha.tential - Friday, October 3, 2014 - link
I think this is a good idea, but I don't like the day 1 testing or rather, I hope you do follow up testing for the more popular games.DPOverLord - Friday, October 3, 2014 - link
Metro Last Light was one of those games thats remarkable but unremarkable when it came to SLI.We did extensive benchmarks with Titans on Surround 1600p & 1440p (portrait and Surround meaning 7680x1600 & 4800x2560), Multiple GPU is similar to Titanfall... Not optimized.
It would be interesting to see if they fixed this issue in the redux version. Are there any plans to check this in resolutions other than 1080p. We of course appreciate your article, but the industry is moving away from 1080p surely but slowly and it would good to see benchmarks from you guys in this regard.
Surround / higher resolutions are more demanding and more accurately portray if the developer is taking advantage of these higher textures. I am not knocking 1080p as a developer, they look for what "70+%" of their clientele (us) use. Hence why SLI / Surround is never optimized proficiently. However, with higher resolutions, SLI usage should be increasing.
Hope that makes sense in a dumb down version, but would be nice to see benchmarks that are not just 1080p, otherwise most of these articles are not as beneficial. Anandtech of all people should be able to switch out a monitor or two.
JarredWalton - Saturday, October 4, 2014 - link
At this time, I only have CrossFire setups for 290X and 280 (and 6970, though that's sort of not useful now). I only have single NVIDIA GPUs for the time being, and of course more configurations means more time to test. Assuming I can get a second GTX 970 (780 and 770 optional), I could at least run a few comparisons for surround gaming as I do have multiple monitors available.However, let me just say that I don't think developers are targeting the 70+% when they ignore multiple monitors but more like the 95%. Yes, multiple displays are a good way to bring GPUs to their knees, but so are 2560x1440 and 3840x2160. I'm more inclined to add those than surround gaming.
I actually have 2560x1440 numbers available, but at least for Metro Redux (with SSAA enabled) it's not particularly useful data without SLI/CF results. The GTX 780 hits 26.6 FPS average, the 970 is 25.6, and the R9 290X is also 25.6. A single R9 280 meanwhile is down at 15.6 FPS and the R9 280X is 19.2 FPS. In other words, not one of the single GPU configurations is able to reach 30+ FPS in Metro Redux. (Note: GTX 980 probably gets there, but not with much room to spare.)
Hope that helps; if this becomes a regular section on AnandTech (which is what I'm hoping to do), we'll almost certainly add additional GPUs in the future. Consider this the beta release. Hahaha. :-)
DPOverLord - Sunday, October 5, 2014 - link
Thanks for writing back!I hear what you are saying. I do feel though that in today's day and age multiple GPU configurations will become more and more mainstream. From 4k to multiple monitors it would be great to see benchmarks more reflective of that.
As I said the reason programs/games are not as optimized always is due to the developers budget. Nvidia and AMD claim it's a simple drive update, but it's not the case, AKA titan fall, and a few other games that needed code edited.