I agree with chizow, I would love to see you guys follow through with your Vista sp1 vs xp sp3. A couple comparative screen shots showing the difference between dx9 and 10 would be great to include, for users to decide weather the performance hit with dx10 is worth it.
Looking forward to it,
Bill.
AoC is a pig. It brings systems to their tiny little knees. I think it would be a worthwhile addition to the test suite..its the equivalent to Crysis '05.
Is there any where were we could download the recorded game scripts to your new games so we could test our own systems against your testbed for comparison?
Several sites are reporting that the Intel SSD differs in performance depending on the type of data, and also takes some time to get stable performance after some types of write operations.
So use a fast hard drive like a WD raptor or a more "reliable" SSD like a samsung with SLC Nand.
(sorry for dupes if any, my post doesn't seem to be showing up)
I personally regard 3D Mark as a meaningless test and a waste of time except to overclockers who mainly use their computers to run 3D Mark, superpi, etc.
The time it takes to run a 3D Mark test might as well be used to run a benchmark of a real application/game.
Something like a flight simulator benchmark would definitely be more meaningful than 3D Mark. I believe there have only been a very few flight sim games released in the past few years, that could be a plus or minus depending on how you view it. I personally don't care :).
In fact 2D performance tests might be more useful to me - some 3D cards don't have as good 2D performance as others.
What I find annoying with some other benchmark sites is they only test resolutions like 2560 x 1600. Yes that's useful to test the really high end, but many people are still using 1280x1024 and 1680x1050. That's one of the reasons why I prefer Anandtech :).
Buying a bigger display is a lot of money- the display costs more and you need to spend tons more on graphic cards just to drive that display at a decent frame rate.
Regarding minimum frame rates - if frame rate graphs are not possible, posting minimum and maximum frame rates would be good (averaged over X seconds minimum).
Then there's SLI. I've heard that for some SLI stuff, the interframe delay going from card #1 to card #2 could be different from card #2 to card #1. Say the average frame rate is 60 fps. So on average there's 16ms between frames. However in theory card #2 could be producing a frame 2 milliseconds after card #1, and then nothing happens for 30ms. So the actual perceived display is not quite as smooth as the numbers might have you believe - it might appear closer to 30fps, or "jittery".
Last but not least, if possible try to measure _latency_ as well. e.g. measure the time it takes for mouse button down and/or key down to the action being displayed on the screen. A video card or video driver that produces higher frame rates but adds a lag of 50 milliseconds will be bad for most games where frames per second count. Testing latency should make for an interesting article. If you find that in general the latency is insignificant - say lower than 10ms, you can leave it out of the standard benchmarks and only do latency comparison tests for things like some fancy new tech wireless mouse.
The problem I have with testing low and mid-end cards on a high end system is it provides no information about the tradeoffs that you are actually making on the type of system that uses such cards. Say I have an E6550 and 8600GT at the moment and have a limited budget for upgrading. I want to know if upgrading the GPU or CPU provides the best payoffs. There is no point testing a GTX280 or 4870X2 on a low to mid-end system, but equally there is no point testing a 4350 with a QX9700.
I understand why a high end system is being used to test video cards, but as someone looking for information on what to buy in the mid range to low end class I find it extremely frustrating.
AT really needs two test beds, a lower spec system and the high end system. Seeing relative differences in the high spec system is useless because it still doesn't tell me if the frame rates I will be seeing will be playable.
I would think something in the 2-2.2 GHz range with 2GB of RAM and Win XP would be good.
All that talk about un-Vistaing Vista only applies if you want to run benchmarks and want consistent results. I find that most of the background programs and services in Vista are actually useful for me on a day to day basis. If you're really worried about getting 41.6 FPS one time and 42.8 FPS the next, if you were to play through the exact same level in the same game in exactly the same way twice, then by all means disable all the auto tuning features so the OS slows down over time and disable indexing to make file searches take 10 minutes instead of 2 seconds and turn off firewall, defender, updates etc. so your system is overrun by viruses and malware the minute you go online.
Back when my machine featured a K6-2 CPU, 6GB of harddrive space and 128MB of RAM, I felt I had to do everything in my power to make it run faster. Now I've got terabytes of storage space and gigabytes of RAM - I no longer feel the need to strip my gaming system to the bare minimum just to save 15MB of RAM, free up 200MB of harddrive space and in the end gain 0.5 FPS, losing in the process much of the functionality that sets a modern PC apart from a Win98 machine.
So yes, I can understand why Vista would be frustrating if you run a hardware site and want to run benchmarks.. but those same features make it easier, smoother and faster to use for regular PC users on a daily basis. Instead of using the computing resources to produce even higher FPS numbers and 3DMarks, you're using them to make the PC experience more enjoyable. Maybe Microsoft is working on Windows 7 - Benchmark Edition for those who don't want the background services and auto tuning :)
Wouldn't have to be a separate edition, just a control panel option to switch to bare minimum. If the graphics drivers could run in safe mode, that would probably be about perfect.
Actually, most of those tweaks can be done on XP too. Performance mode, hidden tray icons, highlighted programs, automatic updates, security center alerts, disabling the welcome screen, system restore, etc, etc...
I actually thought it was kind of funny when he called it de-vistaing. This is usually what I call de-XPing, since you almost end up with Windows 2000 with all that stuff turned off.
Yes, that goofy "Click your picture icon to login" screen. If there is only one user account on the PC, I think it skips right over the welcome screen and logs you in automatically. If it does show up, you can turn it off in the control panel to get a proper CTRL+ALT+DEL login box.
Yeah, for the next week or so I'm still using Windows 2000 on my main gaming machine and have been since 2000. Making the move to Vista and I basically come to the realization that I'm going to end up with something that will be using almost no new features outside of a new sound and window manager theme.
xp 64 is worse than vista by a huge margin. and we're using more than 4GB of RAM in our future test bed.
also, even if there are people out there who don't care about dx10 at all, it is still important to look at dx10 performance to get an understanding of graphics capability and the future of the industry.
I <3 my XP64. I don't know what is terrible about it. At least it lets you use h/w sound acceleration :P
I will eventually move to Vista x64 for my gaming PC, but for now I'm thoroughly pleased with XP64 (Phenom 2.6GHz., 4GB DDR2-1066, 4850 1GB, X-Fi, 500GB Raid-0, etc.). I have Vista x64 on my laptop and I use that just for getting stuff done, while I play on the XP64 machine.
Heh, you really have a hard time hiding your contempt for Vista 64 Derek, but is it justified? You had a chance to put the issue to rest when you promised a Vista 64 vs. XP comparo nearly a year ago, before you guys made the switch to 4GB and Vista 64 earlier this year. But you never got around to it for whatever reason so I guess we're stick with not-so-subtle jabs at Vista until 7 (or Mojave).
As for benchmarking and testbed methodology, I'd like to see some changes as other sites have done. I know you've said in the past that you will never do frames vs. time graphs, but only a few per review would be invaluable in drawing conclusions that simple FPS averages would not show. Not only would it put to rest any questions about min FPS, it'd also show time spent at various frame rates.
I'd also like to see some CPU/GPU speed scaling differences for the featured part in a review. Again, this would not be feasible for all parts in a review, but if done only for the featured part and one or two games, that would give readers a good indication of how that part scales with slower/faster CPUs and also how it scales with clockspeed. Over time, one would be able to cross-reference and compare featured parts as long as the test bed remained the same.
For example, if you were reviewing GTX 280 SLI, you'd run your EE Nehalem at 2GHz/3GHz/4GHz and then re-run those tests at 550/600/650MHz GPU clock. The information you might glean from such a comparison after comparing to an earlier GTX 280 review might be that a single GTX 280 with a 2GHz CPU isn't much different than GTX 280 SLI, but very different with a 4GHz CPU. Or that a 600MHz GTX 280 isn't much different in performance relative to an OC'd GTX 260 at 650MHz etc.
Lastly, I'd like to see drivers updated more frequently, or at least periodic driver comparisons for a single part from each vendor. I understand you guys need to used archived results to save time on a short deadline, but using launch drivers months after release seems a bit antiquated in current reviews. At least a comparison would show any performance difference between driver versions, if any.
I'd love to see flight simulator included in your average testing, it's a HUGE niche. As well as 3D Mark scores, how you guys have still not decided to include 3D mark in every test you run is beyond me... but please start using it! Please! It's objective testing, games are subjective, only when using data from both types of information can you create a complete picture of the performance of any tested part, RAM, Motherboard, GPU or CPU it all needs to be looked at subjectively and objectively.
We have an excellent Flight Simulator X benchmark coming in the next mobo roundup. ;) Also, we use to run 3DMark and considered running 3DVantage (at least on the mobo/memory side), the problem is that the graphic card manufacturers have a bad habit of doing specific driver optimizations for these programs. Usually these optimizations have no bearing on actually improving game play in general, just there to ensure the benchmark results are improved.
3dmark is not objective -- it is what futuremark thinks (subjectively) will be important to the future of gaming.
it is also fully synthetic and doesn't give a good report of what exactly a piece of hardware is good at. meaning that it isn't useful for anything practical and the information it provides is not high quality.
we would be MUCH more likely to adopt a fully task specific synthetic benchmark like GPUBench than something like 3dmark ... for performance analysis anyway.
for max load power tests, i always use 3dmark -- it can fully load the graphics hardware without loading the CPU giving you a good GPU level power comparison.
I personally regard 3D Mark as a meaningless test and a waste of time except to overclockers who mainly use their computers to run 3D Mark, superpi, etc.
The time it takes to run a 3D Mark test might as well be used to run a benchmark of a real application/game.
Something like a flight simulator benchmark would definitely be more meaningful than 3D Mark. I believe there have only been a very few flight sim games released in the past few years, that could be a plus or minus depending on how you view it. I personally don't care :).
In fact 2D performance tests might be more useful to me - some 3D cards don't have as good 2D performance as others.
What I find annoying with some other benchmark sites is they only test resolutions like 2560 x 1600. Yes that's useful to test the really high end, but many people are still using 1280x1024 and 1680x1050. That's one of the reasons why I prefer Anandtech :).
Buying a bigger display is a lot of money- the display costs more and you need to spend tons more on graphic cards just to drive that display at a decent frame rate.
Regarding minimum frame rates - if frame rate graphs are not possible, posting minimum and maximum frame rates would be good (averaged over X seconds minimum).
Then there's SLI. I've heard that for some SLI stuff, the interframe delay going from card #1 to card #2 could be different from card #2 to card #1. Say the average frame rate is 60 fps. So on average there's 16ms between frames. However in theory card #2 could be producing a frame 2 milliseconds after card #1, and then nothing happens for 30ms. So the actual perceived display is not quite as smooth as the numbers might have you believe - it might appear closer to 30fps, or "jittery".
Last but not least, if possible try to measure _latency_ as well. e.g. measure the time it takes for mouse button down and/or key down to the action being displayed on the screen. A video card or video driver that produces higher frame rates but adds a lag of 50 milliseconds will be bad for most games where frames per second count. Testing latency should make for an interesting article. If you find that in general the latency is insignificant - say lower than 10ms, you can leave it out of the standard benchmarks and only do latency comparison tests for things like some fancy new tech wireless mouse.
I personally regard 3D Mark as a meaningless test and a waste of time except to overclockers who mainly use their computers to run 3D Mark, superpi, etc.
The time it takes to run a 3D Mark test might as well be used to run a benchmark of a real application/game.
Something like a flight simulator benchmark would definitely be more meaningful than 3D Mark. I believe there have only been a very few flight sim games released in the past few years, that could be a plus or minus depending on how you view it. I personally don't care :).
In fact 2D performance tests might be more useful to me - some 3D cards don't have as good 2D performance as others.
What I find annoying with some other benchmark sites is they only test resolutions like 2560 x 1600. Yes that's useful to test the really high end, but many people are still using 1280x1024 and 1680x1050. That's one of the reasons why I prefer Anandtech :).
Buying a bigger display is a lot of money- the display costs more and you need to spend tons more on graphic cards just to drive that display at a decent frame rate.
Regarding minimum frame rates - if frame rate graphs are not possible, posting minimum and maximum frame rates would be good (averaged over X seconds minimum).
Then there's SLI. I've heard that for some SLI stuff, the interframe delay going from card #1 to card #2 could be different from card #2 to card #1. Say the average frame rate is 60 fps. So on average there's 16ms between frames. However in theory card #2 could be producing a frame 2 milliseconds after card #1, and then nothing happens for 30ms. So the actual perceived display is not quite as smooth as the numbers might have you believe - it might appear closer to 30fps, or "jittery".
Last but not least, if possible try to measure _latency_ as well. e.g. measure the time it takes for mouse button down and/or key down to the action being displayed on the screen. A video card or video driver that produces higher frame rates but adds a lag of 50 milliseconds will be bad for most games where frames per second count. Testing latency should make for an interesting article. If you find that in general the latency is insignificant - say lower than 10ms, you can leave it out of the standard benchmarks and only do latency comparison tests for things like some fancy new tech wireless mouse.
FPS, FPS, FPS, FPS, FPS... oh, and another FPS! Hopefully you can get in at least a few other genres, like simulation (GRID seems to be a fine inclusion), RPG (other than Fallout 3), and RTS.
From your comments, it looks like it will be an X58 Nehalem platform; does that mean no need for an NVIDIA chipset to run SLI? God I hope so!
Yes, please post more information then just the average framerate. The problem with averages is that a card pumping out a solid, steady framerate can end up with the same average as a card that is fluctuating wildly between highs and lows. As long as the math worked out, you couldn't tell which card was actually a better play.
Of course, this could also be solved with a line graph showing framerates over the course of the test, instead of a simple average framerate bar graph. For some reason just about every review site under the sun uses bar graphs though... Well, except for one, but I don't want to mention names in case it starts some sort of review-site-fanboy war. -_-
Of course then they would be back to several charts for each game, as you would need individual charts for each resolution. And as there would be many thousands of frames in a test, there would still be some averaging to compress that data down into a 500 pixel or so wide graph.
True, it does make the review more complex. But that's a good thing, as readers will get a lot more information out of it. A line showing what a card was doing over the course of a test is far more useful then a bar labeled "35FPS".
I have to agree. It was particularly nice to see games like Devil May Cry 4 and Mass Effect being tested in the laptop reviews - they're modern games with awesome graphics.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
33 Comments
Back to Article
Stolf2012 - Thursday, October 30, 2008 - link
I agree with chizow, I would love to see you guys follow through with your Vista sp1 vs xp sp3. A couple comparative screen shots showing the difference between dx9 and 10 would be great to include, for users to decide weather the performance hit with dx10 is worth it.Looking forward to it,
Bill.
ebayne - Thursday, October 30, 2008 - link
AoC is a pig. It brings systems to their tiny little knees. I think it would be a worthwhile addition to the test suite..its the equivalent to Crysis '05.aguilpa1 - Thursday, October 30, 2008 - link
Is there any where were we could download the recorded game scripts to your new games so we could test our own systems against your testbed for comparison?Zar0n - Thursday, October 30, 2008 - link
The problem is not the performance but consistency.Take this article for ex:
http://www.behardware.com/articles/731-1/ssd-produ...">http://www.behardware.com/articles/731-...samsung-...
Several sites are reporting that the Intel SSD differs in performance depending on the type of data, and also takes some time to get stable performance after some types of write operations.
So use a fast hard drive like a WD raptor or a more "reliable" SSD like a samsung with SLC Nand.
lyeoh - Wednesday, October 29, 2008 - link
(sorry for dupes if any, my post doesn't seem to be showing up)I personally regard 3D Mark as a meaningless test and a waste of time except to overclockers who mainly use their computers to run 3D Mark, superpi, etc.
The time it takes to run a 3D Mark test might as well be used to run a benchmark of a real application/game.
Something like a flight simulator benchmark would definitely be more meaningful than 3D Mark. I believe there have only been a very few flight sim games released in the past few years, that could be a plus or minus depending on how you view it. I personally don't care :).
In fact 2D performance tests might be more useful to me - some 3D cards don't have as good 2D performance as others.
What I find annoying with some other benchmark sites is they only test resolutions like 2560 x 1600. Yes that's useful to test the really high end, but many people are still using 1280x1024 and 1680x1050. That's one of the reasons why I prefer Anandtech :).
Buying a bigger display is a lot of money- the display costs more and you need to spend tons more on graphic cards just to drive that display at a decent frame rate.
Regarding minimum frame rates - if frame rate graphs are not possible, posting minimum and maximum frame rates would be good (averaged over X seconds minimum).
Then there's SLI. I've heard that for some SLI stuff, the interframe delay going from card #1 to card #2 could be different from card #2 to card #1. Say the average frame rate is 60 fps. So on average there's 16ms between frames. However in theory card #2 could be producing a frame 2 milliseconds after card #1, and then nothing happens for 30ms. So the actual perceived display is not quite as smooth as the numbers might have you believe - it might appear closer to 30fps, or "jittery".
Last but not least, if possible try to measure _latency_ as well. e.g. measure the time it takes for mouse button down and/or key down to the action being displayed on the screen. A video card or video driver that produces higher frame rates but adds a lag of 50 milliseconds will be bad for most games where frames per second count. Testing latency should make for an interesting article. If you find that in general the latency is insignificant - say lower than 10ms, you can leave it out of the standard benchmarks and only do latency comparison tests for things like some fancy new tech wireless mouse.
blw37 - Wednesday, October 29, 2008 - link
The problem I have with testing low and mid-end cards on a high end system is it provides no information about the tradeoffs that you are actually making on the type of system that uses such cards. Say I have an E6550 and 8600GT at the moment and have a limited budget for upgrading. I want to know if upgrading the GPU or CPU provides the best payoffs. There is no point testing a GTX280 or 4870X2 on a low to mid-end system, but equally there is no point testing a 4350 with a QX9700.KeithP - Tuesday, October 28, 2008 - link
I understand why a high end system is being used to test video cards, but as someone looking for information on what to buy in the mid range to low end class I find it extremely frustrating.AT really needs two test beds, a lower spec system and the high end system. Seeing relative differences in the high spec system is useless because it still doesn't tell me if the frame rates I will be seeing will be playable.
I would think something in the 2-2.2 GHz range with 2GB of RAM and Win XP would be good.
-KeithP
crimson117 - Tuesday, October 28, 2008 - link
He explained that already...He can't possibly try every CPU with the new GPU, so he asks the question "When it's not limited by CPU, how well can this new graphics card perform?"
It'd be pretty boring to see games top out at 1024x768 with nVidia 280 SLI just because a slow CPU couldn't keep up.
cool - Tuesday, October 28, 2008 - link
"un-Vistaing Vista"?Just install XP32/64 and you can save yourself all the trouble. Who plays/cares about DX10 games anyway?
JimmiG - Tuesday, October 28, 2008 - link
All that talk about un-Vistaing Vista only applies if you want to run benchmarks and want consistent results. I find that most of the background programs and services in Vista are actually useful for me on a day to day basis. If you're really worried about getting 41.6 FPS one time and 42.8 FPS the next, if you were to play through the exact same level in the same game in exactly the same way twice, then by all means disable all the auto tuning features so the OS slows down over time and disable indexing to make file searches take 10 minutes instead of 2 seconds and turn off firewall, defender, updates etc. so your system is overrun by viruses and malware the minute you go online.Back when my machine featured a K6-2 CPU, 6GB of harddrive space and 128MB of RAM, I felt I had to do everything in my power to make it run faster. Now I've got terabytes of storage space and gigabytes of RAM - I no longer feel the need to strip my gaming system to the bare minimum just to save 15MB of RAM, free up 200MB of harddrive space and in the end gain 0.5 FPS, losing in the process much of the functionality that sets a modern PC apart from a Win98 machine.
So yes, I can understand why Vista would be frustrating if you run a hardware site and want to run benchmarks.. but those same features make it easier, smoother and faster to use for regular PC users on a daily basis. Instead of using the computing resources to produce even higher FPS numbers and 3DMarks, you're using them to make the PC experience more enjoyable. Maybe Microsoft is working on Windows 7 - Benchmark Edition for those who don't want the background services and auto tuning :)
strikeback03 - Wednesday, October 29, 2008 - link
Wouldn't have to be a separate edition, just a control panel option to switch to bare minimum. If the graphics drivers could run in safe mode, that would probably be about perfect.And why would you need a search function?
Mr Perfect - Tuesday, October 28, 2008 - link
Actually, most of those tweaks can be done on XP too. Performance mode, hidden tray icons, highlighted programs, automatic updates, security center alerts, disabling the welcome screen, system restore, etc, etc...I actually thought it was kind of funny when he called it de-vistaing. This is usually what I call de-XPing, since you almost end up with Windows 2000 with all that stuff turned off.
strikeback03 - Wednesday, October 29, 2008 - link
There is a welcome screen in XP?Mr Perfect - Wednesday, October 29, 2008 - link
Yes, that goofy "Click your picture icon to login" screen. If there is only one user account on the PC, I think it skips right over the welcome screen and logs you in automatically. If it does show up, you can turn it off in the control panel to get a proper CTRL+ALT+DEL login box.4wardtristan - Wednesday, October 29, 2008 - link
u can also press ctrl+alt+delete twice at the welcome screen to get the oldschool login screen :)Concillian - Tuesday, October 28, 2008 - link
Yeah, for the next week or so I'm still using Windows 2000 on my main gaming machine and have been since 2000. Making the move to Vista and I basically come to the realization that I'm going to end up with something that will be using almost no new features outside of a new sound and window manager theme.DerekWilson - Tuesday, October 28, 2008 - link
xp 64 is worse than vista by a huge margin. and we're using more than 4GB of RAM in our future test bed.also, even if there are people out there who don't care about dx10 at all, it is still important to look at dx10 performance to get an understanding of graphics capability and the future of the industry.
Myrandex - Wednesday, October 29, 2008 - link
I <3 my XP64. I don't know what is terrible about it. At least it lets you use h/w sound acceleration :PI will eventually move to Vista x64 for my gaming PC, but for now I'm thoroughly pleased with XP64 (Phenom 2.6GHz., 4GB DDR2-1066, 4850 1GB, X-Fi, 500GB Raid-0, etc.). I have Vista x64 on my laptop and I use that just for getting stuff done, while I play on the XP64 machine.
Jason
chizow - Tuesday, October 28, 2008 - link
Heh, you really have a hard time hiding your contempt for Vista 64 Derek, but is it justified? You had a chance to put the issue to rest when you promised a Vista 64 vs. XP comparo nearly a year ago, before you guys made the switch to 4GB and Vista 64 earlier this year. But you never got around to it for whatever reason so I guess we're stick with not-so-subtle jabs at Vista until 7 (or Mojave).As for benchmarking and testbed methodology, I'd like to see some changes as other sites have done. I know you've said in the past that you will never do frames vs. time graphs, but only a few per review would be invaluable in drawing conclusions that simple FPS averages would not show. Not only would it put to rest any questions about min FPS, it'd also show time spent at various frame rates.
I'd also like to see some CPU/GPU speed scaling differences for the featured part in a review. Again, this would not be feasible for all parts in a review, but if done only for the featured part and one or two games, that would give readers a good indication of how that part scales with slower/faster CPUs and also how it scales with clockspeed. Over time, one would be able to cross-reference and compare featured parts as long as the test bed remained the same.
For example, if you were reviewing GTX 280 SLI, you'd run your EE Nehalem at 2GHz/3GHz/4GHz and then re-run those tests at 550/600/650MHz GPU clock. The information you might glean from such a comparison after comparing to an earlier GTX 280 review might be that a single GTX 280 with a 2GHz CPU isn't much different than GTX 280 SLI, but very different with a 4GHz CPU. Or that a 600MHz GTX 280 isn't much different in performance relative to an OC'd GTX 260 at 650MHz etc.
Lastly, I'd like to see drivers updated more frequently, or at least periodic driver comparisons for a single part from each vendor. I understand you guys need to used archived results to save time on a short deadline, but using launch drivers months after release seems a bit antiquated in current reviews. At least a comparison would show any performance difference between driver versions, if any.
Hrel - Tuesday, October 28, 2008 - link
I'd love to see flight simulator included in your average testing, it's a HUGE niche. As well as 3D Mark scores, how you guys have still not decided to include 3D mark in every test you run is beyond me... but please start using it! Please! It's objective testing, games are subjective, only when using data from both types of information can you create a complete picture of the performance of any tested part, RAM, Motherboard, GPU or CPU it all needs to be looked at subjectively and objectively.Gary Key - Tuesday, October 28, 2008 - link
We have an excellent Flight Simulator X benchmark coming in the next mobo roundup. ;) Also, we use to run 3DMark and considered running 3DVantage (at least on the mobo/memory side), the problem is that the graphic card manufacturers have a bad habit of doing specific driver optimizations for these programs. Usually these optimizations have no bearing on actually improving game play in general, just there to ensure the benchmark results are improved.DerekWilson - Tuesday, October 28, 2008 - link
3dmark is not objective -- it is what futuremark thinks (subjectively) will be important to the future of gaming.it is also fully synthetic and doesn't give a good report of what exactly a piece of hardware is good at. meaning that it isn't useful for anything practical and the information it provides is not high quality.
we would be MUCH more likely to adopt a fully task specific synthetic benchmark like GPUBench than something like 3dmark ... for performance analysis anyway.
for max load power tests, i always use 3dmark -- it can fully load the graphics hardware without loading the CPU giving you a good GPU level power comparison.
...
flight sims might be nice though ...
lyeoh - Wednesday, October 29, 2008 - link
I personally regard 3D Mark as a meaningless test and a waste of time except to overclockers who mainly use their computers to run 3D Mark, superpi, etc.The time it takes to run a 3D Mark test might as well be used to run a benchmark of a real application/game.
Something like a flight simulator benchmark would definitely be more meaningful than 3D Mark. I believe there have only been a very few flight sim games released in the past few years, that could be a plus or minus depending on how you view it. I personally don't care :).
In fact 2D performance tests might be more useful to me - some 3D cards don't have as good 2D performance as others.
What I find annoying with some other benchmark sites is they only test resolutions like 2560 x 1600. Yes that's useful to test the really high end, but many people are still using 1280x1024 and 1680x1050. That's one of the reasons why I prefer Anandtech :).
Buying a bigger display is a lot of money- the display costs more and you need to spend tons more on graphic cards just to drive that display at a decent frame rate.
Regarding minimum frame rates - if frame rate graphs are not possible, posting minimum and maximum frame rates would be good (averaged over X seconds minimum).
Then there's SLI. I've heard that for some SLI stuff, the interframe delay going from card #1 to card #2 could be different from card #2 to card #1. Say the average frame rate is 60 fps. So on average there's 16ms between frames. However in theory card #2 could be producing a frame 2 milliseconds after card #1, and then nothing happens for 30ms. So the actual perceived display is not quite as smooth as the numbers might have you believe - it might appear closer to 30fps, or "jittery".
Last but not least, if possible try to measure _latency_ as well. e.g. measure the time it takes for mouse button down and/or key down to the action being displayed on the screen. A video card or video driver that produces higher frame rates but adds a lag of 50 milliseconds will be bad for most games where frames per second count. Testing latency should make for an interesting article. If you find that in general the latency is insignificant - say lower than 10ms, you can leave it out of the standard benchmarks and only do latency comparison tests for things like some fancy new tech wireless mouse.
lyeoh - Wednesday, October 29, 2008 - link
I personally regard 3D Mark as a meaningless test and a waste of time except to overclockers who mainly use their computers to run 3D Mark, superpi, etc.The time it takes to run a 3D Mark test might as well be used to run a benchmark of a real application/game.
Something like a flight simulator benchmark would definitely be more meaningful than 3D Mark. I believe there have only been a very few flight sim games released in the past few years, that could be a plus or minus depending on how you view it. I personally don't care :).
In fact 2D performance tests might be more useful to me - some 3D cards don't have as good 2D performance as others.
What I find annoying with some other benchmark sites is they only test resolutions like 2560 x 1600. Yes that's useful to test the really high end, but many people are still using 1280x1024 and 1680x1050. That's one of the reasons why I prefer Anandtech :).
Buying a bigger display is a lot of money- the display costs more and you need to spend tons more on graphic cards just to drive that display at a decent frame rate.
Regarding minimum frame rates - if frame rate graphs are not possible, posting minimum and maximum frame rates would be good (averaged over X seconds minimum).
Then there's SLI. I've heard that for some SLI stuff, the interframe delay going from card #1 to card #2 could be different from card #2 to card #1. Say the average frame rate is 60 fps. So on average there's 16ms between frames. However in theory card #2 could be producing a frame 2 milliseconds after card #1, and then nothing happens for 30ms. So the actual perceived display is not quite as smooth as the numbers might have you believe - it might appear closer to 30fps, or "jittery".
Last but not least, if possible try to measure _latency_ as well. e.g. measure the time it takes for mouse button down and/or key down to the action being displayed on the screen. A video card or video driver that produces higher frame rates but adds a lag of 50 milliseconds will be bad for most games where frames per second count. Testing latency should make for an interesting article. If you find that in general the latency is insignificant - say lower than 10ms, you can leave it out of the standard benchmarks and only do latency comparison tests for things like some fancy new tech wireless mouse.
whatthehey - Tuesday, October 28, 2008 - link
FPS, FPS, FPS, FPS, FPS... oh, and another FPS! Hopefully you can get in at least a few other genres, like simulation (GRID seems to be a fine inclusion), RPG (other than Fallout 3), and RTS.From your comments, it looks like it will be an X58 Nehalem platform; does that mean no need for an NVIDIA chipset to run SLI? God I hope so!
jnmfox - Tuesday, October 28, 2008 - link
+1Anandtech needs to add more non-FPS games. I would like to see Company of Hero or World in Conflict.
Also post minimum frame rates in games, not just the average.
HYPhoenix - Wednesday, October 29, 2008 - link
It would be nice if you guys show a scatter plot of one resolution to see where the framerate stays the most.Mr Perfect - Tuesday, October 28, 2008 - link
Yes, please post more information then just the average framerate. The problem with averages is that a card pumping out a solid, steady framerate can end up with the same average as a card that is fluctuating wildly between highs and lows. As long as the math worked out, you couldn't tell which card was actually a better play.Of course, this could also be solved with a line graph showing framerates over the course of the test, instead of a simple average framerate bar graph. For some reason just about every review site under the sun uses bar graphs though... Well, except for one, but I don't want to mention names in case it starts some sort of review-site-fanboy war. -_-
strikeback03 - Wednesday, October 29, 2008 - link
Of course then they would be back to several charts for each game, as you would need individual charts for each resolution. And as there would be many thousands of frames in a test, there would still be some averaging to compress that data down into a 500 pixel or so wide graph.Mr Perfect - Wednesday, October 29, 2008 - link
True, it does make the review more complex. But that's a good thing, as readers will get a lot more information out of it. A line showing what a card was doing over the course of a test is far more useful then a bar labeled "35FPS".Sc4freak - Tuesday, October 28, 2008 - link
I have to agree. It was particularly nice to see games like Devil May Cry 4 and Mass Effect being tested in the laptop reviews - they're modern games with awesome graphics.JarredWalton - Tuesday, October 28, 2008 - link
Heh... well, I can't say I care for DMC4, but since it has a built-in test it's easy to run. :-) I did try for a nice cross-section of gaming.erikejw - Thursday, October 30, 2008 - link
Maybe you guys should do 2 reviews of every graphics card.One FPS review like the current one and then one
Game review and try to include all kind of games.
That will both make you as a reviewer happy(FPS review galore) and us(Game review) who will actually read it.