It is worth noting that FM have integrated a native rendering that scales to your monitor. So Fire Strike on extreme mode is natively rendered at 2560x1440 and then scaled to 1366x768 of the monitor as required. (source: FM_Jarvis on hwbot.org forums)
Time to fire up some desktop four way, see if it scales :D
Thanks for a great walkthrough! However i am wondering about the Llano and Trinity systems. Are those mobile and if what brand and model do they run on?
All of the systems other than the desktop are laptops. As for the AMD Llano and Trinity, those are prototype systems from AMD. Llano is probably not up to snuff, as I can't update drivers, but the Trinity laptop runs well -- it was the highest scoring A10 iGPU of the three I have right now (the Samsung and MSI being the other two).
I'm annoyed with Samsung for their memory configuration. 2GB+4GB? I'd like to see tests with that same laptop running a decent pair of 4GB DDR3-1600 sticks. On top of this, even if you can configure one yourself online... they gouge so bad on storage and RAM upgrades that it makes more sense for me to buy it poorly preconfigured and upgrade it myself. I could throw the unwanted parts in the trash and STILL come out way cheaper, not that I would do so.
In terms of the graphics fidelity, these aren't games so it's difficult to compare. I actually find even the Ice Storm test looks decent and makes me yearn for a good space simulation, even though it's clearly the least demanding of the three tests. I remember upgrading from a 286 to a 386 just so I could run the original Wing Commander with Expanded Memory [EMS] and upgrade graphics quality! Tie Fighter, X-Wing, Freespace, and Starlancer all graced my hard drive over the years. Cloud Gate and Fire Strike are more strict graphics demos, though I suppose I could see Fire Strike as fighting game.
The rendering effects are good, and I'm also glad we're not seeing any rehashing of old benchmarks with updated graphics (3DMark05 and 3DMark06 come to mind). Really, though, if we're talking about games it's really the experience as much as the graphics that matter--look at indie games like FTL where the graphics are simplistic and yet plenty of people waste hours playing and replaying the game. Ultimately, I see 3DMark more as a way of pushing hardware to extremes that we won't see in most games for a few years, but as graphics demos they don't have all the trappings of real games.
If I were to compare to an actual game, though, even the world of something like Batman: Arkham City looks better in some ways than the overabundant use of particle effects and shaders in Fire Strike. Not that it looks bad (well, it does at single digit frame rates on an HD 4000, but that's another matter), but making a nice looking demo is far different from making a good game. Shattered Horizon is a great example of this, IMO.
Not sure if any of this helps, but of course you can grab the Basic Edition for free and run it on your own system. Or if you don't have a decent GPU, Futuremark posted videos of all three tests on YouTube I think.
Reminds me of the days where I had a bunch of batch files to replace my config.sys and autoexec.bat to change my setup depending on what I was doing. I used QEMM back in the days, dunno why I remember that.
QEMM or similar products were necessary until DOS 5.0 basically solved most of the issues. Hahaha... I remember all the Config.SYS tweaking as well. It was the "game before the game"!
Remember when buying anything not Sound Blaster meant PC gaming hell? I mean, midi was the best it got if you didn't have a Sound Blaster. And that's if you were lucky. Sometimes, you'd just nothing. Total non-functional hell.
I remember my PC screen being given its first taste of (bad) AA with smeared graphics because you had to put your 2d card through a pass-through to get 3dfx graphics and the signal degraded some doing the pass-through.
I remember having to actually figure out IRQ conflicts. Without much help from either the system or the motherboard. Just had to suss them out. Or tough luck, dude.
I remember back when you had all these companies working on x86 processors. Or when AMD and Intel chips could be used on the same motherboards. I remember when Intel told us we just HAD to have CPU's that slotted in via slot. Then told us all the cool kids didn't use slot any more a few years later.
I can remember a day way back when that AMD used PR performance ratings to make up for the fact that Intel wanted to push speed over performance per clock. Ironic compared to how the two play out now on this front.
I can remember when Plextor made fantastic optical drives on their own of their own design. And I can remember when we had three players in the GPU field.
I remember the joy of the Celeron 300a and boosting that bad boy on an Abit BH6 to 450mhz. That thing flew faster than the fastest Pentium for a brief time there. I remember Abit.
I remember...
Dear God, at some point there, I started imagining myself as Rutger Hauer in Blade Runner at the end.
"I've seen things you wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain. Time to die.”
I remember I bought the Game Blaster sound card maybe 6 months before the Sound Blaster came out, and how disappointed I was when I saw the Sound Blaster at the College of Dupage computer show. God, how I miss going to the computer show with my grandfather. I looked forward to it all month long. And the Computer Shopper magazine, in the early days. And reading newsgroups . . . Gosh, I'm getting old.
Celery 300a was pretty awesome, wasn't it? I think that's around the time I started reading Tom's Hardware and AnandTech (when it was still on Geo Cities!) I had that same Abit BH6 motherboard, I think... I also think I used an Abit IT5H before that, with the bus running at 83.3MHz and my Pentium 200 MMX ripping along at 250MHz! And I had a whopping 64MB or RAM.
But even better was the good old days before we even had a reasonable Windows setup. Yeah, I recall installing Windows 2.x and it sucked. 3.0 was actually the first usable version, and there were even Windows alternatives way back then -- I had a friend running some alternative Windows OS as well. GEM maybe, or Viewmax? I can't recall, other than that it didn't really work properly for what we were trying to do at the time (play games).
I remember deciding between 8 and 12mb and then getting my 2nd voodoo card later and running time demos of Quake over and over again after overclocking. The 300 to 450 was fun times too.
I'm not super impressed by how it looks, but appreciate what I think it's saying.
I am of the mind Fire Strike gpu tests are FM saying to the desktop community 'if it can run test 1 with 30fps, your avg framerate in dx11 titles should be ok for the foreseeable future. If it can run test 2 at 30 frames, your minimum should be okay'. The combined test obviously shows what the current feature-set can do (and how far away from using it's full potential in a realistic manner current hardware is). Perhaps it's not a guide, but a suggestion and/or an approximation of where games and gpus are headed, and I think it's a reasonable one at that.
For reference, test 1 would need ~ a stock 670/highly overclocked 7870/slightly overclocked Tahiti LE or 660ti, test 2 ~ a stock 680 or 7970/highly overclocked 670 or Tahiti LE/moderately overclocked 7950 for 1080p. IOW, pretty much what people use.
The combined score looks to be a rough estimation of what to expect from the 20nm high-end to get 30fps at 1080p, which also makes sense, as it will likely be the last generation/process to target 1080p before it it becomes realistic to see higher-end consumer single screens become 4k (probably around 14nm and 2016-2017ish). Also, the differance in frame rate from the combined test to gpu test 2 is approx the difference in resolution from 720p to 1080p...so it works on a few different levels.
Nehalem is quite fine. Performance hasn't changed all that much. Servers are still stuck on Nehalem|Westmere/Xeon-"MPs" or Sandy Bridge-EP/Xeon-E5. Should count as a modern desktop :) Really it's quite nice that a 2-4 year old system isn't ancient and can't drive games, support lots of memory or whatever like how it was in the not so old days. It's still PCI-e 2.0, DDR3 etc. You really wouldn't exactly have thought of putting say a Radeon 9800 Pro into a 600-800MHz Pentium 3 Coppermine machine.
(new comment system needs an edit option) In the table listing the system you list the Nehalem Desktop RAM as DDR2-800, I'm pretty sure they didn't support DDR2.
I feel like all these scores are useless and don't really tell one anything. If they want to make a 3D benchmark. Why not bench the actual capability rather than fps. Build a very variable engine & world in its detail. Give those details weights. Let the system try to render at 30 to 60 fps or something reasonable. The benchmark would simply automatically adjust details according to some weight system until it reaches a verdict. This much detail the system can handle. That way mobile and desktop could be more comparable than a 1000 fps vs 60 fps scene.
As it stands I don't quite see what futuremark accomplished that a few games benchmarks don't do better. Especially with the 2 year update schedule.
I agree with the last part for sure; I prefer game benchmarks to synthetic graphics tests, but there's a point to running 3DMark on laptops as I note in the article. It's a single score that you can at least get some idea of performance from -- but really, it's three scores per tests, and you still have to use your brain to analyze what it all means. :-p
Irrelevant. It's simply not a fair test. The 630M and 650M are nowhere near the top of their mobile product stack, whereas the 7970M is for AMD. Had there been a 680M in the test, I doubt you'd be saying this.
I am waiting to see how an 3570K and 3770K with 7970 GHz Edition and GTX680 score. Hyperthreading seems to double the scores on physics tests on my systems...
I rarely game on this, but I own a system very similar in specs to the ASUS UX51VZ tested (i7-3630QM instead of i7-3612QM, same exact clocks and memory on a GT 650M, same exact clocks and timing on the RAM) and got somewhat better results. Running 310.90 WHQL.
Left number is my Inspiron 17R SE's result, right is the UX51VZ:
Drivers, or I guess it could be the dynamic clocks of the GPU working higher. Some of the difference is the i7-3630QM at 300-400 MHz higher than a i7-3612QM, but is that supposed to show in the graphics subscores?
It could. I tested all systems with either Catalyst 13.1 or NVIDIA 313.96 drivers, with the exception of the Llano system that's running some ancient driver set because nothing newer will install.
Do the graphics and physics tests run sequentially? With integrated graphics they really need to be stressing both components simultaneously. Giving the GPU 17 W and then the CPU 17 W won't reflect the performance in games when the two components have to share 17 W. If the ULV parts hold up a lot better in 3DMark than they do in real games, I think that is to blame.
Also, how do these test results change running on battery?
That's the point I'm trying to make: other than Ice Storm, ULV IVB is far closer to DC SV IVB than I would expect it to be, which suggests the results are going to correlate all that well with real games for HD 4000.
As for running on battery, how the results change depends heavily on what power settings you use, at least with most of the systems. Set everything for maximum performance and you'll get less battery life but should have similar performance to what you see in the charts. Drop to Power Saver and optimize for battery life and you'll get anywhere from one half to maybe 20% of the performance (e.g. HD 7970M will drop the clocks pretty far if you optimize for power).
They do indeed run sequentially, except for the final Fire Strike test which is "combined" and also the most taxing by some margin. So you may be on to something there.
It's also worth noting that in fine tradition the scores appear to be frame-rate based, so will take little account of the awful stop-start fast-slow chuntering that you get with Intel integrated graphics when they're at their thermal limits.
I would LOVE to see specs for the current crop of high end gaming notebooks. For example - The Clevo P370EM3 / Sager NP9370-3D single 670M, 670M SLI, 680M, 680M SLI, 7970M, 7970M xfire. The suite of tests done in both 2D and 3D. The suite of tests done in 7x64 and 8. Mmmm... can you tell what I've been itching to pick up?
Other than the Alienware M17x R4 and the MSI GX60, I don't have any high-end laptops right now (though I'm trying to get something with GTX 680M to use as a comparison for the above). Regardless, some time in the next few months I'm sure we'll have another high-end laptop or two for testing.
We are transitioning many of our number crunchers at work from self-built desktops to laptops - the shifting focus of reviews to such equipment here at Anandtech has been a fantastic help and I eagerly consume everything you folks put out.
What city are you located in? If near enough I’d be willing to drop a dual 680M rig loaner for a few days when I pull the trigger post bonus. Have you been able to beg XOTICPC/Sager/AVADirect/ETC for a unit? If not, would you like to enlist my help?
Given the sheer power available on the high end-laptops I’m planning on replacing my gaming rig – a massively overclocked watercooled i920 with dual 280 greens. Most of the games I want to play have some amount of support for 3D. Few games are really pushing today's GPUs/CPUs as the vendors are reined back by console specs. Even the next gen console specs appear to be a bit underwhelming. I’m fairly confident that Clevo’s massive SLI rig fit with dual 680Ms should last a good many years – hopefully. I plan on utilizing a hefty factory overclock, so I hope the heat doesn’t kill the components too quickly. It also helps push this decision that my job now sees me traveling – often with long layovers – and my aging desktop rig is seeing very little use.
Have you considered notebooks with the possibility of external video upgrades? I have a ThinkPad W520 connected to a ViDock via Express Card which houses a GTX 670 4GB. The scores I got so far: ICE STORM: 97109 (Beats the Alienware with 7970M) CLOUD GATE: 15874 (Slightly slower than 7970M) FIRE STRIKE: 5351 (Even beats the gaming desktop 7950)
i7 3920XM, 16GB CAS10 RAM, ONE 7970M. (Cross-fire equipped, but it locked my system solid the moment the DX11 tests began).
Left number is my Inspiron 17R SE's result, right is the UX51VZ:
Ice Storm: 146723 273526 55947
Cloud Gate: 18157 32362 7159
Fire Strike: 4467 4807 10467
It was kind of fun to watch the tests, but a bore after a 3rd time, due to Crossfire issues. Would be nice to run ONE test, but I guess they need to make money, eh?
RE: I don't see how you saw my results as slower, so here are the comparisons, side-by-side, with data from your graphs, for all, including the Fire Strike tests.
M18x R2.
i7 3920XM, 16GB CAS10 RAM, ONE 7970M. (Cross-fire equipped, but it locked my system solid the moment the DX11 tests began).
Left number is your M17x R4's results, right is the my M18x R2 on ONE 7970M card:
So to me, I have the M17x system beat in all respects. # = My system besting everything on your graphs. * = Your HD7950 showing significant gain(s), over the laptops.
Operating System is Win 8 (Bloatware) x64. Graphics drivers are AMD_Catalyst_13.2_Beta3. (13.1 didn't install at all). No CAP file for this one yet I don't think.
Wow, seems like FM put some real effort into this one. It's more than "just another 3DMark" :)
And regarding the currently missing diagnosis information: I think they should partner with some hardware diagnostic tool vendor, like the Argus Monitor team. Whatever FM needs to implement, they already did it. Fair deal: FM integrates their diagnostic in exchange for advertising that they do so.
There's also more potential in the enhanced result analysis and the nice graphs. And interesting point would be the time between frames. Give a mean and standard deviation and we could judge which system is smoother. Give us a graph and we could easily identify micro stutter.
I tried overclocking the GPU on a P170EM and it totally wouldn't work -- all the utilities I tried basically hard locked the system the instant you applied new clocks. If anyone has a recommended utility for overclocking notebooks, I'd like to know what it is.
Hmm... 7970m seems much more powerfull. 3585 vs ~4700. Maybe I could improve the score using new drivers but I don't have time for it (or overclocking or anything else until they disable the demos). SLI works by the way :D.
A couple of things (long rant):
Disable the frigging demos (I'm starting to repeat myself, not OK) !!!!! or set them looping on some other button from the launcher.
No easily discernible screen tearing at 1k FPS (are they actually showing 1K or just offscreen + 60 on screen).
They have troubles with identifying the SLI on/off , GPU vendor, GPU speed etc.
You can see that the first test is targeted towards mobile stuff. Very bald looking (did I mentioned 1K fps?). What would be nice is to also see attached to the tests the GPU utilization (if not already done).
Phy tests seem rushed. Even in the last combined test, the rocks in the background fall kinda funny. BTW, cloth simulation is way off. Also .. was I using the gpu for physics?? Don't know, don't think so.
Looking over their "invalid config guide" seems that if you have a dedicated PPU the test is invalid. So what if the frigging cellphone can't do complex stuff, use all available hw at your disposal and then worry about scoring???!!!! Do you get better fps, then that is a good thing. Score it accordingly. FPS was a common denominator. Use it! Fidelity(phy&gfx) + Keeping the Illusion of movement is what the end user is concerned not what hw you use to get there. If the phone in your pocket can't do that ... well it's a phone, in another 3-5 years it will.
Last test/demo(actually just the combined part of the test) is what I was expecting, just that we should have had 3 of those not just 1 + 2 with fps >200. I mean 5 tests where 3 are actually DX11 with all the bells and whatnot. I have 4GB of mem per card, are you using that, I want the pcie to fry from all the swapping.
Also, use the demos as a more lengthy benchmark !!!
Thees tests should represent the future of content fidelity, now they waste their time with 10(actually between 9&11) year old tech because this is where the money is right now.
First time I ran 3dMark (2006) my PC was almost leaning towards seconds per frame in some situations. Now I get with a frigging laptop almost 10fps in their most demanding test. NOT OK. I know you can do better FM!!
Are you using Enduro or do you have it disabled? I had it enabled on the M17x, so that could be the reason. Also, you're using 12.10 drivers and I'm using 13.1, so maybe that makes a difference.
Disappointing that it won't support OpenGL ES 3.0 right from the start. Also I wonder how "fair" the tests will be between DirectX and OpenGL devices. Will games that look and work identical on these 2 different API's, get the same score?
If not then it will be a pretty useless "cross-platform" benchmark. But we'll see.
Better performance on the Ice Storm test then the other notebooks tested. Cloudgate 2nd (of notebooks) and firestrike 3th of notebooks. All seem to be due to a low graphics score. I suppose this is due to the fact that the M6000 is optimized fore business apps. Although I wonder how much drivers could be of an isue since all the versions I have been trying so far havn't been very stable. 13.1 was a bit more stable then 13.2 overclocking is out of the option with 13.2 beta 4.
Operating System is Win 8 (Bloatware) x64, and can't reverse install, and too lazy to restore backup. Graphics drivers are AMD_Catalyst_13.2_Beta5. (13.1 didn't install at all, 13.2 completed only once). No CAP file for this one yet I don't think.
Crossfire working, but Intel's XTU informs me I was 'throttling', (somewhat) new to me. I game all day on multipliers of 4 cores @ x44, but this benchmark seems to push the system harder. XTU reports 3% throttling over the benchmark period @ 4cores x44. I dropped to x44,43,42,41 on beta 5 to allow it to complete the benchmark.
Oddly, Super Pi completes two 32M runs back to back without error on x44,44,44,44. (and it does it a whole 30s slower than on Win 7). Ambient room temp is ~29C. (It's 35C outside).
Definitely of interest. It's odd how the Graphics scores increase dramatically with Crossfire, as you'd expect, but somehow the Overall scores don't. Seems they've either calibrated the engine or the score generator such that it leans towards overall system performance more than the GPU. Time will tell whether that gives us a reasonable picture of the system's performance, but past experience suggests it won't.
The title of the product implies it is for testing 3D performance, yet with significant increases in GPU performance, it is shocking to not see that increase in performance reflected in the overall scores.
I took another look at some other scores online, and it appears to me that the 'Physics Score', is really just a CPU test.
And I thought 3DMark06 was being deprecated for focusing on CPU performance to much (as well as for not being DX11 capable), rather than simply reflecting overall 3D performance, as we'd expect from such a title.
But then, I guess, some games are more CPU-dependent, than others, so maybe it would be a mistake to leave out this sort of test for the average user looking to benchmark his systems' overall gaming performance? I can't say for sure.
I'm really skeptical about 3dmark11 outputting scores that have 1:1 parity between DirectX machines and OpenGL ES 2.0 machines. If it doesn't, then it would be pretty useless for Anandtech's benchmarks, because you're trying to compare GPU's and hardware, not graphics API's.
So if say Tegra 3 on Nexus 7 gives a 1000 score, and the Surface RT gives a 1500 score, because the benchmark gives higher score to certain DirectX features, then the benchmark is useless. Because it was supposed to show the GPU's are equal, and it won't.
That's just a speculation for now, but Anand and the guys should really watch out for this.
To be clear, if the drivers are better on one device than the other, then the benchmark SHOULD reflect that, and I'm sure it will. Also it should reflect a higher score if the graphics look better on DirectX machines or anything like that (although that will probably come with a hit in battery life, but that's another issue).
What I'm saying is that if everything else is EQUAL , DirectX shouldn't get higher score, just because it's a more complex API than OpenGL ES 2.0. That wouldn't be on its merits.
Also, I'm very disappointing they are making such a big launch out of this, and they aren't even going to support OpenGL ES 3.0 out of the gate, even though it will probably be almost a year even before they release their OpenGL ES 2.0 benchmark, compared to the time when OpenGL ES 3.0 was launched last year.
Clearly they didn't want to prioritize the OpenGL ES part much, even 2.0, let alone 3.0. We might not see 3.0 support until mid 2014 at the earliest from them. Hopefully GLBenchmark 3.0 will come out this year.
I have CPU and GPU monitoring gadgets running on my second screen, i noticed that whilst the first two tests seemed fairly balanced for CPU/GPU, the third firestrike test maxed out the GPU usage the entire time with the CPU barely hitting 30% usage. A test for the sli/crossfire crowd i think :)
Not sure if it's just a display bug, but it didn't detect my GPU driver at all (ati 6950 modded with 6970 shader count, but GPU/RAM clocks still a little lower than a default 6970) and it detected my fx-8120 CPU (set to 3.4ghz or 4.2ghz turbo) as being 1.4ghz.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
69 Comments
Back to Article
IanCutress - Tuesday, February 5, 2013 - link
It is worth noting that FM have integrated a native rendering that scales to your monitor. So Fire Strike on extreme mode is natively rendered at 2560x1440 and then scaled to 1366x768 of the monitor as required. (source: FM_Jarvis on hwbot.org forums)Time to fire up some desktop four way, see if it scales :D
dj christian - Tuesday, February 5, 2013 - link
Thanks for a great walkthrough! However i am wondering about the Llano and Trinity systems. Are those mobile and if what brand and model do they run on?JarredWalton - Tuesday, February 5, 2013 - link
All of the systems other than the desktop are laptops. As for the AMD Llano and Trinity, those are prototype systems from AMD. Llano is probably not up to snuff, as I can't update drivers, but the Trinity laptop runs well -- it was the highest scoring A10 iGPU of the three I have right now (the Samsung and MSI being the other two).Alexvrb - Wednesday, February 6, 2013 - link
I'm annoyed with Samsung for their memory configuration. 2GB+4GB? I'd like to see tests with that same laptop running a decent pair of 4GB DDR3-1600 sticks. On top of this, even if you can configure one yourself online... they gouge so bad on storage and RAM upgrades that it makes more sense for me to buy it poorly preconfigured and upgrade it myself. I could throw the unwanted parts in the trash and STILL come out way cheaper, not that I would do so.Tuvok86 - Tuesday, February 5, 2013 - link
so, does the 3rd test look "next gen" and how does it look compared to the best engines?JarredWalton - Tuesday, February 5, 2013 - link
In terms of the graphics fidelity, these aren't games so it's difficult to compare. I actually find even the Ice Storm test looks decent and makes me yearn for a good space simulation, even though it's clearly the least demanding of the three tests. I remember upgrading from a 286 to a 386 just so I could run the original Wing Commander with Expanded Memory [EMS] and upgrade graphics quality! Tie Fighter, X-Wing, Freespace, and Starlancer all graced my hard drive over the years. Cloud Gate and Fire Strike are more strict graphics demos, though I suppose I could see Fire Strike as fighting game.The rendering effects are good, and I'm also glad we're not seeing any rehashing of old benchmarks with updated graphics (3DMark05 and 3DMark06 come to mind). Really, though, if we're talking about games it's really the experience as much as the graphics that matter--look at indie games like FTL where the graphics are simplistic and yet plenty of people waste hours playing and replaying the game. Ultimately, I see 3DMark more as a way of pushing hardware to extremes that we won't see in most games for a few years, but as graphics demos they don't have all the trappings of real games.
If I were to compare to an actual game, though, even the world of something like Batman: Arkham City looks better in some ways than the overabundant use of particle effects and shaders in Fire Strike. Not that it looks bad (well, it does at single digit frame rates on an HD 4000, but that's another matter), but making a nice looking demo is far different from making a good game. Shattered Horizon is a great example of this, IMO.
Not sure if any of this helps, but of course you can grab the Basic Edition for free and run it on your own system. Or if you don't have a decent GPU, Futuremark posted videos of all three tests on YouTube I think.
euler007 - Tuesday, February 5, 2013 - link
Reminds me of the days where I had a bunch of batch files to replace my config.sys and autoexec.bat to change my setup depending on what I was doing. I used QEMM back in the days, dunno why I remember that.JarredWalton - Tuesday, February 5, 2013 - link
QEMM or similar products were necessary until DOS 5.0 basically solved most of the issues. Hahaha... I remember all the Config.SYS tweaking as well. It was the "game before the game"!HisDivineOrder - Tuesday, February 5, 2013 - link
Kids today have it easy.Remember when buying anything not Sound Blaster meant PC gaming hell? I mean, midi was the best it got if you didn't have a Sound Blaster. And that's if you were lucky. Sometimes, you'd just nothing. Total non-functional hell.
I remember my PC screen being given its first taste of (bad) AA with smeared graphics because you had to put your 2d card through a pass-through to get 3dfx graphics and the signal degraded some doing the pass-through.
I remember having to actually figure out IRQ conflicts. Without much help from either the system or the motherboard. Just had to suss them out. Or tough luck, dude.
I remember back when you had all these companies working on x86 processors. Or when AMD and Intel chips could be used on the same motherboards. I remember when Intel told us we just HAD to have CPU's that slotted in via slot. Then told us all the cool kids didn't use slot any more a few years later.
I can remember a day way back when that AMD used PR performance ratings to make up for the fact that Intel wanted to push speed over performance per clock. Ironic compared to how the two play out now on this front.
I can remember when Plextor made fantastic optical drives on their own of their own design. And I can remember when we had three players in the GPU field.
I remember the joy of the Celeron 300a and boosting that bad boy on an Abit BH6 to 450mhz. That thing flew faster than the fastest Pentium for a brief time there. I remember Abit.
I remember...
Dear God, at some point there, I started imagining myself as Rutger Hauer in Blade Runner at the end.
"I've seen things you wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain. Time to die.”
Parhel - Tuesday, February 5, 2013 - link
I remember I bought the Game Blaster sound card maybe 6 months before the Sound Blaster came out, and how disappointed I was when I saw the Sound Blaster at the College of Dupage computer show. God, how I miss going to the computer show with my grandfather. I looked forward to it all month long. And the Computer Shopper magazine, in the early days. And reading newsgroups . . . Gosh, I'm getting old.JarredWalton - Tuesday, February 5, 2013 - link
Celery 300a was pretty awesome, wasn't it? I think that's around the time I started reading Tom's Hardware and AnandTech (when it was still on Geo Cities!) I had that same Abit BH6 motherboard, I think... I also think I used an Abit IT5H before that, with the bus running at 83.3MHz and my Pentium 200 MMX ripping along at 250MHz! And I had a whopping 64MB or RAM.But even better was the good old days before we even had a reasonable Windows setup. Yeah, I recall installing Windows 2.x and it sucked. 3.0 was actually the first usable version, and there were even Windows alternatives way back then -- I had a friend running some alternative Windows OS as well. GEM maybe, or Viewmax? I can't recall, other than that it didn't really work properly for what we were trying to do at the time (play games).
Dug - Wednesday, March 13, 2013 - link
I remember deciding between 8 and 12mb and then getting my 2nd voodoo card later and running time demos of Quake over and over again after overclocking. The 300 to 450 was fun times too.WeaselITB - Tuesday, February 5, 2013 - link
RE: Space Simshttp://www.robertsspaceindustries.com/star-citizen...
Parhel - Tuesday, February 5, 2013 - link
If that ever sees the light of day, I'll buy it in a second. Freelancer was one of all time favorite games. I still play it from time to time.silverblue - Wednesday, February 6, 2013 - link
Seconded... great game with excellent atmosphere.Peanutsrevenge - Wednesday, February 6, 2013 - link
Just incase you've not heard (somehow).Check out Star Citizen.
Also cool is Diaspora, which is actually playable.!
alwayssts - Tuesday, February 5, 2013 - link
It looks exactly like the video. :-PI'm not super impressed by how it looks, but appreciate what I think it's saying.
I am of the mind Fire Strike gpu tests are FM saying to the desktop community 'if it can run test 1 with 30fps, your avg framerate in dx11 titles should be ok for the foreseeable future. If it can run test 2 at 30 frames, your minimum should be okay'. The combined test obviously shows what the current feature-set can do (and how far away from using it's full potential in a realistic manner current hardware is). Perhaps it's not a guide, but a suggestion and/or an approximation of where games and gpus are headed, and I think it's a reasonable one at that.
For reference, test 1 would need ~ a stock 670/highly overclocked 7870/slightly overclocked Tahiti LE or 660ti, test 2 ~ a stock 680 or 7970/highly overclocked 670 or Tahiti LE/moderately overclocked 7950 for 1080p. IOW, pretty much what people use.
The combined score looks to be a rough estimation of what to expect from the 20nm high-end to get 30fps at 1080p, which also makes sense, as it will likely be the last generation/process to target 1080p before it it becomes realistic to see higher-end consumer single screens become 4k (probably around 14nm and 2016-2017ish). Also, the differance in frame rate from the combined test to gpu test 2 is approx the difference in resolution from 720p to 1080p...so it works on a few different levels.
Dustin Sklavos - Tuesday, February 5, 2013 - link
An AT editor running Nehalem and only 12GB of RAM?What are you, some kind of animal? Was that the only computer you could fit into your cave? ;)
JarredWalton - Tuesday, February 5, 2013 - link
Hey, I only upgraded from Core 2 Quad on my desktop last year! On the bright side, I have a bunch of laptops that are decent.Penti - Tuesday, February 5, 2013 - link
Nehalem is quite fine. Performance hasn't changed all that much. Servers are still stuck on Nehalem|Westmere/Xeon-"MPs" or Sandy Bridge-EP/Xeon-E5. Should count as a modern desktop :) Really it's quite nice that a 2-4 year old system isn't ancient and can't drive games, support lots of memory or whatever like how it was in the not so old days. It's still PCI-e 2.0, DDR3 etc. You really wouldn't exactly have thought of putting say a Radeon 9800 Pro into a 600-800MHz Pentium 3 Coppermine machine.Steinegal - Saturday, March 9, 2013 - link
In the table listing the system you list the Nehalem Desktop RAM as DDR2-800 I'm pretty shure theySteinegal - Saturday, March 9, 2013 - link
(new comment system needs an edit option)In the table listing the system you list the Nehalem Desktop RAM as DDR2-800, I'm pretty sure they didn't support DDR2.
dj christian - Tuesday, February 5, 2013 - link
Would be great if we could get an option to check "notify me if i get replies" when posting in articles.Or integrate the forum for posts made in articles instead as it is now, separated from eachother. Having two different accounts on AT is a pain!
dusk007 - Tuesday, February 5, 2013 - link
I feel like all these scores are useless and don't really tell one anything.If they want to make a 3D benchmark. Why not bench the actual capability rather than fps.
Build a very variable engine & world in its detail.
Give those details weights.
Let the system try to render at 30 to 60 fps or something reasonable.
The benchmark would simply automatically adjust details according to some weight system until it reaches a verdict.
This much detail the system can handle. That way mobile and desktop could be more comparable than a 1000 fps vs 60 fps scene.
As it stands I don't quite see what futuremark accomplished that a few games benchmarks don't do better. Especially with the 2 year update schedule.
JarredWalton - Tuesday, February 5, 2013 - link
I agree with the last part for sure; I prefer game benchmarks to synthetic graphics tests, but there's a point to running 3DMark on laptops as I note in the article. It's a single score that you can at least get some idea of performance from -- but really, it's three scores per tests, and you still have to use your brain to analyze what it all means. :-pMrSpadge - Tuesday, February 5, 2013 - link
> As it stands I don't quite see what futuremark accomplished that a few games benchmarks don't do better.Really? What praphically demanding games are there which can be benchmarked across different platforms?
cityuser - Tuesday, February 5, 2013 - link
Not dare to make nVidia angry ?It's so simply to draw conclusion from the chart, but why so shy to say "AMD WINS" ???
What if the longest graphic bar belongs to nVidia, will anandtech avoid writing "nVidia wins" but write so many irrelevant words??
silverblue - Tuesday, February 5, 2013 - link
Irrelevant. It's simply not a fair test. The 630M and 650M are nowhere near the top of their mobile product stack, whereas the 7970M is for AMD. Had there been a 680M in the test, I doubt you'd be saying this.Spunjji - Thursday, February 7, 2013 - link
Did you even read any of the paragraphs around the tests? :/Diagrafeas - Tuesday, February 5, 2013 - link
I am waiting to see how an 3570K and 3770K with 7970 GHz Edition and GTX680 score.Hyperthreading seems to double the scores on physics tests on my systems...
Diagrafeas - Tuesday, February 5, 2013 - link
2600K - 7970 OChttp://www.3dmark.com/3dm/15450
and a video from the run
http://www.youtube.com/watch?feature=player_embedd...
lever_age - Tuesday, February 5, 2013 - link
I rarely game on this, but I own a system very similar in specs to the ASUS UX51VZ tested (i7-3630QM instead of i7-3612QM, same exact clocks and memory on a GT 650M, same exact clocks and timing on the RAM) and got somewhat better results. Running 310.90 WHQL.Left number is my Inspiron 17R SE's result, right is the UX51VZ:
Ice Storm:
74570 / 58955
96962 / 72342
41239 / 35781
Cloud Gate:
9514 / 8892
11477 / 11056
5952 / 5277
Fire Strike:
1454 / 1328
1518 / 1367
8288 / 7382
Drivers, or I guess it could be the dynamic clocks of the GPU working higher. Some of the difference is the i7-3630QM at 300-400 MHz higher than a i7-3612QM, but is that supposed to show in the graphics subscores?
JarredWalton - Tuesday, February 5, 2013 - link
It could. I tested all systems with either Catalyst 13.1 or NVIDIA 313.96 drivers, with the exception of the Llano system that's running some ancient driver set because nothing newer will install.relztes - Tuesday, February 5, 2013 - link
Do the graphics and physics tests run sequentially? With integrated graphics they really need to be stressing both components simultaneously. Giving the GPU 17 W and then the CPU 17 W won't reflect the performance in games when the two components have to share 17 W. If the ULV parts hold up a lot better in 3DMark than they do in real games, I think that is to blame.Also, how do these test results change running on battery?
JarredWalton - Tuesday, February 5, 2013 - link
That's the point I'm trying to make: other than Ice Storm, ULV IVB is far closer to DC SV IVB than I would expect it to be, which suggests the results are going to correlate all that well with real games for HD 4000.As for running on battery, how the results change depends heavily on what power settings you use, at least with most of the systems. Set everything for maximum performance and you'll get less battery life but should have similar performance to what you see in the charts. Drop to Power Saver and optimize for battery life and you'll get anywhere from one half to maybe 20% of the performance (e.g. HD 7970M will drop the clocks pretty far if you optimize for power).
Spunjji - Thursday, February 7, 2013 - link
They do indeed run sequentially, except for the final Fire Strike test which is "combined" and also the most taxing by some margin. So you may be on to something there.It's also worth noting that in fine tradition the scores appear to be frame-rate based, so will take little account of the awful stop-start fast-slow chuntering that you get with Intel integrated graphics when they're at their thermal limits.
Landspeeder - Tuesday, February 5, 2013 - link
I would LOVE to see specs for the current crop of high end gaming notebooks. For example -The Clevo P370EM3 / Sager NP9370-3D single 670M, 670M SLI, 680M, 680M SLI, 7970M, 7970M xfire.
The suite of tests done in both 2D and 3D.
The suite of tests done in 7x64 and 8.
Mmmm... can you tell what I've been itching to pick up?
JarredWalton - Tuesday, February 5, 2013 - link
Other than the Alienware M17x R4 and the MSI GX60, I don't have any high-end laptops right now (though I'm trying to get something with GTX 680M to use as a comparison for the above). Regardless, some time in the next few months I'm sure we'll have another high-end laptop or two for testing.Landspeeder - Tuesday, February 5, 2013 - link
Thanks for all that you do Jarred!We are transitioning many of our number crunchers at work from self-built desktops to laptops - the shifting focus of reviews to such equipment here at Anandtech has been a fantastic help and I eagerly consume everything you folks put out.
What city are you located in? If near enough I’d be willing to drop a dual 680M rig loaner for a few days when I pull the trigger post bonus. Have you been able to beg XOTICPC/Sager/AVADirect/ETC for a unit? If not, would you like to enlist my help?
Given the sheer power available on the high end-laptops I’m planning on replacing my gaming rig – a massively overclocked watercooled i920 with dual 280 greens. Most of the games I want to play have some amount of support for 3D. Few games are really pushing today's GPUs/CPUs as the vendors are reined back by console specs. Even the next gen console specs appear to be a bit underwhelming. I’m fairly confident that Clevo’s massive SLI rig fit with dual 680Ms should last a good many years – hopefully. I plan on utilizing a hefty factory overclock, so I hope the heat doesn’t kill the components too quickly. It also helps push this decision that my job now sees me traveling – often with long layovers – and my aging desktop rig is seeing very little use.
MrSpadge - Tuesday, February 5, 2013 - link
That's what the ORB is for. Just wait a few mode days and the results should be there.Landspeeder - Tuesday, February 5, 2013 - link
Drawing a blank mate - ORB?Ryan Smith - Tuesday, February 5, 2013 - link
Futuremark's Online Results Database.Landspeeder - Tuesday, February 5, 2013 - link
perfect - thanks Gents.carage - Tuesday, February 5, 2013 - link
Have you considered notebooks with the possibility of external video upgrades?I have a ThinkPad W520 connected to a ViDock via Express Card which houses a GTX 670 4GB.
The scores I got so far:
ICE STORM: 97109
(Beats the Alienware with 7970M)
CLOUD GATE: 15874
(Slightly slower than 7970M)
FIRE STRIKE: 5351
(Even beats the gaming desktop 7950)
Notmyusualid - Tuesday, February 5, 2013 - link
M18x R2.i7 3920XM, 16GB CAS10 RAM, ONE 7970M. (Cross-fire equipped, but it locked my system solid the moment the DX11 tests began).
Left number is my Inspiron 17R SE's result, right is the UX51VZ:
Ice Storm:
146723
273526
55947
Cloud Gate:
18157
32362
7159
Fire Strike:
4467
4807
10467
It was kind of fun to watch the tests, but a bore after a 3rd time, due to Crossfire issues. Would be nice to run ONE test, but I guess they need to make money, eh?
Will update when Crossfire works...
Notmyusualid - Tuesday, February 5, 2013 - link
Opps, ignore the cut & paste error there from the other poster. I have no 17R, nor UX51VZ.Its late in Asia....
JarredWalton - Tuesday, February 5, 2013 - link
Interesting... looks like even with the 3920XM you're still getting slower results on Fire Strike than the M17x R4. What drivers are you running?Notmyusualid - Wednesday, February 6, 2013 - link
RE: I don't see how you saw my results as slower, so here are the comparisons, side-by-side, with data from your graphs, for all, including the Fire Strike tests.M18x R2.
i7 3920XM, 16GB CAS10 RAM, ONE 7970M. (Cross-fire equipped, but it locked my system solid the moment the DX11 tests began).
Left number is your M17x R4's results, right is the my M18x R2 on ONE 7970M card:
Ice Storm:
85072 / 146723 #
116561 /273526 #
43727 / 55947 #
Cloud Gate:
16729 / 18157 #
30624 / 32362 *
6491 / 7159 #
Fire Strike:
4332 / 4467 *
4696 / 4807 *
8910 / 10467 #
So to me, I have the M17x system beat in all respects.
# = My system besting everything on your graphs.
* = Your HD7950 showing significant gain(s), over the laptops.
Operating System is Win 8 (Bloatware) x64.
Graphics drivers are AMD_Catalyst_13.2_Beta3. (13.1 didn't install at all). No CAP file for this one yet I don't think.
Now to try the Crossfire again...
jtd871 - Tuesday, February 5, 2013 - link
Jarred,Moar data for you...
i7-3630QM 3.3GHz
GT675M 4GB, 9.18.13.623 (306.97 Driver Package)
8GB 1600 RAM
Win 7 Pro
Samsung 830 SSD (if that matters)
3DMark Basic
Screen Resolution (if that matters) 1920x1080
Scenario: Score / Graphics / Physics / Combined
Ice Storm: 80831 / 109914 / 41967
Cloud Gate: 12475 / 18162 / 5952
Fire Storm: 2091 / 2220 / 8146 / 816
MrSpadge - Tuesday, February 5, 2013 - link
Wow, seems like FM put some real effort into this one. It's more than "just another 3DMark" :)And regarding the currently missing diagnosis information: I think they should partner with some hardware diagnostic tool vendor, like the Argus Monitor team. Whatever FM needs to implement, they already did it. Fair deal: FM integrates their diagnostic in exchange for advertising that they do so.
There's also more potential in the enhanced result analysis and the nice graphs. And interesting point would be the time between frames. Give a mean and standard deviation and we could judge which system is smoother. Give us a graph and we could easily identify micro stutter.
Hrel - Tuesday, February 5, 2013 - link
Are you guys ever gonna review those ARM based NAS boxes or not?humbi83 - Tuesday, February 5, 2013 - link
single(gs, phs,cs) / sli af1 / sli af2ice storm 30402 (27721,45965, - ) / 125970 (247612,46323,-) / 125551 (246867, 46159, - )
cloud gate 14687 (26205, 5786 , - ) / 18518 (49233,5817, - ) / 18501 (49182, 5812, - )
fire strike 3106 (3585, 8296 , 1057) / 5667 (6926 , 8330, 2007) / 5795 (7011,8306,2104)
Landspeeder - Tuesday, February 5, 2013 - link
Just what I was looking for!Any chance to try an OC on the CPU/GPUs?
JarredWalton - Tuesday, February 5, 2013 - link
I tried overclocking the GPU on a P170EM and it totally wouldn't work -- all the utilities I tried basically hard locked the system the instant you applied new clocks. If anyone has a recommended utility for overclocking notebooks, I'd like to know what it is.humbi83 - Wednesday, February 6, 2013 - link
Hmm... 7970m seems much more powerfull. 3585 vs ~4700. Maybe I could improve the score using new drivers but I don't have time for it (or overclocking or anything else until they disable the demos). SLI works by the way :D.A couple of things (long rant):
Disable the frigging demos (I'm starting to repeat myself, not OK) !!!!! or set them looping on some other button from the launcher.
No easily discernible screen tearing at 1k FPS (are they actually showing 1K or just offscreen + 60 on screen).
They have troubles with identifying the SLI on/off , GPU vendor, GPU speed etc.
You can see that the first test is targeted towards mobile stuff. Very bald looking (did I mentioned 1K fps?). What would be nice is to also see attached to the tests the GPU utilization (if not already done).
Phy tests seem rushed. Even in the last combined test, the rocks in the background fall kinda funny. BTW, cloth simulation is way off. Also .. was I using the gpu for physics?? Don't know, don't think so.
Looking over their "invalid config guide" seems that if you have a dedicated PPU the test is invalid. So what if the frigging cellphone can't do complex stuff, use all available hw at your disposal and then worry about scoring???!!!! Do you get better fps, then that is a good thing. Score it accordingly. FPS was a common denominator. Use it! Fidelity(phy&gfx) + Keeping the Illusion of movement is what the end user is concerned not what hw you use to get there. If the phone in your pocket can't do that ... well it's a phone, in another 3-5 years it will.
Last test/demo(actually just the combined part of the test) is what I was expecting, just that we should have had 3 of those not just 1 + 2 with fps >200. I mean 5 tests where 3 are actually DX11 with all the bells and whatnot. I have 4GB of mem per card, are you using that, I want the pcie to fry from all the swapping.
Also, use the demos as a more lengthy benchmark !!!
Thees tests should represent the future of content fidelity, now they waste their time with 10(actually between 9&11) year old tech because this is where the money is right now.
First time I ran 3dMark (2006) my PC was almost leaning towards seconds per frame in some situations. Now I get with a frigging laptop almost 10fps in their most demanding test. NOT OK. I know you can do better FM!!
Cheers!
Landspeeder - Wednesday, February 6, 2013 - link
I know XOTIC PC offers it on both the CPU and GPU... perhaps they can shed some light?akhaddd - Tuesday, February 5, 2013 - link
so frickin awesome i love me some computer stuffGTRagnarok - Tuesday, February 5, 2013 - link
Why is my Ice Storm score so high with my M17x R4 with a 3720QM and 7970M?http://www.3dmark.com/3dm/18860
129064 overall
244590 graphics
48646 physics
JarredWalton - Tuesday, February 5, 2013 - link
Are you using Enduro or do you have it disabled? I had it enabled on the M17x, so that could be the reason. Also, you're using 12.10 drivers and I'm using 13.1, so maybe that makes a difference.Krysto - Tuesday, February 5, 2013 - link
Disappointing that it won't support OpenGL ES 3.0 right from the start. Also I wonder how "fair" the tests will be between DirectX and OpenGL devices. Will games that look and work identical on these 2 different API's, get the same score?If not then it will be a pretty useless "cross-platform" benchmark. But we'll see.
mindstorm - Wednesday, February 6, 2013 - link
Result for my Dell M6700 with a FirePro M6000 2GB ram. Test with 13.2 beta 4.http://www.3dmark.com/3dm/63286?
Ice Storm
Overal: 92056
Graphics Score 128702
Physics Score 46107
Cloudgate
Overall: 12160
Graphics Score 15605
Physics Score 6860
Firestrike
Overal: 2241
Graphics Score 2353
Physics Score 9565
Combined Score 896
Better performance on the Ice Storm test then the other notebooks tested. Cloudgate 2nd (of notebooks) and firestrike 3th of notebooks.
All seem to be due to a low graphics score. I suppose this is due to the fact that the M6000 is optimized fore business apps. Although I wonder how much drivers could be of an isue since all the versions I have been trying so far havn't been very stable. 13.1 was a bit more stable then 13.2 overclocking is out of the option with 13.2 beta 4.
Notmyusualid - Wednesday, February 6, 2013 - link
M18x R2.i7 3920XM, 16GB CAS10 RAM.
Left number is your M17x R4's results, middle is the my M18x R2 on ONE 7970M card, right is both 7970Ms in Crossfire (reduced cpu multipliers).:
Ice Storm:
85072 / 146723 / 157182
116561 / 273526 / 343640
43727 / 55947 / 54218
Cloud Gate:
16729 / 18157/ 24279
30624 / 32362 / 65783
6491 / 7159 / 7568
Fire Strike:
4332 / 4467 / 7647
4696 / 4807 / 9542
8910 / 10467 /10449
Also: COMBINED score: 2699
No GPU overclocking.
Operating System is Win 8 (Bloatware) x64, and can't reverse install, and too lazy to restore backup.
Graphics drivers are AMD_Catalyst_13.2_Beta5. (13.1 didn't install at all, 13.2 completed only once). No CAP file for this one yet I don't think.
Crossfire working, but Intel's XTU informs me I was 'throttling', (somewhat) new to me. I game all day on multipliers of 4 cores @ x44, but this benchmark seems to push the system harder.
XTU reports 3% throttling over the benchmark period @ 4cores x44.
I dropped to x44,43,42,41 on beta 5 to allow it to complete the benchmark.
Oddly, Super Pi completes two 32M runs back to back without error on x44,44,44,44. (and it does it a whole 30s slower than on Win 7). Ambient room temp is ~29C. (It's 35C outside).
Hope that is of interest to someone out there.
Spunjji - Thursday, February 7, 2013 - link
Definitely of interest. It's odd how the Graphics scores increase dramatically with Crossfire, as you'd expect, but somehow the Overall scores don't. Seems they've either calibrated the engine or the score generator such that it leans towards overall system performance more than the GPU. Time will tell whether that gives us a reasonable picture of the system's performance, but past experience suggests it won't.Notmyusualid - Friday, February 8, 2013 - link
Yes, good point!The title of the product implies it is for testing 3D performance, yet with significant increases in GPU performance, it is shocking to not see that increase in performance reflected in the overall scores.
I took another look at some other scores online, and it appears to me that the 'Physics Score', is really just a CPU test.
And I thought 3DMark06 was being deprecated for focusing on CPU performance to much (as well as for not being DX11 capable), rather than simply reflecting overall 3D performance, as we'd expect from such a title.
But then, I guess, some games are more CPU-dependent, than others, so maybe it would be a mistake to leave out this sort of test for the average user looking to benchmark his systems' overall gaming performance? I can't say for sure.
Krysto - Saturday, February 9, 2013 - link
I'm really skeptical about 3dmark11 outputting scores that have 1:1 parity between DirectX machines and OpenGL ES 2.0 machines. If it doesn't, then it would be pretty useless for Anandtech's benchmarks, because you're trying to compare GPU's and hardware, not graphics API's.So if say Tegra 3 on Nexus 7 gives a 1000 score, and the Surface RT gives a 1500 score, because the benchmark gives higher score to certain DirectX features, then the benchmark is useless. Because it was supposed to show the GPU's are equal, and it won't.
That's just a speculation for now, but Anand and the guys should really watch out for this.
Krysto - Saturday, February 9, 2013 - link
To be clear, if the drivers are better on one device than the other, then the benchmark SHOULD reflect that, and I'm sure it will. Also it should reflect a higher score if the graphics look better on DirectX machines or anything like that (although that will probably come with a hit in battery life, but that's another issue).What I'm saying is that if everything else is EQUAL , DirectX shouldn't get higher score, just because it's a more complex API than OpenGL ES 2.0. That wouldn't be on its merits.
Also, I'm very disappointing they are making such a big launch out of this, and they aren't even going to support OpenGL ES 3.0 out of the gate, even though it will probably be almost a year even before they release their OpenGL ES 2.0 benchmark, compared to the time when OpenGL ES 3.0 was launched last year.
Clearly they didn't want to prioritize the OpenGL ES part much, even 2.0, let alone 3.0. We might not see 3.0 support until mid 2014 at the earliest from them. Hopefully GLBenchmark 3.0 will come out this year.
shuhan - Wednesday, February 13, 2013 - link
Anyone knows what might be the reason for this:http://www.3dmark.com/3dm/201917
?
Thanks
shuhan - Wednesday, February 13, 2013 - link
Just saw that:"Why is my Ice Storm score so high with my M17x R4 with a 3720QM and 7970M?
http://www.3dmark.com/3dm/18860
129064 overall
244590 graphics
48646 physics"
My result is exactly the same! My rig: Intel Core i5-3570K @4.3, 2x GTX 670
And then my Cloud Gate test scores lower.
failquail - Sunday, February 17, 2013 - link
Certainly seems pretty :)I have CPU and GPU monitoring gadgets running on my second screen, i noticed that whilst the first two tests seemed fairly balanced for CPU/GPU, the third firestrike test maxed out the GPU usage the entire time with the CPU barely hitting 30% usage. A test for the sli/crossfire crowd i think :)
Not sure if it's just a display bug, but it didn't detect my GPU driver at all (ati 6950 modded with 6970 shader count, but GPU/RAM clocks still a little lower than a default 6970) and it detected my fx-8120 CPU (set to 3.4ghz or 4.2ghz turbo) as being 1.4ghz.
Still first go was this:
http://www.3dmark.com/3dm/241866
79420/14031/3423
I need to rerun it though as i had lots of background stuff running and i think i still had AA forced in the GPU driver!