Original Link: https://www.anandtech.com/show/8738/benchmarked-assassins-creed-unity
Benchmarked - Assassin's Creed: Unity
by Jarred Walton on November 20, 2014 8:30 AM ESTSimilar to the last game we looked at, Lords of the Fallen, Assassin's Creed: Unity has had a bit of a rocky start with bugs and other issues needing to be ironed out. It also happens to be a very demanding game to run – at maximum quality, it will basically chew up any GPU you throw at it and spit out crispy bits of silicon. And it's not just GPUs that get eaten, as CPU power can have a substantial impact as well. Finally, and this is not necessarily correlated with the other items in this list, Assassin's Creed: Unity (ACU) is an NVIDIA "The Way It's Meant To Be Played" title, and it's also one of the notable games for NVIDIA's GameWorks toolset – ACU includes support for HBAO+, TXAA, PCSS, Tessellation (coming in a future patch), and now MFAA (which we looked at yesterday).
There's an interesting corollary to the above items that's worth getting out of the way: reviews of Assassin's Creed: Unity have so far been rather lackluster, with an overall average Metacritic score currently sitting at 70%. That's not particularly good for a series that has otherwise had good reviews – e.g. the last game, Black Flag, has an average score of 84%. Perhaps more telling is that the current average user review at Metacritic is an abysmal 2.1. Looking at the comments and reviews makes it abundantly clear that ACU tends to run like a slug on a lot of systems.
I think part of the problem is the mistaken idea that many gamers have that they should be able to max out most settings on games. Assassin's Creed has never been a particularly light series in terms of requirements, though at lower detail settings it was usually playable on a wide selection of hardware. With ACU, the requirements have basically shot up, especially for higher quality settings; at the same time, the rendering quality even at Low is still quite good, and Medium is enough that most users should be content with the way it looks. But if you want to run at High, Very High, or Ultra quality, you'd better be packing some serious GPU heat. The other part of the problem is that the game was likely pushed out the door for the Christmas shopping season before it was fully baked, but that happens every year it seems.
There's another element to the Assassin's Creed: Unity launch worth pointing out; this is a multi-platform release, coming out simultaneously on PC, PS4, and Xbox One. By dropping support for the PS3 and Xbox 360, Ubisoft has opened the doors to much higher quality settings, but the requirements may also be too high for a lot of PCs. With the new generation of consoles now sporting 8GB RAM, we've seen a large jump in resource requirements for textures in particular. I mentioned in the Lords of the Fallen article that GPUs with less than 4GB VRAM may need to opt for lower quality settings; with ACU (at least in the current state of patch 1.2), you can drop the "may" from that statement and just go in knowing full well that GPUs with 2GB RAM are going to struggle at times.
Test System and Benchmarks
With that introduction out of the way, let's just get straight to the benchmarks, and then I'll follow up with a discussion of image quality and other aspects at the end. As usual, the test system is what I personally use, which is a relatively high-end Haswell configuration. Most of the hardware was purchased at retail over the past year or so, and that means I don't have access to every GPU configuration available, but I did just get a second ZOTAC GTX 970 so I can at least finally provide some SLI numbers (which I'll add to the previous Benchmarked articles in the near future).
Gaming Benchmarks Test Systems | |
CPU | Intel Core i7-4770K (4x 3.5-3.9GHz, 8MB L3) Overclocked to 4.1GHz Underclocked to 3.5GHz with two cores ("i3-4330") |
Motherboard | Gigabyte G1.Sniper M5 Z87 |
Memory | 2x8GB Corsair Vengeance Pro DDR3-1866 CL9 |
GPUs | Desktop GPUs: Sapphire Radeon R9 280 Sapphire Radeon R9 280X Gigabyte Radeon R9 290X EVGA GeForce GTX 770 EVGA GeForce GTX 780 Zotac GeForce GTX 970 Reference GeForce GTX 980 Laptops: GeForce GTX 980M (MSI GT72 Dominator Pro) GeForce GTX 880M (MSI GT70 Dominator Pro) GeForce GTX 870M (MSI GS60 Ghost 3K Pro) GeForce GTX 860M (MSI GE60 Apache Pro) |
Storage | Corsair Neutron GTX 480GB |
Power Supply | Rosewill Capstone 1000M |
Case | Corsair Obsidian 350D |
Operating System | Windows 7 64-bit |
We're testing with NVIDIA's 344.65 drivers, which are "Game Ready" for Assassin's Creed: Unity. (I also ran a couple sanity checks with the latest 344.75 drivers and found no difference in performance.) On the AMD side, testing was done with the Catalyst 14.11.2 driver that was released to better support ACU. AMD also released a new beta driver for Far Cry 4 and Dragon Age: Inquisition (14.11.2B), but I have not had a chance to check performance with that yet. No mention is made of improvements for ACU with the driver, so it should be the same as the 14.11.2 driver we used.
One final note is that thanks to the unlocked nature of the i7-4770K and the Gigabyte motherboard BIOS, I'm able to at least mostly simulate lower performance Haswell CPUs. I didn't run a full suite of tests with a second "virtual" CPU, but I did configure the i7-4770K to run similar to a Core i3-4330 (3.5GHz, 2C/4T) – the main difference being the CPU still has 8MB L3 cache where the i3-4330 only has 4MB L3. I tested just one GPU with the slower CPU configuration, the GeForce GTX 980, but this should be the best-case result for what you could get from a Core i3-4330.
Did I mention that Assassin's Creed: Unity is a beast to run? Yeah, OUCH! 4K gaming is basically out of the question on current hardware, and even QHD is too much at the default Ultra settings. Also notice how badly the GTX 770 does at the Ultra settings, which appears to be due to the 2GB of VRAM; I logged system usage for the GTX 770 at QHD Ultra and found that the game was trying to allocate nearly 3GB of VRAM use, which on a 2GB card means there's going to be a lot of texture thrashing. (4K with High quality also uses around 3GB of VRAM, if you're wondering.) The asterisk is there because I couldn't actually run the benchmark, so I used a "Synchronize" from the top of a tower instead, which is typically slightly less demanding than our actual benchmark run.
Anyway, all of the single GPUs are basically unplayable at QHD Ultra settings, and a big part of that looks to be the higher resolution textures. Dropping the texture quality to High can help, but really the game needs a ton of GPU horsepower to make QHD playable. GTX 970 SLI basically gets there, though again I'd suggest dropping the texture quality to High in order to keep minimum frame rates closer to 30. Even at 1080p, I'd suggest avoiding the Ultra setting – or at least Ultra texture quality – as there's just a lot of stutter. Sadly, the GTX 980M and 880M both have 8GB GDDR5, but their performance with Ultra settings is too low to really be viable, though they do show a bit better minimums relative to the other GPUs.
As we continue down the charts, NVIDIA's GTX 780 and 970 (and faster) cards finally reach the point where performance is totally acceptable at 1080p High (and you can tweak a few settings like turning on HBAO+ and Soft Shadows without too much trouble). What's scary is that looking at the minimum frame rates along with the average FPS, the vast majority of GPUs are still struggling at 1080p High, and it's really only 1080p Medium where most midrange and above GPUs reach the point of playability.
There's a secondary aspect to the charts that you've probably noticed as well. Sadly, AMD's GPUs really don't do well right now with Assassin's Creed: Unity. Some of it is almost certainly drivers, and some of it may be due to the way things like GameWorks come into play. Whatever the cause, ACU is not going to be a great experience on any of the Radeon GPUs right now.
I did some testing of CrossFire R9 290X as well, and while it didn't fail to run, performance was not better than a single 290X – and minimum frame rates were down – so CrossFire (without any attempt to create a custom profile) isn't viable yet. Also note that while SLI "works", there are also rendering issues at times. Entering/exiting the menu/map, or basically any time there's a full screen post processing filter, you get severe flicker (a good example is when you jump off a tower into a hay cart, you'll notice flicker on the peripheral as well as on Arno's clothing). I believe these issues happen on all the multi-GPU rigs, so it might be more of a game issue than a driver issue.
I even went all the way down to 1600x900 Medium to see if that would help any of AMD's GPUs; average frame rates on the R9 290X basically top out at 48FPS with minimums still at 25 or so. I did similar testing on NVIDIA and found that with the overclocked i7-4770K ACU maxes out at just over 75 FPS with minimums of 50+ FPS. We'll have to see if AMD and/or Ubisoft Montreal can get things working better on Radeon GPUs, but for now it's pretty rough. That's not to say the game is unplayable on an R9 290X, as you can certainly run 1080p High, but there are going to be occasional stutters. Anything less than the R9 290/290X and you'll basically want to use Low or Medium quality (with some tweaking).
Finally, I mentioned how 2GB GPUs are really going to have problems, especially at higher texture quality settings. The GeForce GTX 770 is a prime example of this; even at 1080p High, minimum frame rates are consistently dropping into the low teens and occasionally even single digits, and Medium quality still has very poor minimum frame rates. Interestingly, at 1600x900 Medium the minimum FPS basically triples compared to 1080p Medium, so if the game is using more than 2GB VRAM at 1080p Medium it's not by much. This also affects the GTX 860M (1366x768 Low is pretty much what you need to run on that GPU), and the 1GB R7 250X can't even handle that. And it probably goes without saying, but Intel's HD 4600 completely chokes with ACU – 3-7 FPS at 1366x768 is all it can manage.
What About the CPU?
I mentioned earlier that I also underclocked the Core i7-4770K and disabled a couple CPU cores to simulate a Core i3-4330. It's not a fully accurate simulation, but just by way of reference the multi-threaded Cinebench 11.5 score went from 8.08 down to 3.73, which looks about right give or take a few percent. I only tested the GTX 980 with the slower CPU, but this is basically the "best case" for what a Core i3 could do.
Looking at the above 1080p charts, you can see that with the slower CPU the GTX 980 takes quite the hit to performance. In fact, the GTX 980 with a "Core i3" Haswell CPU starts looking an awful lot like the R9 290X: it's playable in a pinch, but the minimum frame rates will definitely create some choppiness at times. I don't have an AMD rig handy to do any testing, unfortunately, but I'd be surprised if the APUs are much faster than the Core i3.
In short, not only do you need a fast GPU, but you also need a fast CPU. And the "just get a $300 console" argument doesn't really work either, as frame rates on the consoles aren't particularly stellar either from what I've read. At least one site has found that both the PS4 and Xbox One fail to maintain a consistent 30FPS or higher frame rate.
Image Quality and Settings
In retrospect, I probably should have just skipped the Ultra quality setting and opted for some form of custom settings. The texture data just overwhelms most GPUs at Ultra, and even High still struggles in many cases. Even more problematic is that there are only three texturing options: Low, High, and Ultra.
I also want to point you to NVIDIA's Assassin's Creed: Unity Graphics and Performance Guide, because if you're wanting a better look at what some of the graphics options really mean in terms of quality that article has everything you need to know. One item particularly worth noting is that NVIDIA recommends 2GB cards use Low textures, 3GB cards can do High, and Ultra is for 4GB cards (or maybe 6GB/8GB cards).
Anyway, here's a quick look at what the various presets do for quality. Let me start with a table showing what specific settings are applied for each of the presets. Again, the NVIDIA page linked above has a good explanation for what each of the settings does, and more importantly it has image sliders to let you do A/B comparisons for each setting. (Disregard their AA images, though, as it looks like they used 2560x1440 and shrunk them to 1080p – oops.)
Assassin's Creed: Unity Image Quality Presets | |||||
Low | Medium | High | Very High | Ultra | |
Environmental | Low | Medium | High | Very High | Ultra |
Texture | Low | High | High | Ultra | Ultra |
Shadow | Low | Low | High | High | Soft (PCSS) |
Ambient Occlusion | Off | SSAO | SSAO | HBAO+ | HBAO+ |
Anti-Aliasing | Off | FXAA | 2xMSAA | 2xMSAA | 4xMSAA |
Bloom | Off | On | On | On | On |
The main things to note is that there's a rather noticeable difference between Low and High texture quality, but not so much from High to Ultra. Environmental quality has a generally minor effect on the appearance of the game, especially at everything above Medium (though there are a few areas that are exceptions to this statement). The difference between Low and High shadows is also quite small, but the Soft Shadows implement PCSS (Percentage Closer Soft Shadows), which do look quite nice while also causing a moderate performance hit.
Anti-aliasing has a ton of settings, but the most useful are generally the MSAA options; those are also the most demanding. FXAA is as usual nearly "free" to enable and can help remove jaggies along with some other image details, which might be the best solution. TXAA performance is pretty similar to 4xMSAA I think, which means it's mostly for high-end rigs. Bloom is pretty much always on except at the lowest setting. Finally, ambient occlusion has two options along with off: SSAO or HBAO+. NVIDIA developed HBAO+ as a better version of AO, and in general I think they're right. It's also supposed to be faster than SSAO, at least on NVIDIA GPUs, so if you have NVIDIA hardware you'll probably want to enable that.
Looking at the presets, the difference between Ultra and Very High is visible in the right areas (e.g. placese with shadows), but they're overall pretty similar. There's a more noticeable drop from Very High to High, mostly with the change in textures, and at least for our test images the Medium and High settings look almost the same.
There are a few last items to note on benchmarking, just by way of reference. First, Assassin's Creed: Unity uses "dynamic" day/night cycles. They're not really dynamic, but Ubisoft has four preset times: morning, noon, dusk, and night. The reason this is important is that benchmarking the same sequence at different times of day can result in quite different results. There's also "dynamic" weather (or at least clouds) that can throw things off. Second, if you change certain image quality settings (which I'll get to next), specifically Texture Quality, you have to restart the game for the changes to take effect. Last, the game has dynamic crowds, which means the runs aren't fully deterministic, but in repeat testing the variance is generally less than 3% and closer to 1%.
The good news is that when you load up the game is always at the morning time slot, so basically you have to exit and reload between every setting change. Yes, it's quite tedious if you're benchmarking a dozen or so GPUs….
Closing Thoughts
There are two things you can count on with the fall gaming season: lots of games, and occasionally botched launches as publishers rush to release new titles in time for the peak of the holiday shopping spree. Ubisoft has three major games launching right now, Assassin's Creed: Unity came out last week, Far Cry 4 just released Tuesday, and The Crew launches next week. Obviously, they don't want to launch all three on the same day, but more than one person has come to the conclusion that ACU should have been delayed by a few weeks to get all the bugs worked out.
So far, there has been a Day 0 patch, then the current 1.2, and at least two more patches are planned I believe. The next should provide further bug fixes (and performance optimizations perhaps), while a later patch will also add tessellation support to the game. It's probably a good idea to get performance "fixed" as much as possible before adding tessellation, as it could simply reduce already low frame rates on a lot of systems.
My own experience with Assassin's Creed: Unity has thankfully been mostly uneventful. There was talk about missing textures and "faceless" people, but that's apparently only on unpatched versions – the Day 0 patch addressed that bug, and I know at least in my case I never saw it. Stability hasn't been perfect, but the second patch did a lot to address any crashes in my case – I've played for a few hours several times without crashing, though after a while it seems crashes are still possible.
By far the biggest concern however is performance. I'd say if you can average about 40FPS (with minimums in the mid-20s or above), Assassin's Creed: Unity is playable. The problem is that to get such frame rates, you basically need to go with Low settings on quite a few "midrange" GPUs, and even beefy GPUs like the GTX 980 aren't going to be happy with all settings maxed out at resolutions beyond 1080p. If you have the hardware, ACU is a great looking game and a good addition to the Assassin's Creed series. But for those running older GPUs – or AMD GPUs – you probably want to wait at least another month to see what happens before buying the game.
And if this is the shape of things to come, a lot of people might want a GPU upgrade this holiday season.