As noted in the text, I only ran the i3-4330 simulation with one GPU, and furthermore I only ran it at 1080p (Ultra/High/Medium). Basically it couldn't do more than that so I left of further testing.
How was a i3 doing so bad? This game is basically the same engine as black flag except not optimized at all. And the i3 always performs almost identical in games vs the i5 and i7. Are you sure you did not fake that?
I'm not sure why any of these frame rates are considered playable. Unless you have a gsync monitor, anything less than 60fps minimum frame rate is going to be awful.
"Playable" is not the same as "ideal". I've logged plenty of hours over the years playing games at well under 60 FPS. 30FPS is usually the point where things get "smooth enough" to play well. 40+ is definitely sufficient. G-SYNC is merely icing on the cake if you have it.
Jared Testing must be done at settings which are playable. Why are you testing QHD with Ultra and 4k with High settings where not even a GTX 980 is playable ? You did not even bother to show what setting is playable at 1440p/4k on GTX 980. My guess is high at 1440p and medium or low at 4k would have been playable on GTX 980. Gameworks features like PCSS is killing fps on all cards. AMD definitely need to improve performance in AC Unity.
There's been a rather intense furore following AC:U's launch. Essentially, Ubisoft have blamed AMD for poor performance, and then partially retracted said statement. Considering how closely Ubisoft works with NVIDIA, it would sound like they've only developed for GeForce cards... but some NVIDIA users are having issues as well. What's more, Far Cry 4 is also causing issues with AMD hardware. Both were developed by the same software house.
All in all, it looks more likely that Ubisoft Montreal's internal testing and QA is not up to scratch. You can't simply blame one vendor's CPUs and GPUs for poor performance when you've not bothered to optimise your code for anybody barring NVIDIA. I've even heard that playing offline results in a HUGE performance boost across the board...
That's amazing how it's never AMD's fault no matter what. No matter how poorly they do. No matter how many features they do not have or only have as a ridiculed braggart's vaporware. No matter how long it takes them to continue failing and not delivering, it's not their fault. Never AMD's Fault should be their saying since Never Settle is the opposite of the truth with AMD.
While I would agree with you AMD has been relagated to ultra-low-budget-inconsequential player on the CPU on front, with respect to GPUs I am not certain where you have been living last couple years, whether on Mars or under some rock.
Since HD 4000 series, AMD has been running neck-for-neck with nVidia, sometimes kicking it in the rearside soundly, e.g. Radeon 5870 vs. rebadged GeForce 8800, sometimes being a bit behind, until the Maxwell 980 and 970 parts came couple months ago. But even now, the fastest AMD offering is still at least on par with 2nd fastest nVidia offering performance-wise (the issue is rather power consumption). And drivers-wise, there's lot of games coming out with very good graphical fidelity that have no issues on AMD cards.
Who failed here big time are Ubisoft's managers, who (probably wishing to please the shareholders) wanted to rush the games before the December holiday season to get extra bucks, and allowed proper Q&A to be skipped. There is absolutely no no excuse whatsoever for neglecting GPUs that still make 1/3 of the market (and mind you, nVidia performance is reportedly far from perfect as well). If the AMD cards did not work, they either should not have released the game at all, or release it nVidia only/AMD beta-only.
I do hope it backfires them at Ubisoft in such a way, that instead of now, these games will be rather bought a year later, in 2015 Steam sale season.
Imagine what the nvidia hardware could do with the same power budget. And it isn't just power, but also temps and noise. How come AMD default coolers are the worst in the market yet the nvidia default coolers, esp. for the higher-end models are some of the best? How come it took AMD more than a decade to address the multi-gpu micro-stutter issue in the drivers? And how about the alleged speed boost in CPU performance that AMD promised with Win 8, that never quite took off?
AMD hires from the same talent pool as their competition, but ultimately, it is their consistent corner-cutting and false promises that hurt their business and relegates them to a lower tier.
I apologise if I offended any AMD fans, but please understand this, you aren't on AMD's side and I'm not on nvidia/intel's side... it is us, the consumers who are all on the same side, and unless we demand the quality that we are paying for, every now and then someone would try to get away by BSing us out of our hard-earned cash.
You are kidding right? I have been fortunate enough to essentially own every top-end GPU since the days of 3DFX Voodoo (and before!). AMD has certainly released some absolute monster cards and has been responsible for keeping Nvidia in check since all other competition ceased to exist. Both companies have had their fair share of the performance crown.
Currently I own 2x 290X and have since their launch - I run every single game without issue (aside from the topic of this one) at Ultra settings with no issues (Both watercooled so nice and silent too). Ubi soft is just plain rubbish these days, heck look at the status of their cruddy GTA wannabe watch dogs? That game had issues on any PC. Tell me how black flag can run flawless and then this game just run like absolute crud? Sure a 980 should be in front but the 780ti/290x shouldn't be that far behind.
I will freely admit that Nvidia usually do have more solid drivers in early releases but nothing that has been a deal breaker. Having run SLi and CF since early days I can tell you that both have share of issues .. Anyway all I can say is you better hope that AMD keep on the heels of Nvidia or you will be paying $700 for the same GPU for 3 generations.
CrossfireX was only introduced in September 2005. Granted, the time from then to a viable fix was about 8 years (which is still a very long time) but there's two factors to consider here - how long has it been a problem, and how long has it taken AMD to acknowledge it themselves? The discrepancy between the two is what they really need to be judged upon, not how long it took for a solution to reach the user base. Promising fixes is one thing, burying your head in the sand and flat out ignoring your customers is quite another.
FlushedBubblyJock mentioned it never being AMD's fault for this, that and the other. You'd have to be blinkered to subscribe to that particular theory. AMD's biggest problem is delivery - VCE support was a joke for a long time; some might say their DirectX drivers are in need of serious work; TrueAudio doesn't appear to be having any impact... to name a few. Team Green is in the ascendency right now, and AMD can't release a counter quickly enough, so they look to have no answer for Maxwell. It's almost all down to delivery, and we can only hope they improve in this area. It's not about being a fanboy, but bringing some objectivity to the discussion.
Yes, right. But my point was mainly that graphical glitches and poor performance in ONE PARTICULAR GAME, sponsored by AMD's competitor, should be blamed on Ubisoft Q&A and them rushing to get the game out for x-mas, rather than on AMD.
AMD do disapoint me though. Case example: When LCDs came out, I thought - great, now we will be able to get variable refresh rates. But lo and behold, 10 years pass and nothing, until nVidia comes with G-Sync. And then we learn AMD had done it, they had it RIGHT IN FRONT OF THEIR EYES, and they did not see the benefits, but instead tried to sell it as some crappy energy saving thingy. *facepalm* It is clear their development lacks some people who would focus on improving *game experience*.
(btw, from my last 6 gfx cards, 3 were nVidia, 3 AMD/ATI)
"CrossfireX was only introduced in September 2005..."
I'm sorry, -almost- a decade then. Because it is really inconsequential how long a particular phase takes in the entire process of solving a problem - what matters is how long it took the end-users to get their money's worth.
Secondly, the defence that they just didn't know any better, while the competition apparently did, to the point that the competition had to develop a tool (FCAT) for AMD to actually see (or recognise) the problem, merely suggests that if they weren't being deliberately callous, they were just outright incompetent. Either way, the point is that they need to step up their game, because their customers and fans deserve better than what been bringing forth, and because the free market needs good competition.
Understood - definite incompetence and on a grand scale, too, considering somebody with multiple cards has put x times the money into the vendor than somebody who would purchase just the one. I would find it hard to believe that they were unaware from their own internal testing. There's the possibility that whoever presides over this was given their marching orders and AMD set about fixing the damage, but I guess we'll never know.
It's AMD's responsibility to work with game devs to make certain their cards work properly. Of course, AMD is notoriou for not doing that for many, many years, then of course, it's nVidia's fault. AMD might do well: " We take full responsibility." That would mean of course having Catalyst Makers doing more than emailing and whining, like showing up at game dev studios and taking an active hand and having game day drivers ready. Of course if they did that, what would their unbelievably as incompetent misplaced blame fans have to do ? I mean seriously, it's as bad as the worst politicians we've ever seen pointing fingers in every direction but their own.
1440p High is probably playable on a single GTX 980 -- I just ran GTX 970 on that and got results of 30.4/23.6 Avg/Min, which is about 40% faster (44% to be precise) on average FPS and 65% faster on minimum FPS. If 980 sees the same scaling, it will be around 35/26 FPS at 1440p High. There's not a huge difference in performance or quality between the High and Medium presets, which means you really would need to drop to Low (or close to it) for 4K gaming.
Why did I test these settings? Because you have to choose something, and we generally go with "Ultra" at 1440p -- just to see how the GPUs fare. I've tested 4K at Ultra in the past, but that was completely unplayable across the board so I dropped to High for this game. If I had dropped 1440p to High, I'm sure I'd get people wanting to see Ultra numbers -- you can't please everyone.
Anyway, as someone that has a 4K display, I can tell you I'd rather play at 1440p or even 1080p with better graphics quality (High, Very High, or Ultra) than run at native 4K with the Low/Medium settings. YMMV.
I disagree. I find 30 perfectly playable. That's the effective frame rate of television. Movies are 24, and nobody has issues with them not being "smooth enough." Heck, people almsot got out pitch forks when someone dared film a movie at 48 fps.
I mean yes, for gamign 60 fps is preferable and looks and feels better, but to call anythign under that "awful" is going a little far. Especially whent he game in question is not a twitch shooter. Action/adventure games like Assassin's Creed are perfectly enjoyable at 30 fps.
Well you know, this is the internet... comments must be exaggerated for effect. Either something is greatest of all time or it's awful, never any middle ground. Anyways, I have a GTX980 and a 5820k @ 4.0Ghz and I would say that my experience with "playability" in this game doesn't really mirror the benchmarks at 2560x1440/ultra. Perhaps there are more taxing areas on the game that I haven't seen yet but I'm not seeing frames dropping into the teens. I feel the controls hurt the playability of the game more than anything as they just seem clucky.
Exactly my remarks, 3770k @ 4.8, and evga 980 acx oc'd to 1550... and at 1440/ultra it is completely playable, im about 4 hours in and am completely satisfied with results. would i love to stay above 60fps at all times? yes. am i satisfied? yup!
There is a big difference between passively watching a 24fps film and interacting with a 24fps video game. I'm far from a pedant on these things and I find anything under 45-50 fps distractingly unplayable.
You forget the abysmal controllers console gamers have to use. Using a controller and having low FPS is much different to using a mouse and having low FPS.
There is nothing 'abysmal' about it.. man it bothers me people are latching onto that word now and using it like candy. The 360 and Xbone controllers are wonderful controllers, you just don't have the skill to use them it sounds like. Good gamers can use all input types. After a 6 month break I hopped into BF4 the other day and still had a 2:1 KDR on my first match never even played the map so yes I'm good with KB/M, but I can also pickup my PS4 or Xbox 360 controller and crush it in games that play better with controllers.
Yeah, like yours. 30fps is visually smooth, but the issue is with input. A controller is less sensitive to input due to the large disconnect from the screen. It's the same concept that makes touch screens feel unresponsive even at high smooth framerates.
Halo: Combat Evolved was 30fps locked on Xbox and is considered by many to be a great game. While I PREFER to change settings in games to get the frame rate to match my monitor (144Hz), I still enjoy games that play at low frame rates. I can't tell you how many hours I put into Company of Heroes on my crappy laptop... that thing barely cracked 30fps when nothing was happening.
I forgot to add that my desire for smooth or higher framerates also varies greatly by game. RTS games can get away with 20-30fps as long as the jerkiness it doesn't interfere with my ability to select units. For action games, I prefer 60fps+ and for shooters or other fast-paced games, I want all 144.
In one old game(SD gundam online : random korean gundam online game kappa), I was playing 40 FPS for a while. And it was playable experience and then I upgraded my gigs. . After upgrade, I was able to push until full 60 FPS without any frame drop. Using full 60FPS fixed for a while. and then I had a technical problem of my upgraded gig. Thus, I go back to old computer and played with 40 FPS. There was a MASSIVE difference after downgrading. Experience itself was horrible.
But it really depends on game and person. Again, depends how you accept 30 FPS. The game I mention was really sensitive. every move needs to be quick and responsive.
AC : Unity? I can agree these kinds of game is okay to have somewhat lower FPS. Though as I said it can be fairly bad if you are very used to using 60 fps for a while. Downgrade is way more feasible than upgrade, so it is more about how people accept it. If you downgrade straightforward from 60 FPS fully FIXED(using it for several months) to 30 FPS, you might say it was definitely horrible experience.
By the way, my problem of playing AC Unity was occasional freezing(Frame drop below 10 or 15FPS for about 1 sec.) for about 1 sec. Worse thing is after frame drop, my mouse cursor pops random place. I was playing with GTX 670 FTW SLI. 3930k OC, 32GB RAM.
This unity frame drop issue was the most terrible one. At first crash issue was even with frame drop, but Patch 2 fixs many crash problem. However, frame drop issue still persist. I was using LOW option for all graphic with 1600x900 resolution, which gives normally 60 FPS(70~80FPS if I turn my vertical sync off), but there was still this freezing issue. it intensifys when I put more graphic option or higher resolution. It forces me using 1600x900 with low graphic option, no AA or other random graphic sauce on it. Typically 60~80 FPS, but it was still horrible.
Easy answer: because that's just your opinion. I don't need more than 30 fps, most people can't tell the difference after that. I've played for years with 25 fps and it's fine. 23 fps is where it gets noticeable.
"a lot of people might want a GPU upgrade this holiday season". What about people who already have a 780 or 290/x coupled with a 120Hz FHD or 60Hz QHD monitors? Isn't it more reasonable to say: Don't buy this game for now, wait a few months. I don't believe that ACU is "tougher" to run than Metro or Crysis.
Isn't that what I said in the conclusion as well? "For those running older GPUs – or AMD GPUs – you probably want to wait at least another month to see what happens before buying the game."
If you have a 780 or above, the game runs fine -- just not with Ultra textures. Go for 1080p High and don't worry about it. (You might be able to reach for 1440p High, but honestly it's going to be tough on any single GPU to run that setting as it's almost twice as many pixels to render at 1080p.) If you have AMD, yeah, either the game needs some patching or AMD's drivers need tweaking -- or both.
Hear hear. Ubisoft's motto needs to become: "We shall sell no wine before its time."
I have a sneaking suspicion that nVidia's money is behind lack of AMD optimization. Ubisoft MUST make the game work on the AMD-powered consoles, so I hardly believe they didn't know how to make it work with Radeon graphics cards on the PC side. More like nVidia paid them off not to use Mantle and make it work well with AMD cards.
More like it's cutting edge stuff that brings the best to it's knees, so nVidia has to spend their millions since AMD is broken and broke and helpless, so AMD whines and moans and PR lies, then many moons later a few of us find out AMD refused to cooperate because they act like their most childish fans instead of professionally and in their own best interest.
Flushed, your posts make absolutely no sense when AMD's GCN cards run very well in modern games such as COD:AW, Civilization BE, Evil Within, Ryse: Son of Rome, Dragon Age Inquisition, and especially in another Ubisoft title: FC4.
Besides Unity and some issues in Lords of the Fallen, it is actually NV cards, specifically Kepler architecture, that has not been pulling its weight in the last 6 months. Not to mention that AMD has the entire sub-$330 desktop GPU market locked up, winning in performance at every price level.
Focusing on the broken Unity game as evidence that AMD has issues in performance misses the other 95% of games released in the last 6 months where it is NV that's having issues. Good one.
Have you looked at what people are saying about the console versions, though? They're not exactly shining pillars of smooth frame rates. And I seriously doubt NVIDIA paid Ubisoft to not optimize for anything other than NVIDIA hardware; it's more likely a case of "AMD didn't come by and offer us free GPUs and software developer support."
It's in Ubisoft's best interest to make the best game they can that will run on the widest selection of hardware possible. Many of these games have budgets in the tens of millions, so intentionally killing performance (and sales) on a big chunk of the market would be crazy. Then again, the world is full of crazy.... :-)
Yes Anubis, of course you have a sneaking suspicion, Nvidia obviously paid AMD's driver team to write bad drivers and not bother optimizing for AC: Unity to show Ubisoft how much bad publicity they could garner for not teaming up with a vendor that is becoming less relevant by the day.
I guess you could say the difference is, the consoles makers actually write their own drivers for their APUs and AMD has nothing to do with it at this point. They gave them the keys and blueprints and vacated the premises, which is probably a good thing for console owners. If you bought a console would you honestly want to rely on AMD driver updates for it? D:
AMD needs a driver update, plain and simple. The poor XFire scaling results should be enough to make this clear, which I know you are already aware of.
I spent a good 10 seconds admiring the detail they put into every strand of hair on the girl in the first pic before I realized the poor fellow on the right didn't have a face.
And as I specifically mentioned in the text: the missing faces/textures was apparently patched on Day 0; I personally never saw the problem. I wonder if all the hubbub over the faceless people might have something to do with a bad crack -- wouldn't that be fun? Anyway, it's 2014 and the game uses UPlay so unless I'm missing something, you have to be connected to the Internet to play and the only people not updating with the patch... well, you fill in the blank.
Yup. Poor performance on PC is a good indicator of a rushed PC port, but poor performance on consoles (also true for this game) sounds like the whole project was a mess. Those are fixed hardware configurations that they've known about for a long time.
When MSAA/TXAA/MFAA (the latter two are based on former) is dropped - U R good to go on, for example, Ultra settings with 3GB VRAM card up to 1920:1200 (with FXAA). In this settings I get around 50FPS on GTX 780 Ti OC, here's my video : https://www.youtube.com/watch?v=fGdXJN-5YXw But this game sure can kill any card out there.
I love idiots who think any game under 60FPS is not playable. I imagine they have Fraps running in the corner of their screen and have a total hissy fit if any game dares to dip below 60FPS on their ego trip of a PC. I know Nvidia/AMD stock holders love them dearly.
Looks like it's basically cpu limited. Difference between ultra and medium is only a few fps for something like a 970 at 1080p. Would be interesting to try it with a 6 or 8 core intel processor and see how it scales with more cores?
On which setup are you seeing "only a few FPS"? 1080p Medium is 22% faster with 970 SLI on average FPS and 31% faster on minimums, and a single 970 is 49% faster average and minimum on Medium vs. Ultra. That's far more than a few FPS.
The gap between Medium and High is much smaller, but then they both use High Quality textures and honestly they look very similar. There the performance is only about 10-30% faster (depending on GPU), though minimums still favor cards with 2GB VRAM by a large amount.
Well I'd expect a bigger performance difference between medium and ultra. Looking at the cpu's the 4 core pretty well doubles the 2 cores min frame rates, that shows cpu is having a much bigger impact. If that's the case what would 6 or 8 cores do?
If you have the time I would like you to test further with even lower resolution. It's not much point knowing GPU x can do 18 fps@1080p since it's much easier to adopt to lower resolution as compared to lower frame-rate. Maybe you could use the slowest of the buch and try out 1600x900 and 1280x720 as well? If the system is still up and running I guess it would take much more than a few hours.
I did run 768p Low on most of the GPUs... I don't want to make a graph because really, desktop users don't want to run games below 1080p IMO. But if you're wondering about the laptops and lower end hardware...
CPU plays a big role in Assassin Creed Unity so the GTX 980M comparison against the desktop GPUs are skewed. The desktop GPUs are paired with 84W++ CPUs while the GTX 980M is paired with a 47W soldered lower clocked CPU.
I expect the GTX 980M to be closer to GTX 780 if they ran the same clocks. Something that would be interesting to see from Anandtech, a review of GTX 980M against desktop if both had roughly the same CPU power. http://gamegpu.ru/images/remote/http--www.gamegpu....
The i3-4330 numbers are there for a look at where the CPU bottleneck would lie on lower end CPUs. I would guess that the mobile quad-core CPUs like the i7-4710HQ are generally keeping the GPU "filled" with work. 980M might be a bit faster with a higher clocked CPU, but I don't think it would come anywhere near the 780 or 970.
I've got some numbers and basically across a large selection of games the 780 (with a desktop CPU compared to a mobile CPU) is around 25% faster than the 980M (and the 780 and 970 are basically tied in overall rankings -- like literally within 0.1% of each other).
we need those big wide nvidia cards to come back. 512bit bus or even a 1024bit bus. My GTX 980 only chokes when i try to enable any form of AA on FC4 and AC: Unity. As long as AA is set to None or 2x MSAA, the games fun at 60FPS.
The game is a hardware thrasher from the numbers. I can understand seeing PC titles playing with low frame rates, but there's a problem if the consoles can't get the game over 30 FPS. That is a design failure since you can't upgrade consoles.
PCSS kills performance, im running the game everything on Ultra except PCSS is set to High with FXAA at WQHD (2560 x 1440) and getting 55 FPS avg with 970 GTX (1525/8Ghz), min FPS is like 40.
switching to 2xMSAA with MFAA enabled gets me around 45 FPS avg and 30 Min, so i wonder how 970 SLI in your benches couldnt sustain 60 FPS on WQHD?!
Ultra is 4xMSAA with PCSS. You had a 10FPS drop just enabling 2xMSAA, and 4xMSAA would take another 10 or so FPS off, with PCSS accounting for an additional 10 (give or take).
The main reason for the low performance is the use of MSAA. MSAA in this engine has a massive performance hit as the engine uses deferred rendering . Running the game on ultra settings with FXAA instead of MSAA would net you over 10 FPS easily.
Umm... MSAA on many games tends to exact a fairly decent performance hit, and the more complex the game the bigger the hit. FXAA is basically a 3% hit (vs. no AA) by comparison so yes it would be much faster.
Exactly, so posting benchmarks of the game running at MSAA 4x isn't exactly an accurate representative of the kind of performance you can get out of the game, and arguably isn't even worth the massive performance hit as it just gives you a very slight IQ boost over FXAA. On my own machine, I'm playing at 1440p maxed settings with FXAA and I'm seeing 60 FPS on a regular basis with V-sync on. With V-sync off, I'm getting into the 70s..
This is on a Gigabyte G1 GTX 970 SLI rig with a 4930K @ 4.3ghz driving them..
I'll bet you a dollar you're CPU limited at mid to high 70s when you're down on the streets. Anyway, I ran the Medium numbers as well at 1080p, which is basically FXAA with High textures and a few other items turned down a notch that don't really affect things that much. As to what's "an accurate representation of the kind of performance you can get", well, the numbers don't lie. If you want to run different settings, the numbers change, but there's a reason the developers don't just use FXAA as the default at all settings.
I probably am CPU limited with V-sync off, but considering I'm above 60 FPS and how much is being rendered (the game is absolutely massive in scope and detail), I would say that the engine is still fairly optimized. When I'm playing the game, my CPU is usually around 50 to 60% loaded on all 12 threads with V-sync on. I haven't tested CPU usage with V-sync off though.
The game definitely uses a hex core processor, so that's probably why your frame rates are lower than mine..
I wonder if Apple didn' take all the 20nm production this year, and amd/nvidia had 20nm cards, if we wouldnt have a $200-300 card that outputs 60fps easy at 1080p ultra. We really should of been at 20nm this year.
Why don't u turn off AA and show people what the game can actually run at. I don't know why this is a must have when you can't get solid frame rates. If you ran all the same benches without any AA i don't see why it would be so abysmal. AA is a luxury not mandatory.
No, more on some of the high end numbers where aa starts to get redundant, especially at 4k. I loved crysis when it came out and it slapped my 7900gtx sli around because I knew it was the start of something great to come. This game does have some nice touches, especially in the quantity of npc's on the screen, use of ai and level of detail for such an expansive city, but is nowhere close to heralding in a new concept look of what's to come in terms of textures and reach. Most people are gonna set it to highest textures, turn off AA and get their playable fps at whatever resolution their card supports so I have to admit this is the first time i've really felt a little leary at the state of the game presented on ANANDTECH. I've been reading the site since it was launched but this game benchmark just didn't make me come off with a sense of what performance is really going to be like across various setups.
I don't understand how they can do a poorer job of porting the game to PC on AMD hardware than Nvidia when the consoles are using AMD GPUs. Unless they built it for PC with Nvidia in mind and then did a crappy job of porting it to consoles. Of course given the poor performance of the game on consoles, that isn't hard to believe.
Ubisoft is quickly becoming the new EA. I won't be buying this game this year. Probably in a year when it's down to $20 and they've maybe patched it to a reasonable state. I say maybe because Watch Dogs has been out for months and is still pretty bad.
The bleeding edge has to be pushed, lest there be no need for more. Same thing was said about Crysis, then it wound up being the most famous FPS freak game ever, and still is, until perhaps now. So getting down on the leading edge games that present a challenge to GPU designers is not in all of our best interest. Also it's nice to see a "port" frustrate the highest end elite desktops and see the whining not be about how cruddy for any sort of gaming ported games are, but in this case how " slow my thousands of dollars are ". Very glad too see it crushing the best of the best, we need more of this at a faster rate, then we hopefully won't hear so much about and so often " the increase with the new core isn't worth it ".
Now the GPU makers must overcome, a challenge is a good thing.
This would be a reasonable sentiment if in fact the game was "bleeding edge" graphically. Crysis was a landmark visually (and still looks impressive) and I feel very safe to wager that Unity will not be remembered in even close to the same way. Anyone can make a game that brings "elite" hardware to it's knees, it's not an impressive feat on it's own if it doesn't deliver the experience to justify it.
*shrug* runs fine on my PS4. I'd give it more of an 8.5 personally. Paris is the best playground yet for this series. Online features are still being ironed out, but the game is great :)
Jared I know you didn't test for it but any thoughts on how system memory affects things? Minimum is for 6GB and 8 GB recommended I wonder what impact this has?
(I've just fine from 4GB => 6GB to run this game; wondering if I need to replace the other two sticks two or whether the fact swapfile will be on SSD is enough)
I was looking forward to TC: The Division but given Ubisofts recent track history and inherent game factors (new engine, MMO, rpg aspects) im just not sure that it will be anything except a colossal ballsup?
I agree with many other commenters about the strangely sanguine tone of this article, breezing past the massive performance bottlenecks and instead urging people to upgrade their hardware instead of pointing the finger where it belongs - Ubisoft - and attacking them for releasing what is essentially a botched game in terms of performance. You should be running 60+ at high 1080p settings with a 290/780. Instead you barely get to 45 frames with a 780.
The fact that even a 980(!) can't get over 60 fps on 1080p high means that the game needs to be canned and not the reader base's hardware. Do better, Jared.
That's certainly not what I'm doing. Just because the last sentence says, "And if this is the shape of things to come, a lot of people might want a GPU upgrade this holiday season" doesn't mean I'm telling everyone to upgrade. What I am saying is that IF you want to run THIS game (and IF other games end up having similar requirements in the near future), then yes, a lot of people will need new hardware (or lower quality settings).
When Crysis came out, nothing -- NOTHING -- could run it properly at maximum quality settings. People skewered Crytek and said they were lousy programmers, etc. and "the game doesn't even look that good". And yet, I don't really think that was the case -- they just decided to enable settings that pushed beyond what was currently available.
Is Ubisoft intentionally doing that with their latest releases? Perhaps not in quite the same way (it is the holiday season after all), but the decision to drop support for older generation consoles in order to enable a higher quality experience certainly wasn't made to improve the sales of the game. Believe it or not, there are game developers that just really want to use the latest and greatest technologies, performance be damned.
Fundamentally, we're not a sensationalist website. We're not in the market of pointing fingers, casting blame, etc. All I can say is how the game works right now on the hardware I tested, and it's up to the readers to draw conclusions. Was the game pushed out early? Almost certainly. Should they design all games so that 1080p High gets 60+ FPS? I'm not one to dictate whether that's the best thing to do or not, and I do like seeing companies push the performance envelope on occasion.
It hurts when your high-end GPU can't run a game with settings you are accustomed to using, but I do have to say that their recreation of eighteenth century France is quite remarkable.
That moment when you're done coming through the first forest and you hit the rays coming through the trees, and you look down over the cliffs.
I don't think many people said it was coded badly (although they probably did), but it was such an incredible step up visually that people really took notice.
Assassin's Creed Unity may also be a fantastic game visually, and I will get it at some point, but the fact is, console hardware is a measly set of Jaguar cores and low to midrange previous generation Radeons.
People are right to expect their massively more powerful machine could run the game at 60 FPS.
I've just a laptop with i7 2nd gen, 8 gb ram and 6770m (2 gb vram). I know that this conf is too poor for any serious gaming session.. But I'd like to play ACU, like I do with prev episodes... Could I get at least 25 fps with the lowest settings and a resolution of 1366x768? I don't need the best graphics, I just want to know the story... And of course I have to buy this game to try it... Need help guys :)
Interesting findings Jarred with texture setting, it looks like Santa (Ryan?) sent you some early X-mas presents too with the GPU upgrades. I would also be interested to see a kind of "feature expense" comparison, where you go through some of the main settings to give an idea of what kind of perf hit you take when enabling them at different settings.
For example, I remember a time when setting textures to max was an automatic, but now it seems in this age with 2K and now 4K textures with next-gen console ports, that's no longer possible since those textures will fill VRAM in a heartbeat. Also, did you have any >4GB cards or high bandwidth cards to test to see if they helped with the texture situation at all? Like Titan Black?
But lately I have seen textures and MSAA creating a much bigger perf hit than in the past due to the amount of VRAM they take up. There was a time where VRAM didn't make as much of a difference as shading power and you could just crank up the textures and use MSAA without the crazy hits to perf we see today.
I opted to turn everything up to max on my 970 (but with soft shadows turned off) and with a 30 fps locked frame rate (1080p). It plays butter smooth, but man, if any game benefits from that 60 fps frame rate it's assassin's creed, with it's wonky input lag (I play it with a controller) and even wonkier world traversal / parkour.
Takes a bit of getting used to, but at 30 fps it ain't all that bad and it's a damn nice looking game with the settings maxed out.
Ubisoft is just forming a pattern here of poorly optimised software. They have some of the best artists, but apparently some of the worst software developers. Also, I don't believe them for a second when they try to offload their incompetence on a hardware manufacturer.
lets be honest, this is a poorly optimized game with an enormous amount of bugs that was so ridiuclously messed up it made the BBC news and their shares dropped 11%! It's a complete debacle.
Core i3 4130 with GTX 750 Ti runs this game as good as console version
Eurogamers did test by matching the graphic quality of PC to console version (by running it with 900p and similar graphic settings to PS4), and the result that GTX 750 Ti plays it as good if not slightly better.
When a game is barely playable on the most high-end video cards on the market at resolutions and settings PC gamers are accustomed to, you have utterly failed. Bravo Ubisoft. Bravo.
You can forget about Ubicrap fixing this! This is why Ubicrap gave the unreal PC requirements! They are getting money from GPU/CPU Hardware to help market for them! And they do care to spend more money on us scum customers anyway! So I say XXXXXXXXXXXX UBICRAP!!!!!
Any chance of testing CPU performance on AMD vs nvidia GPUs? I've seen a *ton* of recent games underperform on AMD GPUs due to what I think is their lack of support for deferred contexts aka 'multithreaded rendering'. It's particularly low-end CPUs that are affected.
Unity pushes something like 50.000 draw calls each frame. Note the enormous disparity in minimum framerates between the two vendors on 1080p/medium where even slower nvidia GPUs get higher minimums than faster AMD GPUs. I think it's worth exploring as even low-end FX CPUs can almost double their performance on high-end nvidia GPUs vs. high-end AMD GPUs.
That last line you have tells me AMD is offloading multiple boatloads of work to the cpu --- isn't that exactly why Mantle is for low end cpu's - it relieves the gigantic overburdening cheaty normal driver of AMD that hammers the puny AMD cpus.
It's sad really - shortcuts and angles and scammy drivers that really only hurt everyone.
60 frames per seconds isn’t some arbitrary value. With Vsync enabled and a refresh rate of 60Hz, dips below 60 fps are far more unpleasant. Adaptive Vsync addresses that but isn’t available to everybody. Disabling Vsync leads to screen tearing which some people (me included) find extremely annoying.
In a game every frame consists of discrete information. In a movie each frame is slightly blurred or at least partially blurred, a natural effect of capturing moving objects in a frame. For a game to feel fluent at 24 or 30 fps it needs to add artificial blurring.
In movies each frame has the same length. In games the length of each frame is different. So even 60 fps can feel choppy.
Different people have different sensibilities. I always notice a low frame rate and frame drops. A steady 60 fps with Vsync enabled works best for me. Anything below 50 fps (in a game) feels off to me and above 60 I don’t notice that much difference. Likewise for gaming and movies I use screens with a fast response time since ghosting really distracts me.
I feel that with a decent system a 60 fps minimum should be attainable. What bugs me is that in some games lowering the quality settings has little impact on the minimum frame rate.
I’m always surprised by blanket statement like “30 fps per second is perfectly playable”. Depending on the game, the settings and the person playing the game it’s often not. For me another factor is how close I’m to the screen.
FWIW, I've been playing games for about 35 years now (since I was 6 on a Magnovox Odyssey II), and when I say a game is "playable" at 40 FPS, what I'm saying is that as someone with years of game playing behind them feels the game works fine at that frame rate. I've also played ACU for many hours at sub-60 FPS rates (without G-SYNC being enabled) and didn't mind the experience. Of course I wasn't the one saying it was "perfectly playable" above, but it is most definitely playable and IMO acceptable for performance. If you want *ideal*, which is completely different, then yes: 60+ FPS is what you want. But then there are those with LCDs running at 120Hz who would want even higher frame rates. YMMV.
I don’t mind somebody saying: “this game is perfectly playable for me at 40 fps”. I do mind it if people say that there is no perceivable difference between 40 fps and 60 fps (as stated in the comments) or when people say “the game runs smooth as butter” when it doesn't. The article was fair, some of the comments weren't.
For me a game is not enjoyable at anything below 50 fps and I much prefer it to have Vsync enabled.
I would say that most people accept 60 fps as a reasonable goal at medium settings (whatever they may be) with a high-end GPU. Depending on personal taste (graphics settings) and budget people can than choose to sacrifice fps for MSAA, AO, high-res textures and/or money. I strongly believe that studios should aim for 60 fps at medium settings with a high-end card and 60 fps with a medium-card at low settings (both at 1080).
With smart design choices and quality control that is certainly possible. As it stands, I’m disappointed with both Far Cry 4 and Unity.
1) Wonder if an i5 vs i7 (hyperthreading) matters. 2) Wonder why you guys don't borrow a Titan Black and test it to see if the extra VRAM improves things. Surely, a contact at Asus, Gigabyte, nVidia, etc has a Titan Black with 6GB of RAM to lend you. Probably two for SLI. I'm curious to see if the game can use the VRAM because I'm hearing reports of Ultra taking 4GB and gobbling it up. 3) Ultra settings preset includes MSAA. That's the first setting I'd turn off if my settings were taking a dive. It gobbles up memory AND processing like nobody's business. What happens if you turn it off?
Seems like obvious questions to me. Until Batman Arkham Knight, this looks to be The Benchmark game in terms of crushing your system. Assuming they ever finish patching it.
VRAM clearly makes a very big difference on this game. To answer the question above I maxed out the settings at 1080p on my GTX Titan (original) and just ran/jumped round Paris a bit while GPU-Z was set to data log. The file shows constantly high memory usage maxing out at about 4.4Gb. Interestingly with stock settings the GPU was often being pushed to relatively high clock rates by GPU boost so it looks like the GPU was not being worked extremely hard.
Not a scientific test but potentially bad news for people with 2Gb and 3Gb cards as tweaking will not recover the difference. Interestingly I noticed that the main system memory the game takes is not that large and I wander if the issues people are experiencing are possibly related to the way the game has been programmed and the unified memory model PS and Xbox use. On the consoles the distinction between "graphics" memory and "system" memory would not matter in the same way that they do in a gaming PC with a graphics card.
Lol at needing freaking SLI 970s for 60+ fps at 1080p. Do you think patches in time can make this playable for high end single card setups like a 290x on ultra.
Unity is a good game once you get past the glitchfest. No, it is not a revolution of the Assassin's Creed series, more an evolution of Assassin's Creed 4. It is one awesome game (I played it on a friend's console and another one's PC) once you get past those issues. The only thing I don't like about it is that it is VERY VERY hungry for graphics power even at 1080p settings. To the point where the latest 980M's from NVidia struggle to push more than 30fps at those settings on Ultra. I'm wondering (considering I do not see much additional graphics prettiness) whether that is a sign that the game was not properly optimized for PC's and notebook PC's. If it is, that is something that Ubisoft (and other game makers) are going to have to take note of and fix.
At last a breath of fresh air. Instead of getting everyone excited of how good you can play pacman at 10k, one company still serves as a reminder of the distance we still have to cross.
Way to go Ubisoft, and if you make a game hardly playable at 1280X720, I will make a donation to you and create a church for you. We have had enough from the mobile devolution, touting meaningless resolutions(3 mega pixels on a tablet, oh my god). You will serve as a reminder that high resolution is good, but you have to have some real content to show on it.
We need a new Crysis, rather not only one but several in succession.
QUOTE: And if this is the shape of things to come, a lot of people might want a GPU upgrade this holiday season
WHAT? How about the game is BS! You know good and well this game is just a piece of junk! It is not because of older cards or because the game is graphically advanced! This game is no more advanced then any other game! You should not be advising people to waste money on a new expensive GPU just to play a game that has bloated PC requirments because Ubisoft suddenly decided to no longer correctly optimize pc games! Instead you should be pointing out how horrible Ubisoft's new games are! If AnandTech is now going to push marketing instead of pointing out the truth of horrible software! Then it looks like AnandTech is no longer a trustworthy website for benchmarks or anything! BS is BS! No matter how much cherry's you put on top of the BS! Any benchmark site benchmarking games like this is absolutely discredited!
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
122 Comments
Back to Article
funnyferrell - Thursday, November 20, 2014 - link
Unless I'm totally blind, your CPU benchmarks don't appear to be up there.JarredWalton - Thursday, November 20, 2014 - link
As noted in the text, I only ran the i3-4330 simulation with one GPU, and furthermore I only ran it at 1080p (Ultra/High/Medium). Basically it couldn't do more than that so I left of further testing.FITCamaro - Thursday, November 20, 2014 - link
Yes but you mention charts and don't show any.JarredWalton - Thursday, November 20, 2014 - link
The i3-4330 + GTX 980 numbers are in black in the 1080p charts.P39Airacobra - Tuesday, January 13, 2015 - link
How was a i3 doing so bad? This game is basically the same engine as black flag except not optimized at all. And the i3 always performs almost identical in games vs the i5 and i7. Are you sure you did not fake that?P39Airacobra - Tuesday, January 13, 2015 - link
Also I know of some Pentiums like the G3258 model playing the game perfect with a 970.P39Airacobra - Thursday, December 11, 2014 - link
I suppose you are because the benchmarks are there, You just have to know how to use a webpage instead of only worrying about trends.os6B8dbVUesnzqF - Thursday, November 20, 2014 - link
I'm not sure why any of these frame rates are considered playable. Unless you have a gsync monitor, anything less than 60fps minimum frame rate is going to be awful.JarredWalton - Thursday, November 20, 2014 - link
"Playable" is not the same as "ideal". I've logged plenty of hours over the years playing games at well under 60 FPS. 30FPS is usually the point where things get "smooth enough" to play well. 40+ is definitely sufficient. G-SYNC is merely icing on the cake if you have it.raghu78 - Thursday, November 20, 2014 - link
JaredTesting must be done at settings which are playable. Why are you testing QHD with Ultra and 4k with High settings where not even a GTX 980 is playable ? You did not even bother to show what setting is playable at 1440p/4k on GTX 980. My guess is high at 1440p and medium or low at 4k would have been playable on GTX 980. Gameworks features like PCSS is killing fps on all cards. AMD definitely need to improve performance in AC Unity.
silverblue - Thursday, November 20, 2014 - link
There's been a rather intense furore following AC:U's launch. Essentially, Ubisoft have blamed AMD for poor performance, and then partially retracted said statement. Considering how closely Ubisoft works with NVIDIA, it would sound like they've only developed for GeForce cards... but some NVIDIA users are having issues as well. What's more, Far Cry 4 is also causing issues with AMD hardware. Both were developed by the same software house.All in all, it looks more likely that Ubisoft Montreal's internal testing and QA is not up to scratch. You can't simply blame one vendor's CPUs and GPUs for poor performance when you've not bothered to optimise your code for anybody barring NVIDIA. I've even heard that playing offline results in a HUGE performance boost across the board...
Friendly0Fire - Thursday, November 20, 2014 - link
More like a yearly release schedule is untenable for a game of this scale. Corners had to be cut somewhere.silverblue - Thursday, November 20, 2014 - link
Logical, but even then, it's poor form for UbiSoft to slate AMD for what is most likely their fault as opposed to poor drivers.FlushedBubblyJock - Thursday, November 20, 2014 - link
That's amazing how it's never AMD's fault no matter what. No matter how poorly they do. No matter how many features they do not have or only have as a ridiculed braggart's vaporware. No matter how long it takes them to continue failing and not delivering, it's not their fault.Never AMD's Fault should be their saying since Never Settle is the opposite of the truth with AMD.
ppi - Thursday, November 20, 2014 - link
While I would agree with you AMD has been relagated to ultra-low-budget-inconsequential player on the CPU on front, with respect to GPUs I am not certain where you have been living last couple years, whether on Mars or under some rock.Since HD 4000 series, AMD has been running neck-for-neck with nVidia, sometimes kicking it in the rearside soundly, e.g. Radeon 5870 vs. rebadged GeForce 8800, sometimes being a bit behind, until the Maxwell 980 and 970 parts came couple months ago. But even now, the fastest AMD offering is still at least on par with 2nd fastest nVidia offering performance-wise (the issue is rather power consumption). And drivers-wise, there's lot of games coming out with very good graphical fidelity that have no issues on AMD cards.
Who failed here big time are Ubisoft's managers, who (probably wishing to please the shareholders) wanted to rush the games before the December holiday season to get extra bucks, and allowed proper Q&A to be skipped. There is absolutely no no excuse whatsoever for neglecting GPUs that still make 1/3 of the market (and mind you, nVidia performance is reportedly far from perfect as well). If the AMD cards did not work, they either should not have released the game at all, or release it nVidia only/AMD beta-only.
I do hope it backfires them at Ubisoft in such a way, that instead of now, these games will be rather bought a year later, in 2015 Steam sale season.
D. Lister - Friday, November 21, 2014 - link
Imagine what the nvidia hardware could do with the same power budget. And it isn't just power, but also temps and noise. How come AMD default coolers are the worst in the market yet the nvidia default coolers, esp. for the higher-end models are some of the best? How come it took AMD more than a decade to address the multi-gpu micro-stutter issue in the drivers? And how about the alleged speed boost in CPU performance that AMD promised with Win 8, that never quite took off?
AMD hires from the same talent pool as their competition, but ultimately, it is their consistent corner-cutting and false promises that hurt their business and relegates them to a lower tier.
I apologise if I offended any AMD fans, but please understand this, you aren't on AMD's side and I'm not on nvidia/intel's side... it is us, the consumers who are all on the same side, and unless we demand the quality that we are paying for, every now and then someone would try to get away by BSing us out of our hard-earned cash.
FragAU - Friday, November 21, 2014 - link
You are kidding right? I have been fortunate enough to essentially own every top-end GPU since the days of 3DFX Voodoo (and before!). AMD has certainly released some absolute monster cards and has been responsible for keeping Nvidia in check since all other competition ceased to exist. Both companies have had their fair share of the performance crown.Currently I own 2x 290X and have since their launch - I run every single game without issue (aside from the topic of this one) at Ultra settings with no issues (Both watercooled so nice and silent too). Ubi soft is just plain rubbish these days, heck look at the status of their cruddy GTA wannabe watch dogs? That game had issues on any PC. Tell me how black flag can run flawless and then this game just run like absolute crud? Sure a 980 should be in front but the 780ti/290x shouldn't be that far behind.
I will freely admit that Nvidia usually do have more solid drivers in early releases but nothing that has been a deal breaker. Having run SLi and CF since early days I can tell you that both have share of issues .. Anyway all I can say is you better hope that AMD keep on the heels of Nvidia or you will be paying $700 for the same GPU for 3 generations.
silverblue - Friday, November 21, 2014 - link
CrossfireX was only introduced in September 2005. Granted, the time from then to a viable fix was about 8 years (which is still a very long time) but there's two factors to consider here - how long has it been a problem, and how long has it taken AMD to acknowledge it themselves? The discrepancy between the two is what they really need to be judged upon, not how long it took for a solution to reach the user base. Promising fixes is one thing, burying your head in the sand and flat out ignoring your customers is quite another.FlushedBubblyJock mentioned it never being AMD's fault for this, that and the other. You'd have to be blinkered to subscribe to that particular theory. AMD's biggest problem is delivery - VCE support was a joke for a long time; some might say their DirectX drivers are in need of serious work; TrueAudio doesn't appear to be having any impact... to name a few. Team Green is in the ascendency right now, and AMD can't release a counter quickly enough, so they look to have no answer for Maxwell. It's almost all down to delivery, and we can only hope they improve in this area. It's not about being a fanboy, but bringing some objectivity to the discussion.
ppi - Friday, November 21, 2014 - link
Yes, right. But my point was mainly that graphical glitches and poor performance in ONE PARTICULAR GAME, sponsored by AMD's competitor, should be blamed on Ubisoft Q&A and them rushing to get the game out for x-mas, rather than on AMD.AMD do disapoint me though. Case example: When LCDs came out, I thought - great, now we will be able to get variable refresh rates. But lo and behold, 10 years pass and nothing, until nVidia comes with G-Sync. And then we learn AMD had done it, they had it RIGHT IN FRONT OF THEIR EYES, and they did not see the benefits, but instead tried to sell it as some crappy energy saving thingy. *facepalm* It is clear their development lacks some people who would focus on improving *game experience*.
(btw, from my last 6 gfx cards, 3 were nVidia, 3 AMD/ATI)
D. Lister - Saturday, November 22, 2014 - link
@ silverblue"CrossfireX was only introduced in September 2005..."
I'm sorry, -almost- a decade then. Because it is really inconsequential how long a particular phase takes in the entire process of solving a problem - what matters is how long it took the end-users to get their money's worth.
Secondly, the defence that they just didn't know any better, while the competition apparently did, to the point that the competition had to develop a tool (FCAT) for AMD to actually see (or recognise) the problem, merely suggests that if they weren't being deliberately callous, they were just outright incompetent. Either way, the point is that they need to step up their game, because their customers and fans deserve better than what been bringing forth, and because the free market needs good competition.
silverblue - Saturday, November 22, 2014 - link
Understood - definite incompetence and on a grand scale, too, considering somebody with multiple cards has put x times the money into the vendor than somebody who would purchase just the one. I would find it hard to believe that they were unaware from their own internal testing. There's the possibility that whoever presides over this was given their marching orders and AMD set about fixing the damage, but I guess we'll never know.I apologise for the pedantry as well.
D. Lister - Saturday, November 22, 2014 - link
No problem at all, it takes a big man to take an opposing argument with such candor - well done.FlushedBubblyJock - Wednesday, November 26, 2014 - link
It's AMD's responsibility to work with game devs to make certain their cards work properly.Of course, AMD is notoriou for not doing that for many, many years, then of course, it's nVidia's fault.
AMD might do well: " We take full responsibility."
That would mean of course having Catalyst Makers doing more than emailing and whining, like showing up at game dev studios and taking an active hand and having game day drivers ready.
Of course if they did that, what would their unbelievably as incompetent misplaced blame fans have to do ?
I mean seriously, it's as bad as the worst politicians we've ever seen pointing fingers in every direction but their own.
Lerianis - Friday, November 28, 2014 - link
Agreed.... should be year and a half at least for a game of this scale with the manpower allotted to Ubisoft Montreal.JarredWalton - Thursday, November 20, 2014 - link
1440p High is probably playable on a single GTX 980 -- I just ran GTX 970 on that and got results of 30.4/23.6 Avg/Min, which is about 40% faster (44% to be precise) on average FPS and 65% faster on minimum FPS. If 980 sees the same scaling, it will be around 35/26 FPS at 1440p High. There's not a huge difference in performance or quality between the High and Medium presets, which means you really would need to drop to Low (or close to it) for 4K gaming.Why did I test these settings? Because you have to choose something, and we generally go with "Ultra" at 1440p -- just to see how the GPUs fare. I've tested 4K at Ultra in the past, but that was completely unplayable across the board so I dropped to High for this game. If I had dropped 1440p to High, I'm sure I'd get people wanting to see Ultra numbers -- you can't please everyone.
Anyway, as someone that has a 4K display, I can tell you I'd rather play at 1440p or even 1080p with better graphics quality (High, Very High, or Ultra) than run at native 4K with the Low/Medium settings. YMMV.
AnnonymousCoward - Saturday, November 22, 2014 - link
IMHO, as a 30" owner I'm more interested in 2560-benchmarks at a quality setting that gives 60fps on non-SLI cards.Akrovah - Thursday, November 20, 2014 - link
I disagree. I find 30 perfectly playable. That's the effective frame rate of television. Movies are 24, and nobody has issues with them not being "smooth enough." Heck, people almsot got out pitch forks when someone dared film a movie at 48 fps.I mean yes, for gamign 60 fps is preferable and looks and feels better, but to call anythign under that "awful" is going a little far. Especially whent he game in question is not a twitch shooter. Action/adventure games like Assassin's Creed are perfectly enjoyable at 30 fps.
HanzNFranzen - Thursday, November 20, 2014 - link
Well you know, this is the internet... comments must be exaggerated for effect. Either something is greatest of all time or it's awful, never any middle ground. Anyways, I have a GTX980 and a 5820k @ 4.0Ghz and I would say that my experience with "playability" in this game doesn't really mirror the benchmarks at 2560x1440/ultra. Perhaps there are more taxing areas on the game that I haven't seen yet but I'm not seeing frames dropping into the teens. I feel the controls hurt the playability of the game more than anything as they just seem clucky.theMillen - Friday, November 21, 2014 - link
Exactly my remarks, 3770k @ 4.8, and evga 980 acx oc'd to 1550... and at 1440/ultra it is completely playable, im about 4 hours in and am completely satisfied with results. would i love to stay above 60fps at all times? yes. am i satisfied? yup!foxtrot1_1 - Thursday, November 20, 2014 - link
There is a big difference between passively watching a 24fps film and interacting with a 24fps video game. I'm far from a pedant on these things and I find anything under 45-50 fps distractingly unplayable.Cellar Door - Thursday, November 20, 2014 - link
The amount of less then intelligent comments in here is simply appalling! Do you realize how many PS4 and Xbox One games are locked at 30fps?In reality if you were put infront of a tv with one of those games, and not told about the 30fps, you wouldn't even realize what is happening.
Stable 30fps vs stutter now there is a difference, why people don't understand this is beyond me....
Death666Angel - Thursday, November 20, 2014 - link
You forget the abysmal controllers console gamers have to use. Using a controller and having low FPS is much different to using a mouse and having low FPS.TheSlamma - Friday, November 21, 2014 - link
There is nothing 'abysmal' about it.. man it bothers me people are latching onto that word now and using it like candy. The 360 and Xbone controllers are wonderful controllers, you just don't have the skill to use them it sounds like. Good gamers can use all input types. After a 6 month break I hopped into BF4 the other day and still had a 2:1 KDR on my first match never even played the map so yes I'm good with KB/M, but I can also pickup my PS4 or Xbox 360 controller and crush it in games that play better with controllers.theMillen - Friday, November 21, 2014 - link
and while we're on the topic of controllers as well as AC:U... this is one of those games that DEFINITELY plays better with a controller!inighthawki - Thursday, November 20, 2014 - link
Yeah, like yours. 30fps is visually smooth, but the issue is with input. A controller is less sensitive to input due to the large disconnect from the screen. It's the same concept that makes touch screens feel unresponsive even at high smooth framerates.nathanddrews - Thursday, November 20, 2014 - link
Halo: Combat Evolved was 30fps locked on Xbox and is considered by many to be a great game. While I PREFER to change settings in games to get the frame rate to match my monitor (144Hz), I still enjoy games that play at low frame rates. I can't tell you how many hours I put into Company of Heroes on my crappy laptop... that thing barely cracked 30fps when nothing was happening.nathanddrews - Thursday, November 20, 2014 - link
I forgot to add that my desire for smooth or higher framerates also varies greatly by game. RTS games can get away with 20-30fps as long as the jerkiness it doesn't interfere with my ability to select units. For action games, I prefer 60fps+ and for shooters or other fast-paced games, I want all 144.ELPCU - Thursday, November 20, 2014 - link
It really depends on game/person IMO.Here is my experience.
In one old game(SD gundam online : random korean gundam online game kappa), I was playing 40 FPS for a while. And it was playable experience and then I upgraded my gigs. .
After upgrade, I was able to push until full 60 FPS without any frame drop. Using full 60FPS fixed for a while. and then I had a technical problem of my upgraded gig. Thus, I go back to old computer and played with 40 FPS. There was a MASSIVE difference after downgrading.
Experience itself was horrible.
But it really depends on game and person. Again, depends how you accept 30 FPS. The game I mention was really sensitive. every move needs to be quick and responsive.
AC : Unity? I can agree these kinds of game is okay to have somewhat lower FPS. Though as I said it can be fairly bad if you are very used to using 60 fps for a while. Downgrade is way more feasible than upgrade, so it is more about how people accept it. If you downgrade straightforward from 60 FPS fully FIXED(using it for several months) to 30 FPS, you might say it was definitely horrible experience.
By the way, my problem of playing AC Unity was occasional freezing(Frame drop below 10 or 15FPS for about 1 sec.) for about 1 sec. Worse thing is after frame drop, my mouse cursor pops random place. I was playing with GTX 670 FTW SLI. 3930k OC, 32GB RAM.
This unity frame drop issue was the most terrible one. At first crash issue was even with frame drop, but Patch 2 fixs many crash problem. However, frame drop issue still persist. I was using LOW option for all graphic with 1600x900 resolution, which gives normally 60 FPS(70~80FPS if I turn my vertical sync off), but there was still this freezing issue. it intensifys when I put more graphic option or higher resolution. It forces me using 1600x900 with low graphic option, no AA or other random graphic sauce on it. Typically 60~80 FPS, but it was still horrible.
Murloc - Friday, November 21, 2014 - link
Easy answer: because that's just your opinion. I don't need more than 30 fps, most people can't tell the difference after that. I've played for years with 25 fps and it's fine. 23 fps is where it gets noticeable.Mr.r9 - Thursday, November 20, 2014 - link
"a lot of people might want a GPU upgrade this holiday season". What about people who already have a 780 or 290/x coupled with a 120Hz FHD or 60Hz QHD monitors?Isn't it more reasonable to say: Don't buy this game for now, wait a few months. I don't believe that ACU is "tougher" to run than Metro or Crysis.
kcn4000 - Thursday, November 20, 2014 - link
absolutely this! I don't have to buy new hardware to accommodate lazy porting/coding. Don't buy this game until it is in an acceptable state.JarredWalton - Thursday, November 20, 2014 - link
Isn't that what I said in the conclusion as well? "For those running older GPUs – or AMD GPUs – you probably want to wait at least another month to see what happens before buying the game."If you have a 780 or above, the game runs fine -- just not with Ultra textures. Go for 1080p High and don't worry about it. (You might be able to reach for 1440p High, but honestly it's going to be tough on any single GPU to run that setting as it's almost twice as many pixels to render at 1080p.) If you have AMD, yeah, either the game needs some patching or AMD's drivers need tweaking -- or both.
anubis44 - Thursday, November 20, 2014 - link
Hear hear. Ubisoft's motto needs to become: "We shall sell no wine before its time."I have a sneaking suspicion that nVidia's money is behind lack of AMD optimization. Ubisoft MUST make the game work on the AMD-powered consoles, so I hardly believe they didn't know how to make it work with Radeon graphics cards on the PC side. More like nVidia paid them off not to use Mantle and make it work well with AMD cards.
FlushedBubblyJock - Thursday, November 20, 2014 - link
More like it's cutting edge stuff that brings the best to it's knees, so nVidia has to spend their millions since AMD is broken and broke and helpless, so AMD whines and moans and PR lies, then many moons later a few of us find out AMD refused to cooperate because they act like their most childish fans instead of professionally and in their own best interest.RussianSensation - Thursday, November 20, 2014 - link
Flushed, your posts make absolutely no sense when AMD's GCN cards run very well in modern games such as COD:AW, Civilization BE, Evil Within, Ryse: Son of Rome, Dragon Age Inquisition, and especially in another Ubisoft title: FC4.RussianSensation - Thursday, November 20, 2014 - link
R9 290X = 56 fps980 = 57 fps
7970Ghz/280X = 42 fps
770 = 29 fps
techspot . com/review/917-far-cry-4-benchmarks/page4.html
Besides Unity and some issues in Lords of the Fallen, it is actually NV cards, specifically Kepler architecture, that has not been pulling its weight in the last 6 months. Not to mention that AMD has the entire sub-$330 desktop GPU market locked up, winning in performance at every price level.
techspot . com/guides/912-best-graphics-cards-2014/page7.html
Focusing on the broken Unity game as evidence that AMD has issues in performance misses the other 95% of games released in the last 6 months where it is NV that's having issues. Good one.
Horza - Thursday, November 20, 2014 - link
Pesky facts won't help, this is an emotional argument. AMD are childish, whining liars don't you know.JarredWalton - Thursday, November 20, 2014 - link
Have you looked at what people are saying about the console versions, though? They're not exactly shining pillars of smooth frame rates. And I seriously doubt NVIDIA paid Ubisoft to not optimize for anything other than NVIDIA hardware; it's more likely a case of "AMD didn't come by and offer us free GPUs and software developer support."It's in Ubisoft's best interest to make the best game they can that will run on the widest selection of hardware possible. Many of these games have budgets in the tens of millions, so intentionally killing performance (and sales) on a big chunk of the market would be crazy. Then again, the world is full of crazy.... :-)
chizow - Thursday, November 20, 2014 - link
Yes Anubis, of course you have a sneaking suspicion, Nvidia obviously paid AMD's driver team to write bad drivers and not bother optimizing for AC: Unity to show Ubisoft how much bad publicity they could garner for not teaming up with a vendor that is becoming less relevant by the day.I guess you could say the difference is, the consoles makers actually write their own drivers for their APUs and AMD has nothing to do with it at this point. They gave them the keys and blueprints and vacated the premises, which is probably a good thing for console owners. If you bought a console would you honestly want to rely on AMD driver updates for it? D:
AMD needs a driver update, plain and simple. The poor XFire scaling results should be enough to make this clear, which I know you are already aware of.
kron123456789 - Thursday, November 20, 2014 - link
Btw, about image quality...I suggest you to take a look at these screenshots. This is amazing graphics!http://cloud-2.steampowered.com/ugc/53625466562634...
http://cloud-4.steampowered.com/ugc/34103307051801...
http://cloud-4.steampowered.com/ugc/34103307025487...
WithoutWeakness - Thursday, November 20, 2014 - link
I spent a good 10 seconds admiring the detail they put into every strand of hair on the girl in the first pic before I realized the poor fellow on the right didn't have a face.kron123456789 - Thursday, November 20, 2014 - link
Here's another screenshot with maxed out Ultra graphics))http://cloud-4.steampowered.com/ugc/36355106839023...
JarredWalton - Thursday, November 20, 2014 - link
And as I specifically mentioned in the text: the missing faces/textures was apparently patched on Day 0; I personally never saw the problem. I wonder if all the hubbub over the faceless people might have something to do with a bad crack -- wouldn't that be fun? Anyway, it's 2014 and the game uses UPlay so unless I'm missing something, you have to be connected to the Internet to play and the only people not updating with the patch... well, you fill in the blank.chizow - Thursday, November 20, 2014 - link
Yeah probably, Pirates get half a game and wonder why its broken. Would've been funny if Ubi tweeted something like:"Hey PC players, those of you who are getting scenes from Dark Man pirated the game!" like they did with FC4 POV setting.
r3loaded - Thursday, November 20, 2014 - link
I'm seeing these benchmark results and all I'm thinking of is "shitty optimization".MooseMuffin - Thursday, November 20, 2014 - link
Yup. Poor performance on PC is a good indicator of a rushed PC port, but poor performance on consoles (also true for this game) sounds like the whole project was a mess. Those are fixed hardware configurations that they've known about for a long time.agent_x007 - Thursday, November 20, 2014 - link
When MSAA/TXAA/MFAA (the latter two are based on former) is dropped -U R good to go on, for example, Ultra settings with 3GB VRAM card up to 1920:1200 (with FXAA).
In this settings I get around 50FPS on GTX 780 Ti OC, here's my video : https://www.youtube.com/watch?v=fGdXJN-5YXw
But this game sure can kill any card out there.
kron123456789 - Thursday, November 20, 2014 - link
This game can kill not only any card, but your mind as well with bugs like thishttp://cloud-4.steampowered.com/ugc/50991716460791...
Lucian2244 - Thursday, November 20, 2014 - link
That bug was fixed in day one patch and it was only on a few specific GPUs. I know the hate towards Ubi is great but get your facts right.dirtyferret - Thursday, November 20, 2014 - link
I love idiots who think any game under 60FPS is not playable. I imagine they have Fraps running in the corner of their screen and have a total hissy fit if any game dares to dip below 60FPS on their ego trip of a PC. I know Nvidia/AMD stock holders love them dearly.FlushedBubblyJock - Thursday, November 20, 2014 - link
Well only nVidia stock holders since AMD is the pit of hades in the red holing out of everyone's investment pocket.Dribble - Thursday, November 20, 2014 - link
Looks like it's basically cpu limited. Difference between ultra and medium is only a few fps for something like a 970 at 1080p. Would be interesting to try it with a 6 or 8 core intel processor and see how it scales with more cores?JarredWalton - Thursday, November 20, 2014 - link
On which setup are you seeing "only a few FPS"? 1080p Medium is 22% faster with 970 SLI on average FPS and 31% faster on minimums, and a single 970 is 49% faster average and minimum on Medium vs. Ultra. That's far more than a few FPS.The gap between Medium and High is much smaller, but then they both use High Quality textures and honestly they look very similar. There the performance is only about 10-30% faster (depending on GPU), though minimums still favor cards with 2GB VRAM by a large amount.
Dribble - Thursday, November 20, 2014 - link
Well I'd expect a bigger performance difference between medium and ultra. Looking at the cpu's the 4 core pretty well doubles the 2 cores min frame rates, that shows cpu is having a much bigger impact. If that's the case what would 6 or 8 cores do?JumpingJack - Thursday, November 20, 2014 - link
Hahaha, we have a new king .... "but can it run Assassins Creed Unity"Calista - Thursday, November 20, 2014 - link
If you have the time I would like you to test further with even lower resolution. It's not much point knowing GPU x can do 18 fps@1080p since it's much easier to adopt to lower resolution as compared to lower frame-rate. Maybe you could use the slowest of the buch and try out 1600x900 and 1280x720 as well? If the system is still up and running I guess it would take much more than a few hours.JarredWalton - Thursday, November 20, 2014 - link
I did run 768p Low on most of the GPUs... I don't want to make a graph because really, desktop users don't want to run games below 1080p IMO. But if you're wondering about the laptops and lower end hardware...Performance at 768p Low (Avg/Min):
860M: 35/25
870M: 45/32
880M: 49/37
980M: 56/40
R7-250X: 25/12
R9-280: 37/24
R9-280X: 43/26
R9-290X: 49/27
Intel HD 4600: 6.6/3.2 (Hahaha...)
Of those, I should note that only the 860M and 250X are unable to hit "playable" frame rates at 900p Medium.
huaxshin - Thursday, November 20, 2014 - link
CPU plays a big role in Assassin Creed Unity so the GTX 980M comparison against the desktop GPUs are skewed. The desktop GPUs are paired with 84W++ CPUs while the GTX 980M is paired with a 47W soldered lower clocked CPU.I expect the GTX 980M to be closer to GTX 780 if they ran the same clocks. Something that would be interesting to see from Anandtech, a review of GTX 980M against desktop if both had roughly the same CPU power.
http://gamegpu.ru/images/remote/http--www.gamegpu....
JarredWalton - Thursday, November 20, 2014 - link
The i3-4330 numbers are there for a look at where the CPU bottleneck would lie on lower end CPUs. I would guess that the mobile quad-core CPUs like the i7-4710HQ are generally keeping the GPU "filled" with work. 980M might be a bit faster with a higher clocked CPU, but I don't think it would come anywhere near the 780 or 970.I've got some numbers and basically across a large selection of games the 780 (with a desktop CPU compared to a mobile CPU) is around 25% faster than the 980M (and the 780 and 970 are basically tied in overall rankings -- like literally within 0.1% of each other).
anubis44 - Thursday, November 20, 2014 - link
Jarred, I'd like to see these benchmarks on an AMD FX CPU as well. Forget the APUs, as they don't have level 3 cache, but the FX chips do.JarredWalton - Thursday, November 20, 2014 - link
If I had an FX rig, you can be sure I'd test at least one or two GPUs on it to see how it compares, but sadly I don't.chizow - Thursday, November 20, 2014 - link
I thought that was what the i3 simulation was meant to mimic? ;)Morawka - Thursday, November 20, 2014 - link
we need those big wide nvidia cards to come back. 512bit bus or even a 1024bit bus. My GTX 980 only chokes when i try to enable any form of AA on FC4 and AC: Unity. As long as AA is set to None or 2x MSAA, the games fun at 60FPS.Notmyusualid - Thursday, November 20, 2014 - link
Great to see some mobile GPU numbers in there.Allows the rest of us to know what to expect from a title...
Thanks.
eanazag - Thursday, November 20, 2014 - link
The game is a hardware thrasher from the numbers. I can understand seeing PC titles playing with low frame rates, but there's a problem if the consoles can't get the game over 30 FPS. That is a design failure since you can't upgrade consoles.YazX_ - Thursday, November 20, 2014 - link
PCSS kills performance, im running the game everything on Ultra except PCSS is set to High with FXAA at WQHD (2560 x 1440) and getting 55 FPS avg with 970 GTX (1525/8Ghz), min FPS is like 40.switching to 2xMSAA with MFAA enabled gets me around 45 FPS avg and 30 Min, so i wonder how 970 SLI in your benches couldnt sustain 60 FPS on WQHD?!
JarredWalton - Thursday, November 20, 2014 - link
Ultra is 4xMSAA with PCSS. You had a 10FPS drop just enabling 2xMSAA, and 4xMSAA would take another 10 or so FPS off, with PCSS accounting for an additional 10 (give or take).Carfax - Thursday, November 20, 2014 - link
The main reason for the low performance is the use of MSAA. MSAA in this engine has a massive performance hit as the engine uses deferred rendering . Running the game on ultra settings with FXAA instead of MSAA would net you over 10 FPS easily.JarredWalton - Thursday, November 20, 2014 - link
Umm... MSAA on many games tends to exact a fairly decent performance hit, and the more complex the game the bigger the hit. FXAA is basically a 3% hit (vs. no AA) by comparison so yes it would be much faster.Carfax - Thursday, November 20, 2014 - link
Exactly, so posting benchmarks of the game running at MSAA 4x isn't exactly an accurate representative of the kind of performance you can get out of the game, and arguably isn't even worth the massive performance hit as it just gives you a very slight IQ boost over FXAA. On my own machine, I'm playing at 1440p maxed settings with FXAA and I'm seeing 60 FPS on a regular basis with V-sync on. With V-sync off, I'm getting into the 70s..This is on a Gigabyte G1 GTX 970 SLI rig with a 4930K @ 4.3ghz driving them..
JarredWalton - Thursday, November 20, 2014 - link
I'll bet you a dollar you're CPU limited at mid to high 70s when you're down on the streets. Anyway, I ran the Medium numbers as well at 1080p, which is basically FXAA with High textures and a few other items turned down a notch that don't really affect things that much. As to what's "an accurate representation of the kind of performance you can get", well, the numbers don't lie. If you want to run different settings, the numbers change, but there's a reason the developers don't just use FXAA as the default at all settings.Carfax - Thursday, November 20, 2014 - link
I probably am CPU limited with V-sync off, but considering I'm above 60 FPS and how much is being rendered (the game is absolutely massive in scope and detail), I would say that the engine is still fairly optimized. When I'm playing the game, my CPU is usually around 50 to 60% loaded on all 12 threads with V-sync on. I haven't tested CPU usage with V-sync off though.The game definitely uses a hex core processor, so that's probably why your frame rates are lower than mine..
mcmilhouse - Thursday, November 20, 2014 - link
I wonder if Apple didn' take all the 20nm production this year, and amd/nvidia had 20nm cards, if we wouldnt have a $200-300 card that outputs 60fps easy at 1080p ultra. We really should of been at 20nm this year.Crazyeyeskillah - Thursday, November 20, 2014 - link
Why don't u turn off AA and show people what the game can actually run at. I don't know why this is a must have when you can't get solid frame rates. If you ran all the same benches without any AA i don't see why it would be so abysmal. AA is a luxury not mandatory.JarredWalton - Thursday, November 20, 2014 - link
You mean like the 1080p Medium graph? That uses FXAA, which is nearly "free" to enable.Crazyeyeskillah - Friday, November 21, 2014 - link
No, more on some of the high end numbers where aa starts to get redundant, especially at 4k. I loved crysis when it came out and it slapped my 7900gtx sli around because I knew it was the start of something great to come. This game does have some nice touches, especially in the quantity of npc's on the screen, use of ai and level of detail for such an expansive city, but is nowhere close to heralding in a new concept look of what's to come in terms of textures and reach. Most people are gonna set it to highest textures, turn off AA and get their playable fps at whatever resolution their card supports so I have to admit this is the first time i've really felt a little leary at the state of the game presented on ANANDTECH. I've been reading the site since it was launched but this game benchmark just didn't make me come off with a sense of what performance is really going to be like across various setups.FITCamaro - Thursday, November 20, 2014 - link
I don't understand how they can do a poorer job of porting the game to PC on AMD hardware than Nvidia when the consoles are using AMD GPUs. Unless they built it for PC with Nvidia in mind and then did a crappy job of porting it to consoles. Of course given the poor performance of the game on consoles, that isn't hard to believe.Ubisoft is quickly becoming the new EA. I won't be buying this game this year. Probably in a year when it's down to $20 and they've maybe patched it to a reasonable state. I say maybe because Watch Dogs has been out for months and is still pretty bad.
FlushedBubblyJock - Thursday, November 20, 2014 - link
The bleeding edge has to be pushed, lest there be no need for more.Same thing was said about Crysis, then it wound up being the most famous FPS freak game ever, and still is, until perhaps now.
So getting down on the leading edge games that present a challenge to GPU designers is not in all of our best interest.
Also it's nice to see a "port" frustrate the highest end elite desktops and see the whining not be about how cruddy for any sort of gaming ported games are, but in this case how " slow my thousands of dollars are ".
Very glad too see it crushing the best of the best, we need more of this at a faster rate, then we hopefully won't hear so much about and so often " the increase with the new core isn't worth it ".
Now the GPU makers must overcome, a challenge is a good thing.
Horza - Thursday, November 20, 2014 - link
This would be a reasonable sentiment if in fact the game was "bleeding edge" graphically. Crysis was a landmark visually (and still looks impressive) and I feel very safe to wager that Unity will not be remembered in even close to the same way. Anyone can make a game that brings "elite" hardware to it's knees, it's not an impressive feat on it's own if it doesn't deliver the experience to justify it.Daggard - Thursday, November 20, 2014 - link
*shrug* runs fine on my PS4. I'd give it more of an 8.5 personally. Paris is the best playground yet for this series. Online features are still being ironed out, but the game is great :)Jon Tseng - Thursday, November 20, 2014 - link
Jared I know you didn't test for it but any thoughts on how system memory affects things? Minimum is for 6GB and 8 GB recommended I wonder what impact this has?(I've just fine from 4GB => 6GB to run this game; wondering if I need to replace the other two sticks two or whether the fact swapfile will be on SSD is enough)
WatcherCK - Thursday, November 20, 2014 - link
I was looking forward to TC: The Division but given Ubisofts recent track history and inherent game factors (new engine, MMO, rpg aspects) im just not sure that it will be anything except a colossal ballsup?Mondozai - Thursday, November 20, 2014 - link
I agree with many other commenters about the strangely sanguine tone of this article, breezing past the massive performance bottlenecks and instead urging people to upgrade their hardware instead of pointing the finger where it belongs - Ubisoft - and attacking them for releasing what is essentially a botched game in terms of performance. You should be running 60+ at high 1080p settings with a 290/780. Instead you barely get to 45 frames with a 780.The fact that even a 980(!) can't get over 60 fps on 1080p high means that the game needs to be canned and not the reader base's hardware. Do better, Jared.
JarredWalton - Thursday, November 20, 2014 - link
That's certainly not what I'm doing. Just because the last sentence says, "And if this is the shape of things to come, a lot of people might want a GPU upgrade this holiday season" doesn't mean I'm telling everyone to upgrade. What I am saying is that IF you want to run THIS game (and IF other games end up having similar requirements in the near future), then yes, a lot of people will need new hardware (or lower quality settings).When Crysis came out, nothing -- NOTHING -- could run it properly at maximum quality settings. People skewered Crytek and said they were lousy programmers, etc. and "the game doesn't even look that good". And yet, I don't really think that was the case -- they just decided to enable settings that pushed beyond what was currently available.
Is Ubisoft intentionally doing that with their latest releases? Perhaps not in quite the same way (it is the holiday season after all), but the decision to drop support for older generation consoles in order to enable a higher quality experience certainly wasn't made to improve the sales of the game. Believe it or not, there are game developers that just really want to use the latest and greatest technologies, performance be damned.
Fundamentally, we're not a sensationalist website. We're not in the market of pointing fingers, casting blame, etc. All I can say is how the game works right now on the hardware I tested, and it's up to the readers to draw conclusions. Was the game pushed out early? Almost certainly. Should they design all games so that 1080p High gets 60+ FPS? I'm not one to dictate whether that's the best thing to do or not, and I do like seeing companies push the performance envelope on occasion.
It hurts when your high-end GPU can't run a game with settings you are accustomed to using, but I do have to say that their recreation of eighteenth century France is quite remarkable.
mcmilhouse - Friday, November 21, 2014 - link
^This. Plus Nvidia 900 series is still 28nm. We haven't had a 20nm card, Apple took all the TSMC production lines.piroroadkill - Saturday, November 22, 2014 - link
Crysis absolutely blew everything else away, graphically.That moment when you're done coming through the first forest and you hit the rays coming through the trees, and you look down over the cliffs.
I don't think many people said it was coded badly (although they probably did), but it was such an incredible step up visually that people really took notice.
Assassin's Creed Unity may also be a fantastic game visually, and I will get it at some point, but the fact is, console hardware is a measly set of Jaguar cores and low to midrange previous generation Radeons.
People are right to expect their massively more powerful machine could run the game at 60 FPS.
Milite777 - Thursday, November 20, 2014 - link
I've just a laptop with i7 2nd gen, 8 gb ram and 6770m (2 gb vram). I know that this conf is too poor for any serious gaming session.. But I'd like to play ACU, like I do with prev episodes... Could I get at least 25 fps with the lowest settings and a resolution of 1366x768? I don't need the best graphics, I just want to know the story... And of course I have to buy this game to try it... Need help guys :)chizow - Thursday, November 20, 2014 - link
Interesting findings Jarred with texture setting, it looks like Santa (Ryan?) sent you some early X-mas presents too with the GPU upgrades. I would also be interested to see a kind of "feature expense" comparison, where you go through some of the main settings to give an idea of what kind of perf hit you take when enabling them at different settings.For example, I remember a time when setting textures to max was an automatic, but now it seems in this age with 2K and now 4K textures with next-gen console ports, that's no longer possible since those textures will fill VRAM in a heartbeat. Also, did you have any >4GB cards or high bandwidth cards to test to see if they helped with the texture situation at all? Like Titan Black?
But lately I have seen textures and MSAA creating a much bigger perf hit than in the past due to the amount of VRAM they take up. There was a time where VRAM didn't make as much of a difference as shading power and you could just crank up the textures and use MSAA without the crazy hits to perf we see today.
iceveiled - Thursday, November 20, 2014 - link
I opted to turn everything up to max on my 970 (but with soft shadows turned off) and with a 30 fps locked frame rate (1080p). It plays butter smooth, but man, if any game benefits from that 60 fps frame rate it's assassin's creed, with it's wonky input lag (I play it with a controller) and even wonkier world traversal / parkour.Takes a bit of getting used to, but at 30 fps it ain't all that bad and it's a damn nice looking game with the settings maxed out.
D. Lister - Friday, November 21, 2014 - link
Ubisoft is just forming a pattern here of poorly optimised software. They have some of the best artists, but apparently some of the worst software developers. Also, I don't believe them for a second when they try to offload their incompetence on a hardware manufacturer.poohbear - Friday, November 21, 2014 - link
lets be honest, this is a poorly optimized game with an enormous amount of bugs that was so ridiuclously messed up it made the BBC news and their shares dropped 11%! It's a complete debacle.dwade123 - Friday, November 21, 2014 - link
Good thing I didn't buy GTX 980 for $460. It can't run next genports maxed comfortably. Bring out the real nextgen gpus!maroon1 - Friday, November 21, 2014 - link
Core i3 4130 with GTX 750 Ti runs this game as good as console versionEurogamers did test by matching the graphic quality of PC to console version (by running it with 900p and similar graphic settings to PS4), and the result that GTX 750 Ti plays it as good if not slightly better.
cmdrmonkey - Friday, November 21, 2014 - link
When a game is barely playable on the most high-end video cards on the market at resolutions and settings PC gamers are accustomed to, you have utterly failed. Bravo Ubisoft. Bravo.P39Airacobra - Friday, November 21, 2014 - link
You can forget about Ubicrap fixing this! This is why Ubicrap gave the unreal PC requirements! They are getting money from GPU/CPU Hardware to help market for them! And they do care to spend more money on us scum customers anyway! So I say XXXXXXXXXXXX UBICRAP!!!!!P39Airacobra - Friday, November 21, 2014 - link
They should be arrested for doing this!mr. president - Sunday, November 23, 2014 - link
Any chance of testing CPU performance on AMD vs nvidia GPUs? I've seen a *ton* of recent games underperform on AMD GPUs due to what I think is their lack of support for deferred contexts aka 'multithreaded rendering'. It's particularly low-end CPUs that are affected.Unity pushes something like 50.000 draw calls each frame. Note the enormous disparity in minimum framerates between the two vendors on 1080p/medium where even slower nvidia GPUs get higher minimums than faster AMD GPUs. I think it's worth exploring as even low-end FX CPUs can almost double their performance on high-end nvidia GPUs vs. high-end AMD GPUs.
FlushedBubblyJock - Tuesday, November 25, 2014 - link
That last line you have tells me AMD is offloading multiple boatloads of work to the cpu --- isn't that exactly why Mantle is for low end cpu's - it relieves the gigantic overburdening cheaty normal driver of AMD that hammers the puny AMD cpus.It's sad really - shortcuts and angles and scammy drivers that really only hurt everyone.
RafaelHerschel - Sunday, November 23, 2014 - link
A few observations:60 frames per seconds isn’t some arbitrary value. With Vsync enabled and a refresh rate of 60Hz, dips below 60 fps are far more unpleasant. Adaptive Vsync addresses that but isn’t available to everybody. Disabling Vsync leads to screen tearing which some people (me included) find extremely annoying.
In a game every frame consists of discrete information. In a movie each frame is slightly blurred or at least partially blurred, a natural effect of capturing moving objects in a frame. For a game to feel fluent at 24 or 30 fps it needs to add artificial blurring.
In movies each frame has the same length. In games the length of each frame is different. So even 60 fps can feel choppy.
Different people have different sensibilities. I always notice a low frame rate and frame drops. A steady 60 fps with Vsync enabled works best for me. Anything below 50 fps (in a game) feels off to me and above 60 I don’t notice that much difference. Likewise for gaming and movies I use screens with a fast response time since ghosting really distracts me.
I feel that with a decent system a 60 fps minimum should be attainable. What bugs me is that in some games lowering the quality settings has little impact on the minimum frame rate.
I’m always surprised by blanket statement like “30 fps per second is perfectly playable”. Depending on the game, the settings and the person playing the game it’s often not. For me another factor is how close I’m to the screen.
JarredWalton - Monday, November 24, 2014 - link
FWIW, I've been playing games for about 35 years now (since I was 6 on a Magnovox Odyssey II), and when I say a game is "playable" at 40 FPS, what I'm saying is that as someone with years of game playing behind them feels the game works fine at that frame rate. I've also played ACU for many hours at sub-60 FPS rates (without G-SYNC being enabled) and didn't mind the experience. Of course I wasn't the one saying it was "perfectly playable" above, but it is most definitely playable and IMO acceptable for performance. If you want *ideal*, which is completely different, then yes: 60+ FPS is what you want. But then there are those with LCDs running at 120Hz who would want even higher frame rates. YMMV.RafaelHerschel - Monday, November 24, 2014 - link
I don’t mind somebody saying: “this game is perfectly playable for me at 40 fps”. I do mind it if people say that there is no perceivable difference between 40 fps and 60 fps (as stated in the comments) or when people say “the game runs smooth as butter” when it doesn't. The article was fair, some of the comments weren't.For me a game is not enjoyable at anything below 50 fps and I much prefer it to have Vsync enabled.
I would say that most people accept 60 fps as a reasonable goal at medium settings (whatever they may be) with a high-end GPU. Depending on personal taste (graphics settings) and budget people can than choose to sacrifice fps for MSAA, AO, high-res textures and/or money.
I strongly believe that studios should aim for 60 fps at medium settings with a high-end card and 60 fps with a medium-card at low settings (both at 1080).
With smart design choices and quality control that is certainly possible. As it stands, I’m disappointed with both Far Cry 4 and Unity.
HisDivineOrder - Monday, November 24, 2014 - link
1) Wonder if an i5 vs i7 (hyperthreading) matters.2) Wonder why you guys don't borrow a Titan Black and test it to see if the extra VRAM improves things. Surely, a contact at Asus, Gigabyte, nVidia, etc has a Titan Black with 6GB of RAM to lend you. Probably two for SLI. I'm curious to see if the game can use the VRAM because I'm hearing reports of Ultra taking 4GB and gobbling it up.
3) Ultra settings preset includes MSAA. That's the first setting I'd turn off if my settings were taking a dive. It gobbles up memory AND processing like nobody's business. What happens if you turn it off?
Seems like obvious questions to me. Until Batman Arkham Knight, this looks to be The Benchmark game in terms of crushing your system. Assuming they ever finish patching it.
RafaelHerschel - Monday, November 24, 2014 - link
If the available VRAM makes a difference, then lowering texture quality and turning of all forms of AA will make a big difference.Unfortunately Ubisoft games don't scale well with lowering the settings.
Evenload - Wednesday, November 26, 2014 - link
VRAM clearly makes a very big difference on this game. To answer the question above I maxed out the settings at 1080p on my GTX Titan (original) and just ran/jumped round Paris a bit while GPU-Z was set to data log. The file shows constantly high memory usage maxing out at about 4.4Gb. Interestingly with stock settings the GPU was often being pushed to relatively high clock rates by GPU boost so it looks like the GPU was not being worked extremely hard.Not a scientific test but potentially bad news for people with 2Gb and 3Gb cards as tweaking will not recover the difference. Interestingly I noticed that the main system memory the game takes is not that large and I wander if the issues people are experiencing are possibly related to the way the game has been programmed and the unified memory model PS and Xbox use. On the consoles the distinction between "graphics" memory and "system" memory would not matter in the same way that they do in a gaming PC with a graphics card.
joeh4384 - Tuesday, November 25, 2014 - link
Lol at needing freaking SLI 970s for 60+ fps at 1080p. Do you think patches in time can make this playable for high end single card setups like a 290x on ultra.Lerianis - Sunday, November 30, 2014 - link
Unity is a good game once you get past the glitchfest. No, it is not a revolution of the Assassin's Creed series, more an evolution of Assassin's Creed 4. It is one awesome game (I played it on a friend's console and another one's PC) once you get past those issues.The only thing I don't like about it is that it is VERY VERY hungry for graphics power even at 1080p settings.
To the point where the latest 980M's from NVidia struggle to push more than 30fps at those settings on Ultra.
I'm wondering (considering I do not see much additional graphics prettiness) whether that is a sign that the game was not properly optimized for PC's and notebook PC's. If it is, that is something that Ubisoft (and other game makers) are going to have to take note of and fix.
Ramon Zarat - Sunday, November 30, 2014 - link
I'll only say this: Fuck Ubisoft, the new E.A.IUU - Tuesday, December 2, 2014 - link
At last a breath of fresh air. Instead of getting everyone excited of how good you can play pacman at 10k, one company still serves as a reminder of the distance we still have to cross.Way to go Ubisoft, and if you make a game hardly playable at 1280X720, I will make a donation to you and create a church for you. We have had enough from the mobile devolution, touting meaningless resolutions(3 mega pixels on a tablet, oh my god). You will serve as a reminder that high resolution is good, but you have to have some real content to show on it.
We need a new Crysis, rather not only one but several in succession.
wrayj - Tuesday, December 2, 2014 - link
I've seen videos where dropping the resolution to 1600x900 is really the way to claw back performance.is4u2p - Tuesday, December 9, 2014 - link
I got way better than this with my i5-3570k and R9-290.P39Airacobra - Thursday, December 11, 2014 - link
QUOTE: And if this is the shape of things to come, a lot of people might want a GPU upgrade this holiday seasonWHAT? How about the game is BS! You know good and well this game is just a piece of junk! It is not because of older cards or because the game is graphically advanced! This game is no more advanced then any other game! You should not be advising people to waste money on a new expensive GPU just to play a game that has bloated PC requirments because Ubisoft suddenly decided to no longer correctly optimize pc games! Instead you should be pointing out how horrible Ubisoft's new games are! If AnandTech is now going to push marketing instead of pointing out the truth of horrible software! Then it looks like AnandTech is no longer a trustworthy website for benchmarks or anything! BS is BS! No matter how much cherry's you put on top of the BS! Any benchmark site benchmarking games like this is absolutely discredited!
kmkk - Thursday, February 5, 2015 - link
Agree 100%. Ubisoft are to blame here, not the GPUs.