Comments Locked

56 Comments

Back to Article

  • luv2liv - Friday, November 4, 2011 - link

    they cant make it physically bigger than this?
    im disappointed.

    /s
  • phantom505 - Friday, November 4, 2011 - link

    That was so lazy.... it looks like they took 3 case fans and tie strapped them to the top. I think I could have made that look better and I have no design experience whatsoever.
  • irishScott - Sunday, November 6, 2011 - link

    Well, it apparently works. That's good enough for me, but then again I don't have a side window.
  • Strunf - Monday, November 7, 2011 - link

    Side window and mirrors to see the the fans...I don't understand why people even comment on aesthetics it's not like they'll spend their time looking at the card.
  • phantom505 - Monday, November 7, 2011 - link

    If they were lazy here, where else were they lazy?
  • Sabresiberian - Tuesday, November 8, 2011 - link

    What is obviously lazy here is your lack of thinking and reading before you made your post.
  • Velotop - Saturday, November 5, 2011 - link

    I still have a GTX580 in shrink wrap for my new system build. Looks like it's a keeper.
  • pixelstuff - Saturday, November 5, 2011 - link

    Seems like they missed the mark on pricing. Shouldn't they have been able to price it at exactly 2x a GTX 560 Ti card, or $460. Theoretically they should be saving money on the PCB material, connectors, and packaging.

    Of course we all know that they don't set these price brackets on how much more card costs over the next model down. They set prices based on the maximum they think they could get someone to pay. Oh well. Probably would have sold like hot cakes otherwise.
  • Kepe - Saturday, November 5, 2011 - link

    In addition to just raw materials and manufacturing costs, you must also take in to account the amount of money poured in to the development of the card. This is a custom PCB and as such, takes quite a bit of resources to develop. Also, this is a low volume product that will not sell as many units as a regular 560Ti does, so all those extra R&D costs must be distributed over a small amount of products.
    R&D costs on reference designs such as the 560Ti are pretty close to 0 compared to something like the 560Ti 2Win.
  • Samus - Saturday, November 5, 2011 - link

    i've been running a pair of EVGA GTX460 768MB's in SLI with the superclock BIOS for almost 2 years. Still faster than just about any single card you can buy, even now, at a cost of $300 total when I bought them.

    I'm the only one of my friends that didn't need to upgrade their videocard for Battlefield 3. I've been completely sold on SLI since buying these cards, and believe me, I'd been avoiding SLI for years for the same reason most people do: compatibility.

    But keep in mind, nVidia has been pushing SLI hard for TEN YEARS with excellent drivers, frequent updates, and compatibility with a wide range of motherboards and GPU models.

    Micro-stutter is an ATI issue. It's not noticeable (and barely measurable) on nVidia cards.

    http://www.tomshardware.com/reviews/radeon-geforce...

    In reference to Ryan's conclusion, I'd say consider SLI for nVidia cards without hesitation. If you're in the ATI camp, get one of their beasts or run three-way cross-fire to eliminate micro-stutter.
  • Death666Angel - Saturday, November 5, 2011 - link

    "But keep in mind, nVidia has been pushing SLI hard for TEN YEARS "
    You mean 7 years?
  • Revdarian - Saturday, November 5, 2011 - link

    Microstutter is a thing with BOTH companies.

    The stutter (real stutter, nothing micro there) experienced in that review was because of the use of Caps on top of the driver that actually already had the crossfire fix, and if they would have contacted AMD about it (bitched at them properly actually, nothing wrong in bitching a bit directly on private to the company if you are a known reviewer), they would have got that answer.

    BTW microstutter depends from person to person, but you will feel it "easier" once the average fps of the dual card solution slips below 60fps.
  • Death666Angel - Saturday, November 5, 2011 - link

    Oh, and here are 2 tests that show CF to not be any worse than SLI concerning micro stutter:
    http://preview.tinyurl.com/6exquor
    http://preview.tinyurl.com/6kuutnl
    Got a spam notice and couldn't post with the normal links. I hope it works with tinyurl...
  • Fiah - Saturday, November 5, 2011 - link

    Micro-stuttering is very much a Nvidia problem as well, just look at the ominously green graphs here: http://techreport.com/articles.x/21516/5

    I'm not convinced that either camp will solve this problem anytime soon, as it is as much a game engine problem as a problem of the drivers/GPU.
  • marraco - Sunday, November 6, 2011 - link

    The article shows that Crossfire does worse than SLI.
  • Fiah - Sunday, November 6, 2011 - link

    Your point being?
  • Uritziel - Monday, November 7, 2011 - link

    Me too! Great performance for the price. The only thing they've not quite been able to tackle is Metro 2033 in 3D at highest settings and 1920x1080 res. Not a single issue with SLI either. Who knows when I'll need to upgrade.
  • Sabresiberian - Tuesday, November 8, 2011 - link

    If you read the article more thoroughly, you will see that it says results vary with application; microstuttering with Nvidia's SLI shows up more in other games and potentially more with different settings.

    Another thing I'll say is that microstuttering is one of those things that is terribly annoying to some people, and just isn't noticed at all by others. General reading though shows me it's a problem recognized by both Nvidia and AMD. Personally. I say it shows up most in multi-card solutions, but isn't entirely exclusive to them.

    My experience has only been with Crossfire, and I found it very distracting.

    I particularly find it annoying that someone would go to the trouble and expense of setting up a multi-card system and end up with worse performance. (We can talk about "performance" in terms of frame rates alone if and only if the quality does not deteriorate; id the quality does, then performance is worse, not better, even if the frame rate improves.) This is an issue that needs to be addressed much more aggressively by both companies, and I will say it does not impress me that it hasn't been solved by either one.

    It makes me long for the days when Matrox was a player in gaming graphics.

    ;)
  • Sabresiberian - Tuesday, November 8, 2011 - link

    You have to realize that the card also includes the cost of NF200 bridge chip, which allows non-SLI capable mainboards to actually use this card.

    From the article:

    "While there were some issues with this on the GTX 460 2WIn, this has apparently been resolved (the presence of NF200 shouldsatisfy all SLI license requirements in the first place). EVGA has said that the 2Win will work on non-SLI mobos, making it fully compatible with every motherboard."

    In other words, if it's got a PCIe x16 slot in it, it will work in your mainboard. Most dual-GPU cards can't do that.

    ;)
  • keitaro - Saturday, November 5, 2011 - link

    What's missing are performance numbers on Surround and Eyefinity resolutions. EVGA is also touting Surround 3D capability on this card and it is something to at least consider. I've seen so many single-monitor scores and these days they bore me. Get us some Surround/Eyefinity benchmark numbers so we can see how they fare when pressed with higher pixel count to render.
  • Leyawiin - Saturday, November 5, 2011 - link

    Its like putting small skinny tires on a Corvette.
  • Leyawiin - Saturday, November 5, 2011 - link

    Such a dork..."wouldn't". There goes my funny analogy down the toilet. :(
  • ypsylon - Saturday, November 5, 2011 - link

    I stick with my lovely MSI TF3 580. So what it is slower than 2 560. I don't need that kind of power anyways. And just look at this EVGA monstrosity. I would take that for free even if someone paid me good money to take it. Absolute disgrace.

    EVGA should do the engineering part and then at least slap Arctic Cooling thingy on it. That what companies without imagination do. As for MSI or Asus. You pay a good premium on those, but my god these cards are marvels of modern engineering. Not some slap-dash services a la EVGA.
  • RussianSensation - Saturday, November 5, 2011 - link

    MSI TF3 580 is a marvel of engineering? How so?

    At $499, it's not fast enough on its own for 2560x1600 gaming in modern games. At 1920x1200 or below, it's not providing more playability than a $300 GTX570. And the fact that you can get 2x HD6950 2GB Sapphire Dirt 3 Edition cards that have a high chance of unlocking into a 6970 makes a pretty poor value atm.
  • Fiah - Saturday, November 5, 2011 - link

    - that SLI scaling will always be strong, and that multi-GPU timing issues are easily overcome

    Those are rather strong assumptions. I'm particularly uncertain that the GPU guys will solve micro-stuttering. Micro-stuttering doesn't lose you any benchmarks and it's a complex problem, so I'm rather sceptical if any of the GPU bakers will spend the necessary time and moeny to solve this problem, if indeed it's solvable at all.
  • Death666Angel - Saturday, November 5, 2011 - link

    I see SLI/CF as acceptable in the high end region (HD6950+/GTX570+), where you can't increase performance with one card only. However, even then you need at least a 2560 resolution and other stuff to make really good use of it, because of the console limiting visual performance today.

    Overall though, I'm not a fan of multi-GPU setups today, because of the driver issues and most importantly micro lag. I had a HD3870X2, 2 GTS8800 512MB and recently gamed on a friends 2*470GTX and the experience was never smooth enough to justify the cost and power consumption.

    Isn't PCIe 3.0 supposed to bring better synchronisation to multi-GPU setups? If that happens and if games become more demanding (next console generation?) I might think about SLI/CF again.

    But a good test nonetheless. :D
  • Death666Angel - Saturday, November 5, 2011 - link

    It is only below any SLI setup. It is still well above the 6950CF/6970CF setups which are better and/or cheaper.
  • Cihao - Saturday, November 5, 2011 - link

    Would be possible to put 2 EVGA GeForce GTX 560 Ti 2Win in SLI, thus fitting 4 GPUs in a dual SLI config?
  • mfenn - Saturday, November 5, 2011 - link

    The article specifically says that you can't do that
  • Grandal - Sunday, November 6, 2011 - link

    where i've skimmed it 3 times and still missed it
  • Sabresiberian - Wednesday, November 9, 2011 - link

    You might try actually reading instead of "skimming":

    2nd page of article, upper middle:

    "As for display connectivity, thanks to having 2 GPUs on board EVGA is able to drive up to 4 displays rather than the usual 2 for an NVIDIA card. EVGA has broken this up into 3 DL-DVI ports and a mini-HDMI port. This should efficiently cover triple monitor setups, but if you want a 4th monitor it will be limited to 1920 @ 60Hz. Meanwhile the SLI connector next to the PCI bracket is a bit of a red herring – 4-way SLI is not supported for the 2Win; given the hardware this is presumably an NVIDIA limitation as they have only ever supported 3 and 4-way SLI on their high-end GPUs."

    ;)
  • mfenn - Saturday, November 5, 2011 - link

    Get 6950 2GB CF, it's faster and uses less power
  • Death666Angel - Saturday, November 5, 2011 - link

    I guess they need have that recommendation to also take into account all the cheap PSUs out there. In theory, a good 500-600W PSU like the Seasonic X-Series, Be Quiet, Enermax etc. will be enough to power this graphics card and any modern quad core CPU with peripherals.
  • ol1bit - Saturday, November 5, 2011 - link

    I will say, that dual core GPU's are destine to fail sooner. My 7950GX2 died at 5 years. I know why still using it... Well I have a wife and a 9 year old. The Wife got that one, my kid has a 9800GT.

    The funny thing was, my wife does not stress it out, and a good Antec case with good cooling.

    I paid $589 for that puppy, the most expensive video card I ever bought. I now run 460's in SLI, quiet, and runs well.

    So I am not a fan of 2 core GPUs, get a couple 560's instead, I'll bet they last longer. If that matters to you.
  • Hauk - Saturday, November 5, 2011 - link

    I have two 580's, but sure take notice of how good 6950 CF looks on those charts. That's some good performance for the $$. Kudos to EVGA for keeping things interesting though..
  • trengoloid - Sunday, November 6, 2011 - link

    i have 2 gtx 560 ti sli and my motherboard is z68 asrock extreme 7 gen3 motherboard that support tri sli and quad sli so just a question can i buy gtx 560 ti 2 win and use it to make quad sli or just return my other gtx 560 ti and combine my one gtx 560 ti to gtx ti 2 win to make a tri sli, is that going work?
  • Ryan Smith - Sunday, November 6, 2011 - link

    Sadly no. The 2Win cannot be SLI'd with any other cards. NVIDIA only supports up to 2 GTX 560s in SLI.
  • marraco - Sunday, November 6, 2011 - link

    -Short on RAM
    -More expensive than a pair of 560 Ti. Will only make sense on single PCI-E slot motherboards, but, no single slot motherboard is SLI certified, so that makes the card useless.
    -The article needs analysis like this:
    http://techreport.com/articles.x/21516
  • Sabresiberian - Wednesday, November 9, 2011 - link

    Read the article more thoroughly. This card has the NF200 chip built in, and WILL work on mainboards that aren't "SLI certified". If you have a PCIe-16 slot, it will work.

    ;)
  • Grandal - Sunday, November 6, 2011 - link

    Skimmed some of the article, did I miss discussion about this?
  • DarkUltra - Sunday, November 6, 2011 - link

    - "On the other hand if you don’t share EVGA’s confidence in SLI, then very little has changed. If you believe that new games will have teething issues with SLI, that microstutter will continue to exist, and that not every game will scale well with SLI, then the 2Win is a poor choice in light of the more consistent performance of the GTX 580"

    What? Why not check frametimes in the games you benchmarks? It is easy, just enable the option in FRAPS and import the data in a spreadsheet. If high framerate comes in lumps, there is no perceived improvement.

    A focus on this will make SLI and Crossfire even better, please do that in your next review :)

    For now readers can check
    http://techreport.com/articles.x/21516
  • Ryan Smith - Sunday, November 6, 2011 - link

    At this point it's not really a concern about our existing games. All of them are well developed on the driver side. The concern is with the future: will Batman microstutter? Will Serious Sam have SLI support with good scaling the day it launches? These are questions that can't be answered in the present, which is why it's largely a question of faith in NVIDIA's capabilities.
  • s44 - Monday, November 7, 2011 - link

    Is it possible to include a subjective microstutter report with these reviews?
  • dj christian - Thursday, November 10, 2011 - link

    Even though i am a long timereader of Anand since ten years back I really had to register to make this post..

    Actually Ryan i think you missed his point. Look at HardOCP:

    http://www.hardocp.com/article/2011/11/08/asus_rog...

    With diagrams you see exactly what drops and framespikes given over a period of time. It's so much clearer than showing simple staples which actually says nothing, especially at the highest resolutions which can be a bit tricky to get a grip on.

    For the next review of a graphic card i really hope you post diagrams along with the simple benchmarks which gives the reader a much bigger overview of such things.
  • romany8806 - Monday, November 7, 2011 - link

    I know the first couple of pages of the article mention the redundancy of the SLI nub and the fact that this card needs an SLI-certified mobo, but I think these points should really rate a mention in the conclusion. Vram issues aside, for me these two failings almost completely invalidate this card as a useful option.

    I thought the point of dual-GPU cards was to allow:
    1. quad-SLI/quad-Xfire OR
    2. dual-GPU performance on less accommodating/feature-rich motherboards.

    The only viable application I can think of for this card (and I'm clutching at straws here) is on SLI-certified mATX boards in small enclosures where it is undesirable to use both slots for airflow reasons. Not sure I'd pay a 30% premium over 2 560 Ti boards for that.

    Good article, but readers who skim to the final page might be missing out on pertinent information.
  • Black1969ta - Monday, November 7, 2011 - link

    It requires an SLI Mobo, but can't be SLI'ed to another 560Ti 2WIN
    It pulls more power than 560Ti's in SLI.

    Excuses or not this review is pretty worthless without another 560Ti SLI boards to a dual GPU 560Ti Board.

    With the SLI Mobo certification requirement, and the lack of ability to add another 2WIN to get Quad GPU's with only two PCIe slots.

    And then Add in the $50+ Premium over 2 cards I fail to see the logic of this pricing and/or design.
    Without test results I fail to believe that I couldn't take two 560Ti Cards and SLI them with a modest OC and not get the same or better results.
  • Black1969ta - Monday, November 7, 2011 - link

    Ok I stand corrected, on the Game graphs I see the 560Ti SLI tested, but the 2WIN isn't $50 better which I suspected.

    FAIL EVGA!!!!!
  • Gonemad - Monday, November 7, 2011 - link

    I've always seen these dual-gpu cards as hit or miss, or more like an enthusiast part, really.

    If you just ignore the heat, the noise, the power consumption, then they become appealing, or even compete with 2 sli/cf cards.

    I still like powerful single-slot, single gpus cards better. No SLI issues, no CF issues, it just works, (or not). Should you ever need to upgrade, yay, you have a 2nd slot waiting, and hope you can still find your GPU for sale in compatible forms, then you go back in the SLI saddle.

    These cards compromise, one thing or another. Either you risk your PSU, or the builder has to tone down each gpu.

    I still would stick to a single 580, with the prospect in 1 year or 2, to buy a 2nd one, with a discount. I don't know, really.
  • s44 - Monday, November 7, 2011 - link

    Given that this is the most if not only significant PC-hardware-relevant release in years (pushes hardware while being critically popular *and* selling well), I hope you add it to the bench suite sooner rather than later.
  • Ryan Smith - Monday, November 7, 2011 - link

    The whole suite is getting redone for our SNB-E testbed.
  • Tchamber - Monday, November 7, 2011 - link

    I never understood why they advertise 2GB, but the review says 1GB of useable memory. Can some one explain that to me?
  • Ryan Smith - Monday, November 7, 2011 - link

    With current multi-GPU technology (SLI and Crossfire), each GPU has its own copy of the working set of data and assets. Furthermore they can only work from their local VRAM, and they cannot (practically) work from the other GPU's VRAM.

    As a result you must duplicate the working set and assets across each GPU, which means you get half the effective VRAM.

    At the end of the day boardmakers can advertise having 2GB because it does in fact physically have 2GB, but logically it only has 1GB to work with.
  • Tchamber - Friday, November 11, 2011 - link

    Thanks for clearing that up Ryan. So in any multil-gpu setup, the vram does not increase, unless the card itself has more of its own. Interesting limitation, so only compute power and bandwidth increase?
  • AnnonymousCoward - Monday, November 7, 2011 - link

    It seems like it might make sense for nvidia to base their architecture on having 4, 8, or 16 GPU dies on every board. This would improve yield across the entire low/medium/high end due to smaller die sizes, and it would give a huge boost to the high end (assuming power limits are figured out). In today's age of supercomputers having 4000 chips per rack, it does not seem optimal to put just 1 or 2 per video card PCB.
  • Sabresiberian - Tuesday, November 8, 2011 - link

    Great job, as usual; I have to agree with the conclusions made under "Final Thoughts". The only reason I'd go this route is if I needed the connectivity in terms of monitors and only had a single PCIe x16 slot on my mainboard. That being said, a 30% performance increase in your particular favorite game is nothing to ignore.

    One of the things I've been hoping is that EVGA would send Anandtech or Tomshardware (preferably both) a Classified card so one of these sites could run thorough overclocking tests on it. I highly doubt that the Classified could make up the 30% difference on air, but how much better than stock it can reach will be good to know before I buy.

    (It would also be interesting to know when AMD is going to release their next-gen GPU and whether or not it's going to be worth waiting a month or so for, but their recent CPU release puts them in the "I'm not holding my breath" category.)

    ;)
  • Wakanabi - Monday, February 6, 2012 - link

    I went with this card and I'll tell you why

    TEMPERATURE!!!

    I had two 560Ti's when I first built my pc, and having the sli bridge and the cards close together on a board, one card would be up at 65 to 70celcius under full load, and the other would be at 85 to 92!

    Anytime you have multiple graphics cards, the fans from the top card are pulling hot air directly from the other card's pcb backside.

    I had sold one of my 560s a while back for full price ($250) so instead of buy another one now, I sold the other for $200 and bought the 2Win. Now I only get up to 78celcius total. And once I change my case next week to a higher air flow case it will be even better.

    This is the best card I've ever had, better than two 6870s, a single gtx580 or even the 6990 I was using for mining. I have mine overclocked to 900Mhz and get another 10-12% increase in performance. Unbeatable as far as single cards go, especially considering the 6990 is $700 and the 590 is around that too

Log in

Don't have an account? Sign up now