Comments Locked

123 Comments

Back to Article

  • Cali3350 - Thursday, March 24, 2011 - link

    Last Page you have a seeming paragraph that says "Quickly, let's also..." and then stops.
  • tipoo - Thursday, March 24, 2011 - link

    Also "Unlike AMD isn’t using an exotic phase change thermal compound here" on the meet the card page
  • tipoo - Thursday, March 24, 2011 - link

    Another one "This doesn’t the game in any meaningful manner, but it’s an example of how SLI/CF aren’t always the right tool for the job." on the computation page.
  • ahar - Thursday, March 24, 2011 - link

    Page 2
    "...NVIDIA’s advice about SLI mirror’s AMD’s advice..."

    mirrors
  • beepboy - Thursday, March 24, 2011 - link

    "Quickly, let's also"

    Nice review.
  • slickr - Thursday, March 24, 2011 - link

    For $700 I'd rather buy a whole new PC.

    Whats the point of playing games at larger resolutions than 1600x1050.

    In fact I'd say that 720p resolution is probably the best to play games at, because it tends to be easier to follow since pixels kind of move faster and you have more precision and smoother gameplay experience.

    I'd be keeping my AMD 6870 that is for sure!
  • HangFire - Thursday, March 24, 2011 - link

    I've once heard that the secret to happiness is learning to like the taste of cheap beer.
  • nyran125 - Sunday, June 19, 2011 - link

    did you know thats actually true lol. If you can have your coffee black, then if milk runs out you still get to enjoy life......
  • cjl - Thursday, March 24, 2011 - link

    That depends entirely on your GPU. Several can push high resolutions at >60fps, and it's just as smooth. Gaming at 2560x1600 is just an awesome experience.
  • Azethoth - Sunday, March 27, 2011 - link

    Exactly, some of us have panels with native 2560x1600. I _could_ game at some miserable 1600x1050 resolution, or I could play at my native resolution. I choose 2560x1600 and ignore all review results at inferior resolutions. Damn you Crysis, damn you!
  • Ruger22C - Thursday, March 24, 2011 - link

    Don't spew nonsense to the people reading this! Write a disclaimer if you're going to do that.
  • The Finale of Seem - Saturday, March 26, 2011 - link

    Um...no. For one, HUD elements tend to shrink in physical size as resolution rises, meaning that games with a lot of HUD (WoW comes to mind) benefit by letting you see more of what's going on, which means that 720p is pretty friggin' awful. For two, 1920x1080 has become the standard for most monitors over 21" or so, and a lot of gamers get 1920x1080 displays, especially if they're also watching 1080p video or doing significant multitasking. Non-native resolutions look like ass, and as such, 1600x1050 is right out as you won't want to play at anything but 1920x1080.

    Now, you can say that there isn't much point going above that, and right now, that may be so as cost is pretty prohibitive, but that may not always be the case.
  • rav55 - Thursday, March 31, 2011 - link

    What good is it if you can't buy it? Nvidia cherry picked the gpu's to work on this card and they could only release a little over 1000 units. It is now sold out in the US and available in limited amounts in Europe.

    Basically the GTX 590 is vapourware!!! What a joke!
  • wellortech - Thursday, March 24, 2011 - link

    Reviews seem to still agree that 6950CF or 570 SLI are just as powerful, and much less expensive. Guess I'll be keeping my pair of 6950s while continuing to enjoy 30" 2550x1600 heaven.
  • DanNeely - Thursday, March 24, 2011 - link

    Yeah, these only really make sense if you're going for a 4GPU setup in an ATX box, or have a larger mATX case and want to 2 GPUs and some other card.
  • jfelano - Thursday, March 24, 2011 - link

    You go boy. I'll continue to have a life.
  • The_Comfy_Chair - Thursday, March 24, 2011 - link

    Get over yourself.

    YOU are trolling on a forum about a video card on a tech-geek site on the internet. You have no more of a life than wellortech or anyone else here - self included.
  • ShumOSU - Thursday, March 24, 2011 - link

    You're 16,000 pixels short. :-)
  • egandt - Thursday, March 24, 2011 - link

    Would have been better to see what these cards did with 3x 1920x1200 displays, as obviously they are overkill for any single display.
  • Dudler - Thursday, March 24, 2011 - link

    Couldn't agree more, but since we know from the 1,5 GB 580 that the nVida card do poorly in higher resolutions, AnandTech is probably never test any such setup. Expect 12x10 instead, as nVidia tends to do better in low resolutions than Amd. 19x12 is already irrelevant with these cards.
  • Ryan Smith - Thursday, March 24, 2011 - link

    One way or another we will be including multi-monitor stuff. The problem right now is getting ahold of a set of matching monitors, which will take some time to resolve.
  • fausto412 - Thursday, March 24, 2011 - link

    also would be nice to test 1680x1050 on at least a couple of demanding games. illustrate to people who have 22" screens that these cards are a waste of money at their resolution.
  • bigboxes - Thursday, March 24, 2011 - link

    It has been a waste for that low resolution since two generations ago. But you knew that. Troll...
  • tynopik - Thursday, March 24, 2011 - link

    matching monitors might matter for image quality or something, but for straight benchmarking, who cares?

    surely you have 3 monitors capable of 1920x1080

    it's not like the card cares if one is 20" and another is 24"
  • 7Enigma - Thursday, March 24, 2011 - link

    I don't understand this either. There is no need for anything fancy, heck you don't even need to have them actually outputting anything, just fool the drivers into THINKING they are driving multiple monitors!
  • DanNeely - Thursday, March 24, 2011 - link

    I don't entirely agree. While it doesn't matter much for simple average FPS benches like Anandtech is currently doing, they fall well short of the maximum playable settings testing done by sites like HardOCP.
  • strikeback03 - Thursday, March 24, 2011 - link

    Remember, the AT editors are spread all over. So while between them they certainly have at least 3 1920x1080/1200 monitors, Ryan (doing the testing) probably doesn't.

    Plus with different monitors wouldn't response times possibly be different? I'd imagine that would be odd in gaming.
  • tynopik - Thursday, March 24, 2011 - link

    > Remember, the AT editors are spread all over. So while between them they certainly have at least 3 1920x1080/1200 monitors, Ryan (doing the testing) probably doesn't.

    This has been a need for a while, and it's not like this review was completely unexpected, so not sure why they don't have a multi-monitor setup yet

    > Plus with different monitors wouldn't response times possibly be different? I'd imagine that would be odd in gaming.

    Well that's sort of the point, they wouldn't actually be gaming, so who cares?
  • Martin Schou - Thursday, March 24, 2011 - link

    I would have thought that the marketing departments of companies like Asus, Benq, Dell, Eizo, Fujitzu, HP, LaCie, LG, NEC, Philips, Samsung and ViewSonic would cream their pants at what is really very cheap PR.

    Supply sets of 3 or 5 1920x1080/1920x1200 displays and 3 or 5 2560x1440/2560x1600 displays in exchange for at least a full year's advertisement on a prominent tech website.

    If we use Dell as an example, they could supply a set of five U2211H and three U3011 monitors for a total cost of less than 5,900 USD per set. The 5,900 USD is what us regular people would have to pay, but in a marketing campaign it's really just a blip on the radar.

    Now, excuse me while I go dream of a setup that could pull games at 9,600x1080/5,400x1920 or 7,680x1600/4,800x2560 :D
  • Ryan Smith - Friday, March 25, 2011 - link

    I'd just like to note that advertising is handled separately from editorial content. The two are completely compartmentalized so that ad buyers can't influence editorial control. Conversely as an editor I can't sell ad space.
  • Abot13 - Saturday, March 26, 2011 - link

    Well there is advertising and advertising. Asking multiple monitor manufacturers to supply sets of screens in order to test them for multi monitor setups, where the best sets will be used for a longer test period to see how they keep up over time, doesnt interfere with your journalistic integrity, bit for the manufacturers the cost can be written of as PR/advertising costs.
    in the end your readers will get the information they want, your integrety will be intact and we also will find out what the best monitors are for multi monitor setups.
    Shouldnt be that hard now should it?
  • PrinceGaz - Friday, March 25, 2011 - link

    Once you are dealing with framerates above about 80fps at the highest resolutions and quality settings, multi-monitor is one option, or it is time to start cranking up the anti-aliasing as the results are otherwise irrelevant.

    With one card doing 100fps and another managing 140fps in a game, the best card for that game is the one which is cheapest even if only slightly, unless you can do something to bring the framerates down in a useful way, and higher AA than 4x is the way to go.
  • Ryan Smith - Friday, March 25, 2011 - link

    As a general principle I agree, however 8x AA really doesn't do anything for IQ. I'd rather turn on SSAA/TrAA/MLAA, however AMD and NVIDIA do not have a common AA mode beyond DX9 SSAA. So for DX10/11 titles, MSAA is still the lowest common denominator.
  • DarknRahl - Thursday, March 24, 2011 - link

    I agree. Anandtech always seems to have a nvidia bias to their video card reviews.

    It sticks in the craw as I dig all their other reviews for other products but video card wise I go elsewhere.
  • B3an - Thursday, March 24, 2011 - link

    No they dont. It just that for some reason AMD attracts more morons that like to moan about these non-existent things. And i own AMD cards before you say anything else stupid.
  • softdrinkviking - Sunday, March 27, 2011 - link

    100% agree. i own an AMD card too, and i felt like anandtech was extremely positive about the 5800 series when i bought that card.

    it also looks like they are leaning towards the 6990 vs the 590 in this very article. to paraphrase... the 6990 is going to be better for current new games and future games, but the 590 seems to do a bit better with older titles and new games played at lower resolutions.

    i don't understand how anyone could think that makes them nvidia biased. it boggles the mind.
  • compvter - Thursday, March 24, 2011 - link

    Finnish site called muropaketti did some 3x display tests with these new cards

    http://plaza.fi/muropaketti/artikkelit/naytonohjai...

    and for your question i would like to answer with other question: Why buy faster cpu if you won't need faster at the moment? Granted there is no limitations like dx compatability, but still you can use your gfx card quite a while. Im still using 3870x2 card (3 years), mainly because there are so few dx11 games.
  • VoraciousGorak - Thursday, March 24, 2011 - link

    For the "hype" they were throwing out with that video preview, I thought they were going to pull a fast one and launch a GeForce 6-series. But a teaser video for a super-enthusiast GPU using existing tech that'll be so rare I'll probably never physically look at one in my lifetime?

    Color me disappointed.
  • fausto412 - Thursday, March 24, 2011 - link

    yeah.

    those 700 dollar cards are only to be bought by a rich few.

    Improve the $400 price range...that's as high as i'll go these days.
  • Red Storm - Thursday, March 24, 2011 - link

    ... are you guys not testing these cards at multi-monitor resolutions? You said yourself in the review that these are marketed towards the high end. Well, the high end isn't just 30" monitors. Multi monitors boast a higher resolution and I think it's important to know how well these dual GPU monsters perform at the level they are marketed towards.
  • tipoo - Thursday, March 24, 2011 - link

    So was the WU count close to exactly double the single chip score of 360?
  • tipoo - Thursday, March 24, 2011 - link

    When using both chips with two WU's, I mean.
  • alent1234 - Thursday, March 24, 2011 - link

    i can buy an x-box and with the price of a lot of good older games a few years worth of gaming for that
  • MrBungle123 - Thursday, March 24, 2011 - link

    Thats like someone watching NASCAR, seeing the price of a car and saying they could buy a honda civic and a decades worth of gas for the same money.
  • alent1234 - Thursday, March 24, 2011 - link

    i got sick of buying the latest video card when they hit $399 years ago. around 60fps you really don't notice any difference in speed so getting 100fps or some other number doesn't do it for me anymore
  • tipoo - Thursday, March 24, 2011 - link

    TBH, in the land of console ports, very few games (on a single monitor) justify a card above 200.
  • Targon - Thursday, March 24, 2011 - link

    That just goes to show that you play the wrong games then. The new top of the line games really can push the $400 cards fairly well at 1920x1080 and full details. With DirectX 11 support, these new games really push the limit. Then you have things like Eyefinity, driving 5860x1080, and you want more than a $200 card.
  • cmdrdredd - Thursday, March 24, 2011 - link

    Not really...This card isn't a one off race car. It's a production part, limited maybe but you can buy it at retail. A stock car is not stock...
  • Azethoth - Sunday, March 27, 2011 - link

    What!? Next you are gonna claim wrastling isn't real.
  • medi01 - Thursday, March 24, 2011 - link

    Puzzled by the cryptic color scheme on the graphs?

    Could you stick to red + shades of red for AMD and green + shades of green for nVidia (ok, blue for not so relevant cards)?

    Or at least color the labels of the cards accordingly?
  • 7Enigma - Thursday, March 24, 2011 - link

    Yeah I don't get it either. The last review of the 6990 was fantastic with how it was color-coded. Now you have a DIRECT competitor in both price and performance and the 6990 is never highlighted in the charts!?!? It made it a real PITA to always go hunting for the 6990 bars when they should have been labeled from the get-go.

    Hopefully this can be remedied easily...
  • rabidsquirrel - Thursday, March 24, 2011 - link

    Red/Green is horrible for those of us in this world that are colorblind. Blue/Green is equally horrible. I'd like to see textures used instead.
  • JarredWalton - Thursday, March 24, 2011 - link

    I've changed Ryan's colors. Green is for the stock and EVGA clocks, yellow is for OC 590, red for 6990, and orange for 6990 OC. For the color blind, I apologize but using green/red for the primary cards is pretty standard.
  • rabidsquirrel - Thursday, March 24, 2011 - link

    Common colorblindess is between Green/Red and Green/Blue. Red/Blue works great!
  • Dudler - Thursday, March 24, 2011 - link

    "After talking to several other reviewers, this does not seem to be an isolated case, and many of them have killed their cards with similar testing, which is far from being an extreme test."

    "I most strongly advise anyone to stay away from overclocking this product and use extremely conservative settings, maybe up to 650 MHz and no voltage adjustments."

    Looks like it is no OC champ.

    For high res action, H got their review up. 6990 even beats the 590 at Civ V. Conclusion is damning too:

    "We truly thought the GTX 590 was going to make the Radeon 6990 look bad, but the fact of the matter is that NVIDIA made the 6990 look that much better. The GTX 590 is not the "World's Fastest Single Card Solution" as stated on our page 1 slides; the Radeon HD 6990 is very much retaining that title. Hail to the King, baby! "

    2x 6950 on water looks to be my next buy. Must buy wider desk though :-)
  • Dudler - Thursday, March 24, 2011 - link

    http://www.youtube.com/watch?v=sRo-1VFMcbc&fea...

    SweClockers also fried their card.. The blame nVidia quality drivers.
  • Nfarce - Thursday, March 24, 2011 - link

    Fanboy much? Anyone who is an AMD/ATi fanboy shouldn't be crowing about nVidia driver quality. That's one reason I left AMD and went full nVidia-only GPUs after a 4870 purchase. nVidia cards also play a lot nicer on Microsoft FSX, but not everyone cares about that just like not everyone cares about Civ5.
  • Dudler - Thursday, March 24, 2011 - link

    ?? Fanboy? NVidia just released their highest end($700) card with a driver that fries it if you overvolt. Me commenting on that, how come that is fanboyism?

    It is nothing short of a disaster. Sweclockers fried 2(two) Gtx590's. Thats $1400 worth of hardware. TPU fried one.

    Sure I've had my share of driver issues with ATi, but not any more than with my 8800Gt. Couldn't care less whose name is printed on the sticker.
  • cmdrdredd - Thursday, March 24, 2011 - link

    "fries it if you overvolt."
    _______________

    YOU overvolted it...not the drivers doing it automatically. Remember the saying "your mileage may vary" or "proceed at your own risk"?
  • Dudler - Thursday, March 24, 2011 - link

    Yup. But who doesn't expect voltage tuning today? Read the back of the ASUS box: "50% faster with voltage tuning." Nice way to say: This card can't be oc'ed. It will catch fire. But of course, you are overvolting so it is your own fault.

    But, only reason I posted it was to warn ppl who invest $700 in a card and has it fry on them. I'm not arguing who shall take blame.

    Be careful, at least 6 cards dead, stay away from oc'ing it.
  • BreadFan - Thursday, March 24, 2011 - link

    Would this card be better for the P67 platform vs GTX 580's in sli considering you won't get full 16x going the sli route?
  • Nfarce - Thursday, March 24, 2011 - link

    The 16x vs. 8x issue has been beaten to death for years. Long story short, it's not a measurable difference at or below 1920x1080 resolutions and only barely a difference above that.
  • BreadFan - Thursday, March 24, 2011 - link

    Thanks man. Already have one evga 580. Only reason I was considering was for the step up program evga offers (590 for around $200). I have till first part of June to think about it but am leaning towards adding another 580 once the price comes down in a year or two.
  • wellortech - Thursday, March 24, 2011 - link

    You won't get 16x going the CF route either.....although I agree that it doesn't really matter.
  • softdrinkviking - Sunday, March 27, 2011 - link

    i hope your screen name is from the budgie song!
  • softdrinkviking - Sunday, March 27, 2011 - link

    that comment was to breadfan.
  • 7Enigma - Thursday, March 24, 2011 - link

    Comon guys, I would have thought you could have at least had the 6990 and the 590 data points for Crysis 2. Perhaps a short video as well with the new game? :)
  • Ryan Smith - Thursday, March 24, 2011 - link

    It's unlikely we'll be using Crysis 2 in its current state, but that could always change.

    However if we were to use it, it won't be until the next benchmark refresh.
  • YouGotServed - Friday, March 25, 2011 - link

    Crysis: Warhead will always be the benchmark. Crysis 2 isn't nearly as demanding. It's been dumbed down for consoles, in case you haven't heard. There are no advanced settings available to you through the normal game menu. You have to tweak the CFG file to do so.

    I thought like you, originally. I was thinking: Crysis 2 is gonna set a new bar for performance. But in reality, it's not even close to the original in terms of detail level.
  • mmsmsy - Thursday, March 24, 2011 - link

    I know I can be annoying, but I checked it myself and the built in benchmark in Civ V really favours nVidia cards. In the real world scenario the situation is almost upside down. I got a reply from one of the reviewers last time that it provides quite accurate scores, but I thought that just for the heck of it you'd try and see for yourself that it doesn't at all. I know it's just one game and that benchmarking is a slow work, but in order to keep up the good work you're doing you should at least use the advice and confront it with the reality to stay objective and provide the most accurate scores that really mean sth. I don't mean to undermine you, because I find your articles to be mostly accurate and you're doing a great job. Just use the advice to make this site even better. A lot of writing for a comment, but this time maybe you will see what I'm trying to do.
  • TalonP - Thursday, March 24, 2011 - link

    First paragraph:

    "It really doesn’t seem like it’s been all that long, but it’s been nearly a year and a half since NVIDIA has had a dual-GPU card on the market. The GeForce GTX 295 was launched in January of 2009, the first card based on the 55nm die shrink of the GT200 GPU."

    Well, shit. I thought Jan 2009 was TWO and a half years ago. I MUST GET BACK TO THE FUTURE!
  • strikeback03 - Thursday, March 24, 2011 - link

    It was on the market after its launch. so if it disappeared somewhere at the end of 09/beginning of 10 that would match the "year and a half since on the market"
  • RedemptionAD - Thursday, March 24, 2011 - link

    Are there any reviews with such a setup out yet, or is it even supported? Maybe even a 3x or 4x setup? If it was a 4x 6990 setup or 590 setup could it rule the world?
  • cjl - Thursday, March 24, 2011 - link

    You can't go over 4 GPUs, so you can only SLI/CF two of the dual GPU cards.
  • Nfarce - Thursday, March 24, 2011 - link

    I'll be going with two 570's for the same price, thanks. And I can spread that pain at $350 per purchase over two months instead of one big $700 plunkdown.
  • buildingblock - Thursday, March 24, 2011 - link

    "....However the noise results are nothing short of remarkable – if NVIDIA can dissipate 350W+ of heat while at the same time making 5-7dB less noise, then it starts to become clear that AMD’s design has a serious weakness. The ultimate question is what did NVIDIA do right that AMD did not?...."

    I can't see anyone tolerating the noise level of the 6990. But the 590 is barely noisier than a 580. So an easy win for nVidia if you really need/can afford one of these monsters.
  • cactusdog - Thursday, March 24, 2011 - link

    Ya, even the 6970/6950 are hot cards. Very disappointing after a very cool and silent 5870. I think AMD had a problem with the chips and never intended for them to be so hot. Maybe they had to crank up the power to get them to run right? idk...........
  • Romulous - Thursday, March 24, 2011 - link

    This card might be good for those people out there who love to cram as many GPUs into one box as they can and run folding at home.
  • smigs22 - Thursday, March 24, 2011 - link

    Major bias with the OC listing in the charts... the OC version is not enough... but a 20+% OC is included versus the other standard configs... and the lousy flip switch OC mode of 6990... not around 940/1400+ that other sites have attained.... that offers 6970CF+ performance :s ...Why dont they show 5870/6950/6970 CF & 470/480/570/580 SLI etc with appropriate 20%+ overclocks to put these cards in their place... especially with price vs performance.... the 2gb 6950s also having the ability to be flashed into 6970s too... not bad CF for price...

    The second fastest single card out there.... but still a beast and its kept its idle power within reason... i think its time for 28nm tech asap... as the carbon taxes on these bad boys will be horrendous...lol
  • BrightCandle - Thursday, March 24, 2011 - link

    When you do the 3x monitor review can you please include last generations top end card (5970) for comparisons. Eyefinity and co is really where it is at with this monster graphics cards and in my experience the 5970 just doesn't have the horse power to play well at 5760x1200. I would really like to see how much difference these new cards and their increased RAM actually makes.

    50% performance compared to last generation at 2560 is OK, but do they get even more distance with the higher resolution?
  • valenti - Thursday, March 24, 2011 - link

    Ryan, I commented last week on the 550 review. Just to echo that comment here: how are you getting the "nodes per day" numbers? Have you considered switching to a points per day metric? Very few people can explain what nodes per day are, and they aren't a very good measure for real world folding performance.

    (also, it seems like you should double the number for this review, since I'm guessing it was just ignoring the second GPU)
  • Ryan Smith - Thursday, March 24, 2011 - link

    Last year NVIDIA worked with the F@H group to provide a special version of the client for benchmark purposes. Nodes per day is how the client reports its results. Since points are arbitrary based on how the F@H group is scoring things, I can't really make a conversion.
  • poohbear - Thursday, March 24, 2011 - link

    Good to see that a $700 finally has a decent cooler! Why would somebody spend $700 & then go and hafta spend another $40 for an aftermarket cooler??? nvidia & AMD really need to just charge $750 and hve an ultra quiet card, these people in this price range are'nt gonna squabble over an extra $50 for petes sake!!!! it makes no sense that they skimp on the cooler at this price range! this is the top of the line where money isnt the issue!
  • Guspaz - Thursday, March 24, 2011 - link

    Let's get this straight, nVidia. Slapping two of your existing GPUs together does not make this a "next-generation card". Saying that you've been working on it for two years is also misleading; I doubt it took two years just to lay out the PCB to get two GPUs on a single board.

    SLI and Crossfire still feel like kludges. Take Crysis 2 for example. The game comes out, and I try to play it on my 295. It runs, but only on one GPU. So I go looking online; it turns out that there's an SLI profile update for the game, but only for the latest beta drivers. If you install those drivers *and* the profile update, you'll get the speed boost, but also various graphical corruption issues involving flickering of certain types of effects (that seem universal rather than isolated).

    After two goes at SLI (first dual 285s, next a 295), I've come to the conclusion that SLI is just not worth the headache. You'll end up dealing with constant compatibility issues.
  • strikeback03 - Thursday, March 24, 2011 - link

    And that is why people still buy the 6970/580, rather than having 2 cheaper cards in SLI like so many recommend.
  • JarredWalton - Thursday, March 24, 2011 - link

    For the record, I've had three goes at CrossFire (2 x 3870, 4870X2, and now 2 x 5850). I'm equally disappointed with day-of-release gaming results. But, if you stick to titles that are 2-3 months old, it's a lot better. (Yeah, spend $600 on GPUs just so you can wait two months after a game release before buying....)
  • Guspaz - Friday, March 25, 2011 - link

    I don't know about that, the original Crysis still has a lot of issues with SLI.
  • Nentor - Thursday, March 24, 2011 - link

    "For the GTX 590 launch, NVIDIA once again sampled partner cards rather than sampling reference cards directly to the press. Even with this, all of the cards launching today are more-or-less reference with a few cosmetic changes, so everything we’re describing here applies to all other GTX 590 cards unless otherwise noted.

    With that out of the way, the card we were sampled is the EVGA GeForce GTX 590 Classified, a premium GTX 590 offering from EVGA. The important difference from the reference GTX 590 is that GTX 590 Classified ships at slightly higher clocks—630/864 vs. 607/853.5—and comes with a premium package, which we will get into later. The GTX 590 Classified also commands a premium price of $729."

    Are we calling overclocked cards "more-or-less reference" cards now? That's a nice way to put it, I'll use it the next time I get stopped by a police officer. Sir, I was going more or less 100mph.

    Reference is ONE THING. It is the basis and does not waver. Anything that is not it is either overclocked or underclocked.
  • strikeback03 - Thursday, March 24, 2011 - link

    Bad example, as in the US at least your speedometer is only required to be accurate within 10%, meaning you can't get ticketed at less than 10% over the speed limit. This card is only overclocked by 4%. More importantly, they a) weren't sent a reference card, and b) included full tests at stock clocks. Would you rather they not review it since it isn't a reference card?
  • Nentor - Thursday, March 24, 2011 - link

    That is a good point actually, I didn't think of that.

    Maybe reject the card yes, but that is not going to happen. Nvidia is just showing who is boss by sending a non reference card. AT will have to swallow whatever Nvidia feeds them if they want to keep bringing the news.
  • Ryan Smith - Thursday, March 24, 2011 - link

    There are only 2 reasons this card isn't reference.

    1) Factory overclock, which we can and will nullify for testing

    2) The EVGA backplate. Admittedly I don't have the reference backplate, but the NV backplates shouldn't cause any of our results to differ - the difference is mostly cosmetic.

    For this reason, it's "more-or-less" reference. Technically it's not reference, but once we change the clocks it's quite identical in performance.
  • mariush - Friday, March 25, 2011 - link

    Can you explain how can you nullify the modified / improved cooling system of the eVGA compared to reference cards?

    A reference card with stock voltages/frequencies may still run worse than this eVGA when downclocked, for example because on the reference card the voltage regulators may heat more and throttle the card more often.

    Otherwise.... not an ATI fan but it's painfully obvious you're not focusing on this review on things that make this card look bad, like playing on 2 x 1920x1080 monitors or something like that.
  • Ryan Smith - Friday, March 25, 2011 - link

    The cooling on the EVGA isn't any different. With such a mild overclock, they're basically just clocking it up a bit; the voltage and the cooling is no difference from reference.
  • etamin - Thursday, March 24, 2011 - link

    Where do you get the MSRPs for all the cards on the first page? I'm never able to find any of them at those prices.
  • etamin - Thursday, March 24, 2011 - link

    btw I'm looking at the 6850, 6870, and 6970. They start at $170, $210, and $340 on newegg.
  • Ryan Smith - Thursday, March 24, 2011 - link

    I factor in Mail In Rebates. if you don't, that's probably why you see prices differently.
  • Ramon Zarat - Thursday, March 24, 2011 - link

    Review title:

    ''NVIDIA’s GeForce GTX 590: Duking It Out For The Single Card King''

    Comments in conclusion:

    ''...and as a result there is no one card that can be crowned king.''

    LMAO... How to pretend to say someting and actually mean the complete oppsite!

    Now, performance, cost and power ratio is STILL best with the 6990. And that's without even considering multi screen gaming setup. At 1.5Gb Vram, the 590 WILL come up short at 5760 X 1200.

    Ramon
  • Silent_Scone - Thursday, March 24, 2011 - link

    But none the less, I think I'll stick with my 580GTX SLi thanks ;)
  • krumme - Thursday, March 24, 2011 - link

    REALLY FAST AT 1080 ON HAWX

    NO NOISE

    OVERCLOCK YOURS NOW !!!
  • freedomsbeat212 - Thursday, March 24, 2011 - link

    I hate coming to anandtech sometimes because so many of the comments are from wannabe editors. It's annoying and takes away from the excellent content..

    Why not have an "email correction" button vs taking it out in the comment section? It's weird, I don't see this anywhere else - you guys must be a particularly anal group...

    To go back O/T, I miss the days of powerful sub-$250 graphics cards. There's a market for it but all the action's on the high-end. Remember when the affordable TNT2 would play every recent game at playable framerates?
  • RaistlinZ - Thursday, March 24, 2011 - link

    What about the 6950 2GB? It can be had for $245.00 after rebate and it's plenty powerful.
  • the_elvino - Thursday, March 24, 2011 - link

    Is it so hard to admit that the 6990's performance is better across the board? Multi-monitor setups were left out in order to make NVidia look good.

    Remember when the GTX 580 SLI review was published, AT didn't include a 5970 crossfire setup, because they sadly only had one 5970.

    Yes, the GTX 590 is less noisy, but then again you can underclock the 6990 to GTX 590 performance levels and it will be quieter too, not really an argument.

    The GTX 590 is slower (especially at super high resolutions) and draws more power than the 6990 at the same price, AMD wins! Simple!
  • softdrinkviking - Thursday, March 24, 2011 - link

    If the 590 can only drive 2 displays, is the reason it has 3 DVI ports is only for people who buy 2 cards and then you can run all three off of one card?
  • Ryan Smith - Friday, March 25, 2011 - link

    The individual GPUs can only drive 2 monitors each. NVIDIA is using the display capabilities of both GPUs together in order to drive 4 monitors.
  • softdrinkviking - Friday, March 25, 2011 - link

    ah, got it. i should read more carefully.
    thanks for answering. :)
  • The Jedi - Friday, March 25, 2011 - link

    Surely if each GPU can run two displays, two GPUs on one card can run four displays?
  • Soulkeeper - Thursday, March 24, 2011 - link

    Wow that is massive
    I wouldn't put that in my pc if someone else bought it for me.
  • hab82 - Friday, March 25, 2011 - link

    For me the gaming differences between AMD Nividia at this level would not get me to change allegiance. In my case I bought two EVGA classifieds but they may never see a game. I use them as low cost replacements to the Telsa cards. For Parallel processing I held off until we had powers of two ie 256 512 1024 cores. I did not like 240 432 480 alternatives. Just easier to work with powers of 2. The lower noise and lower power were big winners. Evan if the power supplies can handle them you still have to get the heat out of the room. My 4 nodes of 1024 cuda cores each are a pain to air condition in my den.
    I really love you gamers, you buy enough GPU's to keep the competition going and lowering the price for impressive computation power.
    Thanks all!

    Hubert
  • Calin - Friday, March 25, 2011 - link

    USA manages to stay just south of Canada
  • rs2 - Friday, March 25, 2011 - link

    I thought you guys learned your lesson the last time you pitted a factory-overclocked reference nVidia card against a stock AMD card. If you're going to use an overclocked card in an article that's meant to establish the performance baseline for a series of graphics cards (as this one does for the GTX 590 series), then you really should do at least one of:

    1. Downclock the overclocked "reference" card to stock levels.
    2. Overclock the competing AMD card by a comparable amount (or use a factory overclocked AMD card as the reference point)
    3. Publish a review using a card that runs at reference clocks, and then afterwards publish a separate review for the overclocked card.
    4. Overclock both cards to their maximum stable levels, and use that as the comparison point.

    Come on now, this is supposed to be Anandtech. It shouldn't be up to the readers to lecture you about journalistic integrity or the importance of providing proper "apples to apples" comparisons. Using a factory overclocked card to establish a performance baseline for a new series of GPU's is biased, no matter how small and insignificant the overclock may seem. Cut it out.
  • buhusky - Friday, March 25, 2011 - link

    anybody else remember back in the era when pentiums just kept getting bigger & hotter every year? i wonder when they'll start making gpus smaller, cooler, quieter like they finally ended up doing with CPUs
  • krumme - Saturday, March 26, 2011 - link

    Yeaa, then this card is on of the first Pentium3 1Ghz
  • ryan1e - Saturday, March 26, 2011 - link

    no offense anandtech, but this card is aimed squarely at the bleeding edge consumers much like the amd 6990 is. to that extent, any video card can only add performance to a system with respect to how much the system can deliver on the cpu side. as for the base system itself, it's a basic rig, nothing spectacular now. the gtx 590 and the amd 6990 restively would both perform better, and your results would prove more the limitations and capabilities of those cards if they were being run on the platforms they were targeted for. an example of what i mean: tom's hardware used a test platform based on an intel i7-990x OC to 4ghz paired to an asus rampage III formula mb vs. anandtech's older i7-920 clocked at 3.33 ghz paired to an asus rampage II extreme mb. the review from toms hardware nvidia's gtx 590 and amd's 6990 both performed far better than on anand's rig, but still similar overall. personally, i think i'll stick with my sli gtx 580 oc water cooled setup for performanceand get an upgrade for my cpu, neither the 6990 or 590 in any configuration is worth the expense for the miniscule gain in performance on the graphics side.
  • mino - Saturday, March 26, 2011 - link

    One word: comparability.
  • mino - Saturday, March 26, 2011 - link

    Another important review from AT, another biased review from AT. GRRR.
    - AT chooses NOT TO overclock HD6990 BUT presents un-overclocked results as HD6990 OC
    Yeah, it could embarrass our masters if AMD's built-to-overclock card was presented deemed overclockable

    - "The GTX 590 simply embarrasses the 6990 here; it’s not even a contest."
    Yeah, 4dB is no contest, embarrassment, of course. It is AMD's card after all. (it is louder, no question there)

    PR mercenaries at their best. Lets brace ourselves for another round of PR warfare when BD and Llano launch ...
  • nitrousoxide - Wednesday, March 30, 2011 - link

    Don't you know that noise level goes by factor of 10 with 10dB increase? Do the math, and you will find 6990 2 times louder than 590. Indeed it's no contest. You can check out Linus Tech Tips' video review you Youtube. 6990 is definitely much, much louder than 590.

    The article itself isn't biased. 6990 and 590 have similar win-some-lose-some situation just like most cards at similar price range (570 vs. 6970, 560 vs. 6950, 460 vs. 6850 etc.) Darn it's impossible to have a real card king these days when both NV and AMD are paying developers for optimization.
  • OblivionLord - Sunday, March 27, 2011 - link

    Anyone know what case was used in this test because if different cases were used then that could affect the temp chart.
  • ryedizzel - Wednesday, March 30, 2011 - link

    another excellent and incredibly thorough article. this is why i come back to Anandtech time and time again for the "real" story. thank you and please keep up the good work! :)
  • trogthefirst - Thursday, March 31, 2011 - link

    Actually if i wanted near the top gaming performance i would just fork out for 2 x HD 6950s, crossfire and possibly unlock them as a bonus :P Tadaa!
  • rav55 - Thursday, March 31, 2011 - link

    What good is it if you can't buy it? Nvidia cherry picked the gpu's to work on this card and they could only release a little over 1000 units. It is now sold out in the US and available in limited amounts in Europe.

    Basically the GTX 590 is vapourware!!! What a joke!
  • WhatsTheDifference - Monday, April 4, 2011 - link

    how many years has it been since *any* 4890 has been represented in the charts..? why is it this way? come on. exclude nV's earlier gen top dog and maybe no one will notice.
  • Scandalman - Tuesday, May 10, 2011 - link

    Interesting article, however it would be useful to know what power supplies you recommend.

    I am interested in using a pair of these cards for running an OpenCL application flat out and I want to know if my Tagan 1100W supply is sufficient; It's running a GA890-GPA-UD3H/Phenom x4 955 with two unlocked 6950's at the moment.

    Looks like it might just do it with a whisker to spare, but I'd like to avoid expensive pyrotechnics.
  • nyran125 - Sunday, June 19, 2011 - link

    im not really into giving all my money to the same manufactorer every year just to buy a new video card that does nothing else accept increase FPS by 3-10 fps. Sometimes it doesnt increase it at all. If what you have already owns everything out there, i dont see the point. I only see the point if what you have DOESNT do what you want it to do. AMD 6870 is running every game iver played so far on max withotu issue at highest resolution. We are in the console era.

    Unless your playing Arma 2 and Shogun total war 2, but even then thats more CPU related adn an amd 6870 combined with a high end cpu prety much gets the job done at high res. NOT 30" though. Bu my eyes are screwed up enough from staring at a montier, than even want to go to 30 ".

Log in

Don't have an account? Sign up now