Comments Locked

54 Comments

Back to Article

  • xxtypersxx - Tuesday, December 7, 2010 - link

    If this thing can hit 900mhz it changes the price/performance picture entirely, why no overclock coverage in such a comprehensive review?

    Otherwise great write up as always!
  • Bhairava - Tuesday, December 7, 2010 - link

    Yes good point.
  • vol7ron - Tuesday, December 7, 2010 - link

    Why do graphics cards cost more than cpu+mobo these days?

    I know there's a different design process and maybe there isn't as much an economy of scale, but I'm just thinking about the days when it was reverse.
  • Klinky1984 - Tuesday, December 7, 2010 - link

    Well you're essentially buying a computer on a card with a CPU these days. High performance GPU w/ high performance, pricey ram, all of which needs high quality power components to run. GPUs are now computers inside of computers.
  • lowlymarine - Tuesday, December 7, 2010 - link

    I think it's simply that GPUs can't get cheaper to the extent that CPUs have, since the die sizes are so much larger. I certainly wouldn't say they're getting MORE expensive - I paid $370 for my 8800GTS back in early 2007, and $400 for a 6800 in early 2005 before that.
  • DanNeely - Tuesday, December 7, 2010 - link

    High end GPU chips are much larger than high end CPUchips nowdays. The GF110 has 3bn transistors. For comparison a quadcore i7 only has 700m, and a 6 core athlon 900m, so you get 3 or 4 times as many CPUs from a wafer as you can GPUs. The quad core Itanic and octo core I7 are both around 2bn transistors but cost more than most gaming rigs for just the chip.

    GDDR3/5 are also significantly more expensive than the much slower DDR3 used by the rest of the computer.
  • ET - Tuesday, December 7, 2010 - link

    They don't. A Core i7-975 costs way more than any graphics card. A GIGABYTE GA-X58A-UD9 motherboard costs $600 at Newegg.
  • ET - Tuesday, December 7, 2010 - link

    Sorry, was short on time. I'll add that you forgot to consider the price of the very fast memory on high end graphics cards.

    I do agree, though, that a combination of mid-range CPU and board and high end graphics card is cost effective.
  • mpschan - Wednesday, December 8, 2010 - link

    Don't forget that in a graphics card you're getting a larger chip with more processing power, a board for it to run on, AND memory. 1GB+ of ultra fast memory and the tech to get it to work with the GPU is not cheap.

    So your question needs to factory in cpu+mobo+memory, and even then it does not have the capabilities to process graphics at the needed rate.

    Generic processing that is slower at certain tasks will always be cheaper than specialized, faster processing that excels at said task.
  • slagar - Wednesday, December 8, 2010 - link

    High end graphics cards were always very expensive. They're for enthusiasts, not the majority of the market.
    I think prices have come down for the majority of consumers. Mostly thanks to AMDs moves, budget cards are now highly competitive, and offer acceptable performance in most games with acceptable quality. I think the high end cards just aren't as necessary as they were 'back in the day', but then, maybe I just don't play games as much as I used to. To me, it was always the case that you'd be paying an arm and a leg to have an upper tier card, and that hasn't changed.
  • TheHolyLancer - Tuesday, December 7, 2010 - link

    likely because when the 6870s came out they included an FTW edition of the 460 and was hammered? Not to mention in their own guild lines they said no OCing in launch articles.

    If they do do OC comp, most likely in a special article, possibly with retail brought samples rather than sent demos...
  • Ryan Smith - Tuesday, December 7, 2010 - link

    As a rule of thumb I don't do overclock testing with a single card, as overclocking is too variable. I always wait until I have at least 2 cards to provide some validation to our results.
  • CurseTheSky - Tuesday, December 7, 2010 - link

    I don't understand why so many cards still cling to DVI. Seeing that Nvidia is at least including native HDMI on their recent generations of cards is nice, but why, in 2010, on an enthusiast-level graphics card, are they not pushing the envelope with newer standards?

    The fact that AMD includes DVI, HDMI, and DisplayPort natively on their newer lines of cards is probably what's going to sway my purchasing decision this holiday season. Something about having all of these small, elegant, plug-in connectors and then one massive screw-in connector just irks me.
  • Vepsa - Tuesday, December 7, 2010 - link

    Its because most people still have DVI for their desktop monitors.
  • ninjaquick - Tuesday, December 7, 2010 - link

    DVI is a very good plug man, I don't see why you're hating on it.
  • ninjaquick - Tuesday, December 7, 2010 - link

    I meant to reply to OP.
  • DanNeely - Tuesday, December 7, 2010 - link

    Aside from apple almost noone uses DP. Assuming it wasn't too late in the life cycle to do so, I suspect that the new GPU used in the 6xx series of cards next year will have DP support so nvidia can offer many display gaming on a single card, but only because a single DP clockgen (shared by all DP displays) is cheaper to add than 4 more legacy clockgens (one needed per VGA/DVI/HDMI display).
  • Taft12 - Tuesday, December 7, 2010 - link

    Market penetration is just a bit more important than your "elegant connector" for an input nobody's monitor has. What a poorly thought-out comment.
  • CurseTheSky - Tuesday, December 7, 2010 - link

    Market penetration starts by companies supporting the "cutting edge" of technology. DisplayPort has a number of advantages over DVI, most of which would be beneficial to Nvidia in the long run, especially considering the fact that they're pushing the multi-monitor / combined resolution envelope just like AMD.

    Perhaps if you only hold on to a graphics card for 12-18 months, or keep a monitor for many years before finally retiring it, the connectors your new $300 piece of technology provides won't matter to you. If you're like me and tend to keep a card for 2+ years while jumping on great monitor deals every few years as they come up, it's a different ballgame. I've had DisplayPort-capable monitors for about 2 years now.
  • Dracusis - Tuesday, December 7, 2010 - link

    I invested just under $1000 in a 30" professional 8-bit PVA LCD back in 2006 that is still better than 98% of the crappy 6-bit TN panels on the market. It has been used with 4 different video cards, supports DVI, VGA, Component HD and Composite SD. Has an ultra wide color gamut (113%), great contrast, matt screen with super deep blacks and perfectly uniform backlighting along with mem card readers and USB ports.

    Display Port, not any other monitor on the market offers me absolutely nothing new or better in terms of visual quality or features.

    If you honestly see an improvement in quality spending $300 ever 18 months on a new "value" displays then I feel sorry for you, you've made some poorly informed choices and wasted a lot of money.
  • ilkhan - Tuesday, December 7, 2010 - link

    I love my HDMI connection. It falls out of my monitor about once a month and I have to flip the screen around to plug it back in. Thanks TV industry!
  • Mr Perfect - Tuesday, December 7, 2010 - link

    It is somewhat disappointing. People with existing screens probably don't care, and the cheap TN screens still pimp the DVI interface, but all of the high end IPS panel displays include either HDMI, DP or both. Why wouldn't a high end video card have the matching outputs?
  • EnzoFX - Tuesday, December 7, 2010 - link

    High-End gaming card is probably for serious gamers, which should probably go with TN as they are the best against input lag =P.
  • Mr Perfect - Tuesday, December 7, 2010 - link

    Input lag depends on the screen's controller, you're thinking pixel response time. Yes, TN is certainly faster then IPS for that. I still wouldn't get a TN though, the IPS isn't far enough behind in response time to negate the picture quality improvement.
  • MrSpadge - Tuesday, December 7, 2010 - link

    Agreed. The pixel response time of my eIPS is certainly good enough to be of absolutely no factor. The image quality, on the other hand, is worth every cent.

    MrS
  • DanNeely - Tuesday, December 7, 2010 - link

    Due to the rarity of HDMI 1.4 devices (needed to go above 1920x1200) replacing a DVI port with an HDMI port would result in a loss of capability. This is aggravated by the fact that due to their stickerprice 30" monitors have a much longer lifetime than 1080p displays and owners who would get even more outraged as being told they had to replace their screens to use a new GPU. MiniDVI isn't an option either because it's singlelink and has the same 1920x1200 cap as HDMI 1.3.

    Unfortunately there isn't room for anything except a single miniHDMI/miniDP port to the side of 2 DVI's, installing it on the top half of a double height card like ATI has done cuts into the cards exhaust airflow and hurts cooling. With the 5xx series still limited to 2 outputs that's not a good tradeoff, and HDMI is much more ubiquitous.

    The fiasco with DP-DVI adapters and the 5xxx series cards doesn't exactly make them an appealing option either to consumers.
  • Mr Perfect - Wednesday, December 8, 2010 - link

    That makes good sense too, you certainty wouldn't want to drop an existing port to add DP. I guess it really comes down to that cooling vs port selection problem.

    I wonder why ATI stacked the DVI ports? Those are the largest ports out of the three and so block the most ventilation. If you could stack a mini-DP over the mini HDMI, it would be a pretty small penalty. It might even be possible to mount the mini ports on edge instead of horizontally to keep them all on one slot.
  • BathroomFeeling - Tuesday, December 7, 2010 - link

    "...Whereas the GTX 580 took a two-tiered approach on raising the bar on GPU performance while simultaneously reducing power consumption, the GeForce GTX 470 takes a much more single-tracked approach. It is for all intents and purposes the new GTX 480, offering gaming performance..."
  • Lonyo - Tuesday, December 7, 2010 - link

    Any comments on how many will be available? In the UK sites are expecting cards on the 9th~11th December, so not a hard launch there.
    Newegg seems to only have limited stock.

    Not to mention an almost complete lack of UK availability of GTX580s, and minimal models and quantities on offer from US sites (Newegg).
  • Kef71 - Tuesday, December 7, 2010 - link

    Or maybe they are a nvidia "feature" only?
  • Taft12 - Tuesday, December 7, 2010 - link

    Game, set and match. It will take a long time for Anandtech to redevelop its reputation.
  • 7Enigma - Tuesday, December 7, 2010 - link

    Seriously? We're still going to preach on this topic? I was one of those in disagreement with the way they handled the launch of the AMD 68XX series cards, but let it die already. This is a LAUNCH article and it deals with the design of the card and the performance of the reference card. As such it should not contain comparisons to OC'd cards.....not AMD nor NVIDIA. In a follow-up article, however, it should be compared to non-reference designs from both camps.

    If, when the AMD 69XX series cards come out and they include OC'd Nvidia cards, THEN you can rant and rave. But I can guarantee you there is no way they would do that after the fallout of the previous launch.

    So I politely ask that you stop.
  • Kef71 - Tuesday, December 7, 2010 - link

    Yes, seriously. Was there ever any official statement if OC cards would be used in GPU launches? I didn't see any but on the other hand anandtech has not been in my bookmark list for a while...
  • strikeback03 - Tuesday, December 7, 2010 - link

    Yes, there was, they said because of d-bag comments like yours they would ignore a sector of the market and only provide some of the possible competitors for new products.
  • Kef71 - Tuesday, December 7, 2010 - link

    If you really need to be rude, at least spell out "douchebag".
  • slacr - Tuesday, December 7, 2010 - link

    I was just wondering why there are no starcraft2 performance figures in the review.
    Understandably there is no "benchmark" feature implemented in the game and they are annoying and time consuming to run and of course the card can handle it. But it is the only game some of us play and the figures may help guide us to see if it's "worth it".
  • nitrousoxide - Tuesday, December 7, 2010 - link

    It's a more CPU-bound game so it cannot perfectly reflect the difference in GPU performance
  • Ryan Smith - Tuesday, December 7, 2010 - link

    I'm still in the process of fully fleshing out our SC2 benchmark. Once the latest rendition of Bench is ready, you'll find SC2 in there.
  • tbtbtb - Tuesday, December 7, 2010 - link

    The GTX 570 is now available for pre-order on Amazon for 400$ (http://amzn.to/e89Oo2)
  • Oxford Guy - Tuesday, December 7, 2010 - link

    Where is it, especially minimum frame rate testing?

    While it's nice to see minimum frame rates for Crysis, it would be nice to see them for Metro as well.
  • Oxford Guy - Wednesday, December 8, 2010 - link

    Does Nvidia not want people to use Unigine since it showed the 480 beating the pants off the 580 in minimum frame rate at 1920x1200 and lower resolutions?

    I've noticed a definite lack of Unigine on review sites for the 570.
  • stangflyer - Tuesday, December 7, 2010 - link

    Any idea why the 580 sli takes such a huge dump going from 1920 res to the 2560 res. It loses half its framerate! I has 1.5 gigs of memory vs the 5870 1 gig and the 5870 crossfire goes from 50 fps at 1920 and 37 at 2560. The 580 sli goes from 72 fps at 1920 to 36 at 2560.

    Any ideas??
  • SmCaudata - Tuesday, December 7, 2010 - link

    It seems that AMD is finally getting cross-fire scaling well. The new 68xx cars are better than the old, but the 5870 is scaling as well as the Nvidia cards in a lot of cases. My guess is that with cross-fire or SLI the memory bandwidth is less of an issue. You don't fully double your framerate afterall. It is likely more dependant on the GPU clock speed..which is an advantage for AMD.

    I am really just taking a guess here. The other option is that it is simply an immature driver and will be fixed later.
  • nitrousoxide - Wednesday, December 8, 2010 - link

    Only when you used a dual-AMD-card configuration you will realize how much you will suffer from its poor drivers. It's fast but buggy and I've been waiting too long for AMD to finally come up with a Catalyst that at least runs as stable as the nVidia driver. So please AMD, give us a nice driver!
  • Anchen - Tuesday, December 7, 2010 - link

    Hey,
    Good review overall for an apples to apples comparison. I would have liked to see what it did overclocked as some have mentioned. On the Metro 2033 page the article says the following:

    "While Metro was an outstanding game for the GTX 580 to show off its performance advantage, the situation is quite different for the GTX 470. Here it once again fulfills its role as a GTX 480 replacement, but it’s far more mortal when it comes to being compared to other cards. "

    In the first sentence shouldn't it be "...the situation is quite different for the GTX570." and not the 470?
  • sanityvoid - Tuesday, December 7, 2010 - link

    Much as I love this site, the color schemes for the charts is really getting old. Why can't all the colors be the same EXCEPT for the one being reviewed. We're mostly all adults and can read so the other GPU's in the charts could be left all one color.

    Some other sites do this and it is much easier to read what is actually being reviewed, even if the review color is always the same on each chart. It still adds to the clutter of the charts. The human eye/brain gets distracted easy.

    Other than that, another good job on the article.
  • Ryan Smith - Wednesday, December 8, 2010 - link

    Thanks for the feedback.

    The colors are still a work in progress. We had some requests for additional colors in GPU articles to highlight the products we're immediately comparing the reviewed product to, which is what I did for this article. Certainly if you guys this this is too much, we can go back to fewer colors.
  • ATimson - Wednesday, December 8, 2010 - link

    Personally, my problem isn't so much that there are other colors, as that there's no good way to tell what they mean.

    Maybe one color for "other cards with benchmarks", one color for "immediate competition" (instead of each their own color), and a third for the product proper?
  • sanityvoid - Thursday, December 9, 2010 - link

    I really like this idea. All one color for 'set' of reviews (if multiple), and one color for primary.

    BTW, I didn't know others were asking for more colors. I guess do what others want. For me, personally, I like the one color for primary and one color for all others. It is just the easiest for 'first glance' to be easily distinguishable.

    Peace.
  • kirankowshik - Wednesday, December 8, 2010 - link

    I dont know why I should go for the Nvidia GTX 580 / 570 series when I am getting the same (almost or more than) performance with ATI Radeon cards for a lower price. ATI HD 5970 is almost 30$ cheaper than GTX 580 but outperforms it in every single test. 5870 is not very close but atleast some what close and the performance of GTX 570 over 5870 does not justify a $100 gap between these two. Anyways, I think NVIDIA is just producing cards for name sake..with HD6900 series coming up, I will not be surprised if they offer huge performance leap over the GTX 580/570 for the same price...Again it will be what NVIDIA was when ATI released their batch of first DX11 cards and NVIDIA was struggling hard to get an answer to those...
  • kanthu - Thursday, December 9, 2010 - link

    Ryan Smith from his conclusion
    "As with the GTX 580 we’d pick the simplicity of a single-GPU setup over the potential performance advantages of a multi-GPU setup, but this is as always a personal decision."

    Going with dual GPUs (specifically nvidia) has it advantages. You get to experience nvidia 3D surround. Yeah I know the additional costs the additional monitors etc that this entails. If GTX 460 1GB SLI can bring so much to the table, I can only imagine what the GTX 560 1 GB SLI can do when it comes.

    I only wish the development on the display side catches up with the development on the GPU side (now that AMD has only jumped on the 3D bandwagon).
  • owbert - Thursday, December 9, 2010 - link

    Anandtech, thank you for including a F@H benchmark.

    Thumbs for the continuous great work.
  • Shala3 - Sunday, December 19, 2010 - link

    Anand:

    In this sentence

    ------------
    The GTX 570 is fast enough to justify its position and the high-end card price premium, but at $100 over the GTX 470 and Radeon HD 5870 you’re paying a lot for that additional 20-25% in performance.
    -----------

    did u mean a performance difference detween 5870 & 5870 is 25% ???
  • szore - Monday, April 25, 2011 - link

    I bought my BFG GTX 570 a few weeks ago and I am thrilled with it.

Log in

Don't have an account? Sign up now