I apoligize for straying from the video topic, but I really get annoyed at the all to often trash talk about VIA and SiS chipsets.
I understand that this is a GPU article so I can see Anandtech not recommending SiS or VIA integrated graphics based on their lack luster video capabilities. My question (or maybe I should call it a complaint) is how can Anandtech claim SiS and VIA boards are not stable or reliable? The last reviw (that I can remember) of a SiS based board was over a year ago, even then I dont think it was a production board. Coverage of VIA based boards isn't much better but at least Anandtech does give VIA some budget coverage.
I can fully understand if Anandtech doesn't want to recommend VIA or SiS to their enthusiast crowd due to poor overclocking, or being "a bit more quirky" as your article states.
I'm not going to read all the way through old articles just to try and figure out what these stability and reliability issues mighy be (mainly because most of the articles are so old that a BIOS update could easily have made any stability issues invalid). Well I lied a little, I did briefly look through the VIA board articles within the last year and found no stability issues at stock settings. In fact, the only stability issues I saw mentioned in an article happened when "we tried to exceed the SPD settings of our DDR memory modules" but the next line reads "We did not experience these same issues with our DDR2 memory modules" (and that article is 1 week shy of 9 months old).
I hope Anandtech decides to either stop repeating these claims of unstable, unreliable and quirky boards based on VIA & SiS or start reviewing these boards and show its readers why they deserve these remarks.
Then again if the only thing we as readers get from reviews of these chipsets/boards is complaints about how budget boards are not able to overclock, or the lack of a tweakable BIOS in a sub $60 board then blame the board not the chipset as most people are already aware that budget boards are like this reguardless of what chipset they use.
I know of at least one attempted SiS board review in the past year that was canned because our reviewer could not get the board to function properly (after several BIOS updates and two boards, IIRC). Motherboards (and chipsets) are such an integral part of any computer that I would never skimp in that area. Then again, maybe I'm just too demanding of my computers?
If you read user reviews of VIA/SiS boards you typically see a pattern that indicates the boards are overall "less reliable" - periodic instabilities and far higher failure rates. Some people report no problems and love the low prices, while others try to do a bit more with their systems and encounter difficulties.
If you just want to use a computer for office tasks, just about any system will be fine... but then again, if you're doing office work and your computer crashes, you probably won't be too happy. Anyone planning on running a higher-spec GPU should avoid cheaper motherboards IMO, as running a $300+ GPU in a <$75 board is just asking for problems. (For the same reason, I recommend $75+ PSUs for anyone running a CPU+GPU that cost more than $400 combined.)
Basically, I just can't recommend a questionable motherboard that saves a person $10-$20. The fact that the companies aren't out there promoting their products says something. If they're not proud enough of their work to try hard to get reviews at reputable sites, perhaps it's because they know their boards won't pass muster.
I actually had a company representative complain to me once about my stress tests being "unrealistic". He asked, "How many people actually try to run Folding@Home and a bunch of gaming benchmarks in sequence?" Basically, the system would crash if I used my script to benchmark games at various resolutions without rebooting in between each run. It's true that a lot of people might never stress a system to that level, but when I've looked at dozens of computers that handle that workload without problems, a system that crashes/locks in the same situation is clearly not as "stable or reliable" as competing solutions. All things being equal, I would recommend a different PC at the same price.
That's basically how I see the VIA/SiS situation. $10 is about 100 miles of driving, a trip to most restaurants, a two hour movie.... It's not worth the risk just to save $10. If it is, maybe a new computer isn't what you really need; a used PC would probably be just as good and likely a lot cheaper (and possibly faster as well).
I agree with most of what you say, no one wants a system that crashs.
One thing I do notice though, is that most of your arguments can be atributed to low priced boards, yet the comments I find annoying are the generalizations about chipsets. Do you actually believe a $50 nvidia based board is significantly more stable or reliable than any other chipset? and if you do, couldn't this just be a side effect of being a more popular chipset thus less work programming a bios? I'm sure this isn't what you meant, but going by your comments about motherboard pricing, if I found a $100 SiS based board it should be more stable and reliable than a $50 nvidia board.
You also want me to read "user reviews"? this doesn't sound like a good way to judge reliability to me. Most user reviews are either in enthusiast fourms like the ones you have here, these usually only rewiew overclocking abilities, or on retail sites like Newegg, and to be honest most of the bad reviews I see there look more like PEBKAC.
You really haven't cleared up why VIA or SiS chipsets should be considered unreliable or unstable, although your dislike of budget boards is quite evedent.
I'm not trying to deny you your opinion, I'm just asking that you refrain from singling out specific chipsets if what you are really having a problem with is all budget boards, if there actually is a chipset specific problem please try to get a review published indicating what the problem is.
BTW if the board that wouldn't function, and had the review canned was a production board I feel sorry for the person that bought it without a proper warning from a review site that knew it was flawed (you don't want to know what I think of the review site that would let this happen).
Knowing what to expect from a product can help a budget builder as much as it can help an overclocker.
I tend to view guids like these through the eyes of my own system, and having a 7900GT at 500/1500, there is little reason to upgrade if I'm going to continue to play games at 1280x1024. However, 22" (widescreen) LCDs have also become a lot cheaper, and with my poor eyes, the 1650x1050 or so resolution will probably work pretty well. That leads me to the great situation I'm apparently in - it looks like my card will fetch around $200 if I sell it, and I have the option of either a perhaps slightly faster X1950pro for $199, basically making it a free change but only slightly faster, or a X1950XT 256meg for only $249. That's a lot of additional card for only $50, and pretty tempting. I cannot see why the $249 part doesn't get the nod for your pick over the 7950GT though.
Despite the fact that they are separated by quite a few cards in the table, the X1950 XT 256MB and the 7950 GT give relatively similar performance. The XT is probably 10-15% faster depending on game, but that's not really enough to mean the difference between one resolution and another in my opinion. You also get 512MB of RAM with the 7950GT, and it tends to overclock better than the XT resulting in performance that is basically equal.
However, you're right that it is still worth considering, and so I added it to the final table. This is particularly true for people that don't like NVIDIA hardware for whatever reason - just as the 7950GT is worth considering for people that don't like ATI's drivers. Honestly, I'm still unhappy with ATI's drivers overall; they NEED TO DITCH .NET! What's next, writing low level drivers in C# or Jaba (that's big, fat, slow Java for the uninformed)? I know the .NET stuff is just for the UI, but it still blows, and I get about a 45 second delay after Windows loads while the ATI driver starts up. If I weren't running CrossFire, I might not have as many issues with ATI's drivers, though.
As a side note, Neverwinter Nights 2 appears to require/use .NET 2.0, and for those who have played the game that probably explains a lot of the performance issues. I'm not sure if CrossFire/SLI support is working yet, but I do know that my CrossFire X1900 XT config can't handle running with antialiasing, and/or water reflections/refractions at resolutions above 1280x1024. Seems decent without the AA and water stuff at 1920x1200 with the latest drivers and patch, though.
Something seems to be missing from this part of the last paragraph on page 8.
quote: As another example, we wouldn't recommend upgrading from a GeForce 6800 GT to a GeForce 7600 GT, because even though the latter is faster fair so fundamentally similar in terms of performance.
Weird speech recognition there, I guess. I'm pretty sure it was supposed to be "they are" instead of "fair so"... but I can't honestly remember if that's what I said or not. LOL
Chart of best values jumps from about $100 w/rebate to $200+, while a highly overclockable 7900gs can be had for $145 after rebate (about $35 over a 7600GT).
Fair enough - I added a Midrange Overclocking for you. It's still more like $165 according to the prices I found at Newegg and ZipZoomFly, unless you're seeing something cheaper?
Sure, it's the slower clocked version of the GT, but no rebate and $141 shipped is quite tasty. This wasn't available two days ago, I can say that for sure.
the 6600GT has 8 ROP's. Perhaps the Wiki is referring to the vanilla 6600, but I still doubt that NVIDIA broke with the 1:1 pixel pipe:ROP ratio with any version of the 6 series.
Basically, the way I see it is that it probably doesn't matter too much either way - X1900 and 78/7900 have both shown that 16 ROPs for more pixel shaders is fine - and the idea was to make a more budget oriented part. One of the ways to do that is to cut unnecessary extras like additional Render Output Pipelines. As for 6600 vs. 6600 GT, those are the same chip with different clock speeds, so they have the same number of functional units.
Even if 6600 does have 8 ROPs (I can't find anything official from NVIDIA), the important thing is that a 6600 GT is now slower than a lot of the newer ~$100 cards. :) But hey, if someone gets a specific answer from NVIDIA, I can update. I can also fire off an email just to verify, but it might take a bit to get an answer (if they answer at all).
Not sure if this is worth a mention in the article or not, but Dell.com currently has 15% off which can be used on the XFX 8800 GTS/X cards. If you can dig up a another coupon, Dell sent me one via email for 10% off, the deal can be even sweeter.
I got the 8800 GTS for $382.50 and free shipping. Sweet deal for others not living in Texas like me. About $420 with tax.
Only reason I mention this is because you mention using MIR on the EVA 8800 GTS.
I did the same thing. Mine came out to $404 after tax here in Michigan.
XFX also has a double-lifetime warranty; that is, a lifetime warranty which applies to the first owner, and a second owner, should the first resell the card. You have to make sure you register the card, but it's a neat feature.
Has been for about a year now, but a lot of people keep dragging their feet. The fastest AGP systems are still able to run most games okay, but if you really want high-end graphics performance you are going to have to upgrade to PCI-E.
I've never tried to run two/dual monitors. I have an old CRT (VGA) and a new LCD (DVI). Can any card with both slots run 2 monitors? Or only specific ones?
I cant think of any gfx card that has dual outputs that CANT support dual monitors...so im gonna go with all of them can. definitely all the ones listed in this article (meaning everything that is current or close to current tech.) in fact, im doing dual monitors on my 6600gt (soon to be 8800gts!) right now : ).
One point to add, would be PCI video cards. Since there are a number of Dell machines that have shipped without AGP/PCIE slots, it would be nice to know what PCI card you would recommend as bang for the buck. Right now I'm using Radeon 9250s... but I don't know if that is the best option. Yes, it's slow... but it's still cheaper than canning the entire system for people that want something just a little faster.
The fastest currently available PCI video card is going to be a Radeon X1300 I think, going for around $110 (and I see at least one that has a $20 mail-in rebate). That isn't a very fast graphics card to begin with, and I would expect the PCI interface to further bottleneck the card, but I'm not sure there's anything better if you're stuck looking for PCI parts.
I'm just looking around on Newegg, so perhaps there's something better elsewhere (I seem to recall seeing GeForce 6600 cards on PCI at one point, which might be slightly faster in some cases), but if you need more performance from your graphics subsystem you really will need to look at upgrading to a new motherboard/computer that supports something other than PCI graphics.
Given that DirectX 10 is not actually available -- no games support it, Windows Vista hasn't shipped, and even after Windows Vista becomes available it will probably be a couple months at least before you get DirectX 10 enabled games (i.e. games that actually add new DirectX 10 features). NVIDIA says it best:
quote: Please keep in mind that Windows Vista will not be available to end-users until the end of January. We'd like to assure you that Vista drivers for the GeForce 8800 will be available to download when Vista ships to end users at the end of January.
The inability to run beta/nearly finished Windows Vista with all of the features enabled on brand new hardware isn't something that I consider a major problem. The nature of beta/release candidate software is that there are still many known problems. For all we know, DirectX 10 performance on the G80 chips is going to be terrible... or it might be the greatest thing since sliced bread. The only way we will find out for sure is when Windows Vista is finally released and we actually get games that use DirectX 10's new capabilities.
You guys list an EVGA 768-P2-N831-AR, but the one I got from Fry's electronics differs at the end w/ EVGA 768-P2-N831-FR. Does the FR=Retail, AR=Online? Or would AR be the newer "fixed transistor" SKU?
Honestly, I have no idea. EVGA (and many GPU manufacturers) tend to have so many different SKUs available with only negligible differences between them. I wouldn't be surprised if one of the models has a slight tweak to the transistors, but as for which one is "newer/better" I don't know. You could always email EVGA and ask.
The FR bought release day from Fry's had a 39C transistor and hit 660/1000. The AR ordered online last week has a 40C transistor and hits 630/1000. It may not be quite as fast, but I'll be keeping the newer AR w/ the 40C transistor...comforts me at night. :D
quote: ATI's X1800 line on the other hand is quite different from the X1900 parts, with the latter parts having far more pixel pipelines, although in terms of performance each pixel pipeline on an X1900 chip is going to be less powerful than an X1800 pixel pipeline
Again, this is completely wrong. The major difference between the x1800 and x1900 cards is that the x1900's have 3 pixel shaders per "pipe", whereas the x1800's only have one. If anything, the x1900 pipes are more powerful.
Akin to my comment above, quads are the thing these days, so the 1900 series has 4 pixel shaders per pipe. And if you go back to the original article when the 1900 was released, you'll see that the whole architecture is closer to 4 x1600's than 3 x1800's, either of which would result in the 48 shaders that we see. I recommend you read the first few pages of the debut article, but I think we can agree that the shaders in the x1800 were probably more potent than the ones in the 1600, so the 1900 is probably a little wimpier per shader than the 1800. However, it has 3 times as many, so it's better.
Also the comment was probably intended to dissuade people from assuming that the 1900 would be 3 times better than the 1800, and that there is a difference of architectures going on here.
quote: Also the comment was probably intended to dissuade people from assuming that the 1900 would be 3 times better than the 1800, and that there is a difference of architectures going on here.
Ding! That was a main point of talking about the changes in architecture. In the case of the X1650 XT, however, double the number of pixel shaders really does end up being almost twice as fast as the X1600 XT.
I also added a note on the page talking about the G80 mentioning that they have apparently taken a similar route, using many more "less complex" shader units in order to provide better overall performance. I am quite sure that a single G80 pixel shader (which of course is a unified shader, but that's beside the point) is not anywhere near as powerful as a single G70 pixel shader. When you have 96/128 of them compared to 24, however, more definitely ends up being better. :-)
quote: ATI needed a lot more pipelines in order to match the performance of the 7600 GT, indicating that each pipeline is less powerful than the GeForce 7 series pipelines, but they are also less complex
The 7600gt is 12 pipes. The x1650xt is 8 pipes with 3 pixel shaders each. You may want to rethink the statement quoted above.
What he meant were "pixel shaders", which seem to be interchanged with pipelines quite often. If you look on the table you'll see that the x1650xt is listed as having 24 pixel pipelines, and the 7600gt has 12 pixel pipelines, when they should read shaders instead.
Also quads seem to be the thing, so the 7600 gt probably has 3 quads of shaders, and the 1650 has twice that with 6 quads. Pixel shaders, to be more exact.
I have changed references from "pixel pipelines" to "pixel shaders". While it may have been a slight error in semantics to call them pipelines before, the basic summary still stands. ATI needed more pixel shaders in order to keep up with the performance and video was offering, indicating that each pixel shader from ATI is less powerful (overall -- I'm sure there are instances where ATI performs much better). This goes for your comment about X1800 below as well.
I was reading this article hoping to find a decent low priced card and when I saw the ultra budget section I thought I had found just that. But when I went to check the prices and specs of the cards listed the recommended 7300GT part was listed at several sites as only having a 64 bit memory interface instead of the listed 128 bit. The part number they posted was EVGA 256-P2-N443-LX. I didn't even find this product on the EVGA website. If someone knows the the deal is with this or even where to find one I'd appreciate it as a 128 bit intereface card versus 64 bit is a major performance booster especially in the price range I'm looking at.
I have modified this text slightly now. The cheapest EVGA 7300 GT is available for $75 at Newegg, but you're right that it is only a 64-bit memory interface. For about $10 more, I would recommend a Biostar 7300 GT instead, which comes with slightly higher clock speeds and a 128 bit interface. (It's also available at Newegg.)
quote: ...and a power supply capable of delivering 1.21 Gigawatts of power, by all means go nuts.
page 7
is that a joke i'm not getting or should it say 1.21 kilowatts. if it is the later then why so much power? i would think a quality psu delivering 850 - 1000 watts should be fine. and where does the 1.21 figure come from? adding the maximum tdp values of all the components.
does anyone know when will we be getting low to mid end dx10 cards or when will gdx10 exclusive games start to come out that do not work on anything less than dx10
ROFL you just made my day man. go watch the movie Back to the Future with michael j. fox from the mid 80's...1985 i think. the "doc" in that movie makes a comment (actually he screams it) saying that you need 1.21 gigawatts in order to provide enough power for his time machine to work.
a complete joke, hes saying that you need a ginormous (aka. big, high wattage) PSU in order to run some of these guys. yeah, 800 would be PLENTY imo.
It looks like the new 256MB Radeon X1950XT is a heck of a buy for anyone running a 20" display or less at least.
This guide is much appreciated. I especially think your note on older high-end graphics cards is a good one, though I might place even more emphasis on it so that some people could make a good choice to buy used rather than new (especially AGP folks, many of whom will be best served by a top-end used card like the 6800Ultra or X850XT).
No mention of the 7900GS at all??? And the 7900GT AGP was a ghost even when it first released, so why would you even mention it and then keep mum about the 7800GS AGP which is still easy to find?
Agreed ... (For those who want to stick with NVIDIA) 7900GS is a great price/performance point.
It's afordable and offers great performance on the 19" and 20" wide displays that are so popular right now.
I don't see any reason for someone to buy a 7900GT over a 7900GS right now they fall into the same perforamnce bracket. For people upgrading ... there are still quite a few people out there with SLI boards too. And while yes it's better to just get a more powerful single card. Many people can only afford XX right now. The ability to upgrade by adding a second card later adds some precieved value to people.
I do have to say, Good timing on your article. It's a confusing time for GPU upgrades. With the 8800s out the picture isn't as clear for people.
I thought I had mentioned those cards, but you're right: I didn't. I have now added text to page 5 covering the higher-end AGP offerings in more detail.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
51 Comments
Back to Article
justly - Thursday, December 14, 2006 - link
I apoligize for straying from the video topic, but I really get annoyed at the all to often trash talk about VIA and SiS chipsets.I understand that this is a GPU article so I can see Anandtech not recommending SiS or VIA integrated graphics based on their lack luster video capabilities. My question (or maybe I should call it a complaint) is how can Anandtech claim SiS and VIA boards are not stable or reliable? The last reviw (that I can remember) of a SiS based board was over a year ago, even then I dont think it was a production board. Coverage of VIA based boards isn't much better but at least Anandtech does give VIA some budget coverage.
I can fully understand if Anandtech doesn't want to recommend VIA or SiS to their enthusiast crowd due to poor overclocking, or being "a bit more quirky" as your article states.
I'm not going to read all the way through old articles just to try and figure out what these stability and reliability issues mighy be (mainly because most of the articles are so old that a BIOS update could easily have made any stability issues invalid). Well I lied a little, I did briefly look through the VIA board articles within the last year and found no stability issues at stock settings. In fact, the only stability issues I saw mentioned in an article happened when "we tried to exceed the SPD settings of our DDR memory modules" but the next line reads "We did not experience these same issues with our DDR2 memory modules" (and that article is 1 week shy of 9 months old).
I hope Anandtech decides to either stop repeating these claims of unstable, unreliable and quirky boards based on VIA & SiS or start reviewing these boards and show its readers why they deserve these remarks.
Then again if the only thing we as readers get from reviews of these chipsets/boards is complaints about how budget boards are not able to overclock, or the lack of a tweakable BIOS in a sub $60 board then blame the board not the chipset as most people are already aware that budget boards are like this reguardless of what chipset they use.
JarredWalton - Thursday, December 14, 2006 - link
I know of at least one attempted SiS board review in the past year that was canned because our reviewer could not get the board to function properly (after several BIOS updates and two boards, IIRC). Motherboards (and chipsets) are such an integral part of any computer that I would never skimp in that area. Then again, maybe I'm just too demanding of my computers?If you read user reviews of VIA/SiS boards you typically see a pattern that indicates the boards are overall "less reliable" - periodic instabilities and far higher failure rates. Some people report no problems and love the low prices, while others try to do a bit more with their systems and encounter difficulties.
If you just want to use a computer for office tasks, just about any system will be fine... but then again, if you're doing office work and your computer crashes, you probably won't be too happy. Anyone planning on running a higher-spec GPU should avoid cheaper motherboards IMO, as running a $300+ GPU in a <$75 board is just asking for problems. (For the same reason, I recommend $75+ PSUs for anyone running a CPU+GPU that cost more than $400 combined.)
Basically, I just can't recommend a questionable motherboard that saves a person $10-$20. The fact that the companies aren't out there promoting their products says something. If they're not proud enough of their work to try hard to get reviews at reputable sites, perhaps it's because they know their boards won't pass muster.
I actually had a company representative complain to me once about my stress tests being "unrealistic". He asked, "How many people actually try to run Folding@Home and a bunch of gaming benchmarks in sequence?" Basically, the system would crash if I used my script to benchmark games at various resolutions without rebooting in between each run. It's true that a lot of people might never stress a system to that level, but when I've looked at dozens of computers that handle that workload without problems, a system that crashes/locks in the same situation is clearly not as "stable or reliable" as competing solutions. All things being equal, I would recommend a different PC at the same price.
That's basically how I see the VIA/SiS situation. $10 is about 100 miles of driving, a trip to most restaurants, a two hour movie.... It's not worth the risk just to save $10. If it is, maybe a new computer isn't what you really need; a used PC would probably be just as good and likely a lot cheaper (and possibly faster as well).
justly - Thursday, December 14, 2006 - link
I agree with most of what you say, no one wants a system that crashs.One thing I do notice though, is that most of your arguments can be atributed to low priced boards, yet the comments I find annoying are the generalizations about chipsets. Do you actually believe a $50 nvidia based board is significantly more stable or reliable than any other chipset? and if you do, couldn't this just be a side effect of being a more popular chipset thus less work programming a bios? I'm sure this isn't what you meant, but going by your comments about motherboard pricing, if I found a $100 SiS based board it should be more stable and reliable than a $50 nvidia board.
You also want me to read "user reviews"? this doesn't sound like a good way to judge reliability to me. Most user reviews are either in enthusiast fourms like the ones you have here, these usually only rewiew overclocking abilities, or on retail sites like Newegg, and to be honest most of the bad reviews I see there look more like PEBKAC.
You really haven't cleared up why VIA or SiS chipsets should be considered unreliable or unstable, although your dislike of budget boards is quite evedent.
I'm not trying to deny you your opinion, I'm just asking that you refrain from singling out specific chipsets if what you are really having a problem with is all budget boards, if there actually is a chipset specific problem please try to get a review published indicating what the problem is.
BTW if the board that wouldn't function, and had the review canned was a production board I feel sorry for the person that bought it without a proper warning from a review site that knew it was flawed (you don't want to know what I think of the review site that would let this happen).
Knowing what to expect from a product can help a budget builder as much as it can help an overclocker.
Sunrise089 - Thursday, December 14, 2006 - link
I tend to view guids like these through the eyes of my own system, and having a 7900GT at 500/1500, there is little reason to upgrade if I'm going to continue to play games at 1280x1024. However, 22" (widescreen) LCDs have also become a lot cheaper, and with my poor eyes, the 1650x1050 or so resolution will probably work pretty well. That leads me to the great situation I'm apparently in - it looks like my card will fetch around $200 if I sell it, and I have the option of either a perhaps slightly faster X1950pro for $199, basically making it a free change but only slightly faster, or a X1950XT 256meg for only $249. That's a lot of additional card for only $50, and pretty tempting. I cannot see why the $249 part doesn't get the nod for your pick over the 7950GT though.JarredWalton - Thursday, December 14, 2006 - link
Despite the fact that they are separated by quite a few cards in the table, the X1950 XT 256MB and the 7950 GT give relatively similar performance. The XT is probably 10-15% faster depending on game, but that's not really enough to mean the difference between one resolution and another in my opinion. You also get 512MB of RAM with the 7950GT, and it tends to overclock better than the XT resulting in performance that is basically equal.However, you're right that it is still worth considering, and so I added it to the final table. This is particularly true for people that don't like NVIDIA hardware for whatever reason - just as the 7950GT is worth considering for people that don't like ATI's drivers. Honestly, I'm still unhappy with ATI's drivers overall; they NEED TO DITCH .NET! What's next, writing low level drivers in C# or Jaba (that's big, fat, slow Java for the uninformed)? I know the .NET stuff is just for the UI, but it still blows, and I get about a 45 second delay after Windows loads while the ATI driver starts up. If I weren't running CrossFire, I might not have as many issues with ATI's drivers, though.
JarredWalton - Thursday, December 14, 2006 - link
As a side note, Neverwinter Nights 2 appears to require/use .NET 2.0, and for those who have played the game that probably explains a lot of the performance issues. I'm not sure if CrossFire/SLI support is working yet, but I do know that my CrossFire X1900 XT config can't handle running with antialiasing, and/or water reflections/refractions at resolutions above 1280x1024. Seems decent without the AA and water stuff at 1920x1200 with the latest drivers and patch, though.PrinceGaz - Thursday, December 14, 2006 - link
Something seems to be missing from this part of the last paragraph on page 8.JarredWalton - Thursday, December 14, 2006 - link
Weird speech recognition there, I guess. I'm pretty sure it was supposed to be "they are" instead of "fair so"... but I can't honestly remember if that's what I said or not. LOLgerf - Thursday, December 14, 2006 - link
BTW, good article. Laptop integrated's good enough for me though (ex-gamer). On the second page, were should be "where."
Noya - Thursday, December 14, 2006 - link
Chart of best values jumps from about $100 w/rebate to $200+, while a highly overclockable 7900gs can be had for $145 after rebate (about $35 over a 7600GT).JarredWalton - Thursday, December 14, 2006 - link
Fair enough - I added a Midrange Overclocking for you. It's still more like $165 according to the prices I found at Newegg and ZipZoomFly, unless you're seeing something cheaper?Noya - Saturday, December 16, 2006 - link
http://www.newegg.com/Product/Product.asp?Item=N82...">http://www.newegg.com/Product/Product.asp?Item=N82...JarredWalton - Saturday, December 16, 2006 - link
Got a better one for you: http://www.newegg.com/Product/Product.asp?Item=N82...">Sapphire X1900 GT v2Sure, it's the slower clocked version of the GT, but no rebate and $141 shipped is quite tasty. This wasn't available two days ago, I can say that for sure.
TechLuster - Thursday, December 14, 2006 - link
The 7600GT (and GS as well I think) has 8 ROP's, not 6. Jarred, you may want to fix this.And, though I have no resource to back this up, I have a hunch that the 6600GT has 8 (not 4) also.
JarredWalton - Thursday, December 14, 2006 - link
You're correct on the 7600 GT, but there are quite a few places that agree 6600 has 4 ROPs, like http://en.wikipedia.org/wiki/Render_Output_unit">Wiki for example.TechLuster - Friday, December 15, 2006 - link
According to this articlehttp://www.hardwarezone.com/articles/view.php?cid=...">http://www.hardwarezone.com/articles/view.php?cid=...
the 6600GT has 8 ROP's. Perhaps the Wiki is referring to the vanilla 6600, but I still doubt that NVIDIA broke with the 1:1 pixel pipe:ROP ratio with any version of the 6 series.
JarredWalton - Friday, December 15, 2006 - link
I know some places say 8 ROPs, but I think more say 4.http://techreport.com/reviews/2004q4/geforce-6200/...">TechReport
http://en.wikipedia.org/wiki/GeForce_6_Series">Another Wiki Link
http://www.legionhardware.com/document.php?id=536">Legion Hardware
http://www.behardware.com/articles/514-1/nvidia-ge...">BeHardware - why 8 ROPs isn't necessarily a good idea
Basically, the way I see it is that it probably doesn't matter too much either way - X1900 and 78/7900 have both shown that 16 ROPs for more pixel shaders is fine - and the idea was to make a more budget oriented part. One of the ways to do that is to cut unnecessary extras like additional Render Output Pipelines. As for 6600 vs. 6600 GT, those are the same chip with different clock speeds, so they have the same number of functional units.
Even if 6600 does have 8 ROPs (I can't find anything official from NVIDIA), the important thing is that a 6600 GT is now slower than a lot of the newer ~$100 cards. :) But hey, if someone gets a specific answer from NVIDIA, I can update. I can also fire off an email just to verify, but it might take a bit to get an answer (if they answer at all).
microAmp - Wednesday, December 13, 2006 - link
Not sure if this is worth a mention in the article or not, but Dell.com currently has 15% off which can be used on the XFX 8800 GTS/X cards. If you can dig up a another coupon, Dell sent me one via email for 10% off, the deal can be even sweeter.I got the 8800 GTS for $382.50 and free shipping. Sweet deal for others not living in Texas like me. About $420 with tax.
Only reason I mention this is because you mention using MIR on the EVA 8800 GTS.
LoneWolf15 - Thursday, December 14, 2006 - link
I did the same thing. Mine came out to $404 after tax here in Michigan.XFX also has a double-lifetime warranty; that is, a lifetime warranty which applies to the first owner, and a second owner, should the first resell the card. You have to make sure you register the card, but it's a neat feature.
SonicIce - Wednesday, December 13, 2006 - link
So I guess it's RIP AGP. :(JarredWalton - Wednesday, December 13, 2006 - link
Has been for about a year now, but a lot of people keep dragging their feet. The fastest AGP systems are still able to run most games okay, but if you really want high-end graphics performance you are going to have to upgrade to PCI-E.pottervillian - Wednesday, December 13, 2006 - link
Merry Christmas, and thanks for a great guide!aakoch - Wednesday, December 13, 2006 - link
I've never tried to run two/dual monitors. I have an old CRT (VGA) and a new LCD (DVI). Can any card with both slots run 2 monitors? Or only specific ones?Chapbass - Wednesday, December 13, 2006 - link
I cant think of any gfx card that has dual outputs that CANT support dual monitors...so im gonna go with all of them can. definitely all the ones listed in this article (meaning everything that is current or close to current tech.) in fact, im doing dual monitors on my 6600gt (soon to be 8800gts!) right now : ).kleinwl - Wednesday, December 13, 2006 - link
One point to add, would be PCI video cards. Since there are a number of Dell machines that have shipped without AGP/PCIE slots, it would be nice to know what PCI card you would recommend as bang for the buck. Right now I'm using Radeon 9250s... but I don't know if that is the best option. Yes, it's slow... but it's still cheaper than canning the entire system for people that want something just a little faster.JarredWalton - Wednesday, December 13, 2006 - link
The fastest currently available PCI video card is going to be a Radeon X1300 I think, going for around $110 (and I see at least one that has a $20 mail-in rebate). That isn't a very fast graphics card to begin with, and I would expect the PCI interface to further bottleneck the card, but I'm not sure there's anything better if you're stuck looking for PCI parts.I'm just looking around on Newegg, so perhaps there's something better elsewhere (I seem to recall seeing GeForce 6600 cards on PCI at one point, which might be slightly faster in some cases), but if you need more performance from your graphics subsystem you really will need to look at upgrading to a new motherboard/computer that supports something other than PCI graphics.
mgambrell - Wednesday, December 13, 2006 - link
Geforce 8800 may be fast, but it can't run directx10.http://forums.nvidia.com/index.php?showtopic=22248">http://forums.nvidia.com/index.php?showtopic=22248
Witness the driver debacle. Just beware.
JarredWalton - Wednesday, December 13, 2006 - link
Given that DirectX 10 is not actually available -- no games support it, Windows Vista hasn't shipped, and even after Windows Vista becomes available it will probably be a couple months at least before you get DirectX 10 enabled games (i.e. games that actually add new DirectX 10 features). NVIDIA says it best:The inability to run beta/nearly finished Windows Vista with all of the features enabled on brand new hardware isn't something that I consider a major problem. The nature of beta/release candidate software is that there are still many known problems. For all we know, DirectX 10 performance on the G80 chips is going to be terrible... or it might be the greatest thing since sliced bread. The only way we will find out for sure is when Windows Vista is finally released and we actually get games that use DirectX 10's new capabilities.
Jodiuh - Wednesday, December 13, 2006 - link
You guys list an EVGA 768-P2-N831-AR, but the one I got from Fry's electronics differs at the end w/ EVGA 768-P2-N831-FR. Does the FR=Retail, AR=Online? Or would AR be the newer "fixed transistor" SKU?Thanks for the guide!!
JarredWalton - Wednesday, December 13, 2006 - link
Honestly, I have no idea. EVGA (and many GPU manufacturers) tend to have so many different SKUs available with only negligible differences between them. I wouldn't be surprised if one of the models has a slight tweak to the transistors, but as for which one is "newer/better" I don't know. You could always email EVGA and ask.Jodiuh - Wednesday, December 13, 2006 - link
The FR bought release day from Fry's had a 39C transistor and hit 660/1000. The AR ordered online last week has a 40C transistor and hits 630/1000. It may not be quite as fast, but I'll be keeping the newer AR w/ the 40C transistor...comforts me at night. :DJodiuh - Thursday, December 14, 2006 - link
Reply from EVGA!Jod,
AR= Etail/Retail RoHS compliant
FR= Frys Retail RoHS compliant
All of our cards had the correct transistor value when shipped out.
Regards,
munky - Wednesday, December 13, 2006 - link
Again, this is completely wrong. The major difference between the x1800 and x1900 cards is that the x1900's have 3 pixel shaders per "pipe", whereas the x1800's only have one. If anything, the x1900 pipes are more powerful.
evonitzer - Wednesday, December 13, 2006 - link
Akin to my comment above, quads are the thing these days, so the 1900 series has 4 pixel shaders per pipe. And if you go back to the original article when the 1900 was released, you'll see that the whole architecture is closer to 4 x1600's than 3 x1800's, either of which would result in the 48 shaders that we see. I recommend you read the first few pages of the debut article, but I think we can agree that the shaders in the x1800 were probably more potent than the ones in the 1600, so the 1900 is probably a little wimpier per shader than the 1800. However, it has 3 times as many, so it's better.Also the comment was probably intended to dissuade people from assuming that the 1900 would be 3 times better than the 1800, and that there is a difference of architectures going on here.
JarredWalton - Wednesday, December 13, 2006 - link
Ding! That was a main point of talking about the changes in architecture. In the case of the X1650 XT, however, double the number of pixel shaders really does end up being almost twice as fast as the X1600 XT.
I also added a note on the page talking about the G80 mentioning that they have apparently taken a similar route, using many more "less complex" shader units in order to provide better overall performance. I am quite sure that a single G80 pixel shader (which of course is a unified shader, but that's beside the point) is not anywhere near as powerful as a single G70 pixel shader. When you have 96/128 of them compared to 24, however, more definitely ends up being better. :-)
munky - Wednesday, December 13, 2006 - link
The 7600gt is 12 pipes. The x1650xt is 8 pipes with 3 pixel shaders each. You may want to rethink the statement quoted above.
evonitzer - Wednesday, December 13, 2006 - link
What he meant were "pixel shaders", which seem to be interchanged with pipelines quite often. If you look on the table you'll see that the x1650xt is listed as having 24 pixel pipelines, and the 7600gt has 12 pixel pipelines, when they should read shaders instead.Also quads seem to be the thing, so the 7600 gt probably has 3 quads of shaders, and the 1650 has twice that with 6 quads. Pixel shaders, to be more exact.
JarredWalton - Wednesday, December 13, 2006 - link
I have changed references from "pixel pipelines" to "pixel shaders". While it may have been a slight error in semantics to call them pipelines before, the basic summary still stands. ATI needed more pixel shaders in order to keep up with the performance and video was offering, indicating that each pixel shader from ATI is less powerful (overall -- I'm sure there are instances where ATI performs much better). This goes for your comment about X1800 below as well.Spoelie - Wednesday, December 13, 2006 - link
why does nvidia always gets replaced to "and video" in your texts? here and in the article :)JarredWalton - Wednesday, December 13, 2006 - link
Speech recognition does odd things. I don't proof posts as well as I should. :)spidey81 - Wednesday, December 13, 2006 - link
I was reading this article hoping to find a decent low priced card and when I saw the ultra budget section I thought I had found just that. But when I went to check the prices and specs of the cards listed the recommended 7300GT part was listed at several sites as only having a 64 bit memory interface instead of the listed 128 bit. The part number they posted was EVGA 256-P2-N443-LX. I didn't even find this product on the EVGA website. If someone knows the the deal is with this or even where to find one I'd appreciate it as a 128 bit intereface card versus 64 bit is a major performance booster especially in the price range I'm looking at.JarredWalton - Wednesday, December 13, 2006 - link
I have modified this text slightly now. The cheapest EVGA 7300 GT is available for $75 at Newegg, but you're right that it is only a 64-bit memory interface. For about $10 more, I would recommend a Biostar 7300 GT instead, which comes with slightly higher clock speeds and a 128 bit interface. (It's also available at Newegg.)semo - Wednesday, December 13, 2006 - link
page 7
is that a joke i'm not getting or should it say 1.21 kilowatts. if it is the later then why so much power? i would think a quality psu delivering 850 - 1000 watts should be fine. and where does the 1.21 figure come from? adding the maximum tdp values of all the components.
does anyone know when will we be getting low to mid end dx10 cards or when will gdx10 exclusive games start to come out that do not work on anything less than dx10
Chapbass - Wednesday, December 13, 2006 - link
ROFL you just made my day man. go watch the movie Back to the Future with michael j. fox from the mid 80's...1985 i think. the "doc" in that movie makes a comment (actually he screams it) saying that you need 1.21 gigawatts in order to provide enough power for his time machine to work.a complete joke, hes saying that you need a ginormous (aka. big, high wattage) PSU in order to run some of these guys. yeah, 800 would be PLENTY imo.
JarredWalton - Wednesday, December 13, 2006 - link
Cue Hewey Lewis and the News! "Gotta get back in time....." :DGlad some people got the reference.
bilbo3660 - Wednesday, December 13, 2006 - link
The reality is you can run quad-core, three 8800GTX, water-cooling and overclock this monster on the Corsair 620W just fine. Review was done at the Inquirer. http://www.theinquirer.net/default.aspx?article=36...">http://www.theinquirer.net/default.aspx?article=36...LoneWolf15 - Wednesday, December 13, 2006 - link
It looks like the new 256MB Radeon X1950XT is a heck of a buy for anyone running a 20" display or less at least.This guide is much appreciated. I especially think your note on older high-end graphics cards is a good one, though I might place even more emphasis on it so that some people could make a good choice to buy used rather than new (especially AGP folks, many of whom will be best served by a top-end used card like the 6800Ultra or X850XT).
RamarC - Wednesday, December 13, 2006 - link
No mention of the 7900GS at all??? And the 7900GT AGP was a ghost even when it first released, so why would you even mention it and then keep mum about the 7800GS AGP which is still easy to find?VooDooAddict - Wednesday, December 13, 2006 - link
Agreed ... (For those who want to stick with NVIDIA) 7900GS is a great price/performance point.It's afordable and offers great performance on the 19" and 20" wide displays that are so popular right now.
I don't see any reason for someone to buy a 7900GT over a 7900GS right now they fall into the same perforamnce bracket. For people upgrading ... there are still quite a few people out there with SLI boards too. And while yes it's better to just get a more powerful single card. Many people can only afford XX right now. The ability to upgrade by adding a second card later adds some precieved value to people.
I do have to say, Good timing on your article. It's a confusing time for GPU upgrades. With the 8800s out the picture isn't as clear for people.
JarredWalton - Wednesday, December 13, 2006 - link
I thought I had mentioned those cards, but you're right: I didn't. I have now added text to page 5 covering the higher-end AGP offerings in more detail.gamania - Wednesday, December 13, 2006 - link
I have been trying hard to find a card to support DVI+DVI+HDTV output at the same time, anybody knows here? Thanks.