Though like a large part of the readers, I don't feel I can afford to buy high end cards, that doesn't mean I'm not interested in them.
Personally, my favourite reading is about new technology. New generations of graphics cards make for interesting articles, IMO. Yet another card that's exactly the same as the others but different somewhat in performance (like 4830, or the various renames NVIDIA cards go through) I find quite a boring read.
For getting a good idea about what to buy, my favourite place to go used to be the 3Digests at digit-life. They seem to have stopped, unfortunately, but they pretty much summed up the performance of the cards in order to make a purchase decision.
I'm also interested in seeing where things can be stretched. Quad-SLI, though I'd never have it, does tell me things about technology. Cross-SLI (or cross-crossfire), such as 4870+4830, would also be of interest.
I generally try to stay around the $150 mark, because that's normally where the cards with the best performance/dollar are. Right now the HD4850 and the 9800GTX+ are in that price range, and with rebates and newegg discounts I've seen both down to 125; which is a really good deal. But if I was putting together a whole new computer and the difference in performance was big enough, at least 30% as I've found 30% is the smallest performance improvement that's noticeable in real world situations, then I'd be willing to spend closer to $200 or maybe even slightly more if it really was just that good of a card.
However, if I already had my computer built and all I was doing was upgrading the GPU I'd be significantly less likely to spend any amount of extra money on that GPU; since the system as a whole has less life in it. I think it'd be a good idea to mention that in an upcoming article and/or make a poll about it.
Oh, I also try to stay around the $150 dollar range because those cards tend to not be too loud when gaming and now-a-days are silent when idle; and considering I use my pc as my bedroom television that's important. I also don't want my GPU to raise my electric bill too much. That's one great thing about Nvidia's GT200 series cards, their idle power consumption is phenomenal. Honestly that's the reason I'm holding off on upgrading my GPU, I want a card that idles at about 50W, or less would be even better. That also pretty much defeats the need to be able to switch to integrated graphics when not using your dedicated GPU.
So yea, you should definitely consider the issues of noise and power consumption; and when your talking about whole system noise you need to be concerned with heat as well. I'm one of those who believes you shouldn't be able to hear your computer AT ALL unless you get down and put your ear to it. Heavy gaming sessions excluded, as the GPU fan obviously needs to spin up; but it should spin down quickly when you're done... so I can get to bed easily:)
One reason I read this site is the careful attention paid to price and performance. I'm in the 1% price doesn't matter to me camp, but nearly everyone I advise on pc parts cares about cost quite a bit. Here I can find a proper vendor agnostic test set. An important service indeed.
I'm usually in the 150's when I purchase a Video card. I had an Saphire ATI x1950 Pro which Replaced a Guilmont 6600GT. Neither of those cards were "Top of the Line" but they were in the top 20%. I never ran them at max resolution (I just ran 1024 x 768) and I wasn't afraid to turn antialiasing off if I could keep detail high. I had a 19" Trinitron CRT.
When I moved to a 22" LCD I upgraded my video card with an MSI 8800gt 512MB OC. I payed $120 for it before rebate. It's faster than my 1950 pro but about the same if I run games at 1600 X 1050 which I do. I looked into upgrading to a better card this year but none of them are that much faster at this resolution. This is a case where I bought a cheaper video card than usual since the performance difference wasn't that great. I took my video card budget for the year and replaced my E6400 with a Q6600 instead.
I hope AnandTech takes notice of the results of this survey. Most people are after midrange, "bang for your buck" parts and don't really care about the expensive top end. Sure, it is interesting to see the technology and what will filter down but is mostly irrelevant.
Why bother with ongoing multi SLI scaling articles when if you look at the steam hardware survey you will see that 97.8% of gamers use single GPU systems. (as at Feb 2009)
I find that the people who spend massive dollars on SLI systems are either excessively wealthy or new to the scene and haven't been through the several generational cycles that turn their new $450 card into a paperweight within 2 years.
The SLI equation rarely adds up. Although it is good in theory to add a second card later down the track, by the time you need it a new next-gen card is released that is just as fast as your SLI system but at 1/3rd of the power and much less noise and heat.
Mid range volume sellers like the 9800pro, 6800GT, 8800GT, 4850... now that is where it's at!
Even though parts are made upgradable, I tend to buy on the mid-to-highend and replace every 2-3 years going along with Moore's Law.
Parts are made upgradable but I rarely do. And when it's time to buy a new CPU, you need a new Mobo, and you find out there is new RAM yada yada. The GPU and the HD are the parts that generally last longer then 2-3 years.
The biggest problem these days is after 2007/2008's more properly-priced bang-for-the-buck GPUs, we're back to 2006/2007 where things were overpriced.
Especially in this economy, but even in a better economy like we saw in 2007/2008, one should be able to buy a very good, latest-generation GPU for $200-250. Something that can handle the usual assortment of the latest FPS games at 20-24" monitor resolutions with high quality settings without dipping into slowdown levels of frames-per-second, and without needing to also have some top-of-the-line $500 CPU either.
2007/2008 was the generation of Core2Duo + 7900GT (and later 8800GT) (and several other good combos). We are only just now starting to see the GTX-260 Core-216 GPUs in the right price range and they are questionably the latest generation, what-with the 285/295 now being on-market.
WOW… I’m not easily shocked and I’m shocked at the results. Granted times are tough but people (responding) are much cheaper than I would have ever guessed. I guess that partially explains why the market is so flooded with low end junk. Even with devaluation of the dollar and inflation prices are more restrained than ever. Just a few years ago a just released X1900XT cost me $500 and was a price performance deal compared to the in short supply $800 Geforce (7800?). Just about a year ago I picked up a bargain 8800 GTX at $350 (right when the 8800 GT and then 9000’s started appearing). In perspective a $334 GTX285 I’m thinking about buying is an outright price performance steal.
To each his own but I think the results are skewed by a much wider audience outside of the gamer, enthusiast, system builder and fanatic overclocker crowd one would normally associate with a hardware site like this. I fall somewhere in the enthusiast/overclocker/gamer crowd and I look to build a good balance of price performance. Granted I have a fair amount of disposable income (I have 4 kids, 2 dogs, 1 cat and a wife that does not work) but I have more brains than I have money. I do look at reviews and look for something to give me the best price performance gaming (and other uses) on a 26” at 1920x1200. I see a whole lot of people trying to put cheap lipstick on a pig in the comments. Conversely I always get a good laugh out of people wanting to run high end cards and/or SLI on small low resolution monitors.
"WOW… I’m not easily shocked and I’m shocked at the results. Granted times are tough but people (responding) are much cheaper than I would have ever guessed."
No, we gamer/enthusiasts a number of things but we're not "cheap". We're price-conscious, we're bang-for-the-buck oriented, we're smart with how we spend our money, and we're (mostly) adults who have the patience and discipline to resist being suckered into dropping big bills for a GPU that's really not worth it.
You just have a skewed perception that is now being corrected by reality, and it was skewed by spending time on forums populated primarily by the handfuls of people who actually spend their money on ultra high-end GPUs (and often do so every 6 months, and do so specifically for the sake of forum postings and signature stats).
Most people (talking middle class normal gamer/enthusiat folks) do not drop $500 for a GPU for their computer. They understand the value of $500 better than some kid who doesn't pay for their housing, insurance, etc, and would not be so quick to drop that kind of money on a computer except in rare circumstances.
And the reality is the GPU market has changed to go after the higher-end pricing schema because there have been an increasing number of suckers that make such a schema more lucrative where it never was quite so lucrative in the past.
Realistically $200-300 is the range people expect for a near-top-of-the-line GPU for gaming on your average 20-24" widescreen monitor at High graphical game settings. There should only be one single-card offering of a given generation priced higher than that, at around $450-500, which is for the people who think they need it, or who run 30" displays, or who really want to run Ultra High settings at high resolutions at playable framerates.
Everyone else, and where the big money has traditionally been made, is in the sub-$300 GPU market.
And there's a lot more caveats and things I'd like to say but whatever this is long enough.
With young kids, my discretionary income is limited. I usually can scare up a sweet deal. I got my 9600GT for $75 AR and sometime soon I will end up getting another one for less than that. Ending up with a pair of 9600GTs for around $125 while having graphics performance
that is good enough for me for about 3-4 years is value in my book.
My current card is an 8800 GTX that cost $550 in January 2007. It was a bit hard to swallow at them time, but I'm still happily using it on a 22 inch monitor and it provides comparable performance to today's mid-range cards (HD 4850/9800 GTX). That's pretty good longevity and value, especially considering I'll get one more year out of it before my new PC build ($180 a year - many people spend that much upgrade mid-range cards over three years).
Hopefully the "next big thing" will hit about the time I'm ready for my new build.
When I buy a new video card, I usually shoot for under $200, but will go up to $250 if pressed.
But, I don't really care much about *today's* price when I look at reviews (which is what we're talking about here, right?). It's all relative and I'm very rarely ready to buy when you put out a review. So if I'm looking at a bunch of cards compared by price on newegg, I want to beable to come to anandtech, look at a page and see the performace of these cards compared to each other. Tom's tries to do that but doesn't do it well. THEN, I want to be able to drill down and see the details. It's important to have generic articles on the tech behind each card and what it is that makes it a good card. Periodic scorecards on manufacturers would allow you to get some kind of idea on reliability and build quality.
So I'm looking for a site that gives me the tools I need to make a smart purchasing decision when I'm ready to buy. I like to keep up with the latest kit as much as the next guy, but when I'm buying, I want fast, reliable information on the cards I'm looking at. Build your reviews and benchmarks for longevity, and I'll be happy.
such as yearly hardware budget, or number of PCs, or total RAM of main PC... any indication of the level of geekness of the respondent.
You may be surprised by how many non-computer-nerds your audience has, due to the democratisation of the whole PC scene, and the lack of good web sites.
GPUs I look at usually MSRP for 200$, I go ±$50 based on ease of implementing a passive cooling (or semi-active <19dB) solution. Several years ago, it was the ATI Radeon 9800pro 256MB, just recently it was the ATI Radeon HD4870 1GB for 230$. I paid the premium to ensure I got the 4-phase power design, for example. Generally, I wait a while to make the purchase (HD4870 has been out for a long time) to allow the market to fight the price/performance "battle" for me.
I was thinking you could have aqcuired more information about relative graphics card spending by asking a similar set of questions.
1) How much do you spend when upgrading the main components of your system (mainboard, CPU, GPU, RAM)?
2) What proportion of that upgrade cost is dedicated to GPU?
I bought my last 285 cards because they were coming with CoD - World at War which I wanted to buy anyway. That's $40-50 right there. (Accidentally I got FC2 too) So if I had to chose between three cards of the same speed/price: one without extras, one with a game I own/don't want and one with a game I want, I'll definitely go with the third one. I'd also take a free game over a mail-in rebate.
I've had some luck winning graphics cards in recent years, which is my favourite way of acquiring them. Before that I bought a couple of used ones (sub-$100, but they were last gen's high end) and before that I typically bought low end (sub-$100), but that was 10 years ago.
Assuming I don't win another card, though I voted $100-$149, I think sub-$100 is pretty decent, and if I had to buy now, I'd probably go for a Radeon 4670. I want something that'd run games at 1600x1200, since it's my monitor's resolution, but I'm willing to do without FSAA and other eye candy, if there's need, and 30fps is typically enough for me (even less, for games that aren't action games). The programmer in me likes to have the latest tech (DX10.1, ...) even if I don't use it. I also prefer lower power cards, if possible.
I'll certainly be tempted to buy a DX11 card when they come out, but will wait for something like the 4670 for that gen.
I tend to want some decent performance, but really these days an HD 4670 makes a compelling case for a second or third computer.
I guess what I'm saying is that value is important when you're talking about multiples. I would have "never" really considered having more than one up to date "gaming" rig but with prices falling for a performance level that I think is quite good, then I don't see why limit oneself to one expensive card, when there may be more value in having a LAN party ready to happen at all times.
So, given the state of things right now, I think it's really difficult to come up with a reason to spend over $150.00 while if I needed to (in order to play the best games (not just the most demanding games, the BEST ones)) then I could certainly go to $250.00 per say. I for one don't think the prices are going to go that high for regular "upper mainstream" needs.
Now that we've tasted affordability, I think a lot of us aren't going back.
Also, I think more than ever, I would say my very highest concern in terms of cost is on energy efficiency. Why? Because I've done away with building my own computers, I buy Dells now and they only come with 300-350W power supplies and I don't want to change them. So basically I don't want the hassle of a pricey card. I look at the current $250+ offerings I quite simply don't want them at ANY price. Literally, if I could buy a double-slot card that needs a 550W power supply for $50.00 I would pass.
I currently have an HD 4850 and I simply don't see the use of more. In fact, the HD 3870 that it replaced was plenty fast. I simply couldn't pass up the boost per $ of the 4850. So it's not a cut and dry thing.
Some products come around and they are compelling. For example, the 9800GT 512MB edition was hard not to want.. it was $200.00 and it was cool running and energy efficient. So again, the $ amount of the pricier cards is only one of the reasons that I don't want them... most of them (just look at the size of the cooling and the power draw) are no more than overpriced prototypes.
What I am currently MOST interested in is a better half-height (or low profile) offering that doesn't need a six-pin power and the like. I am basically waiting for a HD4670 level of performance in a half-height but it doesn't look like it's going to come. I would pay $150 for such a card even though the performance per $ would be way low. Because it would fit my very nice slim desktop (the extra computer).
So to beat a dead horse, I don't think I'm alone on this. We want a compelling product FIRST AND FOREMOST, and if it's compelling enough many of us will come up with the money so long as it's not that much money (the difference between $70 and $ 150 is over 100% but it's still a reasonable cost).
So... so long as there are $130 offerings with better power envelopes and sufficient performance there is no way no how that I'll spend more for a less appealing product. I want to support research and development with my dollars, not marketing hype...
I voted for the $50-99 segment (And why not? You can get an 8600GTS all the way to a 9800GT in that price segment, including the ~$70-on-ebay 9600GT smack dab in the middle of that segment.)
With a +/- $50 tacked on there.
Buuuut, if I had any money at all I would probably have shot into the $150-200 segment, again with $50 leeway.
I base the money I am willing to spend on what I ask the card to do, namely I ask my card to run dual 1280x1024 screens in a stereoscopic 3D setup (one for the left eye, one for the right). Thus I have decided that a 9600GT (or a decent 8800/9800 series card with 512MB if it falls under $80) is probably the best choice considering I have no money coming on a regular basis.
If I was to move up to dual 1650x1080 or a single 2560x1600 then I would start looking up the chain at a ~$100-250 card. I don't think that getting 10% better FPS is worth 200% the price, so I tend to find the sweet spot that balances my system with my card, and offers the very best value for money.
I purchase a new graphics card every 2-3 years to upgrade my PC. It is easy to sink a lot of money into a graphics card unreplacement unless discipline and restraint are exercised.
What is important to me when I consider a replacement graphics card are the following criteria I strictly abide by:
1) The existing graphics card must be service for at least 2 years before replacement.
2) The replacement card must realize at least double the performance and double the memory for the same or less cost of the original graphics card.
3) The replacement graphics card may cost between $150 to $200.
4) The replacement graphics card power draw must be 55W to 65W at full load and preferably be a single slot card.
I am waiting for a suitable replacement for the 7950GT this summer. The nVidia 9600GT does not quite meet the double the performance criteria while it meets my power, memory and cost criteria. The nVidia 9800GT meets the double performance, memory and cost criteria, but fails the power draw criteria. To me, any card drawing more than 65W at load requires too much power.
When GPUs are released this summer with 45nm process, then graphics cards will be released worthy of consideration.
I am assuming you have heard of the 9800 Green Power? It does not require an external power connection (all power comes through the PCI-E slot connection.)
I beg to differ on your "double performance" bashing of the 9600GT, by pure numbers it has much more than double the performance in any modern game (AKA one using shaders, like any released in the last 2 years.)
The 7900 series (Which I love, my main card is a 650mhz 7900GS) has 20 or 24 pixel shaders, and 7 or 8 vertex shaders, and 1400mhz DDR3.
The 9600GT has 64 Unified Shaders, is stock clocked at 675mhz and 2,000mhz DDR3.
I had a G92 and G80 card, sold them both, why? Because the older games I was into actually showed a decrease in FPS from my 7900GS. BUT! I recently picked up an 8600 GTS (2,000mhz 128-bit bus ~$40 on ebay), and it really is much much faster in modern shader-laden games such as Grid. I would say that since the 8600GTS is about half a 9600GT and the 8600 will solidly trounce a 7900 card, you can safely get a 9600GT. (Why you don't just get an HD4830 is beyond me, it sounds like your perfect card.)
I think you may be a little too picky with the "65w too much power" argument, after all CPU's have drawn more power than that for years and did you bitch about that? And the GPU is doing so much more work for crying out loud.
Not to knock being picky, I am unbelievably picky, but I channel it into action and modify an existing solution.
Might I suggest a 9800GT with a volt-mod to reduce the power, and clock it back a little? Although if you wait a little bit the factory will release the Green Power model for you.
I don't know why you need a single slot card either, do you demand that your CPU function with a 1U cooler on it? No, of course not, it would be silly. Why force your video card GPU to labor under the heat it can't rid with a single slot cooler?
Getting value from a graphics card is more than performance/dollar at time of purchase. It also means using the card for a certain period of time to recoup your moneys worth. Depreciation on graphics cards is worse than that of automobiles. That's why I will keep a graphics card for 2 years before I replace it.
Also, the cost of electricity is high where I live. Performance / watt is important to me -- though I did not adequate express this in the above blog.
I am also picky about the power load because I think it ridiculous a graphics card would draw more power than the CPU. Any PC that requires more than a 350W power supply is a workstation, not a PC.
I currently use an E6600 which draws 65W. I use a 500W Seasonic power supply which could easily power more powerful graphics cards than the 7950GT (65W) in the 100W to 130W class. However, I use 500W because power supplies are most efficient between 40 to 60% of max load -- 250W for my rig at max load.
No, I have not heard of the 9800 Green Power. Where can I find power draw and performance figures for this card. This card may operate at lower voltages, but the frequency is also lower because the G94 chip is being used.
The 9800GT draws 83W which exceeds my criteria. It is possible to reduce voltages but not by much. Its conceivable that 75W could be achieved with lower voltages.
The HD4830 is a very good card with more than double the performance. However, the HD4830 draws 85W while the 7950GT draws 65W.
Both the HD4830 and the 9800GT have better performance/watt than the 7950GT. But I still want to limit loads to 65W.
"Also, the cost of electricity is high where I live. Performance / watt is important to me -- though I did not adequate express this in the above blog."
Logically then you should evaluate the time you spend using the card vs not using the card and then find a card that has good idle and load power draw.
This is very very important, you could find that you should use a hybrid-power graphics solution (one that uses the onboard graphics when not in 3D mode), and an ~85 or 90 watt graphics card and still see less energy usage than you do now.
"I think it ridiculous a graphics card would draw more power than the CPU"
Your logic is flawed, if it is doing more work there is a good reason that it is consuming more power. (some estimates show about 10 times the amount of calculation that the CPU does. Why are you so hard on it then?)
"Any PC that requires more than a 350W power supply is a workstation, not a PC."
There is no reason to run more than a 350w power supply for 10 to 20 watts more of energy.
". . . the frequency is also lower because the G94 chip is being used."
No, the G94 chip has only 64 Unified Shaders and is used exclusively in the 9600 series. There is no reason that the G94 should clock less than the G92, I see similar clocks on both the 9800GT and 9600GT series.
As I mention, perhaps your stringent requirements are best met with customization.
I have heard the power arguments before (in a different way, cars that use a lot of gas). Frankly I don't think that 20 watts more of electricity only when you are gaming is a lot.
Do you own a "Kill-a-Watt" device? Have you measured the actual energy usage of your computer? Is the 40-60% load accurate? Would an 85watt card actually draw only 10 more watts at the wall? What is the power consumption of your monitor and speakers?
Are you running compact flourescent bulbs? In this country they cost $1 at a store and use 25% of the energy of a normal bulb.
I don't mind you having a preference, I am just very curious if you are basing your actions on assumptions or on hard data.
I notice that power supplies run on 220v are much more energy efficient, if your country doesn't supply 220v to the wall outlets perhaps you can save a lot of money by putting just your PC on a 220v outlet.
I once spent $4300 for a PC with no monitor. It was the hottest thing going.
Six months later, a faster model, with monitor, was under $2500.
Lesson learned.
If a customer wants something super-hot, and is willing to pay for it, I'll build it, but, generally, I'll recommend something a couple of steps down from the top.
For most people, this puts us into the 4670-4830 range.
I think most people will have a flexibility in their price point that relates to the amount their willing to spend on the card in the first place.
So for example, if someone willing to pay $100 typically has $20 flexibility, you might expect someone willing to pay 500 to have a flexibility nearer to $100.
The flexibility histogram would therefore be expected to reasonably mimic the one for the purchase price.
Although I would never buy one of the "halo" cards, I always enjoy reading your reviews of them. It's helpful to read that there's no point in buying anything faster than card X for resolutions of 1680 x 1050 in game Y and so on.
I'm willing to spend as high as $500 because I know I'll be using the card for 4+ years. But one thing I won't tolerate is for the graphics card to be engineered as a pig.
Three power sources (PCI-e + 2 power connectors) and 300 watts of power consumption? For a single graphics card? That is absurd. And don't tell me that it only uses that power when in 3D mode. With contemporary desktop environments your card is *always* doing 3D rendering. And don't get me started about the waste heat produced and the ever-more-extreme cooling methods required to remove that heat.
Fire up GPU-Z and you will see that your card switches to low speed/power when in desktop mode. Even in Vista with all the eye candy on my 285 drops to something like 100MHz clock from 700MHz and 300MHz memory.
But I still agree that I'd like to see smaller and cooler cards.
My rigs are built as toys, and as personal rewards. As a result, I have little issue with spending extra to ensure they're top-notch. Looking forward I plan to do big upgrades every 2-3 years and little things (quiet fans, etc.) in between, assuming of course that it is financially doable.
I enjoy the wide variety of video cards available, and also understand that some of it is due to weird circumstances in marketing. The only real issue I have with the current GPU landscape is the concept of renaming cards piecemeal. It can make sense to rename cards IF it makes the product landscape more sensical, but I don't think it is the case currently.
I'm really glad you did this posting, too bad it was only after the other two multi-GPU articles had come out. Oh well. I am really glad to see Anand focus on readers' input, to integrate into the article, and going further than simple polling but also a briefing to help explain the need/target for the polling.
I generally buy cards at the 150+/-40 point. It takes a great card for me to fork out $180, and a great bargain for $110.
If I had to get a new card, I'd probably get either a 9800GTX+ 512MB ($150 before rebate) or a 9800GT 1GB ($150 before rebate) on nVidia's sideor be split between a 4870 512MB ($165 before rebate) and a 4850 1GB ($162, no rebate) on ATI's side.
The 4870 1GB's are too much, and the 4870 X2 and GTX260's are well out of my range at this time.
Noise and power consumption are more important to me than either price or performance. I tend to buy the fastest passively cooled GPU available (excluding the occasional power-hungry GPU with crazy big passive cooling which would require me to beef up case cooling).
There are still interesting trade-offs, eg does this small increase in wattage give me a big increase in performance? Also idle power consumption is often more relevant than max power consumption (I can cope with a bit of extra noise when playing a game).
The poll was hard for me to anwser. I always run more then one machine. Usuallly i have one thats armed to the teeth with bleeding edge hardware, and three others ranging from mainsteam to enuthist level gear.
Example. Currently my main rig is a water cooled QX9650 running at 3.6 24/7 with a 4ghz profile in my bios mainly for crysis on a Evga 790i. 8 gigs of DDR3 1600 linked and sync'd, 2 intel SSD 80g MLC's raid 0, running recently installed 3 EVGA GTX 285 FTW's on a 1080P 42" display.
Ok yes i understand 90 percent of games get minimal gains if any at my resolution with these GPUs. For this rig i honestly care about the maximum visuals and filtering Vsync'd un the most demanding games with filtering vsynced when possible. So do i care if card number 3 losses me 15 frames overr 2 cards but i can still lock up my vsync. Who cares. They are frames my monitor never would have seen anyways. Now take a frame rate and drop below my 60...then i start to care. On this rig at least.
My last upgrade is an excellant example. I had 3 8800GTX's. The basically played ever game on the block maxed out with filtering at a factor of 8 save crysis until the release of Far cry 2 (could do max visuals but only 2x filtering and still vsync) . So for good reason i didn't see much point when the 280's hit a year ago to buy new cards as at the time only crysis could cripple my system and 3 280's at stock still wouldn't vsync crysis with any filtering. A stock 280 gives you like a 20% performance increase over a 8800GTX.
Only with far cry 2 and EVGA's new FTW's did my mind start to change. with such a high over clock the 285's by EVGA convinced me to upgrade...the first time i vsynced crysis at max setting and 8 filtering followed by the same on far cry 2. I stopped doubting my purchase. It was beautiful.
Now all my other systems...I still like high frame rates but its much more about bang for the buck. A Phenom II @3.6 with 3870X2 + 3870 tri-fire on a 28" 1920x1200. A Q9300 at 3ghz again with 3870's tri-fired on 52" 1080P doubling as a HTPC. Have no plans upgrading those cards/rigs tell Direct X 11. They are basically LAN machines for friends/family. Tri-fire for that series of card and resolutions works well in 95% of games with maxed out settings are possible with 4 to 8x fitlering.
Generally in terms of GPU's unless there is a game presently able to drop my main rig below 60fps...i find i need to upgrade every other GPU generation.
Ultimately i think every rig I build i do so with differant budget and frame rate in mind....though the later weighs heavily. No one likes playing crap frame rates with bad visuals. On the same note i don't need every rig i own to leave me gushing O negative to have me feeling satisfied with my gaming experience.
Indeed. I mean, I'm less extreme... but it's the same concept. Main system I want most games to work great, but I become even more value conscious with other systems. And in particular, if things changed from how they are right now, I'd be willing to upgrade my main computer's power supply to say a 700W supply and add a $300 video card say (not with the current games and the current offerings but if things changed). But there's no way, I mean NONE that I'd have my other computers killing a bunch of electricity for the two or three times a month someone might use it for gaming.
Other issues for me that's dear to me are noise level, hassle and driver compatibility. I run Ubuntu from time to time and SLI or crossfire "scares me".
It's funny (or rather... sad ?) and very revealing that the second question is also about price. Personally, I'm not willing extremely flexible on price, especially since a higher price generally actually lowers the perf/price ratio. There's a lot of money to be saved playing 2y old games on 2y old hardware.
I am willing to spend more for quality, durability, stability, and silence. The first free are barely ever mentioned in reviews. Noise, for some reason, has made it. I've been repeatedly disappointed (stuff not working, becoming very noisy fast, bad service...) by Asus, which for some reason seems to have a sterling reputation with journalists, for example. I've never had a problem with Asrock, go figure.
All in all, I'm all about current yet fully established technology. I mean, my 8GB of DDR2 ram suits me just fine, I don't need 12GB of DDR3 RAM for four times the cost. Even the very act of trying to be a computer snob leaves me open to "value priced stuff" because what you just "had to have" last year is now available in an overall better package at a lower cost.
I think a healthy budget is where you can afford to update your equipment at that price every 18 months or so. Like if you're budget is $3000 for a desktop but you can only afford to do get a new one in five years... then you really couldn't afford the $3000 desktop to begin with . Even as improvements have cooled down somewhat because of decreased demand, I still think progress is quick. So, you're better off updating frequently, even the resale value of your used equipment is maximized at that time when it's still not quite obsolete.
I used to buy the $499+ cards but I've realized that I get just as much (or little) enjoyment out of my games using a $200 card.
I am however flexible enough that I'd spend 25% more if that translated into a >25% performance improvement.
I think it's ridiculous the way the market is flooded with very similar cards that are only a few FPS apart and are also priced <$10 apart. Does the world really need one $149 card, one at $155, another at $169, another at $172 etc? Surely if you can afford a $149 card you can also afford a $169 card - it's the same market segment IMO.
I would have preferred if ATI and especially Nvidia had fewer cards in their lineups.
$99 for new pc gamers just coming from integrated
$199 for the mid-range/mainstream crowd (possibly also one at $149, but I personally feel that it's unnecessary - the $199 card would drop to the $149 - $169 price point soon enough anyway)
Agreed, but that's what makes these reviews so important. Options are good, especially if there's no clear winner, but having too many definitely complicates things.
I'm a $200 GPU consumer, but I'll go up or down $50 to find the sweet spot. I'm not sure how to answer the questions based on that. I think I said $150-200 with a $50 swing, but I don't think I'd ever buy a $100 video card.
My friend picked up a 9800 GT for ~$100 three months ago; I think it was a great buy. I would have done the same had I not just bought an 8800GT in April 2008 for $200!
How did you guys decide on the granularity for those two questions?
If I was targeting the highest figure in any of those ranges or the lowest or the middle, the maximum spread using the highest flexibility is still just 2. Assuming people picked price points within the ranges given evenly and using the results people gave in their flexibility, you really didn't learn anything from asking the question on flexibility.
After reading the question I came up with the answer +100, but the highest option was 50!
Consider this: I was willing to spend $250 on a graphics card and when the 8800GTs were at $112, it was stupid not to buy one of those. It was well below my target price but the price/performance was so great, I would have been a moron to pick anything else. The people spending $20 less for 1/3rd the performance were also equally stupid. Are people really this inflexible when it comes to picking a graphics card?
i skimmed the article
really all i think you need is a brief line and then the two questions
also the article can sometimes skew results.
"But we suspect that the majority of our readers, while interested in high end or even halo parts, will care much more about lower price points and bang for buck metrics."
you wouldn't think this would affect anything, but sometimes, lines like this actually can effect peoples answers to the questions.
I'm a fan of hooking my PC to my TV for gaming. I like big :-)
If I could afford a 2560x1600 monitor for personal use I'd totally be there, but I'll opt for a 50" 1080p TV over a smaller monitor.
i know, i know, i always complain about DPI ... i'd like smaller pixels everywhere, but >2x AA becomes increasingly useful with larger TV sized pixels :-)
I think it is important to keep a distinctions between regular computer user, gamers, and people doing 3D modeling, because the demands are completely different. I liked the way you separated things in your christmas round up I like to se more of that, and more articles in $100-105 range. Although I do a faire bit of gaming and will spend up to $250 on a card, most of my friends who I would build computers for would not.
that would have been a good question to ask ... perhaps in a future poll we'll try to determine how much of our readership are PC gamers, console gamers, both, or neither ...
thus far, when i have aimed at getting a new graphics card for gaming, my budget has landed somewhere in the $200-250 range (average). there are a couple instances when i have gone with less because thats all i needed at the time (HD4670 at 1240x1024 for quake wars?), but in general i aim for the kickers, since i like the pretty, but i dont want to have to upgrade every 6 months to keep it that way. usually the $200-250 cards are good for at least a year and a half to 2 years at at LEAST medium quality textures at 1650x1080 with maybe 8xAF 2xAA. it varies from game to game, but you get the picture. however, there have been times when i aimed my budget +/-$50 from my usual price range, like i will probably be doing when i upgrade my video card soon. assuming that there isnt anything new out by the time i do it, im going to be picking up an HD4850X2 2GB. the card may cost $40 more, but the extra performance over say an HD48701GB is insane. of course, at $8 an hour part time, that may be a little while, since i have other higher priority items on my plate first. if what im hearing is true, the HD5XXX series is coming out this summer. maybe i will look into picking up something from that, depending on the performance gains/$ the cards offer
I tend to aim for £150 - £200 (UK Pricing), which works out to around $250. I'm quite flexible though, but I find that such pricing gives the best bang-for-the-buck; like the 1GB 8800GT Palit Overclocked card I've got now - runs 1920x1200 with no issues, but didn't break the bank.
For a particular level of performance (i.e. 60fps, 1920x1200, full eye-candy) what is the cheapest way to get there. This will be different for different games, as well as being unobtainium for some games.
With what's out there today, I'd prefer to stay between $100-$150, +/- $15. Although, 1680x1050 is my resolution so I can afford to fall within that price segment. Had I a larger display, I would likely have to increase the amount I was willing to spend.
I like the article and the simplified take on the poll. Just a small thought however: considering the the price different between a cheap card and the highest end cards is several hundred dollars, I think the choice increments of $5 in the 2nd question is a bit moot and should perhaps be something like +/- $10, $25, $50, $75, $100 and infinitely.
I also like the idea of a poll on people's take on coupons and mail-in rebates and if they affect their purchase decision.
Agreed, I didn't know which price category to pick because often the after-rebate price of something I buy puts it into a lower category.
Another question helpful in differentiating customer needs would aim to sort by non-gamer, infrequent gamer, and frequent gamer, though I suppose the non-gamer these days has little reason to buy a video card at all except for features instead of performance.
I make a decent living, but in my opinion, no computer is really worth more than $500 (in the same way I feel no phone plan/internet plan is worth more than $20 per month).
Am I willing to spend a little bit more if it gets me a little more performance? Sure, but I definitely have limits. I would never spend more than $200 on a video card.
I just use the internet, play music and watch movies. If I had an actual reason for needing a good video card (such as having an interest in working with any kind of 3D modeling) then I would certainly feel differently.
I just consider myself a "regular" computer user, with no special needs other then having a picture come on my screen without any major slowdowns.
As an aside, I am a console gamer. With my salary, I simply can not afford to keep buying the new parts required to play the newest games.
"As an aside, I am a console gamer. With my salary, I simply can not afford to keep buying the new parts required to play the newest games." (quote function doesn't seem to be working)
I understand that there are valid reason to stick with console gaming, but price isn't nearly as clean cut as you imply. You said yourself that you are a regular user and $500 seems to be your target price. If you were wise in the purchase of said system, the cost of making such a system gamable is usually the cost of a decent video card. If you spend ~$150 for a Radeon 4850 or GeForce 9800GTX(+), you can get some pretty stellar gaming on reasonable monitors. I consider 1680x1050 monitors to be reasonable as the price jumps going to higher resolutions. Given your target price, I'd say you're looking at 1280x1024 or similar. If this is true you could spend even less on a card like a Radeon 4670 and still get good results. Of course you could also spend ~$250 on a GTX260+ or a Radeon 4870 1Gb and push games some modern games at max settings (sometimes even high AA levels) at resolutions of 2560x1600. If you need memory, 4Gb DDR2-1066 can be had for less than $50 (less for slower modules).
Now compare this to the price of a console. Consider that consoles max out at the graphics capability of an X1900XT/7800GT (XBOX360/PS3). Sure, the 360 utilizes 3 PowerPC cores for its CPU and the PS3 has the cell processor. However, only two of the three PPC cores in the 360 really get used in games. Further, the power of the cell currently only really shows value in blueray decode. By the time they make good use of these (if they can while being graphically limited) quad cores will be extremely cheap and much of the cpu load (I.E. physics) will be shifted to the video card anyways.
Now think about the price of games. In my experience, big release titles are 1/5 to 1/3 more expensive for console releases than PC releases, even for the same title (examples: Fallout 3/CoD:World at War/FEAR 2 PC:$43/$47/$45 360:$57/$57/$57). So even if you decide to spend a little more for upgrades, depending on how much you game and how long between upgrades, you can still spend less overall.
I have a system based on an Athlon64 X2 5600+ with 2Gb RAM and a Radeon 4850. This system allows me to max out C&C: Tiberium Wars, C&C: Red Alert, Stalker: SoC, and Stalker Clear Skies (DX9 as I'm using WindowsXP) on my 1680x1050 monitor (Far Cry 2 runs extremely well, too). I have a buddy with a single core P4-2.8GHz (no HT), 1Gb ram, and a Radeon 3670 ($80 at the time). I can still play Stalker (both) with full dynamic lighting and some settings scaled back on his monitor (1440x900 I think). It still looks good and is a very enjoyable experience. I could name a whole lot of cheaper systems that are used in the small LAN parties I host that are perfectly adequate for modern games.
Note: We typically playing Stalker, Tiberium Wars (tried Red Alert 3, but didn't like the system as much), FarCry (might switch to or add FarCry 2 when I can convince some people that FarCry isn't the end all be all), and even some old school Deus Ex. We would still play Generals if it weren't for the endless "sync errors".
The point is, you only have to upgrade your system if you feel the upgrade in gameplay is worth the money. I could have just stuck with my X1900XT and got almost exactly the same gameplay experience as an XBOX 360 (save control differences). With newer games, I'd just adjust the settings to my system. It would still look as good as comparable games on a 360. The situation is slightly more complicated for the PS3 due to developers having a harder time extracting performance, but it's still effectively the same situation. The beauty of the PC is that, if I deem it worth the cost, I can move beyond the current capability of consoles.
I have an old AthlonXP based system with a Radeon 9700Pro that I use as a spare in LAN parties (We use it for Tiberium Wars mostly). It still plays many modern games at low levels of detail. Sure the eye candy isn't as nice as my newer systems, but it is nice to still be able to play a modern game on a system that's around the age of the original Xbox. Try playing a 360 game on the original XBOX.
Props and +1pt for that very detailed break down. See folks, this is the reason the Post-a-comment feature. If not, people read an article and just assume it's unbiased and exactly the way to go — until you read guy's comment above! Thanks bro!
I personally won't purchase a card over $200, realistically $200 + MIR. The vid card industry has INFECTED PCs and now basically dominates/controls the industry. Note how PSU makers have directly catered to this ridiculous SLI crap. PSU prices have gone up, power requirements have risen, heat, noise, dust, blah blah all have increased b/c of POS "gaming" zombies... take all that money spent on fanboy upgrades and zealotry = could be taken a trip overseas, paid for certs, treat yourself to something in real life Vs colors on a screen... lol You noobs will learn when you get much older (some of you will never learn) all that time and money spent was in VAIN on an isolated, "experience!" Get off the tube/keyboard and save your cash.
BTW- Unplug yourselves and listen to Alex Jones INFOWARS: Distractions in the form of "entertainment" are keeping people stupid watching sports and "gaming," while people are robbing us all blind!
No offense to the original poster, but to say that you make a decent living, then follow it up with "can't afford PC gaming" is bizarre.
If you want your 360 to look even halfway decent, you must have a hi-def TV or monitor, so you could afford that. You can afford the more expensive games. What about the costs of Xbox Live service? And where can I pay less than $20 a month for decent high-speed internet access? Sign me up!
I've got an ancient athlon 3700+ and an 8800GT. Right now I'm playing Fallout 3 at high detail with HDR lighting etc. and it looks great at 1280x1040 or whatever the default is on my monitor. The GT cost me $120. Before that, I had a 7600GT that cost me about $110. I upgrade my card every 18 months and my PC every four years. No, I'm not a stickler for ultra-high graphics, but I've played 360 lots and I'm not missing out on anything. In fact, many times, I can customize the graphics to look better than the 360 version. Plus, I can login to Steam and play TF2 for FREE anytime I want.
Console gaming is great, but it's not really any cheaper than PC gaming, unless your talking about pushing the resolutions on the PC game well beyond anything a console can handle. And if you do that, you're comparing apples to oranges.
So I would assume you don't have a mobile phone and broadband is out of the question for you as these are generally >$20. For your needs, it doesn't even look like you need a computer. Get a $200 netbook to surn the net, watch movies on your TV, and use a stereo to listen to music.
That's a popular point that for most people there is no reason to spend so much on a computer to do jut regular tasks. This is why Netbooks are emerging. (I wish they had Netbooks when I started university 4 years ago)
For some people however, who do very intensive computing for their job or for fun, and time is money, then spending the extra few hundred dollars or even one thousand can be worthwhile if it saves them 1 hour a day, every day. Or if not money then it is time that could be spent doing other things.
I guess this validates AMD's strategy of focusing on winning the mid-range ($150-$199) and performance ($200-$300) markets since that's where the customers are. Any breakthrough in the high-end would be a bonus given the small amount of capital put in up front. (A dual die HD4870 X2 is cheaper to come up with than a mega chip).
It validates the strategy if their only concern is the subset of customers who respond to polls on AnandTech ...
There is a bigger market out there, and we've always believed that our readership are smart shoppers interested in lots of performance for the money they put in. This sort of helps confirm our thoughts about that, but we are also not randomly sampling, and there could be some correlation between people who tend to respond to polls and the answers we've gotten ... which we can't evaluate without random polling.
we actually do have the ability to randomly poll people with popup polls, but people have been down on that idea, and we still have the problem that we are limited by people who chose to respond to a poll even if they are randomly presented with a popup poll.
we would expect a similar poll on an overclocking web site to reflect higher spending and higher flexibility or more of a focus on absolute performance.
we would likewise expect a poll on a more general consumer focused computer tech site to reflect that value parts (<$50 or maybe $50-$100) have the highest demand.
This data is not generalizable... but, for us and our readers, this information is very valuable.
I suspect a poll conducted by a third-party consumer research company would give results suggesting that most people don't buy a discrete graphics-card themselves, and that those who do are most likely to buy a card in the US$50-99 range (roughly the cost of two games), with those who also indicate a significant interest in gaming more likely to spend in the US$100-149 or US$150-199 region on a graphics-card (three or four games).
People who visit sites like AT are not normal. They're technology enthusiasts at least, and PC hardware fanatics quite often. Therefore any poll run here can only ever give the views of people who already visit the site, which is certainly not the general public.
I don't think many people these days would actually go to the trouble of buying and installing (or having installed for them) a sub US$50 graphics card, as the time or money spent actually putting it in and setting it up would probably exceed the value of the card. Besides, whilst even sub US$50 cards are many times faster than onboard graphics, even Intel chipsets like the X3000 are quite capable of satisfying the needs of people who enjoy Sims 2 and other less graphically-demanding games (I've been playing Civ 4 a lot lately and I certainly don't need an 8800GTS for that).
The best way to find out what most people in the real-world spend on a graphics-card would be for Anand to use his connections with many mobo companies (who also sell lots of graphics cards with both nVidia and AMD GPUs) and ask them for a rough breakdown of sales across each sector. I'm sure if he promised not to reveal his sources, he'd be able to obtain relative figures of the volumes shipped in each segment (and possibly even between their nVidia and AMD based sales). We know these companies are often willing to spill the beans on upcoming stuff, and even pass on the odd unreleased CPU from Intel or AMD to the likes of AT (what they get in return for this is never made clear...) so I'm sure they could say
"x% of products shipped is in this range, x% in that, x% there, and x% above this amount).
i modified the second question to be a little more clear -- value was changed to VALUE and the (perf/$) note was added
we aren't looking at how flexible you are to get the best performance (that should be taken into account in the first question) ... we want to know how flexible you are to get the best perf/$ ...
342 had responded before this change was pushed live. I do apologize for any inconvenience or misunderstanding.
I normally by cards in the $150-200 range in the $100-150, because I wait for good sales with coupon and mail-in-rebates. I think there should be a question about how much do we consider market price versus lowest price we can find on bargain websites and such.
Same here. I'll wait for a good deal before purchasing a video card. I have the time to wait since I don't play games very often but do like have a good card when I need it. I got my 9800gx2 a year or so ago for $300 and it's still doing very well for my needs.
That's pretty much what I do. I nabbed a GTX 260 Superclocked last summer for $220 after promo code, free shipping, and a mail-in rebate. Eight months later I still don't regret it, and is how I've begun to conduct most of my other PC-related purchases.
To answer your question, the only time I give the bundle any consideration is when it has a game I already wanted to purchase was included. This unfortunately is very rare so it almost never turns into a factor in my GPU purchases.
Unique programs like EVGA's Voltage tuner give value, but specific warranty length, options, and provisions are usually the biggest deciding factor beyond price for me personally.
If the GPU already had a very good quality aftermarket cooler pre-installed would this count as part of the bundle? I would factor any very good factory cooler upgrade into the price equation as it saves me from buying one myself. Otherwise I don't end up factoring the bundle into the original price.
To further muddle the issue, things like combos would factor in if I already was planning to buy the hardware. For example the Intel X25-M and Core i7 920 "combo" on Newegg gives an instant $60 off the combined total... I've been incredibly tempted by that because both items are on my future shopping list already and those items don't deal in rebates otherwise.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
71 Comments
Back to Article
ET - Wednesday, March 4, 2009 - link
Though like a large part of the readers, I don't feel I can afford to buy high end cards, that doesn't mean I'm not interested in them.Personally, my favourite reading is about new technology. New generations of graphics cards make for interesting articles, IMO. Yet another card that's exactly the same as the others but different somewhat in performance (like 4830, or the various renames NVIDIA cards go through) I find quite a boring read.
For getting a good idea about what to buy, my favourite place to go used to be the 3Digests at digit-life. They seem to have stopped, unfortunately, but they pretty much summed up the performance of the cards in order to make a purchase decision.
I'm also interested in seeing where things can be stretched. Quad-SLI, though I'd never have it, does tell me things about technology. Cross-SLI (or cross-crossfire), such as 4870+4830, would also be of interest.
Hrel - Tuesday, March 3, 2009 - link
I generally try to stay around the $150 mark, because that's normally where the cards with the best performance/dollar are. Right now the HD4850 and the 9800GTX+ are in that price range, and with rebates and newegg discounts I've seen both down to 125; which is a really good deal. But if I was putting together a whole new computer and the difference in performance was big enough, at least 30% as I've found 30% is the smallest performance improvement that's noticeable in real world situations, then I'd be willing to spend closer to $200 or maybe even slightly more if it really was just that good of a card.However, if I already had my computer built and all I was doing was upgrading the GPU I'd be significantly less likely to spend any amount of extra money on that GPU; since the system as a whole has less life in it. I think it'd be a good idea to mention that in an upcoming article and/or make a poll about it.
Hrel - Tuesday, March 3, 2009 - link
Oh, I also try to stay around the $150 dollar range because those cards tend to not be too loud when gaming and now-a-days are silent when idle; and considering I use my pc as my bedroom television that's important. I also don't want my GPU to raise my electric bill too much. That's one great thing about Nvidia's GT200 series cards, their idle power consumption is phenomenal. Honestly that's the reason I'm holding off on upgrading my GPU, I want a card that idles at about 50W, or less would be even better. That also pretty much defeats the need to be able to switch to integrated graphics when not using your dedicated GPU.So yea, you should definitely consider the issues of noise and power consumption; and when your talking about whole system noise you need to be concerned with heat as well. I'm one of those who believes you shouldn't be able to hear your computer AT ALL unless you get down and put your ear to it. Heavy gaming sessions excluded, as the GPU fan obviously needs to spin up; but it should spin down quickly when you're done... so I can get to bed easily:)
Rigan - Monday, March 2, 2009 - link
One reason I read this site is the careful attention paid to price and performance. I'm in the 1% price doesn't matter to me camp, but nearly everyone I advise on pc parts cares about cost quite a bit. Here I can find a proper vendor agnostic test set. An important service indeed.Joe Schmoe - Monday, March 2, 2009 - link
I'm usually in the 150's when I purchase a Video card. I had an Saphire ATI x1950 Pro which Replaced a Guilmont 6600GT. Neither of those cards were "Top of the Line" but they were in the top 20%. I never ran them at max resolution (I just ran 1024 x 768) and I wasn't afraid to turn antialiasing off if I could keep detail high. I had a 19" Trinitron CRT.
When I moved to a 22" LCD I upgraded my video card with an MSI 8800gt 512MB OC. I payed $120 for it before rebate. It's faster than my 1950 pro but about the same if I run games at 1600 X 1050 which I do. I looked into upgrading to a better card this year but none of them are that much faster at this resolution. This is a case where I bought a cheaper video card than usual since the performance difference wasn't that great. I took my video card budget for the year and replaced my E6400 with a Q6600 instead.
wheel - Monday, March 2, 2009 - link
I hope AnandTech takes notice of the results of this survey. Most people are after midrange, "bang for your buck" parts and don't really care about the expensive top end. Sure, it is interesting to see the technology and what will filter down but is mostly irrelevant.Why bother with ongoing multi SLI scaling articles when if you look at the steam hardware survey you will see that 97.8% of gamers use single GPU systems. (as at Feb 2009)
I find that the people who spend massive dollars on SLI systems are either excessively wealthy or new to the scene and haven't been through the several generational cycles that turn their new $450 card into a paperweight within 2 years.
The SLI equation rarely adds up. Although it is good in theory to add a second card later down the track, by the time you need it a new next-gen card is released that is just as fast as your SLI system but at 1/3rd of the power and much less noise and heat.
Mid range volume sellers like the 9800pro, 6800GT, 8800GT, 4850... now that is where it's at!
Cheers :)
vol7ron - Monday, March 2, 2009 - link
Even though parts are made upgradable, I tend to buy on the mid-to-highend and replace every 2-3 years going along with Moore's Law.Parts are made upgradable but I rarely do. And when it's time to buy a new CPU, you need a new Mobo, and you find out there is new RAM yada yada. The GPU and the HD are the parts that generally last longer then 2-3 years.
yacoub - Sunday, March 1, 2009 - link
The biggest problem these days is after 2007/2008's more properly-priced bang-for-the-buck GPUs, we're back to 2006/2007 where things were overpriced.Especially in this economy, but even in a better economy like we saw in 2007/2008, one should be able to buy a very good, latest-generation GPU for $200-250. Something that can handle the usual assortment of the latest FPS games at 20-24" monitor resolutions with high quality settings without dipping into slowdown levels of frames-per-second, and without needing to also have some top-of-the-line $500 CPU either.
2007/2008 was the generation of Core2Duo + 7900GT (and later 8800GT) (and several other good combos). We are only just now starting to see the GTX-260 Core-216 GPUs in the right price range and they are questionably the latest generation, what-with the 285/295 now being on-market.
anartik - Sunday, March 1, 2009 - link
WOW… I’m not easily shocked and I’m shocked at the results. Granted times are tough but people (responding) are much cheaper than I would have ever guessed. I guess that partially explains why the market is so flooded with low end junk. Even with devaluation of the dollar and inflation prices are more restrained than ever. Just a few years ago a just released X1900XT cost me $500 and was a price performance deal compared to the in short supply $800 Geforce (7800?). Just about a year ago I picked up a bargain 8800 GTX at $350 (right when the 8800 GT and then 9000’s started appearing). In perspective a $334 GTX285 I’m thinking about buying is an outright price performance steal.To each his own but I think the results are skewed by a much wider audience outside of the gamer, enthusiast, system builder and fanatic overclocker crowd one would normally associate with a hardware site like this. I fall somewhere in the enthusiast/overclocker/gamer crowd and I look to build a good balance of price performance. Granted I have a fair amount of disposable income (I have 4 kids, 2 dogs, 1 cat and a wife that does not work) but I have more brains than I have money. I do look at reviews and look for something to give me the best price performance gaming (and other uses) on a 26” at 1920x1200. I see a whole lot of people trying to put cheap lipstick on a pig in the comments. Conversely I always get a good laugh out of people wanting to run high end cards and/or SLI on small low resolution monitors.
yacoub - Monday, March 2, 2009 - link
"WOW… I’m not easily shocked and I’m shocked at the results. Granted times are tough but people (responding) are much cheaper than I would have ever guessed."No, we gamer/enthusiasts a number of things but we're not "cheap". We're price-conscious, we're bang-for-the-buck oriented, we're smart with how we spend our money, and we're (mostly) adults who have the patience and discipline to resist being suckered into dropping big bills for a GPU that's really not worth it.
You just have a skewed perception that is now being corrected by reality, and it was skewed by spending time on forums populated primarily by the handfuls of people who actually spend their money on ultra high-end GPUs (and often do so every 6 months, and do so specifically for the sake of forum postings and signature stats).
Most people (talking middle class normal gamer/enthusiat folks) do not drop $500 for a GPU for their computer. They understand the value of $500 better than some kid who doesn't pay for their housing, insurance, etc, and would not be so quick to drop that kind of money on a computer except in rare circumstances.
And the reality is the GPU market has changed to go after the higher-end pricing schema because there have been an increasing number of suckers that make such a schema more lucrative where it never was quite so lucrative in the past.
Realistically $200-300 is the range people expect for a near-top-of-the-line GPU for gaming on your average 20-24" widescreen monitor at High graphical game settings. There should only be one single-card offering of a given generation priced higher than that, at around $450-500, which is for the people who think they need it, or who run 30" displays, or who really want to run Ultra High settings at high resolutions at playable framerates.
Everyone else, and where the big money has traditionally been made, is in the sub-$300 GPU market.
And there's a lot more caveats and things I'd like to say but whatever this is long enough.
superkdogg - Sunday, March 1, 2009 - link
With young kids, my discretionary income is limited. I usually can scare up a sweet deal. I got my 9600GT for $75 AR and sometime soon I will end up getting another one for less than that. Ending up with a pair of 9600GTs for around $125 while having graphics performancethat is good enough for me for about 3-4 years is value in my book.
Leyawiin - Sunday, March 1, 2009 - link
My current card is an 8800 GTX that cost $550 in January 2007. It was a bit hard to swallow at them time, but I'm still happily using it on a 22 inch monitor and it provides comparable performance to today's mid-range cards (HD 4850/9800 GTX). That's pretty good longevity and value, especially considering I'll get one more year out of it before my new PC build ($180 a year - many people spend that much upgrade mid-range cards over three years).Hopefully the "next big thing" will hit about the time I'm ready for my new build.
cookEgawd - Sunday, March 1, 2009 - link
When I buy a new video card, I usually shoot for under $200, but will go up to $250 if pressed.But, I don't really care much about *today's* price when I look at reviews (which is what we're talking about here, right?). It's all relative and I'm very rarely ready to buy when you put out a review. So if I'm looking at a bunch of cards compared by price on newegg, I want to beable to come to anandtech, look at a page and see the performace of these cards compared to each other. Tom's tries to do that but doesn't do it well. THEN, I want to be able to drill down and see the details. It's important to have generic articles on the tech behind each card and what it is that makes it a good card. Periodic scorecards on manufacturers would allow you to get some kind of idea on reliability and build quality.
So I'm looking for a site that gives me the tools I need to make a smart purchasing decision when I'm ready to buy. I like to keep up with the latest kit as much as the next guy, but when I'm buying, I want fast, reliable information on the cards I'm looking at. Build your reviews and benchmarks for longevity, and I'll be happy.
StormyParis - Sunday, March 1, 2009 - link
such as yearly hardware budget, or number of PCs, or total RAM of main PC... any indication of the level of geekness of the respondent.You may be surprised by how many non-computer-nerds your audience has, due to the democratisation of the whole PC scene, and the lack of good web sites.
Tiamat - Sunday, March 1, 2009 - link
GPUs I look at usually MSRP for 200$, I go ±$50 based on ease of implementing a passive cooling (or semi-active <19dB) solution. Several years ago, it was the ATI Radeon 9800pro 256MB, just recently it was the ATI Radeon HD4870 1GB for 230$. I paid the premium to ensure I got the 4-phase power design, for example. Generally, I wait a while to make the purchase (HD4870 has been out for a long time) to allow the market to fight the price/performance "battle" for me.superccs - Sunday, March 1, 2009 - link
I was thinking you could have aqcuired more information about relative graphics card spending by asking a similar set of questions.1) How much do you spend when upgrading the main components of your system (mainboard, CPU, GPU, RAM)?
2) What proportion of that upgrade cost is dedicated to GPU?
Great work none-the-less.
Zak - Sunday, March 1, 2009 - link
I bought my last 285 cards because they were coming with CoD - World at War which I wanted to buy anyway. That's $40-50 right there. (Accidentally I got FC2 too) So if I had to chose between three cards of the same speed/price: one without extras, one with a game I own/don't want and one with a game I want, I'll definitely go with the third one. I'd also take a free game over a mail-in rebate.Z.
ET - Sunday, March 1, 2009 - link
I've had some luck winning graphics cards in recent years, which is my favourite way of acquiring them. Before that I bought a couple of used ones (sub-$100, but they were last gen's high end) and before that I typically bought low end (sub-$100), but that was 10 years ago.Assuming I don't win another card, though I voted $100-$149, I think sub-$100 is pretty decent, and if I had to buy now, I'd probably go for a Radeon 4670. I want something that'd run games at 1600x1200, since it's my monitor's resolution, but I'm willing to do without FSAA and other eye candy, if there's need, and 30fps is typically enough for me (even less, for games that aren't action games). The programmer in me likes to have the latest tech (DX10.1, ...) even if I don't use it. I also prefer lower power cards, if possible.
I'll certainly be tempted to buy a DX11 card when they come out, but will wait for something like the 4670 for that gen.
gochichi - Sunday, March 1, 2009 - link
I tend to want some decent performance, but really these days an HD 4670 makes a compelling case for a second or third computer.I guess what I'm saying is that value is important when you're talking about multiples. I would have "never" really considered having more than one up to date "gaming" rig but with prices falling for a performance level that I think is quite good, then I don't see why limit oneself to one expensive card, when there may be more value in having a LAN party ready to happen at all times.
So, given the state of things right now, I think it's really difficult to come up with a reason to spend over $150.00 while if I needed to (in order to play the best games (not just the most demanding games, the BEST ones)) then I could certainly go to $250.00 per say. I for one don't think the prices are going to go that high for regular "upper mainstream" needs.
Now that we've tasted affordability, I think a lot of us aren't going back.
Also, I think more than ever, I would say my very highest concern in terms of cost is on energy efficiency. Why? Because I've done away with building my own computers, I buy Dells now and they only come with 300-350W power supplies and I don't want to change them. So basically I don't want the hassle of a pricey card. I look at the current $250+ offerings I quite simply don't want them at ANY price. Literally, if I could buy a double-slot card that needs a 550W power supply for $50.00 I would pass.
I currently have an HD 4850 and I simply don't see the use of more. In fact, the HD 3870 that it replaced was plenty fast. I simply couldn't pass up the boost per $ of the 4850. So it's not a cut and dry thing.
Some products come around and they are compelling. For example, the 9800GT 512MB edition was hard not to want.. it was $200.00 and it was cool running and energy efficient. So again, the $ amount of the pricier cards is only one of the reasons that I don't want them... most of them (just look at the size of the cooling and the power draw) are no more than overpriced prototypes.
What I am currently MOST interested in is a better half-height (or low profile) offering that doesn't need a six-pin power and the like. I am basically waiting for a HD4670 level of performance in a half-height but it doesn't look like it's going to come. I would pay $150 for such a card even though the performance per $ would be way low. Because it would fit my very nice slim desktop (the extra computer).
So to beat a dead horse, I don't think I'm alone on this. We want a compelling product FIRST AND FOREMOST, and if it's compelling enough many of us will come up with the money so long as it's not that much money (the difference between $70 and $ 150 is over 100% but it's still a reasonable cost).
So... so long as there are $130 offerings with better power envelopes and sufficient performance there is no way no how that I'll spend more for a less appealing product. I want to support research and development with my dollars, not marketing hype...
nubie - Sunday, March 1, 2009 - link
I voted for the $50-99 segment (And why not? You can get an 8600GTS all the way to a 9800GT in that price segment, including the ~$70-on-ebay 9600GT smack dab in the middle of that segment.)With a +/- $50 tacked on there.
Buuuut, if I had any money at all I would probably have shot into the $150-200 segment, again with $50 leeway.
I base the money I am willing to spend on what I ask the card to do, namely I ask my card to run dual 1280x1024 screens in a stereoscopic 3D setup (one for the left eye, one for the right). Thus I have decided that a 9600GT (or a decent 8800/9800 series card with 512MB if it falls under $80) is probably the best choice considering I have no money coming on a regular basis.
If I was to move up to dual 1650x1080 or a single 2560x1600 then I would start looking up the chain at a ~$100-250 card. I don't think that getting 10% better FPS is worth 200% the price, so I tend to find the sweet spot that balances my system with my card, and offers the very best value for money.
ClagMaster - Saturday, February 28, 2009 - link
I purchase a new graphics card every 2-3 years to upgrade my PC. It is easy to sink a lot of money into a graphics card unreplacement unless discipline and restraint are exercised.What is important to me when I consider a replacement graphics card are the following criteria I strictly abide by:
1) The existing graphics card must be service for at least 2 years before replacement.
2) The replacement card must realize at least double the performance and double the memory for the same or less cost of the original graphics card.
3) The replacement graphics card may cost between $150 to $200.
4) The replacement graphics card power draw must be 55W to 65W at full load and preferably be a single slot card.
I am waiting for a suitable replacement for the 7950GT this summer. The nVidia 9600GT does not quite meet the double the performance criteria while it meets my power, memory and cost criteria. The nVidia 9800GT meets the double performance, memory and cost criteria, but fails the power draw criteria. To me, any card drawing more than 65W at load requires too much power.
When GPUs are released this summer with 45nm process, then graphics cards will be released worthy of consideration.
nubie - Sunday, March 1, 2009 - link
I am assuming you have heard of the 9800 Green Power? It does not require an external power connection (all power comes through the PCI-E slot connection.)I beg to differ on your "double performance" bashing of the 9600GT, by pure numbers it has much more than double the performance in any modern game (AKA one using shaders, like any released in the last 2 years.)
The 7900 series (Which I love, my main card is a 650mhz 7900GS) has 20 or 24 pixel shaders, and 7 or 8 vertex shaders, and 1400mhz DDR3.
The 9600GT has 64 Unified Shaders, is stock clocked at 675mhz and 2,000mhz DDR3.
I had a G92 and G80 card, sold them both, why? Because the older games I was into actually showed a decrease in FPS from my 7900GS. BUT! I recently picked up an 8600 GTS (2,000mhz 128-bit bus ~$40 on ebay), and it really is much much faster in modern shader-laden games such as Grid. I would say that since the 8600GTS is about half a 9600GT and the 8600 will solidly trounce a 7900 card, you can safely get a 9600GT. (Why you don't just get an HD4830 is beyond me, it sounds like your perfect card.)
I think you may be a little too picky with the "65w too much power" argument, after all CPU's have drawn more power than that for years and did you bitch about that? And the GPU is doing so much more work for crying out loud.
Not to knock being picky, I am unbelievably picky, but I channel it into action and modify an existing solution.
Might I suggest a 9800GT with a volt-mod to reduce the power, and clock it back a little? Although if you wait a little bit the factory will release the Green Power model for you.
I don't know why you need a single slot card either, do you demand that your CPU function with a 1U cooler on it? No, of course not, it would be silly. Why force your video card GPU to labor under the heat it can't rid with a single slot cooler?
ClagMaster - Sunday, March 1, 2009 - link
Getting value from a graphics card is more than performance/dollar at time of purchase. It also means using the card for a certain period of time to recoup your moneys worth. Depreciation on graphics cards is worse than that of automobiles. That's why I will keep a graphics card for 2 years before I replace it.Also, the cost of electricity is high where I live. Performance / watt is important to me -- though I did not adequate express this in the above blog.
I am also picky about the power load because I think it ridiculous a graphics card would draw more power than the CPU. Any PC that requires more than a 350W power supply is a workstation, not a PC.
I currently use an E6600 which draws 65W. I use a 500W Seasonic power supply which could easily power more powerful graphics cards than the 7950GT (65W) in the 100W to 130W class. However, I use 500W because power supplies are most efficient between 40 to 60% of max load -- 250W for my rig at max load.
No, I have not heard of the 9800 Green Power. Where can I find power draw and performance figures for this card. This card may operate at lower voltages, but the frequency is also lower because the G94 chip is being used.
The 9800GT draws 83W which exceeds my criteria. It is possible to reduce voltages but not by much. Its conceivable that 75W could be achieved with lower voltages.
The HD4830 is a very good card with more than double the performance. However, the HD4830 draws 85W while the 7950GT draws 65W.
Both the HD4830 and the 9800GT have better performance/watt than the 7950GT. But I still want to limit loads to 65W.
nubie - Sunday, March 1, 2009 - link
"Also, the cost of electricity is high where I live. Performance / watt is important to me -- though I did not adequate express this in the above blog."Logically then you should evaluate the time you spend using the card vs not using the card and then find a card that has good idle and load power draw.
This is very very important, you could find that you should use a hybrid-power graphics solution (one that uses the onboard graphics when not in 3D mode), and an ~85 or 90 watt graphics card and still see less energy usage than you do now.
"I think it ridiculous a graphics card would draw more power than the CPU"
Your logic is flawed, if it is doing more work there is a good reason that it is consuming more power. (some estimates show about 10 times the amount of calculation that the CPU does. Why are you so hard on it then?)
"Any PC that requires more than a 350W power supply is a workstation, not a PC."
There is no reason to run more than a 350w power supply for 10 to 20 watts more of energy.
". . . the frequency is also lower because the G94 chip is being used."
No, the G94 chip has only 64 Unified Shaders and is used exclusively in the 9600 series. There is no reason that the G94 should clock less than the G92, I see similar clocks on both the 9800GT and 9600GT series.
As I mention, perhaps your stringent requirements are best met with customization.
I have heard the power arguments before (in a different way, cars that use a lot of gas). Frankly I don't think that 20 watts more of electricity only when you are gaming is a lot.
Do you own a "Kill-a-Watt" device? Have you measured the actual energy usage of your computer? Is the 40-60% load accurate? Would an 85watt card actually draw only 10 more watts at the wall? What is the power consumption of your monitor and speakers?
Are you running compact flourescent bulbs? In this country they cost $1 at a store and use 25% of the energy of a normal bulb.
I don't mind you having a preference, I am just very curious if you are basing your actions on assumptions or on hard data.
I notice that power supplies run on 220v are much more energy efficient, if your country doesn't supply 220v to the wall outlets perhaps you can save a lot of money by putting just your PC on a 220v outlet.
JohnMD1022 - Saturday, February 28, 2009 - link
I once spent $4300 for a PC with no monitor. It was the hottest thing going.Six months later, a faster model, with monitor, was under $2500.
Lesson learned.
If a customer wants something super-hot, and is willing to pay for it, I'll build it, but, generally, I'll recommend something a couple of steps down from the top.
For most people, this puts us into the 4670-4830 range.
Excellent value for a few bucks.
Just an opinion.
blowfish - Saturday, February 28, 2009 - link
I think most people will have a flexibility in their price point that relates to the amount their willing to spend on the card in the first place.So for example, if someone willing to pay $100 typically has $20 flexibility, you might expect someone willing to pay 500 to have a flexibility nearer to $100.
The flexibility histogram would therefore be expected to reasonably mimic the one for the purchase price.
Although I would never buy one of the "halo" cards, I always enjoy reading your reviews of them. It's helpful to read that there's no point in buying anything faster than card X for resolutions of 1680 x 1050 in game Y and so on.
swsnyder - Saturday, February 28, 2009 - link
I'm willing to spend as high as $500 because I know I'll be using the card for 4+ years. But one thing I won't tolerate is for the graphics card to be engineered as a pig.Three power sources (PCI-e + 2 power connectors) and 300 watts of power consumption? For a single graphics card? That is absurd. And don't tell me that it only uses that power when in 3D mode. With contemporary desktop environments your card is *always* doing 3D rendering. And don't get me started about the waste heat produced and the ever-more-extreme cooling methods required to remove that heat.
I won't buy a pig at any price.
Zak - Sunday, March 1, 2009 - link
Fire up GPU-Z and you will see that your card switches to low speed/power when in desktop mode. Even in Vista with all the eye candy on my 285 drops to something like 100MHz clock from 700MHz and 300MHz memory.But I still agree that I'd like to see smaller and cooler cards.
Z.
alkalinetaupehat - Sunday, March 1, 2009 - link
My rigs are built as toys, and as personal rewards. As a result, I have little issue with spending extra to ensure they're top-notch. Looking forward I plan to do big upgrades every 2-3 years and little things (quiet fans, etc.) in between, assuming of course that it is financially doable.I enjoy the wide variety of video cards available, and also understand that some of it is due to weird circumstances in marketing. The only real issue I have with the current GPU landscape is the concept of renaming cards piecemeal. It can make sense to rename cards IF it makes the product landscape more sensical, but I don't think it is the case currently.
Razorbladehaze - Saturday, February 28, 2009 - link
I'm really glad you did this posting, too bad it was only after the other two multi-GPU articles had come out. Oh well. I am really glad to see Anand focus on readers' input, to integrate into the article, and going further than simple polling but also a briefing to help explain the need/target for the polling.Demon-Xanth - Saturday, February 28, 2009 - link
I generally buy cards at the 150+/-40 point. It takes a great card for me to fork out $180, and a great bargain for $110.If I had to get a new card, I'd probably get either a 9800GTX+ 512MB ($150 before rebate) or a 9800GT 1GB ($150 before rebate) on nVidia's sideor be split between a 4870 512MB ($165 before rebate) and a 4850 1GB ($162, no rebate) on ATI's side.
The 4870 1GB's are too much, and the 4870 X2 and GTX260's are well out of my range at this time.
lplatypus - Saturday, February 28, 2009 - link
Noise and power consumption are more important to me than either price or performance. I tend to buy the fastest passively cooled GPU available (excluding the occasional power-hungry GPU with crazy big passive cooling which would require me to beef up case cooling).lplatypus - Saturday, February 28, 2009 - link
There are still interesting trade-offs, eg does this small increase in wattage give me a big increase in performance? Also idle power consumption is often more relevant than max power consumption (I can cope with a bit of extra noise when playing a game).W4R - Saturday, February 28, 2009 - link
The poll was hard for me to anwser. I always run more then one machine. Usuallly i have one thats armed to the teeth with bleeding edge hardware, and three others ranging from mainsteam to enuthist level gear.Example. Currently my main rig is a water cooled QX9650 running at 3.6 24/7 with a 4ghz profile in my bios mainly for crysis on a Evga 790i. 8 gigs of DDR3 1600 linked and sync'd, 2 intel SSD 80g MLC's raid 0, running recently installed 3 EVGA GTX 285 FTW's on a 1080P 42" display.
Ok yes i understand 90 percent of games get minimal gains if any at my resolution with these GPUs. For this rig i honestly care about the maximum visuals and filtering Vsync'd un the most demanding games with filtering vsynced when possible. So do i care if card number 3 losses me 15 frames overr 2 cards but i can still lock up my vsync. Who cares. They are frames my monitor never would have seen anyways. Now take a frame rate and drop below my 60...then i start to care. On this rig at least.
My last upgrade is an excellant example. I had 3 8800GTX's. The basically played ever game on the block maxed out with filtering at a factor of 8 save crysis until the release of Far cry 2 (could do max visuals but only 2x filtering and still vsync) . So for good reason i didn't see much point when the 280's hit a year ago to buy new cards as at the time only crysis could cripple my system and 3 280's at stock still wouldn't vsync crysis with any filtering. A stock 280 gives you like a 20% performance increase over a 8800GTX.
Only with far cry 2 and EVGA's new FTW's did my mind start to change. with such a high over clock the 285's by EVGA convinced me to upgrade...the first time i vsynced crysis at max setting and 8 filtering followed by the same on far cry 2. I stopped doubting my purchase. It was beautiful.
Now all my other systems...I still like high frame rates but its much more about bang for the buck. A Phenom II @3.6 with 3870X2 + 3870 tri-fire on a 28" 1920x1200. A Q9300 at 3ghz again with 3870's tri-fired on 52" 1080P doubling as a HTPC. Have no plans upgrading those cards/rigs tell Direct X 11. They are basically LAN machines for friends/family. Tri-fire for that series of card and resolutions works well in 95% of games with maxed out settings are possible with 4 to 8x fitlering.
Generally in terms of GPU's unless there is a game presently able to drop my main rig below 60fps...i find i need to upgrade every other GPU generation.
Ultimately i think every rig I build i do so with differant budget and frame rate in mind....though the later weighs heavily. No one likes playing crap frame rates with bad visuals. On the same note i don't need every rig i own to leave me gushing O negative to have me feeling satisfied with my gaming experience.
gochichi - Sunday, March 1, 2009 - link
Indeed. I mean, I'm less extreme... but it's the same concept. Main system I want most games to work great, but I become even more value conscious with other systems. And in particular, if things changed from how they are right now, I'd be willing to upgrade my main computer's power supply to say a 700W supply and add a $300 video card say (not with the current games and the current offerings but if things changed). But there's no way, I mean NONE that I'd have my other computers killing a bunch of electricity for the two or three times a month someone might use it for gaming.Other issues for me that's dear to me are noise level, hassle and driver compatibility. I run Ubuntu from time to time and SLI or crossfire "scares me".
StormyParis - Saturday, February 28, 2009 - link
It's funny (or rather... sad ?) and very revealing that the second question is also about price. Personally, I'm not willing extremely flexible on price, especially since a higher price generally actually lowers the perf/price ratio. There's a lot of money to be saved playing 2y old games on 2y old hardware.I am willing to spend more for quality, durability, stability, and silence. The first free are barely ever mentioned in reviews. Noise, for some reason, has made it. I've been repeatedly disappointed (stuff not working, becoming very noisy fast, bad service...) by Asus, which for some reason seems to have a sterling reputation with journalists, for example. I've never had a problem with Asrock, go figure.
gochichi - Sunday, March 1, 2009 - link
Yeah. I here you.All in all, I'm all about current yet fully established technology. I mean, my 8GB of DDR2 ram suits me just fine, I don't need 12GB of DDR3 RAM for four times the cost. Even the very act of trying to be a computer snob leaves me open to "value priced stuff" because what you just "had to have" last year is now available in an overall better package at a lower cost.
I think a healthy budget is where you can afford to update your equipment at that price every 18 months or so. Like if you're budget is $3000 for a desktop but you can only afford to do get a new one in five years... then you really couldn't afford the $3000 desktop to begin with . Even as improvements have cooled down somewhat because of decreased demand, I still think progress is quick. So, you're better off updating frequently, even the resale value of your used equipment is maximized at that time when it's still not quite obsolete.
JimmiG - Saturday, February 28, 2009 - link
I used to buy the $499+ cards but I've realized that I get just as much (or little) enjoyment out of my games using a $200 card.I am however flexible enough that I'd spend 25% more if that translated into a >25% performance improvement.
I think it's ridiculous the way the market is flooded with very similar cards that are only a few FPS apart and are also priced <$10 apart. Does the world really need one $149 card, one at $155, another at $169, another at $172 etc? Surely if you can afford a $149 card you can also afford a $169 card - it's the same market segment IMO.
I would have preferred if ATI and especially Nvidia had fewer cards in their lineups.
$99 for new pc gamers just coming from integrated
$199 for the mid-range/mainstream crowd (possibly also one at $149, but I personally feel that it's unnecessary - the $199 card would drop to the $149 - $169 price point soon enough anyway)
$299 for the enthusiasts
$499+ for those with more money than brains.
icrf - Saturday, February 28, 2009 - link
Agreed, but that's what makes these reviews so important. Options are good, especially if there's no clear winner, but having too many definitely complicates things.I'm a $200 GPU consumer, but I'll go up or down $50 to find the sweet spot. I'm not sure how to answer the questions based on that. I think I said $150-200 with a $50 swing, but I don't think I'd ever buy a $100 video card.
crimson117 - Sunday, March 1, 2009 - link
My friend picked up a 9800 GT for ~$100 three months ago; I think it was a great buy. I would have done the same had I not just bought an 8800GT in April 2008 for $200!cosmotic - Saturday, February 28, 2009 - link
How did you guys decide on the granularity for those two questions?If I was targeting the highest figure in any of those ranges or the lowest or the middle, the maximum spread using the highest flexibility is still just 2. Assuming people picked price points within the ranges given evenly and using the results people gave in their flexibility, you really didn't learn anything from asking the question on flexibility.
After reading the question I came up with the answer +100, but the highest option was 50!
Consider this: I was willing to spend $250 on a graphics card and when the 8800GTs were at $112, it was stupid not to buy one of those. It was well below my target price but the price/performance was so great, I would have been a moron to pick anything else. The people spending $20 less for 1/3rd the performance were also equally stupid. Are people really this inflexible when it comes to picking a graphics card?
strafejumper - Saturday, February 28, 2009 - link
i skimmed the articlereally all i think you need is a brief line and then the two questions
also the article can sometimes skew results.
"But we suspect that the majority of our readers, while interested in high end or even halo parts, will care much more about lower price points and bang for buck metrics."
you wouldn't think this would affect anything, but sometimes, lines like this actually can effect peoples answers to the questions.
no biggie though
reactor - Saturday, February 28, 2009 - link
I typically buy a new gpu every 2 years or so and try to stay under $300 unless there is a good deal at another price point(higher or lower).My main concern these days is that it can run modern games at 1080p with high/max details, and do hdmi out w/ audio.
DerekWilson - Saturday, February 28, 2009 - link
This describes my philosophy as well :-)I'm a fan of hooking my PC to my TV for gaming. I like big :-)
If I could afford a 2560x1600 monitor for personal use I'd totally be there, but I'll opt for a 50" 1080p TV over a smaller monitor.
i know, i know, i always complain about DPI ... i'd like smaller pixels everywhere, but >2x AA becomes increasingly useful with larger TV sized pixels :-)
spunlex - Saturday, February 28, 2009 - link
I think it is important to keep a distinctions between regular computer user, gamers, and people doing 3D modeling, because the demands are completely different. I liked the way you separated things in your christmas round up I like to se more of that, and more articles in $100-105 range. Although I do a faire bit of gaming and will spend up to $250 on a card, most of my friends who I would build computers for would not.spunlex - Saturday, February 28, 2009 - link
sorry about the typo, I meant $100-150DerekWilson - Saturday, February 28, 2009 - link
that would have been a good question to ask ... perhaps in a future poll we'll try to determine how much of our readership are PC gamers, console gamers, both, or neither ...faxon - Saturday, February 28, 2009 - link
thus far, when i have aimed at getting a new graphics card for gaming, my budget has landed somewhere in the $200-250 range (average). there are a couple instances when i have gone with less because thats all i needed at the time (HD4670 at 1240x1024 for quake wars?), but in general i aim for the kickers, since i like the pretty, but i dont want to have to upgrade every 6 months to keep it that way. usually the $200-250 cards are good for at least a year and a half to 2 years at at LEAST medium quality textures at 1650x1080 with maybe 8xAF 2xAA. it varies from game to game, but you get the picture. however, there have been times when i aimed my budget +/-$50 from my usual price range, like i will probably be doing when i upgrade my video card soon. assuming that there isnt anything new out by the time i do it, im going to be picking up an HD4850X2 2GB. the card may cost $40 more, but the extra performance over say an HD48701GB is insane. of course, at $8 an hour part time, that may be a little while, since i have other higher priority items on my plate first. if what im hearing is true, the HD5XXX series is coming out this summer. maybe i will look into picking up something from that, depending on the performance gains/$ the cards offerLazerFX - Saturday, February 28, 2009 - link
I tend to aim for £150 - £200 (UK Pricing), which works out to around $250. I'm quite flexible though, but I find that such pricing gives the best bang-for-the-buck; like the 1GB 8800GT Palit Overclocked card I've got now - runs 1920x1200 with no issues, but didn't break the bank.JustSomeDude - Saturday, February 28, 2009 - link
For a particular level of performance (i.e. 60fps, 1920x1200, full eye-candy) what is the cheapest way to get there. This will be different for different games, as well as being unobtainium for some games.josh6079 - Saturday, February 28, 2009 - link
With what's out there today, I'd prefer to stay between $100-$150, +/- $15. Although, 1680x1050 is my resolution so I can afford to fall within that price segment. Had I a larger display, I would likely have to increase the amount I was willing to spend.KingstonU - Saturday, February 28, 2009 - link
I like the article and the simplified take on the poll. Just a small thought however: considering the the price different between a cheap card and the highest end cards is several hundred dollars, I think the choice increments of $5 in the 2nd question is a bit moot and should perhaps be something like +/- $10, $25, $50, $75, $100 and infinitely.I also like the idea of a poll on people's take on coupons and mail-in rebates and if they affect their purchase decision.
mindless1 - Saturday, February 28, 2009 - link
Agreed, I didn't know which price category to pick because often the after-rebate price of something I buy puts it into a lower category.Another question helpful in differentiating customer needs would aim to sort by non-gamer, infrequent gamer, and frequent gamer, though I suppose the non-gamer these days has little reason to buy a video card at all except for features instead of performance.
BigToque - Saturday, February 28, 2009 - link
I make a decent living, but in my opinion, no computer is really worth more than $500 (in the same way I feel no phone plan/internet plan is worth more than $20 per month).Am I willing to spend a little bit more if it gets me a little more performance? Sure, but I definitely have limits. I would never spend more than $200 on a video card.
I just use the internet, play music and watch movies. If I had an actual reason for needing a good video card (such as having an interest in working with any kind of 3D modeling) then I would certainly feel differently.
I just consider myself a "regular" computer user, with no special needs other then having a picture come on my screen without any major slowdowns.
As an aside, I am a console gamer. With my salary, I simply can not afford to keep buying the new parts required to play the newest games.
JPForums - Monday, March 2, 2009 - link
"As an aside, I am a console gamer. With my salary, I simply can not afford to keep buying the new parts required to play the newest games." (quote function doesn't seem to be working)I understand that there are valid reason to stick with console gaming, but price isn't nearly as clean cut as you imply. You said yourself that you are a regular user and $500 seems to be your target price. If you were wise in the purchase of said system, the cost of making such a system gamable is usually the cost of a decent video card. If you spend ~$150 for a Radeon 4850 or GeForce 9800GTX(+), you can get some pretty stellar gaming on reasonable monitors. I consider 1680x1050 monitors to be reasonable as the price jumps going to higher resolutions. Given your target price, I'd say you're looking at 1280x1024 or similar. If this is true you could spend even less on a card like a Radeon 4670 and still get good results. Of course you could also spend ~$250 on a GTX260+ or a Radeon 4870 1Gb and push games some modern games at max settings (sometimes even high AA levels) at resolutions of 2560x1600. If you need memory, 4Gb DDR2-1066 can be had for less than $50 (less for slower modules).
Now compare this to the price of a console. Consider that consoles max out at the graphics capability of an X1900XT/7800GT (XBOX360/PS3). Sure, the 360 utilizes 3 PowerPC cores for its CPU and the PS3 has the cell processor. However, only two of the three PPC cores in the 360 really get used in games. Further, the power of the cell currently only really shows value in blueray decode. By the time they make good use of these (if they can while being graphically limited) quad cores will be extremely cheap and much of the cpu load (I.E. physics) will be shifted to the video card anyways.
Now think about the price of games. In my experience, big release titles are 1/5 to 1/3 more expensive for console releases than PC releases, even for the same title (examples: Fallout 3/CoD:World at War/FEAR 2 PC:$43/$47/$45 360:$57/$57/$57). So even if you decide to spend a little more for upgrades, depending on how much you game and how long between upgrades, you can still spend less overall.
I have a system based on an Athlon64 X2 5600+ with 2Gb RAM and a Radeon 4850. This system allows me to max out C&C: Tiberium Wars, C&C: Red Alert, Stalker: SoC, and Stalker Clear Skies (DX9 as I'm using WindowsXP) on my 1680x1050 monitor (Far Cry 2 runs extremely well, too). I have a buddy with a single core P4-2.8GHz (no HT), 1Gb ram, and a Radeon 3670 ($80 at the time). I can still play Stalker (both) with full dynamic lighting and some settings scaled back on his monitor (1440x900 I think). It still looks good and is a very enjoyable experience. I could name a whole lot of cheaper systems that are used in the small LAN parties I host that are perfectly adequate for modern games.
Note: We typically playing Stalker, Tiberium Wars (tried Red Alert 3, but didn't like the system as much), FarCry (might switch to or add FarCry 2 when I can convince some people that FarCry isn't the end all be all), and even some old school Deus Ex. We would still play Generals if it weren't for the endless "sync errors".
The point is, you only have to upgrade your system if you feel the upgrade in gameplay is worth the money. I could have just stuck with my X1900XT and got almost exactly the same gameplay experience as an XBOX 360 (save control differences). With newer games, I'd just adjust the settings to my system. It would still look as good as comparable games on a 360. The situation is slightly more complicated for the PS3 due to developers having a harder time extracting performance, but it's still effectively the same situation. The beauty of the PC is that, if I deem it worth the cost, I can move beyond the current capability of consoles.
I have an old AthlonXP based system with a Radeon 9700Pro that I use as a spare in LAN parties (We use it for Tiberium Wars mostly). It still plays many modern games at low levels of detail. Sure the eye candy isn't as nice as my newer systems, but it is nice to still be able to play a modern game on a system that's around the age of the original Xbox. Try playing a 360 game on the original XBOX.
v12v12 - Wednesday, March 4, 2009 - link
Props and +1pt for that very detailed break down. See folks, this is the reason the Post-a-comment feature. If not, people read an article and just assume it's unbiased and exactly the way to go — until you read guy's comment above! Thanks bro!I personally won't purchase a card over $200, realistically $200 + MIR. The vid card industry has INFECTED PCs and now basically dominates/controls the industry. Note how PSU makers have directly catered to this ridiculous SLI crap. PSU prices have gone up, power requirements have risen, heat, noise, dust, blah blah all have increased b/c of POS "gaming" zombies... take all that money spent on fanboy upgrades and zealotry = could be taken a trip overseas, paid for certs, treat yourself to something in real life Vs colors on a screen... lol You noobs will learn when you get much older (some of you will never learn) all that time and money spent was in VAIN on an isolated, "experience!" Get off the tube/keyboard and save your cash.
BTW- Unplug yourselves and listen to Alex Jones INFOWARS: Distractions in the form of "entertainment" are keeping people stupid watching sports and "gaming," while people are robbing us all blind!
pourspeller - Monday, March 2, 2009 - link
+1 to JP on that.No offense to the original poster, but to say that you make a decent living, then follow it up with "can't afford PC gaming" is bizarre.
If you want your 360 to look even halfway decent, you must have a hi-def TV or monitor, so you could afford that. You can afford the more expensive games. What about the costs of Xbox Live service? And where can I pay less than $20 a month for decent high-speed internet access? Sign me up!
I've got an ancient athlon 3700+ and an 8800GT. Right now I'm playing Fallout 3 at high detail with HDR lighting etc. and it looks great at 1280x1040 or whatever the default is on my monitor. The GT cost me $120. Before that, I had a 7600GT that cost me about $110. I upgrade my card every 18 months and my PC every four years. No, I'm not a stickler for ultra-high graphics, but I've played 360 lots and I'm not missing out on anything. In fact, many times, I can customize the graphics to look better than the 360 version. Plus, I can login to Steam and play TF2 for FREE anytime I want.
Console gaming is great, but it's not really any cheaper than PC gaming, unless your talking about pushing the resolutions on the PC game well beyond anything a console can handle. And if you do that, you're comparing apples to oranges.
Exar3342 - Monday, March 2, 2009 - link
So I would assume you don't have a mobile phone and broadband is out of the question for you as these are generally >$20. For your needs, it doesn't even look like you need a computer. Get a $200 netbook to surn the net, watch movies on your TV, and use a stereo to listen to music.KingstonU - Saturday, February 28, 2009 - link
That's a popular point that for most people there is no reason to spend so much on a computer to do jut regular tasks. This is why Netbooks are emerging. (I wish they had Netbooks when I started university 4 years ago)For some people however, who do very intensive computing for their job or for fun, and time is money, then spending the extra few hundred dollars or even one thousand can be worthwhile if it saves them 1 hour a day, every day. Or if not money then it is time that could be spent doing other things.
ltcommanderdata - Saturday, February 28, 2009 - link
I guess this validates AMD's strategy of focusing on winning the mid-range ($150-$199) and performance ($200-$300) markets since that's where the customers are. Any breakthrough in the high-end would be a bonus given the small amount of capital put in up front. (A dual die HD4870 X2 is cheaper to come up with than a mega chip).DerekWilson - Saturday, February 28, 2009 - link
It validates the strategy if their only concern is the subset of customers who respond to polls on AnandTech ...There is a bigger market out there, and we've always believed that our readership are smart shoppers interested in lots of performance for the money they put in. This sort of helps confirm our thoughts about that, but we are also not randomly sampling, and there could be some correlation between people who tend to respond to polls and the answers we've gotten ... which we can't evaluate without random polling.
we actually do have the ability to randomly poll people with popup polls, but people have been down on that idea, and we still have the problem that we are limited by people who chose to respond to a poll even if they are randomly presented with a popup poll.
we would expect a similar poll on an overclocking web site to reflect higher spending and higher flexibility or more of a focus on absolute performance.
we would likewise expect a poll on a more general consumer focused computer tech site to reflect that value parts (<$50 or maybe $50-$100) have the highest demand.
This data is not generalizable... but, for us and our readers, this information is very valuable.
PrinceGaz - Sunday, March 1, 2009 - link
I suspect a poll conducted by a third-party consumer research company would give results suggesting that most people don't buy a discrete graphics-card themselves, and that those who do are most likely to buy a card in the US$50-99 range (roughly the cost of two games), with those who also indicate a significant interest in gaming more likely to spend in the US$100-149 or US$150-199 region on a graphics-card (three or four games).People who visit sites like AT are not normal. They're technology enthusiasts at least, and PC hardware fanatics quite often. Therefore any poll run here can only ever give the views of people who already visit the site, which is certainly not the general public.
I don't think many people these days would actually go to the trouble of buying and installing (or having installed for them) a sub US$50 graphics card, as the time or money spent actually putting it in and setting it up would probably exceed the value of the card. Besides, whilst even sub US$50 cards are many times faster than onboard graphics, even Intel chipsets like the X3000 are quite capable of satisfying the needs of people who enjoy Sims 2 and other less graphically-demanding games (I've been playing Civ 4 a lot lately and I certainly don't need an 8800GTS for that).
The best way to find out what most people in the real-world spend on a graphics-card would be for Anand to use his connections with many mobo companies (who also sell lots of graphics cards with both nVidia and AMD GPUs) and ask them for a rough breakdown of sales across each sector. I'm sure if he promised not to reveal his sources, he'd be able to obtain relative figures of the volumes shipped in each segment (and possibly even between their nVidia and AMD based sales). We know these companies are often willing to spill the beans on upcoming stuff, and even pass on the odd unreleased CPU from Intel or AMD to the likes of AT (what they get in return for this is never made clear...) so I'm sure they could say
"x% of products shipped is in this range, x% in that, x% there, and x% above this amount).
Bremen7000 - Saturday, February 28, 2009 - link
Making any kind of conclusion about a company's strategy based on a poll on a tech site is epic-fail.GaryJohnson - Saturday, February 28, 2009 - link
Yeah, the people participating in the poll are only consumers. What the hell do they know?crimson117 - Sunday, March 1, 2009 - link
The only thing I know is that I know nothing.DerekWilson - Saturday, February 28, 2009 - link
i modified the second question to be a little more clear -- value was changed to VALUE and the (perf/$) note was addedwe aren't looking at how flexible you are to get the best performance (that should be taken into account in the first question) ... we want to know how flexible you are to get the best perf/$ ...
342 had responded before this change was pushed live. I do apologize for any inconvenience or misunderstanding.
Jjoshua2 - Saturday, February 28, 2009 - link
I normally by cards in the $150-200 range in the $100-150, because I wait for good sales with coupon and mail-in-rebates. I think there should be a question about how much do we consider market price versus lowest price we can find on bargain websites and such.The0ne - Saturday, February 28, 2009 - link
Same here. I'll wait for a good deal before purchasing a video card. I have the time to wait since I don't play games very often but do like have a good card when I need it. I got my 9800gx2 a year or so ago for $300 and it's still doing very well for my needs.DerekWilson - Saturday, February 28, 2009 - link
this is a good suggestion. we'll look at a separate poll in the coming weeks on coupons, instant rebates, mail in rebates ...are bundles part of the same equation, or should that be considered separately from the cost?
Kougar - Monday, March 2, 2009 - link
That's pretty much what I do. I nabbed a GTX 260 Superclocked last summer for $220 after promo code, free shipping, and a mail-in rebate. Eight months later I still don't regret it, and is how I've begun to conduct most of my other PC-related purchases.To answer your question, the only time I give the bundle any consideration is when it has a game I already wanted to purchase was included. This unfortunately is very rare so it almost never turns into a factor in my GPU purchases.
Unique programs like EVGA's Voltage tuner give value, but specific warranty length, options, and provisions are usually the biggest deciding factor beyond price for me personally.
If the GPU already had a very good quality aftermarket cooler pre-installed would this count as part of the bundle? I would factor any very good factory cooler upgrade into the price equation as it saves me from buying one myself. Otherwise I don't end up factoring the bundle into the original price.
To further muddle the issue, things like combos would factor in if I already was planning to buy the hardware. For example the Intel X25-M and Core i7 920 "combo" on Newegg gives an instant $60 off the combined total... I've been incredibly tempted by that because both items are on my future shopping list already and those items don't deal in rebates otherwise.
v1001 - Saturday, February 28, 2009 - link
I try to keep it around $150. Maybe give or take $30 or so if necessary. But would really prefer not to go over $150 if possible.