Comments Locked

120 Comments

Back to Article

  • Hardin4188 - Monday, August 20, 2012 - link

    This is a good move on amd's part. I've been planning on getting a new card and sleeping dogs.
  • wicketr - Monday, August 20, 2012 - link

    About time nVidia joined the game to put some pressure on the prices. Now if only they could put their 650 and 640 lines out there to compete on the low end. They've been destroyed this cycle with AMD seemingly the only game in town for so long.
  • StevoLincolnite - Tuesday, August 21, 2012 - link

    It is a nice move, but the price/performance ratio in my opinion still doesn't match the previous 6000 series, I picked up 2x Radeon 6950 2gb cards which unlocked into 6970's (Then I overclocked them) and they each came with a copy of Dirt 3 for about $200Au a year ago.
    Sold 2x copies of Dirt 3 each for $60 so in the end each card was about $180.
    The 6970's are competitive to the 7870 in allot of games that aren't compute heavy.

    The higher prices this generation are probable caused by a combination of 28nm being new and expensive and limited yields, so it's not entirely unexpected, just hope next generation which I intend to jump on is priced lower. :)
  • RussianSensation - Tuesday, August 21, 2012 - link

    6950 2GB were amazing cards. The problem is AMD couldn't afford to sell $200-250 6950s that unlocked to 6970 of all things. They were losing a lot of profitability with that strategy. It was only a matter of time before the new CEO had to cut that cord short. It just means we are back to historical situation such as 9800XT or X1800XTX or X1950XTX when AMD cards were $500+. Just means you have to wait longer to catch better value / upgrade. On the positive side, current games aren't getting demanding that fast since next generation consoles are still 1-2 years away and the graphics progress is very slow. Those 6950s in CF will probably last you until 2014 easily.
  • piroroadkill - Tuesday, August 21, 2012 - link

    They were, and are. The fact they existed led to me being very confused at the success of the Geforce 560Ti, a card with far less potential (and VRAM).

    6950 2GB unlocking was underestimated, I believe, and they were the sweet spot for the entire time they were on the market.
  • HisDivineOrder - Tuesday, August 21, 2012 - link

    That's a reflection of just how bad AMD's drivers were last year and, arguably, this year up until the last few months.
  • RussianSensation - Saturday, August 25, 2012 - link

    Umm...no, The low prices were a reflection of the old CEO and management team. HD4850/4870 ($199/299), HD4890 ($269), HD5850/5870 ($269/379), HD6950/6970 ($299/369). Those prices had nothing to do with the state of AMD's drivers. All of those cards had rock solid drivers in single-GPU states. It was simply a matter of strategy -- trying to gain market share through price competitiveness. If anything, AMD's drivers were better for all of those series than the first 6 months of HD7970 series. So you cannot use drivers as an explanation why AMD was selling their cards for so cheap.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    amd drivers still aren't right
    http://www.overclock.net/t/1231670/official-the-am...

    Welcome to the continuing GSOD
    Welcome to black screens
    Welcome to black and white lines oc'ing
    Welcome to load line calibration issues
  • Pantsu - Tuesday, August 21, 2012 - link

    The performance difference was maybe 10%, mostly due to different clocks. Clock to clock the difference between 6970 and 6950 was 0-5%.

    In any case, the 200$ market hasn't moved since in terms of performance. I bought two of them and sold them for maybe 50€ less a year later. CrossFire was a big disappointment for me. No driver support whatsoever back then.

    I'd say those who bought the early cheap 5870's got the best deal in hindsight, as long as we're talking 1080p gaming.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    They were the $300 to $400 dollar market, not the $200 market.
    How the HECK do you people do that ?
    How is it possible to make that large of errors ?
  • StevoLincolnite - Tuesday, August 21, 2012 - link

    2014 for a pair of Radeon 6950's? Ain't happening, I game at 5760x1080.
  • TheJian - Tuesday, August 21, 2012 - link

    Thus proving my point about 2560x1600.

    Thanks for that Steve :)

    Dang, you must feel like you almost stole those cards...LOL. NICE.
  • RussianSensation - Tuesday, August 21, 2012 - link

    Did you actually read AT's own review?

    Using 1080P only,

    HD7950B wins in Crysis Warhead, Metro 2033. The card are tied in Dirt 3, 1 fps separates them in Batman AC. However, in Skyrim mods weren't used and both Skyrim and Batman AC can work well on a 7950 with 8xMSAA. Like I showed you earlier, once you add mods and higher MSAA for better image quality, the 7950 will win in Skyrim, Dirt 3 and Batman AC as well.

    It's not a clear win for the 660Ti at all. In fact the main reason the 660Ti even looks good is because Zotac AMP! 660Ti OC / MSI Power Edition 660Ti were used.

    Take a 7950 @ 1100-1150mhz and it'll be at the top of those benchmarks in every game over a GTX670 besides Portal 2 SSAA. Portal 2 is an exception for SSAA.

    Using a wide variety of games with SSAA, even HD7970 beats a 680:
    http://www.computerbase.de/artikel/grafikkarten/20...

    That means in a wide variety of games with SSAA, the 7950 will once again win over the 660Ti.
  • TheJian - Tuesday, August 21, 2012 - link

    You don't see the benchmarks in 2560x1600 in AT's review? That's not 1080P. You're not seriously telling me you don't know what 1080P is are you? I'm not even sure how to respond to that.
    Check the comments in the article, click VIEW ALL comments and read page 4. And #1 don't bother pointing me to sites not in english, and #2 they're useless if they don't tell me the speeds, models etc of whatever is being tested. Read the "wall" of text. I used half of Anandtechs OWN tests and statements against them...Did YOU read the review? I quoted Ryan at least a dozen times FROM that article. He really made it too easy. He replied they test in 2560x1600 because of some monitor that "enthusiasts" can now buy under $400. But only if you order from Korea or New Zealand from websites that "Just Started" on Amazon...LOL. Their website (the only one with reviews) has a blank Faq and About page...ROFL. Their email is a gmail account and they have no contact#. But Ryan, says this is a popular monitor with Enthusiasts and newegg nor amazon sell it, and it can't be bought in the USA. Did you read any of our exchange or just trolling? Pointing me to sites in german that have no explanation of how the test machine or what cards are in it, or what speeds they did the tests at do me no good and are useless. Stop wasting my time. As Ryan said, the 7950B chips are REJECTS which is why they are 1.25v. Did you READ Anandtech's articles? You seem to be able to write in english you should be able to READ it.

    Zotac speeds can be bought for $299 check newegg.
    http://www.newegg.com/Product/Product.aspx?Item=N8...
    Core Clock: 1019MHz
    Boost Clock: 1097MHz
    Memory can be boosted FAR above 6.6ghz shown already
    http://hardocp.com/article/2012/08/21/galaxy_gefor...
    7.71ghz. Easy to do 6.6ghz and they did it with a 3gb card not 2gb (less chips is easier to OC).
    Knowing this, why would you NOT compare Zotac benchmarks when most cards out of the box will do these speeds and higher easily. Hardocp got their card to 1300mhz! You keep quoting 1150mhz for 7950's. It's tough to even buy a Default clocked 660. 1150 on your chip RUNS HOT and chews up watts. 80 watts more to be exact:
    http://www.guru3d.com/article/radeon-hd-7950-overc...
    138watts vs. 217watts at 1.25v. That's what it took and you can keep your hot chip thanks.

    http://www.anandtech.com/show/6152/amd-announces-n...
    Ryan (quoted many times in my wall to him :)) is quoted as saying they are HOT REJECTS in their boost review. Sorry.
    "These numbers paint an interesting picture, albeit not one that is particularly rosy. For the 7970 AMD was already working with top bin Tahiti GPUs, so to make a 7970GE they just needed to apply a bit more voltage and call it a day. The 7950 on the other hand is largely composed of salvaged GPUs that failed to meet 7970 specifications. GPUs that failed due to damaged units aren’t such a big problem here, but GPUs that failed to meet clockspeed targets are another matter. As a result of the fact that AMD is working with salvaged GPUs, AMD has to apply a lot more voltage to a 7950 to guarantee that those poorly clocking GPUs will correctly hit the 925MHz boost clock."

    There won't be many old 7950's left soon...LOL. Welcome to your boost chips :) Deboost chips? The tahiti chips were BINNED to get 1.125v default (660's are .987) and your rejects are 1.25v. Good luck hitting 1150 without heating your house. Your top clock is the boost clock out of the box, where as hardocp shows, the 660 will do another 100 past that all on it's own...LOL. 1300mhz after they changed it to 1200. That's not a limit either, they'll go to whatever is safe.

    http://www.behardware.com/articles/853-18/roundup-...
    Look at the chart. 11 7950 cards. Only 2 hit 1200 and needed a heck of a lot of volts to do it. 1.275? Jeez.
    "The maximum clock on the Radeon HD 7900s generally seems to be between 1125 and 1200 MHz when the GPU voltage is adjusted."
    Not good, and those are not BOOST chips that already need 1.25v for 925 reliably. Keep dreaming. Anandtech's words not mine.

    Do you read Anandtech articles? I'm done with you troll.
  • Ammaross - Wednesday, August 22, 2012 - link

    Speaking of troll TheJian...

    Anyway, you (eventually) cite how hard it is to get a 7950 OCed to 1150, but yet you still rely on AT's benches comparing heavily-overclocked 660Ti cards against a STOCK 7950 card? Even a few mhz bumps would do loads better (as shown by the slightly OCed 7950B marks).

    What I don't get is why AT insists on reviewing extremely OCed cards, but lines them up with non-OCed competitor parts? It's like comparing an OCed [email protected] against AMD's stock lineup.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    the next new egg verified buyer

    " Blown away

    Pros: Out of the box very impressive. Card is well designed, lots of cooling added to the Twin Frozr. After some tweaking, I benchmarked 1330mhz stable while adding ~500mhz to memory as well."

    This is reality son.
  • RussianSensation - Saturday, August 25, 2012 - link

    No the reality is that a GTX660Ti @ 1330mhz can't even touch a stock GTX670 in games because of the ROP/memory bandwidth limitation:

    http://www.bit-tech.net/hardware/2012/08/16/nvidia...

    OC 7950 walks all over your 660Ti 1300mhz:
    http://www.hardocp.com/article/2012/08/23/galaxy_g...

    Let's cheer-leading, more facts please :)
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    I see the page you linked at bittech and the 660Ti equals the 7970.
    LOL
    Let's face it, you're the amd fanboy of the decade here, who can't read his own facts he links.
  • RussianSensation - Saturday, August 25, 2012 - link

    HardOCP just proved everything we have been saying since GTX660Ti launch:

    1) HD7950 OC keeps up with an OCed 670 at 1080P and is arguably faster at 1600P

    2) GTX660Ti OC is no match for a 7950 OC at 1080P, and especially at 1600P with MSAA:

    660Ti got smashed:
    http://www.hardocp.com/article/2012/08/23/galaxy_g...
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    from your link see the 660Ti win
    http://www.hardocp.com/image.html?image=MTM0NTczNj...

    amd did a skyrim driver hack, so that's 1 win nVidia, only a recent change for amd

    The third game , of 3, that "proves everything to you" is an amd favored game.

    So much for the wild eyed amd fanboy again.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Your goose is cooked because the 660Ti is hitting over 1300 core solid, and 7000 memory.

    Now your idiotic 7950 OC whine is OVER.
  • RussianSensation - Saturday, August 25, 2012 - link

    OC 7950 walks all over your 660Ti 1300mhz:
    http://www.hardocp.com/article/2012/08/23/galaxy_g...

    660Ti 1300mhz is a class below 7950 OC that actually competes with a 670 OC.

    Next time do more research before calling someone names.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    Answered above wacko.
  • jiffylube1024 - Tuesday, August 21, 2012 - link

    I agree on the smart move by AMD. Essentially, they are having their cake and eating it too with the 7xxx series. They got to skim the market with the early release (Nvidia has been doing this for years - remember the 7800 GTX 512MB starting at $600+ and climbing to ~$900 at launch?). Now, AMD are settling in to a more mass-market friendly $200/$300/$400 pricing structure (with some rounding).

    Of course, this is a direct result of the GTX 680 and 660 Ti being such strong parts, but the AMD cards are still quite attractive, especially the 7850 and 7870 in the sub $300 category, since they have no corresponding competing card from Nvidia.

    It's funny how, again and again, Nvidia and AMD converge on price and performance, despite taking radically different approaches initially (ie. small die vs. big die, compute vs. non-compute performance) and then essentially regressing towards the mean in both cases.
  • TheJian - Tuesday, August 21, 2012 - link

    Unfortunately they DIVERGE on profits. AMD has lost 6 billion in the last 10 years, and lost 629 million in the last 12 months. Meanwhile Nvidia has made 2.3billion in the same last 10 years and Nvida has made 473 million in the same last 12 months.

    http://investing.money.msn.com/investments/stock-p...
    http://investing.money.msn.com/investments/stock-p...

    Great for consumers, terrible price cuts for AMD. They should have charged more and clocked them higher to begin with, and maybe I could tell you about profits instead of losses then. I cringed when they let Dirk go. AMD only has another 1.5bil to burn through and at this rate they will be out of money by xmas 2014. All they can do is dilute their stock more by selling senior notes again (more bad) or try to get even worse loans at even worse rates since (like america was just downgraded from AAA) since they have been downgraded to basically JUNK bond status. You can't borrow cheaply with bad credit and billions in debt. You can't compete with your enemy with no money for R&D (hence the admitted to giving up the race with Intel). They are about to give up the race with Nvidia. This is a bummer on TWO fronts.
    http://www.fool.com/investing/general/2012/08/20/c...
    Bankruptcy wouldn't be bad now so we can get someone who has capital behind their IP to get some good R&D going. CPU's are designed 5 years out. We are just seeing the end of AMD's cpu/gpu pipeline run it's 5 year course. Watch the video, it's grim. Then read the link. You can give a fake email to get the report I think...LOL. I'm a registered fool so don't quote me on the fake email ;)
    http://www.fool.com/fool/free-report/18/sa-datamin...
    The gardner brothers are famous and so is motley fool. They're picking Nvidia as the next trillion dollar company. A bit over the top I think but I'd bet on 100billion and have been putting my money where my mouth is for a while. It's like free money buying their stock right now. Nexus, surface, kindle2, ouya all using tegra 3 selling now or shortly, tegra 4/5/6 were being dev'd stride for stride with tegra 3 (Jen Hsun said they were all 4 being developed at the same time, wayne already in the bag etc). Gpu's are strong and cool, volts awesome OC to 1300 on the 660 and 3 years earlier than AMD to mobile. Fools buy Nvidia like mad :) 4/5 stars from caps. You can dev like this when you have zero debt with 3billion cash and your competition is 2bil in the hole. Bulldozer is the start of the downfall unfortunately. Ceo's and company leads jumping ship right and left is ugly. It's good for consumers now, but we'll all pay in the end. You should load up on NVDA now so you can pay for the expensive Intel/Nvidia stuff later free :) By 2015 we'll all be getting screwed again from both of them.
  • RussianSensation - Saturday, August 25, 2012 - link

    HD7750 is going for $90-100 on Newegg
    HD7770 is going for $100-120 on Newegg
    HD7850 is going for $180-220 on Newegg
    HD7870 is going for $230-240 on Newegg
    NV has no desktop competitors for these cards.

    NV lost 10% of desktop discrete GPU shipments last quarter:
    http://www.techpowerup.com/170575/Graphics-Shipmen...

    HD7950 is going for $300-310 , with similar performance at stock speeds, beating it handily with OCing.

    HD7970 GE undercuts 680 and outperforms it stock or OC vs. OC.

    NV is doing well as a company but its desktop discrete GPU line-up only has 1 good card -- GTX670.

    Rory got left with a company who worked for 5 years on Bulldozer and failed. Don't blame AMD's GPUs on how AMD is doing. They don't have a mobile/smartphone/tablet strategy and are uncompetitive in the CPU space. However, in terms of GPUs, AMD has won this generation so far:

    1) Better price/performance
    2) Better single-GPU performance
    3) Better performance/watt for sub $250 cards.

    AMD's problems are in execution in other business segments and lack of professional penetration for GPUs. On the consumer side, their GPUs are kicking ass.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    nVidia's last generation competes with those cards you idiot - and often beats them, sometimes completely.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Good move on amd's part, I agree... after nVidia right crossed them, decked them again with an uppercut to the crooked jaw, and kicked them in the scrotum bringing them down, it was time amd dodged the wooden chair about ready to be busted over their backside.

    Good duck amd. Keep going lower and lower and pretty soon you be knocked out and won't be able to get up. At least your fans will have fun looting you on the way down.
  • RussianSensation - Saturday, August 25, 2012 - link

    Yes, NV hit them card, considering GTX660Ti launched in August 16th and HD7950 came out Jan 31, HD7850/7870 on March 3rd. NV is 5-6 months late. You failed to mention that part.....All AMD did was collect nice profits and let NV catch up, and then NV released a 660Ti that can hardly compete with an OCed 7950 on performance or with a 7870 on price. Good one.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    amd was catching up with nVidia last generation - how late were they there ?
    6 months of bad drivers makes them barely to the table first...
    In any case, it took amd how many months over a full year to catch the nVidia 500 series ?
    Oh yes, they are still trying to beat it with the 7870 and 7850, and THEY HAVE FAILED.
    LOL
    What were you saying ? LOL
  • frostyfiredude - Monday, August 20, 2012 - link

    Good move on their part, this makes AMD's cards the best value to get at all basically every price point. The only exception being the very top where the 7970GE vs GTX680 goes to the 680 because of noise/power reasons.
  • TheJian - Tuesday, August 21, 2012 - link

    And basically keep losing another 630 million per year. These price cuts are not good. Why do people not get this? DEBT is not good. With 1.5billion left in cash (they already mortgaged the house to get that at crappy interest rates), they will burn through that in 9 quarters. Xmas 2014 is looking like they'll go broke. Like our country being another 6.5 trillion in debt since obama took office (about to hit 16Trillion !), AMD can't take another 2 years of losses since buying ATI for a ridiculous price. They had to write down 2/3's of what they paid for it which killed them. They've lost 6 billion in 10 years.

    Like USA will go bankrupt if Obama gets another 4 years, AMD will go in 2.25 years. That's not good in either case. Debt=bad. Losing money=bad. At some point you have to bring in more than you piss away or you're dead.

    These price cuts are BAD for amd, but admittedly good for us ;) Long term, VERY bad for us all as NV/Intel will be able to price hike like mad without AMD.

    I've posted enough data to refute AMD being better at $300 range. 660 is a strong product and the boost chips are rejects that required amd to put 1.25v to them to reliably hit 850/925boost.
    http://www.anandtech.com/show/6152/amd-announces-n...
    Scroll down to ryan's comments on the voltages. Then go read my wall response to his response to me on page 4 of the all comments in the 660 article at anandtech. It's not pretty :( But very enlightening. No BS, just data (most of it his).
  • RussianSensation - Saturday, August 25, 2012 - link

    1) "These price cuts are not good. Why do people not get this? DEBT is not good."

    ^ People get this but since you can't go back and re-do the AMD-ATI merger, you can't make the $5.4 Billion in debt via a leveraged buy-out just disappear. Also, AMD didn't just cut prices magically because it wanted to, the competition from NV forced it to do so.

    2) You really need to take a couple classes of accounting and finance since you have no idea about cash flow vs. net income. The only thing that matters is cash flows, not net losses. $6B in losses is not what the company actually lose in $.

    3) You seem to be overly happy to keep pointing out how AMD is going to be bankrupt in 2-3 years, blah blah blah. Let us know how you enjoy your $1000 GTX990 and Intel Core i5 for $500.

    And then despite discussing how it's bad that AMD lowered prices, you still going out of your way to make up data that 660Ti is a better card. You are something else guy.

    No one cares about a reference 7950 review since only 1 out of 19 cards on Newegg is a reference 7950 B card. Stop trolling or go to GeForce.com.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    you can also buy things marked "value" in the dollar aisle at Walmart
  • RussianSensation - Saturday, August 25, 2012 - link

    You can go ahead and call AMD cards value/budget, but most of us would rather take the faster card and $120 extra than pay throw $ away for the brand:

    http://www.xbitlabs.com/articles/graphics/display/...
  • TheJian - Monday, August 20, 2012 - link

    This isn't leverage..They were forced as the 660TI wins almost everything at 1920x1200, unlike your review which seems to think 2560x1600 is important. Yet Newegg has 68 monitors at 24in that use 1920x1200/1080 and NONE that use 2560x1600. Same story for the 52 models at 27in, where again NONE use 2560x1600. ZERO. You have to buy a 30in monitor for 2560x1600 to be important. Translation? AS I proved in the comments section over and over in your 660TI review, you based your conclusions and digs over bandwidth at 2560x1600 which is only for 2% of the user base. Say what?

    Steampowered.com hardware survey shows 98% of the users in the world are using ...Wait for it...1920x1200 or below. Even that res & 1920x1080 is only used by 29.5%. The rest far lower. But you couldn't decide on a recommendation based on 2560x1600 results...ROFL. But you kept digging at that 2560x1600 memory bandwidth issue (that only exists here and there anyway). Instead of concentrating on the fact that 98% of us use the res the 660 TI DOMINATES in. Do you own a 30in monitor Ryan? We don't. "That Darned Memory Bus"...LOL. Umm...Didn't affect a thing at 1920x1200 did it? So 98% of us couldn't care less about that darned memory bus?

    USELESS drivel...I'll help you fix your recommendation...

    "At 27in or lower the 660 TI is the obvious choice for gamers. Because even those monitors use less than 2560x1600, including every 24 or 27in on newegg, yes I mean EVERY ONE :) " LOL. Fixed for ya :) Feel free to paste it into your Conclusion page and remove your drivel.

    AMD had to lower prices...They got stomped at 27in and lower resolutions. Meaning, almost for everybody they got stomped. Before any fanboys come out...I own a radeon 5850 and can prove it (already did in the comments - See my amazon backorder)...LOL.

    Somebody has to call Ryan out. The review and it's digs about bandwidth at 2560x1600 (at some points twice on a page) SUCKED. Every time he made that comment, he should have said something like "but it only affects 2% of you anyway". I might have said good review then... It would have been accurate instead of ridiculously misleading :)
  • Zoomer - Monday, August 20, 2012 - link

    Good 27 inchers are usually 2560 x 1440.
  • TheJian - Tuesday, August 21, 2012 - link

    I don't dispute that...In fact I totally agree with you and pointed that out in my response to ryan 5 minutes ago in the comments section of his 660 TI review. Its easy to find, look for the wall of text, but read his post to me first :)

    It's a long read, but PURELY backed up with data and easy to follow along.

    I challenge anyone to argue with the data...Most of which are his own words and scores :)
  • RussianSensation - Saturday, August 25, 2012 - link

    1. NV charges $300 or 25-30% more for 9% more performance over the $230-240 HD7870 at 1080P and just 4% faster at 2560x1600:
    http://www.computerbase.de/artikel/grafikkarten/20...

    2. NV's 660Ti OC is hopelessly out-gunned by the HD7950 OC:
    http://www.hardocp.com/article/2012/08/23/galaxy_g...

    GTX660Ti vs
    - HD7870 has better best price/performance
    - HD7950 has better performance for enthusiasts.

    Any more rants on your end?
  • Bull Dog - Monday, August 20, 2012 - link

    I game on a Dell U2711 (2560x1400) as well so the 2560x1600 benchmarks are relevant for me.
  • tuteja1986 - Tuesday, August 21, 2012 - link

    2560x1600 benchmark is import for me as i also play games on my U2711. Also after getting this monitor i bought myself second used GTX 570 instead of new GTX 680 as SLI is more beneficial for 2560x1400 resolution.
  • TheJian - Tuesday, August 21, 2012 - link

    Proving my point to Ryan once again...Thanks.
    You are CORRECT sir :) Dual cards. My point in my WALL response to ryan in the comments section of his 660 TI review ;)

    Ryan, you reading this stuff? Go read the wall Ryan ;) I dare you to respond again.

    Another guy with an $850 monitor, and money for duals too. The PROPER way to go (or an expensive 680 as he mentioned). People like this guy don't game on their $850 monitor with a $300 SINGLE card and expect good results. Which I proved with multiple data points and links.

    Head to hardocp for 5760x1200 results etc...He likes running those benches :)
    http://hardocp.com/article/2012/08/21/galaxy_gefor...
    His reviews are great, but once I nailed him (politely even), unlike Ryan (so far) Kyle Deleted me :( Not surprised though, Ryan will have a headache too...ROFL. Hats off to Ryan for actually responding to my first criticism. But I don't think he'll be able to stomach doing it again.

    While I understand the results are important for some, I took issue with his conclusions and excuses for them. I mean a $400 monitor from Korea? No usa sellers? etc?
  • TheJian - Tuesday, August 21, 2012 - link

    Said the one guy in the room with $850 for a monitor from newegg. Congrats, ryan wrote his GTX 660 TI conclusion for you. :)

    According you Ryan you are a "rich enthusiast".

    But poor me I'm not with just a 24 dell, 22 LG and a radeon 5850 etc... :) Bummer for me.

    Seriously though, I am jealous of the monitor :) mmmm...WANT ONE ;)
  • CeriseCogburn - Saturday, August 25, 2012 - link

    Dude, these people are not serious, they are wacked out fanboys.
    You call them on the facts, and they offer hatred and lies and fudging and flawed side points and it never ends.
    A GTX680 or a 7970GE are NOT ENOUGH for 1920x1200.
    When I spend 2 grand + on a new build, I don't want to have to screw around and knock down the settings in most games.
    I let all these fools know it on guess what release.
    But they all know everything, with no facts, just their feelings. If the facts can be twisted to how they feel, all the better. If there are no facts, they'll make up a speculation.
    When the facts hit them in the face, their brain takes a verdetrol trip. LOL
    You did us all a service, I'll say that, and it will eventually sink in. By then it may be too late, but it's going to do some good.
    Now, Crysis is still being shown as a test here - let's take the recent 660Ti review - 7970 @ 1920x1200 and they have it on 4xAA - and 54 fps... 2560x is 31 frames - unplayable in both cases amd best show, and it's not even cranked.
    Once again, not even the top single core cards now handle 1920x1200. They crumble before moving on.
    Now, you can stretch em out... but you're compromising.
    ( Dirt 3 is playable, the point is, the best games, sales and hype and IQ, you knock em down to play)
    That's the facts period.
    Let's not forget, most these whining joes do not have an OC 2500K or better rig - a LOT of them have an AMD spider platform LOL....
    Heck, that's even worse for 1920x1200 - the cpu makes a lot of fps difference nowadays as offloading to it started becoming more relevant a couple years ago.
    So the FPS sighted is the monster system on a clean install not connected to a 64 player server, etc.
    In other words, your point on resolution fits not only for the price or availability or those who are using such, it fits because the cards themselves are lacking SEVERELY at that resolution.
    I tried to hammer it home months ago...

  • RussianSensation - Monday, August 20, 2012 - link

    12 Professional reviews average:

    800mhz 7950 > GTX660Ti
    http://www.3dcenter.org/artikel/launch-analyse-nvi...

    HD7950 also has 30-40% overclocking headroom. So it's just as fast at stock speeds and has extra reserve to handle MSAA, mods and high resolutions. 3GB of VRAM 660Ti versions also cost $340.

    Even TPU shows GTX660Ti not any better than a stock 7950. See the "Blue graph":
    http://tpucdn.com/reviews/Palit/GeForce_GTX_660_Ti...

    That's against an 800mhz 7950.

    There are 880mhz HD7950 MSI TwinFrozr 3, 900mhz Gigabyte Windforce 3x 7950 for $330 on Newegg and those prices should fall even more soon.

    Enthusiasts who buy 2500k/3570k know where it's at --> OCed 7950 crushes the 660Ti.
  • TheJian - Tuesday, August 21, 2012 - link

    You keep pointing to things I can't or wouldn't buy. A ref clocked 660ti? What for? They come out of the box for $299 @1015+ core/1115 boost. Why would I buy a ref clock for the same $299?

    You point to charts where I can't even see how it's being tested, setup, speeds etc. I looked at your chart...It tells me nothing. The entire $280-350 range will dip so much below 30fps (unplayable territory) in so many games it's pointless to act like people buy these for that 2560x1600 resolution besides the fact that NO 27in or below monitor even uses it.

    See my wall to ryan. 68 newegg 24in monitors don't go above 1920x1200 native. NONE and that's all they sell. 27in, same story. It's 2560x1440 and bandwidth isn't an issue when you can't run your game above 30fps in almost all games on a $300 card at 2560x1600 as everyone seems to want to test in. It's not important if my car can do 300mph but there are no streets to drive on. It's not important if I run out of bandwidth at a resolution that not even people who spend $850 for a u2711 at newegg would run in (2560x1440, it doesn't run at 2560x1600) and if you run a 7870, 7950 or 660ti at 2560x1600 you won't do it above 30fps in many games without hitting BELOW 30fps and having a generally crappy experience. SLI/Crossfire is for this...See the wall I explain everything since you don't seem to get my point.
    http://hardocp.com/article/2012/08/21/galaxy_gefor...
    I know all cards OC...Just like the 660TI from 915mhz to 1300 here.
    That's a 42% overclock and he's not the only one hitting 1300. :) $339 and won't heat my house and can't be damaged. 600 series won't let you damage it. The top clock is NOT always top clock on a 660. Ryan's own review tells you the 7950 is a SALVAGED part and the put the boost at 1.25v for a REASON. He also notes it's a heater/watt user to get perf. :)
    http://www.anandtech.com/show/6152/amd-announces-n...

    "These numbers paint an interesting picture, albeit not one that is particularly rosy. For the 7970 AMD was already working with top bin Tahiti GPUs, so to make a 7970GE they just needed to apply a bit more voltage and call it a day. The 7950 on the other hand is largely composed of salvaged GPUs that failed to meet 7970 specifications. GPUs that failed due to damaged units aren’t such a big problem here, but GPUs that failed to meet clockspeed targets are another matter. As a result of the fact that AMD is working with salvaged GPUs, AMD has to apply a lot more voltage to a 7950 to guarantee that those poorly clocking GPUs will correctly hit the 925MHz boost clock."

    Go read the wall to ryan in the 660ti comments section. Argue if you can.
  • RussianSensation - Tuesday, August 21, 2012 - link

    Your entire argument makes keeps coming back to 2560x1600, but I am not sure why out of all people you are the only one focused on that.

    I just showed you in 12 professional reviews that a GTX660Ti cannot beat a 7950 and a 660Ti OC cannot beat an 850-925mhz HD7950.

    TechReport even has MSI TwinFrozr III 880mhz 7950 and it beats GTX660Ti OC at 1080P:
    http://techreport.com/r.x/geforce-gtx-660ti/value-...

    You really don't have a point here. No one is going to buy a stock HD7950 either. Take an 880-900 $320-330 7950 and it'll be as fast as any factory pre-overclock 660Ti at 1920x1080, while 7950 has 30-40% free performance on top.

    Also, I already showed you that an OCed 7950 keeps up with an OCed 670 at 1920x1080 per Guru3D. So there is no point trying to tell us that 660Ti is the fastest card at 1920x1080, since it isn't.

    And if the consumer wanted to save $, they'd get an HD7850/7870 and pocket $50-90 over the 660Ti.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    techreport 70 fps to 72 fps, OC msi BOOST 7970, and it costs a lot more... and it's a single point chart

    you showed what ? LOL fail

    no rez either but most of the few games are at 2560 1440 and without dirt showdown 7950 loses

    So get amd for one game and a $450 no warranty monitor with bad pixels to ship back overseas for $60 more and another chance

    no thanks
  • RussianSensation - Saturday, August 25, 2012 - link

    660Ti OC <<<<<<<< HD7950 OC = GTX670 OC:
    http://www.hardocp.com/article/2012/08/23/galaxy_g...

    Must be awesome to get a $300 660Ti and then not be able to use 8xMSAA in Batman AC, use mods in Skyrim with MSAA, and then have awful performance in every DirectCompute game out right now (Sleeping Dogs, Sniper Elite V2, Dirt Showdown) and all the other games where 660Ti is slow:

    - Alan Wake, Bulletstorm, Anno 2070, Arma II Day-Z mod, Alan Wake, Serious Sam 3, Trine 2 with SSAA, etc.

    What's 660Ti got going for it? Portal 2, Lost Planet 2, HAWX 2, Project Cars and WOW. I am running out of games.

    Great. $300 for a card that loses in 90% of modern games and only has 1.5GB of usable VRAM....and can't handle MSAA.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    Yet anyone opens the boxes and installs the 2 cards and "yours" LOSES.
    Let's face it, you're going to have to say who gives a **** since arma 3 is coming out with PhysX.
    If you're going to use high rez and all the settings, I suggest 2xTOP single from either company for 1920x1200.
    Also make certain you have an OC 2500K minimum and NO AMD cpu.
    The 7950 loses to the 600Ti, and the 660Ti won't be burrned to a crisp and dead like your 7950 after you try desperately to get stable clocks that in a few games get a bare win against the 660Ti.
    Go ahead burn it to the ground, the housefire will get you either way, for lack of features, for an enormous electric bill, or the sad GSOD, BSOD, and all the other amd driver fails, including 6 months before they could run the common games without crashing.
    Forget the amd junk.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    ROFL

    Sniper Elite V2 is part of AMDs Gaming Evolved program

    shame on you amd
  • CeriseCogburn - Saturday, August 25, 2012 - link

    Russian in your techreport chart the 660Ti BEATS the 7950 - card most to upper left wins...
    If you're looking fps 70 vs 72 is margin of error, then you've got the $25 price difference, and the endless extra features set of the nVidia card.
    So, EVERYONE is saying what thejian is saying, not no one.
  • RussianSensation - Tuesday, August 21, 2012 - link

    GTX660SLi is just 6% faster at 2560x1440 4AA against HD7970 GE:
    http://www.computerbase.de/artikel/grafikkarten/20...

    You can buy HD7970 GE for $450-460 against a $600 GTX660SLI setup:

    1) Vapor-X 7970 GE ($457 usually can get $10-15 coupon code on the front page)
    This card comes with 8 Black Diamond Chokes and exact same cooler from the HD7970 Sapphire TOXIC. That's value for 27 inch screens!
    http://www.superbiiz.com/detail.php?name=AT-7970GH...

    2) Visiontek HD7970 GE with Lifetime Warranty for $430 + $8 shipping = $438
    http://www.newegg.com/Product/Product.aspx?Item=N8...

    Looks like AMD still has the price performance locked down and 660Ti SLI looks awful in comparison for $600 since it can't handle MSAA at 2560x1440/1600.

    Just because you have a 5850 doesn't mean you come off as objective. Stick to the facts, not opinions.
  • TheJian - Tuesday, August 21, 2012 - link

    Read the wall responding to ryan. It's CHALK full of all the facts you need.

    You act like you can't overclock anything but Radeons.
    http://hardocp.com/article/2012/08/21/galaxy_gefor...
    Geforce 660TI running at 1.3ghz. Memory at 7.71ghz (that's 1.71ghz OVER stock). and the card comes with 3GB for $339 at newegg.

    Nuff said? Need more? Read my wall, argue if you can. Make sure you read ryans initial response to me first though, to keep it in context :)

    I'm not saying radeons suck. I OWN one and love it. But I don't want a heater for my room thanks. I already know 7950's are great overclockers. I even pointed to the guru3d 7950 OC article. I couldn't have used more facts if I tried. :)

    Saying I own a Radeon 5850 is not an opinion. It's a FACT. The insinuation by YOU that I'm not being objective? That is the definition of an OPINION. Which you more than have a right to as long as you can prove it. ;)
    http://hardocp.com/article/2012/08/21/galaxy_gefor...
    Batman Arkham @2560x1600 (one of the games you mention the 660ti gets crushed in by the way), yet it's the radeon that spends a LOT of time between 10-15fps while the 660TI never goes below 28fps.
    Skyrim, 8xMSAA and running 37fps min...Can't handle what? That's 8xMSAA compared to ryan running at 4xmsaa. Prove it can't run MSAA please. Out of the box this card ran at 1188mhz. He did nothing to get that. The card does it.
  • RussianSensation - Tuesday, August 21, 2012 - link

    Heater for your room?

    That's exactly why I noted in AT forums that people will use HD7950B power consumption to make wrong comments like yours. HD7950B uses 1.25V because it accounts for 50-60% ASIC binned 7950 that were terrible chips.

    How about MSI TwinFrozr 3 that has > 80% ASIC binned Tahiti XT chips and chips with 0.993-1.03V stock voltage at 880mhz?

    http://www.guru3d.com/imageview.php?image=36567
    HD7950 TF3 = 142W

    http://www.guru3d.com/imageview.php?image=41801
    GTX660Ti PE = 134W

    Just 8W of power separate an 880mhz 7950 TF3 and the MSI PE 660Ti.
  • TheJian - Tuesday, August 21, 2012 - link

    LOL, and how many more cherry picked are left?
    OLD 7950, NOT BOOST at 1.25 at guru 3D pushing 217 watts over regular clocked 7950 he started with at 138. That's 80 watts on the OLD BINNED great one.
    http://www.guru3d.com/article/radeon-hd-7950-overc...
    217 watts. Hitting ONLY 1150 at 1.25 stable and 217 watts, 80 extra watts for that 1150. Meanwhile 660's hit 1300 and out of the box at .987v...ROFL really dude? Are you going to keep this up?

    http://www.behardware.com/articles/853-18/roundup-...
    11 cards based on the OLD 7950's & 7970's, not BOOST chips. Only ONE hit 1200 at 1.225 and the only other to get there took 1.275. FOUR of those cards are 7970's! The freaking BINNED ones. The BOOST chips didn't exist on MAY 9th 2012, the date of that article! Check that chart!
    http://www.behardware.com/medias/photos_news/00/36...
    Doesn't look good for your old binned chips now does it? Incorrectly pointing to boost chips? Read much?

    So you POINT me to a chart that shows exactly what I'm saying. Your perfect example is already over the 660ti at 142 watts. It will be OVER the 217 then from the one I pointed to at the SAME FREAKING JOINT you pointed to but the one I point to is even better in the OC article at 138 watts...ROFLMAO. It hit 217w at 1150. You are worse than ryan or obama...LOL. "Business is good in the private sector"...LOL. "You didn't build that, the govt did"...LOL. OK man...Just 8 watts separate them BEFORE you OC the crap out of it to hit 1150 and hit 217w+ a few because yours is worse than the one in the OC 7950 article I just pointed to at guru3d. You sure play fast and loose with the facts eh?

    http://hardocp.com/article/2012/08/21/galaxy_gefor...
    Batman 10fps minimum on 7950...No amount will help that hit 28fps of 660TI minimum...ROFL. 2560x1600. But no, these cards are NOT for that res. You still don't get it...Skyrim at 8xMsaa there too, 37fps minimum. Pretty crushed by MSAA and ryan only ran at 4xMSAA. AGain 2560x1600...Nope still don't believe you want to run these there. Too many games DIP below 30fps like your lovely OLD radeon 7950 which stays between 10-15 fps for a LONG part of that graph. Witcher 2, again 18 660ti and 16 for 7950...Both totally unplayable at 2560x1600. Are you getting the message yet? Still don't understand? Read my response to Ryan page 4 of the ALL COMMENTS on 660ti. Once you run at 1920x1200 bandwidth will NEVER be a problem on 660TI (the only advantage, though hard to show, of the 7950 OLD chip). At 1920x1200 zotac (easily had clocks for $299 out of the box as already linked time and again for you at newegg) runs roughshod over your 7950B:
    First the 7870:
    660TI vs. 7870 @1920x1200
    Civ5 >3% faster
    Skyrim >6% faster
    Battlefield3 >37% faster (above 50% or so in FXAA High!!)
    Portal 2 >62% faster (same in 2560x...even though it's useless IMHO)
    Batman Arkham >22% faster
    Shogun 2 >31% faster
    Dirt3 >11% faster
    Metro 2033>15% faster
    Warhead ~wash (all 660 38.8fps-40.4 & 7870 is 39.9) WASH
    STARCRAFT 2: 7970ghz (108FPS) VS. GTX670 (121FPS) @1920x1200
    http://www.anandtech.com/show/6096/evga-geforce-gt...
    So for $20 you won't notice the difference (in this res this card would be bought for) with these phenomenal results vs. the 7870. Faster in everything, 31%, 37%, 62%, 15%, 11%, 22% & 50%+ (starcraft2)...People won't notice a card running this much faster in most of the games you tested? You really want to stand by that statement?
    That's from the wall response to him :)
    And now the 7950B (from the same post, all his benchmarks):
    "while the 7950 is anywhere between a bit faster to a bit slower depending on what benchmarks you favor." HERE WE GO AGAIN PEOPLE: Follow along:
    Civ5 <5% slower
    Skyrim >7% faster
    Battlefield3 >25% faster (above 40% or so in FXAA High)
    Portal 2 >54% faster (same in 2560x...even though it's useless IMHO)
    Batman Arkham >6% faster
    Shogun 2 >25% faster
    Dirt3 >6% faster
    Metro 2033 =WASH (ztac 51.5 vs. 7950 51...margin of error..LOL)
    Crysis Warhead =WASH (ref 7950 (66.9) lost to ref 660 (67.1), and 7950B 73.1 vs 72.5/70.9/70.2fps for other 3 660's) this is a WASH either way.
    STARCRAFT 2: 7950 (88.2fps) VS. GTX670 (121.2fps) @1920x1200
    So another roughly 37% victory for 660TI extrapolated? I'll give you 5 frames for the Boost version...Which still makes it ~30% faster in Starcraft 2.
    So vs. the 7950B which you MADE UP YOUR MIND ON, here's your quote:
    "If we had to pick something, on a pure performance-per-dollar basis the 7950 looks good both now and in the future"

    we have victories of 25% (bf3), 54%(P2), 7%(skyrim), 25% (shog2) 30%+ (sc2), 6% (dirt3)
    1 loss at <5% in CIV5 and the rest washes (less than 3%) ...But YOU think people should buy the card that gets it's but kicked or is a straight up wash. You said your recommendations was based on 1920x1200...NOT 2560x1600...Well, suck it up, this is the truth here in your OWN benchmarks...Yet you've ignored them and LIED. Let me quote you from ABOVE again lest you MISSED your own words:
    "And 1920x1200 is what we based our concluding recommendations on. "
    Then explain to me how you can have a card that wins in ONE game at <5% and BEATEN in 6 games by OVER 6%, with 4 of those 6 games BEATEN by >25%, and yet still come up with this ridiculous statement (again your conclusion):
    "On the other hand due to the constant flip-flopping of the GTX 660 Ti and 7950 on our benchmarks there is no sure-fire recommendation to hand down there. If we had to pick something, on a pure performance-per-dollar basis the 7950 looks good both now and in the future"
    Are you smoking crack or just BIASED? Being paid by AMD? Well? The 7950 is NOT cheaper than the 660TI. Can you explain your math sir? I'm confused. What "flip-flopping of the GTX 660 Ti and 7950"?? You're making these statements on 1920x1200 right? OR should I quote you again??...

    You really should read his post first, and then the ENTIRE post from me. These are the anandtech benchmarks used here. Whip out your calculator and prove me wrong if you can. Already showed the 660 can go to 1300gpu/7.6ghz memory, far above the zotac. The zotac is NOT using a lot of watts OR HOT in this review...LOL. Lots of room to spare no doubt.
    LOAD @322W...7950 is 353w before overclocking it...LOL.
    http://www.anandtech.com/show/6159/the-geforce-gtx... nevermind the boost (373) and the 7970 (391). Neither of which will hit 1200 reliably and need 1.225-1.275 possibly to do it:
    You read the chart right?
    http://www.behardware.com/medias/photos_news/00/36...
    Read it again, BINNED 7970 (4)and 7950's (7) in there from MAY, no boost, just hot volts to supposedly great chips :) Look at that, your twin Frozr III in there too...ROFLMAO. Hell the sapphire 7950 OC couldn't even hit 1200. XFX black couldn't hit more than 1125, Powercolor HD 7950PCS+ stopped at 1050! LOL. MANY couldn't hit more than 1125! The default for 7970 is 1.125 and hits 391 watts already in anandtech's 660 article. What do you think it hits at these voltages?
    You are quoting MINIMUM voltages as shown in the same page at behardware.com article above.
    http://www.behardware.com/medias/photos_news/00/36...
    You don't do those volts at 1150mhz! Also shown same page, read it again pal:
    http://www.behardware.com/articles/853-18/roundup-...
    Nice try though.
  • CeriseCogburn - Saturday, August 25, 2012 - link

    Yeah russian cheated on the power comparison - it's SAD, pathetic, sorry, lame, etc.
    Hey, they should just say they don't give a crap they're an amd fanboy no matter what. Having to lie makes it so sickeningly pathetic.
    Be a freakin fanboy, without LYING and spinning - heck then you actually have a REASON to be one.
    ( I still say it's their immense OWS hatred of nVidia ) - I mean maybe they're just wacked out fanboys, but there seems to be some other mental aberration behind it all - hence we always get the who can profit, nvidia is going to lose money, they're scalping - they held back and the 680 is really a 560ti replacement... on and on and on and on as if nVidia's bottom line is the reason they have their FANGS bared in amd love it or die fashion...
    YEP - see, so facts and sanity and truth do not matter - it's a freaking mission man - they're on a mission.
    LOL
    "it OC's to 1300 it's 40% !!!" " It's 8W diff at 880, what power prob ?"
    Yeah... oki dokie.
  • RussianSensation - Saturday, August 25, 2012 - link

    I compared power consumption for stock cards to disprove the point that people think HD7950 somehow uses 200-225W of power because they fail to comprehend power consumption charts using full system power at the wall, not accounting for PSU inefficiency either.

    Power consumption is the key topic of discussion now, 8-10W of power difference is a world of difference now? So you must have skipped GTX400/500 series entirely then.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    Yeah, 8-10 watts, another big fat lie.
    Plus amd screwed you on the 7950GE with higher core voltage, heck they're converting the WHOLE LINE - SO ALL THE POWER USEAGE GOES UP DUMB DUMB !
  • RussianSensation - Saturday, August 25, 2012 - link

    So much wasted space to say nothing useful.

    Only 1 out of 19 cards on Newegg is a reference HD7950 with GPU Boost. Sapphire DX, MSI TF3, Gigabyte Windforce 3x, HIS ICE-Q2 are all for sale and all overclock to 1100-1150 on stock 1.175V of Tahiti XT. Thanks for wasting a wall of space arguing about the reference 7950 B card no one will buy on this forum.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    The new core voltage on amd's failing process for the 79xx series means all the watt charts are too low.
    LOL
    MOAR POWER, MORE $$$$$$$$, amd epic fail
  • Ananke - Wednesday, August 22, 2012 - link

    :) GTX660ti has asymmetric 196-bit bus, i.e. that thing effectively is a 128-bit card...GTX670 on the other hand is good hardware, but too expensive.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Blown away verified owner doen't agree with your amd fanboy lies

    Pros: Out of the box very impressive. Card is well designed, lots of cooling added to the Twin Frozr. After some tweaking, I benchmarked 1330mhz stable while adding ~500mhz to memory as well. Max temp in BF3 overclocked everything Ultra was 76 degrees. Never dipped below 50fps from what I noticed.

    Overclocked, this card outperforms a 670 and is just barely shy of a 680 on benchmarks. $300 is worth every penny, plus with Borderlands 2 that technically drops it down to $240, making it more of that low-range card people were wanting out of it.
  • claysm - Tuesday, August 21, 2012 - link

    I use a 2560x1440 monitor...
  • Flunk - Tuesday, August 21, 2012 - link

    High end graphics cards aren't really necessary unless you're driving screens that big. If they didn't put in those benchmarks everything would be CPU bound and therefore impossible to compare.

    I have a 1080p screen (23") driven by 460 GTX x2 and it's more than enough. You just don't need the GPU power at low resolutions.
  • RussianSensation - Tuesday, August 21, 2012 - link

    Good point. You can turn down a lot of settings in games such as Ultra in BF3 or Extreme in Crysis 2. That setting has a huge performance hit and barely any improvement in image quality. In fact in many games I can't even tell the difference between High and Very High/Ultra.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    " RussianSensation on Tuesday, August 21, 2012
    Good point. You can turn down a lot of settings in games such as Ultra in BF3 or Extreme in Crysis 2. That setting has a huge performance hit and barely any improvement in image quality. In fact in many games I can't even tell the difference between High and Very High/Ultra. "

    Yet you go on about 8xmsaa on previous pages claiming amd wins....
    Which lie is it going to be next time ?
  • TheJian - Tuesday, August 21, 2012 - link

    None of the games in the 660TI review at anandtech are cpu limited at 1920x1200. The closest you come to that is Skyrim, which with 4xMSAA/16AF on, the spread of the cards is from 43.6 - 98.2fps. The difference between the Zotac AMP (92.8) and the Reference 670 GTX (98.2) is 5.4fps (~5%) and the 7950Boost only scored 86. Quite the spread. They'd all be pegged and within a fps of each other if CPU limited.

    Funny to note that the 660 is over 2x faster than the 560TI (43.6fps) it is replacing and that ryan says isn't worth the upgrade..LOL.
    From the conclusion page and part of what I attacked in my wall of text :) :
    http://www.anandtech.com/show/6159/the-geforce-gtx...

    "The GTX 660 Ti is actually a great upgrade for the GTX 560 Ti (and similar cards) from a performance standpoint, but despite the similar name it can’t match the GTX 560 Ti’s affordability. This entire generation has seen a smaller than normal performance increase at the standard price points, and the GTX 660 Ti doesn’t change this. If you’re frugal and on Fermi, you’re probably going to want to wait for whatever comes next."

    Nope not as cheap as the 560, but it's over two times faster in skyrim as shown. But you should wait for whatever comes next, because the 660TI just isn't a good improvement...LOL Smaller performance gain for entire generation? So a 112% performance improvement isn't worth the money? Small gain? When was the last time we replace the last model with a card that hits 100% faster? It's still going for ~$150. Double power, double price...Hmm...I don't get the conclusion I guess. It's 50-100% faster depending on game.
    http://www.anandtech.com/show/4221/nvidias-gtx-550...
    560TI vs. 460 TI
    49.9 vs. 38.5 Civ5 29%
    61 vs. 44.8 Bad company 2 36%
    44 vs 33.1 Stalker CP 32%
    82.6 vs. 63.6 Dirt =29 %
    74 vs. 60.5 Mass Effect 2 =22%
    119 vs. 93 HAWKX =27%
    31 vs. 24 Metro 2033=29%
    There are more games, I just got bored.
    Pretty much 22-36% and I'm using 1920x1200 not the 1680x1050 scores which would make it less as the 460tI would be less stressed memory wise, so this is a best case for the gaps.
    There isn't a game in the list that doesn't improve by over 35% in the 660TI review vs. 560.
    Civ 5 660 TI review vs. 560 ti in same review
    51.9 vs. 35.9=44% So directly comparing to last gen 560 ti replacing 460ti, which only improved 29% over it in Civ 5 way back then, the UPGRADE to 660 is looking like a 15% improvement HIGHER than the previous gen...But Ryan thinks that 44% is LOWER than 29%. Where did he get his diploma? Skyrim shows 112% improvement...Again, 112% is worse than avg improvement of previous gen of 22-36%. I'm not sure what planet you are on if 112% is LOWER than 22-36%. Strange conclusions by ryan...ROFL.

    So monitors you should by from fly by night joints in Korea/New Zealand, 112% is less than 36% (or apparently 22% too...LOL), 44% is lower than 29% (really?) and the list goes on and on with the inaccuracies of this conclusion in the 660 TI article...There's just so much that is whacked. It was really to easy to pick apart. People may not like walls of text, but all should read page 4 of the All Comments section (in the 660 review) for his LAME excuses, and my rebuttal. It's really quite funny. I challenge him to debate it, ANY of it if he can. Heck, anyone take a shot at it...LOL. It's his benchmarks and his words. :) Incidentally RYAN did the review of the 550ti which I got the %'s from for the 560ti vs 460ti numbers...LOL. He can feel free to argue with himself too... :)
  • Blazorthon - Tuesday, August 21, 2012 - link

    Didn't affect a thing? That memory bus (beyond what people have said that proves you wrong about it anyway)and the associated hardware with it are the only difference between the 660 TI and a 670, if I remember correctly. The difference in performance between the two cards shows that the 660 TI's memory bus does in fact have a huge impact on its performance. The GTX 670 is already a memory-bandwidth bottle-necked card when its GPU is running code that it doesn't suck at (too much tessellation, DirectC, AA, et cetera) and the 660 TI proves it.

    If anything, AMD lowered prices because of not wanting to confront the hype/propaganda that is going around about the 660 TI, not because their cards are worse. They most certainly aren't anyway and I have no idea what is going on with Nvidia that would cause them to release such extremely unbalanced GPUs as the Kepler GPUs. Fermi, although hot, was a lot better as an architecture in many ways than Kepler and was also not kept so memory-bandwidth constrained. A die-shrink of it with minor reorganizing would have probably been better than Kepler.
  • RussianSensation - Tuesday, August 21, 2012 - link

    It's not just the memory bus width.

    660Ti has 24 ROPs vs. 32 ROPs for the 670. ROPs coupled with memory bandwidth is what helps to support MSAA performance.
  • TheJian - Tuesday, August 21, 2012 - link

    LOL at MSAA...8xmsaa in skyrim totally playable even at useless res of 2560x1600 at hardocp (2 of the games unplayable just as said of anandtech's minimums if he had them - most don't show these...Gee? Could it be because many would drop well below 30 making them unplayable? Why did Ryan leave out batman hitting 10fps on 7950 (the avg is only 35.4!)? Heck it runs 10-15 fps for a long time in hardocp's charts. Thats not going to be playable, but Anandtech acts like you'd run these games there on a $300 card. Witcher 2 hitting 16fps on 7950. Pray to god, and you still can't get an overclock to overcome that and get to 30fps... Never mind Batman scores being near single digits.

    http://hardocp.com/article/2012/08/21/galaxy_gefor...

    Just keep saying over and over though...Maybe the cards will slow down and make you right. Note 8xMSAA is more potent than the 4xMSAA Ryan ran in Skyrim (noted by the slight drop in fps at Hardocp). So the card can handle MSAA, just not 2560x1600 all the time, which is why I say no point in running it or the 7950 there and quoting victories for either side (if everything isn't playable above 30fps who would try it?). No 17-27in monitor runs there anyway. And if you actually have the cash for a $850 Dell u2711 you probably have the cash for a 7970ghz/GTX 680 or 690 or dual cards to get rid of the dipping below 30fps altogether. Which is what $500-1000 cards and SLI/Crossfire are for! Get it yet? Still confused?

    MSAA comment from the article:
    "In Skyrim the overclocked GALAXY GTX 660 Ti GC 3GB can't quite reach GTX 670 performance at 2560x1600 with 8X MSAA, but it comes close. So close you couldn't tell the two apart".......LOL. But that 660 sucks right? Kyle Bennet and Hardocp must be idiots and you're just right. :) OK, I give up, no amount of evidence or proof I can tell you fanboy will get you to drop it. Forget the benchmarks, the 660 TI just can't run MSAA no matter what review sites prove it can... ;) I just believe YOU. Russian's word is worth more than some stinking review site's benchmarks...All bow to the great RUSSIAN...Who cares if Kyle and Hardocp.com proved the 660 TI can run memory at 7.6ghz and OC to 1300mhz, he's an idiot and you just can't do what he proved dang it. You can't catch the gtx 670...He's crazy. :) That MSAA stuff just won't run on that crappy 660 TI. Russian said it's nuts.

    TROLL much lately?
  • CeriseCogburn - Saturday, August 25, 2012 - link

    Ahh... well. thanks again. I feel better after trying much less a few months ago to hammer that home.
    The excuse is always the same - until it flip flops 100% - first Russian goes wonk with 8xmsaa, then he claims he can't tell the difference to Flunk and his 2x460's above. LOL
    Dude, it is INCREDIBLE the crap they pull.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    If anything, AMD lowered prices because of not wanting to confront the hype/propaganda that is going around about the 660 TI, not because their cards are worse. They most certainly aren't anyway and I have no idea what is going on

    enuff said
  • Belard - Tuesday, August 21, 2012 - link

    Its shocking that ANYONE games in anything lower than 1920x1080, considering that such monitors start out at $120+. And that is considered standard res in my book.
  • Parhel - Tuesday, August 21, 2012 - link

    Right. And yet people post these long crazy fanboy rants regarding $300+ GPUs, and insist that the reviews needs to target only those buying $100 monitors. I would bet that most people are using either integrated graphics or a sub $100 card, especially those with sub 1080p resolutions.
  • CeriseCogburn - Saturday, August 25, 2012 - link

    Lowest priced 1920x1200 monitor at the egg, out of 25 currently there is $268.98 delivered.
    So in your kookball lying rant, $100 = cheapest there is $268.98 ?
    I don't know bro - I guess if you didn't lie so large I could agree with you ?
  • CeriseCogburn - Saturday, August 25, 2012 - link

    Dude, when the amd fanboy doesn't have 5 or 10 bucks for PhysX, for adaptive v-sync, for frame rate target, for ambient occlusion for driver upgrades for YEARS, LONG YEARS to buy the nVidia card over the slower and slightly less expensive according to the amd fanboy penny pincher, they DO NOT have $120 for a monitor, either !
    :-) reality may in fact bite
  • fourzeronine - Monday, August 20, 2012 - link

    for computing the AMD corner went from a no-brainer to who's nvidia?
  • Patflute - Tuesday, August 21, 2012 - link

    These are gaming cards, no one cares about computing.
  • C'DaleRider - Tuesday, August 21, 2012 - link

    That's not what the nvidia fabois yelled and bleated when the 4xx and 5xx series of cards came out. Amazing what the green sheeple will say to justify their irrational love of a company.
  • Patflute - Tuesday, August 21, 2012 - link

    I'm not a fanboy ;0
  • TheJian - Tuesday, August 21, 2012 - link

    Stop you're killing me. You're making too much sense. :)

    I don't care about 2560x1600 either when no 27in has that res (only 11 on newegg are 2560x1440 and all over $688), all 41 others are 1920x1080.

    All 68 newegg 24in monitors are 1920x1200 or less. ZERO above it.

    My dell 24 1920x1200, and my 22 lg 1680x1050. If I actually buy a 30in or go to dual/triple monitors I'll be running in a res FAR above either or even 2560x1600. More like 3840x1200, or 5760x1200. I'll be running TWO cards or a GTX 7970/680/690 most likely too. Because a $280-350 card just isn't enough without dipping below playable 30fps in a LOT of games at 2560x1600. I won't try it with a rejected heater 7950 OC'ed to hell. 7970's are binned.

    Not saying these results aren't important to some, just that dogging a card in this range for bandwidth issues is pointless. Claiming victory in a game where fps drops below 30fps is pointless.

    Fanboys sure do come out when you point out the facts and it's not going their way. The first thing they do is call you one...LOL.
  • RussianSensation - Tuesday, August 21, 2012 - link

    After-market 7950s are binned too. Already showed you their power consumption earlier in the thread. After-market HD7950 @ 1100-1150mhz can be achieved at Tahiti XT stock voltage of 1.175V under 190W. That's around what a GTX680 draws, and yet 7950 costs $330, $170 savings.

    Go to our Forum and ask the owners directly since you don't believe actual real world data I keep providing you. Every single person is getting 1100+ OC on the MSI TF3 7950 using less than 1.175V:
    http://forums.anandtech.com/showthread.php?t=22593...
  • TheJian - Tuesday, August 21, 2012 - link

    http://www.behardware.com/articles/853-18/roundup-...

    These guys are fools too then with 11 cards 7950 & 7970 (4 cards!). Nah, ignore the review sites. Russian said review sites are wrong...Forum people never lie etc. Russian is always right. You drag out forum people, I drag out review. But they're crazy.
    http://www.behardware.com/medias/photos_news/00/36...
    MINIMUM VOLTAGES AND CLOCKS @ SAID MINS

    http://www.behardware.com/medias/photos_news/00/36...
    "The first value given for each card represents the starting voltage while the orange boxes represent cases where cards were stable on the bench table but not in the casing, where temperatures get higher."

    Only 7/11 can do what you say here. 5 didn't make it at 1.2v. NONE made it over 1200 and only TWO of 11 hit that. Those chips will be running out soon and only BOOST will be left for 7950's. Note you're talking JUST 7950's my link points to 7970's also not going over 1200, and ONLY ONE hit 1200! out of 4. These guys are pretty thorough...But you're right ;) They just had bad chips. Because all of them can do no bad in your fanboy eyes eh?

    http://www.guru3d.com/article/radeon-hd-7950-overc...
    1.25v just to hit 1150 at 217watt JUST FOR THE CARD. Whole system, obviously more. 80 watts for that overclock as reg scored 138w alone. Guru3d are idiots too...You're right, I should just depend on forum users, review sites are all getting bad chips and don't know what they are doing.

    http://www.anandtech.com/show/6152/amd-announces-n...
    7970ghz edition (meaning clocked up) 376w vs. GTX680 333 watts. So 43 watts different. Heck I won't argue with you if you tell me ryan can't do math, but I think I can. And you will run worse, this card is only 1ghz. Anandtech are fools too...Russian is right...He's always right. ;) I believe you fanboy. These numbers lie. The 7970's are the best out there, they have to be as they're binned for higher speed. But what do 3 review sites know. Russian's right dang it.
    "After going through the full validation process we were able to hit an overclock of +150MHz, which pushed our base clock from 1000MHz to 1150MHz, and our boost clock from 1050MHz to 1200MHz. Depending on how you want to count this overclock amidst the presence of the boost clock this is either 25MHz better than our best 7970 card, or 75MHz better. In either case our 7970GE definitely overclocks better than our earlier 7970 cards but not significantly so, which is in-line with our expectations."
    So never quite hit 1200 before at anandtech...And the ghz edition is BETTER binned.
    "As with any overclocking effort based on a single sample our overclocking results are not going to be representative of what every card can do, but they are reasonable. With AMD now binning chips for the 7970GE we’d expect to see some stratification among the 7970 family such that high overclocking chips that would previously show up in 7970 cards will now show up in 7970GE cards instead. For penny-pinching overclockers this is not good news"

    So, like I said, good chips will be going out of the 7950 line (if not already) and moving into the even more expensive 7970ghz edition. Blame ryan not me. ;) WOW, double binning...LOL. They expect a the 7970's to NOT hit 1175 anymore? They never had one hit 1200 before the 7970ghz edition. OUCH.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    Yes, big ouch, so the russian has fled in continuing self induced perma-blindness.
    LOL
    Fan boy disease.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    nVidia has a huge base for compute software and still kicks the crap out of amd in compute because amd doesn't have the software to support it.

    amd has now winzip - winzip dude...

    the nVidia fanboy WAS CORRECT, and IS STILL CORRECT

    amd has no drivers, no apps, so compute can do some crappy amd benches and do winzip for amd.. that's nearly it

    Not like it's PhysX or something where one of the card companies can't do it at all... and that doesn't matter right, even though PhysX is more used than compute... especially with gamers running gaming cards at a gamers review...

    BUT - the nVidia fanboys are flip flopped ? LOL
  • RussianSensation - Tuesday, August 21, 2012 - link

    1) Bitcoin mining
    2) Accelerate Photoshop filters and WinZip 16.5 (see Tom's Hardware Review).

    It may not matter but those are "free" features of GCN architecture.

    Also, it's now impacting game performance severely:

    - Dirt Showdown & Sniper Elite V2 runs very slow on GTX600 cards:
    http://www.xbitlabs.com/articles/graphics/display/...

    - Sleeping Dogs, you'll see benchmarks shortly in reviews.

    That's already 3 games where NV seems to be unable to fix the performance.
  • TheJian - Tuesday, August 21, 2012 - link

    Bitcoin mining was over ages ago. OR do you like to run your computer 100% to try for something that can't be found much anymore? The quick bitcoins were gone AGES ago. You won't pay for your card mining bit coins. ROFL. I don't care how many cards you buy you won't magically find them daily...LOL. By now the bots will beat you to anything :) Also it costs money to run your gpu all day folding, bitcoin mining etc. Electricity isn't free. But have fun.

    You keep trying to find bitcoins though...Nvidia junked it for a reason. Gamers don't care (not the smart ones). Dedicating gaming gpu space to this tech is a waste. Nvidia rightly axed it and gave us a cooler, less wattage GAMING card. Which is exactly what I plan to do with it. You go right on hashing though. other stuff? I don't buy a $300 gamer card and expect it to be used for Quadro/FireGL stuff, or things a quad could do just fine. Those cards exist for a reason :) Is rightware a game?...LOL.
    http://www.tomshardware.com/reviews/adobe-cs5-cuda...
    Never heard of CUDA huh? That's TWO years old. But only count on filters, as adobe seems to have gone opengl opencl in the framework. However filters themselves can do whatever they want. As dumb as it is to use a gaming card for this crap:
    http://www.tomshardware.com/reviews/geforce-gtx-66...
    And the wind blows the other way...Those are GTX680's blowing away AMD's best right? But again...Pointless for a gaming discussion.

    Toms also screwed up by taking every card they had in the 660 TI review (AMD too!) and clocking them at reference for the benchmarks. Why review the cards at all then? I'm talking both sides, no fanboy talk here. Read the fine print on the test setup page. YOu see? That's what those pages are for, so you can see how things are set up and discount them when necessary.
    http://www.metacritic.com/game/pc/sniper-elite-v2 SCORE 65...LOL
    I wouldn't PIRATE that gem let alone pay for it or play it. 1 positive review.
    http://www.metacritic.com/game/pc/dirt-showdown Score 72 (user score 4.8)...Victories in games people play count. GAMESPY quote (score 40):
    "DiRT: Showdown delivers bargain-basement entertainment value for the high, high price of $50. With its neutered physics, limited driving venues, clunky multiplayer, and diminished off-road racing options, discerning arcade racing fans should just write this one off as an unanticipated pothole in Codemaster's trailblazing DiRT series."
    You can quote games we'd never play all you want. You play these?
    You realize metacritic can point you to FPS games 85+ right? Racing games that score 85+ too. Like dirt3 (86 if memory serves). Showdown sucks.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Have plenty of friends who FRIED their amd cards trying to find bitcoins... LOL

    They just died - dead as a doornail... one bakejob revived one amd crapper for a little bit then it croaked forever

    Be careful of used amd bitcoin cards, they're on the shaky edge of electromigration and blown with blank screen outputs
  • CeriseCogburn - Thursday, August 23, 2012 - link

    It may not matter but those are free features.... LOL

    nVidia has photoshop for a long, long, long time.
    winzip is a joke

    Since older amd cards excelled at bitcoin while they sucked miserably at compute, THAT DOESN'T COUNT FOR COMPUTE DUMMY.

    So amd has winzip, and finally halfway caught up in photoshop with lousy drivers

    Dirt showdown looks like BF3 for amd, where amd loses badly -

    Sniper elite 2 looks like a normal win

    Sleeping dogs ... uhh okay whenever we get a bench you don't have

    So amd has hacked dirt 3 again, like they hack other gaming evolved games, no problem nVidia will probably pass them up in it soon when they unscrew amd's hacks

    So like, you've got winzip left really. That's a umm great compute dude, you can winzip
  • philipma1957 - Tuesday, August 21, 2012 - link

    wrong. if you buy a hd7950 when you don't game do some bitcoin. makes about 30 bucks a month. so don't say the card costs 320 or 350 vs 400 for a gtx 670.

    the card cost 250 if you figure in some bitcoin money.
  • RussianSensation - Tuesday, August 21, 2012 - link

    I don't want to start a flame war. Many people ignore bitcoin, the electricity cost isn't worth it for them (Hawaii, Europe) or they aren't interested in learning about it. So I only bring it up from time to time. However, I totally agree with you that with bitcoin mining on the side, HD7900 series cards start to look even better.

    With the value teetering around $9-10 mark, that starts to add up over the course of 5-6 months, to the point where eventually the card is paid off fully :)
    http://www.bitcoincharts.com/markets/mtgoxUSD.html
  • CeriseCogburn - Thursday, August 23, 2012 - link

    In 5-6 months you've put years of use on it, and it might just die, if it hasn't already by then - I've seen many cook before that length of time
  • fourzeronine - Tuesday, August 21, 2012 - link

    Wait what? Nobody cares about compute?

    First off, wrong.

    Secondly, nVidia's solution for compute is closing the source and charging more for less performance because CUDA is "special".

    Third, for graphics professionals, extremely important. When I am not gaming I am simulating and reconstructing geometry. AMD is the absolute choice for users like me and many others in my field. Even the afford ability of the module architecture in the CPU's is VERY good for me (even at this early stage). Just about all my computing at this point is very multi threaded/openCL accelerated. People are ignorant to the fact that AMD is currently focusing on a specific audience. But alas, I'm sorry, but this is where its all going. Your comment is ignorant.
  • TheJian - Tuesday, August 21, 2012 - link

    In IT we be quadros/firegl's for this stuff and it makes users very happy. No argument there.

    Like it or not, Cuda is fast when used. Even two years ago when benched by Toms in ps5 filters. Which was the easiest to google :) You can google cuda perf all day. Top link in the list below. "Photoshop cuda acceleration" pops that up.

    Not saying opengl opencl are bad...Just pointing out cuda is fast too.
    http://www.tomshardware.com/reviews/adobe-cs5-cuda...
    Filters though, don't count on PS itself. :) as they found out two years ago.
    From page 2 in that article (more than just ps is augmented):
    "Adobe boasts that very large projects can see up to a 10x performance gain from Mercury. Nvidia promises “performance gains of up to 70 times” for visual processing tasks. While we’re more inclined to lean toward Adobe’s number, given some of the GPGPU results we’ve seen in the past, such claims don’t sound infeasible. "

    I tend to believe adobe more too...LOL. But toms wondered 2 years ago.
    "Turn on CUDA, though, and nothing else matters. Our job finishes right around three minutes regardless of HT or core counts. Without CUDA, the best Photoshop can manage is 12 minutes. The difference is like night and day." 4 times faster isn't bad.
  • Patflute - Wednesday, August 22, 2012 - link

    A huge majority of people buy $300-500 graphics cards to mine and most people do not know or care about bitcoin mining. Also why would I leave my computer on and stress the hardware to it's limits for such a small profit?

    Most people do not compute, get it through your head.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    bitcoin isn't compute, the rage3d fools lied about it.

    If they could think straight they would have known even 4850's excel in bitcoin mining vs the nVidia equiv....

    IT'S NOT COMPUTE.

    It's another "idiotic lie because amd fanboys are stupid or pathological", or both.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    You don't do crap but play to make flower pics on flikr.

    Another braindead lying amd fanboy with cryptology as an answer.

    Another LIAR.
  • TheJian - Tuesday, August 21, 2012 - link

    ROFL...Score. :)
  • RussianSensation - Monday, August 20, 2012 - link

    http://www.techpowerup.com/170575/Graphics-Shipmen...

    - Nvidia's desktop discrete shipments dropped 10.4% from last quarter
    - AMD saw gains of 2.5% in the discrete desktop category

    AMD is now delivering better performance and price/performance in nearly every price level from $100-$499. Great for competition!
  • TheJian - Tuesday, August 21, 2012 - link

    http://www.techspot.com/news/47593-jpr-discrete-gp...
    We talking integrated market share or DISCRETE?
    Nvidia has had 60% for a while.
    You're forgetting Nvidia isn't selling chipsets now:
    http://jonpeddie.com/press-releases/details/graphi...
    "AMD increased its market share to 37.8%, Nvidia’s market share slipped but still retains a large majority at 61.9%. (seeTable 1)"

    Yep, 37.8 is better than 61.9.
    Sales/Profits for the last 12 months:
    AMD= 6.38Billion/-615mil (that's a LOSS - last 10yrs 6bil loss)
    NVDA=3.99Billion/437.8mil (those are profits, last 10yr=2.3Billion GAIN)
    Sales/Income growth last 12 months:
    AMD= 1.1%/5.10%
    NVDA=12.8%/129.5%
    Market cap AMD=2.91bil - NVDA=9.09B
    Debt/equity ratio AMD=1.81 (thats BAD) NVDA=0 (NO debt-good)
    Total assets/liabilities
    AMD=3924/5,041 (bil)
    NV=1438/5891 (bil)
    Cash = AMD =1.5bil NV=3.278
    http://investing.money.msn.com/investments/stock-b...
    http://investing.money.msn.com/investments/stock-p...

    http://www.fool.com/investing/general/2012/07/21/3...
    The gardners are not stupid. The next trillion dollar company? Nah, but I'm betting 100billion and yes like a smart man, I own it. Full disclosure. What FOOL wouldn't own it...LOL. I happily admit I am a FOOL... :)
    CAPS rating for NV 4/5. AMD
    http://www.fool.com/investing/general/2012/08/20/c...
    That's from YESTERDAY. OUCH.
    http://www.fool.com/fool/free-report/18/sa-datamin...
    Take a guess...NVDA :)
    But like any good enthusiast...I'm waiting until Black Friday to make my purchases and hoping AMD will pull a trick out of their sleeve for Piledriver...I'm hoping benchmarks will happen before Nov30. But I'll buy Intel if I have to. 660 TI looking good also (i game at 1920x1200 on my 24, it's native res). Bandwidth is NOT an issue.

    Steampowered.com hardware survey isn't about sales. IT's about what WE ARE USING. Their numbers don't lie. 2560x1600 is a decimal point of users. 1920x1200/1080 is 30%. Below this 68% - Above this is 2%. Nuff said.

    AMD will be out of business or bought by 2014 xmas (2015? Only if they release some decent cpu's soon). NO, I do NOT like that. I was an AMD reseller for years and pitched them every chance I could. Including the FAMOUS white box asus P2B's with no asus name on them...LOL. I owned one :) . Used to sell Step Thermodynamics chips when they were an awesome company :) Anyone remember those bad boys? IF our court system didn't suck so bad, they would have given AMD 20bil instead of 1.25billion! Intel earned 60billion over the time they screwed AMD. I know. I watched it as a reseller owing a PC business for 8 years. I don't hate AMD. I still have a white T-shirt from my last AMD conference...ROFL (circa 2004 - so what, no holes). It's a nice shirt! HANES, but I wear it when I can in public...LOL. Seriously, it's nice! And I like advertising for them. HUGE AMD logo on the back, little on on front. Companies were almost afraid to show their face in the same room as AMD then. Not ASUS! I've got a ASUS blue shirt from the same conference I think. Another Hanes (but I only sleep in it occasionally...LOL). But I won't buy the AMD stock :) My radeon 5850 smoke the then HEATER from Nvidia and was much cheaper and the only perf card I could live with in TX/AZ. I'm not giving it up either, just passing it to dad - He appreciates NOISELESS GPU's (it is SILENT, you only hear my case fans a bit). Heck half the time my pc is running I think the gpu's fan isn't running...LOL. I almost bought a GTX285 but the 5850 turned out better all things being equal (especially heat/price).

    AMD has to help themselves or die. They should be clocking their cards higher and charging MORE instead of you overclocking it on the cheap. People like you loving the low prices is killing them. They need to stop price wars and charge reasonable fees or just go bankrupt before they have no IP worth selling. They'd look better out of the box, and make more money doing it. Whatever. They are about to lose market share until xmas or longer. Meanwhile NV sells out and makes money. AMD cuts prices to keep hotter/slower cards selling. They need a GTX 690 too! Where is it? No king hurts mind-share regardless of how stupid it is and either side only sells 10000 of them. That 2.5% blip is 3 months behind (due to the channel and how it works). Next 3 will show a flip flop as NV gets more cards of the 600's down to $100 by xmas. I hope AMD is redesigning something that uses less watts/less heat by xmas or ouch. Economics don't lie. Debt KILLS you. In personal finances AND business. AND the country! We'll be bankrupt by 2016 if Obama gets another 4 years. We can't afford another 6 trillion in debt either. The first downgrade in our history of credit (AAA) since 1917 when we got it! If we don't quit spending money on crap like FISKER/Solyndra etc, and don't start DRILLING everywhere we can we'll be BROKE (oil towns have 3% unemployment). It's simple economics, just like AMD. SADLY. But maybe you live in Russia? :) So don't care ;) AMD is selling 6.38 billion dollars of product/year, but losing 629 million on it...That can't go on for much longer. They will be out of cash at this rate in 2yrs (2.25yrs from now at 650mil/year=1.5b they have in cash GONE). Check all their financials above...I'm not making this stuff up and I'm NOT happy about it.
  • chrnochime - Wednesday, August 22, 2012 - link

    What is it with you and the stupendous drive to make it a point that the 660ti just make more sense at that price point/res and that AMD is doomed to fail. If you like Nvidia so much why aren't you working for them. Or are you their employee already?

    I know people who work for Nvidia(yes I actually do, unlike the countless others who have imaginary friends who work there) and they sure don't go as far as you do to crow about Nvidia.

  • maximumGPU - Wednesday, August 22, 2012 - link

    He's clearly a troll, and makes a lot of silly remarks and comments. Just ignore him or he'll post another page long answer where he repeats himself.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    He has to correct the lying amd fanboys. Give the guy a break, it's not a nice nor an easy job, and he has provided facts that show the amd fanboys spinning and lying.

    He wouldn't do it at all if he didn't have so many amd trolls blowing it and not getting the facts correct.

    By the way, trolling is the amd fanboy blowing out spun up lies.

    Trolling is not taking the time to show them wrong with links and stats.

    Repeating is required until the liar amd fanboy actually has enough sense manhood to acknowledge the facts, which as we can see takes quite some time.

    By the way mGPU, what you just did was a total troll, are you an amd fanboy too, or do you want the truth so you can actually get the best deal ?
  • maximumGPU - Sunday, August 26, 2012 - link

    Thanks for offering to enlighten me, but i'm happy with my choice. A GTX 670 if you must know, so i wouldn't exactly describe myself as an amd fanboy.
    and yes you and that other guy are trolling.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    No you're trolling, a whiner troll, that's what you are.
  • Patflute - Tuesday, August 21, 2012 - link

    I was hoping for a 660, but it seems like it won't be out in the next 30 days. I would really want Sleeping Dogs!!! Hopefully the price of the 7850 goes down to the $190s or less by September 20.
  • TheJian - Tuesday, August 21, 2012 - link

    I don't get it? 660 is available all over. 30 days? Newegg has 12 in stock last I checked.

    Buy now and do AMD a favor. See my post above...They're going bankrupt fast.

    Sleeping dogs does look pretty cool.
  • Pantsu - Tuesday, August 21, 2012 - link

    He meant sans Ti.
  • TheJian - Tuesday, August 21, 2012 - link

    Ah...I get it now...Too tired, didn't notice :) Thanks.

    Well, no matter what he buys at $190 everything is pretty fast these days. If he really can't afford it, wait until black friday like me :)

    I may change my mind by then on the 660TI the way they keep dropping AMD's pricing. They need to re-release some models at higher clocks so they start making some more money.

    As much as I hate that I just said that, as that would not make my black friday happy...LOL. But even $20 times 5-10 million cards can be a TON of money to them per quarter. They have been clocking their cards too low allowing users to get too much free and killing their own business. I think the 7950 should have been clocked at 7970 default speeds and 7970 should be 1ghz already with a 1050/1100 special binning one at GTX 680 pricing or something. Anything to help, because they aren't getting much help from the cpu's these days. It's amazing they practically sell everything out yet lose 640mil. Who's running their clocking division? :) It does you no good to win a price war if you go bankrupt doing it, and they will NEVER price NV to death with 2bil in debt vs. none for NV with 3bil in cash. With Intel nibbling from the cpu end (and gpu a bit) they just can't keep cutting prices. If they stopped I think Jen Hsun would be more than happy to up his stock price (I would be happy too..ROFL) and make shareholders happy in the process. He's not the kind of guy that would cut off his nose to spite his face. I think about 40% of his wealth is in the stock still. ~200mil. When it goes down, I'm sure he winces a bit...LOL.
  • Blazorthon - Tuesday, August 21, 2012 - link

    Why even have the 570 in a chart such as the on on the last page where the cheaper 7870 is far superior?
  • Galcobar - Tuesday, August 21, 2012 - link

    It's a chart displaying what cards are currently available, at what prices. Note the label: price comparison, not performance comparison.

    That said, I didn't find a single 7870 at $250 on Newegg, even when rebates are included. Lowest I could see was a $270 XFX, albeit with a two-game coupon. Did see a 570 at $240 after rebate, however.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    LOL - another amd fanboy dreamer without the facts
  • DinoDinoDino - Wednesday, August 22, 2012 - link

    the 7850 is now hovering at the $190 mark on Newegg :D
    http://www.newegg.com/Product/ProductList.aspx?Sub...
    and i was JUST about to buy the 6950 PFFFFFFFFFFFFFT nothanks ^O^
  • CeriseCogburn - Saturday, August 25, 2012 - link

    MSI twin Frozer GTX570 is $209.99 AR and it kicks the ever loving crap out of 7850 right... ? LOL
    So, like what's the deal about 8 months late when AMD can't beat nVidia prior generation correctly that like was launched like over a YEAR BEFORE the amd 78xx crap series ?
    I mean, amd, like what's up with that ?

  • jfelano - Wednesday, August 22, 2012 - link

    I already see 2GB 7850's at $189 AR
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Are you going to write nVidia a thank you note and tell them how much you appreciate them helping you destroy amd's bottom line even further ?

    Oh sorry, for a second I thought you might care about amd and not just yourself or your fanboyism. With fanboys like those who needs competition?
  • CeriseCogburn - Saturday, August 25, 2012 - link

    Just get a real card and be happy
    http://www.newegg.com/Product/Product.aspx?Item=N8...

    Hey, amd loses again. It's that $200 market nVidia is "ignoring" according to the amd fanboys.
    LOL
  • Targon - Sunday, August 26, 2012 - link

    The 570GTX came out back at the end of 2010 and cost around $350 at the time. That said, you can ALWAYS find older generation stuff for less than newer generation. Performance is a mixed bag when you have newer generation products out. DirectX 10 parts when DirectX 11 is the current version, and when DX 11.1 comes out, there will be an automatic depreciation of DX 11.0 and earlier parts, so the price drops, even if the performance is high.
  • CeriseCogburn - Wednesday, August 29, 2012 - link

    GTX570 is DX11

Log in

Don't have an account? Sign up now