Comments Locked

302 Comments

Back to Article

  • Wreckage - Thursday, November 7, 2013 - link

    The 290X = Bulldozer. Hot, loud, power hungry and unable to compete with an older architecture.

    Kepler is still king even after being out for over a year.
  • trolledboat - Thursday, November 7, 2013 - link

    Hey look, it's a comment from a permanently banned user at this website for trolling, done before someone could of even read the first page.

    Back in reality, very nice card, but sorely overpriced for such a meagre gain over 780. It also is slower than the cheaper 290x in some cases.

    Nvidia needs more price cuts right now. 780 and 780ti are both badly overpriced in the face of 290 and 290x
  • neils58 - Thursday, November 7, 2013 - link

    I think Nvidia probably have the right strategy, G-Sync is around the corner and it's a game changer that justifies the premium for their brand - AMD's only answer to it at this time is going crossfire to try and ensure >60FPS at all times for V-Sync. Nvidia are basically offering a single card solution that even with the brand premium and G-sync monitors comes out less expensive than crossfire. 780Ti for 1440p gamers, 780 for for 1920p gamers.
  • Kamus - Thursday, November 7, 2013 - link

    I agree that G-Sync is a gamechanger, but just what do you mean AMD's only answer is crossfire? Mantle is right up there with g-sync in terms of importance. And from the looks of it, a good deal of AAA developers will be supporting Mantle.

    As a user, it kind of sucks, because I'd love to take advantage of both.
    That said, we still don't know just how much performance we'll get by using mantle, and it's only limited to games that support it, as opposed to G-Sync, which will work with every game right out of the box.

    But on the flip side, you need a new monitor for G-Sync, and at least at first, we know it will only be implemented on 120hz TN panels. And not everybody is willing to trade their beautiful looking IPS monitor for a TN monitor, specially since they will retail at $400+ for 23" 1080p.
  • Wreckage - Thursday, November 7, 2013 - link

    Gsync will work with every game past ad present. So far Mantle is only confirmed in one game. That's a huge difference.
  • Basstrip - Thursday, November 7, 2013 - link

    TLDR: When considering Gsync as a competitive advantage, add the cost of a new monitor. When considering Matnle support, think multiplatform and think next-gen consoles having AMD GPUs. Another plus side for NVidia is shadowplay and SHIELD though (but again, added costs if you consider SHIELD).

    Gsync is not such a game changer as you have yet to see both a monitor with Gsync AND its pricing. The fact that I would have to upgrade my monitor and that that Gsync branding will add another few $$$ on the price tag is something you guys have to consider.

    So to consider Gsync as a competitive advantage when considering a card, add the cost of a monitor to that. Perfect for those that are going to upgrade soon but for those that won't, Gsync is moot.

    Mantle on its plus side will be used on consoles and pc (as both PS4 and Xbox One have AMD processors, developpers of games will most probably be using it). You might not care about consoles but they are part of the gaming ecosystem and sadly, we pc users tend to get the shafted by developpers because of consoles. I remember Frankieonpc mentioning he used to play tons of COD back in the COD4 days and said that development tends to have shifted towards consoles so the tuning was a bit more off for pc (paraphrasing slightly).

    I'm in the market for both a new monitor and maybe a new card so I'm a bit on the fence...
  • Wreckage - Thursday, November 7, 2013 - link

    Mantle will not be used on consoles. AMD already confirmed this.
  • althaz - Thursday, November 7, 2013 - link

    Mantle is not used on consoles...because the consoles already have something very similar.
  • Kamus - Thursday, November 7, 2013 - link

    You are right, consoles use their own API for GCN, guess what mantle is used for?
    *spoiler alert* GCN
  • EJS1980 - Thursday, November 7, 2013 - link

    Mantle is irrefutably NOT coming to consoles, so do your due diligence before trying to make a point. :)
  • fewafwwaefwa - Thursday, November 7, 2013 - link

    sterven.
  • looncraz - Thursday, November 7, 2013 - link

    When game producers author the games they will do it with a mind towards Mantle and exploiting the AMD GPU characteristics exposed by Mantle on PCs for their console games.

    When creating portable software you create as thin of an abstraction layer as possible, that layer will now be much closer to the metal with unoptimized DirectX alternatives being manually added. That could very well mean that AMD hardware will have a noticeable advantage on PCs and game producers will only need to do a little extra work to become compatible with other DX-10/11 compatible video cards on Windows/Linux - so nVidia will become something of a "don't forget about me!" rather than "let's build to a generic platform and pull in the nVidia GPU extensions..."
  • Basstrip - Friday, November 8, 2013 - link

    I think they've ALWAYS programmed directly to the core. I think it's safe to assume that the processes translate fairly well and that although they might not be the same, they are similar.

    It just seems so economic to streamline the whole process. Less of a headache than to constantly try optimize things for multiple platforms.

    AMD chips on consoles may not be able to support mantle on the hardware side but programming for consoles and for pc will definitely NOT be 2 completely different things.
  • elajt_1 - Friday, November 8, 2013 - link

    Something I read on Extremetech: Feedback we’ve gotten from other sources continues to suggest that Microsoft’s low-level API for the Xbox One is extremely similar to Mantle, and the difference between the two is basically semantic. This doesn’t square very well with Microsoft’s own statements; we’ll continue to investigate.
    http://www.extremetech.com/gaming/168671-xbox-one-...
  • klmccaughey - Monday, November 11, 2013 - link

    The difference is a couple of header files. Izzy Wizzy! And you have your API calling code in Xbox transferable to a PC, the header files compile the API's to Mantle API - but both API's are essentailly the same. It couldn't be easier.
  • polaco - Friday, November 8, 2013 - link

    The point of mantle I think is to provide an easy way to port from PC to console or Console to PC. So giving the possibility to allow an easier cross compilation.
  • L33T BEANS - Friday, November 8, 2013 - link

    Basing someones intelligence on a single statement is unwise.
  • Totally - Sunday, November 10, 2013 - link

    Reading these comments makes me wonder, if these people slinging mantle around like a buzzword actually know what it does, because going by the comments alone trying to pitting it against g-sync they clearly don't. Mantle is as relevant to gamers as Cuda is. Yes it does have a direct impact but the benefits aren't for the end user.
  • klmccaughey - Monday, November 11, 2013 - link

    You do not understand. The API on the consoles is basically "Mantle". Mantle copies verbatim the API calls for the consoles. They just call it the API on the console. Port the code across, change a few headers, and you have your Mantle calls ;)
  • MonkeyM - Sunday, November 10, 2013 - link

    They will sell DIY kits, you don't need a new monitor, as per the press conference.
  • yuko - Monday, November 11, 2013 - link

    for me neither of them is gamechanger ... gsync, shield ... nice stuff i don't need
    mantle: another nice approach to create an semi-closed-standard .. it's not that directX or opengl is allready existing and working quite good, no , we need another low level standard where amd creates the api (and to be honest, they would be quite stupid not optimizing it for their hardware).

    I cannot believe and hope that mantle will flop, it does no favor to customers and the industry. It's just good for the marketing but has no real world use.
  • Kamus - Thursday, November 7, 2013 - link

    Nope, it's confirmed for every frostbite 3 game coming out, that's at least a dozen so far, not to mention it's also officially coming to starcitizen, which runs on cryengine 3 I believe.
    But yes, even with those titles it's still a huge difference, obviously.

    That said, you can expect that any engine optimized for GCN on consoles could wind up with mantle support, since the hard work is already done. And in the case of star citizen... Well, that's a PC exclusive, and it's still getting mantle.
  • StevoLincolnite - Thursday, November 7, 2013 - link

    Mantle is confirmed for all Frostbite powered games.
    That is, Battlefield 4, Dragon Age 3, Mirrors Edge 2, Need for Speed, Mass Effect, StarWars Battlefront, Plant's vs Zombies: Garden Warfare and probably others that haven't been announced yet by EA.
    Star Citizen and Thief will also support Mantle.

    So that's EA, Cloud Imperium Games, Square Enix that will support the API and it hasn't even released yet.
  • ahlan - Thursday, November 7, 2013 - link

    And for Gsync you will need a new monitor with Gsync support. I won't buy a new monitor only for that.
  • jnad32 - Thursday, November 7, 2013 - link

    http://ir.amd.com/phoenix.zhtml?c=74093&p=irol...
    BOOM!
  • Creig - Friday, November 8, 2013 - link

    Gsync will only work on Kepler and above video cards.

    So if you have an older card, not only do you have to buy an expensive gsync capable monitor, you also need a new Kepler based video card as well. Even if you already own a Kepler video card, you still have to purchase a new gsync monitor which will cost you $100 more than an identical non-gsync monitor.

    Whereas Mantle is a free performance boost for all GCN video cards.

    Summary:
    Gsync cost - Purchase new computer monitor +$100 for gsync module.
    Mantle cost - Free performance increase for all GCN equipped video cards.

    Pretty easy to see which one offers the better value.
  • neils58 - Sunday, November 10, 2013 - link

    As you say Mantle is very exciting, but we don't know how much performance we are talking about yet. My thinking on saying that crossfire was AMD's only answer is that in order to avoid the stuttering effect of dropping below the Vsync rate, you have to ensure that the minimum framerate is much higher, which means adding more cards or turning down quality settings. If Mantle turns out to be a huge performance increase things might work out, but we just don't know.

    Sure, TN isn't ideal, but people with gaming priorities will already be looking for monitors with low input lag, fast refresh rates and features like backlight strobing for motion blur reduction, G-Sync will basically become a standard feature on a brands lineup of gaming oriented monitors. I think it'll come down in price a fair bit too once there are a few competing brands.

    It's all made things tricky for me, I'm currently on a 1920x1200 'VA monitor on a 5850 and was considering going up to a 1440p 27" screen (which would have required a new GPU purchase anyway) G-Sync adds enough value to Gaming TN's to push me over to them.
  • jcollett - Monday, November 11, 2013 - link

    I've got a large 27" IPS panel so I understand the concern. However, a good high refresh panel need not cost very much and still look great. Check out the ASUS VG248QE; been hearing good things about the panel and it is relatively cheap at about $270. I assume it would work with the G-Sync but I haven't confirmed that myself. I'll be looking for reviews of Battlefield 4 using Mantle this December as that could makeup a big part of the decision on my next card coming from Team Green or Red.
  • misfit410 - Thursday, November 7, 2013 - link

    I don't buy that it's a game changer, I have no intention of replacing my three Dell Ultrasharp monitors anytime soon, and even if I did I have no intention of dealing with buggy displayport as my only option to hook up a synced monitor.
  • Mr Majestyk - Thursday, November 7, 2013 - link

    +1

    I've got two high end Dell 27" monitors and it's a joke to think I'd swap them out for garbage TN monitors just to get G Sync.

    I don't see the 780 Ti as being any skin off AMD's nose. It's much dearer for very small gains and we haven't seen the custom AMD boards yet. For now I'd probably get the R9 290, assuming custom boards can greatly improve on cooling and heat.
  • fewafwwaefwa - Thursday, November 7, 2013 - link

    sterven..
  • althaz - Thursday, November 7, 2013 - link

    G-Sync a game-changer, seriously? I admit to not having seen it in action, but it seems like a small advantage at best and something nobody in the whole world has a monitor that supports it at worst.
  • MonkeyM - Sunday, November 10, 2013 - link

    780 isn't nearly as overprice as the ti. It's 500 now, not 650. Which, in all honesty, is a pretty fair price for a card that draw almost 70 watts less than the 290 or 290x. Badly overpriced? False. Overpriced? That's more than fair for the Ti, but a bit of a stretch for the 780. Meagre gain is also bullshit. You get the last 3 missing SMX's, an extra 1,000Mhz on the GDDR5's clock, and you also get a sizable 576 more stream processors. Other than those, it's a fair comment. I do wish they would feel the need to drop prices more, but you certainly get consistency when you buy from big green...
  • Da W - Thursday, November 7, 2013 - link

    Hey look, an Nvidia Fanboy! So happy to get a few framerate advantage like if he owned the company or worked for it.
    WHO GIVES A DAMN?
    At the end of the day i'm looking at performance/price/temperature/noise. That being said, living in Canada, every degree of heat my videocard produce, i save in heating bill.
  • euskalzabe - Thursday, November 7, 2013 - link

    hahahaha... I totally understand, that is one of the reasons I still keep my GTX470: the heat it provides during cold Chicago winters is a plus until I move elsewhere next year and buy a 8xx Maxwell :)
  • EzioAs - Thursday, November 7, 2013 - link

    The GTX 780ti is also quite power hungry and loud and you would know that if you read the review
  • Wreckage - Thursday, November 7, 2013 - link

    I'm guessing you ignored the "uber mode" setting for the 290x, it is off the charts compared to the 780ti.

    Nothing I said in my above post is wrong. I think it's the truth that is upsetting people.
  • EzioAs - Thursday, November 7, 2013 - link

    You also didn't clarify that it was the Uber mode...and it is still one the charts.

    Without the "uber mode", the 290X is still quite close to the GTX 780ti in terms of gaming performance, power consumption and noise.
  • TheJian - Thursday, November 7, 2013 - link

    You must not be reading anywhere but here, and even then, 290x isn't close:
    Oddly Anandtech doesn’t seem to know it has special tech in it that allows better OCing – power balancing (unbalancing?). You guys not using it or something? :)

    http://www.bit-tech.net/hardware/graphics/2013/11/...
    “A new power management feature for the GTX 780 Ti related to clock speeds and overclocking in particular is called Power Balancing. A card like the GTX 780 Ti draws power across three rails: the PCI-Express lane and the two additional PCI-E power connections. Power is balanced between the three but can become unbalanced when overclocking and possibly limit your overclocks if you max out one rail while having headroom elsewhere. Power Balancing simply allows the balance to be maintained when overclocking, potentially allowing for higher overclocks than previous GK110 cards, on top of the already higher clock speeds.”
    They only hit 1152, but in practice saw it hitting 1230. Mem hit 1950!
    http://www.guru3d.com/articles_pages/geforce_gtx_7...
    More on power balancing. They hit 1276 boost 7948mem.

    http://www.legitreviews.com/nvidia-geforce-gtx-780...
    1289 OC/1900 mem

    https://www.youtube.com/watch?v=m1JOhT015ww
    Linustechtips, as always both cards Oc’ed to the wall. He mentions Over 1200 core (not sure if that’s base or boost). But as you can see when both 780ti/290x are clocked to max 780ti dominates everything. Benchmarks at 8:35 or so. Also note Luke says 1080p will still be tough in upcoming games like star citizen etc as he shows. Pretty much a landslide by 15-25% “crushing everything” Luke says. He actually discusses 1080p and shows Farcry 3 (55avg, 290x hits 47avg)/Crysis 3 (50fps vs. 40fps 290x) maxed not hitting above 55fps and at 2560 shows they don’t even hit 30fps avg and this is OC’ed to the max and already kicking the crap out of AMD here (24fps crysis3 for 290x max oc’ed). So if you like to MAX everything in your game, these both are not even playable in crysis 3 or farcry3 at 2560 and many other games. You will constantly be turning stuff down at 1600p, so not quite sure how anyone can say these cards are overkill for 1080p when as he notes games like star citizen will no doubt slow you down even more than Crysis 3 (same engine, later game, well duh). You’ll need 20nm to max 2560 or always run things on low, medium etc like anandtech does. You can play there but with how many sacrifices?

    http://www.overclockersclub.com/reviews/nvidia_gtx...
    1304 OC/1940 mem
    Note also these guys show the quiet mode dropped 290x to 669mhz!

    While Anandtech still uses very few games and a useless warhead game:
    Games 780 wins or dominates in 2560 ALL vs. UBER 290x (of course all worse for quiet mode, note bit-tech only does 1080p and 5760):
    Skyrim (bit-tech w/hires texture packs, techpowerup without)
    Assassins Creed 3 (techpowerup, 5.3%)
    SplinterCell Blacklist (techpowerup, blows away 690, crushes UBER 36%, also same shown at overclockersclub even 5760)
    Battlefield 3 (techpowerup, legitreviews, overclockersclub 1080/5760)
    Battlefield 4 (bit-tech, but barely, same 1080p, tweaktown shows big loss? But guru3d shows big win@2xMSAA…LOL – guru3d shows losses below)
    Batman Arkham City (overclockersclub at both 1080/5760)
    Tombraider (legitreviews, techpowerup, tweaktown etc)
    WOW Mysts of Pandaria (techpowerup, over 25% faster, over 20% 5760)
    StarCraft 2 HOS (techpowerup, over 15%, beat 690 too)
    Diablo 3 (techpowerup, over 15%, 20% in 1080p also)
    COD Black Ops 2 (techpowerup 17%, again over 22% in 1080p also)
    Sleeping Dogs (techpowerup)
    Crysis 3 (techpowerup, bit-tech)
    Bioshock Infinite (Techpowerup, bit-tech etc – everyone I guess)
    Phantasy Star online 2 (tweaktown, 17%+, even beats 1065mhz OC 290x)
    Lost Planet 2 (tweaktown, over 34%! Same vs. 1065mhz 290x, same 1080p)
    F1 2012 (tweaktown, beats 1065mhz 290x also, all resolutions)
    Dirt Showdown (tweaktown tie 2560, but wins 1200p/1680x1050)
    Far Cry2 (tweaktown, anyone play this? Still they show it over 10% NV)
    Guild Wars 2 (techreport, dominated by old 780, so 780ti will be better)
    Medal of Honor Warfighter (guru3d 17%).

    Maybe there's a reason anandtech has chosen their games? Still waiting for the NVIDIA PORTAL.

    The point here? Gsync, GeforceExp, Physx, Cuda, streaming, shadowplay, lower noise, power, heat, 3 AAA games, massive OCing and all the games above with some major victories (BEFORE and overclock). This is without mentioning all the driver issues, including AMD admitting they have a current problem with “VARIANCE” with 290/290x and will fix that with a driver supposedly in response to Tomshardware article, Techreport etc about retails perf being lower than press cards. For anyone thinking $700 is a rip-off, I suggest you look at the numbers/features above. On top you need a new fan or wait for better models before I'd even touch 290x/290 due to noise.

    Only disappointment I can see as a buyer, is no full DP. Titan still has that and 6GB, though nobody can show a game using more than 3GB and run into the problem while being OVER 30fps. To force this into a problem (not sure you can, skyrim modded out?), you will be CRAWLING in fps.
  • Galidou - Thursday, November 7, 2013 - link

    Wow dude, hardcore fan or working for nvidia or I don't know, took the time to find every link, type that to make us realise this: nvidia's reference cooler is amazing like before, we know how GTX 780 ti pushed to the max performs(i don't think custom coolers will go much past 1300mhz on the core), 290x reference cooler is crap(like we didn't already know) and for that we still don't know how it performs pushed to the max.

    Oh and maybe YOU didn't chose your games for comparison... And yes it is close, from your carefully handpicked games it's averaging 15-20% faster while costing 28% more.

    700$ is not a ripoff for totl performance but still too much for 99.8% of us(pc gamers that still use 1080p monitors).

    For the 3gb argument, did you travel in the future, 2 years from now if the games will never use more? Skyrim with a couple mods goes close to 2gb in 1080p!! Heavily modded over 2gb easily.i'm right now on the limit of mods with my GTX 660 ti 2gb, sometimes it suffers from a little lack of memory...
  • 1Angelreloaded - Saturday, November 16, 2013 - link

    False I Hit the 3.5 Gb limit quite a few times due to it being a 32 bit game, now if they are 64bit games then yes they will use more than 3GB for textures and draw distance , but meh you know what your talking about.......right.
  • ahlan - Friday, November 8, 2013 - link

    Damage control Nvidia fanboy! Nvidia fanboys are delusional as MS and Apple fanboys...

    Keep paying more for the same performance...
  • dylan522p - Thursday, November 7, 2013 - link

    Not at all. In quiet more. It runs hotter, is louder 95% of the time and is using more power.
  • dylan522p - Thursday, November 7, 2013 - link

    And performs significantly worse.
  • DMCalloway - Thursday, November 7, 2013 - link

    Definition of upsetting: Early gtx 780 adopters now able to purchase a 'true' gtx 780 at the same price point previous gtx 780's were at launch. Nvidia sat back, took everyone's cash, and now to remain competitive finally release a fully enabled chip..... wow
  • Spunjji - Thursday, November 7, 2013 - link

    I think early adopters on both sides got dicked here. The R9 290 makes everything else look like a joke in terms of pricing, for all its manifest flaws.
  • dylan522p - Thursday, November 7, 2013 - link

    I would rather not have the 480v2, in my machine.
  • Yojimbo - Thursday, November 7, 2013 - link

    And next year they'll release something even faster at the same price point. You can't have both increasing performance/price over time and also not have your new hardware become a comparatively bad deal in the future. People who bought the GTX 780 when it came out got 5 to 6 months of use of the card in exchange for a card which is now ~15% slower than what's available at the same price point.
  • ShieTar - Friday, November 8, 2013 - link

    In other words: Nvidia did what absolutely every other CPU & GPU provider has also done over the last 30 years? Wow indeed.

    Everybody wants to bring the most profitable product possible to the market. That means, you need to be good enough to interest customers and cheap enough to be affordable. And you don't get better or cheaper, unless something changes the market, e.g. competition.
  • extide - Thursday, November 7, 2013 - link

    You stated the 290x is "unable to compete with an older architecture." That is false. LOL
  • TheJian - Thursday, November 7, 2013 - link

    Not sure you're correct. If NV set 780TI at 95 degrees default how fast would it be going out of the box? 1200mhz-1300mhz (that's 30% free!) judging by Ocing with stock fans already as I noted in the previous post with all the site links and it never goes above 83 doing it. They overclock them and don't hit uber noise. So you can get all the perf from overclocking and SMASH the 290/290x but still be more quiet. I don't call that keeping up. AMD put out a good card, but it has lots of issues (heat/noise and blown away by stock overclocks from NV that won't drive you crazy with noise).

    From the highest clock I saw so far (1304 at overclockers):
    "On the GTX 780 Ti with the fan spinning at 100% locked in a chassis its not bad and will not wake your "neighbors" compared too the R9 290X."

    So even at 100% nothing like 290x. :) I call that not competing too ;) How crappy is your fan/heatsink combo if you can't compare to a guy running 100%? Out of the box buyers for ref will be much happier with NV, not to mention all the features they have over AMD and 3 AAA games etc. You release a new card, while your competition just turns on some stuff they've disabled for a year waiting for you to catch them...LOL. On top, your card has "Variance" issues you are admitting you need to fix. You're running so close to crapout, they have been clocked at 669 in QUIET. That's UGLY right? Overclockers got 669 dips on quiet. How usable is that?
  • Da W - Thursday, November 7, 2013 - link

    You done masturbating yet?
  • DMCalloway - Thursday, November 7, 2013 - link

    I think you under estimate just how 'happy' early gtx 780 adopters are with current pricing. For what they paid at launch they should've received a fully enabled chip. 7970ghz to r290x is a larger jump forward than gtx 780 to gtx 780Ti. We all remember what happened when AMD pushed their 7970 to the ghz. version in relation to the gtx 680. It's all relative except IMO Nvidia profits more for brand loyalty.
  • Galidou - Thursday, November 7, 2013 - link

    If nvidia would have enabled a full chip at 780 launch, imagine titan early adopter... We would have heard their anger far away in space...
  • Margalus - Friday, November 8, 2013 - link

    people got exactly what they paid for when they bought it. There is no reason to be upset because a better card is no available.
  • Nevk - Thursday, November 7, 2013 - link

    Nvidia fanboyz erererer
  • grayson360 - Thursday, November 7, 2013 - link

    Its so sad. I have a 780 and I basically only buy nvidia but that doesn't mean hate the competition. Any competition is good competition :D
  • OverclockedCeleron - Thursday, November 7, 2013 - link

    Trolls be damned. 290X = Bulldozer? Really? I am growing sick of these PR reps who troll tech sites. And for the record, a properly-cooled "GHz" edition of the R290X will probably beat GTX 780 Ti in many scenarios and still be over $100 cheaper. Enjoy paying more for less :-).
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    Probably not in this case, unless the 290x is fitted with a waterblock I don't see it happening. The problem is we will probably see triple slot vapor coolers like HOF galaxy which basically lends to the fact that multi GPU is a non possibility depending on your motherboard. Another thing is I would not put a 290x stock in a 350D or any other itx/matx solution case, between the thermals and the noise of the stock unit. I'm kinda excited for what Hawaii is, but also disappointed that this should have come out during the release of Nvidias 600 series not the end of 700 series cycle.
  • PsiAmp - Thursday, November 7, 2013 - link

    tomshardware tested Accelero Xtreme III with R9 290 and it made it nearly silent and very cool. Aslo had better performance.
  • Kamus - Thursday, November 7, 2013 - link

    I wouldn't buy that aftermarket cooler just yet.... I was about to buy one when I read that it's not cooling the VRM properly. Some guy bought 2 of them for his crossfire setup and started hearing whine from both cards. He concluded it was the crappy heatsinks that they provide for the VRM aren't up to the task.
    The VRM was exceeding 100 degrees. This resulted in games crashing and the loud whine I mentioned earlier. In the end, he put the regular AMD heatsinks back on, and that fixed the problem. Then he sold both cards and now he says he might just get regular 290's when they ship with custom coolers. Which isn't a bad idea at all, since those will probably outperform the reference 290x easily, for a lot less cash.
  • Margalus - Friday, November 8, 2013 - link

    so in other words, you have to spend another $100 for an aftermarket cooler to make the 290 a reasonable card?
  • milli - Thursday, November 7, 2013 - link

    All you need is an Accelero Xtreme III. It proves that as soon as companies like HIS release better versions, the R9 cards will be silent, fast and cheap. I have a HIS HD 7950, a supposed loud card, that is completely silent.
    http://www.computerbase.de/artikel/grafikkarten/20...
  • Sivar - Thursday, November 7, 2013 - link

    I gather from the link that you natively speak German. Please note that, in English, "silent" means "absolutely no sound". I think the word you are looking for is "quiet". :)
  • HisDivineOrder - Thursday, November 7, 2013 - link

    True, when custom cooled cards come out, that'll be great. None are announced and as far as I know no one's seen any. So you're waving your hands right now and saying, "When better cards come out, they'll be better!"

    You're talking about unannounced things you expect will come and save the day. I could easily say that overclocked variants of the 780 Ti are going to come and make everything better. Or that the GHZ editions of the 780 (non-Ti) are going to show up and magically make the price point logical again.

    But you know what? I'm waiting until what I think MIGHT happen actually DOES happen. Rather than daydreaming about HIS coolers or MSI coolers or Gigabyte Windforce coolers on the R9 series when they haven't shown anyone a sign of such a card yet. Are they coming? Sure, yeah, someday. Are they coming in 2013?

    ...No one's seen one yet. Hell, there's more evidence of GHZ versions of the standard 780 than there are of custom-cooled R9 290/290x cards...
  • OverclockedCeleron - Thursday, November 7, 2013 - link

    ... and your point is? You are happy with a 700-dollar-card, then good for you. I was only speaking up because I have noticed a significant rise of anti-AMD trolls (i.e. Nvidia PR Reps). Just read the first comment to this article and tell me you don't see anything wrong with it.

    So you like logic and you think you got it all sorted? Well, how about this: Try to judge the silicon and not the fan, i.e., install equally superior GPU air-based coolers, and then Overclock each card to the maximum stable overclock, and re-run the benchmarks. I know who will win, but I am not here to preach or daydream like you said.

    The point is, just because I realize the *actual* and *real* potential of something doesn't mean I am a daydreamer, it just means I appreciate that piece of silicon (i.e. R290X) and I realize that it is superior to the 780 Ti if it was not for that cheap heatsink.
  • TheJian - Thursday, November 7, 2013 - link

    You're wrong unless you're saying NV will win after both are equipped with equal cooling and also, Superclocked 780TI has already been announced :)
    http://videocardz.com/47777/evga-launches-geforce-...
    https://www.evga.com/articles/00795/#2884
    1006/1072 out of the box. Dual classified has yet to be clocked. But would have to be faster than ACX version and they already have announce the water block model :)
    03G-P4-2889-KR GTX 780 Ti Dual Classified w/ EVGA Hydro Copper
    Currently you void your warranty pulling this on 290x or 290 right? Sure seems like OEM's have no trouble modding NV. I'm guessing AMD has to sort out this variance stuff before the OEM's can jump on board.

    Realizing AMD is getting beaten doesn't make you an NV PR troll...LOL. This doesn't mean AMD's silicon is junk, just it's beaten. I for one am glad they released it forcing NV's full SMX hand. That means we can all look forward to even better 20nm chips as they now have a much higher bar to beat on both sides. I own a radeon 5850 BTW, because I realized it was the best silicon when I bought it (8800GT before that). I go wherever the SILICON/features tells me. The 8800GT is still serving my dad's pc from 2007 (over 6yrs old!). I'm wondering if these 95C chips can make it that long ;) My 5850 will be serving him the second I get my hands on 20nm maxwell (if only for gsync unless NV's chip totally sucks).
  • Galidou - Thursday, November 7, 2013 - link

    No need to be a genius to KNOW that the custom cooled cards will be WORLDS above this even if they are not out. If you did not realise how crappy that cooler is, it was crappy for 7970 cards and it's the sAme... Comon, a little more than 90 IQ is necessary to find out about that...
  • Toxicsix - Thursday, November 7, 2013 - link

    More for less you say... Let's look at that. The 290X excels at 4K gaming and that's what AMD keeps cramming down our throat, how superior there product is at the highest resolutions. So If your looking at price performance you need to pair that 290X with a 4K display. Now let's look at some simple numbers shall we. 290X $5-600 + 4K Display $3k+ so your sitting at about $3500+ to get that top end performance. On the other hand you can spend $700 on a 780 Ti and pair it with an amazing 1080p monitor for well under 1k and the 780ti will stop all over Your 290X. Now your probably thinking that this isn't a "fair" comparison but you have to consider that the resolution is where these companies are going to try convince you that there card is superior. As it stands the 780 Ti is the King of 1080p gaming and it pisses me off big time that anadtech didn't even include 1080p benchmarks as this is where the majority of gamers are. If you want the best gameplay possible I would recommend getting the sweetest 1080p monitor you can find that will give you great refresh rates, low input lag and 3D capabilities if that's your thing and pair it with the fastest 1080p gaming card you can find which is now Nvidia's 780Ti. AMD has a great card in the 290X but we can't just look at price/performance based on the card alone we have to consider what were playing it into and look at the complete package as one.
  • Fan_Atic - Thursday, November 7, 2013 - link

    I hate to say it but most of your points are completely moot. The reason no one runs 1080p benchmarks in these reviews is that it doesn't stress a modern card. Anything above the $200 price point can pull 60 FPS at max settings any more. No one in their right mind would spend $700 for a graphics card to play at 1080p when a $250 can provide a 60fps experience with the same settings. 1440p 1600p and 4k are the only things that remotely stress modern mid to high end graphics cards.
  • bronopoly - Thursday, November 7, 2013 - link

    Who plays at 60 fps? Surely not people playing games competitively.
  • Tetracycloide - Thursday, November 7, 2013 - link

    They don't include 1080p because they assumed no one would be stupid enough to buy a $700 card for 1080p gaming. I guess you showed them...
  • Ryan Smith - Thursday, November 7, 2013 - link

    We actually have the 1080p data, however we didn't publish it since it's not a resolution we're really expecting this card to be used with. When GPU Bench 14 goes live, that data will be available alongside everything else.
  • bronopoly - Thursday, November 7, 2013 - link

    I need 120 fps for my games when I'm using lightboost or 144 fps when I'm not using it. A lot of people aren't satisfied with 60 fps on good settings. When I go from a high framerate to something as awful as 60 fps, I usually restart my computer because I think something is wrong.
  • Mondozai - Thursday, November 7, 2013 - link

    The vast majority of people play at 60 fps. Including most people buying a high-end card. And the move is towards 4K over the next few years. You'll probably be stuck at 1080p for a long time if you insist on 120-144 fps while we will get 4K monitors next year for relatively affordable prices and even more so in 2015.
    That's your right. But cards like these are not going to be tailored towards people who want as high a fps as possible as opposed to as high a graphics fidelity as possible where 60 fps is the acceptable threshold. You can whine and bitch about it as much as you want. Won't make a difference.
  • HisDivineOrder - Thursday, November 7, 2013 - link

    Where are these "properly-cooled 'GHZ' Editions of the R290X?" If they existed, you might have an argument, but they don't. So talk about what exists.

    Even HardOCP--between ignoring the ear-piercing sound of the 290X at 55% fanspeed--admitted they had seen and heard nothing of custom coolers for the R9 290/290X yet and that doesn't bode well.

    So if you're waiting for a custom cooler to show up and save the R9 series, it looks like they're far, far away...
  • Tetracycloide - Thursday, November 7, 2013 - link

    "Let's all pretend we suffer from a crippling lack of foresight!"
  • OverclockedCeleron - Thursday, November 7, 2013 - link

    1) Everyone knows what comes after a GPU announcement [hint: an avalanche of custom GPUs based on that GPU].

    2) Many reviewing sites have said that AMD is preventing their partners from producing custom R290X gpus until 780 Ti is released. Can you guess why ;-)?
  • bronopoly - Thursday, November 7, 2013 - link

    Can I just make stuff up too?

    3) Insert wild speculation and conspiracy
  • Mondozai - Thursday, November 7, 2013 - link

    Someone's got an NV card and wants to defend it to death :D
  • bigboxes - Thursday, November 7, 2013 - link

    "Doesn't bode well." Just zip it.
  • TheJian - Thursday, November 7, 2013 - link

    https://www.youtube.com/watch?v=m1JOhT015ww
    Fast forward to 8:35 or so...Got smashed, both OC'ed to the max. In Uber as anandtech even showed, it hits it's max all day. So 1ghz won't get more, and they run higher than that at linustechtips (both cards oced max they could get). Nowhere close in any benchmark they run. Don't forget you can clock the crap out of 780TI as already shown everywhere. If you're assuming a "Properly Cooled" ghz edition beats 780ti well we've only seen REF 780TI's so far too right? So what does a "Properly Cooled" 780ti do? They hit 1200-1300 stock fan. Will they hit 1300-1400 with an aftermarket fan? You won't catch it as it is without water and you'll need to be way over 1ghz to do it and pulling more watts no doubt.

    What record are you giving? Assumptions and guesswork that goes against all current info from MANY sites. Check out all the sites I listed. That is what we call "for the record".
    http://www.techpowerup.com/reviews/NVIDIA/GeForce_...
    Only 1120/1175 (base/boost) and it's 30.8% faster than UBER 290x. Not even sure water would catch this and this is a WEAK OC compared to other sites like overclockers hitting 1291/1304. They can do this out of the box.
  • whiteswolf - Sunday, June 8, 2014 - link

    to let you know the nvidea gtx 780 ti is walking and not breaking sweat. while the amdsare running think of it this way. if you over clock or make nvidea gtx 780 ti sprint it is said int can get a 19% to 20% boost in performence. and it only gets noise and sound of amds card.
  • Da W - Thursday, November 7, 2013 - link

    The only thing louder than a 290X is a Nvidia Fanboy.
  • HisDivineOrder - Thursday, November 7, 2013 - link

    Actually, I think the louder thing still is the sound of AMD fanboys coming into a review of a new nVidia halo product and crying about "nvidia fanboys." ;)
  • OverclockedCeleron - Thursday, November 7, 2013 - link

    Well, ask your colleague troll not to bring AMD trash talk into an Nvidia article. (See first comment to this article).
  • bigboxes - Thursday, November 7, 2013 - link

    Haha!
  • Flunk - Thursday, November 7, 2013 - link

    Sure, if you like to throw money away. Nvidia could seriously destroy AMD on this generation, but they're choosing not to compete by not pricing in line with performance.

    I personally think AMD has the edge when it comes to cost/unit because their chip is smaller, which is how they can price so much lower.
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    See its about the feature sets as well that determine price, and of course support. The 289x should have been released with the 600 series not the end of the 700 series and at the edge of starting the 800 series in a few months with Maxwell.
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    edit "290x"
  • HisDivineOrder - Thursday, November 7, 2013 - link

    Sure, you can save money by buying into the R9 290X, but save that money because you're going to need it in a few years for a hearing aid.
  • OverclockedCeleron - Thursday, November 7, 2013 - link

    As if there won't be any custom-cooled R290X-based GPUs. You make it sound like all GPU vendors and partners have abandoned AMD, and that AMD is going to be stuck with that fan forever. Well done for being short-sighted.
  • HalloweenJack - Thursday, November 7, 2013 - link

    [img]http://assets.diylol.com/hfs/8fa/d87/4bb/resized/y...[/img]
  • halo37253 - Thursday, November 7, 2013 - link

    Personally I find it Kinda sad given the Fact that GK110 is a much bigger chip in general it would have a bigger lead at stock. Plus powerusage while gaming goes back and forth with the titan, it is competing with. Nvidia just has more aggressive TDP throttling, while AMD's is mainly temp based.

    290x is only hot under stock cooler, It actually runs pretty cool under water. Also 290x is doing more with less transistors compared to Nvidia. Sucks Nvidia needs to scale Kepler to such a higer level just to compete with AMD's lower offerings. AMD would slaughter Nvidia with a Die of equal size.

    Also with the 290 being $399 Nvidia is boned, unless the drop the gtx780's price again. 290 @ 1300mhz is about the same as a 780 @ 1400mhz.

    G-Sync only works with one monitor so far, and considering I already have a 120hz monitor and already get a taste of wants to come I could care less. We won't see it in anything not overpriced for years to come, by that time we will probably have a open standard that both Intel and AMD can use. Plus I want a 1440p ISP with G-Sync and doubt that will happen any time soon. Mantle is by far a more interesting option if you ask me. I already with with vsync off and get no tearing and games are as smooth as ever with AMD's current drivers. Smoother then my old Nvidia card, though I just made the switch to AMD and never really had the chance to use the old "crap" drivers.
  • fewafwwaefwa - Thursday, November 7, 2013 - link

    foad cretin.
  • fewafwwaefwa - Thursday, November 7, 2013 - link

    hope you get your stomach disemboweled.
  • fewafwwaefwa - Thursday, November 7, 2013 - link

    I'd gladly saw your head open.
  • Samus - Friday, November 8, 2013 - link

    The problem with this card is the 25% price premium over AMD's 290X for 11% more performance.

    The only real advantage it has over the 290X is lower noise. Other than that it lacks next-gen optimizations (Mantel, EA partnership, console ports, etc.)
  • ahlan - Friday, November 8, 2013 - link

    http://s11.postimg.org/odh7byx3n/amd_N.png

    Lol most review site are Nvidia's bitch....
  • tcube - Friday, November 8, 2013 - link

    erm... 290x in uber mode is edged by what? 1-5%... I can't call this a win + GK110 has reached it's full potential. And 780Ti is basically an excelent GK110 chip castrated and sold for 700$ instead of the regular 4k$ (K6000) + they pushed GDDR5 to it's maximum to just edge the 290x... I don't know... This thing looks like vapor ware, let's see some availability but I doubt it's ok for nvidia to sell a perfectly good chip for what? 6 times less? Plus the last tests with 290 without the x show that 290x has lots of potential left

    How I see it 290x was rushed to market and suffers because of it bad cooler, high temperatures and slow memory... 780ti is the best of the Kepler architecture Oc'ed both memory & gpu with basically a pro grade chip and 30% more diespace and it just barelly edges out the 290x in uber mode.

    What AMD managed is to make nvidia divert perfect GK110's from pro line to mainstream and shifted their focus - which is a bad thing to do atm for nvidia. And nvidia reacted like a... fanboy really by scrambling to bring a lab rat on the market... just to barely claim the crown back... instead of focusing on maxwell and pro line improvements... They really behaved like little kids with ADHD with this one... but ... oh well...
  • will1956 - Monday, December 2, 2013 - link

    troll but true
  • Rajiv Kishore - Thursday, November 7, 2013 - link

    Waiting on 290 with better cooling, will ask my sis to pick 1 up when she's coming back to india. Nvidia 780ti is overkill for my 1080p screen. Gj nvidia can't wait for maxwell! Still single gpu king.
  • HalloweenJack - Thursday, November 7, 2013 - link

    So it uses as much power as the R290X , its thermal load is set lower and its nearly as noisy under load. but its faster and costs a lot more.

    R290 is the winner here - $200 cheaper , add an aftermarket cooler or wait for the AIB to be unleashed , and for less money you`ll have a faster (factory overclocked , or oc yourself) R290 which beats the 780Ti
  • A5 - Thursday, November 7, 2013 - link

    1) A 3dB difference means that the R290X in "standard" mode is twice as loud as the 780 Ti.

    2) In "Uber" mode, which is what the 290X has to use to match performance, it is 8dB louder. That is a huge difference.
  • Traciatim - Thursday, November 7, 2013 - link

    A5, that point 1 is incorrect. 3dB is twice the power but not twice as loud. 3dB is about where anyone can perceive a loudness difference, 10dB is generally what is perceived as twice as loud.
  • tedders - Thursday, November 7, 2013 - link

    While the results speak for themselves, I cannot wait to see what a properly air cooled 290X will be able to do against a 780ti. It has pretty much shown that the stock reference cooler on the 290X is its bottleneck. Will you be revisiting the 290X review once the other manufactures come out with their properly cooled cards?
  • pyroHusk - Thursday, November 7, 2013 - link

    Completely agree with you, R9 290X clock rate throttle so much by the crap reference cooler. Reviewer from techspot replace R9 290 reference cooler with HIS IceQ X2 from R9 280X and temperature drop to 60-70C easily.
  • nathanddrews - Thursday, November 7, 2013 - link

    Tom's also replaced the 290 cooler with a tri-fan setup and saw a 20% fps gain without all the noise. Seems to me that a $400 290 + a high quality cooler ($50?) will get you close to the 780ti for $250 less.
  • hoboville - Thursday, November 7, 2013 - link

    Once reviews from both sides come in (when both sides come out with third-party coolers), then we can see how these cards stack up. Right now, the AMD cards can't be realistically overclocked (or in the case of the 290X realistically reach its innate performance). The numbers from Tom's hardware are promising, but who knows what ASUS and the rest will pull out of their hats.

    Hopefully by mid December we can make good informed decisions, but for now, it's just too early to buy. There's still rumors that Never Settle will come into the picture, so waiting is good.
  • IanCutress - Thursday, November 7, 2013 - link

    A high quality VGA air cooler is more like $100-$120. Or strap on a closed loop liquid cooler
  • Tetracycloide - Thursday, November 7, 2013 - link

    Hardware vendors get much better prices than that which is why you so often find third party coolers on custom cards for a fairly modest markup ($10-20).
  • nathanddrews - Friday, November 8, 2013 - link

    The Arctic Accelero Xtreme III that Tom's used was only $70, but even if it was $100 extra, that's still a $150 gap. For vendors, subtract the cost of the bad cooler from the good cooler and I'll bet we see dual/tri-fan 290s for under $450.

    Also, this is interesting:
    http://www.tomshardware.com/reviews/radeon-r9-290-...
  • Mithan - Thursday, November 7, 2013 - link

    Great card, but about $150 over priced. I would purchase this for $550 right now, but $700? No.
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    Maxwell is due out next year, so tbh this would be a bad bandwagon to jump on, an architechure change and possible die shrink will come next year and depending on yields I would anticipate a 10-15% jump in the next series with lower tdp.
  • kwrzesien - Thursday, November 7, 2013 - link

    Maybe even at $600. $700? No.
  • Nirvanaosc - Thursday, November 7, 2013 - link

    Great review, but the Overclocking section still has the same text as the R9 290 review.
  • piroroadkill - Thursday, November 7, 2013 - link

    290 and 290X look even better in this when used in CF. They scale better than 780Ti in SLI.

    You can save even more with 290X CF than 780Ti, AND get better performance in almost every test listed.

    With that setup you'd be wise in either case to get a nice custom cooling loop anyway.
  • Gast - Thursday, November 7, 2013 - link

    1st paragraph of the conclusion. "NVIDIA’s high-end cards a bit faster and a big cheaper each time."

    Should be "a bit cheaper each time".
  • Pneumothorax - Thursday, November 7, 2013 - link

    Sad, this goes to show that Nvidia was selling us mid-range Keplers all last year at premium prices. This card is what the GTX 680 should've been all along. OTOH, if the 7970 was priced much better out of the gate, it might've forced the green team not to have ripped us off so much.
  • EJS1980 - Thursday, November 7, 2013 - link

    If Nvidia released this as their answer to the 7970, AMD would have simply gone out of business. Maybe AMD should thank NVidia for showing them mercy, and keeping them afloat...j/k!
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    They can't because of antitrust/monopoly laws, the penalties for NVidia would be retarded from the Gov. TBH since ati has been lowballing it lately this has caused NVidia to cap yields for higher prices.
  • Mondozai - Friday, December 13, 2013 - link

    EJS the buttboy for Nvidia keeps entertaining us! Dance monkey, dance!
  • Kodongo - Thursday, November 7, 2013 - link

    Us? Speak for yourself. If you willingly allowed nVidia to rape your wallet, more fool you. Me, I will go for the best price-performance cards which puts me firmly in the Radeon camp at the moment.
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    Depends on your perspective, SLI is just better overall, and supported better. I'll gladly pay for a better product versus 1 at mainstream budget with less feature sets.
  • anubis44 - Thursday, November 7, 2013 - link

    "SLI is just better overall".

    Not anymore. HardOCP said: "We've been telling our readers for years that CrossFire just didn't feel as good as SLI while gaming.

    Those times have changed, at least on the new Radeon R9 290/X series. The new CrossFire technology has improved upon the CrossFire experience in a vastly positive way. Playing games on the Radeon R9 290X CrossFire configuration was a smooth experience. In fact, it was smoother than SLI in some games. It was also smoother on the 4K display at 3840x2160 gaming, and it was noticeably smoother in Eyefinity at 5760x1200."

    Read the whole R9 290X crossfire article here:

    http://www.hardocp.com/article/2013/11/01/amd_rade...

    Finally, ignore the noise about noise on the reference R9 290(X) cards. The custom cooled versions are coming out by the end of November and they'll be as quiet and cool as the nVidia cards, but faster and cheaper.
  • TheJian - Thursday, November 7, 2013 - link

    You need to read balance sheets: paste from another post I made at tomshardware (pre 780ti)-
    Simple economics...NV doesn't make as much as they did in 2007. They are not gouging anyone and should be charging more (so should AMD) and neither side should be handing out free games. Do you want them to be able to afford engineers and good drivers or NOT? AMD currently can't afford them due to your price love, so you get crap drivers that still are not fixed. It's sad people don't understand the reason you have crap drivers is they have lost $6Billion in 10yrs! R&D isn't FREE and the king of the hill gets to charge more than the putz. Why do you think their current card is 10db’s higher in noise, 50-70 watts higher and far hotter? NO R&D money.

    NV made ~550mil last 12 months (made $850 in 2007). Intel made ~10Billion (made under 7B 2007, so profits WAY UP, NV way down). Also INtel had 54B in assets 2007, now has 84billion! Who's raping you? The Nvidia hate is hilarious. I like good drivers, always improving products, and new perf/features. That means they need to PROFIT or we'll get crappy drivers from NV also.

    Microsoft 2007=14B, this year $21B (again UP HUGE!)
    Assets 2007=64B, 2013=146Billion HOLY SHITE.

    Who's raping you...IT isn't Nvidia...They are not doing nearly as well as 2007. So if they were even raping you then, now they're just asking you to show them your boobs...ROFL. MSFT/Intel on the other hand are asking you to bend over and take it like a man, oh and give me your wallet when I'm done, hey and that car too, heck sign over your house please...

    APPLE 2007=~3Bil profits 2013=41Billion (holy 13.5x the raping).
    Assets 2007=25B, wait for it...2013=176Billion!
    bend over and take it like a man, oh and give me your wallet when I'm done, hey and that car too, heck sign over your house please...Did you mention you're planning on having kids?...Name them Apple and I want them as slaves too...LOL

    Are we clear people. NV makes less now than 2007 and hasn't made near that 850mil since. Why? Because market forces are keeping them down which is only hurting them, and their R&D (that force is AMD, who by the way make ZERO). AMD is killing themselves and fools posting crap like this is why (OK, it's managements fault for charging stupidly low prices and giving out free games). You can thank the price of your card for your crappy AMD drivers

    Doesn't anyone want AMD to make money? Ask for HIGHER PRICES! Not lower, and quit demonizing NV who doesn't make NEAR what they did in 2007! Intel killed their chipset business and cost them a few hundred million each year. See how that works. If profits for these two companies don't start going up we're all going to get slower product releases (witness what just happened, no new cards for 2yrs if you can't even call AMD's new as it just catches OLD NV cards and runs hot doing it), and we can all expect CRAP DRIVERS with those slower released cards.
  • mohammadm5 - Monday, November 11, 2013 - link

    http://www.aliexpress.com/item/Wholesale-Price-GeF...

    thats the wholesale price its not nvidia that charges so much is the resellers. the profit nvidia makes per gpu is very low but the reseller make alot of money, also the new amd r9 290 is going for $255 per unit at wholesale price and the r9 280x is going for $160 dollar per unit. you have to also remember thats the distributer price not the manufacturer price,witch should be alot lower. i know the gtx 780 at manufacturer price sells from $200 to $280 depending on brand.

    so remember this is america were they sell you something made in china for 1 dollar for 10 dollars
  • RussianSensation - Thursday, November 7, 2013 - link

    Looks overpriced to be honest.

    I'd rather get MSI Lightning 780 or better yet grab 2 after-market R9 290s once they are out for $100-150 more and likely get 50-60% more performance. High resolution gaming advantage over R9 290X melts away to less than 8%. It looks even worse against $399 R9 290 - only a 15% advantage for a 75% price increase. Terrible value proposition. NV should have priced this guy at $599.

    http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_780_T...
  • A5 - Thursday, November 7, 2013 - link

    The article repeatedly points out that is overpriced. Like every other flagship card ever.

    Anyone looking for price/performance is getting a 280 or 770 (or lower).
  • Dantte - Thursday, November 7, 2013 - link

    Can we remove Battlefield 3 from the benchmarks and add Battlefield 4 please. BF3 is now 2 years old and is no long current with the genre. When's the last time you heard someone say "hey, I wonder how well this card will perform in BF3," I bet not for a while, but I have been hearing that exact statement for BF4 for the last year!
  • A5 - Thursday, November 7, 2013 - link

    BF4 has a built-in benchmark too, but I have no idea how good it is. I'd guess they're waiting on a patch?

    If nothing else, there will be BF4 results if/when that Mantle update comes out.
  • IanCutress - Thursday, November 7, 2013 - link

    BF4 has a built in benchmark tool? I can't find any reference to one.
  • Ryan Smith - Thursday, November 7, 2013 - link

    BF3 will ultimately get replaced with BF4 later this month. For the moment with all of the launches in the past few weeks, we haven't yet had the time to sit down and validate BF4, let alone collect all of the necessary data.
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    Hell man people run FEAR still as a benchmark because of how brutal it is against GPU/CPU/HDD.
  • Bakes - Thursday, November 7, 2013 - link

    I think it's better to wait until driver performance stabilizes for new applications before basing benchmarks on them. If you don't then early benchmark numbers become useless for comparison sake.
  • TheJian - Thursday, November 7, 2013 - link

    I would argue warhead needs to go. Servers for that game have been EMPTY for ages and ZERO people play it. You can ask to add BF4, but to remove BF3 given warhead is included (while claiming bf3 old) is ridiculous. How old is Warhead? 7-8 years? People still play BF3. A LOT of people. I would argue they need to start benchmarking based on game sales.
    Starcraft2, Diablo3, World of Warcraft Pandaria, COD Black ops 2, SplinterCell Blacklist, Assassins Creed 3 etc etc... IE, black ops 2 has over 5x the sales of Hitman Absolution. Which one should you be benchmarking?
    Warhead...OLD.
    Grid 2 .03 total sales for PC says vgchartz
    StarCraft 2 5.2.mil units (just PC).
    Which do you think should be benchmarked?

    Even Crysis 3 only has .27mil units says vgchartz.
    Diablo 3? ROFL...3.18mil for PC. So again, 11.5x Crysis 3.

    Why are we not benchmarking games that are being sold in the MILLIONS of units?
    WOW still has 7 million people playing and it can slow down a lot with tons of people doing raids etc.
  • TheinsanegamerN - Friday, November 8, 2013 - link

    because any halfway decent machine can run WoW? they use the most demanding games to show how powerful the gpu really is. 5760x1080p with 4xMSAA gets 69 FPS with the 780ti.
    why benchmark hitman over black ops? simple, it is not what we call demanding.
    they use demanding games. not the super popular games thatll run on hardware from 3 years ago.
  • powerarmour - Thursday, November 7, 2013 - link

    Well, that time on the throne for the 290X lasted about as long as Ned Stark...
  • Da W - Thursday, November 7, 2013 - link

    I look at 4K gaming since i play in 3X1 eyefinity (being +/- 3.5K gaming).
    At these resolution i see an average of 1FPS lead for 780Ti over 290X. For 200$ more.
    Power consumption is about the same.
    And as far as temperature go, it's temperature AT THE CHIP level. Both cards will heat your room equally if they consume as much power.

    The debate is really about the cooler, and Nvidia got an outright lead as far as cooling goes.
  • JDG1980 - Thursday, November 7, 2013 - link

    It seems to me that both Nvidia and AMD are charging too much of a price premium for their top-end cards. The GTX 780 Ti isn't worth $200 more than the standard GTX 780, and the R9 290X isn't worth $150 more than the standard R9 290.

    For gamers who want a high-end product but don't want to unnecessarily waste money, it seems like the real competition is between the R9 290 ($399) and the GTX 780 ($499). At the moment the R9 290 has noise issues, but once non-reference cards become available (supposedly by the end of this month), AMD should hold a comfortable lead. That said, the Titan Cooler is indeed a really nice piece of industrial design, and I can see someone willing to pay a bit extra for it.
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    Physically the 780 and 780 TI are literally the same unit, minus minor things. The difference is the neutered chip, and OC'd VRAM, Which means your paying for the same unit at 2 completely different prices, in fact, How much does it cost to disable the SMX texture? So shouldn't the overhead on the original unit be higher with more work having to be done? Or like AMD tricore are we paying for defective chips again ?
  • TheJian - Thursday, November 7, 2013 - link

    Wrong. They have been saving DEFECT FREE GK110 units for months just to be able to launch this with good quantity (probably only started having better results at B1, which all of these are). I doubt that there are many 780's that have fully working units that are disabled. They are failed Tesla chips (you can say DP is disabled on purpose, but not the SMX's). Do you really think 550mm chips have a ZERO defect rate?...LOL. I would be surprised if the first runs of Titan had any more working SMX's too as they were directly failed Tesla's. Sure there are probably a few cards with some working that are disabled but Yields and history say with chips this big there just has to be a pretty high defect rate vs. 100% working chips. It is pretty much the largest chip TSMC can make. That's not easy. Both AMD and NV do this to salvage failed chips (heck everybody does). You come with a flagship, then anything that fails you mark as a lower model (many models). It allows you to increase your yield and chips that can be sold. You should be thankful they have the tech to do this or we'd all be paying FAR higher prices do to chucking chips by the millions in the trash.
  • TheJian - Thursday, November 7, 2013 - link

    http://www.tomshardware.com/reviews/geforce-gtx-78...
    "Not to be caught off-guard, Nvidia was already binning its GK110B GPUs, which have been shipping since this summer on GeForce GTX 780 and Titan cards. The company won’t get specific about what it was looking for, but we have to imagine it set aside flawless processors with the lowest power leakage to create a spiritual successor for GeForce GTX 580. Today, those fully-functional GPUs drop into Nvidia’s GeForce GTX 780 Ti."

    There, don't have to believe me...confirmed I guess ;)
  • beck2448 - Thursday, November 7, 2013 - link

    Great job Nvidia! I think the partners with custom cooling will get another 15 to 20 % performance out of it with lower temps and less noise, and that is insane for a single GPU. Can't wait to see the Lightning and Windforce editions.
  • aznjoka - Thursday, November 7, 2013 - link

    The crossfire scaling on the 290x is much better than the 780ti. If you are running a dual card set up, getting a 290x is pretty much a no brainer.
  • beck2448 - Thursday, November 7, 2013 - link

    from Benchmark reviews: In conclusion, GeForce GTX 780 Ti is the gamer’s version of GTX TITAN with a powerful lead ahead of Radeon R9 290X. Even if it were possible for the competition to overclock and reach similar frame rate performance, temperatures and noise would still heavily favor the GTX 780 Ti design. I was shocked at how loud AMD’s R9 290X would roar once it began to heat up midway through a benchmark test, creating a bit of sadness for gamers trying to play with open speakers instead of an insulated headset. There is a modest price difference between them, but quite frankly, the competition doesn’t belong in the same class.
    Read more at http://benchmarkreviews.com/8468/nvidia-geforce-gt...
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    TBH the stock Nvidia cooler isn't that much better either they tend to run a lot hotter than ACX/Twin Frozer, and other such solutions, so both have cooling headroom, Hawaii though is just plain ridiculous.
  • deedubs - Thursday, November 7, 2013 - link

    Noticed the graph for shadowplay performance has its labels reversed. It makes it look like SP increases performance instead of decreasing.
  • Ryan Smith - Thursday, November 7, 2013 - link

    Whoops. Thanks for that. The multi-series graph tool is a bit picky...
  • Filiprino - Thursday, November 7, 2013 - link

    I don't see NVIDIA as a real winner here, really. Their margin is very tight, and AMD drivers still have to mature, and when you talk about crossfire, AMD is doing clearly better, for $200 less and 6dB more.
  • Filiprino - Thursday, November 7, 2013 - link

    And I have to add that with aftermarket coolers the 290X will get better performance, allowing to overclock even more.
    Here you have only compared the 290X without overclocking, only "Uber mode" which I is not the same as overclocking.
  • ludikraut - Thursday, November 7, 2013 - link

    I'm not really on board with the R9 290X. Seems to me that the performance/overclocking of the 290X is a little sketchy, whereas the results for the 290 appear to be more consistent and for $150 less, much more attractive.
  • Skiddywinks - Thursday, November 7, 2013 - link

    I think that sketchiness comes from the fact it gets throttled all the time. With the fan running on a temperature knife edge, the ambient temp and layout of your PC is going to have a massive effect on how well it is going to perform.

    The 290, as we should all know, had a fan speed boost to try and take on the 780 after the price drop, instead of the targeted 770. Once AMD get around to giving the 290X the same treatment (or, alternatively, we start seeing these after market coolers), I would be willing to bet the 290X will start looking much more promising. Probably still not enough to ruin the 290 as the go-to value high end product, but it will certainly not look as pointless as current reviews and benchmarks have it looking.
  • madwolfa - Thursday, November 7, 2013 - link

    Crysis 3 section has BF3 part pasted in it.
  • ludikraut - Thursday, November 7, 2013 - link

    What this review really needs is results for a CF R9 290. Seems to me that a pair of R9 290s will trounce a 780Ti for only $100 more. Actually looking at the overall results and how the R9 290 stacks up, I just don't see being able to justify a $300 premium for the 780ti.

    l8r)
  • smartypnt4 - Thursday, November 7, 2013 - link

    This. 100% this. They'll do it eventually, but I doubt he had a 2nd R9 290 in time to put those results in.

    That said, the R9 290X crossfire results make me very, very hopeful for R9 290 performance. A pair of them for $800 would be a steal to get a level of 4K gaming that wasn't available at anything below $1300 previously (2x780), and the 2x290 should even beat dual 780s handily at 4K based on how the 290X does (unless they give a special 780 some of that 7GHz memory).
  • just4U - Friday, November 8, 2013 - link

    From what I've noticed thru the years..
    with a product launch of this nature Anandtech doesn't rain to much on the featured cards parade. It's the star of the show after all. They get some criticism for that but ah well. They do release updated information and head to head comparison articles after initial launches. Maybe it's just a time thing.
  • Vorl - Thursday, November 7, 2013 - link

    I can't believe that the reviewer is allowed to be so blatantly biased.

    I suppose they assume most of their audience is too stupid to actually think.. but still. things like this are really making me start to lose respect for a site I have read for years now.

    It is so bad, it is almost to the point of leaving and recommending people I talk to/work with look for a less biased site.
  • nsiboro - Thursday, November 7, 2013 - link

    I was initially worried about the wording/flow too but I think Ryan did the right thing.

    He was comparing 780ti to 780/Titan and only brought up R9-290/X when things mattered.

    The take away from this review is that R9-290/X with AIB custom cooler will beat 780-Ti and reclaim the crown for AMD.

    The R9-290 non-X with AIB custom cooler (when it gets released) will surely get an Editor's Award.
  • Vorl - Thursday, November 7, 2013 - link

    I could see your point if he hadn't blatantly said "don't but the 290". Not "look forward to after marked cooling".

    They downplayed 4k, and the games that the 290 series did better in, and use much stronger words in the few areas that the Ti did better in.

    It's also funny how they downplay the price for a minimal improvement in speed. I remember in past reviews that a price difference like that would have made a huge difference in recommendation no matter things like noise levels.
  • Owls - Thursday, November 7, 2013 - link

    Ryan I'm sorry but the video card reviews as of late have been very poor in quality and objectivity. Stop rushing to be the first. I don't go to Anand to read a crappy review, that's what HardOCP is for.

    That said your testing is flawed with old games and comparing the Ti to be faster than a 290x that is in silent mode is disingenuous. We all expect better from this site.
  • nsiboro - Friday, November 8, 2013 - link

    True - unfortunate wordings. But we gotta learn to read between the lines e.g. 780ti cannot be compared to 290/X. :)
  • nunomoreira10 - Thursday, November 7, 2013 - link

    totally
    why does he even compare the non "uber" 290x to the 780TI
    its very misleading when he says it´s 11% faster then 290x not pointing out the fact that it was in silent mode.
    also not sure which drivers were used on the 290x
  • hoboville - Thursday, November 7, 2013 - link

    Sigh, I've always had good experiences with Nvidia products. They have always been good to me, but the pricing nonsense of GK110 has really put me off, a lot. I get it, you have the best so you charge people for the best, but all this does is put performance hardware further out of reach of people.

    Most people can't afford even $300 GPUs. A fact Maximum PC editors have commented on many times when they talk about how most people have relatively "low-end" hardware in their systems. AMD, because they haven't had the performance crown has clearly been going for performance-per-dollar. And that's good, very good for you. Because let's face it, money is a real determining factor for almost everyone.

    And to think, if I did have $700 to spend, I'd spend $100 more and just get 2 R9 290s. 150% performance for 15% more money, not bad...
  • Trenzik - Thursday, November 7, 2013 - link

    Very, very true comment. Money determines EVERYTHING. You know what makes something worth buying, price. I agree I prefer Nvidia due to past experience with AMD, BUT Nvidia is expensiveeeeee. Were talking a 4GB video card for 500 bucks.
  • DominionSeraph - Thursday, November 7, 2013 - link

    "Bloody" price war?
    The GK104 was designed to be the successor to the GF114, i.e. the $250 GTX 560 Ti. But as it turned out faster than AMD's high end chips it became the $500 GTX 680 and, 14 months later, the $400 GTX 770. The GK110 should've been the replacement for the GTX 580 at $500, but it became the $1000 Titan and $650 GTX 780. We are now 20 months past the release of the GK104 and all AMD's $550 launch price did was push Nvidia's midrange chip to $330.

    The GTX 460 was hailed as the value king at launch at $200. Six months later you could get one for $90 as we saw a real price war between AMD and Nvidia.
    20 months now and the GK104 is still going for $330, with Nvidia's back-pocket here being released at $700? There's no war here.
  • Skiddywinks - Thursday, November 7, 2013 - link

    Yeh, I have to agree with your argument here. This is no "bloody" price war by any stretch.

    I can't fault companies for trying to make better margins, but there has not been a well priced GPU in years. Well priced compared to competition, sure, but I remember the days when £200 would typically get you the top end single GPU card. Hell, the HD 4870X2 cost me £330 only a month or two after launch. Now what does that get me? Probably an after market cooled 290. Not even the X.
  • TheJian - Thursday, November 7, 2013 - link

    You guys are forgetting how much R&D these cost, how much money they make now compared to when chips were small and had higher yields (simpler) etc. These chips are HUGE and complicated to make. See my "simple economics" post. They haven't more than 2007 in the last 6yrs. Which should immediately explain why the price is high. They are not making as much even though they sell more than 2007.

    The 4870 was 256mm, not 550mm+. That card came with 956mil transistors vs. 7.1B here on a single chip (even 2x 4870's was smaller and 5B+ less transistors etc). It came with 512MB as 4870 (2GB on the 4870x2 I think) vs. 3GB of much faster stuff. 4870x2 launched at $550 and your price is $530us. That same $530 today almost gets you a 290x and it smokes your 4870x2 right? How is that bad?? You act like R&D is free (not just the chips either, software R&D too).
    http://www.techpowerup.com/68231/amd-launches-rade...

    I think it was samsung that said on dailytech the other day that it costs 20x more to make a chip today in R&D than 1995. Considering profits, it's amazing they sell this stuff at current pricing and actually quite stupid, they're doing you a favor on both sides - or they'd be making more money right? For instance, AMD 48mil first time profitable in 5 quarters, losing 6Billion+ in the last 10yrs. Umm, somebody is pricing crap wrong when you lose 6Billion in 10yrs. :)

    There is no other way to say that ;) YOU NEED TO CHARGE MORE. Period. This low pricing has caused them to double their outstanding shares (meaning SERIOUS share dilution), sell their fabs, lose 6B, sell their land, lose the CPU war completely, have all kinds of driver issues (even Variance now with new 290x/290) etc etc...The list of crap low pricing has caused is HUGE. Did I mention the value of the company today (1/4 it's worth in just the last few years)? I digress...
  • DominionSeraph - Thursday, November 7, 2013 - link

    You don't know what a GK104 is, do you?
  • just4U - Friday, November 8, 2013 - link

    Jian, They are not doing us any favors... Their in the business of making money (well mostly.. amd loses year over year but not due to their graphics department..) and looking for ways to entice you into parting with your coin.
  • chizow - Thursday, November 7, 2013 - link

    Figured Nvidia would not let the crown rest with AMD when they still had their trump card to play (full 2880SP GK110), so this is not unexpected at all.

    I still feel Nvidia comes out of this with a black eye however, while AMD started the price escalation with Tahiti for 28nm, Nvidia took this greed to a new level with 690/Titan pricing. $699 is what the high-end GK110 SKU should have sold for, max, and it should've arrived this time last year. The fact AMD had to undercut Nvidia pricing so badly with the 290/290X will be something Nvidia will have a difficult time living down for years to come.

    The fact we will have 3 SKUs launched in such a short time since Titan released that outperform it at a fraction of the cost is an absolute slap in the fact to some of Nvidia's most loyal and spendy customers. Lesson learned for Nvidia, and certainly a lesson learned for many of their customers.
  • TheJian - Thursday, November 7, 2013 - link

    Let me know when EITHER side makes more than they have in the last 10yrs. For NV that was ~800mil in 2007 (Q1 2008 TTM). For AMD it's ~500mil.
    http://investing.money.msn.com/investments/financi...
    http://investing.money.msn.com/investments/financi...

    http://money.msn.com/business-news/article.aspx?fe...
    A year ago 209mil for NV this Q. This year, 119m. Get the point? That's far worse than a year ago right? Lessons for Nvidia? Charge more every chance you get...LOL. Don't forget how much R&D etc costs. They are on Tegra4 now, and they haven't made a DIME on them yet total. Until they break 1Billion in Tegra sales it will continue to lose money robbing gpus cash. I highly doubt AMD's will break even for a while either once it hits. Also note it only took another 150mil or so in revenue to make 209mil last year. So basically they are selling the same crap for less right? 1.2B in revenue last year making 209mil. 1.05B revenue now, making 119.

    Anyone thinking either side is ripping them off needs to learn how to read balance sheets. Start expecting slower release schedules, worse drivers, an less perf jumps until profits go back up.
  • hero4hire - Sunday, November 10, 2013 - link

    Hail corporate!

    Price is just the price. If you feel safer with profits, branding, and fatter balance sheets as persuasive then look no further than the 80's American auto manufacturers as the greatest product around.

    Hail corporate!
  • just4U - Thursday, November 7, 2013 - link

    From the Article: "And though GTX Titan is falling off of our radar, we’re glad to see that NVIDIA has kept around Titan’s second most endearing design element, the Titan cooler"
    ---

    I am not in the market for a card yet. Quite happy with my 7870 from His... but overall, that cooler is the most interesting thing to come out of this generation of video cards. (Amd needs to take note of that to..)

    You see people going the aftermarket route in regards to reference coolers that simply can't cut it. So it's about damn time NVidia got off their butts and designed something special. Both companies have been guilty of neglect with their shitty coolers.. leaving partners to think outside the box and fix the problem. That's helped to set them apart from each other but it doesn't address the fact that we get flooded with the sub par that sticks around as a lemon thru-out the life of each card.
  • just4U - Thursday, November 7, 2013 - link

    Oh.. forgot to mention..
    I'd be more interested in this generation if they'd get temperatures under control. Not interested in any card hitting 80C+ under load... been there done that, and it can take it's toll out on your other hardware over time.
  • tomc100 - Thursday, November 7, 2013 - link

    I will stick to my Radeon 7970 for now since my room gets pretty hot in the summer time so neither gpu is going to perform well for me in the summer even with the air conditioner turned on. Also, I'm hoping mantle will provide another 25-50% increase in performance in some games.
  • Conduit - Thursday, November 7, 2013 - link

    Thanks but no thanks. Who the hell pays $700 for a video card? I have a $200 card that plays all my games on ULTRA with AA at 60 fps.

    If I ever go for a high end card it will be a custom cooled R9 290.
  • nsiboro - Thursday, November 7, 2013 - link

    2 thumbs up + 2 big-toe up !
  • Trenzik - Thursday, November 7, 2013 - link

    I have to admit $700 is a lot for a GPU, but its more of an enthusiast thing. You get obsessed with specs and frames :). I enjoy it.
  • Makaveli - Thursday, November 7, 2013 - link

    lol conduit,

    You mean at 1680x1050 right?
  • Gadgety - Thursday, November 7, 2013 - link

    @Conduit

    Those who want the maximum number of CUDA cores (SPs) for the minimum amount of money, and apply those CUDA cores for GPU based ray trace rendering. Disregarding memory size differences this is what it looks like:

    The Quadro K6000 is $5000 for 2880 CUDA cores: 0.576 CUDA cores/dollar
    The TeslaK20 is $3500 for 2496 CUDAs: 0.7 CC/dollar.
    When the Titan launched it was a bargain at $1000 for 2688 CUDAS: 2.69 CC/dollar
    When the memory limited GTX780 launched it was 3.55 CC/dollar

    Now the GTX780Ti provides 2880 CUDAs for $699: 4.12 CC/dollar. Since it brings a decent 3GB memory, it's both cheaper and more powerful than the 1.5GB equipped GTX780. I get 576 extra CUDA cores for only 50 bucks!

    For larger scenes the Titan still is better, but for those that create smaller renders, the GTX780Ti is the absolute value leader. I can fit four of them in my chassis, at a total cost of $2800 for 11520 CUDAs. If I need the larger 6GB memory, the Titan alternative would be $4000 and provide 10752 CUDAS. Still cheap compared to four K6000 Quadros, which would be
    $20 000, for 11520 CUDAs. Although I would much prefer a 4GB, or larger version, of course.

    So that is who the hell pays the ultra bargain $700 for a GTX780Ti. It's not all about gaming.
  • Ananke - Thursday, November 7, 2013 - link

    Why are you still using CUDA instead of OpenCL is beyond me :).
    GTX780Ti is overpriced, NVidia probably doesn't even care, because they care about HPC market for their 110 cores. On the other hand, they do feel some financial heat recently - Tegra was a flop, everybody and his grandma is buying Qualcom today, and it seems Intel tomorrow...HPC computing moves towards Intel and FPGAA.
    NVidia indeed got EXTREMELY greedy in the last two years, and the industry has punished them.

    p.p. 3GB is not enough for "high-end" pricing. It is the minimum size already.
  • Gadgety - Thursday, November 7, 2013 - link

    What software are you using? The blazingly fast Octane runs only on CUDA. Blender Cycles runs on CUDA. VRay RT runs better on CUDA. Bunkspeed Shot and Bunkspeed Move for moving images, CUDA. 3DS Max, CUDA.

    The choice just seems less with OpenCL. What is there besides Luxrender? OpenCL, seems to have been in development, and there is hope for a brighter future, but I don't currently see it as on par with the CUDA applications, and it needs something to really take a big leap.
  • colonelclaw - Friday, November 8, 2013 - link

    I work in archvis, and we can't yet make full use of VRayRT because of memory usage. A typical housing development consumes >20GB Ram on our render nodes, and we don't make nearly enough money to justify buying K6000s or K5000s. Our still image output size is typically 5000x3337, so I think it's gonna be a few years until we can really embrace the whole GPU rendering, but it's definitely coming. I do use VRayRT a lot to set up lighting and materials at much lower resolutions, and it's pretty damn fast on a K4000.
  • TheJian - Thursday, November 7, 2013 - link

    One person out of how many that actually gets it? :)

    Heck you could sell the 3 AAA games for $100 off each card too ;) I just raised your CC/$ right? :)
  • Gadgety - Friday, November 8, 2013 - link

    Yes you did. Thanks. Assuming one can get $100 for the games, it's now 4.8CC/$. I assume the game bundle is temporary, though.
  • hero4hire - Sunday, November 10, 2013 - link

    Sold my 2game bundle for $40. Everybody won.
  • Trenzik - Thursday, November 7, 2013 - link

    Lol why is everyone's comments to Wreckage so MEAN? He made a simple comment and some of the replies are just ridiculous. Is he not allowed to state his opinion? And is it that hard to reply to something you don't agree to with dignity, class, and without having to cuss?
    Good old merica at its finest eh?
  • kyuu - Thursday, November 7, 2013 - link

    Because Wreckage is a blatant troll with nothing useful to contribute. No one is under any obligation to be nice to a useless shill, regardless of which company s/he is shilling for.
  • kyuu - Thursday, November 7, 2013 - link

    Oh, and I'd add that Wreckage posted his/her comment well before anyone could have actually read the article, so it's pretty obvious s/he came specifically to spout off.
  • Hrel - Thursday, November 7, 2013 - link

    You talk about Titan as still being plausible as a compute card, yet the AMD cards, all of them, outperform both the Titan and the 780ti. Then the 780ti out performs the Titan. Nvidia beats itself here; and AMD beats them by a massive margin. Then you throw in the fact that Nvidia is essentially not even trying to compete on a price/performance basis and all of a sudden buying an Nvidia card makes absolutely no sense.

    Honestly I'm happy about this. I can't buy AMD CPU's since Intel so completely wallops them; but now, finally, I have no excuse to recommend any GPU except an AMD GPU. Good on ya folks, hopefully your CPU department starts firing on all cylinders like this.
  • Galatian - Thursday, November 7, 2013 - link

    I might not be an expert but I keep wondering what these new chips from AMD and Nvidia mean for their next generation? Clearly bringing out this full featured chips (which were once only supposed to be sold as workstation graphic chips) because 22nm keeps being delayed, will put pressure on their next chips. For example I guess the 780 chips are at the performance level Nvidia probably targeted Maxwell at. Maybe they are now pushed into releasing a full blown Maxwell chip to begin with.
  • TheJian - Thursday, November 7, 2013 - link

    This ^^^ Excellent that both have set the bar so much higher now. Realistically though it shouldn't be hard to top with the die shrink. It's just that they will not be able to give us such a gimped card at launch of 20nm for either side, saving a ton for refresh. They will be forced to give us something semi-real out of the gate :) I can't wait for maxwell 20nm. AMD will have to produce an awesome chip (like if AMD goes all out, low watts/heat/noise, 20% faster than NV basically like reverse of 780ti vs. 290x) in order for me to not want Gsync. No lag, stutter, tearing is worth a ton to me.

    http://wccftech.com/alleged-nvidia-maxwell-archite...
    If this is true, AMD better have some good stuff up their sleeve. 6144 cuda cores? Plus all the other enhancements would be potent. I don't believe this though. Even with a die shrink it would seemingly be a HUGE die but too lazy to do that math right now to see how plausible and too far away...LOL. I could believe 4608 though with 6 GPC/18SMX/256alu's and 6144 maybe held for 16nm/14nm or something.
  • AngelOfTheAbyss - Thursday, November 7, 2013 - link

    The difference between Titan and the 780 cards is the FP64 performance (1/3 vs 1/24 FP32),
    Using 64bit (double precision) floating point operations simplifies a lot of things when implementing numerical algorithms. If you use 32bit (single precision) operations, you often have to resort to some numerical skulduggery to get the desired accuracy.
  • TheJian - Thursday, November 7, 2013 - link

    Quit looking at sites like Anandtech/Toms that don't show much CUDA perf. Quit looking at OPENCL crap on Nvidia (only a retard buys NV cards and doesn't run CUDA whenever they have a job that can be done with a CUDA app or an app that has a Cuda plugin!). None of the crap running here would be done on Titan. You wouldn't run Sony Vegas either which has tons of issues running nvidia (google it, vegas cuda - badabing bad idea to buy this app for NV go ADOBE). You'd buy an Adobe lic like the rest of the world for Photos or Video editing and you'd turn on Cuda.

    I'll bet EVERY penny and object I own that they will run adobe the second AMD gets OpenCL in it (which is coming)...ROFL. How much does AMD pay this site? ;) They won't run AMD vs. NV in adobe until then. Of course if someone else shows it sucks still even after optimizing the upcoming revs of adobe apps, I guess they won't do it even then ;)

    Ask Anandtech why they don't run Cuda vs. AMD (in anything, amd can usually go OpenCL, DirectX or OpenGL in the same apps that use Cuda). You can run any pro app and pick luxrender for AMD and say, Octane/furryball etc for NV, yet anandtech refuses. Or just run adobe and choose cuda for nv and OpenGL for AMD. You can do Adobe tests with a freaking trial download.

    http://www.tomshardware.com/reviews/best-workstati...
    Look at that and the next 3 pages of cuda benchmarks and marvel as a $1000 card (titan) blows away $2000-$5000 cards (W6000 etc).

    Tomshardware does the same crap as anandtech. Note they say "NOT SUPPORTED" for all cuda benchmarks. But all they have to do is use LUXRENDER for all of them and pit them head to head with Cuda. I've asked many times why they do this for all the benchmarks in their forums and they NEVER have responded...ROFL. Why do they run any OpenCL benchmark on NV at either site? Run some real stuff like Adobe AMD vs NV. The world uses premiere and photoshop (largely).

    http://www.tomshardware.com/reviews/geforce-gtx-78...
    marvel now as 780TI blows away the same cards $2000+ and nearly does it 2x faster than most...LOL. Understand? But why is AMD not included?
    3dsmax+iray (run luxrender for AMD).
    Blender 2.66 (run luxrender for AMD).
    Why tomshardware why?...ROFL. They read here too ;)
    OctaneRender™ for...
    ArchiCAD Cinema4D, Inventor, Maya, Revit, Softimage, 3ds Max, Blender, Daz Studio, Lightwave, Poser, Rhino (sketchup & carrara, autocad etc coming soon)

    Luxrender:
    3dsmax, lightwave3d, blender, dazstudio, poser, cinema4d, softimage, sketchup & carrara

    See how they overlap? Lux for AMD vs. Octane for NV. simple. But that would show how weak AMD is and how strong cuda is after 7yrs and billions in development ;) Heck pit any plugin you want for AMD vs. NV Cuda. Cuda is available for the top 200 apps but anandtech/tomshardware seem to be incapable of running them against each other. Well, anandtech does have an AMD portal page...LOL. :) Just saying...You should never run LUX with NVidia. Run LUX vs. Octane! Pick an app above and run both plugins against each other for AMD/NV. Simple. I understand hating on Cuda for being proprietary, but these two sites ignoring it and acting like NV is slow due to opencl is ridiculous and misleading.

    Instead of the above, Anandtech runs luxmark (pick an app, use plugins instead of dumb opencl benchmark which highlights ONLY AMD), sony vegas (pit it against Adobe/Cuda/premiere easy to render the same vid in both), CLbenchmark? ROFL....How about something we can make money with instead of this fake crap? At least show the other side:
    http://www.ozone3d.net/benchmarks/physx-fluidmark/
    Which runs on both AMD and NV. Fluidmark:
    "This benchmark exploits OpenGL for graphics acceleration and requires an OpenGL 2.0 compliant graphics card: NVIDIA GeForce 5/6/7/8/9/GTX200 (and higher), AMD/ATI Radeon 9600+, 1k/2k/3k/4k (and higher) or a S3 Graphics Chrome 400 series."

    Folding@home can't make you a dime either...waste of time testing this. You like high electric bills for warm fuzzy feelings? Not me. Think some pill company will pay you to solve cancer? NOPE.
    Syscompute+AMP crap home made benchmark to suit AMD? No thanks...Can't make me a dime. NOT REAL. Doesn't anyone find it strange Anandtech only runs ONE thing (sony vegas) that can actually be used to make money? And it sucks on NV when Adobe rocks. What the heck is going on here? Nobody at anandtech knows how to use Adobe products?
    "Last, in our C++ AMP benchmark we see the GTX 780 Ti take the top spot for an NVIDIA card, but like so many of our earlier compute tests it will come up short versus AMD’s best cards."

    LOL...Gee, maybe if you actually ran some REAL STUFF and pit AMD vs. NV (CUDA..DUH!) we'd find out some truth ;) What are you afraid of anandtech? I guess I should just paste this into every article with OpenCL benchmarks here ;) Maybe if they start losing even more traffic (down since 660ti article last sept, about in half, worse since AMD portal probably and AMD personal visit to ONLY this site...LOL), they will start telling it like it is and start running cuda vs. AMD.

    FYI Titan has full DP and 6GB. Come back and say that junk when you try to render something big. Come back when anandtech starts running CUDA vs. AMD. Until then, quit drinking anandtech/amd koolaide ;)
  • hero4hire - Sunday, November 10, 2013 - link

    Hail corporate!

    Cuda!!
    Whatever you're getting paid needs to get cut for each ;) LOL ROFL and CAPS!!!! You're not being taken seriously when you sound like a 14 girl texting their bffs

    Hail corporate!
  • SBTech86 - Thursday, November 7, 2013 - link

    nvidia thinks we r dumb
  • gordon151 - Thursday, November 7, 2013 - link

    Would they be wrong :)?
  • firewall597 - Thursday, November 7, 2013 - link

    Did you even read this review? CF pooped all over SLI in most scenarios.
  • jigglywiggly - Thursday, November 7, 2013 - link

    why are the benchmarks not including any older cards? even a 670...
  • Ryan Smith - Thursday, November 7, 2013 - link

    Very limited time. 290 launched on Tuesday, 780 Ti launched on Thursday. If I had a couple more days I would have used the time to collect data from some older cards.
  • undeadpolice - Thursday, November 7, 2013 - link

    Thanks Anantech, I was worry that I make the wrong choice by buying two weaker R9 290x as money is not issue for but performance is, with all the Gtx780ti hype

    At least I was preparing for a regret it but it turns to be a present surprise!
  • GeorgeH - Thursday, November 7, 2013 - link

    What does the asterisk next to the 280X benchmarks signify?
  • Ryan Smith - Thursday, November 7, 2013 - link

    Non-reference. We don't have a reference 280X, and for 280X CF it's an oddball combination of an XFX card and an underclocked Asus card.
  • ludikraut - Thursday, November 7, 2013 - link

    Ryan, how are you coming up with "This [780ti performance advantage] will break down to being 11% faster than Radeon R9 290X, 9% faster than GTX Titan, and a full 20% faster than the original GTX 780 that it formally replaces"? I looked through the test results again, and if I focus on the 4K results, what really strikes me is how close the R9 290 stays on the heels of the 780ti (often within 1-2 FPS). Heck in some cases the $400 R9 290 is ahead of the $700 780ti. Also, why no results for dual/triple monitor setups?

    l8r)
  • Yojimbo - Thursday, November 7, 2013 - link

    Well that's good. Buy a $400 card, $800 on monitors or whatever, and run everything at a high resolution and extremely low quality settings. Either that or suffer through 30 fps average frame rates and minimum frame rates of god knows.
  • Ryan Smith - Thursday, November 7, 2013 - link

    Our focus is on 2560x1440, and that's where we draw our primary conclusions from. As we've mentioned in the past few weeks in other reviews (290X, etc), we don't consider 4K to be viable for a single card setup right now. You have to make some very significant quality compromises in most cases, which are far more detrimental than the extra resolution is beneficial. 4K gaming really requires a multi-GPU setup right now, which is why we focus on 4K when discussing said multi-GPU setups.

    As for multi-monitor setups, the use of a 4K tiled monitor generally makes those redundant. It tests the same technology, and does so at an even higher resolution than 3x1080p. Since we don't have the time to do 2560 and 4K and multi-monitor, we've opted for the first two on the basis listed above.
  • NewCardNeeded - Thursday, November 7, 2013 - link

    Wow, after reading this review it's become clear what graphics card I'm now going to buy.

    An R9 290 with 3rd party cooler :-)
  • Da W - Thursday, November 7, 2013 - link

    In the sake of fareness, ill add an unbiased review. Since AMD cards scale better than Nvidia at 4K, and that "uber" mode is noisy but still is the mode where the card really runs at full speed (NOT overclocked), i think its grossly unfair to only cite 1400p gaming, taking only 290X non-"uber" numbers and call Nvidia a winner.

    So for 4 k gaming, where a tie means 1or2 FPS difference or that the 780ti sits between the 290X uber and non-uber mode, we see:
    -Metro: tie
    -CoH2: AMD
    -Bioshock: Nvidia
    -Battlefield 3: tie (battle field 4 will be a win for AMD but we don't have that here)
    -Crysis 3: Nvidia
    -Crysis: Tie
    -TW Rome2: Nvidia
    -Hitman: AMD
    -Grid 2: Tie

    That's 2 AMD wins, 3 Nvidia wins and 4 ties at 4K, which is also a good proxy for eyefinity/surround setups.

    AMD is louder and Nvidia is 200$ more. You essentially pay 200$ for a cooler.

    Case closed.
  • Kutark - Friday, November 8, 2013 - link

    nobody cares about 4k. I honestly don't even know why people bother to benchmark. Like the author said, 4k is only viable in multi (high end) GPU setups. So we're talking 800+ dollars just for video cards, the only 4k monitor out right now is $3800, has a 30hz refresh rate and fucking terrible input lag. NOBODY IS GAMING AT 4K. If we were having this discussion a year from now things might be a *little* different. But we're not, we're having it now. So, i state once again, nobody except AMD fanbois trying to stroke their proverbial cocks gives 2 shits about how these cards scale at 4k resolution.
  • FuriousPop - Sunday, November 10, 2013 - link

    lol. you do realize that those of us running surround/eyefinity need to have a bench to relate to this. Thats what 4k does for us, its not much but its better than just that standard 1600p.
    in actual fact i am currently gaming close to 8k resolution (eyefinity) so before you rage, take a breath!
  • yeeeeman - Thursday, November 7, 2013 - link

    And, after this replica from nVidia, the 290X still seems the right choice. You strap a water block on it, and it goes like 780Ti, has more memory (4GB) and consumes aprox. the same. You have to be extremely stubborn not to admit the fact that 290X is the right card to get.
  • Kutark - Friday, November 8, 2013 - link

    Right, cus the 90-140 dollars for the water block, also making the assumption that you already have all the other requisite shit for water cooling TOTALLY makes it worth it.

    For someone buying a new card, if they dont already have a water cooling setup, water cooling is a COMPLETE non option.
  • hero4hire - Sunday, November 10, 2013 - link

    Can buy aftermarket air coolers for >$100 too. Only for the tinkerers. I'd rather just see what an aftermarket does and not pretend I'm better. Amd has a laughably bad reference cooler which is why it's so easy to see the weak link. If we didn't see a large performance jump at 100% (60%) fan throttle I'd just call the 290 a bust and move on. I won't buy this gen but I am very interested just as an overclock er
  • scook9 - Thursday, November 7, 2013 - link

    AnandTech, PLEASE PLEASE test the 780 Ti against the R9 290x and R9 290 with all of them watercooled at stock clocks. This will be the only real way to tell what card is better than the other with temperatures removed from the equation as clearly temperature wildly influences the overall performance capabilities of these cards.

    Thanks!
  • Yojimbo - Thursday, November 7, 2013 - link

    Haha if you are going to watercool them, why test them at stock clocks? Because AMD is already more-or-less overclocking their cards and you want to cast AMD in a better light? If you are going to watercool them, then overclock each card aggressively, and test them that way.
  • eanazag - Thursday, November 7, 2013 - link

    I'm not sold on the Ti being that strong of a champ. I will say that Nvidia's cooler is by far better and AMD should take note - especially since AMD's temp limit is high. I don't have money for either 780 + or R290 +, but if I was spending Nvidia's position in any category doesn't justify the price. Their wins are not impressive enough for that. $50 more over the R290x is reasonable. The overclocking options look good; without overclocking R290X + $25.
  • looncraz - Thursday, November 7, 2013 - link

    Is it just me or do the performance charts not mate up with the words?

    What I mean is that the charts will show the 290X, in uber mode, beating the 780Ti by 2-3FPS almost across the board and then the text in the article will declare the 780Ti the winner. This is most obvious on the Crysis: Warhead page.

    "with the additional performance offered by the GTX 780 Ti NVIDIA is once again at the top, though only by a margin of under 2fps"

    That isn't true any way you shake it. The 290X in quiet mode loses by 1.2FPS - at worst - and in uber mode it wins by 2.2FPS.

    All I see in the charts from the 780Ti is a card with a slight average advantage at lower resolutions and a more significant loss at higher resolutions. Not a bad card, but I'd call it a tie if anything... a performance difference in the range of 2% between the 290X and Titan was considered a tie... why not now?
  • mac2j - Thursday, November 7, 2013 - link

    Totally agree - and the reviews on other sites are much more balanced from what I've seen so far. I don't think of Ryan as someone who is generally overtly biased, but if you look at the numbers this looks like a huge win for the 290X. In most games the 2 cards are +/- 5% of each other which wouldn't even justify a $100 premium much less $150. On top of that the 290X seems to scale better in CF. Just my interpretation based on the games I play but the "final words" seems very slanted and the "11%" over 290X seems very biased as its not based on Uber mode.
  • venkman - Thursday, November 7, 2013 - link

    Maybe this has been asked before, but when are we going to see Benchmarks with the 2013 Fall Games? Battlefield 4/COD: Ghosts/Batman/AC4 etc?
  • NewCardNeeded - Thursday, November 7, 2013 - link

    Even the R9 280X crossfire beats the GTX 780 Ti in SLI in several cases !!!!!!!! Note that I do mean the 280, it's not a typo. The 290X Crossfire *SLAUGHTERS* the 780 Ti in SLI AND it's a fraction of the price.
  • austinsguitar - Thursday, November 7, 2013 - link

    okay okay...lets tell this guy about what happens after a new nvidia graphics card comes out shall we...first 2 weeks a card comes out (ALWAYS UN OPTIMIZED FOR SLI) 2 weeks later (ABSOLUTE SCALING WITH THE NEXT AVAILABLE DRIVER) happens every time dude. That little guy will be better in two weeks, just trust me
  • NewCardNeeded - Thursday, November 7, 2013 - link

    I'm not so sure this time. Nvidia have held back the 780 Ti for months until AMD released their new cards. They've had plenty of time to optimize for SLI. Expect small gains yes, but nothing more.

    Let's see what happens when mantle comes out...
  • austinsguitar - Thursday, November 7, 2013 - link

    Temperature, power (wattage), noise....This beats the 290x bad....
    Think about this....95 degrees and the ungodly noise coming from the 290x is ABSOLUTELY "UN"ACCEPTABLE... The card is cheap yes, but after 2 years of game playing your energy bill will determine that factor. I do realize that amd's drivers are getting better, but come on people....mantle?
  • NewCardNeeded - Thursday, November 7, 2013 - link

    Have you never heard of a "third party cooler"?

    Coming this way soon !
  • NewCardNeeded - Thursday, November 7, 2013 - link

    Read the article again without your green tinted glasses on!

    Full load on Crysis 3:

    Power (W):
    780 Ti = 372
    290X = 375

    Does it really beat the 290X bad?
  • austinsguitar - Thursday, November 7, 2013 - link

    oh im sorry i was speaking about crossfire and sli configurations when putting into account the power draw... and everyone knows that when nvidia plays its games at 60 it clocks things lower, and power draw is very impressive...odds are these cards will never see below 60 for a while now, and nvidia's power draw at medium loads are phenomenal.
  • Kutark - Thursday, November 7, 2013 - link

    Jesus Christ, i've been reading the comments. The AMD fanbois are starting to get worse than Biodrones were during the release of TORtanic. Granted there is some definite Nvidia fanboism going on but the reality is this. Nvidia is on a 9 month old architecture and is able to put out a card that beats AMD's brand new architecture's top dawg by roughly 10%, running at a significantly lower temperature, and significantly quieter.

    Does that justify a $200 price gap? Well thats up to the consumer to decide. But to try to suggest this is a "Tie" or anything other than Nvidia reclaiming the fastest single card crown is just being ridiculous.

    I just find it hilarious some of these AMD people sitting here spouting off these very specific scenarios where the AMD card comes out on top and acts like that means anything. Ok, so crossfire 290x (only a thousand dollars!) beats sli 780ti's in several cases, whoopety do. This will affect all of the 1/10th of 1% of people who will pay that kind of money for a graphical solution on their gaming rigs.

    The other thing is, Nvidia's architecture is 9 months old for shits sake. You dont think they will have something else out in a few months thats going just crap all over AMD's new
  • Kutark - Thursday, November 7, 2013 - link

    Gah, stupid IE9 (im on a computer i can't install a good browser on). Anyways, i was just saying, Nvidia will likely release something early to mid 2014 which will probably blow any current gen cards out of the water and then where is AMD? Same spot they've been in?

    I'm glad AMD released the 290x, it is overall a HUGE step forward for them and im glad nvidia has some real competition. That is only a good thing for the consumer. But overblowing this 290x as something it isnt is not doing any favors. We need to stop blowing smoke up AMD's ass so they actually keep pushing themselves and come out with a proper Nvidia smasher, and then nvidia will be in a position that they cant keep charging 400-800 dollars for cards that should be 250-400 dollars.
  • UpSpin - Friday, November 8, 2013 - link

    I don't get your comment and I'm no GPU fanboy at all (even though I only bought NVidia GPUs in the past), because I barely game high end games and find such high prices (both AMD and NVidia) for a GPU ridiculous. But I'm interested in tech and consider buying a mid-tie GPU because my Nvidia GTX 560 TI starts acting strange.

    What matters is what NVIDIA or AMD sells now and what it costs now. It's a fact, that the 290X beats in half of the benchmarks the 780 Ti. The other half the 780 Ti wins. It's a fact, that the power draw between both single cards is identical. And it's a fact that the newly released 780 Ti is $150 more expensive than the newly released 290X.

    Of course is the 290X too loud, but that's not a issue of the GPU (same power draw), more of the cooler, which should be fixed with a third party cooler implemented by ASUS, ... The NVidia reference coolers were always superb (that's why I own a reference EVGA 560 Ti, because it was really silent compared to the similar priced alternatives).

    We live here and now and we can only buy the current stuff. So I don't care if Nvidia might release in the near or far future an even better card (at the time AMD might release a new card, too). And if you want to buy a GPU now, the Nvidia is, regarding the price, a complete rip off compared to the AMD.

    As an excuse for the 'poor' performance of the Nvidia card you said, it's 9 month technology. So let me get this straight:
    NVidia sells you 9 month old technology for $150 more than AMD asks you for the latest bleeding edge technolgy they can offer? And you defend this? Are you serious? Nvidia sold the Titan for even more during the last months. So be damn happy that AMD released such a great card at such a low price point, else you would get ripped off by NVidia the following months, too.
  • Kutark - Friday, November 8, 2013 - link

    The argument was never whether one or the other was a better value but rather which is the better card. Thats what this discussion is about. Anybody with their head not firmly planted in their ass can see the 290x is a good value. The problem is value changes from person to person. Some people don't care how noisy their video card is. Some people don't care how hot it runs under load. Those people would find that card to be an excellent value. Other people do care. For them it might be worth it to pay extra for a quieter card that runs 8-10c cooler. Just like some people don't give 2 shits about having leather seats in their car, and don't think a $2000 option for leather would be a good value. Others think its great.

    Secondly, the Ti only loses to the 290x at 4k resolution which is a complete non point as there is all of 1 4k monitor out right now and it costs 2x of what most people spend on their computer. Lets not also mention that to get decent framerates you need a minimum of 2x of the 780/Ti's or the 290/Xs, So we're talking about a 5k investment outside of the rest of your shit to play at 4k? Id be willing to bet less than 1 in million PC gamers have that much money into their rigs
  • Mondozai - Friday, December 13, 2013 - link

    AMD fucked up with their reference cooler, and they fucked up with not providing data early enough to OEMs for aftermarket coolers. But comparing stock 290/290X to the GTX 780 Ti is misleading. You have to compare both cards when there are aftermarket coolers to both of them.

    But you don't do that. Why? Because you know an aftermarket cooler isn't going to be a big difference to 780 Ti, but it will make a massive difference to 290/290X.

    Stock versions of both cards with aftermarket cooler from the same OEM will show very little variance in performance at higher(1440p and above) resolutions. Except that 780 Ti will remain completely overpriced.

    Price to performance ratio isn't just about budget and middle segment cards. You can do the same to high-end cards. A 290X with an aftermarket cooler is simply going to beat out 780 Ti in anybody's but a fanboy's eyes. Sorry, but you're fanboying.

    And I am saying that as an Nvidia card owner myself, but the fact remains that Nvidia has been able to rape the wallets of a lot of people for so long, and I blame AMD for this, that some people like you have internalized the raping and come to defend it.

    I can only look at you with pity.
  • Owls - Thursday, November 7, 2013 - link

    Ryan I'm sorry but the video card reviews as of late have been very poor in quality and objectivity. Stop rushing to be the first. I don't go to Anand to read a crappy review, that's what HardOCP is for.

    That said your testing is flawed with old games and comparing the Ti to be faster than a 290x that is in silent mode is disingenuous. We all expect better from this site.
  • ol1bit - Thursday, November 7, 2013 - link

    First, Fantastic Review as always!

    As a side note, it's amazing to me that AMD can't get the performance with the same heat output as Nivida. After all they are a semi-big chip company, what gives?
  • FuriousPop - Thursday, November 7, 2013 - link

    wow! as a current AMD owner, i must say it is impressive. Temps and noise great! - power consumption not as much as expected, but hey a good card indeed. Now if only we could see the benchmarks in the higher areas of 1600p+. If your going to make a comment of overkill at 1440p with SLi of these, then why not show 1600p then and see how it really matches up!?

    i know that it in the past has been AMD for >1440p and Nvid for 1080< but as of late, we are starting to see that change dramatically.

    Can we please see some 1600p and 1600p+ res being tested on benchmarks! if there were already those benchmarks out and showed impressive results at higher resos i prob would of gone out and bought 2x of these ASAP. however will have to wait and see...
  • AnotherGuy - Thursday, November 7, 2013 - link

    So Ryan, you made the R9 290 as a total disappointment because it had a little higher noise than the rest at the time, but now when you see the 780 getting close at 52dB this is a little high but ok for nVidia and they won...That is not fair.
    You need to control those emotions, think many times and then finally find the right words to describe a product, not let your first thought into the final conclusion of a review.
  • Ma Deuce - Thursday, November 7, 2013 - link

    You need to control those emotions. Think many times and then finally find the right words to criticize a review, not let your first thought into the comment...

    290 is loud and hot and he didn't recommend it. Sorry man, it's not the end of the world though. You can buy it without their expressed written consent.
  • sf101 - Friday, November 8, 2013 - link

    Anotherguy your right though on the 290x review they made it sound like the end of the world over the noise /temp and power use levels.

    Yet when Nvidia does it its ok.

    And everyone just does a point in other direction look its super man type thing and pretends it didn't happen.

    If your going to be nit picky about noise and heat then at-least be consistent from what I've seen both the 290 and 780ti are fairly close in wattage use now so there goes your TDP arguments.

    And just because Nvidia has a better reference cooler doesn't change the fact that it still is using similar wattage to the 290/290x the only difference is Nvidia's cooler deals with it a bit better.

    just finding the review a tad bias ,,,,,,,
  • Morawka - Friday, November 8, 2013 - link

    the 290 got up to 62 db
  • formulav8 - Monday, November 11, 2013 - link

    Ryan's reviews has stunk for awhile now. I used to defend him and anand on appearing bias and such. But there does seem to be something. And the thing is that if it is true, they could care less. Literally
  • dwade123 - Friday, November 8, 2013 - link

    Only morons will buy current gen cards on steroids.
  • MLSCrow - Friday, November 8, 2013 - link

    Never in all my life of being a supporter of Anandtech have I been so disgusted by the overly obvious bias toward NVidia. The GTX780Ti is a JOKE at $700.
  • just4U - Friday, November 8, 2013 - link

    As much as I hate to admit this.. I "DO" like to see AMD succeed.. that being said I don't favor them over NVidia.. and Anandtech's reviews are fair/balanced. I think all reviewers have a preference.. and sometimes that shows in their reviews.. but it's really hard to pinpoint what Ryan's are. The guy can't win with NVidia or AMD die-hard fans. He gets criticized for being a fanboy of both.
  • Ranger101 - Friday, November 8, 2013 - link

    I hear you, I have followed Anandtech for decades but this kind of rubbish is definitely making me think about looking at other tech sites for a balanced perspective...shame on you Anandtech.
  • Ranger101 - Friday, November 8, 2013 - link

    At 4K resolutions the R290X beats 780Ti every time. How is it possible to conclude that 780Ti is 11% faster than R290X when the former card is consistently beaten at 4K resolutions which is the ultimate test of a cards speed? How much is Nvidia paying you to write this junk?
  • polaco - Friday, November 8, 2013 - link

    Indeed that performance difference is very tricky. In most cases they are head to head. And for 150 bucks less it's a no brainer 290X is the winner. However 290 seems so sweet at that price that puts me into real doubt if 290X is worthy. Radeon 290 looks lovely.
  • Yojimbo - Friday, November 8, 2013 - link

    He explains his reasoning quite clearly, and I think the reasoning is sound. 4K resolution is still out of the reach of a single-GPU card, because in order to achieve it, one must either accept painfully low frame rates, or run on extremely low quality settings, no matter what single-GPU card is chosen. Neither of these options makes much sense, but if you wish to take advantage of them, the data is there and you are free to ignore his analysis and pursue your own; Buy a 290X and a 4K monitor. But in terms of "victory" for AMD, it seems to me that running 4K somewhat faster, but still not fast enough to be usable, is meaningless.
  • Jaboobins - Friday, November 8, 2013 - link

    The memory frequency for the GTX 770 is wrong. It needs to be 7Ghz not 6Ghz.
    But damn is that 780 GTX ti is fast!
  • wwwcd - Friday, November 8, 2013 - link

    I got a rumour for r9 290x with 8GB 7+ GHz GDDR5 VRAM. WoW!, if that will be made real the card be have twice speed bandwidth than normal refferent r9 290x. GTX 780 will be downed to ground. Hardly ;)
  • slickr - Friday, November 8, 2013 - link

    At $700 its a bit too expensive, especially when you consider it averages only about 5% increase in performance over the Titan and about 9% increase in performance over the 780, which when translated to raw numbers, its only 3-4 frames.

    I mean whether a game runs at 50 or 54 frames is of no significance, especially if you have to pay $200 more for it.

    I think the 780 Ti is good in its own right, but its just not good enough when compared to the competition and when you consider the price.

    The 290x is $550 and in some cases is still faster than the 780 Ti, all this with a terribly designed cooler, which will be replaced by custom coolers by 3rd party in the next week or two.

    So at this point we are looking at a $550 290x with a custom cooler that will be able to run even faster with a better cooler, which means beating the new 780 Ti in many benchmarks and drawing in others all at $150 less.

    So yeah, Nvidia may have released a slightly faster card than the 780 and Titan, but considering the price and what the competition is offering, it isn't very appealing.

    If it launched at $600 it may have been reasonable and you have a 10% slower 780 at $500 and in between the 290x at $550 and it could make sense to go for the 780Ti, but right now I don't really see the appeal.
  • polaco - Friday, November 8, 2013 - link

    Indeed that's exactly the way I see it. AMD has played it's cards quite well this time. NVidia seems to be on it's price knees and can't compete any way. NVidia needs a new GPU to compete with Hawaii. I don't want to put into cosideration Mantle, TrueAudio, Gsync nor the Shield discount.
    Mantle and TrueAudio hasn't been demoed yet.
    Shield seems like a waste of money and almost useless for me.
    Maybe in a future if they prove to be worthy we will see Mantle, TrueAudio and Gsync to be included as part of an upgrade other standards.
  • Gigaplex - Sunday, November 10, 2013 - link

    Only 5% faster than a $1000 card? Yeah, totally overpriced at $700.

    Sarcasm aside, it really is overpriced. But comparing it to the Titan to justify that claim doesn't work.
  • polaco - Friday, November 8, 2013 - link

    Well looking at Anand's benchmarks I can't find a way to justify spending such amount of money on this card. AMD 290 and 290X looks way more interesting. As difference to others reviews Anand's has focused only in ultra high resolutions I think that's the way to go. Since no one would buy one of this cards (780ti, 290X) to game at 1680x1050. So at this high resolutions performances differences between the cards are barely minimal in most cases and could be reduced even more by drivers updates or settings tunning. I find no reason to spend such a difference for 780 Ti while having more than decent performance from radeon 290 at 250 bucks less. If I would want to go to an extreme instead of aquiring a 780Ti it would be better and get 2xradeon290 for 100 bucks more. So the problem of 780Ti is it's price, pure and simple. I really find difficult NVidia could lower the prices more since it looks 290 and 290X launch has putted NVidia prices to it's knees. So far I'm still waiting for 290, 290X third party coolers if they are good they can even pair or overcome 290 and 290X performance against 780Ti. What I have no doubt is how Titan and 780 early buyers must be feeling at this moment...
  • j6z7 - Friday, November 8, 2013 - link

    I think the majority of comments here reveals the truth behind big headlines with reviews.

    The reference 290X with 10 year old reference cooler still beating the 780ti with best cooler in CF - and it does it being $300 cheaper too!

    If anything, Nvidia shot itself in the foot against its own Titan.
    Nvidia fans will continue to support the company in ripping people off, while AMD provides same performance at affordable prices.

    End of the day, people who'll buy the 290X will be much more satisfied customers.

    The End.
  • Mondozai - Friday, December 13, 2013 - link

    Nvidia fanboys like EJS1980 are like battered wives. They are trying to rationalize themselves being raped.
  • b3nzint - Friday, November 8, 2013 - link

    You must be pissed with 290x being priced that low. Anyway AMD should reconsider for using that blower type fan. Nvidia have better cooling system than AMD, but for that price i just don't care. For $700? still too much.
  • b3nzint - Friday, November 8, 2013 - link

    what about mantle, compute power and trueaudio? u don't buy gpu for just friggin fps number!
  • wwwcd - Friday, November 8, 2013 - link

    WoW Ryan cut my comment. I know he is a really hard green fen, but this censor not placed with democracy. I can revenge of m-r Anand!
  • Ryan Smith - Friday, November 8, 2013 - link

    Uh, we haven't deleted anything. Are you sure you haven't just misplaced your comment?
  • not_there - Friday, November 8, 2013 - link

    I'm not a gamer, but I like to put a decent video card in my builds to run Folding@Home. It's real science and it helps heat my basement. Reading the reviews here I'm confused (someone point out what I'm missing). In a June review of the GTX 760 the Radeon HD 7970 got a 36.1 and in this review, the same card on the same test got a 29.3. This was the Folding@Home Explicit, Single Precision benchmark. June number for the GTX 770 in this test was 35.6 and in the same test on the same card in this review the benchmark is 15.1. Why the difference?
  • r13j13r13 - Friday, November 8, 2013 - link

    a los fans de NVIDIA no se preocupen pronto sacaran la version de 1000 dolares con un 5% mas de rendimiento
  • twtech - Friday, November 8, 2013 - link

    An interesting comparison would be a 780 Ti vs. crossfired 290s. In the previous generation of cards, I wouldn't have considered that fair, as any type of SLI/crossfire setup was definitely inferior to any single-card setup in a variety of ways. But that's changed in this latest generation.

    I bought two 290X on launch day (I knew they would go out of stock by day 3, and not come back into general availability for something like 2 months), but the experience compared to the last time I'd tried CF a few years back was completely different. There are no bridges to worry about - you just plug in the two cards and go. CF doesn't have any sync polarity issues, and the driver support for CF & multi-monitor is actually pretty fleshed out. I didn't notice any stuttering or texture corruption as I had the previous time I'd given CF a try.

    In fact the only reason I tried it now is because I have 3 2560x1600 monitors, and driving that is too much to ask out of a single card. The two 290X handle it easily though.
  • FuriousPop - Sunday, November 10, 2013 - link

    what frames are you getting?
    i currently have 2x7970's in CF and was looking to upgrade to handle my same setup.
  • DMCalloway - Friday, November 8, 2013 - link

    Wow !!! There sure are a lot of used 780's on Eb*y......meanwhile in a very luxurious board room in Santa Clara California ..... ' but sir.... do you really think they'll sell at that price point? '.....( while laughing ) ' of course they'll sell at that price point; our consumer research polls show that our customer base simply can't help themselves.'..... and throughout the world the rustling of wallets and swishing of credit cards could be heard as green team loyalist geared up to purchase their second almost $700 gtx 780 for 2013...... : )
  • polaco - Friday, November 8, 2013 - link

    This is what we are talking about:
    http://www.tomshardware.com/reviews/radeon-r9-290-...
    when 290's get theirs non reference coolers NVidia 780 Ti will have to take it's bargains and go home definitley. AMD's 290 ans 290X series are full of hopes to hit even better performance numbers, however NVidia's 780Ti are at it's max.
  • EJS1980 - Friday, November 8, 2013 - link

    " AMD's 290 ans 290X series are full of hopes to hit even better performance numbers, however NVidia's 780Ti are at it's max..."

    OC'ing the 780ti will give you around 15-20% more performance, or higher, so what the hell are you talking about? I realize you're in love with all things AMD, but if you can take your goggles off for a second, you'd realize the 780ti is actually a really great card (much like the 290(x), obviously).
    I had NO IDEA how many AMD fanboys could be found mouth-breathing on the internet, at any given time. Which begs the question: if AMD has so many fanboys, why the f*ck are they doing so poorly in the discrete GPU market?
  • polaco - Saturday, November 9, 2013 - link

    Yes I do prefer AMD due to it's fair price. However what I'm talking about is that with non reference cooler 290-290X will be able to run pretty cool and have decent overclocking potential too, as shown in tomshardware chart. Since they cost several bucks less than NVidia cards and at that point should be a pretty closed gap in performance (in fact they already are) then AMD cards will be at an extremely nice price/performance point. What do you mean by poorly in discrete GPU? Many APUs has been sold, APUs are replacing discrete GPUs, all PS4 and XBox One are like discrete GPUs. And I do have preference by AMD but mainly coz this reasons: they have always been trying to innovate, they have to compete with a giant as Intel and they bring price balance to the GPU / CPU market. That doen't mean I will buy them whatever they take to the market, I evaluate all options and buy what fits my needs better. In fact 780ti is a great card nobody says the opposite, just quite expensive from my point of view and I don't want to get into the "how much NVidia has been abusing buyers wallet during this months". I wonder if any NVidia fan that has acquired and 780 or Titan previouly to 290 entry to the market could recognize that...
  • Owls - Saturday, November 9, 2013 - link

    OCing a Ti is not guaranteed. Why do people parrot this info around like every card is going to peform the way you describe?
  • EJS1980 - Saturday, November 9, 2013 - link

    Again, what the hell are you people talking about???

    Even though results can very from card to card, EVERY 780ti can be overclocked to boost performance by a significant margin. These chips are the cream of the Kepler crop, and Nvidia is confident enough with their yields that a substantial OC is all but guaranteed with each card, as EVERY review so far has illustrated.
    I personally feel this card is about a $100 overpriced, and as such, I will NOT be upgrading at this time. I also believe that even with the significant problems inherent to the new Hawaii chips, they are powerful cards at an EXCELLENT price point.
    However, I'm not going to pretend that the 290(x) are faster than the 780ti, just because their priced better. So many of you guys keep pointing out that once after market solutions arrive, the 290(x) will take back the crown, and that simply isn't true. Performance will obviously improve, but only to levels comparable to a STOCK 780ti, and maybe not even that. That's where OC'ing comes in to play, for if we're going to compare the 290(x) OC'd with a better cooling solution, then the same must be applied to the 780ti too. I expect the 780ti to maintain its 5-15% performance advantage over the 290(x) after they've BOTH released their aftermarket solutions, so the question ultimately returns to whether or not the consumer finds that performance advantage to be worthy of the price differential. Just because you don't, or I don't, does NOT mean that anyone else won't too, or that there isn't even a advantage to begin with, which there undoubtedly will be...
  • Mondozai - Friday, December 13, 2013 - link

    EJS1980, the mouth-breathing Nvidia fanboy, you're talking about a card(GTX 780 Ti) which with an aftermarket cooler could have an advantage as low as 5% for 200-250 dollars more in price. Only a Nvidia buttboy would think that's a good deal, you've been raped by them through their pricing for so long, you've come to even enjoy it.

    Most sane, non-buttboys will opt for the best price/performance ratio. Including for high-end cards. A 290 in CF with aftermarket coolers will crush everything. Even a 290X on an aftermarket cooler is going to do a lot better, especially as we transition to 4K within the next 1-2 years.

    Stop being a buttboy for Nvidia.

    (P.S. I'm currently using an Nvidia card, but I always get embarrassed when I see buttboys for a specific company like yourself).
  • beck2448 - Tuesday, November 12, 2013 - link

    They live in a dream world. Pros buy Nvidia 80% plus. That says everything about quality and reliability.
  • Mondozai - Friday, December 13, 2013 - link

    EJS, the buttboy for Nvidia, most sane people are non-fanboys.
    This means most people, including myself, skipped AMD the last few generations because they did a shitty job. We bought Nvidia hardware instead. Now, the roles will be changed with aftermarket coolers.

    Also, please don't talk about mouthbreathers when you're literally chewing cowshit in your mouth everytime you're trying to say something. It stinks.
  • xdesire - Saturday, November 9, 2013 - link

    You really don't know what you are talking about. Obviously, R9 290 holds great price/performance value but GTX 780 Ti has great OC potential out of the box. I'm afraid AMD shot themselves on their own foot with this reference cooler
  • Grugtuck - Friday, November 8, 2013 - link

    Any reason why the 900000000 pound gorilla in the room isnt mentioned here? 290x spanks the living **** out of the 780i in CF vs SLI. It makes me think that driver issues are still not fully sorted out.

    Ryan you sound like an absolute idiot when you say that no one is going to need SLI or CF any more. I also think its interesting how these days suddenly 60FPS is the standard to live by when it comes to FPS. I started playing PC games competitively back around 2002 and 80FPS has always been what people shot for, not 60. 60 is the bar min for acceptable smooth play, its not the optimal for competitive or serious FPShooter gaming.
  • lostsanityreturned - Saturday, November 9, 2013 - link

    Hmmm I figured I would run a quick bench... my OCed 780 gigabyte with stock cooling gets the same average fps as their OCed 780ti in metro... 67fps 1440p high preset.

    I imagine it would be even higher if I uninstall comodo (which seems to drop my average fps by 5-14 frames just by being installed even if everything is disabled and profiles are set up correctly to ignore games, goes right back up if uninstalled though)

    I hit 77degrees after my third run and it dropped back down to 75 soon after when the fans ramped up again, keeping in mind this is Western Australia I am in currently at 34 degrees (that is 94.2 Fahrenheit), all the windows open and no air-conditioning with an aircooled case.
    It isn't even a demanding overclock +161 to core and +189 to memory... which considering I usually run it at +181 and +201 with ease and stability (I turned it down to see what the results were for an easy overclock as they didn't push their 780ti much)

    I was feeling like crap about them releasing a new card just 4=5months after I got the 780... now... not so
  • sf101 - Saturday, November 9, 2013 - link

    I think obviously people are just irate with Nvidia thinking they can charge premiums on everything and not just small premiums but they seem really set on this +750$ area pricing refusing to cut their customer base a break on the overpricing.

    So they drop down the 780 GTX to 500$ and everyone cheers "ignorantly" !!! Really its just a smoke screen because they knew the 290 and 290x were out performing their card while running on poor performing cooling units and yet it still has a $100 premium over the 290 which also out runs it pretty much everywhere.

    Now down come's the 780TI pooping all over early adopters of the 780 and more so the titan buyers who thought they were getting a flagship card and foolishly paid $1000+ for them.

    But its not all bad because heck man performance is performance and the 780ti is obviously needed to keep Nvidia in this race so we all look past its release but can't look past its premium pricing which is just another rip off of the customers @ 150$ over the competitor's pricing which closely competes at lower resolutions and fails to out perform @ 4k and in SLI/crossfire configurations.

    Now all that would be just business as usual if Reviewer's were Educating about AMD's crossfire and high resolution performance wins over even the new and improved 780TI.

    Instead they are quiet as a mouse to All AMD's Wins aside from pricing because well that's obvious and hard to ignore right. and rip AMD a new one over the downsides aka heat and noise which is totally justified and expected.

    While for the nvidia side of things its all Christmas and Win's on the review reading in such a way that the 780ti wins in every category. failing to mention the Wattage use is getting up there as well as heat and fan noise, perhaps not up to 290x height's but much further than the GTX780.

    All of these things have been pointed out on other review sites the good and the bad.
  • FuriousPop - Sunday, November 10, 2013 - link

    lol, at the end of the day its all about target marketing.... excluding the fanboys of course. Fanboys = omg its better, faster, cooler oh oh i gotta have it. where as most normal people will analyse the cost of the GPU in relation to its performance to which if applicable to them would do other little upgrades to it/their case if need be, if it all still fits into the equation of how much to spend. etc etc

    do your research, read lots of reviews, ask questions(if any) then purchase and don't look back. pretty simple.....

    most of you all come here to rage and fire shots to either side (great entertainment btw) reminds me of that Halo Red Vs Blue. more like "fanboys - Red Vs Green" oh hey hey - why Red first eh eh!?
  • SymphonyX7 - Saturday, November 9, 2013 - link

    Why exactly would I buy a GTX 780 Ti, when for a $100 more I can get TWO Radeon R9 290s in SLI and get twice the performance? The heat issue is there, but it ain't nothing an aftermarket cooler can't handle like the Accelero Xtreme 3.

    AMD wouldn't have flinched from the GTX 780 Ti's launch had it not been for their utterly terrible reference coolers.
  • SymphonyX7 - Saturday, November 9, 2013 - link

    I meant Crossfire, not SLI. But you get the point. Have you seen those CF 290x vs SLI 780 Ti numbers? That's a ridiculous beatdown.
  • TheinsanegamerN - Saturday, November 9, 2013 - link

    you know, what i see from this, is that the 290x in uber mode is just as fast as the 780ti in most senarios, and is often a little faster. which should mean that the third party coolers that get slapped on these things should allow the 290x to soundly beat the 780ti. lets get the windforce 3x version of both these cards when they come out, and bench those for a more equal review.
  • IUU - Saturday, November 9, 2013 - link

    So, the high end of video cards can run shamelessly all the "high" end titles at 2560x1440. What are the game developers doing? So much computing power being wasted for viewing our games at nonsensical resolutions? There's still room for improvement of the game visuals, why don't they take advantage of the cards' muscle?
    I may be eccentric, but for some peculiar reason, I don't get excited by playing pacman and supermario at ultra hd.
  • Vortac - Saturday, November 9, 2013 - link

    Folding@home double precision benchmark results are somewhat strange to me. How can a 780 Ti card (with FP64=1/24 FP32) beat a 7970 aka 280X (with FP64=1/4 FP32)?
  • abhishek_takin - Sunday, November 10, 2013 - link

    780ti is great card in terms of performance. But 700$ is too much to ask. As a gamer Max FPS is not everything. It should smooth and fast with Ultra / High details. I have 7970GHZ crossfire with 27 inch Dell dual monitor setup. My pc smokes all the latest game in the market. And ask me how much i paid 640$ and bunch of free games. I know the problem of crossfire but its not that huge for which one should opt for a single card for 700$.

    I am not a fanboy of Nvidia or AMD. If the card's price would be under 550$ then everyone would be saying that.... its the best card ever made. Only because of its big price tag lots of people are voting for 290X, 290 and 780(normal) which is very much fair.
  • nsiboro - Sunday, November 10, 2013 - link

    780ti burnt
    http://www.chiphell.com/thread-897838-2-2.html

    NV issue stop sale
    http://www.chiphell.com/thread-897383-11-1.html

    What's happening?

    Can someone confirm this?
  • polaco - Sunday, November 10, 2013 - link

    who on earth will be able to confirm a post on a page written in chineese?
  • nsiboro - Sunday, November 10, 2013 - link

    Yer right.
    How about in English.

    http://linustechtips.com/main/topic/74131-chiphell...
  • nsiboro - Monday, November 11, 2013 - link

    Could be a hoax. The posted image of the burnt PCI-E connector doesn't look like a 780ti.
  • nsiboro - Monday, November 11, 2013 - link

    It's confirmed to affect Galaxy branded GTX-780ti.

    http://translate.google.com/translate?sl=auto&...

    UPDATE: It is reported GALAXY official has released a formal announcement, said 2013 sales between 11.7-11.10 GTX 780 Ti existence of quality defects, serial number 13B0020705-13B0020759 a total of 55 cards between the user can call the official customer service phone 400-700 -3933 for a free replacement.
  • Skr13 - Sunday, November 10, 2013 - link

    Please fix typo at the Company of Heroes 2 page: http://postimg.org/image/u96hy7skf/62e3b91a/
  • DPOverLord - Monday, November 11, 2013 - link

    What about on Surround monitors, the main draw for the Titan was that it has 6GB of Ram.
  • mohammadm5 - Monday, November 11, 2013 - link

    http://www.aliexpress.com/item/Wholesale-Price-GeF...

    thats the wholesale price its not nvidia that charges so much is the resellers. the profit nvidia makes per gpu is very low but the reseller make alot of money, also the new amd r9 290 is going for $255 per unit at wholesale price and the r9 280x is going for $160 dollar per unit. you have to also remember thats the distributer price not the manufacturer price,witch should be alot lower. i know the gtx 780 at manufacturer price sells from $200 to $280 depending on brand.

    so remember this is america were they sell you something made in china for 1 dollar for 10 dollars
  • UpSpin - Monday, November 11, 2013 - link

    While those numbers are interesting, your conclusion is wrong. The $700 are what NVidia wants the customer to pay for the card, not what the reseller wants:
    http://nvidianews.nvidia.com/Releases/NVIDIA-Unvei...
    "Pricing is expected to start at $699"

    So it's not the seller who makes 100% profit, it's NVidia.
  • polaco - Monday, November 11, 2013 - link

    remember that the reseller has to buy the card, pay import taxes, transportation and others, try to sell it and if the card goes unsold they have to sell it in a few month at discounts prices. Also in the case of psychical stores they have to keep building costs, employees, more taxes, etc. So what you are describing happens in every industry. Also and maybe in first place where are you getting those numbers from?
  • Wade_Jensen - Monday, November 11, 2013 - link

    OK, so either Brian has lost his Nexus 5 or its benchmark boosting, cause something has to be going on here.
  • beck2448 - Monday, November 11, 2013 - link

    Wow oc results are impressive. Where are the Lightning and Windforce versions?
  • r13j13r13 - Tuesday, November 12, 2013 - link

    la ventaja frente a una 290x es mínima pero la diferencias de precio no cuando se mejore la refrigeración de la 290x, al fin y al cabo la competencia nos beneficia a todos
  • jukkie - Tuesday, November 12, 2013 - link

    HD7990 can be gotten for as low as £400 in the UK at the moment (or £480 with a PSU or a M/B), so anyone wanting the ultimate in single CARD performance would bet better off buying that anyway.

    Obviously noise and heat will still be an issue, but if you're going to ignore the GTX 780 Ti's price, we can ignore those factors (thankfully microstuttering is mostly a non-issue these days since frame pacing has resolved that in everything except DX9 for now).

    If the price of the 780 Ti were to drop 10-20%, I'd consider buying one, but as it is I simply can't justify it, even if affording it isn't a problem.
  • arjunp2085 - Wednesday, November 13, 2013 - link

    Anand / Ryan

    I would love to see a comparison of the 290 , 290x 780, 780 Ti with water coolers working efficiently. I believe that anyone spending $600 on a Gfx card would be able to spend an additional 100-200$ for a cool setup.

    Further i would love to see if there is any performance increase due to increased cooling( With respect to boost states both Nv and AMD.

    Any comments please let me know

    Thanks,
    Arjun
  • GUNN3R - Friday, November 29, 2013 - link

    Is there an update in scores with latest drivers?
  • Hrel - Sunday, January 12, 2014 - link

    I really wish you guys had included useful resolutions. You realize the VAST majority of people, even enthusiasts, have 1080p screens right?

    You can do ur uber testing for your own personal concerns but it should be standing policy to include 1080p testing for everything. That's the resolution of screen that people have.
  • IUU - Friday, April 18, 2014 - link

    Can someone explain to me the logic of setting the game at 4x antialiasing while playing at 2560x1440?
    Because I can't find any.
  • naseruddin - Tuesday, July 29, 2014 - link

    Looks like Anand is more Fan of a AMD rather than Nvidia i can understand that, but check the game bench markings on 780Ti after market cards it kills AMD 290x, i dont understand one thing if game bench marking is all about then i have GTX550ti 1 GB DDR5 runs ultra settings for Sniper Elite 3, Far Cry 3, AC3, AC4 , Sky Rim, the reason is simple and obvious that Nividia has & AMD dosent -- PhysX Software with the Chip on the card it self, i know many of you might be thinking hey even AMD cards utlize PhyX but remember not with right hardware. AMD is Good a CPUs not GPUs. one more interesting point in every 6 out of 10 games splash screen as "Nvidia the way it meant to be played" i have hardly seen any game that says AMD Radeon the way its meant to be played lol. Something to think about, in conclusion Nvidia is for a premium segment the pricing speaks, why else would you pay more price, because Nvidia is confident on their quality, on the other hand competitors go for desperate selling, which puts doubts in you brain.

Log in

Don't have an account? Sign up now