Comments Locked

337 Comments

Back to Article

  • ponderous - Thursday, February 21, 2013 - link

    Cannot give kudos to what is a well performing card when it is so grossly
    out of order in price for the performance. $1000 card for 35% more performance
    than the $450 GTX680. A $1000 card that is 20% slower than the $1000 GTX690.
    And a $1000 card that is 30% slower than a $900 GTX680SLI solution.

    Meet the 'Titan'(aka over-priced GTX680).

    Well here we have it, the 'real' GTX680 with a special name and a 'special'
    price. Nvidia just trolled us with this card. It was not enough for them to
    sell a mid-ranged card for $500 as the 'GTX680', now we have 'Titan' for twice
    the price and an unremarkable performance level from the obvious genuine successor
    to GF110(GTX580).

    At this irrational price, this 'Titanic' amusement park ride is not one worth
    standing in line to buy a ticket for, before it inevitably sinks,
    along with its price.
  • wreckeysroll - Thursday, February 21, 2013 - link

    now there is some good fps numbers for titan. we expected to see such. shocked to see it with the same performance as 7970ghz in that test although!

    much too much retail msrp for the card. unclear what nvidia was thinking. msrp is sitting far too high for this unfortunately
  • quantumsills - Thursday, February 21, 2013 - link

    Wow....

    Some respectable performance turn-out here. The compute functionality is formidable, albeit the value of such is questionable in what is a consumer gaming card.

    A g-note though ? Really nvidia ? At what degree of inebriation was the conclusion drawn that this justifies a thousand dollar price tag ?

    Signed

    Flabbergasted.
  • RussianSensation - Thursday, February 21, 2013 - link

    Compute functionality is nothing special. Still can't bitcoin mine well, sucks at OpenCL (http://www.computerbase.de/artikel/grafikkarten/20... and if you need double precision, well a $500 Asus Matrix Platinum @ 1300mhz gives you 1.33 Tflops. You really need to know specific apps you are going to run on this like Adobe CS6 or very specific CUDA compute programs to make it worthwhile as a non-gaming card.
  • JarredWalton - Thursday, February 21, 2013 - link

    Really? People are going to trot out Bitcoin still? I realize AMD does well there, but if you're serious about BTC you'd be looking at FPGAs or trying your luck at getting one of the ASICs. I hear those are supposed to be shipping in quantity some time soon, at which point I suspect prices of BTC will plummet as the early ASIC owners cash out to pay for more hardware.
  • RussianSensation - Thursday, February 28, 2013 - link

    It's not about bitcoin mining alone. What specific compute programs outside of scientific research does the Titan excel at? It fails at OpenCL, what about ray-tracing in Luxmark? Let's compare its performance in many double precision distributed computing projects (MilkyWay@Home, CollatzConjecture), run it through DirectCompute benches, etc.
    http://www.computerbase.de/artikel/grafikkarten/20...

    So far in this review covers the Titan's performance from specific scientific work done by universities. But those types of researchers get grants to buy full-fledged Tesla cards. The compute analysis in the review is too brief to conclude that it's the best compute card. Even the Elcomsoft password hashing - well AMD cards perform faster there too but they weren't tested. My point is it's not true to say this card is unmatched in compute. It's only true in specific apps. Also, leaving full double precision compute doesn't justify its price tag either since AMD cards have had non-gimped DP for 5+ years now.
  • maxcellerate - Thursday, March 28, 2013 - link

    I tend to argee with RussianSensation, though the fact is that the first batch of Titans has sold out. But to who? There will be the odd mad gamer who must have the latest most expensive card in their rig, regardless. But I suspect the majority of sales have gone to CG renderers where CUDA still rules and $1000 for this card is a bargain compared to what they would have paid for it as a Quadra. Once sales to that market have dried up, the price will drop.
    Then I can have one;)
  • ponderous - Thursday, February 21, 2013 - link

    True. Very disappointing card. Not enough performance for the exorbitant cost.

    Nvidia made a fumble here on the cost. Will be interesting to watch in the coming months where the sure to come price drops wind up placing the actual value of this card at.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    LOL - now compute doesn't matter - thank you RS for the 180 degree flip flop, right on schedule...
  • RussianSensation - Thursday, February 28, 2013 - link

    I never said compute doesn't matter. I said the Titan's "out of this world compute performance" needs to be better substantiated. Compute covers a lot of apps, bitcoin, openCL, distributed computing projects. None of these are mentioned.
  • Tetracycloide - Thursday, February 21, 2013 - link

    That's the thing, it's not a 'consumer gaming card.' It's a consumer compute card. Obviously the price for performance for gaming makes no sense but that's not their target market.
  • ronin22 - Thursday, February 21, 2013 - link

    This exactly!

    It's an amazing card for computing.
    I wish I could get one...
  • Blazorthon - Thursday, February 21, 2013 - link

    In reply to both of your comments, I have to ask this: If that is justification for its price, then why is it that AMD doesn't have their Tahiti cards priced like that and why didn't Nvidia price their previous consumer compute cards like that (GTX 280, GTX 285, GTX 480, GTX 580, etc.)?
  • CommandoCATS - Friday, February 22, 2013 - link

    Because this seems like a specialized thing for people who care about compute tasks within NVidia's CUDA universe (and things like iRay, which didn't exist when previous generations first came out).

    The truth is that in academia and research, CUDA is still the top dog (just do a google scholar search). I'm sure for most gamers, the GTX 680 is the way better deal. However, this is essentially a Tesla K20 for 1/3rd of the cost, so it's kind of a bargain from that perspective.
  • cheersloss - Saturday, February 23, 2013 - link

    Exactly right. There is nothing about this card that is a value. The same compute functions were there in the older flagships as well, the gtx 580, 480, 280 etc.

    Titan is just an overpriced, overhyped trainwreck. Another attempt at a cashgrab on the gullible.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    The gullible that have the several thousands of extra they can spend that you don't have and cannot spend.

    Certainly poorboy feels better after having called his superiors gullible. The jelly is seeping through at an extraordinary rate.
  • cheersloss - Saturday, February 23, 2013 - link

    Exactly right. There is nothing about this card that is a value. The same compute functions were there in the older flagships as well, the gtx 580, 480, 280 etc.

    Titan is just an overpriced, overhyped trainwreck. Another attempt at a cashgrab on the gullible.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    LOL

    Good you run the 280, and I'll run the Titan, and we can be equal and best friends, and I'll tell you over and over all the benchmarks and games and fps scores and compute tests are lying and your 280 is just as good and the same and you're right and I wish so badly that I was as poor as you and just bought a used 3 gen back nVidia card, but they fooled me, the gullible gamer.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    ROFL - aww poor baby, now tessellation and compute is a total loss for amd, too as you conveniently forgot to include the 680, 670 660Ti 660 650Ti 650. hahahahha u people suck.

    So I can buy 2 amd cards that crash and don't run CF at all 33% of the time, or I can buy the most awesome top card in the entire world of gaming and play all the titles and be just great, or I can buy 2 nVidia cards and SLI them and have every game run except correctly except 1 while amd CF fails often...

    I can buy the most stable, fully featured, many more featured nVidia card, or I can buy the dying no catalyst driver writers worth their salt (fired for savings) or left for better waters or headhunted, crashing piece of rotten unsupported glitching amd crap.

    $999 looks like a bargain basement price to me. I can hardly wait to own The Shield, too.
    Innovation. Awesomeness. New features. Unified drivers THAT JUST WORK.
    Features ported BACKWARDS to prior generations.
    Cuda
    PhysX
    Frame Rate Target
    Boost
    Stable dual card setups
    Same game day drivers
    Honest company not cheating liars like amd

    I BUILT THIS co-founder Jen-Hsun Huang current CEO, a perfect example of the awesomeness of capitalism and personal success and the American Dream in REAL LIFE. lol
    ( Oh I bet those little OWS activist amd fanboys we have here are shooting blood through every pore)

    Why in the world would I buy an amd card ? There's only one reason - to try to save a dying loser in a kind act of CHARITY - and frankly, what we have for amd fanboys is nothing short of the most evil little penny pinching crying whining baby SCROOGES I have ever seen.

    So we can FORGET IT when it comes to the amd fanboy rabble here supporting AMD - they support only their own selfish fanboy agenda and psychotic public pocketbook panhandling.

    I'd like to thank TheJian for pointing out amd fail coverage, vs the ignoring of the nVidia FINANCIAL SUCCESS STORY:

    amd Q earnings coverage
    "
    http://www.anandtech.com/show/5465/amd-q411-fy-201...
    http://www.anandtech.com/show/5764/amd-q112-earnin...
    http://www.anandtech.com/show/6383/amd-q3-2012-ear...
    http://www.anandtech.com/show/6690/amd-q412-and-fy...
    "
    nVidia Q earnings coverage
    http://www.anandtech.com/show/6746/tegra-4-shipmen...

    LOL - let it burn you crybabies to the CORE, I hope blood shoots from your eyes...
  • xaml - Sunday, March 3, 2013 - link

    Don't leave out the biggest "crybaby" of all, yourself.
  • CeriseCogburn - Monday, March 4, 2013 - link

    That's all you've got ?

    Did you at least look at the links, or have a failed brainfart of an idea for a rebuttal ?

    No, of course you did not. Another mindless, insulting fool, without a single anything other than of course, and insult.
    I would feel better about agreeing with you if you had any facts or even an opinion on anything else.
  • swing848 - Friday, May 17, 2013 - link

    I believe the review was for gaming machines. In that regard HD 7970s and GTX 680s trade blows as one card is faster in some games and the other faster in some games.

    So, gamers should pick a card that will perform the best for the games they play.

    Microsoft FSX is very old now, yet a person needs a very powerful gaming computer to run it with all of the goodies turned up [with lots of code fixes], including overclocking an Ivy Bridge to 4.5GHz+ [this is because when the game was developed it was believed that Moore's Law was valid and single core CPUs would be running at 10GHz by 2012]. And, yes, FSX was coded for NVIDIA.
  • coilpower - Thursday, February 21, 2013 - link

    Well that is where nvidia's attempted marketing falls apart. Compute in their flagship gpu brought to geforce lineup is nothing new. Now they think they can go $1000 on it, doubling the price.

    Techpowerup was apt when they said this is the most overpriced video card in 25 years.

    Nvidia has dropped the ball here on the price, heck, they have thrown it down the street. Too bad, now they will lose more face with the price drops needed to get these off shelves.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    They're sold out, already.

    So much for you being correct, you're already an incorrect failure.

    Nice try amd fansvengaliboy
  • Alucard291 - Sunday, February 24, 2013 - link

    All 100 of them? :)
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    You forgot the k fool.
  • Alucard291 - Friday, March 8, 2013 - link

    Proof? :)

    In any case, please stop shitposting. This is not 4chan or engadget.
  • CeriseCogburn - Tuesday, March 12, 2013 - link

    They're still sold out, ROFL.
  • CeriseCogburn - Thursday, March 21, 2013 - link

    From Anand Brain etc in the current article: " 12:16PM EDT - GK110 in general seems to be supply constrained right now. NVIDIA has previously told us they're selling every Tesla K20 and Titan card they can make"

    "Thank you Cerise, I'm sorry, I Alutard291 was wrong, and you are right. I challenged you and lost miserably. In the future I will shut my lying piehole and learn from you, Cerise, instead."

    LOL - No you won't Alutard, you will never be correct.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    $4500 from appreciative clients, or a grand from whining disgruntled crybabies ?

    Hmmm... what should a company do... ?

    http://www.excaliberpc.com/622885/nvidia-tesla-k20...

    I think they should take the extra $3500, and let the crybabies squeal and wail and fill their diapers.
  • cliffnotes - Thursday, February 21, 2013 - link

    Price is a disgrace. Can we really be surprised though ? We saw the 680 release and knew then they were selling their mid ranged card as a flagship with a flagship price.

    We knew then the real flagship was going to come at some point. I admit I assumed they would replace the 680 with it and charge maybe 600 or 700. Can't believe they're trying to pawn it off for 1000. Looks like nvidia has decided to try and reshape what the past flagship performance level is worth. 8800gtx,280,285,480,580 all 500-600, we all know gtx680 is not a proper flagship and was their mid-range. Here is the real one and..... 1000

    Outrageous.
  • ogreslayer - Thursday, February 21, 2013 - link

    Problem here is this gen none of the reviewers chewed out AMD for the 7970. This led Nvidia to think it was totally fine to release GK104 for $500 which was still cheaper then a 7970 but not where that die was originally slotted and to do this utter insanity with a $1000 solution that is more expensive then solutions that are faster then it.

    7950 3-way Crossfire, GTX690, GTX660Ti 3 Way SLI, GTX670SLI and GTX680SLI are all better options for anyone who isn't spending $3000 on cards as even dual card you are better off with the GTX690s in SLI. Poor form Nvidia, poor form. But poor form to every reviewer who gives this an award of any kind. It's time to start taking pricing and availability into the equation.

    I think I'd have much less of an issue if partners had access to GK110 dies binned for slightly lower clocks and limited to 3GB at 750-800. I'd wager you'd hit close to the same performance window at a more reasonable price that people wouldn't have scoffed at. GTX670SLI is about $720...
  • HisDivineOrder - Thursday, February 21, 2013 - link

    Pretty much agree. GPU reviewers of late have been so forgiving toward nVidia and AMD for all kinds of crap. They don't seem to have the cahoneys to put their foot down and say, "This far, no farther!"

    They just keep bowing their head and saying, "Can I have s'more, please?" Pricing is way out of hand, but the reviewers here and elsewhere just seem to be living in a fantasy world where these prices make even an iota of sense.

    That said, the Titan is a halo card and I don't think 99% of people out there are even supposed to be considering it.

    This is for that guy you read about on the forum thread who says he's having problems with quad-sli working properly. This is for him to help him spend $1k more on GPU's than he already would have.

    So then we can have a thread with him complaining about how he's not getting optimal performance from his $3k in GPU's. And how, "C'mon, nVidia! I spent $3k in your GPU's! Make me a custom driver!"

    Which, if I'd spent 3k in GPU's, I'd probably want my very own custom driver, too.
  • ronin22 - Thursday, February 21, 2013 - link

    For 3k, you can pay a good developer (all cost included) for about 5 days, to build your custom driver.

    Good luck with that :D
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    I can verify that programmer pricing personally.

    Here is why we have crap amd crashing and driver problems only worsening still.
    33% CF failure, right frikkin now.
    Driver teams decimated by losing financial reality.

    "Investing" as our many local amd fanboy retard destroyers like to proclaim, in an amd card, is one sorry bet on the future.
    It's not an investment.

    If it weren't for the constant crybaby whining about price in a laser focused insane fps only dream world of dollar pinching beyond the greatest female coupon clipper in the world's OBSESSION level stupidity, I could stomach an amd fanboy buying Radeons at full price and not WHINING in an actual show of support for the failing company they CLAIM must be present for "competition" to continue.

    Instead our little hoi polloi amd ragers rape away at amd's failed bottom line, and just shortly before screamed nVidia would be crushed out of existence by amd's easy to do reduction in prices.... it went on and on and on for YEARS as they were presented the REAL FACTS and ignored them entirely.
    Yes, they are INSANE.
    Perhaps now they have learned to keep their stupid pieholes shut in this area, as their meme has been SILENCED for it's utter incorrectness.
    Thank God for small favors YEARS LATE.

    Keep crying crybabies, it's all you do now, as you completely ignore amd's utter FAILURE in the driver department and are STUPID ENOUGH to unconsciously accept "the policy" about dual card usage here, WHEN THE REALITY IS NVIDIA'S CARDS ALWAYS WORK AND AMD'S FAIL 33% OF THE TIME.

    So recommending CROSSFIRE cannot occur, so here is thrown the near perfect SLI out with the biased waters.

    ANOTHER gigantic, insane, lie filled BIAS.

    Congratulations amd fanboys, no one could possibly be more ignorant nor dirtier. That's what lying and failure is all about, it's all about amd and their little CLONES.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Prices have been going up around the world for a few years now.

    Of course mommies basement has apparently not been affected by the news.
  • trajan2448 - Thursday, February 21, 2013 - link

    Awesome card! best single GPU on the planet at the moment. Almost 50% better in frame latencies than 7970. Crossfire,don't make me laugh. here's an analysis. Many of the frames "rendered" by the 7970 and especially Crossfire aren't visible.
    http://www.pcper.com/reviews/G...
  • CeriseCogburn - Thursday, February 21, 2013 - link

    So amd has been lying, and the fraps boys have been jiving for years now....
    It's coming out - the BIG LIE of the AMD top end cards... LOL
    Fraudster amd and their idiot fanboys are just about finished.

    http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...

    LOL- shame on all the dummy reviewers
  • Alucard291 - Sunday, February 24, 2013 - link

    What you typed here sounds like sarcasm.

    And you're actually serious aren't you?

    That's really cute. But can you please take your comments to 4chan/engadget where they belong.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Ok troll, you go to wherever the clueless reign. You will fit right in.

    Those aren't suppositions I made, they are facts.
  • Alucard291 - Friday, March 8, 2013 - link

    And once again you spew your b/s out of every orifice.

    But you still haven't said why you think your walls of nonsense make any difference :)

    To 4chan with ya
  • CeriseCogburn - Sunday, February 24, 2013 - link

    You people literally are pathetic. Right now, the cheapest gtx 670 is $720 sli, right ?

    Get your sli motherboard, get ready for extra heat, a better PS ( already stated botiques are launching these with 450W PS.

    So how many months after 670 launch with reduced prices are you only 25% off the single fastest video card in the world, while you take the cheapest version you can find ?

    You people are seriously filling your diapers at an unbelievable rate.

    I'll note once again for all you fools who continuously missed it, and still do, because of course, your gigantic flapping lips wraped around the gourd so many times they sealed off oxygen flow to the brain that you not only don't want to face reality, but choose not to on purpose:

    There was a manufacturing shortage for die space in Jan 2012 when 79xx did a near paper launch. Availability for that card was short till a day before the small in comparison 680 die hit the shelves far over half a year later, and the SINGLE factory in the entire world for production was busily building out well over 2 BILLION in emergency production space desperately trying to keep up with bare minimum demands.

    THERE WAS NO CAPACITY to produce a 7.1B transistor chip. The design of the chips follows a very slow SEVERAL YEAR slog, and even now, yield on the most complex chip ever is no doubt too low for comfort, and far too low to have been "launched" WHEN YOU IDIOT TIN FOIL HAT WEARING CHARLIE D BUTT KISSING MIND SLAVE FOOLS claim the conspiracy against all gamers was undertaken by "the greedy nVidia".

    You people suck.
  • ronin22 - Thursday, February 21, 2013 - link

    Common, stupid..
    If you are expecting a gaming card, go buy your AMD whatever.

    The real magic of Titan is its compute power.
    You were stupid to expect anything else from a GK110
  • CeriseCogburn - Saturday, February 23, 2013 - link


    Good to know amd absolutely failed to produce a top end videocard and has stuck all you tards with their only release, a mid range, at $579+ in comparison.

    Be a mid ranger, buy amds flagship, the half mast loser card mid range.
  • piiman - Saturday, February 23, 2013 - link

    Come on it's priced , for now, for the Geeks that have to have the biggest most bad ass card out. It will come down once those suckers...oops I mean enthusiast are sucked dry. :-)
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Since amd is a sucked dry shriveling corpse (excellent fanboy mistreatment by the tightwad whining poorboy amd penny pinching freaks), your theory, if we give a single deformed brain cell of credit to the amd fanboys, when they wail without amd everything will be a thousand bucks, may not pan out.

    AMD is dying, and when gone, a thousand bucks will be standard, right all you amd fanboys ?

    Start getting used to it.

    L O L
  • atlr - Thursday, February 21, 2013 - link

    Agreed. I was hoping for an initial price somewhat proportional to the performance like US$700. Perhaps ebay will be flooded with enough 680's and 690's from the early 'ticket' buyers which will cause retail prices of the same to drop.
  • wongwarren - Thursday, February 21, 2013 - link

    Guess you guys didn't read the article properly:

    With a price of $999 Titan is decided out of the price/performance race; Titan will be a luxury product, geared towards a mix of low-end compute customers and ultra-enthusiasts who can justify buying a luxury product to get their hands on a GK110 video card.
  • Alucard291 - Thursday, February 21, 2013 - link

    Wait wait wait, a GPU is now a luxury product?

    There was me thinking that all pc components have long since become commodities...
  • JeffFlanagan - Thursday, February 21, 2013 - link

    I agree that talk of a luxury GPU seems odd. Is there any game that will actually look better with this card rather than a $400 card?

    It may allow the user to up the resolution, but is anyone even shipping textures with detail beyond 1080p these days?

    I haven't bought a video card in several years, and can still select Ultra settings on new games at 1080p.
  • PEJUman - Thursday, February 21, 2013 - link

    Made me wonder:
    7970 - 4.3B trans. - $500 - OK compute, 100% gaming perf.
    680 - 3.5B trans. _ $500 - sucky compute, 100% gaming perf.
    Titan - 7.1B trans - $1000 - OK compute, ~140% gaming perf.

    1. Does compute capability really takes that much more transistors to build? as in 2x trans. only yield ~140% improvement on gaming.
    I think this was a conscious decision by nVidia to focus on compute and the required profit margin to sustain R&D.

    2. despite the die size shrink, I'm guessing it would be harder to have functional silicon as the process shrinks. i.e. finding 100mm^2 of functional silicon @ 40nm is easier than @28nm, from the standpoint that more transistors are packed to the same area. Which I think why they have 15SMXs designed.
    Thus it'd be more expensive for nVidia to build same area at 28 vs. 40 nm... at least until the process matures, but at 7B I doubt it will ever be attainable.

    3. The AMD statement on no updates to 7970 essentially sealed the $1000 price for titan. I would bet if AMD announced 8970, Titan would be priced at $700 today, with 3GB memory.
  • JarredWalton - Thursday, February 21, 2013 - link

    Luxury GPU is no more silly than Extreme CPUs that cost $1000 each. And yet, Intel continues to sell those, and what's more the performance offered by Titan is a far better deal than the performance offered by a $1000 CPU vs. a $500 CPU. Then there's the Tesla argument: it's a $3500 card for the K20 and this is less than a third that price, with the only drawbacks being no ECC and no scalability beyond three cards. For the Quadro crowd, this might be a bargain at $1000 (though I suspect Titan won't get the enhanced Quadro drivers, so it's mostly a compute Tesla alternative).
  • chizow - Friday, February 22, 2013 - link

    The problem with this analogy, which I'm sure was floated around Nvidia's Marketing board room in formulating the plan for Titan, is that Intel offers viable alternative SKUs based on the same ASIC. Sure there are the few who will buy the Intel EE CPU (3970K) for $1K, but the overwhelming majority in that high-end market would rather opt for the $500 option (3930K) or $300 option (3820).

    Extend this to the GPU market and you see Nvidia clearly withheld GK100/GK110 as the flagship part for over a year, and instead of offering a viable SKU for traditional high-end market segments based on this ASIC, they created a NEW ultra-premium market. That's the ONLY reason Titan looks better compared to GK104 than Intel's $1K and $500 options, because Nvidia's offerings are truly different classes while Intel's differences are minor binning and multiplier locked parts with a bigger black box.
  • mlambert890 - Saturday, February 23, 2013 - link

    The analogy is fine, you're just choosing to not see it.

    Everything you said about Intel EE vs standard directly applies here.

    You are assuming that the Intel EE parts are nothing more than a marketing ploy, which is wrong, while at the same time assuming that the Titan is orders of magnitude beyond the 680 which is also wrong.

    You're seeing it from the point of view of someone who buys the cheapest Intel CPU, overclocks it to the point of melting, and then feels they have a solution "just as good if not better" than the Intel EE.

    Because the Titan has unlocked stream procs that the 680 lacks, and there is no way to "overclock" your way around missing SPs, you feel that NVidia has committed some great sin.

    The reality is that the EE procs give out of box performance that is superior to out of box performance of the lesser SKUs by a small, but appreciable, margin. In addition, they are unlocked, and come from a better bin, which means they will overclock *even better* than the lesser SKUs. Budget buyers never want to admit this, but it is reality in most cases. Yes you can get a "lucky part" from the lesser SKU that achieves a 100% overclock, but this is an anomaly. Most who criticize the EE SKUs have never even come close to owning one.

    Similarly, the Titan offers a small, but appreciable, margin of performance over the 680. It allows you to wait longer before going SLI. The only difference is you don't get the "roll of the dice" shot at a 680 that *might* be able to appear to match a Titan since the SP's arent there.

    The analogy is fine, it's just that biased perspective prevents some from seeing it.
  • chizow - Saturday, February 23, 2013 - link

    Well you obviously have trouble comprehending analogies if you think 3.6B difference in transistors and ~40% difference in performance is analogous to 3MB L3 cache, an unlocked multiplier and 5% difference in performance.

    But I guess that's the only way you could draw such an asinine parallel as this:

    "Similarly, the Titan offers a small, but appreciable, margin of performance over the 680."

    It's the only way your ridiculous analogy to Intel's EE could possibly hold true, when in reality, it couldn't be further from the truth. Titan holds a huge advantage over GTX 680, but that's expected, its a completely different class of GPU whereas the 3930K and 3960X are cut from the exact same wafer.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    There was no manufacturing capacity you IDIOT LIAR.
    The 680 came out 6 months late, and amd BARELY had 79xx's on the shelves till a day before that.

    Articles were everywhere pointing out nVidia did not have reserve die space as the crunch was extreme, and the ONLY factory was in the process of doing a multi-billion dollar build out to try to keep up with bare minimum demand.

    Now we've got a giant GPU core with perhaps 100 attempted dies per wafer, with a not high yield, YET YOU'RE A LIAR NONETHELESS.
  • chizow - Sunday, February 24, 2013 - link

    It has nothing to do with manufacturing capacity, it had everything to do with 7970's lackluster performance and high price tag.

    GTX 680 was only late (by 3, not 6 months) because Nvidia was too busy re-formulating their high-end strategy after seeing 7970 outperform GTX 580 by only 15-20% but asking 10% higher price. Horrible price:performance metric for a new generation GPU on a new process node.

    This gave Nvidia the opportunity to:

    1) Position mid-range ASIC GK104 as flagship GTX 680 and still beat the 7970.
    2) Push back and most importantly, re-spin GK100 and refine it to be GK110.
    3) Screw their long-time customers and AMD/AMD fans in the process.
    4) Profit.

    So instead of launching and mass-producing their flagship ASIC first (GK100) as they've done in every single previous generation and product launch, they shifted their production allocation at TSMC to their mid-range ASIC, GK104 instead.

    Once GK110 was ready, they've had no problem churning them out, even the mfg date of these TITAN prove this point as week 31 chips are somewhere in the July-August time frame. They were able to deliver some 19,000 K20X units to ORNL for the real TITAN in October 2012. Coupled with the fact they're using ASICs with the same number of functional units for GTX Titanic, it goes to show yields are pretty good.

    But the real conclusion to be drawn for this is that other SKUs based on GK110 are coming. There's no way GK110 wafer yields are anywhere close to 100% for 15 SMX ASICs. I fully expect a reduced SMX unit, maybe 13 with 2304SP as originally rumored show it's face as the GTX 780 with a bunch of GK114 refreshes behind it to fill out the line-up.

    The sooner people stop overpaying for TITAN, the sooner we'll see the GTX 700 series, imo, but with no new AMD GPUs on the horizon we may be waiting awhile.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Chizow I didn't read your stupid long post except for your stupid 1st line.

    you're a brainwashed lying sack of idiocy, so maybe i'll waste my time reading your idiotic lies, and maybe not, since your first line is the big fat frikkin LIE you HAVE TO BELIEVE that you made up in your frikkin head, in order to take your absolutely FALSE STANCE for the past frikkin nearly year now.
  • chizow - Monday, February 25, 2013 - link

    You should read it, you might learn something.

    Until then stfd, stfu, and gfy.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Dear Jeff, a GPU that costs $400 dollars is a luxury GPU.

    I'm not certain you disagree with that, I'd just like to point out the brainless idiots pretending $1000 for a GPU is luxury and $250 is not are clueless.
  • piiman - Saturday, February 23, 2013 - link

    Yes a $1000.00 GPU is a luxury. Don't want luxury use the on board GPU and have a blast! :-)
  • CeriseCogburn - Sunday, February 24, 2013 - link

    ANY GPU over about $100 to $150 bucks is a LUXURY PRODUCT.

    Of course we can get you a brand spankin new gpu for $20 after rebate DX11 capable with a gig of ram, GTX 600 series, ready to rock out, even fit in your HTPC.

    So stop playing so stupid and so ridiculous. Why is stupidity so cool and so popular with you puke brains ?

    A hundred bucks can get one a very reasonable GPU that will play everything now available with a quite tolerable eye candy pain level, so the point is dummy, THERE ARE THOUSANDS OF NON LUXURY GPU's, JUST LIKE THERE ARE ALWAYS A FEW DOZEN LUXURY GPU's.

    So your faux aghast smarmy idiot comment about I thought GPU's were commodities fits right in with the retard liar shortbus so stuffed to the brim with the clowns we have here.

    You're welcome, I'm certain that helped.
  • Ankarah - Thursday, February 21, 2013 - link

    Could then you care to explain why any of those ultra enthusiasts would choose this card over the 690GTX, which seems to be faster overall?

    And let's leave the power consumption between the two out of this discussion - if you can drop a grand on your graphics card for your PC, then you can afford a big power supply too.
  • sherlockwing - Thursday, February 21, 2013 - link

    You haven't seen the SLI benches yet.

    This card in SLI will perform better than GTX 690 in SLI due to bad scaling for Quad SLI.
  • sherlockwing - Thursday, February 21, 2013 - link

    Correction: It will be better than GTX 690 SLI if you overclock the Titan to 1Ghz, 690 don't really have that much OC headroom.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    did you say overclock ?

    " In our testing, NVIDIA’s GK110 core had no issue hitting a Boost Clock of 1162MHz but hit right into the Power Limit, despite it being set at 106%. Memory was also ready to overclock and hit a speed of 6676MHz. As you can see in the quick benchmarks below, this led to a performance increase of about 15%. "

    So, that's the 27mhz MAX the review here claims...LOL

    Yep, a 15% performance increase, or a lousy 27mhz, you decide.... amd fanboy fruiter, or real world overclocker...

    http://www.hardwarecanucks.com/forum/hardware-canu...
  • Alucard291 - Thursday, February 21, 2013 - link

    Well as Ryan said in the article its "removed from the price curve" which in human language means: Its priced badly and is hoping to gain sales from publicity as opposed to quality.
  • Oxford Guy - Thursday, February 21, 2013 - link

    Hence the "luxury product" meme.
  • trajan2448 - Thursday, February 21, 2013 - link

    Learn about latencies and micro stuttering, driver issues, heat and noise. Almost 50% better in frame latencies than 7970. Crossfire,don't make me laugh. here's an analysis.
    http://www.pcper.com/reviews/G...
  • CeriseCogburn - Thursday, February 21, 2013 - link

    correcting that link for you again, that shows the PATHETIC amd crap videocard loser for what it is
    http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...
  • veppers - Saturday, February 23, 2013 - link

    Grow up man.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Adults like myself face reality. Lying fanboys act like spoiled brats and cannot stand to hear or see the truth.
    You're a child.
  • Alucard291 - Sunday, February 24, 2013 - link

    Yup and you sound just like the spoilt brat in question. This is not engadget mate. Go away.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    So far you've posted 3 attacks against me, and added exactly NOTHING to any discussion here.

    It's clear you're the whining troll with nothing to say, so you are one that needs to go away, right ? Right.
  • chizow - Monday, February 25, 2013 - link

    Oh the irony, you are crying about posting personal attacks and adding nothing to any discussion here? That's what every single one of your posts boils down to.
  • Alucard291 - Friday, March 8, 2013 - link

    The discussion? You spew random offensive insulting nonsense against anyone who dares to point out that slower + more expensive is worse than faster and cheaper (be it amd or nv).

    You then proceed to attack people (on a very personal level I might add) for whatever other reason and go on to say that AMD (did anyone except you even mention amd? - well I'm sure some did but mostly due to your constant stream of butthurt) is terrible.

    Cool don't use them. Calm down, relax, take a breather go for a walk.

    Or of course you can continue whiteknighting some random product that you are unlikely (given the odds) to ever buy for yourself. Who cares. Just get off the neophyte train when you do it. Ok?
  • CeriseCogburn - Tuesday, March 12, 2013 - link

    You have no clue on any odds.
    Like I said, you people are 3rd worlder crybabies.
    Between bragging hardcore upper end users frequent Anandtech, you whine and cry about 1/3rd the price of a decent PC.
    You're all full of it, and all act like you're budget is personal malaria in sub saharan Africa, except of course when you're rig bragging.
    This is the USA, except of course wherever you penniless paupers reside.
  • RussianSensation - Thursday, February 21, 2013 - link

    Yes, yes. Keep eating NV's marketing. 36-38% faster than a $430 HD7970GE for $1000???!!

    http://www.computerbase.de/artikel/grafikkarten/20...

    Heck, you can buy a $500 Asus Matrix Platinum 7970 and those overclock to 1300mhz, which makes the Matrix 30% faster than the GTX680. Do the math and see where that ends up relative to the Titan.

    http://www.hardwarecanucks.com/forum/hardware-canu...

    This is really a $699 card max.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Why buy a crashing piece of crap amd cannot even write drivers for ?
    Forget it.
    AMD is almost gone too, so "future proof" is nowhere except in nVidia's pocket.
    Now and in the future, nVidia wins period.
    Idiots buy amd.
  • Hrel - Thursday, February 21, 2013 - link

    The problem with that reasoning, that they're raising here, is that the 7970 is almost as fast and costs a lot less. The Titan is competing, based on performance, with the 7970. Based on that comparison it's a shitty deal.

    http://www.newegg.com/Product/Product.aspx?Item=N8...

    $430. So based on that I'd say the highest price you can justify for this card is $560. We'll round up to $600.

    Nvidia shipping this, at this price, and just saying "it's a luxury product" is bullshit. It's not a luxury product, it's their version of a 7970GHE. But they want to try and get a ridiculous profit to support their PhysX and CUDA projects.

    Nvidia just lost me as a customer. This is the last straw. This card should be pushing the pricing down on the rest of their lineup. They SHOULD be introducing it to compete with the 7970GHE. Even at my price range, compare the GTX660 to the 7870GHE, or better yet the sub $200 7850. They just aren't competitive anymore. I'll admit, I was a bit of a Nvidia fan boy. Loved their products. Was disappointed by older ATI cards and issues I had with them. (stability, screen fitting properly, audio issues) But ATI has become AMD and they've improved quality a lot and Nvidia is USING their customers loyalty; that's just wrong.

    I'm done with Nvidia on the desktop. By the time I need a new laptop AMD will probably have the graphics switching all sorted; so I'm probably done with Nvidia on laptops too.
  • CeriseCogburn - Thursday, February 21, 2013 - link

    LMHO - yes by the time the amd fanboy actually declares his amd fanboyism, amd won't be around anymore....

    Yes, you're done.
  • Alucard291 - Friday, March 8, 2013 - link

    And naturally you're not a fanboy... Dear god... you cannot be that stupid...
  • MrSpadge - Thursday, February 21, 2013 - link

    Don't buy it, period.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Another amd fanboy control freak loser.
    I won't be taking your "advice".
  • Alucard291 - Saturday, February 23, 2013 - link

    Yup you really DO need to grow up a little. Or a lot. Your choice.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Nope, you crybabies and poorboy whiners are the sad little tropes that need adulthood desperately.

    Adults earn money and have a grand to spend.

    Crybaby children do not.
  • Alucard291 - Sunday, February 24, 2013 - link

    Looking at the way you're expressing your impotent rage all over this review's comment section you sound roughly old enough to be my son's class mate :)
  • DemBones79 - Thursday, February 21, 2013 - link

    Wow... did no one read the first part of this article? I think it's pretty obvious from the price/performance ratio that NVIDIA is trying to scare away all but the most truly insane enthusiasts and the compute-on-a-budget crowd.

    My guess is that yields are still abysmally low and they're still reeling from the backlog of Tesla orders that resulted from the Titan supercomputer contract win. Given that, they probably do not have sufficient supply yet to meet "enthusiast" demand, so they priced it more into the "you've got to be insane to pay this much for this little" bracket.

    Whereas computer scientists and others who could benefit from the compute tech on the card could probably more readily convince their Finance depts. to loosen the purse strings for this as opposed to a Tesla proper.

    Don't be fooled. This may be labeled as a "consumer" card, and it certainly is a performance bump over the 680, but it was not brought into this world for the express purpose of playing games.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    You people are all lying freaks.
    A day before this, many of you screamed buy 2x of the top end, and when amd was charging $599 for one, you were all good with that.

    Now like stupid crybaby two year olds, you've all copped the same whine.
    You're all pathetic. All liars, too. All sheep that cannot be consistent at all.
  • Alucard291 - Saturday, February 23, 2013 - link

    Stop being so rude and abusive.

    Take a break. Stand up go outside, take some deep breaths. Stay away from the internet for a bit.

    Might do you some good.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Stop whining along with the rest of them, grow a set, get a job, and buy two of them.

    Might do you some good.
  • Alucard291 - Sunday, February 24, 2013 - link

    Unlike you, I have a job :)
  • chizow - Sunday, February 24, 2013 - link

    Good point, I'd tend to agree with that assessment as anyone who actually works for their money would not be so eager to part with it so quickly in $1K denominations for what amounts to a glorified pinball machine.

    He's probably a kid who has never had to work a day in his life or a basement dweller who has no hope of ever buying one of these anyways.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    And now with the pure troll, the lying idiot conspiracist nVidia hater takes on the pure personal attack for a big fat ZERO score.

    Congratulations, you and your pure troll can high five each other and both be wrong anyway, for another yer or two, or the rest of your whining crybaby posting PC herd idiot mentality lives.
  • Alucard291 - Friday, March 8, 2013 - link

    No no kid. You're the "pure troll here"

    So yeah go get a job and buy two of them. For yourself. Stop being angry at us for not being able to afford it

    ~lol~
  • wiyosaya - Thursday, February 21, 2013 - link

    While I understand your frustrations, IMHO, this is a card aimed at those wanting the compute performance of a Tesla at 1/3 the cost. As I see it, nVidia shot themselves in the foot for compute performance with the 680, and as such, I bet that 680 sales were less than expected primarily because of its crappy compute performance in comparison to say even a 580. This may have been their strategy, though, as they might have expected $3,500 Teslas to fly off the shelf.

    I am also willing to bet that Teslas did not fly off the shelf, and that in order to maintain good sales, they have basically dropped the price of the first GK110s to something that is reasonable with this card. Once can now buy 3.5 Titan's for the price of the entry level GK110 Tesla, and I highly expect nVidia to make a profit rather than the killing that they might have thought possible on the GK110 Teslas.

    That said, I bet that nVidia gets a sht load of orders for this card from new HPC builders and serious CAD/CAE workstation suppliers. Many CAD/CAE software packages like SolidWorks and Maple support GPGPUs in their code making this card a bargain for their builds.

    My apologies, to all the gamers here but us compute nerds are drooling over this card. I only wish I could afford one to put in my i7-3820 build from July. It is more than 2x what I paid for a 580 back then, and the 580 buy was for its compute performance.
  • atlr - Thursday, February 21, 2013 - link

    wiyosaya, I am trying to come up to speed on comparing compute performance between Nvidia and AMD options. Is the Titan drool-worthy only for software that only uses CUDA and not OpenCL? This reminds me of the days of Glide versus OpenGL APIs.
  • trajan2448 - Friday, February 22, 2013 - link

    AMDs fps numbers are overstated. They figured out a trick to make runt frames, or frames which are not actually rendered to trigger the fps monitor as a real fully rendered frame. This is real problem for AMD much worse than the latency problem. Crossfire is a disaster which is why numerous reviewers including Tech Report have written that Crossfire produces higher fps but feels less smooth than Nvidia.
    Check this article out. http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...
  • chizow - Saturday, February 23, 2013 - link

    That's an awesome analysis by PCPer, thanks for linking that. Might end up being the biggest driver cheat scandal in history. Runt framesgate lol.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    HUGE amd cheat.

    It's their standard operating procedure.

    The fanboys will tape their mouths, gouge out their eyes and stick fingers in their ears and chant: "I'm not listening".
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Enlightenment comes slow to the angry activist conspiracist tinfoil hatter, but it appears you've made progress.

    Another REASON $999 is the correct price.

    Suck it up loser.
  • Sabresiberian - Tuesday, February 26, 2013 - link

    You know, there are some reasonable arguments against this card, but you have to take it to an AMD fanboy level by calling it an overpriced GTX 680.
  • iSlayer - Saturday, March 30, 2013 - link

    Titan gets pretty close to 690/680 SLI performance while using less power, producing less heat, and freeing up space for 3 additional Titans. With Titan, I can make a machine up to 60% more powerful than was previously possible. Sure, it's going to cost twice as much as it would to get the same performance from other cards, but that's only theoretical as you literally cannot match a Titan config with a single computer.

    You seem to have entirely missed the point of this card.
  • klepp0906 - Friday, February 21, 2014 - link

    if you thought the titan was a troll, i'd hate to see what you call their latest attempt w/ the titan black, free of the artificial limitations put in place just so they can make another money grab. (all while the driver support on the older card still falls flat on its face in many cases ie: surround)
  • ehpexs - Thursday, February 21, 2013 - link

    Nvidia has been so greedy with 600 series. 560s turned into 680s and now this. Nvidia is not getting my money for years.
  • HighTech4US - Thursday, February 21, 2013 - link

    > Nvidia has been so greedy with 600 series. 560s turned into 680s and now this. Nvidia is not getting my money for years.

    So you instead gave your money to greedy AMD and their $549 HD7970 last January or did you wait for AMD to become less greedy when they had to reduce the price by $120 because of the greedy Nvidia releasing the GTX680.
  • chizow - Thursday, February 21, 2013 - link

    Yes AMD started this all with their ridiculous 7970 prices, but Nvidia has taken it way beyond that. $1K is usury, plain and simple.
  • Hrel - Thursday, February 21, 2013 - link

    Haha, never seen anyone use the word "usury" in real life before. You used it kinda wrong but your point came across. Nice!
  • chizow - Friday, February 22, 2013 - link

    It's contemporary usage has extended beyond the traditional references to loans/interest, if you replace "interest" with "premium" it makes perfect sense. But glad you understood and enjoyed the reference.
  • mlambert890 - Saturday, February 23, 2013 - link

    It's a ridiculous usage honestly. The sense of entitlement people have is almost incomprehensible that there is moral outrage over an expensive video card.

    If you feel its a bad deal, don't buy it. If you feel that somehow NVidia has committed a sin against nature by releasing the 680 originally (which is somehow now being viewed as "broken" I guess.. amazing) and then releasing this card at $1000, because you feel that this card should have *been* the 680, then you are making assumptions without any evidence.

    Everyone seems to be basing their angst on the notion that NVidia *could* be selling this card right now at $500. How does *anyone* know this? It's faith based insanity. No one knows what their yields are on this thing yet all of the disgruntled are going out on some wild limb screaming because they feel this card could have come out last year for half price.
  • chizow - Saturday, February 23, 2013 - link

    The usage is fine, as it's in reference to Nvidia's practice of overcharging it's customers with exorbitant price increases. That's usury.

    The "entitlement" comes from helping to build Nvidia from a rinky dink GPU company into the global factor it is today by spending a few thousand dollars every few years on their glorified gaming machines.

    The outrage comes from that company thinking it's OK to suddenly up the ante and charge you 2x as much for the same class and level of performance you'd expect from what you've paid every few years in an upgrade.

    It's obvious you've never bought into this market before, because you'd feel more invested in what has happened in the landscape of desktop GPUs since the 7970 launch and the rest of Kepler launch and understand what is happening here. I don't plan to buy it as most others I know who have bought in this range of GPU before, most of whom have similar sense of disappointment and disdain for Nvidia's Titan pricing strategy.

    As for the last bit...Nvidia has sold their 500+mm^2 ASIC GPUs for much less than even $500 in the past, hell the GTX 285 sold for as low as $330 and even the past GPUs with reported "terrible" yields like the GT200 and GF100 were sold for $500 in quarters where Nvidia still turned a profit. TSMC is charging slightly more per 300mm wafer at 28nm than previous nodes, but nothing close to the 100% premium being asked for with TITAN. So obviously they could sell it for less and still profit, they just chose not to.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    You're an IDIOT.

    nVidia sells these for THOUSANDS EACH, and production is LIMITED, you idiot gasbag fool.

    The fact that they spare a few dies for a grand shows they are being extremely generous with their profit potential and giving you sick ungrateful whining losers a big freaking break !

    But you're FAR TOO STUPID to know that. That's why nVidia has a REAL CEO and you're a crybaby whiner FOOL on the internet, who cannot think past his own insane $360 card budget, AND GETS IT 100% INCORRECT.

    When nVidia sells one of these limited production monsters to one of you whining ungrateful OWS crybaby losers for a $1000.00, they are LOSING A COUPLE GRAND IN PROFITS, YOU IDIOT !

    Have a nice, dumbed down, idiot crybaby loser day.
  • chizow - Saturday, February 23, 2013 - link

    Production isn't limited, Nvidia and the article have shot that down from the outset, so please stop trying to use that as some "limited" production excuse.

    Nvidia will make as many of these as demand dictates, but they could've just as easily charged $300-400 less and sold untold magnitudes more GK110-based cards. That's the general impression I'm getting from all over the internet and from the circles of enthusiast I've encountered, anyways.

    Pricing these at $1K is like stealing candy from a retarded kid (you), but $600-$700 would be justified based on expected price and performance relative to previous generations and worth a look at 1 or even 2 for any enthusiast who purchased in the enthusiast range before and still allowed Nvidia to charge a premium for Kepler's overachieving performance numbers.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    No, they shot down the only 10,000 amd fanboy liar rumor, and claimed there will be "continuing availability".

    So you're a liar and a fool.
  • chizow - Sunday, February 24, 2013 - link

    Exactly, you said these were "limited" when they are not, so you stand corrected.

    Looks like you're the liar and the fool, but why state the obvious?
  • CeriseCogburn - Sunday, February 24, 2013 - link

    They are limited, production is NOT limitless, a gradeB skumbag fool who needs a clue from a real authority to use common sense (to paraphrase your prior stupid remark) would know that.

    nVidia CANNOT produce whatever demand is whenever demand goes over production capacity, and the price you want, you have implied it would.

    So go blow it out your tinfoil hat.
  • chizow - Monday, February 25, 2013 - link

    Production is limited by demand with GK110, not some artificial "limited" production warranting the $1K price tag as you implied. But why state the obvious about the law of supply and demand?

    Please stop trying to cover your tracks with dishonesty, you were wrong to say Titan is limited when it was not, now move along.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Moving along would be you checking production capacity and the dates in question, of course instead you've gourd bolted reynolds wrap.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Yes mlambert908, they are spoiled lying crybaby children.

    Thanks for being another sane voice in the wilderness.

    The crybabies need to become men and earn a living, so they stop embarrassing themselves. Not that they know or understand that, nor that they ever will, but it gets sickeningly redundant when they poo their diapers all over the place every time.

    How about they grow up, get to work, and buy a video card once without whining like spoiled brats for months on end over the evil overlords who just won't give them a huge break because... they demand it.

    Maybe the Illegal Usurper will start enforcing price controls for them in this crappy economic worldwide situation where ALL prices have been rising greatly for a number of years.

    Perhaps the fantasy world the brat fanboy crybabies live in is actually a sheltered virtual unreality absolutely separate from RL.

    Charlie D's slaves, with deranged fantasy brains, and like you said, one of the sickest cases of entitlement the world has ever seen.
  • chizow - Saturday, February 23, 2013 - link

    It's funny that you keep mentioning Charlie D., your asinine rants make him look like a genius and a prophet.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    He's the jerk off liar you sucked up whole bud.
  • chizow - Sunday, February 24, 2013 - link

    At least he can put together a coherent sentence but compared to you, he looks like a saint.
  • CeriseCogburn - Sunday, February 24, 2013 - link

    you're his little altar boy, you should know
  • chizow - Monday, February 25, 2013 - link

    Sounds like suppressed scars and emotions from a troubled past, you should see someone about that, can't be healthy for you. Might result in lashing out uncontrollably in public ...oh wait.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    LOL - The big hate filled price crybaby conspiracist, lashing out for near a year now, the OWS mirror must be extremely painful.
  • CeriseCogburn - Thursday, February 21, 2013 - link

    The crybaby fool liar amd fanboy has nothing else to offer but the exemplary issuing of brainfarts on a most regular basis, despite the corrections attempted by other for years on end.
    Forget the facts, the raging amd fanboy is blissful, but does not mean a word they say, either, and will never follow through.
    A dry empty, stupid threat of nothing.
    One can only hope the idiots grasp tightly on the rump cheeks of amd and never let go, thus when it falls into gehennna precious fanbioy will be gone with it too.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    They don't need your money they've got BILLIONS.

    AMD needs your money red fanboy, but then you don't have any to give them, as the amd fanboy tightwads have proven, they can destroy their own favorite company with their constant demand for free and cheap.

    We don't want crybaby loser tightwads like you supporting nVidia. Your company is the in debt failing and fired amd. Enjoy the ghetto.
  • CeriseCogburn - Wednesday, February 27, 2013 - link


    So nVidia is pulling in about 10% profit in the video card area, as recent and past years show.

    So how exactly is that greed ? This whole past year of 600 series left them with that same pathetic profit margin.

    They charge ten percent over their costs on average. Ten percent.

    You people are obviously rage filled hate bots without a clue. Ten percent is not some unreasonable scalp price.
  • UzairH - Thursday, February 21, 2013 - link

    I am upgrading shortly, and for me Skyrim is a big deal when it comes to graphics performance. Vanilla Skyrim plays great on GTX 580/HD 7850 and up, but one can load a dozen hi-res texture mods and high-quality ENB mod on top of Skyrim ot make it look an order of magnitude better than vanilla Skyrim, with a consequent massive performance drop. Skyrim is also a game that does not do SLI/Crossfire well,. so it would be interesting to see what a single powerful GPU can do in it.
  • Ryan Smith - Thursday, February 21, 2013 - link

    We've tested Skyrim with the high res texture pack and it's still CPU bound on high-end video cards. As for mods, as a matter of editorial policy we do not include mods in our testing due to the very frequent changes and relatively small user base.
  • UzairH - Thursday, February 21, 2013 - link

    Ah ok, thanks for the explanation Ryan. Fair enough if the game is CPU bound, and your policy sounds fair as well. Please note however that at high resolutions enabling SSAO kills the performance, and enabling Transparency Anti-aliasing on top of that even more so, so even without mods Skyrim can still be pretty brutal on cards like the 670 and HD 7970.
  • CeriseCogburn - Thursday, February 21, 2013 - link

    LOL ignore the idiocy and buy the great nVidia card, you'll NEVER have to hear another years long screed from amd fanboys about 3G of ram being future-proof -ESPECIALLY WITH SKYRIM AND ADDONS!!!!

    As they screamed endlessly....
  • CeriseCogburn - Saturday, February 23, 2013 - link

    It's a bunch of HOOEY no matter how reasonable "the policy" excuse sounds...

    http://www.bit-tech.net/hardware/2013/02/21/nvidia...

    There's the Skyrim results, with TITAN 40+% ahead.
  • trajan2448 - Friday, February 22, 2013 - link

    AMDs fps numbers are overstated. They figured out a trick to make runt frames, or frames which are not actually rendered to trigger the fps monitor as a real fully rendered frame. This is real problem for AMD much worse than the latency problem. Crossfire is a disaster which is why numerous reviewers including Tech Report have written that Crossfire produces higher fps but feels less smooth than Nvidia.
    Check this article out. http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...
  • Ankarah - Thursday, February 21, 2013 - link

    From a regular consumer's point of view, the hype it being the fastest 'single' graphics card doesn't really appeal that much - it doesn't make a difference to me how these video cards work in what configurations underneath its big case, as long as it does its job.

    So I really can't understand why any regular consumer would intentionally choose this over the 690GTX, which seems to be faster overall for the same price, unless you belong to perhaps 0.5% of their market share where you absolutely require FP64 executions for your work but don't really need the full power of Tesla.

    And let's face it, if you are willing to shell out a grand for your graphics card for your PC, you aren't worried about the difference their TDP will make on your electric bills.

    So I think it's just a marketing circus specifically engineered to draw in a lucky few, to whom money or price/performance ratio holds no value at all - there's nothing to see here for regular Joes like you and me.

    Let's move along.
  • sherlockwing - Thursday, February 21, 2013 - link

    This card is for people willing to spend at least $2K on their Graphic cards and don't want to deal with Quad GPU scaling while also having room for a Third. If you don't have that much cash you are not in its target audience.
  • Ankarah - Thursday, February 21, 2013 - link

    That makes sense,

    so this card, however we slice it, is only for about perhaps 1% of the consumer base if even that.
  • andrewaggb - Thursday, February 21, 2013 - link

    pretty much. And bragging rights.
  • CeriseCogburn - Thursday, February 21, 2013 - link

    not for the crybabies we have here.

    Yet go to another thread and the screaming about the 7990 and the endless dual top end videocard setups with thousand dollar INTEL cpu's will be endless.

    It all depends on whose diapers the pooing crybabies are soiling at the moment.
  • cmdrdredd - Thursday, February 21, 2013 - link

    Plus people who want to break world records in benchmarking.
  • Sufo - Thursday, February 21, 2013 - link

    lol, you clearly haven't run a dual gpu setup.
  • Veteranv2 - Thursday, February 21, 2013 - link

    Such a shame. How you can you disregard all the dual GPU cards?
    Another biased review. I miss the time when Anand used to be objective. Now it is just a Intel/Nvidia propaganda site. Not even an objective final thoughts. It is really a shame. I feel sad.
  • Ryan Smith - Thursday, February 21, 2013 - link

    Um? They're there. Along with 680 SLI and 7970GE CrossFire.
  • processinfo - Thursday, February 21, 2013 - link

    Act surprised? He means that in final thoughts you downplaying fact that Titan is slower than dual GPU cards. I agree with him. It seems biased, especially when later you talk about 3 way SLI with Titan that would have same issues like dual GPU cards. They cost the same or less and they are faster. For gaming this card brings nothing to the table. For $500-600 it would be different story.
  • Ryan Smith - Thursday, February 21, 2013 - link

    Ahh.

    So our editorial stance is that while SLI/CF are great, they should only be used to increase performance beyond what a single large GPU can accomplish. AFR comes with some very notable penalties, and while these are acceptable when you can go no bigger, we do not believe these are good tradeoffs to make otherwise.

    It's the same reason why we didn't recommend cards like the GTX 560 Ti 2Win over the GTX 580.

    http://www.anandtech.com/show/5048/evgas-geforce-g...

    Simply put we'd rather have the more consistent performance and freedom from profiles that having a single GTX Titan provides, over the higher but also more unreliable performance of a GTX 690.
  • Alucard291 - Thursday, February 21, 2013 - link

    Well its exactly as you said Ryan. Its overpriced and badly positioned in the market (except you used much kinder words - presumably to keep your paycheck)

    Its a nice, pointless consumer (that's a key word right here) gpu which brings benefits (what are those benefits exactly?) of overpriced compute performance to people who don't need compute performance.

    Beautiful move Nvidia.
  • processinfo - Thursday, February 21, 2013 - link

    It is not about recommendation. I prefer single GPU and no SLI configs myself.
    It is about a fact that it is just slower than anything with similar price tag.
    This is card only for people who need both: fast gaming card and computing card in one (or for those who don't care about a price).
  • Hrel - Thursday, February 21, 2013 - link

    I think it's about time you re-asses that stance. SLI/CF has come a long way in the past few years.

    Also, 1000 dollars for one card puts it so far out of consideration it doesn't even count as an option for single GPU use. Which was why he said "For $500-600 it would be a different story". For gaming this card is useless. For Compute it seems 7970GHE would be a better option too. Again, based solely on price. Performance is close enough it makes WAY more sense to just buy 2 of those for 860 bucks if you really need to do some serious GPU compute.
  • Ryan Smith - Thursday, February 21, 2013 - link

    Actually we did re-assess our stance ahead of this article.

    Far Cry 3 came out and SLI didn't work correctly; it took a few NVIDIA releases to get it up to where it is today. That's exactly the kind of scenario that drives our existing stance.
  • CeriseCogburn - Thursday, February 21, 2013 - link

    Yes, of course, forget mentioning the half decade of AMD epic failure in CF...

    It's just amazing what I read here. The bias is so glaring a monkey couldn't miss it.
  • chizow - Friday, February 22, 2013 - link

    Sorry Ryan but that's a bit of a cop-out exuse imo. Anyone who can fork out $1K for a few GPUs in SLI should be able to figure out how to google SLI bits and enter them into Nvidia Inspector. They are "enthusiasts" afterall, right?

    SLI working on launch day with proper SLI bits:
    http://www.overclock.net/t/1334390/how-to-get-prop...

    Nvidia is usually pretty good about updating these bits or launching optimized drivers ahead of a big game launch, but sometimes they are a few days behind. The same can be said for single-GPU performance optimizations, but obviously less frequently than SLI.

    With few exceptions over the course of my 5 year run with SLI, I've been able to get SLI working on Day 1 for most major game launches. AA is more hit or miss, but generally with Nvidia you don't have to wait for updates if you know how to manipulate SLI/AA bits (or use google).

    In any case, I think Titan has created a new paradigm that may need to be addressed, where 2 performance parts are going to be preferable over a single GPU, when those 2 parts in CF/SLI offer 150% of the performance at 75% of the price.
  • Veteranv2 - Thursday, February 21, 2013 - link

    And that is a subjective view. Not objective.

    You don't have 1 graph where you prove dual GPU's are inconsistent. You don't prove your point, you just bring up a subjective view defending a 1000$ card which is crap compared to best offer from AMD...
  • ronin22 - Thursday, February 21, 2013 - link

    Haters gonna hate.
  • CeriseCogburn - Thursday, February 21, 2013 - link

    liars gonna lie

    http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...
  • CeriseCogburn - Thursday, February 21, 2013 - link

    here's a chart showing how amd fails utterly...

    http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...

    Have fun amd fanboy, the lies you live by are now exposed.
  • Veteranv2 - Friday, February 22, 2013 - link

    Same comment for you:

    Wow, you base your opinion on 1 game on a point that 1 review site has made without knowing that the FRAPS software measures correctly and that drivers are good with that game?
    Wow, that blows my mind. How dumb can be people be?
    This is the same like saying, wow this 1 musquito didn't give me malaria, malaria doesn't exist....
    I am not a fanboy, except for my wallet and honesty.

    Anandtech makes a point which it does not prove. That is my point. Hence if you cannot prove a point, but still make it, that makes you subjective. So either prove it, or shut up about it and be objective.
  • veppers - Saturday, February 23, 2013 - link

    I'm not sure you should be calling anyone out for being a fanboy when your childish pro-Nvidia posts are here for all to see. (happy Nvidia customer here just incase you wanted to go there)

    Also, how many times are you going to spam that same link?
  • CeriseCogburn - Saturday, February 23, 2013 - link

    As many times as the lying whiners pull their stupid crap.

    It should be spammed about 300 times in all the articles here since the Jan 2012 release of the amd failship 79xx, shouldn't it ?

    Hello amd fanboy.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    " SLI profiles for optimum scaling, NVIDIA has done a very good job here in the past, and out of the 19 games in our test suite, SLI only fails in F1 2012. Compare that to 6 out of 19 failed titles with AMD CrossFire. "

    http://www.techpowerup.com/reviews/NVIDIA/GeForce_...

    LOL - yes you sure know what you're talking about...
  • CeriseCogburn - Thursday, February 21, 2013 - link

    Take a look at HOW CRAP amd dual top end card setups actually are and how FRAPS, which this site regularly uses has been SCREWING nVidia for years...

    http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...

    Enjoy the years of lies you have swallowed whole, while sentient human beings like myself have warned against.
  • Veteranv2 - Friday, February 22, 2013 - link

    Wow, you base your opinion on 1 game on a point that 1 review site has made without knowing that the FRAPS software measures correctly and that drivers are good with that game?
    Wow, that blows my mind. How dumb can be people be?
    This is the same like saying, wow this 1 musquito didn't give me malaria, malaria doesn't exist....
    I am not a fanboy, except for my wallet and honesty.

    Anandtech makes a point which it does not prove. That is my point. Hence if you cannot prove a point, but still make it, that makes you subjective. So either prove it, or shut up about it and be objective.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Yeah it's not one site, this into has been around for years.
    Just run fraps with your own games vs other frame count methods on both cards idiot.

    Oh that's right, you have no experience.
  • Veteranv2 - Sunday, February 24, 2013 - link

    Wow, you are the saddest person I have ever seen on the internet. You seem like you could use a hug or something.
    Or professional help.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Another internet parrot with the usual, repeated a thousand times whenever they have no rebuttal.
    You fit right in with the rest of the fools.
  • carlos l - Thursday, February 21, 2013 - link

    Titan's most interesting feature is DP performance. The previous gtx 6xx cards have only 1/24 DP, much less than the gtx 5xx. Titan's announced 1/3 DP performance is, in theory, an amazing improvement for gpgpu tasks. It woul be nice if you could ad a benchmark on that feature. I suggest something like GeneferCUDA, an app from Primegrid (a Boinc project) designed for cuda capable cards that uses DP intensively.
  • codedivine - Thursday, February 21, 2013 - link

    Look at the DGEMM and FFT (double-precision) benchmarks. Those are both using fp64.
  • chizow - Thursday, February 21, 2013 - link

    While the DP performance is great for those that would use it, I think the vast majority of potential buyers would've preferred to see the same neutered DP performance if it meant a more reasonable, historic level of pricing in the $500-$650 range.

    As a previous owner of these flagship cards, I know I gave 0 fks about the DP performance of the card even though it was respectable relative to the Tesla parts (about half).
  • alpha754293 - Thursday, February 21, 2013 - link

    Does it have the same PCI devID as the Tesla cards?

    I like how Rahul Garg covered the compute performance using HPC-level code.

    Do you or does he know if MATLAB will work with this card?

    Thanks.
  • codedivine - Thursday, February 21, 2013 - link

    Rahul here. Not sure about the PCI ID. About MATLAB, while we did not test this, but I see no reason why it will not work with MATLAB.
  • alpha754293 - Thursday, February 21, 2013 - link

    Rahul:

    I loved your analysis. Very useful.

    One question question though - nVidia was saying that the single precision peak theorectical floating point rate should be 4.5 TFLOPs. How come your SGEMM only shows 3.something TFLOPS? Is it tuning or optimizations?
  • Ryan Smith - Thursday, February 21, 2013 - link

    PCI\VEN_10DE&DEV_1005&SUBSYS_103510DE

    I have no idea what a Tesla card's would be, though.
  • alpha754293 - Thursday, February 21, 2013 - link

    I don't suppose you would know how to tell the computer/OS that the card has a different PCI DevID other than what it actually is, would you?

    NVIDIA Tesla C2075 PCI\VEN_10DE&DEV_1096
  • Hydropower - Friday, February 22, 2013 - link

    PCI\VEN_10DE&DEV_1022&SUBSYS_098210DE&REV_A1

    For the K20c.
  • brucethemoose - Thursday, February 21, 2013 - link

    "This TDP limit is 106% of Titan’s base TDP of 250W, or 265W. No matter what you throw at Titan or how you cool it, it will not let itself pull more than 265W sustained."

    The value of the Titan isn't THAT bad at stock, but 106%? Is that a joke!?

    Throw in an OC for OC comparison, and this card is absolutely ridiculous. Take the 7970 GE... 1250mhz is a good, reasonable 250mhz OC on air, a nice 20%-25% boost in performance.

    The Titan review sample is probably the best case scenario and can go 27MHz past turbo speed, 115MHZ past base speed, so maybe 6%-10%. That $500 performance gap starts shrinking really, really fast once you OC, and for god sakes, if you're the kind of person who's buying a $1000 GPU, you shouldn't intend to leave it at stock speeds.

    I hope someone can voltmod this card and actually make use of a waterblock, but there's another issue... Nvidia is obviously setting a precedent. Unless they change this OC policy, they won't be seeing any of my money anytime soon.
  • JarredWalton - Thursday, February 21, 2013 - link

    As someone with a 7970GE, I can tell you unequivocally that 1250MHz on air is not at all a given. My card can handle many games at 1150MMhz, but other titles and applications (say, running some compute stuff) and I'm lucky to get stability for more than a day at 1050MHz. Perhaps with enough effort playing with voltage mods and such I could improve the situation, but I'm happier living with a card for a couple years that doesn't crap out because of excessively high voltages.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    " After a few hours of trial and error, we settled on a base of the boost curve of 9,80 MHz, resulting in a peak boost clock of a mighty 1,123MHz; a 12 per cent increase over the maximum boost clock of the card at stock.

    Despite the 3GB of GDDR5 fitted on the PCB's rear lacking any active cooling it too proved more than agreeable to a little tweaking and we soon had it running at 1,652MHz (6.6GHz effective), a healthy ten per cent increase over stock.

    With these 12-10 per cent increases in clock speed our in-game performance responded accordingly."

    http://www.bit-tech.net/hardware/2013/02/21/nvidia...

    Oh well, 12 is 6 if it's nVidia bash time, good job mr know it all.
  • Hrel - Thursday, February 21, 2013 - link

    YES! 1920x1080 has FINALLY arrived. It only took 6 years from when it became mainstream but it's FINALLY here! FINALLY! I get not doing it on this card, but can you guys PLEASE test graphics cards, especially laptop ones, at 1600x900 and 1280x720. A lot of the time when on a budget playing games at a lower resolution is a compromise you're more than willing to make in order to get decent quality settings. PLEASE do this for me, PLEASE!
  • JarredWalton - Thursday, February 21, 2013 - link

    Um... we've been testing 1366x768, 1600x900, and 1920x1080 as our graphics standards for laptops for a few years now. We don't do 1280x720 because virtually no laptops have that as their native resolution, and stretching 720p to 768p actually isn't a pleasant result (a 6.7% increase in resolution means the blurring is far more noticeable). For desktop cards, I don't see much point in testing most below 1080p -- who has a desktop not running at least 1080p native these days? The only reason for 720p or 900p on desktops is if your hardware is too old/slow, which is fine, but then you're probably not reading AnandTech for the latest news on GPU performance.
  • colonelclaw - Thursday, February 21, 2013 - link

    I must admit I'm a little bit confused by Titan. Reading this review gives me the impression it isn't a lot more than the annual update to the top-of-the-line GPU from Nvidia.
    What would be really useful to visualise would be a graph plotting the FPS rates of the 480, 580, 680 and Titan along with their release dates. From this I think we would get a better idea of whether or not it's a new stand out product, or merely this year's '780' being sold for over double the price.
    Right now I genuinely don't know if i should be holding Nvidia in awe or calling them rip-off merchants.
  • chizow - Friday, February 22, 2013 - link

    From Anandtech's 7970 Review, you can see relative GPU die sizes:

    http://images.anandtech.com/doci/5261/DieSize.png

    You'll also see the prices of these previous flagships has been mostly consistent, in the $500-650 range (except for a few outliers like the GTX 285 which came in hard economic times and the 8800Ultra, which was Nvidia's last ultra-premium card).

    You an check some sites that use easy performance rating charts, like computerbase.de to get a quick idea of relative performance increases between generations, but you can quickly see that going from a new generation (not half-node) like G80 > GT200 > GF100 > GK100/110 should offer 50%+ increase, generally closer to the 80% range over the predecessor flagship.

    Titan would probably come a bit closer to 100%, so it does outperform expectations (all of Kepler line did though), but it certainly does not justify the 2x increase in sticker price. Nvidia is trying to create a new Ultra-premium market without giving even a premium alternative. This all stems from the fact they're selling their mid-range part, GK104, as their flagship, which only occurred due to AMD's ridiculous pricing of the 7970.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    You seem to keep forgetting nearly all other computer parts, since our illustrious communist has usurped the perch, have also not fallen in price, as has traditionally been the case.
    Computers parts across the board are staying the same and rising in price.

    It's called a crappy world money printing inflationary mess.

    If you haven't noticed it, you're clueless.
  • chizow - Saturday, February 23, 2013 - link

    Yeah once again you must be living in a parallel universe.

    PCs, and electronics in general, are all getting cheaper in price and increasing in performance with each iteration. Hell even Apple products have lower price points than they did 3-4 years ago across the board.

    Just look at laptops for example. You can get a solid "Ultrabook" laptop for $500-800. Same class of laptop would've cost you $1200-$1500 5 years ago.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    A finished product is not a computer part you fool.

    Nice try liar.
  • colonelpepper - Thursday, February 21, 2013 - link

    Vue
    Avid
    Maya
    Autocad
    3DS Max
    After Effects
    Adobe Creative Suite

    This card is touching the boundary between gaming & QUADRO cards and yet there are ZERO benchmarks for any of this software yet page after page after page after page dedicated to various games.

    What gives?
    Perhaps Toms will have a relevant review.
  • Hrel - Thursday, February 21, 2013 - link

    I was wondering about this as well. Dustin even says it designed as a "cheap compute card".

    I'd add PowerDirector to the list though.
  • Hrel - Thursday, February 21, 2013 - link

    Ryan*
  • JarredWalton - Thursday, February 21, 2013 - link

    Keep in mind the time constraints. Ryan and Anand received the Titan hardware late last week; adding in additional testing for this article is a bit much, considering Ryan already had to write the initial Part 1 plus this Part 2. I'm guessing that performance in professional apps where Quadro is usually the go-to choice aren't going to benefit all that much from Titan relative to GTX 680, though, unless NVIDIA is reversing their stance on Quadro getting special drivers (which doesn't seem to be the case).
  • dbr1 - Thursday, February 21, 2013 - link

    Exactly!

    How does this card perform in Adobe Premiere Pro???
  • atlr - Thursday, February 21, 2013 - link

    Seeing performance of a benchmark like http://ppbm5.com/ with Creative Suite v6.03 across the set of Nvidia and AMD cards would be interesting. I have not found runs of this benchmark with a AMD 7970 yet. CUDA rules the Adobe roost although there is some OpenCL support in current products.
  • Ryan Smith - Monday, February 25, 2013 - link

    Due to the disparity in GeForce and Quadro drivers, not to mention the difference in support, these are programs where for the most part it wouldn't make much sense to benchmark them on a GeForce Titan. Now if NVIDIA rolls out a GK110 Quadro, then that would be far more useful.

    If you're in the market for a Quadro, you're probably not in the market for a GeForce. NVIDIA keeps those market segments rather rigorously separated.
  • vps - Thursday, February 21, 2013 - link

    For compute benchmark you might want to take a look at FAHBench
    FAHBench is the official Folding@Home GPU benchmark. It measures the compute performance of GPUs for Folding@Home.
    http://proteneer.com/blog/?page_id=1671

    Some reference scores are here:
    http://foldingforum.org/viewtopic.php?f=38&t=2...
  • Ryan Smith - Thursday, February 21, 2013 - link

    FAHBench is primarily an OpenCL benchmark (there's a CUDA path, but it's effectively on its way out). It's on our list, and is one of the things we couldn't run due to the fact that OpenCL is not currently working on Titan.
  • Hrel - Thursday, February 21, 2013 - link

    PowerDirector still uses CUDA
  • atlr - Thursday, February 21, 2013 - link

    Not sure if this helps. I found CLBenchmark results of a Titan versus a 7970 here.
    http://clbenchmark.com/compare.jsp?config_0=144702...
  • atlr - Thursday, February 21, 2013 - link

    Ville Timonen posted results running his own code on a Tesla K20 versus the usual suspects. Might be helpful to folks considering options for GPGPU computation.
    http://wili.cc/blog/gpgpu-faceoff.html
  • chizow - Thursday, February 21, 2013 - link

    That's the first thing that comes to mind now when I think of Nvidia, which is a shame because that name used to be synonymous with Awesome. That's gone and replaced with a disdain for ridiculous levels of usury with their last two high-end product launches. I'm not going to be disingenuous and claim I'm going AMD, because the fact of the matter is, Nvidia products are still in a class of it's own for my usage needs, but I will certainly not be spending as much on Nvidia parts as I used to.

    Kepler has basically set back Nvidia's product stack back by half a generation, but my price:performance metrics will stay the same. Nvidia has their ultra-premium "Xtreme Edition" GPU this round, but that only came about as a result of AMD's ridiculous pricing and overall lackluster performance of the 7970 for a "flagship" card. Either way, I think it will be difficult for Nvidia to sustain this price point, as expectations just got higher at that $1K range.

    @Ryan: I'm disappointed you didn't write a harsher commentary on the fact Nvidia is now charging 2x for the same class of GPU, pricing that has stayed true since 2006. Even the G80 Ultra didn't approach this $1K mark. Given how many times Nvidia has had to backpedal and apologize about the Ultra's pricing, you would think they would learn from their mistakes. I guess not, I hope they are prepared to deal with the long-term ramifications, backlash, and loss of goodwill stemming from this pricing decision.
  • CeriseCogburn - Thursday, February 21, 2013 - link

    7.1 billion transistors and 6G of ram.

    I for one am sick of you people constantly whining.

    If we check the whine log from the ATI32 days you were doing it then, too.

    It's all you people do. Every time, all the time.
  • chizow - Friday, February 22, 2013 - link

    And all you do is post inane, barely intelligible nonsense in defense of Nvidia. If you check your "whine logs" you'll see I've done my fair share of defending Nvidia, but I can't and won't give them a pass for what they've done with Kepler. AMD started it for sure with the terribad price:performance of Tahiti, but Nvidia has taken it to a new level of greed.

    And for all the idiots who are going to reply "herr dueerr Nvidai need make money not a charity derrr", my advanced reply is that Nvidia has made money in all but 2-3 quarters since 2006 without selling a single $1K desktop GPU. In fact, they enjoyed record profits, margin and revenue on the back of a $250 GPU, the 8800GT in 2007-2008.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Nope, you are the crying whining baby who says the same thing in 100 different posts here, and has no clue what "the economy" of "the world" has been doing for the past several years.

    Whatever.

    Please go cry about all the other computer part prices that are doing the same thing.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    I mean you idiots want the same price for more perf over a DECADE. Meanwhile, the rest of the worlds pricing has DOUBLED.

    Now, computer prices used to drop across the board, but they just aren't doing it anymore, and IDIOTS like yourself continue on your stupid frikkin rants, ignoring the world itself, not to mention the LACK of production in that world for the GPU's you whine about. It's really funny how stupid you are.
  • chizow - Saturday, February 23, 2013 - link

    I haven't use this rebuttal in a long time, I reserve it for only the most deserving, but you sir are retarded.

    Everything you've written above is anti-progress, you've set Moore's law and semiconductor progress back 30 years with your asinine rants. If idiots like you running the show, no one would own any electronic devices because we'd be paying $50,000 for toaster ovens.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Yeah that's a great counter you idiot... as usual when reality barely glints a tiny bit through your lying tin foiled dunce cap, another sensationalistic pile of bunk is what you have.
    A great cover for a cornered doofus.
    When you finally face your immense error, you'll get over it.

  • hammer256 - Thursday, February 21, 2013 - link

    Not to sound like a broken record, but for us in scientific computing using CUDA, this is a godsend.
    The GTX 680 release was a big disappointment for compute, and I was worried that this is going to be the trend going forward with Nvidia: nerfed compute card for the consumers that focuses on graphics, and compute heavy professional cards for the HPC space.
    I was worried that the days of cheap compute are gone. These days might still be numbered, but at least for this generation Titan is going to keep it going.
  • ronin22 - Thursday, February 21, 2013 - link

    +1
  • PCTC2 - Thursday, February 21, 2013 - link

    For all of you complaining about the $999 price tag. It's like the GTX 690 (or even the 8800 Ultra, for those who remember it). It's a flagship luxury card for those who can afford it.

    But that's beside the real point. This is a K20 without the price premium (and some of the valuable Tesla features). But for researchers on a budget, using homegrown GPGPU compute code that doesn't validate to run only on Tesla cards, these are a godsend. I mean, some professional programs will benefit from having a Tesla over a GTX card, but these days, researchers are trying to reach into HPC space without the price premium of true HPC enterprise hardware. The GTX Titan is a good middle point. For the price of a Quadro K5000 and a single Tesla K20c card, they can purchase 4 GTX Titans and still have some money to spare. They don't need SLI. They just need the raw compute power these cards are capable of. So as entry GPU Compute workstation cards, these cards hit the mark for those wanting to enter GPU compute on a budget. As a graphics card for your gaming machine, average gamers need not apply.
  • ronin22 - Thursday, February 21, 2013 - link

    "average gamers need not apply"

    If only people had read this before posting all this hate.

    Again, gamers, this card is not for you. Please get the cr*p out of here.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    You have to understand, the review sites themselves have pushed the blind fps mentality now for years, not to mention insanely declared statistical percentages ripened with over-interpretation on the now contorted and controlled crybaby whiners. It's what they do every time, they feel it gives them the status of consumer advisor, Nader protege, fight the man activist, and knowledgeable enthusiast.

    Unfortunately that comes down the ignorant demands we see here, twisted with as many lies and conspiracies as are needed, to increase the personal faux outrage.
  • Dnwvf - Thursday, February 21, 2013 - link

    In absolute terms, this is the best non-Tesla compute card on the market.

    However, looking at flops/$, you'd be better off buying 2 7970Ghz Radeons, which would run around $60 less and give you more total Flops. Look at the compute scores - Titan is generally not 2x a single 7970. And in some of the compute scores, the 7970 wins.

    2 7970ghz (not even in crossfire mode, you don't need that for OpenCL), will beat the crap out of Titan and cost less. They couldn't run AOPR on the AMD cards..but everybody knows from bitcoin that Amd cards rule over nvidia for password hashing ( just google bitcoin bit_align_int to see why).

    There's an article on Toms Hardware where they put a bunch of nvidia and amd cards through a bunch of compute benchmarks, and when amd isn't winning, the gtx 580 generally beats the 680...most likely due to its 512 bit bus. Titan is still a 384 bit bus...can't really compare on price because Phi costs an arm and a leg like Tesla, but you have to acknowledge that Phi is probably gonna rock out with its 512 bit bus.

    Gotta give Nvidia kudos for finally not crippling fp64, but at this price point, who cares? If you're looking to do compute and have a GPU budget of $2K, you could buy:

    An older Tesla
    2 Titans
    -or-
    Build a system with 2 7970Ghz and 2 Gtx 580.

    And the last system would be the best...compute on the amd cards for certain algorithms, on the nvidia cards for the others, and pci bandwidth issues aside, running multiple complex algorithms simultaneously will rock because you can enqueue and execute 4 OpenCL kernels simultaneously. You'd have to shop around for a while to find some 580's though.

    Gamers aren't gonna buy this card unless they're spending Daddy's money, and serious compute folk will realize quickly that if they buy a mobo that will fit 2 or 4 double-width cards, depending on Gpu budget, they can get more flops per dollar with a multiple-card setup (think of it as a micro-sized Gpu compute cluster). Don't believe me? Google Jeremi Gosni oclhashcat.

    I'm not much for puns, but this card is gonna flop. (sorry)
  • DanNeely - Thursday, February 21, 2013 - link

    Has any eta on when the rest of the Kepler refresh is due leaked out yet?
  • HisDivineOrder - Thursday, February 21, 2013 - link

    It's way out of my price range, first and foremost.

    Second, I think the pricing is a mistake, but I know where they are coming from. They're using the same Intel school of thought on SB-E compared to IB. They price it out the wazoo and only the most luxury of the luxury gamers will buy it. It doesn't matter that the benchmarks show it's only mostly better than its competition down at the $400-500 range and not the all-out destruction you might think it capable of.

    The cost will be so high it will be spoken of in whispers and with wary glances around, fearful that the Titan will appear and step on you. It'll be rare and rare things are seen as legendary just so long as they can make the case it's the fastest single-GPU out there.

    And they can.

    So in short, it's like those people buying hexacore CPU's from Intel. You pay out the nose, you get little real gain and a horrible performance per dollar, but it is more marketing than common sense.

    If nVidia truly wanted to use this product to service all users, they would have priced it at $600-700 and moved a lot more. They don't want that. They're fine with the 670/680 being the high end for a majority of users. Those cards have to be cheap to make by now and with AMD's delays/stalls/whatever's, they can keep them the way they are or update them with a firmware update and perhaps a minor retooling of the fab design to give it GPU Boost 2.

    They've already set the stage for that imho. If you read the way the article is written about GPU Boost 2 (both of them), you can see nVidia is setting up a stage where they introduce a slightly modified version of the 670 and 680 with "minor updates to the GPU design" and GPU Boost 2, giving them more headroom to improve consistency with the current designs.

    Which again would be stealing from Intel's playbook of supplement SB-E with IB mainstream cores.

    The price is obscene, but the only people who should actually care are the ones who worship at the altar of AA. Start lowering that and suddenly even a 7950 is way ahead of what you need.
  • varg14 - Thursday, February 21, 2013 - link

    I will hang on to my SLI 560 tis for a while longer. Since i game at 1080p they perform very well.
  • mayankleoboy1 - Thursday, February 21, 2013 - link

    Some video conversion benchmarks please.
  • mayankleoboy1 - Thursday, February 21, 2013 - link

    Ohh, and the effect of PCIE2.0 VS PCIE3.0 also. Lets see how much is the Titan gimped by PCIE2.0
  • Ryan Smith - Thursday, February 21, 2013 - link

    This isn't something we can do at this second, but it's definitely something we can follow up on once things slow down a bit.
  • mayankleoboy1 - Thursday, February 21, 2013 - link

    Sure. I am looking forward to a part three of the Titan review
  • Hrel - Thursday, February 21, 2013 - link

    The problem with that reasoning, that they're raising here, is that the 7970 is almost as fast and costs a lot less. The Titan is competing, based on performance, with the 7970. Based on that comparison it's a shitty deal.

    http://www.newegg.com/Product/Product.aspx?Item=N8...

    $430. So based on that I'd say the highest price you can justify for this card is $560. We'll round up to $600.

    Nvidia shipping this, at this price, and just saying "it's a luxury product" is bullshit. It's not a luxury product, it's their version of a 7970GHE. But they want to try and get a ridiculous profit to support their PhysX and CUDA projects.

    Nvidia just lost me as a customer. This is the last straw. This card should be pushing the pricing down on the rest of their lineup. They SHOULD be introducing it to compete with the 7970GHE. Even at my price range, compare the GTX660 to the 7870GHE, or better yet the sub $200 7850. They just aren't competitive anymore. I'll admit, I was a bit of a Nvidia fan boy. Loved their products. Was disappointed by older ATI cards and issues I had with them. (stability, screen fitting properly, audio issues) But ATI has become AMD and they've improved quality a lot and Nvidia is USING their customers loyalty; that's just wrong.

    I'm done with Nvidia on the desktop. By the time I need a new laptop AMD will probably have the graphics switching all sorted; so I'm probably done with Nvidia on laptops too.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    LOL - be done, and buy the alternative crap - amd.

    You'll be sorry, and when you have to hold back admitting it, I'll be laughing the whole time.

    Poor baby can't pony up the grand, so he's boycotting the whole line.
    You know you people are the sickest freaks the world has ever seen, and frankly I don't believe you, and consider you insane.

    You're all little crybaby socialist activists. ROFL You're all pathetic.

    nVidia won't listen to you, so go blow on your amd crash monkey, you and two other people will do it before amd disappears into bankruptcy, and then we can laugh at your driver less video cards.

    I never seen bigger crybaby two year olders in my entire life. You all live in your crybaby world together, in solidarity - ROFL

    No one cares if you lying turds claim you aren't buying nVidia - they have billions and are DESTROYING amd because you cheapskate losers cost amd money - LOL

    YOU ARE A BURDEN AND CANNOT PAY FOR THE PRODUCTION OF A VIDEO CARD !

    Enjoy your false amd ghetto loser lifestyle.
  • Soulnibbler - Thursday, February 21, 2013 - link

    Hey, I'm excited about the fp64 performance but I'm not going to have any time to write code for a bit so I'll ask the question that would let me justify buying a card like this:

    How much acceleration should I expect using this card with Capture One as compared to AMD/software rendering. I've heard anecdotal evidence that the openCL code paths in version 7 make everything much faster, but I'd like a metric before I give up my current setup (windows in VMware) and dual-boot to get openCL support.

    I know openCL is not yet ready on this card but when you revisit it could we see a little Capture One action?

    Preferably the benchmark sets would be high resolution images at both high and low iso.
  • Ryan Smith - Monday, February 25, 2013 - link

    I'm afraid I don't know anything about Capture One. Though if you could describe it, that would be helpful.
  • Soulnibbler - Monday, February 25, 2013 - link

    Capture One is a raw developer for digital cameras.
    http://www.phaseone.com/en/Imaging-Software.aspx
    notably for medium format digital backs but also for 35mm and aps sensors. It could be considered a competitor to Adobe's Lightroom and ACR software but the medium format camera support and workflow are the major differentiators.

    The last two releases have had openCL support for both previews and exporting which I've heard has lead to reductions in time to get an image through post.

    I'd imagine that one could benchmark on a large library of photos and determine if this card as a compute card is any improvement over standard gaming cards in this use scenario.

    I'd imagine this is part of the market that NVIDIA is aiming at as I know at least one user who switched to an ATI W7000 for openCL support with Capture One.
  • rcabor - Thursday, February 21, 2013 - link

    Why is the voltage higher at 996MHz to 992MHz? Is that a typo?
  • rcabor - Thursday, February 21, 2013 - link

    meant 966 MHz not 996 MHz
  • Ryan Smith - Thursday, February 21, 2013 - link

    Bingo. Fixed. Thank you.=)
  • HollyDOL - Thursday, February 21, 2013 - link

    Hi,
    it seems a typo sneaked at voltages mentioned (page 2) at the three working frequencies tightly below 1GHz speed... should say 1.1625 (I guess) but says 1.625V for example. 1.625 also appears in text, not just the table.
  • just4U - Thursday, February 21, 2013 - link

    I am interested in these luxury coolers for some of their lower end GPU's. I don't know what they cost but I'd pay a premium for it if it wasn't to far out there..
  • BrokenCrayons - Thursday, February 21, 2013 - link

    Viewed outside fo consideration for the cost, it's quite impressive. Inexpensive (relatively) GPU compute seems like its focus which makes sense given the original design intentions. On the gaming side of the equation, the performance benefits don't stack well against the costs. A 680 or 7970 seems like a much more reasonable purchase unless the whole intention is for fairly pointless bragging rights to people that don't care, won't understand the significance, or are just random sorts you ramble to on the Internet about how much money you can afford to drop on a GPU which only returns the ability to waste time playing games at higher end display settings. So yeah, the performance is interesting, but the price puts it firmly into the "yeah whatever" territory of gaming.

    -BC
  • Hood6558 - Thursday, February 21, 2013 - link

    This is another piece of the puzzle I call my Ultimate Dream Machine, in which every component is the best or fastest available. It joins such parts as the i7-3970X, Asrock X79 Extreme11, Corsair 900D, Corsair Dominator Platinum 64 GB kits, and Corsair AX 1200i. I'll probably never have the money to build it, but that's okay, the dream is sometimes better than the reality. Anyway by the time I build it all the above parts will be obsolete, slow museum pieces, and the new stuff will be 10 times faster.
  • silverblue - Thursday, February 21, 2013 - link

    What's going on with the 7970's framerate in Crysis: Warhead's 1080p E Shaders/G Quality tests? It's massively behind the 7970GE.

    Also, the minimum framerate table for Crysis: Warhead appears to have been placed on top of the Far Cry 3 page in addition to the bottom of the Crysis: Warhead page.
  • Ryan Smith - Thursday, February 21, 2013 - link

    I'll have to look into that and get back to you. It's not a transcription error, so it may have been a configuration error on the benchmark.
  • JlHADJOE - Thursday, February 21, 2013 - link

    1/3 FP64 is awesome. This is a bargain if you need the compute.

    If you wanna play games on it... not so much. =P
  • ronin22 - Thursday, February 21, 2013 - link

    That's the point, it's not a gamerz card
  • Finally - Thursday, February 21, 2013 - link

    "Titan delivers the kind of awe-inspiring performance we have come to expect from NVIDIA’s most powerful video cards."
    If you hear unfiltered Nvidia marketing speak like this, you know that AT isn't fooling around when it comes to earning their PR dollars. Well done!
  • Scritty - Thursday, February 21, 2013 - link

    Paper launch? Fine. I get that. But I suspect stock levels will be seriously limited. Rumour has it that only 10,000 of these will be made - which seems very odd as even with a substantial profit marging - the ROI on development costs is going to be hard to recoup with a potential sales level as low as that.

    I'm looking to buy a couple of these as soon as they are available for SLI - maybe 3 for a triple set up if possible, but I can see there being real issues with stock. I decent solution 3 screen at 2560x1440 for sure - if you can get hold of them anywhere.
  • Ryan Smith - Thursday, February 21, 2013 - link

    Note that NVIDIA specifically shot down the 10K card rumor. As far as we've been advised and as best as we can tell, card availability will be similar to what we saw with the GTX 690. Which is to say tight at first, but generally available and will continue to be available.
  • Egg - Thursday, February 21, 2013 - link

    The chart on page 1 is missing a 'GB' under GTX Titan's VRAM listing. There aren't any 5760*1200 non-GE 7970 benchmarks. Also, on the Power, Temperature, and Noise page, "temperate" should be "temperature" just before the first chart.

    Additionally, the voltage issue HollyDOL and the strange Crysis Warhead 1080p E Shader/G Quality issue silverblue mentioned should be clarified as well. (I'm just repeating them here so they have a higher chance of being seen.)

    Also, Wolfram|Alpha interprets "gigaflops" as "billion floating point operations per second" by default, while offering an alternative interpretation that doesn't have the seconds unit. Wikipedia also defines flops as already having the time unit. By their standards, flops/s is technically incorrect. I'm not scientist, and I actually didn't notice this until typed gigaflops into Wolfram|Alpha, so take this for what little it's worth.

    It's silly to suggest that this card needs a voltmod and a waterblock. Very few people doing scientific work are going to be having time to do that. This card isn't intended to be a gaming card. Yes, there undoubtedly will be people on hwbot who would love to do such a thing, but relative to the population of scientists living on meager grants, they're small.

    It's also silly to say that Titan is a bad card because it isn't as efficient as other cards at password hashing or bitcoin mining. These embarallel workloads aren't representative of scientific workloads. Besides, the most dedicated people have a custom FPGAs or ASICs for those workloads.

    Saying that it shows Nvidia jacking up prices on its flagship is misleading. Yes, it's technically true. But I saw someone say that the GTX 680 was only a "midrange" card. The GTX 680 still competes with the Radeon 7970 GE. It isn't outright winning anymore - in certain games, it loses - and it's often substantially more expensive. But it's still reasonably competitive. Why did anyone expect Titan to push down GTX 680 prices? If anything, it might push down Tesla 20X prices, but I'm not holding my breath.
    Would anyone have complained about Nvidia being outrageously greedy if Titan didn't exist in the consumer space at all?

    (Moreover, the GTX 580 had FP64 performance at 1/8 FP32 performance, not Titan's 1/3. (http://www.anandtech.com/show/4008/nvidias-geforce...

    Simply looking at the specs partially explains why the card is so damn expensive. It's 7.1 billion transistors, compared to the GTX 690's 2*3.5 billion transistors. (Page 1 on this article). Going purely by transistor count, Titan is underpriced, because it's just as expensive as the GTX 690. Looking at die area instead is less forgiving, but don't forget that squeezing 7 billion transistors on a single die is more difficult than having two 3.5 billion transistor dies. Titan also has 2 extra gigabytes of GDDR5.

    The only valid criticism I've seen is that Titan can be outperformed by two 7970 GEs in certain, mostly FP32 compute workloads, which are a cheaper solution, especially for scientists who probably aren't as concerned with heat production as those working on the Titan supercomputer. After all, you can fit bigger fans in an EATX case than in most racks. 689 Gflops is still greater than 50% of 1309 Gflops; it's 53%. When you can find the cheapest 7970 GEs at a bit over $400, two 7970s will be about $200 cheaper.
    But figure in power: http://www.wolframalpha.com/input/?i=200+W+*+1+yea... . After a year of continuous usage (or two years of 50% utilization), and assuming that two 7970 GEs will use 200 more watts than a Titan - a fairly reasonable estimate in my opinion - Wolfram|Alpha informs us that we'll have saved $216.
    Not to mention the fact that two 7970s occupy around twice as much space as a Titan. That means you need more individual systems if you're looking to scale beyond a single workstation.
    And finally, anyone who needs as much memory per GPU as they can get will need Titan.
    It's hard to draw any real conclusions right now, though, with DirectCompute dubious and OpenCL broken. Great work on Nvidia's part, getting the drivers working...

    There's also the fact that Nvidia is marketing this as a gaming card, which is disingenuous and poor practice. But come on, we all read Anandtech for a reason. Overhyped marketing is nothing new in technology.

    So in conclusion - treat the GTX 680 as the flagship single-GPU consumer card. (They did call it a 680. See the GTX 580, 480, and 280.) It's in roughly in 7970GE's ballpark when it comes to price and performance. For gamers, Titan can effectively be ignored.
    If you need FP32 compute performance, consider multiple 7970 GEs as well as Titan.
    If you need FP64 compute performance, Titan is unparalleled, assuming you run it for a decent amount of time.
    And if you're trying to set a world record, well, I guess you can pay through the nose for Titan too.
  • Insomniator - Thursday, February 21, 2013 - link

    Thank you, so many here just sound like butt hurt kids that do not understand these concepts or maybe didn't even read the article. Few of them would buy it at the $700 they cry about wanting it to be.

    This card is not just for gamers, and even if it were, performance wise it crushes the next closest single GPU competitor. Remember when Intel EE editions were $1k? The best always costs extra... and in this case the card isn't even being marketed soley for gamers anyway.

    Until AMD puts out a new card that can beat it for cheaper, this will remain a $1k. Until then, the 680, 670, and 660 are all competitive products.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Don't expect the crybaby fools to respond. They'd prefer to pretend your post isn't here.

    If they do say anything, it will just be another repetitious pile of tinfoil hat lies Charlie D will be proud of.
  • Olaf van der Spek - Thursday, February 21, 2013 - link

    Still only average framerates? :(
    I had hoped you'd move to minimum framerate / max frametime based benchmarking. Averages are (and were) kinda meaningless.
  • Ryan Smith - Thursday, February 21, 2013 - link

    Actually we have some FRAPS data for a few of our games as a trial of some ideas. Unfortunately you won't see it for this article as there simply wasn't enough time to put that together on top of everything else. But keep your eyes peeled.
  • GiantPandaMan - Thursday, February 21, 2013 - link

    The Titan was a compute part, first and foremost. Gamers have much better alternatives in the 7970/680 route.

    Personally I think it's a pretty impressive piece of hardware, though there's no way in hell I'd ever buy it. That's because I'm a value oriented buyer and I don't have that much disposable income.

    I just don't get all the indignation and outrage. It's not like nVidia screwed you over in some way. They had an expensive piece of hardware designed for compute and said to themselves, what the hell, why not release it for gamers?
  • chizow - Thursday, February 21, 2013 - link

    You must not have followed the development of GPUs, and particularly flagship GPUs very closely in the last decade or so.

    G80, the first "Compute GPGPU" as Nvidia put it, was first and foremost a graphics part and a kickass one at that. Each flagship GPU after, GT200, GT200b, GF100, GF110 have continued in this vein...driven by the desktop graphics market first, Tesla/compute market second. Hell, the Tesla business did not even exist until the GeForceTesla200. Jensen Huang, Nvidia's CEO, even got on stage likening his GPUs to superheroes with day jobs as graphics cards while transforming into supercomputers at night.

    Now Nvidia flips the script, holds back the flagship GPU from the gaming market that *MADE IT POSSIBLE* and wants to charge you $1K because it's got "SuperComputer Guts"??? That's bait and switch, stab in the back, whatever you want to call it. So yes, if you were actually in this market before, Nvidia has screwed you over to the tune of $1K for something that used to cost $500-$650 max.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    You only spend at max $360 for a video card as you stated, so this doesn't affect you and you haven't been screwed.

    Grow up crybaby. A company may chagre what it desires, and since you're never buying, who cares how many times you scream they screwed everyone ?
    NO ONE CARES, not even you, since you never even pony up $500, as you yourself stated in this long, continuous crybaby whine you made here, and have been making, since the 680 was released, or rather, since Charlie fried your brain with his propaganda.

    Go get your 98 cent a gallon gasoline while you're at it , you fool.
  • chizow - Saturday, February 23, 2013 - link

    Uh no, I've spent over $1K in a single GPU purchasing transaction, have you? I didn't think so.

    I'm just unwilling to spend *$2K* for what cost $1K in the past for less than the expected increase in performance. I spent $700 this round instead of the usual $1K because that's all I was willing to pay for a mid-range ASIC in GK104 and while it was still a significant upgrade to my last set of $1K worth of graphics cards, I wasn't going to plunk down $1K for a set of mid-range GK104 GTX 680s.

    It's obvious you have never bought in this range of GPUs in the past, otherwise you wouldn't be posting such retarded replys for what is clearly usurious pricing by Nvidia.

    Now go away, idiot.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Wrong again, as usual.
    So what it boils down to is you're a cheapskate, still disgruntled, still believe in Charlie D's lie, and are angry you won't have the current top card at a price you demand.
    I saw your whole griping list in the other thread too, but none of what you purchase or don't purchase makes a single but of difference when it comes to your insane tinfoil hat lies that you have used for your entire argument

    Once again, pretending you aren't aware of production capacity leaves you right where you brainless rant started a long time ago.

    You cover your tracks whining about ATI's initial price, which wasn't out of line either, and ignore nVidia's immediate crushing of it when the 680 came out, as you still complained about the performance increase there. You're a crybaby, that's it.

    That's what you have done now for months on end, whined and whined and whined, and got caught over and over in exaggerations and lies, demanding a perfectly increasing price perf line slanting upwards, for years on end, lying about it's past, which I caught you on in the earlier reviews.

    Well dummy, that's not how performance/price increases work in any area of computer parts, anyway.
    Glad you're just another freaking parrot, as the reviewers have trained you fools to automaton levels.
  • Pontius - Thursday, February 21, 2013 - link

    My only interest at the moment is OpenCL compute performance. Sad to see it's not working at the moment, but once they get the kinks worked out, I would really love to see some benchmarks.

    Also, as any GPGPU programmer knows, the number one bottleneck for GPU computing is randomly accessing memory. If you are working only within the on-chip local memory, then yes, you get blazingly fast speeds on a GPU. However, the second you do something as simple as a += on a global memory location, your performance grinds to a screeching halt. I would really like to see the performance of these cards on random memory heavy OpenCL benchmarks. Thanks for the review!
  • codedivine - Thursday, February 21, 2013 - link

    We may do this in the future if I get some time off from univ work. Stay tuned :)
  • Pontius - Thursday, February 21, 2013 - link

    Thanks codedevine, I'll keep an eye out.
  • Pontius - Thursday, February 21, 2013 - link

    My only interest at the moment is OpenCL compute performance. Sad to see it's not working at the moment, but once they get the kinks worked out, I would really love to see some benchmarks.

    Also, as any GPGPU programmer knows, the number one bottleneck for GPU computing is randomly accessing memory. If you are working only within the on-chip local memory, then yes, you get blazingly fast speeds on a GPU. However, the second you do something as simple as a += on a global memory location, your performance grinds to a screeching halt. I would really like to see the performance of these cards on random memory heavy OpenCL benchmarks. Thanks for the review!
  • Bat123Man - Thursday, February 21, 2013 - link

    The Titan is nothing more than a proof-of-concept; "Look what we can do! Whohoo! Souped up to the max!" Nvidia is not intending this card to be for everyone. They know it will be picked up by a few well-moneyed enthusiasts, but it is really just a science project so that when people think about "the fastest GPU on the market", they think Nvidia.

    How often do you guys buy the best of the best as soon as it is out the door anyway ? $1000, $2000, it makes no difference, most of us wouldn't buy it even at 500 bucks. This is all about bragging rights, pure and simple.
  • Oxford Guy - Thursday, February 21, 2013 - link

    Not exactly. The chip isn't fully enabled.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    I really don't understand that mentality you have. I'm surrounded by thousands of dollars of computer parts and I certainly don't consider myself some sort of hardware enthusiast or addicted overclocker, or insane gamer.

    Yet this card is easily a consideration, since several other systems have far more than a thousand dollars in them on just the basics. It's very easy to spend a couple thousand even being careful.

    I don't get what the big deal is. The current crop of top end cards before this are starkly inadequate at common monitor resolutions.
    One must nearly ALWAYS turn down features in the popular benched games to be able to play.

    People just don't seem to understand that I guess. I have untold thousands of dollars in many computers and the only thing that will make them really gaming capable at cheap monitor resolutions is a card like this.

    Cripes my smartphone cost a lot more than the former top two cards just below Titan.

    This is the one area that comes to mind ( the only one that exists as far as I can tell) where the user is left with "my modern computer can't do it" - and that means, take any current taxing game (lots of those - let's say 50% of those reviewed as a rough thumb) and you're stuck unable to crank it up.

    Now 120hz monitors are becoming common, so this issue is increased.
    As you may have noticed, another poster exclaimed:
    " Finally ! 1920x1080 a card that can do it ! "

    There's the flat out closest to the truth, and I agree with that entirely, at least for this moment, as I stated here before the 7970 didn't do it when it was released and doesn't now and won't ever. (neither does the 680)

    I'm trying to deny it, but really it is already clear that the Titan doesn't cut it for everything at the above rez either, not really, and not at higher refresh rates.

    More is still needed, and this is the spot that is lacking for gamers, the video card.

    This card is the card to have, and it's not about bragging, it's about firing up your games and not being confronted with the depressing "turn off the eyecandy" and check the performance again... see if that is playable...

    I mean ****, that apparently does not bother any of you, and I do not know why.
    Everything else in your system is capable...
    This is an IMPORTANT PART that actually completes the package, where the end user isn't compromising.
  • HighTech4US - Thursday, February 21, 2013 - link

    If it does could we see a new story on performance using NVENC across the entire Kepler line along with any FREEware/PAYware software that utilizes it. I have an older Intel Q8300 that is used as my HTPC/Living Room Gaming System and encoding videos take a long time just using the CPU cores.

    If getting a Kepler GPU and using NVENC can speed up encoding significantly I would like to know. As that would be the lowest cost upgrade along with getting a Gaming card upgrade.

    Thanks
  • Ryan Smith - Thursday, February 21, 2013 - link

    Yes, NVEnc is present.
  • lkuzmanov - Thursday, February 21, 2013 - link

    excellent! now make it 30-40% cheaper and I'm on board.
  • Zink - Thursday, February 21, 2013 - link

    Rahul Garg picked the lowest HD 7970 scores in both cases from the Matsumoto et al. paper. The other higher GFLOPS scores represent performance using alternate kernels performing the same calculation on the same hardware as far as I can tell. Rahul needs to justify choosing only the lowest HD 7970 numbers in his report or I can only assume he is tilting the numbers in favor of Titan.
  • JarredWalton - Thursday, February 21, 2013 - link

    Picking the highest scoring results that are using optimized cores and running on different hardware in the first place (e.g. not the standard test bed) would be tilting the results very far in AMD's favor. A default run is basically what Titan gets to do, so the same for 7970 would make sense.
  • codedivine - Thursday, February 21, 2013 - link

    The different algorithms are actually not performing the exact same calculation. There are differences in matrix layouts and memory allocations. We chose the ones that are closest to the layouts and allocations we were testing on the Titan.

    In the future, we intend to test with AMD's official OpenCL BLAS. While Matsumoto's numbers are good for illustrative purposes. We would prefer running our own benchmarks on our own testbeds, and on real-world code which will typically use AMD's BLAS for AMD cards. AMD's OpenCL BLAS performance is actually a little bit lower than Matsumoto's numbers so I don't think we tilted the numbers in AMD's favour. If anything, we gave AMD a bit of benefit-of-the-doubt here.

    In the same vein, faster results than Nvidia's CUBLAS have been demonstrated on Nvidia hardware. However, we chose to test only using CUBLAS as all production code will typically use CUBLAS due to its reliability and support from Nvidia.

    AMD's OpenCL BLAS is a bit complicated to setup correctly and in my research, I have had problems with stability with it on Windows. Thus, we avoided it in this particular review but we will likely look at it in the future.
  • Zink - Thursday, February 21, 2013 - link

    Thanks, shouldn't have doubted you :)
  • Nfarce - Thursday, February 21, 2013 - link

    ...about my 680 purchase last April (nearly a year ago already, wow). Was so worried I made the wrong decision replacing two 570s knowing the Kepler was less than a year away. The news on this card has firmed up my decision to lock in with a second 680 now for moving up to a 2560x1440 monitor.

    Very *very* disappointing, Nvidia.
  • CeriseCogburn - Thursday, February 21, 2013 - link

    The new top card has been near the same as two of the former cards FOREVER.

    You people are nothing short of stupid nut jobs.

    There are not enough tampons at Johnson and Johnson warehouses for this thread.

    THE VERY SAME RATIO has occurred every time for all the prior launches.
  • chizow - Friday, February 22, 2013 - link

    Idiot...has the top end card cost 2x as much every time? Of course not!!! Or we'd be paying $100K for GPUs!!!
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Stop being an IDIOT.

    What is the cost of the 7970 now, vs what I paid for it at release, you insane gasbag ?
    You seem to have a brainfart embedded in your cranium, maybe you should go propose to Charlie D.
  • chizow - Saturday, February 23, 2013 - link

    It's even cheaper than it was at launch, $380 vs. $550, which is the natural progression....parts at a certain performance level get CHEAPER as new parts are introduced to the market. That's called progress. Otherwise there would be NO INCENTIVE to *upgrade* (look this word up please, it has meaning).

    You will not pay the same money for the same performance unless the part breaks down, and semiconductors under normal usage have proven to be extremely venerable components. People expect progress, *more* performance at the same price points. People will not pay increasing prices for things that are not essential to life (like gas, food, shelter), this is called the price inelasticity of demand.

    This is a basic lesson in business, marketing, and economics applied to the semiconductor/electronics industry. You obviously have no formal training in any of the above disciplines, so please stop commenting like a ranting and raving idiot about concepts you clearly do not understand.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    They're ALREADY SOLD OUT STUPID IDIOT THEORIST.

    LOL

    The true loser, an idiot fool, wrong before he's done typing, the "education" is his brainwashed fried gourd Charlie D OWNZ.
  • chizow - Sunday, February 24, 2013 - link

    And? There's going to be some demand for this card just as there was demand for the 690, it's just going to be much lower based on the price tag than previous high-end cards. I never claimed anything otherwise.

    I outlined the expectations, economics, and buying decisions in general for the tech industry and in general, they hold true. Just look around and you'll get plenty of confirmation where people (like me) who previously bought 1, 2, 3 of these $500-650 GPUs are opting to pass on a single Titanic at $1000.

    Nvidia's introduction of an "ultra-premium" range is an unsustainable business model because it assumes Nvidia will be able to sustain this massive performance lead over AMD. Not to mention they will have a harder time justifying the price if their own next-gen offering isn't convincingly faster.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    You're not the nVidia CEO nor their bean counter, you whacked out fool.

    You're the IDIOT that babbles out stupid concepts with words like "justifying", as you purport to be an nVidia marketing hired expert.

    You're not. You're a disgruntled indoctrinated crybaby who can't move on with the times, living in a false past, and waiting for a future not here yet.
  • Oxford Guy - Thursday, February 21, 2013 - link

    The article's first page has the word luxury appearing five times. The blurb, which I read prior to reading the article's first page has luxury appearing twice.

    That is 7 uses of the word in just a bit over one page.

    Let me guess... it's a luxury product?
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    It's stupid if you ask me. But that's this place, not very nVidia friendly after their little didn't get the new 98xx fiasco, just like Tom's.

    A lot of these top tier cards are a luxury, not just the Titan, as one can get by with far less, the problem is, the $500 cards fail often at 1920x resolution, and this one perhaps can be said to have conquered just that, so here we have a "luxury product" that really can't do it's job entirely, or let's just say barely, maybe, as 1920X is not a luxury resolution.
    Turn OFF and down SOME in game features, and that's generally, not just extreme case.

    People are fools though, almost all the time. Thus we have this crazed "reviews" outlook distortion, and certainly no such thing as Never Settle.
    We're ALWAYS settling when it comes to video card power.
  • araczynski - Thursday, February 21, 2013 - link

    too bad there's not a single game benchmark in that whole article that I give 2 squirts about. throw in some RPG's please, like witcher/skyrim.
  • Ryan Smith - Thursday, February 21, 2013 - link

    We did test Skyrim only to ultimately pass on it for a benchmark. The problem with Skyrim (and RPGs in general) is that they're typically CPU limited. In this case our charts would be nothing but bar after bar at roughly 90fps, which wouldn't tell us anything meaningful about the GPU.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Here you are arac, some places can do things this place claims it cannot.

    See the massive spanking amd suffers.

    http://www.bit-tech.net/hardware/2013/02/21/nvidia...

    That's beyond a 40% lead for the nvidia Titan above and beyond the amd flagship. LOL

    No problem. No cpu limited crap. I guess some places know how to test.

    TITAN 110 min 156 max
    7970ghz 72 min 94 max
  • TheJian - Sunday, February 24, 2013 - link

    Jeez, I wish I had read your post before digging up my links. Yours is worse than mine making my point on skyrim even more valid.

    In your link the GTX670 takes out the 7970ghz even at 2560x1200. I thought all these dumb NV cards were bandwidth limited ;) Clear separation on all cards in this "cpu limited" benchmark on ALL resolutions.

    Hold on let me wrap my head around this...So with your site, and my 3 links to skyrim benchmarks in my posts (one of them right here at anandtech telling how to add gfx, their 7970ghz article), 3/4 of them showing separations according to their GPU class...Doesn't that mean they are NOT cpu bound? Am I missing something here? :) Are you wondering if Ryan benched skyrim with the hi-res pack after it came out, found it got smacked around by NV and dropped it? I mean he's claiming he tested it right above your post and found skyrim cpu limited. Is he claiming he didn't think adding a HI-RES PACK that's official would NOT add graphical slowdowns? This isn't a LOW-RES pack right?

    http://www.anandtech.com/show/6025/radeon-hd-7970-...
    Isn't that Ryan's article:
    "We may have to look at running extra graphics effects (e.g. TrSSAA/AAA) to thin the herd in the future."...Yep I think that's his point. PUT IN THE FREAKIN PACK. Because Skyrim didn't just become worthless as a benchmark as TONS are playing it, unlike Crysis Warhead and Dirt Showdown. Which you can feel free to check the server link I gave, nobody playing Warhead today either. I don't think anyone ever played Showdown to begin with (unlike warhead which actually was fun in circa 2008).

    http://www.vgchartz.com/game/23202/crysis-warhead/
    Global sales .01mil...That's a decimal point right?
    http://www.vgchartz.com/game/70754/dirt-showdown/
    It hasn't reached enough sales to post the decimal point. Heck xbox360 only sold 140K units globally. Meanwhile:
    http://www.vgchartz.com/game/49111/the-elder-scrol...
    2.75million sold (that's not a decimal any more)! Which one should be in the new game suite? Mods and ratings are keeping this game relevant for a long time to come. That's the PC sales ONLY (which is all we're counting here anyway).
    http://elderscrolls.wikia.com/wiki/Official_Add-on...
    The high-res patch is an OFFICIAL addon. Can't see why it's wrong to benchmark what EVERYONE would download to check out that bought the game, released feb 2012. Heck benchmark dawnguard or something. It came Aug 2012. I'm pretty sure it's still selling and being played. PCper, techpowerup, anandtech's review of the 7970ghz and now this bit-tech.net site. Skyrim's not worth benching but all 4 links show what to do (up the gfx!) and results come through fine and 3 sites show NV winning (your site of course the one of the four that ignores the game - hmm, sort of shows my bias comment doesn't it?). No cpu limit at 3 other sites who installed the OFFICIAL pack I guess, but you can't be bothered to test a HI-RES pack that surely stresses a gpu harder than without? What are we supposed to believe here?

    Looks like you may have a point Cerise.
    Thanks for the link BTW:
    http://www.bit-tech.net/hardware/2013/02/21/nvidia...
    You can consider witcher 2 added as a 15th benchmarkable game you left out Ryan. Just wish they'd turn on ubersampling. As mins are ~55 for titan here even at 2560x1600. Clearly with it on this would be a NON cpu limited game too (it isn't cpu limited even off). Please refrain from benchmarking games with less than a 100K units in sales. By definition that means nobody is playing them OR buying them right? And further we can extrapolate that nobody cares about their performance. Can anyone explain why skyrim with hires (and an addon that came after) is excluded but TWO games with basically ZERO sales are in here as important games that will be hanging with us for a few years?
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Yes, appreciate it thanks, and your links I'll be checking out now.

    They already floated the poster vote article for the new game bench lineup, and what was settled upon already was Never Settle heavily flavored, so don't expect anything but the same or worse here.
    That's how it goes and there's a lot of pressure and PC populism and that great 2 week yearly vacation, and certainly attempting to prop a dying amd ship that "enables" this whole branch of competition for review sites is certainly not ignored. A hand up, a hand out, give em hand !
    lol

    Did you see where Wiz there at TPU in Titan review mentioned nVidia SLI plays 18 of 19 in house game tests and amd CF fails on 6 of them... currently fails on 6 of 19.

    " NVIDIA has done a very good job here in the past, and out of the 19 games in our test suite, SLI only fails in F1 2012. Compare that to 6 out of 19 failed titles with AMD CrossFire. "
    http://www.techpowerup.com/reviews/NVIDIA/GeForce_...

    So the amd fanboys have a real problem recommending 79xx rather 7xxx or 6xxx doubled or tripled up as an alternative with equal or better cost and "some performance wins" when THIRTY THREE PERCENT OF THE TIME AMD CF FAILS.

    I'm sorry, I was supposed to lie about that and claim all of amd's driver issues are behind it and it's all equal and amd used to have problems and blah blah blah the green troll company has driver issues too and blah blah blah...
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Oh man, investigative reporting....lol

    " http://www.vgchartz.com/game/23202/crysis-warhead/
    Global sales .01mil...That's a decimal point right?
    http://www.vgchartz.com/game/70754/dirt-showdown/
    It hasn't reached enough sales to post the decimal point. Heck xbox360 only sold 140K units globally. Meanwhile:
    http://www.vgchartz.com/game/49111/the-elder-scrol...
    2.75million sold (that's not a decimal any more)! Which one should be in the new game suite? "

    Well it's just a mad, mad, amd world ain't it.

    You have a MASSIVE point there.

    Excellent link, that's a bookmark.
  • Zingam - Thursday, February 21, 2013 - link

    GeForce Titan "That means 1/3 FP32 performance, or roughly 1.3TFLOPS"
    Playstation 4 "High-end PC GPU (also built by AMD), delivering 1.84TFLOPS of performance"

    Can somebody explain to me how that above could be? GeForce Titan $999 graphics card has much lesser performance than what would be in basically (if I understand properly) an APU by AMD for $500 for the full system??? I doubt that Sony will accept $1000 or more loss but what I find even more doubtful that an APU could have that much performance.

    Please, somebody clarify!
  • chizow - Thursday, February 21, 2013 - link

    1/3 FP32 is double-precision FP64 throughput for Titanic. The PS4 must be quoting single-precision FP32 throughput and 1.84TFlops is nothing impressive in that regard. I believe GT200/RV670 were producing numbers in that range for single-precision FLOPs.
  • Blazorthon - Thursday, February 21, 2013 - link

    You are correct about PS4 quoting single precision and such, but I'm sure that you're wrong about GT200 being anywhere near 1.8TFLOPS in single precision. That number is right around the Radeon 7850.
  • chizow - Saturday, February 23, 2013 - link

    GT200 was around 1TFlop, I was confused because the same gen cards (RV670) were in the 1.2-1.3TFLOP range due to AMD's somewhat overstated VLIW5 theoretical peak numbers. Cypress for example was ~2.5TFlops so I wasn't too far off the mark in quoted TFLOPs.

    But yes if PS4 is GCN the performance would be closer to a 7850 in an apples to apples comparison.
  • frogger4 - Thursday, February 21, 2013 - link

    Yep, the quoted number for the PS4 is the single precision performance. It's just over the single precision FP for the HD7850 at 1.76flops, and it has one more compute unit, so that makes sense. The double precision for Pitcairn GPUs is 1/16th of that.

    The single precision performance for the Titan is (more than) three times the 1.3Tflop double precision number. Hope that clears it up!
  • StealthGhost - Thursday, February 21, 2013 - link

    Why are the settings/resolution used for, at least Battlefield 3, not consistent with those used in previous tests on GPUs, most directly those in Bench? Makes it harder to compare.

    Bench is such a great tool, it should be constantly updated and completely relevant, not discarded like it seems to be with these tests.
  • JeBarr - Thursday, February 21, 2013 - link

    I would guess because as time goes by the reviewers here (and elsewhere) think they need to bench at settings used by the "majority". Even when that majority doesn't frequent, or even know the existance of, Anandtech.com. Go figure.

    I don't like it any more than you do...but for different reasons.

    I for one was happy to have a review site still benching at 16:10...which is what the long-time hardware enthusiasts/gamers prefer, that is, when they can't find a good CRT monitor ;)

    Just think of this review as the new bench standard going forward. A new starting point, if you will.
  • Ryan Smith - Monday, February 25, 2013 - link

    Bench 2013 will be going live soon. The backend is done (it's what I used to store and generate the charts here), but the frontend is part of a larger project...

    As for why the settings change, when we refresh our suite we sometimes change our settings to match what the latest generation of cards can do. When Titan sets the high bar for example, running 2560 at Ultra with 4xMSAA is actually practical.
  • TheJian - Thursday, February 21, 2013 - link

    NO Borderlands 2 (~6 million copies sold rated 89! not counting the addons rated high also)
    No Diablo3 (I hate the DRM but 10million+ sold of course rated high, but not by users)
    No Guild 2 (MMO with 3million copies sold rated 90!) even WOW Mists of pandaria has 3million or so now and 11 million playing the game's total content. I don't play WOW but it's still got a TON of users.
    No Assassin's Creed 3 (brings 680/7970 to low 30's 2560x1600)
    Crysis 3, warhead needs to die, and this needs to replace it (at the very LEAST). As shown below NOBODY is playing warhead. Wasted page space, and time spend benching it.

    Instead we get Crysis warhead...ROFL Well what can we expect Ryan still loves AMD.
    http://www.gametracker.com/search/warhead/
    Notice all the empty servers? Go ahead list them by players only 3 had over 10!..Most are ZERO players...LOL...Why even waste your time benchmarking this ignored game? Just to show NV weakness?
    Dirt Showdown - Raise your hand if you play this...Nope, you're all playing Dirt3 (wisely, or F1 etc anything that rates better than showdown)
    User ratings on metacritic of 70/4.7 (out of TEN not 5) and best summarized by gamespy (rated it a 40/100 on the frontpage of the metacritic site: http://www.metacritic.com/game/pc/dirt-showdown
    "DiRT: Showdown delivers bargain-basement entertainment value for the high, high price of $50. With its neutered physics, limited driving venues, clunky multiplayer, and diminished off-road racing options, discerning arcade racing fans should just write this one off as an unanticipated pothole in Codemaster's trailblazing DiRT series. "
    If you're going to use a racing game, at least make it a good one, not just the one AMD wins in. Why not F1 2012 (scored 80 at metacritic/6.8 from users). AMD wins in warhead which is also why crysis warhead is chosen even though nobody plays it (it's from 2008!). Again check the server list, who are you testing this for? What does it represent today? What other game based on it's engine? It's representing nothing correct? Nobody plays showdown either.

    How about adding some games people actually PLAY. I thought the whole point of benchmarking is to show us how games WE PLAY will run, is that not true at anandtech?

    Also no discussion of the frame delay ala Techreport:
    http://techreport.com/review/24381/nvidia-geforce-...
    No discussion of the frame latency issues that AMD is working on game by game. Their current beta I think just fixed the skyrim/borderland/guild wars2 issues which were awful.
    http://techreport.com/review/24218/a-driver-update...
    This has been an ongoing problem Anantech (ryan?) seems to just ignore. AMD is just getting to fixing this stuff in Jan...LOL. You can read more about it in the rematch of the 660TI/7950 here:
    http://techreport.com/review/23981/radeon-hd-7950-...
    Of course you can start at the beginning but this is where they recommend the 660TI and why (dec 2012 article).
    "The FPS average suggests near-parity performance between the 7950 and the GTX 660 Ti, with a tiny edge to the GeForce. The 99th percentile frame time, though, captures the impact of the Radeon's frame latency issues and suggests the GTX 660 Ti is easily the superior performer."
    More:
    "Instead, we have a crystal clear recommendation of the GeForce GTX 660 Ti over the Radeon HD 7950 for this winter's crop of blockbuster games. Perhaps AMD will smooth out some of the rough patches in later driver releases, but the games we've tested are already on the market—and Nvidia undeniably delivers the better experience in them, overall. "
    Even Tomshardware reports on delays now (albeit the wrong metric...LOL). Read the comments at techreport for why they're using the wrong one.

    No wonder they left out the xmas blockbusters and diablo3 (which will still sell probably 15million over it's life even though I would never buy it). I can name other games that are hot and new also:
    Dishonored, Deadspace 3, max payne 3, all highly rated. Max 3 barely hits 50's on top cards at 2560x1600 (7970ghz, 680 even lower), excellent test game and those are NOT the minimums (which can bring you to 20's/teens on lower cards). Witcher 2 (witcher 3 is coming), with uber sampling ENABLED is a taxer also.

    Dragon Age 2 at 2560x1600 will bring 7970/680 to teens/20's at minimums also, barely hits 40's avg (why use ONLY AVG at techspot I don't know, but better than maxes).
    http://www.techspot.com/review/603-best-graphics-c...

    START reporting MIN FPS for every game benched! There should be more discussion of the fact that in a lot of these games you hit teens for even $500 cards at 2560x1600 maxed out. Max fps means NOTHING. IF you hit 10-20fps a lot in a game your max means nothing. You won't want to play at that res, so what have you shown me? NOTHING. You should ALWAYS report MIN FPS as that dictates our gameplay experience and if it isn't always above 30 life sucks usually. Farcry 3 hits below 30 on both 680/7970 at 2560x1600.
    http://www.hardocp.com/article/2013/02/21/nvidia_g...
    And they don't have them on ULTRA, only titan is and none on 4xmsaa. At least they're giving max details/res you can expect to play and what it's min will be (better, you at least have USEFUL info after reading their benchmarks).

    From your article:
    "This is enough to get Titan to 74fps at 2560 with 4xMSAA, which is just fast enough to make BF3 playable at those settings with a single GPU."
    Why didn't you just report the minimums so we can see when ALL cards hit 30fps or less in all resolutions tested? If the game doesn't give a way to do this use fraps while running it (again, for ALL games). So it takes 74fps to get playable in BF3? It's easier to just give the minimums so people can see, otherwise are we supposed to attempt to extrapolate every one of your games without MINS listed? You did it for us in this sentence, but for ONE card and even then it's just a comment, not a number we can work with. It's YOU extrapolating your own guess that it would be playable given 74fps. What kind of benchmarking is this? I won't even get into your other comments throughout the articles on titan, It's more important to me to key on what you totally ignore that is VERY important to anyone picking ANY gpu. SMOOTHNESS of gameplay (latency testing) and MIN FPS so we know where we have no prayer of playing or what to expect playable on a given gpu. This is why Hardocp actually points to you guys as why your benchmarks suck. It's linked in most of their articles...LOL. FIX IT.
    http://www.hardocp.com/article/2008/02/11/benchmar...
    They have that in nearly every gpu article including the titan article. It's a valid point. But if you're not going to use IN GAME play, at least give min fps for canned etc. That link is in the test setup page of nearly every article on hardocp, you'd think you'd fix this so they'd stop. Your benchmarks represent something that doesn't reflect gameplay in most cases. The maxfps doesn't dictate fun factor. MIN does.

    One comment on Titan, I'd think about it at $800-850. Compute isn't important today at home for me, and won't be until more games use it like civ5 (they're just scratching surface here). At that point this card could become a monster compared to 690 without heat, noise etc. One day it may be worth $1000 to me, but for now it's not worth more than $800 (to me, no SFF needed, no compute needed). I don't like any dual chips or running multiple cards (see microstutter, latency delays etc), so once cheaper this would be tops on my list, but I don't usually spend over $360 on a card anyway...LOL. Most of the first run will go to boutique shops (20K first run I think). Maybe they'll drop it after that.

    LOL at anyone thinking the price sucks. Clearly you are NOT the target market. If you're product sells out at a given price, you priced it right. That's good business, and actually you probably should have asked more if it's gone in hours. You can still an SLI of titan in SFF, what other card can do that? You always pay a premium for the TOP card. Intel's extreme chips are $1000 too...No surprise. Same thing on the pro side is $2500 and not much different. IT's 20% slower than 690, but 690 can't go into SFF for the most part and certainly not as quiet or controllable. Also blows away 690 in compute if someone is after that. Though they need APPS that test this, not some home made anandtech benchmark. How about testing something I can actually USE and is relevant (no I don't count folding@home or bitcoin mining either, they don't make me money-a few coins?...LOL).
  • JeBarr - Thursday, February 21, 2013 - link

    I'm pretty sure Ryan has mentioned the benches you want are forthcoming. Maybe they haven't figured it all out yet...i dunno....but like you, I've been waiting what seems like a year or more for Anandtech to catch up with reality in GPU benching.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Yes, well I've found Frame Rate Target to be an absolute GEM in this area:

    " START reporting MIN FPS for every game benched! There should be more discussion of the fact that in a lot of these games you hit teens for even $500 cards at 2560x1600 maxed out. Max fps means NOTHING. IF you hit 10-20fps a lot in a game your max means nothing. "

    If you crank to max settings then have frame drop issues, FRAME RATE TARGET by nVidia of course, is excellent for minimizing and eliminating that issue.
    It really is a great and usable feature, and of course is for the most now already completely ignored.

    It was ported back to at least the top 500 series cards I don't remember exactly which ones right now, but that feature should have an entire article dedicated to it at every review site. It is AWESOME, and directly impacts minimum frame rates lofting nVidia to absolutely playable vs amd.

    I really think the bias won't ever be overcome. We used to hear nothing but eyefinity, yet now with nvidia cards capable of 4 monitors out of the box, it has suddenly become very unpopular for reviewers to mention eyefinity, surround, and surround plus ONE MORE in the nVidia case, without the need for any special adapters in many of nViida's partners card releases.

    So, it's really a sick situation.
  • Urbanos - Friday, February 22, 2013 - link

    he went through all the trouble of benchmarking in order to show that entry points for budget conscious users can get through Titan, but it doesn't actually prove that Titan is even worth the money without comparing it to at least 1 of its bigger competitors in the GPGPU market. Can you please consider adding that or having a new review based on the compute only.
  • codedivine - Friday, February 22, 2013 - link

    I am certainly interested in looking at the Xeon Phi if I can find the time and if we can arrange the resources to do so.

    My performance expectation (based on Intel white papers) is about 1700-1800 GFLops for SGEMM and 800-900 GFlops for DGEMM on the Xeon Phi 5110P. However, there are also a few benchmarks where I am expecting them to win as well thanks to the large cache on the Phi. Stay tuned.
  • Ryan Smith - Monday, February 25, 2013 - link

    This is really a consumer/prosumer level review, so the cards we're going to judge it against need to be comparable in price and intended audience. Not only can we not get some of those parts, but all of them cost many times more than Titan.

    If we were ever able to review K20, then they would be exactly the kinds of parts we'd try to include though.
  • kivig - Friday, February 22, 2013 - link

    There is a whole community of 3D people interested.
    Or when it will get added to bench table?
  • etriky - Saturday, February 23, 2013 - link

    +1
    Since this card at this price point is pointless for gaming I figured the article would be heavy on compute applications in order to give us a reason for it's existence.

    But then, nothing. No SmallLuxGpu or Cycles. Not even any commercial packages like Octane, or any of the Adobe products. I know LuxGPU and Blender used to be in the test suite. What happened?
  • etriky - Sunday, February 24, 2013 - link

    OK, after a little digging I guess I shouldn't be to upset about not having Blender benches in this review. Tesla K20 and GeForce GTX TITAN support was only added to Blender on the 2/21 and requires a custom build (it's not in the main release). See http://www.miikahweb.com/en/blender/svn-logs/commi... for more info
  • Ryan Smith - Monday, February 25, 2013 - link

    As noted elsewhere, OpenCL was broken in the Titan launch drivers, greatly limiting what we could run. We have more planned including SLG's LuxMark, which we will publish an update for once the driver situation is resolved.
  • kukreknecmi - Friday, February 22, 2013 - link

    If you look at Azui's PDF, with using different type of kernel , results for 7970 are :

    SGEMM : 2646 GFLOP
    DGEMM : 848 GFLOP

    Why did u take the lowest numbers for 7970 ??
  • codedivine - Friday, February 22, 2013 - link

    This was answered above. See one of my earlier comments.
  • gwolfman - Friday, February 22, 2013 - link

    ASUS: http://www.newegg.com/Product/Product.aspx?Item=N8...
    OR
    Titan gfx card category (only one shows up for now): http://www.newegg.com/Product/ProductList.aspx?Sub...

    Anand and staff, post this in your news feed please! ;)
  • extide - Friday, February 22, 2013 - link

    PLEASE start including Folding@home benchmarks!!!
  • TheJian - Sunday, February 24, 2013 - link

    Why? It can't make me any money and isn't a professional app. It tells us nothing. I'd rather see photoshop, premier, some finite analysis app, 3d Studiomax, some audio or content creation app or anything that can be used to actually MAKE money. They should be testing some apps that are actually used by those this is aimed at (gamers who also make money on their PC but don't want to spend $2500-3500 on a full fledged pro card).

    What does any card prove by winning folding@home (same with bitcoin crap, botnets get all that now anyway)? If I cure cancer is someone going to pay me for running up my electric bill? NOPE. Only a fool would spend a grand to donate electricity (cpu/gpu cycles) to someone else's next Billion dollar profit machine (insert pill name here). I don't care if I get cancer, I won't be donating any of my cpu time to crap like this. Benchmarking this proves nothing on a home card. It's like testing to see how fast I can spin my car tires while the wheels are off the ground. There is no point in winning that contest vs some other car.

    "If we better understand protein misfolding we can design drugs and therapies to combat these illnesses."
    Straight from their site...Great I'll make them a billionaire drug and get nothing for my trouble or my bill. FAH has to be the biggest sucker pitch I've ever seen. Drug companies already rip me off every time I buy a bottle of their pills. They get huge tax breaks on my dime too, no need to help them, or for me to find out how fast I can help them...LOL. No point in telling me sythentics either. They prove nothing other than your stuff is operating correctly and drivers set up right. Their perf has no effect on REAL use of products as they are NOT a product, thus not REAL world. Every time I see the word synthetic and benchmark in the same sentence it makes me want to vomit. If they are limited on time (usually reviewers are) I want to see something benchmarked that I can actually USE for real.

    I feel the same way about max fps. Who cares? You can include them, but leaving out MIN is just dumb. I need to know when a game hits 30fps or less, as that means I don't have a good enough card to get the job done and either need to spend more or turn things down if using X or Y card.
  • Ryan Smith - Monday, February 25, 2013 - link

    At noted elsewhere, FAHBench is in our plans. However we cannot do anything further until NVIDIA fixes OpenCL support.
  • vanwazltoff - Friday, February 22, 2013 - link

    the 690, 680 and 7970 have had almost a year to brew and improve with driver updates, i suspect that after a few drivers and an overclock titan will creep up on a 690 and will probably see a price deduction after a few months. dont clock out yet, just think what this could mean for 700 and 800 series cards, its obvious nvidia can deliver
  • TheJian - Sunday, February 24, 2013 - link

    It already runs 1150+ everywhere. Most people hit around 1175 max OC stable on titan. Of course this may improve with aftermarket solutions for cooling but it looks like they hit 1175 or so around the world. And that does hit 690 perf and some cases it wins. In compute it's already a winner.

    If there is no die shrink on the next gens from either company I don't expect much. You can only do so much with 250-300w before needing a shrink to really see improvements. I really wish they'd just wait until 20nm or something to give us a real gain. Otherwise will end up with a ivy,haswell deal. Where you don't get much (5-15%). Intel won't wow again until 14nm. Graphics won't wow again until the next shrink either (full shrink, not the halves they're talking now).
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Yes, and this is the core situation the radical chizow and others like him have chosen to completely ignore.

    Ivy is 22nm and only 14nm now appears to be possible as approx. 30 atoms are channel widths, with electromigration/plasma leakage hits a critical stage.

    So the slowdown has already occurred, Moore's law is broken (decelleration has been occurring for a long time) , and the reality is near present with the "largest possible" die at Titan's node.

    The number of atoms across in the "electric wire channel" and insulator sides width is down to countable on fingers and toes and it appears there's nearly no place to go.
    That's why we keep hearing about quantum computing dreams, and why shrinkage steps have been less beneficial toward this wall.

    So, expect the crybabies to be taking up a few notches more into an ever higher pitch the next couple of releases. It's coming, or rather it's here.
  • vanwazltoff - Friday, February 22, 2013 - link

    the 690, 680 and 7970 have had almost a year to brew and improve with driver updates, i suspect that after a few drivers and an overclock titan will creep up on a 690 and will probably see a price deduction after a few months. dont clock out yet, just think what this could mean for 700 and 800 series cards, its obvious nvidia can deliver
  • initialised - Friday, February 22, 2013 - link

    When are you guys going to start posting 4K performance for high end graphics?
  • iceman-sven - Friday, February 22, 2013 - link

    I am also wondering. Anandtech need to buy the Sharp PN-K321 fast. I will upgrade from my 2560x1600 to 4k in the next 12 months.

    I hope Anandtech does a rerun of some benchmarks with 4k and Titan SLI configurations. I am planning to buy 2 Titan for this.
  • Ryan Smith - Monday, February 25, 2013 - link

    When someone releases a suitable desktop monitor and we can acquire it on a long-term basis. Something like Sharp's 32-incher is the right resolution, but it really pushes the boundary for what can be called a "desktop" monitor.
  • ElminsterTCOM - Friday, February 22, 2013 - link

    I was wondering if you could pop this card into a Mac Pro and let us know if it is compatible? This would be a great card for 3D modeling!
  • Saxie81 - Friday, February 22, 2013 - link

    I'm wondering why the other websites that give reviews, benchmarks etc, have missed the mark with this card. Everywhere I look, they are posting nothing but game benchmarks, this is why I keep coming to Anandtech. This clearly is meant for more than that. I'm looking @ it for gaming and 3d rendering. I would have loved to have seen maybe Rendering times on a simple image in Blender etc, but the compute benchmarks gives a pretty good indication of what the Titan is capable of. Great article as always, Ryan, and welcome Rahul!
  • Zoeff - Friday, February 22, 2013 - link

    Looking at the Crysis 1080p at highest settings benchmark. I guess they're the wrong way around? :)
  • Ryan Smith - Monday, February 25, 2013 - link

    Do'h. Fixed.
  • realneil - Friday, February 22, 2013 - link

    Seems like whenever `anyone` releases the ~newest/best~ video card, they always price it at a grand. So this isn't surprising to me at all. How much were the Matrix cards from ASUS when they were new?

    I just can't see spending that much for it though. A pair of 680s or 7970s would get the job done for me.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    $800 or $900 dollars is close enough to a grand that it seems silly.

    Two 7970's at the $579 release and months long price is nearer $1200, and we have endless amd fanboy braggarts here claiming they did the right thing and went for it or certainly would since future proof and value is supreme.

    Now not a single one has said in this entire near 20 pages of comments they'd love to see the FUTURE PROOF ! of the 6 GIGS of ram onboard...
    Shortly ago it was all we ever heard, the absolute reason the 79xx series MUST be purchased over the 600 nVidia series...

    ROFL - the bare naked biased stupidity is almost too much to bear.

    Now the "futureproof" amd cards the crybaby liars screeched must be purchased for the 3G of ram future wins, ARE LOSERS TO THIS NEW TITAN PERIOD, AND FOREVERMORE.

    I guess the "word" "futureproof" was banned from the techtard dictionary just before this article posted.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    Thank you nVidia, 3 monitors, and a 4th, ultra rezz 5,760 x 1,080, and playable maxxed !

    ROFL -

    Thank you all the little angry loser fanboys who never brought this up over 22 pages of ranting whines.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    " After a few hours of trial and error, we settled on a base of the boost curve of 9,80 MHz, resulting in a peak boost clock of a mighty 1,123MHz; a 12 per cent increase over the maximum boost clock of the card at stock. "
    http://www.bit-tech.net/hardware/2013/02/21/nvidia...
    That's 27mhz according to here...

    LOL

    Love this place.
  • TheJian - Sunday, February 24, 2013 - link

    Here's why they don't test more of the games I mentioned previously and others:
    http://www.techpowerup.com/reviews/NVIDIA/GeForce_...
    Crysis 2, with DX11 & HIRES pack added @1920x1200 it beats 3 radeons...Note you have to go to a game where NV doesn't care (warhead) to show it so badly. C2 shows much closer to 2 or 3 radeons than warhead which I don't think NV has spent a second on in probably 4 years.

    Page 11 has Diablo 3 scores.
    Diablo 3 scores

    Page 4 for AC3
    Assassins Creed 3, beats 1,2 or 3 Radeon 7970's at all tested resolutions...ROFL
    http://techreport.com/review/24381/nvidia-geforce-...
    Showing the same 20fps diff at 2560x1600, and showing same CF crapout, losing to singl 7970 even in both websites. Clearly AMD has per game problems. Which they allude to on page 16 of the review:
    "Just know that waiting for driver updates to fix problems has become a time-honored tradition for owners of CrossFire rigs."

    techpowerup.com titan review page 5
    Batman Arkham City, same story...You see this is why SLI/CF isn't all it's cracked up to be...Every game needs work, and if company X doesn't do the work, well, AC3/Bat AC etc is what happens...Crysis 2 seems almost the same also.

    techpowerup.com titan article page 8
    COD Black ops2, 2 titans handily beat 1/2/3 7970's.

    techpowerup page 13:
    F1 2012...ROFL, 1 titan to rule all cards...1/2/3 CF or SLI all beaten by ONE CARD. It seems they limit the games here for a reason at anandtech...Always pitching how fast two 7970's is in this article vs a titan, even though they half recommend ONE titan but 'not at these prices, dual cards will always win'.
    ...ummm, I beg to differ. It should win, if drivers are done correctly, but as shown not always.

    Note at anandtech, dirt showdown shows 3% for NV Titan vs. 7970ghz, but if you run the FAR better Dirt3:
    http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...
    It's a ~20% win for Titan vs. 7970ghz. Crappy showdown game picked for a reason?

    Wait we're not done...
    techpowerup titan review page 15
    Max Payne3, 1 titan closing on 2 or 3 radeon 7970ghz's no matter the res...Not always good to get more cards I guess?

    techpowerup.com page 18 for starcraft 2
    Oh, snap...This is why they don't bench Starcraft 2...ROFL...1, 2 or 3 7970, all beat by 1 titan.
    But then, even a GTX 680 beats 3 7970's in all resolutions here...Hmmm...But then this is why you dropped it right? You found out a 680 beat 7970ghz way back here, even the 670 beat 7970ghz:
    http://www.anandtech.com/show/6025/radeon-hd-7970-...
    Totally explains why you came up with an excuse shortly after claiming a patch broke the benchmark. Most people would have just run with the patch version from a week earlier for the 660ti article. But as bad as 7970ghz lost to 670@1920x1200 it was clear the 660TI would beat it also...LOL. Haven't seen that benchmark since, just a comment it would be back in the future when patch allowed...NOPE. It must really suck for an AMD lover to have to cut out so many games from the new test suite.

    techpowerup.com titan review page 7
    Crap, someone benched Borderlands 2...LOL...Almost the same story, a titan looks good vs. 3 7970's (only loses in 5760x1080 which the single card isn't really for anyway).
    Again, proving adding more cards in some cases even goes backwards...LOL. It shouldn't, but you have to have the money to fix your drivers. Tough to do cutting 30% of your workforce & losing 1.18B.

    techpowerup page 20 titan article has WOW mists of pandaria.
    Dang those techpowerup guys, They had the nerve to bench the most popular game in the world. WOW Mists of Pandaria...Oh snap, 1 titan beats 3 7970's again, at all res. OUCH, even a SINGLE 680 does it...See why they don't bench other games here, and seem to act as though we all play pointless crap like warhead and Dirt3 showdown? Because if you bench a ton of today's hits (anything in 2012) except for a special few, you'll get results like techpowerup.

    That's ELEVEN, count them, 11 situations that kind of show a LOT MORE of the picture than they do here correct? I just wish I knew if they used mins or max at techpowerup (too lazy to ask for now), but either way it shows the weakness of multi-card setups without proper driver dev. It also shows why you need a wide range of TODAY's games tested for an accurate picture. Anandtech has really begun to drop the ball over the years since ryan took over card reviews. These games just add to the missing latency discussion issues that affect all radeons and are still being fixed on a GAME BY GAME basis. The driver fix doesn't affect them all at once. The last driver fixed 3 games (helped anyway), and every other game seems to need it's own fix. BUMMER. Ryan totally ignores this discussion. Techreport has done quite a few articles on it, and cover it in detail again in the titan review. PCper does also.

    Same Techpowerup article (since this site is puking on my links calling it spam) pg 19
    Skyrim, with all 3 radeon's at the bottom again. 1, 2 & 3 7970's beaton by ONE TITAN! So I guess 11 situations Ryan ignores. Does this make anyone take another look at the conclusions here on anandtech?
    PCper titan article shows the same in skyrim.
    http://www.anandtech.com/show/6025/radeon-hd-7970-...
    I kind of see why you dropped skyrim, even in your own tests at 1920x1200 670 was beating 7970ghz also, so even after 13.11 you'll still likely have a loss to 680 as shown at the other two links here, this was 4xmsaa too, which you complained about being a weakness in the 660ti article if memory serves...This kind of score short circuits comments like that huh? I mean 580/670 & 680 all pegged at 97.5fps clearly cpu bound not a memory issue I think, since all radeons are below 86. Well, shucks, can't have this benchmark in our next suite...ROFL. Anyone seeing a pattern here?

    Want more bias? Read the 660TI review's comments section where Ryan and I had a discussion about his conclusions in his article...ROFL. The fun starts about page 17 if memory serves (well not if you list all comments, diff page then). I only had to use HIS OWN benchmarks for the most part to prove his conclusioins BS in that case. He stooped so low as to claim a 27in monitor (bought from ebay...ROFL, even amazon didn't have a seller then, which I linked to) was a reason why 660ti's bandwidth etc sucked. Enthusiasts buy these apparently (cheapest was $500 from korea, next stop was $700 or so). Of course this is why they leave out mins here, as they would hit TEENS or single digits in that article if he posted them. All of the games he tested in that article wouldn't hit 30fps at 2560x1600 on EITHER amd or nv on a 66t0i. So why claim a victor?

    What about Crysis 3? Titan at or near top:
    http://www.guru3d.com/articles_pages/crysis_3_grap...
    Note he's telling you 40min, and really need 60 for smooth gameplay throughout as he says he uses avg. Also note at 2560x1600 with everything on, 7970/ 680 won't be playable as he's only avg 30. But see the point, only WARHEAD sucks on NV. But as show before nobody plays it, as servers are empty. 7970 wins over 680 by 20% in ryans warhead tests. But as soon as you go Crysis 2 dx11/hires textures or Crysis 3 it's suddenly a tie or loss.
    Page 8 in the same article
    Note the comment about 2560x1600, dipping to 25 or so even on gtx 680, and only fastest cards on the planet handle it fine:
    "At 2560x1600 with Very High Quality settings only the most expensive cards on the globe can manage. Please do bear in mind that our tests are based on averages, so YES there will be times your FPS drops to 25 fps in big fire fights and explosions, even with say a GTX 680."
  • TheJian - Sunday, February 24, 2013 - link

    Sorry, this site pukes on a certain amount of links, so I had to change them all to just page refs for the most part: 2nd part here :)
    Ryans warhead comment from this article: "In the meantime, with GTX 680’s LANGUID performance, this has been a game the latest Radeon cards have regularly cleared. For whatever reason they’re a good match for Crysis, meaning even with all its brawn, Titan can only clear the 7970GE by 21%."
    No Ryan, just in this game...Not crysis 2 or 3...LOL. He gives yet another dig in the same page, because this 5yr old game is major important even though nobody plays it:
    "As it stands AMD multi-GPU cards can already cross 60fps, but for everything else we’re probably a generation off yet before Crysis is completely and utterly conquered."

    Jeez, if you'd just put down the 5yr old game and get with the times (crysis 2 or 3 will do Ryan or any of the games above, what 11 of them I gave?), you'll find the only LANGUID performer is AMD. So Titan is a gen behind if you believe him on all CRYSIS games? If NV is a gen behind, how come nobody else shows this in Cryis2 DX11/Hires pack, or Crysis 3? Oh, that's right, NV actually optimizes for games that are less than 5yrs old...ROFL. Honestly I don't think AMD has done anything on their driver for warhead for 5yrs either...They just happen to be faster in a 5yr old game. :) And NV doesn't care. Why would they with the non-existent player base shown above on servers? Is Cryengine 2 being used in a bunch of games I don't know about? Nope, just WARHEAD. I've never heard of the other 3 on the list, but crysis 1 is not quite the same engine and as shown above performs quite well on kepler(1fps difference on 680vs7970ghz @1920x1200) same for crysis 2 & 3. Only warhead sucks on kepler.
    Search wikipedia.org for CryEngine
    You can do this for any game people, to find out what is out now, and what is coming. Look up unreal 3 engine for instance and take a gander at the # of games running it vs. Warhead.
    search wikipedia.org for List of Unreal Engine games
    Complete list of u3 base games there.

    http://techreport.com/review/24381/nvidia-geforce-...
    Guild Wars 2, Titan beating single 7970 & 7970CF at 2560x1600 by a lot...Another ignored game with 3mil sold. Titan is beating CF 7970 by ~20%. OUCH.

    http://www.guru3d.com/articles_pages/crysis_3_grap...
    Reminder, crysis 3 2560x1600 680gtx (that languid card on warhead according to Ryan) TIES 7970ghz in guru3d's benchmarks. Mind you, neither can run there as it's 30fps for both. You'll dip to 10-20fps...ROFL. But point proven correct? RYAN is misrepresenting the facts. Unless you play 3 gen old warhead instead of crysis2 or crysis 3 (or even updated crysis 1 now on cryengine3 according to the site, probably why it does well on kepler too)? Who does that? You still play serious sam1 or far cry 1 too? Still playing doom1?

    Is that 14 games I'm up to now? That's a lot of crap you couldn't use in the new suite huh?

    http://www.anandtech.com/show/6159/the-geforce-gtx...
    The comments section for Ryan's 660ti article. Realizing what I said above, go back and read our conversation. Read as he attempts to defend the bias conclusions in that article, and read the data from his OWN article I used then to prove those cards were made for 1920x1200 and below, not 2560x1600 or 2560x1440 as Ryan defended. Look at the monitor he was pitching and me laying out how you had to EBAY it from KOREA to even make his statements make sense (I gave links, showed that ebay company in korea didn't even have an about page etc...ROFL). Not that you'd actually order a monitor direct from some DUDE in korea giving up your visa like that anyway, how risky is that for a $500 monitor? But it was humorous watching him and Jarred defend the opinions (Jarred basically called me a ahole and said I was uninformed...LOL). The links and the data said otherwise then, and above I just did it again. This hasn't changed much with dual cards or titan. You still need these to play above 1920x1200 at above 30fps and some games still bring the top cards to their knees at 2560x1600 etc. That's why they don't post minimums here. All of the arguments about bandwidth being an issue go out the window when you find out you'll be running 10-20fps to prove it's true. One of the pages in the 660TI article is titled ~"that darned memory bandwidth"...Really? I also pointed out the # of monitors selling @1920x1200 or less (68 if memory serves) and above it on newegg.com at the time. I pointed at that steampowered.com showed less than 2% market share above 1920x1200 (and almost all had dual cards according to their survey, NOT a 660TI or below). I doubt it's much higher now.

    Hopefully one day soon Anand will stop this junk. It's hard to believe this is the new game suite...I mean seriously? That's just sad. But then Anand himself ignored basically the entire freakin' earnings report for NVDA and didn't even respond to the only comment on his NON-informational post (mine...LOL).
    http://www.anandtech.com/show/6746/tegra-4-shipmen...
    I'm the only comment... $20 says Nobody from Anandtech addresses this post either... :) What can they say? The data doesn't lie. Don't believe me...I provided all the links to everything so you can judge them yourselves (and what they've said or done - or not done in all these cases). They didn't address last Q's financial/market share whipping NVDA gave AMD either. I love AMD myself. I currently run a 5850, and put off my 660ti purchase as I'm not really impressed with either side currently and can wait for now (had a black friday purchase planned but passed), but the BIAS here has to stop. Toms, Techreport, PCper etc is reporting heavily on latency problems on radeons (at least 1 other user already mentioned it in this comment section) and AMD is redoing their memory manager to fix it all! AMD released a driver just last month fixing 3 games for this (fixed borderlands2, guild wars2 and one other). PCper.com (Ryan Shrout) is still working on exactly how to accurately test it (others have already decided I guess but more will come out about this). He's calling it frame rating capture:
    http://www.pcper.com/reviews/Graphics-Cards/Frame-...
    Note his comment on situation:
    "This is the same graph with data gathered from our method that omits RUNT frames that only represent pixels under a certain threshold (to be discussed later). Removing the tiny slivers gives us a "perceived frame rate" that differs quite a bit - CrossFire doesn't look faster than a single card."
    AMD cheating here or what (they've both done tricks at some point in their history)? I look forward to seeing Ryan Shrout's data shortly. He used to run AMDMB.com so I'm pretty sure he's pro AMD :)
    http://www.tomshardware.com/reviews/geforce-gtx-ti...
    more latency stuff. Note AMD is working on a new memory manager for GCN supposedly to fix this. I wonder if this will lower their fps avg.

    I didn't form my opinion by making stuff up here. AMD has great stuff, but I provided a LOT of links above that say it's not quite like Anandtech would have you believe. I can find benchmarks where AMD wins, but that's not the point. Ryan always makes the claim AMD wins (check his 660TI article conclusions for example). At best you could call this even, at worst it looks pretty favorable to NV cards here IMHO. IF you toss out crap/old 2 games (warhead, dirt showdown) that nobody plays and add in the 14 above this is pretty grimm for AMD correct? Pretty grimm for Anandtech's opinion too IMHO. If you can argue with the data, feel free I'd like to see it. None of the benchmarks are what you'd buy either, they are all reference clocked cards which nobody in their right mind would purchase. Especially the 660TI's, who buys ref clocked 660TI's? Toms/anand/hardocp seem to love to use them even though it's not what we'd buy as the same price gets you another 100mhz easily OOBE.

    I'd apologize for the wall, but it's not an opinion, all of the benchmarks above are facts and NOT from me. You can call me crazy for saying this site has AMD bias, but that won't change the benchmarks, or the ones Anandtech decided to REMOVE from their test suite (skyrim, borderlands2, diablo3, starcraft2 - all have been in previous tests here, but removed at 660ti+ articles). Strange turn of events?
  • Ryan Smith - Monday, February 25, 2013 - link

    "I'm the only comment... $20 says Nobody from Anandtech addresses this post either... :) What can they say?"

    Indeed. What can we say?

    I want to respond to all user comments, but I am not going to walk into a hostile situation. You probably have some good points, but if we're going to be attacked right off the bat, how are we supposed to have a meaningful discussion?
  • TheJian - Monday, February 25, 2013 - link

    If that's what you call an attack, it has to be the most polite one I've seen. The worst I called you was BIASED.

    Please, feel free to defend the 14 missing games, and the choice of the warhead (which doesn't show the same as crysis 1, 2 or 3 as shown) and dirt showdown. Also why Starcraft2 was in but now out when a launch event for the next one is coming with the next few weeks. Not an important game? The rest above are all top sellers also. Please comment on skyrim, as with the hires pack that is OFFICIAL as I noted in response to CeriseCogburn (where right above his post you call it cpu limited, as his link and mine show it is NOT, and AMD was losing in his by 40fps! out of ~110 if that isn't GPU separation I don't know what is). Are you trying to say you have no idea what the HI-RES pack is for skyrim out for over a year now? Doesn't the term HI-RES instantly mean more GPU taxing than before?

    Nice bail...I attacked your data and your credibility here, not YOU personally (I don't know you, don't care either way what you're like outside your reviews). Still waiting for you to attack my data. Still waiting for an explanation of the game choices and why all the ones I listed are left out for 2 games that sold 100,000 units or less (total failures) and one of them (warhead) from 5 years ago that doesn't represent Crysis 1, 2 or 3 benchmarks shown from all the titan articles (where all the keplers did very well with a lot of victories at 1920x1200, and some above, not just titan).

    This isn't nor have any of my posts been hostile. Is it hostile because I point out you misrepresenting the facts? Is it hostile because I backed it with a lot of links showing it NOT like you're saying (which enforces the misrepresentation of the facts comments)? It would be (perhaps) hostile if I insinuated you were an Ahole and have an "uninformed opinion" like Jarred Walton said about me in the 660ti comments section (which I never did to either of you) even after I provided boat loads of PROOF and information like I did here. So basically it appears, if I provide ample proof in any way say you're not being factual I'm labelled hostile. I was even polite in my response to Jarred after that :)

    How does one critique your data without being hostile? :)

    Never mind I don't want an answer to your distraction comment. Defend your data, and rebut mine. I'm thinking there are curious people after all I provided. It won't be meaningful until you defend your data and rebut the data from all the sites I provided (heck any, they all show the same, 14 games where NV does pretty well and not so good for radeons or CF, in some cases even SLI). I've done all the work for you, all you have to do is explain the results of said homework, or just change your "suite of benchmarks" for gaming. Clearly you're leaving out a lot of the story which heavily slants to NV if added. The ones in the links are the most popular games out today and in the last 15 months. Why are they missing? All show clear separation in scores (in same family of gpu's or out). These are great gpu test games as shown. So please, defend your data and game choices, then do some rebuttal of the evidence. IF someone said this much stuff about my data, and I thought I had a leg to stand on, I'd certainly take some time to rebut the person's comments. Politely just as all my comments were. Including this one. I can't think of a defense here, but if you can and it makes sense I'll acknowledge it on the spot. :)
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    I appreciate that, and read all the words and viewed all the links and then some.

    I have noted extreme bias in many past articles in the wording that is just far too obvious and friends and I have just had a time rolling through it.
    I commented a few reviews back pointing a bit of it out yet there's plenty of comments that reek as well.
    I am disappointed yet this site is larger than just video cards so I roll with it so to speak.

    Now that you've utterly cracked open the factual can exposing the immense amd favored bias, and the benchmark suite is scheduled to change -lol- that's how the timing of things work and coincide so often it seems.

    Anyway, you not only probably have some points, you absolutely do have a lot of unassailable points but then people do have certain "job pressures" so I don't expect any changes at all but am very appreciative and do believe you have done everyone a world of good with your posts.
    The 4 benchmarks dropped was just such a huge nail, lol.

    It's all good, some people like to live in a fantasy type blissful fog and feel good and just the same when reality shines the light it's all good too and even better.

    I absolutely appreciate it, know that.
    You did not waste your time nor anyone else's.
  • thralloforcus - Monday, February 25, 2013 - link

    Please test folding@home and bitcoin mining performance! Those would be my main justifications for getting a new video card to replace the 570 Classified's I have in SLI.
  • Ryan Smith - Monday, February 25, 2013 - link

    As noted elsewhere, OpenCL is currently non-functional on Titan. Until it's fixed we can't run FAHBench. As for BitCoin Jarred has already gone into some good reasons why it's not a very useful GPU benchmark, and why GPUs are becoming less than useful for it.
  • justaviking - Wednesday, February 27, 2013 - link

    On Feb 22, the review closed wtih this teaser:
    "Wrapping things up, on Monday we’ll be taking a look at the final piece of the puzzle"

    Monday was two days ago. Am I impatient? Yes. I am really looking forward to seeing what you have to say about Origin’s tri-SLI full tower Genesis PC.

    Did I miss it somehow?
  • avel - Wednesday, February 27, 2013 - link

    I've been thinking the same thing. While I was waiting I found that Tomshardware has a tri sli titan review up. Maybe Anand will have theirs up today.
  • Ryan Smith - Wednesday, February 27, 2013 - link

    Unfortunately it's going to be a few more days. I'm currently out of commission with the flu, so I haven't been able to finish my work on the Genesis system yet.
  • justaviking - Friday, March 1, 2013 - link

    Oh, sorry to hear about that.
    Get well soon.
  • CiccioB - Monday, March 4, 2013 - link

    It would be nice if you could also address TheJian's post.
    In particular on the reasons for choosing such games instead of those listed and, most of all, if in the future the list of games used will change with at least a part of those more modern ones.

    If you made a choice there must be a reason. It would be nice to let us know which it is. Avoiding giving reasons for your choices is a reason for many to have doubts on impartiality and/or professionalism.

    Thanks in advance
  • CeriseCogburn - Monday, March 4, 2013 - link

    Dream on, you play as many games as the person you have a problem with.

    The site is amd gpu biased out the wazoo, and every blind pig knows it. They failed to get a card from nVidia years ago (a certain G92) and it's been open hatred ever since. Same thing happened to Tom's.

    I'm sure there are other reasons - I've seen some stated - "the confident and arrogant nVidia reps" was one theme.
    The intense "hatred" right now for anyone profitable, especially above and beyond the pined for "take down the giants (Intel and nVidia)" AMD underdog dream of these fantasy activists.

    The desire for the "competitive war" to continue so this site has a reason to exist and do video card reviews, thus the failing piece of crap company AMD must be propped continuously, it is after all fully compliant with "self interest" even if it is, and it is, extremely unethical and completely immoral.

    So don't expect any answers, and there's exactly ZERO chance fair and equitable is the answer.
  • CeriseCogburn - Monday, March 4, 2013 - link

    Don't get me wrong, the site is great, I've been reading it forever, before it was even on the map, and of course people are human and have certain pressures and personal tastes.
    That won't ever change.

    They have many sections, the podcasts are a recent cool addition for some added flavor, and like anything, especially evaluating tow competing entities, perfection is impossible.
  • CiccioB - Monday, March 4, 2013 - link

    I like this site for GPU reviews. I have always found its review better than those done by many other sites.
    They are rich in technical description and give many answers many other sites don't even imagine to question.
    Or ask and answer only by doing a copy & paste from here, and sometimes even without understanding much of what they are C&P.
    The computational tests done here, even in the past years, have not been found anywhere. Others use stupid synthetic benchmark mostly based on OpenCL that require two minutes hack to double their performances or are biased depending on who has sponsored the tests (see AMD and SiSandra Benchmark Suite).

    However I have been thinking that the game choice was always "random".
    Review after review some good games suddenly disappeared to leave space to others that have not real meaning (i.e. games that do 150+ FPS on high end systems are quite ridiculous to bench). Same for very old games recently superseded by new release. And some games never reviewed at all.
    For example, I would like to know games like StarCraft 2, that had big problems with SLI/Crossfire at the time it was published, run now on the latest GPUs with latest drivers. Or games like Arma2 that were unplayable. But I still see Crysis Warhead benches, which is not exacly interesting nor indicative of anything while others already have Crysis 3 benches.
    It would also be a good option to add Physx option when possible. For example, with such a beast like Titan many games have enough room to run Physx at high levels. How does that compare with a SLI solution? Or with no Physx at all? How that impact on these GPUs rather than GK104 or older Fermi?

    But apart these requests, it would really be nice to understand the choice of reviewed games. Because it is well known that games are good or bad on certain architectures more than others, and choosing only most of those that adapt to one or to another with no apparent reason really makes these test quite cheap with respect to others, like for example those done by Techpowerup like it has been addressed before.

    Not answering rally means feeding the doubts. Which for many may change in not being doubts anymore.

    Sorry for my English, it is not my native language
  • clickonflick - Thursday, March 7, 2013 - link

    i agree that the price of this GPU is really high , one could easily assemble a fully mainstream laptop online with dell at this price tag or a desktop, but for gamers, to whom performance is above price. then it is a boon for them

    for more pics check this out

    http://clickonflick/nvidia-geforce-gtx-titan/

    so check the above link for specifications of titan
  • enigz - Thursday, March 7, 2013 - link

    CeriseCogBurn, you shit from your mouth, don't you? I've owned both nvidia and amd cards, I go for performance and I most certainly do not care about spending. It is not about the company. I don't go around slamming the other team online like the bloody ball-less keyboard warrior you are. Do you not realise that that your comments make you look like those "fanboys" which you go around insulting? Go grab a paper towel to clean off all that shit dripping down your chin, then sit down and try to absorb what I've just said while I'll be off to get my Titan. At least AMD and NVIDIA are capable of producing graphics and computing solutions for consumers worldwide while you, Sir, are just capable of being an asshole right here at Anandtech.
  • CeriseCogburn - Tuesday, March 12, 2013 - link

    ROFL another amd fanboy having a blowout. Mommie will be down to the basement with the bar of soap, don't wet your pants.
    When amd dies your drivers will still suck, badly.
  • trajan2448 - Saturday, March 16, 2013 - link

    Until you guys start showing latencies, these reviews based primarily on fps numbers don't tell the whole story. Titan is 4x faster than multi GPU solutions in real rendering.
  • IUU - Wednesday, March 20, 2013 - link

    Just a thought: if they price titan say at 700 or 500 (that was the old price point for flagship cards), how on earth will they market game consoles, and the brave "new" world of the mobile "revolution"?
    Like it or not, high tech companies have found a convenient way to get away from the cutthroat competition of the pc-land(from there their hate and slogans like post-pc and the rest) and get a breath of fresh(money) air!

    Whether this is also good for the consumer in the long run, remains to be seen, but the fact is, we will pay more to get less, unless something unexpected happens.
  • paul_59 - Saturday, June 15, 2013 - link

    I would appreciate any intelligent opinions on the merits of buying a 690 card versus a Titan, considering they retail for the same price
  • bravegag - Tuesday, August 13, 2013 - link

    I have bought the EVGA nVidia GTX Titan, actually two of them instead of the Tesla K20 thanks to the benchmark results posted in this article. However, the performance results I got are nowhere close to the ones shown here. Running DGEMM from CUDA 5.5 and CUBLAS example matrixMulCUBLAS with my EVGA nVidia GTX Titan reaches no more than 220 GFlop/s which is nowhere close to 1 TFlop/s. My question is then, are the results presented here a total fake?

    I created the following project where some additional HPC benchmarks of the nVidia GTX Titan are included, the benchmark computing environment is also detailed there:
    https://github.com/bravegag/eigen-magma-benchmark
  • bravegag - Wednesday, August 14, 2013 - link

    have anyone tried replicating the benchmark results shown here? how did it go?
  • Tunnah - Wednesday, March 18, 2015 - link

    It feels nVidia are just taking the pee out of us now. I was semi-miffed at the 970 controversy, I know for business reasons etc. it doesn't make sense to truly trounce the competition (and your own products) when you can instead hold something back and keep it tighter, and have something to release in case they surprise you.

    And I was semi-miffed when I heard it would be more like a 33% improvement over the current cream of the crop, instead of the closer to 50% increase the Titan was over the 680, because they have to worry about the 390x, and leave room for a Titan X White Y Grey SuperHappyTime version.

    But to still charge $1000 even though they are keeping the DP performance low, this is just too far. The whole reasoning for the high price tag was you were getting a card that was not only a beast of a gaming card, but it would hold its own as a workstation card too, as long as you didn't need the full Quadro service. Now it is nothing more than a high end card, a halo product...that isn't actually that good!

    When it comes down to it, you're paying 250% the cost for 33% more performance, and that is disgusting. Don't even bring RAM into it, it's not only super cheap and in no way a justification for the cost, but in fact is useless, because NO GAMER WILL EVER NEED THAT MUCH, IT WAS THE FLIM FLAMMING WORKSTATION CROWD WHO NEEDING THAT FLIM FLAMMING AMOUNT OF FLOOMING RAM YOU FLUPPERS!

    This feels like a big juicy gob of spit in our faces. I know most people bought these purely for the gaming option and didn't use the DP capability, but that's not the point - it was WORTH the $999 price tag. This simply is not, not in the slightest. $650, $750 tops because it's the best, after all..but $999 ? Not in this lifetime.

    I've not had an AMD card since way back in the days of ATi, I am well and truly part of the nVidia crowd, even when they had a better card I'd wait for the green team reply. But this is actually insulting to consumers.

    I was never gonna buy one of these, I was waiting on the 980Ti for the 384bit bus and the bumps that come along with it...but now I'm not only hoping the 390x is better than people say because then nVidia will have to make it extra good..I'm hoping it's better than they say so I can actually buy it.

    For shame nVidia, what you're doing with this card is unforgivable

Log in

Don't have an account? Sign up now