Comments Locked

32 Comments

Back to Article

  • Shadow7037932 - Wednesday, September 16, 2015 - link

    Interesting partnership. Makes a lot of sense for both companies I think.
  • Despoiler - Wednesday, September 16, 2015 - link

    That Fury X price is now looking pretty good.
  • Kutark - Wednesday, September 16, 2015 - link

    I dont see why. The Fury X is a TERRIBLE OC'er. basically you can get another 50-75mhz before you start getting horrible issues. The 980ti on the other hand, people are getting 35-40% overclocks with watercooling. So, $100-200 price premium for a 40-45% performance gain over a Fury X seems like good math to me.
  • RussianSensation - Wednesday, September 16, 2015 - link

    While the overall idea of your post is solid, the math behind is all wrong. A stock GTX980Ti boosts to 1176-1202mhz. That means a 1500mhz 980Ti is really a 25-27.5% overclock over a reference 980Ti, not a 35-40% overclock. 40-45% performance gain over Fury X? Now you are literally pulling stuff out of your rear end. A heavily overclocked 980Ti with boost near 1.4Ghz is 22% faster at 1440P and 16% faster at 4K:
    http://www.techpowerup.com/reviews/Palit/GeForce_G...

    That's using a very favourable review towards the 980Ti by using skewed benchmarks like Project CARS, WOW and Wolfenstein. Other sites show even less of an advantage. Gigabyte G1 980Ti Gaming boosts to 1.4Ghz as well -- 17% faster at 1440P, 12% at 4K:

    http://www.sweclockers.com/test/20730-amd-radeon-r...

    While it's true without a doubt that 980Ti and 980Ti OC are far better value and products than a $650 Fury X, your performance data is a gross inaccuracy that I would more expect from someone like rollo or wreckage.
  • at80eighty - Thursday, September 17, 2015 - link

    glad youre around to stomp out posts like that
  • TallestJon96 - Thursday, September 17, 2015 - link

    Your fact reviewing and criticism is well founded and level headed. You are both a gentleman and a scholar.
  • tviceman - Thursday, September 17, 2015 - link

    He's exaggerating, but not by much. The Palit card in that TPU review has another 12.5% OC performance, bringing it up to 37% faster than a stock Fury X at 1440p. Fury X's get what, 5-7% OC real world gains? 980 TI's OC are still 30% faster than Fury X OC. Even with the price premium of the MSI/Corsair combo, it's still a better perf/$ deal than Fury X.
  • hahmed330 - Thursday, September 17, 2015 - link

    I know trolling is your profession, but be polite okay? Or your rear end will start grunting. 40 percent overclock soundly beats measly 10-12% of fury x.
  • Samus - Thursday, September 17, 2015 - link

    Yeah, I'm surprised they went so conservative with the OC. Especially only 100MHz on the memory!? I have the same generation SK Hynix GDDR5 on my 970 and I have it clocked at 7.8GHz. Some people get 8GHz.

    SK Hynix has been having trouble yielding 8GHz GDDR5 for over a year and odds are most of the 7GHz GDDR5 is just batches of 8GHz memory that wouldn't run at 8GHz + whatever headroom is required to approve it.

    Anyway, chances are this card could overclock to 1300MHz/8000MHz assuming the the front cooling plate and back plate are well installed to cool the VRM and memory. Knowing MSI and Corsair, this is probably a top-notch implementation down to the TIM's and paste.

    I'm glad to see the Fury X creating competition in the CLLC market. Liquid cooling GPU's makes a hell of a lot more sense than liquid cooling a CPU that typically has HALF the TDP and almost never runs maxed out. Most games max out the GPU and are somewhat disinterested in hammering the CPU.
  • MisterAnon - Sunday, September 20, 2015 - link

    A 980Ti is an absolute waste of money when 390 xfire costs the same is 90% faster.
  • WorldWithoutMadness - Wednesday, September 16, 2015 - link

    This generation is awkward one, half done HBM, rebadging, etc.It should be fun next year with HBM2, they might announce a new form factor, smaller and liquid cooled so it will bundled with PC 'console' like steam's or AMD quantum.
  • jtrdfw - Thursday, September 17, 2015 - link

    this a joke? lol
  • Kutark - Wednesday, September 16, 2015 - link

    I'm kind of wishing i hadn't been in such a big rush to get my 980ti, and waited for one of these closed loop liquid cooling versions. Mine is an EVGA superclocked, but it throttles so damn much it might as well not be OC'd.

    I would buy a water block for it, but then i'd have to buy a radiator and all the other assorted hardware, and at that point i might as well liquid cool the CPU, and then im out another 600 bucks between all of that.

    I wonder if EVGA does the step up thing to the CLLC version of the card?
  • RaistlinZ - Wednesday, September 16, 2015 - link

    You're not bound to the card. Sell it and buy a CLLC. You'll take a hit to your wallet, but if you want it go for it. YOLO, as the kids say. =)
  • Shadow7037932 - Wednesday, September 16, 2015 - link

    You only need to make that initial watercooling investment once and it'll last a very, very, very long time as long as it's cared for properly. I'm still rocking a Swiftech GTX, MCR320, D5, and Bitspower fittings. This is around 8 years old now. The only additional investments you'll need to make are the mounting adapters for newer sockets and tubing.
  • Samus - Thursday, September 17, 2015 - link

    I'm still running a first-gen Corsair Asetek cooler from 2010! It's seriously old-school, the waterblock/pump assembly is like 4" tall!

    And it works and has always worked flawlessly! It's also very quiet compared to some newer coolers like Coolermaster 120, but pretty much on-par with the current Corsair H-series equivalents. I originally had it on my x58/i920 OC to 3.4GHz.

    http://media.bestofmicro.com/U/M/217534/original/c...
  • HollyDOL - Thursday, September 17, 2015 - link

    This looks like a little bit cheaper and faster version of EVGA GeForce GTX 980 Ti HYBRID (http://www.evga.com/Products/Product.aspx?pn=06G-P...
  • edzieba - Thursday, September 17, 2015 - link

    Get a Corsair HG10 and a Corsair CLC, and you will have an identical card + cooler to this, just without the branding.
  • rtho782 - Thursday, September 17, 2015 - link

    EVGA will sell you their hybrid cooler as an aftermarket part if you have a reference card.

    I did that with my 980s and they were not even EVGA models.
  • Makaveli - Thursday, September 17, 2015 - link

    The $90 markup seems about right.

    Paid $100 for a G10 kraken and H55 setup for my 7970Ghz.
  • meacupla - Thursday, September 17, 2015 - link

    Seeing as it has a fan on the card itself, it looks like they didn't bother with an extensively custom design to cool the VRMs.

    I do, however, like the fact they kept it to dual slot size, because most aftermarket CLLC blocks end up being 3 slots tall.
  • PPalmgren - Thursday, September 17, 2015 - link

    Hmm, Corsair and MSI huh. These two companies fill the gaps each other has in their product line. Corsair makes everything in a computer except parts with PCBs like the motherboard and the graphics cards, whereas MSI focuses on PCB parts and all-in-one solutions like their PC and laptop lineup.

    What I'm saying is I think the two companies would make good partners for a merger. I think MSI is (surprisingly) the bigger of the two, and I don't think it would happen, but both companies bring a lot to the table and would likely make a very solid merger should they ever go that route.
  • TallestJon96 - Thursday, September 17, 2015 - link

    Not totally surprising, but not expected.

    I trust corsair quite a bit. I recognize that they are basically only redistributing products from other companies, but they usually have a good combination of reasonable price, high (or high enough) quality, and value. At the end of the year, my new build will have a PSU, case, and possibly ran from corsair, and I would be interested in their pascal cards if they make some.
  • TallestJon96 - Thursday, September 17, 2015 - link

    I'm going to join the ranks of those asking for an edit button.

    In my particular case, getting Safari to not auto correct gpu to "you" or ram to "ran" is quite bothersome, but at least on websites like toms I can fix it.
  • Assimilator87 - Thursday, September 17, 2015 - link

    This is an unnecessary, already existing solution and a big missed opportunity. To make a noticeable splash and turn heads, Corsair and MSI should have made a CLLC cooled Lightning.
  • hammer256 - Thursday, September 17, 2015 - link

    CLCs seem to make a lot of sense for GPUs (probably more so than CPUs), considering how much power they can pull and the limited amount of space they have for dissipating said power. Personally I would love to see some double slotted dual GPUs with CLCs, that way I can pack in 4 of them into a workstation and not have to worry about the tight spacing and inefficient cooling. Right now I pretty much have to ram air into the tiny gaps between the GPUs with 4 super loud fans (pushing 300+ CFM each). Even then the air flow is barely adequate. The 4 GPUs sandwiched in the middle gets to around 83C under load, which is just borderline acceptable right now.
  • SpartyOn - Thursday, September 17, 2015 - link

    I'm not sure why OEM CLC designs for graphic cards haven't happened before now; there definitely has been a huge market for it at the enthusiast end. Back in 2012 I put a Dwood's bracket (before he sold out to NZXT) on my GTX 770 4GB and hooked it up to a Zalman LQ320 CLC and I'm still rocking it to this day.

    Even in a mITX case with lots of stuff crammed in, I was able to overclock to 1304 core with a maximum burst boost of 1398 (usually settles around 1350-something for more boost clocks) and hit 8 GHz on the VRAM; all told I'm getting about 25% more performance with this OC. Of course I modded the VBIOS as well, but this setup has allowed me to extend the lifespan of this card phenomenally. I even played the Witcher 3 with everything at maximum including HairWorks on this out-of-date setup at 30 fps lock on my 55" 1080p TV. It could actually get more around 40-something fps, but I found limiting it to 30 helped produce a more favorable visual experience when it did occasionally dip into the high 20's. I was pleasantly surprised from this 3 year old, not even top of the line, graphic card.

    I think OEM CLC is the future at the enthusiast end, and I'm ready to welcome it with open arms. It allows me a three year upgrade cycle instead of every two, which also seems to more align with technological advances anyway.

    Sure the GTX 770 is getting a little old in the tooth, but I'm gonna see this thing through until Pascal is released and HBM2 is outted.
  • garadante - Thursday, September 17, 2015 - link

    I'm also in agreement that this is a good direction for GPU vendors to take their cards in. I've long wanted to do exactly what you did and slap a CLC on my GPU but it's tough to justify the $100~ cost of doing so on my current, dated card that has no overclocking headroom due to being a nonreference, no VCORE control card.

    But here's a thought: considering how there have been brackets on the market to use a CPU CLC on a GPU for awhile now, I wonder if the CLCs used in this graphics cards will be the same as a CPU variant. If so, that means when the card outlasts its useful life (or the card itself dies), you could pop off the CLC, buy a CPU mounting bracket or another GPU bracket, and reuse a potentially still strong CLC.
  • SpartyOn - Thursday, September 17, 2015 - link

    By the time a video card dies, the pump in your CLC will either have failed or otherwise be on it's last leg.

    I think it'd be more practical for the OEMs just to release a universal bracket for their particular cards that at GPU update time could be dismantled from the outgoing GPU and transferred to the new. It'd be a win for consumers since at upgrade time, you'd buy a board-only card, and also a win for the OEMs because they'd be locking you into their bracket system, unless you ponied up more for a new full system from another manufacturer.

    Since people tend to update graphic cards more often than CPUs, this seems more beneficial, plus hardcore CPU clockers aren't going to use a 120x120x25mm rad on their CPUs anyway - which is probably what most of these would come with for practicality and budget purposes.
  • th3rdpartynation - Thursday, September 17, 2015 - link

    Would really like to see them use their expertise to make a single slot high end graphics card. Seems to me many living room friendly mini-itx cases can fit a full length graphics card but almost never have 2 expansion slots.

    Give me a single slot graphics card that can handle 1080P with maxed out settings and I will gladly pay a premium for that.
  • mrcaffeinex - Friday, September 18, 2015 - link

    As far as I know, no one ever released a single-slot 750 Ti either. The most interesting part is that there were a number of single-slot 8800/9800/250 video cards with various configurations and higher TDPs (I have a mountain of them that I've ripped from dead boxes over the years). It was possible, but no one seemed to bother. While they are not going to max out the settings in newer games at 1080p, they actually handle newer titles like GTA V and Dying Light at 1080p with what I would consider to be acceptable fidelity.

    The nVidia lineup should be really interesting with HBM2 allowing a variety of form-factors next year. Combined with the improved GPU core efficiency, I think there is a good possibility that we may see something just like what you are looking for.
  • Oxford Guy - Saturday, September 19, 2015 - link

    Galaxy Razor is one of at least two single-slot 750 Ti cards released.

Log in

Don't have an account? Sign up now