Comments Locked

177 Comments

Back to Article

  • Harry Lloyd - Tuesday, February 18, 2014 - link

    20 nm Maxwell will be epic. Gimme.
  • TheinsanegamerN - Tuesday, February 18, 2014 - link

    Imagine. OCed Geforce 690 level performance, out of a single chip, with 8 GB of RAM on a 512 bit bus, pulling the same amount of power as a geforce 770. One can dream....
  • ddriver - Tuesday, February 18, 2014 - link

    LOL, epic? Crippling FP64 performance further from 1/24 to 1/32 - looks like yet another nvidia architecture I'll be skipping due to abysmal compute performance per $ ratio...
  • JDG1980 - Tuesday, February 18, 2014 - link

    This card is designed for gaming and HTPC. Only a tiny fraction of users need FP64.
  • nathanddrews - Tuesday, February 18, 2014 - link

    So I guess we'll have to wait for the 750TIB before we can see SLI benchmarks. Two of these would be within reach of 770 while using considerably less power. Hypothetically, that is.
  • ddriver - Tuesday, February 18, 2014 - link

    You do realize the high end GPUs on the same architecture will have the same limitation?
  • Morawka - Tuesday, February 18, 2014 - link

    I thought the higher end Maxwell cards will have Denver/aRM cores on the PCB as well.
  • Mr Perfect - Wednesday, February 19, 2014 - link

    It might be a software/firmware limitation though. From what the compute enthusiasts have said, the only difference between the Titan's full compute and 780Ti's cut down compute is firmware based. They've got the same chip underneath, and some people hack their 780s for full compute. They're probably doing the same thing with the Maxwell stack.
  • chrnochime - Wednesday, February 19, 2014 - link

    Got link for the hack? Sounds interesting.
  • Mr Perfect - Thursday, February 20, 2014 - link

    I don't myself, but if you're interested look up IvanIvanovich over at bit-tech.net. He was talking about vbios mods and resistor replacement tweaks that can do that.
  • Harag - Thursday, March 6, 2014 - link

    Not true at all. The release of the Titan showed they could unlock FP64 performance on a specific architecture. The Titan Black also has amazing FP64 performance. You may also want to look into their Quadro line.
  • kwrzesien - Tuesday, February 18, 2014 - link

    Cards are available on Newegg! Check out this EVGA Superclocked (1268MHz) with a dual-fan ACX cooler and 6-pin PCIe power connector: http://www.newegg.com/Product/Product.aspx?Item=N8...
  • Frenetic Pony - Tuesday, February 18, 2014 - link

    Maxwell is designed for mobile gaming, in which case who cares? Broadwell looks to improve performance per watt at least as much as Maxwell if Intel's initial hints of 30% power improvement for 14nm and 40% improvement for gpu power efficiency pan out. And they were already damned good.

    But Maxwell isn't designed for high end, in which case GCN 1.1 and AMD are already beating them for price for performance. Congrats Nvidia, you're second place in both categories if this card is anything to go by. I hope to hell your Titan 2 or whatever kicks more ass than this card.
  • varad - Wednesday, February 19, 2014 - link

    @FreneticPony, statements like "Maxwell is designed for mobile gaming" and "But Maxwell isn't designed for high end" tell us you know precious little. Maxwell is an architecture that will span across all of Nvidia's products [Tegra, GeForce, Quadro and Tesla].
  • Frenetic Pony - Thursday, February 20, 2014 - link

    Err... they intend to produce as such yes. But it's obvious the architecture itself is targeted squarely at mobile. Power constraints don't actually get in the way as much as other constraints do on the high end. Who really cares if it's 150+ tdp if it's gaming? You get constrained by memory latency and other things no matter how high you can clock it up.

    This appears to be Nvidia's version of Haswell, concentrated solely on improving performance per watt rather than performance at all. Which is bad timing as Intel is doing the same, but integrates it's GPUs right onto the chip, making them cheaper and smaller than any dedicated card for a laptop is going to be. Meanwhile AMD is crushing Nvidia in both compute and high end gaming performance on the desktop for performance per $.

    True, this will help mitigate electricity cost. for compute based work. But as others pointed out not by much. Meaning Nvidia stuck itself with the wrong focus at the wrong time. Maybe it will help with their Tegra SOCs, if they're lucky they'll get back into the game, as Qualcomm soundly crushed the Tegra 4 for third party ARM Socs over the last year.

    So, no, it's designed for high end. Doesn't mean they're not going to do it anyway.
  • Frenetic Pony - Thursday, February 20, 2014 - link

    I.E. it really doesn't matter how well they did at what they're doing. Because Intel has done just as well and has built in advantages for its market, what their doing doesn't help that much against AMD in the high end market, and this leaves their only chance for financial success with it being next years Tegra SOCs.
  • ninjaquick - Thursday, February 20, 2014 - link

    Plus, AMD is easily capable of taking Nvidia on at the low end with better hardware across the board, more integrated designs, etc.
  • willis936 - Thursday, February 20, 2014 - link

    Pro tip: you're always TDP limited. Increasing performance per watt IS increasing performance.
  • Harag - Thursday, March 6, 2014 - link

    Broad statements like "AMD is crushing Nvidia..." only proved @Varad correct. you know precious little.
  • HisDivineOrder - Wednesday, February 19, 2014 - link

    nVidia fits a lot more performance in a little more space at a lot less power and you think they're doing poorly? This is on the same node.

    Imagine what they'll pack into a smaller node.

    Their focus is probably the right one, given the fact they want to migrate these cores into Tegra.
  • TheJian - Wednesday, February 19, 2014 - link

    You are only able to say AMD is winning with 265 because of magical pricing that probably won't exist, just like 290x/290 are not $550/400, which we now know are really $709/550 (amazon's lowest pricing, which is below newegg on 290, about even on 290x - amazon has 290x in stock). At least most are out of stock now, so maybe they're selling better or it really is just a shortage of chips like PNY says. I'm guessing the shortage of chips that can run 1ghz is causing problems and higher pricing on 290/290x, not selling like crazy. If they were selling like crazy to miners etc AMD would have had a quarter like NV had where GPU revenues rose 14% on the backs of HIGH-END gpu sales in a 11% down PC market. The 290/290x are AMD's high end, yet selling out means zero profits? So High-End isn't selling much and is just a shortage of 1ghz chips then right? 10mil console chips were all of AMD's gpu profits (10mil x $12 each=120mil pretty much exactly AMD's profits).

    I don't see how Anandtech etc can say crypto mining is insane, when AMD's quarterly report shows miners must not be buying them much at all after the first rush at launch. Otherwise they would have made more than just console money. AMD said they get low double digits on consoles now (so not mid which would be 15%, if closer to 15% profits would be like 150mil), which is 8.2mil units already purchased in retail consoles, and another ~2mil in transit or already in MS/Sony being boxed up for more retail boxes. AMD gets their money long before we see it on the shelf in a console (hence the ~2mil in transition). MS/Sony don't pay AMD AFTER the console sales, they pay them BEFORE it gets anywhere near a box on a shelf. So AMD has already sold more than the shelf sales show.

    On the flipside, NV has a quarter with ~$145mil in profits and basically ALL of that is from GPU's. So again, if AMD is selling out (in any volume that is) how come they didn't have $240mil in profits or something like this? Why no profits from GPU's showing up? There must be a real shortage that isn't due to them selling out like crazy, but instead due to manufacturing chips that can do 1ghz without throttling. This is the ONLY assumption that fits the financial data. Miners are NOT buying these in massive quantities. AMD just can't make enough to satisfy anyone, thus the price goes through the roof and companies like PNY say they can't get chips. In turn this causes AMD's MSRP to basically be magical fairy dust pricing which may not be REALITY for many months to come :( Your price perf story doesn't fit. AMD is winning nothing.

    Let me know when you can buy a 290 or 290x for $400 or $550. Let me know when you can buy a 265 for $149. This may turn out to be real for 265 but it isn't now and Anandtech shouldn't be comparing cards that don't even exist yet and pricing is unknown. I mean the reviews of 290/290x said "these are awesome buys", blah blah, but at $709 for 290x isn't it a terrible deal with 780TI OC models going for the same exact price but winning by 20%?
    http://hardocp.com/article/2014/02/10/msi_geforce_...
    OC 290x vs. OC 780ti.
    "The current street price of the ASUS R9 290X DirectCU II OC is $699.99 at several etailers (if you can find it in stock), representing a significant bump from its MSRP of $569.99. If you were to compare the two cards at MSRP, then the 20% performance difference between these could easily be accounted for with the 20% difference in price. However, at the current street pricing, the MSI GeForce GTX 780 Ti GAMING 3G simply slaps the ASUS R9 290X DirectCU II OC around with a large rainbow trout."

    Slaps AMD around like a rainbow trout? OK, OC & price contest settled then. Custom cooling won't magically trump 780TI and MSRP for 290x of $550 and reality for a card that can actually do AMD's magical ref speeds and up is a $150 difference. I don't know why anyone would buy a 265 for compute (or any card in this category, stupid to benchmark this for these low end models), so maybe AMD will actually get to $150 on them. But giving reviews based on pricing we now know may not happen on cards that aren't even available yet vs. a HARD LAUNCH with OC models already out far above what is tested here by anandtech is a bit of a pipe dream at best.

    http://www.tomshardware.com/reviews/radeon-r7-265-...
    "In short, you'll have to pardon our skepticism that Radeon R7 265 will show up on time and at the price point AMD is claiming. We've seen fingers pointed at gun-shy add-in board partners, performance-thirsty cryptocurrency miners, price-gouging retailers, and foundries unable to keep up with supply. But at the end of the day, we're left wondering why AMD is setting prices if it can't control what you pay for its hardware? After piling praise onto the Radeon R9 280X at $300 and 290X at $550, it's our credibility on the line now, and we've been burnt too many times to give you guidance on a card you can't buy yet."

    Just one of 3-4 of their paragraphs outlining the pricing problems and AMD's magical prices :) You get a whole page dedicated to AMD's pricing issues at tomshardware...LOL. Tomshardware is worried about credibility claiming AMD's pricing is real.

    Even anandtech says it's probably magical pricing, so why compare the 265 to 750ti as if 265 will actually be $149?
    http://www.anandtech.com/show/7754/the-amd-radeon-...
    "but unless something changes to bring the other Pitcairn cards back down to their MSRPs, then $149 for 265 may be an unreasonable expectation"

    "The lack of selection has done no favors for the pricing, leading to 260 prices starting at $125. This is $15 above MSRP – a significant difference for this segment of the market – and just a stone’s throw away from the 260X at current prices."

    More comments about lack of 250's etc also. AMD can't seem to put a card out at MSRP. How do you come to the conclusion AMD wins at price perf, when no card is MSRP? Reality check please pal. If 260x is supposed to be $120 for new MSRP how is this possible given we already have regular 260 at $125? Again magical pricing is used for your statements not REALITY.

    Current pricing on 290x/780TI are the same, and 780TI smoke it slapping it around like a rainbow trout. :) Not sure how you get AMD is winning from all of these comments. Yeah if you include magical pricing that may ONE DAY exist, but not REALITY for right now. You keep living in your fantasy world, I'll just stay in reality thanks. I fail to see how AMD will be able to keep up with R&D NV is clearly investing in GPU's. I'm not sure why anandtech even bothered to benchmark the ref design, when they admit NOBODY will be shipping them.
    "NVIDIA’s partners will be launching with custom cards from day-one, and while NVIDIA has put together a reference board for testing and validation purposes, the partners will not be selling that reference board. Instead we’ll be seeing semi-custom and fully-custom designs; everyone has their own cooler, a lot of partners will be using the NVIDIA reference PCB, and others will be rolling out their own PCBs too."

    So why test them? That isn't reality as they clearly point out.
    http://www.anandtech.com/show/7764/the-nvidia-gefo...
    I don't get it. Further showing their AMD love they left the Zotac out of most high end benchmarks and only used ref. What? After clearly stating REF won't even be sold why KEY on REF designs in your benchmarks? Oh right, they only have an AMD portal on anandtech...Never mind...LOL. I get it. :)

    http://www.newegg.com/Product/ProductList.aspx?Sub...
    For $5-10 more over a REAL MSRP on 750ti you get 1176/1255 or 1202/1281 which are both WAY over stock.

    If maxwell is designed for mobile gaming so "who cares" then AMD is designed for compute/mining crap that has just about NOTHING to do with gaming so "who cares" too right? I mean if you want to win synthetic crap buy AMD. If you want to win in gaming buy NV. It would seem NV has the right idea for their audience. Also I wasn't aware Intel has ever put anything out in GPU that is damned good. LOL. They can't even catch BROKE AMD's Kaveri.

    You're basing your price perf comment on pricing that is not REAL. Get back to us when AMD puts out something that sells at their MSRP. Until then their PRICE is FAKE, and price to performance crap is meaningless unless you talk in terms of REAL pricing. In which case NV looks great as hardocp shows.

    "On a pure price/performance basis, the GTX 750 series is not competitive. If you’re in the sub-$150 market and looking solely at performance, the Radeon R7 260 series will be the way to go."
    Fantasy pricing makes this comment moot.

    "With that said however, we will throw in an escape clause: NVIDIA has hard availability today, while AMD’s Radeon R7 265 cards are still not due for about another 2 weeks. Furthermore it’s not at all clear if retailers will hold to their $149 MSRP due to insane demand from cryptocoin miners; if that happens then NVIDIA’s competition is diminished or removed entirely, and NVIDIA wins on price/performance by default."

    That comment is REALITY. We know they won't be MSRP if recent history is any indication. They have been so wrong that tomshardware can't recommend anything on MSRP now. Anyone making comparisons on MSRP for AMD at this point is not credible. Hardocp, tomshardware etc all note it's currently FAKE pricing. Until that changes any site should be writing reviews with REAL pricing in the recommendations/conclusions and people like you should just avoid using phrases like "price performance" ;) AMD is losing everything on price performance when using REAL pricing. I don't know why anyone even quotes MSRP. It's merely a SUGGESTED price. You should just quote the lowest newegg or amazon price as that is the lowest you can get MSRP or not. IF it's NOT a hard launch so you can get that pricing, you shouldn't be reviewing something (since I can't REALLY buy it and have no idea of the REAL price). Get it?

    NV needs to say Titan Black is MSRP of $550 I guess and start acting like AMD. Reviews would have to be written with MSRP conclusions for them then too right? Has anyone gotten a 290 for $400 or 290x for $550? Doubtful. Time for NV to join the lying game and soft launches with pricing that may not come for months?

    From anandtech's 265 article:
    "Finally, for the time being NVIDIA’s competition is going to be spread out, leaving a lack of direct competition for AMD’s latest arrivals. With the retirement of GTX 650 Ti Boost, NVIDIA doesn’t currently have a product directly opposite 265 at $149, with GTX 660 well above it and GTX 650 Ti well below it. On the other hand, NVIDIA’s closest competition for 260 as it stands is GTX 650 Ti, though this would shift if 260X cards quickly hit their new $119 MSRP."

    Totally false considering all AMD pricing is fake right? If 260x QUICKLY hits $119 new MSRP? ROFL. Until they HIT that pricing shut up please. NV doesn't have a direct competitor to $149 265? Yeah because it won't REALLY be $149...LOL. 270 shows it, 260 shows it, 290, 290x, 280, 280x...jeez. Does AMD have a card that really is MSRP? So anandtech is comparing the NV stack to a magically priced fairy dust AMD stack right? That seems a bit unfair considering all the data on AMD pricing currently and EVERY site commenting on it. They continue to heap praise on AMD in reviews based on fake pricing. You can't write a whole page on how bad AMD's pricing situation is then go ahead and write conclusions and recommendations on that fake pricing as if it is real or will be real at some point MAYBE. Misleading the public at best, which is why tomshardware/hardocp etc backed off now. NV wins by default, until AMD puts out a real MSRP card. Anandtech still gives AMD the benefit of the doubt giving NV an escape clause...It should be the other way around. AMD needs that clause as no card they've released recently is MSRP.
  • ninjaquick - Thursday, February 20, 2014 - link

    AMD's MSRP is very real, the issue is supply cannot meet demand. AMD only sees the money they make selling cards to vendors, if they had direct supply control (AMD badged products) they could be making money hands over fists with the inflation, but AMD isn't in the position to do that. They sell their chipsets at the stipulated pricepoint to their vendors, and the vendors then either pass on the savings, or squeeze supply and drive up prices for retailers. If they pass on the savings, then ultimately retailers are capitalizing on the consumer's willingness to pay more, but AMD, ultimately, will see no profits from this, and their sales will be hurt by lower flow due to high prices.

    Plenty of people bought 2XX series cards for their MSRP before LiteCoin made the prices skyrocket. To date, only the 290 series is still hyper inflated. And the MSRPs being determined there are from the vendors, not AMD.
  • chiechien - Friday, February 28, 2014 - link

    The 280x are still priced 50% to 100% over MSRP, too. They're supposed to be $300, but you can't find cheaper than $450, with $5-600 quite common. The R9 270x runs about $50-$100 over MSRP (25-50%).
  • Zetbo - Thursday, February 20, 2014 - link

    I just bought 4096MB Asus Radeon R9 290 DirectCU II OC Aktiv PCIe 3.0 x16 for 397,53eur. I think it's fair price. http://www.mindfactory.de/product_info.php/4096MB-...
  • vision33r - Sunday, March 9, 2014 - link

    That tiny fraction currently buys more GPUs than the avg consumer. Thus the demand for AMD's high end GPUs.
  • A5 - Tuesday, February 18, 2014 - link

    If you really need full FP64, get whoever is paying you to buy a Tesla card.
  • extide - Tuesday, February 18, 2014 - link

    Or go with a big-GCN card :)
  • A5 - Tuesday, February 18, 2014 - link

    Or that, assuming your code isn't locked in to CUDA.
  • ddriver - Tuesday, February 18, 2014 - link

    Thank god it is not. Running about 50 TFLOPS here, nice cheap radeons, no tesla overpriced junk thank you very much nvidia.
  • Morawka - Tuesday, February 18, 2014 - link

    where you buying your radeons? they are overpriced price gouged to hell, Steaming hot thermals but sure it does fp 64 great go get em tiger!!!
  • Mondozai - Wednesday, February 19, 2014 - link

    Anywhere outside of NA gives normal prices. Get out of your bubble.
  • ddriver - Wednesday, February 19, 2014 - link

    Yes, prices here are pretty much normal, no on rushes to waste electricity on something as stupid as bitcoin mining. Anyway, I got most of the cards even before that craze began.
  • R3MF - Tuesday, February 18, 2014 - link

    at ~1Bn transitors for 512Maxwell shaders i think a 20nm enthusiast card could afford the 10bn transistors necessary for a 4096 shaders...
  • Krysto - Tuesday, February 18, 2014 - link

    If Maxwell has 2x the P/W, and Tegra K2 arrives at 16nm, with 2 SMX (which is very reasonable expection), then Tegra K2 will have at least a 1 Teraflop of performance, if not more than 1.2 Teraflops, which would already surpass the Xbox One.

    Now THAT's exciting.
  • chizow - Tuesday, February 18, 2014 - link

    It probably won't be Tegra K2, will most likely be Tegra M1 and could very well have 3xSMM at 20nm (192x2 vs. 128x3), which according to the article might be a 2.7x speed-up vs. just a 2x using Kepler's SMX arch. But yes, certainly all very exciting possibilities.
  • grahaman27 - Wednesday, February 19, 2014 - link

    the Tegra M1 will be on 16nm finfet if they stick to their roadmap. But, since they are bringing the 64bit version sooner than expected, I dont know what to expect. BTW, it has yet to be announce what manufacturing process the 64bit version will be... we can only hope TSMC 20nm will arrive in time.
  • Mondozai - Wednesday, February 19, 2014 - link

    Exciting or f%#king embarrassing for M$? Or for the console industry overall.
  • RealiBrad - Tuesday, February 18, 2014 - link

    Looks to be an OK card when you consider that mining has caused AMD cards to sell out and push up price.

    It looks like the R7 265 is fairly close on power, temp, and noise. If AMD supply could meet demand, then the 750Ti would need to be much cheaper and would not look nearly as good.
  • Homeles - Tuesday, February 18, 2014 - link

    Load power consumption is clearly in Nvidia's favor.
  • DryAir - Tuesday, February 18, 2014 - link

    Power consumpion is way higher... give a look at TPU´s review. But price/perf is a lot beter yeah.

    Personally I'm a sucker for low power, and I will gadly pay for it.
  • RealiBrad - Tuesday, February 18, 2014 - link

    If you were to run the AMD card 10hrs a day with the avg cost of electricity in the US, you would pay around $22 more a year in electricity. The AMD card gives a %19 boost in power for a %24.5 boost in power usage. That means that the Nvidia card is around %5 more efficient. Its nice that they got the power envelope so low, but if you look at the numbers, not huge.

    The biggest factor is the supply coming out of AMD. Unless they start making more cards, the the 750Ti will be the better buy.
  • Homeles - Tuesday, February 18, 2014 - link

    Your comment is very out of touch with reality, in regards to power consumption/efficiency:

    http://www.techpowerup.com/reviews/NVIDIA/GeForce_...

    It is huge.
  • mabellon - Tuesday, February 18, 2014 - link

    Thank you for that link. That's an insane improvement. Can't wait to see 20nm high end Maxwell SKUs.
  • happycamperjack - Wednesday, February 19, 2014 - link

    That's for gaming only, it's compute performance/watt is still horrible compared to AMD though. I wonder when can Nvidia catch up.
  • bexxx - Wednesday, February 19, 2014 - link

    http://media.bestofmicro.com/9/Q/422846/original/L...

    260kh/s at 60 watts is actually very high, that is basically matching 290x in kh/watt ~1000/280watts, and beating out r7 265 or anything... if you only look at kh/watt.
  • ninjaquick - Thursday, February 20, 2014 - link

    To be honest, all nvidia did was increase the granularity of power gating and core states, so in the event of pure burn, the TDP is hit, and the perf will (theoretically) droop.

    The reason the real world benefits from this is simply the way rendering works, under DX11. Commands are fast and simple, so increasing the number of parallel queues allows for faster completion and lower power (Average). So the TDP is right, even if the working wattage per frame is just as high as any other GPU. AMD doesn't have that granularity implemented in GCN yet, though they do have the tech for it.

    I think this is fairly silly, Nvidia is just riding the coat-tails of massive GPU stalling on frame-present.
  • elerick - Tuesday, February 18, 2014 - link

    Since the performance charts have 650TI Boost i looked up the TDP of 140W. When compared to the Maxwell 750TI with 60W TDP I am in awe of the performance per watt. I sincerely hope that the 760/770/780 with 20nm to give the performance a sharper edge but even if they are not it will still give people with older graphics cards more of a reason to finally upgrade since driver performance tuning will start favoring Maxwell over the next few years.
  • Lonyo - Tuesday, February 18, 2014 - link

    The 650TI/TI Boost aren't cards designed to be efficient. They are cut down cards with sections of the GPU disabled. While 2x perf per watt might be somewhat impressive, it's not that impressive given the comparison is made to inefficient cards.
    Comparing it to something like a GTX650 regular, which is a fully enabled GPU, might be more apt of a comparison, and probably wouldn't give the same perf/watt increases.
  • elerick - Tuesday, February 18, 2014 - link

    Thanks, I haven't been following lower end model cards for either camp. I usually buy $200-$300 class cards.
  • bexxx - Thursday, February 20, 2014 - link

    Still just over 1.8x higher perf/watt: http://www.techpowerup.com/reviews/NVIDIA/GeForce_...
  • MrSpadge - Tuesday, February 18, 2014 - link

    To be fair GTX650Ti Boost consumes ~100 W in the real world. Still a huge improvement!
  • NikosD - Tuesday, February 18, 2014 - link

    Hello.

    I have a few questions regarding HTPC and video decoding.

    Can we say that we a new video processor from Nvidia, a new name like VP6 or more like a VP5.x ?

    How Nvidia is calling the new video decoder ?

    Why don't you add a 4K60 fps clip in order to test soon to be released HDMI 2.0 output ?

    If you run a benchmark using DXVA Checker between VP5 and VP6 (?) how much faster is VP6 in H.264 1080p, 4K clips ?

    Thanks!
  • Ryan Smith - Thursday, February 20, 2014 - link

    NVIDIA doesn't have a name for it; at least not one they're sharing with us.
  • NikosD - Thursday, February 20, 2014 - link

    Thanks.
    Is it possible to try a 4K60fps with Maxwell ?

    I wonder if it can decode it in realtime...
  • Flunk - Tuesday, February 18, 2014 - link

    I think these will be a lot more exciting in laptops. Even if they're no where near Nvidia's claimed 2x Kepler efficiency per watt. On the desktop it's not really that big a deal. The top-end chip will probably be ~40% faster than the 780TI but that will be a while.
  • dylan522p - Tuesday, February 18, 2014 - link

    the 880 will be much more powerful than the 780ti. More than 40% even. They could literally die shrink and throw a few more SMX's and the 40% would be achieved. I would imagine either they are gonna have a HUGE jump (80% +) or they are gonna do what they did with Kepler and release a 200W Sku that is about 50% faster and when 20nm yields are good enough have the 900 series come with 250W Skus.
  • Kevin G - Tuesday, February 18, 2014 - link

    Very impressive performance for its power consumption. I can see an underclocked version of this card coming with a passive cooler for HTPC solutions. Perhaps that'd be a hypothetical GT740? I'm surprised that nVidia hasn't launched a mobile version of this chip. It seems like it'd be ideal for midrange laptops that still have discrete graphics.

    I suspect that the extra overclocking headroom is in reserve for a potential rebrand to a GTX 800 series product. (Though a straight die shrink of this design to 20 nm would provide even more headroom for a GTX 800/900 card.) nVidia could have held back to keep it below the more expensive GTX 660.

    Though ultimately I'm left wanting the bigger GM100 and GM104 chips. We're going to have to wait until 20 nm is ready but considering the jump Maxwell has provided in the low end of the market, I'm eager to see what it can do in the high end.
  • DanNeely - Tuesday, February 18, 2014 - link

    ASUS has a 65W TDP GT 640 with a big 2 slot passive heat sink (GT640-DCSL-2GD3); with the 750 Ti only hitting 60W a passive version of it should be possible at near stock performance. I suspect the 740 will be a farther cut down 3 SMM model which might allow a single slot passive design.
  • PhoenixEnigma - Tuesday, February 18, 2014 - link

    Passive cooling was my first thought as well - I've been looking for something to replace the 6770 in my HTPC with, and I wanted something both faster and passively cooled. There are already passive 7750s on the market, and the numbers in Bench put the 750Ti at about 9W more than then 7750 under real world load, so a vanilla 750 with a passive cooler should be entirely possible. Even a 750Ti might be doable, but that could be pushing things a little far.
  • evilspoons - Tuesday, February 18, 2014 - link

    I need a new half-height HTPC card, my 2.5 year old Asus Radeon 6570 bit the dust last month (sparkly picture, one particular shade of grey turned random colours). If they can work out the kinks in this thing and underclock it a bit, it sounds like a good candidate.

    It feels like it's been a long time since anything new showed up in the half-height video card game.
  • TheinsanegamerN - Tuesday, February 18, 2014 - link

    Look at sapphire's 7750. superior in every way to the 6570, and is single slot low profile. and overclocks like a champ.
  • dj_aris - Tuesday, February 18, 2014 - link

    Sure but it's cooler is kind of loud. Definitely NOT a silent HTPC choice. Maybe a LP 750 would be better.
  • evilspoons - Tuesday, February 18, 2014 - link

    Thanks for pointing that out. None of my local computer stores sell that, but I took a look on MSI's site and sure enough, there it is. They also seem to have an updated version of the same card being sold as an R7 250, although I'm not sure there's any real difference or if it's just a new sticker on the same GPU. Clock speeds, PCB design, and heat sink are the same, anyway.
  • Sabresiberian - Tuesday, February 18, 2014 - link

    I'm hoping the power efficiency means the video cards at the high end will get a performance boost because they are able to cram more SMMs on the die than SMXs were used in Kepler solutions. This of course assumes the lower power spec means less heat as well.

    I do think we will see a significant performance increase when the flagship products are released.

    As far as meeting DX11.1/11.2 standards - it would be interesting to hear from game devs how much this effects them. Nvidia has never been all that interested in actually meeting all the requirements for Microsoft to give them official status for DX versions, but that doesn't mean the real-world visual quality is reduced. In the end what I care about is visual quality; if it causes them to lose out compared to AMD's offerings, I will jump ship in a heartbeat. So far that hasn't been the case though.
  • Krysto - Tuesday, February 18, 2014 - link

    Yeah, I'm hoping for a 10 Teraflops Titan, so I can get to pair with my Oculus Rift next year!
  • Kevin G - Tuesday, February 18, 2014 - link

    nVidia has been quite aggressive with the main DirectX version. They heavily pushed DX10 back in day with the Geforce 8000/9000 series. They do tend to de-emphassize smaller updates like 8.1, 10.1, 11.1 and 11.2. This is partially due to their short life spans on the market before the next major update arrives.

    I do expect this to have recently changed as Windows it is moving to rapid release schedule and it'll be increasingly important to adopt these smaller iterations.
  • kwrzesien - Tuesday, February 18, 2014 - link

    Cards on Newegg are showing DirectX 11.2 in the specs list along with OpenGL 4.4. Not that I trust this more than the review - we need to find out more.
  • JDG1980 - Tuesday, February 18, 2014 - link

    The efficiency improvements are quite impressive considering that they're still on 28nm. TDP is low enough that AIBs should be able to develop fanless versions of the 750 Ti.

    The lack of HDMI 2.0 support is disappointing, but understandable, considering that it exists virtually nowhere. (Has the standard even been finalized yet?) But we need to get there eventually. How hard will it be to add this feature to Maxwell in the future? Does it require re-engineering the GPU silicon itself, or just re-designing the PCB with different external components?

    Given the increasing popularity of cryptocoin mining, some benchmarks on that might have been useful. I'd be interested to know if Maxwell is any more competitive in the mining arena than Kepler was. Admittedly, no one is going to be using a GPU this small for mining, but if it is competitive on a per-core basis, it could make a big difference going forward.
  • xenol - Tuesday, February 18, 2014 - link

    I'm only slightly annoyed that NVIDIA released this as a 700 series and not an 800 series.
  • DanNeely - Tuesday, February 18, 2014 - link

    I suspect that's an indicator that we shouldn't expect the rest of the Maxwell line to launch in the immediate future.
  • dylan522p - Tuesday, February 18, 2014 - link

    They are waiting for 20nm for the entire 800 series .
  • MugatoPdub - Tuesday, February 18, 2014 - link

    Interestingly, it seems Nvidia has simply followed Intel in the "mobile first" market race, it is starting to feel as if the enthusiast will be left in the dust within the next few years =(
  • Krysto - Tuesday, February 18, 2014 - link

    Not likely, thanks to the boom in VR that we'll be seeing, which at 4k and 120fps games, will require 16x the performance we get now for games, just to play the same games, in a few years.

    So if anything, Nvidia should be making GPU's at the high-end that are a level or two ABOVE Titan (think 20-30 TF GPUs in 2015).
  • A5 - Tuesday, February 18, 2014 - link

    They probably will? I'm guessing we won't see stuff below the top end (or SLI) targeted at 4K until late 2015/spring 2016, though.
  • madmilk - Tuesday, February 18, 2014 - link

    I doubt enthusiasts will be left behind, simply because HPC users will demand a 225W Tesla card. That in turn can easily sold as a 250W enthusiast card, perhaps under the Titan line.
  • Mondozai - Wednesday, February 19, 2014 - link

    Also, Nvidias desktop business is contributing to their profits and is seeing revenue growth. Their Tegra business revenue is falling almost 50% year over year.

    The desktop high-end GPU market will grow in good health for years to come. Their discrete laptop GPUs, however, will face doom in a relativeley short period of time as integrated GPUs performance rises to a level when most people are satisfied. Laptops specifically for gaming continues to be an unsignificant market.
  • jkauff - Tuesday, February 18, 2014 - link

    madVR NNEDI3 uses OpenCL, and works fine on Intel and AMD boards. NVIDIA OpenCL support has been broken for the last couple of driver iterations. Please use your influence with the NVIDIA developers to get this fixed in the next driver release.
  • IKeelU - Tuesday, February 18, 2014 - link

    ugh, cryptocoin
  • texasti89 - Tuesday, February 18, 2014 - link

    Wow! .. Substantially faster than 260x and consuming less than 60w using same process node. Really impressive. I can't wait to see how Maxwell arch performance & power scale at 20nm. I'm really convinced now that AMD GCN is not as efficient as many reviewers think. AMD will very likely have a hard time in this round.
  • g101 - Tuesday, February 18, 2014 - link

    *slower...Almost always slower than the 260x.... , all for power savings in the range of 5-10%...How...exciting?

    You should try reading the actual words that have been written in the article.
  • texasti89 - Tuesday, February 18, 2014 - link

    http://media.bestofmicro.com/4/R/422667/original/F...
  • texasti89 - Tuesday, February 18, 2014 - link

    Also I was referring to the 750ti (60w) not the 750 (55w) in my comment. Words in the article reflect reviewers opinions. Benchmark results from various tech websites give same conclusion.
  • texasti89 - Tuesday, February 18, 2014 - link

    Another one to look at : http://www.techpowerup.com/reviews/NVIDIA/GeForce_...
  • tspacie - Tuesday, February 18, 2014 - link

    [Coming soon to a flu near you]

    This is a caching error or similar on page 4, right?
  • mindbomb - Tuesday, February 18, 2014 - link

    Hello Ryan and Ganesh. I'd like to point out for your video tests that there is no luma upscaling or image doubling for a 1080p video on a 1080p display, since luma is already scaled. You need to test those with a 720p video, and they are mutually exclusive, since image doubling will convert 1280x720 to 2560x1440, where you will need to downscale rather than upscale.
  • ganeshts - Tuesday, February 18, 2014 - link

    Luma upscaling is present for 480i / 576i / 720p videos and downscaling for the 4Kp30 video. We have nine different sample streams.
  • jwcalla - Tuesday, February 18, 2014 - link

    I'd like to see AT adopt some OpenGL benchmarks in the future.

    Us OpenGL consumers are out here. :)
  • Ryan Smith - Thursday, February 20, 2014 - link

    So would I. But at the moment there aren't any meaningful games using OpenGL that are suitable for benchmarking. After Wolfenstein went out of date and Rage was capped at 60fps, we ended up stuck in that respect.
  • Roland00Address - Tuesday, February 18, 2014 - link

    Feel better Ryan, don't let the flu get you down! (Or is it Ganesh T S?)

    Looks like Nvidia has a 8800gt/9800gt on its hands (for different reasons than the original 8800gt)
  • Hrel - Tuesday, February 18, 2014 - link

    Seriously impressive performance/watt figures in here. Makes me wonder when we're going to see this applied to their higher end GPU's.

    Looking at TSMC's site they are already producing at 20nm in 2 fabs. Starting in May of this year they'll have a 3rd up. Do you think it's likely May/June is when we'll see Maxwell make it's way into higher end GPU's accompanied by a shift to 20nm?

    That approach would make sense to me, they'd have new product out in time for Summer Sales and have enough time to ramp production and satiate early adopters before back to school specials start up.

    On a personal note: I'm still running a GTX460 and the GTX750ti seems to be faster in almost every scenario at lower power draw in a smaller package. So that's pretty cool. But since TSMC is already producing 20nm chips I'm going to wait until this architecture can be applied at a smaller manufacturing process. That GPU is in a media PC, so gaming is a tertiary concern anyway.
  • kwrzesien - Tuesday, February 18, 2014 - link

    Seems coincidental that Apple is going to use TSMC for all production of the A8 chip with Samsung not ready yet, maybe Apple is getting priority on 20nm? Frankly what nVidia is doing with 28nm is amazing, and if the yields are great on this mature process maybe the price isn't so bad on a big die. Also keep in mind the larger the die the more surface area there is to dissipate heat, Haswell proved that moving to a very dense and small die can create even more thermal limitations.
  • DanNeely - Tuesday, February 18, 2014 - link

    Wouldn't surprise me if they are; all the fab companies other than Intel are wailing about the agonizingly high costs of new process transitions and Apple has a history of throwing huge piles of its money into accelerating the build up of supplier production lines in trade for initial access to the output.
  • dylan522p - Tuesday, February 18, 2014 - link

    Many rumors point to Apple actually making a huge deal with intel for 14nm on the A8.
  • Mondozai - Wednesday, February 19, 2014 - link

    Maybe 14 for the iPhone to get even better power consumption and 20 for the iPad? Or maybe give 14 nm to the premium models of the iPad over the mini to differentiate further and slow/reverse cannibalization.
  • Stargrazer - Tuesday, February 18, 2014 - link

    So, what about Unified Virtual Memory?
    That was supposed to be a major new feature of Maxwell, right? Is it not implemented in the 750s (yet - waiting for drivers?), or is there currently a lack of information about it?
  • A5 - Tuesday, February 18, 2014 - link

    That seems to be a CUDA-focused feature, so they probably aren't talking about it for the 750. I'm guessing it'll come up when the higher-end parts come out.
  • Ryan Smith - Thursday, February 20, 2014 - link

    Bingo. This is purely a consumer product; the roadmaps we show are from NV's professional lineup, simply because NV doesn't produce a similar roadmap for their graphics lineup (despite the shared architecture).
  • dragonsqrrl - Tuesday, February 18, 2014 - link

    "Meet The Reference GTX 750 Ti & Zotac GTX 750 Series"

    "This is the cooler style that most partners will mimic, as the 60W TDP of the GTX 650 Ti does not require a particularly large cooler"

    You mean 750 Ti right?
  • chizow - Tuesday, February 18, 2014 - link

    The performance and efficiency of this chip and Maxwell is nothing short of spectacular given this is still on 28nm. Can't wait to see the rest of the 20nm stack.

    Ryan, are you going to replace the "Architectural Analysis" at some point? Really looking forward to your deep-dive on that, or is it coming at a later date with the bigger chips?
  • dgingeri - Tuesday, February 18, 2014 - link

    In the conclusion, the writer talks about the advantages of the AMD cards, but after my experiences with my old 4870X2, I'd rather stick with Nvidia, and I know I'm not alone. Has AMD improved their driver quality to a decent level yet?
  • formulav8 - Tuesday, February 18, 2014 - link

    Overall their drivers have been fine for a long time. NVidia and AMD run into bugs time to time. I'm sure you know NVidia has bugs as well. Just because you don't see them doesn't mean there are none and that NVidia is perfect.
  • dgingeri - Wednesday, February 19, 2014 - link

    yeah, well, I've never had bugs with my Nvidia cards to the degree of the bugs with my two AMD/ATI video cards. I had that 4870X2 for over a year and a half, quietly waiting for them to fix the bugs in the drivers to get it to properly use the second GPU, only to have them completely abandon me at the 10.1 driver and beyond.
  • extide - Tuesday, February 18, 2014 - link

    So, really, not even a single mention of Maxwell's marquee feature, "Unified Virtual Memory?" I think you guys got a little bit caught up in the power/watt increase, and entirely forgot about that!
  • kwrzesien - Tuesday, February 18, 2014 - link

    It will probably be unveiled with the 800 series cards. It might be baked into the die but not enabled in this revision or it might be there but not enabled in the drivers. What we want to wait for is the 860/880 which will have the muscle to do something amazing with UVM.
  • jwcalla - Tuesday, February 18, 2014 - link

    I don't think UVM will provide much in the way of benefits other than making some CUDA programming [marginally] easier.
  • doubledeej - Tuesday, February 18, 2014 - link

    Is anybody ever going to produce a decent single slot video card again? I'm getting a little tired of either having low-end, slow cards, or having all of my PCIe slots being covered up.
  • DanNeely - Tuesday, February 18, 2014 - link

    Probably not. Making a quiet card is much easier in a 2 slot form factor; and the number of people who care about performance and need the slot immediately below their GPU is a very small fraction of the total.

    On a tangent, legacy PCI is facing extinction because it's been removed from Intel's current chipsets and is only available via a bridge chip. If you have a long term need for it; I'd advise buying a current generation mobo with an unobstructed slot while they're still on the market.
  • TheinsanegamerN - Tuesday, February 18, 2014 - link

    you mean like the geforce 680 single slot edition that never came out? yeah, I wish companies would actually make those things.
  • FelixDraconis - Tuesday, February 18, 2014 - link

    I really like the direction this card is going in. Especially if there's a DisplayPort variant coming. It's tough building a low power ITX system with DP without using onboard graphics.

    But what's up with the 750 name? That's just blatantly misleading, as the 760 has been out for almost a year now and is an older process, while originally Maxwell was slated to be called 8XX. Nothing new, but boy is this confusing. I guess it makes short term sense for marketing, as it always does.
  • jwcalla - Tuesday, February 18, 2014 - link

    Obviously they're reserving the 800 series for the 20nm parts.
  • EdgeOfDetroit - Tuesday, February 18, 2014 - link

    The EVGAs have Displayport, but they might be the only ones. I ordered the Superclocked 750 Ti with the $5 rebate from Newegg because it had a DisplayPort and the competitors did not.
  • Death666Angel - Tuesday, February 18, 2014 - link

    "the 760 has been out for almost a year now and is an older process" -> Still the same 28nm process for the 760 and 750 alike. :)
  • MrPoletski - Tuesday, February 18, 2014 - link

    This jump in cache for 128k to 2mb... I wonder what that does for cryptocurrency mining?
  • The Von Matrices - Tuesday, February 18, 2014 - link

    Unless the integer shift operation has been improved, not much.
  • g101 - Tuesday, February 18, 2014 - link

    Nothing, nividia is fundamentally deficient with integer compute, these are architectural decisions that NVidia made in hopes of squeezing out slightly better FPS. Think: anti-gpgpu, or more of a classic asic.

    So no, this arc isn't going to change their position with regards to the actual algorithms. Perhaps there will be a moderate increase in sCrypt sha2 performance (due to the memory-hard nature of that implementation), however, nvidia's extreme (and sometimes intentional) incompetence with gpgpu leads me to believe that they still do not understand that GPGPU is the reason AMD's cards are above MSRP. It's not due to one specific hashing function, it's due to their superiority in over 11 specific functions, superior general opencl performance and comparatively greater performance for many SP compute intensive CUDA applications. For instance, cross-comparison between cuda and opencl raycasting yields some very interesting results, with the opencl/AMD solutions outperforming cuda 2:1, often with greater accuracy.

    CUDA is easy, NVidia has zero compute advantage beyond 'ease'.
  • oleguy682 - Tuesday, February 18, 2014 - link

    AMD receives nothing for their cards being sold over MSRP. Their channel partners likely have agreements in place for this generation of processors that is locked in at a specific price or price range. Perhaps if they signed new partners, or revised their processors substantially enough to warrant a new agreement, they can take advantage of the higher-than-MSRP situation, but I doubt it. And even the ASUS and Gigabytes of the world are likely unable to capitalize much on the demand. At best, they are able to sell boards to retailers as fast as they come off the line.

    Only the Neweggs are profiting handsomely off of this.
  • HighTech4US - Wednesday, February 19, 2014 - link

    Von and g101 you are both wrong as Maxwell has now greatly improved integer compute. Check out the following review page from Tom's:

    http://www.tomshardware.com/reviews/geforce-gtx-75...

    Quote: Historically, Nvidia's cards came up short against competing Radeons, which is why you see R9 290X boards selling for $700 and up. But the Maxwell architecture's improvements allow the 60 W GeForce GTX 750 Ti to outperform the 140 W GeForce GTX 660 and approach AMD's 150 W Radeon R7 265, which just launched, still isn't available yet, but is expected to sell for the same $150. On a scale of performance (in kH/s) per watt, that puts Nvidia way out ahead of AMD. Today, four GM107-based cards in a mining rig should be able to outperform a Radeon R9 290X for less money, using less power.
  • Yojimbo - Wednesday, February 19, 2014 - link

    Which is good for NVidia, maybe just lucky. Increasing gamer market share in exchange for some short-term profits is probably a good trade-off for Nvidia. If AMD can't maintain their market share, they'll have less muscle behind their Mantle initiative.
  • hpvd - Tuesday, February 18, 2014 - link

    Does this first small Maxwell brings Support for Unified Virtual Memory Management IN HARDWARE? If yes: would be really interesting to see how efficient it could work...
    details see:
    http://www.anandtech.com/show/7515/nvidia-announce...
  • willis936 - Tuesday, February 18, 2014 - link

    I would like very much to see a comparison of GM107 in SLI to other $300 graphics card options. Us 560 Ti owners are in a tough position because it's upgradin' time and there's no decent, quiet solution. SLI is still a bit of a hack and from what I can tell can be more of a compatibility headache than a performance gain. These cards may be the exception though.
  • EdgeOfDetroit - Tuesday, February 18, 2014 - link

    This card (Evga 750 Ti OC) is replacing a 560Ti for me. Its slower but its not my primary game machine anymore anyways. I'll admit I was kinda bummed when the 700 series stopped at the 760, and now that the 750 is here, its like they skipped the true successor to the 560 and 660. I can probably still get something for my 560Ti, at least.
  • rhx123 - Tuesday, February 18, 2014 - link

    I wonder if we'll get the 750Ti or even the 750 in a half height config.

    It would be nice for HTPCs given the power draw, but I'm not optimistic.
    There's still nothing really decent in the half height Nvidia camp.
  • Frenetic Pony - Tuesday, February 18, 2014 - link

    "it is unfortunate, as NVIDIA carries enough market share that their support (or lack thereof) for a feature is often the deciding factor whether it’s used"

    No this time. Both the Xbone and PS4 are fully feature compliant, as is GCN 1.1 cards, heck even GCN 1.0 has a lot of the features required. With the new consoles, especially the PS4, selling incredibly well these are going to be the baseline, and if you buy a NVIDIA card without it, you be SOL for the highest end stuff.

    Just another disappointment with Maxwell, when AMD is already beating Nvidia price for performance wise very solidly. Which is a shame, I love their steady and predictable driver support and well designed cooling set ups. But if they're not going to compete, especially with the rumors of how much Broadwell supposedly massively improves on Intel's mobile stuff, well then I just don't know what to say.
  • Rebel1080 - Tuesday, February 18, 2014 - link

    Can we all come to a consensus by declaring the 8th console generation an a epic bust!!! When the Seventh console generation consoles (PS3/XB360) made their debut it took Nvidia and AMD 12-18 months to ship a mainstream GPU that could match or exceed thier performance. This generation it only took 3 months at 2/3rds the price those cards sold at (3870/8800GT).

    It's pretty condemning that both Sony and MSFT's toy boxes are getting spanked by $119-149 cards. Worst of all the cards are now coming from both gpu companies for which I'm sure gives Nvidia all smiles.
  • FearfulSPARTAN - Tuesday, February 18, 2014 - link

    Really an epic bust.... Come on now we all knew from the start they were not going to be bleeding edge based on the specs. They were not going for strong single threaded performance they were aiming for well threaded good enough cpu performance and the gpus they had were average at their current time. However considering the ps4 and x1 are selling very well calling the entire gen a bust already is just stupid. You dont need high performance for consoles when you have developers coding to scrape every bit of performance they can out of your hardware, thats something we dont have in the pc space and why most gamers are not using those cards that just met last gen console performance seven years ago.
  • Rebel1080 - Tuesday, February 18, 2014 - link

    They're selling well for the same reasons iTards keep purchasing Apple products even though they only offer incremental updates on both hardware and less on software. It's something I like to call "The Lemming Effect".

    Developers code to the metal but that only does so much and then you end up having to compromise the final product via lower res, lower fps, lower texture detail. Ironcially I was watching several YouTube videos of current gen games (BF3&4, Crysis 3, Grid 2, AC4) running at playable fps between 720p & 900P on a Radeon 3870.
  • oleguy682 - Tuesday, February 18, 2014 - link

    Except that unlike Apple, Sony and Microsoft are selling each unit at a loss once the BOM, assembly, shipping, and R&D are taken into consideration. The PS3 was a $3 billion loss in the first two years it was available. The hope is that licensing fees, add-ons, content delivery, etc. will result in enough revenue to offset the investment, subsidize further R&D, and leave a bit left over for profit. Apple, on the other hand, is making money on both the hardware and the services.

    And believe it or not, there are a lot more console gamers than PC gamers. Gartner estimates that in 2012, PC gaming made up only $14 billion of the $79 billion gaming market. This does include hardware, in which the consoles and handheld devices (likely) get an advantage, but 2012 was before the PS4 and Xbone were released.

    So while it might be off-the-shelf for this generation, it was never advertised as anything more than a substantial upgrade over the previous consoles, both of which were developed in the early 2000s. In fact, they were designed for 1080p gaming, and that's what they can accomplish (well, maybe not the Xbone if recent reports are correct). Given that 2160p TVs (because calling it 4K is dumb and misleading) are but a pipe dream for all but the most well-heeled of the world and that PCs can't even come close to the performance needed to drive such dense displays (short of spending $1,000+ on GPUs alone), there is no need to over-engineer the consoles to do something that won't be asked of them until they are near EOL.
  • Rebel1080 - Tuesday, February 18, 2014 - link

    PC Gaming is growing faster globally than the console market because purchasing consoles in many nations is extremely cost prohibitive due to crushing tariffs. Figure that in 3yrs time both Intel and AMD will have IGPs that will trounce the PS4 and will probably sell for under $99 USD. PC hardware is generally much more accessible to people living in places like Brazil, China and India compared to consoles. It would actually cost less to build a gaming PC if you live there.

    The console market is the USA, Japan and Western Europe, as the economies of these nations continue to decline (all 3 are still in recession) people who want to game without spending a ton will seek lower cost alternatives. With low wattage cards like the 750Ti suddenly every Joe with a 5yr old Dell/HP desktop can now have console level gaming for a fraction of the cost without touching any of his other hardware.
  • Rebel1080 - Tuesday, February 18, 2014 - link

    http://www.gamespot.com/articles/sony-says-brazil-...
  • oleguy682 - Wednesday, February 19, 2014 - link

    Brazil is only Brazil. It does not have any bearing on China or India or any other developing nation as they all choose their own path on how they tax and tariff imports. Second, throwing a 750Ti into a commodity desktop (the $800-1,200 variety) from 3 years ago, let alone 5, is unlikely to result in performance gains that would turn it into a full-bore 1080p machine that can run with the same level of eye-candy as a PS4 or XBone. The CPU and memory systems are going to be huge limiting factors.

    As far as the PC being a faster growing segment, the Gartner report from this fall thinks that PC gaming hardware and software will rise from the 2012 baseline of 18.3% of spending to 19.4% of spending in 2015. So yes, it will grow, but it's such a small share already that it barely does anything to move the needle in terms of where gaming goes. In contrast, consoles are expected to grow from 47.4% to 49.6% of spending. The losing sectors are going to be handheld gaming, eaten mostly by tablets and smartphones. PCs aren't dying, but they aren't thriving, regardless of what Brazil does with PS4 imports in 2014.
  • Mondozai - Wednesday, February 19, 2014 - link

    USA in recession? You are either ignorant or use your own home-cooked economics for "special" people like yourself.

    As for consoles. Sure you can get low-end cards for cheaper sums a PC but people buy consoles for their games, simplicity and the fact that they are increasingly multimedia machines for a low cost.

    These factors will not change with these new cards.
  • Yojimbo - Wednesday, February 19, 2014 - link

    I thought I remember reading a headline a while back that said Sony or Microsoft or both were not planning on selling their hardware for a loss this time...
  • madmilk - Tuesday, February 18, 2014 - link

    The launch PS3 cost over $800 to manufacture, and Sony lost something like $3 billion in the first two years from hardware sales even though the PS3 wasn't even selling that well. To a lesser extent, Microsoft had the same problem with the Xbox 360. Of course Sony and Microsoft would go for cheaper, mid-range off-the-shelf components this time around. No one wants to make the same mistake twice.
  • Antronman - Tuesday, February 18, 2014 - link

    Wow. What do they think, everybody here is an OC pro who has/had world records and has a monster closed loop browsing/gaming/work setup? I don't give a damn about lower power consumption if it means I have to OC the balls off the card!
  • moozoo - Tuesday, February 18, 2014 - link

    Please include at least one fp64 benchmark in the compute section.
    It is great that you found out and reported the fp64 ratio.
    Its a pity there isn't at least one low power low profile card with good DP Gflops (at least enough to beat the CPU and form a compelling argument to switch API's)
    At work we only get small form factor PCs, and asking for anything that looks different ends in politics.
  • Ryan Smith - Thursday, February 20, 2014 - link

    For the moment FP64 data is available via Bench. This being a mainstream consumer card, it's purposely not built for high FP64 performance; FP64 is there for compatibility purposes rather than being able to do much in the way of useful work.

    This is a purposeful market segmentation move that won't be going anywhere. So cards such as the 750 Ti will always be very slow at FP64.
  • jrs77 - Tuesday, February 18, 2014 - link

    Now we need a manufacturer to release a GTX750 with single-slot cooler.
  • koolanceGamer - Tuesday, February 18, 2014 - link

    While all of this "low power" stuff is a little boring to me (not that anything is really pushing the high end card) I hope that in the not too distant future even the video cards like the 780/Titan will be able to be powered by the PCI alone.

    I would love to do a gaming build with a PCI based SSD and no cables coming off the video cards, it would be so clean!
  • EdgeOfDetroit - Tuesday, February 18, 2014 - link

    Well I want laser light circuit cables. So much faster than copper and they would look so clean, you wouldn't even know there was a cable there unless you put your hand into the laser beams to see the pretty lights...

    ... Ahh crap another BSOD, these laser cables suck!
  • Devo2007 - Wednesday, February 19, 2014 - link

    Starting to wonder what a good card to replace a GTX 560 Ti would be (that's still relatively affordable). Would I have to step up to something like the R9 270 or GTX 760 cards to make things worthwhile? The power savings of the GTX 750 Ti aren't really a big factor as I'm currently using a 650w PSU, but I also don't want to spend a ton of money.
  • Mondozai - Wednesday, February 19, 2014 - link

    Wait for 800 series budget cards if you have the patience. Hopefully no more than 4-5 months if TSMC does very well on 20.
  • Jeffrey Bosboom - Wednesday, February 19, 2014 - link

    I understand the absolute hashrate on these cards will be low, but I'm interested to know how the focus on power consumption improves mining performance per watt. (Though I can't imagine this lowish-end cards would be used, even if efficient, due to the fixed cost of motherboards to put them in.)
  • Antronman - Wednesday, February 19, 2014 - link

    Nvidia's best cards have tiny hash rates compared to 95% of every AMD GPU ever released.
  • JarredWalton - Wednesday, February 19, 2014 - link

    Apparently you're not up to speed on the latest developments. GTX 780 Ti as an example is now hitting about 700 KHash in scrypt, and word is the GTX 750 will be pretty competitive with 250-260 KHash at stock and much lower power consumption. Some people have actually put real effort into optimizing CUDAminer now, so while AMD still has an advantage, it's not nearly as large as it used to be. You could even make the argument that based on perf/watt in mining, some of NVIDIA's cards might even match AMD's top GPUs.
  • darthrevan13 - Wednesday, February 19, 2014 - link

    Why did they chose to retire 650 Ti Boost and replace it with 750Ti? 650 Ti B is a much better card for high end games because of the memory interface. They should have marketed 750Ti as 750 and 750 as 740.

    And why on earth did they not include full support for HEVEC and DX11.2? You're limiting the industry's adoption for years to come because of you're move. I hope they will fix this in the next generation 800 cards or when they will transition to 20nm.
  • Ryan Smith - Thursday, February 20, 2014 - link

    Not speaking for NV here, but keep in mind that 650 Ti Boost is a cut-down GK106 chip. All things considered, 750 Ti will be significantly cheaper to produce for similar performance.

    NVIDIA really only needed it to counter Bonaire, and now that they have GM107 that's no longer the case.
  • FXi - Wednesday, February 19, 2014 - link

    No DX 11.2 or even 11.1 support? For THAT price??
    Pass...
  • rish95 - Wednesday, February 19, 2014 - link

    According to GeForce.com it supports 11.2. Not sure what's up with this:

    http://www.geforce.com/hardware/desktop-gpus/gefor...
  • willis936 - Wednesday, February 19, 2014 - link

    You don't need to be compliant to support something. Compliance means you meet all required criteria. Support means you can run it without having necessarily all the bells and whistles. If console hardware has DX compliance then the devs will take advantage of that and when they're ported you'll lose some of the neat graphics tricks. They might still be able to be done in software, you'll just need a bigger GPU to get the same frame rates :p Some things might not be able to be done in software though. Idk enough about DX to say.
  • sourav - Wednesday, February 19, 2014 - link

    does it will support on a pci v2?
  • rish95 - Wednesday, February 19, 2014 - link

    Yes. You can run PCIe 3.0 cards on 2.0 slots.
  • rish95 - Wednesday, February 19, 2014 - link

    This card is quite literally jesus for me. I've been waiting for something like this for a few years now.

    Currently I'm running an Athlon II X4 with a GT 240 on an OEM 250W PSU. I know it sounds like that may be a bit much for the PSU, but it's been working fine for years.

    There haven't been any cards without external power connectors released since the GT 240 that have been significantly faster. I know I could have jumped to an HD 7750, but it's still not that much of an improvement. Now I can get a massive 3-4X performance boost without upgrading my PSU.

    I hope this was worth the wait. I've had a copy of Crysis 3 for some time that I couldn't use because the 240 doesn't support DX11.
  • cbrownx88 - Wednesday, February 19, 2014 - link

    Good for you man! This does sound like quite the fit! Hope your power supply keeps hangin in there!
  • Antronman - Wednesday, February 19, 2014 - link

    These cards will barely be able to run Crysis 3.
  • rish95 - Wednesday, February 19, 2014 - link

    Did you even read the review? It seems to manage 36 FPS at high settings at 1080p.

    I don't need Very High and I don't need 60 FPS. I just need it to look pretty good and run at a playable frame rate at native res.

    This does seem to fit the bill.
  • Qwertilot - Thursday, February 20, 2014 - link

    The only worry in some ways is that the 20nm version of this is inevitably going to be non trivially better at the same sort of power draw. I guess it isn't at all certain if they'll do a roughly equivalent one now though. Obviously not terribly soon. Might end up skipping to 16nm or something.
  • HighTech4US - Wednesday, February 19, 2014 - link

    Quote: NVIDIA is making a major effort to target GTX 550 Ti and GTS 450 owners as those cards turn 3-4 years old, with the GTX 750 series able to easily double their performance while reducing power consumption.

    And they have gotten me to upgrade my HTPC GPU that was on an EVGA GTS 450 to a brand new EVGA 02G-P4-3753-KR GeForce GTX 750 Ti Superclocked 2GB which I just purchased on Newegg for $160.38.

    http://www.newegg.com/Product/Product.aspx?Item=N8...

    This card is factory overclocked at Core Clock: 1176MHz Boost Clock: 1255MHz has a DisplayPort connector and a better copper heat assisted heat sink and fan shroud that exits some of the heat out the back bracket.

    I will be selling my old EVGA GTS 450 on eBay and should clear $60 so I will have a very nice card for the next 3-4 years for an upgrade price of around $100. Not bad at all.
  • pierrot - Wednesday, February 19, 2014 - link

    Awesome this ITX friendly size is the standard, just need power now
  • ninjaquick - Thursday, February 20, 2014 - link

    As neat as this is, it only proves that Maxwell scales within its TDP. It is consistent where it is at.
  • [email protected] - Friday, February 21, 2014 - link

    wow..............
  • jukkie - Friday, February 21, 2014 - link

    I see the GTX 750 Ti as a direct competitor to the HD 7770, so why was AMD's card left out of the list?
    Hmmm...
  • Novaguy - Saturday, February 22, 2014 - link

    I thought AMD's plan is to put the 7850/r7 265 up against the 750 Ti, not the 7770. The HD 7770 really isn't the direct competitor to the 750 Ti; it's usually had around $110. I would guess that if there's anything the HD 7770 competes against, it would be the upcoming 750.
  • th3parasit3 - Friday, February 21, 2014 - link

    I'm still running a GTX460 768MB with an E8500 at stock (built in 2010), mind you my display is only 1650x1050. To me, Maxwell is a huge advancement -- not because of its ability to deliver great FPS at 1080p, but because of its power requirements, or lack thereof.

    AMD burned me on a faulty 5770, so I have much love for NVIDIAs driver support and performance boosts. Looks like after a four year holding pattern, 2014-15 is the year I upgrade my GPU and rebuild. Sign me up for a 750ti and a 860/870.
  • Grandal - Saturday, February 22, 2014 - link

    These seem to be ready made Steam Box drop-ins to me. Will hit the thermal requirements at the perfect time to win the "reference" Steam Box GPU battle.
  • Novaguy - Saturday, February 22, 2014 - link

    Hmm, beyond using this to upgrade my oem boxes from radeon 7750's, I'll love to see this turned into a mid-range mobile card. 750 Ti downclocked for mobile (maybe this is the 850M/860M) would be a nice upgrade over 750M/755M and even possibly even the 760M/765M. It's already below the 75 W TDP those 760M/765M MXM cards call for....
  • Novaguy - Saturday, March 1, 2014 - link

    Just broke down and bought a 750 ti to upgrade from a 7750. Really nice, runs really cool. Definitely worth it for those of you who want to upgrade oem boxes without dealing with the psu, especially if you flip the 7750 at the usual places.
  • dr_sn0w - Wednesday, February 26, 2014 - link

    So, gurus, please tell me if the GTX 750 ti OC will support 4k resolution or not. Thanks.
  • av30 - Friday, March 7, 2014 - link

    I really would have liked to see how the vanilla 750 performed in the HTPC environment in relation to the GT 640. Any chance of updating that section of the review?
  • kamlesh - Wednesday, March 12, 2014 - link

    I m realy curious about Tegra K1 and its succsessor... Leave K1 beside for a moment and see if gtx 750 having 512 cuda cores n draws 55W and gtx 750ti having 640 drws 60W then if u calculate maxwell's each cuda (veriably) draw 0.039W(if clocked at 1ghz or abov). Means if next Tegra uses 2smx of maxwell (256 cores) it might use only 4W (CONSIDERING 20NM AND ~600MHZ CLOCK GPU) and and max 5W with entire SoC.
  • Gadgety - Saturday, March 22, 2014 - link

    Yep me too. Specially the K1 successor, even though the K1 itself is barely out. GPU per watt likely to yield amazing mobile graphics.
  • Gadgety - Saturday, March 22, 2014 - link

    Great review. Thank you. Maxwell looks promising for a small HTPC build capable of gaming. I'd like to see what a 100-120W version could do...
  • Asukichan - Wednesday, December 16, 2020 - link

    i got mine to 1410 mhrz on the core and 6.4ghrz on the vram on asus gtx 750 ti 2gb
  • Asukichan - Wednesday, December 16, 2020 - link

    thats not even max overclock i can go even higher on it
  • Asukichan - Wednesday, December 16, 2020 - link

    update i got it at max speed 1413mhrz on core and 6412mhrz on vram with 31+core voltage in mv.

Log in

Don't have an account? Sign up now