Ugh, that's disgusting! Between the potential physical durability problems with positioning the connector that way to the idea of needing that many pins to deliver power to a graphics card being such an outstanding example of how much energy and effort we waste while chasing down ways to keep ourselves amused so we need not be bothered with thinking for ourselves, it's just another statement about the sad state of being human.
So long as GPUs come with an adapter, I can't say that I care about this at all. Durability shouldn't be an issue any more than any other pin on any given motherboard. To put it another way, if a GPU vendor can't properly attach a power connector to the PCB, you're gonna have other problems in need of RMA.
He was referring to the perpendicular mounting of the new, even longer, connector. Look at the picture, the way it's mounted. Now visualize someone sticking their hand in a chassis to plug said connector into their new GPU upgrade. Hope they're don't apply too much force!
My main issue though is the fact that it is an Nvidia creation, it is at this point proprietary. I prefer to avoid adapters when possible.
Well if you look at the card realestate 2-8 pins take up and the fact cards are getting longer and longer my guess this is necessary to not make the PCB simply not fit in many cases. Also if you look at 2-8 pin connectors there's 6 wires for ground and 6 for power...so there's no need for the extra 4 "sense" pins. If you hook a 12 pin card up thought with 2 6 pins, you can start fires potentially (those only have effectively 2 positive wires each).
Also, the new 12pin is the length of a single old 8pin connector; so even mounted flat this will save more space than a naive 12 vs 16 pin consideration would imply.
Are you sure? From the picture here and on seasonic it looks like just they are normalizing the connector instead of doing the ol standard attach two 6 pin ones together.
Which would make sense because most power supplies now (at least modular ones) come with TWO 6 pin connectors in the box, would would make upgrading easier for people IF its not included with card..though i'm sure it will be.
" If you hook a 12 pin card up thought with 2 6 pins, you can start fires potentially " I think you've underestimated the value of fuses. The power supply should cut power if excessive current draws are seen. That's not a "nice design addition" thing, that's a "can't legally sell it without that feature" thing. Most regulatory agencies look very dimly upon devices that run enough power through underspec'ed cables and connectors to start fires.
You might dial back the outrage a little bit, Peach. The point about mechanical stability is understandable, but higher-end graphics cards have long featured a 6-pin + 8-pin connector pair or even 2x 8-pin. I think I might've even seen a card with 3x 8-pin, but perhaps I'm imagining that.
Anyway, it sounds like it's equivalent to 2x 8-pin, which really isn't moving the bar. As ATC9001 points out, it's probably more about saving board real estate, which is in keeping with its unusual orientation.
3x8-pin cards do exist, although they're usually AIB super powered up versions, like Powercolor's Devil 13 and things. Or overclockers that want to mod on a completely different power delivery to the GPU, those things often have 5-6x 8-pin connectors.
"Well, that's going to break" was what I immediately thought when I saw the connector. If the photos floating around of the "3090" next to a 2080 Ti are real, then I'd like to know how big the thing would have been without that space savings.
Vertical sounds like asking for trouble with all the physical stress being when pushing in the cable. There is a very good reason why most laptops have the power socket on a lead and not attached directly to the PBC. Still, a quick re-flow should sort it if it breaks and Nvidia do love re-flows as shown by bumpgate back 2009ish.
It's a connector that's not going to have a lot of insertions throughout its lifetime. Even if we give like a WAG of 100 inserts, you can still take out the card once a month for cleaning and still be good for over 7 years. At which point, if history still holds up, the card will be obsolete by then.
Maybe the vertical design in that picture is more of a visual aid to see the 12 connectors clearly and not really Nvidia's engineering recommendation. I suspect someone from marketing or other team created that and is not an official recommendation.
if it was at an angle of the 'front' of the card, that kinda makes sense cable management wise. Rather than the thick cable sticking straight out, it's toward the back out the way of air flow.
So you'd rather take two 8-pin connectors where over half of the pins aren't used for power delivery than a 12-pin one it's likely at least half of them are?
I can see the concern about the orientation and durability problems, but this isn't a USB connector. Most likely the end user isn't going to be reinserting the plug more than a handful of times. And if you're one of those people who will want to take the card out frequently for some reason, just unplug it from the 8+8 pin end.
I actually think this connector is a good idea. The 6/8pins are old and outdated. There are so many faulty splitters that cause people problems. Half the pins are useless.
Totally true. All for an extra 10/15fps over the previous gen. it's absolute madness. The second thing that's annoying, is that 124fps is pointless when a game dips below 60fps. Make ALL games, where possible, at least a SOLID 60fps.
Well, not everyone have pro equipment to run games at solid 60fps all the time. The vast majority of gamers run 3-6 years old middle-range hardware. Only a tiny number of enthusiast actually buy RTX Titan level of hardware. If your game cannot run properly on most of the hardware and console you will not make many sells, and therefore, even if the game may be great, nobody will play it.
What about durability problems with positioning the connector that way?
Have you ever looked at the connector stacks at the back I/O area of motherboards and thought the same? How many of those connector stacks have you ever screwed due to them being standing tall?
This connector is going to have a dongle to the back of the GPU. We've already seen pictures of it. So you're plugging into the dongle on the back, which is mechanically supported by the heatsink/shroud. This is also nothing new, NV did it with a bunch of the half length PCB full length cooler cards in the past.
There's literally zero risk of breaking this connector unless you take the card apart to watercool it. And if you're doing that(I sure will be) you should already know what you're getting into.
I too am curious about the durability, or rather the mechanism of support. The diagram doesn't seem to show any support behind the connector. Without something it seems there would be a lot more stress at the PCB interface.
It _was_ a Seasonic adapter, it wasn't a knockoff. I didn't look at where it was manufactured or assembled. Bought from Newegg. The power supply was undamaged, just what it was plugged in to and the wire and the adapter.
Based on your comment, you’re most likely an American and like most Americans, you have tendency to blame others for your mistakes and problems. But the next thing you buy will still be made in China.
It's smaller and involves less wires going to the GPU, allowing for cleaner installations. I hope it gets adopted industry-wide. We'll probably start seeing them included with all PSUs soon.
There doesn't seem to be much will to actually iterate the very dated ATX standard as a whole. So I guess the best we can do is maybe chip away at it a little bit at a time.
Largely over intertia. The gamign market is rich but also niche, it doesnt have the huge multi million sales every quarter to support a evolving standard like the smartphone market does. A big change like that would require a lot of cooperation from all PSU, mobo, and GPU AIBs to change the standard at the same time and smooth the operation out.
Even then youd still have lasting demand from previous systems that need new power supplies for whatever reason. And get the motherboard manufacturers to agree tot ake on the additional complexity of including power circuitry on board, which could be an issue for mini ITX and micro ATX boards. Even full ATX boards are stuffed already.
Agreed. The real question is if NVidia is planning to push this to ATX to try and get it added after the fact; or going to pull proprietary BS and force AMD to come up with a different high density power plug. That in turn will result in PSU makers having to include separate NVidia and AMD power cables. A few years later the EU will beat both of them over the head with an e-waste club and force them to agree to a shared standard that will probably result in both proprietary cables being obsoleted.
Technically, it's not proprietary to NVIDIA. It's actually a 12-pin Molex Micro-fit 3.0 connector. It's readily available even on Amazon (not bulk pricing).
I am sure you said the same when the ATX power connector was introduced, replacing the AT and LPX connectors. And i am also certain you said the same when the SATA power connector replaced Molex.
First Nvidia increase they GPUs to ridiculous price, then change power pin. which upset many people but are willing to tolerate. And now the Vertical power pin mount which is stupid. There many computer casing design that wouldn't fit, Take ITX builds for example. I hope they drop Vertical pin, and switch to horizontal pin. Otherwise more reasons for me to switch to AMD GPUs if Nvidia don't use their brains for once.
But that's not an FE 3090, which by all accounts appears to be huge.
It would probably be quite compact if you swapped the stock cooler for a water block, but at that point it would make more sense just to buy a water cooled version up front instead of throwing $100 worth of 300W TDP air cooling in the trash.
I have a box full of adapters from when graphics cards first started using 6-pin pcie power and from the transition to sata power. The adapters came with the devices that required them for a time extending well beyond PSUs without the 'new' connectors were common place. The changes in power connectors were not a big deal then and they won't be now.
The real problem I see is that despite the prevalence of (semi/fully) modular power supplies that are fully capable of providing power to the new standards as is. You can't just get a new cable for your PSU because there is no way to know what the pin-out on the PSU is without a multi-meter.
People are always afraid of change, the initial reaction always follows the Kübler-Ross model with the following emtions: denial, anger, bargaining, depression, and acceptance.
Someone has to innovate! At least Nvidia is trying!
Ha, you're an optimist. If these people were in charge, something like a Geforce 256 would still be future tech, and he industry would still hesitate to move from the XT platform to the AT...
Seriously, people were whining and griping about HW T&L back during the Geforce 256 launch. "Muh CPU can do all of that in software rendering mode!" "No game uses this nowadays, why waste precious silicon on it?" Not exact quotes, but that sort of general malarkey. Fuddy duddy FUDDs exist in all sorts of environments, sadly holding back any ideas and progress beyond what they had (n-1) years ago, because "things were better back then."
Who cares about connector. How about HDMI 2.1 to drive 8k monitors at 60Hz with full gamma? Damn Samsung and graphics card manufacturers do not dance together with this respect. Latter ones have the DisplayPort which is capable to drive such monitors but Samsung 8k monitors do not have DisplayPort connections at all, only HDMI
That's because they are not monitors, they are consumer TV's. Samsung monitors have DP, but I agree it would make sense for Samsung to add one DP 2.0 input to High end TV's.
I'm excited about this card, but concerned about the rumored $1400 msrp for the 3090, and then, on top of that, what the final TDP will be for the card itself, let alone whatever added power will be required for whatever motherboard and cpu I would choose to run it with.
I haven't gamed in a long time so I don't know what the current status is, but last time I checked into it most of the quality games were being released for consoles first, and then PCs several months later, if at all, and then the games that were ported to PC weren't optimized at all, and were buggy half of the time, upon release.
Has this changed at all in the last year or so? If I actually can get the money to invest in this card and a new PC, is my money going to be well spent or am I going to be frustrated with a lot of poor console ports that make me feel as if my money was wasted?
For me, building a new pc and then having to worry that the TDP for playing a game a few hours a couple of days a week might skyrocket the power bill, dampens my enjoyment. I need to know that this is a worthwhile investment and not just me buying something "cool" for the sake of buying something "cool".
Nobody has the answers right now, I know, because the card hasn't been released yet, but I'm hoping for a TDP for the actual card that is reasonable and not totally insane, and on the flip side of that I am hoping for a performance increase from the card that justifies the over a grand price that it seems Nvidia will be asking for.
How many games are lined up that will actually support ray tracing and use it evenly throughout the game and not just sporadically in a level, here and there? I'm seriously asking, because I don't know.
I haven't heard people raving online about all these awesome ray traced games that are must buy.
I hear a lot about FortNite, which I think will play on most PCs with just average graphics cards.
You "haven't gamed in a long time" but still have hopes for an ultra-enthusiast card and mixes it up with Fortnite while being worried about the few cents it'll cost in power? I smell troll.
This is an odd post, but to summarise: There are still a bunch of solid games developed PC-first. There aren't really any "must-buy" ray-traced games. It's still very much optional. High-end cards have become a stupid race to the limits of affordability.
12.5A are for dual pin Microfit, maximum for 12 pin is 9.5A with gauge 16 wires (ca 1.3 mm). Still at 684W a lot of headroom compared to the ATX std 300W over two 8-pin.
So what I gathered from this is that Nvidia has gone full Apple - by which I mean, they're claiming to have invented things they didn't (the connector), are focusing on the herculean efforts required by their engineers to make an absurdly impractical design work to distract from the fact that it's an absurdly impractical design (300W+ GPU), and are doing all of that in service of hyping up a product which will be overpriced and yet somehow still released to rave reviews and bought by legions of dedicated fans.
And to think I just bought a Seasonic PX-750 not that long ago. I think the 850w recommendation is a bit absurd but I guess it all depends on just how power hungry or inefficient these cards happen to be. I'm thinking I would be fine with a Seasonic sourced cable and that the power requirement would be less on anything 3080 or below. Still, it sucks to have to either use an adapter and the potential risk that comes with it or to buy a new PSU that includes it.
How hard would it be to route the power cables of every motherboard and component THROUGH the motherboard? Thus placing the pcie aux power connector next to the pcie slot, improving aesthetics and airflow?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
79 Comments
Back to Article
PeachNCream - Wednesday, August 26, 2020 - link
Ugh, that's disgusting! Between the potential physical durability problems with positioning the connector that way to the idea of needing that many pins to deliver power to a graphics card being such an outstanding example of how much energy and effort we waste while chasing down ways to keep ourselves amused so we need not be bothered with thinking for ourselves, it's just another statement about the sad state of being human.paulharmo - Wednesday, August 26, 2020 - link
Counterpoint: haha rtx graphics go brrrrrrdgingeri - Wednesday, August 26, 2020 - link
What does "go brrrr" mean, anyway?69369369 - Wednesday, August 26, 2020 - link
cooming and shidding itselfTheinsanegamerN - Thursday, August 27, 2020 - link
https://brrr.money/nathanddrews - Thursday, August 27, 2020 - link
https://knowyourmeme.com/memes/money-printer-go-br...So long as GPUs come with an adapter, I can't say that I care about this at all. Durability shouldn't be an issue any more than any other pin on any given motherboard. To put it another way, if a GPU vendor can't properly attach a power connector to the PCB, you're gonna have other problems in need of RMA.
Alexvrb - Sunday, August 30, 2020 - link
He was referring to the perpendicular mounting of the new, even longer, connector. Look at the picture, the way it's mounted. Now visualize someone sticking their hand in a chassis to plug said connector into their new GPU upgrade. Hope they're don't apply too much force!My main issue though is the fact that it is an Nvidia creation, it is at this point proprietary. I prefer to avoid adapters when possible.
Kjella - Wednesday, August 26, 2020 - link
Please go back to wccftech. Reading the forums there is just painful.ATC9001 - Wednesday, August 26, 2020 - link
Well if you look at the card realestate 2-8 pins take up and the fact cards are getting longer and longer my guess this is necessary to not make the PCB simply not fit in many cases. Also if you look at 2-8 pin connectors there's 6 wires for ground and 6 for power...so there's no need for the extra 4 "sense" pins. If you hook a 12 pin card up thought with 2 6 pins, you can start fires potentially (those only have effectively 2 positive wires each).DanNeely - Wednesday, August 26, 2020 - link
Also, the new 12pin is the length of a single old 8pin connector; so even mounted flat this will save more space than a naive 12 vs 16 pin consideration would imply.imaheadcase - Wednesday, August 26, 2020 - link
Are you sure? From the picture here and on seasonic it looks like just they are normalizing the connector instead of doing the ol standard attach two 6 pin ones together.Which would make sense because most power supplies now (at least modular ones) come with TWO 6 pin connectors in the box, would would make upgrading easier for people IF its not included with card..though i'm sure it will be.
DanNeely - Wednesday, August 26, 2020 - link
From Toms, the 12 is with a hair of being the same length as the 8.https://cdn.mos.cms.futurecdn.net/5YcitNp9eMSUC4VF...
Full article: https://www.tomshardware.com/news/seasonic-outs-nv...
DanNeely - Wednesday, August 26, 2020 - link
this is the image I was looking for:https://hardforum.com/data/attachment-files/2020/0...
Lord of the Bored - Thursday, August 27, 2020 - link
" If you hook a 12 pin card up thought with 2 6 pins, you can start fires potentially "I think you've underestimated the value of fuses. The power supply should cut power if excessive current draws are seen.
That's not a "nice design addition" thing, that's a "can't legally sell it without that feature" thing. Most regulatory agencies look very dimly upon devices that run enough power through underspec'ed cables and connectors to start fires.
mode_13h - Wednesday, August 26, 2020 - link
You might dial back the outrage a little bit, Peach. The point about mechanical stability is understandable, but higher-end graphics cards have long featured a 6-pin + 8-pin connector pair or even 2x 8-pin. I think I might've even seen a card with 3x 8-pin, but perhaps I'm imagining that.Anyway, it sounds like it's equivalent to 2x 8-pin, which really isn't moving the bar. As ATC9001 points out, it's probably more about saving board real estate, which is in keeping with its unusual orientation.
intelati - Wednesday, August 26, 2020 - link
You're not imagining the 3x8 pin card.I distinctly remember a triple width card with 3 connections. I'm thinking it was a special halo GTX 590
intelati - Wednesday, August 26, 2020 - link
ASUS ROG MARS IIIan Cutress - Wednesday, August 26, 2020 - link
3x8-pin cards do exist, although they're usually AIB super powered up versions, like Powercolor's Devil 13 and things. Or overclockers that want to mod on a completely different power delivery to the GPU, those things often have 5-6x 8-pin connectors.evilpaul666 - Wednesday, August 26, 2020 - link
"Well, that's going to break" was what I immediately thought when I saw the connector. If the photos floating around of the "3090" next to a 2080 Ti are real, then I'd like to know how big the thing would have been without that space savings.nikaru - Monday, August 31, 2020 - link
On the internet forums, one plug and unplug his GPU card on a daily basis. In real life, you put it in the motherboard and dont touch it for years.KompuKare - Wednesday, August 26, 2020 - link
Vertical sounds like asking for trouble with all the physical stress being when pushing in the cable.There is a very good reason why most laptops have the power socket on a lead and not attached directly to the PBC.
Still, a quick re-flow should sort it if it breaks and Nvidia do love re-flows as shown by bumpgate back 2009ish.
xenol - Wednesday, August 26, 2020 - link
It's a connector that's not going to have a lot of insertions throughout its lifetime. Even if we give like a WAG of 100 inserts, you can still take out the card once a month for cleaning and still be good for over 7 years. At which point, if history still holds up, the card will be obsolete by then.frbeckenbauer - Wednesday, August 26, 2020 - link
the connector is probably mechanically supported by the coolerHolliday75 - Wednesday, August 26, 2020 - link
Playing devils advocate here.Maybe the vertical design in that picture is more of a visual aid to see the 12 connectors clearly and not really Nvidia's engineering recommendation. I suspect someone from marketing or other team created that and is not an official recommendation.
catavalon21 - Wednesday, August 26, 2020 - link
The narrated video suggests this is the solution for FE cards; vertical, and at a swept angle rather than 90 degrees.cjb110 - Wednesday, September 2, 2020 - link
if it was at an angle of the 'front' of the card, that kinda makes sense cable management wise. Rather than the thick cable sticking straight out, it's toward the back out the way of air flow.xenol - Wednesday, August 26, 2020 - link
So you'd rather take two 8-pin connectors where over half of the pins aren't used for power delivery than a 12-pin one it's likely at least half of them are?I can see the concern about the orientation and durability problems, but this isn't a USB connector. Most likely the end user isn't going to be reinserting the plug more than a handful of times. And if you're one of those people who will want to take the card out frequently for some reason, just unplug it from the 8+8 pin end.
Fataliity - Wednesday, August 26, 2020 - link
I actually think this connector is a good idea. The 6/8pins are old and outdated. There are so many faulty splitters that cause people problems. Half the pins are useless.So why not make a new connector standardized?
shabby - Wednesday, August 26, 2020 - link
You sound offended...Achaios - Wednesday, August 26, 2020 - link
? The only things I see are:1. Progress.
2. Brilliant Engineering.
3. A new gen of GPU's.
...and I'm not even a NVIDIA fanboi, the only thing I am is anti-Intel.
euler007 - Wednesday, August 26, 2020 - link
If you're going to go the humanist route you have more important things to worry about than video card connectors. And more important places to be.damianrobertjones - Wednesday, August 26, 2020 - link
Totally true. All for an extra 10/15fps over the previous gen. it's absolute madness. The second thing that's annoying, is that 124fps is pointless when a game dips below 60fps. Make ALL games, where possible, at least a SOLID 60fps.nikaru - Monday, August 31, 2020 - link
Well, not everyone have pro equipment to run games at solid 60fps all the time. The vast majority of gamers run 3-6 years old middle-range hardware. Only a tiny number of enthusiast actually buy RTX Titan level of hardware. If your game cannot run properly on most of the hardware and console you will not make many sells, and therefore, even if the game may be great, nobody will play it.michael2k - Wednesday, August 26, 2020 - link
People playing games aren't going to be fighting, stealing, destroying, or running for office.I think more people should be playing games.
DigitalFreak - Wednesday, August 26, 2020 - link
I know, right? That 12 pin connector is taking food away from starving children in Africa!MrVibrato - Wednesday, August 26, 2020 - link
What about durability problems with positioning the connector that way?Have you ever looked at the connector stacks at the back I/O area of motherboards and thought the same? How many of those connector stacks have you ever screwed due to them being standing tall?
Purpose - Wednesday, August 26, 2020 - link
REALITY CHECK.This connector is going to have a dongle to the back of the GPU. We've already seen pictures of it. So you're plugging into the dongle on the back, which is mechanically supported by the heatsink/shroud. This is also nothing new, NV did it with a bunch of the half length PCB full length cooler cards in the past.
There's literally zero risk of breaking this connector unless you take the card apart to watercool it. And if you're doing that(I sure will be) you should already know what you're getting into.
catavalon21 - Wednesday, August 26, 2020 - link
I too am curious about the durability, or rather the mechanism of support. The diagram doesn't seem to show any support behind the connector. Without something it seems there would be a lot more stress at the PCB interface.EdgeOfDetroit - Wednesday, August 26, 2020 - link
I had an adapter catch fire before. Fortunately I was right there and powered the thing off, but things melted. I don't like this.DigitalFreak - Wednesday, August 26, 2020 - link
If you weren't using some Chinese knockoff, it was most likely a fluke.catavalon21 - Wednesday, August 26, 2020 - link
And you're sure authentic Seasonic branded adapters aren't made in China?EdgeOfDetroit - Wednesday, August 26, 2020 - link
It _was_ a Seasonic adapter, it wasn't a knockoff. I didn't look at where it was manufactured or assembled. Bought from Newegg. The power supply was undamaged, just what it was plugged in to and the wire and the adapter.sonny73n - Friday, August 28, 2020 - link
@DigitalFreakBased on your comment, you’re most likely an American and like most Americans, you have tendency to blame others for your mistakes and problems. But the next thing you buy will still be made in China.
Quantumz0d - Wednesday, August 26, 2020 - link
Hoping AIB cards do not push this non standard BS. Founders trash can do it and keep it for themselves.Guspaz - Wednesday, August 26, 2020 - link
It's smaller and involves less wires going to the GPU, allowing for cleaner installations. I hope it gets adopted industry-wide. We'll probably start seeing them included with all PSUs soon.limitedaccess - Wednesday, August 26, 2020 - link
There doesn't seem to be much will to actually iterate the very dated ATX standard as a whole. So I guess the best we can do is maybe chip away at it a little bit at a time.TheinsanegamerN - Wednesday, August 26, 2020 - link
Largely over intertia. The gamign market is rich but also niche, it doesnt have the huge multi million sales every quarter to support a evolving standard like the smartphone market does. A big change like that would require a lot of cooperation from all PSU, mobo, and GPU AIBs to change the standard at the same time and smooth the operation out.Even then youd still have lasting demand from previous systems that need new power supplies for whatever reason. And get the motherboard manufacturers to agree tot ake on the additional complexity of including power circuitry on board, which could be an issue for mini ITX and micro ATX boards. Even full ATX boards are stuffed already.
damianrobertjones - Wednesday, August 26, 2020 - link
"The gamign market is rich but also niche" - Blimey! It's so 'niche' that I've never heard of this 'gamign' market.DanNeely - Wednesday, August 26, 2020 - link
Agreed. The real question is if NVidia is planning to push this to ATX to try and get it added after the fact; or going to pull proprietary BS and force AMD to come up with a different high density power plug. That in turn will result in PSU makers having to include separate NVidia and AMD power cables. A few years later the EU will beat both of them over the head with an e-waste club and force them to agree to a shared standard that will probably result in both proprietary cables being obsoleted.ikjadoon - Wednesday, August 26, 2020 - link
Technically, it's not proprietary to NVIDIA. It's actually a 12-pin Molex Micro-fit 3.0 connector. It's readily available even on Amazon (not bulk pricing).Molex Micro-Fit 3.0™ dual row (12 Circuits) Male & Female receptacle plug, w/Terminal sockets, (Pack of 1 Complete Set) https://www.amazon.com/dp/B0799GQR6G/ref=cm_sw_r_c...
But, agreed. This should be pushed to ATX. Perhaps all these changes can be bundled into the ATX12VO standard.
If more compact Micro-Fit 3.0 is available, let's bring it to the entire ATX plug set.
xenol - Wednesday, August 26, 2020 - link
Technically the 8+8 pin configuration isn't standard in the PCIe spec either. So AIBs have been pushing "non standard BS" for a while.Kevin G - Wednesday, August 26, 2020 - link
Newer iterations of the spec permit up to 525W split across three 8 pin power connectors (150W each) and the PCIe slot (75 W)MrVibrato - Wednesday, August 26, 2020 - link
I am sure you said the same when the ATX power connector was introduced, replacing the AT and LPX connectors. And i am also certain you said the same when the SATA power connector replaced Molex.What, you didn't? Pfft, lousy amateur...
DoomSlayer - Wednesday, August 26, 2020 - link
First Nvidia increase they GPUs to ridiculous price, then change power pin. which upset many people but are willing to tolerate. And now the Vertical power pin mount which is stupid.There many computer casing design that wouldn't fit, Take ITX builds for example.
I hope they drop Vertical pin, and switch to horizontal pin. Otherwise more reasons for me to switch to AMD GPUs if Nvidia don't use their brains for once.
DigitalFreak - Wednesday, August 26, 2020 - link
How do you know this isn't just for the RTX 3090? You aren't going to fit a FE 3090 in most mini-itx cases anyway.DoomSlayer - Thursday, August 27, 2020 - link
I can fit an RTX-Titan water-cooled in my ITX case....Spunjji - Friday, August 28, 2020 - link
But that's not an FE 3090, which by all accounts appears to be huge.It would probably be quite compact if you swapped the stock cooler for a water block, but at that point it would make more sense just to buy a water cooled version up front instead of throwing $100 worth of 300W TDP air cooling in the trash.
MrVibrato - Wednesday, August 26, 2020 - link
Do it. Switch over. Don't just yap about what you are going to do without actually doing it...DoomSlayer - Friday, August 28, 2020 - link
Ohh I will switch if all the AIBs follow the same Vertical design.Spunjji - Thursday, August 27, 2020 - link
Dude, if the ridiculous pricing didn't persuade you to switch then I'm highly doubting this will.DoomSlayer - Friday, August 28, 2020 - link
If it wouldn't fit into my case, I will switch.It's the same logic as its too big for your house you wouldn't buy it.
GNUminex_l_cowsay - Wednesday, August 26, 2020 - link
I have a box full of adapters from when graphics cards first started using 6-pin pcie power and from the transition to sata power. The adapters came with the devices that required them for a time extending well beyond PSUs without the 'new' connectors were common place. The changes in power connectors were not a big deal then and they won't be now.The real problem I see is that despite the prevalence of (semi/fully) modular power supplies that are fully capable of providing power to the new standards as is. You can't just get a new cable for your PSU because there is no way to know what the pin-out on the PSU is without a multi-meter.
UltraWide - Wednesday, August 26, 2020 - link
People are always afraid of change, the initial reaction always follows the Kübler-Ross model with the following emtions: denial, anger, bargaining, depression, and acceptance.Someone has to innovate! At least Nvidia is trying!
DigitalFreak - Wednesday, August 26, 2020 - link
If some of these people were in charge of tech companies, we'd still be running single core CPUs with Geforce 256 class GPUs!MrVibrato - Wednesday, August 26, 2020 - link
Ha, you're an optimist. If these people were in charge, something like a Geforce 256 would still be future tech, and he industry would still hesitate to move from the XT platform to the AT...jeremyshaw - Wednesday, August 26, 2020 - link
Seriously, people were whining and griping about HW T&L back during the Geforce 256 launch. "Muh CPU can do all of that in software rendering mode!" "No game uses this nowadays, why waste precious silicon on it?" Not exact quotes, but that sort of general malarkey. Fuddy duddy FUDDs exist in all sorts of environments, sadly holding back any ideas and progress beyond what they had (n-1) years ago, because "things were better back then."MrVibrato - Wednesday, August 26, 2020 - link
You are wrong. People are not afraid of change.They are hysterically psychotic about it.
SanX - Wednesday, August 26, 2020 - link
Who cares about connector. How about HDMI 2.1 to drive 8k monitors at 60Hz with full gamma? Damn Samsung and graphics card manufacturers do not dance together with this respect. Latter ones have the DisplayPort which is capable to drive such monitors but Samsung 8k monitors do not have DisplayPort connections at all, only HDMIZoolook - Thursday, August 27, 2020 - link
That's because they are not monitors, they are consumer TV's.Samsung monitors have DP, but I agree it would make sense for Samsung to add one DP 2.0 input to High end TV's.
Zoolook - Thursday, August 27, 2020 - link
Darn edit function, You can get HDMI 2.0 functionality via a cheap passive adapter with videocards supporting DP++ however.Rictorhell - Wednesday, August 26, 2020 - link
I'm excited about this card, but concerned about the rumored $1400 msrp for the 3090, and then, on top of that, what the final TDP will be for the card itself, let alone whatever added power will be required for whatever motherboard and cpu I would choose to run it with.I haven't gamed in a long time so I don't know what the current status is, but last time I checked into it most of the quality games were being released for consoles first, and then PCs several months later, if at all, and then the games that were ported to PC weren't optimized at all, and were buggy half of the time, upon release.
Has this changed at all in the last year or so? If I actually can get the money to invest in this card and a new PC, is my money going to be well spent or am I going to be frustrated with a lot of poor console ports that make me feel as if my money was wasted?
For me, building a new pc and then having to worry that the TDP for playing a game a few hours a couple of days a week might skyrocket the power bill, dampens my enjoyment. I need to know that this is a worthwhile investment and not just me buying something "cool" for the sake of buying something "cool".
Nobody has the answers right now, I know, because the card hasn't been released yet, but I'm hoping for a TDP for the actual card that is reasonable and not totally insane, and on the flip side of that I am hoping for a performance increase from the card that justifies the over a grand price that it seems Nvidia will be asking for.
How many games are lined up that will actually support ray tracing and use it evenly throughout the game and not just sporadically in a level, here and there? I'm seriously asking, because I don't know.
I haven't heard people raving online about all these awesome ray traced games that are must buy.
I hear a lot about FortNite, which I think will play on most PCs with just average graphics cards.
Anyway.....
Kjella - Wednesday, August 26, 2020 - link
You "haven't gamed in a long time" but still have hopes for an ultra-enthusiast card and mixes it up with Fortnite while being worried about the few cents it'll cost in power? I smell troll.Spunjji - Thursday, August 27, 2020 - link
This is an odd post, but to summarise:There are still a bunch of solid games developed PC-first.
There aren't really any "must-buy" ray-traced games. It's still very much optional.
High-end cards have become a stupid race to the limits of affordability.
blzd - Monday, August 31, 2020 - link
In the last year or so? That hasn't been the case for 10+ years.If you don't know why you would want a $1400 graphics card chances are you don't need one lol.
edzieba - Thursday, August 27, 2020 - link
"NVIDIA states in the video that this 12-pin design is of its own creation"Its Molex' Micro-Fit, almost certainly Micro-Fit+ (12.5A per pin, so potentially 900W per connector).
Zoolook - Thursday, August 27, 2020 - link
12.5A are for dual pin Microfit, maximum for 12 pin is 9.5A with gauge 16 wires (ca 1.3 mm).Still at 684W a lot of headroom compared to the ATX std 300W over two 8-pin.
Spunjji - Thursday, August 27, 2020 - link
So what I gathered from this is that Nvidia has gone full Apple - by which I mean, they're claiming to have invented things they didn't (the connector), are focusing on the herculean efforts required by their engineers to make an absurdly impractical design work to distract from the fact that it's an absurdly impractical design (300W+ GPU), and are doing all of that in service of hyping up a product which will be overpriced and yet somehow still released to rave reviews and bought by legions of dedicated fans.SkyBill40 - Thursday, August 27, 2020 - link
And to think I just bought a Seasonic PX-750 not that long ago. I think the 850w recommendation is a bit absurd but I guess it all depends on just how power hungry or inefficient these cards happen to be. I'm thinking I would be fine with a Seasonic sourced cable and that the power requirement would be less on anything 3080 or below. Still, it sucks to have to either use an adapter and the potential risk that comes with it or to buy a new PSU that includes it.dj_aris - Wednesday, September 2, 2020 - link
How hard would it be to route the power cables of every motherboard and component THROUGH the motherboard? Thus placing the pcie aux power connector next to the pcie slot, improving aesthetics and airflow?