Doubt it, the real capability of an 8-pin (in terms of the molex mini-fir jr's physical specifications) is well over 200A. But it does force the user to run it with a good power supply, plus it means people who don't know much in-depth will add up the connectors and assume the VRM is 150W better than the evga kingpin.
Source? When i was researching, 9A was common per pin, with i think 11A being available. 8 pin has 3 hot pins, 3 ground and 2 sense. 3 time 9 is 27 Amps.
"You see, most blokes will be playing at 2. You’re on 2, all the way up, all the way up...Where can you go from there? Nowhere. What we do, is if we need that extra push over the cliff...Three. One faster."
Maintaining legacy support for DVI. I wouldn't be surprised if that goes away in the next few years since AMD and NVidia finally killed VGA last summer (vs an original commitment to do so in 2013) with DVI end of life scheduled for a few years afterwards (but with the original deadline already blown).
But they could easily fit the 2xHDMI and 3xmDP on the bottom row and keep the DVI port. Not all ports have to work simultaneously - I don't think the 5 current ports all work simultaneously, can't you only use 4 displays per card right now?
OK, that'd potentially work. I'd assumed (apparently incorrectly) the card was capable of driving all 5 outputs at once; if that's not the case adding a 6th port is probably feasible. I'm caveating myself because it's possible the GPU itself has 5 outputs with a few that can be configured as either DP or HDMI, and adding a 6th port would require some sort of extra switch/multiplexer hardware (or on card MST and all the limitations that would impose). OTOH I don't think I've ever seen a mini port on a full size desktop GPU; there seems to be some resistance to doing so. Probably because it'd require the use of dongles (or new cables) to connect.
@DanNeely: " I'd assumed (apparently incorrectly) the card was capable of driving all 5 outputs at once."
Incorrect for this card, but the confusion is understandable. ATi can drive up to six outputs from a single card. nVidia can drive up to four.
@DanNeely: "I don't think I've ever seen a mini port on a full size desktop GPU; there seems to be some resistance to doing so. Probably because it'd require the use of dongles (or new cables) to connect."
I'm not aware of a consumer nVidia card that employs miniDP, but they've been around on ATi cards since the 5000 series. Initially, they were mostly used for six MiniDP eyefinity edition cards and they did require dongles. However, it became fairly common place to see two of them (in addition to other standard ports) in the 6000 and 7000 series. MiniDP cables were also much more available at this point. Several monitors (notably enthusiast and professional models from NEC) launched that make use of MiniDP inputs as well. There are still some consumer cards that make use of MiniDP in the R9 200 series, but it has become rare again. I haven't seen an RX400 or later series card that makes use of them. Here's just a few links to a few video cards I found in a 5 minute search that can use MiniDP:
I think they shelved MiniDP, not because of cables or dongles, but because (at least for a while) MiniDP was behind on supporting the latest DP standard. Not sure if they've caught up yet, but a quick search only shows DP 1.2 compliant MiniDP cables. Standard size DP is at DP 1.4.
Eh, I'll wait the 8-10 years it takes for iGPUs to offer similar performance or maybe the 6 years it takes for bottom feeder dGPUs to do the same. I just can't see stuffing something that absurdly huge and power hungry into a computer just so I can play an early access garbage fire of a game that's been overhyped and pushed out the door without optimization, adequate bug stomping, or half the promised features. It just seems imprudent to spend a ton on purchase price and spend more on power and air conditioning only to have the privilege of spending release day prices on games so that I can be a beta tester for EA's latest steamer.
This philosophy works for single player games and even some low requirement online multiplayer games that are popular enough to still have a following after a decade (I.E. CounterStrike). Online multiplayer games with a yearly release cadence and subsequent mass exodus to the new game in the series (I.E. Call of Duty) force players into upgrading into a system that can run it before it gets abandoned (or deal with limited gameplay types and long match making due to low player count).
I personally spend up a bit (but not up to the flagship products) on the video card and look to play most releases between a year and two years out when I can find a good discount. There are very few game series that I am tempted to purchasing at or near release. In the last decade, I've only purchased XCOM Enemy Unknown and Deus Ex: Human Revolution at or near launch. I do miss out on some pretty nice titles for a little while, but I also miss out on a lot of disappointment.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
25 Comments
Back to Article
Ken_g6 - Wednesday, June 21, 2017 - link
Triple 8-pin? Who needs more than double, which can provide 375W? Triple would be...525?!ingwe - Wednesday, June 21, 2017 - link
People who live in cold climates and want their GPUs to double as space heaters.nathanddrews - Wednesday, June 21, 2017 - link
Is there a heat/voltage advantage to spreading out the power delivery across three plugs? Maybe that's why?mickulty - Wednesday, June 21, 2017 - link
Doubt it, the real capability of an 8-pin (in terms of the molex mini-fir jr's physical specifications) is well over 200A. But it does force the user to run it with a good power supply, plus it means people who don't know much in-depth will add up the connectors and assume the VRM is 150W better than the evga kingpin.blahsaysblah - Wednesday, June 21, 2017 - link
Source? When i was researching, 9A was common per pin, with i think 11A being available. 8 pin has 3 hot pins, 3 ground and 2 sense. 3 time 9 is 27 Amps.FLHerne - Wednesday, June 21, 2017 - link
I'm guessing that was a typo intended as 200W rather than amps, which would be quite alarming through those little pins!3 × 9A × 12V is 324W, greatly exceeding the 150W spec for GPU connectors.
blahsaysblah - Wednesday, June 21, 2017 - link
The 9A is for the mini-fit female pin itself, precluding wire used, what the PCB traces and pin can handle. http://www.molex.com/molex/products/datasheet.jsp?...DanNeely - Wednesday, June 21, 2017 - link
"You see, most blokes will be playing at 2. You’re on 2, all the way up, all the way up...Where can you go from there? Nowhere. What we do, is if we need that extra push over the cliff...Three. One faster."shabby - Wednesday, June 21, 2017 - link
Classicvladx - Wednesday, June 21, 2017 - link
Ha that reminds me somehow of an old TheOnion article about Gillette.Foeketijn - Saturday, June 24, 2017 - link
Thanks. That made my dayWinterCharm - Wednesday, June 21, 2017 - link
Only Palpatine would need Unlimited Power!!!!!!!!AllIDoIsWin - Wednesday, June 21, 2017 - link
Probably heaps more money, but for 1-4 fps difference?peterfares - Wednesday, June 21, 2017 - link
Not a fan of 2xHDMI and 2xDP, I think 3xDP is betterDanNeely - Wednesday, June 21, 2017 - link
IIRC when more recent cards started going to 2 and 2 the marketing argument was that the 2nd HDMI was for a VR headset in addition to an HDMI monitor.peterfares - Wednesday, June 21, 2017 - link
Why not have 2xHDMI and 3xMiniDP then? Could easily be fit on there.DanNeely - Wednesday, June 21, 2017 - link
Maintaining legacy support for DVI. I wouldn't be surprised if that goes away in the next few years since AMD and NVidia finally killed VGA last summer (vs an original commitment to do so in 2013) with DVI end of life scheduled for a few years afterwards (but with the original deadline already blown).peterfares - Thursday, June 22, 2017 - link
But they could easily fit the 2xHDMI and 3xmDP on the bottom row and keep the DVI port. Not all ports have to work simultaneously - I don't think the 5 current ports all work simultaneously, can't you only use 4 displays per card right now?DanNeely - Thursday, June 22, 2017 - link
OK, that'd potentially work. I'd assumed (apparently incorrectly) the card was capable of driving all 5 outputs at once; if that's not the case adding a 6th port is probably feasible. I'm caveating myself because it's possible the GPU itself has 5 outputs with a few that can be configured as either DP or HDMI, and adding a 6th port would require some sort of extra switch/multiplexer hardware (or on card MST and all the limitations that would impose). OTOH I don't think I've ever seen a mini port on a full size desktop GPU; there seems to be some resistance to doing so. Probably because it'd require the use of dongles (or new cables) to connect.BurntMyBacon - Friday, June 23, 2017 - link
@DanNeely: " I'd assumed (apparently incorrectly) the card was capable of driving all 5 outputs at once."Incorrect for this card, but the confusion is understandable. ATi can drive up to six outputs from a single card. nVidia can drive up to four.
@DanNeely: "I don't think I've ever seen a mini port on a full size desktop GPU; there seems to be some resistance to doing so. Probably because it'd require the use of dongles (or new cables) to connect."
I'm not aware of a consumer nVidia card that employs miniDP, but they've been around on ATi cards since the 5000 series. Initially, they were mostly used for six MiniDP eyefinity edition cards and they did require dongles. However, it became fairly common place to see two of them (in addition to other standard ports) in the 6000 and 7000 series. MiniDP cables were also much more available at this point. Several monitors (notably enthusiast and professional models from NEC) launched that make use of MiniDP inputs as well. There are still some consumer cards that make use of MiniDP in the R9 200 series, but it has become rare again. I haven't seen an RX400 or later series card that makes use of them. Here's just a few links to a few video cards I found in a 5 minute search that can use MiniDP:
AMD Radeon 5870 Eyefinity
http://www.tomshardware.com/reviews/radeon-5870-ey...
MSI Radeon 6970
https://www.newegg.com/Product/Product.aspx?Item=N...
VisionTek Radeon 7870
https://www.amazon.com/VisionTek-Radeon-GDDR5-Mini...
MSI Radeon 7870
https://www.newegg.com/Product/Product.aspx?Item=N...
Club3D Radeon R9 295X2
http://www.club-3d.com/index.php/products/reader.e...
I think they shelved MiniDP, not because of cables or dongles, but because (at least for a while) MiniDP was behind on supporting the latest DP standard. Not sure if they've caught up yet, but a quick search only shows DP 1.2 compliant MiniDP cables. Standard size DP is at DP 1.4.
vladx - Wednesday, June 21, 2017 - link
Amazing, this might actually challenge my Titan Xp once overclocked.LauRoman - Wednesday, June 21, 2017 - link
F**k RGBSkOrPn - Wednesday, June 21, 2017 - link
Yay, yet another card impossible to purchase thanks to mining.BrokenCrayons - Thursday, June 22, 2017 - link
Eh, I'll wait the 8-10 years it takes for iGPUs to offer similar performance or maybe the 6 years it takes for bottom feeder dGPUs to do the same. I just can't see stuffing something that absurdly huge and power hungry into a computer just so I can play an early access garbage fire of a game that's been overhyped and pushed out the door without optimization, adequate bug stomping, or half the promised features. It just seems imprudent to spend a ton on purchase price and spend more on power and air conditioning only to have the privilege of spending release day prices on games so that I can be a beta tester for EA's latest steamer.BurntMyBacon - Friday, June 23, 2017 - link
This philosophy works for single player games and even some low requirement online multiplayer games that are popular enough to still have a following after a decade (I.E. CounterStrike). Online multiplayer games with a yearly release cadence and subsequent mass exodus to the new game in the series (I.E. Call of Duty) force players into upgrading into a system that can run it before it gets abandoned (or deal with limited gameplay types and long match making due to low player count).I personally spend up a bit (but not up to the flagship products) on the video card and look to play most releases between a year and two years out when I can find a good discount. There are very few game series that I am tempted to purchasing at or near release. In the last decade, I've only purchased XCOM Enemy Unknown and Deus Ex: Human Revolution at or near launch. I do miss out on some pretty nice titles for a little while, but I also miss out on a lot of disappointment.