I just need 2 ports, one for my desktop, and one for my home server... it will make video editing and converting sooooo much better when I can just run everything native from the server instead of needing to transfer local first!
Then go cable to cable from that PC to the server, setup RRAS services on the server to forward requests as needed, and use cheaper gear to connect the rest of the machines to the server.
What Lightningz71 suggested would do that. Or you can just play with the packet routing/route value a little on the machines and do a direct connect and each one can run a gig port to the switch which other machines connect to.
But yes, ideally I'd like to see some inexpensive 8-24+2/4 switches that have 8-24 ports of GbE and 2-4 ports of 2.5, 5 or 10GbE to go along with. Preferably with some SFP+ slots also if I am going to have my cake and eat it too.
It can be done now quite cheap. Pickup Intel X520 SFP+ NICs off ebay for well under $100, you can pickup 24 port switches with 4 10Gb ports for like $300 US, converted from Australian dollars which is usually overpriced... FS.com sells the SFP and cables almost cheaper than a Cat6 cable from your local computer store...
I went this way, except I prefer Mellanox ConnectX 2 or 3 cards, pair can be had for $25-$30, and yes, fs.com 5m SFP+ 10G cable cost me something like $13 with shipping. For $300 pair of Mellanox ConnectX-4 with 40G or 100G can be had if such bandwidth is required, QSFP+ cable is little bit pricier ($20-30?), but that should suffice for everything.
Switches, though... well, if you have to ask for price, you can't afford it.
I do have both computers hooked to a regular network, and only video, nfs and iscsi traffic goes trough 10G link - it's good enough.
You can also buy cheap 1G managed switch (used 24 port often can be had below $50) and grab a pair of 4x1G cards (here I suggest Intel cards), and configure teaming. It's not that much better, but still 400 MB/s is better than 100 MB/s.
Teaming isn't very useful, since each stream is limited to just one link. So, max transfer rates would be limited to just 112MB/s, unless both machines were Windows 10 and/or Windows Server 2012r2 or 2016.
NIC teaming is useful for servers where multiple workstations are accessing resources at once or between switches where multiple streams are going between them, but for single point to point, it does not enhance performance, as it only allows a connection between 2 endpoints to use one single link. This is the case with both software (Round Robin) and hardware (LACP) methods.
However, with Windows 10 and Windows Server 2012r2 and up, It does allow for each file or session being transferred to go across a different link, so 2 4 NIC teams could copy bunches of files much faster than a single link, but each individual stream is still limited to just one link, so a single file copy or a streamed video would still be limited to 112MB/s.
Screw teaming, look into SMB3.0 ... Windows 10 supports it by default. I was using 3 X 1GB links to copy from Windows 10 to Server 2016 and regularly saw above 300MB/sec transfer speeds without configuring anything more than unique IP addresses on the same subnet for each connection. I've since switched to 10GBe with a Buffalo switch and Ebay Intel cards and only bumped up to ~360MB/sec due to other limitations (it'll hit 10GB speeds if I use ram drives on both sides).
The only one I know of that's relatively inexpensive (I have one) is the Netgear MS510TX. It has multi-gigabit ports, so it'll work with these cards at 10, 5 or 2.5Gbit/s depending on which port you use. It has two 10Gbit ports, but only one of those uses a standard 8P8C connector.
Oh, I should just add a caveat: I've had massively unstable operation with this switch + Aquantia AQN108s. In the end, I gave up, and went back to onboard gigabit, which works flawlessly. I have a feeling it is because of crappy drivers, but who knows.
You want inexpensive? That's relative. The best I've seen is around $200 for a 10 port switch with 2 X 3.5/5/10GbE and 8 X 1GbE, unmanaged. Want management? That'll be an extra hundred. Want routing and advanced features? That'll be another hundred plus.
If price/port is more than this card's $90, then it would make sense to build your own Software Defined Router instead. They don't need to be large: https://www.youtube.com/watch?v=ylgV5TUdErU
Make sure it's multigigabit (i.e. supporting 802.3bz). Don't really want to be stuck with a lovely new <something> that tops out at 5GBASE-T, but stuck with a switch that only supports 1000BASE-T and 10GBASE-T.
This, of course, adds a lovely extra layer of complexity when trying to search for just the right 10G switch.
Most of the newer switched designed in the last year or so are multi-gigabit. Buffalo in particular has a couple that are almost a reasonable price, with an 8 port that's about $500. If they could get the price down another $1-200 they would fly off the shelves.
10 port netgear with 2 10GIG ports is 199.00 The more interesting one for me though has 4 1GIG ports 2 2.5GIG, 2 5GIG and 2 10GIG for a little over $300. (netgear MS510TX) not cheap but coming down for sure.
I don't like the idea of spending +$200 on a switch with only two 10Gbit ports tho. That means only one PC in the house can only connect to one NAS at 10Gbit.... At that point id rather just have a direct connect from my PC to the NAS and hope a cheap 10Gb switch will be released in the future.
The bandwidth capacity of the NIC is rarely the limiting factor for online gaming. IT is almost always the quality of your Internet connection or your graphics card.
But I guess there is money to be made selling excess capacity to gamers they will never use. Now all they need is an RGB edition and this thing will be a success.
In this context "gamer" primarily means "demanding consumer". You can't use words like "professional" or "workstation" without conjuring up $$$, and there really isn't a better alternative nomenclature.
It's a few bucks more for a black PCB and router-like software. That's all. Aquantia are the cheapest 5G and 10G NIC controllers out there now. They have the regular green PCB version too. There's nothing $$$$$ about it. I prefer black PCBs...shocking.
It's not gaming that matters, or even video. It's file transfers, so people can keep their stuff on a NAS instead of on their gaming computer, removing hard drives from the power load and heat load the gaming system would have to contend with, while still allowing them to access the files fast enough for editing and/or installation.
I know this because that is how I've done it. I have Intel X520 NICs and a Dlink DGS-1510-28X switch, to set up 10G interconnects between my 2 training VM hosts, file server, and main workstation.
I really really really really really really wish Anandtech would do some real world benchmarks of the Aquantia AQC107 versus an Intel i219v, an Intel i350, an Intel X550, a Killer E2500 with their software, and a Reaktek 8111. Throughput tests, latency tests, CPU and RAM usage tests, file copy tests, and in game latency tests would actually give people a better idea of what is best suited for their needs. I am using an i219v for my regular LAN and gaming and an AQC107 directly connected to an Intel x520 for 10Gbe connectivity between my NAS and desktop. I have no idea if what I am using is the best or not because nobody will actually compare them all.
Did you know that Cisco and many other switch vendors have included in their software license agreement that users cannot publish performance comparison information? Their performance is so dependent on a variety of environmental circumstances that such comparisons can be unreliable. A comparison between network adapters can be troublesome, and almost always needs to be done without a switch.
I love that they have a black PCB version now. I bought their 5G version when it was on sale a few months ago. I know most on here are cynical as hell and don't care. But even in my 6 whitebox builds, I have all black components. I don't mind the router-like software. I won't be using it. But it's not a bad value add on. It's basically what Qualcomm offers with their Atheros / Killer NIC variant. Aquantia has been kicking butt lately on bring more of these affordable chips to market. I'm seeing more HEDT boards offering it too. But again, just glad there's a black PCB option now. Options are part of what makes this hobby great.
The card itself is bandwidth-limited to 4GBps. I know Ethernet real-world throughput is lower than advertised, but isn't 4GBps still a bottleneck before a 10G Ethernet connection?
You're mixing Gbps and GB/s. So the PCIe 3.0 x4 lanes max throughput is 3.94 GB/s. 10G is 1.25GB/s. So there's plenty of room still left. x2 lanes would have been 1.97 GB/s.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
31 Comments
Back to Article
Jhlot - Thursday, July 12, 2018 - link
Nice but where are the inexpensive 5-8 port 10gbe switches to go with it?CaedenV - Thursday, July 12, 2018 - link
I just need 2 ports, one for my desktop, and one for my home server... it will make video editing and converting sooooo much better when I can just run everything native from the server instead of needing to transfer local first!kgardas - Thursday, July 12, 2018 - link
So why do you not connect your desktop together with you server directly? Obviously you don't need any switch in your setup...DanNeely - Thursday, July 12, 2018 - link
Because he has more than 2 computers on his network, just that 3+ have much less pressing needs for faster connections.lightningz71 - Thursday, July 12, 2018 - link
Then go cable to cable from that PC to the server, setup RRAS services on the server to forward requests as needed, and use cheaper gear to connect the rest of the machines to the server.Gigaplex - Friday, July 13, 2018 - link
A 2-port switch won't give them the ability to connect to the other machines, though.azazel1024 - Friday, July 13, 2018 - link
What Lightningz71 suggested would do that. Or you can just play with the packet routing/route value a little on the machines and do a direct connect and each one can run a gig port to the switch which other machines connect to.But yes, ideally I'd like to see some inexpensive 8-24+2/4 switches that have 8-24 ports of GbE and 2-4 ports of 2.5, 5 or 10GbE to go along with. Preferably with some SFP+ slots also if I am going to have my cake and eat it too.
danielfranklin - Thursday, July 12, 2018 - link
It can be done now quite cheap.Pickup Intel X520 SFP+ NICs off ebay for well under $100, you can pickup 24 port switches with 4 10Gb ports for like $300 US, converted from Australian dollars which is usually overpriced...
FS.com sells the SFP and cables almost cheaper than a Cat6 cable from your local computer store...
Vatharian - Saturday, July 14, 2018 - link
I went this way, except I prefer Mellanox ConnectX 2 or 3 cards, pair can be had for $25-$30, and yes, fs.com 5m SFP+ 10G cable cost me something like $13 with shipping. For $300 pair of Mellanox ConnectX-4 with 40G or 100G can be had if such bandwidth is required, QSFP+ cable is little bit pricier ($20-30?), but that should suffice for everything.Switches, though... well, if you have to ask for price, you can't afford it.
I do have both computers hooked to a regular network, and only video, nfs and iscsi traffic goes trough 10G link - it's good enough.
You can also buy cheap 1G managed switch (used 24 port often can be had below $50) and grab a pair of 4x1G cards (here I suggest Intel cards), and configure teaming. It's not that much better, but still 400 MB/s is better than 100 MB/s.
dgingeri - Monday, July 16, 2018 - link
Teaming isn't very useful, since each stream is limited to just one link. So, max transfer rates would be limited to just 112MB/s, unless both machines were Windows 10 and/or Windows Server 2012r2 or 2016.NIC teaming is useful for servers where multiple workstations are accessing resources at once or between switches where multiple streams are going between them, but for single point to point, it does not enhance performance, as it only allows a connection between 2 endpoints to use one single link. This is the case with both software (Round Robin) and hardware (LACP) methods.
However, with Windows 10 and Windows Server 2012r2 and up, It does allow for each file or session being transferred to go across a different link, so 2 4 NIC teams could copy bunches of files much faster than a single link, but each individual stream is still limited to just one link, so a single file copy or a streamed video would still be limited to 112MB/s.
a351must2 - Wednesday, July 18, 2018 - link
Screw teaming, look into SMB3.0 ... Windows 10 supports it by default. I was using 3 X 1GB links to copy from Windows 10 to Server 2016 and regularly saw above 300MB/sec transfer speeds without configuring anything more than unique IP addresses on the same subnet for each connection. I've since switched to 10GBe with a Buffalo switch and Ebay Intel cards and only bumped up to ~360MB/sec due to other limitations (it'll hit 10GB speeds if I use ram drives on both sides).piroroadkill - Friday, July 13, 2018 - link
The only one I know of that's relatively inexpensive (I have one) is the Netgear MS510TX. It has multi-gigabit ports, so it'll work with these cards at 10, 5 or 2.5Gbit/s depending on which port you use. It has two 10Gbit ports, but only one of those uses a standard 8P8C connector.piroroadkill - Friday, July 13, 2018 - link
Oh, I should just add a caveat: I've had massively unstable operation with this switch + Aquantia AQN108s. In the end, I gave up, and went back to onboard gigabit, which works flawlessly.I have a feeling it is because of crappy drivers, but who knows.
lightningz71 - Thursday, July 12, 2018 - link
You want inexpensive? That's relative. The best I've seen is around $200 for a 10 port switch with 2 X 3.5/5/10GbE and 8 X 1GbE, unmanaged. Want management? That'll be an extra hundred. Want routing and advanced features? That'll be another hundred plus.edzieba - Friday, July 13, 2018 - link
If price/port is more than this card's $90, then it would make sense to build your own Software Defined Router instead. They don't need to be large: https://www.youtube.com/watch?v=ylgV5TUdErUcolinstu - Thursday, July 12, 2018 - link
THIS. Please a $200-340 8 port 10GbE switch please. Not asking for much.Preferably fanless too. and stable. lol
chaos215bar2 - Thursday, July 12, 2018 - link
Make sure it's multigigabit (i.e. supporting 802.3bz). Don't really want to be stuck with a lovely new <something> that tops out at 5GBASE-T, but stuck with a switch that only supports 1000BASE-T and 10GBASE-T.This, of course, adds a lovely extra layer of complexity when trying to search for just the right 10G switch.
rahvin - Thursday, July 12, 2018 - link
Most of the newer switched designed in the last year or so are multi-gigabit. Buffalo in particular has a couple that are almost a reasonable price, with an 8 port that's about $500. If they could get the price down another $1-200 they would fly off the shelves.ewilliams28 - Wednesday, July 18, 2018 - link
10 port netgear with 2 10GIG ports is 199.00 The more interesting one for me though has 4 1GIG ports 2 2.5GIG, 2 5GIG and 2 10GIG for a little over $300. (netgear MS510TX) not cheap but coming down for sure.ComputerGuy2006 - Saturday, July 21, 2018 - link
I don't like the idea of spending +$200 on a switch with only two 10Gbit ports tho. That means only one PC in the house can only connect to one NAS at 10Gbit.... At that point id rather just have a direct connect from my PC to the NAS and hope a cheap 10Gb switch will be released in the future.tech6 - Thursday, July 12, 2018 - link
The bandwidth capacity of the NIC is rarely the limiting factor for online gaming. IT is almost always the quality of your Internet connection or your graphics card.But I guess there is money to be made selling excess capacity to gamers they will never use. Now all they need is an RGB edition and this thing will be a success.
GeorgeH - Thursday, July 12, 2018 - link
In this context "gamer" primarily means "demanding consumer". You can't use words like "professional" or "workstation" without conjuring up $$$, and there really isn't a better alternative nomenclature.Gigaplex - Friday, July 13, 2018 - link
"Enthusiast"CheapSushi - Friday, July 13, 2018 - link
It's a few bucks more for a black PCB and router-like software. That's all. Aquantia are the cheapest 5G and 10G NIC controllers out there now. They have the regular green PCB version too. There's nothing $$$$$ about it. I prefer black PCBs...shocking.dgingeri - Monday, July 16, 2018 - link
It's not gaming that matters, or even video. It's file transfers, so people can keep their stuff on a NAS instead of on their gaming computer, removing hard drives from the power load and heat load the gaming system would have to contend with, while still allowing them to access the files fast enough for editing and/or installation.I know this because that is how I've done it. I have Intel X520 NICs and a Dlink DGS-1510-28X switch, to set up 10G interconnects between my 2 training VM hosts, file server, and main workstation.
oRAirwolf - Friday, July 13, 2018 - link
I really really really really really really wish Anandtech would do some real world benchmarks of the Aquantia AQC107 versus an Intel i219v, an Intel i350, an Intel X550, a Killer E2500 with their software, and a Reaktek 8111. Throughput tests, latency tests, CPU and RAM usage tests, file copy tests, and in game latency tests would actually give people a better idea of what is best suited for their needs. I am using an i219v for my regular LAN and gaming and an AQC107 directly connected to an Intel x520 for 10Gbe connectivity between my NAS and desktop. I have no idea if what I am using is the best or not because nobody will actually compare them all.CheapSushi - Friday, July 13, 2018 - link
same! there's a lot of kind sort maybe hearsay on a lot of NICs. There hasn't really been a big benchmarking like they do for CPUsdgingeri - Monday, July 16, 2018 - link
Did you know that Cisco and many other switch vendors have included in their software license agreement that users cannot publish performance comparison information? Their performance is so dependent on a variety of environmental circumstances that such comparisons can be unreliable. A comparison between network adapters can be troublesome, and almost always needs to be done without a switch.CheapSushi - Friday, July 13, 2018 - link
I love that they have a black PCB version now. I bought their 5G version when it was on sale a few months ago. I know most on here are cynical as hell and don't care. But even in my 6 whitebox builds, I have all black components. I don't mind the router-like software. I won't be using it. But it's not a bad value add on. It's basically what Qualcomm offers with their Atheros / Killer NIC variant. Aquantia has been kicking butt lately on bring more of these affordable chips to market. I'm seeing more HEDT boards offering it too. But again, just glad there's a black PCB option now. Options are part of what makes this hobby great.lmcd - Friday, July 13, 2018 - link
The card itself is bandwidth-limited to 4GBps. I know Ethernet real-world throughput is lower than advertised, but isn't 4GBps still a bottleneck before a 10G Ethernet connection?CheapSushi - Friday, July 13, 2018 - link
You're mixing Gbps and GB/s. So the PCIe 3.0 x4 lanes max throughput is 3.94 GB/s. 10G is 1.25GB/s. So there's plenty of room still left. x2 lanes would have been 1.97 GB/s.