While I'm always excited about increased theoretical bandwidth, I don't see a huge need for this now but as always, create the tech and someone will figure out a use for it.
While gigabit internet service is still rare in most markets, for those lucky enough to have it the increases are nice. Even if it is just to avoid concerns like "OMG my phone can only download at half the speed of my AC router!" But for the most part I agree. Until we get to the point where a phone form factor has enough power to replace laptops/desktops and you are grabbing a 6GB ISO while it is docked to your 4K display, the differences in real world usage will be relatively minimal. My biggest gripe of AC on phones is really the same issue as all other devices...range. Pulling 220Mb/s on my Nexus 5 doesn't matter much when I can only do so from the study where the router is located.
Having 2 streams of data instead of only 1 should allow you to get double the download rate anywhere you've got a signal. If you're several rooms away your download rate will be degraded by the same proportion; but 20% of 866mbps is still twice as fast as 20% or 433mbps.
Most people seem to forget that efficient use of a wireless network is just as important as the peak throughput.
When we say that a wifi network can push out 450mbps - we say that thats the amount of data that can travel across that spectrum at its peak. However, most forget or don't understand what this implies.
This doesn't mean 450mbps to a single client - its 450mbps total to everyone on that channel. Both those connect to your own AP, and those connected to other APs on the same channel.
With the above 450mbps example, with 4 APs you only have 125mbps to use. All of a sudden, thats already not a lot.
Note that your router and computer will -not- report the value after congestion, it reports the value before it.
Then after that, theres another important step as well: client consumption. Lets use a really basic example, I have a phone that can only handle 65mbps (e.g. an iPhone), but due to being a bit far away its only capable of negotiating around 30mbps.
However, the phone is streaming data at 5mbps. Now, a naive user would assume that means it only means 5mbps of the 450mbps total is being used, but its actually much worse. That iPhone is using the equivalent of 75mbps! If we're further out, and can only negotiate 15mbps, its using a whopping 150mbps of capacity (33% of the frequency capacity) to transfer 5mbps of data!
The reason is because, somewhat simplified, transfer times are divided into time blocks. You can't have multiple stations sending at the same time, so in order to maintain a 5mbps link to a 15mbps receive capable station - it has to dedicate 333ms of the 1s to transfer to the iPhone. That only leaves 666ms to the remaining stations, which is a lot of lost time.
What the 2x2 MIMO allows is essentially the transfer rates in all situations to double. So the phone now goes from using 333ms to 166ms. Which leaves a fair chunk of more time for other clients to receive data.
As you can see, the implications of this are rather important. Its not that your own phone can receive 150mbps of data, its that it requires significantly less time, and therefore causes significantly less congestion, thats beneficial.
Its the same with cellular networks. No, your own phone doesn't need 300mbps of constant receive capability, but you'll cause much less congestion for other clients on the same network by having the capability. It means you can receive much more data in the time slot you are given, so when you can only receive 10mbps due to congestion - doubling the theoretical maximum from say 150 to 300, means you can receive 20mbps consuming the same amount of resources.
Is it just me or does it seem silly for them to chain together the different wireless tech using pcie, usb and uart. Wouldn't it save power if they where unified?
I'm guessing that the wifi's peak performance exceeds what can be pushed over USB (presumably 2.0) or the UART; so they need a higher speed connect for it. From the other end I think it's a combination of USB/Uart operate at lower power levels, so when only BT is active they can gate off the high speed bus to save power and not needing to write new drivers for BT over PCIe/SDIO.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
7 Comments
Back to Article
PyroHoltz - Monday, February 24, 2014 - link
While I'm always excited about increased theoretical bandwidth, I don't see a huge need for this now but as always, create the tech and someone will figure out a use for it.jeffkibuule - Monday, February 24, 2014 - link
Increasing transmission speed also increases power efficiency as the wireless radio doesn't have to be in an active state for quite as long.Bob Todd - Monday, February 24, 2014 - link
While gigabit internet service is still rare in most markets, for those lucky enough to have it the increases are nice. Even if it is just to avoid concerns like "OMG my phone can only download at half the speed of my AC router!" But for the most part I agree. Until we get to the point where a phone form factor has enough power to replace laptops/desktops and you are grabbing a 6GB ISO while it is docked to your 4K display, the differences in real world usage will be relatively minimal. My biggest gripe of AC on phones is really the same issue as all other devices...range. Pulling 220Mb/s on my Nexus 5 doesn't matter much when I can only do so from the study where the router is located.DanNeely - Monday, February 24, 2014 - link
Having 2 streams of data instead of only 1 should allow you to get double the download rate anywhere you've got a signal. If you're several rooms away your download rate will be degraded by the same proportion; but 20% of 866mbps is still twice as fast as 20% or 433mbps.DarkXale - Wednesday, February 26, 2014 - link
Most people seem to forget that efficient use of a wireless network is just as important as the peak throughput.When we say that a wifi network can push out 450mbps - we say that thats the amount of data that can travel across that spectrum at its peak. However, most forget or don't understand what this implies.
This doesn't mean 450mbps to a single client - its 450mbps total to everyone on that channel. Both those connect to your own AP, and those connected to other APs on the same channel.
With the above 450mbps example, with 4 APs you only have 125mbps to use. All of a sudden, thats already not a lot.
Note that your router and computer will -not- report the value after congestion, it reports the value before it.
Then after that, theres another important step as well: client consumption. Lets use a really basic example, I have a phone that can only handle 65mbps (e.g. an iPhone), but due to being a bit far away its only capable of negotiating around 30mbps.
However, the phone is streaming data at 5mbps. Now, a naive user would assume that means it only means 5mbps of the 450mbps total is being used, but its actually much worse. That iPhone is using the equivalent of 75mbps! If we're further out, and can only negotiate 15mbps, its using a whopping 150mbps of capacity (33% of the frequency capacity) to transfer 5mbps of data!
The reason is because, somewhat simplified, transfer times are divided into time blocks. You can't have multiple stations sending at the same time, so in order to maintain a 5mbps link to a 15mbps receive capable station - it has to dedicate 333ms of the 1s to transfer to the iPhone. That only leaves 666ms to the remaining stations, which is a lot of lost time.
What the 2x2 MIMO allows is essentially the transfer rates in all situations to double. So the phone now goes from using 333ms to 166ms. Which leaves a fair chunk of more time for other clients to receive data.
As you can see, the implications of this are rather important. Its not that your own phone can receive 150mbps of data, its that it requires significantly less time, and therefore causes significantly less congestion, thats beneficial.
Its the same with cellular networks. No, your own phone doesn't need 300mbps of constant receive capability, but you'll cause much less congestion for other clients on the same network by having the capability. It means you can receive much more data in the time slot you are given, so when you can only receive 10mbps due to congestion - doubling the theoretical maximum from say 150 to 300, means you can receive 20mbps consuming the same amount of resources.
toyotabedzrock - Monday, February 24, 2014 - link
Is it just me or does it seem silly for them to chain together the different wireless tech using pcie, usb and uart. Wouldn't it save power if they where unified?DanNeely - Monday, February 24, 2014 - link
I'm guessing that the wifi's peak performance exceeds what can be pushed over USB (presumably 2.0) or the UART; so they need a higher speed connect for it. From the other end I think it's a combination of USB/Uart operate at lower power levels, so when only BT is active they can gate off the high speed bus to save power and not needing to write new drivers for BT over PCIe/SDIO.