Or some sort of wireless tech that can reliably stream a movie from the router to the living room TV... Some AT readers were really hit hard by such issues.
close, if you have anything better or equal to Wireless-N equipment you should not be having a problem reliably streaming movies from the internet to your TV unless your internet speeds before they hit your router are not up to par.
5MBps minimum for 1080p content. 25MBps minimum for 4K content.
We already have broadband internet everywhere except in the uber-boonies in this country or what the feds define as 'broadband internet'. If you want 1GBps speeds you are going to have to harass Congress to stop with the allowing de-facto monopolies, mandating linesharing, and get some true competition into the markets of the United States.
But that would require Congress to turn down the (no doubt) large "political contributions" they receive, from those very same monopolistic internet service provider companies (cable, phone, satellite, etc), that are paid to ensure they don't meddle in their business, wouldn't it?
Might make it a bit more difficult for some of our elected officials to become millionaires, while in office, darn it!
Princeton and Northwestern researchers have proven that Congress completely ignores what 90% of the public wants. Don't assume they pay attention much to the lower part of that top 10% either.
weather broadband has Gbps bandwidth is irrelevant. there are still so many use cases that wiGiG solves, Wireless VR and Video being the main attraction here. Latency is also a concern.
It can benefit a lot. Screen mirroring, for starters, can use all the bandwidth it can get. I don't mind a solution where i can use my smartphone to stream high quality video from back camera directly to my laptop. The possibilities are endless, it's not only about storage.
I'll bet someone was like, "640K? Is that enough for me to do my quaint 80s work?" and he probably said something like, "Oh, yeah, 640K ought to be enough."
Yeah, but with that extra 384k you could create a RAM drive which command.com and other key OS components could be copied over to upon bootup. You could also use some of it as a HD cache, so no, 1MB wasn't bad if you used it properly.
No, those were 1MB modules. The graphics adapter used the memory from 640K onwards, but there were tricks to get around some of that. So it was 640K available memory, even with 1MB modules.
HiMem.sys and emm386.exe are the two that Microsoft included in DOS. Then there was the really good Quarterdeck QEMM-386. You could cram large amounts of stuff into upper memory freeing as much memory below 640KB as possible. Really made playing popular games at the time easier.
presently i can't use all of the bandwidth. But if you build it, someone will make use of it. Bandwidth is like storage. Everyone claims you will never use it all, but it also seems that we never have enough.
It hasn't been half the rated speed since 802.11g. That was rated for 54Mbps, but only ever got an effective ~20-22Mbps. Sure you can set up some nice point to point, single direction, single device tests that make a nice benchmark for marketing, but in the real world, more than one device is accessing the wireless and often times, more than one wireless network is talking within the 20MHz side-lobes. If you are using 40MHz, 80MHz, or 160MHz channels to get the higher bandwidth offered by higher end routers, you raise that probability exponentially.
Seems to be quite safe at the planned energy levels, because of the big limitation of 60GHz signals - they don't go through solid objects. Won't go through walls, and won't go through your skull either. Sitting in a room with a 60GHz access point should be only slightly more hazardous than moonlight. Much safer than going outside during the day.
A microwave oven throws out over 1000 watts of radiation, and it takes a minute or two to bring a cup of water up to boiling temperatures. If you strapped 500 routers together in some kind of horrible internet-inspired Hallowe'en mask, you'll notice a bunch of heat, which would be caused by the loss through all the other circuitry. I think each router is limited to about 1 watt, while they might display ten times that in heat through the other components.
You shouldn't think of 802.11ad as a successor 802.11g/n/ac. The article refers to 802.11ad a "short range communication standard" and the chart puts a number on what that means: 10 meters or 32.8 feet. In other words, 802.11ad is a line-of-sight wireless standard. It really has no ability to penetrate walls. It's intended to replace wires in where the devices are physically close and need lots of bandwidth, for example, the video cords on monitors or televisions. Goodbye Displayport/HDMI, hello 802.11ad. Because of its limitations, 802.11ad isn't going to show up in mobile devices.
8Gbps = 1GB/s = PCIe x16 3.0. Even if the quoted 8Gbps is full duplex and the actual rate is only 4Gbps both ways, that's still PCIe x8 3.0, which is still more than enough to allow every current graphics card in existence to perform at its best (difference between 3.0 x16 and x8 is ~1%, well within the margin of error). So the prospect of a "wireless" graphics card is there, depending on the latency.
Terrible analogy since 16 wifis cannot be combined to provide the magnified bandwidth simultaneously to just a single client. It doesn't work that way..
They certainly can be, if both peers have 16 wireless radios operation on non-overlapping frequencies. Layer 2 bonding would let you do that if you really wanted to.
60GHz WiFi is only usable in the same room, it can't go through walls really and has a very limited range, it's more like fast Bluetooth than WiFi in that respect.
So unless you have that phone in the same room as a router it's not that useful. So how much are people needing it?
To be fair though, it's an interesting point in the tech industry. You could wirelessly dock using WiGig and Rezence or higher-power Qi for charging. But you can also achieve the same with just a single reversible plug (USB-C and its alt modes). I suspect the cable will win, probably just because it's cheaper and simpler (no charging plate installation). But I do love the concept of just placing a device down somewhere and boom, it's docked and charging. With Hello, you just have to look at it to unlock it securely. Awesome.
It's not really just about using it for one device though. My desktop has a ton of wires coming out of it, a bunch of which go to devices that already have their own power source that are in the same room. Off the top of my head, I could eliminate wires for my speakers, my projector, my monitor, my printer, and my GigE network.
Decluttering wires isn't really the point though, enabling use cases that are impossible now is. Virtual reality currently has to decide between two bad situations: either you have a bunch of wires hanging off your head, or you severely limit the quality of your experience by using onboard mobile-class hardware. Using WiGig to stream the imagery to the headset is the best of both worlds: you get the wire-free approach that makes setting up and using the headset so much better, and you get massively improved battery life because you're not doing any processing on the headset itself.
The "presence" threshold (the point below which your brain can't perceive the latency and accepts it as real) is commonly accepted to be 20ms. 1ms isn't going to make or break it alone.
The 8Gbps of bandwidth is more than is the raw video bandwidth required for the Oculus Rift (which requires around 5.6 Gbps today), and the nature of VR (high detail in the middle, lower detail away from the centre) lends itself quite well to lossless compression. An optimized pipeline that takes eye tracking into account for transmission and not just foveated rendering could dramatically reduce the bandwidth requirements.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
50 Comments
Back to Article
scmorange16 - Friday, October 28, 2016 - link
When are we going to see Gbps WiFi performance in smartphones?Morawka - Friday, October 28, 2016 - link
or even notebooks or laptops for that matter.boeush - Friday, October 28, 2016 - link
Or even broadband internet?close - Saturday, October 29, 2016 - link
Or some sort of wireless tech that can reliably stream a movie from the router to the living room TV... Some AT readers were really hit hard by such issues.Lerianis - Sunday, October 30, 2016 - link
close, if you have anything better or equal to Wireless-N equipment you should not be having a problem reliably streaming movies from the internet to your TV unless your internet speeds before they hit your router are not up to par.5MBps minimum for 1080p content. 25MBps minimum for 4K content.
CrimsonKnight - Tuesday, November 1, 2016 - link
I stream 4k + DolbyDigital from my FiOS quantum gateway to my TV just fine using the gateway's 802.11ac 5GHz band.Lerianis - Sunday, October 30, 2016 - link
We already have broadband internet everywhere except in the uber-boonies in this country or what the feds define as 'broadband internet'.If you want 1GBps speeds you are going to have to harass Congress to stop with the allowing de-facto monopolies, mandating linesharing, and get some true competition into the markets of the United States.
marvdmartian - Wednesday, November 2, 2016 - link
But that would require Congress to turn down the (no doubt) large "political contributions" they receive, from those very same monopolistic internet service provider companies (cable, phone, satellite, etc), that are paid to ensure they don't meddle in their business, wouldn't it?Might make it a bit more difficult for some of our elected officials to become millionaires, while in office, darn it!
Oxford Guy - Tuesday, November 8, 2016 - link
Princeton and Northwestern researchers have proven that Congress completely ignores what 90% of the public wants. Don't assume they pay attention much to the lower part of that top 10% either.Morawka - Sunday, October 30, 2016 - link
weather broadband has Gbps bandwidth is irrelevant. there are still so many use cases that wiGiG solves, Wireless VR and Video being the main attraction here. Latency is also a concern.supdawgwtfd - Friday, October 28, 2016 - link
Point being? Its a phone. It can't make use of all the bandwidth.lilmoe - Saturday, October 29, 2016 - link
It can benefit a lot. Screen mirroring, for starters, can use all the bandwidth it can get. I don't mind a solution where i can use my smartphone to stream high quality video from back camera directly to my laptop. The possibilities are endless, it's not only about storage.prisonerX - Saturday, October 29, 2016 - link
"wireless docking stations, wireless AR/VR head-mounted displays, wireless high-performance storage devices, wireless displays"Seriously, every single one of the uses mentioned in the article applies to phones, and the greater the bandwidth the faster transfers happen.
prisonerX - Saturday, October 29, 2016 - link
640K should be enough for anyone.damianrobertjones - Saturday, October 29, 2016 - link
One day you might read the rest of that comment and realise what he meant.BrokenCrayons - Monday, October 31, 2016 - link
Bill Gates has said multiple times that he's never said that line.http://www.computerworld.com/article/2534312/opera...
mkozakewich - Monday, October 31, 2016 - link
I'll bet someone was like, "640K? Is that enough for me to do my quaint 80s work?" and he probably said something like, "Oh, yeah, 640K ought to be enough."jtgmerk - Sunday, October 30, 2016 - link
they said that about 640k ram as well and people were stupid for installing 1mb ram modules.Mr_Bird_Man - Sunday, October 30, 2016 - link
Yeah, but with that extra 384k you could create a RAM drive which command.com and other key OS components could be copied over to upon bootup. You could also use some of it as a HD cache, so no, 1MB wasn't bad if you used it properly.mkozakewich - Monday, October 31, 2016 - link
No, those were 1MB modules. The graphics adapter used the memory from 640K onwards, but there were tricks to get around some of that. So it was 640K available memory, even with 1MB modules.CoreLogicCom - Friday, November 4, 2016 - link
HiMem.sys and emm386.exe are the two that Microsoft included in DOS. Then there was the really good Quarterdeck QEMM-386. You could cram large amounts of stuff into upper memory freeing as much memory below 640KB as possible. Really made playing popular games at the time easier.jtgmerk - Sunday, October 30, 2016 - link
presently i can't use all of the bandwidth. But if you build it, someone will make use of it. Bandwidth is like storage. Everyone claims you will never use it all, but it also seems that we never have enough.Jumangi - Sunday, October 30, 2016 - link
Pointless for them to have it.13Gigatons - Saturday, October 29, 2016 - link
So much hype. What's the real world speed?prisonerX - Saturday, October 29, 2016 - link
Typically it's have the rated speed.prisonerX - Saturday, October 29, 2016 - link
*halfBurntMyBacon - Monday, October 31, 2016 - link
It hasn't been half the rated speed since 802.11g. That was rated for 54Mbps, but only ever got an effective ~20-22Mbps. Sure you can set up some nice point to point, single direction, single device tests that make a nice benchmark for marketing, but in the real world, more than one device is accessing the wireless and often times, more than one wireless network is talking within the 20MHz side-lobes. If you are using 40MHz, 80MHz, or 160MHz channels to get the higher bandwidth offered by higher end routers, you raise that probability exponentially.BoyBawang - Saturday, October 29, 2016 - link
So what's the effect of this 60GHz signals passing through our brains?PixyMisa - Sunday, October 30, 2016 - link
Seems to be quite safe at the planned energy levels, because of the big limitation of 60GHz signals - they don't go through solid objects. Won't go through walls, and won't go through your skull either. Sitting in a room with a 60GHz access point should be only slightly more hazardous than moonlight. Much safer than going outside during the day.name99 - Sunday, October 30, 2016 - link
60GHz = 0.00025 eV=2.9KGiven that we're living in a sea of 300K photons, I think our fragile little bodies can handle being hit by a 3K photon...
mkozakewich - Monday, October 31, 2016 - link
A microwave oven throws out over 1000 watts of radiation, and it takes a minute or two to bring a cup of water up to boiling temperatures. If you strapped 500 routers together in some kind of horrible internet-inspired Hallowe'en mask, you'll notice a bunch of heat, which would be caused by the loss through all the other circuitry. I think each router is limited to about 1 watt, while they might display ten times that in heat through the other components.Lerianis - Sunday, October 30, 2016 - link
Why the big leap from 5Ghz to 60Ghz? Would not 6-59Ghz penetrate walls while allowing higher speeds?A5 - Sunday, October 30, 2016 - link
60GHz is the next major block of unlicensed spectrum after 5GHz.Meteor2 - Monday, October 31, 2016 - link
That's only an arbitrary human designation. I wonder what all that spectrum is being used for?Guspaz - Monday, October 31, 2016 - link
https://upload.wikimedia.org/wikipedia/commons/thu...ddarko - Sunday, October 30, 2016 - link
You shouldn't think of 802.11ad as a successor 802.11g/n/ac. The article refers to 802.11ad a "short range communication standard" and the chart puts a number on what that means: 10 meters or 32.8 feet. In other words, 802.11ad is a line-of-sight wireless standard. It really has no ability to penetrate walls. It's intended to replace wires in where the devices are physically close and need lots of bandwidth, for example, the video cords on monitors or televisions. Goodbye Displayport/HDMI, hello 802.11ad. Because of its limitations, 802.11ad isn't going to show up in mobile devices.Meteor2 - Monday, October 31, 2016 - link
Indeed, ax is the successor to ac, covered here on Anandtech earlier this week. There's also af, low-power/longer range (but slow) for IoT stuff.Meteor2 - Monday, October 31, 2016 - link
But you most certainly will see ad in mobile devices, to allow mobile docking. Think wireless Continuum, for example.The_Assimilator - Sunday, October 30, 2016 - link
8Gbps = 1GB/s = PCIe x16 3.0. Even if the quoted 8Gbps is full duplex and the actual rate is only 4Gbps both ways, that's still PCIe x8 3.0, which is still more than enough to allow every current graphics card in existence to perform at its best (difference between 3.0 x16 and x8 is ~1%, well within the margin of error). So the prospect of a "wireless" graphics card is there, depending on the latency.renovich31 - Sunday, October 30, 2016 - link
except that stating 8Gbps = 1GB/s = PCIe x16 3.0 is outrageously false, the PCIe x16 3.0 is rated at theorytically 16GB/seldakka - Sunday, October 30, 2016 - link
As renovich31 said, and you don't need to do the conversion from Gbps to GBps, as a SINGLE PCIe 3.0 (i.e. x1) == 8Gbps.So 802.11ad == 8Gbps == PCIe 3.0 x1 == 8Gbps,
Therefore an x16 is 128Gbps, 16 x faster.
hahmed330 - Monday, October 31, 2016 - link
Terrible analogy since 16 wifis cannot be combined to provide the magnified bandwidth simultaneously to just a single client. It doesn't work that way..Guspaz - Monday, October 31, 2016 - link
They certainly can be, if both peers have 16 wireless radios operation on non-overlapping frequencies. Layer 2 bonding would let you do that if you really wanted to.Wwhat - Sunday, October 30, 2016 - link
60GHz WiFi is only usable in the same room, it can't go through walls really and has a very limited range, it's more like fast Bluetooth than WiFi in that respect.So unless you have that phone in the same room as a router it's not that useful. So how much are people needing it?
Meteor2 - Monday, October 31, 2016 - link
You've never docked a laptop?Meteor2 - Monday, October 31, 2016 - link
To be fair though, it's an interesting point in the tech industry. You could wirelessly dock using WiGig and Rezence or higher-power Qi for charging. But you can also achieve the same with just a single reversible plug (USB-C and its alt modes). I suspect the cable will win, probably just because it's cheaper and simpler (no charging plate installation). But I do love the concept of just placing a device down somewhere and boom, it's docked and charging. With Hello, you just have to look at it to unlock it securely. Awesome.Guspaz - Monday, October 31, 2016 - link
It's not really just about using it for one device though. My desktop has a ton of wires coming out of it, a bunch of which go to devices that already have their own power source that are in the same room. Off the top of my head, I could eliminate wires for my speakers, my projector, my monitor, my printer, and my GigE network.Decluttering wires isn't really the point though, enabling use cases that are impossible now is. Virtual reality currently has to decide between two bad situations: either you have a bunch of wires hanging off your head, or you severely limit the quality of your experience by using onboard mobile-class hardware. Using WiGig to stream the imagery to the headset is the best of both worlds: you get the wire-free approach that makes setting up and using the headset so much better, and you get massively improved battery life because you're not doing any processing on the headset itself.
BoyBawang - Sunday, October 30, 2016 - link
This is the solution to the VR wire problemhahmed330 - Monday, October 31, 2016 - link
Not even close since even 1ms latency would make it quite a jarring experience and bandwidth is still too low..Guspaz - Monday, October 31, 2016 - link
The "presence" threshold (the point below which your brain can't perceive the latency and accepts it as real) is commonly accepted to be 20ms. 1ms isn't going to make or break it alone.The 8Gbps of bandwidth is more than is the raw video bandwidth required for the Oculus Rift (which requires around 5.6 Gbps today), and the nature of VR (high detail in the middle, lower detail away from the centre) lends itself quite well to lossless compression. An optimized pipeline that takes eye tracking into account for transmission and not just foveated rendering could dramatically reduce the bandwidth requirements.