Where does LG and/or Anand Tech claim that? It would be pretty silly to make a display supporting charging and video over USB-C, but not provide any downstream USB ports to allow the display to act as a complete hub.
If I had a nickel for every time I've said that to myself for the past 5+ years, I'd have enough money to wallpaper my house with craptastic 16:9 displays.
I keep hearing that, but I don't think it really means as much these days. I care about screen real estate, pixels, DPI, not so much about aspect ratio, I'm not watching much 16:10 media.
Let's put it this way, I'd rather have a 3840x1600 34" screen than a 2560x1600 (16:10) 27" screen. Same vertical real estate, more width to work.
I keep hearing that but I don't think it means what you think it means. Maybe you should know that "screen real estate" and aspect ratio kind of go hand in hand. Especially since for the same diagonal you have less "real estate" the higher the aspect ratio. So a 16:9 screen has less surface area than a 16:10 one with the same diagonal. And with production costs being important and all that, manufacturers started to convince you that you *love* even 21:9 screens.
And your comparison is pointless to say the least. You picked a ~25% larger diagonal AND a 50% higher resolution screen to prove that one AR is better than the other? I guess that's you unknowingly admitting that 16:9 is better than 16:10 only if you really pump up everything else.
Goats are like mushrooms: if you shoot a duck, I'm afraid of toasters...
He's right though, just like you also have valid points. Vertical resolution is extremely important for productivity on the computer. Lets take a simpler example, take two 24" screens, one is a 1920x1080 FHD resolution, the other a 1920x1200.
If you're going to use the computer for anything other than just watching videos, the screen with the higher vertical resolution is so much better. This applies across the board.
The ultra wide monitors being put out now, like the 34" LG he mentioned do everything well so long as they have enough vertical resolution. They basically serve to replace two monitor setups while eliminating the annoying border in the middle of the screen. Amazing for gaming and videos, even better for actual work. So 21:9 screens with enough vertical real estate serve an amazing function in all aspects.
You are also correct about the manufacturing costs and how the industry attempted to convince everyone that what's cheaper for them to produce, is better for the consumers to own. Don't fall for it!
@niva, I'm not arguing whether or not others should use one specific format. Just that "real estate" means what it means and that meaning doesn't change based on our personal preference, so the remarks about diagonal and AR are very contradictory. Also a comparison becomes more irrelevant the more parameters you change. So to keep it relevant you change one, like diagonal, resolution, refresh rate, etc. Of course he's going to pick the "bigger and better" screen and he'll do it regardless of AR, he just doesn't now it yet.
But I have to agree with him in one respect: reading the post again I realize that it's only about what he personally likes, not about objectiveness or relevance. Of course, I am in no position to contradict him on what he likes.
My observations are same... while 24" 1920x1080 16:9 feels a bit short (tbh, narrow as well), I never noticed same issues on 27" 2560x1440 16:9 screen... using both screen sizes daily. So I guess you get beyond the break line somewhere in between those two screens...
LOL, so if I multiply an issue by 3 then it becomes obvious? You are talking about 16:27 vs 16:30 which is totally irrelevant here. The guy already said I don't want to kill my neck but here you are tripling the neck breaking.
I say this as someone with a 32" UHD that first tried a 36" UHD. The 36 was stupid on the neck. The 32 is just about perfect, not too large, not too small. 16:10 would suck, too much vertical head travel because your eyeballs need head movement at that point.
I will start by stating the obvious: 16 is the long (horizontal) side so when putting three 16:9 or 16:10 screens together you will end up with 48:9 vs 48:10... I haven't yet met that person that would pivot all 3 screens vertically so 16:27 or 16:30 would mean stacking 3 screens on top of each other.
Higher ARs are chosen simply because at any diagonal they provide less screen area so less manufacturing costs. This is why manufacturers are trying to push the even more unholy 21:9 AR now, as if you're working on a cinema screen. When 16:9 became commonplace it wasn't because it's optimal for large diagonals.
And your 32" vs. 36" issue is more one of diagonal and sheer size rather than of AR. In other words there's probably no 36" screen that most people can use comfortably as a computer screen regardless of AR. Most computer desks were never designed to be used with a screen this size. And when you factor in the high resolution that that without scaling forces you to "get in there" you have a recipe for an uncomfortable experience. Just remember that even for older 17"-24" screens the recommended viewing distance is 20" to 40".
I know right... whens the last time you've watched a movie on your computer monitor? last time for me was when Netflix first went digital. Most film's are 21:9 anyways
4K 120hz OLED ~30" with smooth motion tech (BFI or other to mitigate/avoid sample-and-hold effect) and I'd be prepared to make a sizeable investment. Until then it's best value for decent specs to hold me over. Btw. single cable support for 4k@120hz would require a new Dislayport (or less likely hdmi) standard with higher bitrate.
4k@120 or 5k@60 single cable are supported by DP1.3, which has been available since the new GPUs started coming out this summer. Fitting 30bit HDR in too would require DP1.4's compression or dropping back to to only 100hz.
OTOH Displayport is still using 8/10 bit encoding instead of 128/130 bit like in PCIe3 or 128/132 like in USB3, which means it's still leaving almost 20% of its theoretical bandwidth on the table. I'm a bit surprised that the most recent DP standards haven't followed other ultra high bandwidth data links in adopting a more efficient encoding scheme. Doing so would've allowed lossless single cable 4k/120hz/30bit or 5k/70hz/30bit displays.
Thanks for the correction. Turns out I was thinking of 144hz which needs DP 1.4 compression for 4K 24 bit. It's been a while since I checked the details. It looks like HDR10 needs DP 1.4 (spec finalised this year but I haven't heard of monitor support yet... latest GPUs say they are 1.4 "ready") so I guess this LG display must have 1.4 support... in which case the signalling could handle 120Hz 30 bit with DSC (compression).
Xbone supports 10-bit 4:2:0 HDR playback of UHD Blu-ray, but none of the consoles render using wide color gamut, so the HDR of games is limited to Rec.709.
We're closer, now bring an Ultrawide version if this (3440x1440 or better 5120x2160) with 120~144Hz refresh rate and keep the HDR/DCI colors
An 2160p ultrawide @120HZ with 30bit colors will require a little more than DP 1.3/1.4 bandwidth (32.4Gbit/s), so it will need DSC compression which will require DP 1.4
A 1440p ultrawide with similar 30bit @120 can be supported with DP1.3/1.4 without DSC compression even at 144Hz, HDMI 2.0 will be able to handle this @100Hz and in theory also at 120Hz
A 1600p ultrawide with HDR/30bit can be handled also by either DP 1.3 or 1.4 without DSC compression, no HDMI can support this at 100Hz, 96Hz will at the edge of what HDMI 2.0 can handle in theory.
Now, have them make a 34" 21:9 100Hz FreeSync version with a USB hub built in. For ... $6-700? Pretty please? Also: seriously, that thing needs a DisplayPort or two. Come on.
"Moreover, neither game console currently support DCI-P3 color space, which means that the display would need to support sRGB alongside HDR10."
Or just do the exact same thing every 'SDR' monitor with a non-sRGB gamut does: take an sRGB input and smear it over the wider gamut resulting in oversaturation, because there was no provision for transmitting your colourspace when these connections were implementation.
What is up with HDR? Is it real deal or just another marketing trick to make monitors more expensive? Slightly boosted colours and contrast from default? What about HDR content? How does non HDR content look on HDR monitor? IS it better or no difference? Can somebody explain?
It's pretty damn amazing, to be honest. It's all about color volume - volume that is a combination of not only more colors, but much darker colors and brighter colors. This is a good little image to sum up the capabilities of SDR vs HDR: http://www.ctoic.net/wp-content/uploads/2016/01/So...
This article is an absolute mess, not unlike the actual HDR standards themselves, the two main contenders being HDR10 Media Profile and Dolby Vision. After a couple hours of research, I'll admit I'm actually more confused. Mainly, the confusion appears to stem from people misinterpreting HDR, a single feature; which I'd best describe as a significantly expanded contrast ratio from the current standards; as being synonymous with the specific HDR standards mentioned above, which encompass guidelines for HDR; color bit depth; and color gamut as a package. As others have commented already, the Xbox One S and Playstation 4 also support HDR, specifically HDR10. The HDR10 format specifies a baseline of the Rec. 2020 (ITU-R Recommendation BT.2020) color space, 10-bits per sample color depth, and a few other criteria.
As a bit of background, the CIE 1931 color space represents the spectrum that the human visual system can physically see. Rec. 2020 encompasses 75.8% of that spectrum, 53.6% for DCI-P3, 52.1% for Adobe RGB, and 35.9% for Rec. 709/sRGB. Taking into consideration the vast range in quality of displays that all claim to support HDR, with even the top models only just hovering around 100% coverage of DCI-P3, it may initially be confusing how each display would correctly reproduce the source material. That's where a transfer function comes into play.
The transfer function is basically an algorithm stored in metadata that plots the range and intensity of color and luminance. The display manufacturers, knowing the capabilities of each tier of their products, incorporates what I would imagine is an inverse decoder of the TF to correctly map/compress the data. Both HDR10 and Dolby Vision use a TF called Perceptual Quantizer, which has been specified formally as SMPTE ST 2084. Factoring in all that mind numbing, soul crushing information, ratified by a bazillion different consortiums, it's not correct that "...the display would need to support sRGB..." for the aforementioned HDR capable consoles. They all process in Rec. 2020 and the content would be mapped accordingly to each system and display.
Please correct or clarify me if any information I've presented is incorrect or misleading. This whole HDR ecosystem is incredibly exciting as a concept, but astronomically complex in terms of execution.
TL;DR: Article needs a couple corrections: 1. Xbox One S and Playstation 4 also have HDR. 2. Consoles' max output of DCI-P3 ≠ Display fallback to sRGB
"The LG 32UD99 will be aimed at creative professionals, prosumers and gamers..."
I can't see them aiming that screen at anyone else. The rest of us aren't as willing to be milked for cash because we like lounging on our money like dragons atop our hordes. Those in their target audience are either obtaining their hardware with money that's invisible to them because it's coming out of a company expense account or are lined up on the milking carousel willingly because 4K displays make them more l33t.
All of you forgot to notice that there are NO graphics card on the market with a USB-C port output! That means we can expect to hear about new cards from AMD and Nvidia with USB-C ports.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
51 Comments
Back to Article
secretmanofagent - Wednesday, December 14, 2016 - link
No USB hub is a little disappointing.chaos215bar2 - Wednesday, December 14, 2016 - link
Where does LG and/or Anand Tech claim that? It would be pretty silly to make a display supporting charging and video over USB-C, but not provide any downstream USB ports to allow the display to act as a complete hub.sorten - Wednesday, December 14, 2016 - link
In the specification chart. 1 x USB-C, 1 x HDMIDevo2007 - Thursday, December 15, 2016 - link
That's inputs - nothing about outputsbji - Wednesday, December 14, 2016 - link
16:9 is disappointing. That's a TV aspect ratio. 16:10 is better for a computer monitor.jsntech - Wednesday, December 14, 2016 - link
If I had a nickel for every time I've said that to myself for the past 5+ years, I'd have enough money to wallpaper my house with craptastic 16:9 displays.Sarah Terra - Tuesday, December 20, 2016 - link
Nice super good deal now you can buy a 1500 dollar display to charge your macbook pro LOL.Of course youll need two for work so only 3 grand biggie....
Before you whiners start complaining about my price estimate, id say the odds of this thing costing less than $1250 USD are 0.0000000001%
sor - Wednesday, December 14, 2016 - link
I keep hearing that, but I don't think it really means as much these days. I care about screen real estate, pixels, DPI, not so much about aspect ratio, I'm not watching much 16:10 media.Let's put it this way, I'd rather have a 3840x1600 34" screen than a 2560x1600 (16:10) 27" screen. Same vertical real estate, more width to work.
close - Thursday, December 15, 2016 - link
I keep hearing that but I don't think it means what you think it means. Maybe you should know that "screen real estate" and aspect ratio kind of go hand in hand. Especially since for the same diagonal you have less "real estate" the higher the aspect ratio. So a 16:9 screen has less surface area than a 16:10 one with the same diagonal. And with production costs being important and all that, manufacturers started to convince you that you *love* even 21:9 screens.And your comparison is pointless to say the least. You picked a ~25% larger diagonal AND a 50% higher resolution screen to prove that one AR is better than the other? I guess that's you unknowingly admitting that 16:9 is better than 16:10 only if you really pump up everything else.
Goats are like mushrooms: if you shoot a duck, I'm afraid of toasters...
niva - Friday, December 16, 2016 - link
He's right though, just like you also have valid points. Vertical resolution is extremely important for productivity on the computer. Lets take a simpler example, take two 24" screens, one is a 1920x1080 FHD resolution, the other a 1920x1200.If you're going to use the computer for anything other than just watching videos, the screen with the higher vertical resolution is so much better. This applies across the board.
The ultra wide monitors being put out now, like the 34" LG he mentioned do everything well so long as they have enough vertical resolution. They basically serve to replace two monitor setups while eliminating the annoying border in the middle of the screen. Amazing for gaming and videos, even better for actual work. So 21:9 screens with enough vertical real estate serve an amazing function in all aspects.
You are also correct about the manufacturing costs and how the industry attempted to convince everyone that what's cheaper for them to produce, is better for the consumers to own. Don't fall for it!
close - Sunday, December 18, 2016 - link
@niva, I'm not arguing whether or not others should use one specific format. Just that "real estate" means what it means and that meaning doesn't change based on our personal preference, so the remarks about diagonal and AR are very contradictory. Also a comparison becomes more irrelevant the more parameters you change. So to keep it relevant you change one, like diagonal, resolution, refresh rate, etc. Of course he's going to pick the "bigger and better" screen and he'll do it regardless of AR, he just doesn't now it yet.But I have to agree with him in one respect: reading the post again I realize that it's only about what he personally likes, not about objectiveness or relevance. Of course, I am in no position to contradict him on what he likes.
alistair.brogan - Wednesday, December 14, 2016 - link
That is only true with small screen sizes. I don't want to wreck my neck using a 16:10 32 inch display.HollyDOL - Thursday, December 15, 2016 - link
My observations are same... while 24" 1920x1080 16:9 feels a bit short (tbh, narrow as well), I never noticed same issues on 27" 2560x1440 16:9 screen... using both screen sizes daily. So I guess you get beyond the break line somewhere in between those two screens...Molor - Thursday, December 15, 2016 - link
16:9 isn't bad for a single monitor, but when I put 3 together for my office setup I really start to notice how short they are compared to 3 16:10.Azethoth - Thursday, December 15, 2016 - link
LOL, so if I multiply an issue by 3 then it becomes obvious? You are talking about 16:27 vs 16:30 which is totally irrelevant here. The guy already said I don't want to kill my neck but here you are tripling the neck breaking.I say this as someone with a 32" UHD that first tried a 36" UHD. The 36 was stupid on the neck. The 32 is just about perfect, not too large, not too small. 16:10 would suck, too much vertical head travel because your eyeballs need head movement at that point.
close - Thursday, December 15, 2016 - link
I will start by stating the obvious: 16 is the long (horizontal) side so when putting three 16:9 or 16:10 screens together you will end up with 48:9 vs 48:10... I haven't yet met that person that would pivot all 3 screens vertically so 16:27 or 16:30 would mean stacking 3 screens on top of each other.Higher ARs are chosen simply because at any diagonal they provide less screen area so less manufacturing costs. This is why manufacturers are trying to push the even more unholy 21:9 AR now, as if you're working on a cinema screen. When 16:9 became commonplace it wasn't because it's optimal for large diagonals.
And your 32" vs. 36" issue is more one of diagonal and sheer size rather than of AR. In other words there's probably no 36" screen that most people can use comfortably as a computer screen regardless of AR. Most computer desks were never designed to be used with a screen this size. And when you factor in the high resolution that that without scaling forces you to "get in there" you have a recipe for an uncomfortable experience. Just remember that even for older 17"-24" screens the recommended viewing distance is 20" to 40".
Morawka - Thursday, December 15, 2016 - link
I know right... whens the last time you've watched a movie on your computer monitor? last time for me was when Netflix first went digital. Most film's are 21:9 anywaysprogramcsharp - Wednesday, December 14, 2016 - link
When are we going to see 4k @ 120hz? Going to 4k is a big investment across 2 or 3 monitors, I want to make sure that lasts.Huacanacha - Wednesday, December 14, 2016 - link
4K 120hz OLED ~30" with smooth motion tech (BFI or other to mitigate/avoid sample-and-hold effect) and I'd be prepared to make a sizeable investment. Until then it's best value for decent specs to hold me over. Btw. single cable support for 4k@120hz would require a new Dislayport (or less likely hdmi) standard with higher bitrate.DanNeely - Wednesday, December 14, 2016 - link
4k@120 or 5k@60 single cable are supported by DP1.3, which has been available since the new GPUs started coming out this summer. Fitting 30bit HDR in too would require DP1.4's compression or dropping back to to only 100hz.OTOH Displayport is still using 8/10 bit encoding instead of 128/130 bit like in PCIe3 or 128/132 like in USB3, which means it's still leaving almost 20% of its theoretical bandwidth on the table. I'm a bit surprised that the most recent DP standards haven't followed other ultra high bandwidth data links in adopting a more efficient encoding scheme. Doing so would've allowed lossless single cable 4k/120hz/30bit or 5k/70hz/30bit displays.
Huacanacha - Wednesday, December 14, 2016 - link
Thanks for the correction. Turns out I was thinking of 144hz which needs DP 1.4 compression for 4K 24 bit. It's been a while since I checked the details. It looks like HDR10 needs DP 1.4 (spec finalised this year but I haven't heard of monitor support yet... latest GPUs say they are 1.4 "ready") so I guess this LG display must have 1.4 support... in which case the signalling could handle 120Hz 30 bit with DSC (compression).r3loaded - Thursday, December 15, 2016 - link
4K, 120Hz, OLED, HDR10 and Freesync would be the killer combination that would get me to run out and drop a grand on a monitor.prime2515103 - Thursday, December 15, 2016 - link
I would be happy with 85Hz. My eyes and/or brain don't go any faster than that as far as I can tell.p1esk - Thursday, December 15, 2016 - link
We saw one almost a year ago:http://www.techradar.com/reviews/pc-mac/monitors-a...
A better question: when will we buy one?
Huacanacha - Wednesday, December 14, 2016 - link
Xbox One S has HDR. It's one of the key selling points.Devo2007 - Thursday, December 15, 2016 - link
That's what I thought as well.nathanddrews - Thursday, December 15, 2016 - link
Xbone supports 10-bit 4:2:0 HDR playback of UHD Blu-ray, but none of the consoles render using wide color gamut, so the HDR of games is limited to Rec.709.nathanddrews - Thursday, December 15, 2016 - link
At least, they didn't a month ago...usernametaken76 - Monday, December 19, 2016 - link
It also supports HDR in the Netflix app. So, essentially, any app coded to support HDR can work on the XBOX One S. This should also include games.http://www.windowscentral.com/here-are-games-suppo...
Shadowmaster625 - Wednesday, December 14, 2016 - link
Search: $No results.
Fail.
I probably didn't want to know anyway...
cheinonen - Wednesday, December 14, 2016 - link
LG didn't put a price in the details they sent out to everyone. Possibly at CES but not for sure.A5 - Thursday, December 15, 2016 - link
Can you just email all your PR contacts and say "1440p HDR HDR HDR HDR" over and over? :PXajel - Thursday, December 15, 2016 - link
We're closer, now bring an Ultrawide version if this (3440x1440 or better 5120x2160) with 120~144Hz refresh rate and keep the HDR/DCI colorsAn 2160p ultrawide @120HZ with 30bit colors will require a little more than DP 1.3/1.4 bandwidth (32.4Gbit/s), so it will need DSC compression which will require DP 1.4
A 1440p ultrawide with similar 30bit @120 can be supported with DP1.3/1.4 without DSC compression even at 144Hz, HDMI 2.0 will be able to handle this @100Hz and in theory also at 120Hz
A 1600p ultrawide with HDR/30bit can be handled also by either DP 1.3 or 1.4 without DSC compression, no HDMI can support this at 100Hz, 96Hz will at the edge of what HDMI 2.0 can handle in theory.
DesktopMan - Thursday, December 15, 2016 - link
"because Sony’s PlayStation 4 Pro and NVIDIA’s SHIELD ATV (the only HDR-capable game consoles available today"Regular PlayStation 4 and Xbox One S also support HDR, so this statement is quite incorrect.
Valantar - Thursday, December 15, 2016 - link
Now, have them make a 34" 21:9 100Hz FreeSync version with a USB hub built in. For ... $6-700? Pretty please? Also: seriously, that thing needs a DisplayPort or two. Come on.edzieba - Thursday, December 15, 2016 - link
"Moreover, neither game console currently support DCI-P3 color space, which means that the display would need to support sRGB alongside HDR10."Or just do the exact same thing every 'SDR' monitor with a non-sRGB gamut does: take an sRGB input and smear it over the wider gamut resulting in oversaturation, because there was no provision for transmitting your colourspace when these connections were implementation.
edzieba - Thursday, December 15, 2016 - link
*Implemented. Damn autocorrect.milkod2001 - Thursday, December 15, 2016 - link
What is up with HDR? Is it real deal or just another marketing trick to make monitors more expensive? Slightly boosted colours and contrast from default? What about HDR content? How does non HDR content look on HDR monitor? IS it better or no difference? Can somebody explain?DanNeely - Thursday, December 15, 2016 - link
SAM MACHKOVECH from Arstechnica considers it a killer app if you've got content available.http://arstechnica.com/gadgets/2016/12/high-dynami...
nathanddrews - Thursday, December 15, 2016 - link
It's pretty damn amazing, to be honest. It's all about color volume - volume that is a combination of not only more colors, but much darker colors and brighter colors. This is a good little image to sum up the capabilities of SDR vs HDR:http://www.ctoic.net/wp-content/uploads/2016/01/So...
halcyon - Thursday, December 15, 2016 - link
More like luminosity (or brightness) volume. Not more chroma, but more absolute brightness and more absolute contrast.A5 - Thursday, December 15, 2016 - link
If you have HDR content available, it's the real deal. In TVs, it is the real reason to buy a 4K set from a PQ perspective.Assimilator87 - Thursday, December 15, 2016 - link
This article is an absolute mess, not unlike the actual HDR standards themselves, the two main contenders being HDR10 Media Profile and Dolby Vision. After a couple hours of research, I'll admit I'm actually more confused. Mainly, the confusion appears to stem from people misinterpreting HDR, a single feature; which I'd best describe as a significantly expanded contrast ratio from the current standards; as being synonymous with the specific HDR standards mentioned above, which encompass guidelines for HDR; color bit depth; and color gamut as a package. As others have commented already, the Xbox One S and Playstation 4 also support HDR, specifically HDR10. The HDR10 format specifies a baseline of the Rec. 2020 (ITU-R Recommendation BT.2020) color space, 10-bits per sample color depth, and a few other criteria.As a bit of background, the CIE 1931 color space represents the spectrum that the human visual system can physically see. Rec. 2020 encompasses 75.8% of that spectrum, 53.6% for DCI-P3, 52.1% for Adobe RGB, and 35.9% for Rec. 709/sRGB. Taking into consideration the vast range in quality of displays that all claim to support HDR, with even the top models only just hovering around 100% coverage of DCI-P3, it may initially be confusing how each display would correctly reproduce the source material. That's where a transfer function comes into play.
The transfer function is basically an algorithm stored in metadata that plots the range and intensity of color and luminance. The display manufacturers, knowing the capabilities of each tier of their products, incorporates what I would imagine is an inverse decoder of the TF to correctly map/compress the data. Both HDR10 and Dolby Vision use a TF called Perceptual Quantizer, which has been specified formally as SMPTE ST 2084. Factoring in all that mind numbing, soul crushing information, ratified by a bazillion different consortiums, it's not correct that "...the display would need to support sRGB..." for the aforementioned HDR capable consoles. They all process in Rec. 2020 and the content would be mapped accordingly to each system and display.
Please correct or clarify me if any information I've presented is incorrect or misleading. This whole HDR ecosystem is incredibly exciting as a concept, but astronomically complex in terms of execution.
TL;DR: Article needs a couple corrections:
1. Xbox One S and Playstation 4 also have HDR.
2. Consoles' max output of DCI-P3 ≠ Display fallback to sRGB
Home theater sucks.
nathanddrews - Thursday, December 15, 2016 - link
Don't forget about BT.2100. ;-)BrokenCrayons - Thursday, December 15, 2016 - link
"The LG 32UD99 will be aimed at creative professionals, prosumers and gamers..."I can't see them aiming that screen at anyone else. The rest of us aren't as willing to be milked for cash because we like lounging on our money like dragons atop our hordes. Those in their target audience are either obtaining their hardware with money that's invisible to them because it's coming out of a company expense account or are lined up on the milking carousel willingly because 4K displays make them more l33t.
Holliday75 - Thursday, December 15, 2016 - link
I am going to the bank after work to withdraw all my cash so I can sleep on it tonight.BrokenCrayons - Friday, December 16, 2016 - link
That's a slightly more literal interpretation than was trying to convey, but why not? Physical currency is very sanitary stuff. :)KoolAidMan1 - Thursday, December 22, 2016 - link
Good lord that's tempting. No Thunderbolt 3 ports is the only thing that's missing, otherwise it seems great.stun - Thursday, December 22, 2016 - link
All of you forgot to notice that there are NO graphics card on the market with a USB-C port output! That means we can expect to hear about new cards from AMD and Nvidia with USB-C ports.quickcorrect - Tuesday, January 3, 2017 - link
The article incorrectly states that the PS4 and Nvidia Shield are the only HDR capable consoles currently available. The Xbox One S also supports HDR.scaramoosh - Friday, January 6, 2017 - link
PPI?