420 color is good enough for Blu-ray most of the time, but banding becomes and issue more often than not. I could see this as a nice option for every 6-bit, TN-based 4K TV out there that already suffers from banding. Not ideal, but nice. Of course it sounds like the manufacturer would have to put out a firmware update to make it work if it doesn't inherently support the option.
The primary cause of banding is color bit depth, or lack of it. Specifically, human eye can see banding at 8 bit color depth while 10 bit is almost gone.
Chroma subsampling tends to create edge bleeds and blocks because the lack of color accuracy.
Great article. Too bad DisplayPort isn't the chosen successor to HDMI for all 4K devices. HDMI bitstream tech is far inferior yet the industry seems to be stuck on it.
In every way except display-side hardware complexity.
HDMI requires licensing costs ($10k + 4 cent per device IF you market the fact your device supports HDMI and your device supports all anti-piracy measures), where DP is free. HDMI supports only fixed refresh rates, where DP support any refresh rate, even variable refresh rates. Display-size buffering, used some mobile phones, is not possible with HDMI's fixed refresh rate. DP supports daisy-chaining, although often not implemented. HDMI's standardization process is awful. According to wikipedia, HDMI 1.0 supports 1200p. But on at least one ivy-bridge laptop the maximum supported resolution is 1080p. HDMI 1.3 is also supposed to support to support 1440, but only two ivy bridge laptop supports that feature: the Alienware M17x with AMD GPU and M18x. This situation has increased somewhat with haswell/kepler but not much. DP is physically smaller, it is possible to use mini-hdmi but that is usually limited to 1080p and requires an adapter anyway. HDMI provides no support for high bit depth in most implementations.
Very few of those points have anything to do with phy. Refresh rates and display buffers are closer to protocol than phy. I just don't underrated why someone would claim that dp phy is somehow superior to hdmi like the people writing the spec didn't know what they were doing.
As long as they give you a choice between low-hertz high-chroma and high-hertz low-chroma, this is a perfectly cromulent solution. Anyone buying a 4K setup at this juncture ought to be well aware of the limitations already, and the workaround is more than anyone had the right to expect.
BluRay and most currently used "consumer" videos are stuck at 4:2:0 anyway. Hopefully this will change in future, with the wider adoption of new video formats.
Most TV/movie content is only 24-30fps so there's really no disadvantage with the current 30Hz limit, and going to 4:2:0 pretty much compromises usability as a monitor, making 60Hz rather pointless.
That's the same crap as it always was with HDMI: you are not required to support the new high-bandwidth modes to claim HDMI 2.0 compliance (in fact this makes even sense if it's not a UHD device). So these devices can claim HDMI 2.0 compliance yet miss the only truly interesting feature of it. Well ok not quite they at least support the new HDMI 2.0 ycbcr 4:2:0 feature hence make it possible to support the full resolution with ycbcr 4:2:0 format.
funny for 120hz 4k... 12Ghz... 3860x2160x12x120... (x * y resolution) * bit depth * frame rate = 12GHz because it is digital. so unless some magic modulation scheme is applied, it's not going to happen
I'd like to point out that it's 24 bits per pixel so the required bitrate is closer to 24 Gbps and with typical two state NRZ you need at least 12 Ghz. You had the right number but for wrong reasons.
What about TVs finally supporting (officially) 1080p at 120Hz? I still don't understand why almost all TVs with >60Hz panels don't support it when there's no lack of bandwidth.
The benefit is that the TV doesn't have to invent those in-between frames to fill the screen during those extra refresh cycles. Right now, when a 120 Hz TV gets a signal at 30 or 60 Hz, it is supposed to just re-display the same image until the next new one comes in. In a 30-120 situation, the same image is displayed 4 times in a row, until the next one comes to the TV, while a 60-120 situation displays the same image 2 times in a row. When the TV has extra processing (like "Auto Motion Plus," "MotionFlow," or "TruMotion") being done to the signal, what's happening is the TV is guessing what the next frame will look like, and filling the in between refreshes with those best guesses, instead of the same unaltered frame. So, if your computer can send more frames per second to the TV, the TV does less guessing (which means the image you see will have less mistakes), and that is where the real benefit is.
What about the only scenario where this doesn't involve massive picture quality loss, i.e. 24 fps video? You only have 24 frames per second of video there, so whether you display it at 30Hz or 60 Hz (or higher), you don't get any benefit.
If I'm understanding correctly, you're just going to have lower panel lag. The panel switching at 60Hz will change the pixel state twice as fast as the one switching at 30Hz(naturally), so the pixels will be at the correct color sooner. You can see the effect in 120Hz monitor reviews, where the 120Hz panels have all finished changing color before the new frame comes in, but the 60Hz panels still have ghosts of the previous frame. See the following monitor review for examples.
Unless I'm mistaking. It doesn't effect the pixel response. The pixels attempt to change as fast as they can. What it changes is how many requests are made per second.
You could attempt to drive a IPS panel at 120hz. But since the pixels will be changing no faster then before ( which was barely fast enough for 60 fps) you get an old image that persists across multiple frames.
I was thinking he was comparing a 30Hz panel(that has a pixel response that is only fast enough to support 30Hz) to a 60Hz panel(with a pixel response fast enough to support 60Hz).
That's incorrect. In a 60 Hz 4K display, the pixels are always going to refresh at 60 Hz. If it has a 30 Hz signal, then it refreshes each frame twice. If you have a 24 Hz film signal, then you have 3:2 pull-down applied and get judder. Panels that run at 120 Hz in TVs work the same way, only you can eliminate judder by repeating each frame 5 times.
The benefit here is you can do 4K video content that is at 60 Hz (like the World Cup right now) natively. Since all consumer video content is already 4:2:0 you don't lose any resolution. As many displays convert everything back to 4:2:0 (even if they receive a 4:2:2, 4:4:4 or RGB signal) before sending it to the screen, you won't notice a difference in color detail as well.
Hmm, so would the 60Hz have more or less judder on a 24FPS film then 30Hz? I'd think the higher refresh would be more likely to be ready when the frame changes.
Interesting stuff, and great article. Packed with useful info and an example in only 7 paragraphs. It's great to feel like I've learned something important and only spent 5 minutes doing it.
Why does 4:2:0 work so well in video if not on the desktop? Is it because the video quality is already compromised at 4:2:0 or that video inherently needs less color information because of the tendency for smooth gradations and fast motion, compared to the sharp contrasts in desktop video?
For a better explanation I recommend the Wikipedia article on chroma subsampling, but basically when it comes to images/video the human eye is less sensitive to chroma information than it is luma. Whether someone's skin is sampled 4:4:4 or 4:2:0 is unlikely to be noticed except in extreme circumstances.
However as you correctly note, text and GUIs have sharp contrasts that natural images do not. This is further exacerbated by the fact that we use subpixel rendering techniques (e.g. ClearType) to improve the readability of text on these relatively low DPI displays. Since 4:2:0 reduces contrast at the edge, it essentially blurs text and makes it harder to read.
Also of note: JPEG also uses chroma subsampling. Which is part of the reason why it's so hard on text, especially at higher compression ratios where 4:2:0 is used.
There's a multitude of potential reasons. At the end of the day the HDMI 2.0 spec was finalized less than a year ago (you usually need a year to bring new silicon like that to market) and I'm not sure if one could do it with just the power provided by DP; you may need to use external USB power, like DL-DVI adapters (which would drive up the cost significantly).
The HDMI 2.0 tx/rx silicon has been ready for some time. As far as power goes, Thunderbolt would have that solved...and think of all those Retina loving Mac fans with Thunderbolt and Retina display needs.
I was surprised that there were no active display adapters shown at Computex '14 for DP 1.2 to HDMI 2.0 with support for 4k 60 fps. Did any 'journalists' look for these kinds of novel little things or were they too busy filing the press releases for another 28" LCD monitor? Seriously, coverage from IT websites has really been a letdown (and that includes this site).
Regarding the active adapter, it seems that with a few hundred thousand tv sets and computer monitors being sold with HDMI 2.0, there would be a nice market for enabling PC connections to these monitors.
Me thinks there might be some underhanded dealings in Taiwan and Korea whereby the big display manufacturers (and the HDMI cartel) are restricting the use of TV's as computer monitors...and limiting refresh rate has been the way to do it.
Of course, we can also thank AMD and Nvidia for selling $1k+ GPU's which are handicapped by a little HDMI transmitter chip.
To those who wonder, I'm typing this on LG's 2014 55" 4k entry (55 UB8500 series). Desktop GTX 570 and laptop GT 650m in Asus NV56 both handle things well via DP 1.2 to HDMI 1.4 active dongle and straight HDMI output on laptop (both at 30 hz).
Sitting 40" from the screen, my eyes have never felt better. Yes, it's 30 hz, but I'm doing zero gaming. Lots of CAD work and web research. CAD productivity has improved (worked on a 30" monitor prior). Firefox, Autodesk Inventor 2014 and Onenote 2010 are great with massive screen real estate and the resulting <90 PPI (yet still 'Retina' at 40") means everything works pretty well via simple Windows 7 scaling at 150%.
Still gotta find me that adapter just to see the difference between 30 Hz and 60 Hz. To those who are hesitating though, please don't. Sitting close to a 4k screen is awesome.
Neat solution. 4:2:0 is around half the bandwidth of 4:4:4 and RGB24, so a good match for HDMI 1.4. Not a good idea for the desktop, but in a living room where you're situated some way off the screen, you won't notice the difference.
You won't notice the difference because human vision is comparative. So unless you see the original video you will not be able to tell. Second, subsampling do the worse damage with video that is already degraded. It exacerbates noise, blockiness and other quality symptoms.
Article title is slightly misleading. Maxwell-based cards (750 ti) also support this feature, not just kepler. I have confirmed it working with my UHD TV connected to an HTPC with a 750ti on board.
Still testing, but so far so good. The display does appear "dimmer" and less vibrant though, which is a slight concern.
Best 4k tvs are have about 3 times more input lag than sony's bravia full hd tvs. Gaming on a 4k TV is still far far away from perfect, which would be an hdmi 2.0 ( with new codec ) tv at 65 inches to appreciate those extra pixels and then there is the hardware to process all that is still way too expensive, I'm predicting another few years post maxwell to be closer to an ideal experience.
I have tested this as well and it does work. On desktop use, the first issue I noticed is that my mouse pointer, which is a white arrow with a black outline, the outline was not apparent. So on a white back round it was hard to see where the mouse was. However, upon booting up a game, it looked fantastic. 4k at 60fps was a real treat.
I currently own a Sony 4k Set and a new Panasonic AX800 in my gaming room. The Panasonic has a Display Port and I am using 2 GTX Titan Blacks to get 4k/60fps in games. Comparing the image in a game (Elder Scrolls On-Line) they look almost identical. But when going to desktop, it is apparent that one using HDMI is not of the same quality as the Display Port which is sending a true 4:4:4 image to the screen.
I am going to test more and compare them some more. My Panasonic suffers from Banding issues, so I need to decide if I will exchange it or get a refund. I had last years Panasonic 4k due to them being the only ones including display port, but I went through 2 sets with really bad banding issues, so not sure I want to get another one now.
I have Samsung F9000 UHD TV (with updated connect box that supports HDMI 2.0)and a NVidia 750Ti connected to it. Once I updated to 343 beta driver, the 4k 60Hz resolution appeared. When I switch to it, both the TV and card report 60Hz. Unfortunately I can see that the display is still updated at 30Hz.
I did some testing by writing a simple Direct3D program and figured out that only every other frame is displayed. Basically, the refresh rate is 60Hz but only every other Present call is visible on the TV. I don't know whether it's the card or the TV dropping half the frames.
Has anyone actually tried enabling this 4:2:0 mode and experienced smooth 60Hz in UHD?
Hey mate, I've got the same TV as you and a GTX 780 OC. I've connected it up with a few different HDMI cables including one from eBay claiming to be 2.0 and I'm only getting 30fps. I'm going crazy! whats the solution to get 60fps since there doesn't seem to be any display port to hdmi 2.0 and the TV doesn't take display port. Thanks in advance. I've currently got driver 353.30 - surely this would support it a year later...
Currently, lets say the GTX680 2GB can run Metro 2033 (4k Max settings) at only 12 FPS (4:4:4). Now with this driver update and the output colour reduced to 4:2:0, would you expect to have a better FPS? If so, what could you realistically expect?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
54 Comments
Back to Article
nathanddrews - Friday, June 20, 2014 - link
420 color is good enough for Blu-ray most of the time, but banding becomes and issue more often than not. I could see this as a nice option for every 6-bit, TN-based 4K TV out there that already suffers from banding. Not ideal, but nice. Of course it sounds like the manufacturer would have to put out a firmware update to make it work if it doesn't inherently support the option.I'm not upgrading my GPU until DP1.3 hits in 2015.
http://www.brightsideofnews.com/2013/12/03/display...
crimsonson - Friday, June 20, 2014 - link
The primary cause of banding is color bit depth, or lack of it. Specifically, human eye can see banding at 8 bit color depth while 10 bit is almost gone.Chroma subsampling tends to create edge bleeds and blocks because the lack of color accuracy.
nathanddrews - Friday, June 20, 2014 - link
Yes, bit depth is the primary cause.MrSpadge - Monday, June 23, 2014 - link
Yeah, with all those TN panels 4:2:0 won't matter.Dave G - Tuesday, August 12, 2014 - link
As an example, professional HD/SDI connections use 10-bits with 4:2:2 chroma sub-sampling.kwrzesien - Friday, June 20, 2014 - link
Just knowing this difference is enlightening, thanks!MikeMurphy - Friday, June 20, 2014 - link
Great article. Too bad DisplayPort isn't the chosen successor to HDMI for all 4K devices. HDMI bitstream tech is far inferior yet the industry seems to be stuck on it.UltraWide - Friday, June 20, 2014 - link
All consumer electronics devices are entrenched in HDMI, so it's hard to make the switch to DP at this point. They just need to keep improving HDMI.brightblack - Friday, June 20, 2014 - link
Or they could include DP ports along with HDMI.surt - Friday, June 20, 2014 - link
OMG are you kidding, that could ad 50 or even 60 cents to the price of a device!xdrol - Friday, June 20, 2014 - link
You mean to the cost of a device. To the price, it would add the same in dollars.dragonsqrrl - Friday, June 20, 2014 - link
lol, yep pretty much.willis936 - Saturday, June 21, 2014 - link
How is DP phy superior to HDMI?Darkstone - Sunday, June 22, 2014 - link
In every way except display-side hardware complexity.HDMI requires licensing costs ($10k + 4 cent per device IF you market the fact your device supports HDMI and your device supports all anti-piracy measures), where DP is free.
HDMI supports only fixed refresh rates, where DP support any refresh rate, even variable refresh rates.
Display-size buffering, used some mobile phones, is not possible with HDMI's fixed refresh rate.
DP supports daisy-chaining, although often not implemented.
HDMI's standardization process is awful. According to wikipedia, HDMI 1.0 supports 1200p. But on at least one ivy-bridge laptop the maximum supported resolution is 1080p. HDMI 1.3 is also supposed to support to support 1440, but only two ivy bridge laptop supports that feature: the Alienware M17x with AMD GPU and M18x. This situation has increased somewhat with haswell/kepler but not much.
DP is physically smaller, it is possible to use mini-hdmi but that is usually limited to 1080p and requires an adapter anyway.
HDMI provides no support for high bit depth in most implementations.
willis936 - Monday, June 23, 2014 - link
Very few of those points have anything to do with phy. Refresh rates and display buffers are closer to protocol than phy. I just don't underrated why someone would claim that dp phy is somehow superior to hdmi like the people writing the spec didn't know what they were doing.leliel - Friday, June 20, 2014 - link
As long as they give you a choice between low-hertz high-chroma and high-hertz low-chroma, this is a perfectly cromulent solution. Anyone buying a 4K setup at this juncture ought to be well aware of the limitations already, and the workaround is more than anyone had the right to expect.thewhat - Friday, June 20, 2014 - link
Here's an (extreme) example of how 4:2:0 affects the image quality:http://i.imgur.com/RY3YrFn.png
BluRay and most currently used "consumer" videos are stuck at 4:2:0 anyway. Hopefully this will change in future, with the wider adoption of new video formats.
JlHADJOE - Sunday, June 22, 2014 - link
Well that's rather disappointing.Most TV/movie content is only 24-30fps so there's really no disadvantage with the current 30Hz limit, and going to 4:2:0 pretty much compromises usability as a monitor, making 60Hz rather pointless.
bernstein - Friday, June 20, 2014 - link
as per samsungs spec sheet the ue40hu6900 is HDMI 2.0 compliant. yet you are implying it does not! so do you know this for a fact? thanks.mczak - Friday, June 20, 2014 - link
That's the same crap as it always was with HDMI: you are not required to support the new high-bandwidth modes to claim HDMI 2.0 compliance (in fact this makes even sense if it's not a UHD device). So these devices can claim HDMI 2.0 compliance yet miss the only truly interesting feature of it. Well ok not quite they at least support the new HDMI 2.0 ycbcr 4:2:0 feature hence make it possible to support the full resolution with ycbcr 4:2:0 format.surt - Friday, June 20, 2014 - link
Yawn. Wake me when I can run 4k 120p.SirKnobsworth - Saturday, June 21, 2014 - link
DisplayPort 1.3 will allow this, and the spec is due out some time this year. Hopefully we'll start seeing it in actual devices next year.haardrr - Sunday, June 22, 2014 - link
funny for 120hz 4k... 12Ghz... 3860x2160x12x120... (x * y resolution) * bit depth * frame rate = 12GHz because it is digital. so unless some magic modulation scheme is applied, it's not going to happenwillis936 - Monday, June 23, 2014 - link
I'd like to point out that it's 24 bits per pixel so the required bitrate is closer to 24 Gbps and with typical two state NRZ you need at least 12 Ghz. You had the right number but for wrong reasons.sheh - Friday, June 20, 2014 - link
What about TVs finally supporting (officially) 1080p at 120Hz? I still don't understand why almost all TVs with >60Hz panels don't support it when there's no lack of bandwidth.eio - Friday, June 20, 2014 - link
Great workaround...but I wonder what would be the benefit of 60hz in video without 3D?knightspawn1138 - Friday, June 20, 2014 - link
The benefit is that the TV doesn't have to invent those in-between frames to fill the screen during those extra refresh cycles. Right now, when a 120 Hz TV gets a signal at 30 or 60 Hz, it is supposed to just re-display the same image until the next new one comes in. In a 30-120 situation, the same image is displayed 4 times in a row, until the next one comes to the TV, while a 60-120 situation displays the same image 2 times in a row. When the TV has extra processing (like "Auto Motion Plus," "MotionFlow," or "TruMotion") being done to the signal, what's happening is the TV is guessing what the next frame will look like, and filling the in between refreshes with those best guesses, instead of the same unaltered frame. So, if your computer can send more frames per second to the TV, the TV does less guessing (which means the image you see will have less mistakes), and that is where the real benefit is.Death666Angel - Friday, June 20, 2014 - link
What about the only scenario where this doesn't involve massive picture quality loss, i.e. 24 fps video? You only have 24 frames per second of video there, so whether you display it at 30Hz or 60 Hz (or higher), you don't get any benefit.Mr Perfect - Friday, June 20, 2014 - link
If I'm understanding correctly, you're just going to have lower panel lag. The panel switching at 60Hz will change the pixel state twice as fast as the one switching at 30Hz(naturally), so the pixels will be at the correct color sooner. You can see the effect in 120Hz monitor reviews, where the 120Hz panels have all finished changing color before the new frame comes in, but the 60Hz panels still have ghosts of the previous frame. See the following monitor review for examples.http://www.tftcentral.co.uk/reviews/asus_vg278he.h...
SlyNine - Saturday, June 21, 2014 - link
Unless I'm mistaking. It doesn't effect the pixel response. The pixels attempt to change as fast as they can. What it changes is how many requests are made per second.You could attempt to drive a IPS panel at 120hz. But since the pixels will be changing no faster then before ( which was barely fast enough for 60 fps) you get an old image that persists across multiple frames.
That's my understanding.
Mr Perfect - Saturday, June 21, 2014 - link
Yes, you're right.I was thinking he was comparing a 30Hz panel(that has a pixel response that is only fast enough to support 30Hz) to a 60Hz panel(with a pixel response fast enough to support 60Hz).
cheinonen - Saturday, June 21, 2014 - link
That's incorrect. In a 60 Hz 4K display, the pixels are always going to refresh at 60 Hz. If it has a 30 Hz signal, then it refreshes each frame twice. If you have a 24 Hz film signal, then you have 3:2 pull-down applied and get judder. Panels that run at 120 Hz in TVs work the same way, only you can eliminate judder by repeating each frame 5 times.The benefit here is you can do 4K video content that is at 60 Hz (like the World Cup right now) natively. Since all consumer video content is already 4:2:0 you don't lose any resolution. As many displays convert everything back to 4:2:0 (even if they receive a 4:2:2, 4:4:4 or RGB signal) before sending it to the screen, you won't notice a difference in color detail as well.
Mr Perfect - Saturday, June 21, 2014 - link
Hmm, so would the 60Hz have more or less judder on a 24FPS film then 30Hz? I'd think the higher refresh would be more likely to be ready when the frame changes.crimsonson - Sunday, June 22, 2014 - link
You have a fundamental misunderstanding of chroma subsampling and how it works.If you subsample a video that is already a subsample from previous, you further degrade color accuracy and quality.
And I am unsure what you mean by "onvert everything back to 4:2:0". That is now how chroma subsampling works.
crimsonson - Sunday, June 22, 2014 - link
BTW - you actually lose overall resolution resolving power with chroma sumsapling.Sivar - Friday, June 20, 2014 - link
Interesting stuff, and great article. Packed with useful info and an example in only 7 paragraphs. It's great to feel like I've learned something important and only spent 5 minutes doing it.Why does 4:2:0 work so well in video if not on the desktop? Is it because the video quality is already compromised at 4:2:0 or that video inherently needs less color information because of the tendency for smooth gradations and fast motion, compared to the sharp contrasts in desktop video?
Ryan Smith - Friday, June 20, 2014 - link
For a better explanation I recommend the Wikipedia article on chroma subsampling, but basically when it comes to images/video the human eye is less sensitive to chroma information than it is luma. Whether someone's skin is sampled 4:4:4 or 4:2:0 is unlikely to be noticed except in extreme circumstances.However as you correctly note, text and GUIs have sharp contrasts that natural images do not. This is further exacerbated by the fact that we use subpixel rendering techniques (e.g. ClearType) to improve the readability of text on these relatively low DPI displays. Since 4:2:0 reduces contrast at the edge, it essentially blurs text and makes it harder to read.
Also of note: JPEG also uses chroma subsampling. Which is part of the reason why it's so hard on text, especially at higher compression ratios where 4:2:0 is used.
dmytty - Friday, June 20, 2014 - link
I was surprised that there were no active display adapters shown at Computex '14 for DP 1.2 to HDMI 2.0 with support for 4k 60 fps. Any input?Ryan Smith - Saturday, June 21, 2014 - link
There's a multitude of potential reasons. At the end of the day the HDMI 2.0 spec was finalized less than a year ago (you usually need a year to bring new silicon like that to market) and I'm not sure if one could do it with just the power provided by DP; you may need to use external USB power, like DL-DVI adapters (which would drive up the cost significantly).dmytty - Saturday, June 21, 2014 - link
The HDMI 2.0 tx/rx silicon has been ready for some time. As far as power goes, Thunderbolt would have that solved...and think of all those Retina loving Mac fans with Thunderbolt and Retina display needs.dmytty - Friday, June 20, 2014 - link
I was surprised that there were no active display adapters shown at Computex '14 for DP 1.2 to HDMI 2.0 with support for 4k 60 fps. Did any 'journalists' look for these kinds of novel little things or were they too busy filing the press releases for another 28" LCD monitor? Seriously, coverage from IT websites has really been a letdown (and that includes this site).Regarding the active adapter, it seems that with a few hundred thousand tv sets and computer monitors being sold with HDMI 2.0, there would be a nice market for enabling PC connections to these monitors.
Me thinks there might be some underhanded dealings in Taiwan and Korea whereby the big display manufacturers (and the HDMI cartel) are restricting the use of TV's as computer monitors...and limiting refresh rate has been the way to do it.
Of course, we can also thank AMD and Nvidia for selling $1k+ GPU's which are handicapped by a little HDMI transmitter chip.
To those who wonder, I'm typing this on LG's 2014 55" 4k entry (55 UB8500 series). Desktop GTX 570 and laptop GT 650m in Asus NV56 both handle things well via DP 1.2 to HDMI 1.4 active dongle and straight HDMI output on laptop (both at 30 hz).
Sitting 40" from the screen, my eyes have never felt better. Yes, it's 30 hz, but I'm doing zero gaming. Lots of CAD work and web research. CAD productivity has improved (worked on a 30" monitor prior). Firefox, Autodesk Inventor 2014 and Onenote 2010 are great with massive screen real estate and the resulting <90 PPI (yet still 'Retina' at 40") means everything works pretty well via simple Windows 7 scaling at 150%.
Still gotta find me that adapter just to see the difference between 30 Hz and 60 Hz. To those who are hesitating though, please don't. Sitting close to a 4k screen is awesome.
Pork@III - Saturday, June 21, 2014 - link
Dammit! hdmi 2.0 it is already for use at 1 year. Nvidia play fucking games with consumers.OrphanageExplosion - Saturday, June 21, 2014 - link
Neat solution. 4:2:0 is around half the bandwidth of 4:4:4 and RGB24, so a good match for HDMI 1.4. Not a good idea for the desktop, but in a living room where you're situated some way off the screen, you won't notice the difference.crimsonson - Sunday, June 22, 2014 - link
You won't notice the difference because human vision is comparative. So unless you see the original video you will not be able to tell.Second, subsampling do the worse damage with video that is already degraded. It exacerbates noise, blockiness and other quality symptoms.
Hisma1 - Saturday, June 21, 2014 - link
Article title is slightly misleading. Maxwell-based cards (750 ti) also support this feature, not just kepler. I have confirmed it working with my UHD TV connected to an HTPC with a 750ti on board.Still testing, but so far so good. The display does appear "dimmer" and less vibrant though, which is a slight concern.
Nuno Simões - Saturday, June 21, 2014 - link
It's not implying that only Kepler will get this, it's informing that Kepler will get it.godrilla - Saturday, June 21, 2014 - link
Best 4k tvs are have about 3 times more input lag than sony's bravia full hd tvs. Gaming on a 4k TV is still far far away from perfect, which would be an hdmi 2.0 ( with new codec ) tv at 65 inches to appreciate those extra pixels and then there is the hardware to process all that is still way too expensive, I'm predicting another few years post maxwell to be closer to an ideal experience.Hawk269 - Saturday, June 21, 2014 - link
I have tested this as well and it does work. On desktop use, the first issue I noticed is that my mouse pointer, which is a white arrow with a black outline, the outline was not apparent. So on a white back round it was hard to see where the mouse was. However, upon booting up a game, it looked fantastic. 4k at 60fps was a real treat.I currently own a Sony 4k Set and a new Panasonic AX800 in my gaming room. The Panasonic has a Display Port and I am using 2 GTX Titan Blacks to get 4k/60fps in games. Comparing the image in a game (Elder Scrolls On-Line) they look almost identical. But when going to desktop, it is apparent that one using HDMI is not of the same quality as the Display Port which is sending a true 4:4:4 image to the screen.
I am going to test more and compare them some more. My Panasonic suffers from Banding issues, so I need to decide if I will exchange it or get a refund. I had last years Panasonic 4k due to them being the only ones including display port, but I went through 2 sets with really bad banding issues, so not sure I want to get another one now.
kalita - Sunday, July 13, 2014 - link
I have Samsung F9000 UHD TV (with updated connect box that supports HDMI 2.0)and a NVidia 750Ti connected to it. Once I updated to 343 beta driver, the 4k 60Hz resolution appeared. When I switch to it, both the TV and card report 60Hz. Unfortunately I can see that the display is still updated at 30Hz.I did some testing by writing a simple Direct3D program and figured out that only every other frame is displayed. Basically, the refresh rate is 60Hz but only every other Present call is visible on the TV. I don't know whether it's the card or the TV dropping half the frames.
Has anyone actually tried enabling this 4:2:0 mode and experienced smooth 60Hz in UHD?
Semore666 - Sunday, June 28, 2015 - link
Hey mate, I've got the same TV as you and a GTX 780 OC. I've connected it up with a few different HDMI cables including one from eBay claiming to be 2.0 and I'm only getting 30fps. I'm going crazy! whats the solution to get 60fps since there doesn't seem to be any display port to hdmi 2.0 and the TV doesn't take display port. Thanks in advance. I've currently got driver 353.30 - surely this would support it a year later...R1CHY_RICH - Wednesday, July 30, 2014 - link
Can anyone please help me with this question?Currently, lets say the GTX680 2GB can run Metro 2033 (4k Max settings) at only 12 FPS (4:4:4). Now with this driver update and the output colour reduced to 4:2:0, would you expect to have a better FPS? If so, what could you realistically expect?
blah238 - Wednesday, July 30, 2014 - link
That would have absolutely no bearing on the framerate.R1CHY_RICH - Wednesday, July 30, 2014 - link
Okay, thank you.smygarn - Monday, September 15, 2014 - link
I've got some strange issues with my new UHD TV..https://www.youtube.com/watch?v=9uAkjv3NJgs
I didn't Think it was 420 problem. Looked like a pixel mapping issue.
Can someone verify this issue? Thanks.