Seems like an odd problem that it is targeted for 4K video, yet it doesn't support the colorspace that is supposedly most appropriate for mastering the video at the moment.
This seems like the first DP1.4 (or 1.3) display I've seen. I suppose this means that there are starting to be display controllers that could conceivably support higher bit-rates than the ancient DP1.2 controllers! Interesting times ahead.
First - yes, very exciting to see DP1.4 in the wild!
Second - the colorspace support is more than sufficient to do the work needed. A native 10-bit panel with a 14-bit LUT is pretty common in the industry. When calibrated and configured for the specific needs of the project, monitors like this are perfect.
The difference between all those panels currently available and this one is the addition of native HDR support. It would be interesting to hear from an industry expert about how using an HDR monitor affects workflows. If only there were a tech website with a lot of clout that could interview someone in the business...
Addendum: monitors like this aren't "perfect" when compared against Sony's PVM OLED line or other SDI-equipped professional displays... but we're talking $1,500 vs $25,000...
Following questions might be useful for consumers with deep pockets. May I assume you still need workstation GPUs for HDR10? Do you need specific software that use OpenGL overlays (Adobe software)?
I know that PS3 can deliver 12bit DeepColour, a feature introduced with HDMI 1.3. However, it never got used as even BluRays were 8bit. It was 600USD well spent at the time, as it was the first source with that feature you could buy. Also, remember that even HDMI 1.2 can do 12 bit 4:2:2, but that is clearly geared to non computer graphics. So, getting back to my question. I know Adobe Photoshop can output 10bit per channel, but it requires a Quaddro or FirePro GPU, apart from the monitor. And it only was Photoshop and few other relatively expensive software. Has anything changed in the last five years? Which software and which OS does support 10bit render? Do we even have a software video player available that can handle HDR10? Or is it only the console's and a few streaming set top boxes?
It's only 2 inches wider (and half an inch shorter) than a 30" 16:10 display. 25.4x15.9" vs 27.5x15.4.
Now that DP1.4 is finally starting to show up, I'm looking forward to getting either a 4k@120 or 5k@60hz display at that size to replace my venerable NEC 3090 sometime next year.
(I'd've preferred 32.5" for a closer height match with my other displays; but the panel industry appears to've skipped that size increment.)
The only thing that seems out of date to me is the use of CCFL backlighting. I don't understand that decision. I look forward to more HDR 4K monitors coming out to match what's available with TVs these days.
After doing a check of NEC/EIZO's current top of the line, it is a bit surprising. The absolute best CCFLs continued to outperform the absolute best LED backlights for a number of years after LED backlighting had swept the mainstream market; but the top of the line LDC vendors appear to've switched their halo lines over to LED back lighting in their current generation.
This maybe an error in the article. The german press release refers to the display as "the LED expert". And I don't think that a power consumption of 50W is possible with CCFL for a display this size.
*Does not include required 750nit brightness, only covers 87% P3 Gamut, not in any way suitable for professionals but hey it's a nice buzzword so get too it!
Until they can solve the static burn in problems of oled (which make it very unsuitable for a computer display) and deliver that solution at a reasonable price point I will continue to use VA monitors with the best static contrast ratio you can get in LCD screens at 3000+:1. CR is the most important aspect to me as the better it gets the more clear everything looks like a veil of fog is lifted off your screen. I wish more attention was put into VA monitors and more sizes choices and refresh rates from more companies were available but almost every monitor is either TN or IPS both limited to 1000:1 CR bleh.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
16 Comments
Back to Article
zepi - Monday, December 12, 2016 - link
Seems like an odd problem that it is targeted for 4K video, yet it doesn't support the colorspace that is supposedly most appropriate for mastering the video at the moment.This seems like the first DP1.4 (or 1.3) display I've seen. I suppose this means that there are starting to be display controllers that could conceivably support higher bit-rates than the ancient DP1.2 controllers! Interesting times ahead.
nathanddrews - Monday, December 12, 2016 - link
First - yes, very exciting to see DP1.4 in the wild!Second - the colorspace support is more than sufficient to do the work needed. A native 10-bit panel with a 14-bit LUT is pretty common in the industry. When calibrated and configured for the specific needs of the project, monitors like this are perfect.
The difference between all those panels currently available and this one is the addition of native HDR support. It would be interesting to hear from an industry expert about how using an HDR monitor affects workflows. If only there were a tech website with a lot of clout that could interview someone in the business...
*hint*wink*nudge*
nathanddrews - Monday, December 12, 2016 - link
Addendum: monitors like this aren't "perfect" when compared against Sony's PVM OLED line or other SDI-equipped professional displays... but we're talking $1,500 vs $25,000...nagi603 - Monday, December 12, 2016 - link
I wish there was a 27" 1440p model. (I already have 27"+34" ultrawide setup)sanf780 - Monday, December 12, 2016 - link
Following questions might be useful for consumers with deep pockets.May I assume you still need workstation GPUs for HDR10? Do you need specific software that use OpenGL overlays (Adobe software)?
JoeyJoJo123 - Monday, December 12, 2016 - link
I think the key is that GPUs with HDMI 2.0b ports support HDR10 output, which should include RX400 series and GTX 1000 series GPUs.sanf780 - Monday, December 12, 2016 - link
I know that PS3 can deliver 12bit DeepColour, a feature introduced with HDMI 1.3. However, it never got used as even BluRays were 8bit. It was 600USD well spent at the time, as it was the first source with that feature you could buy. Also, remember that even HDMI 1.2 can do 12 bit 4:2:2, but that is clearly geared to non computer graphics.So, getting back to my question. I know Adobe Photoshop can output 10bit per channel, but it requires a Quaddro or FirePro GPU, apart from the monitor. And it only was Photoshop and few other relatively expensive software. Has anything changed in the last five years? Which software and which OS does support 10bit render? Do we even have a software video player available that can handle HDR10? Or is it only the console's and a few streaming set top boxes?
Daniel Egger - Monday, December 12, 2016 - link
31.5"... I wonder how many people have a suitable desk for that giving that the optimum viewing distance is around 1.5x the diagonal size...DanNeely - Monday, December 12, 2016 - link
It's only 2 inches wider (and half an inch shorter) than a 30" 16:10 display. 25.4x15.9" vs 27.5x15.4.Now that DP1.4 is finally starting to show up, I'm looking forward to getting either a 4k@120 or 5k@60hz display at that size to replace my venerable NEC 3090 sometime next year.
(I'd've preferred 32.5" for a closer height match with my other displays; but the panel industry appears to've skipped that size increment.)
Lolimaster - Monday, December 12, 2016 - link
IPS with their pathetic contrast and washed out colors? Any pro monitor should be RGB-OLED 3:2aspect ratio 10bits HDR.lefenzy - Monday, December 12, 2016 - link
The only thing that seems out of date to me is the use of CCFL backlighting. I don't understand that decision. I look forward to more HDR 4K monitors coming out to match what's available with TVs these days.DanNeely - Monday, December 12, 2016 - link
After doing a check of NEC/EIZO's current top of the line, it is a bit surprising. The absolute best CCFLs continued to outperform the absolute best LED backlights for a number of years after LED backlighting had swept the mainstream market; but the top of the line LDC vendors appear to've switched their halo lines over to LED back lighting in their current generation.UlfertG - Wednesday, December 14, 2016 - link
This maybe an error in the article. The german press release refers to the display as "the LED expert". And I don't think that a power consumption of 50W is possible with CCFL for a display this size.Frenetic Pony - Monday, December 12, 2016 - link
HDR10**Does not include required 750nit brightness, only covers 87% P3 Gamut, not in any way suitable for professionals but hey it's a nice buzzword so get too it!
Chaotic42 - Monday, December 12, 2016 - link
Oooh, I may have found my next monitor. Let's hope the reviews are good.Laststop311 - Monday, December 19, 2016 - link
Until they can solve the static burn in problems of oled (which make it very unsuitable for a computer display) and deliver that solution at a reasonable price point I will continue to use VA monitors with the best static contrast ratio you can get in LCD screens at 3000+:1. CR is the most important aspect to me as the better it gets the more clear everything looks like a veil of fog is lifted off your screen. I wish more attention was put into VA monitors and more sizes choices and refresh rates from more companies were available but almost every monitor is either TN or IPS both limited to 1000:1 CR bleh.