You'll also need HDMI 2.1(raw bandwidth for 4k@120) which is not available anywhere yet. It was just announced. Displayport has not even announced their next spec that would allow 4k@120. By the time the cables/ports exist, hopefully the displays will support Rec 2020 color space that UHD blu-ray movies will support.
But not Displayport 1.4, 120Hz AND HDR with the expanded color spectrum. You have to sacrifice something. 4K gaming monitors IPS monitors that will come out with 4K, 120 hz will sacrifice the HDR/color. Professional monitors will sacrifice the 120 Hz in order to get the nice colors and HDR. simple as that.
Yeah, but I think you most want HDR for movies/television, which are normally at 24/30hz, and then you can have 120hz for gaming, where I don't think you need the HDR as much.
Gaming monitors can just turn on the optional 2:1 visually lossless compression mode (like video compression but at a much lower rate so nothing is visible, not the chroma sub-sampling that TVs do which makes text look like crap); or drop back to 90 hz to maintain lossless HDR.
It's not magic, DP 1.4 and DP 1.3 are physically same. Just introduced more compression streams and feature support. DP1.3 has enough raw bandwidth for around 4k@90 with 10 bit/color in a perfect world.
Next DP and HDMI specs will have enough bandwidth for "uncompressed" 4k@120h and 10 bit color/HDR.
Anything above that is through mathematically lossy compression, 8 bit color,... Read the fine print.
This is the most important point - HDR10 is mastered at 1000 nits, bare minimum (and often higher)... so activating HDR on this monitor will just make everything unusably dark.
This is incorrect. A display with a nits cap doesn't make everything darker unless the display is working wrong. It implements tone mapping that either eliminates some details in highlights or pushes most things down a bit in brightness that are close to the nits cap. Since tone mapping has no standard this will vary from display to display.
Also the 50% point on HDR 1000 nits content comes in at 100 nits. Almost all onscreen content is coming in at this level or below. Very little is actually surpassing 350 nits. No, I wouldn't call this HDR either with a 350 cap, but it also isn't going to suddenly provide a worse image either.
It needs a firmware update. I've seen that in some TV's before they were updated to match source. For instance, even on Sony's tv's there were problems with the PS4 and hdr, but a Samsung ultra blu-ray player was fine.
wish it had DP1.4, but also, does it have optical audio out?
can't imagine anyone would be happy just using the analog audio output at this pricepoint... in the case where you might plug this in with a chromecast ultra for 4k video.
I'd like an analog out for my external powered speakers (QSC K8's), but I doubt this thing offers a volume control which will vary the output levels :-(
In the interest of non-silliness, let's report a $999.99 price as $1000, or even as $1K. The very next sentence has a 50% or more imprecision in delivery date, and time is money, right?
While it is not the crappy TN panels various companies try to sell....In 2017, there is one misleading flaw and that is the "HDR". It's a 350 nit panel, on lg's website they state that in spurts it can reach 500 nits, but even with that, it doesn't qualify as a real HDR panel for lcd which requires 1000 nits. 500 nits would be ok if it was an old screen and our black level was really low, but seeing that it is lcd, 350 typical. It's is hardly any better than my 2011 Dell monitor.
As for refresh rate. We all knew it woildnt support 4k 120 hz,. There is no such thing at this time. There is a bandwith co strain with current display port 1.2 and HDMI 2.0. at best I was hoping for display port 1.4 and 75 Hertz. Also keep in mind, true HDR adds bandwidth so even with display port 1.4 you can't have it and 120 hz. You gotta wait for HDMI 2.1 for the necessary bandwidth for that.
The other option is to utilize what the 5k or 8k displays do and that is conect via two display ports,
Yep, i'm waiting for the AU Optronics 27" HDR 1000 nits 3840 x 2160 144Hz IPS due in July 2017 in the Asus ROG Swift PG27UQ and Acer Predator XB272-HDR.
Keep in mind this will only be supporting HDR10 through NVIDIA's G-Sync implementation over DisplayPort. It will not be the solution you're looking for if you intend to use it for console-based HDR.
From the LG specs: USB Type-C™ ... 4K + Power Charging (~60W) + Data
YES! It's HERE! Does anyone else not understand that THIS is the dream we've all had? ONE cable into your laptop... Lag-free 4k, mouse, keyboard, thumb drives, AND CHARGING!
That's completely false, there is no HDR10 required brightness level. If you want certification to be an UltraHD Premium display with an LCD screen, you need to have peak highlights above 1000 nits, but most companies don't pay for certification. There are lots of HDR10 and Dolby Vision capable displays that can't hit 1000 nits. HDR10 only requires that you can read and use the metadata.
Are you sure about Dolby Vision? Everything I've read about the DV spec requires mastering at 1,000-2,000 nits, optimally 4,000, with specifications allowing up to a theoretical 10,000. Seems a waste of their certification if they allow HDTV makers to incorporate DV without being able to match the necessary brightness levels.
That's mastering. The upcoming P-Series from TCL is Dolby Vision certified and when we asked at CES, they can produce around 350 nits peak. Dolby said as long as a display can provide a good experience and meet certain (unnamed) criteria they can be certified, but this is the lowest output we know of.
HDR10 has no such certification. All you have to do to be an HDR display is know how to read the metadata, even if you just throw it all away.
I was hoping to use this with the new Scorpio Xbox but it is missing HDMI 2.1 for the newly announced freesynch support on Scorpio so i will have to pass on this one
It's not. DCI-P3 D65 has the same white point as existing computer displays, DCI-P3 Cinema uses the standard white point for existing cinema projection systems. IIRC the differences come down to the optimum settings for film projection the better part of a century ago and the optimum settings for CRTs at the end of the last century were. Moving forward new standards for each kept them the same for backwards compatability reason. I'm not sure where TVs generally fall in the mix.
I doubt we'll ever see that on a large scale. The innards to the low level panel control that takes an HDMI/DP signal and passes it to the low level panel interface; so you're not going to be able to swap a new box with different connectors to upgrade an old monitor. Moving them farther away than somewhere behind the panel would increase costs (extra enclosure, more cabling, etc) for no real benefit. Putting them in the base itself would make the screen unusable with wall mount kits, monitor arms, or other aftermarket stands.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
44 Comments
Back to Article
virtuastro - Monday, April 10, 2017 - link
how much percentage of Adobe RGB?sotti - Monday, April 10, 2017 - link
DCI is very nearly 100% of adobe, the greens are subtly different, but DCI green is very saturatedr3loaded - Monday, April 10, 2017 - link
Very close to my ideal monitor, now we just need 120Hz and FreeSync 2!blahsaysblah - Monday, April 10, 2017 - link
You'll also need HDMI 2.1(raw bandwidth for 4k@120) which is not available anywhere yet. It was just announced. Displayport has not even announced their next spec that would allow 4k@120. By the time the cables/ports exist, hopefully the displays will support Rec 2020 color space that UHD blu-ray movies will support.DigitalFreak - Monday, April 10, 2017 - link
DisplayPort 1.4 supports 4k@120hz. Nvidia Pascal cards already support it, but monitors don't even support DP 1.3 yet.powerincarnate - Monday, April 10, 2017 - link
But not Displayport 1.4, 120Hz AND HDR with the expanded color spectrum. You have to sacrifice something. 4K gaming monitors IPS monitors that will come out with 4K, 120 hz will sacrifice the HDR/color. Professional monitors will sacrifice the 120 Hz in order to get the nice colors and HDR. simple as that.hubick - Monday, April 10, 2017 - link
Yeah, but I think you most want HDR for movies/television, which are normally at 24/30hz, and then you can have 120hz for gaming, where I don't think you need the HDR as much.fallaha56 - Friday, April 14, 2017 - link
er hubick no. the whole point is that gaming is going HDR(!)DanNeely - Monday, April 10, 2017 - link
Gaming monitors can just turn on the optional 2:1 visually lossless compression mode (like video compression but at a much lower rate so nothing is visible, not the chroma sub-sampling that TVs do which makes text look like crap); or drop back to 90 hz to maintain lossless HDR.Meteor2 - Tuesday, April 11, 2017 - link
Now, 60 Hz over 30 I get, but 120 Hz over 60? Really? Anyone A/B tested that?blahsaysblah - Monday, April 10, 2017 - link
It's not magic, DP 1.4 and DP 1.3 are physically same. Just introduced more compression streams and feature support. DP1.3 has enough raw bandwidth for around 4k@90 with 10 bit/color in a perfect world.Next DP and HDMI specs will have enough bandwidth for "uncompressed" 4k@120h and 10 bit color/HDR.
Anything above that is through mathematically lossy compression, 8 bit color,... Read the fine print.
fallaha56 - Friday, April 14, 2017 - link
exactly! this monitor is a pointless waste of money with high latency HDR and high minimum frame ratesxchaotic - Monday, April 10, 2017 - link
and it's NOT curved, hooray!!!Meteor2 - Tuesday, April 11, 2017 - link
Having recently had the pleasure of using a curved display regularly, I'm rather sold on them.Dug - Monday, April 17, 2017 - link
I was thinking the opposite. I just tried a samsung 32" curved and loved it. (Except the 1080p resolution)euskalzabe - Monday, April 10, 2017 - link
If it doesn't support 1000 nits... it's pretty much useless. It won't give a nice, full HDR effect.Sarchasm - Monday, April 10, 2017 - link
This is the most important point - HDR10 is mastered at 1000 nits, bare minimum (and often higher)... so activating HDR on this monitor will just make everything unusably dark.Hard pass.
cheinonen - Tuesday, April 11, 2017 - link
This is incorrect. A display with a nits cap doesn't make everything darker unless the display is working wrong. It implements tone mapping that either eliminates some details in highlights or pushes most things down a bit in brightness that are close to the nits cap. Since tone mapping has no standard this will vary from display to display.Also the 50% point on HDR 1000 nits content comes in at 100 nits. Almost all onscreen content is coming in at this level or below. Very little is actually surpassing 350 nits. No, I wouldn't call this HDR either with a 350 cap, but it also isn't going to suddenly provide a worse image either.
Sarchasm - Tuesday, April 11, 2017 - link
If that's the case, then there's something seriously wrong with this monitor:https://www.youtube.com/watch?v=9JIoZ_u8y3E
Dug - Monday, April 17, 2017 - link
It needs a firmware update. I've seen that in some TV's before they were updated to match source. For instance, even on Sony's tv's there were problems with the PS4 and hdr, but a Samsung ultra blu-ray player was fine.Frenetic Pony - Monday, April 10, 2017 - link
Yep, this is PR "HDR!" rather than any even industry standard HDR. What a pile of junk. Not too mention you'd want Freesync 2 for HDR content.Dug - Monday, April 17, 2017 - link
If you haven't compared, then you don't know.You don't need 1000 nits to give a nice full HDR effect.
8steve8 - Monday, April 10, 2017 - link
wish it had DP1.4, but also, does it have optical audio out?can't imagine anyone would be happy just using the analog audio output at this pricepoint... in the case where you might plug this in with a chromecast ultra for 4k video.
hubick - Monday, April 10, 2017 - link
I'd like an analog out for my external powered speakers (QSC K8's), but I doubt this thing offers a volume control which will vary the output levels :-(Arbie - Monday, April 10, 2017 - link
In the interest of non-silliness, let's report a $999.99 price as $1000, or even as $1K. The very next sentence has a 50% or more imprecision in delivery date, and time is money, right?powerincarnate - Monday, April 10, 2017 - link
While it is not the crappy TN panels various companies try to sell....In 2017, there is one misleading flaw and that is the "HDR". It's a 350 nit panel, on lg's website they state that in spurts it can reach 500 nits, but even with that, it doesn't qualify as a real HDR panel for lcd which requires 1000 nits. 500 nits would be ok if it was an old screen and our black level was really low, but seeing that it is lcd, 350 typical. It's is hardly any better than my 2011 Dell monitor.As for refresh rate. We all knew it woildnt support 4k 120 hz,. There is no such thing at this time. There is a bandwith co strain with current display port 1.2 and HDMI 2.0. at best I was hoping for display port 1.4 and 75 Hertz. Also keep in mind, true HDR adds bandwidth so even with display port 1.4 you can't have it and 120 hz. You gotta wait for HDMI 2.1 for the necessary bandwidth for that.
The other option is to utilize what the 5k or 8k displays do and that is conect via two display ports,
lordmocha - Tuesday, April 11, 2017 - link
Yep, i'm waiting for the AU Optronics 27" HDR 1000 nits 3840 x 2160 144Hz IPS due in July 2017 in the Asus ROG Swift PG27UQ and Acer Predator XB272-HDR.Source: http://www.tftcentral.co.uk/articles/high_refresh_...
Sarchasm - Tuesday, April 11, 2017 - link
Keep in mind this will only be supporting HDR10 through NVIDIA's G-Sync implementation over DisplayPort. It will not be the solution you're looking for if you intend to use it for console-based HDR.azulon1 - Friday, April 14, 2017 - link
Do you elaborate on this please sir. What exactly do you mean it doesn't support real HDR.edgineer - Monday, April 10, 2017 - link
From the LG specs: USB Type-C™ ... 4K + Power Charging (~60W) + DataYES! It's HERE! Does anyone else not understand that THIS is the dream we've all had? ONE cable into your laptop... Lag-free 4k, mouse, keyboard, thumb drives, AND CHARGING!
This is the one, folks. THIS IS IT!
Eden-K121D - Tuesday, April 11, 2017 - link
This is not HDR10. HDR 10 requires max brightness above 1000Nitscheinonen - Tuesday, April 11, 2017 - link
That's completely false, there is no HDR10 required brightness level. If you want certification to be an UltraHD Premium display with an LCD screen, you need to have peak highlights above 1000 nits, but most companies don't pay for certification. There are lots of HDR10 and Dolby Vision capable displays that can't hit 1000 nits. HDR10 only requires that you can read and use the metadata.Sarchasm - Tuesday, April 11, 2017 - link
Are you sure about Dolby Vision? Everything I've read about the DV spec requires mastering at 1,000-2,000 nits, optimally 4,000, with specifications allowing up to a theoretical 10,000. Seems a waste of their certification if they allow HDTV makers to incorporate DV without being able to match the necessary brightness levels.cheinonen - Tuesday, April 11, 2017 - link
That's mastering. The upcoming P-Series from TCL is Dolby Vision certified and when we asked at CES, they can produce around 350 nits peak. Dolby said as long as a display can provide a good experience and meet certain (unnamed) criteria they can be certified, but this is the lowest output we know of.HDR10 has no such certification. All you have to do to be an HDR display is know how to read the metadata, even if you just throw it all away.
tspacie - Wednesday, April 12, 2017 - link
Please let there be an upcoming review of this thing. A 4K HDR monitor/tv that works with a PC and a game console is my grail.medi03 - Tuesday, April 11, 2017 - link
I wonder, how do they achieve wider color gamut on TFTs.PS
And with 55" 4k OLED TVs under 2000$ mark, shouldn't 1000$ OLED 32" OLED monitor be doable? (much smaller surface area)
DanNeely - Tuesday, April 11, 2017 - link
You get a wider color spectrum backlight; either directly or by using phosphors to broaden the emissions.RamboCommando - Tuesday, April 11, 2017 - link
I was hoping to use this with the new Scorpio Xbox but it is missing HDMI 2.1 for the newly announced freesynch support on Scorpio so i will have to pass on this onegfkBill - Wednesday, April 12, 2017 - link
"Consumer" DCI-P3 vs Cinema? DCI-P3 is DCI-P3, no? Whatchoo talkin' bout, Anton?DanNeely - Wednesday, April 12, 2017 - link
https://en.wikipedia.org/wiki/DCI-P3It's not. DCI-P3 D65 has the same white point as existing computer displays, DCI-P3 Cinema uses the standard white point for existing cinema projection systems. IIRC the differences come down to the optimum settings for film projection the better part of a century ago and the optimum settings for CRTs at the end of the last century were. Moving forward new standards for each kept them the same for backwards compatability reason. I'm not sure where TVs generally fall in the mix.
gfkBill - Thursday, April 13, 2017 - link
Thanks Dan - I had even checked there, but missed the white point info at the bottom. Good to know!gfkBill - Thursday, April 13, 2017 - link
Oh, and to add something myself, TV's rec. 709 is also D65. That I did know :)vicbee - Wednesday, April 12, 2017 - link
Could this be the last model with its innards behind the monitor instead of in a separate box (or the base)?DanNeely - Wednesday, April 12, 2017 - link
I doubt we'll ever see that on a large scale. The innards to the low level panel control that takes an HDMI/DP signal and passes it to the low level panel interface; so you're not going to be able to swap a new box with different connectors to upgrade an old monitor. Moving them farther away than somewhere behind the panel would increase costs (extra enclosure, more cabling, etc) for no real benefit. Putting them in the base itself would make the screen unusable with wall mount kits, monitor arms, or other aftermarket stands.