I can't wait until *I* can game at 4K at home. I think one more vid card gen will make it optimal (although 290x does a GREAT job, esp in CF). Plus, we need some cheap-ish 4k monitors to become available.
Gamut percentages for displays are typically specified relative to the NTSC RGB gamut unless otherwise stated. NTSC is slightly larger than AdobeRGB - NTSC has the same R and G primaries as Adobe, and a slightly more saturated B primary.
The sRGB gamut is 72% of NTSC, so this monitor has an sRGB-sized gamut. It's impossible to tell if it actually *is* sRGB without knowing the primary chromaticities. Most "72%" monitors are fairly close to sRGB.
Contrary to popular belief, the need (or lack thereof) for higher bit depth doesn't change much between sRGB and AdobeRGB.
Gamut is a *volume* metric, while quantization is a function of step size along a single primary/axis. The difference in step size between two gamuts is roughly equal to the square root of the difference in volume (sRGB/ARGB have equivalent black/white points such that the gamut can only grow in 2 dimensions, hence square root rather than cube root).
IMO the most compelling reason for high bitdepth is for profiling - If you're implementing the profile on the host (i.e. in the video HW LUTs) and if your monitor is only 8 bits, then that remapping may cause "double steps". For example you might end up with a pair or mappings such as 10->10 and 11->12 in your LUT, which may lead to a visible transition. 10bpp fixes that by allowing host-side LUTs to more precisely specify the desired output levels.
The other way around the same issue is what NEC and Eizo do: Put the LUTs in the monitor. I use a pair of NECs (3090W and 301W, both with SpectraView) and quantization is a non-issue at 8bpp when their LUTs are properly configured.
One one hand, $800 is less than I paid for my 1600p panel. On the other I'm kinda meh about only sRGB and would just as soon skip the built in MST kludge generation. On the gripping hand part of me wants to wait a bit to see if the rumors of 5k panels have any substance to them so I can upgrade to a 2:1 scaling mode.
Unless you're editing photos, then a wide-gamut is more of a liability than a bonus. You'll either end up with inaccurate colours for all your media, games etc, you'll be losing some of your bit-depth, or the monitor will have a half-decent sRGB setting (that doesn't sacrifice bit depth) and you'll have paid extra for nothing. Same with 10bit: photo-editing applications are 10bit aware, pretty much everything else you're likely to encounter is not.
It seems that 4K and Ultra-HD are being used interchangeably. 4K is (4096 x 2160) and Ultra-HD, the successor to 1080p is (3840 x 2160).
I think we need to nip this thing in the bud before it gets out of hand, especially since CES is this week and we are seeing new "4K" TVs and monitors all over the place.
Heh, I'm afraid that ship has very much sailed. I too am a stickler for things to be named what they mean, but it seems that we won't be seeing ANY screens with 4096 horizontal pixels and every 4K screen at CES has only 3840 across. In fact, they went ahead and called 7680 pixels across 8K, so having the nK number actually mean something went out the window for good now. It would have been so much easier to call them 4X and 8X. Plus they would have actually been accurate designations that way. Oh well.
kind of like how hard drive manufacturers are sticking to that "We count the bits not the bytes" crap. so a 512gb hard drive ends up being about 8% less capacity than you think.
$800.00 is starting to get close to reasonable. I might bite at $400, but probably not, the price of the GPUs to drive it would push it up considerably.
It's a great price, but can you really use a TN panel and call it a "professional" display? I'm assuming it's TN, since their viewing angles in their PDF (not linked from here) are listed as 170/160, whereas IPS would be 178/178 (and I've seen PVA at 178 too).
I get the feeling that they're taking a consumer 4K display and marketing it as a professional display to account for the higher price, which is a great price for a 4K display but priced like a professional 28" display.
Could somone ask Lenovo to also make a real display for pros? Including wide gamut (9X% adboRGB) and not including apple's famous double-glassy-reflection (ie. just simple matte fininsh)? Would this triple the price or just double?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
25 Comments
Back to Article
madwolfa - Monday, January 6, 2014 - link
"72% color gamut", "true 10-bit color", uhm, what? And 72% of sRGB, Adobe RGB?extide - Monday, January 6, 2014 - link
I can't wait until *I* can game at 4K at home. I think one more vid card gen will make it optimal (although 290x does a GREAT job, esp in CF). Plus, we need some cheap-ish 4k monitors to become available.madwolfa - Monday, January 6, 2014 - link
Well, I want NEC PA272W with G-Sync - not gonna happen though.baii9 - Monday, January 6, 2014 - link
I think 72% NTSC is typical when it is worded like this, so expect ~100% sSRGBDingoJunior - Friday, May 23, 2014 - link
I just got an email from Lenovo that this monitor is available. According to this document (http://shopap.lenovo.com/au/en/common/pdf/ThinkVis... it's 72% Adobe RGB.Ian Cutress - Monday, January 6, 2014 - link
It didn't specify, oddly enough.patrickjchase - Monday, January 6, 2014 - link
Gamut percentages for displays are typically specified relative to the NTSC RGB gamut unless otherwise stated. NTSC is slightly larger than AdobeRGB - NTSC has the same R and G primaries as Adobe, and a slightly more saturated B primary.The sRGB gamut is 72% of NTSC, so this monitor has an sRGB-sized gamut. It's impossible to tell if it actually *is* sRGB without knowing the primary chromaticities. Most "72%" monitors are fairly close to sRGB.
madwolfa - Monday, January 6, 2014 - link
Why do you need 10 bit color with sRGB gamut? Better gradients?patrickjchase - Monday, January 6, 2014 - link
Contrary to popular belief, the need (or lack thereof) for higher bit depth doesn't change much between sRGB and AdobeRGB.Gamut is a *volume* metric, while quantization is a function of step size along a single primary/axis. The difference in step size between two gamuts is roughly equal to the square root of the difference in volume (sRGB/ARGB have equivalent black/white points such that the gamut can only grow in 2 dimensions, hence square root rather than cube root).
IMO the most compelling reason for high bitdepth is for profiling - If you're implementing the profile on the host (i.e. in the video HW LUTs) and if your monitor is only 8 bits, then that remapping may cause "double steps". For example you might end up with a pair or mappings such as 10->10 and 11->12 in your LUT, which may lead to a visible transition. 10bpp fixes that by allowing host-side LUTs to more precisely specify the desired output levels.
The other way around the same issue is what NEC and Eizo do: Put the LUTs in the monitor. I use a pair of NECs (3090W and 301W, both with SpectraView) and quantization is a non-issue at 8bpp when their LUTs are properly configured.
DanNeely - Monday, January 6, 2014 - link
One one hand, $800 is less than I paid for my 1600p panel. On the other I'm kinda meh about only sRGB and would just as soon skip the built in MST kludge generation. On the gripping hand part of me wants to wait a bit to see if the rumors of 5k panels have any substance to them so I can upgrade to a 2:1 scaling mode.psuedonymous - Monday, January 6, 2014 - link
"On the other I'm kinda meh about only sRGB"Unless you're editing photos, then a wide-gamut is more of a liability than a bonus. You'll either end up with inaccurate colours for all your media, games etc, you'll be losing some of your bit-depth, or the monitor will have a half-decent sRGB setting (that doesn't sacrifice bit depth) and you'll have paid extra for nothing. Same with 10bit: photo-editing applications are 10bit aware, pretty much everything else you're likely to encounter is not.
ThanatosOmega - Monday, January 6, 2014 - link
It seems that 4K and Ultra-HD are being used interchangeably. 4K is (4096 x 2160) and Ultra-HD, the successor to 1080p is (3840 x 2160).I think we need to nip this thing in the bud before it gets out of hand, especially since CES is this week and we are seeing new "4K" TVs and monitors all over the place.
Flunk - Monday, January 6, 2014 - link
4K is not a defined standard, it's a generic term, so it's going to get thrown around a lot with little meaning.Here's a wackypedia link: http://en.wikipedia.org/wiki/4K_resolution
psuedonymous - Monday, January 6, 2014 - link
No, 4K IS a defined DCI standard, that's why it's so annoying to see it misapplied.http://dcimovies.com/specification/index.html
Check spec version 1.2, section 4.2.
euler007 - Monday, January 6, 2014 - link
VESA recognizes both 3840x2160 and 4096x2160 as 4k resolutions. Who cares about DCI, created in 2002.Sergio526 - Monday, January 6, 2014 - link
Heh, I'm afraid that ship has very much sailed. I too am a stickler for things to be named what they mean, but it seems that we won't be seeing ANY screens with 4096 horizontal pixels and every 4K screen at CES has only 3840 across. In fact, they went ahead and called 7680 pixels across 8K, so having the nK number actually mean something went out the window for good now.It would have been so much easier to call them 4X and 8X. Plus they would have actually been accurate designations that way. Oh well.
CalaverasGrande - Monday, January 6, 2014 - link
kind of like how hard drive manufacturers are sticking to that"We count the bits not the bytes" crap. so a 512gb hard drive ends up being about 8% less capacity than you think.
Flunk - Monday, January 6, 2014 - link
$800.00 is starting to get close to reasonable. I might bite at $400, but probably not, the price of the GPUs to drive it would push it up considerably.stingerman - Monday, January 6, 2014 - link
Current 4K pro displays are over $3K, I'd say this is a good price and bodes well for a 4K Thunderbolt display soon...Guspaz - Monday, January 6, 2014 - link
It's a great price, but can you really use a TN panel and call it a "professional" display? I'm assuming it's TN, since their viewing angles in their PDF (not linked from here) are listed as 170/160, whereas IPS would be 178/178 (and I've seen PVA at 178 too).I get the feeling that they're taking a consumer 4K display and marketing it as a professional display to account for the higher price, which is a great price for a 4K display but priced like a professional 28" display.
errorr - Monday, January 6, 2014 - link
There are no 10 bit TN displays that I know of so it can't be that. Underlying tech would depend on the panel maker.baii9 - Monday, January 6, 2014 - link
Any professional spreadsheet user around?ther00kie16 - Tuesday, January 7, 2014 - link
Some MVA panels were 170/160. I thought TN was usually 160/160.CalaverasGrande - Monday, January 6, 2014 - link
I like the industrial design of this MUCH better than the fugly Dells and LGs.Thinking this will match up nicely with a New Mac Pro.
toke - Wednesday, January 8, 2014 - link
Could somone ask Lenovo to also make a real display for pros?Including wide gamut (9X% adboRGB) and not including apple's famous double-glassy-reflection (ie. just simple matte fininsh)?
Would this triple the price or just double?