Kinda silly to not just have a Thunderbolt 3 port vs. the 64W USB-C. Much more useful in 2019 to replace 3.1 USB-C w/ Thunderbolt 3 + the fact USB4 includes Thunderbolt 3
Appreciate the true 4K resolution, however it's super strange there's not much info about its HDR support.If it doesn't have Dolby Vision & HLG on top of HDR10, that is incredibly disappointing and this monitor wouldn't be that great for a lot of professionals accordingly (particularly video pros).
I also would've hoped for HDMI 2.1 for the HDMI ports to leverage QuickSync & other affordances in addition to perhaps VRR for computer graphics work for interactive entertainment content production.
Seems 2020 is more opportune time to buy something along those lines, starting w/ the Asus PA32UCG (VRR, support for all the major HDR formats supported in film & streaming w/ only HDR10+ support TBD, 4K@120hz, Thunderbolt 3)
This is a pro monitor. There are two choices there: do HDR PROPERLY (i.e. either OLED or a full array backlight) or don't do it at all. Here, they've gone for don't do it at all.
The half-arsed "accepts HDR inputs but only displays SDR" crap you find in the consumer space does not fly in the pro world where people often actually have a clue.
I’m very well familiar with OLED HDR reference monitors which isn’t mandatory for a pro display as Apple & Asus proved with the Pro Display XDR & PA32UCG monitor enables (as well as dual LCD panels by the likes of Panasonic)
Given the existence of those monitors—and the PA32UCX already existing pretty much the same price at ~$3800, I’m of the opinion this monitor’s feature set is questionable this late in 2019
PA32UCX has 1D-LUTs, not 3D. For professional photographers, colorists (like myself), etc., that is a big difference. I cannot get the accuracy needed without 3D LUTs. Also, NEC has a history of serving professional image market, Asus is still trying to prove itself but hasn't yet. They've made good efforts, but the company is clearing still learning that market.
Apple Pro Display XDR hasn't proven anything yet, as nobody has hands-on to evaluate it. It's also unclear if it will support 3D LUTs with 3rd-party calibration, or if it uses 3D LUTs at all. The specs just say it has a Thunderbolt input, so it's still unclear. It also costs twice the price.
I agree with both of you, I get both of your consumer frustration, I would like a 4k dci Dolby Vision monitor with a gamut bigger (even slightly) than Dci-P3. But It has to be color/bleeding consistent through years of usage. To my experience, even Nec cannot assure pro level consistency in 3-5 years of daily usage, except the reference models. The asus PA32UCX lacks the 4k DCI which but with 89% Rec2020 lt looks a solid option, but I do not believe it will be consistent enough to do the job. I had experience with apple monitors as well and I think it won't be much better on the long run. As many people have already said I would go for an SDR working monitor and I will check the results on an HDR 50+ inches premium TV.
I hope I am wrong and both Apple's and Asus's monitors will be amazing, but we need to wait at least a year to evaluate consistency, but luckily I do not need an HDR color grading monitor anytime soon and I can wait eizo to release a more affordable version of their 40k Prominence monitor.
Real question: Isn't 1d-lut vs 3d-lut a software thing? If the monitor is properly calibrated, isn't lut just a transformation table that is handled in software and does not use the monitors internal tables (which I'm assuming are just there for calibration)?
Why wouldn’t you with a pro computing device to be used alongside this?
Also USB4 has TB3, so it just doesn’t much sense to not have TB3 for the longevity this should have with the amount you’re paying for it + being a pro device.
TB requires Intel licensing. The overwhelming majority of PCs do not have TB ports or at best optional that requires a PCIe card. At this point, DP and HDMI are the safest port types for monitor.
It would be really great to know what kind of HDR support this monitor has, although with its peak brightness of 350 nits I'm guessing it has no support. Which is a shame. I can't understand why manufacturers continue marketing high-end monitors that don't have HDR support.
I'm not trolling, no one thinks "Oh man lets get that fad HRD tech and play with it". No one notices a difference with it. People lap up bullshit tech like its new again. lol
Consumer display of HDR is one thing, as you can just approximate it and get an HDR effect. Mastering HDR requires constant high brightness without drift or burn-in, usually >1000 nits. Those devices cost >$30K right now. Apple marketing is saying Pro Display XDR will be able to compete with these for a fraction of the price, but they'll have to prove it. Apple has a history of over-stating their capabilities and glossing over the finer points.
Why have touch support in such a monitor? Fingerprints will rapidly reduce the 10 bit color resolution to 8 bits or less and cause non-uniformity across the screen.
Not as badly as you say. Fingerprints don’t prove to be a real problem on phones and tablets. You don’t notice them until you turn it off. If you have really greasy or dirty hands, you should clean them first, or your keyboard/mouse/trackpad or trackball will get filthy fast. Not very professional.
The only thing I find impressive is the 14 bit luts. The rest is pretty much middle range. Where is the definite support for DCI-P3? Without that, forget pro grade video. 4k’s color standard is DCI-P3. So is cinema. So without stated support, this monitor isn’t useful in those categories,
350 nits? Seriously, in a new graphics/video pro monitor? What are they thinking. Sure, this isn’t an Apple Pro display for double the price (at 6k), that can do 1200 nits. But to hope to get to HDR you need at least 650 nits to even come close.
This seems to be the perfect monitor for the last generation of users.
"The key feature of the ColorEdge Prominence CG3145 display is its ability to properly reproduce both very bright and very dark areas on the scene without artefacts caused by local dimming (used on many IPS-based televisions and on some monitors) or an auto brightness limiter. EIZO does not reveal many details about the IPS panel it uses for the CG3145, but it claims that it has control of backlight intensity in every pixel. The latter means that the company either uses Panasonic’s IPS panels with a special layer of light-modulating cells that enable pixel-by-pixel control of backlight intensity, or a similar technology it has developed in-house." (Quoted from Anandtech)
There isn't really a more-expensive or better technology than the dual-layer LCD used in the Prominence. It is per-pixel, it costs about $30K, the Apple Pro Display XDR is inferior in all regards except resolution, although the Apple display is a proprietary 6K resolution that nobody uses in the same size panel where you can't see that much detail anyway. So if you're expecting the XDR, with it's 576 zones, to compete with something like the Prominence per-pixel backlight, you're going to be very disappointed. 576 zones is not very much for professional use, and I'm very skeptical that I could live with it (I'm a colorist).
Because HDR increases the cost significantly and if you are pitching yourself as a product for the professional content creator (print, video, etc.) you will not be implementing HDR VESA 400 or something low end, esecially at that price. You would need to achieve at least 600 if not 1000. Which makes it more expensive and requires more complex processing to achieve more accurate colors than your standard consumer monitor.
And anyway, HDR is not needed for every monitor. Not every monitor is for HDR. Yes it does limit the market for this monitor, but that is no different that deciding that a monitor be widescreen or not. Not everyone needs or wants a widescreen monitor.
This is a $3200 monitor. It is expensive. Considering that much less expensive ones do offer full DCI-P3 and over 500 nits, there is no excuse for this one not using it.
The only properly implemented HDR monitor I am aware of is the ColorEdge PROMINENCE CG3145 (see my post above). That costs about $30,000. It's unrealistic to expect proper HDR on a $3200 monitor.
The competition for this is Eizo CG3145 at about $5600. This NEC is very nicely priced for its market. Anything less than 1000 nits is useless for professional HDR mastering and only good for consumer playback.
With 3D LUTs, you can calibrate to whatever colorspace you need. So there is your P3 support. People who don't know how to do this shouldn't be working in P3 anyway.
75 Hz. Is no one commending NEC for having a higher than usual refresh rate? Sure, 120hz would be far more welcomed, but even 75hz is noticeably smoother than 60hz.
If they can do fractional framerates below 72Hz, it is useful, because then you can do 71.94Hz which is 3x 23.98, giving you nice smooth playback of "24p" video. 60Hz is messy for this since it's having to drop every other frame (well, that's over-simplifying how it works but you get the idea).
"including a USB-C input that supports 65 W power delivery". I read this as the power to the monitor is supplied via USB-C (input to the monitor). Not that it could supply either devices with up to 65W (power output).
USB-C port and USB-C input are almost the same. But since this is not an input "port" might be clearer.
3000 Dollars only at 75hz refresh rate? Hahahahhaahaha are you joking being really stupid NEC? 31 INCH TOO!! HAHAHAHAHAAHHAHAHAHAHA CAN'T STOP LAUGHING.
Then you're laughing in the mirror because you are completely ignorant. This is probably the least important spec for the target market. But of course, just assume that NEC, what with all their engineers and years of success in this market, are the idiots who don't know how to spec or price their product. Maybe consider understanding things before you speak on them?
I see a lot of videographers commenting on here but no photographers. As far as I can tell this series has always been produced for photography, pre-press and medical imaging markets, not video, hence not marketing a P3 color gamut or being HDR. I am a protog and do fine art printing for other clients too. I have exclusively used NEC PA series monitors for many years. For me the only specs that really matter are the 100% Adobe RGB gamut and the 14 bit lookup table. It's nice to get bumps in resolution and contrast on this series, but nothing else really matters. When doing print proofing the standard is between 80 and 120 nits brightness, so 350 is super overkill. Same is true for the medical DICOM standards for the medical market that NEC sells lots of these to. Another big advantage that NEC has over many other brands (Asus, LG, ViewSonic, BenQ... all the newbies) is that their SpectraView II calibration software supports some higher end spectrophotometers directly, like the Xrite X1 Pro, whereas the less pro offerings only support a very small number of lower end colorimeters, which aren't as accurate and particularly less sensitive near the lower end of the brightness range.
If the above is all gibberish to you then, put simply this series isn't targeted at you and luckily something far more affordable will probably do you just fine and make you way happier... unless you're grading HDR video content, in which case I'm sorry, and I hope you've got a spare kidney because then yes, I agree with the other commenters who have tried to explain why you need 1000 nits, 100% P3 gamut, and none of this prosumer gobbledygook.
I'm a photographer who also profiles paper to try to print what I want. My older ViewSonic is dying. What would you recommend for under $1500 and 32 inch?
Honestly, probably the 27" NEC PA271Q, though it's smaller and lower resolution than you'd probably like... if you're okay just using a colorimeter the ASUS PA329Q is pretty durn good. At least it's 99.5% Adobe RGB and can be hardware calibrated. Uniformity control is lacking but at least it doesn't fry its self like the BenQ :-P
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
52 Comments
Back to Article
lilkwarrior - Wednesday, November 20, 2019 - link
Kinda silly to not just have a Thunderbolt 3 port vs. the 64W USB-C. Much more useful in 2019 to replace 3.1 USB-C w/ Thunderbolt 3 + the fact USB4 includes Thunderbolt 3lilkwarrior - Wednesday, November 20, 2019 - link
Appreciate the true 4K resolution, however it's super strange there's not much info about its HDR support.If it doesn't have Dolby Vision & HLG on top of HDR10, that is incredibly disappointing and this monitor wouldn't be that great for a lot of professionals accordingly (particularly video pros).I also would've hoped for HDMI 2.1 for the HDMI ports to leverage QuickSync & other affordances in addition to perhaps VRR for computer graphics work for interactive entertainment content production.
Seems 2020 is more opportune time to buy something along those lines, starting w/ the Asus PA32UCG (VRR, support for all the major HDR formats supported in film & streaming w/ only HDR10+ support TBD, 4K@120hz, Thunderbolt 3)
edzieba - Wednesday, November 20, 2019 - link
This is a pro monitor. There are two choices there: do HDR PROPERLY (i.e. either OLED or a full array backlight) or don't do it at all. Here, they've gone for don't do it at all.The half-arsed "accepts HDR inputs but only displays SDR" crap you find in the consumer space does not fly in the pro world where people often actually have a clue.
lilkwarrior - Wednesday, November 20, 2019 - link
I’m very well familiar with OLED HDR reference monitors which isn’t mandatory for a pro display as Apple & Asus proved with the Pro Display XDR & PA32UCG monitor enables (as well as dual LCD panels by the likes of Panasonic)Given the existence of those monitors—and the PA32UCX already existing pretty much the same price at ~$3800, I’m of the opinion this monitor’s feature set is questionable this late in 2019
Canam Aldrin - Saturday, November 23, 2019 - link
PA32UCX has 1D-LUTs, not 3D. For professional photographers, colorists (like myself), etc., that is a big difference. I cannot get the accuracy needed without 3D LUTs. Also, NEC has a history of serving professional image market, Asus is still trying to prove itself but hasn't yet. They've made good efforts, but the company is clearing still learning that market.Apple Pro Display XDR hasn't proven anything yet, as nobody has hands-on to evaluate it. It's also unclear if it will support 3D LUTs with 3rd-party calibration, or if it uses 3D LUTs at all. The specs just say it has a Thunderbolt input, so it's still unclear. It also costs twice the price.
umano - Monday, November 25, 2019 - link
I agree with both of you, I get both of your consumer frustration, I would like a 4k dci Dolby Vision monitor with a gamut bigger (even slightly) than Dci-P3. But It has to be color/bleeding consistent through years of usage. To my experience, even Nec cannot assure pro level consistency in 3-5 years of daily usage, except the reference models.The asus PA32UCX lacks the 4k DCI which but with 89% Rec2020 lt looks a solid option, but I do not believe it will be consistent enough to do the job. I had experience with apple monitors as well and I think it won't be much better on the long run. As many people have already said I would go for an SDR working monitor and I will check the results on an HDR 50+ inches premium TV.
I hope I am wrong and both Apple's and Asus's monitors will be amazing, but we need to wait at least a year to evaluate consistency, but luckily I do not need an HDR color grading monitor anytime soon and I can wait eizo to release a more affordable version of their 40k Prominence monitor.
andychow - Tuesday, November 26, 2019 - link
Real question: Isn't 1d-lut vs 3d-lut a software thing? If the monitor is properly calibrated, isn't lut just a transformation table that is handled in software and does not use the monitors internal tables (which I'm assuming are just there for calibration)?Death666Angel - Wednesday, November 20, 2019 - link
Not for those without a TB3 device, though.lilkwarrior - Wednesday, November 20, 2019 - link
Why wouldn’t you with a pro computing device to be used alongside this?Also USB4 has TB3, so it just doesn’t much sense to not have TB3 for the longevity this should have with the amount you’re paying for it + being a pro device.
There’s no shortage of pro devices with TB3
imaheadcase - Wednesday, November 20, 2019 - link
Because every device doesn't have thunderbolt, and every device has USB-C. Simple.crimsonson - Wednesday, November 20, 2019 - link
TB requires Intel licensing. The overwhelming majority of PCs do not have TB ports or at best optional that requires a PCIe card.At this point, DP and HDMI are the safest port types for monitor.
lilkwarrior - Wednesday, November 20, 2019 - link
It’s not an either-or. You can have TB3 & the other ports. Most pros are using TB3 for the bandwidth & I/O perks alone.It’s no coincidence all pro laptops have Thunderbolt 3; only weird consumer-level “pro” laptops & desktops skip TB3
lilkwarrior - Wednesday, November 20, 2019 - link
I meant to say high-end pro hardwareimaheadcase - Thursday, November 21, 2019 - link
But that is still the MINORITY of people. Most people still use monitor on a desktop like this. Laptops don't run this monitor.lilkwarrior - Wednesday, November 20, 2019 - link
USB4 has royalty free Thunderbolt 3. Intel kept their promise to make it eventually royalty free via that method.Accordingly, pros increasingly expect TB3.
imaheadcase - Wednesday, November 20, 2019 - link
Other way around. Thunderbolt 3 not very useful for most people anymore. I don't see many devices even with Thunderbolt anymore.melgross - Thursday, November 21, 2019 - link
You’re blind.imaheadcase - Thursday, November 21, 2019 - link
I'm blind? Lets look at current amount of monitors with TB3 vs USB-C. Ok i win USB-C still wins.Speednet - Wednesday, November 20, 2019 - link
It would be really great to know what kind of HDR support this monitor has, although with its peak brightness of 350 nits I'm guessing it has no support. Which is a shame. I can't understand why manufacturers continue marketing high-end monitors that don't have HDR support.imaheadcase - Wednesday, November 20, 2019 - link
Because HDR is not really a popular option, and not really something worth including in a Pro monitor. Well, any monitor for that matter.crimsonson - Wednesday, November 20, 2019 - link
Trolling dude.imaheadcase - Thursday, November 21, 2019 - link
I'm not trolling, no one thinks "Oh man lets get that fad HRD tech and play with it". No one notices a difference with it. People lap up bullshit tech like its new again. lollilkwarrior - Wednesday, November 20, 2019 - link
HDR is certainly a popular option pros are looking for; the most sold tech devices sold last Black Friday alone wa 4K HDR devices.Even the iPhone has supported Dolby Vision HDR for some time; also even the Apple TV & Chromecast supports various HDR content.
Sport streaming supports HLG HDR
How else pros can master content for consumers if they don’t have a HDR pro monitor? These monitors are DOA to most pros these days
imaheadcase - Thursday, November 21, 2019 - link
You are %100 wrong on most devices sold last Black Friday. The most sold TV in the USA was a $199 TCL 1080p TV. No HDR.The most sold phone last year Black Friday was a $85 prepaid phone (fun fact it outsold Iphone devices by over TWO YEARS of sells)
HDR isn't even advertised as a selling point on TVs. Its a footnote like "supports HDR".
Canam Aldrin - Saturday, November 23, 2019 - link
Consumer display of HDR is one thing, as you can just approximate it and get an HDR effect. Mastering HDR requires constant high brightness without drift or burn-in, usually >1000 nits. Those devices cost >$30K right now. Apple marketing is saying Pro Display XDR will be able to compete with these for a fraction of the price, but they'll have to prove it. Apple has a history of over-stating their capabilities and glossing over the finer points.Duncan Macdonald - Wednesday, November 20, 2019 - link
Why have touch support in such a monitor? Fingerprints will rapidly reduce the 10 bit color resolution to 8 bits or less and cause non-uniformity across the screen.melgross - Wednesday, November 20, 2019 - link
Not as badly as you say. Fingerprints don’t prove to be a real problem on phones and tablets. You don’t notice them until you turn it off. If you have really greasy or dirty hands, you should clean them first, or your keyboard/mouse/trackpad or trackball will get filthy fast. Not very professional.dontlistentome - Wednesday, November 20, 2019 - link
So ... how many small cars does this cost?melgross - Wednesday, November 20, 2019 - link
The only thing I find impressive is the 14 bit luts. The rest is pretty much middle range. Where is the definite support for DCI-P3? Without that, forget pro grade video. 4k’s color standard is DCI-P3. So is cinema. So without stated support, this monitor isn’t useful in those categories,350 nits? Seriously, in a new graphics/video pro monitor? What are they thinking. Sure, this isn’t an Apple Pro display for double the price (at 6k), that can do 1200 nits. But to hope to get to HDR you need at least 650 nits to even come close.
This seems to be the perfect monitor for the last generation of users.
melgross - Wednesday, November 20, 2019 - link
That is, 6k price for 6k resolution.imaheadcase - Wednesday, November 20, 2019 - link
Considering it IS a pro monitor, HDR is not needed. HDR is not like a must have for any monitor..its not even something most people even consider.melgross - Wednesday, November 20, 2019 - link
For a pro graphics/video monitor HDR is very important. It’s required.Pro-competition - Wednesday, November 20, 2019 - link
If you really need HDR, then you should only consider per-pixel dimming (not even the new Apple Pro Display XDR qualifies).ColorEdge PROMINENCE CG3145:
https://www.eizo.com/products/coloredge/cg3145/
https://www.anandtech.com/show/11286/eizo-announce...
"The key feature of the ColorEdge Prominence CG3145 display is its ability to properly reproduce both very bright and very dark areas on the scene without artefacts caused by local dimming (used on many IPS-based televisions and on some monitors) or an auto brightness limiter. EIZO does not reveal many details about the IPS panel it uses for the CG3145, but it claims that it has control of backlight intensity in every pixel. The latter means that the company either uses Panasonic’s IPS panels with a special layer of light-modulating cells that enable pixel-by-pixel control of backlight intensity, or a similar technology it has developed in-house." (Quoted from Anandtech)
melgross - Thursday, November 21, 2019 - link
I’ve seen that monitor in a recent presentation. It’s good, but still not up to the Apple standards, much less that of those even more expensive.Canam Aldrin - Saturday, November 23, 2019 - link
There isn't really a more-expensive or better technology than the dual-layer LCD used in the Prominence. It is per-pixel, it costs about $30K, the Apple Pro Display XDR is inferior in all regards except resolution, although the Apple display is a proprietary 6K resolution that nobody uses in the same size panel where you can't see that much detail anyway. So if you're expecting the XDR, with it's 576 zones, to compete with something like the Prominence per-pixel backlight, you're going to be very disappointed. 576 zones is not very much for professional use, and I'm very skeptical that I could live with it (I'm a colorist).crimsonson - Wednesday, November 20, 2019 - link
Because HDR increases the cost significantly and if you are pitching yourself as a product for the professional content creator (print, video, etc.) you will not be implementing HDR VESA 400 or something low end, esecially at that price. You would need to achieve at least 600 if not 1000. Which makes it more expensive and requires more complex processing to achieve more accurate colors than your standard consumer monitor.And anyway, HDR is not needed for every monitor. Not every monitor is for HDR.
Yes it does limit the market for this monitor, but that is no different that deciding that a monitor be widescreen or not. Not everyone needs or wants a widescreen monitor.
melgross - Wednesday, November 20, 2019 - link
This is a $3200 monitor. It is expensive. Considering that much less expensive ones do offer full DCI-P3 and over 500 nits, there is no excuse for this one not using it.Pro-competition - Wednesday, November 20, 2019 - link
The only properly implemented HDR monitor I am aware of is the ColorEdge PROMINENCE CG3145 (see my post above). That costs about $30,000. It's unrealistic to expect proper HDR on a $3200 monitor.melgross - Thursday, November 21, 2019 - link
350 nits is barely acceptable in a nonHDR monitor. Many monitors can output 600 nits and even higher.Canam Aldrin - Saturday, November 23, 2019 - link
The competition for this is Eizo CG3145 at about $5600. This NEC is very nicely priced for its market. Anything less than 1000 nits is useless for professional HDR mastering and only good for consumer playback.Canam Aldrin - Saturday, November 23, 2019 - link
With 3D LUTs, you can calibrate to whatever colorspace you need. So there is your P3 support. People who don't know how to do this shouldn't be working in P3 anyway.Pro-competition - Wednesday, November 20, 2019 - link
75 Hz. Is no one commending NEC for having a higher than usual refresh rate? Sure, 120hz would be far more welcomed, but even 75hz is noticeably smoother than 60hz.melgross - Thursday, November 21, 2019 - link
For what professional purpose? If it’s used for graphics, then it’s not useful. For video, it’s not a standard refresh rate.For gaming, it might be better, but this is supposed to be a pro graphics/video monitor, not a gaming monitor.
Canam Aldrin - Saturday, November 23, 2019 - link
If they can do fractional framerates below 72Hz, it is useful, because then you can do 71.94Hz which is 3x 23.98, giving you nice smooth playback of "24p" video. 60Hz is messy for this since it's having to drop every other frame (well, that's over-simplifying how it works but you get the idea).hennes - Thursday, November 21, 2019 - link
"including a USB-C input that supports 65 W power delivery". I read this as the power to the monitor is supplied via USB-C (input to the monitor). Not that it could supply either devices with up to 65W (power output).USB-C port and USB-C input are almost the same. But since this is not an input "port" might be clearer.
yetanotherhuman - Thursday, November 21, 2019 - link
Huh, that's a novelty, an actual 4K monitor as opposed to Ultra HD.DBsuper - Thursday, November 21, 2019 - link
3000 Dollars only at 75hz refresh rate? Hahahahhaahaha are you joking being really stupid NEC? 31 INCH TOO!! HAHAHAHAHAAHHAHAHAHAHA CAN'T STOP LAUGHING.imaheadcase - Thursday, November 21, 2019 - link
Are you stupid or something?Canam Aldrin - Saturday, November 23, 2019 - link
Then you're laughing in the mirror because you are completely ignorant. This is probably the least important spec for the target market. But of course, just assume that NEC, what with all their engineers and years of success in this market, are the idiots who don't know how to spec or price their product. Maybe consider understanding things before you speak on them?we_are_theBorg - Sunday, December 8, 2019 - link
I see a lot of videographers commenting on here but no photographers. As far as I can tell this series has always been produced for photography, pre-press and medical imaging markets, not video, hence not marketing a P3 color gamut or being HDR. I am a protog and do fine art printing for other clients too. I have exclusively used NEC PA series monitors for many years. For me the only specs that really matter are the 100% Adobe RGB gamut and the 14 bit lookup table. It's nice to get bumps in resolution and contrast on this series, but nothing else really matters. When doing print proofing the standard is between 80 and 120 nits brightness, so 350 is super overkill. Same is true for the medical DICOM standards for the medical market that NEC sells lots of these to. Another big advantage that NEC has over many other brands (Asus, LG, ViewSonic, BenQ... all the newbies) is that their SpectraView II calibration software supports some higher end spectrophotometers directly, like the Xrite X1 Pro, whereas the less pro offerings only support a very small number of lower end colorimeters, which aren't as accurate and particularly less sensitive near the lower end of the brightness range.If the above is all gibberish to you then, put simply this series isn't targeted at you and luckily something far more affordable will probably do you just fine and make you way happier... unless you're grading HDR video content, in which case I'm sorry, and I hope you've got a spare kidney because then yes, I agree with the other commenters who have tried to explain why you need 1000 nits, 100% P3 gamut, and none of this prosumer gobbledygook.
Homework:
https://www.dpbestflow.org/color/monitor-calibrati...
https://siim.org/page/displays_chapter3
https://www.xrite.com/blog/colorimeter-vs-spectrop...
stevejayd - Sunday, December 8, 2019 - link
I'm a photographer who also profiles paper to try to print what I want. My older ViewSonic is dying. What would you recommend for under $1500 and 32 inch?we_are_theBorg - Wednesday, December 11, 2019 - link
Honestly, probably the 27" NEC PA271Q, though it's smaller and lower resolution than you'd probably like... if you're okay just using a colorimeter the ASUS PA329Q is pretty durn good. At least it's 99.5% Adobe RGB and can be hardware calibrated. Uniformity control is lacking but at least it doesn't fry its self like the BenQ :-P