Would you really count $475 for this as a steal? It seems quite expensive for a rebranded cheapo WQHD monitor.The Dell is probably worth the extra money, especially considering the 3 year advanced exchange warranty included vs 1 year not.The Microcenter monitor also has the same inputs for $400.
I would gladly play $50 not to have glossy plastic bezels. And $50 to calibrate it for me. I have had my current Dell monitors since 2006 and I am not going to pay $100 less for a cheap looking monitor. It would just irritate me every day.
To a large extent these monitors are aimed at people who consider $600-700 crazy; but are willing to make compromises to stretch up from a 1080p screen. They're the same people who bought the low end 1920x1200 monitors a half dozen years ago when good ones cost $500 and most people bought $200 1680x1050 screens if they were stepping up from the cheapest common denominator.
.....or they're just gamers who don't need/want that kind of color fidelity, but want the resolution and decent response time. Now that GPU's are getting beefy enought to push beyond 1080p maxed out, it's only natural for gamers to look beyond 1080p monitor solution.
With a $75 delta over a $400 base, I would get the Ultrasharp every time. Dell's monitor is not only better out of the box but you've got a much better history of quality with their high-end monitors. I was going to post that I would rather have a 24" Ultrasharp than this 27" cheapie but the price different is much less than I expected.
Maybe if they priced this at $350 it would look like a deal to some. I still wouldn't buy it, LCDs last too long to buy a cheap one.
That is what I learned the hard way. 4 years ago I picked up a 1200p display for $300 because it was what I could afford, instead of spending the $5-600 on one that would really be nice. But now I am stuck with a monitor that has a faint but noticeable buzzing sound, backlight bleeding, horrible color, huge pixels (1200p on a 28" monitor), and displays have improved so much that there is no possible way to resell the thing to help me move up. So now I am stuck with this thing for another couple years every day being painfully aware that I made a bad call. Next time around I will be waiting for a non-tiled 4K 60fps display in the 35-42" range. It will cost a pretty penny, but if I am going to have to look at it 4-10 hours a day for 7-10 years then the price will be more than justifiable. Monitors, power supplies, and hard drives are things that cost a bit more up front for quality, but more than pay themselves off in reliability and longevity.
Okay, this monitor is just as unusual as the other monoprice one. Over 150 Cd/m^2 MINIMUM brightness? I know people like "brighter is better", but 100 Cd/m^2 is the recommended brightness in a well lit workplace. For a reason.
At night, in a dark room, its already too bright. 163 minimum means you are messing up your eyes bigtime if you are a nighttime gamer. In a dark room, 20-30 are perfectly fine.
Would it not be neat if monitors had ambient light sensors that would automatically adjust the brightness? My Sony TV from 2006 already has such a feature!
phones have this, yet there are tons of problems with the various implementations, how fast, how much, user limits etc So I can't see this coming to displays like this, esp as the environmental conditions are more static (and/or controllable) than TV/Phone usage.
The fact that conditions are more static may make for a feature less-commonly used, but would also reduce inaccuracies due to constantly changing light conditions that you have when using a cell phone. Over the course of the average day the light in the room where I work gradually shifts from dark to bright and back to dark before the high-powered lights go on. If a screen like this would adjust with the slowly changing ambient light without me having to adjust any settings on my keyboard I would be thrilled.
sure, if it is like a phone screen or TV then there could be issues, but if it was software controlled with preset user settings (like a good fan controller) then it would not be an issue.
2006? That's nothing. We had two TVs with this feature when I was growing up in the dark ages of the 1980s. I used to play with them by putting my hand over the photosensor and watching the display go dim and bright again. :) It's somewhat absurd that three decades later, the feature is less common than ever. Especially with all the intelligence you could design it with using modern electronics. Put a few photosensors in at different locations to avoid a transient shadow, and let the microprocessor figure out WTF is going on when a shadow passes across the screen.
I have to agree it's kind of silly. My Sony VPC-Z11 series laptop from 2010 had autodimming and it worked pretty much perfectly. Fast forward to 2013 and my Asus Tablet sort of has this feature. It will brighten in response to too much light but it won't dimm without power it on and off. I setup an Asus Laptop and autodimming didn't work. My NEC PA242W has this feature as well (and it works) but it is turned off by default because it would likely mess with the calibration on a monitor this precise. So it seems oddly it's not an easily implementable feature.
My 2008 NEC 3090 has an auto-dimming feature; but I was never happy with it. Part was that with light coming from 3-5 directions beside/behind me my shadows was complex enough that adjusting my position a few inches generally changed the number hitting it's sensor triggering an adjustment. The bigger problem was that it was one of 3 monitors on my desk; and even if all 3 had the option, unless they had a way to communicate with each other, there's zero chance of them all getting the same amount of light falling on them.
I have a xrite Colormunki attached to my PC which automatically adjusts my 3 x Dell U2412m. most of the time when i a, working its late at night with minimal lighting so the xrite automatically backs off the brightness for me whilst still keeping it calibrated. Its an eye saver for sure.
Its a problem with a lot of monitors, the minimum brightness is still too high for realistic usage. Once you get to having to turn down the contrast to reduce the brightness somewhat you are already in trouble and the quality of the image really suffers for it.
My Samsung has an ambient light sensor. But I don't use it, on most phones it has been pretty annoying. On my Nexus 7 2013 it does not bother me. But environment light changes a lot more when I use my Nexus than in my room where the PC display is. Also, it doesn't save me any battery on my PC display. :P
ASUS PB278Q is $553 on amazon.com right now. I'd buy that for sure over the Monoprice (in fact I have bought it and I am very happy with it). The PB278Q has a matte screen, is calibrated well, has all the inputs you will ever need AND comes with all the cables in the box (VGA, HDMI, DVI, Displayport) - which also needs to be figured into the price difference.
Nice monitor. If you are lucky enough to get one with no dead pixels or massive light bleeding problems. I tried three of them and returned them all. Two had dead pixels that were towards the middle of the screen and noticeable, and the third a massive light bleed problem in the lower right and left, probably an assembly defect with the bezel not fitting correctly. I gave up and am now spending time researching other 1440p options.
> Considering the color accuracy of this display after calibration, it seems like a cheap option for an image professional that wants color accuracy.
I infer by "image professional" that you would include a serious Photoshop'er. At that level, I think they would expect closer to 100% coverage of the Adobe-98 gamut, rather than sRGB.
Last I recall Adobe-RGB is a wider colour space than standard sRGB which is closer to what most consumer monitors are tuned to. To display it usually requires a wide-spectrum backlight system which you are not going to find in a cheap monitor.
From what I recall it depends on the application. Image Professionals who publish primarily to the Internet or to a consumer's computer will never need more than sRGB because that's what your customer's only capable of. Using Adobe-RGB would likely throw off the picture quite a bit because it won't look remotely correct in sRGB colorspace. I believe the Adobe-RGB users are probably printing images where there's a very wide colorspace or just archiving the pictures and trying to see as much as possible.
I'm hoping that Monoprice or one of the Korean vendors will soon release a 4K monitor that uses the inexpensive panel used on Seiki 4K TVs, but supports 60 Hz via DisplayPort. (The panel on the Seiki TVs can do that, it's just that they are limited to HDMI input, which only supports 30 Hz.) 2560x1440 is OK, but surely we can do better now.
Have you even looked at the performance hit on modern high end graphics cards that 4K monitors do? See Tom's review on Sept. 18 about it. At high graphics quality settings in games, a 4K monitor (2160p) brings a Titan GPU to its knees, barely making 30fps in games like BF3, and with Crysis 3, forget about it unless you go with two Titans. At some point, the law of diminishing returns steps in to what the eye can appreciate as resolutions move up anyway. But if you've got the money, sure, you *can* do better than 1440p - you just need to pony up for the GPU power to run it.
The few games they benched with no AA gave good results on the single titan. I'd like to see more tests like that with a single 780. While 140DPI isn't enough to not benefit from AA; it's enough of an improvement over 100 that it's not as important.
That said; my budgeting is assuming that when I jump on the 4k bandwagon that I'll need to add a second GPU to feed it at native resolution.
My fairly ancient Soyo 24" MVA is finally dying so i need to get something soon. What the consensus on Samsung PVA vs LG IPS (crossover/catleap/shimain vs qnix/x-star). Googling doesn't help much. I want a glossy panel.
I just purchased the catleap crossover with the included P-blade stand. For $399 I can't complain, even with the lack of an OSD. The professional stand is enough IMO to choose the catleap.
Chris, does this monitor have audio passthrough? A headphone jack would be acceptable.
I'd prefer to use a decent 2.1 setup instead of the built-in speakers for audio over displayport or HDMI. I have an Auria EQ276W, and it lacks this feature, which is pretty annoying, especially with a device like a Chromecast that doesn't have a separate audio out.
The 3.5mm jack on the Auria only works as an audio *input* alongside the DVI or VGA video inputs, and won't reverse into an output when using Displayport or HDMI.
I got this one at a little over $300 from Rakuten so it was a great deal. Before buying it, I checked with Monoprice and they told me it should have zero dead pixels and when I got it the second day, there was indeed no dead pixels. I like this monitor. However, there are two "major" problems for me. First one is the stand as everybody have noticed. I have to use one of my good old Asus monitor stand and it is much more sturdy. The second issue was not too bad but very annoying. Once my computer turned monitor off and I made it back by using mouse or keyboard, the monitor light sometimes turned on but there's no display. The only ways to make it on is to turn the monitor off and on or let the kvm switch go through a full cycle. I contact their customer service but he cannot really fix the problem. He gave me some useless suggestions which just wasted a bunch of my time. BTW, I tested on two computers and both had the same problem. Both uses Intel graphics so I don't know if it is graphic board specific.
Chris, I know that IPS displays are all the rage. However, any chance that you can test the 144 Hz displays from ASUS and BenQ? Specifically, the input lag & pixel response at both 60 Hz and 144 Hz compared to 60 Hz TN displays and IPS displays would make for a really good read IMHO.
I think it funny when people talk about paying a lot for a monitor when they will spend 400 bucks on a video card, and 350 bucks for a processor but when it comes to the monitor they'll buy the 100 dollar special on newegg. The monitor and sound is what makes your games pop, and allows you to enjoy the content. When you get a good one you will notice a difference. I bought the ZR24w about 2 years ago and at first I didn't notice much of a difference, but when I go over to my friends house or use the computer at work its night and day.
If I connected the Dual-Link DVI cable directly to a mini-display port on a laptop and the USB part of the cable directly to a USB hub that's connected to the laptop...would it work?
Insanely late comment on an old review, but the part about the HDMI port being unable to support 1440p@60fps is false. It runs fine off my GTX 960 via HDMI; I later switched over to displayport to use the HDMI port for my Xbox, but it ran fine for months on HDMI.
Interestingly, because of the above 1080p resolution, Nvidia didn't automatically default to HDTV colorspace settings like it does for 1080p and lower resolution monitors.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
41 Comments
Back to Article
peterfares - Tuesday, October 22, 2013 - link
Would you really count $475 for this as a steal? It seems quite expensive for a rebranded cheapo WQHD monitor.The Dell is probably worth the extra money, especially considering the 3 year advanced exchange warranty included vs 1 year not.The Microcenter monitor also has the same inputs for $400.piroroadkill - Tuesday, October 22, 2013 - link
I'd agree that a far more premium look, more inputs, better stand, etc are worth $75 alone.Let alone 3 year advanced exchange warranty. The Dell is definitely worth the extra money.
Fergy - Tuesday, October 22, 2013 - link
I would gladly play $50 not to have glossy plastic bezels. And $50 to calibrate it for me. I have had my current Dell monitors since 2006 and I am not going to pay $100 less for a cheap looking monitor. It would just irritate me every day.DanNeely - Tuesday, October 22, 2013 - link
To a large extent these monitors are aimed at people who consider $600-700 crazy; but are willing to make compromises to stretch up from a 1080p screen. They're the same people who bought the low end 1920x1200 monitors a half dozen years ago when good ones cost $500 and most people bought $200 1680x1050 screens if they were stepping up from the cheapest common denominator.LancerVI - Thursday, October 31, 2013 - link
.....or they're just gamers who don't need/want that kind of color fidelity, but want the resolution and decent response time. Now that GPU's are getting beefy enought to push beyond 1080p maxed out, it's only natural for gamers to look beyond 1080p monitor solution.Flunk - Tuesday, October 22, 2013 - link
With a $75 delta over a $400 base, I would get the Ultrasharp every time. Dell's monitor is not only better out of the box but you've got a much better history of quality with their high-end monitors. I was going to post that I would rather have a 24" Ultrasharp than this 27" cheapie but the price different is much less than I expected.Maybe if they priced this at $350 it would look like a deal to some. I still wouldn't buy it, LCDs last too long to buy a cheap one.
CaedenV - Thursday, October 24, 2013 - link
That is what I learned the hard way. 4 years ago I picked up a 1200p display for $300 because it was what I could afford, instead of spending the $5-600 on one that would really be nice. But now I am stuck with a monitor that has a faint but noticeable buzzing sound, backlight bleeding, horrible color, huge pixels (1200p on a 28" monitor), and displays have improved so much that there is no possible way to resell the thing to help me move up. So now I am stuck with this thing for another couple years every day being painfully aware that I made a bad call.Next time around I will be waiting for a non-tiled 4K 60fps display in the 35-42" range. It will cost a pretty penny, but if I am going to have to look at it 4-10 hours a day for 7-10 years then the price will be more than justifiable. Monitors, power supplies, and hard drives are things that cost a bit more up front for quality, but more than pay themselves off in reliability and longevity.
CecileWamsley - Monday, October 28, 2013 - link
my Aunty Maria recently got an awesome cream Chevrolet Corvette Z0-Six by working off of a macbook. pop over to these guys... http://smal.ly/8wUo2blau808 - Tuesday, October 22, 2013 - link
Sorry, but that thing is hideous.imsabbel - Tuesday, October 22, 2013 - link
Okay, this monitor is just as unusual as the other monoprice one. Over 150 Cd/m^2 MINIMUM brightness? I know people like "brighter is better", but 100 Cd/m^2 is the recommended brightness in a well lit workplace. For a reason.At night, in a dark room, its already too bright. 163 minimum means you are messing up your eyes bigtime if you are a nighttime gamer. In a dark room, 20-30 are perfectly fine.
Infy102 - Tuesday, October 22, 2013 - link
Would it not be neat if monitors had ambient light sensors that would automatically adjust the brightness? My Sony TV from 2006 already has such a feature!cjb110 - Tuesday, October 22, 2013 - link
phones have this, yet there are tons of problems with the various implementations, how fast, how much, user limits etc So I can't see this coming to displays like this, esp as the environmental conditions are more static (and/or controllable) than TV/Phone usage.purerice - Wednesday, October 23, 2013 - link
The fact that conditions are more static may make for a feature less-commonly used, but would also reduce inaccuracies due to constantly changing light conditions that you have when using a cell phone. Over the course of the average day the light in the room where I work gradually shifts from dark to bright and back to dark before the high-powered lights go on. If a screen like this would adjust with the slowly changing ambient light without me having to adjust any settings on my keyboard I would be thrilled.CaedenV - Thursday, October 24, 2013 - link
sure, if it is like a phone screen or TV then there could be issues, but if it was software controlled with preset user settings (like a good fan controller) then it would not be an issue.LordOfTheBoired - Tuesday, October 22, 2013 - link
2006? That's nothing. We had two TVs with this feature when I was growing up in the dark ages of the 1980s. I used to play with them by putting my hand over the photosensor and watching the display go dim and bright again. :)It's somewhat absurd that three decades later, the feature is less common than ever. Especially with all the intelligence you could design it with using modern electronics. Put a few photosensors in at different locations to avoid a transient shadow, and let the microprocessor figure out WTF is going on when a shadow passes across the screen.
foxalopex - Tuesday, October 22, 2013 - link
I have to agree it's kind of silly. My Sony VPC-Z11 series laptop from 2010 had autodimming and it worked pretty much perfectly. Fast forward to 2013 and my Asus Tablet sort of has this feature. It will brighten in response to too much light but it won't dimm without power it on and off. I setup an Asus Laptop and autodimming didn't work. My NEC PA242W has this feature as well (and it works) but it is turned off by default because it would likely mess with the calibration on a monitor this precise. So it seems oddly it's not an easily implementable feature.DanNeely - Tuesday, October 22, 2013 - link
My 2008 NEC 3090 has an auto-dimming feature; but I was never happy with it. Part was that with light coming from 3-5 directions beside/behind me my shadows was complex enough that adjusting my position a few inches generally changed the number hitting it's sensor triggering an adjustment. The bigger problem was that it was one of 3 monitors on my desk; and even if all 3 had the option, unless they had a way to communicate with each other, there's zero chance of them all getting the same amount of light falling on them.Moricon - Tuesday, October 22, 2013 - link
I have a xrite Colormunki attached to my PC which automatically adjusts my 3 x Dell U2412m. most of the time when i a, working its late at night with minimal lighting so the xrite automatically backs off the brightness for me whilst still keeping it calibrated. Its an eye saver for sure.BrightCandle - Tuesday, October 22, 2013 - link
Its a problem with a lot of monitors, the minimum brightness is still too high for realistic usage. Once you get to having to turn down the contrast to reduce the brightness somewhat you are already in trouble and the quality of the image really suffers for it.Death666Angel - Tuesday, October 22, 2013 - link
My Samsung has an ambient light sensor. But I don't use it, on most phones it has been pretty annoying. On my Nexus 7 2013 it does not bother me. But environment light changes a lot more when I use my Nexus than in my room where the PC display is. Also, it doesn't save me any battery on my PC display. :Pjbm - Tuesday, October 22, 2013 - link
ASUS PB278Q is $553 on amazon.com right now. I'd buy that for sure over the Monoprice (in fact I have bought it and I am very happy with it). The PB278Q has a matte screen, is calibrated well, has all the inputs you will ever need AND comes with all the cables in the box (VGA, HDMI, DVI, Displayport) - which also needs to be figured into the price difference.Nfarce - Tuesday, October 22, 2013 - link
Nice monitor. If you are lucky enough to get one with no dead pixels or massive light bleeding problems. I tried three of them and returned them all. Two had dead pixels that were towards the middle of the screen and noticeable, and the third a massive light bleed problem in the lower right and left, probably an assembly defect with the bezel not fitting correctly. I gave up and am now spending time researching other 1440p options.jabber - Tuesday, October 22, 2013 - link
Buy some carbon fibre vinyl sheeting (or whatever) and cover the bezel in that.l_d_allan - Tuesday, October 22, 2013 - link
> Considering the color accuracy of this display after calibration, it seems like a cheap option for an image professional that wants color accuracy.I infer by "image professional" that you would include a serious Photoshop'er. At that level, I think they would expect closer to 100% coverage of the Adobe-98 gamut, rather than sRGB.
Or not?
foxalopex - Tuesday, October 22, 2013 - link
Last I recall Adobe-RGB is a wider colour space than standard sRGB which is closer to what most consumer monitors are tuned to. To display it usually requires a wide-spectrum backlight system which you are not going to find in a cheap monitor.From what I recall it depends on the application. Image Professionals who publish primarily to the Internet or to a consumer's computer will never need more than sRGB because that's what your customer's only capable of. Using Adobe-RGB would likely throw off the picture quite a bit because it won't look remotely correct in sRGB colorspace. I believe the Adobe-RGB users are probably printing images where there's a very wide colorspace or just archiving the pictures and trying to see as much as possible.
piroroadkill - Tuesday, October 22, 2013 - link
I think he's inferring that someone who wants colour accuracy probably wouldn't be looking at a cheap ass monitor.JDG1980 - Tuesday, October 22, 2013 - link
I'm hoping that Monoprice or one of the Korean vendors will soon release a 4K monitor that uses the inexpensive panel used on Seiki 4K TVs, but supports 60 Hz via DisplayPort. (The panel on the Seiki TVs can do that, it's just that they are limited to HDMI input, which only supports 30 Hz.)2560x1440 is OK, but surely we can do better now.
Nfarce - Tuesday, October 22, 2013 - link
Have you even looked at the performance hit on modern high end graphics cards that 4K monitors do? See Tom's review on Sept. 18 about it. At high graphics quality settings in games, a 4K monitor (2160p) brings a Titan GPU to its knees, barely making 30fps in games like BF3, and with Crysis 3, forget about it unless you go with two Titans. At some point, the law of diminishing returns steps in to what the eye can appreciate as resolutions move up anyway. But if you've got the money, sure, you *can* do better than 1440p - you just need to pony up for the GPU power to run it.iamlilysdad - Tuesday, October 22, 2013 - link
Not everybody is in it just for gaming.DanNeely - Wednesday, October 23, 2013 - link
The few games they benched with no AA gave good results on the single titan. I'd like to see more tests like that with a single 780. While 140DPI isn't enough to not benefit from AA; it's enough of an improvement over 100 that it's not as important.That said; my budgeting is assuming that when I jump on the 4k bandwagon that I'll need to add a second GPU to feed it at native resolution.
shaolin95 - Tuesday, October 22, 2013 - link
Why even get that?QNIX is much cheaper and looks awesome. Anandtech come on now...tons of cheaper Korean monitors out there...whats going on...?
DanNeely - Tuesday, October 22, 2013 - link
In general, if a company isn't willing to provide a review sample their products don't get reviewed.Byte - Wednesday, October 23, 2013 - link
My fairly ancient Soyo 24" MVA is finally dying so i need to get something soon. What the consensus on Samsung PVA vs LG IPS (crossover/catleap/shimain vs qnix/x-star). Googling doesn't help much. I want a glossy panel.kedesh83 - Tuesday, October 22, 2013 - link
I just purchased the catleap crossover with the included P-blade stand. For $399 I can't complain, even with the lack of an OSD. The professional stand is enough IMO to choose the catleap.joelypolly - Wednesday, October 23, 2013 - link
I wonder how much variation there are in panel uniformity between sampleswffurr - Wednesday, October 23, 2013 - link
Chris, does this monitor have audio passthrough? A headphone jack would be acceptable.I'd prefer to use a decent 2.1 setup instead of the built-in speakers for audio over displayport or HDMI. I have an Auria EQ276W, and it lacks this feature, which is pretty annoying, especially with a device like a Chromecast that doesn't have a separate audio out.
The 3.5mm jack on the Auria only works as an audio *input* alongside the DVI or VGA video inputs, and won't reverse into an output when using Displayport or HDMI.
twinclouds - Wednesday, October 23, 2013 - link
I got this one at a little over $300 from Rakuten so it was a great deal. Before buying it, I checked with Monoprice and they told me it should have zero dead pixels and when I got it the second day, there was indeed no dead pixels. I like this monitor. However, there are two "major" problems for me. First one is the stand as everybody have noticed. I have to use one of my good old Asus monitor stand and it is much more sturdy. The second issue was not too bad but very annoying. Once my computer turned monitor off and I made it back by using mouse or keyboard, the monitor light sometimes turned on but there's no display. The only ways to make it on is to turn the monitor off and on or let the kvm switch go through a full cycle. I contact their customer service but he cannot really fix the problem. He gave me some useless suggestions which just wasted a bunch of my time. BTW, I tested on two computers and both had the same problem. Both uses Intel graphics so I don't know if it is graphic board specific.Wall Street - Wednesday, October 23, 2013 - link
Chris, I know that IPS displays are all the rage. However, any chance that you can test the 144 Hz displays from ASUS and BenQ? Specifically, the input lag & pixel response at both 60 Hz and 144 Hz compared to 60 Hz TN displays and IPS displays would make for a really good read IMHO.k9cj5 - Friday, October 25, 2013 - link
I think it funny when people talk about paying a lot for a monitor when they will spend 400 bucks on a video card, and 350 bucks for a processor but when it comes to the monitor they'll buy the 100 dollar special on newegg. The monitor and sound is what makes your games pop, and allows you to enjoy the content. When you get a good one you will notice a difference. I bought the ZR24w about 2 years ago and at first I didn't notice much of a difference, but when I go over to my friends house or use the computer at work its night and day.fathomit - Tuesday, October 29, 2013 - link
If I connected the Dual-Link DVI cable directly to a mini-display port on a laptop and the USB part of the cable directly to a USB hub that's connected to the laptop...would it work?az060693 - Tuesday, November 3, 2015 - link
Insanely late comment on an old review, but the part about the HDMI port being unable to support 1440p@60fps is false. It runs fine off my GTX 960 via HDMI; I later switched over to displayport to use the HDMI port for my Xbox, but it ran fine for months on HDMI.Interestingly, because of the above 1080p resolution, Nvidia didn't automatically default to HDTV colorspace settings like it does for 1080p and lower resolution monitors.