On the Windows site of things, what video card and driver was used? How AMD and nVidia handle MST support varies slightly so you might have better luck with one over the other.
... or a wall of U28D590D for $699 each. 60Hz TN 4K. I'm glad to see the major players offer up affordable 60Hz 4K. Of course, I'd rather have 120Hz 4K DP1.3. Doesn't matter if you can't play games, it would be of tremendous value to me just for desktop operations.
If I'm correct in assuming you're talking about the 39" Seiko TV then I must inform you that those are not 120Hz screens in the same sense that a computer monitor is 120Hz. Those Seiko TV's only take an HDMI input which is currently limited to 4K@30Hz. They then interpolate frames between frames of source material to give the illusion of 120Hz. A 120Hz monitor takes in a 120Hz signal and displays it natively. There are currently no 4K 120Hz monitors on the market (there aren't even any 2560x1600 120Hz monitors I'm aware of).
The 50" Seiki 4K TVs do native 120Hz 1080p over HDMI 1.4, but it seems to be a lottery as to whether it needs to be hacked or not. While this resolution is not officially supported, creating a custom EDID makes it possible. In addition, several people have been loading the 50" firmware on the 39", making native 120Hz 1080p possible there as well. So you can have your desktop and videos at 30Hz 4K (not ideal, but still razor sharp) while also gaming at 120Hz 1080p. Some are claiming 720p at native 240Hz... but I'd have to see that to believe it.
All seiki 50 inch displays will natively display 1080p@120Hz. The 39 inch models all will to (with a firmware update). The 39 inch monitors with the firmware update do pixel doubling which means ideal scaling for gaming (almost identical to gaming on a big 1080p display with no scaling artifacts).
Both the 50 inch (and 39 with the right firmware) will accept 720p@240Hz. It sitll only displays 120Hz but this does halve the input lag from around 9ms to 4.5 ms which is why for games where it really matters (only quakelive for me) I ran at 1280x720@240Hz.
Actually, the 39" inch Seiko *does* accept 1080@120hz over hdmi (when flashed with the firmware from 50" Seiko) and output all individual 120 frames per second (it looks very smooth!). You're right that it's limited to 4k@30hz.
For <$400 and 2560x1440, I'll take a thick (matte finish) bezel with an IPS panel. Infact, I'll take 6 of them and keep the change instead of buying a "Dell"
Samsung just announced a direct-competitor to this monitor for $800 starting price. So maybe just 3 of those and keep the change, and they have a razor thin bezel and aluminum frame construction.
I mean...I just can't get over the price here. $2800 bucks. It's not even worth $1000.
Are you referring to the ud590? It's hardly a "direct competitor." It's not only 4" diagonal smaller but it's a TN panel, and lacks quite a few input types. Or are you referring to the ud970, in which case would you please post a source to the $800 price tag, which sounds unlikely.
On top of that, remember that you then need 2 video cards to power 4/6 monitors (nvidia supports 4 but only in span with the fourth as an "extra" display) so you won't even be able to tile them properly by default. On top of that have fun trying to play a game across all the monitors. At least with a single 4K display you can render at 1080p and have a seamless scaling across the entire display. With 4 or 6 monitors you'd have to run it in the standard "eyefinity" style 3x1 configuration which is nice, but also will cost you with resolution. At that point you're going to need not just two GPUs, but likely two high end GPUs. When all is said and done I don't think you'll save much once you take into account the disadvantages of the setup on top of the price.
You have never been truly productive with a computer, have you ? A multi monitor setup enables me to do things just that much faster. While you could argue that a 4K monitor would allow me to do stuff side by side on the same panel, at the end of the day, when the new resolution becomes the norm, you have to develop content for 4k. And the only way to see your results is to see them at 4k, so you have to have another panel. Single display setups are meant for lazy office people and content consumers. And even then, playing a game with a surround monitor setup is much nicer than having one large monitor. People used the same kind of software back in the days of 1280 by 1024, 1024 by 768 and 800 by 640. And even though most of the software for today is written for 1080p or 1200p panels, the fact still remains, you cannot operate Visual Studio in half of that. Hell, it's a nightmare to run Visual Studio on 1336 by 768, let alone something like 960 by 1080p. Even though application UI's appear scalable, scaling them is a last resort. So yes, when we will catch up to 4K on the software side of things, we will need multiple monitors again. But yes, of course, for a web developer of today, being able to edit multiple stylesheets, javascript and html files at the same time AND seeing the result largely in scale is truly great. Unfortunately, most enterprise developers rarely need to edit countless files at once and the php/css people just simply will not be able to afford these displays.
What do you mean by "run 4K at 1080p"? That makes no sense.
If you are talking running those 4 monitors to get a total screen space of 3840x2160, sure, you could do that, but you entirely miss the point of having a high-density monitor. The pixels on a 39" 1080p screen are the size of a truck. Maybe suitable for using as a TV viewed across your living room, but not as a monitor a couple of feet from your face.
Yes, that information could help, since not every test was done on Nvidia/AMD and the same driver. Also, it would help a lot to hear if those problems also happen with ASUS and Sharp, to compare those products. Anandtech made a review already: http://www.anandtech.com/show/7157/asus-pq321q-ult...
It takes more then 6months to go from a finished standard to fully certified working silicon let along shipping it in quantities to enable mass production of consumer devices.
I'm not sure this is right. Companies usually are making and testing IP while a standard is in the works. In some cases they're out before the standard is done.
This is correct. There is currently no full HDMI 2.0 silicon out there that I'm aware of, and since the Dell started shipping last fall it certainly didn't have access to it then. There are currently devices shipping that claim "HDMI 2.0" support in the AV world, but that isn't full HDMI 2.0. It is support for 4:2:0 chroma subsampling, which is part of the HDMI 2.0 spec, and enabled UltraHD resolution at 60 Hz. Since computers don't use chroma subsampling, this isn't relevant and there is no HDMI 2.0 silicon right now.
where you get that idea from , its false you need a GeForce 600 "Kepler" graphics card or newer to drive a display up to 4096 x 2160.
hell, even the ChromeOS guys have merged this linux UHD patch in to their tree now...so intel Haswell/Iris Graphics work at "UHD-1" 3840x2160P if you are not gaming http://lists.x.org/archives/xorg-devel/2014-Januar...
You can do that resolution at 24 Hz, or 3840x2160 at 30 Hz, but you can't do it at 60 Hz without MST right now. HDMI 2.0 allows it at 60 Hz but that isn't available yet on a product.
I was speaking about 600MHz HDMI not ~300MHz. 300MHz HDMI has been around since GCN 1.0 and Kepler. It's also available in Haswell, works fine in Windows, OS X or GNU/Linux at that res, but that limits it's to 30Hz for 3840x2160. That's not HDMI 2.0 specs. You can't use anything else than DisplayPort for 60Hz 4k/UHD. DisplayPort-receivers only do that on MST too. You need two 300MHz HDMI-ports to do UHD @ 60Hz. So gaming in UHD with HDMI is out regardless of gpu/source.
Maxwell doesn't do H.265/HEVC for that matter either. You only need ~300MHz HDMI 1.4 to do 4096x2160 @ 24Hz. Not HDMI 2.0, that can do it @ 60Hz.
As far as things that still aren't there, I'd throw in color space (both gamut and bit depth) as well. Official UHDTV (see Rec. 2020), beyond the resolution standards bumping to 4K or 8K, also at last features a significantly larger color space and also the depth necessary to go with it (either 10-bit or 12-bit). That's another marquee feature of HDMI 2.0, 12-bit 4:2:2 4K@60fps. Without the increase depth a wider gamut isn't a straight upgrade since the delta between colors increases too, 8-bit AdobeRGB say isn't a clear superset of 8-bit sRGB. It's exciting that as well as HiDPI we'll finally see an industry wide shift to a color space that will be a strict improvement and is large enough to basically be "done" as far as human vision.
There's still a lot more pieces needed on the PC side though, including both hardware (video cards, interconnect) and OS/applications. High DPI is slowly improving, but even Apple has slipped a bit in terms of color management and support. That said, given the economies of scale that'll come with the general UHDTV push the market pressure should be there at least.
Correct, 8.0 has static scaling across all displays. 8.1 introduced different scaling for each display. This works very well for surface pros that are docked. It will scale the surface pro display at 150% and the extra monitors at 100%. If you move a window between the displays the screen with the majority of the window will decide the scaling for that window. As you pull it from one screen to the next you will see the window change its scaling factor.
Hey Chris, I have the similar UP2414Q and have the same issues of having to "reboot" the monitor frequently when waking my computer up from sleep. You said you were able to update your firmware for the 32" monitor. Is there one for the 24" model as well? I was not able to find one.
I'll update the article but I got the firmware from Dell. You would need to return your monitor to them and exchange it for a refurbished model with the updated firmware it sounds like.
Where did you get the firmware update from? I own a UP2414Q and experience the issues you described in your article (the need to power cycle the monitor most of the time after the computer goes to sleep is driving me crazy), but I couldn't find a firmware update on Dell's site.
Windows can scale each display independently. Most desktop apps don't support that yet and get blurry. But Windows itself is definitely there and works just fine. Now it's the app developers who have to step up.
They claim in can, but it doesn't work properly. I have two displays connected to my computer, a 2414Q (2160p) and a 2414H (1200p). I can either say I want to use the same scaling for both displays, in which case I am allowed to specify the DPI scaling, or I can choose to use separate scalings for the monitors, which is utterly broken - Windows guesses ("detects") the proper DPI scaling and if I am not satisfied it allows me to scale both display simultaneously, but not separately, which is kind of brain dead, especially since it doesn't guess (ehm.. detect) the DPI properly on the 4k display. Aside that, using the same (200%) scaling on both screens breaks some DPI unaware applications, but letting Windows use separate DPI scalings breaks most DPI unaware apps. In either case, the 1200p screen is useless as everything is too big, or the 4k display has elements so tiny you can't work with it without a magnifying glass. Why I can't simply state I want to have 180dpi on screen 1 and 100dpi on screen 2 is beyond me.
If you set your primary display to 120% and your secondary to 200% the 200% will actually be 120% bitmap scaled. Or wise versa. One screen will always look rubbish and blurry. Read what Microsoft says instead of broad claims!
" With PremierColor, your monitor provides superb color accuracy and 99% AdobeRGB and 100% sRGB coverage."
this is an April fool right ! at $3,499/$2800, under no circumstance is any real certified UHD-1 Rec. 2020 real color space 10bit per pixel "UHD-1" 3840x2160P 16:9 ratio panel going to give you only 99% AdobeRGB at best
"In coverage of the CIE 1931 color space the Rec. 2020 color space covers 75.8%, the digital cinema reference projector color space covers 53.6%, the Adobe RGB color space covers 52.1%, and the Rec. 709 color space covers 35.9%."
I don't believe I ever said it has full Rec. 2020 coverage. Nothing has Rec. 2020 coverage as aside from lasers (I've been told), nothing can produce the gamut required. Not LEDs, not quantum dots, nothing. The display claims AdobeRGB and sRGB gamut which is what it has.
"Nothing has Rec. 2020 coverage as aside from lasers (I've been told), nothing can produce the gamut required. "
:) he is wrong OC as they would not and could not ratify the 2020 spec if they could not already produce the results it lays out, the guy was probably trying to sell the Mitsubishi LaserVue panels with their use a laser for red and regular LED’s for blue and green...
for the less interested ill just add this paste... "Rec. 2020
Much like with Rec. 470, Rec. 601 and Rec. 709 the Rec. 2020 standard is more than just a color space. It is the standard for Ultra High Definition Television (UHD-TV), which knows two versions: 4K (3840x2160 and 4096x2160) and 8K (7680x4320 and 8192x4320). Apart from the obvious improvement in resolution over the Rec. 709 standard, the Rec. 2020 standard also improves upon its predecessor in many other ways. The maximum frame rate doubles from 60 Hz progressive to 120 Hz progressive (interlaced resolutions are no longer supported, which is good). The color depth also increases by at least 2 bits per channel, with the possibility of 4 bits. Because Rec. 709 and Rec. 2020 both use studio swing /narrow band, this does not result in the usual 16.8 million, 1.07 billion and 68.7 billion colors though.
Bit depth per channel Reference black level Reference white level Usable combinations per channel Total number of colors 8 16 235 220 10,648,000 10 64 940 877 674,526,133 12 256 3760 3505 43,059,012,625
Last but not least, the Rec. 2020 offers significant improvement over the Rec. 709 standard when it comes to the color gamut: nearly twice the size of its predecessor. It uses three monochromatic primaries with wavelengths of 630, 532 and 467 nm. This results in a very large gamut, but without most of the drawbacks of even larger color spaces like the Adobe Wide Gamut RGB color space.
Most of the thought process behind the origin of these primaries can be found in ITU Recommendation BT 2246-2:2012. The summarized version is that UHD-TV should have a larger color gamut in order to cover the real surface colors (based on Pointer’s gamut and SOCS database) as much as possible using real primaries."
what is clear is that much like the PCISIG the ethernet SIG etc the panel entities are also not fit for purpose as they dragged their feet actually producing new panels and ened to end kit to the latest specs that where in effect made by NHK and the BBC R&D...
For my data point: using the Mac 10.9.3 with HiDPI on an UltraHD display is gorgeous. If you have a Mac that can handle it (I have a 2012 Macbook Retina, as a reference point), you should definitely try it. It's like the retina display but huge. My eyes are finally happy.
tjoynt, find the best visual quality "2001 A Space Odyssey Opening in 1080 HD" or higher clip you can find https://www.youtube.com/watch?v=e-QFj59PON4 and play that , tell us you cant see the ringing due to the 8 bit per pixel pseudocolor, that why we need Rec. 2020 color space, in fact 10bit isnt really enough as it takes 11bit's or more to get true real colour but consumers have to make do for a reasonable price (this is not reasonable)
I guess displays like this are for the niche market where people will buy it anyway to show off. There simply isnt much engineering behind a screen like this anway, they just slap a 4k panel and some electronics into a box and call it a 4k display. The many unacceptible flaws listed in this review prove this point.
This monitor is not ultra-high dpi. It is just high dpi. The 4k is just a good resolution for 32".
150% is a suitable dpi setting in Windows for this monitor.
People using this monitor will typically not be using it with other screens at the same time, or low res monitors. The typical uses will be as a single screen connected to a desktop, or to a laptop. 150% might also be suitable for a good laptop screen, say 1080p 13".
Also most software has worked well with high dpi settings in Windows for several years.
The only problem with windows is the lack of per-screen dpi but the extent to which this poses a problem with this screen is exaggerated. The pixel-war 4k resolutions for small screens, e.g. 24", are more likely pose a problem because they would require a dpi setting close to 200% which would be very inappropriate for most other screens.
The problem is if you use this as the primary monitor in Windows with 150%, say your 27" 2560x1440 monitor that you run at 100% will be 150% bitmap scaled to 100% in Windows 8.1. If you choose you 100% screen as primary the results will be really disastrous. One screen will always look blurry and bad if you do not use the same scaling. Plenty of Microsoft's own software doesn't work decent with DPI-scaling and stuff like the browser ignores the native scaling and just scales by zooming. 24" 200% still produces some oddities even if it's your only screen. You can't really speak of any improvements here in Windows yet. OS X seems to do multiscreen better at least. Having different scaling on your laptop screen and external screen seems like a given to me.
I agree that this is a problem in principle, but not so much here because:
- Using 32" and 27" is unlikely given the size of screens. (And using non-identical dual screens is not recommended anyway because of differing color profiles needed anyway.)
- 27" 1440p is quite high dpi. So if 150% is preferable for this monitor then 125% is preferable for 27". So you'd end up with things only slightly too large on the 27".
Coping well with screens at different dpis should be done but it is quite challenging for OSes and software and will take many years.
A gradual increase in dpi (as in this Dell) is the best approach at the moment IMO.
hey Chris, other reviewers found input lag to be less than 20ms. How come your results are so skewed? Aren't you suppose to use the best setting to test this? Why are u testing at a non native resolution? TBH you're better off not testing for it.
He already mentioned why he's testing at non-native resolution, because most graphics cards (even the higher end ones), can't drive all games at high settings at a 4k resolution.
Makes sense to me, if I'm going to be playing at <4k resolution, that's where I want the input lag tested.
there are many games that can perfectly be played at native resolution, like Civ 5, or FIFA, or WoW, or any 2/3 years old game. Also, he used HDMI at 30 hz instead of DP at 60 hz.
I do not see the point of giving figures if they are not the best the monitor can do.
The HDMI input is driving at 1080p at 60 Hz, not 30Hz. 30Hz would be if it was using a 4K signal. Tom's Hardware is measuring using a 1080p signal as well, so their results should be similar but they're using a high speed camera instead of the Leo Bodnar device. TFT Central measures it using SMTT which I find slightly odd as that would require a CRT that can do UltraHD resolution, of which I'm not aware of any.
I'm working on a better way for input lag but for right now it's a static test that is as accurate, and comparable, as I can make it.
Regarding the unexpectedly high case temperature the author measured, I'd expect that to be a function of high pixel density, which blocks a greater proportion of the backlight. This in turn requires more backlight intensity to produce a given panel illumination.
I don't know what is considered by many as HiDPI but this monitor is most definitely not HiDPI at just 140 DPI! I know that most highend monitors are ~ 100 DPI and common one are even lower but I don't see why 140 DPI would be such a big deal. Are the Icons that much smaller? Are the alphanumeric character unreadable?
I believe the the ridiculously low DPI of generations of monitor has made expectation of huge icons and lettering the norm and they are just not needed. You can see the icons and characters perfectly fine at 140DPI no scaling is needed!
Yeah I think its just people set in their ways. Even when I had an out-dated prescription and saw worse than 20/20 I still would have no problem with that size. The only thing I can think of is that is just how most people have used computers and are stuck in their ways. I used 1600x1200 on a 17 inch CRT way back in the days (pre windows-2000) and soon after when I switched to linux I was 2560x1920 on a 22 inch CRT. It was even a bit blurry but it was still not a problem and that was 160 PPI. With a super sharp 140 PPI display why do people need scaling? I don't use scaling even on a 200 PPI+ display.
Personally, I'd like to see a 39" 4K monitor, using the same VA panel in the Seiki TV but with a 60 Hz input. The Seiki TV is OK for productivity apps, but if you play any games or watch videos, as I do, then the low frame rate is a deal-breaker.
A 39" monitor at 4K would provide an absolutely huge workspace - you would no longer need a multiple monitor setup. And the DPI isn't much higher than a standard 27" 2560x1440 monitor, so you don't need to use the Windows scaling that so many applications still don't do properly. (Microsoft really needs to do something about this - right now they seem content just to hope everyone eventually moves to Metro, which they aren't and won't.)
Chris H. - does the Dell preset for Game exist? On my 24", the Game mode has less lag (by many milliseconds), with a tradeoff that the color is a bit too overdriven.
May be UltraHD in comparison to a TV, but 138 DPI is something I would sneeze at. A 50% increase in pixel density compared to a standard 16:10 24" monitor with 94 DPI is not enough.
You mention the usefulness of contrast over brightness in this instance. As LED monitors do lose a notable amount of brightness over time, are you able to re-test the brightness of a monitor that you have previously tested and recorded the numbers on and report the differences?
I suspect over the long term having a monitor that can go brighter than needed may be more useful than suspected.
Lag does not cause motion blur. Lag is how long a display takes to react to an input, and is usually (in the case of displays like this) caused by a delay in the image processing circuitry in the display itself. Motion blur on the other hand would be caused by a slow pixel response time (where the pixels themselves take a long time to change states after the display has already begun to refresh).
I'm more looking for the 28" Dell one, much better price/performance for me. 600€ is nearly as much as I paid for my 1440p monitor not that long ago. Incredible.
On the topic of HiDPI and scaling: I have a 11.6" 1080p laptop (XE700T1C) and have no issue with it running at 125% and that is with using my finger most of the time (I only use the pen when I am already holding it because of note taking). 11.6"@1080p is 190 DPI, this is just 140. Unless you are using multiple monitors and suffer issues because of that, you need to get your eye sight fixed if you have having "high DPI" trouble with modern Windows .
I tested this monitor on linux. Works better on linux than all the rest because of the MST BS. On linux you just set it in a config file and never have problems and you don't have to worry about drivers 'dancing' their way around the problem. Not only that only on linux allows both GPU and monitor scaling of all resolutions while the display is in MST mode. People can't get scaling working at all on this on windows when the display is in MST mode.
Just FYI - these issues have all know since the monitor went on sale last December. Dell's stance currently is if you want one with a fixed firmware you have to give up your new monitor and accept a refurbished replacement.
170 watts power usage! Are you kidding me? Typical 32 inch led monitors use around 30 watts. Somebody had to tell manufacturers that power usage is supposed to go down not increase by a factor of almost six.
Hey, it's still April's fools day... that price is a grotesque joke. Just save your hard earned $$$ for the Vizio P-series and forget that DELL even offers this HDMI 1.4 embarrassment of an UHD monitor.
"Sure, you can run a desktop at full resolution with no scaling but that is almost impossible for anyone to actually use."
I use a 27" WQHD (2560X1440) monitor without any scaling, and it works beautifully. That's ~109DPI. It's not that much harder for a 32" UltraHD monitor to do the same. The DPI for that monitor calculates out to about 137DPI. I don't agree that it would be "almost impossible for anyone to actually use."
The key is having good OS support for HiDPI. This is at least coming to OSX in 10.9.3.
I have a 2013 retina MBP driving a Dell UP2414Q on 10.9.3. It works really really well. I have it set to 2x scaling most of the time ("Retina" in Apple parlance) and so have a super super crisp display for software development/text all day long. I don't game much but all apps work perfectly. It is like looking at a 24" recent smartphone, iPad. The only issue I have is that on occasion the display will not wake from sleep, but a quick cycle of the monitor power button resolves its and all windows return to where they were. Once Dell fixes the firmware for this I will likely exchange it. Otherwise the build quality and image is superb.
Before people chortle that it is waste to have an OS scale a 4k monitor to 1080p, remember that even though the effective resolution is 1080p for text with OS X scaling, the full 4k resolution is still available for use by imaging apps, games, etc. And sometimes I simple change the OS scaling to provide a 2560x1440 desktop if I need more 'real estate'. But my primary goal with HiDPI is to finally have crisp, sharp readable text on a big screen. 4k @ 32" is not about HiDPI, it's about desktop real estate. So it depends on what your needs are.
fyi, I was recently able to buy the 24" Dell Ultrasharp for < $1000 all in with some careful shopping. That's a few hundred more than the recently announced Samsung UD590 but it's far nicer IPS panel.
I don't know what is wrong at Dell. They never had the industrial design chops of Apple or IBM/Lenovo's products. But they were still head and shoulders above the other PC and display makers. The current Dell displays are just ugly. Not talking bout the picture quality, rather it's chassis and stand. Seriously, what is up with that hideous stand? There is not one angle that looks right on it. And the chassis with the silver and dark grey is very out of place on a $3k monitor. The Lenovo UHD-4k designs are far more professional looking. Asus qhd/4k/uhd displays are also more pleasing in a Honda CRX kind of way. Samsung and LG's professional offerings are similarly far less ugly. I know it may sound trite, but hey, if I am sitting in front of it 40+ hours a week it matters.
in case of disabling sleep mode in control panel, am i going to expirience still those problems?
And if i use this monitor only for work ( i need big desktop area for multitasking ) am i still going to expirience mst issues? Cose i need 60hz for my work.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
84 Comments
Back to Article
Kevin G - Tuesday, April 1, 2014 - link
On the Windows site of things, what video card and driver was used? How AMD and nVidia handle MST support varies slightly so you might have better luck with one over the other.Samus - Tuesday, April 1, 2014 - link
Wow, $2800 bucks...you can have a whole WALL of ZR2740w's for that price.nathanddrews - Tuesday, April 1, 2014 - link
... or a wall of U28D590D for $699 each. 60Hz TN 4K. I'm glad to see the major players offer up affordable 60Hz 4K. Of course, I'd rather have 120Hz 4K DP1.3. Doesn't matter if you can't play games, it would be of tremendous value to me just for desktop operations.Gunbuster - Tuesday, April 1, 2014 - link
Or you could run 4 39" 4K's at 1080p 120hz in multi-monitor and still have 4k resolution and some change left over...Bad pricing is bad.
WithoutWeakness - Tuesday, April 1, 2014 - link
If I'm correct in assuming you're talking about the 39" Seiko TV then I must inform you that those are not 120Hz screens in the same sense that a computer monitor is 120Hz. Those Seiko TV's only take an HDMI input which is currently limited to 4K@30Hz. They then interpolate frames between frames of source material to give the illusion of 120Hz. A 120Hz monitor takes in a 120Hz signal and displays it natively. There are currently no 4K 120Hz monitors on the market (there aren't even any 2560x1600 120Hz monitors I'm aware of).nathanddrews - Tuesday, April 1, 2014 - link
The 50" Seiki 4K TVs do native 120Hz 1080p over HDMI 1.4, but it seems to be a lottery as to whether it needs to be hacked or not. While this resolution is not officially supported, creating a custom EDID makes it possible. In addition, several people have been loading the 50" firmware on the 39", making native 120Hz 1080p possible there as well. So you can have your desktop and videos at 30Hz 4K (not ideal, but still razor sharp) while also gaming at 120Hz 1080p. Some are claiming 720p at native 240Hz... but I'd have to see that to believe it.http://www.avsforum.com/t/1473728/official-seiki-s...
houkouonchi - Friday, April 4, 2014 - link
All seiki 50 inch displays will natively display 1080p@120Hz. The 39 inch models all will to (with a firmware update). The 39 inch monitors with the firmware update do pixel doubling which means ideal scaling for gaming (almost identical to gaming on a big 1080p display with no scaling artifacts).Both the 50 inch (and 39 with the right firmware) will accept 720p@240Hz. It sitll only displays 120Hz but this does halve the input lag from around 9ms to 4.5 ms which is why for games where it really matters (only quakelive for me) I ran at 1280x720@240Hz.
marcosears - Thursday, October 9, 2014 - link
It's nice, but it could be a lot better! /Marco from http://www.consumertop.com/best-monitor-guide/dave_rosenthal - Tuesday, April 1, 2014 - link
Actually, the 39" inch Seiko *does* accept 1080@120hz over hdmi (when flashed with the firmware from 50" Seiko) and output all individual 120 frames per second (it looks very smooth!). You're right that it's limited to 4k@30hz.inighthawki - Tuesday, April 1, 2014 - link
Enjoy your massive bezel and spanning content across monitors.Samus - Tuesday, April 1, 2014 - link
For <$400 and 2560x1440, I'll take a thick (matte finish) bezel with an IPS panel. Infact, I'll take 6 of them and keep the change instead of buying a "Dell"Samsung just announced a direct-competitor to this monitor for $800 starting price. So maybe just 3 of those and keep the change, and they have a razor thin bezel and aluminum frame construction.
I mean...I just can't get over the price here. $2800 bucks. It's not even worth $1000.
inighthawki - Tuesday, April 1, 2014 - link
Are you referring to the ud590? It's hardly a "direct competitor." It's not only 4" diagonal smaller but it's a TN panel, and lacks quite a few input types. Or are you referring to the ud970, in which case would you please post a source to the $800 price tag, which sounds unlikely.On top of that, remember that you then need 2 video cards to power 4/6 monitors (nvidia supports 4 but only in span with the fourth as an "extra" display) so you won't even be able to tile them properly by default. On top of that have fun trying to play a game across all the monitors. At least with a single 4K display you can render at 1080p and have a seamless scaling across the entire display. With 4 or 6 monitors you'd have to run it in the standard "eyefinity" style 3x1 configuration which is nice, but also will cost you with resolution. At that point you're going to need not just two GPUs, but likely two high end GPUs. When all is said and done I don't think you'll save much once you take into account the disadvantages of the setup on top of the price.
nevertell - Tuesday, April 1, 2014 - link
You have never been truly productive with a computer, have you ?A multi monitor setup enables me to do things just that much faster. While you could argue that a 4K monitor would allow me to do stuff side by side on the same panel, at the end of the day, when the new resolution becomes the norm, you have to develop content for 4k. And the only way to see your results is to see them at 4k, so you have to have another panel. Single display setups are meant for lazy office people and content consumers. And even then, playing a game with a surround monitor setup is much nicer than having one large monitor.
People used the same kind of software back in the days of 1280 by 1024, 1024 by 768 and 800 by 640. And even though most of the software for today is written for 1080p or 1200p panels, the fact still remains, you cannot operate Visual Studio in half of that. Hell, it's a nightmare to run Visual Studio on 1336 by 768, let alone something like 960 by 1080p. Even though application UI's appear scalable, scaling them is a last resort. So yes, when we will catch up to 4K on the software side of things, we will need multiple monitors again. But yes, of course, for a web developer of today, being able to edit multiple stylesheets, javascript and html files at the same time AND seeing the result largely in scale is truly great. Unfortunately, most enterprise developers rarely need to edit countless files at once and the php/css people just simply will not be able to afford these displays.
Sabresiberian - Sunday, April 6, 2014 - link
What do you mean by "run 4K at 1080p"? That makes no sense.If you are talking running those 4 monitors to get a total screen space of 3840x2160, sure, you could do that, but you entirely miss the point of having a high-density monitor. The pixels on a 39" 1080p screen are the size of a truck. Maybe suitable for using as a TV viewed across your living room, but not as a monitor a couple of feet from your face.
mczak - Tuesday, April 1, 2014 - link
A wall of TN panels is fairly pointless if you can only see the center one due to the poor viewing angles :-).cheinonen - Tuesday, April 1, 2014 - link
AdobeRGB vs. sRGB, HiDPI vs. Regular DPI, Apples to Orangescheinonen - Tuesday, April 1, 2014 - link
I tested with both Nvidia and AMD cards and had the same issues with both.toncij - Sunday, April 6, 2014 - link
Yes, that information could help, since not every test was done on Nvidia/AMD and the same driver. Also, it would help a lot to hear if those problems also happen with ASUS and Sharp, to compare those products. Anandtech made a review already: http://www.anandtech.com/show/7157/asus-pq321q-ult...jasonelmore - Tuesday, April 1, 2014 - link
Where the hell is HDMI 2.0? Its done and being produced, why isnt it in these new devices?SunLord - Tuesday, April 1, 2014 - link
It takes more then 6months to go from a finished standard to fully certified working silicon let along shipping it in quantities to enable mass production of consumer devices.willis936 - Tuesday, April 1, 2014 - link
I'm not sure this is right. Companies usually are making and testing IP while a standard is in the works. In some cases they're out before the standard is done.cheinonen - Tuesday, April 1, 2014 - link
This is correct. There is currently no full HDMI 2.0 silicon out there that I'm aware of, and since the Dell started shipping last fall it certainly didn't have access to it then. There are currently devices shipping that claim "HDMI 2.0" support in the AV world, but that isn't full HDMI 2.0. It is support for 4:2:0 chroma subsampling, which is part of the HDMI 2.0 spec, and enabled UltraHD resolution at 60 Hz. Since computers don't use chroma subsampling, this isn't relevant and there is no HDMI 2.0 silicon right now.Penti - Tuesday, April 1, 2014 - link
Not even Maxwell can output it, so what sources are you suppose to use?BMNify - Tuesday, April 1, 2014 - link
where you get that idea from , its false you need a GeForce 600 "Kepler" graphics card or newer to drive a display up to 4096 x 2160.hell, even the ChromeOS guys have merged this linux UHD patch in to their tree now...so intel Haswell/Iris Graphics work at "UHD-1" 3840x2160P if you are not gaming http://lists.x.org/archives/xorg-devel/2014-Januar...
cheinonen - Tuesday, April 1, 2014 - link
You can do that resolution at 24 Hz, or 3840x2160 at 30 Hz, but you can't do it at 60 Hz without MST right now. HDMI 2.0 allows it at 60 Hz but that isn't available yet on a product.Penti - Tuesday, April 1, 2014 - link
I was speaking about 600MHz HDMI not ~300MHz. 300MHz HDMI has been around since GCN 1.0 and Kepler. It's also available in Haswell, works fine in Windows, OS X or GNU/Linux at that res, but that limits it's to 30Hz for 3840x2160. That's not HDMI 2.0 specs. You can't use anything else than DisplayPort for 60Hz 4k/UHD. DisplayPort-receivers only do that on MST too. You need two 300MHz HDMI-ports to do UHD @ 60Hz. So gaming in UHD with HDMI is out regardless of gpu/source.Maxwell doesn't do H.265/HEVC for that matter either. You only need ~300MHz HDMI 1.4 to do 4096x2160 @ 24Hz. Not HDMI 2.0, that can do it @ 60Hz.
zanon - Tuesday, April 1, 2014 - link
As far as things that still aren't there, I'd throw in color space (both gamut and bit depth) as well. Official UHDTV (see Rec. 2020), beyond the resolution standards bumping to 4K or 8K, also at last features a significantly larger color space and also the depth necessary to go with it (either 10-bit or 12-bit). That's another marquee feature of HDMI 2.0, 12-bit 4:2:2 4K@60fps. Without the increase depth a wider gamut isn't a straight upgrade since the delta between colors increases too, 8-bit AdobeRGB say isn't a clear superset of 8-bit sRGB. It's exciting that as well as HiDPI we'll finally see an industry wide shift to a color space that will be a strict improvement and is large enough to basically be "done" as far as human vision.There's still a lot more pieces needed on the PC side though, including both hardware (video cards, interconnect) and OS/applications. High DPI is slowly improving, but even Apple has slipped a bit in terms of color management and support. That said, given the economies of scale that'll come with the general UHDTV push the market pressure should be there at least.
peterfares - Tuesday, April 1, 2014 - link
Did you test it on a Windows computer other than the one you pictured? Because that one is 8.0, not 8.1 which added multi-DPI support.datobin1 - Tuesday, April 1, 2014 - link
Correct, 8.0 has static scaling across all displays. 8.1 introduced different scaling for each display.This works very well for surface pros that are docked. It will scale the surface pro display at 150% and the extra monitors at 100%. If you move a window between the displays the screen with the majority of the window will decide the scaling for that window. As you pull it from one screen to the next you will see the window change its scaling factor.
cheinonen - Tuesday, April 1, 2014 - link
Yes, I tested with both Windows 8.0 and 8.1. I just happened to have rebooted into Windows 8.0 when I took the photos but I tested both.MrPete123 - Tuesday, April 1, 2014 - link
Hey Chris, I have the similar UP2414Q and have the same issues of having to "reboot" the monitor frequently when waking my computer up from sleep. You said you were able to update your firmware for the 32" monitor. Is there one for the 24" model as well? I was not able to find one.cheinonen - Tuesday, April 1, 2014 - link
I'll update the article but I got the firmware from Dell. You would need to return your monitor to them and exchange it for a refurbished model with the updated firmware it sounds like.aron9621 - Tuesday, April 1, 2014 - link
Where did you get the firmware update from? I own a UP2414Q and experience the issues you described in your article (the need to power cycle the monitor most of the time after the computer goes to sleep is driving me crazy), but I couldn't find a firmware update on Dell's site.davepermen - Tuesday, April 1, 2014 - link
Windows can scale each display independently. Most desktop apps don't support that yet and get blurry. But Windows itself is definitely there and works just fine. Now it's the app developers who have to step up.aron9621 - Tuesday, April 1, 2014 - link
They claim in can, but it doesn't work properly. I have two displays connected to my computer, a 2414Q (2160p) and a 2414H (1200p). I can either say I want to use the same scaling for both displays, in which case I am allowed to specify the DPI scaling, or I can choose to use separate scalings for the monitors, which is utterly broken - Windows guesses ("detects") the proper DPI scaling and if I am not satisfied it allows me to scale both display simultaneously, but not separately, which is kind of brain dead, especially since it doesn't guess (ehm.. detect) the DPI properly on the 4k display. Aside that, using the same (200%) scaling on both screens breaks some DPI unaware applications, but letting Windows use separate DPI scalings breaks most DPI unaware apps. In either case, the 1200p screen is useless as everything is too big, or the 4k display has elements so tiny you can't work with it without a magnifying glass. Why I can't simply state I want to have 180dpi on screen 1 and 100dpi on screen 2 is beyond me.Penti - Tuesday, April 1, 2014 - link
That's still not true.If you set your primary display to 120% and your secondary to 200% the 200% will actually be 120% bitmap scaled. Or wise versa. One screen will always look rubbish and blurry. Read what Microsoft says instead of broad claims!
BMNify - Tuesday, April 1, 2014 - link
" With PremierColor, your monitor provides superb color accuracy and 99% AdobeRGB and 100% sRGB coverage."this is an April fool right ! at $3,499/$2800, under no circumstance is any real certified UHD-1
Rec. 2020 real color space 10bit per pixel "UHD-1" 3840x2160P 16:9 ratio panel going to give you only 99% AdobeRGB at best
"In coverage of the CIE 1931 color space
the Rec. 2020 color space covers 75.8%,
the digital cinema reference projector color space covers 53.6%,
the Adobe RGB color space covers 52.1%, and
the Rec. 709 color space covers 35.9%."
cheinonen - Tuesday, April 1, 2014 - link
I don't believe I ever said it has full Rec. 2020 coverage. Nothing has Rec. 2020 coverage as aside from lasers (I've been told), nothing can produce the gamut required. Not LEDs, not quantum dots, nothing. The display claims AdobeRGB and sRGB gamut which is what it has.BMNify - Wednesday, April 2, 2014 - link
"Nothing has Rec. 2020 coverage as aside from lasers (I've been told), nothing can produce the gamut required. ":) he is wrong OC as they would not and could not ratify the 2020 spec if they could not already produce the results it lays out, the guy was probably trying to sell the Mitsubishi LaserVue panels with their use a laser for red and regular LED’s for blue and green...
rather than just state they can ill point you to this page as you will probably find it interesting... the quantum dots and the M3 film are cool too
http://www.tftcentral.co.uk/articles/content/point...
for the less interested ill just add this paste...
"Rec. 2020
Much like with Rec. 470, Rec. 601 and Rec. 709 the Rec. 2020 standard is more than just a color space. It is the standard for Ultra High Definition Television (UHD-TV), which knows two versions: 4K (3840x2160 and 4096x2160) and 8K (7680x4320 and 8192x4320). Apart from the obvious improvement in resolution over the Rec. 709 standard, the Rec. 2020 standard also improves upon its predecessor in many other ways. The maximum frame rate doubles from 60 Hz progressive to 120 Hz progressive (interlaced resolutions are no longer supported, which is good). The color depth also increases by at least 2 bits per channel, with the possibility of 4 bits. Because Rec. 709 and Rec. 2020 both use studio swing /narrow band, this does not result in the usual 16.8 million, 1.07 billion and 68.7 billion colors though.
Bit depth per channel
Reference black level
Reference white level
Usable combinations per channel
Total number of colors
8
16
235
220
10,648,000
10
64
940
877
674,526,133
12
256
3760
3505
43,059,012,625
Last but not least, the Rec. 2020 offers significant improvement over the Rec. 709 standard when it comes to the color gamut: nearly twice the size of its predecessor. It uses three monochromatic primaries with wavelengths of 630, 532 and 467 nm. This results in a very large gamut, but without most of the drawbacks of even larger color spaces like the Adobe Wide Gamut RGB color space.
Most of the thought process behind the origin of these primaries can be found in ITU Recommendation BT 2246-2:2012. The summarized version is that UHD-TV should have a larger color gamut in order to cover the real surface colors (based on Pointer’s gamut and SOCS database) as much as possible using real primaries."
what is clear is that much like the PCISIG the ethernet SIG etc the panel entities are also not fit for purpose as they dragged their feet actually producing new panels and ened to end kit to the latest specs that where in effect made by NHK and the BBC R&D...
tjoynt - Tuesday, April 1, 2014 - link
For my data point: using the Mac 10.9.3 with HiDPI on an UltraHD display is gorgeous. If you have a Mac that can handle it (I have a 2012 Macbook Retina, as a reference point), you should definitely try it. It's like the retina display but huge. My eyes are finally happy.BMNify - Tuesday, April 1, 2014 - link
tjoynt, find the best visual quality "2001 A Space Odyssey Opening in 1080 HD" or higher clip you can find https://www.youtube.com/watch?v=e-QFj59PON4 and play that , tell us you cant see the ringing due to the 8 bit per pixel pseudocolor, that why we need Rec. 2020 color space, in fact 10bit isnt really enough as it takes 11bit's or more to get true real colour but consumers have to make do for a reasonable price (this is not reasonable)Panzerknacker - Tuesday, April 1, 2014 - link
Way too high input lag, useless display tbh.I guess displays like this are for the niche market where people will buy it anyway to show off. There simply isnt much engineering behind a screen like this anway, they just slap a 4k panel and some electronics into a box and call it a 4k display. The many unacceptible flaws listed in this review prove this point.
CSMR - Tuesday, April 1, 2014 - link
The Windows dpi comments are exaggerated.This monitor is not ultra-high dpi. It is just high dpi. The 4k is just a good resolution for 32".
150% is a suitable dpi setting in Windows for this monitor.
People using this monitor will typically not be using it with other screens at the same time, or low res monitors. The typical uses will be as a single screen connected to a desktop, or to a laptop. 150% might also be suitable for a good laptop screen, say 1080p 13".
Also most software has worked well with high dpi settings in Windows for several years.
The only problem with windows is the lack of per-screen dpi but the extent to which this poses a problem with this screen is exaggerated. The pixel-war 4k resolutions for small screens, e.g. 24", are more likely pose a problem because they would require a dpi setting close to 200% which would be very inappropriate for most other screens.
Penti - Tuesday, April 1, 2014 - link
The problem is if you use this as the primary monitor in Windows with 150%, say your 27" 2560x1440 monitor that you run at 100% will be 150% bitmap scaled to 100% in Windows 8.1. If you choose you 100% screen as primary the results will be really disastrous. One screen will always look blurry and bad if you do not use the same scaling. Plenty of Microsoft's own software doesn't work decent with DPI-scaling and stuff like the browser ignores the native scaling and just scales by zooming. 24" 200% still produces some oddities even if it's your only screen. You can't really speak of any improvements here in Windows yet. OS X seems to do multiscreen better at least. Having different scaling on your laptop screen and external screen seems like a given to me.CSMR - Tuesday, April 1, 2014 - link
I agree that this is a problem in principle, but not so much here because:- Using 32" and 27" is unlikely given the size of screens. (And using non-identical dual screens is not recommended anyway because of differing color profiles needed anyway.)
- 27" 1440p is quite high dpi. So if 150% is preferable for this monitor then 125% is preferable for 27". So you'd end up with things only slightly too large on the 27".
Coping well with screens at different dpis should be done but it is quite challenging for OSes and software and will take many years.
A gradual increase in dpi (as in this Dell) is the best approach at the moment IMO.
Hxx - Tuesday, April 1, 2014 - link
hey Chris, other reviewers found input lag to be less than 20ms. How come your results are so skewed? Aren't you suppose to use the best setting to test this? Why are u testing at a non native resolution? TBH you're better off not testing for it.BinaryTB - Tuesday, April 1, 2014 - link
He already mentioned why he's testing at non-native resolution, because most graphics cards (even the higher end ones), can't drive all games at high settings at a 4k resolution.Makes sense to me, if I'm going to be playing at <4k resolution, that's where I want the input lag tested.
apertotes - Tuesday, April 1, 2014 - link
there are many games that can perfectly be played at native resolution, like Civ 5, or FIFA, or WoW, or any 2/3 years old game. Also, he used HDMI at 30 hz instead of DP at 60 hz.I do not see the point of giving figures if they are not the best the monitor can do.
cheinonen - Tuesday, April 1, 2014 - link
The HDMI input is driving at 1080p at 60 Hz, not 30Hz. 30Hz would be if it was using a 4K signal. Tom's Hardware is measuring using a 1080p signal as well, so their results should be similar but they're using a high speed camera instead of the Leo Bodnar device. TFT Central measures it using SMTT which I find slightly odd as that would require a CRT that can do UltraHD resolution, of which I'm not aware of any.I'm working on a better way for input lag but for right now it's a static test that is as accurate, and comparable, as I can make it.
NCM - Tuesday, April 1, 2014 - link
Regarding the unexpectedly high case temperature the author measured, I'd expect that to be a function of high pixel density, which blocks a greater proportion of the backlight. This in turn requires more backlight intensity to produce a given panel illumination.Taracta - Tuesday, April 1, 2014 - link
I don't know what is considered by many as HiDPI but this monitor is most definitely not HiDPI at just 140 DPI! I know that most highend monitors are ~ 100 DPI and common one are even lower but I don't see why 140 DPI would be such a big deal. Are the Icons that much smaller? Are the alphanumeric character unreadable?I believe the the ridiculously low DPI of generations of monitor has made expectation of huge icons and lettering the norm and they are just not needed. You can see the icons and characters perfectly fine at 140DPI no scaling is needed!
houkouonchi - Friday, April 4, 2014 - link
Yeah I think its just people set in their ways. Even when I had an out-dated prescription and saw worse than 20/20 I still would have no problem with that size. The only thing I can think of is that is just how most people have used computers and are stuck in their ways. I used 1600x1200 on a 17 inch CRT way back in the days (pre windows-2000) and soon after when I switched to linux I was 2560x1920 on a 22 inch CRT. It was even a bit blurry but it was still not a problem and that was 160 PPI. With a super sharp 140 PPI display why do people need scaling? I don't use scaling even on a 200 PPI+ display.JDG1980 - Tuesday, April 1, 2014 - link
Personally, I'd like to see a 39" 4K monitor, using the same VA panel in the Seiki TV but with a 60 Hz input. The Seiki TV is OK for productivity apps, but if you play any games or watch videos, as I do, then the low frame rate is a deal-breaker.A 39" monitor at 4K would provide an absolutely huge workspace - you would no longer need a multiple monitor setup. And the DPI isn't much higher than a standard 27" 2560x1440 monitor, so you don't need to use the Windows scaling that so many applications still don't do properly. (Microsoft really needs to do something about this - right now they seem content just to hope everyone eventually moves to Metro, which they aren't and won't.)
sk317bge - Tuesday, April 1, 2014 - link
Chris H. - does the Dell preset for Game exist? On my 24", the Game mode has less lag (by many milliseconds), with a tradeoff that the color is a bit too overdriven.GTVic - Tuesday, April 1, 2014 - link
May be UltraHD in comparison to a TV, but 138 DPI is something I would sneeze at. A 50% increase in pixel density compared to a standard 16:10 24" monitor with 94 DPI is not enough.lokitx - Tuesday, April 1, 2014 - link
Everyone should read this before purchasing this monitor: http://en.community.dell.com/support-forums/periph...praeses - Tuesday, April 1, 2014 - link
You mention the usefulness of contrast over brightness in this instance. As LED monitors do lose a notable amount of brightness over time, are you able to re-test the brightness of a monitor that you have previously tested and recorded the numbers on and report the differences?I suspect over the long term having a monitor that can go brighter than needed may be more useful than suspected.
Human Bass - Tuesday, April 1, 2014 - link
It was looking quite decent, but that lag, wow, seems like a motion blur city.cjl - Wednesday, April 2, 2014 - link
Lag does not cause motion blur. Lag is how long a display takes to react to an input, and is usually (in the case of displays like this) caused by a delay in the image processing circuitry in the display itself. Motion blur on the other hand would be caused by a slow pixel response time (where the pixels themselves take a long time to change states after the display has already begun to refresh).Death666Angel - Tuesday, April 1, 2014 - link
I'm more looking for the 28" Dell one, much better price/performance for me. 600€ is nearly as much as I paid for my 1440p monitor not that long ago. Incredible.Death666Angel - Tuesday, April 1, 2014 - link
On the topic of HiDPI and scaling: I have a 11.6" 1080p laptop (XE700T1C) and have no issue with it running at 125% and that is with using my finger most of the time (I only use the pen when I am already holding it because of note taking). 11.6"@1080p is 190 DPI, this is just 140. Unless you are using multiple monitors and suffer issues because of that, you need to get your eye sight fixed if you have having "high DPI" trouble with modern Windows .Bob-o - Tuesday, April 1, 2014 - link
Would love to see Anandtech evaluate all products on linux. . . at least a 1 paragraph "I tried it" kind of thing. . .houkouonchi - Friday, April 4, 2014 - link
I tested this monitor on linux. Works better on linux than all the rest because of the MST BS. On linux you just set it in a config file and never have problems and you don't have to worry about drivers 'dancing' their way around the problem. Not only that only on linux allows both GPU and monitor scaling of all resolutions while the display is in MST mode. People can't get scaling working at all on this on windows when the display is in MST mode.https://www.youtube.com/watch?v=f8oPyKDriiQ
lord solar macharius - Tuesday, April 1, 2014 - link
Just FYI - these issues have all know since the monitor went on sale last December. Dell's stance currently is if you want one with a fixed firmware you have to give up your new monitor and accept a refurbished replacement.http://en.community.dell.com/support-forums/periph...
Darrenn - Tuesday, April 1, 2014 - link
170 watts power usage! Are you kidding me? Typical 32 inch led monitors use around 30 watts. Somebody had to tell manufacturers that power usage is supposed to go down not increase by a factor of almost six.MrSmartyAss - Tuesday, April 1, 2014 - link
Hey, it's still April's fools day... that price is a grotesque joke. Just save your hard earned $$$ for the Vizio P-series and forget that DELL even offers this HDMI 1.4 embarrassment of an UHD monitor.AnnonymousCoward - Tuesday, April 1, 2014 - link
Ok seriously. The picture on anandtech.com is stretched, to give the screen an aspect ratio of 2.24! But no, this isn't a 21:9 screen.dgingeri - Wednesday, April 2, 2014 - link
"Sure, you can run a desktop at full resolution with no scaling but that is almost impossible for anyone to actually use."I use a 27" WQHD (2560X1440) monitor without any scaling, and it works beautifully. That's ~109DPI. It's not that much harder for a 32" UltraHD monitor to do the same. The DPI for that monitor calculates out to about 137DPI. I don't agree that it would be "almost impossible for anyone to actually use."
nquery - Wednesday, April 2, 2014 - link
The key is having good OS support for HiDPI. This is at least coming to OSX in 10.9.3.I have a 2013 retina MBP driving a Dell UP2414Q on 10.9.3. It works really really well. I have it set to 2x scaling most of the time ("Retina" in Apple parlance) and so have a super super crisp display for software development/text all day long. I don't game much but all apps work perfectly. It is like looking at a 24" recent smartphone, iPad. The only issue I have is that on occasion the display will not wake from sleep, but a quick cycle of the monitor power button resolves its and all windows return to where they were. Once Dell fixes the firmware for this I will likely exchange it. Otherwise the build quality and image is superb.
Before people chortle that it is waste to have an OS scale a 4k monitor to 1080p, remember that even though the effective resolution is 1080p for text with OS X scaling, the full 4k resolution is still available for use by imaging apps, games, etc. And sometimes I simple change the OS scaling to provide a 2560x1440 desktop if I need more 'real estate'. But my primary goal with HiDPI is to finally have crisp, sharp readable text on a big screen. 4k @ 32" is not about HiDPI, it's about desktop real estate. So it depends on what your needs are.
fyi, I was recently able to buy the 24" Dell Ultrasharp for < $1000 all in with some careful shopping. That's a few hundred more than the recently announced Samsung UD590 but it's far nicer IPS panel.
CalaverasGrande - Wednesday, April 2, 2014 - link
I don't know what is wrong at Dell. They never had the industrial design chops of Apple or IBM/Lenovo's products. But they were still head and shoulders above the other PC and display makers. The current Dell displays are just ugly. Not talking bout the picture quality, rather it's chassis and stand.Seriously, what is up with that hideous stand? There is not one angle that looks right on it. And the chassis with the silver and dark grey is very out of place on a $3k monitor.
The Lenovo UHD-4k designs are far more professional looking. Asus qhd/4k/uhd displays are also more pleasing in a Honda CRX kind of way.
Samsung and LG's professional offerings are similarly far less ugly.
I know it may sound trite, but hey, if I am sitting in front of it 40+ hours a week it matters.
themeinme75 - Wednesday, April 2, 2014 - link
I see on the apple pro you can buy a 32inch 4k display for 3600.Clorex - Friday, June 6, 2014 - link
No mention of the IGZO panel and how it differs from IPS?KURGAN999 - Sunday, September 14, 2014 - link
in case of disabling sleep mode in control panel, am i going to expirience still those problems?And if i use this monitor only for work ( i need big desktop area for multitasking ) am i still going to expirience mst issues? Cose i need 60hz for my work.
Ty