It depends on how close you are to the display and the size of the display. 4k will definitely help in a computing environment. It will do no good in common TV viewing scenarios. 4k will however sell TVs because you typically buy TVs at the store where you are up close to it. http://carltonbale.com/does-4k-resolution-matter/
Well, doesn´t that depend on what you mean by "benefit"? (I know the eyes have an upper limit, when it comes to resolution, giving a max distance from your TV/monitor)
I have never experienced a High DPI screen - like a Retina - in real life, but I know a great deal about the difference when it comes to sound.
What is it excactly, that makes a 2000 dollar stereo set better than a 500 dollar one? It´s very hard to put your finger on one parameter and say.: "It´s that!" or "It´s this!" "Better transients" or "better bass" are just subjective expressions.
At the end of the day, it very often comes down to: "It simply sounds "softer" to my ears". Or: "You can turn the volume up way higher, before it starts to sound harsh or rough".
I don´t know, I just presume it´s the same with screens: The higher the res, the "softer" the picture wil feel to your eyes. Even if we actually exceed the resolution capabilities of the eye.
It's that your eyes physically cannot resolve the difference in resolution at a certain distance combined with a certain size of screen. ie 720, 1080, or 4k look the same if you are sitting far enough back.
720p is rougly the maximum visual acuity for a moving image. For static images, the more resolution the better. For movies or gaming (save for the occasional sniper shot, where most of the screen is still) 720p is most your brain can process.
Additionally, motion actually enhances a lot of the acuity requirements. You can actually see motion (especially if it's repetitive) that is on the order of one arc minute or less.
I can 100% tell the difference between 720p and 1080p on my 24" monitor at a reasonable difference.
The fuzzy edges and aliasing are a dead giveaway.
Film on the other hand, tends to soften the crap out of edges anyway, and has natural motion blur built in, and at only 23.976 frames per second, tends to give little in the way of real resolution when motion is occurring.
However, games are not film. They are not rendered at a low framerate, and objects and absolutely, perfectly crisp. Rendered geometry. You can easily tell the difference.
Integr8d, that's only because the display has motion blur. On a CRT, motion clarity is the same during stationary motion and fast motion (this is known as the "CRT effect"). You get that on LightBoost LCD's too as well. So fast-panning motion of a constant speed (e.g. horizontally strafing left/right while you track eyes on moving objects), the panning image is as clear as stationary image. This is the "CRT effect", and you don't get that on most TV's except for certain modes like Sony's new low-latency interpolation-free "Motionflow IMPULSE" mode (Game Mode compatible) found in some TV's such as KDL55-W802A -- it's essentially Sony's version of LightBoost.
Your statement would be fine if we came from the same mold, but we don't people vary and the capabilities of their bodies also vary, kind of why certain army personel are chosen to become snipers.
People need to look a the the recommended viewing distances on HD TV's. Most people sit way too far away to take advantage of HD content. Distances are recommended between 5.5 ft to 6.5 feet for 42" to 50" TVs. The whole idea being to move close enough to replicate the feel of a large movie screen.
Audiophiles are just terrible at objective testing, but the differences between a $500 stereo and a $2K one are definitely measurable and not terribly hard to pin down... They're also not very large in many cases (audiophiles are also amongst the worst hobbyists when it comes to paying for diminishing returns).
Huh? you could have fooled me vintage stereo equipment goes for thousands over the original retail. Cars have the worst diminishing return of any other hobby that exists btw.
"Diminishing returns" != "depreciation." He's saying that if speakers that cost $500 would rate 90/100, and $5,000 would rate 95/100, and $50,000 98/100, audiophiles would spend the $50,000 or more if they had the money, even though each 10-fold price increase gets you a smaller increment of quality. Average people would say that they all sound pretty good.
Even putting aside your syntax issues, almost all audio equipment depreciates in value quickly. The rare exceptions would be (the minority of) tube based amplification, and maybe a very few speakers (Quad ESL 63's).
What's even worse, judging from the pictures of the homes of audiophiles the same person spending 10-20k or more on his stereo spend little to no thought on how the room effect the sound. Something we all know is incredibly important for the way sound-waves behave.
As an audiophile as well as a video guy, I don't think the problem is that audiophiles are the worst hobbyists when it comes to paying for diminishing returns. I think the problem is that the press around audio focuses far too much on those diminishing return pieces. I'm considering writing a piece on budget phono amps, as more and more people buy turntables, but it's going to be hard. You can find 100 reviews of a $2,500 phono stage, but none of a $130 one that most people might buy. I think audio has a bad, bad marketing problem that the press reinforces.
Diminishing returns is an understatement. Whenever I hear the word "audiophile", it always reminds me of those numerous sound-clarifying snakeoil products (e.g. the magnificent Bedini Clarifier, http://www.bedini.com/clarifier.htm) and the praising reviews they get around the web.
The low end of what anyone with any knowledge of the subject says is *easily* discernible by a person is one arc minute per pixel. (There are things like vernier acuity, rotational acuity and such that can push that by a factor of 10 or more -- to 0.1 arc minute per pixel or less.)
A commonly accepted, comfortable viewing angle is 60 degrees. (Some "experts" put it at 90 degrees or more.)
Combining the minimum 1 arc minute per pixel with the minimum 60 degrees gives a horizontal pixel count of 3,600 as the minimum that the average person can easily discern given an optimum viewing distance. (If you take that to the other extreme of 0.1 arc minute and 90 degrees this becomes 54,000 horizontal pixels at the optimum viewing distance.)
So is 2160 (v) x 3820 (h) worth it to the average person with good eye sight? Absolutely. It just barely crosses the 3600 horizontal pixel count.
If you can't tell the difference between UHD (2160p) and HD (1080p) then I humbly submit you need to get your eyes checked. If you can't tell the difference between 720p and 1080p then you REALLY need to get your eyes checked.
No believe me playing games for long enough, on low end and high end screens def makes you more aware to PPI, in fact it is funny that most HDTVs look horrible to me, even at the optimized distance. There are things you just notice, if anything I thing being around PC for 2 long makes us somewhat sensitive just like the difference between 30fps and 60fps typically you shouldn't be able to tell a difference but as so many had said yes you can.
Anyone with knowledge of the subject knows that 60 Hz versus 30 or 24 Hz is easily discernible, and 120 vs 60 is also easily discernible. The confusion here stems from 24 fps being the standard for film, but the difference is that film has built-in artifacts like motion blur that make 24 Hz the bare MINIMUM for smooth motion.
If you've seen IMAX presentations, you know that even for true film, 60 vs. 24 is a huge difference.
Sound is as quantifiable as video. The accuracy of each can be measured and known beyond question. It's just that nobody does it because they don't want to admit their $2000 stereo is measurably terrible despite how good they've convinced themselves it sounds.
I'm glad I don't have to worry about yanking $ 3.5k from somewhere, cause years of computer use has caused my 20/20 vision to weaken to the point where 1080p on a 27" screen works just fine...
I feel like this is an easy position to take with very few 4k TVs in the wild, very little content delivered in 4k, and, maybe most importantly, *even less* content being finished at 4k (as opposed to upscaled 2.5k or 3k).
When you see native 4k content distributed at 4k on a good 4k display, you can see the difference at normal viewing distances.
I don't think it's worth adopting until 2015 or 2016, but it will make a difference.
I see no reason to upgrade until 1080p becomes 10800p. Kept the CRT for 30 years, inherited the damn thing. That's how long I intend to keep my 1080p TV; whether tv makers like it or not.
My parents no-name 19" CRT tv lasted from the early '80's to ~2000; the no-name ~30" CRT tv they replaced it with was still working fine ~3 years ago when they got a used ~35-40" 720p LCD for free from someone else. I'm not quite sure how old that TV is; IIRC it was from shortly after prices in that size dropped enough to make them mass market.
You must be trolling. My top of the line Mitsubishi CRT started having issues in 2006 in year seven. I replaced it with an NEC LCD panel that I'm still using today. It could go at any time and I'd update to the latest technology. I'm picky about image quality and could care less about screen thinness, but there is always options if you are looking for quality. I'm sure your 1080p tv won't make it 30 years. Of course, I don't believe your CRT made it 30 years without degradation issues. It's just not possible. Maybe you are just a cheap ass. At least man up about it. I want my 1080p tv to last at least ten years. Technology will have long passed it by at that time.
Of course, this coming from a man who replaced his bedroom CRT tv after almost 25 years. Even so, the tube was much dimmer before the "green" stopped working. Not to mention the tuner had long given up the ghost. Of course, this tv had migrated from the living room as the primary set to bedroom until it finally gave up the ghost. I miss it, but I'm not going to kid myself into believing that 1988 tech is the same as 2012. It's night and day.
Basically, your living room TV is the main area that you don't see a benefit from 4K. And I've seen all the 4K demos with actual 4K content in person. I did like at CES this year when companies got creative and arranged their booths so you could sometimes only be 5-6' away from a 4K set, as if most people ever watched it from that distance.
That site sadly perpetuates (by inference) the old myth of 1 arcminute/pixel being the limit of human acuity. This is totally false. Capability of the Human Visual System (http://www.itcexperts.net/library/Capability%20of%... is a report from the AFRL that nicely summarises how we are nowhere even CLOSE to what could actually be called a 'retina' display.
A lot of people seem to confuse cycles/deg with pixels/deg. The commonly accepted value for the practical limit of human visual acuity is 60 cycles/deg, or 1 cycle/min. The paper you posted supports this limit by the way: Section 3.1 states "When tested, the highest sinusoidal grating that can be resolved in normal viewing lies between 50 and 60 cy/deg..."
To represent a 60 cycle/deg modulation you need an absolute minimum of 120 pixels/deg (Nyquist's sampling theorem). Assuming unassisted viewing and normal vision (not near-sighted) this leads to an overall resolution limit of 500 dpi or so.
With that said, the limit of acuity is actually not as relevant to our subjective perception of "sharpness" as many believe. There have been several studies arguing that our subjective perception is largely a driven by modulations on the order of 20 pairs/degree (i.e. we consider a scene to be "sharp" if it has strong modulations in that range). Grinding through the math again we get an optimal resolution of 200-300 dpi, which is right about where the current crop of retina displays are clustered.
The paper also mentions that cycles/degree is only ONE of the ways that the eyes perceive 'detail' When it comes to line misalignment (i.e. aliasing), we can see right down to the arcsecond level. If you want a display that does not exhibit edge aliasing, you're looking at several tens of thousands of DPI.
Well lets be honest, its only usefull to us if the PPI is high enough to throw AA out the window, or atleast down to 2x of any iteration. I can see some uses in productivity or workstation applications. As for the TV market they aren't even fully at a standard 1080p in content, and they invested a lot into upgrading content as Hollywood started upgrading the cameras for higher resolutions, so I don't see the industry on a bandwagon to keep upgrading.
720p is about as good as you need if you have a 50" TV and you sit 10 feet away from it. If you have a 30" display that you sit 18 inches from, it makes a huge difference.
I suppose since I work in broadcast I am special but 4k, HD and 720 are all apparent when you have a decently sharp display. Even from several feet away.
I just had an argument with my friend over why laptops around 15" are getting 3200x1800 displays but we still have < 100 ppi on desktop displays. We both agreed that it would be nice to have high DPI desktop monitors but i insisted that they're too expensive and more niche than laptops and tablets.. It's crazy to see the first 4k monitor ever get such a nice reward, what do you think prevents the cost from going down yet?
Displays, like IC's, get exponentially more expensive as the size increases, especially for newer technologies. It's mostly due to the defect ratio. A 30" screen is 4 times as large as a 15" one, but it's way more than 4x as expensive. Suppose that there's a single fatal defect; the 30" screen would have to be discarded, but 3/4 of the 15" panels would be fine.
And then after that you're going to sell far fewer, so your profit margins are going to have to change to adapt for that as well, and it really winds up making them far more expensive. It really is the best looking display I've used and the one I most want to keep around after the review period. Companies should be rewarded for taking the risk in releasing niche products that help push the market forward, and really are a breakthrough.
They've done so in the past, and IIRC still do bin GPU levels that way; but in all their recent generations the dual and quad core CPUs that make up 99% of their sales have been separate dies.
Your analogy breaks down even for the handful of exceptions (single core celeron, quadcore LGA2011); since the LCD equivalent would be to sell you a 15" screen in a 30" case with a huge asymmetric bezel covering 3/4ths of the panel area.
It's not just the parts getting more expensive to manufacture, it's also because the manufacturer knows it's a high-margin product. The difference in price for an APS-C vs an FF sensor is on the order of a magnitude smaller than the difference in price between the complete cameras, i.e. $500 vs $2500, even if the FF camera obviously also include faster processing, higher quality body etc.
companies would like like to milk users as its brought to Desktop marketed as NEW TECH, this is the only reason why its very pricey, and dont forget that on the next months other companies will bring their products into competition which will help greatly in reduce the prices.
No worries, there's a 4K 39" TV on Amazon for $700. Since that TV has the same number of pixels and isn't a whole lot bigger, I think we will soon be seeing these 32" displays fall into that sub-$1000 range as well.
It's pretty easy to tell why they're getting 3200x1800 monitors. The super high DPI at that size and distance is unnecessary, but it has one HUGE advantage: You can use 200% scaling. That means things that aren't DPI-aware can run at standard DPI and then be doubled in width and height. This avoids the fuzzy effect like this article was complaining about when you use 150% on unsupported programs. With 200% scaling they'll look just like they do on a standard DPI display, not worse which is when happens when you use a decimal scaling factor.
I'm concerned since you mention the "low" contrast numbers. The last time I checked, any number over 100:1 was due to software enhancements and was equivalent to a higher quality contrast that racked lower due to not having much or any software enhancement meaning that the current contrast numbers thrown around are just a bunch of marketing jumble.
So wouldn't it be better to just measure the contrast with a 3rd party tool? As the numbers provided from the manufacture are pretty much a product of fiction + the marketing team, just my two cents
The number on the specs page is the one that the manufacturer quotes. In the test results you can see what we actually found. Desktop LCD numbers far fall behind what is possible with plasma TVs and the best LCOS projectors, or ever rear array LCD TVs. 100:1 would be insanely low and a poor design. Desktop LCDs are capable of 1000:1, without any sort of trick.
I see, I could of been confusing the data with that from projectors; regardless I'm glad the test results are posted regardless as the definition of "contrast" can be a varying thing. Thanks for the swift response
Yeah, you were definitely thinking of those "dynamic" contrast ratios that are complete BS. They are usually more in the 100,000:1 or 1,000,000: range, though. Completely unreasonable.
31.5 inch is still too big for me. I'm not puttin anything over 24" on my desk. I've tried it, and I just can't like it.
Unfortunately it seems that vendors are producing either tiny portable screens or gigantic TVs, and no real midsize desk monitors anymore. At least not outside the same old 1080p that we've been getting for the past 4 years.
As someone who's been in love with his 30" 2560x1600 display for the past 3.5 years the only thing seriously wrong with this display is it's still about twice what I'm willing to spend.
Ideally I'd like another inch in the diagonal just to give it the same vertical height as the pair of 20" 1200x1600 screens I'll probably be flanking it with. (I don't have enough desk space to keep the old 30 as a flanker.)
I'd prefer if I could avoid the flankers, and just get a screen that is natively wide. 36" 21:9 with 4096 horizontal pixels would be a good start. And going wider wouldn't hurt either, 3:1 - 4:1 h:v ratios should work on most desks.
Of course, by then horizontal resolution would reach into the 8k pixels, and display port would have a little cry about required bandwidth, and it would take >1000W of GPU power to render anything halfway complex, at 16MP.... With pixel doubling we're back down to 4MP though, much like a current 30" screen.
I know, pipe dreams, but I just bought new screens, so I can wait another decade or so....I hope people have been buying those 29" 21:9 screens en-masse though, so that manufacturers get it, that there's a market for wide screens, if they have enough vertical pixels.
At a 3:1 width (4960/1600), 16" tall, and a normal sitting distance a flat display wouldn't work well. If curved screens ever go mainstream a monolithic display might make sense; until then 3 separate monitors lets me angle the side two so my viewing distance is roughly constant across the entire array.
For gaming, it has to be flat, until proper multi-head rendering gets implemented. Otherwise the distortion will mess things up. And for films, the central 2.35:1 area should also be flat.
I'm getting farsighted (and can't tolerate reading glasses due to the horrible lighting at work (I know, the should fix it)), and am considering moving to a 27" monitor and putting it further back on my desk to reduce eyestrain.
Chris, can you please test out scaling in Windows 8.1 with the DPI setting on 200% for us? That enables pixel-doubling and that may also make more applications and websites look a lot clearer. If Anand can try out the same thing with his RMBP and Windows 8.1, it would be interesting to see the results.
Does 200% DPI on Windows 8.1 actually do nearest-neighbor scaling on legacy applications? The other scaling factors use GPU scaling (probably bilinear or bicubic) if I'm not mistaken, resulting in the fuzzy results described by the reviewer.
On Windows 8 vanilla you have a choice between Vista-style (GPU) and XP-style DPI scaling, and the XP method doesn't appear to use the GPU scaling methods described, but only text scales and not images and other non-text UI elements, leading to layout issues in most legacy apps.
At 200% the poorly scaled text is more readable than before. Things that do scale correctly are incredibly sharp, though I wouldn't keep it here as I miss the desktop space too much. It's certainly better than 150% on those poorly scaled items, but just too large IMO.
Great review. While dual HDMI 4k doesn't work at the moment, NVIDIA are working on hacks to their driver (pcper has a beta copy for testing) so you should see this functionality soon.
If NVIDIA actually supported 2x1 and 2x2 Surround with any monitor, they wouldn't have to resort to such hacks but apparently artificially crippling their Windows driver to preserve Quadro revenue is more important.
Tiled 4K displays are going to be more common with all the delays HDMI 2 is facing. 10-bit color is also going to be standard with all these displays. So I have to wonder how long they can keep crippling their windows driver and how scalable is having EDID whitelist for these types of monitors.
On the plus side, at least the GeForce Linux drivers aren't crippled like this.
Great, finally. I also find it irritating that tech built for disposible, none productive consumers is being given priority for improvements over professional desktop hardware (which would give tangible benefits to people doing actual work.)
You mention the new tech uses a more responsive chemical composition, and I can't see a refresh rate in your spec list. Are we likely to ever see these screens run above 60hz? Probably not.
Right now I'm not sure that DisplayPort can handle this resolution at refresh rates above 60 Hz. HDMI 2.0 should allow for up to 120 Hz at that resolution, at least if they follow the full Rec. 2020 UHD spec, but that keeps getting delayed.
Is HDMI2.0 4k@120hz a dual cable solution? I looked at what's written up in Wikipedia and it's listed as maxing out at 4k@60hz; the same limit as DP 1.2.
HDMI 2.0 isn't final or announced yet. Any specs that are out there for it are rumors right now. The UHD spec, Rec. 2020, calls for up to 120Hz at 8K resolutions. I don't think we'll see that, but I'd think we see 120Hz at 4K because you need at least 96 Hz to support high frame rate 3D, like The Hobbit, if that ever comes to the home.
4k@120hz would be nice; but even at only 24bit color that's a 24 gigabit datastream. Short of going stealth-dual cable by adding additional data lines I don't think the technology is here to do that at an affordable cost in the near future.
Thunderbolt shows that it is possible to have 40Gbps in a DisplayPort socket. Certainly not cheap though. I don't see the active cables being a necessity though, so long as fiber is not required.
I think 31.5 inches is slightly too big and 30Hz WAY too low. Just like Chris says it's hard to go to a lower resolution display. I think, for me it is hard to give up the amazing IPS colors and 120Hz refresh rate. And I don't think there are any 27 inch 4K, 120Hz monitors in the pipeline for the next 5 years. (And we're not even talking affordle yet). Looks like i'm going to be stuck at 27 inch 1440p, 120Hz for some time to come.
The ASUS runs at 60Hz with either a DisplayPort 1.2 connection using MST (how I tested) or dual HDMI 1.4 outputs, which I don't have on my graphics card and couldn't test.
I think the individual DPI scaling in the Win 8.1 preview is broke. I don't know anyone who has got it to work properly, so i'm not surprised it didn't work for you either. Hopefully it's fixed in the final release.
Is it interesting that the benefits of that working or not hinge greatly on what you're planning to do with this monitor?
My first instinct is, "Awesome! Finally some great PPI for the desktop crowd," but after a bit of thinking my response is quickly subdued.
How would I use this thing? and for what? It's very nice having additional real estate, but using a 31.5" monitor at native resolution w/o PPI scaling (to fully utilize the extra real estate) means sitting closer to it than I would a lower res monitor that's smaller in size. I guess that's one question I have for Chris: How is it like sitting 2' away from this thing for hours on end when using native resolution? I had a 27" monitor that I returned for a 1080p because I found it too big and needed to sit farther away from it.
If you sit farther back, than the DPI scaling would have to be set higher and you'd be losing real estate - a trade-off that many would gladly live with for a sharper image. The thing is, there's still a vast majority of software out there, particularly in the Windows landscape, that can't handle or scale well, especially when it comes to legacy professional applications which tend to have cluttered menus.
I guess my point is that there's only so much screen real estate that one needs until it starts to become a hindrance and scaling is required, and that's heavily influenced by personal preference and screen size. I would rather have three x1200 at 21-24" monitors for the screen real estate than a single 31.5" at 4K. When it comes to gaming, though, I'd love the 31.5" 4K monitor
First, games. They still need some kind of shader-based anti-aliasing, like FXAA or MSAA, but the resolution is amazing- I really enjoy 2560x1600 with a pair of GTX670's.
Second, any application that is currently resolution limited. Think CAD, photo editing, and video editing; anything that requires a rendered visualization. For instance, I'm editing 20MP files shot from a Canon 6D on a 4MP monitor that has UI elements on it, reducing the effective workspace. A monitor with over 7MP would improve my productivity.
Out of curiosity did you try to run Half Life 2 at 1080p? Performance is going to crush most GPU's running at 4k, and I am just curious as to the clarity of the scaling when running non-native resolution that keeps the same pixel uniformity.
very costly..... hope these displays become mainstream soon.... Higher resolution/ppi does make a big difference atleast for people using their computers all day... Even when I jumped from a 1366x768 laptop to a 1920x1080 laptop and then to a rMBP the difference is truly there.... Once you go to the higher resolution working on the lesser one really is a pain...
And where can I buy monitors with the panels you speak of? I'd like a 4k monitor for about 800 €, maybe even 1000 € (I paid 570 for my Samsung 27" 1440p, so that seems fair if the panels only cost slightly more)....
Because I had to change from SMTT (which shows input lag and response time) as our license expired and they're no longer selling new licenses. The Leo Bodnar shows the overall lag, but can't break it up into two separate numbers.
It can accept 10-bit, but I have nothing that uses 10-bit data as I don't use Adobe Creative Suite or anything else that supports it.
The ASUS has a video mode, with a full CMS, to go with the dual HDMI outputs. Since that would indicate they expect some video use for it, testing for 2:2 and 3:2 cadence is fair IMO.
some of the current generation high end 2560x1600/1440 panels are 14bit internally and have a programmable LUT to do in monitor calibration instead of at the OS level. (The latter is an inherently lossy operation; the former is much less likely to be.)
I'd like to know how a Retina MacBook Pro and a new MacBook Air hold up to the 4k display. The Verge com a while back published a demo and the results were not spectacular. Although in their demo they didn't go into depth as to WHY the results were so poor (weak video card, bad DisplayPort drivers, other???)
Could you connect up the new Haswell MacBook Air to see performance?
Holy snaps man, edit this article! It's absolutely painful to read. Its like pulling teeth, just trying to get to the next awkwardly chosen and/or wrong word.
Its for your desk; soon enough we'll have three of these type on our desk. Only thing, is 31" too large to look at without panning your head from left to right? I currently use 3 24" monitors.
I'm confused: "While not truly 4K, it is a 3840x2160 LCD display that can accept an Ultra High Definition (UHD) signal over HDMI and DisplayPort"; isn't 3840x2160 4K?
It's the standard marketing rebranding effort. 4k cinema has been around for years at 4096x2160 for years. As much as I dislike that sort of game in general; a 2:1 scaling ratio with 1080p makes new systems play much nicer with old ones and is a reasonable tradeoff.
Search around on eBay for an IBM T220 or T221. These have a 3840x2400 resolution (though only a 48 Hz refresh rate), and usually cost about $800-$1500. They aren't always there, but show up on a semi-regular basis.
Technically it's UHD, though everyone uses 4K for 3840x2160 anyway. I'm trying to avoid it to be more accurate, but since everyone refers to their display as a 4K model, I often fall back to it. UHD would be more accurate, though.
I'd love if you could test with a Mac Pro and see how it does with the "retina" display mode, i.e. effectively the space of a 1080p display but with double the sharpness.
I think you'll see a little bit of both in terms of using scaling, and the physical size of elements onscreen. Things will have to be scaled somewhat, but text for example won't have to be just as big as it was before.
Unfortunately, for gamers, there isn't a video card that can handle 60fps at 4k with maximum video settings. Not even with 3 titans as shown on this video -> http://www.youtube.com/watch?v=Wa-DRVqPJRo
Just think of the videocards that will sell ... the next "big thing" for AMD and nVidia; because let's face it, Intel is catching up far too quickly for their comfort at low resolutions.
140DPI at desktop monitor distances isn't high enough DPI to do without AA; and if you can't run at native resolution with all the other settings maxed you'd be better off running at 2560x or 1920x on a panel that natively supports that resolution to avoid scaling artifacts and scaling lag in the panel itself.
I don't understand the people hating at 4k and saying they intend to stay with 1080p. I mean common, everybody wants something new right? I think the LCD technique and LCD displays are far from perfect yet, and despite they clearly have their advantages over CRT, they still also clearly have their disadvantages.
I see this as one step closer to beating CRT. Now that with 4K we finally are at a higher pixel density, a level of sharpness that will be hard to improve on, I hope the focus will shift towards improving black levels, response times and overal picture 'feeling' (watching a LCD is still like staring at a LED lamp, while CRT gives the much nicer light bulb feeling), and bringing back the nice glow effects in games we enjoyed on CRT's that appear like washed out collored spots on a LCD.
this is great news! i've been wanting to replace my ancient Dell 2009WFP's with something larger, feel like experimenting with that Seiki SE39UY04 for $700 that got announced last month. hopefully you guys can get your hands on one of those soon and have something to compare with this ASUS model.
can't wait to do Photoshop and Lightroom work on a giant 4k display and use a more expensive/high quality uniformity display for color accuracy of prints and media.
btw, why is everybody worrying so much about gaming and graphics cards not handling 4k? I mean when you have that many pixels available it should be no problem to upscale, run the game at 1080p and simply upscale to 4k. I doubt there will be quality loss due to this and it will probably still look better than on a native 1080p screen.
How does this work btw? Is it possible to let the screen do this by itself like with a TV? So you input 1080p, 1024 x 768, whatever ress, will it be upscaled by the screen to 4k and display fullscreen? This is really important for me because I would use the screen for everything, also playing older games that do not support 4k.
Of course, like any current monitor, monitors do scaling. Some do it better, some worse, some let you configure more scaling options, some don't. It's probably best to handle scaling with the graphics card (/drivers), because that gives you, at least potentially, the most control.
"The ASUS PQ321Q is pricey, and I can’t say that getting three or four 30” 2560x1600 panels isn’t a better deal, but it’s not the same as having one display that looks like this. "
I don't mean to be harsh, but this story needs more careful copy editing. There are run on sentences and other pretty amateurish errors.
Probably never since margins are higher on 16:9 screens; a 21" 2560x1440 screen could be made from the same line just by cutting the panels smaller.
In that general size bucket though I'd like to see them jump directly to 4k too though for ~200DPI. This is high enough to make AA mostly unneeded when gaming and to allow for 2:1 scaling for legacy apps.
UHD or 4k is gonna be good for the living room simply because TV's will get bigger, and that's where 4k really shines.
Back in the day, i remember 32" was massive. Then when HD first came out, 42" was the standard when talking about big screens. Now 55" is the new standard, and 70 inch will probably be common place in the next couple of years. 4k on a 70 inch will look great.
Maybe. 70/80" TVs require rearranging your living room in ways that smaller sizes don't; and take up enough space that in smaller houses they've often impossible just because you don't have any blank wall sections that big that you can orient furniture around.
The PixPerAn test looks great, you’ve done nothing wrong; on the contrary, the idea is to show the realistic best and worst case scenario out of which the former should be more representative for most users. However, some of your results, namely pictures 2 and 3, seem very optimistic. In my opinion, and from my experience, pictures 4 and 5 show the real best (4) and worst (5) case scenarios. How did you capture the results and under what settings? I could try to replicate the test on my U2311H and compare the results to those that I did when I purchased the monitor and upload the results.
It might be a good idea the crop the relevant pictures (those above), pick out two that are most representative for relevant best and worst case scenarios and create a single picture instead of a gallery. If you decide that more cases should be shown that’s perfectly alright, but an organized single picture would still be easier to read. Also, some context for readers unfamiliar with what this test shows and how to read it would be quite useful.
It’s a common misconception that monitors with higher response time than 2ms are not fast enough. Believe it or not, a lot of people, especially gamers, steer clear of monitors with higher response time. In reality things are much different; the ms rating alone says little. Not all response time ratings are created equal. One might write an article solely about the response time of a monitor, however, without going into technicalities such as RTC, PixPerAn can give an easy to read and understand representation of what a user can actually expect to experience in terms of RT in a monitor. For example, a lighter ghost image or a pale overshoot behind the moving car is usually harder to spot in actual use. PixPerAn is also good for testing stereoscopic 3D ghosting and S3D performance in general; you can clearly see the benefits of NVIDIA LightBoost in the test.
A picture is worth a thousand words, especially in this case. Just by looking at a PixPerAn result you can derive almost everything you need to know about how fast a monitor should actually feel like (and what kind of visual artifacts you might experience). I would be really glad if you make this test standard on all your future monitor reviews, at least here on AnandTech. It wouldn’t take much space in the reviews and should be relatively quick to take. Please, let me know whether you think it’s a useful test, etc.
Everyone who writes about Igzo (including Chris in this article) talks about how Igzo allows for higher electron mobility which makes a faster display. Then why is the GTG this bad?
Does it do 1080p@120hz? Might not matter if it really is this slow though.
True 4K, when it comes to digital cinema, is 4096 pixels wide, but the height can vary depending on the content. Since this is only 3840 pixels wide, it falls under the UHD heading, but everyone uses 4K anyway because it sounds good.
You can find 4K TV in stores now. I can see clearly the difference between 1080P and 4K. You should try it and you may change you thoughts about 4K TV. To me, the issue is you could not find enough 4K materials to watch now. I hope this situation will change in 2-3 years.
One of my "pet peeves" is the fact that the display industry goes out of its way to blur the monitor/TV line. They aren't the same, and selling them as though they were is disingenuous at best. It has given us a 16:9 standard where it doesn't belong - and now we are crashing up against the limits of standards essentially made with low-res and TV in mind.
Putting speakers in a $3500 display is like putting a $5 tuner in a Marantz amplifier. Certainly if I have the money to pay for one of these things I'm not going to put up with crap for speakers! It is just about as useless a thing as can be done.
I think the best thing I can say about this monitor is that finally someone has broken the ice, and I give Asus a lot of credit for that. Not saying it's bad, just saying for $3500 I'd like to have seen more polish (better OSD) and better calibrated performance. Nice job Asus, but you could have done better.
On the other hand, if you are building a $3,500 display you may as well throw some speakers in there. A graphics artist might like the picture but only want to play the occasional email or IM notification sound.
At the same time, for that sort of use an optional speakerbar is a reasonable accessory without driving the price up for those of us who'll never need it.
I am so jealous of anyone with a spare $3500 laying around to spend on this screen. I am just dying to move from my 27" 1440p panel to a 4k 60 Hz screen... I'm really hoping the 39" type VA panel that ASUS has coming out next year is a lot cheaper.
"We have seen non-IGZO panels in smartphones with higher pixel densities, but we don’t have any other current desktop LCDs that offer a higher pixel density than this ASUS display"
Wha??? I have been using 22 inch 4k LCD's since 2005-ish that have way higher pixel density than this thing does...
Can the "NVIDIA GeForce GT 650M 1024 MB" graphics card that the Macbook Pro retina has drive this display? If not then do I have to buy a separate desktop to do the job?
Reading this there does not seem to be a reason to buy it. Someone is better off buying 3 30" 1600p monitors for 3k than 1 31.5" 4k monitor for the same price. Thoughts?
You can't really compare multiple monitors to a single one. Some people are comfortable with 3x30" monitors on their desk, other people may not even have space for that many 30 inch monitors or might think it awkward. The pixel density of this monitor is much higher than a 30" 1600p monitor, and that matters to many people.
Not to mention, driving 7680x4800 for gaming is basically impossible without turning settings down to low in every game and having 4x Titans. 3840x2160 consumes enough GPU power as it is, and 3x1600p is more than 4 times the number of pixels! And then there's the issue of bezels, which some people hate. If you're comfortable with multiple monitor gaming, this screen probably isn't for you. So it's all down to personal preference.
That,s a great gaming monitor site and every new habit begins with mental shifts and thank you very much for your instruction it,s very helpful or If you want to know more here is you Get Good information.. http://www.bestgamingmonitorshq.com
That,s a great gaming monitor site and every new habit begins with mental shifts and thank you very much for your instruction it,s very helpful or If you want to know more here is you Get Good information.. http://www.bestgamingmonitorshq.com
How do you find images and videos are handled? I have a rMBP and I find that neither are outputed pixel for pixel rather scaled with the rest of the desktop, is this the same on windows or are images and videos rendered pixel for pixel?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
166 Comments
Back to Article
Mondozai - Tuesday, July 23, 2013 - link
BUT BUT BUT, we were told everything above 720p was overkill and stupid!Where are Anandtechs resident armchair experts now?!
Despoiler - Tuesday, July 23, 2013 - link
It depends on how close you are to the display and the size of the display. 4k will definitely help in a computing environment. It will do no good in common TV viewing scenarios. 4k will however sell TVs because you typically buy TVs at the store where you are up close to it. http://carltonbale.com/does-4k-resolution-matter/beginner99 - Tuesday, July 23, 2013 - link
Exactly. Most people either have a too small tv or sit too far away to even benefit from ful HD or even 720p.kevith - Tuesday, July 23, 2013 - link
Well, doesn´t that depend on what you mean by "benefit"? (I know the eyes have an upper limit, when it comes to resolution, giving a max distance from your TV/monitor)I have never experienced a High DPI screen - like a Retina - in real life, but I know a great deal about the difference when it comes to sound.
What is it excactly, that makes a 2000 dollar stereo set better than a 500 dollar one? It´s very hard to put your finger on one parameter and say.: "It´s that!" or "It´s this!" "Better transients" or "better bass" are just subjective expressions.
At the end of the day, it very often comes down to: "It simply sounds "softer" to my ears". Or: "You can turn the volume up way higher, before it starts to sound harsh or rough".
I don´t know, I just presume it´s the same with screens: The higher the res, the "softer" the picture wil feel to your eyes. Even if we actually exceed the resolution capabilities of the eye.
Despoiler - Tuesday, July 23, 2013 - link
It's that your eyes physically cannot resolve the difference in resolution at a certain distance combined with a certain size of screen. ie 720, 1080, or 4k look the same if you are sitting far enough back.Integr8d - Wednesday, July 24, 2013 - link
720p is rougly the maximum visual acuity for a moving image. For static images, the more resolution the better. For movies or gaming (save for the occasional sniper shot, where most of the screen is still) 720p is most your brain can process.shaurz - Wednesday, July 24, 2013 - link
That sounds like bollocks to me. I can easily tell the difference between a game running at 720p and 1080p.doobydoo - Thursday, July 25, 2013 - link
He said 'if you are sitting far enough back'. Can you tell the difference between a game running at 720p and 1080p from 1 mile away? No.Ortanon - Wednesday, July 24, 2013 - link
...at a certain viewing angle [distance].Shadowself - Wednesday, July 24, 2013 - link
Absolutely not. Please read my earlier post.Additionally, motion actually enhances a lot of the acuity requirements. You can actually see motion (especially if it's repetitive) that is on the order of one arc minute or less.
1Angelreloaded - Wednesday, July 24, 2013 - link
Your brain can process more than that from your optical socket.Kamus - Wednesday, July 24, 2013 - link
What a load of crap. You have no clue what you are talking about do you?piroroadkill - Thursday, July 25, 2013 - link
I can 100% tell the difference between 720p and 1080p on my 24" monitor at a reasonable difference.The fuzzy edges and aliasing are a dead giveaway.
Film on the other hand, tends to soften the crap out of edges anyway, and has natural motion blur built in, and at only 23.976 frames per second, tends to give little in the way of real resolution when motion is occurring.
However, games are not film. They are not rendered at a low framerate, and objects and absolutely, perfectly crisp. Rendered geometry. You can easily tell the difference.
SlyNine - Friday, July 26, 2013 - link
Where did you come up with that?? You need to substantiate your comments with some sources and objective tests.I can DEFENETLY tell the difference between 720 and 1080 on SOME moving content. Even if it is not as noticeable.
mdrejhon - Wednesday, July 31, 2013 - link
Integr8d, that's only because the display has motion blur. On a CRT, motion clarity is the same during stationary motion and fast motion (this is known as the "CRT effect"). You get that on LightBoost LCD's too as well. So fast-panning motion of a constant speed (e.g. horizontally strafing left/right while you track eyes on moving objects), the panning image is as clear as stationary image. This is the "CRT effect", and you don't get that on most TV's except for certain modes like Sony's new low-latency interpolation-free "Motionflow IMPULSE" mode (Game Mode compatible) found in some TV's such as KDL55-W802A -- it's essentially Sony's version of LightBoost.1Angelreloaded - Wednesday, July 24, 2013 - link
Your statement would be fine if we came from the same mold, but we don't people vary and the capabilities of their bodies also vary, kind of why certain army personel are chosen to become snipers.random2 - Friday, July 26, 2013 - link
People need to look a the the recommended viewing distances on HD TV's. Most people sit way too far away to take advantage of HD content. Distances are recommended between 5.5 ft to 6.5 feet for 42" to 50" TVs. The whole idea being to move close enough to replicate the feel of a large movie screen.Impulses - Wednesday, July 24, 2013 - link
Audiophiles are just terrible at objective testing, but the differences between a $500 stereo and a $2K one are definitely measurable and not terribly hard to pin down... They're also not very large in many cases (audiophiles are also amongst the worst hobbyists when it comes to paying for diminishing returns).1Angelreloaded - Wednesday, July 24, 2013 - link
Huh? you could have fooled me vintage stereo equipment goes for thousands over the original retail. Cars have the worst diminishing return of any other hobby that exists btw.cremefilled - Wednesday, July 24, 2013 - link
"Diminishing returns" != "depreciation." He's saying that if speakers that cost $500 would rate 90/100, and $5,000 would rate 95/100, and $50,000 98/100, audiophiles would spend the $50,000 or more if they had the money, even though each 10-fold price increase gets you a smaller increment of quality. Average people would say that they all sound pretty good.cremefilled - Wednesday, July 24, 2013 - link
Even putting aside your syntax issues, almost all audio equipment depreciates in value quickly. The rare exceptions would be (the minority of) tube based amplification, and maybe a very few speakers (Quad ESL 63's).Calista - Thursday, July 25, 2013 - link
What's even worse, judging from the pictures of the homes of audiophiles the same person spending 10-20k or more on his stereo spend little to no thought on how the room effect the sound. Something we all know is incredibly important for the way sound-waves behave.cheinonen - Thursday, July 25, 2013 - link
As an audiophile as well as a video guy, I don't think the problem is that audiophiles are the worst hobbyists when it comes to paying for diminishing returns. I think the problem is that the press around audio focuses far too much on those diminishing return pieces. I'm considering writing a piece on budget phono amps, as more and more people buy turntables, but it's going to be hard. You can find 100 reviews of a $2,500 phono stage, but none of a $130 one that most people might buy. I think audio has a bad, bad marketing problem that the press reinforces.vgroo - Monday, July 29, 2013 - link
Diminishing returns is an understatement. Whenever I hear the word "audiophile", it always reminds me of those numerous sound-clarifying snakeoil products (e.g. the magnificent Bedini Clarifier, http://www.bedini.com/clarifier.htm) and the praising reviews they get around the web.Shadowself - Wednesday, July 24, 2013 - link
The low end of what anyone with any knowledge of the subject says is *easily* discernible by a person is one arc minute per pixel. (There are things like vernier acuity, rotational acuity and such that can push that by a factor of 10 or more -- to 0.1 arc minute per pixel or less.)A commonly accepted, comfortable viewing angle is 60 degrees. (Some "experts" put it at 90 degrees or more.)
Combining the minimum 1 arc minute per pixel with the minimum 60 degrees gives a horizontal pixel count of 3,600 as the minimum that the average person can easily discern given an optimum viewing distance. (If you take that to the other extreme of 0.1 arc minute and 90 degrees this becomes 54,000 horizontal pixels at the optimum viewing distance.)
So is 2160 (v) x 3820 (h) worth it to the average person with good eye sight? Absolutely. It just barely crosses the 3600 horizontal pixel count.
If you can't tell the difference between UHD (2160p) and HD (1080p) then I humbly submit you need to get your eyes checked. If you can't tell the difference between 720p and 1080p then you REALLY need to get your eyes checked.
1Angelreloaded - Wednesday, July 24, 2013 - link
No believe me playing games for long enough, on low end and high end screens def makes you more aware to PPI, in fact it is funny that most HDTVs look horrible to me, even at the optimized distance. There are things you just notice, if anything I thing being around PC for 2 long makes us somewhat sensitive just like the difference between 30fps and 60fps typically you shouldn't be able to tell a difference but as so many had said yes you can.cremefilled - Wednesday, July 24, 2013 - link
Anyone with knowledge of the subject knows that 60 Hz versus 30 or 24 Hz is easily discernible, and 120 vs 60 is also easily discernible. The confusion here stems from 24 fps being the standard for film, but the difference is that film has built-in artifacts like motion blur that make 24 Hz the bare MINIMUM for smooth motion.If you've seen IMAX presentations, you know that even for true film, 60 vs. 24 is a huge difference.
entrigant - Friday, May 30, 2014 - link
Sound is as quantifiable as video. The accuracy of each can be measured and known beyond question. It's just that nobody does it because they don't want to admit their $2000 stereo is measurably terrible despite how good they've convinced themselves it sounds.ImSpartacus - Tuesday, July 23, 2013 - link
They sit too far away ON AVERAGE.Sitting distance is a random variable and it has non-trivial variance.
I made a spreadsheet to measure this effect: http://goo.gl/dNkj6
n13L5 - Thursday, July 25, 2013 - link
I'm glad I don't have to worry about yanking $ 3.5k from somewhere, cause years of computer use has caused my 20/20 vision to weaken to the point where 1080p on a 27" screen works just fine...ninjaburger - Tuesday, July 23, 2013 - link
I feel like this is an easy position to take with very few 4k TVs in the wild, very little content delivered in 4k, and, maybe most importantly, *even less* content being finished at 4k (as opposed to upscaled 2.5k or 3k).When you see native 4k content distributed at 4k on a good 4k display, you can see the difference at normal viewing distances.
I don't think it's worth adopting until 2015 or 2016, but it will make a difference.
Hrel - Tuesday, July 23, 2013 - link
I see no reason to upgrade until 1080p becomes 10800p. Kept the CRT for 30 years, inherited the damn thing. That's how long I intend to keep my 1080p TV; whether tv makers like it or not.Sivar - Tuesday, July 23, 2013 - link
30 years? I hope you don't have a Samsung TV.althaz - Tuesday, July 23, 2013 - link
It doesn't matter what brand the TV is, good TVs last up to about 7 years. Cheaper TVs last even less time.DanNeely - Tuesday, July 23, 2013 - link
My parents no-name 19" CRT tv lasted from the early '80's to ~2000; the no-name ~30" CRT tv they replaced it with was still working fine ~3 years ago when they got a used ~35-40" 720p LCD for free from someone else. I'm not quite sure how old that TV is; IIRC it was from shortly after prices in that size dropped enough to make them mass market.Maybe you just abuse your idiotboxes.
bigboxes - Wednesday, July 24, 2013 - link
You must be trolling. My top of the line Mitsubishi CRT started having issues in 2006 in year seven. I replaced it with an NEC LCD panel that I'm still using today. It could go at any time and I'd update to the latest technology. I'm picky about image quality and could care less about screen thinness, but there is always options if you are looking for quality. I'm sure your 1080p tv won't make it 30 years. Of course, I don't believe your CRT made it 30 years without degradation issues. It's just not possible. Maybe you are just a cheap ass. At least man up about it. I want my 1080p tv to last at least ten years. Technology will have long passed it by at that time.bigboxes - Wednesday, July 24, 2013 - link
Of course, this coming from a man who replaced his bedroom CRT tv after almost 25 years. Even so, the tube was much dimmer before the "green" stopped working. Not to mention the tuner had long given up the ghost. Of course, this tv had migrated from the living room as the primary set to bedroom until it finally gave up the ghost. I miss it, but I'm not going to kid myself into believing that 1988 tech is the same as 2012. It's night and day.cheinonen - Tuesday, July 23, 2013 - link
I've expanded upon his chart and built a calculator and written up some more about it in other situations, like a desktop LCD here:http://referencehometheater.com/2013/commentary/im...
Basically, your living room TV is the main area that you don't see a benefit from 4K. And I've seen all the 4K demos with actual 4K content in person. I did like at CES this year when companies got creative and arranged their booths so you could sometimes only be 5-6' away from a 4K set, as if most people ever watched it from that distance.
psuedonymous - Tuesday, July 23, 2013 - link
That site sadly perpetuates (by inference) the old myth of 1 arcminute/pixel being the limit of human acuity. This is totally false. Capability of the Human Visual System (http://www.itcexperts.net/library/Capability%20of%... is a report from the AFRL that nicely summarises how we are nowhere even CLOSE to what could actually be called a 'retina' display.patrickjchase - Tuesday, July 23, 2013 - link
A lot of people seem to confuse cycles/deg with pixels/deg. The commonly accepted value for the practical limit of human visual acuity is 60 cycles/deg, or 1 cycle/min. The paper you posted supports this limit by the way: Section 3.1 states "When tested, the highest sinusoidal grating that can be resolved in normal viewing lies between 50 and 60 cy/deg..."To represent a 60 cycle/deg modulation you need an absolute minimum of 120 pixels/deg (Nyquist's sampling theorem). Assuming unassisted viewing and normal vision (not near-sighted) this leads to an overall resolution limit of 500 dpi or so.
With that said, the limit of acuity is actually not as relevant to our subjective perception of "sharpness" as many believe. There have been several studies arguing that our subjective perception is largely a driven by modulations on the order of 20 pairs/degree (i.e. we consider a scene to be "sharp" if it has strong modulations in that range). Grinding through the math again we get an optimal resolution of 200-300 dpi, which is right about where the current crop of retina displays are clustered.
psuedonymous - Wednesday, July 24, 2013 - link
The paper also mentions that cycles/degree is only ONE of the ways that the eyes perceive 'detail' When it comes to line misalignment (i.e. aliasing), we can see right down to the arcsecond level. If you want a display that does not exhibit edge aliasing, you're looking at several tens of thousands of DPI.twtech - Tuesday, July 23, 2013 - link
Even if you can't see the individual pixels, you'll still notice a difference in the clarity of the display.EnzoFX - Tuesday, July 23, 2013 - link
I cannot believe people who are saying 4k is a waste on TV's, this is asinine. 1080p on a large tv is terrible, the pixels are clearly visible.1Angelreloaded - Wednesday, July 24, 2013 - link
Well lets be honest, its only usefull to us if the PPI is high enough to throw AA out the window, or atleast down to 2x of any iteration. I can see some uses in productivity or workstation applications. As for the TV market they aren't even fully at a standard 1080p in content, and they invested a lot into upgrading content as Hollywood started upgrading the cameras for higher resolutions, so I don't see the industry on a bandwagon to keep upgrading.SodaAnt - Tuesday, July 23, 2013 - link
720p is about as good as you need if you have a 50" TV and you sit 10 feet away from it. If you have a 30" display that you sit 18 inches from, it makes a huge difference.smartthanyou - Tuesday, July 23, 2013 - link
No person has ever made such a blanket statement. It has always been in the context of what was being viewed and the distance to the display.In the future, consider your posts more carefully before you put in writing that you are an idiot.
NCM - Tuesday, July 23, 2013 - link
So evidently you didn't make it even to the end of the article's first paragraph?CalaverasGrande - Thursday, December 26, 2013 - link
I suppose since I work in broadcast I am special but 4k, HD and 720 are all apparent when you have a decently sharp display. Even from several feet away.karasaj - Tuesday, July 23, 2013 - link
I just had an argument with my friend over why laptops around 15" are getting 3200x1800 displays but we still have < 100 ppi on desktop displays.We both agreed that it would be nice to have high DPI desktop monitors but i insisted that they're too expensive and more niche than laptops and tablets.. It's crazy to see the first 4k monitor ever get such a nice reward, what do you think prevents the cost from going down yet?
bryanlarsen - Tuesday, July 23, 2013 - link
Displays, like IC's, get exponentially more expensive as the size increases, especially for newer technologies. It's mostly due to the defect ratio. A 30" screen is 4 times as large as a 15" one, but it's way more than 4x as expensive. Suppose that there's a single fatal defect; the 30" screen would have to be discarded, but 3/4 of the 15" panels would be fine.cheinonen - Tuesday, July 23, 2013 - link
And then after that you're going to sell far fewer, so your profit margins are going to have to change to adapt for that as well, and it really winds up making them far more expensive. It really is the best looking display I've used and the one I most want to keep around after the review period. Companies should be rewarded for taking the risk in releasing niche products that help push the market forward, and really are a breakthrough.Sivar - Tuesday, July 23, 2013 - link
Ideally they can cut 3 good 15" displays from the failed 30" material.Whether the process actually works this way, I don't know.
madmilk - Tuesday, July 23, 2013 - link
It doesn't work that way. That's like saying Intel can cut a quad core CPU into two dual core CPUs.sunflowerfly - Wednesday, July 24, 2013 - link
Where do you think Intel gets lower core count CPU's? They actually do disable cores and sell them for lower spec parts.DanNeely - Thursday, July 25, 2013 - link
They've done so in the past, and IIRC still do bin GPU levels that way; but in all their recent generations the dual and quad core CPUs that make up 99% of their sales have been separate dies.Your analogy breaks down even for the handful of exceptions (single core celeron, quadcore LGA2011); since the LCD equivalent would be to sell you a 15" screen in a 30" case with a huge asymmetric bezel covering 3/4ths of the panel area.
Calista - Thursday, July 25, 2013 - link
It's not just the parts getting more expensive to manufacture, it's also because the manufacturer knows it's a high-margin product. The difference in price for an APS-C vs an FF sensor is on the order of a magnitude smaller than the difference in price between the complete cameras, i.e. $500 vs $2500, even if the FF camera obviously also include faster processing, higher quality body etc.YazX_ - Tuesday, July 23, 2013 - link
companies would like like to milk users as its brought to Desktop marketed as NEW TECH, this is the only reason why its very pricey, and dont forget that on the next months other companies will bring their products into competition which will help greatly in reduce the prices.Fleeb - Tuesday, July 23, 2013 - link
This reply is better than yours: http://www.anandtech.com/comments/7157/asus-pq321q...madmilk - Tuesday, July 23, 2013 - link
No worries, there's a 4K 39" TV on Amazon for $700. Since that TV has the same number of pixels and isn't a whole lot bigger, I think we will soon be seeing these 32" displays fall into that sub-$1000 range as well.peterfares - Wednesday, July 24, 2013 - link
That screen is lower quality and doesn't have an input capable of driving it at 60Hz at 4Kpeterfares - Wednesday, July 24, 2013 - link
It's pretty easy to tell why they're getting 3200x1800 monitors. The super high DPI at that size and distance is unnecessary, but it has one HUGE advantage: You can use 200% scaling. That means things that aren't DPI-aware can run at standard DPI and then be doubled in width and height. This avoids the fuzzy effect like this article was complaining about when you use 150% on unsupported programs. With 200% scaling they'll look just like they do on a standard DPI display, not worse which is when happens when you use a decimal scaling factor.APassingMe - Tuesday, July 23, 2013 - link
I'm concerned since you mention the "low" contrast numbers. The last time I checked, any number over 100:1 was due to software enhancements and was equivalent to a higher quality contrast that racked lower due to not having much or any software enhancement meaning that the current contrast numbers thrown around are just a bunch of marketing jumble.So wouldn't it be better to just measure the contrast with a 3rd party tool? As the numbers provided from the manufacture are pretty much a product of fiction + the marketing team, just my two cents
APassingMe - Tuesday, July 23, 2013 - link
my bad, that's ranked* not rackedcheinonen - Tuesday, July 23, 2013 - link
The number on the specs page is the one that the manufacturer quotes. In the test results you can see what we actually found. Desktop LCD numbers far fall behind what is possible with plasma TVs and the best LCOS projectors, or ever rear array LCD TVs. 100:1 would be insanely low and a poor design. Desktop LCDs are capable of 1000:1, without any sort of trick.APassingMe - Tuesday, July 23, 2013 - link
I see, I could of been confusing the data with that from projectors; regardless I'm glad the test results are posted regardless as the definition of "contrast" can be a varying thing. Thanks for the swift responsefreedom4556 - Tuesday, July 23, 2013 - link
Yeah, you were definitely thinking of those "dynamic" contrast ratios that are complete BS. They are usually more in the 100,000:1 or 1,000,000: range, though. Completely unreasonable.sheh - Tuesday, July 23, 2013 - link
The official specs table lists the contrast as "800:01:00".cheinonen - Tuesday, July 23, 2013 - link
Great, I'll fix that since apparently Excel decided to change that to a time or something else. But it's fixed now.Streetwind - Tuesday, July 23, 2013 - link
31.5 inch is still too big for me. I'm not puttin anything over 24" on my desk. I've tried it, and I just can't like it.Unfortunately it seems that vendors are producing either tiny portable screens or gigantic TVs, and no real midsize desk monitors anymore. At least not outside the same old 1080p that we've been getting for the past 4 years.
cheinonen - Tuesday, July 23, 2013 - link
Well rumors are that Dell might be coming out with a 24" UHD display using a panel from LG. If that happens then I'll be happy to review that as well.DanNeely - Tuesday, July 23, 2013 - link
As someone who's been in love with his 30" 2560x1600 display for the past 3.5 years the only thing seriously wrong with this display is it's still about twice what I'm willing to spend.Ideally I'd like another inch in the diagonal just to give it the same vertical height as the pair of 20" 1200x1600 screens I'll probably be flanking it with. (I don't have enough desk space to keep the old 30 as a flanker.)
Rick83 - Tuesday, July 23, 2013 - link
I'd prefer if I could avoid the flankers, and just get a screen that is natively wide. 36" 21:9 with 4096 horizontal pixels would be a good start. And going wider wouldn't hurt either, 3:1 - 4:1 h:v ratios should work on most desks.Of course, by then horizontal resolution would reach into the 8k pixels, and display port would have a little cry about required bandwidth, and it would take >1000W of GPU power to render anything halfway complex, at 16MP.... With pixel doubling we're back down to 4MP though, much like a current 30" screen.
I know, pipe dreams, but I just bought new screens, so I can wait another decade or so....I hope people have been buying those 29" 21:9 screens en-masse though, so that manufacturers get it, that there's a market for wide screens, if they have enough vertical pixels.
DanNeely - Tuesday, July 23, 2013 - link
At a 3:1 width (4960/1600), 16" tall, and a normal sitting distance a flat display wouldn't work well. If curved screens ever go mainstream a monolithic display might make sense; until then 3 separate monitors lets me angle the side two so my viewing distance is roughly constant across the entire array.Rick83 - Wednesday, July 24, 2013 - link
For gaming, it has to be flat, until proper multi-head rendering gets implemented. Otherwise the distortion will mess things up.And for films, the central 2.35:1 area should also be flat.
sheh - Tuesday, July 23, 2013 - link
ASUS hinted at 24" hi-DPI monitors in about a year.At the end here: http://www.tomshardware.com/reviews/asus-ama-toms-...
bobbozzo - Tuesday, July 23, 2013 - link
I'm getting farsighted (and can't tolerate reading glasses due to the horrible lighting at work (I know, the should fix it)), and am considering moving to a 27" monitor and putting it further back on my desk to reduce eyestrain.Cataclysm_ZA - Tuesday, July 23, 2013 - link
Chris, can you please test out scaling in Windows 8.1 with the DPI setting on 200% for us? That enables pixel-doubling and that may also make more applications and websites look a lot clearer. If Anand can try out the same thing with his RMBP and Windows 8.1, it would be interesting to see the results.JDG1980 - Tuesday, July 23, 2013 - link
Does 200% DPI on Windows 8.1 actually do nearest-neighbor scaling on legacy applications? The other scaling factors use GPU scaling (probably bilinear or bicubic) if I'm not mistaken, resulting in the fuzzy results described by the reviewer.freedom4556 - Tuesday, July 23, 2013 - link
On Windows 8 vanilla you have a choice between Vista-style (GPU) and XP-style DPI scaling, and the XP method doesn't appear to use the GPU scaling methods described, but only text scales and not images and other non-text UI elements, leading to layout issues in most legacy apps.cheinonen - Wednesday, July 24, 2013 - link
At 200% the poorly scaled text is more readable than before. Things that do scale correctly are incredibly sharp, though I wouldn't keep it here as I miss the desktop space too much. It's certainly better than 150% on those poorly scaled items, but just too large IMO.NLPsajeeth - Tuesday, July 23, 2013 - link
Great review.While dual HDMI 4k doesn't work at the moment, NVIDIA are working on hacks to their driver (pcper has a beta copy for testing) so you should see this functionality soon.
If NVIDIA actually supported 2x1 and 2x2 Surround with any monitor, they wouldn't have to resort to such hacks but apparently artificially crippling their Windows driver to preserve Quadro revenue is more important.
Tiled 4K displays are going to be more common with all the delays HDMI 2 is facing. 10-bit color is also going to be standard with all these displays. So I have to wonder how long they can keep crippling their windows driver and how scalable is having EDID whitelist for these types of monitors.
On the plus side, at least the GeForce Linux drivers aren't crippled like this.
Steveymoo - Tuesday, July 23, 2013 - link
Great, finally. I also find it irritating that tech built for disposible, none productive consumers is being given priority for improvements over professional desktop hardware (which would give tangible benefits to people doing actual work.)You mention the new tech uses a more responsive chemical composition, and I can't see a refresh rate in your spec list. Are we likely to ever see these screens run above 60hz? Probably not.
cheinonen - Tuesday, July 23, 2013 - link
Right now I'm not sure that DisplayPort can handle this resolution at refresh rates above 60 Hz. HDMI 2.0 should allow for up to 120 Hz at that resolution, at least if they follow the full Rec. 2020 UHD spec, but that keeps getting delayed.DanNeely - Tuesday, July 23, 2013 - link
Is HDMI2.0 4k@120hz a dual cable solution? I looked at what's written up in Wikipedia and it's listed as maxing out at 4k@60hz; the same limit as DP 1.2.cheinonen - Tuesday, July 23, 2013 - link
HDMI 2.0 isn't final or announced yet. Any specs that are out there for it are rumors right now. The UHD spec, Rec. 2020, calls for up to 120Hz at 8K resolutions. I don't think we'll see that, but I'd think we see 120Hz at 4K because you need at least 96 Hz to support high frame rate 3D, like The Hobbit, if that ever comes to the home.DanNeely - Tuesday, July 23, 2013 - link
4k@120hz would be nice; but even at only 24bit color that's a 24 gigabit datastream. Short of going stealth-dual cable by adding additional data lines I don't think the technology is here to do that at an affordable cost in the near future.madmilk - Tuesday, July 23, 2013 - link
Thunderbolt shows that it is possible to have 40Gbps in a DisplayPort socket. Certainly not cheap though. I don't see the active cables being a necessity though, so long as fiber is not required.sheh - Tuesday, July 23, 2013 - link
Would be nice if it supported 120Hz at least at 1920x1080. It certainly supports the bandwidth already.dishayu - Tuesday, July 23, 2013 - link
I think 31.5 inches is slightly too big and 30Hz WAY too low. Just like Chris says it's hard to go to a lower resolution display. I think, for me it is hard to give up the amazing IPS colors and 120Hz refresh rate. And I don't think there are any 27 inch 4K, 120Hz monitors in the pipeline for the next 5 years. (And we're not even talking affordle yet). Looks like i'm going to be stuck at 27 inch 1440p, 120Hz for some time to come.cheinonen - Tuesday, July 23, 2013 - link
The ASUS runs at 60Hz with either a DisplayPort 1.2 connection using MST (how I tested) or dual HDMI 1.4 outputs, which I don't have on my graphics card and couldn't test.B3an - Tuesday, July 23, 2013 - link
I think the individual DPI scaling in the Win 8.1 preview is broke. I don't know anyone who has got it to work properly, so i'm not surprised it didn't work for you either. Hopefully it's fixed in the final release.mrdude - Tuesday, July 23, 2013 - link
Is it interesting that the benefits of that working or not hinge greatly on what you're planning to do with this monitor?My first instinct is, "Awesome! Finally some great PPI for the desktop crowd," but after a bit of thinking my response is quickly subdued.
How would I use this thing? and for what? It's very nice having additional real estate, but using a 31.5" monitor at native resolution w/o PPI scaling (to fully utilize the extra real estate) means sitting closer to it than I would a lower res monitor that's smaller in size. I guess that's one question I have for Chris: How is it like sitting 2' away from this thing for hours on end when using native resolution? I had a 27" monitor that I returned for a 1080p because I found it too big and needed to sit farther away from it.
If you sit farther back, than the DPI scaling would have to be set higher and you'd be losing real estate - a trade-off that many would gladly live with for a sharper image. The thing is, there's still a vast majority of software out there, particularly in the Windows landscape, that can't handle or scale well, especially when it comes to legacy professional applications which tend to have cluttered menus.
I guess my point is that there's only so much screen real estate that one needs until it starts to become a hindrance and scaling is required, and that's heavily influenced by personal preference and screen size. I would rather have three x1200 at 21-24" monitors for the screen real estate than a single 31.5" at 4K. When it comes to gaming, though, I'd love the 31.5" 4K monitor
airmantharp - Tuesday, July 23, 2013 - link
I have two examples for you:First, games. They still need some kind of shader-based anti-aliasing, like FXAA or MSAA, but the resolution is amazing- I really enjoy 2560x1600 with a pair of GTX670's.
Second, any application that is currently resolution limited. Think CAD, photo editing, and video editing; anything that requires a rendered visualization. For instance, I'm editing 20MP files shot from a Canon 6D on a 4MP monitor that has UI elements on it, reducing the effective workspace. A monitor with over 7MP would improve my productivity.
thurst0n - Tuesday, July 23, 2013 - link
Wait, you have a 1440p 120hz IPS?tviceman - Tuesday, July 23, 2013 - link
Hey Chris,Out of curiosity did you try to run Half Life 2 at 1080p? Performance is going to crush most GPU's running at 4k, and I am just curious as to the clarity of the scaling when running non-native resolution that keeps the same pixel uniformity.
Spunjji - Tuesday, July 23, 2013 - link
@ Chris: Out of interest, did you get any of the POST / boot failure problems with your setup that were experienced by the chaps at PCPer?cheinonen - Tuesday, July 23, 2013 - link
No, I did not. A firmware update is supposed to be coming that will fix those, but I haven't seen them.Assimilator87 - Tuesday, July 23, 2013 - link
Chris, if HDMI supports 2160p30 and DisplayPort 1.2 has double the bandwidth of HDMI, then why is MST used instead of a single signal/stream?rpsgc - Tuesday, July 23, 2013 - link
Basically, apparently there aren't timing controllers (Tcon) capable of 4K@60Hz so Sharp (and Asus) cheated and used two controllers.thurst0n - Tuesday, July 23, 2013 - link
This was supposed to be in reply to dishayu.msahni - Tuesday, July 23, 2013 - link
very costly..... hope these displays become mainstream soon....Higher resolution/ppi does make a big difference atleast for people using their computers all day...
Even when I jumped from a 1366x768 laptop to a 1920x1080 laptop and then to a rMBP the difference is truly there.... Once you go to the higher resolution working on the lesser one really is a pain...
Cheers
airmantharp - Tuesday, July 23, 2013 - link
The cost is IZGO; 4k panels cost only slightly more than current panels when using other panel types like IPS, VA or PLS.Death666Angel - Tuesday, July 23, 2013 - link
And where can I buy monitors with the panels you speak of? I'd like a 4k monitor for about 800 €, maybe even 1000 € (I paid 570 for my Samsung 27" 1440p, so that seems fair if the panels only cost slightly more)....airmantharp - Tuesday, July 23, 2013 - link
Look up Seiko, they're all over the place. 30Hz only at 4k for now, but that's an electronics limitation; the panels are good for 120Hz.Gunbuster - Monday, July 29, 2013 - link
Seikisheh - Tuesday, July 23, 2013 - link
Why does the response time graph show no input lag for the monitor?Can it accept 10-bit input? Does 10-bit content look any better than 8-bit?
"Common film and video cadences of 3:2 and 2:2 are not properly picked up upon and deinterlaced correctly."
Why expect a computer monitor to have video-specific processing logic?
cheinonen - Tuesday, July 23, 2013 - link
Because I had to change from SMTT (which shows input lag and response time) as our license expired and they're no longer selling new licenses. The Leo Bodnar shows the overall lag, but can't break it up into two separate numbers.It can accept 10-bit, but I have nothing that uses 10-bit data as I don't use Adobe Creative Suite or anything else that supports it.
The ASUS has a video mode, with a full CMS, to go with the dual HDMI outputs. Since that would indicate they expect some video use for it, testing for 2:2 and 3:2 cadence is fair IMO.
sheh - Wednesday, July 24, 2013 - link
Thanks.Alas. It'd be interesting to know the lag break down. If most is input lag, there's hope for better firmware. :)
Are 10-bit panels usually true 10-bit or 8-bit with temporal dithering?
DanNeely - Thursday, July 25, 2013 - link
some of the current generation high end 2560x1600/1440 panels are 14bit internally and have a programmable LUT to do in monitor calibration instead of at the OS level. (The latter is an inherently lossy operation; the former is much less likely to be.)mert165 - Tuesday, July 23, 2013 - link
I'd like to know how a Retina MacBook Pro and a new MacBook Air hold up to the 4k display. The Verge com a while back published a demo and the results were not spectacular. Although in their demo they didn't go into depth as to WHY the results were so poor (weak video card, bad DisplayPort drivers, other???)Could you connect up the new Haswell MacBook Air to see performance?
Thanks!
cheinonen - Tuesday, July 23, 2013 - link
I have a vintage 2010 MacBook Air myself and no access to a Haswell one, so that's beyond my reach right now unfortunately.Treckin - Tuesday, July 23, 2013 - link
Holy snaps man, edit this article! It's absolutely painful to read. Its like pulling teeth, just trying to get to the next awkwardly chosen and/or wrong word.LemmingOverlord - Tuesday, July 23, 2013 - link
when u copy&pasted the data from Excel (?) the contrast ratio messed up... I guess.vLsL2VnDmWjoTByaVLxb - Tuesday, July 23, 2013 - link
30 pounds. Wow. That's a seriously large number. I'd like to see something 5-10 pounds less before I would consider.noeldillabough - Tuesday, July 23, 2013 - link
Its for your desk; soon enough we'll have three of these type on our desk. Only thing, is 31" too large to look at without panning your head from left to right? I currently use 3 24" monitors.DanNeely - Tuesday, July 23, 2013 - link
I have no problem with my 30" monitor.airmantharp - Tuesday, July 23, 2013 - link
Me either! And mine has CCFLs, not LEDs. The LED monitors feel like Frisbees.DanNeely - Tuesday, July 23, 2013 - link
My NEC 3090 weights in at 40 pounds. A replacement that much lighter is very attractive.noeldillabough - Tuesday, July 23, 2013 - link
I'm confused: "While not truly 4K, it is a 3840x2160 LCD display that can accept an Ultra High Definition (UHD) signal over HDMI and DisplayPort"; isn't 3840x2160 4K?DanNeely - Tuesday, July 23, 2013 - link
It's the standard marketing rebranding effort. 4k cinema has been around for years at 4096x2160 for years. As much as I dislike that sort of game in general; a 2:1 scaling ratio with 1080p makes new systems play much nicer with old ones and is a reasonable tradeoff.noeldillabough - Tuesday, July 23, 2013 - link
Damn my current monitors are 1920x1200 and I was hoping "real 4K" was 2x2 of that.JDG1980 - Tuesday, July 23, 2013 - link
Search around on eBay for an IBM T220 or T221. These have a 3840x2400 resolution (though only a 48 Hz refresh rate), and usually cost about $800-$1500. They aren't always there, but show up on a semi-regular basis.cheinonen - Tuesday, July 23, 2013 - link
Technically it's UHD, though everyone uses 4K for 3840x2160 anyway. I'm trying to avoid it to be more accurate, but since everyone refers to their display as a 4K model, I often fall back to it. UHD would be more accurate, though.Synaesthesia - Tuesday, July 23, 2013 - link
I'd love if you could test with a Mac Pro and see how it does with the "retina" display mode, i.e. effectively the space of a 1080p display but with double the sharpness.twtech - Tuesday, July 23, 2013 - link
I think you'll see a little bit of both in terms of using scaling, and the physical size of elements onscreen. Things will have to be scaled somewhat, but text for example won't have to be just as big as it was before.BubbaJoe TBoneMalone - Tuesday, July 23, 2013 - link
Unfortunately, for gamers, there isn't a video card that can handle 60fps at 4k with maximum video settings. Not even with 3 titans as shown on this video -> http://www.youtube.com/watch?v=Wa-DRVqPJRonoeldillabough - Tuesday, July 23, 2013 - link
Just think of the videocards that will sell ... the next "big thing" for AMD and nVidia; because let's face it, Intel is catching up far too quickly for their comfort at low resolutions.airmantharp - Tuesday, July 23, 2013 - link
Why would you run it at maximum settings? Gotta love the FUD peddlers.DanNeely - Thursday, July 25, 2013 - link
140DPI at desktop monitor distances isn't high enough DPI to do without AA; and if you can't run at native resolution with all the other settings maxed you'd be better off running at 2560x or 1920x on a panel that natively supports that resolution to avoid scaling artifacts and scaling lag in the panel itself.Panzerknacker - Tuesday, July 23, 2013 - link
I don't understand the people hating at 4k and saying they intend to stay with 1080p. I mean common, everybody wants something new right? I think the LCD technique and LCD displays are far from perfect yet, and despite they clearly have their advantages over CRT, they still also clearly have their disadvantages.I see this as one step closer to beating CRT. Now that with 4K we finally are at a higher pixel density, a level of sharpness that will be hard to improve on, I hope the focus will shift towards improving black levels, response times and overal picture 'feeling' (watching a LCD is still like staring at a LED lamp, while CRT gives the much nicer light bulb feeling), and bringing back the nice glow effects in games we enjoyed on CRT's that appear like washed out collored spots on a LCD.
Good review btw.
chewbyJ - Tuesday, July 23, 2013 - link
this is great news! i've been wanting to replace my ancient Dell 2009WFP's with something larger, feel like experimenting with that Seiki SE39UY04 for $700 that got announced last month. hopefully you guys can get your hands on one of those soon and have something to compare with this ASUS model.can't wait to do Photoshop and Lightroom work on a giant 4k display and use a more expensive/high quality uniformity display for color accuracy of prints and media.
Panzerknacker - Tuesday, July 23, 2013 - link
btw, why is everybody worrying so much about gaming and graphics cards not handling 4k? I mean when you have that many pixels available it should be no problem to upscale, run the game at 1080p and simply upscale to 4k. I doubt there will be quality loss due to this and it will probably still look better than on a native 1080p screen.How does this work btw? Is it possible to let the screen do this by itself like with a TV? So you input 1080p, 1024 x 768, whatever ress, will it be upscaled by the screen to 4k and display fullscreen? This is really important for me because I would use the screen for everything, also playing older games that do not support 4k.
sheh - Wednesday, July 24, 2013 - link
Of course, like any current monitor, monitors do scaling. Some do it better, some worse, some let you configure more scaling options, some don't. It's probably best to handle scaling with the graphics card (/drivers), because that gives you, at least potentially, the most control.pattycake0147 - Tuesday, July 23, 2013 - link
The paragraph describing the black levels is missing a zero after the decimal and before the seven. Confused me until I looked at the graph.pandemonium - Wednesday, July 24, 2013 - link
Nice spreadsheet you got there. Clearly shows the necessary amounts per distance and size of display. More people need to be aware of such things!LordSegan - Wednesday, July 24, 2013 - link
"The ASUS PQ321Q is pricey, and I can’t say that getting three or four 30” 2560x1600 panels isn’t a better deal, but it’s not the same as having one display that looks like this. "I don't mean to be harsh, but this story needs more careful copy editing. There are run on sentences and other pretty amateurish errors.
Mondozai - Wednesday, August 21, 2013 - link
I don't mean to be harsh, but your comment needs more careful copy editing. You should spell it run-on sentences, not run on sentences.It helps having correct grammar when trying to correct others.
Just a tip.
bill5 - Wednesday, July 24, 2013 - link
Heh, what a surprise the reviewer loved a $3,500 monitor...It's almost like you get what you pay for
Confusador - Wednesday, July 24, 2013 - link
So when can I get a 23" 2560x1600 display? 32 is a bit much for me, but I'd love the dpi.sheh - Wednesday, July 24, 2013 - link
2014, probably.DanNeely - Thursday, July 25, 2013 - link
Probably never since margins are higher on 16:9 screens; a 21" 2560x1440 screen could be made from the same line just by cutting the panels smaller.In that general size bucket though I'd like to see them jump directly to 4k too though for ~200DPI. This is high enough to make AA mostly unneeded when gaming and to allow for 2:1 scaling for legacy apps.
jasonelmore - Wednesday, July 24, 2013 - link
UHD or 4k is gonna be good for the living room simply because TV's will get bigger, and that's where 4k really shines.Back in the day, i remember 32" was massive. Then when HD first came out, 42" was the standard when talking about big screens. Now 55" is the new standard, and 70 inch will probably be common place in the next couple of years. 4k on a 70 inch will look great.
DanNeely - Thursday, July 25, 2013 - link
Maybe. 70/80" TVs require rearranging your living room in ways that smaller sizes don't; and take up enough space that in smaller houses they've often impossible just because you don't have any blank wall sections that big that you can orient furniture around.yhselp - Wednesday, July 24, 2013 - link
PixPerAn FeedbackThe PixPerAn test looks great, you’ve done nothing wrong; on the contrary, the idea is to show the realistic best and worst case scenario out of which the former should be more representative for most users. However, some of your results, namely pictures 2 and 3, seem very optimistic. In my opinion, and from my experience, pictures 4 and 5 show the real best (4) and worst (5) case scenarios. How did you capture the results and under what settings? I could try to replicate the test on my U2311H and compare the results to those that I did when I purchased the monitor and upload the results.
It might be a good idea the crop the relevant pictures (those above), pick out two that are most representative for relevant best and worst case scenarios and create a single picture instead of a gallery. If you decide that more cases should be shown that’s perfectly alright, but an organized single picture would still be easier to read. Also, some context for readers unfamiliar with what this test shows and how to read it would be quite useful.
It’s a common misconception that monitors with higher response time than 2ms are not fast enough. Believe it or not, a lot of people, especially gamers, steer clear of monitors with higher response time. In reality things are much different; the ms rating alone says little. Not all response time ratings are created equal. One might write an article solely about the response time of a monitor, however, without going into technicalities such as RTC, PixPerAn can give an easy to read and understand representation of what a user can actually expect to experience in terms of RT in a monitor. For example, a lighter ghost image or a pale overshoot behind the moving car is usually harder to spot in actual use. PixPerAn is also good for testing stereoscopic 3D ghosting and S3D performance in general; you can clearly see the benefits of NVIDIA LightBoost in the test.
A picture is worth a thousand words, especially in this case. Just by looking at a PixPerAn result you can derive almost everything you need to know about how fast a monitor should actually feel like (and what kind of visual artifacts you might experience). I would be really glad if you make this test standard on all your future monitor reviews, at least here on AnandTech. It wouldn’t take much space in the reviews and should be relatively quick to take. Please, let me know whether you think it’s a useful test, etc.
Thank you for the great review.
sheh - Wednesday, July 24, 2013 - link
3D graphs of response times for all combinations of start/end pixel lightnesses, which were in Xbitlabs reviews, are also interesting.DesktopMan - Wednesday, July 24, 2013 - link
Response time: 8ms GTGEveryone who writes about Igzo (including Chris in this article) talks about how Igzo allows for higher electron mobility which makes a faster display. Then why is the GTG this bad?
Does it do 1080p@120hz? Might not matter if it really is this slow though.
ChristianLG - Wednesday, July 24, 2013 - link
Hello. ¿Why the monitor you are reviewing isn’t a true 4k monitor? ¿ What is a true 4k monitor? Thanks.cheinonen - Friday, July 26, 2013 - link
True 4K, when it comes to digital cinema, is 4096 pixels wide, but the height can vary depending on the content. Since this is only 3840 pixels wide, it falls under the UHD heading, but everyone uses 4K anyway because it sounds good.joshu zh - Wednesday, July 24, 2013 - link
You can find 4K TV in stores now. I can see clearly the difference between 1080P and 4K. You should try it and you may change you thoughts about 4K TV. To me, the issue is you could not find enough 4K materials to watch now. I hope this situation will change in 2-3 years.Sabresiberian - Wednesday, July 24, 2013 - link
One of my "pet peeves" is the fact that the display industry goes out of its way to blur the monitor/TV line. They aren't the same, and selling them as though they were is disingenuous at best. It has given us a 16:9 standard where it doesn't belong - and now we are crashing up against the limits of standards essentially made with low-res and TV in mind.Putting speakers in a $3500 display is like putting a $5 tuner in a Marantz amplifier. Certainly if I have the money to pay for one of these things I'm not going to put up with crap for speakers! It is just about as useless a thing as can be done.
I think the best thing I can say about this monitor is that finally someone has broken the ice, and I give Asus a lot of credit for that. Not saying it's bad, just saying for $3500 I'd like to have seen more polish (better OSD) and better calibrated performance. Nice job Asus, but you could have done better.
Zan Lynx - Thursday, July 25, 2013 - link
On the other hand, if you are building a $3,500 display you may as well throw some speakers in there. A graphics artist might like the picture but only want to play the occasional email or IM notification sound.DanNeely - Thursday, July 25, 2013 - link
At the same time, for that sort of use an optional speakerbar is a reasonable accessory without driving the price up for those of us who'll never need it.kenour - Thursday, July 25, 2013 - link
*yawn* Can someone wake me when a 1600p @ 24" @ 120hz monitor is released.tackle70 - Thursday, July 25, 2013 - link
I am so jealous of anyone with a spare $3500 laying around to spend on this screen. I am just dying to move from my 27" 1440p panel to a 4k 60 Hz screen... I'm really hoping the 39" type VA panel that ASUS has coming out next year is a lot cheaper.skrewler2 - Friday, July 26, 2013 - link
Pretty disappointing a $3500 monitor doesn't come with a stand that pivots (portrait mode). Wonder if the 4k Dells will come with it...houkouonchi - Monday, July 29, 2013 - link
"We have seen non-IGZO panels in smartphones with higher pixel densities, but we don’t have any other current desktop LCDs that offer a higher pixel density than this ASUS display"Wha??? I have been using 22 inch 4k LCD's since 2005-ish that have way higher pixel density than this thing does...
dj christian - Tuesday, July 30, 2013 - link
What does this gamma, contrast and db2000 all mean?path_doc - Wednesday, July 31, 2013 - link
Can the "NVIDIA GeForce GT 650M 1024 MB" graphics card that the Macbook Pro retina has drive this display? If not then do I have to buy a separate desktop to do the job?path_doc - Thursday, August 1, 2013 - link
Anyone out there?DPOverLord - Wednesday, August 7, 2013 - link
Reading this there does not seem to be a reason to buy it. Someone is better off buying 3 30" 1600p monitors for 3k than 1 31.5" 4k monitor for the same price. Thoughts?Sancus - Saturday, August 10, 2013 - link
You can't really compare multiple monitors to a single one. Some people are comfortable with 3x30" monitors on their desk, other people may not even have space for that many 30 inch monitors or might think it awkward. The pixel density of this monitor is much higher than a 30" 1600p monitor, and that matters to many people.Not to mention, driving 7680x4800 for gaming is basically impossible without turning settings down to low in every game and having 4x Titans. 3840x2160 consumes enough GPU power as it is, and 3x1600p is more than 4 times the number of pixels! And then there's the issue of bezels, which some people hate. If you're comfortable with multiple monitor gaming, this screen probably isn't for you. So it's all down to personal preference.
twtech - Sunday, August 11, 2013 - link
4k will be a bigger deal for the 100" TV sets that will be mainstream a few years from now.EclipsedAurora - Thursday, August 15, 2013 - link
Hey! There's no need of HDMI 2.0. HDMI 1.4 is already 4K capable long before DP supporting it!pauljowney - Sunday, January 5, 2014 - link
Hi,
That,s a great gaming monitor site and every new habit begins with mental shifts and thank you very much for your instruction it,s very helpful or If you want to know more here is you Get Good information..
http://www.bestgamingmonitorshq.com
pauljowney - Sunday, January 5, 2014 - link
Hi,
That,s a great gaming monitor site and every new habit begins with mental shifts and thank you very much for your instruction it,s very helpful or If you want to know more here is you Get Good information..
http://www.bestgamingmonitorshq.com
platinumjsi - Friday, January 17, 2014 - link
How do you find images and videos are handled? I have a rMBP and I find that neither are outputed pixel for pixel rather scaled with the rest of the desktop, is this the same on windows or are images and videos rendered pixel for pixel?