Comments Locked

29 Comments

Back to Article

  • fishfishfish - Tuesday, October 1, 2013 - link

    I just bought a QNIX - one of those korea based ebay sellers you mentioned. cheap make and panels are hit and miss.. There is a serious lack of reasonably priced and quality built 1440p & 1600p monitors IMO. With next gen GPU's claiming gaming capabilities at 4k, I wander if most gamers will skip 1440p and 1600p in favour of 4k. Although again that comes down to price.. can't see 4k monitors dropping to < 1000 for at least a few years.. i know i'll be ready when it happens though :)
  • inighthawki - Tuesday, October 1, 2013 - link

    My understanding is that both Xbox One and PS4 still have some trouble rendering at 1080p @60hz, let alone 4K. Maybe at best, they have the capability to drive a 4K display for their main UI, but I would be astonished if there were any games that would support it. Even high end PC GPUs like the Titan and the 290X are going to have trouble doing any real gaming at 4K with high quality shaders.
  • psuedonymous - Wednesday, October 2, 2013 - link

    1080p displayed on a 1080p panel vs. 1080p upscaled to UHD (4K is a cinema thing, let's not mix things up) with one of the many varieties of computationally cheap post-AA, the 4K panel will look better. It's even an integer multiple of 1080p, so you can pixel-double with no overhead for the game itself and render the UI/HUD at full resolution and have the best of both worlds.
  • Origin64 - Wednesday, October 2, 2013 - link

    The consoles are going to render UI at 1080 and upscale to 4K (which might work decently with 1:2 pixel mapping, but it'll be advertized as true 4K just like current consoles are supposed to be 1080p), games will be 720-1080p and might not even upscale. If you want 4K compatibility, go PC, and shell out the big bucks.

    If 2 Titans can run 5760*1080 they can run 3860*2160 almost as well, so dual high end cards seem to be the minimum for relatively smooth (40+ fps) play
  • nathanddrews - Wednesday, October 2, 2013 - link

    Smooth play at max/ultra settings, yes. I don't know about the rest of the world, but I generally don't play with max settings even when I have the option. For me, high frame rates trump candy every time. I'm not saying I only play CS at 800x600 with low settings, but I'll usually target the native resolution of a display and tweak accordingly.

    I'm all for pushing the candy to the max as they make for pretty screenshots, but I'd really like to see some 4K benchmarks with low and medium settings just to see what is realistic for the sub-$600 GPU crowd. Honestly, nothing is more annoying than swaying grass or exploding debris making it harder to track targets. Pretty to look at, but not helpful in many games.
  • dalingrin - Tuesday, October 1, 2013 - link

    Once you go 120hz, there's no going back. I won't touch 4K or 1440p until we get monitors that can handle 120hz at those resolutions.
    While I do definitely want higher density monitors, I would rather see non-TN panels capable of greater than 60hz refresh rates first.
  • tackle70 - Tuesday, October 1, 2013 - link

    I disagree... once you go to higher resolution, you never go back!
  • Samus - Wednesday, October 2, 2013 - link

    Likewise, I picked up an HP ZR2740 2560x1440 monitor on ebay for $450 used. Excellent monitor, especially in games; once you go big you never go back. BF3/4 in 1440P is just ridiculous.
  • Rainman11 - Wednesday, October 2, 2013 - link

    You can get a QNIX/Xstar 1440p PLS that can do 120hz for cheaper than most 120hz 1080p TN monitors. In other words 1080p TN 120hz = waste of money.
  • Novulux - Wednesday, October 2, 2013 - link

    Some Korean 1440p monitors can overclock to 120Hz, you can also by 120Hz capable PCBs for ~$200.
  • Origin64 - Wednesday, October 2, 2013 - link

    As a gamer, I cant agree more. Upon switching to 120Hz it gets so much easier to land precision snipes in FPSs with double the framerate and half the latency, and clicking individual zerglings in SCII is hardly a challenge anymore. Playing more casual games in 3D helps massively for immersion, although with games like Rome II TW I do notice I'm running into the limits of Full HD resolution; zoom out a quarter of the way and you dont see more than 2 blocky pixels of all those beautifully modeled units.

    My next screen has got to be at least 2560*1440 120Hz, and preferably those 2 numbers will increase even more the next few years.

    Now there's just the long wait left until cinema figures out 24Hz isnt smooth if youre used to seeing 5 times that. Every movie I watch is a slide show and it's driving me crazy. 48fps is better, but thats about the same as tv (50/60hz) and we can do even better than that.
  • EzioAs - Wednesday, October 2, 2013 - link

    If you have a 120Hz monitor and would like to watch your videos at 120fps, give SVP (Smooth Video Project) a try. There are settings so that you can play your videos at your monitor native refresh rate. I only have a 60Hz monitor so I can't tell you the 120fps experience but even then, to me there is no going back to watching videos at 24fps.
  • nathanddrews - Thursday, October 3, 2013 - link

    SVP works well for 72Hz and 96Hz, too. Personally, I prefer the pure 24fps of film over the artifacts that interpolation brings. Even at maximum quality (minimum artifacts), SVP still has disruptive artifacts - in addition to running your CPU+GPU at max power! With synchronized pull-down or frame-multiples (48/72/96/120), the "film look" works fine for me. Some films are better than others, however. YMMV
  • abhaxus - Tuesday, October 1, 2013 - link

    Bought a qnix qx2710 glossy recently myself. Very pleased with the quality, and got an overclock to 96hz out of the box with the included dvi cable. Could probably go higher but don't feel like investing more money on the slight chance I get higher frame rates. One dead pixel in a hardly noticeable place, better black levels than my Asus 23" IPS. Arrived in less than two days from the opposite side of the world.
  • HisDivineOrder - Wednesday, October 2, 2013 - link

    I suspect that 1440p and 1600p will become the new 1080p and 1200p. Then 4K will slot in where 1440/1600p were.

    Then I imagine 1080p will drop to where 720/768p were and 720p will fall to the bottom of the barrel where basically the giveaways are.
  • ShieTar - Wednesday, October 2, 2013 - link

    Given the state of current TV content, one could also assume that 1080p will be the new 1080p, and remain so for annother decade. 1440p, 4K and multi-monitor solutions will then remain firmly outside the mainstreams field of view.
  • Sunrise089 - Tuesday, October 1, 2013 - link

    Make it 16:10 for $500 or even $550 and it's an easy sale. I have no desire to deal with 16:9 in a display I need to use for more than just content consumption.
  • Samus - Wednesday, October 2, 2013 - link

    You'll barely noticed the difference between 16:10/16:9 at 27"+ 1440p.
  • JPForums - Wednesday, October 2, 2013 - link

    Perhaps you barely notice the difference between 16:10 and 16:9 at 27"+ 1440p.
    However, neither size, nor pixel density can do a thing to change the shape of the 16:9 to look more like a 16:10. For people like myself and (apparently) Sunrise, 16:10 is preferred.
    Interestingly, 16:10 was originally chosen as the aspect ratio that most closely approximates the more precise and active portions of the human visual range. The movie and broadcast industries were the ones to champion 16:9. The hilarious part of it is, as soon as monitor manufacturers started adopting 16:9 (1.7778:1), movies moved on to wider aspect ratios up to 2.35:1. To give a few examples from different eras The Matrix, Xmen Origins Wolverine / First Class, Batman Dark Knight (changes between ratios), and Fast Five all make use of 2.35:1 framing. So now, people give up the extra vertical resolution and still have to deal with black boarders.
  • JaBro999 - Wednesday, October 2, 2013 - link

    I also despise 16:9, but the cost of a 30" 16:10 monitor is really prohibitive. With the price of Korean imports and off-brand 30" monitors hovering around $900-$1000, that's just too darn much of a gamble. Despite working on 24" 16:10 monitors every day, I find my myself leaning towards a 27" purchase for my home use.
  • DarkXale - Wednesday, October 2, 2013 - link

    1440p monitors are usually used as (at the least) a dual 1280x1440 monitor setup. As most software and websites are designed for 1280 widths, the benefits of more width than that are very poor. 1920 pixel width is too narrow in many cases to run the equivalent without loosing elements.

    The difference is also pronounced with open PDF documents, which become much easier to read.
  • ShieTar - Wednesday, October 2, 2013 - link

    You are talking from a somewhat specific viewpoint here. I do know a couple of web-designers, and people in related fields of work, and I know most of them would fully agree with you.

    For engineering on the other hand, most people tend to always use the full monitor surface with their primary software, be it a CAD tool or a compiler IDE or just plain old Excel. 90% of the time, a 30" 1600p monitor won't even be wide enough to show everything you're working on in a readable magnification.

    A lot of times, we will stand around our monitors with 2-6 people anyways, thus we need to increase magnification even more for everybody to be able to see/read the details. Thus in my own experience, for any monitor below 50" in size, full HD works out just fine for our purpose. More would be appreciated, but it would never make the difference in our way of using the monitor in the manner which you describe.
  • DarkXale - Wednesday, October 2, 2013 - link

    Certainly, though such users in my experience tend to be less concerned about the aspect ratio. It's the surface area we need, and that's regardless of the ratio. Many engineering software still have sub windows, but I referred to them as fully individual windows.

    More monitors is the usual solution to solve the surface area problem, but 1920 width monitors often waste much more space than 2560 monitors (if you don't want a compromised experience), which was the intention of the post.
  • JPForums - Wednesday, October 2, 2013 - link

    More monitors is the usual solution to solve the surface area problem, but 1920 width monitors often waste much more space than 2560 monitors

    True, but I'd still prefer 2560x1600 rather than 2560x1440.

    Certainly, though such users in my experience tend to be less concerned about the aspect ratio.

    I'm an engineer who happens to be concerned with aspect ratio. True. it isn't quite as critical when displaying a schematic or CAD drawing, but when writing code in an IDE or using Excel, I find vertical space to be at a premium. Some of our dedicated software guys even go as far as rotating their monitors so that they can see more lines of code at ones. Even with a rotated screen, though, they still want more vertical (rotated horizontal) resolution as it can limit the length of the lines they write (or force word wrap which defeats the purpose of rotating).
  • CSMR - Wednesday, October 2, 2013 - link

    I wish more manufacturers would go DP only. Less electronics, less power, and a more efficient path resulting in lower latency.
  • FwFred - Wednesday, October 2, 2013 - link

    16:9 has been fine for me at home on a 2560x1440. At work I use two 24" 1900x1200 vertically, so I like vertical real estate.

    I would shop around before purchasing this Nixeus @ $450. I bought a Dell U2713HM refurb for ~$400. I couldn't tell the difference between a new monitor and this refurb. I'm only losing the 5 year warranty, but at the price I can afford to replace it in the unlikely event it fails.
  • jackstar7 - Wednesday, October 2, 2013 - link

    Completely agree. Glad someone was willing to put this out in the market. Hopefully it does well and serves as a signal to other makers that there is demand for Displayport.
  • sulu1977 - Wednesday, October 2, 2013 - link

    Personally, I'm eagerly waiting for the SWXQXATZGA monitor.

    (sorry, just had to say that):)
  • geok1ng - Friday, October 4, 2013 - link

    On Alibaba these things sell for USD 180-210, and the seller offers up to 3% replacements parts for the buyer.
    The real question is: can The Nixeus overclock?

Log in

Don't have an account? Sign up now