Comments Locked

29 Comments

Back to Article

  • Opencg - Friday, May 31, 2019 - link

    Im so tired of full array. It does nothing for local contrast.
  • quiksilvr - Friday, May 31, 2019 - link

    I'm ready for OLED. With sufficient pixel shifting it would be ideal for gaming.
  • nathanddrews - Friday, May 31, 2019 - link

    It's already ideal for gaming - you just have to be willing to accept image retention as a part of life and the likelihood of eventual, permanent burn-in. If you're the type of person that upgrades displays every few years, then you far more to gain than to lose.
  • Kamus - Friday, May 31, 2019 - link

    Burn in is possible, but mostly FUD. Plasmas had more burn in potential, yet my 8 year old plasma doesn't have any.

    My note 4, which I used for 4 years before retiring never got any either.
  • Opencg - Friday, May 31, 2019 - link

    anyone heard much about these double layered displays? supposedly they are an alternative to oled that offers high contrast on a per pixel basis
  • nathanddrews - Saturday, June 1, 2019 - link

    Nothing you can buy quite yet. Panasonic introduced a proof of concept back in 2016 that used two layers of IPS. HiSense recently showed something too, I think. Basically, you use a cheap, monochrome 1080p IPS panel behind the color 4K IPS panel and shine a SH!T-TON of light through it. The 1080p panel effectively acts like 2-million-zone FALD behind the 4K panel that further assists with blocking light to reduce blooming and increase contrast.

    So you get better-than-VA contrast (won't beat OLED) with the viewing angles of IPS. From articles and interviews I've seen, it's the power efficiency that's holding it back. By the time it reaches the market (2021?), we may also see MicroLED hit the scene, who knows.
  • reckless76 - Friday, May 31, 2019 - link

    Well, that's just not going to fit on my desk..
    What exactly is the use case for this? I kinda thought if I wanted my PC hooked up to a TV in the living room, I'd buy a TV.. Speaking of which, this screen looks a lot like a Samsung Q series in game mode with gsync instead of freesync.
  • Flunk - Friday, May 31, 2019 - link

    You'd buy it because the specs vastly exceed any TV on the market. The main difference between a current-gen TV and a current-gen monitor is the inclusion of a tuner (and maybe some picture tuning).
  • Flunk - Friday, May 31, 2019 - link

    Although I imagine most people would never buy this anyway because the specs on this monitor imply that it will be absurdly expensive.
  • Alistair - Friday, May 31, 2019 - link

    How does it "vastly" exceed a current gen OLED TV from LG? Or the Samsung Q90? My understanding is that those TVs would be similar once an HDMI 2.1 source is available. Spec tables don't mean much without a review.

    Samsung's Q90 hits 10,000:1 contrast with full array local dimming, 1300 nits brightness in HDR, 120hz 4k with port 4 (currently limited to 4:2:0 color with HDMI 2.0, but it is supposedly an HDMI 2.1 port).

    It also has an optical layer that Samsung calls 'Ultra Viewing Angle.' which greatly improves the viewing angles at the expense of lower native contrast ratio, basically making VA similar to IPS in regards to viewing angle, but with still much better contrast like most VA TVs.

    Samsung's TV is very expensive ($3000) but we are hearing the Asus will cost a lot more than that. Buy an LG OLED is still my preference, since it also has an HDMI 2.1 port supposedly. We need an HDMI 2.1 video card first, nVidia or AMD.
  • boeush - Friday, May 31, 2019 - link

    Ever tried using a TV as a computer monitor? The lack of contrast on text tends to make it nigh-unusable. In general, TVs play fast and loose with the image they're trying to display; they are designed to produce a visually 'equivalent' image at a large viewing distance.

    Monitors aren't allowed to mess with the picture content - so no lossy compression, no hinky games with local edge or contrast enhancement to "improve picture quality", etc. Monitors would have DisplayPort and other PC-oriented I/O ports, whereas a TV won't have any such things.

    TVs will probably also have worse GtG and generally ghosting artifact characteristics, because they're geared toward displaying movie content where frame-to-frame differences tend to be small and gradually evolving - which isn't the case with the type of content that computers tend to generate.
  • GrandTheftAutoTune - Friday, October 4, 2019 - link

    You know nothing about modern TV’s it seems. Modern TV’s have a Game Mode and PC Mode that disables all the nonsense that screws up text sharpness and accuracy. Heck most of that is caused by post-processing effects and sharpness than any intelligent person who wishes to calibrate their display to studio accuracy disables anyway.

    There is zero compression used outside for Chroma Sampling with HDR - which all displays including HDR monitors use. If you actually look at the raw measurements of GtG of both the best TV’s and monitors you would realise they’re both near the edge of what’s possible with LED/LCD. Don’t ever listen to manufactures quoted response times, they all measure Gt differently. Using RTings measurements, you will realise that you could achieve low blur with a high end Samsung TV for example. You can actually achieve much less than a monitor with BFI (Black Frame Insertion) which is much better than overdrive. You can also disable and/or reduce dimming if you’re don’t like it.

    On a properly set up Samsung colour calibrated with the nonsense off, it looks as clear and as accurate as my phone screen for web browsing. Movies and video games look way better due to higher brightness, contracts and lower levels of blur. On a Sony OLED Colour calibrated with the nonsense turned off in Game Mode, with BFI enabled in HDR. It was the best visual experience I’ve had. Easily destroy any monitor in every category. No issues with sharpness or dimming, Plasma levels of motion clarity, studio level colour accuracy, insane pitch black room contrast for horror games, high brightness levels, superb UHD clarity, no banding in 10-bit colour. It looked perfect.

    Next time educate yourself before you comment nonsense.
  • colecodez - Sunday, June 2, 2019 - link

    Lots of reasons that are the exact reasons why all TVs are not even close to a good monitor:
    - No Displayport
    - Max refresh rate: 120Hz
    - Input lag 4k@120Hz: 18.4ms
    - 4k Freesync maxes out at 60Hz
    - 4k@120Hz skips frames in game mode. Without game mode we're talking 57ms input lag.
    - TVs are optimized for TV formats, PCs are not.
    - Ads

    TL;DR: If you don't see input lag advertised on the box, pass.

    https://www.rtings.com/tv/reviews/samsung/q90-q90r...
  • NunayaBiz - Thursday, September 19, 2019 - link

    The best TVs are extremely close to a good monitor, and at a fraction of the price
    - No Displayport, yes, that's why they said once GPU's get HDMI 2.1
    - Max Refresh Rate: 120Hz is NOT that different from 144Hz. This is mostly a non-issue. AND the ROG is Also 120Hz. It can only reach 144Hz with an overclock.
    - Input lag 4k@120Hz: 18.4ms After an update, the ms is now 7.1ms
    - 4k Freesync maxes out at 60Hz - Yes, this will be one thing you'll be giving up
    - 4k@120Hz skips frames in game mode - This is fixed.
    - TVs are optimized for TV formats - Explain why this is an issue?
    - Ads - Def. a problem with TV use, but idk if they are visible during use as a monitor

    If you don't see input lag advertised on the box - They NEVER put input lag on the box. The advertised 1ms or 2ms on the box refers to the GTG, NOT the input lag. Even good gaming monitors have ~4-5ms of input lag, basically unnoticeable from the 7.1ms of input lag seen in the q90.

    OR you could go with a LG C9 with OLED, with sub-7ms input lag times (haven't been tested with 4k yet, as there aren't GPU's with HDMI 2.1)

    TL;DR The ROG Swift 22Hz faster refresh rate after overclock, Freesync that goes to 120Hz - For that you're paying $~5,000-$6,599 vs $2,000
    Up to you if expanded Freesync range and 22Hz refresh rate is worth $3,000 - $4,599
  • Metalingus - Friday, October 4, 2019 - link

    1. DisplayPort isn’t needed, HDMI 2.1 is good enough and will be for many years.
    2. The perceptual difference between 120Hz and 144Hz is minute. Even if you really care, many Samsung TV’s can be overclocked to 144Hz.
    3. Actually new updates for TV’s have reduce input lag further. Some are as low as 7.1ms at 120Hz. Which is a minuscule 2ms slower than the best PC monitor for input lag right now. I forget the name but it’s a 240Hz monitor with a measured input lag of 5ms. You won’t feel 2ms difference.
    4. That’s true, I would imagine next years TV’s will support VRR up to 120Hz.
    5. That’s irrelevant. The only use for 120Hz is video games. You’re supposed to play in Game Mode. Self explanatory and it’s a non issue.
    6. TV’s by standard are optimised for 3 formats; Rec.709/SDR colour, DCI P3/SDR/HDR and Rec.2020 HDR. The first one is almost identical to web colour standards, therefore your point is moot with regards to internet content. The second is for movies and the last is for movies and video games. PC games target Rec.2020 HDR too. So your uneducated point is false. Plus EVEN niché standards that many colour artists use, you CAN tune the TV for them. You can buy a colourometer and calibrate to whatever standard you like and save it as a preset.
    7. Relevant point but not all TV’s have ads. Plus you only ever seek them when accessing the Smart Menus, so it’s a moot point for PC use.
    8. No monitor companies state the input lag of their products either. Reviewers have to use a LeoBognar device to measure them and post the results. Another moot point.
  • Guspaz - Friday, May 31, 2019 - link

    Which specs vastly exceed any TV on the market? Compared to a standard LG OLED, the peak brightness is slightly higher, and the "overclocked" framerate of 144Hz is a bit higher than the 120Hz on a TV, but this display has a worse contrast ratio, worse pixel response time, worse coverage of the DCI-P3 colour space... The only metrics where this display beats out conventional OLED TVs by a significant margin is in supporting G-Sync HDR, except game consoles don't support G-Sync, they support FreeSync... which the LG OLEDs do support.
  • jeremyshaw - Friday, May 31, 2019 - link

    The LG OLEDs support HDMI 2.1 VRR, not Freesync.

    This is important, since no Freesync-only source seems to be able to activate Freesync over HDMI. Xbox One S/X implemented a more correct HDMI 2.1 VRR setup (compatible with Freesync and HDMI VRR), but AMD's unofficial HDMI modification for Freesync over HDMI, doesn't work.

    Note: neither the XboneS/X nor the PS4 use AMD's own HDMI implementation. Sony uses their own logic, fabricated at Panasonic. MSFT uses a TI DP-->HDMI converter.

    Either way, they have more input lag (tested) than my U2711.
  • jeremyshaw - Friday, May 31, 2019 - link

    Whoops, I meant to remove my last line before posting. Later retests shown the high input lag was a fault in the input lag test, not actual high input lag.
  • colecodez - Sunday, June 2, 2019 - link

    No the input lag is way higher than monitors in all tests I looked at. The game mode cuts it to 18ms but drops frames at 4k.
  • Metalingus - Friday, October 4, 2019 - link

    Well that’s changed. Look at the results now with the latest firmware. Input lag is below 10ms.
  • Metalingus - Friday, October 4, 2019 - link

    There is no advantage to using FreeSync over VRR. Newer updates have reduced input lag to below 10ms in VRR with many Samsung TV’s.
  • colecodez - Sunday, June 2, 2019 - link

    You can't go off advertisements, they neglect to mention the most important ones for gaming (cough input lag).

    The TV sucks for gaming because:
    - No Displayport
    - Max refresh rate: 120Hz
    - Input lag 4k@120Hz: 18.4ms
    - 4k Freesync maxes out at 60Hz
    - 4k@120Hz skips frames in game mode. Without game mode we're talking 57ms input lag.
    - TVs are optimized for TV formats, PCs are not.
    - Ads
  • Metalingus - Friday, October 4, 2019 - link

    I’ll post this again as you seem to post your nonsense everywhere too:

    1. DisplayPort isn’t needed, HDMI 2.1 is good enough and will be for many years.
    2. The perceptual difference between 120Hz and 144Hz is minute. Even if you really care, many Samsung TV’s can be overclocked to 144Hz.
    3. Actually new updates for TV’s have reduce input lag further. Some are as low as 7.1ms at 120Hz. Which is a minuscule 2ms slower than the best PC monitor for input lag right now. I forget the name but it’s a 240Hz monitor with a measured input lag of 5ms. You won’t feel 2ms difference.
    4. That’s true, I would imagine next years TV’s will support VRR up to 120Hz.
    5. That’s irrelevant. The only use for 120Hz is video games. You’re supposed to play in Game Mode. Self explanatory and it’s a non issue.
    6. TV’s by standard are optimised for 3 formats; Rec.709/SDR colour, DCI P3/SDR/HDR and Rec.2020 HDR. The first one is almost identical to web colour standards, therefore your point is moot with regards to internet content. The second is for movies and the last is for movies and video games. PC games target Rec.2020 HDR too. So your uneducated point is false. Plus EVEN niché standards that many colour artists use, you CAN tune the TV for them. You can buy a colourometer and calibrate to whatever standard you like and save it as a preset.
    7. Relevant point but not all TV’s have ads. Plus you only ever seek them when accessing the Smart Menus, so it’s a moot point for PC use.
    8. No monitor companies state the input lag of their products either. Reviewers have to use a LeoBognar device to measure them and post the results. Another moot point.
  • tazius - Sunday, June 2, 2019 - link

    As one of the whales that bought the Omen X Emperium the picture & color quality is worse than my Samsung QF6 TV, is worse than my x34 Predator. That said, I love the display, its an amazing experience playing at high framerates and have G-Sync to keep the picture smooth on drops. The remote could be better, the sound bar is great though it should really come with simulated surround sound rather than stereo, best if coupled with an existing system.
  • zodiacfml - Friday, May 31, 2019 - link

    Because they don't have plans to bring the price down. This could have been easily a 43-50" display which fits more desks. Typing this on a 43" 4K tv.
  • DigitalFreak - Friday, May 31, 2019 - link

    The HP version of this was $5000, so I don't expect the Asus to be much, if any, cheaper.
  • DigitalFreak - Friday, May 31, 2019 - link

    They must not be selling well either. Microcenter has them for sale for $1000 off MSRP.
  • godrilla - Friday, May 31, 2019 - link

    Lol the HP omen is already selling for $1000 off at my local Microcenter. Like I said when you hear HP do you think enthusiast anything?
  • isthisavailable - Saturday, June 1, 2019 - link

    Gaming speakers? Hmm..

Log in

Don't have an account? Sign up now