Original Link: https://www.anandtech.com/show/13060/asus-pg27uq-gsync-hdr-review



Delayed past its original late 2017 timeframe, let alone the April and May estimates, NVIDIA’s G-Sync HDR technology finally arrived over the last couple months courtesy of Asus’ ROG Swift PG27UQ and Acer’s Predator X27. First shown at Computex 2017 as prototypes, the 27-inch displays bring what are arguably the most desired and visible aspects of modern gaming monitors: ultra high resolution (4K), high refresh rates (144Hz), and variable refresh rate technology (G-Sync), all in a reasonably-sized quality panel (27-inch IPS-type). In addition to that, of course, are the various HDR-related capabilities with brightness and color gamut.

Individually, these features are just some of the many modern display technologies, but where resolution and refresh rate (and also input latency) are core to PC gaming, those elements typically work as tradeoffs, with 1440p/144Hz being a notable middle ground. So by the basic 4K/144Hz standard, we have not yet had a true ultra-premium gaming monitor. But today, we look at one such beast with the Asus ROG Swift PG27UQ.

ASUS ROG Swift PG27UQ G-SYNC HDR Monitor Specifications
  ROG Swift PG27UQ
Panel 27" IPS (AHVA)
Resolution 3840 × 2160
Refresh Rate OC Mode 144Hz (HDR, 4:2:2) 144Hz (SDR, 4:2:2)
Standard 120Hz (HDR, 4:2:2)
98Hz (HDR, 4:4:4)
120Hz (SDR, 4:4:4)
Over HDMI 60Hz
Variable Refresh Rate NVIDIA G-Sync HDR module
(actively cooled)
Response Time 4 ms (GTG)
Brightness Typical 300 - 600 cd/m²
Peak 1000 cd/m² (HDR)
Contrast Typical 1000:1
Peak 50000:1 (HDR)
Backlighting FALD, 384 zones
Quantum Dot Yes
HDR Standard HDR10 Support
Viewing Angles 178°/178° horizontal/vertical
Pixel Density 163 pixels per inch
0.155mm pixel pitch
Color Depth 1.07 billion
(8-bit with FRC)
Color Gamut sRGB: 100%
Adobe RGB: 99%
 DCI-P3: 97%
Inputs 1 × DisplayPort 1.4
1 × HDMI 2.0
Audio 3.5-mm audio jack
USB Hub 2-port USB 3.0
Stand Adjustments Tilt: +20°~-5°
Swivel: +160°~+160°
Pivot: +90°~-90°
Height Adjustment: 0~120 mm
Dimensions (with stand) 634 x 437-557 x 268 mm
VESA Mount 100 × 100
Power Consumption Idle: 0.5 W
Peak: 180 W (HDR)
Price $1999

As an ultra-premium gaming monitor of that caliber, the PG27UQ also has an ultra-premium price of $1999. For reasons we’ll soon discuss, the pricing very much represents the panel’s HDR backlighting unit, quantum dot film, and G-Sync HDR module. The full-array local dimming (FALD) backlighting system delivers the brightness and contrast needed for HDR, while the quantum dot film enhances the representable colors to a wider gamut, another HDR element. The new generation G-Sync HDR module deals with the variable refresh implementation, but with HDR, high refresh rate, and high resolution combined, bandwidth constraints require chroma subsampling beyond 98Hz.

In terms of base specifications, the PG27UQ is identical to Acer’s Predator X27 as it uses the same AU Optronics panel, and both monitors are essentially flagships for the G-Sync HDR platform, which includes the curved ultrawide 35-inch models and 4K 65-inch Big Format Gaming Displays (BFGD). Otherwise, there isn’t anything new here that we haven’t already known about in the long run-up.

NVIDIA G-SYNC HDR Monitor Lineup
  Acer
Predator X27
ASUS
ROG Swift PG27UQ
Acer
Predator X35
ASUS
ROG Swift PG35VQ
Acer
Predator BFGD
ASUS
ROG Swift PG65
HP
OMEN X 65 BFGD
Panel 27" IPS-type (AHVA) 35" VA
1800R curve
65" VA?
Resolution 3840 × 2160 3440 × 1440 (21:9) 3840 × 2160
Pixel Density 163 PPI 103 PPI 68 PPI
Max Refresh Rates 144Hz
60Hz (HDMI)
200Hz
60Hz (HDMI)
120Hz
60Hz (HDMI)
Backlighting FALD (384 zones) FALD (512 zones) FALD
Quantum Dot Yes
HDR Standard HDR10 Support
Color Gamut sRGB
DCI-P3
Inputs 2 × DisplayPort 1.4
1 × HDMI 2.0
DisplayPort 1.4
HDMI 2.0
DisplayPort 1.4
HDMI 2.0
Ethernet
Price $1999 TBA TBA
Availability Present 2H 2018?

Furthermore, Asus’ ROG Swift PG27UQ also had a rather insightful channel for updates on their ROG forums, so there's some insight into the panel-related firmware troubles they've been having.

How We Got Here: Modern Gaming Monitors and G-Sync HDR

One of the more interesting aspects about the PG27UQ is about its headlining features. The 3840 x 2160 ‘4K’ resolution and 144Hz refresh rate are very much in the mix, and so is the monitor being not just G-Sync but G-Sync HDR. Then there is the HDR aspect, with the IPS-type panel that has localized backlighting and a quantum dot film. G-Sync HDR means both a premium tier of HDR monitor, as well as the new generation of G-Sync that works with high dynamic range gaming.

Altogether, the explanation isn’t very succinct for gamers, especially compared to a non-HDR gaming monitor, and it has all to do with the vast amount of moving parts involved in consumer monitor features, something more thoroughly covered by Brett. For some context, recent display trends include

  • Higher resolutions (e.g. 1440p, 4K, 8K)
  • Higher refresh rates (e.g. 120Hz, 165Hz, 240Hz)
  • Variable refresh rate (VRR) (e.g. G-Sync, FreeSync)
  • Panel size, pixel density, curved and/or ultrawide formats
  • Better panel technology (e.g. VA, IPS-type, OLED)
  • Color bit depth
  • Color compression (e.g. chroma subsampling)
  • Other high dynamic range (HDR) relevant functions for better brightness/contrast ratios and color space coverage, such as local dimming/backlighting and quantum dot films

These features obviously overlap, and much of their recent developments are not so much ‘new’ as they are now ‘reasonably affordable’ to the broader public. For a professional class price, monitors for professional visualization have offered many of the same specifications. And most elements are ultimately limited by PC game support, even uncapped refresh rates and 4K+ resolutions. This is, of course, not including connection standards, design (i.e. bezels and thinness), or gaming monitor features (e.g. ULMB). All these bits, and more, are served up to consumers in a bevy of numbers and brands.

Why does all of this matter? All of these points are points of discussion with the Asus ROG Swift PG27UQ, and especially to G-Sync HDR at the heart of this display. Gaming monitors are moving beyond resolution and refresh rate in their feature sets, especially as games start to support HDR technologies (i.e. HDR10, Dolby Vision, FreeSync 2 tone-mapping). To implement those overlapping features, much more has to do with the panel rather than the VRR hardware/specification, which has become the de facto identifier of a modern gaming monitor. The goal is no longer summarized by ‘faster frames filled with more pixels’ and becomes more difficult to communicate, let alone market, to consumers. And this has much to do with where G-Sync (and VRR) started and what it is now aspiring to be.



From G-Sync Variable Refresh To G-Sync HDR Gaming Experience

The original FreeSync and G-Sync were solutions to a specific and longstanding problem: fluctuating framerates would cause either screen tearing or, with V-Sync enabled, stutter/input lag. The result of VRR has been a considerably smoother experience in the 30 to 60 fps range. And an equally important benefit was compensating for dips and peaks over the wide ranges introduced with higher refresh rates like 144Hz. So they were very much tied to a single specification that directly described the experience, even if the numbers sometimes didn’t do the experience justice.

Meanwhile, HDR in terms of gaming is a whole suite of things that essentially allows for greater brightness, blacker darkness, and better/more colors. More importantly, this requires developer support for applications and production of HDR content. The end result is not nearly as static as VRR, as much depends on the game’s implementation – or in NVIDIA’s case, sometimes with Windows 10’s implementation. Done properly, even with simply better brightness, there can be perceived enhancements with colorfulness and spatial resolution, which are the Hunt effect and Stevens effect, respectively.

So we can see why both AMD and NVIDIA are pushing the idea of a ‘better gaming experience’, though NVIDIA is explicit about this with G-Sync HDR. The downside of this is that the required specifications for both FreeSync 2 and G-Sync HDR certifications are closed off and only discussed broadly, deferring to VESA’s DisplayHDR standards. Their situations, however, are very different. For AMD, their explanations are a little more open, and outside of HDR requirements, FreeSync 2 also has a lot to do with standardizing SDR VRR quality with mandated LFC, wider VRR range, and lower input lag. Otherwise, they’ve also stated that FreeSync 2’s color gamut, max brightness, and contrast ratio requirements are broadly comparable to those in DisplayHDR 600, though the HDR requirements do not overlap completely. And with FreeSync/FreeSync 2 support on Xbox One models and upcoming TVs, FreeSync 2 appears to be a more straightforward specification.

For NVIDIA, their push is much more general and holistic with respect to feature standards, and purely focused on the specific products. At the same time, they discussed the need for consumer education on the spectrum of HDR performance. While there are specific G-Sync HDR standards as part of their G-Sync certification process, those specifications are only known to NVIDIA and the manufacturers. Nor was much detail provided on minimum requirements outside of HDR10 support, peak 1000 nits brightness, and unspecified coverage of DCI-P3 for the 4K G-Sync HDR models, citing their certification process and deferring detailed capabilities to other certifications that G-Sync HDR monitors may have. In this case, UHD Alliance Premium and DisplayHDR 1000 certifications for the Asus PG27UQ. Which is to say that, at least for the moment, the only G-Sync HDR displays are those that adhere to some very stringent standards; there aren't any monitors under this moniker that offer limited color gamuts or subpar dynamic contrast ratios.

At least with UHD Premium, the certification is specific to 4K resolution, so while the announced 65” 4K 120Hz Big Format Gaming Displays almost surely will be, the 35” curved 3440 × 1440 200Hz models won’t. Practically-speaking, all the capabilities of these monitors are tied into the AU Optronics panels inside them, and we know that NVIDIA worked closely with AUO as well as the monitor manufacturers. As far as we know those AUO panels are only coupled with G-Sync HDR displays, and vice versa. No other standardized specification was disclosed, only referring back to their own certification process and the ‘ultimate gaming display’ ideal.

As much as NVIDIA mentioned consumer education on the HDR performance spectrum, the consumer is hardly any more educated on a monitor’s HDR capabilities with the G-Sync HDR branding. Detailed specifications are left to monitor certifications and manufacturers, which is the status quo. Without a specific G-Sync HDR page, NVIDIA lists G-Sync HDR features under the G-Sync page, and while those features are specified as G-Sync HDR, there is no explanation on the full differences between a G-Sync HDR monitor and a standard G-Sync monitor. The NVIDIA G-Sync HDR whitepaper is primarily background on HDR concepts and a handful of generalized G-Sync HDR details.

For all intents and purposes, G-Sync HDR is presented not as specification or technology but as branding for a premium product family, and right now for consumers it is more useful to think of it that way.



When DisplayPort 1.4 Isn’t Enough: Chroma Subsampling

One of the key elements that even makes G-Sync HDR monitors possible – and yet still holds them back at the same time – is the amount of bandwidth available between a video card and a monitor. DisplayPort 1.3/1.4 increased this to just shy of 26Gbps of video data, which is a rather significant amount of data to send over a passive, two-meter cable. Still, the combination of high refresh rates, high bit depths, and HDR metadata pushes the bandwidth requirements much higher than DisplayPort 1.4 can handle.

All told, DisplayPort 1.4 was designed with just enough bandwidth to support 3840x2160 at 120Hz with 8bpc color, coming in at 25.81Gbps of 25.92Gbps of bandwidth. Notably, this isn’t enough bandwidth for any higher refresh rates, particularly not 144MHz. Meanwhile when using HDR paired with the P3 color space, where you’ll almost certainly want 10bpc color, there’s only enough bandwidth to drive it at 98Hz.

DisplayPort Bandwidth
Standard Raw Effective
DisplayPort 1.1 (HBR1) 10.8Gbps 8.64Gbps
DisplayPort 1.2 (HBR2) 21.8Gbps 17.28Gbps
DisplayPort 1.3/1.4 (HBR3) 32.4Gbps 25.92Gbps

As a result, for these first generation of monitors at least, NVIDIA has resorted to a couple of tricks to make a 144Hz 4K monitor work within the confines of current display technologies. Chief among these is support for chroma subsampling.

Chroma subsampling is a term that has become a little better known in the last few years, but the odds are most PC users have never really noticed the technique. In a nutshell, chroma subsampling is a means to reduce the amount of chroma (color) data in an image, allowing images and video data to either be stored in less space or transmitted over constrained links. I’ve seen it referred to compression at some points, and while the concept is indeed similar it’s important to note that chroma subsampling doesn’t try to recover lost color information nor does it even intelligently discard color information, so it’s perhaps thought better as a semi-graceful means of throwing out color data. In any case, the use of chroma subsampling is as old as color television, however its use in anything approaching mainstream monitors is much newer.

So how does chroma subsampling work? To understand chroma subsampling, it’s important to understand the Y'CbCr color space it operates on. As opposed to tried and true (and traditional) RGB – which stores the intensity of each color subpixel in a separate channel – Y'CbCr instead stores luma (light intensity) and chroma (color) separately. While the transformation process is not important, at the end of the day you have one channel of luma (Y) and two channels of color (CbCr), which add up to an image equivalent to RGB.

Chroma subsampling, in turn, is essentially a visual hack on the human visual system. Humans are more sensitive to luma than chroma, so as it goes, some chroma information can be discarded without significantly reducing the quality of an image.

The technique covers a range of different patterns, but by far the most common patterns, in order of image quality, are 4:4:4, 4:2:2:, and 4:2:0. 4:4:4 is a full chroma image, equivalent to RGB. 4:2:2 is a half chroma image that discards half of the horizontal color information, and requires just 66% of the data as 4:4:4/RGB. Finally 4:2:0 is a quarter chroma image, which discards half of the horizontal and half of the vertical color information. In turn it achieves a full 50% reduction in the amount of data required versus 4:4:4/RGB.


Wikipedia: diagram on chroma subsampling

In the PC space, chroma subsampling is primarily used for storage purposes. JPEG employs various modes to save on space, and virtually every video you’ve ever seen, from YouTube to Blu-rays, has been encoded with 4:2:0 chroma. In practice chroma subsampling is bad for text because of the fine detail involved – which is why PCs don’t use it for desktop work – but for images it works remarkably well.

Getting back to the matter of G-Sync then, the same principle applies to bandwidth savings over the DisplayPort connection. If DP 1.4 can only deliver enough bandwidth to get to 98Hz with RGB/4:4:4 subsampling, then going down one level, to 4:2:2, can free up enough bandwidth to reach 144Hz.

Users, in turn, are given a choice between the two options. When using HDR they can either pick to stick with a 98Hz refresh rate and get full 4:4:4 subsampling, or drop to 4:2:2 for 144Hz.

In practice for desktop usage, most users are going to be running without HDR due to Windows’ shaky color management, so the issue is moot and they can run at 120Hz without any colorspace compromises. It’s in games and media playback where HDR will be used, and at that point the quality tradeoffs for 4:2:2 subsampling will be less obvious, or so NVIDIA’s reasoning goes. Adding an extra wrinkle, even on an RTX 2080 Ti few high-fidelity HDR-enabled games will be able to pass 98fps to begin with, so the higher refresh rate isn’t likely to be needed right now. Still, if you want HDR and access to 120Hz+ refresh rates – or SDR and 144Hz for that matter – then there are tradeoffs to be made.

On that note, it’s worth pointing out that to actually go past 120Hz, the current crop of G-Sync HDR monitors require overclocking. This appears to be a limitation of the panel itself; with 4:2:2 subsampling there’s enough bandwidth for 144Hz even with HDR, so it’s not another bandwidth limitation that’s stopping these monitors at 120Hz. Rather the purpose of overclocking is to push the panel above its specifications (something it seems plenty capable of doing), allowing the panel to catch up with the DisplayPort connection to drive the entire device at 144Hz.

Meanwhile on a quick tangent, I know a few people have asked why NVIDIA hasn’t used the VESA’s actual compression technology, Display Stream Compression (DSC). NVIDIA hasn’t officially commented on the matter, and I don’t really expect they will.

However from talking to other sources, DSC had something of a rough birth. The version of the DSC specification used in DP 1.4 lacked support for some features manufacturers wanted like 4:2:0 chroma subsampling, while DP1.4 itself lacked a clear definition of how Forward Error Correction would work with DSC. As a result, manufacturers have been holding off on supporting DSC. To that end, the VESA quietly released the DisplayPort 1.4a specification back in April to resolve the issue, with the latest standard essentially serving as the “production version” of DisplayPort with DSC. As a result, DSC implementation and adoption is just now taking off.

As NVIDIA controls the entire G-Sync HDR ecosystem, they aren’t necessarily reliant on common standards. None the less, if DSC wasn’t in good shape to use in 2016/2017 when G-Sync HDR was being developed, then it’s as good a reason as any that I’ve heard for why we’re not seeing G-Sync HDR using DSC.



The (Asus) G-Sync HDR Experience: Premium Panel for Premium Price

In the end, gamers are given the ultimate guidance with the price point: $2000. The cost doesn't pull any punches, and while it may not be explicitly communicated to consumers, the price is all about the panel functionality, while everything else takes the backseat. Though we can only say this directly about the Asus PG27UQ, this is presumably the case for Acer's Predator X27, which shares the connectivity, large physical design, and active cooling setup.

Some of this is out of Asus's hands, and with the G-Sync HDR module's capabilities and limitations, something that they can only package up and support the best they can. Manufacturers on the display design side would be limited in expanding the basic range of use of G-Sync HDR. Some aspects are even out of NVIDIA's hands when it comes to HDR support in the OS, which goes back to Windows' historically poor management of anything non-SDR and non-sRGB; if the monitors were ready before the Windows 10 April 2018 Update, ease-of-use would've been a big issue.

As one of two current G-Sync HDR implementations, the Asus PG27UQ is also just one of three VESA DisplayHDR 1000 certified products, alongside the Acer counterpart and a Phillips 4K TV, and one of three UHDA Premium certified monitors, alongside two proviz monitors. So by certifications, it would be one of the best HDR PC monitors on the consumer market anyway, G-Sync or otherwise. It seems more likely than not that the 35-inch and 65-inch models are not imminently ready, although resolving firmware issues with FALD backlighting should be a shared investment between them. But for now, G-Sync HDR can only truly stretch its legs in a niche case: single-monitor non-silent PC gaming with HDR titles on NVIDIA G-Sync HDR supporting hardware powerful enough to target 4Kp144 target. The last bit is already niche on its own: the GeForce GTX 1080 Ti was the first card to really hit 60fps on no-compromises 4K, and both AMD and NVIDIA have stepped back from multi-GPU and multi-card solutions.

As an aside, we know now 144fps is perhaps even further out given that NVIDIA's next generation offering of the GeForce RTX 2080 Ti is more-or-less in the Titan V gaming performance bracket, which is to say it's only about 37% faster than the GTX 1080 Ti. The majority of this review was done prior to the RTX 2080 Ti and RTX 2080 launch, but doesn't fundamentally alter the core premise of 4Kp144 being out-of-reach.

And when you're paying more dollars than most people have horizontal pixels on their screen, especially when that price is especially baked in to that use case, that niche becomes extremely relevant. There's no price tiering right now in terms of non-4K G-Sync HDR or non-HDR 4Kp144 G-Sync, so pursuing either combination still leaves you at the $2000 price point. So let's find out if the prospect of playing PC games with the cutting-edge of 2018 visuals measure up.



Physical Design and Features

Starting with appearances, the design of the PG27UQ is, to put colloquially, quite thick. This would typically be an area where different companies can distinguish between similar products, in this case meaning Asus and Acer, but the dimensions are a consequence of the AUO LCD/backlight unit, the G-Sync HDR module, and HSF assembly.

And yes, a heatsink and fan. Topping the bulkness off is the aforementioned active cooling with the fan, located behind the stand/VESA mount point. The fan's behavior is not really documented, but it runs during standby and sometimes when the monitor is powered off - the latter behavior had me slightly confused on first use, as the fan spun up once I plugged in the monitor. Thankfully, the noise levels are low enough that it should only be a concern for fanless/silent configurations.


PCPerspective's photo of the FPGA module and fan

A teardown by PCPer revealed an Altera Arria 10 GX 480 FPGA with 3GB DDR4-2400 RAM, a substantial upgrade from the Altera Arria V GX FPGA with 768MB DDR3L of the original G-Sync module. NVIDIA stated that, like previous iterations, the module does not replace the TCON and does not support VESA DSC. The latter has been suggested as a solution to the bandwidth limitations of combined high res/high refresh/HDR, and we know that DisplayPort 1.4 includes the DSC standard. Implementing DSC for G-Sync may or may not add latency, but NVIDIA probably explored that option before going with the current implementation of chroma subsampling past 98Hz.

More interestingly, NVIDIA also mentioned that the G-Sync HDR module uses eDP to interface with the LCD panel, as opposed to the first generation’s LVDS, which is an aged standard nowadays. In general, eDP provides higher bandwidth, requiring fewer PCB traces and signal wires overall, and so consumes less power. Except in this case, the overall power usage and/or heat generation requires a blower fan.

It’s been reported that the 27-inch panel will come in a non-HDR variant without the FALD backlight, but the price reduction is harder to guess, since the G-Sync HDR module and quantum dot film would likely still be used. The panel will presumably have an eDP interface, which wouldn’t be compatible with the LVDS-only capability of the first generation G-Sync modules. At the least, there likely wouldn’t be a need for active cooling anymore.

So in contrast with the modern trend of smaller screen borders, the PG27UQ bezels are noticeable at around 15mm at the sides and around 20mm on the top and bottom. The three-point stand is large and the unit as a whole is on the heavier side, just a little over 20 pounds. That stand actually allows for underside LEDs, which can project a logo on the desk below, and the monitor comes with customizable blank plastic covers for this purpose. This falls under the "LIGHT IN MOTION" OSD, and a separate "Aura RGB" option governs LEDs for the ROG logo at the back of the stand. Alternatively, Aura Sync can be enabled to control the "Aura RBG" lighting.

Similarly, the ROG logo can be projected rearwards by the "ROG Light Signal," the last bit in the monitor's bling kit. The power LED also does turn red, but this is to indicate that the monitor is in G-Sync model; it is white during standard operation and amber during standby.

Also at the top of the monitor is an ambient light sensor, which is used with auto-adjusting SDR brightness ('Auto SDR Brightness') and blackness (Auto Black Level) settings in the OSD.

Connectivity is as minimal as it gets without being a big issue: 1 x DisplayPort 1.4, 1 HDMI 2.0 port, an audio jack, and a 2-port USB 3.0 hub. By standards of a premium monitor, it’s certainly not ideal; even if the panel specifications and features are the main attraction over connectivity, the $2000 price point hardly suggests minimal connections. The configuration is identical with Acer's X27 so I'm not sure if there was much Asus could do, unless the reasoning was primarily about margins (if so, then it might indicate that development/panel/module expenses are higher).

The stand and mount combine to offer a good range of adjustment options.

In terms of the on-screen display (OSD), the PG27UQ comes with several SDR picture mode presets called 'GameVisual' as it uses GameVisual Video Intelligence. The modes are as follows:

  • Racing (default): intended for input lag reduction
  • Scenery: intended for more constrast gradations. Also sets monitor to 100% brightness and locks gamma and Dark Boost (auto gamma curve adjustment)
  • Cinema: intended for saturated and cool colors. Also sets monitor to 'Cool' color temperature and locks gamma and Dark Boost
  • RTS/RPG: intended to enhance constrast sharpness and color saturation. Also sets gamma to 2.4 and Dark Boost to Level 1
  • FPS: intended for higher constrast. Also sets Level 3 Dark Boost
  • sRGB: intended for viewing photos and graphics on PCs. Also locks color temperature, brightness, contrast, and gamma

Switching to HDR mode disables GameVisual, and locks gamma, dark boost, variable backlight, and Auto SDR Brightness.

Meanwhile, a separate 'GamePlus' button brings up options for gaming-oriented OSD overlays: crosshair, timer, FPS counter, and screen alignment markers for multi-monitor setup.



Brightness and Contrast

Going straight into the brightness and contrast metrics, we know HDR's calling card is to permit those bright whites and dark blacks. For DisplayHDR 1000, and UHD Premium, 1000 nits (the common non-SI term for cd/m2) is the requirement. So for brightness of white levels, it's no surprise that in HDR mode the PG27UQ reaches that coveted mark.

Outside of HDR, brightness is also useful in gauging visibility/usability in conditions of bright and direct ambient light, i.e. outdoors. For moderately lit indoor use, the typical 200 to 300 nit range of desktop monitors is more than sufficient. In terms of factory defaults, the PG27UQ is set at 80 for brightness, which is around 266 nits. 1000 nits is much too bright for day-to-day usage, as is 500 nits.

White Level -  i1DisplayPro
*In HDR mode, there is an adjustable 'reference white' setting, defaulted at 80 nits, instead of a brightness setting. At that default setting, the PG27UQ displayed the HDR test pattern at 1032 nits.

Because HDR has a static 'reference white' level instead of brightness, there isn't really an equivalent to minimum brightness white level as it isn't utilized in the same way; for the PG27UQ, reference white can be set between 20 to 300 nits.

Otherwise, enabling the Windows 10 'HDR and Wide Color Gamut' mode puts the monitor into its HDR mode; additionally, that Windows 10 setting provides a brightness slider governing SDR content while in HDR mode. So at 100% SDR brightness, the display pushes past 500 nits. For users, it works nicely to avoid SDR dimness when HDR is enabled, and also providing the option to boost up to much higher brightness if desired.

Black Level - i1DisplayPro
*Represents black levels corresponding to the default 'reference white' setting of 80 nits.

IPS-type panels are often known for their 'backlight bleed' and so relatively higher black levels. Without its local dimming capability, the PG27UQ isn't much of an exception. Enabling variable backlighting (FALD) in the OSD brings the black levels to HDR tier performance, and can be enabled on SDR mode at the cost of maximum brightness white levels.

Contrast Ratio -  i1DisplayPro
*Contrast ratios calculated from default reference white of 80 nits

The good range between bright whites and dark blacks translate into high contrast ratios for the PG27UQ. While we don't have any other HDR monitors for comparison, the contrast ratios are really in their own class, especially as only HDR content will take advantage of the brightness and be (properly) displayed. Otherwise, in pure SDR mode, the PG27UQ resembles a solid IPS-type SDR monitor. Just enabling the direct LED backlighting in SDR mode improves contrasts considerably on the strength of its black levels.



SDR Color Modes: sRGB and Wide Gamut

Pre-calibration/calibration steps of the monitor is done with SpectraCal’s CalMAN 5 suite. For contrast and brightness, the X-Rite i1DisplayPro colorimeter is used, but for the actual color accuracy readings we use the X-Rite i1Pro spectrophotometer. Pre-calibration measurements were done at 200 nits for sRGB and Wide Gamut with Gamma set to 2.2.

The PG27UQ comes with two color modes for SDR input: 'sRGB' and 'Wide Gamut.' Advertised as DCI-P3 coverage, the actual 'Wide Gamut' sits somewhere between DCI-P3 and BT.2020 HDR, which is right in line with minimum coverages required by DisplayHDR 1000 and UHD Premium. That being the case, the setting isn't directly calibrated to a color gamut, as opposed to sRGB.

Out-of-the-box, the monitor defaults to 8 bits per color, which can be changed in NVIDIA Control Panel. Either way, sRGB accuracy is very good, as the monitor comes factory-calibrated. To note, 10bpc for the PG27UQ is with dithering (8bpc+FRC).


SpectraCal CalMAN sRGB color space for PG27UQ, with out-of-the-box default 8bpc (top) and default with 10bpc (bottom)

In 8bpc or 10bpc, average delta E is around 1.5, which corresponds with the included factory calibration result of 1.62; for reference, for color accuracy a dE below 1.0 is generally imperceptible and a dE below 3.0 is considered accurate.

SpectraCal CalMAN DCI-P3 (above) and BT.2020 (below) color spaces for PG27UQ, on default settings with 10bpc and 'wide color gamut' enabled under SDR Input

The 'wide gamut' options are not mapped to either DCI-P3 or BT.2020, sitting somewhere in between, but then again, it doesn't need to be as a professional or prosumer monitor would.

Grayscale and Saturation

Looking at color accuracy more throughly, we look at greyscale and saturation readings with respect to the sRGB gamut. The dips in gamma aren't perfect, and the whitepoints are a little on the warm side.

SpectraCal CalMAN sRGB color space grayscales with out-of-the-box default 8bpc (top) and default with 10bpc (bottom)

The saturation numbers are better, and in fact the dE is around 1.5 to 1.4, which is impressive for a gaming monitor.

SpectraCal CalMAN sRGB color space saturation sweeps for PG27UQ, with out-of-the-box default 8bpc (top) and default with 10bpc (bottom)

 

Gretag Macbeth (GMB) and Color Comparator

The last color accuracy test is the most thorough, and again the PG27UQ shines with dE of 1.53 and 1.63

SpectraCal CalMAN sRGB color space GMB for PG27UQ, with out-of-the-box default 8bpc (top) and default with 10bpc (bottom)

Considering that this monitor was not designed for professional use, it's very calibrated out-of-the-box for gamers, and there's no strong concern for calibration. If anything, users should just be sure to select 10bpc in the NVIDIA Control Panel, but even then most games use 8bpc anyhow.

SpectraCal CalMAN sRGB relative color comparator graphs for PG27UQ, with out-of-the-box default 8bpc (top) and default with 10bpc (bottom). Each color column is split into halves; the top half is the PG27UQ's reproduction and the bottom half is the correct value



HDR Color And Luminance

New to our monitor workflow, we've utilized some of CalMAN's recent HDR Analysis workflow additions, again in the SpectraCal suite. As I don't have an HDR test pattern generator currently, we've utilized madTPG for the HDR pattern generator. To note, in SDR mode the monitor reports a "SDR-BT1886" EOTF (gamma) instead of "SDR-sRGB" when in YCbCr mode, but in HDR mode the EOTF is "HDR-ST2084".

Checking back on HDR brightness, we see that the peak brightness is around 1236 nits, reached when a 10% area of the screen was white. So there's definitely room to spare with respect to the 1000 nit minimum requirement.


SpectraCal CalMAN

The HDR gamut coverage matches what we saw with the SDR 'wide gamut' mode, confirming that it covers around 76% BT.2020 and 92% DCI-P3.


SpectraCal CalMAN

Originally in delta ICtCp format, which reportedly describes HDR more accurately, the metrics are in delta E for ease of comparison. The HDR10 media profile targets the BT.2020 space, so it's important to see how well devices designed for the more limited P3 gamut do inside this subset. In that respect the PG27UQ is able to do quite well, with dE's below 3 on color saturation sweeps.


SpectraCal CalMAN

As for chroma subsampling, the 4:2:2 mode kicks into effect when in HDR mode at 120Hz or above. The lack of full horizontal color data can be noticed in certain cases, such as colored text on colored backgrounds where font can become blurry.


4:2:2 (top) and 4:4:4 (bottom) as seen on 12 point font



HDR Gaming Impressions

In the end, a monitor like the PG27UQ is really destined for one purpose: gaming. And not just any gaming, but the type of quality experience that does not compromise between resolution and refresh rate, let alone HDR and VRR.

That being said, not many games support HDR, which for G-Sync HDR means HDR10 support. Even for games that do support an HDR standard of some kind, the quality of the implementation naturally varies from developer to developer. And because of console HDR support, some games only feature HDR in their console incarnations.

The other issue is that the HDR gaming experience is hard to communicate objectively. In-game screenshots won't replicate how the HDR content is delivered on the monitor with its brightness, backlighting, and wider color gamut, while photographs are naturally limited by the capturing device. And naturally, any HDR content will obviously be limited by the viewer's display. On our side, this makes it easy to generally gush about glorious HDR vibrance and brightness, especially as on-the-fly blind A/B testing is not so simple (duplicated SDR and HDR output is not currently possible).

As for today, we are looking at Far Cry 5 (HDR10), F1 2017 (scRGB HDR), Battlefield 1 (HDR10), and Middle-earth: Shadow of War (HDR10), which covers a good mix of genres and graphics intensity. Thanks to in-game benchmarks for three of them, they also provide a static point of reference; in the same vein, Battlefield 1's presence in the GPU game suite means I've seen and benchmarked the same sequence enough times to dream about it.

For such subjective-but-necessary impressions like these, we'll keep ourselves grounded by sticking to a few broad questions:

  • What differences are noticable from 4K with non-HDR G-Sync?
  • What differences are noticable from 4:4:4 to 4:2:2 chroma subsampling at 98Hz?
  • What about lowering resolution to 1440p HDR or lowering details with HDR on, for higher refresh rates? Do I prefer HDR over high refresh rates?
  • Are there any HDR artifacts? e.g. halo effects, washed out or garish colors, blooming due to local dimming

The 4K G-Sync HDR Experience

From the beginning, we expected that targeting 144fps at 4K was not really plausible for graphically intense games, and that still holds true. On a reference GeForce GTX 1080 Ti, none of the games averaged past 75fps, and even the brand-new RTX 2080 Ti won't come close to doubling that.

Ubquituous black loading and intro screens make the local dimming bloom easily noticable, though this is a commonly known phenomenon and somewhat unavoidable. The majority of the time, it is fairly unintrusive. Because the local backlighting zones can only get so small on LCD displays – in the case of this monitor, each zone is roughly 5.2cm2 in area – anything that is smaller than the zone will still be lit up across the zone. For example, a logo or loading throbber on a black background will have a visible glow around them. The issue is not specific to the PG27UQ, only that higher maximum brightness makes it little more obvious. One of the answers to this is OLED, where subpixels are self-emitting and thus lighting can be controlled on an individual subpixel basis, but because of burn-in it's not suitable for PCs.


Loading throbbers for Shadow of War (left) and Far Cry 5 (right) with the FALD haloing effect

Much has been said about describing the sheer brightness range, but the closest analogy that comes to mind is like dialing up smartphone brightness to maximum after a day of nursing a low battery on 10% brightness. It's still up to the game to take full advantage of it with HDR10 or scRGB. Some games will also offer to set gamma, maximum brightness, and/or reference white levels, thereby allowing you to adjust the HDR settings to the brightness capability of the HDR monitor.

The most immediate takeaway is the additional brightness and how fast it can ramp up. The former has a tendency to make things more clear and colorful - the Hunt effect in play, essentially. The latter is very noticable in transitions, such as sudden sunlight, looking up to the sky, and changes in lighting. Of course, the extra color vividness works hand-in-hand with the better contrast ratios, but again this can be game- and scene-dependent; Far Cry 5 seemed to fare the best in that respect, though Shadow of War, Battlefield 1, and F1 2017 still looked better than in SDR.

In-game, I couldn't perceive any quality differences going from 4:4:4 to 4:2:2 chroma subsampling, though the games couldn't reach past 98Hz at 4K anyway. So at 50 to 70fps averages, the experience reminded me more of a 'cinematic' experience, because HDR made the scenes look more realistic and brighter while the gameplay was the typical pseudo 60fps VRR experience. With that in mind, it would probably be better for exploration-heavy games where you would 'stop-and-look' a lot - and unfortunately, we don't have Final Fantasy XV at the moment to try out. NVIDIA themselves say that increased luminance actually increases the perception of judder at low refresh rates, but luckily the presence of VRR would be mitigating judder in the first place.

4K HDR Gaming Performance - GTX 1080 Ti & ROG Swift PG27UQ

What was interesting to observe was a performance impact with HDR enabled (and with G-Sync off) on the GeForce GTX 1080 Ti, which seems to corroborate last month's findings by ComputerBase.de. For the GTX 1080 Ti, Far Cry 5 was generally unaffected, but Battlefield 1 and F1 2017 took clear performance hits, appearing to stem from 4:2:2 chroma subsampling on HDR (YCbCr422). Shadow of War also seemed to fare worse. Our early results also indicate that even HDR with 4:4:4 chroma subsampling (RGB444) may result in a slight performance hit in affected games.

It's not clear what the root cause is, and we'll be digging deeper as we revisit the GeForce RTX 20-series. Taking a glance at the RTX 2080 Ti Founders Editions, the performance hit of 4:2:2 subsampling is reduced to negligable margins in these four games.

4K HDR Gaming Performance - GeForce RTX 2080 Ti FE & PG27UQ

On Asus' side, the monitor does everything that it is asked of: it comfortably reaches 1000 nits, and as long as FALD backlighting is enabled, the IPS backlight bleed is pretty much non-existent. There was no observed ghosting or other artifacts of the like.

The other aspects of the HDR gaming 'playflow' is that enabling HDR can be slightly different per game and Alt-Tabbing is hit-or-miss - that is on Microsoft/Windows 10, not on Asus - but it's certainly much better than before. For example, Shadow of Mordor had no in-game HDR toggle and relied on the Windows 10 toggle. And with G-Sync and HDR now in the mix, adjusting resolution and refresh rate (and for 144Hz, the monitor needs to be put in OC mode) to get the exact desired configuration can be a bit of a headache. Thus, it was very finicky in lowering in-game resolution to 1440p but keeping G-Sync HDR and 144Hz OC mode.

At the end of the day, not all HDR is made equal, which goes for the game-world and scene construction in addition to HDR support. So although the PG27UQ is up to the task, you may not see the full range of color and brightness translated into a given game, depending on its HDR implementation. I would strongly recommend visiting a brick-and-mortar outlet that offered an HDR demo, or look into specific HDR content that you would want to play or watch.



Display Uniformity and Power Usage

Especially with localized dimming, the PG27UQ's panel uniformity was solid. In the default out-of-the-box configuration (FALD enabled), the maximum local difference of white levels is around 5% of the center brightness.

Black levels were more uneven, with a general trend of brighter blacks towards the top and darker blacks towards the bottom.

Color reproduction across the panel, however, is excellent, and virtually imperceptible between different parts of the display.

Power Use

As far as power usage goes, the PG27UQ has been specified for a peak 180W with HDR on. Stand-by was specified at 0.5W, but in practice the monitor often idled for some time around 27W in the power-off mode, before finally going to sub-1W power draw. The fan is on at that time, and it's not exactly clear how this state is governed.

Power Draw (Wall Measurements)

With G-Sync and HDR enabled, peaks of around 150W to 160W were observed during gaming, with a peak of 162W. In SDR mode, power consumption is more-or-less in line with typical monitors.



Closing Thoughts

Bringing this review to a close, the ROG Swift PG27UQ has some subtleties as it is just as much a ‘G-Sync HDR’ monitor as it is an ROG Swift 4Kp144 HDR monitor. In terms of panel quality and color reproduction, the PG27UQ is excellent by our tests. As a whole, the monitor comes with some slight design compromises: design bulkiness, active cooling, limited connectivity. However, those aspects aren’t severe enough to be dealbreakers except in very specific scenarios, such as for silent PC setups. Given the pricing and capabilities, the PG27UQ is destined to be paired with the highest end graphics cards; for a 4K 144Hz target, multi-GPU with SLI is the only – and pricy – solution for more intensive games.

And on that note, therein lies the main nuance with the PG27UQ. The $2000 price point is firmly in the ultra-high-end based on the specific combination of functionalities that the display offers: 4K, 144Hz, G-Sync, and DisplayHDR 1000-level HDR.

For ‘price is no object’ types, this is hardly a concern if the ROG Swift PG27UQ can hit all those well – and it does. But if price is at least somewhat of a consideration – and for the vast majority, it still is – then not using all those features simultaneously means not utilizing the full value of the monitor, and at $2000 this is already including an existing premium. The use-cases where all those features would be used simultaneously, that is, HDR games, are somewhat limited due to the nature of HDR support in PC games, as well as the horsepower of graphics cards currently on the market.

The graphics hardware landscape brings us to the other idea behind getting a monitor of this caliber: futureproofing. At this time, even the GeForce RTX 2080 Ti is not capable of reaching much beyond 80fps or so, and with NVIDIA stepping back from SLI, especially with 2+ way configurations, multi-GPU options are somewhat unpredictable and require more user-configuration. This could be particularly problematic depending on the nature of the HDR with 4:2:2 chroma subsampling performance hit for Pascal cards. Though this could go both ways, as some gamers expect minimal user configuration for products at the upper end of ‘premium’.

On the face of it, this is the type of monitor that demands ‘next-generation graphics’, and fortunately we have the benefit of NVIDIA’s announcement – and now launch – of Turing and GeForce RTX GPUs. In looking to that next generation, G-Sync HDR monitors are put in an awkward position. We still don’t know the extent of performance on Turing hybrid rendering with real-time ray tracing effects, but that feature is clearly the primary focus, if the branding ‘GeForce RTX’ wasn’t already clear enough. For traditional rendering in games (i.e. ‘out-of-the-box’ performance in most games), for 4K performance we saw the RTX 2080 Ti as 32% faster than the GTX 1080 Ti, reference-to-reference, and the RTX 2080 as around 8% faster than the GTX 1080 Ti and 35% faster than the GTX 1080. In the mix is the premium pricing of the GeForce RTX 2080 Ti, 2080, and 2070, of which only the 2080 Ti and 2080 support SLI.

Although this is really a topic to revisit after RTX support rolls out in games, Turing and its successors matter if only because this is a forward-looking monitor with G-Sync (and thus using VRR) means using NVIDIA cards. And for modern ultra-high-end gaming monitors, VRR is simply mandatory. Given that Turing’s focus is on new feature sets rather than purely on raw traditional performance over Pascal, then it somewhat plays against the idea of ‘144Hz 4K HDR with variable refresh’ as the near-future 'ultimate gaming experience', presumably in favor of real-time raytracing and the like. So enthusiasts might be faced with a quandary where enabling real-time raytracing effects means forgoing 4K resolution and/or ultra-high refresh rates, and even when for traditional non-raytraced performance, the framerate is still lacking. Again, details won’t become clear until we see the intensity of hybrid rendered game workloads, but this is absolutely something to keep in mind because not only are ultra-high-end gaming monitors and ultra-high-end graphics cards are tied at the hip, but also that the former tends to have longer upgrade/replacement cycles than the latter.

With futureproofing and to a lesser extent early adoption, consumers are paying the premium for features that they will fully utilize at some point, and that the device in question will still be viable until then. But if there is hard divergence from that vision of the future, then some of those features might not be fully utilized for quite some time. For the PG27UQ, it’s clear that the panel quality and HDR capability will keep it viable for quite some time, but right now the rest of the situation is unclear.

Returning to the here-and-now, there are a few general caveats for a prospective buyer. Utilizing HDMI will work with HDR input sources (limited to 60Hz max), but the G-Sync functionality is unused with current generation HDR consoles, which support FreeSync. The monitor is not intended for double-duty as a professional visualization monitor, and for business/productivity purposes the HDR/VRR is not generally useful, and the 4:2:2 chroma subsampling modes may be an issue for clear text reproduction.

On the brightness side, the HDR white and black levels, and the contrast ratios are excellent; with Windows 10 HDR mode these features can be utilized outside of HDR content. The ROG Swift PG27UQ is well-calibrated out-of-the-box, which can’t be understated as most people don’t calibrate monitors. The FALD operates with good uniformity, and color reproduction matches well under both HDR and SDR gamuts.

As for the $2000 price point, and the monitor itself, it all comes down to gaming with all the bells and whistles of PC display technology: 4K, 144Hz, G-Sync, HDR10 with FALD and peak 1000 nits brightness. Market-wise, there isn’t a true option that is a step below this, as right now, the PG27UQ and Acer variant are the only choices if gamers are looking for either high refresh rates on a 4K G-Sync monitor, or a G-Sync monitor that supports HDR.

So seeking either combination leaves consumers to have to step up to the G-Sync HDR products. Nonetheless, Acer did recently announce a DisplayHDR 400 variant without quantum dots or FALD, set at $1299 and due to launch in Q4. However, without QD, FALD, or DisplayHDR 1000/600 capabilities, HDR functionality is on the minimal side, and it’s telling that the monitor is specced as a G-Sync monitor rather than G-Sync HDR. As far as we know, there isn’t an upcoming intermediate panel in the vein of a 1440p 144Hz G-Sync HDR product, which would be less able to justify a premium margin.

But because the monitor is focused on HDR gaming, the situation with OS and video game support needs to be noted, though again we should reiterate that this is outside Asus’ control. There is a limited selection of games with HDR support, which doesn’t always equate to HDR10, and of those games not all are developed or designed to utilize HDR’s breadth of brightness and color. Windows 10 support for HDR displays have improved considerably, but is still a work-in-progress. All of this is to say that HDR gaming is baked into the $2000, and purchasing it for primarily high refresh rate 4K gaming effectively increases the premium that the consumer would be paying.

So essentially, gamers will not get the best value of the PG27UQ unless they:

  • Regularly play or will play games that support HDR, ideally ones that use HDR very well
  • Have or will have the graphics horsepower to go beyond 4K 60fps in those games, and are willing to deal with SLI if necessary to achieve that
  • Are willing to deal with maturing HDR support in video games, software, and Windows 10

Again, if price is no object, then these points don't matter from a value perspective. And if consumers fit the criteria, then the PG27UQ deserves serious consideration, because presently there is no other class of monitor that can provide the gaming experience that G-Sync HDR monitors like the ROG Swift PG27UQ can. Asus's monitor packs in every bell and whistle imaginable on a PC gaming monitor, and the end result is that, outside of Acer's twin monitor, the PG27UQ is unparalleled with respect to its feature set, and is among the best monitors out there in terms of image quality.

But if price is still a factor – as playing on the bleeding edge of technology so often is – consumers will have to keep in mind that they might be paying a premium for features they may not regularly use, will use much later in the future than anticipated, or will cost more than expected to use (i.e. costs of dual RTX 2080 Ti's). In the case of GeForce RTX cards, you might end up in a waiting situation for titles to release with HDR and/or RTX support, whereupon the card would still not push the PG27UQ's capabilties to the max.

On that note, the situation relies a lot on media consumption habits, not only in terms of HDR games or HDR video content but also in terms of console usage, preference for indie over AA/AAA games, and preference over older versus newer titles. If $2000 is an affordable amount, that budget could encompass two quality displays combined that may better suit individual use-case scenarios, for example, Asus' $600 to $700 PG279Q (1440p 165Hz G-Sync IPS 27-in) monitor paired with a $1300 4K HDR 27-in professional monitor with peak 1000 nit luminance. Or instead of a professional HDR monitor, an entry or mid-level 4K HDR TV in the $550 to $1000 range.

Wrapping things up, if it sounds like this is equal parts a conclusion of G-Sync HDR as much as it is of the ROG Swift PG27UQ, it is because it is. G-Sync HDR currently exists as the Asus and Acer 27-in models, and those G-Sync HDR capabilities are what is driving the price; NVIDIA’s G-Sync HDR is not just context or a single feature, it is intrinsically intertwined with the PG27UQ.

Though this is not to say the ROG Swift PG27UQ is flawed. It’s not the perfect package, but the panel provides combined qualities that no other gaming monitor, excluding the Acer variant, can offer. As a consumer-focused site we can never ignore the importance of price in evaluating a product, but just putting that aside for the briefest of moments, it really is an awesome monitor that is well beyond what any other monitor can deliver today. It just costs more than what most gamers will ever consider paying for a monitor, and the nuances of the monitor, G-Sync HDR, and HDR gaming means that $2000 might be more than expected for how you use it.

Ultimately the PG27UQ is the first of many G-Sync HDR monitors. And as the technology matures, hopefully we'll see these monitors further improve and for the price to drop. However for the near future, the schedule slip of the 27-inch G-Sync HDR models doesn’t bode well for the indeterminate timeframe of the 35-in ultrawides and 65-in BFGDs. So if you want the best right now – and what's very likely to be the best 27-inch monitor for at least the next year or two to come – this is it.

Log in

Don't have an account? Sign up now