Hey Anandtech, feel like giving one of these away? In seriousness, this is pretty much everything I want in a monitor, aside from the lack of HDR support.
The stand and the logo are the only things that really give it that silly look. As far as video game-oriented products are concerned, this one seems notably subdued compared to other monitors produced in the recent past. Despite that, I still do agree with you that it looks rather silly.
The pictures give you the wrong impression of the stand's effect on aestethics because the display is in the absolute top position. In any sensible ergonomic setup the display is located so it covers the entire center of the stand from view.
Only if you're short. It's an inch taller than a 24" 1920x1200 display. Putting the top edge of the screen in line with my (6'2") eyes I'd probably have it near the top of the stand. My 24" one is at the top of its stand with the bottom ~6.5" above my desk surface. If I slammed the screen down to the bottom to hide the stand structure I'd get neck strain after a few hours from always having to bend to look down just like I do if forced to use my laptop directly on my desk for an extended period of time.
The high res images make it appear that the stand is removable; so you can probably swap it for any VESA 100mm alternative if you want to.
First, that design is nah... Not to say it's bad... But I prefer more elegant designs than those gaming centric ones even thought this is a gaming screen but not every one will like the design of it.
Second... Why there's no FreeSynch version of it ?
Probably in a few months. Regardless of which comes first most OEMs launch the GSync and FreeSync versions a few months apart to get a second ride on the hype train.
If I had to guess why GSync first, probably because until Vega is available NVidia has a stranglelock on the top of the GPU market needed to drive a panel like this at high refresh rates.
HDR needs 10bit color instead of 8. That means 25% more data on the IO link and a 20% reduction in maximum framerate. Staying with 8 bit color lets them fit 120hz into DP 1.2. 10bit color would either require going to DP1.3/1.4 or dropping to 100hz. Assuming their panel controller didn't top out on throughput first.
I'm confused: if it can be overclocked to 120Hz, why didn't they overclock it at the factory? What are the dangers of doing so? Or does it mean not every monitor they sell will work at 120Hz?
If it's anything like the PG348Q, the OSD will show the option to overclock it up to 120hz, but each individual monitor will have its own limit. Some just won't make it that high.
I wish we could get a monitor that supports both g-sync and freesync so you can buy a top end screen and not be locked to a single video card. Since freesync doesn't require any special hardware, i would think it's a technical possibility, unless the g-sync module somehow prevents freesync from being possible to implement.(which wouldn't entirely surprise me.)
I think it effectively does. AIUI it's an Nvidia chip (Nvida programmed FPGA?) that takes over some/all of the work to decode the video signal and send to the panel. The original version failed to place nicely with anything else which's why the GSync 1.0 monitors only had the single input that the GSync controller worked on. Since GSync is a proprietary tech and Freesync is optional, Nvidia is free to continue to decline to support the latter in their controller and to make not adding 3rd party support for Freesync a condition to anyone either using their GSync controller or interested in licensing the IP to make one of their own (if this is available at all).
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
20 Comments
Back to Article
neo_1221 - Tuesday, May 23, 2017 - link
Hey Anandtech, feel like giving one of these away? In seriousness, this is pretty much everything I want in a monitor, aside from the lack of HDR support.Solidstate89 - Tuesday, May 23, 2017 - link
God am I sick of the aggressive "gamer" aesthetic. I'd literally pay a grand for this display hardware, but placed in the body of a Dell Ultrasharp.BrokenCrayons - Tuesday, May 23, 2017 - link
The stand and the logo are the only things that really give it that silly look. As far as video game-oriented products are concerned, this one seems notably subdued compared to other monitors produced in the recent past. Despite that, I still do agree with you that it looks rather silly.Kvaern1 - Tuesday, May 23, 2017 - link
The pictures give you the wrong impression of the stand's effect on aestethics because the display is in the absolute top position.In any sensible ergonomic setup the display is located so it covers the entire center of the stand from view.
DanNeely - Tuesday, May 23, 2017 - link
Only if you're short. It's an inch taller than a 24" 1920x1200 display. Putting the top edge of the screen in line with my (6'2") eyes I'd probably have it near the top of the stand. My 24" one is at the top of its stand with the bottom ~6.5" above my desk surface. If I slammed the screen down to the bottom to hide the stand structure I'd get neck strain after a few hours from always having to bend to look down just like I do if forced to use my laptop directly on my desk for an extended period of time.The high res images make it appear that the stand is removable; so you can probably swap it for any VESA 100mm alternative if you want to.
Kvaern1 - Tuesday, May 23, 2017 - link
Well, I'm 184cm and I never see the center of the stand on my X34A and my aim point is also the top of the screen.nathanddrews - Tuesday, May 23, 2017 - link
Check out Linus's review of the new 8K Dell monitor. That thing oozes aesthetic. It ought to for $4,999.Diji1 - Tuesday, May 23, 2017 - link
Why? No one serious is going to buy that for gaming on, it's a terrible choice for gamers.WinterCharm - Sunday, May 28, 2017 - link
Seriously.How about something that looks like this: https://9to5mac.files.wordpress.com/2016/12/lg-5k-...
Zak - Tuesday, May 23, 2017 - link
I would expect a higher price. That is not so bad.Xajel - Tuesday, May 23, 2017 - link
First, that design is nah... Not to say it's bad... But I prefer more elegant designs than those gaming centric ones even thought this is a gaming screen but not every one will like the design of it.Second... Why there's no FreeSynch version of it ?
DanNeely - Tuesday, May 23, 2017 - link
Probably in a few months. Regardless of which comes first most OEMs launch the GSync and FreeSync versions a few months apart to get a second ride on the hype train.If I had to guess why GSync first, probably because until Vega is available NVidia has a stranglelock on the top of the GPU market needed to drive a panel like this at high refresh rates.
maximumGPU - Tuesday, May 23, 2017 - link
is a monitor without HDR worth buying when those supporting high refresh and HDR are only a few months away?A5 - Tuesday, May 23, 2017 - link
Depends on whether or not those first-gen HDR monitors are any good, I'd think.And it may be a bit before one comes out at this size/res, but I'm not current on what's been announced.
DanNeely - Tuesday, May 23, 2017 - link
HDR needs 10bit color instead of 8. That means 25% more data on the IO link and a 20% reduction in maximum framerate. Staying with 8 bit color lets them fit 120hz into DP 1.2. 10bit color would either require going to DP1.3/1.4 or dropping to 100hz. Assuming their panel controller didn't top out on throughput first.p1esk - Tuesday, May 23, 2017 - link
I'm confused: if it can be overclocked to 120Hz, why didn't they overclock it at the factory? What are the dangers of doing so? Or does it mean not every monitor they sell will work at 120Hz?reckless76 - Tuesday, May 23, 2017 - link
If it's anything like the PG348Q, the OSD will show the option to overclock it up to 120hz, but each individual monitor will have its own limit. Some just won't make it that high.kaesden - Tuesday, May 23, 2017 - link
I wish we could get a monitor that supports both g-sync and freesync so you can buy a top end screen and not be locked to a single video card. Since freesync doesn't require any special hardware, i would think it's a technical possibility, unless the g-sync module somehow prevents freesync from being possible to implement.(which wouldn't entirely surprise me.)DanNeely - Tuesday, May 23, 2017 - link
I think it effectively does. AIUI it's an Nvidia chip (Nvida programmed FPGA?) that takes over some/all of the work to decode the video signal and send to the panel. The original version failed to place nicely with anything else which's why the GSync 1.0 monitors only had the single input that the GSync controller worked on. Since GSync is a proprietary tech and Freesync is optional, Nvidia is free to continue to decline to support the latter in their controller and to make not adding 3rd party support for Freesync a condition to anyone either using their GSync controller or interested in licensing the IP to make one of their own (if this is available at all).wyewye - Wednesday, May 24, 2017 - link
It is a complete pain to read Anton's subjective and idiotic opinions. Please just stick to the specs next time.