As a lifelong Nvidia user, i'm about to swap to AMD graphics if Nvidia can't get these monitor manufacturers to include Gsync models at launch. It's ridiculous having to wait a full year for a comparable G Sync model.
FreeSync, which is just a brand name for VESA adaptive sync, is free. Nvidia can support it today on their current cards. But they prefer to fleece their buyers for their proprietary solution.
Because any difference is marginal at best in any blind test I've seen. Also Nvidia can support both it's proprietary interface and the free standard at the same time on the same hardware. It's customers can decide at that point whether it's worth it. Thus fleecing it's customers by forcing them to buy their premium standard.
The sad part is that there is no AMD GPU that could even properly drive a 5K 120Hz screen. No AMD GPU has enough power to render any recent game anywhere close to making use of that screen.
It does contain fewer pixels than an actual 4K screen, though, by 921,600 pixels. So a 2080 Ti that's already over 60 FPS at 4K could drive this lower resolution screen at a higher frame rate.
That's patently false; Nvidia cards can definitely & easily support monitors of these ratios.
Even true 4K ultrawides (LG's 5K2K monitor) are best ran w/ Nvidia GPUs at rates that max what the monitors can do today for pro & gaming uses.
AMD has not prioritized accommodating such monitor arrangements at & beyond 4K with their mainstream GPUs binned from their workstation cards with performance in mind compared to Nvidia that has cards that unapologetically do; that said an AMD card should have no problem running this monitor that is ultimately a 1440p monitor with a common ultra-wide aspect ratio utilized.
On paper two vega 7s in crossfire should be able to handle this just fine and be cheaper and more powerful that a titan rtx and and msi lightning z 2080ti!
Good monitor cost 1200-2000$ and last 8-15 years. Good graphic card cost 250-2500$ and last 2 to 5 years... Definitely you buy GPU more often, and keep the monitor, so monitor affects what GPU you will buy.
I've done that. I upgraded my monitor and my GPU was fairly marginal on the old one. I decided to try the new monitor on the old GPU and see what happened. The frame rate was low enough to push me into upgrading the GPU to better support the much higher resolution of the new monitor.
YOU may never have been in that position but it certainly doesn't mean you have the right to deride those who are or assume it doesn't happen. If you upgrade from 1080P to 1440P or 4K then you're possibly gonna need a new card unless you want to drop texture quality, etc. And as above, GPU upgrade cycles tend to be quicker than monitor cycles.
This is just what I need for viewing the 10,000px × 3,520px furry porn on my art site. We already have the art filling the width of the screen; we just need wider screens!
Admittedly, another doubling of the resolution would be even better, but I can wait…
It's really pity to see this kind of monitors and then remember how far behind AMD is in GPUs. If only AMD was producing GPUs that where as fast as NVidia's these monitors would have been great. Of course hell will have to freeze first before Nvidia supports Adaptive sync, so forget about that possibility. Especially now that it's share price bottomed and their GPU revenues hit the ceiling, they would never even consider abandoning their GSync revenues for something free.
Finally - I've been waiting for this monitor to replace my two monitors ever since Samsung launched the CHG90 - the low vertical resolution of the latter was a no-no. Interestingly a few sites appointed Q4 2018 as the launch date, also because Phillips launched a similar 5k monitor based on a Samsung panel during 2018, but this seems based on a different panel. Now I just need to get a new graphics card.. and Vive 2 / Rift 2... Btw, about AMD GPU ability (or lack of it) to drive this monitor, it is quite true that currently no AMD card can use the full res of this monitor in next gen games at high frames (90+...). But one may not use the whole area to play (I actually intend to use just half of it for most games, as I currently play on my 27") and the lifetime of a monitor should cover three generations of graphic cards, so while 2019 GPUs may not be able to fully use this monitor and FS2, 2020+ GPUs may do so. Either from AMD or Intel. And *maybe* nVidia will stop being asinine and support Freesync... ;) (oen may dream)
A small thing - the Samsung press release states that the monitor model is CRG9 but Samsung's CES website refers CRG90, the latter more aligned with previous name (CHG90 => CRG90)
this is gonna be cool for folks looking for a cinematic display with that high peak brightness and local dimming. no HDR? I hope they bring that as well. it may not be ideal for twich shooters due to its massive size.
I cringe every time these monitors are claimed to be 5K; I suppose LG had no choice but to call their true 4K ultrawide a "5K2K" monitor if we're gonna call 32:9 1440p monitors 5K
Indeed, that caught me out. This isn't a true 5K monitor in my book. It's a 1440P ultrawide monitor. I find that incredibly misleading and also possibly something that will lose them sales. If I'm looking for a new monitor on a website, I do not include 5K monitors in my search. If they pop this in the 5K category, I will likely miss it or overlook it as soon as I see "5K" in the title. For my use, desk setup and system config, I want 1440P.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
33 Comments
Back to Article
Chugworth - Thursday, January 3, 2019 - link
I'm still waiting on a 4K FreeSync 2 monitor that's not goofy.nathanddrews - Friday, January 4, 2019 - link
I believe that all the current-gen Samsung Q-series 4K displays do what you want. Variable refresh rates and resolutions up to 4K60 with HDR.Chugworth - Friday, January 4, 2019 - link
It looks like those are TVs. 55" would be a bit large for a desktop display, don't you think?godrilla - Thursday, January 3, 2019 - link
So far 2019 is starting strong hopefully this will be close to the previous in terms of price.godrilla - Saturday, February 2, 2019 - link
https://www.bhphotovideo.com/c/product/1449627-REG...its available for preorder fyi.
boozed - Thursday, January 3, 2019 - link
So what do you call that, "half five kay"?Morawka - Thursday, January 3, 2019 - link
As a lifelong Nvidia user, i'm about to swap to AMD graphics if Nvidia can't get these monitor manufacturers to include Gsync models at launch. It's ridiculous having to wait a full year for a comparable G Sync model.Alexvrb - Friday, January 4, 2019 - link
The better solution would be for them to support both FS and GS.Shinshin - Friday, January 4, 2019 - link
The better better solution is that Nvidia will support the free* Freesync tech with their cards...* license fees that is because it will cost money to develop support...
DrKlahn - Friday, January 4, 2019 - link
FreeSync, which is just a brand name for VESA adaptive sync, is free. Nvidia can support it today on their current cards. But they prefer to fleece their buyers for their proprietary solution.lilkwarrior - Friday, January 4, 2019 - link
How is Nvidia fleecing buyers if it's better & held to a far higher standard, particularly G-SYNC HDR vs Freesync 2?DrKlahn - Friday, January 4, 2019 - link
Because any difference is marginal at best in any blind test I've seen. Also Nvidia can support both it's proprietary interface and the free standard at the same time on the same hardware. It's customers can decide at that point whether it's worth it. Thus fleecing it's customers by forcing them to buy their premium standard.Hubb1e - Friday, January 4, 2019 - link
Another reason to support AMD on this. They've historically been very open with their standards going back as far as I can remember.nevcairiel - Friday, January 4, 2019 - link
The sad part is that there is no AMD GPU that could even properly drive a 5K 120Hz screen. No AMD GPU has enough power to render any recent game anywhere close to making use of that screen.yeeeeman - Friday, January 4, 2019 - link
Neither nvidia really.piroroadkill - Friday, January 4, 2019 - link
It does contain fewer pixels than an actual 4K screen, though, by 921,600 pixels. So a 2080 Ti that's already over 60 FPS at 4K could drive this lower resolution screen at a higher frame rate.lilkwarrior - Friday, January 4, 2019 - link
That's patently false; Nvidia cards can definitely & easily support monitors of these ratios.Even true 4K ultrawides (LG's 5K2K monitor) are best ran w/ Nvidia GPUs at rates that max what the monitors can do today for pro & gaming uses.
AMD has not prioritized accommodating such monitor arrangements at & beyond 4K with their mainstream GPUs binned from their workstation cards with performance in mind compared to Nvidia that has cards that unapologetically do; that said an AMD card should have no problem running this monitor that is ultimately a 1440p monitor with a common ultra-wide aspect ratio utilized.
godrilla - Saturday, January 26, 2019 - link
On paper two vega 7s in crossfire should be able to handle this just fine and be cheaper and more powerful that a titan rtx and and msi lightning z 2080ti!imaheadcase - Friday, January 4, 2019 - link
Yah because replacing videocard based on a monitor is totally worth it...said no one ever.haukionkannel - Friday, January 4, 2019 - link
Good monitor cost 1200-2000$ and last 8-15 years.Good graphic card cost 250-2500$ and last 2 to 5 years...
Definitely you buy GPU more often, and keep the monitor, so monitor affects what GPU you will buy.
philehidiot - Saturday, January 5, 2019 - link
I've done that. I upgraded my monitor and my GPU was fairly marginal on the old one. I decided to try the new monitor on the old GPU and see what happened. The frame rate was low enough to push me into upgrading the GPU to better support the much higher resolution of the new monitor.YOU may never have been in that position but it certainly doesn't mean you have the right to deride those who are or assume it doesn't happen. If you upgrade from 1080P to 1440P or 4K then you're possibly gonna need a new card unless you want to drop texture quality, etc. And as above, GPU upgrade cycles tend to be quicker than monitor cycles.
Mr.Vegas - Wednesday, January 9, 2019 - link
Now that nvidia supports freesync, you can get this monitor.In any case, there are no AMD video cards that can push this resolution even at 40-50fps
Confesor - Saturday, January 12, 2019 - link
https://www.anandtech.com/show/13797/nvidia-to-sup...GreenReaper - Friday, January 4, 2019 - link
This is just what I need for viewing the 10,000px × 3,520px furry porn on my art site.We already have the art filling the width of the screen; we just need wider screens!
Admittedly, another doubling of the resolution would be even better, but I can wait…
yannigr2 - Friday, January 4, 2019 - link
It's really pity to see this kind of monitors and then remember how far behind AMD is in GPUs. If only AMD was producing GPUs that where as fast as NVidia's these monitors would have been great.Of course hell will have to freeze first before Nvidia supports Adaptive sync, so forget about that possibility. Especially now that it's share price bottomed and their GPU revenues hit the ceiling, they would never even consider abandoning their GSync revenues for something free.
Pyrostemplar - Friday, January 4, 2019 - link
Finally - I've been waiting for this monitor to replace my two monitors ever since Samsung launched the CHG90 - the low vertical resolution of the latter was a no-no. Interestingly a few sites appointed Q4 2018 as the launch date, also because Phillips launched a similar 5k monitor based on a Samsung panel during 2018, but this seems based on a different panel.Now I just need to get a new graphics card.. and Vive 2 / Rift 2...
Btw, about AMD GPU ability (or lack of it) to drive this monitor, it is quite true that currently no AMD card can use the full res of this monitor in next gen games at high frames (90+...). But one may not use the whole area to play (I actually intend to use just half of it for most games, as I currently play on my 27") and the lifetime of a monitor should cover three generations of graphic cards, so while 2019 GPUs may not be able to fully use this monitor and FS2, 2020+ GPUs may do so. Either from AMD or Intel.
And *maybe* nVidia will stop being asinine and support Freesync... ;) (oen may dream)
Pyrostemplar - Friday, January 4, 2019 - link
A small thing - the Samsung press release states that the monitor model is CRG9 but Samsung's CES website refers CRG90, the latter more aligned with previous name (CHG90 => CRG90)Hxx - Friday, January 4, 2019 - link
this is gonna be cool for folks looking for a cinematic display with that high peak brightness and local dimming. no HDR? I hope they bring that as well. it may not be ideal for twich shooters due to its massive size.BubbaJoe TBoneMalone - Friday, January 4, 2019 - link
Only three DisplayHDR 1000 certified monitors - https://displayhdr.org/certified-products/Hope the CRG9 is added to the list.
rsandru - Friday, January 4, 2019 - link
I don't see anywhere a mention of a FALD so it most likely won't offer the best HDR experience...godrilla - Friday, February 1, 2019 - link
The previous version is on the 600 nit list so I do not see why this model will not make the 1000 nit list.lilkwarrior - Friday, January 4, 2019 - link
I cringe every time these monitors are claimed to be 5K; I suppose LG had no choice but to call their true 4K ultrawide a "5K2K" monitor if we're gonna call 32:9 1440p monitors 5Kphilehidiot - Saturday, January 5, 2019 - link
Indeed, that caught me out. This isn't a true 5K monitor in my book. It's a 1440P ultrawide monitor. I find that incredibly misleading and also possibly something that will lose them sales. If I'm looking for a new monitor on a website, I do not include 5K monitors in my search. If they pop this in the 5K category, I will likely miss it or overlook it as soon as I see "5K" in the title. For my use, desk setup and system config, I want 1440P.