Generally cool, I am quiet interested in a BFGD display, but no HDMI 2.1 is a no-go for me, as I would use it for my main living room screen and at latest when a PS5 arrives sometime in 2021 I would be annoyed of not having full RGB 4:4:4 with my HDR.
The next best thing is always coming in a few years. HDMI 2.1 products are on that list, but by then there will be an announcement for another. The future is bright and exciting, but we can only buy that which is available in the present. :)
Generally agree, but here we are not talking about products which can do more, just a connector. I guess this is just what is so annoying for me - the product can do everything, but the silly connector and cable cannot get the data over fast enough. Similar to 4k with 30fps HDMI 1.4 - it's just not a balanced product somehow.
Exactly. The panels are fully capable of running at 4:4:4 120hz (or above), the problem is that there is no interface to feed them at that bandwidth. It is also possible that some of the internal controller busses might not support that bandwidth either since they knew that there was no connection interface that could provide it.
But this is one of those cases where they should have simply waited to use HDMI 2.1. With them already delaying these monitors until now, which is 8 months after HDMI 2.1 was official, they should have been looking at a redesigned controller to handle the new HDMI standard which would have unlock the full potential of the panels. People have already been expecting to see the first batch of devices implementing HDMI 2.1 release in the next few months. With those knowns, and the fact that this is such a high end panel targeting a specific consumer base (which happens to know and follow the fact that these panels are gimped out of the door without having HDMI 2.1), these companies shouldn't expect very good sales. I know for a fact that I won't buy one if it does not have HDMI 2.1...
this is utterly stupid. don't expect the next PS will cover 4K-144Hz.. it won't.
+ you have two display ports, so you will be able to use both at the same time which is exactly the same than having more bandwidth than display port 1.5 or hdmi 2.1 .
+ There is not game that can run in 4k-144hz with hdr and chroma 100% (only older game not AAA can run at theses specs, and they weren't made in HDR neither 8bits+ color.
It's not really "just a connector" we are waiting for.The capability to signal at those speeds comes from the chip generating the signals, not from the physical port. We are waiting for the next generation of HDMI control chips that can signal at 12 GHz per channel as opposed to HDMI 2.0 chips which only needed to signal at 6 GHz per channel. It's no simple feat to create these systems and bring them to mass production. It really isn't so different from waiting for the next generation of GPUs and CPUs after all.
In addition, I should point out that the publication of the HDMI 2.1 specification doesn't mean "products will be coming out any time now"; that's when the designs for new products BEGINS, and it takes a lot longer than 6 months to get these things rolling. At the current moment, the HDMI 2.1 CTS (compliance testing specification) isn't even finished yet, so it would be impossible to for any HDMI 2.1 device to be available, as the HDMI creators haven't even finished writing the testing procedures for HDMI 2.1 (we expect it to be finished in late 2018). Don't expect HDMI 2.1 products for at least a year at minimum. Development cycles for new interfaces are very long from the time the specification is first published.
I still think the price premium for G-Sync is stupid when Freesync is a thing and royalty free. Seems like it would be easy for them to add so I can have a better choice in monitors for my 1070. This is a straight up money grab for NVidia. Is there really any benefit to G-Sync over Freesync 2 in terms of performance?
Yes, there is a benefit. G-sync still has a wider refresh rate and, because of Nvidia’s control over the scalar, will deliver more consistent performance than a simple certification. Is it worth the premium? Probably not, but nobody buying a $3,000 monitor is going to quibble. You want the very best of everything at that price point, and the fastest GPUs available today are all G-Sync compatible.
They make the GSync add in card that goes in the monitor, not the video card. Its around $100 price premium for the add-in card that gets added to the cost of the monitor. Freesync is a free spec that anyone can implement without paying royalties.
Yeah, I've been disappointed that NVIDIA hasn't yet brought out anything more with G-Sync that justifies their method of implementing it.
I don't have experience comparing Freesync and G-Sync. I have heard that G-Sync does some things slightly better, but it doesn't seem to be enough to make the difference in price worth it. NVIDIA says that the prices are higher because monitor manufacturers are able to charge a premium for G-Sync monitors, and there may be some truth to that, but I think the manufacturers' component costs for a G-Sync monitor are likely higher than for a Freesync monitor.
On the other hand, I think there wouldn't be any Freesync without G-Sync, and NVIDIA probably wouldn't have made G-Sync, either, if it meant implementing it the way Freesync is implemented. Without G-sync already existing, there just isn't much incentive for anyone to develop something like Freesync. Freesync was a reactive scramble by AMD to neutralize any competitive advantage NVIDIA might get with G-Sync.
FreeSync was a reaction to G-Sync but it is just the implementation of a VESA std that had already been in place long before G-sink ever came out. Not a lot of work on AMD's part at all. They just implemented support for it their GPU's(Again, not a lot of work). There was even talk that the current at the time monitors could technically support FreeSync but good luck getting manufacturers to upgrade the monitors firmware to support it. Then not long after, "new" monitors with FREESYNC!!! G-SYNC is just a vendor lock in.
Listing facts that are all incorrect is highly counterproductive.
From everything I remember, the VESA standard was worked on and created because of G-Sync. There's no such thing as "just an implementation of a standard" as if a standard is some magic ether that is free and effortless. Designing Freesync cost money and implementing it similarly costs money. Making sure that implementations are up to standard also costs money. I am under the impression that part of the extra cost of G-Sync is the stricter and more costly validation program that NVIDIA makes monitor makers participate in as part of G-Sync when compared with AMD's Freesync designation.
Monitors need more than just firmware to support Freesync (or the VESA standard that Freesync implements). Freesync monitors became available significantly after G-Sync monitors were available.
No, the VESA standard known as Adaptive Sync was established before but rarely if ever implemented until Nvidia came out with G-Sync. As a counter AMD released Free-Sync, which is VESA's Adaptive Sync implemented to work on their cards aka(not a lot of work). AMD has even admitted that FreeSync is basically Adaptive Sync. Never said it was free or magic but AMD didn't put a lot of work into it as the standard was already fleshed out(that's why it's called a std ;) ). AMD simply asked how far can we push this with current tech without additional HW like G-SYNC. The answer? 48hz-75 hz which is why you got what you got. Now FreeSync 2 with LFC or whatever is an extension of that standard and that requires changes to HW. As far as Monitors go, there were discussions about whether or not some current at the time panels could utilize FreeSync but alas it would require upgrading the firmware as there was not a HW restriction preventing them from implementing the standard. Some have even attempted with limited success to enable it on various nonfreesync monitors.
"No, the VESA standard known as Adaptive Sync was established before but rarely if ever implemented until Nvidia came out with G-Sync. As a counter AMD released Free-Sync, which is VESA's Adaptive Sync implemented to work on their cards aka(not a lot of work)."
Companies either do work and submit it to a standards group to vote on inclusion, or various companies send people into a SIG to hammer out a standard. In this case, AMD did the work on Freesync in response to NVIDIA's G-Sync. They then submitted it to VESA and VESA accepted it and named is Adaptive Sync, as part of an extension to the DisplayPort standard. So, yes, Freesync is basically AdaptiveSync. But AMD did do the work for it. A simple web search will lead you to multiple sources talking about this. Here's one: https://www.pcper.com/news/Graphics-Cards/AMD-Demo... https://www.pcper.com/news/General-Tech/Rumor-VESA... https://www.pcper.com/news/General-Tech/Move-over-...
The difference between AdaptiveSync and Freesync is that "Freesync" includes an AMD certification procedure, although apparently the certification procedure that is not as strict as NVIDIA's certification procedure for G-Sync.
"To build monitors supporting Adaptive Sync and FreeSync technologies special display scalers are required."
There was a lot of confusion about Freesync when it first came out. There was a lot of misinformation being bandied about. That confusion was caused or at least exacerbated by AMD's PR. But what I am telling you in my posts is what seems to be the reality of the situation. Go research it for yourself to find out if you don't believe me.
"Some have even attempted with limited success to enable it on various nonfreesync monitors. " Some have attached lawnmower engines to bicycles but that doesn't make them Harley-Davidsons.
Variable refresh rate aka Adaptive Sync aka Freesync is about a decade old now. AMD may have expanded it but they are not the sole inventor/proprietor of variable refresh rate tech.
In regards to the scaler chips in said monitors. No, special HW was not required. Did the HW need to meet a certain spec? of course. There is no additional component like G-Sync. It only need to be fast enough and capable of accepting a variable rr command from the GPU which some unsupported monitors at the time were already capable of doing. GSYNC and *Sync may accomplish the same thing relatively, but they take different approaches which is borne out in their respective capabilities.
Please link me with a source that Adaptive Sync is a decade old rather then being included as an extension to DisplayPort 1.2a, which the rest of the world seems to be under the impression is true: https://www.anandtech.com/show/8008/vesa-adds-adap...
Yes, special hardware is required and I linked you with something say that it's so. A special scalar chip is required to support the standard.
You are just ignoring my evidence and asserting your own "facts" without providing your own evidence for these "facts". Stop wasting my time.
The beauty of Gsync is that all monitors is certified to work from 30hz-Max. Ranges are for untested products such as Freesync where each manufacturers can put as much or as little range confusing consumers.
In their top picture it looks like they washed out the picture for SDR side for sure. On the HDR side the colors look way over done as cartoon like on the character.
With that said I use my Samsung 60" HDTV for movie watching and gaming for now until I decide to go back down stairs & use my 125" projector screen again. I found on the Samsung TV that if I set Dynamic Contrast to low (level1) turn HDMI Black to Normal (low makes the black squash the greys.) Set Color input to "Native" mode and set it on the PC to RGB 4,4,4 color Full Spectrum (PC Full color). I get really deep blacks and the areas in the shadows are also there (Just like HDR) does it. The colors are full and bright but not cartoon like or over done & like I said the blacks are pure black even when watching movies and you get those top and bottom bars they are totally black and no light bleed or a bit grey and the whites on the screen are a pure white not cream color or dark.
I am not sure why but when I do Native color on the TV and set it to RGB 4,4,4 Full PC Color spectrum in the video card the AMD software switches my color bits to 12 bit instead of 10 bit. I have not ventured to look it up as to why it did that I just left it alone since the software put it to that setting. Anyways my point is my SDR TV is giving great color non washed out look and pure blacks & rich colors so I see no need to upgrade until I go 4K or higher down the road in the near future now that 4K is within reach of the pocket book. Then again my Sapphire TRI-X 390x 8GB even though it is OC'ed to the max won't handle 4K gaming very well I am sure unless turning the settings down but then whats the point if it looks like crap. I see a Geforce 1180 or AMD Navi in my near future for sure.
Time to save up for that Acer x35. It should be hopefully a significant upgrade over my current X34 although i'm not sure how i feel about "upgrading' from an IPS to a VA panel hopefully with the curver being more pronounced, i won't notice the smaller viewing angles.
Still going to wait. 27" is too small for 4k and I'll be curious if 1000 cd/m² is real on an ips monitor. Really want a 32" 4k with gsync. Or even better, would be if AMD could match a 1080ti or more then I could just purchase a 4k monitor with freesync and save a buttload of money.
27" is too big for 1080P, that's a fact unless someone is blind or hasn't done many comparisons. I believe you on 27" working well with 4K. It's probably on the small side, but I'm not sure I want a single monitor larger than 27" on my desk. I use three 24" 1080P (had 27" 1080P before), but my system is used for making money, not gaming alone. So productivity for me comes first while most people posting here are looking for 144Hz/GSync/HDR.
Heck I'm typing this on a Samsung 60" TV granted I am sitting across the room from it and don't have my nose pressed against the screen. My point is it looks fine and so would that 27" 2 108p if you do not have your nose on the screen all of the time.
Yeah I agree, needs a middle ground. Personally I've been happy gaming with 4k 60hz when I can hit 60hz averages on my 65" OLED TV. When 4k simply isn't going to be an option (right now I can do 4k on nearly everything) I can do 1080p 120hz and crank the AA which isn't bad at max settings.
Acer makes a 32" 4k Gsync monitor now the XB321HK. its main issue is that it cant do 4k over HDMI because its only v1.4. all they really need to do is up date it to HDMI 2.x and DP 1.4 and it would be fine
4k GSync that would also accept a 4k signal from a Xbox or PS4
I will never buy another Acer product again, They are terrible at customer service and bad quality products especially the xb321hk Acer Predator 32". Read this forum:
I mean, it's not like DisplayPort 1.4 has a compression algorithm that would allow for 4K 144 Hz HDR with much less quality loss than chroma subsampling... (*cough* DSC)
Just for reference:
DisplayPort 1.3/1.4 can do up to 4K 120 Hz with 24 bit/px (non-HDR) RGB/4:4:4 color uncompressed. To get HDR you would need to drop to around 97–98 Hz. To get 144 Hz with or without HDR, you need DSC or chroma subsampling.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
46 Comments
Back to Article
beisat - Wednesday, May 16, 2018 - link
Generally cool, I am quiet interested in a BFGD display, but no HDMI 2.1 is a no-go for me, as I would use it for my main living room screen and at latest when a PS5 arrives sometime in 2021 I would be annoyed of not having full RGB 4:4:4 with my HDR.Sivar - Wednesday, May 16, 2018 - link
The next best thing is always coming in a few years. HDMI 2.1 products are on that list, but by then there will be an announcement for another. The future is bright and exciting, but we can only buy that which is available in the present. :)beisat - Wednesday, May 16, 2018 - link
Generally agree, but here we are not talking about products which can do more, just a connector. I guess this is just what is so annoying for me - the product can do everything, but the silly connector and cable cannot get the data over fast enough. Similar to 4k with 30fps HDMI 1.4 - it's just not a balanced product somehow.Fallen Kell - Wednesday, May 16, 2018 - link
Exactly. The panels are fully capable of running at 4:4:4 120hz (or above), the problem is that there is no interface to feed them at that bandwidth. It is also possible that some of the internal controller busses might not support that bandwidth either since they knew that there was no connection interface that could provide it.But this is one of those cases where they should have simply waited to use HDMI 2.1. With them already delaying these monitors until now, which is 8 months after HDMI 2.1 was official, they should have been looking at a redesigned controller to handle the new HDMI standard which would have unlock the full potential of the panels. People have already been expecting to see the first batch of devices implementing HDMI 2.1 release in the next few months. With those knowns, and the fact that this is such a high end panel targeting a specific consumer base (which happens to know and follow the fact that these panels are gimped out of the door without having HDMI 2.1), these companies shouldn't expect very good sales. I know for a fact that I won't buy one if it does not have HDMI 2.1...
cmdrdredd - Wednesday, May 16, 2018 - link
Actually from what I hear consumer displays with HDMI 2.1 won't make it to retail for a couple years yet.rtho782 - Thursday, May 17, 2018 - link
Why they didn't just make you use two DP cables (as 4k monitors used to) I don't know. Then they could hit 120hz 4:4:4.I have completely lost interest in this product now.
professional - Thursday, May 17, 2018 - link
this is utterly stupid. don't expect the next PS will cover 4K-144Hz.. it won't.+ you have two display ports, so you will be able to use both at the same time which is exactly the same than having more bandwidth than display port 1.5 or hdmi 2.1 .
+ There is not game that can run in 4k-144hz with hdr and chroma 100% (only older game not AAA can run at theses specs, and they weren't made in HDR neither 8bits+ color.
Ahnilated - Sunday, May 20, 2018 - link
You mean, there is no Video card that can.Glenwing - Saturday, May 19, 2018 - link
It's not really "just a connector" we are waiting for.The capability to signal at those speeds comes from the chip generating the signals, not from the physical port. We are waiting for the next generation of HDMI control chips that can signal at 12 GHz per channel as opposed to HDMI 2.0 chips which only needed to signal at 6 GHz per channel. It's no simple feat to create these systems and bring them to mass production. It really isn't so different from waiting for the next generation of GPUs and CPUs after all.In addition, I should point out that the publication of the HDMI 2.1 specification doesn't mean "products will be coming out any time now"; that's when the designs for new products BEGINS, and it takes a lot longer than 6 months to get these things rolling. At the current moment, the HDMI 2.1 CTS (compliance testing specification) isn't even finished yet, so it would be impossible to for any HDMI 2.1 device to be available, as the HDMI creators haven't even finished writing the testing procedures for HDMI 2.1 (we expect it to be finished in late 2018). Don't expect HDMI 2.1 products for at least a year at minimum. Development cycles for new interfaces are very long from the time the specification is first published.
Ikefu - Wednesday, May 16, 2018 - link
I still think the price premium for G-Sync is stupid when Freesync is a thing and royalty free. Seems like it would be easy for them to add so I can have a better choice in monitors for my 1070. This is a straight up money grab for NVidia. Is there really any benefit to G-Sync over Freesync 2 in terms of performance?flashbacck - Wednesday, May 16, 2018 - link
Nvidia cards are all technically capable of freesync too, but no way in hell are they ever going to enable it *sigh*Dr. Swag - Thursday, May 17, 2018 - link
Not true... You need hardware support for it. That's why cards like the 280 didn't support it while the 290 and 285 did.foxtrot1_1 - Wednesday, May 16, 2018 - link
Yes, there is a benefit. G-sync still has a wider refresh rate and, because of Nvidia’s control over the scalar, will deliver more consistent performance than a simple certification. Is it worth the premium? Probably not, but nobody buying a $3,000 monitor is going to quibble. You want the very best of everything at that price point, and the fastest GPUs available today are all G-Sync compatible.imaheadcase - Thursday, May 17, 2018 - link
How is it a money grab when you literally have the card that works with it? roflIkefu - Thursday, May 17, 2018 - link
They make the GSync add in card that goes in the monitor, not the video card. Its around $100 price premium for the add-in card that gets added to the cost of the monitor. Freesync is a free spec that anyone can implement without paying royalties.Yojimbo - Thursday, May 17, 2018 - link
Yeah, I've been disappointed that NVIDIA hasn't yet brought out anything more with G-Sync that justifies their method of implementing it.I don't have experience comparing Freesync and G-Sync. I have heard that G-Sync does some things slightly better, but it doesn't seem to be enough to make the difference in price worth it. NVIDIA says that the prices are higher because monitor manufacturers are able to charge a premium for G-Sync monitors, and there may be some truth to that, but I think the manufacturers' component costs for a G-Sync monitor are likely higher than for a Freesync monitor.
On the other hand, I think there wouldn't be any Freesync without G-Sync, and NVIDIA probably wouldn't have made G-Sync, either, if it meant implementing it the way Freesync is implemented. Without G-sync already existing, there just isn't much incentive for anyone to develop something like Freesync. Freesync was a reactive scramble by AMD to neutralize any competitive advantage NVIDIA might get with G-Sync.
Manch - Friday, May 18, 2018 - link
FreeSync was a reaction to G-Sync but it is just the implementation of a VESA std that had already been in place long before G-sink ever came out. Not a lot of work on AMD's part at all. They just implemented support for it their GPU's(Again, not a lot of work). There was even talk that the current at the time monitors could technically support FreeSync but good luck getting manufacturers to upgrade the monitors firmware to support it. Then not long after, "new" monitors with FREESYNC!!! G-SYNC is just a vendor lock in.Yojimbo - Sunday, May 20, 2018 - link
Listing facts that are all incorrect is highly counterproductive.From everything I remember, the VESA standard was worked on and created because of G-Sync. There's no such thing as "just an implementation of a standard" as if a standard is some magic ether that is free and effortless. Designing Freesync cost money and implementing it similarly costs money. Making sure that implementations are up to standard also costs money. I am under the impression that part of the extra cost of G-Sync is the stricter and more costly validation program that NVIDIA makes monitor makers participate in as part of G-Sync when compared with AMD's Freesync designation.
Monitors need more than just firmware to support Freesync (or the VESA standard that Freesync implements). Freesync monitors became available significantly after G-Sync monitors were available.
Manch - Monday, May 21, 2018 - link
No, the VESA standard known as Adaptive Sync was established before but rarely if ever implemented until Nvidia came out with G-Sync. As a counter AMD released Free-Sync, which is VESA's Adaptive Sync implemented to work on their cards aka(not a lot of work). AMD has even admitted that FreeSync is basically Adaptive Sync. Never said it was free or magic but AMD didn't put a lot of work into it as the standard was already fleshed out(that's why it's called a std ;) ). AMD simply asked how far can we push this with current tech without additional HW like G-SYNC. The answer? 48hz-75 hz which is why you got what you got. Now FreeSync 2 with LFC or whatever is an extension of that standard and that requires changes to HW. As far as Monitors go, there were discussions about whether or not some current at the time panels could utilize FreeSync but alas it would require upgrading the firmware as there was not a HW restriction preventing them from implementing the standard. Some have even attempted with limited success to enable it on various nonfreesync monitors.Yojimbo - Wednesday, May 23, 2018 - link
"No, the VESA standard known as Adaptive Sync was established before but rarely if ever implemented until Nvidia came out with G-Sync. As a counter AMD released Free-Sync, which is VESA's Adaptive Sync implemented to work on their cards aka(not a lot of work)."Companies either do work and submit it to a standards group to vote on inclusion, or various companies send people into a SIG to hammer out a standard. In this case, AMD did the work on Freesync in response to NVIDIA's G-Sync. They then submitted it to VESA and VESA accepted it and named is Adaptive Sync, as part of an extension to the DisplayPort standard. So, yes, Freesync is basically AdaptiveSync. But AMD did do the work for it. A simple web search will lead you to multiple sources talking about this. Here's one:
https://www.pcper.com/news/Graphics-Cards/AMD-Demo...
https://www.pcper.com/news/General-Tech/Rumor-VESA...
https://www.pcper.com/news/General-Tech/Move-over-...
The difference between AdaptiveSync and Freesync is that "Freesync" includes an AMD certification procedure, although apparently the certification procedure that is not as strict as NVIDIA's certification procedure for G-Sync.
The original Freesync does require changes to hardware, it requires a special scalar chip. From https://www.kitguru.net/components/graphic-cards/a...
"To build monitors supporting Adaptive Sync and FreeSync technologies special display scalers are required."
There was a lot of confusion about Freesync when it first came out. There was a lot of misinformation being bandied about. That confusion was caused or at least exacerbated by AMD's PR. But what I am telling you in my posts is what seems to be the reality of the situation. Go research it for yourself to find out if you don't believe me.
"Some have even attempted with limited success to enable it on various nonfreesync monitors. "
Some have attached lawnmower engines to bicycles but that doesn't make them Harley-Davidsons.
Manch - Wednesday, May 23, 2018 - link
Variable refresh rate aka Adaptive Sync aka Freesync is about a decade old now. AMD may have expanded it but they are not the sole inventor/proprietor of variable refresh rate tech.In regards to the scaler chips in said monitors. No, special HW was not required. Did the HW need to meet a certain spec? of course. There is no additional component like G-Sync. It only need to be fast enough and capable of accepting a variable rr command from the GPU which some unsupported monitors at the time were already capable of doing. GSYNC and *Sync may accomplish the same thing relatively, but they take different approaches which is borne out in their respective capabilities.
Yojimbo - Tuesday, May 29, 2018 - link
Please link me with a source that Adaptive Sync is a decade old rather then being included as an extension to DisplayPort 1.2a, which the rest of the world seems to be under the impression is true: https://www.anandtech.com/show/8008/vesa-adds-adap...Yes, special hardware is required and I linked you with something say that it's so. A special scalar chip is required to support the standard.
You are just ignoring my evidence and asserting your own "facts" without providing your own evidence for these "facts". Stop wasting my time.
Diji1 - Thursday, May 17, 2018 - link
G-sync is better than Freesync technically and has much tighter standards. There are no low quality G-sync monitors.Yojimbo - Sunday, May 20, 2018 - link
Yes I agree, I am just questioning whether the differences in quality and features are worth the price difference.Krteq - Wednesday, May 16, 2018 - link
Cmon nV, what are G-sync ranges? Even AMD have FreeSync ranges listed on their site for supported monitors!Sttm - Wednesday, May 16, 2018 - link
Doesn't GSync work from 0hz to Max? At least that was what I always thought.DC_Khalid - Wednesday, May 16, 2018 - link
The beauty of Gsync is that all monitors is certified to work from 30hz-Max. Ranges are for untested products such as Freesync where each manufacturers can put as much or as little range confusing consumers.Manch - Friday, May 18, 2018 - link
That no min verifiable specs are released to the public is disconcerting.rocky12345 - Wednesday, May 16, 2018 - link
In their top picture it looks like they washed out the picture for SDR side for sure. On the HDR side the colors look way over done as cartoon like on the character.With that said I use my Samsung 60" HDTV for movie watching and gaming for now until I decide to go back down stairs & use my 125" projector screen again. I found on the Samsung TV that if I set Dynamic Contrast to low (level1) turn HDMI Black to Normal (low makes the black squash the greys.) Set Color input to "Native" mode and set it on the PC to RGB 4,4,4 color Full Spectrum (PC Full color). I get really deep blacks and the areas in the shadows are also there (Just like HDR) does it. The colors are full and bright but not cartoon like or over done & like I said the blacks are pure black even when watching movies and you get those top and bottom bars they are totally black and no light bleed or a bit grey and the whites on the screen are a pure white not cream color or dark.
I am not sure why but when I do Native color on the TV and set it to RGB 4,4,4 Full PC Color spectrum in the video card the AMD software switches my color bits to 12 bit instead of 10 bit. I have not ventured to look it up as to why it did that I just left it alone since the software put it to that setting. Anyways my point is my SDR TV is giving great color non washed out look and pure blacks & rich colors so I see no need to upgrade until I go 4K or higher down the road in the near future now that 4K is within reach of the pocket book. Then again my Sapphire TRI-X 390x 8GB even though it is OC'ed to the max won't handle 4K gaming very well I am sure unless turning the settings down but then whats the point if it looks like crap. I see a Geforce 1180 or AMD Navi in my near future for sure.
Hxx - Wednesday, May 16, 2018 - link
Time to save up for that Acer x35. It should be hopefully a significant upgrade over my current X34 although i'm not sure how i feel about "upgrading' from an IPS to a VA panel hopefully with the curver being more pronounced, i won't notice the smaller viewing angles.Dug - Wednesday, May 16, 2018 - link
Still going to wait. 27" is too small for 4k and I'll be curious if 1000 cd/m² is real on an ips monitor.Really want a 32" 4k with gsync.
Or even better, would be if AMD could match a 1080ti or more then I could just purchase a 4k monitor with freesync and save a buttload of money.
foxtrot1_1 - Wednesday, May 16, 2018 - link
27” is not too small for 4K. I have a 28-inch 4K monitor and it’s both noticeable and great. Don’t know why this weird idea got startedFlying Aardvark - Thursday, May 17, 2018 - link
27" is too big for 1080P, that's a fact unless someone is blind or hasn't done many comparisons. I believe you on 27" working well with 4K. It's probably on the small side, but I'm not sure I want a single monitor larger than 27" on my desk. I use three 24" 1080P (had 27" 1080P before), but my system is used for making money, not gaming alone. So productivity for me comes first while most people posting here are looking for 144Hz/GSync/HDR.rocky12345 - Thursday, May 17, 2018 - link
Heck I'm typing this on a Samsung 60" TV granted I am sitting across the room from it and don't have my nose pressed against the screen. My point is it looks fine and so would that 27" 2 108p if you do not have your nose on the screen all of the time.sibuna - Thursday, May 17, 2018 - link
it is if you want to use it for non gaming tasks without some ridiculous display scalingmilkod2001 - Thursday, May 17, 2018 - link
Same here, make it 32''4k, gsync and im in. Well if price will not be crazy. Somewhere around $1200 max .sibuna - Wednesday, May 16, 2018 - link
27" is too small for 4k. And not everyone wants a 21:9. There needs to be a 32-34" 4k with these specscmdrdredd - Wednesday, May 16, 2018 - link
Yeah I agree, needs a middle ground. Personally I've been happy gaming with 4k 60hz when I can hit 60hz averages on my 65" OLED TV. When 4k simply isn't going to be an option (right now I can do 4k on nearly everything) I can do 1080p 120hz and crank the AA which isn't bad at max settings.sibuna - Thursday, May 17, 2018 - link
Acer makes a 32" 4k Gsync monitor now the XB321HK. its main issue is that it cant do 4k over HDMI because its only v1.4. all they really need to do is up date it to HDMI 2.x and DP 1.4 and it would be fine4k GSync that would also accept a 4k signal from a Xbox or PS4
godrilla - Wednesday, May 16, 2018 - link
First 4k 120 hz monitor available nowUnfortunately non adaptive sync
https://www.pcgamer.com/the-first-4k-120hz-monitor...
mobutu - Thursday, May 17, 2018 - link
If you wanna see availability then you look at panel manufacturers:http://www.tftcentral.co.uk/news_archive/39.htm#pa...
http://www.tftcentral.co.uk/news_archive/39.htm#pa...
http://www.tftcentral.co.uk/news_archive/39.htm#pa...
Good luck.
adambomb13 - Thursday, May 17, 2018 - link
I will never buy another Acer product again, They are terrible at customer service and bad quality products especially the xb321hk Acer Predator 32". Read this forum:https://community.acer.com/en/discussion/441879/xb...
DanNeely - Friday, May 18, 2018 - link
The predator x27 has surfaced on Newegg as a $2k preorder. (Don't buy monitors from them though, they have an 8 dead pixel return policy.)https://www.newegg.com/Product/Product.aspx?Item=N...
Glenwing - Saturday, May 19, 2018 - link
Lol @ 4:2:2 subsampling :PI mean, it's not like DisplayPort 1.4 has a compression algorithm that would allow for 4K 144 Hz HDR with much less quality loss than chroma subsampling... (*cough* DSC)
Just for reference:
DisplayPort 1.3/1.4 can do up to 4K 120 Hz with 24 bit/px (non-HDR) RGB/4:4:4 color uncompressed. To get HDR you would need to drop to around 97–98 Hz. To get 144 Hz with or without HDR, you need DSC or chroma subsampling.
Ahnilated - Sunday, May 20, 2018 - link
Who really cares for a 27" 4K monitor? I sure don't, I want something 30-36" at 4K. I guess I will be waiting for a while still.CharonPDX - Tuesday, May 22, 2018 - link
Now if only it was possible to purchase an Nvidia GPU setup capable of running that 4K 144 Hz display for less than $2000.