Comments Locked

24 Comments

Back to Article

  • bubblyboo - Thursday, January 3, 2019 - link

    Another scam by VESA for a pointless certification and price hike on monitors? Come back when they get true 10-bit on any specification, or make it so 600 and up don't require shitty local dimming.
  • JoeyJoJo123 - Thursday, January 3, 2019 - link

    Someday you'll grow into an adult.

    >pointless certification
    These standards are meant to give end-consumers who aren't highly technical a comparative or baseline expectation of what they can expect the display to provide. Advanced users like you and I won't find a single certification to be enough information to base a purchase on, but the fact of the matter is that the majority of users walking into a BestBuy or Wal-Mart aren't highly technical and will walk away that day with one display or another--this certification is supposed to alleviate or provide some baseline comparative specification that's handled by an impartial party, which is already way better than displays """"claiming"""" they're capable of a 1,000,000:1 dynamic contrast ratio. Again, you and I are capable and very willing to do our own research and reading of review content before buying displays, the majority of end users don't do this and this is vastly more helpful to those consumers than the technical ones.

    >price hike
    The DisplayHDR certifications aren't mandated for the sale of displays. If the manufacturer deems it worthy enough to test and certify a particular model of displays, they might do so. An OLED display being manufactured for sale could far exceed DisplayHDR 500 True Black's certification, but that's still at the discretion of the manufacturer if they want to get it certified or not or if the certification label is even helpful in moving/selling more display units.

    That being said, VESA standards are pretty damn tame. DisplayPort is still a royalty free VESA standard, and if they're not charging royalties like HDMI or other tech standards groups, then I highly doubt that certifying a particular model of display costs that much money to a company that they need to increase the price drastically to the end user.

    >Come back when they get true 10-bit on any specification
    Again with this "TRUE 10-BIT" meme.
    http://www.tftcentral.co.uk/faq.htm
    "In fact on many modern panels these FRC are very good and in practice you’d be hard pressed to spot any real difference between a 6-bit + FRC display and a true 8-bit display. Colour range is good, screens show no obvious gradation of colours, and they show no FRC artefacts or glitches in normal everyday use. Most average users would never notice the difference and so it is more important to think about the panel technology and your individual uses than get bogged down worrying about 6-bit vs. 8-bit arguments."
    I trust and value TFT Central's opinions far more than some armchair tech complainer that has no better occupation than to shitpost in Anandtech article comment sections that 8-bit + FRC displays being sold as 10-bit monitors are "terrible" by any standard. At the end of the day, you'd need a fully 10-bit pipeline to see any meaningful difference, from software, to display adapter, to cabling, to display, and the sheer majority of users aren't doing any 10-bit work. 10-bit literally doesn't even matter for posting your inane comment here where sRGB (requiring 8-bit color) is the default color space of the web and for this Anandtech article you're posting in.
  • StevoLincolnite - Thursday, January 3, 2019 - link

    I have a Kogan 6-bit+FRC and an 8-bit LG TV's in my home.
    There is a difference. - Especially with black gradients... You notice it less with film/moving images with brighter contrasts.

    Obviously, every-time you increase the bit depth there will be diminishing returns... 8-bit should be the minimum... With a transition to 10-bit,
  • HollyDOL - Friday, January 4, 2019 - link

    I have 6+FRC at work and full 10-bit system at home... While not possible to compare side by side the difference is quite big (both in reproduction and also eye strain, the later might be caused also by environment factors though, light conditions at work are suboptimal). I have no 8bit at hand to test where the 'curve breaks' I would avoid 6+FRC unless the usage was occasional only (for reading emails it's more than enough).
  • krazyfrog - Sunday, January 6, 2019 - link

    How are you people powering these 10-bit panels? Isn't 10-bit support limited only to Quadro and Radeon Pro graphics cards?
  • HollyDOL - Tuesday, January 8, 2019 - link

    GeForce (10xx) can do that in DirectX 11 fullscreen (likely it can do so with Dx12 as well, haven't tested).
    For Adobe stuff you still need Quadro + drivers. Wasn't too happy about that investment but it allowed wife to work from home comfortably...

    Absolutely no idea about situation with AMD cards though.
  • bubblyboo - Thursday, January 3, 2019 - link

    DisplayHDR 600 and up requires local dimming or 4000:1 contrast ratio (impossible for IPS), which ends up being shitty edge lit <10 zone crap that only ends up increasing the cost of the monitor, while not providing any improvements for the end user.

    Yes, it is a pointless specification when you can get a monitor certified for DisplayHDR1000 with only 32 zones of EDGE LIT dimming.

    Yes the monitors end up costing far more for the end user when you force OEMs to add in useless edge lit dimming that needs to be disabled or else it screws up everything you're looking at, as well as everything needed for their own tests and Vesa's test for the certification.

    Yes 10-bit matters. Just because monitors are a decade behind TVs doesn't mean the highest level of HDR on monitors should be 8-bit. 6-bit + FRC vs true 8-bit is a night and day difference unless you're blind.

    Before you start telling people to "grow up" and that their comments are "inane" you should be asking yourself why you go so far to defend these companies' bullshit that only ends up harming consumers with blatantly false information.
  • krazyfrog - Sunday, January 6, 2019 - link

    This whole DisplayHDR business is surprisingly scammy, especially DisplayHDR 400 and 600. These are supposedly meant to help lay people (who in reality don't even know what HDR is) but only mislead by saying DisplayHDR 400 and 600 panels are true HDR when these things are barely any better than any half decent SDR monitor. Most of the HDR monitors on the market now meet these bare minimum standards because they can then pass them off as HDR. The only people these standards help are the monitor manufacturers.
  • JoeyJoJo123 - Monday, January 7, 2019 - link

    >Yes 10-bit matters.
    I didn't say it didn't. HDR fundamentally needs to have that many more tones to work with to do what it needs to do. What I'm saying doesn't matter is the usage of modern FRCs in displays, which you're making to be big deal when it really isn't. 8-bit + FRC vs native 10-bit literally does not matter.

    >6-bit + FRC vs true 8-bit is a night and day difference unless you're blind.
    No, it's not. You're either an outright liar or the typical online commenter that can't hold back hyperbole.

    >Before you start telling people to "grow up" and that their comments are "inane" you should be asking yourself why you go so far to defend these companies' bullshit.
    I'm not defending anything. The certification is just that, just a certification. You need to wake up to reality that certification does not ALWAYS equal reality. An electrician can have certification to work on homes, but that's not to say they aren't lazy or that they won't get electrocuted. This is exactly why I stressed that technical users like you or I can't and rightfully don't take a look at a single certification and put SOOO MUCH WEIGHT and trust into that one certification.

    You keep calling this VESA certification a scam when that's not the problem. You're the problem for putting too much weight and emphasis on a single aggregate certification which no matter which way you slice it, there will always be products that try to design the minimal possible specifications to game that certification, just as there are electricians that will do the least work possible to get certified.
  • leo_sk - Friday, January 4, 2019 - link

    Idk. I know a couple of guys who bought hdr 400 monitors expecting to get hdr 1000 like display. Even the salesman continuously repeated that they were hdr monitors when i raised objections. I think they don't simplify it for normal end users at all. It makes it easier to loot them though
  • a5cent - Friday, January 4, 2019 - link

    ^ someone who gets it!

    DisplayHDR 400 is blatant marketing BS. Any SDR monitor that covers the same gamut will offer pretty much the same experience. Edge-lit monitors with a DisplayHDR 600 certification are probably even worse, as they cost much more while (for most content) not actually delivering more.

    These certifications don't mean these monitor are garbage. They might be great monitors, but they are highly unlikely to deliver anything resembling a HDR experience. Until FALD and MiniLED become much less expensive and more widespread, that is unlikely to change.
  • Jorgp2 - Friday, January 4, 2019 - link

    The whole point of HDR is dynamic range, which is why 10 bit is desired
  • Kamus - Thursday, January 3, 2019 - link

    This is kind of dumb... the display should just tell the OS the max peak brightness it's capable off, and have the OS tonemap to that.
  • Billy Tallis - Thursday, January 3, 2019 - link

    That doesn't really address the problem of deciding which monitor to buy, which is what DisplayHDR certifications are supposed to help consumers with.
  • MrCommunistGen - Thursday, January 3, 2019 - link

    I think the most interesting part here is that they've announced an OLED spec. Maybe a bit of wishful thinking, but maybe this is a precursor to actually getting OLED monitors in the PC industry that aren't ultra-niche panels.
  • rsandru - Thursday, January 3, 2019 - link

    That's probably the most important takeaway. :-)
  • mode_13h - Thursday, January 3, 2019 - link

    I've given up on an OLED PC monitor. I'd settle for HDR + adaptive sync at a decent size, resolution, and price. Still looking...
  • doubledeej - Thursday, January 3, 2019 - link

    They've got to solve the burn-in issues with OLED before we'll see monitors meant for computer use.
  • mode_13h - Thursday, January 3, 2019 - link

    This.

    What kind of emissive alternatives are there?
  • a5cent - Friday, January 4, 2019 - link

    MicroLED is the only viable alternative on the horizon. MicroLED combines the benefits of TFT and OLED technology (emissive = perfect contrast, no organic degradation or burn in). It's proven to work but so far is too expensive to mass produce.

    We really should stop waiting for OLED to go mainstream in the PC monitor space. It's not going to happen. The flaws/issues are inherent to the organic nature and can't be engineered away. Anything that solves those flaws/issues won't be OLED.
  • lilkwarrior - Saturday, January 5, 2019 - link

    OLED monitors will do just fine. Pros insist for them to be available knowing the value; it's becoming extremely annoying having to have an OLED reference monitor on top of an 4K ultrawide or true 5K monitor to confirm work for pros.

    OLED monitors are used in the health industry with no problem as well.
  • a5cent - Sunday, January 6, 2019 - link

    For MOST use-cases OLED is unproblematic. Why site the "health industry" niche when you can just point to the gigantic consumer television market. OLED works great there too. That's not the point.

    The point is that specifically for PC monitors, where large portions of the screen remain unchanged for long periods of time, over months or even years, and where color accuracy is often critical (more so than in most other areas, including the health industry), OLED's degradation issues aren't acceptable.

    Of course there are always exceptions, like portable monitors that are intended to be used for "shortish" presentations. But for most PC monitor uses-cases, the monitor industry is justifiably hesitant to shoulder the risk of mass returns as a result of degradation and/or burn-in.

    It's not economically feasible, so no, OLED won't do just fine. It's not happening. If OLED was viable there would be no need for the huge investments being made in MicroLED. That's where the PC monitor market is headed.
  • lilkwarrior - Friday, January 4, 2019 - link

    OLED burn-in is exaggerated almost like SSDs running out of write cycles.

    You do obvious common sense things to protect them but they both deliver can't go back experiences worth the hassle for most who can afford it. Just like luxury cards vs. owning a honda civic.

    OLED monitors already exist and Asus & JOLED will add pro & gaming OLED monitors among the kind mainstream consumers can acquire for consumption.
  • Xex360 - Thursday, January 3, 2019 - link

    I think true blacks are more important than high brightness (if the screen can achieve reasonable brightness levels), watching films is better in a dark room,for LCDs are more game oriented while OLEDs are more suited for films.

Log in

Don't have an account? Sign up now