Comments Locked

36 Comments

Back to Article

  • Xajel - Monday, July 29, 2019 - link

    AFAIK, AMD do support 10bit color space on OpenGL... But only Their FireGL or Radeon Pro support 10bit OpenGL buffer, which is the thing required for professional apps.

    To make it clear, AMD's 10bit support works on the desktop, they can do 10bit colors, and newer can also do 12bit for HDR support. But this support doesn't work on professional applications. This is what I know, I don't know if it's correct or works now as I knew this from about 1-2 years.
  • krazyfrog - Monday, July 29, 2019 - link

    AFAIK only the Radeon Pro series of graphics cards support 10-bit color and that the usual Radeon cards will only do 10-bit in DirectX, similar to GeForce.
  • nathanddrews - Monday, July 29, 2019 - link

    This was true last I checked.
  • Ryan Smith - Monday, July 29, 2019 - link

    So I asked AMD about this years ago, and they told me that they dropped any 30-bit restrictions earlier this decade. That said, I don't have a suitable monitor for testing this, so I can't immediately do any confirmation testing.
  • CKing123 - Tuesday, July 30, 2019 - link

    AMD likely only meant they support 10 bit color displays, not that they allow rendering in 10 bit color. See amdmatt's statement (this is in 2016): https://community.amd.com/thread/203751

    "Agree with the folks above.

    Radeon Settings and Catalyst Control Center (CCC) show the display output colour depth in Bits per colour/channel. This is to match the colour depth on the display for output.

    Enabling 10 bit in Radeon Settings/CCC for consumer products will match the colour depth on the display for output, but the surface being rendered can be 6/8 bit and is dithered to 10 bit output if this feature is enabled.

    In consumer products (non Firepro) 10 bit rendering is not supported. This is not a per product support but rather class. Workstations versus Consumer. Currently only Workstation products support 10 bit rendering. If you are looking for viewing/working with 10 bit content and looking for full support (for example using Photoshop for 10 Bit rendering) you cannot use any consumer card for it and would instead require a Firepro product."
  • mode_13h - Sunday, August 4, 2019 - link

    "the surface being rendered can be 6/8 bit and is dithered to 10 bit output if this feature is enabled."

    That makes no sense. You don't create addition information through dithering, usually. Dithering is used when starting with a higher bit-depth and converting to lower.
  • eroneko - Monday, July 29, 2019 - link

    NVIDIA’s Studio drivers, which can be installed on any GeForce/Titan card, desktop and mobile.
    I think they only support pascal cards and after?
  • Freakie - Monday, July 29, 2019 - link

    Officially yes it's only Pascal and later. But installing it on older cards is a simple .ini change in Notepad.
  • NikoTodd - Monday, July 29, 2019 - link

    Can't seem to find any .ini file after extracting 431.70 studio for gtx1080 into a folder. I'm with gtx980 btw. Is there a tutorial somewhere how to do the trick? ty
  • Freakie - Monday, July 29, 2019 - link

    Oops, it's an .inf apparently. Here's a guide: https://forums.guru3d.com/threads/nvidia-inf-drive...
  • NikoTodd - Tuesday, July 30, 2019 - link

    Thank you. After few attempts I managed to get it to work, but to be honest, I don's see much of a difference in PS:) Anyways, thank you for the help!
  • Freakie - Tuesday, July 30, 2019 - link

    Happy to help! It's a hack that laptop owners have had to use for a loooong time if they wanted the latest drivers instead of waiting for their OEMs to release their "approved" versions.

    And of course the 10-bit support for non-Quadro cards will only make a difference if 1) Your screen supports 10-bit color depth and 2) You enable it within Windows/Nvidia Control Panel and within Photoshop its self (there is an actual separate option within PS that you have to enable). But even then it isn't always immediately obvious unless you work with gradients a lot.

    Largest benefit for work that doesn't use gradients is having a wider range of displayable colors so that the screen can display the exact color you want more accurately so that it's as close as possible for print/production. Because having your work come off the printer with a slightly different hue can be an expensive mistake if you send it to a printing house!
  • shabby - Monday, July 29, 2019 - link

    Did nvidia use a 3dfx voodoo card to come up with that 24bit image?
  • HollyDOL - Monday, July 29, 2019 - link

    It's written there it's simulated...
  • 0ldman79 - Tuesday, July 30, 2019 - link

    Don't knock the Voodoo...

    Mine topped out at 16 bit color anyway...
  • eva02langley - Monday, July 29, 2019 - link

    So funny of Nvidia trying to show a 30bit image over a 24bit image over the internet with people having monitor that cannot show the difference.
  • FreckledTrout - Monday, July 29, 2019 - link

    Its like they made a 4 bit vs 8 bit image to show 24bit vs 30bit. Cute.
  • BINARYGOD - Monday, July 29, 2019 - link

    What it's actually like is that neither you or your first responder understand this was from a slide at a presser where people in the audience could see the difference, not something made specifically for the internet (but of course they will share the entire slide-show online, because they nearly always do).

    A better question is why AnandTech chose to include it, actually.
  • Awful - Monday, July 29, 2019 - link

    S I M U L A T E D
  • Oliseo - Tuesday, July 30, 2019 - link

    If you don't have a monitor that can see the difference then there's little point in even reading the article, let alone complaining you don't have a monitor that can see the difference.

    I imagine those that do have a monitor that can see the difference then this is good news.
  • FXi - Monday, July 29, 2019 - link

    Intel.
    I suspect Intel will bring HDMI 2.1 (which will be widespread by the time they release their cards), VRR and I'll bet 10 bit to the regular user.

    The same thing happened when users started thinking AMD for VRR because freesync was becoming an "everywhere" product with no cost premium. Given a looming loss in competitive advantage they bring features they were not going to prior to that point.

    But yes, with the advent of HDR and 10 or 12 bit hitting more product areas, this has some basis in good sense. Just should have happened years ago.
  • 0ldman79 - Tuesday, July 30, 2019 - link

    How are they going to market this to normal folks?

    The normal terms for the colors are 8 bit, 16 bit, 24 bit and 32 bit, 24 bit being truly what we have, but alpha being thrown in for 32 bit.

    30 bit is a downgrade by the terminology. What are they going to call it? 1 trillion colors? 40 bit? HDR?
  • Oliseo - Tuesday, July 30, 2019 - link

    Why do you think they will?

    And if they do, perhaps they can pay someone on Love Island to do a demo.
  • crimsonson - Tuesday, July 30, 2019 - link

    You are making a mountain out of molehills here.

    10 bit video has been a mainstay of professional video production for 2+ decades now.

    For consumers, since high-end LCDs and OLEDs its something that is often quoted in system specs and confirmed by reviewers.

    Video bit depth DOES NOT need to be in chunks of 8. You can argue that the container is likely in 8/16/24/32 encoding but 10 or 12 bit is technically accurate.
  • Freakie - Tuesday, July 30, 2019 - link

    Yes 10-bit is also referred to as 1 trillion colors but what you're getting it mixed with is total bit depth vs. bit per channel. When they refer to 10-bit here, they are referring to 10 bits per channel. Whereas the 24 and 32 bits that you refer to is combined from all 3 color channels. So your 24 bit is 6 bits per channel, and 32 is 8 bits per channel.
  • mode_13h - Sunday, August 4, 2019 - link

    WTF? Nobody uses 6 bits per channel. I was with you 'till that point. You're confusing 3 vs 4 channel specifications. Anyone talking about 24-bit color means 8-bits per channel, for 3 channels.

    32-bit color is *practically* the same as 24-bit color - just that they're counting a 4th channel that's usually used to hold alpha (transparency) values that can be used for compositing.
  • D. Lister - Friday, August 2, 2019 - link

    @0ldman79: 24-bit color is 8 bits(red)+8 bits(blue)+8 bits(green), giving you a total of 16.7 million colors. 32-bit color is actually just 24-bit color + 8 bits for alpha transparencies, and still only gives you 16.7 mil colors.

    30-bit color OTOH is actually 10 bits each for red, green and blue channels, giving you a total of 1.07 billion color. And it is definitely no gimmick. Properly set up, it can provide noticably smoother gradients in compatible videos and image types (eg. PNG or TIFF). For gamers that means significantly better smoke/fog/explosions/god rays/ambient occlusion.
  • Agent Smith - Tuesday, July 30, 2019 - link

    So if i selected 10bit in the nVidia control panel that was just for DirectX applications all along - is that right?
  • Freakie - Tuesday, July 30, 2019 - link

    DirectX and full screen OpenGL applications. So if you watched a 10-bit movie using MPC-HC in full-screen then it's supposed to work before this change. Also games that use OpenGL HDR could theoretically work as long as they are in full-screen mode. It was a weird mess, glad it's finally being sorted.
  • Stacy_Elmer - Tuesday, July 30, 2019 - link

    I am Mrs Stacy, i want to invest in any business in good faith I have equity capital for profitable investment. Get back to me via email: [email protected] with your business proposal or your project plans for review.
    Thanks.
    Mrs Stacy
  • s.yu - Wednesday, July 31, 2019 - link

    So uh...besides the slightly larger VRAM and so-called stability, what's left for Quadro again?
  • diehardmacfan - Wednesday, July 31, 2019 - link

    Certain OpenGL instructions are still isolated to Quadros, for an extreme example look at Siemens NX.
  • mode_13h - Sunday, August 4, 2019 - link

    Some of the Quadro RTX cards support ECC memory.

    Where there's a difference in RAM, it's usually substantial - like 2x or 4x. For instance, the Quadro RTX 5000 is basically the same as the GeForce RTX 2080 Super, but the Quadro has 16 GB of RAM while the GeForce has only 8 GB.

    Oh, and the Quadro RTX cards also had better specs than GeForce cards build on the same GPUs, but the Supers now seem to have surpassed them.
  • mode_13h - Sunday, August 4, 2019 - link

    Also, they nerfed tensor fp16 multiply with 32-bit accumulate, on the GeForce cards. On GeForce, it runs at half the speed, as nudge for people to use the Quadro RTX or Titan RTX cards for training deep learning models.
  • mode_13h - Sunday, August 4, 2019 - link

    And the Quadros have auxiliary power connectors on the short edge, so you can fit them in a 3U case.
  • mode_13h - Sunday, August 4, 2019 - link

    And the lower-end Quadros are single-slot width, but I think these models typically run slower than their GeForce counterparts. For instance, consider the 160 Watt Quadro RTX 4000 vs. the 215 Watt RTX 2080 or 250 Watt RTX 2080 Super.

Log in

Don't have an account? Sign up now