Comments Locked

99 Comments

Back to Article

  • zeeBomb - Tuesday, December 8, 2015 - link

    Wow. Life's gonna be good for the AMD user in 2016!
  • JoeMonco - Tuesday, December 8, 2015 - link

    +5 funny.
  • ImSpartacus - Tuesday, December 8, 2015 - link

    Yeah, their roadmap is looking impressive.

    I wonder if Nvidia will be able to catch up.
  • DEADLIFT - Friday, December 11, 2015 - link

    Yes, I wonder if NVIDIA will ever catch up to the technology that AMD hasn't even released yet. Who can say?
  • Joserin - Monday, February 15, 2016 - link

    Pretty sure Nvidia will
    http://wccftech.com/nvidia-pascal-gpu-gtc-2015/
  • Zeus Hai - Tuesday, December 8, 2015 - link

    Yeah, if there's still any.
  • SolMiester - Thursday, December 10, 2015 - link

    2016?..LOL, AMD users are good at waiting!
  • Michael Bay - Thursday, December 10, 2015 - link

    If I was an AMD user, I`d dread the next driver release, not dream about how life is going to be good in 2016.
  • Spunjji - Friday, December 11, 2015 - link

    Blah blah. Happily running nVidia and AMD here in various machines, like always.
  • JonnyDough - Saturday, January 23, 2016 - link

    Likewise. Fanboys. :\ I've had issues with both, and both for the most part, work.
  • JonnyDough - Monday, January 25, 2016 - link

    I'm sure it will be great come 2017 if it all works as projected. At least for those with a lot of wealth.
  • nathanddrews - Tuesday, December 8, 2015 - link

    My 2016 wishlist:
    40" OLED, 4K, 120Hz w/FS
  • wiak - Tuesday, December 8, 2015 - link

    change 40" to 65" and we agree =)
  • nathanddrews - Tuesday, December 8, 2015 - link

    Change "65"" to "ALL OLEDs" all will be right with the world.
  • tential - Wednesday, December 9, 2015 - link

    Seriously man. I don't get what the aversion is to having large monitors. I understand you need small monitors, but some of us want 50+ inch monitors. I'll take 65" although I really want 70-80. Anything over 50 though. Hmmm, maybe I'll get the Wasabi Mango UHD550 now to hold me over until these new displays come. I'm going to guess 2017 is when we'll see large freesync displays above 40-50 inches.
  • slickr - Friday, December 11, 2015 - link

    If you want the best possible gaming experience a relatively "smaller" monitor is better, especially for certain type of games, since the bigger the monitor is the further away you need to sit, making your mouse to monitor coordination worse.

    Sitting between 60 and 120cm is the optimal range for mouse-monitor hand-eye coordination.

    For watching movies and general work though, of course a larger monitor is almost always better!
  • jasonelmore - Tuesday, December 8, 2015 - link

    OLED's burn in and are horrible for video games which commonly have Static UI for Health, Mini-map, inventory, etc...

    while the color is good, OLED is not the answer to HDR due to it's inherit design flaws. If you buy a OLED panel, it's colors will only be at peak performance for 1-2 years, and then colors will start to dull.

    Samsung is innovating in this segment with Quantum Dot technology, and i'd like to see it move over to monitors.
  • nathanddrews - Tuesday, December 8, 2015 - link

    You should probably use the product before spreading FUD. OLED makes for an amazing monitor.
  • Stuka87 - Wednesday, December 9, 2015 - link

    Sure if you like poor color accuracy and don't mind burn in. But if you like over-saturated unrealistic colors, then sure, its great.
  • Shadow7037932 - Wednesday, December 9, 2015 - link

    Burn in really isn't an issue on modern OLED screens. As for colors, that depends on calibration.
  • SirGCal - Thursday, December 10, 2015 - link

    Yes it still is. I have 3 that can prove that, all modern and popular brands/models. I won't be buying another OLED.
  • nathanddrews - Thursday, December 10, 2015 - link

    Good, more for the rest of us. ;-)
  • jospoortvliet - Sunday, December 13, 2015 - link

    wow, where did you find OLED monitors?
  • medi03 - Thursday, December 10, 2015 - link

    Look at PS Vita's and get a clue, please.
    While burn in is a problem, whining about color is ridiculous.
  • Asomething - Wednesday, December 9, 2015 - link

    Samsung aren't the 1st to use quantum dot tech. Sony was the 1st out of the gate for it.
  • nathanddrews - Wednesday, December 9, 2015 - link

    Also, considering Samsung just shut down a major LCD factory in November and is ramping up OLED production to compete with LG for 2016, QD tech isn't going anywhere soon. Sony is replacing it's production/studio monitors with OLED, so it's only a matter of time before they begin shipping to consumers. Just like a plasma, CRT, or LCD, you can get permanent burn in if you abuse it. Image retention - which is more common on emissive displays - is different and is not permanent. OLED is the future... and the future is NOW! :D
  • milkod2001 - Friday, December 11, 2015 - link

    OLED is the future... and the future is NOW!

    Heard that 10 years ago and still waiting LOL
  • TelstarTOS - Thursday, December 10, 2015 - link

    not going to happen in 2016 with a 10bit panel.
  • Drumsticks - Tuesday, December 8, 2015 - link

    Is AMD moving off of GCN anytime soon? If they aren't, how are they going to keep up with Nvidia's architectural improvements, especially since they're already, in the best case, slightly behind in power efficiency? I like AMD and I'd be interested in a Nano 2 for a small build next year, but I wonder if they'll be able to keep up without architectural changes.
  • wiak - Tuesday, December 8, 2015 - link

    well when all consoles are gcn, most of current amd stack is gcn, why move away from it?,even nvidia started to move to a gcn-like architecture in their latest gpus

    my gcn based Radeon HD 7970 still kicks ass, even when its 4 years old this winter
  • pt2501 - Tuesday, December 8, 2015 - link

    Agreed my r9 280 (aka 7950 boost) was sold to an individual with an alienware x51 with only a 330W supply. I thought it was going to be a lost cause because of the power requirements is a 500 W Power Supply.

    Well turns out if you underclock r9 280 by at least 250 Mhz it works fine, and can still play world of warships on ultra at 1080p.

    GCN is an impressive architecture that has scaled very well since its release.
  • ImSpartacus - Tuesday, December 8, 2015 - link

    You know that the 380 has a tdp of 190w, right? Toss in 90w for the cpu and you have 50ishW for other stuff on that 330w psu.

    You could probably plug a 980 (advertised 165w) in there and it would do just fine.
  • Yorgos - Thursday, December 10, 2015 - link

    or use a more high tech and efficient Fury Nano and try for free all the goodies like freesync and next generation features that only gcn offers.
  • zodiacsoulmate - Tuesday, December 8, 2015 - link

    500w is recommand spec considering other components also drawing power from PSU.
    the 280 itself should draw no more than 250w.
  • Mr Perfect - Tuesday, December 8, 2015 - link

    That's part of the problem though, isn't it? If a four year old card isn't much behind the current card, then things have stagnated. We need to start moving forward again, and AMD needs people buy a new card more then twice a decade.
  • Samus - Tuesday, December 8, 2015 - link

    They can only milk it for so long. The R9 380 is basically the same die and configuration as the 7950, a 4 year old card. NVidia is about to release their third architectural leap over AMD since GCN came out. Not good. GCN scales well, but not for performance. Fury is their future.
  • DragonJujo - Tuesday, December 8, 2015 - link

    The R9 380 is an update of the R9 285; the Tahiti Pro chip in the HD 7950 was rebadged as the R9 280 and then dropped in the 300 series.
  • e36Jeff - Tuesday, December 8, 2015 - link

    Fury is an updated GCN, the same generation as Tonga(GCN 1.2). The real issue has been that they've been stuck at 28nm for years now because the 20nm node fell through. This year when they(and Nvidia) move down to the 14/16nm node we should see some pretty good jumps in performance from both sides of the isle. Add HBM to that, and the 2016 looks like a pretty sweet year for GPUs.
  • beginner99 - Wednesday, December 9, 2015 - link

    I doubt we will actually see this jumps in 2016 an my reference point of measurement is performance/$. You can get 290 of ebay for $200 or if you bought them before 300 series came out they were about that price new. You will never get that performance/$ in 14 nm because the process is more expensive.
  • frenchy_2001 - Friday, December 11, 2015 - link

    The process is more expensive, but the die will be significantly smaller.
    We will have to wait and see how the 2016 crop of GPU will end up, but 14/16nm FF will give:
    - smaller dies
    - smaller power consumption
    - possible faster clocks
    Now to see if those improvements from process can actually be harnessed by huge GPU dies (20nm planar, for example, did not scale well). We will have to wait as there is currently no such big and power hungry die made with 14/16nm node (Samsung/TSMC).
  • BurntMyBacon - Thursday, December 10, 2015 - link

    @Samus: "GCN scales well, but not for performance. Fury is their future."

    Fury is GCN. Their issue isn't GCN as GCN is actually a relatively loose specification that allows for plenty of architectural leeway in its implementation. Also note that GCN 1.0, GCN 1.1, and GCN 1.2 are significantly different from each other and should not be considered a single architecture as you seem to take it.

    ATi's current issue is the fact that they are putting out a third generation of products on the same manufacturing node. My guess is that many of the architectural improvements they were working on for the 20nm chips can't effectively be brought to the 28nm node. You see a bunch of rebadges because they decided they would rather wait for the next node than spend cash that they probably didn't have on new top to bottom architecture updates to a node that they can't wait to get off of and probably won't recoup the expense for. They opted to update the high end where the expenses could be better covered and they needed a test vehicle for HBM anyways.

    On the other hand, nVidia, with deeper pockets and greater marketshare decided that it was worth the cost. Though, even they took their sweet time in bringing the maxwell 2.0 chips down to the lower end.
  • slickr - Friday, December 11, 2015 - link

    Nvidia's products are based on pretty much slight improvements over their 600 series graphics architecture. They haven't had any significant architectural improvements since basically their 500 series. This is because both companies have been stuck on 28nm for the pat 5 years!

    Maxwell is pretty much a small update in the same technology that Nvidia has already been using before since the 600 series.
  • Budburnicus - Wednesday, November 16, 2016 - link

    That is TOTALLY INCORRECT! Maxwell is a MASSIVE departure from Kepler! Not only does it achieve FAR higher clock speeds, but it does more with less!

    At GTX 780 Ti is effectively SLOWER than a GTX 970, even at 1080p where the extra memory makes no difference, and where the 780 Ti has 2880 CUDA cores, the 970 has just 1664!

    There are FAR too many differences to list, and that is WHY Kepler has not been seeing ANY performance gains with newer drivers! Because the programming for Kepler is totally different from Maxwell or Pascal!

    Also, now that Polaris and Pascal is released: LMFAO! The RX 480 cannot even GET CLOSE to the 1503 MHZ I have my 980 Ti running on air! And if you DO get it to 1400 it throws insane amounts of heat!

    GCN is largely THE SAME ARCHITECTURE IT HAS ALWAYS BEEN! It has seen incremental updates such as memory compression, better branch prediction, and stuff like the Primitive Discard Accelerator - but otherwise is TOTALLY unchanged on a functional level!

    Kind of like how Pascal is an incremental update to Maxwell, adding farther memory compression, Simultaneous Multi Projection, better branch prediction and so on. Simultaneous Multi Projection adds an extra 40% to 60% performance for VR and surround monitor setups, when Maxwell - particularly the GTX 980 and 980 Ti are already FAR better at VR than even the Fury X! Don't take my word for it, go check the Steam Benchmark results on LTT forums! https://linustechtips.com/main/topic/558807-post-y...

    See UNLIKE Kepler to Maxwell, Pascal is BASICALLY just Maxwell on Speed, a higher clocked Maxwell chip! And it sucks FAR less power, creates FAR less heat and provides FAR more performance, as the RX 480 is basically tied with a GTX 970 running 1400 core! And FAR behind a 980 at the same or higher!

    Meanwhile the GTX 1060 beats it with ease, while the GTX 1070 (which at even 2100 MHZ is just a LITTLE less powerful than the 980 Ti at 1500 MHZ) 1080, and Pascal Titan SHIT ALL OVER THE FURY X!

    Hell the GTX 980 regular at 1500 MHZ kicks the ASS of the Fury X in almost every game at almost every resolution!

    Oh and Maxwell as well as Pascal are both HDR capable.
  • Furzeydown - Tuesday, December 8, 2015 - link

    Both companies have been rather limited by the same manufacturing node for the past four years as well though. It limits things to tweaks, efficiency improvements, and minor features. As far as performance goes, both companies are neck and neck with monster 600mm dies.
  • ImSpartacus - Tuesday, December 8, 2015 - link

    But Nvidia's monster die is generally considered superior to amd's monster die despite using older memory tech. Furthermore, amd's monster die only maintains efficiency because it's being kept very chilly with a special water cooler.

    It's not neck and neck.
  • Asomething - Wednesday, December 9, 2015 - link

    That is down to transistor density, amd are putting more into the same space which drives minimum requirements for the cooler up.
  • Dirk_Funk - Wednesday, December 9, 2015 - link

    Neck and neck as in there's hardly a difference in how many frames are rendered per second. It's not like either one has any big advantages over the other, and they are made almost exclusively for gaming so if fps is the same then yes it is neck and neck as far as most people are concerned.
  • OrphanageExplosion - Thursday, December 10, 2015 - link

    Not at 1080p and 1440p they aren't...
  • RussianSensation - Wednesday, December 23, 2015 - link

    The reference cards are very close.

    1080p - 980Ti leads by 6.5%
    1440p - 980Ti leads by just 1.1%
    4k - Fury X leads by 4.5%

    Neither card is fast enough for 4K, while both are a waste of money for 1080p without using VSR/DSR/Super-sampling. That leaves 1440p resolution where they are practically tied.
    http://www.techpowerup.com/reviews/ASUS/R9_380X_St...

    The only reason 980Ti is better is due to its overclocking headroom. As far as reference performance goes, they are practically neck and neck as users above noted.
  • zodiacsoulmate - Tuesday, December 8, 2015 - link

    waht do u mean nvidia is moving to a gcn-like architecture?
  • Michael Bay - Thursday, December 10, 2015 - link

    He`s drunk or crazy. Typical state for AMD user.
  • RussianSensation - Wednesday, December 23, 2015 - link

    It's actually correct. GCN-like implies Pascal will be more oriented towards GPGPU/compute functions -- i.e., graphics cards are moving towards general purpose processing devices that are good at performing various parallel tasks well. GCN is just a marketing name but the main thing about it is focus on compute + graphics functionality. NV is re-focusing its efforts heavily on compute with Pascal. For example, they are aiming to increase neural network performance by 10X.
  • extide - Tuesday, December 8, 2015 - link

    While nVidia picks a new name for each generation it's not like they are tossing the old design in the trash and building an entirely new GPU ... I would imagine we will see "GCN 2.0" next year, and I would be surprised if better power efficiency was not one of the main features.
  • Refuge - Tuesday, December 8, 2015 - link

    That has been their trend for the last two Architecture updates they've made. Granted small adjustments, but all in the name of power and efficiency.
  • Jon Irenicus - Tuesday, December 8, 2015 - link

    Apparently in maxwell they got that power efficiency by stripping out a lot of the hardware schedules amd still has in gcn, so the efficiency boost and power decrease was not "free." It will mean that maxwell cards are less capable of context switching for VR, and can't handle mixed graphics/compute workloads as well as gcn cards. That was fine with dx11 and it worked well for them, but I don't think those cards will age well at all. But that may have been part of the point.
  • haukionkannel - Tuesday, December 8, 2015 - link

    They only need to upgrade some part of GCN and they are just fine!
    The Nvidia did very good job in compression architecture of their GPU and that lead much better energy usage because they can use smaller (and cheaper) memory pathway. (There are other factors too, but that one is guite important)
    AMD have higher bandwidth version of their 380, but the card does not benefit from it, so they are not releasing it, because 380 also have relative good compression build in. Make it better, increase ROPs and GCN is competitive again.
  • WaltC - Tuesday, December 8, 2015 - link

    Odd, considering that nVidia is very much in catch-up mode presently concerning HBM deployment and even D3d12 hardware compliance...;) But, I don't do mobile at all, so I can't see it from that *cough* perspective...
  • Michael Bay - Thursday, December 10, 2015 - link

    HBM does not offer any real advantage to the enduser presently, so there is literally no catch-up on nV part. Same with DX12.
  • Macpoedel - Tuesday, December 8, 2015 - link

    Nvidia changes the name of their architecture for every little change they make, that doesn't mean AMD has to do so as well. GCN 1.0 and GCN 1.2 are almost as much apart as Maxwell 2 and Kepler are. GPU architectures haven't changed all that much since both companies stated using the 28nm node.
  • Frenetic Pony - Tuesday, December 8, 2015 - link

    Supposedly next year will bring GCN 2.0 Also it's already confirmed that there's basically no architectural improvements from Nvidia next year. Pascal is almost identical to Maxwell in most ways except for a handful of compute features.
  • Azix - Wednesday, December 9, 2015 - link

    GCN is just a name. It does not mean there aren't major improvements. Nvidia constantly changing their architecture name is not necessarily an indication its better, its usually the same improvements over an older arch.

    Also it seems AMD is ahead of the game with GCN and nvidia is playing catchup, having to cut certain things out to keep up.
  • BurntMyBacon - Thursday, December 10, 2015 - link

    @Azix: "Also it seems AMD is ahead of the game with GCN and nvidia is playing catchup, having to cut certain things out to keep up."

    I wouldn't go that far. nVidia is simply focused differently that ATi at the moment. ATi gave up compute performance for gaming back in the 6xxx series and brought compute performance back with the 7xxx series. Given AMD's HSA initiative, I doubt we'll see them make that sacrifice again.

    nVidia on the other hand decided to do something similar going from Fermi to little Kepler (6xx series). They brought compute back to some extent for big Kepler (high end 7xx series), but dropped it again for Maxwell. This approach does make some sense as the majority of the market at the moment doesn't really care about DP compute. The ones that do can get a Titan, a Quadro, or if the pattern holds, a somewhat more DP capable consumer grade card once every other generation. On the other hand, dropping the DP compute hardware allows them to more significantly increase gaming performance at similar power levels on the same process. In a sense, this means the gamer isn't paying as much for hardware that is of little or no use to gaming.

    At the end of the day, it is nVidia that seems to be ahead in the here and now, though not by as much as some suggest. It is possible that ATi is ahead when it comes to DX12 gaming and their cards may age more gracefully than Maxwell, but that remains to be seen. More important will be where the tech lays out when DX12 games are common. Even then, I don't think that Maxwell with have as much of a problem as some fear given that DX11 will still be an option.
  • Yorgos - Thursday, December 10, 2015 - link

    What are the improvements that Maxwell offer?
    They can run better the binary blobs from crapworks?
    e.g. lightning effects http://oi64.tinypic.com/2mn2ds3.jpg
    or efficiency
    http://www.overclock.net/t/1497172/did-you-know-th...
    or 3.5 GB vram,
    or obsolete architecture for the current generation of games(which has already started)

    Unless you have money to waste, there is no other option in the GPU segment.
    a lot of GTX 700 series owners say so.(amongst others)
  • Michael Bay - Thursday, December 10, 2015 - link

    I love how you convenienty forgot to mention not turning your case into a working oven, and then grasped sadly for muh 3.5 GBs as if it matters in the real world with overwhelming 1080p everywhere.

    And then there is an icing on a cake in the form of hopeless wail on "current generation of games(which has already started)". You sorry amdheads really don`t see irony even if it hits you in the face.
  • slickr - Friday, December 11, 2015 - link

    Nvidia still pretty much uses the same techniques/technology as they had in their old 600 series graphics. Just because they've names the architecture differently doesn't mean it is.

    AMD's major architectural change will be in 2016 when they move to 14nm FinFet, so will Nvidia's when they move to 16nm FinFet.

    AMD already has design elements like HBM in their current GCN Fiji architecture that they can more easily implement for their new GPU's which are supposed to start arriving in late Q2 2016.
  • Zarniw00p - Tuesday, December 8, 2015 - link

    FS-HDMI for game consoles, DP 1.3 for Apple who would like to update their Mac Pros with DP 1.3 and 5k displays.
  • Jtaylor1986 - Tuesday, December 8, 2015 - link

    This is a bit strange since almost the entire presentation depends on action by the display manufacturer industry and industry standard groups. I look forward to them announcing what they are doing in 2016, not what they are trying to get the rest of the industry to do in 2016.
  • bug77 - Tuesday, December 8, 2015 - link

    Well, most of the stuff a video card does depends on action by the display manufacturer...
  • BurntMyBacon - Thursday, December 10, 2015 - link

    @Jtaylor1986: "This is a bit strange since almost the entire presentation depends on action by the display manufacturer industry and industry standard groups."

    That's how it works when you want a non-proprietary solution that allows your competitors to use it as well. ATi doesn't want to give away their tech any more than nVidia does. However, they also realize that Intel is the dominant graphics manufacturer in the market. If they can get Intel on board with a technology, then there is some assurance that the money invested isn't wasted.

    @Jtaylor1986: "I look forward to them announcing what they are doing in 2016, not what they are trying to get the rest of the industry to do in 2016."

    Point of interest: It is hard to get competitors to work together. They don't just come together and do it. Getting Acer, LG, and Samsung to standardize on a tech (particularly a proprietary one) means that there has already been significant effort invested in the tech. Also, getting Mstar, Novatek, and Realtek to make compatible hardware is similar to getting nVidia and ATi or AMD and Intel to do the same. IBM forced Intel's hand. Microsoft's DirectX standard forced ATi and nVidia (and 3DFX for that matter) to work in a compatible way.

    Beyond that, it isn't as if ATi has is doing nothing. It is simply that their work requires cooperation for all parties involved. Cooperation that they've apparently obtained. This is what is required when you think about the experience beyond just your hardware. nVidia does similar with their Tesla partners, The Way it's Meant to be Played program, and CUDA support. Sure, they built the chip themselves with gsync, but they still had to get support from monitor manufacturers to get the chip into a monitor.
  • TelstarTOS - Thursday, December 10, 2015 - link

    and the display industry has been EXTREMELY slow at producing BIG, quality display, not even counting 120 and 144hz. I dont see them embracing HDR (10-12 bit panels) any soon :(
    My monitor bought last year is going to last awhile, until they catch up with the video card makers.
  • wiak - Tuesday, December 8, 2015 - link

    " an important development especially given the fact that DisplayPort support is non-existent on consumer AMD laptops."
    all laptops these days have eDP aka Embedded DisplayPort to the display panel"... :P
  • BurntMyBacon - Thursday, December 10, 2015 - link

    @wiak: "all laptops these days have eDP aka Embedded DisplayPort to the display panel... :P"

    I think they were talking external ports. Though that still raises the question of why there aren't more adaptive sync laptops available as it shouldn't be that hard to implement.
  • Blitzvogel - Tuesday, December 8, 2015 - link

    Nice to see AMD supporting new display tech as usual, especially the push for Freesync on both DP and HDMI, but some updates on the actual GPU architecture would've been nice too!
  • A5 - Tuesday, December 8, 2015 - link

    I've seen some HDR video content and it is the real deal. Very excited to see it come to the home video space soon...maybe in a decade we'll get HDR TV shows :-p
  • Agent_007 - Tuesday, December 8, 2015 - link

    "to directly support the HDCP 2.2 standard, which is being required for all 4K/HDR content"
    This claim is presented in multiple 4k video related articles, but it is incorrect. e.g. Netflix and Ultra HD Blu-ray may require HDCP 2.2 support for 4K/HDR, but you can e.g. download Big Buck Bunny in 4K60 and watch it without HDCP 2.2, or download Life of Pi – 4K UHD HDR.

    That also means that those Gaming & Photos parts in the image are incorrect.
  • Murloc - Tuesday, December 8, 2015 - link

    I think it's implied that they're talking about traditional and paying home cinema customers, who will be using UHD blu-rays and netflix.
  • Mr Perfect - Tuesday, December 8, 2015 - link

    If RTG is talking, ask them about motion blur compensation! As much as I'd like to use an open standard like Freesync/Adaptivesync, the Low Motion Blur compensation in GSync is really tempting, especially for FPS games. I'd love to see some feature parity between AMD and Nvidia here.
  • Jon Irenicus - Tuesday, December 8, 2015 - link

    The only sad part of this presentation is that it looks like we will have to wait much longer to get a holy grail display / port.

    Displayport 1.3 has enough bandwidth for 4k @ 120Hz, but apparently NOT enough to ALSO include HDR.

    The dream computer display tops out from 40-43" in size to keep a similar ppi @4k as the current 27" displays @1440p

    is

    oled
    4k
    vrr range of 35Hz - 120Hz
    low input latency
    HDR - and here is the rub, displayport 1.3 can't give all of this at once. Disappointed.
  • Beany2013 - Tuesday, December 8, 2015 - link

    Are there even any games that you can play at 4k/120hz that can be driven to that level by the current generation (or upcoming) of GPUs, full stop?

    You're not wrong, but I don't think we're there yet, are we?
  • Jon Irenicus - Tuesday, December 8, 2015 - link

    Not the latest games, no. This is more about next years gpus though, the first die shrink in several years, we can almost sort of push 4K @ 60fps with certain settings turned down on top end cards, but next year we should be able to more reliably go above 60fps on top end single gpu solutions, and there we want to have 4k displays that go beyond 60fps.... it's a pity we can't get HDR too. Though to be honest, unless it's an oled I don't even want an HDR display, what's the point without the deep blacks. And local dimming is not good enough.
  • Nintendo Maniac 64 - Wednesday, December 9, 2015 - link

    I wouldn't be surprised if you could do 4k HDR at 90 or 100Hz.
  • Frenetic Pony - Tuesday, December 8, 2015 - link

    RE HDR rendering. HDR rendering for games is, in some ways, really easy. Any halfway decent high end game today already tonemaps from 16bits per channel from hdr, though some hacks like 10bit render targets for less important steps may have to go (tests will need to be done).

    No the hard part will be content creation. Textures will need to be done in a higher colorspace, meaning capture equipment like cameras and etc. will need to support it, then monitors will need to support it, then art tools will need to support it, then the game engine will need to support it. Considering how long high end games take to make it's going to take quite a while before it shows up unfortunately.
  • Mr Perfect - Wednesday, December 9, 2015 - link

    Actually, I think the graphics professionals are already ahead of us here. If you look at high-end displays meant for graphics work(think $2k IPS screens from NEC), they're all 10 bit with internal color processing of 14bit. It's just us plebians who are stuck with 8bit panels.
  • HollyDOL - Wednesday, December 9, 2015 - link

    Got Eizo CX271 myself and can confirm, more bits really make difference, esp. when wife works with raw photos. Alas for movies/games so far there is no difference I could see (esp. for movies with their 16-235 range). Few apps that more-or-less support rendering in HDR (like 10b per channel) seem to have impact especially with darker tones which look more like you would expect compared to 8bits per channel. I don't expect though having 10+ bits per channel as matured standart in games or general desktop apps for at least 2 more years... Except professional software built directly for it the implementations look more like school/PoC/experiment/learning projects and there is so far not really solid consistency between the results. I'd expect though it becomes generally supported standart in about 5yrs give or take.
  • Oxford Guy - Wednesday, December 9, 2015 - link

    1000+ nit brightness? Get ready for flashing ads to sear your eyeballs. Commercials are going to be even more irritating.
  • Oxford Guy - Wednesday, December 9, 2015 - link

    "
    Meanwhile RTG is also working on the software side of matters as well in conjunction with Microsoft. At this time it’s possible for RTG to render to HDR, but only in an exclusive fullscreen context, bypassing the OS’s color management. Windows itself isn’t capable of HDR rendering, and this is something that Microsoft and its partners are coming together to solve."

    What about OS X?
  • Razdroid - Wednesday, December 9, 2015 - link

    So, 'balkanization' is actually a word in US/UK !?
  • Oxford Guy - Wednesday, December 9, 2015 - link

    yes, if you capitalize it
  • Midwayman - Wednesday, December 9, 2015 - link

    This might finally be the year I go 4k. been waiting for 120hz monitors and it looks like the pieces are falling into place.
  • StevoLincolnite - Wednesday, December 9, 2015 - link

    All I ask of you AMD is to release something amazing on 14nm at an affordable price. (So I can get two.)

    Also updating your legacy drivers so that old cards (That are more than capable enough for their intended needs) have Windows 10 support. (Like the Radeon 2000, 3000, 4000 series.)
  • medi03 - Thursday, December 10, 2015 - link

    "while you can find DisplayPort in many monitors, you would be hard pressed not to find HDMI"

    Try to find HDMI in any g-sync monitor.
  • BurntMyBacon - Thursday, December 10, 2015 - link

    @medi03: "Try to find HDMI in any g-sync monitor."

    Try to find FreeSync/Adaptive Sync on any g-sync monitor. Given they mutual exclusivity, I'm pretty sure they can simply count these monitors out.
  • bizude - Thursday, December 10, 2015 - link

    Something is funky with the math, or I'm missing information. It says that DP 1.3 will support 3440x1440@190hz, yet based on the bandwidth increase math is telling me the maximum would be 165hz.
  • sheh - Saturday, December 12, 2015 - link

    I get 218Hz = DP1.3's 26Gbps bandwidth (32Gbps - overhead) / 3440 / 1440 / 3 (RGB) / 8 (assuming 8-bit color)
  • Harry Lloyd - Friday, December 11, 2015 - link

    I would like to see a truthful HDR comparison on actual displays. Those comparison pictures say absolutely nothing. The colors I see on normal displays do not seem dull to me, but maybe it is because I do not know what not dull looks like.
  • johnpombrio - Saturday, December 12, 2015 - link

    Reluctant to talk? AMD? They cannot shut up! They have been having press conferences and press releases and ton of PPT slides as fast as they can set them up. The massively overhyped Fury cards was a good example. AMD talked about them for so long that everyone thought that AMD was going to release a blockbuster. When it turned out to be an good card but not what they were promising ("Fastest card on the planet!"), there was a lot of disappointed people.
    Now we have ZEN MANIA! For a chip that will not be released in volume until 2017, AMD has been hyping it up for over a year now. That will be close to two YEARS of talks, presentations, and even more slides while Intel releases almost 3 generations of chips during the same period.
    Those that can, release product. Those that can't, PowerPoint.
  • geeks - Thursday, December 17, 2015 - link

    Really useful blog.It looks impressive.

    </a href="https://geeksonrepair.com/">Online Computer Repair</a>
  • quickbunnie - Tuesday, January 26, 2016 - link

    For Brandon, the argument that you need a proper viewing environment is imho a straw man argument. None of the image quality factors on either OLED or LCD are really worth looking at in poor viewing environments, other than arguably peak brightness.
  • euskalzabe - Wednesday, July 20, 2016 - link

    The HDR discussion of this review was super interesting, but as always, there's one key piece of information missing: WHEN are we going to see HDR monitors that take advantage of these new GPU abilities?

    I myself am stuck at 1080p IPS because more resolution doesn't entice me, and there's nothing better than IPS. I'm waiting for HDR to buy my next monitor, but being 5 years old my Dell ST2220T is getting long in the teeth...

Log in

Don't have an account? Sign up now