Half of your examples make no sense, Windows Me was named as such for the anniversary of the millennium. That was the whole point, it makes sense. And 8.1... came after 8, pretty straight forward. Vastly different versions, vastly different names. Windows Vista wasn't just called XP Ultimate.
But I would argue that when it comes to operating systems, brands lasting upwards of a decade, it matters a lot less. The brand-name of the OS is intentionally used as such so it's easily recognizable by regular people and not a confusing mess of 12_1_2.35 type versions. And from 7 on (so for over a decade) it's more than clear, 7>8>8.1>10, with 10 having versions clearly based on launch date (1503, 1909, 2004).
Now Direct X 12 Ultimate is most definitely not in the same category.
You still haven't. I was clearly talking about the chosen *name* and the occasion for which it was chosen. I have to ask, if someone did speak highly of Windows Me are you sure you would have understood what they meant? ;)
It could be argued that even if Millennium made sense as a name for the successor to Win98, it only got called that because the ball was dropped in renaming NT5 to release it as 2000. Both Windows lineages would have made better sense from a naming perspective if 2K/ME were switched. '95 > '98 > 2000 makes sense. NT, ME, XP are all 2 letter names that would have worked as a set.
Windows 2000 represented a huge attempt to unify Win9x compatibility with the NT line. They did a fantastic job as well. I ran it on my desktop for years with little issue. I expect that they were force to cut a release early for continued business support, while they needed extra time to make the additional improvements that went into XP. They renamed XP from a year because quite frankly it would have been confusing to have Windows 2000 and then Windows 2001.
Honestly, we need another Windows 2000 right now. A slim, stable, fast Windows-Based OS without any BS.
I have been in the IT field for over 20 years, I never minded ME. The problem was with terrible quality hardware with terrible drivers trying to run it. If you ran it on decent hardware it was as solid if not more so than windows 98/SE
DX 12 is relatively unsuccessful API you can see it in its slow adaption to the market and mostly luck luster performance. You can blame developers or say it does things differently giving developers more hardware control but the end is result some DX12 implementations are awful compared to DX11 counterparts, not to mention how they killed pretty much SLI and CFX and gave useless explicit mGPU feature that makes things harder to implement. Microsoft needs to think how to give developers the features of DX12 but also help with optimization, or rethink it all with DX13.
They're also the reason good 'ol MSAA isn't effective for many applications anymore (only some of the jaggies will be touched, while still incurring the performance penalties), and we're stuck with post-processing (vaseline) and temporal AA.
I'm holding out for Super DirectX 12 Turbo HD Remix. At least everyone knows that means it's the official successor to Hyper Direct X 12 Anniversary Edition.
To make it more confusing, once platinum was a more expensive metal than gold & now gold costs more than platinum. Does that mean my gold cards will somehow perform "better" than my platinum cards? Maybe a software update? :P
Not to be nitpicky but before wasn't helpful at all either. It violated semantic versioning that definitely confused the people that actually *used* the APIs. Serious revisions of any software should not be point releases at all conventionally.
Rules are meant to be broken and Microsoft has had a notorious history doing that w/ point releases instead of releasing a major version.
It's accurate that Direct X 12 has gotten this marketing to combat the stagnation of full compliance by Nvidia & especially AMD. Rather than saying this is the final point release of what they imagine DX12 to be, they've called it DX12 Ultimate for marketing. It's not really that egregious to me.
It's also mostly BS: When is nvidia going to get back to their roots and brute force performance older generation video cards were known for.
Example: 3080 is not going to be any better than a 2080Ti. Same shiz different scoop as far as Nvidia is concerned. They jack the prices on the 2080Ti's so they can justify the waste of money the put into the 3080. Just like they did with the 1080Ti vs 2080.
Seriously? DirectX got it's name because it was asinine and confusing. DirectX was named by the reporters because Microsoft was adding "Direct" to things like DirectPlay or DirectSound. Of course before that was WinG which should never, ever be spoken of in front of polite company.
If current RTX games (which use DXR 1.0 + nvidia's extensions) were to be updated to support DXR 1.1, does it mean their ray-tracing options can be enabled on RDNA2 cards?
RTX is an implementation of DXR, not the other way around. So it depends on the developers. If they implemented DXR in their games then, assuming AMD releases a DXR driver for RDNA2, which they should, the games should run the ray tracing on the AMD software with perhaps some minor tweaks the developers have to make. But if they used code outside DXR to target NVIDIA's RTX more directly then they would need more major rewriting of their code to get it to work on AMD's hardware.
It isn't necessarily a superset. It only has the potential to be. RTX must implement all of DXR but it might include more functionality, or it might include an alternate method of implementing the same functionality. I have no idea if RTX does or does not do that at the moment, only that it's technically possible for them to do it.
The question is how did the early adopter developers choose to implement the hardware ray tracing acceleration. NVIDIA may have created libraries that exposed some things in a different way. As a non ray-tracing example: NVIDIA inplemented mesh shaders in Turing and they must have created some way for developers to target them. Now the mesh shaders are part of directx 12 ultimate and so there should be a directx api way of using them. Most likely those are two different ways and at least require a change in syntax in the code. but the change could be more significant than that, depending on how the two different groups decided to implement the feature, and how much nvidia knew of the upcoming microsoft implementation when they were making their own. So the functionality is the same but i would assume you could not just take your game targeting NVIDIA's mesh shaders pre-dx12ultra and run them on rdna2, even though rdna2 has the functionality. On the other hand, if your game implemented the mesh shaders through a dxr12ultra api call then it should run on rdna2.
Haha, that graphics preference box tripped me out. I looked away while scrolling the article and looked back and thought something had popped up from my laptop. Then I looked closer and saw that it didn't listed the graphics options that I actually had, so I thought it was some advertisement or spam pop-up from anandtech.com. Then I finally realized that it was an embedded graphic for the article.
"WDDM 2.7 [..] allows GPUs to more directly manage their VRAM"
Hopefully this open ways to fully utilize VRAM applications under Windows 10. Since WDDM 2.0 a single vram-heavy app (e.g. Cuda/CG) can only use about 80% VRAM (or more accurately .. 90% of 90%) opposed to >95% (?) in Win7~8.1 ... because reasons.
> NVIDIA is enabling support for hardware accelerated GPU scheduling ... I find it interesting that NVIDIA lumps video playback in here as a beneficiary as well, since video playback is rarely an issue these days
Isn't this the second part of the multi monitor+different refresh rate fix? Or did the recent wddm update fix that itself?
I dropped my second monitor for the time being, in part, because having video running on it (60hz) would screw with my main monitor (120hz) and it was super irritating.
Personally, I had quite a few issues with a 144Hz main display and 60Hz secondary around the release of Turing but they've been completely fixed with driver updates for quite a while now. Might be worth another try for your setup, though I realize that it's not a guarantee that everything is fixed.
I don't own a Navi or Turing GPU, but since Minecraft and Quake 2 are the only games with real benefits from RT, because it is just too slow, I don't expect to see modern games really using it before Nvidia 4000 / Radeon 7000
There's quite a few other titles that benefit. It's still kind of the situation where you're paying $50 extra for a functional tech demo but it's not just Quake and MC. Control is not really playable at 1440p with high graphics settings on a 5700/5700XT, but DLSS is basically magic and means you can upscale to 1440p with no significant quality penalty, plus run full RT effects even on the 6GB 2060. Metro looks fantastic with RT turned on as well.
I saw videos of all RT games, and I failed to see a meaningful upgrade, an improved reflect here, better shadows there...maybe if I owned Control and Metro,I would see a difference. Lightning improvement in Q2 and MC just feel like complete new games.
DLSS 2.0 impressed me on the other side, I can imagine so much applications with the same concept (photo, music, etc...)
Your comment is kinda laughable. Of course you don't notice a difference if you don't own a game that doesn't use the revolutionary features. That said, AMD lack of support & the delay of Next gen consoles for them to catch up definitely limited the effectiveness of Turing ironically.
Most PC games are console ports; of course most of the dramatic amount of support for next-gen gaming by Turing wouldn't be really noticed till next-gen console games release
It's a DX12 Ultimate feature, so of course 970 doesn't have it. Turing and Amphere were made in conjunction w/ Microsoft to finally realize DX12's original vision that's Windows 10+ only.
Maxwell was supposed to be the swan song of rendering DX11 games really well till proper rendering slowed down by Windows 7 being still prevalent.
Things like variable shading & so on are things your GPU won't support either. That said, you were wise to adopt a TI owner's usual upgrade path of buying every other generation this time around.
Not a massively helpful reply. Simple answer is hardware scheduling works on Pascal+, so not just Turing/Amphere, it works on Nvidia 10 series or newer. The rest of what you wrote is not relevant.
"At this point no previously-announced games have confirmed that they’ll be using DX12U, though this is just a matter of time, especially with the Xbox Series X launching in a few weeks."
I applied the Windows 10 update and turned on the windows GPU scheduler, and downloaded and installed the new Nvidia Studio driver 451.48 - But, the driver breaks HDR playback in MPC-HC (usingMadVR as the decoder)! WTF???
I installed the Studio driver for my RTX 2060 Super, and it broke HDR functionality in video playback (Windows HDR settings remain fine). Studio drivers are supposed to be double checked for this kind of nonsense!
I reverted to driver (clean install/uninstall both times) 442.92 and HDR is working just fine in movie playback. Left the windows update alone, so its clearly not that breaking the HDR playback!
Maybe you should file a bug with MadVR instead. That and post a driver feedback bug report with Nvidia. Those are the best ways to get your corner case resolved.
While possible that it's a bug in the new driver, it's also possible that madvr or MPC-HC are handling HDR video through a non-standard method, that was working with previous nvidia drivers.
Try asking in MPC-HC and madvr official threads in the doom9 forum and see if others have this issue.
"In short, Windows 10 2004 has done away with the “Run with graphics processor” contextual menu option within NVIDIA’s drivers, which prior to now has been a shortcut method of forcing which GPU an application runs on it an Optimus system. In fact, it looks like control over this has been removed from NVIDIA’s drivers entirely. As noted in the support document, controlling which GPU is used is now handled through Windows itself, which means laptop users will need to get used to going into the Windows Settings panel to make any changes."
I am fed up with Microsoft constantly removing features from Windows that I actually use.
Tell me about it. I cant count the times I got stuck in a game and had to hard reset the PC because Windows doesnt offer task switch to desktop anymore. Coupled with Windows sometimes not task switching at all (at least not caring about focus anymore), when that happens, it feels like I am using Windows 95 again.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
58 Comments
Back to Article
Oxford Guy - Wednesday, June 24, 2020 - link
Companies are really dropping the ball with product naming.Once upon a time, if a serious revision occurred it would be coupled with a clear name change.
DirectX 10 became DirectX 10.1
But, now everything is "super" and "awesome" and "ultimate". It's both asinine and confusing.
DigitalFreak - Wednesday, June 24, 2020 - link
You forgot "turbo" and "extreme".tamalero - Wednesday, June 24, 2020 - link
GT, ULTRA, DELUXE, GOLD, PLATINUM, XTByte - Thursday, June 25, 2020 - link
Millenium, XP, Vista 8.1close - Thursday, June 25, 2020 - link
Half of your examples make no sense, Windows Me was named as such for the anniversary of the millennium. That was the whole point, it makes sense. And 8.1... came after 8, pretty straight forward. Vastly different versions, vastly different names. Windows Vista wasn't just called XP Ultimate.But I would argue that when it comes to operating systems, brands lasting upwards of a decade, it matters a lot less. The brand-name of the OS is intentionally used as such so it's easily recognizable by regular people and not a confusing mess of 12_1_2.35 type versions. And from 7 on (so for over a decade) it's more than clear, 7>8>8.1>10, with 10 having versions clearly based on launch date (1503, 1909, 2004).
Now Direct X 12 Ultimate is most definitely not in the same category.
FlukeLSX - Thursday, June 25, 2020 - link
lol, I seriously have never met or talked to anyone in the IT field who referred to windows me in a positive way.close - Thursday, June 25, 2020 - link
You still haven't. I was clearly talking about the chosen *name* and the occasion for which it was chosen. I have to ask, if someone did speak highly of Windows Me are you sure you would have understood what they meant? ;)NZLion - Friday, June 26, 2020 - link
It could be argued that even if Millennium made sense as a name for the successor to Win98, it only got called that because the ball was dropped in renaming NT5 to release it as 2000. Both Windows lineages would have made better sense from a naming perspective if 2K/ME were switched.'95 > '98 > 2000 makes sense. NT, ME, XP are all 2 letter names that would have worked as a set.
eek2121 - Sunday, June 28, 2020 - link
Windows 2000 represented a huge attempt to unify Win9x compatibility with the NT line. They did a fantastic job as well. I ran it on my desktop for years with little issue. I expect that they were force to cut a release early for continued business support, while they needed extra time to make the additional improvements that went into XP. They renamed XP from a year because quite frankly it would have been confusing to have Windows 2000 and then Windows 2001.Honestly, we need another Windows 2000 right now. A slim, stable, fast Windows-Based OS without any BS.
MFinn3333 - Friday, June 26, 2020 - link
It introduced System Restore to the Windows family. Of course, one could argue it introduced the need for it as well.RadiclDreamer - Saturday, June 27, 2020 - link
I have been in the IT field for over 20 years, I never minded ME. The problem was with terrible quality hardware with terrible drivers trying to run it. If you ran it on decent hardware it was as solid if not more so than windows 98/SEBeaver M. - Sunday, June 28, 2020 - link
Now you havent met one more.Flunk - Wednesday, June 24, 2020 - link
I guess you're not a fan of the next DirectX release, DirectX 12 Even More Ulitmate + Infinity?nevcairiel - Wednesday, June 24, 2020 - link
Its still 12.2 basically, Ultimate is just marketing.Jorgp2 - Wednesday, June 24, 2020 - link
Yeah, literally the same meaning scheme they've had since 10.1brontes - Wednesday, June 24, 2020 - link
DirectX 12 THICCSamus - Wednesday, June 24, 2020 - link
I think they are just scared of going to 13Eliadbu - Wednesday, June 24, 2020 - link
DX 12 is relatively unsuccessful API you can see it in its slow adaption to the market and mostly luck luster performance. You can blame developers or say it does things differently giving developers more hardware control but the end is result some DX12 implementations are awful compared to DX11 counterparts, not to mention how they killed pretty much SLI and CFX and gave useless explicit mGPU feature that makes things harder to implement. Microsoft needs to think how to give developers the features of DX12 but also help with optimization, or rethink it all with DX13.Hul8 - Wednesday, June 24, 2020 - link
Rather than DX12, isn't it the modern deferred rendering techniques (in both DX12 and DX11) that killed multi-GPU for gaming?Hul8 - Wednesday, June 24, 2020 - link
They're also the reason good 'ol MSAA isn't effective for many applications anymore (only some of the jaggies will be touched, while still incurring the performance penalties), and we're stuck with post-processing (vaseline) and temporal AA.Flunk - Wednesday, June 24, 2020 - link
DX 12 was never intented to replace DX 11. It's really only for developers who want to write much closer to the metal.brantron - Wednesday, June 24, 2020 - link
I'm holding out for Super DirectX 12 Turbo HD Remix. At least everyone knows that means it's the official successor to Hyper Direct X 12 Anniversary Edition.Oxford Guy - Saturday, June 27, 2020 - link
Now with LED lighting.bharatwd - Thursday, June 25, 2020 - link
To make it more confusing, once platinum was a more expensive metal than gold & now gold costs more than platinum. Does that mean my gold cards will somehow perform "better" than my platinum cards? Maybe a software update? :Plilkwarrior - Thursday, June 25, 2020 - link
Not to be nitpicky but before wasn't helpful at all either. It violated semantic versioning that definitely confused the people that actually *used* the APIs. Serious revisions of any software should not be point releases at all conventionally.Rules are meant to be broken and Microsoft has had a notorious history doing that w/ point releases instead of releasing a major version.
It's accurate that Direct X 12 has gotten this marketing to combat the stagnation of full compliance by Nvidia & especially AMD. Rather than saying this is the final point release of what they imagine DX12 to be, they've called it DX12 Ultimate for marketing. It's not really that egregious to me.
FlukeLSX - Thursday, June 25, 2020 - link
It's also mostly BS: When is nvidia going to get back to their roots and brute force performance older generation video cards were known for.Example: 3080 is not going to be any better than a 2080Ti. Same shiz different scoop as far as Nvidia is concerned. They jack the prices on the 2080Ti's so they can justify the waste of money the put into the 3080. Just like they did with the 1080Ti vs 2080.
MFinn3333 - Friday, June 26, 2020 - link
Seriously? DirectX got it's name because it was asinine and confusing. DirectX was named by the reporters because Microsoft was adding "Direct" to things like DirectPlay or DirectSound. Of course before that was WinG which should never, ever be spoken of in front of polite company.Nothing has changed.
Oxford Guy - Saturday, June 27, 2020 - link
DirectX 10.0 going to 10.1 is more clear than supercalifragilisticexpialidocious.eddman - Wednesday, June 24, 2020 - link
If current RTX games (which use DXR 1.0 + nvidia's extensions) were to be updated to support DXR 1.1, does it mean their ray-tracing options can be enabled on RDNA2 cards?Yojimbo - Wednesday, June 24, 2020 - link
RTX is an implementation of DXR, not the other way around. So it depends on the developers. If they implemented DXR in their games then, assuming AMD releases a DXR driver for RDNA2, which they should, the games should run the ray tracing on the AMD software with perhaps some minor tweaks the developers have to make. But if they used code outside DXR to target NVIDIA's RTX more directly then they would need more major rewriting of their code to get it to work on AMD's hardware.brontes - Wednesday, June 24, 2020 - link
So to be clear, RTX is a superset of DXR? Any idea how "super" it is, in actual practice?Yojimbo - Wednesday, June 24, 2020 - link
It isn't necessarily a superset. It only has the potential to be. RTX must implement all of DXR but it might include more functionality, or it might include an alternate method of implementing the same functionality. I have no idea if RTX does or does not do that at the moment, only that it's technically possible for them to do it.The question is how did the early adopter developers choose to implement the hardware ray tracing acceleration. NVIDIA may have created libraries that exposed some things in a different way. As a non ray-tracing example: NVIDIA inplemented mesh shaders in Turing and they must have created some way for developers to target them. Now the mesh shaders are part of directx 12 ultimate and so there should be a directx api way of using them. Most likely those are two different ways and at least require a change in syntax in the code. but the change could be more significant than that, depending on how the two different groups decided to implement the feature, and how much nvidia knew of the upcoming microsoft implementation when they were making their own. So the functionality is the same but i would assume you could not just take your game targeting NVIDIA's mesh shaders pre-dx12ultra and run them on rdna2, even though rdna2 has the functionality. On the other hand, if your game implemented the mesh shaders through a dxr12ultra api call then it should run on rdna2.
nathanddrews - Wednesday, June 24, 2020 - link
Nice explanation, thanks.rocky12345 - Wednesday, June 24, 2020 - link
Great way to explain it Thank You. If there was a way to give a up vote here I would have done that as well.catavalon21 - Wednesday, June 24, 2020 - link
+1 is about as good as it gets around here.Yojimbo - Wednesday, June 24, 2020 - link
Haha, that graphics preference box tripped me out. I looked away while scrolling the article and looked back and thought something had popped up from my laptop. Then I looked closer and saw that it didn't listed the graphics options that I actually had, so I thought it was some advertisement or spam pop-up from anandtech.com. Then I finally realized that it was an embedded graphic for the article.MyRandomUsername - Wednesday, June 24, 2020 - link
"WDDM 2.7 [..] allows GPUs to more directly manage their VRAM"Hopefully this open ways to fully utilize VRAM applications under Windows 10.
Since WDDM 2.0 a single vram-heavy app (e.g. Cuda/CG) can only use about 80% VRAM (or more accurately .. 90% of 90%) opposed to >95% (?) in Win7~8.1 ... because reasons.
See discussions on e.g.: https://forums.developer.nvidia.com/t/windows-10-u...
brontes - Wednesday, June 24, 2020 - link
> NVIDIA is enabling support for hardware accelerated GPU scheduling ... I find it interesting that NVIDIA lumps video playback in here as a beneficiary as well, since video playback is rarely an issue these daysIsn't this the second part of the multi monitor+different refresh rate fix? Or did the recent wddm update fix that itself?
I dropped my second monitor for the time being, in part, because having video running on it (60hz) would screw with my main monitor (120hz) and it was super irritating.
Destoya - Wednesday, June 24, 2020 - link
Personally, I had quite a few issues with a 144Hz main display and 60Hz secondary around the release of Turing but they've been completely fixed with driver updates for quite a while now. Might be worth another try for your setup, though I realize that it's not a guarantee that everything is fixed.yeeeeman - Wednesday, June 24, 2020 - link
Greeeat, Navi users that cheered for saving 50 bucks over nvidia are now missing out. NiceJiai - Wednesday, June 24, 2020 - link
I don't own a Navi or Turing GPU, but since Minecraft and Quake 2 are the only games with real benefits from RT, because it is just too slow, I don't expect to see modern games really using it before Nvidia 4000 / Radeon 7000Destoya - Wednesday, June 24, 2020 - link
There's quite a few other titles that benefit. It's still kind of the situation where you're paying $50 extra for a functional tech demo but it's not just Quake and MC. Control is not really playable at 1440p with high graphics settings on a 5700/5700XT, but DLSS is basically magic and means you can upscale to 1440p with no significant quality penalty, plus run full RT effects even on the 6GB 2060. Metro looks fantastic with RT turned on as well.Jiai - Wednesday, June 24, 2020 - link
I saw videos of all RT games, and I failed to see a meaningful upgrade, an improved reflect here, better shadows there...maybe if I owned Control and Metro,I would see a difference. Lightning improvement in Q2 and MC just feel like complete new games.DLSS 2.0 impressed me on the other side, I can imagine so much applications with the same concept (photo, music, etc...)
lilkwarrior - Thursday, June 25, 2020 - link
Your comment is kinda laughable. Of course you don't notice a difference if you don't own a game that doesn't use the revolutionary features. That said, AMD lack of support & the delay of Next gen consoles for them to catch up definitely limited the effectiveness of Turing ironically.Most PC games are console ports; of course most of the dramatic amount of support for next-gen gaming by Turing wouldn't be really noticed till next-gen console games release
nievz - Wednesday, June 24, 2020 - link
You should fire up your ryzen rig and run your benchmarks to know the difference with HAGS. +Assassin's Creed Oddeysey is now at 97% utilization from 90-92% previously. I'm psyched!
Rdr2
Setting: MAX
Before: min fps 29, avg 51
Now: min fps 46, avg 74
2070s, 3700x at 4.25ghz, 3600 CL16
david slayer - Wednesday, June 24, 2020 - link
hi. is it true hardware accelerated gpu scheduling is only for 10 series+ nvidia gpus ? ~ i only got a 970. :(lilkwarrior - Thursday, June 25, 2020 - link
It's a DX12 Ultimate feature, so of course 970 doesn't have it. Turing and Amphere were made in conjunction w/ Microsoft to finally realize DX12's original vision that's Windows 10+ only.Maxwell was supposed to be the swan song of rendering DX11 games really well till proper rendering slowed down by Windows 7 being still prevalent.
Things like variable shading & so on are things your GPU won't support either. That said, you were wise to adopt a TI owner's usual upgrade path of buying every other generation this time around.
Dribble - Monday, June 29, 2020 - link
Not a massively helpful reply. Simple answer is hardware scheduling works on Pascal+, so not just Turing/Amphere, it works on Nvidia 10 series or newer. The rest of what you wrote is not relevant.06GTOSC - Wednesday, June 24, 2020 - link
"At this point no previously-announced games have confirmed that they’ll be using DX12U, though this is just a matter of time, especially with the Xbox Series X launching in a few weeks."I'm sorry what now?
Ryan Smith - Wednesday, June 24, 2020 - link
Sorry. I keep forgetting I'm in the timeline where the Xbox Series X doesn't launch until the fall. Typo corrected!catavalon21 - Wednesday, June 24, 2020 - link
Well playedTodaynottoday - Thursday, June 25, 2020 - link
Bug ridden garbage!I applied the Windows 10 update and turned on the windows GPU scheduler, and downloaded and installed the new Nvidia Studio driver 451.48 - But, the driver breaks HDR playback in MPC-HC (usingMadVR as the decoder)! WTF???
I installed the Studio driver for my RTX 2060 Super, and it broke HDR functionality in video playback (Windows HDR settings remain fine). Studio drivers are supposed to be double checked for this kind of nonsense!
I reverted to driver (clean install/uninstall both times) 442.92 and HDR is working just fine in movie playback. Left the windows update alone, so its clearly not that breaking the HDR playback!
Tried to alert Nvidia via their reporting tool via speaking to an agent @ https://nvidia.custhelp.com/app/chat/chat_landing
but when my turn came, my wait time in the cue went back to 9 mins, (twice!), Nividia agents clearly 'hanging up' the chat calls!
Anyway, this update is broken and a waste of time, gaud alone knows what else it breaks!
ipkh - Thursday, June 25, 2020 - link
Maybe you should file a bug with MadVR instead.That and post a driver feedback bug report with Nvidia. Those are the best ways to get your corner case resolved.
eddman - Thursday, June 25, 2020 - link
While possible that it's a bug in the new driver, it's also possible that madvr or MPC-HC are handling HDR video through a non-standard method, that was working with previous nvidia drivers.Try asking in MPC-HC and madvr official threads in the doom9 forum and see if others have this issue.
eddman - Thursday, June 25, 2020 - link
Here: https://forum.doom9.org/showthread.php?t=146228&am...Apparently madvr doesn't know how to handle HDR with the latest driver.
FYI, madvr hasn't been updated in more than a year.
Rookierookie - Thursday, June 25, 2020 - link
I feel like that Windows GPU selection is going to break a ton of games.Gigaplex - Sunday, June 28, 2020 - link
"In short, Windows 10 2004 has done away with the “Run with graphics processor” contextual menu option within NVIDIA’s drivers, which prior to now has been a shortcut method of forcing which GPU an application runs on it an Optimus system. In fact, it looks like control over this has been removed from NVIDIA’s drivers entirely. As noted in the support document, controlling which GPU is used is now handled through Windows itself, which means laptop users will need to get used to going into the Windows Settings panel to make any changes."I am fed up with Microsoft constantly removing features from Windows that I actually use.
Beaver M. - Sunday, June 28, 2020 - link
Tell me about it. I cant count the times I got stuck in a game and had to hard reset the PC because Windows doesnt offer task switch to desktop anymore. Coupled with Windows sometimes not task switching at all (at least not caring about focus anymore), when that happens, it feels like I am using Windows 95 again.