> According to NVIDIA’s announcement, the feature hinges on the > “hardware-accelerated programmable scaling filter available in Turing”
Riiight. Because integer scaling is a super complex and taxing operation.
If the ratio between the source and target resolutions isn't a whole number, any idea if it scales to the largest possible size, and centers? For example, 640x480 to 1920x1080 would be 1280x960.
Raspberry Pi 3 can also do nearest neighbor scaling in the GPU to at least 1920 x 1080. Texture scaling is one of the most basic GPU capabilities, so it's not really a HW limitation. OpenGL or Vulkan can do it, although it probably won't be as efficient (power, fps) as doing it closer to the display output blocks.
Doh. I just finished replaying a game that'd look way better with integer scaling vs bilinear. Oh well. (Red Alert 2 - with some modded exes it runs at native resolution during gameplay, but cutscenes and menus are switched to 640x480, 800x600, or 1024x768, all of which look nasty on my setup.)
Having just switched to nvidia for the first time in 4 years (2060 Super) this is a pretty nice treat for me. Especially the integer scaling. Performance gains are always nice, and it's good to see AMD pressing competition hard to stir up new features as standard.
Now if nvidia could pull the control panel interface out if being stuck in 1999, that would be great ^.^
I just tested integer scaling in several games on my Predator x27 with the new driver. 1080p should be perfectly upscaled to 2160p, but it still look blurry and much worse than a 27" 1080p panel.
It does seems to be working with integer scaling. If I select 1440p on my 2160p monitor it just centers the image without scaling, but 1080p and 720p are scaled to the full screen.
I have an RTX, but absolutely will never log in to nvidia, so image sharpening and a lot of goodies are out of my reach, because they require, for no reason, geforce experience, which doesn't works unless you log in.
I hate nvidia. If AMD had support for tensorflow, I would had chosen an RX 5700 XT, which has much better bang for the buck.
Same here. Hate Nvidia, but bought the RTX 2070, because tensorflow. And the RTX generation is the best so far for AI/ML. 16-bit training and TensorRT are a bonus.
AMD does support tensorflow with ROCm, but it's not that fast, plus you need to have PCIe that supports atomics operations. Also, as of 2.7, Navi is not yet supported. I feel your pain.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
25 Comments
Back to Article
willis936 - Tuesday, August 20, 2019 - link
Ah finally a just-in-time rendering option.1000 Hz mouse: $60.
144 Hz display: $600.
Waiting until 2019 for a basic GPU driver option: priceless.
Yojimbo - Tuesday, August 20, 2019 - link
They apparently had it before and no one seemed to care about it...prophet001 - Tuesday, August 20, 2019 - link
Integer scaling would be amazing on my 4k laptop. However, it doesn't work on the GTX 1080? 10?? series cards?JeffFlanagan - Tuesday, August 20, 2019 - link
He mentions that it might come later to the last-generation cards, but only Nvidia really knows if that's the case.sheh - Tuesday, August 20, 2019 - link
> According to NVIDIA’s announcement, the feature hinges on the> “hardware-accelerated programmable scaling filter available in Turing”
Riiight. Because integer scaling is a super complex and taxing operation.
If the ratio between the source and target resolutions isn't a whole number, any idea if it scales to the largest possible size, and centers? For example, 640x480 to 1920x1080 would be 1280x960.
willis936 - Tuesday, August 20, 2019 - link
I mean actual dedicated hardware blocks for scaling is a bit of a high end feature. For instance, the nintendo 3ds has one.voicequal - Sunday, August 25, 2019 - link
Raspberry Pi 3 can also do nearest neighbor scaling in the GPU to at least 1920 x 1080. Texture scaling is one of the most basic GPU capabilities, so it's not really a HW limitation. OpenGL or Vulkan can do it, although it probably won't be as efficient (power, fps) as doing it closer to the display output blocks.Alistair - Tuesday, August 20, 2019 - link
I'd assume integer scaling would be an aspect ration scaling system, not a "full screen" one.Phynaz - Tuesday, August 20, 2019 - link
Riiiight. Because you know what you’re talking about. Wait, no you don’t because you just had to ask a bunch of stupid questions.Monstieur - Wednesday, August 21, 2019 - link
> it scales to the largest possible size, and centersThat's exactly what it does.
evilspoons - Tuesday, August 20, 2019 - link
Doh. I just finished replaying a game that'd look way better with integer scaling vs bilinear. Oh well. (Red Alert 2 - with some modded exes it runs at native resolution during gameplay, but cutscenes and menus are switched to 640x480, 800x600, or 1024x768, all of which look nasty on my setup.)AshlayW - Tuesday, August 20, 2019 - link
Having just switched to nvidia for the first time in 4 years (2060 Super) this is a pretty nice treat for me. Especially the integer scaling. Performance gains are always nice, and it's good to see AMD pressing competition hard to stir up new features as standard.Now if nvidia could pull the control panel interface out if being stuck in 1999, that would be great ^.^
Alistair - Tuesday, August 20, 2019 - link
As long as they keep the left and right design. The new AMD panel is very responsive, but I hate the layout.nVidia could use a Win10 application for settings like Realtek etc., I actually like that. Opens and works instantly.
StrangerGuy - Tuesday, August 20, 2019 - link
NV's driver control panel has always been sluggish to load and to apply settings for no good reason. I cannot even remember when it wasn't.lilkwarrior - Wednesday, August 21, 2019 - link
Intel actually pushed them this time more than anything else as far as integer scalingMr Perfect - Tuesday, August 20, 2019 - link
Image sharpening looks interesting. I've been using Reshade's image sharpening, but if the driver can do it then all the better.TristanSDX - Tuesday, August 20, 2019 - link
Hopefully you (Anandtech site) will review integer scaling soonMonstieur - Tuesday, August 20, 2019 - link
I just tested integer scaling in several games on my Predator x27 with the new driver. 1080p should be perfectly upscaled to 2160p, but it still look blurry and much worse than a 27" 1080p panel.Phynaz - Tuesday, August 20, 2019 - link
You’ve done something wrong.Monstieur - Wednesday, August 21, 2019 - link
It does seems to be working with integer scaling. If I select 1440p on my 2160p monitor it just centers the image without scaling, but 1080p and 720p are scaled to the full screen.Monstieur - Wednesday, August 21, 2019 - link
1080p with integer scaling is definitely more "boxy" than filtered scaling, but the 4x4 pixels do not have sharp edges like a native 1080p monitor.pavag - Tuesday, August 20, 2019 - link
I have an RTX, but absolutely will never log in to nvidia, so image sharpening and a lot of goodies are out of my reach, because they require, for no reason, geforce experience, which doesn't works unless you log in.I hate nvidia. If AMD had support for tensorflow, I would had chosen an RX 5700 XT, which has much better bang for the buck.
Beaver M. - Tuesday, August 20, 2019 - link
You hate Nvidia yet you support them with one of their worst generations?Andy Chow - Thursday, August 22, 2019 - link
Same here. Hate Nvidia, but bought the RTX 2070, because tensorflow. And the RTX generation is the best so far for AI/ML. 16-bit training and TensorRT are a bonus.Andy Chow - Thursday, August 22, 2019 - link
AMD does support tensorflow with ROCm, but it's not that fast, plus you need to have PCIe that supports atomics operations. Also, as of 2.7, Navi is not yet supported. I feel your pain.