Long-term, this could be a big deal in the console space (I'm thinking next-gen consoles). Variable refresh rates along with dynamic resolution scalers make targeting 60 FPS a lot more feasible. Right now the weak Jaguar cores are the bottleneck that keep most Xbox and PS4 games locked to 30 FPS, but with Ryzen cores coming in next-gen systems that will no longer be the case. The auto low latency mode would also enable non-techies to benefit from the low-latency modes offered by modern TVs. The end result could be large swaths of people being able to enjoy games with greatly reduced input lag.
The PS4 Pro's display controller is a newer generation since it supports HDMI 2.0. However it's not guaranteed since we don't know the inner workings, and ultimately it's up to Sony whether they want to pursue it.
I thought most gamers actually connect their console to a big screen TV (probably 50+"). How many of those support FreeSync? HDMI 2.1's adaptive sync is the closest thing most people will have to this but it's still not really in the hands of consumers.
If I have to connect my console to a "PC" LCD and play from the comfort of the desk chair I'd rather use a PC for it.
Just found out that all of Samsung's "Q" Series 2018 sets (the ones JUST hitting stores) support FreeSync and VRR, and support up to 120hz at 1080p! (we have to wait for HDMI 2.1 for 4k@120hz) These are out now, and competitively priced to other premium sets on the market.
A lot of people talk about the current gen of Xbox/PS consoles as if they were x86 PCs, just because they use the X86 ISA. There is very little architectural similarity to a common PC (I'm not speaking of x86 ISA!), especially the PS4.
Pertaining to the specific topic at hand, Sony does not use AMD's display output controller, instead using a proprietary chip (iirc, it was a Sony designed chip from Panasonic, I may be wrong on that). Xbox One/S/X use the APU's built in, AMD display output controller.
What evidence do you have that console-specific builds target 30FPS because they are bottlenecked by the CPU? I'm pretty sure they optimize heavily, use all available threads, and offload as much work as possible to the GPU and custom blocks. Dynamic scaling has allowed Fortnite BR hit 60FPS pretty consistently (once you're on the ground) even on base Xbox One. Sounds more like a graphics bottleneck to me.
Any word on whether FreeSync 2 support only entails HDR or if it also includes the other parts of the spec? I'm more specifically fishing for LFC support.
LFC is actually a native FreeSync feature. It's simply not available if a display doesn't support a wide enough range of refresh rates (>2x).
All FreeSync 2 certification does is require that a monitor supports a wide range. It doesn't actually have any implication for the source device (video card/console).
LFC isn't a property of FreeSync support on the card, but rather the monitor. The monitor needs to have a wide enough refresh rate range (it used to be that the largest value had to be at least 2.5× the smallest, but it could be down to 2× today), and then LFC works.
This is great news! I don't own consoles but play most action/adventure PC games on a TV, and hopefully this will spur the inclusion of variable-refresh-rate in future TVs. I'm still on a 1080p TV and didn't really plan to upgrade until VRR is included (or my current TV fails).
"Low Latency Mode" is mainly applicable to VSYNC ON, which usually is on consoles. But also works on some PC monitors via a Custom Resolution Utility, via a Quick Frame Transport technique of using Large Vertical Totals -- basically a 144Hz monitor can sometimes be tricked to have a 60Hz low-latency fixed-Hz mode, by using a 60Hz signal with the pixel clock of 144Hz. (Large paddings in Vertical Front Porch) -- this accelerates refresh cycle delivery of lower refresh rates to full dotclock rate.
I wonder if many TVs will actually bother to implement either FreeSync directly (unlikely, imho), or HDMI 2.1 VRR (still a big question mark for me on TV adoption). Or are you supposed to get a big gaming screen for this?
It will probably be prevalent on 55+ inch mid to high end tv's with 120hz displays, 60hz displays dominate the mid to low end. Don't let the writing on the box fool you, people will have to do some research to determine if their display has a true 120hz panel or if its just marketing on the box.
Bought the Xbox One X on release and have been extremely disappointed about being stuck at only a paltry 1080P. - So it's just sitting there gathering dust.
LOL, even paltry 1080P looks better with the XB1X. frame rate is better in several games too. If you spent $500 on console to gather dust you're an idiot. You're probably a troll. Definitely and idiot.
I can't imagine a person as concerned with display resolution as you appear to be wouldn't do a few seconds of research to determine that a console runs games at 1080p and then unknowingly buy one only to find out after the fact and let that one thing be the deciding factor in whether or not it gets used. After that then not bother to return it to its place of purchase for a refund, resell it, or give it away to someone else. I think if you actually do own a XB1X and you really don't use it at all, you have other reasons and the comment you're making is merely being made to smooth down your own feathers over some personal choice you've made with spending your time doing something else or expended money on some other entertainment alternative.
You run out and plunk down $500 on a console at launch but 1080p is a dealbreaker? Seems a discerning fellow like yourself who can't tolerate pedestrian resolutions would clearly own at LEAST a 4K display... which it supported on day one. 1440p, how primitive.
I think Freesync is not AMD proprietary. FreeSync was developed by AMD in collaboration with VESA, its royalty-free and free to use as per the Wikipedia page: https://en.wikipedia.org/wiki/FreeSync
Well yeah it's in the name. FREEsync. When it was announced by AMD it was made known that it was a free solution vs Nvidias G-SYNC which requires hardware in the monitor to support it. I think theVESA model already had an implementation for variable refresh rate that no one until AMD took advantage of.
The main part of the variable refresh rate protocol of FreeSync, VESA Adaptive-Sync and HDMI 2.1 VRR is actually darn near identical.
They are simply variable-height blanking intervals -- if you're familiar with an analog TV, back in the 1970s, where the picture rolled with a misadjusted VHOLD knob -- that thick black bar is the VSYNC interval. Variable refresh rate is simply varying the height of that blanking interval to vary the time between refresh cycles -- diagram at https://www.blurbusters.com/photo-of-analog-vsync-...
FreeSync has some additional enhancements above and beyond, but the root protocol is fully interoperable. ToastyX CRU is capable of configuring a Radeon graphics card to output a FreeSync signal out of HDMI, and using a HDMI-to-VGA adaptor to make FreeSync work on an analog CRT display! It works, if the CRT is (A) multisync and (B) doesn't have blankout electronics for refresh-rate-changes. There's some hacker threads that confirmed freesync worked successfully on certain high end old multisync CRTs.
The technique of refresh-rate varying is actually relatively gentle to these CRTs, though the refresh-rate slewing can cause picture-position/size distortions on the fly if the refresh interval changes too quickly, but the picture was stable (except varying rate of flicker).
It's rather neat how FreeSync commandeered a creative modification of the digital version of a 1930s scanning technique -- by simply inserting/deleting hidden scanlines from the VBI (that black bar between refresh cycles) -- which is apparently backwards compatible with analog so FreeSync worked on certain 'strong' MultiSync CRTs.
OK, that is cool! I knew VESA had a variable refresh rate protocol no one ever implemented, and I knew AMD used it to build FREESYNC bc it would enable manufacturers to implement the spec if their controllers were already up to standard. Didn't know about the CRTs!
I am curious about to things related to Xbox One S
1. What games will take advantage of FreeSync 2. Will a Samsung 40 in 4K monitor with HDR 7 Series purchase this year - will work with FreeSync 3. I also have Samsung 4K monitor purchase about 2 years ago also
Only displays that explicitly say they support freesync or vrr. There are no consumer television yet on the market that support it. Many computer monitors support it, you need to look up the specs on your monitor,
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
27 Comments
Back to Article
Stochastic - Monday, March 12, 2018 - link
Long-term, this could be a big deal in the console space (I'm thinking next-gen consoles). Variable refresh rates along with dynamic resolution scalers make targeting 60 FPS a lot more feasible. Right now the weak Jaguar cores are the bottleneck that keep most Xbox and PS4 games locked to 30 FPS, but with Ryzen cores coming in next-gen systems that will no longer be the case. The auto low latency mode would also enable non-techies to benefit from the low-latency modes offered by modern TVs. The end result could be large swaths of people being able to enjoy games with greatly reduced input lag.MonkeyPaw - Monday, March 12, 2018 - link
Considering both PS and Xbox use all-AMD SOCs, I wonder if Sony will also add the feature?Ryan Smith - Monday, March 12, 2018 - link
The PS4 Pro's display controller is a newer generation since it supports HDMI 2.0. However it's not guaranteed since we don't know the inner workings, and ultimately it's up to Sony whether they want to pursue it.close - Tuesday, March 13, 2018 - link
I thought most gamers actually connect their console to a big screen TV (probably 50+"). How many of those support FreeSync? HDMI 2.1's adaptive sync is the closest thing most people will have to this but it's still not really in the hands of consumers.If I have to connect my console to a "PC" LCD and play from the comfort of the desk chair I'd rather use a PC for it.
jnemesh - Friday, April 6, 2018 - link
Just found out that all of Samsung's "Q" Series 2018 sets (the ones JUST hitting stores) support FreeSync and VRR, and support up to 120hz at 1080p! (we have to wait for HDMI 2.1 for 4k@120hz) These are out now, and competitively priced to other premium sets on the market.jeremyshaw - Monday, March 12, 2018 - link
A lot of people talk about the current gen of Xbox/PS consoles as if they were x86 PCs, just because they use the X86 ISA. There is very little architectural similarity to a common PC (I'm not speaking of x86 ISA!), especially the PS4.Pertaining to the specific topic at hand, Sony does not use AMD's display output controller, instead using a proprietary chip (iirc, it was a Sony designed chip from Panasonic, I may be wrong on that). Xbox One/S/X use the APU's built in, AMD display output controller.
Alexvrb - Tuesday, March 13, 2018 - link
What evidence do you have that console-specific builds target 30FPS because they are bottlenecked by the CPU? I'm pretty sure they optimize heavily, use all available threads, and offload as much work as possible to the GPU and custom blocks. Dynamic scaling has allowed Fortnite BR hit 60FPS pretty consistently (once you're on the ground) even on base Xbox One. Sounds more like a graphics bottleneck to me.Trixanity - Monday, March 12, 2018 - link
Any word on whether FreeSync 2 support only entails HDR or if it also includes the other parts of the spec? I'm more specifically fishing for LFC support.jordanclock - Monday, March 12, 2018 - link
Looks like FreeSync already plays well with LFC.https://www.amd.com/Documents/freesync-lfc.pdf
Ryan Smith - Monday, March 12, 2018 - link
LFC is actually a native FreeSync feature. It's simply not available if a display doesn't support a wide enough range of refresh rates (>2x).All FreeSync 2 certification does is require that a monitor supports a wide range. It doesn't actually have any implication for the source device (video card/console).
piroroadkill - Tuesday, March 13, 2018 - link
LFC isn't a property of FreeSync support on the card, but rather the monitor. The monitor needs to have a wide enough refresh rate range (it used to be that the largest value had to be at least 2.5× the smallest, but it could be down to 2× today), and then LFC works.hybrid2d4x4 - Monday, March 12, 2018 - link
This is great news! I don't own consoles but play most action/adventure PC games on a TV, and hopefully this will spur the inclusion of variable-refresh-rate in future TVs. I'm still on a 1080p TV and didn't really plan to upgrade until VRR is included (or my current TV fails).A5 - Monday, March 12, 2018 - link
I suspect we'll see a lot of it in mainstream displays when HDMI 2.1 hits next year.The Auto Low Latency Mode talked about here is part of HDMI 2.1 as well.
mdrejhon - Tuesday, March 13, 2018 - link
"Low Latency Mode" is mainly applicable to VSYNC ON, which usually is on consoles. But also works on some PC monitors via a Custom Resolution Utility, via a Quick Frame Transport technique of using Large Vertical Totals -- basically a 144Hz monitor can sometimes be tricked to have a 60Hz low-latency fixed-Hz mode, by using a 60Hz signal with the pixel clock of 144Hz. (Large paddings in Vertical Front Porch) -- this accelerates refresh cycle delivery of lower refresh rates to full dotclock rate.Chad - Monday, March 12, 2018 - link
This is a pretty big deal. Not now, but for the future of console gaming as a whole.nevcairiel - Monday, March 12, 2018 - link
I wonder if many TVs will actually bother to implement either FreeSync directly (unlikely, imho), or HDMI 2.1 VRR (still a big question mark for me on TV adoption). Or are you supposed to get a big gaming screen for this?c1979h4life - Monday, March 12, 2018 - link
It will probably be prevalent on 55+ inch mid to high end tv's with 120hz displays, 60hz displays dominate the mid to low end. Don't let the writing on the box fool you, people will have to do some research to determine if their display has a true 120hz panel or if its just marketing on the box.StevoLincolnite - Monday, March 12, 2018 - link
Don't forget native 1440P support finally.Bought the Xbox One X on release and have been extremely disappointed about being stuck at only a paltry 1080P. - So it's just sitting there gathering dust.
Manch - Tuesday, March 13, 2018 - link
LOL, even paltry 1080P looks better with the XB1X. frame rate is better in several games too. If you spent $500 on console to gather dust you're an idiot. You're probably a troll. Definitely and idiot.PeachNCream - Tuesday, March 13, 2018 - link
I can't imagine a person as concerned with display resolution as you appear to be wouldn't do a few seconds of research to determine that a console runs games at 1080p and then unknowingly buy one only to find out after the fact and let that one thing be the deciding factor in whether or not it gets used. After that then not bother to return it to its place of purchase for a refund, resell it, or give it away to someone else. I think if you actually do own a XB1X and you really don't use it at all, you have other reasons and the comment you're making is merely being made to smooth down your own feathers over some personal choice you've made with spending your time doing something else or expended money on some other entertainment alternative.Alexvrb - Tuesday, March 13, 2018 - link
You run out and plunk down $500 on a console at launch but 1080p is a dealbreaker? Seems a discerning fellow like yourself who can't tolerate pedestrian resolutions would clearly own at LEAST a 4K display... which it supported on day one. 1440p, how primitive.prashplus - Tuesday, March 13, 2018 - link
I think Freesync is not AMD proprietary. FreeSync was developed by AMD in collaboration with VESA, its royalty-free and free to use as per the Wikipedia page: https://en.wikipedia.org/wiki/FreeSyncManch - Tuesday, March 13, 2018 - link
Well yeah it's in the name. FREEsync. When it was announced by AMD it was made known that it was a free solution vs Nvidias G-SYNC which requires hardware in the monitor to support it. I think theVESA model already had an implementation for variable refresh rate that no one until AMD took advantage of.mdrejhon - Tuesday, March 13, 2018 - link
The main part of the variable refresh rate protocol of FreeSync, VESA Adaptive-Sync and HDMI 2.1 VRR is actually darn near identical.They are simply variable-height blanking intervals -- if you're familiar with an analog TV, back in the 1970s, where the picture rolled with a misadjusted VHOLD knob -- that thick black bar is the VSYNC interval. Variable refresh rate is simply varying the height of that blanking interval to vary the time between refresh cycles -- diagram at https://www.blurbusters.com/photo-of-analog-vsync-...
FreeSync has some additional enhancements above and beyond, but the root protocol is fully interoperable. ToastyX CRU is capable of configuring a Radeon graphics card to output a FreeSync signal out of HDMI, and using a HDMI-to-VGA adaptor to make FreeSync work on an analog CRT display! It works, if the CRT is (A) multisync and (B) doesn't have blankout electronics for refresh-rate-changes. There's some hacker threads that confirmed freesync worked successfully on certain high end old multisync CRTs.
The technique of refresh-rate varying is actually relatively gentle to these CRTs, though the refresh-rate slewing can cause picture-position/size distortions on the fly if the refresh interval changes too quickly, but the picture was stable (except varying rate of flicker).
It's rather neat how FreeSync commandeered a creative modification of the digital version of a 1930s scanning technique -- by simply inserting/deleting hidden scanlines from the VBI (that black bar between refresh cycles) -- which is apparently backwards compatible with analog so FreeSync worked on certain 'strong' MultiSync CRTs.
Manch - Wednesday, March 14, 2018 - link
OK, that is cool! I knew VESA had a variable refresh rate protocol no one ever implemented, and I knew AMD used it to build FREESYNC bc it would enable manufacturers to implement the spec if their controllers were already up to standard. Didn't know about the CRTs!HStewart - Tuesday, March 13, 2018 - link
I am curious about to things related to Xbox One S1. What games will take advantage of FreeSync
2. Will a Samsung 40 in 4K monitor with HDR 7 Series purchase this year - will work with FreeSync
3. I also have Samsung 4K monitor purchase about 2 years ago also
Hixbot - Wednesday, March 14, 2018 - link
Only displays that explicitly say they support freesync or vrr. There are no consumer television yet on the market that support it. Many computer monitors support it, you need to look up the specs on your monitor,