Did they let you play with the tech demo and verify it's demonstrating what they say it's capable of? There was a report that freesync mode was running at a fixed framerate between 47 and 48 fps, which sounds very similar to what they had to show at CES.
If AMD pulled off variable refresh rates so easily, then what does Nvidia's G-Sync do or offer that requires expensive hardware (particularly that FPGA board)? Does G-Sync have some innate technical advantage over FreeSync?
There's a lot of misinformation surrounding these technologies, but as far as I've been able to decipher any monitor that wants to variable refresh needs to have *some kind* of hardware inside to coordinate that process. NVIDIA saw an opportunity to provide a solution (proprietary, but possibly licenseable) via dedicated hardware upgrades to existing and future DP monitors. In response, AMD/VESA sought out a method that used existing technology (eDP) in order to bring it to future DP1.2a monitors. Either way, the hardware has to be there alongside software/firmware.
I just hope we see variable sync cover all refresh/frame rates at all resolutions. Confining it to sub-60fps is nice, but tearing and stutter is also an issue at higher refresh/frame rates.
I believe the primary point of these technologies is to deal with sub-<display refresh rate> framerates. If the demo monitor in question is a 60hz display, then it isn't going to display more than 60 FPS. If you're already getting >60 FPS on a 60hz display, then you can just cap the framerate or enable v-sync (unless you don't mind tearing).
However, if you are either consistently below 60 FPS, or are dipping below it periodically, then this kicks in. Being able to handle sub-60 FPS on a 60hz display fluidly is a huge deal. Previously you'd get nasty artifacts or you'd have to enable v-sync, which is often even less desirable. In either case they seem to feel future retail units will have a larger range than 40-60 without tearing. How well this functions (like most things) might also vary from display to display.
Now, just because they're only demoing the technology on a 60hz display doesn't mean it won't work on a 120hz display. A 120hz display with Freesync might be able to smoothly handle a much larger range of framerates, too. The best part? It won't be a costly addition and leverages existing technology, thus every monitor with DP support will have it eventually.
The difference is GSYNC can run your refresh rate in 1hz increments between say, 30-144hz, 144 times a second. So every frame is in sync with every refresh.
Stobing and/or high refresh rates don't do that. They allow higher refresh rates (which prevents tearing as "V-SYNC" will work beyond 60FPS) they don't prevent stutter since the refresh isn't variable.
There are utilities that allow you to run 4 refresh rates "simultaneously" on compatible hardware with any video card that supports the maximum refresh rate you specify, but they are crude implementations and still don't completely prevent tearing, but they help.
G-sync doesn't do 1hz increments like that, it doesn't even make sense to think of those increments in terms of Hz. Better to think of it in terms of how long you wait before checking for a new frame. The first check might be made assuming maximum refresh rate, so 6.9ms. After that you might just let your loop spin until it hits a new frame. Obviously there's some polling rate here, but the exact number could be anything from a few kHz to a few MHz. In any case, you'd probably measure the increment between possible g-sync refresh intervals in microseconds.
My guess is AMD is having a harder time of it than they're letting on, and that the FPGA+onboard RAM was necessary to get the desired performance in a short amount of time.
Long term, there may be advantages to keeping the RAM in the display, that have to do with reducing memory usage and overhead in the GPU, but the FPGA will certainly be replaced by an ASIC at some point.
RAM in the display could also be used for panel self refresh where the display tells itself to refresh when it does not receive an update. Enabling variable timing with panel self refresh sounds like a natural extension.
Well, you either have to do panel self refresh, or you have to re-send frames before the panel fades. G-sync does panel self refresh, but it's unclear how freesync will work. It's possible to use an extra framebuffer on the GPU to shift cost from the display to the video card, but there may be some performance and total system price considerations that make RAM inside the monitor a better option.
Not really. Current LCDs can hold their image for long enough that you don't have to redraw it at low Hz settings. I can set my displays to 24Hz and they work fine,
The reason I heard for why nVidia needs proprietary tech and AMD can use existing stuff is that AMD supported the blanking in eDP while nVidia didn't and had to install a board inside the monitor that could communicate with the graphics card to arrange the blanking interval.
If this makes it inside next gen 4k panels I might be interested. In the mean time, 110Hz 1440p IPS fine. :D
You might get a few seconds before the image disappears completely, choosing the minimum refresh rate is a question of color accuracy and contrast ratio. 30hz was chosen for the VG248QE, but different panels will fade at different rates, so that tradeoff will change based on panel technology and process.
110hz and ips... The display maybe trying to refresh at the oced 110hz. But that doesn't mean the panel is keeping up. For now I'm still convinced you need to sacrifice either speed or image quality. Or get a old CRT.
Some IPS panels have controllers that can be overclocked that high, but it doesn't make the actual panel transition any faster, so the frames tend to blur together. There's a high speed video of it drifting around somewhere.
It's easy because it's part of a spec already. It doesn't require licensing anything. It will be as ubiquitous as dp1.2. The problem on the meantime is that there is lag between the spec and implementation. We're in that valley but it isn't forever and once out everyone gets the benefits of this and not just Nvidia.
Why do some people assume G-Sync is going to be expensive? There is no reason it should be (beyond monitor manufacturers trying to charge all the proverbial traffic will bear).
Performance compared to a high refresh rate fixed refresh monitor, availability of g-sync, availability and potential performance constraints with freesync, cost of development and implementation, and brand value(licensing fees).
I expect g-sync prices to stay high until freesync ships, at which point I expect the price to be within 10% of a freesync monitor with equivalent performance.
If this is so easy and FREE why does it take 6-12 more months for availability and why can't existing monitors get a firmware upgrade to have it work on them.
Seems like something is not free to the monitor manufactures.
It's free as in free-specification, not free as in "we will upgrade the existing display hardware in your monitor that was not designed for this specification"... If the monitor was not designed with appropriate hardware to flexibly modify the refresh rate then no amount of firmware will fix that!
You already know the answer. It isn't going to be EASY or FREE. I wonder how long they had to look to find a monitor that would even do it between 40-60hz which by the way, nobody would buy. They're also clear here you won't be doing this on monitors with a firmware upgrade. They can hint you could do it, but that is BS if they can't even get ONE to go outside 40-60hz. That isn't a tech that is WORKING, it's what we commonly call jerry-rigging something and it seems just good enough to show a demo with. It's becoming more clear by the day NV didn't screw anyone, they just did what was necessary to get it to market. What takes 6-12 months (err...12months+, for AMD fudge factor)? Convincing someone to make a scaler/tech (whatever is needed) etc that can go far outside 40-60hz and be done reasonably cheaply. Nobody will do this for free which is exactly what NV ran into. They gave NV the bird, and NV made their own solution and even then it would only work on ONE monitor for months as we're just starting to see more gsync stuff coming in the pipe.
AMD has a long ways to go here and it won't be free. Free for AMD maybe, but not for someone involved in making the monitor and not for us after they pass it on. Also just like NV, you'll need specific cards to get the job done. So for most (like me with a radeon 5850), you need a card+monitor. ZERO manufacturers will go back and make new firmware to allow you to put off purchasing a NEW monitor. That is stupid for them and they are not stupid. I can hear the conversation now:
"So amd, you'd like us to make new firmware that will make people live longer in their monitors and avoid buying a new one for another year or two since they'll love how fluid their games are on the OLD one they have now? FU." Followed by "Ring...ring...Hello Nvidia? Can we start making some gysnc stuff so we can sell more monitors TODAY instead of a few years from now on AMD's free plan?"...LOL. There is a price to pay for awesome tech, and it isn't FREE. One day it gets cheaper so us mere mortals may be able to buy it, but it is VERY expensive for first adopters of pretty much anything great or cool.
We can only hope the AMD solution gets close enough NV has to license the REAL deal for a decent price to move it into everything (tablets, phones, tv's, monitors etc) and that should bring costs down for all.
That being said competition is great. We wouldn't have anything being looked at if they weren't trying to top each other. Gsync led to and AMD response. However it happened we now have Mantle, DX11 drivers from NV that catch mantle, OpenGL speeches from NV showing it has been in there for years (why nobody talked about this 3yrs ago is beyond me), DX12 coming to solve the same problems etc. It's amazing game devs needed a speech to tell them OpenGL has ways around driver overhead for years. We really don't even need dx12 or Mantle if they'd just use what is ALREADY in OpenGL. Whatever caused it all, at least the conversation is REALLY happening now. Since Valve's steamos won't run DX we should start seeing people take advantage of OpenGL stuff soon especially with DX12 so far off (how many can use it when it hits? How long before games use it massively?). OpenGL works for everyone today including Intel/AMD/NV/ARM crap etc.
Why would display vendors pay NVIDIA for something that is worse then what is provided for free as part of the DP 1.3 spec?
Because Nvidia will sell them a module that can drop in and have a working product. AMD expect them to either develop and manufacture their own controller, or pester a panel controller manufacturer to develop and sell one to them.
they talk about video cards... Free to use with any video cards. You need a Nvidia card to use G-Sync. What a moron writing one page of useless words. lol That means like the PhysX this can be software and not just hardware proprietary.
> The monitor in question operates with a fairly narrow range of rates of just 40Hz to 60Hz
Works for me. If you're dropping below that, you should adjust game detail until you're closer to 60 again. IMO G-Sync/FreeSync should be a way to minimize the impact of minor dips in framerate, not as an excuse to accept lower performance.
Doesn't work for me. Games should support 30-60, but you need to support as low as ~23 if you want to work with films.
Technically you can set a monitor to 23.976 Hz... but only if the timing of all components works out (many GPUs don't allow timing that precise, or make it very difficult to set/use), and you need really complex software on the player side to synchronize that precisely.
With FreeSync/G-Sync, all these problems go away. Everything becomes super simple: the player decodes the video at what it thinks is real-time, and updates the monitor every time it wants to present a frame. The exact timing is less important, because the average framerate will be 23.976 even if frame present times aren't super precise, and dropped/duplicated frames simply won't be a problem at all.
According to that person, there are around 4 supported ranges: 36-240, 21-144, 17-120, 9-60. My guess is that the standard, or the AMD hardware, has a limited number of modes it can handle (or, these ranges are defined for faster transition between modes).
...which we have been doing since 3D gaming with audio became the norm...
You just need to display the frames as they get completed directly to the monitor, instead of combining parts of frames with parts of other frames and display that combination whenever the monitor has to refresh (every fixed interval).
As-they-arrive frame updating actually makes audio sync a LOT easier. If you can assume that 24 frames per second will be displayed as 24 equally spaced frames per second (23.976, you get the idea) rather than the current case of having to double and shift frames to fit into the 60fps monitor refresh rate, keeping audio in sync is made dramatically simpler.
Same with game audio. You run audio at game logic speed, and display updates happen as soon as a frame is rendered. You don't need to worry about delaying certain sound effects (but not background music or environmental sound) if a frame is late and you need to repeat the last frame to keep in sync with the display refresh.
> you need to support as low as ~23 if you want to work with films Nonsense. 48 (47.952), 72 or 120 work even better than 24 for film refreshes as in case of random glitch it would last only fraction of frame unlike full frame time in 24 fps mode.
Eh. Adaptive Sync sounds great, but is unproven. FreeSync is just AMD's brand name for that. Again, unproven.
Meanwhile, Gsync is in shipping product. AMD really needed to show up with the fully armed and operational battlestation if they wanted to come out looking great today.
Otherwise, they're just telling us what we already know. In a year or so, we'll see a FEW monitors with Adaptive Sync (not to be confused with Adaptive V-sync, dig at nVidia) that are overpriced to account for the "extra certification" and are more broadly compatible than Gsync. But we won't know how WELL they work until later (than today) because all we're seeing from AMD is a vague proof of concept that doesn't really prove the concept.
Prove the concept by showing us the lower range and upper range you're advertising or go home, AMD. Because nVidia has said for a while now that going as low as AMD is promising is not ideal and yet there's AMD promising it without being able to actually SHOW it.
Is there any luck we see some GSync updated or already working with adaptive sync? I mean if I buy a gsync device I might not want to get stuck with nvidia for around 5 to 10 years (my mean time between screen change).
IIRC, weren't laptops were more likely to support adaptive sync? Any chance an AMD partner has a Kaveri APU + 1866/2133mhz dram + 1080p + adaptive sync laptop model coming to the market?
Such a system could be a decent budget $650~700 gamer laptop (especially if paired with an $100 or $200 aftermarket crucial SSD)
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
44 Comments
Back to Article
Soulwager - Thursday, June 5, 2014 - link
Did they let you play with the tech demo and verify it's demonstrating what they say it's capable of? There was a report that freesync mode was running at a fixed framerate between 47 and 48 fps, which sounds very similar to what they had to show at CES.r3loaded - Thursday, June 5, 2014 - link
If AMD pulled off variable refresh rates so easily, then what does Nvidia's G-Sync do or offer that requires expensive hardware (particularly that FPGA board)? Does G-Sync have some innate technical advantage over FreeSync?nathanddrews - Thursday, June 5, 2014 - link
There's a lot of misinformation surrounding these technologies, but as far as I've been able to decipher any monitor that wants to variable refresh needs to have *some kind* of hardware inside to coordinate that process. NVIDIA saw an opportunity to provide a solution (proprietary, but possibly licenseable) via dedicated hardware upgrades to existing and future DP monitors. In response, AMD/VESA sought out a method that used existing technology (eDP) in order to bring it to future DP1.2a monitors. Either way, the hardware has to be there alongside software/firmware.I just hope we see variable sync cover all refresh/frame rates at all resolutions. Confining it to sub-60fps is nice, but tearing and stutter is also an issue at higher refresh/frame rates.
Alexvrb - Thursday, June 5, 2014 - link
I believe the primary point of these technologies is to deal with sub-<display refresh rate> framerates. If the demo monitor in question is a 60hz display, then it isn't going to display more than 60 FPS. If you're already getting >60 FPS on a 60hz display, then you can just cap the framerate or enable v-sync (unless you don't mind tearing).However, if you are either consistently below 60 FPS, or are dipping below it periodically, then this kicks in. Being able to handle sub-60 FPS on a 60hz display fluidly is a huge deal. Previously you'd get nasty artifacts or you'd have to enable v-sync, which is often even less desirable. In either case they seem to feel future retail units will have a larger range than 40-60 without tearing. How well this functions (like most things) might also vary from display to display.
Now, just because they're only demoing the technology on a 60hz display doesn't mean it won't work on a 120hz display. A 120hz display with Freesync might be able to smoothly handle a much larger range of framerates, too. The best part? It won't be a costly addition and leverages existing technology, thus every monitor with DP support will have it eventually.
Samus - Monday, June 9, 2014 - link
The difference is GSYNC can run your refresh rate in 1hz increments between say, 30-144hz, 144 times a second. So every frame is in sync with every refresh.Stobing and/or high refresh rates don't do that. They allow higher refresh rates (which prevents tearing as "V-SYNC" will work beyond 60FPS) they don't prevent stutter since the refresh isn't variable.
There are utilities that allow you to run 4 refresh rates "simultaneously" on compatible hardware with any video card that supports the maximum refresh rate you specify, but they are crude implementations and still don't completely prevent tearing, but they help.
Soulwager - Monday, June 9, 2014 - link
G-sync doesn't do 1hz increments like that, it doesn't even make sense to think of those increments in terms of Hz. Better to think of it in terms of how long you wait before checking for a new frame. The first check might be made assuming maximum refresh rate, so 6.9ms. After that you might just let your loop spin until it hits a new frame. Obviously there's some polling rate here, but the exact number could be anything from a few kHz to a few MHz. In any case, you'd probably measure the increment between possible g-sync refresh intervals in microseconds.Soulwager - Thursday, June 5, 2014 - link
My guess is AMD is having a harder time of it than they're letting on, and that the FPGA+onboard RAM was necessary to get the desired performance in a short amount of time.Long term, there may be advantages to keeping the RAM in the display, that have to do with reducing memory usage and overhead in the GPU, but the FPGA will certainly be replaced by an ASIC at some point.
Kevin G - Thursday, June 5, 2014 - link
RAM in the display could also be used for panel self refresh where the display tells itself to refresh when it does not receive an update. Enabling variable timing with panel self refresh sounds like a natural extension.Soulwager - Thursday, June 5, 2014 - link
Well, you either have to do panel self refresh, or you have to re-send frames before the panel fades. G-sync does panel self refresh, but it's unclear how freesync will work. It's possible to use an extra framebuffer on the GPU to shift cost from the display to the video card, but there may be some performance and total system price considerations that make RAM inside the monitor a better option.Death666Angel - Thursday, June 5, 2014 - link
Not really. Current LCDs can hold their image for long enough that you don't have to redraw it at low Hz settings. I can set my displays to 24Hz and they work fine,The reason I heard for why nVidia needs proprietary tech and AMD can use existing stuff is that AMD supported the blanking in eDP while nVidia didn't and had to install a board inside the monitor that could communicate with the graphics card to arrange the blanking interval.
If this makes it inside next gen 4k panels I might be interested. In the mean time, 110Hz 1440p IPS fine. :D
Soulwager - Thursday, June 5, 2014 - link
You might get a few seconds before the image disappears completely, choosing the minimum refresh rate is a question of color accuracy and contrast ratio. 30hz was chosen for the VG248QE, but different panels will fade at different rates, so that tradeoff will change based on panel technology and process.SlyNine - Friday, June 6, 2014 - link
110hz and ips... The display maybe trying to refresh at the oced 110hz. But that doesn't mean the panel is keeping up. For now I'm still convinced you need to sacrifice either speed or image quality. Or get a old CRT.invinciblegod - Friday, June 6, 2014 - link
What the, do they exist? I've never seen a 110hz, IPS, 1440p panel!Soulwager - Sunday, June 8, 2014 - link
Some IPS panels have controllers that can be overclocked that high, but it doesn't make the actual panel transition any faster, so the frames tend to blur together. There's a high speed video of it drifting around somewhere.HighTech4US - Thursday, June 5, 2014 - link
Good questions. I hope Anandtech does a side by side comparison of the two technologies.But only G-Sync monitors are available now and F-Sync ones are still out 6 to 12 more months (which makes no sense if it was so easy to do).
tuxRoller - Thursday, June 5, 2014 - link
It's easy because it's part of a spec already. It doesn't require licensing anything. It will be as ubiquitous as dp1.2.The problem on the meantime is that there is lag between the spec and implementation. We're in that valley but it isn't forever and once out everyone gets the benefits of this and not just Nvidia.
SlyNine - Friday, June 6, 2014 - link
The real question is what one offers the least latency. input tests will be interesting.Sabresiberian - Sunday, June 8, 2014 - link
Why do some people assume G-Sync is going to be expensive? There is no reason it should be (beyond monitor manufacturers trying to charge all the proverbial traffic will bear).Soulwager - Monday, June 9, 2014 - link
Performance compared to a high refresh rate fixed refresh monitor, availability of g-sync, availability and potential performance constraints with freesync, cost of development and implementation, and brand value(licensing fees).I expect g-sync prices to stay high until freesync ships, at which point I expect the price to be within 10% of a freesync monitor with equivalent performance.
HighTech4US - Thursday, June 5, 2014 - link
If this is so easy and FREE why does it take 6-12 more months for availability and why can't existing monitors get a firmware upgrade to have it work on them.Seems like something is not free to the monitor manufactures.
Spunjji - Thursday, June 5, 2014 - link
It's free as in free-specification, not free as in "we will upgrade the existing display hardware in your monitor that was not designed for this specification"... If the monitor was not designed with appropriate hardware to flexibly modify the refresh rate then no amount of firmware will fix that!Soulwager - Thursday, June 5, 2014 - link
Is the spec free to anyone, or just VESA members? I'm definitely curious about the details.ninjaquick - Thursday, June 5, 2014 - link
Either way, almost everyone is a VESA member.akbo - Thursday, June 5, 2014 - link
Free as in free speech not free beer. Hardware costs money for you know, "binned screens" verified to run 10-100 hz and compute and stuff.TheJian - Thursday, June 5, 2014 - link
You already know the answer. It isn't going to be EASY or FREE. I wonder how long they had to look to find a monitor that would even do it between 40-60hz which by the way, nobody would buy. They're also clear here you won't be doing this on monitors with a firmware upgrade. They can hint you could do it, but that is BS if they can't even get ONE to go outside 40-60hz. That isn't a tech that is WORKING, it's what we commonly call jerry-rigging something and it seems just good enough to show a demo with. It's becoming more clear by the day NV didn't screw anyone, they just did what was necessary to get it to market. What takes 6-12 months (err...12months+, for AMD fudge factor)? Convincing someone to make a scaler/tech (whatever is needed) etc that can go far outside 40-60hz and be done reasonably cheaply. Nobody will do this for free which is exactly what NV ran into. They gave NV the bird, and NV made their own solution and even then it would only work on ONE monitor for months as we're just starting to see more gsync stuff coming in the pipe.AMD has a long ways to go here and it won't be free. Free for AMD maybe, but not for someone involved in making the monitor and not for us after they pass it on. Also just like NV, you'll need specific cards to get the job done. So for most (like me with a radeon 5850), you need a card+monitor. ZERO manufacturers will go back and make new firmware to allow you to put off purchasing a NEW monitor. That is stupid for them and they are not stupid. I can hear the conversation now:
"So amd, you'd like us to make new firmware that will make people live longer in their monitors and avoid buying a new one for another year or two since they'll love how fluid their games are on the OLD one they have now? FU." Followed by "Ring...ring...Hello Nvidia? Can we start making some gysnc stuff so we can sell more monitors TODAY instead of a few years from now on AMD's free plan?"...LOL. There is a price to pay for awesome tech, and it isn't FREE. One day it gets cheaper so us mere mortals may be able to buy it, but it is VERY expensive for first adopters of pretty much anything great or cool.
We can only hope the AMD solution gets close enough NV has to license the REAL deal for a decent price to move it into everything (tablets, phones, tv's, monitors etc) and that should bring costs down for all.
That being said competition is great. We wouldn't have anything being looked at if they weren't trying to top each other. Gsync led to and AMD response. However it happened we now have Mantle, DX11 drivers from NV that catch mantle, OpenGL speeches from NV showing it has been in there for years (why nobody talked about this 3yrs ago is beyond me), DX12 coming to solve the same problems etc. It's amazing game devs needed a speech to tell them OpenGL has ways around driver overhead for years. We really don't even need dx12 or Mantle if they'd just use what is ALREADY in OpenGL. Whatever caused it all, at least the conversation is REALLY happening now. Since Valve's steamos won't run DX we should start seeing people take advantage of OpenGL stuff soon especially with DX12 so far off (how many can use it when it hits? How long before games use it massively?). OpenGL works for everyone today including Intel/AMD/NV/ARM crap etc.
Spazturtle - Thursday, June 5, 2014 - link
They have said they don't expect display vendors to make new firmware, this was just for demo to show that it could be done.Why would display vendors pay NVIDIA for something that is worse then what is provided for free as part of the DP 1.3 spec?
tuxRoller - Thursday, June 5, 2014 - link
Dp 1.2a, apparently.edzieba - Friday, June 6, 2014 - link
Because Nvidia will sell them a module that can drop in and have a working product. AMD expect them to either develop and manufacture their own controller, or pester a panel controller manufacturer to develop and sell one to them.Antronman - Thursday, June 12, 2014 - link
Please actually read about Freesync the next time you decide to post about it sir.antimoron - Friday, August 1, 2014 - link
they talk about video cards... Free to use with any video cards. You need a Nvidia card to use G-Sync. What a moron writing one page of useless words. lol That means like the PhysX this can be software and not just hardware proprietary.shivabowl - Thursday, June 5, 2014 - link
It lookie like a Nixeus 27" DisplayPort monitor that AMD is using to demonstrate Freesync.Septor - Tuesday, June 10, 2014 - link
Yup, cause it is. Apparently with updated firmware however.Anonymous Blowhard - Thursday, June 5, 2014 - link
> The monitor in question operates with a fairly narrow range of rates of just 40Hz to 60HzWorks for me. If you're dropping below that, you should adjust game detail until you're closer to 60 again. IMO G-Sync/FreeSync should be a way to minimize the impact of minor dips in framerate, not as an excuse to accept lower performance.
Guspaz - Thursday, June 5, 2014 - link
Doesn't work for me. Games should support 30-60, but you need to support as low as ~23 if you want to work with films.Technically you can set a monitor to 23.976 Hz... but only if the timing of all components works out (many GPUs don't allow timing that precise, or make it very difficult to set/use), and you need really complex software on the player side to synchronize that precisely.
With FreeSync/G-Sync, all these problems go away. Everything becomes super simple: the player decodes the video at what it thinks is real-time, and updates the monitor every time it wants to present a frame. The exact timing is less important, because the average framerate will be 23.976 even if frame present times aren't super precise, and dropped/duplicated frames simply won't be a problem at all.
tuxRoller - Thursday, June 5, 2014 - link
http://www.forbes.com/sites/jasonevangelho/2014/05...According to that person, there are around 4 supported ranges: 36-240, 21-144, 17-120, 9-60. My guess is that the standard, or the AMD hardware, has a limited number of modes it can handle (or, these ranges are defined for faster transition between modes).
Gigaplex - Thursday, June 5, 2014 - link
You still need to make sure the frame times sync with the audio, so it isn't entirely super simple.yasamoka - Friday, June 6, 2014 - link
...which we have been doing since 3D gaming with audio became the norm...You just need to display the frames as they get completed directly to the monitor, instead of combining parts of frames with parts of other frames and display that combination whenever the monitor has to refresh (every fixed interval).
edzieba - Friday, June 6, 2014 - link
As-they-arrive frame updating actually makes audio sync a LOT easier. If you can assume that 24 frames per second will be displayed as 24 equally spaced frames per second (23.976, you get the idea) rather than the current case of having to double and shift frames to fit into the 60fps monitor refresh rate, keeping audio in sync is made dramatically simpler.Same with game audio. You run audio at game logic speed, and display updates happen as soon as a frame is rendered. You don't need to worry about delaying certain sound effects (but not background music or environmental sound) if a frame is late and you need to repeat the last frame to keep in sync with the display refresh.
Senti - Friday, June 6, 2014 - link
> you need to support as low as ~23 if you want to work with filmsNonsense. 48 (47.952), 72 or 120 work even better than 24 for film refreshes as in case of random glitch it would last only fraction of frame unlike full frame time in 24 fps mode.
HisDivineOrder - Friday, June 6, 2014 - link
Eh. Adaptive Sync sounds great, but is unproven. FreeSync is just AMD's brand name for that. Again, unproven.Meanwhile, Gsync is in shipping product. AMD really needed to show up with the fully armed and operational battlestation if they wanted to come out looking great today.
Otherwise, they're just telling us what we already know. In a year or so, we'll see a FEW monitors with Adaptive Sync (not to be confused with Adaptive V-sync, dig at nVidia) that are overpriced to account for the "extra certification" and are more broadly compatible than Gsync. But we won't know how WELL they work until later (than today) because all we're seeing from AMD is a vague proof of concept that doesn't really prove the concept.
Prove the concept by showing us the lower range and upper range you're advertising or go home, AMD. Because nVidia has said for a while now that going as low as AMD is promising is not ideal and yet there's AMD promising it without being able to actually SHOW it.
Antronman - Wednesday, June 11, 2014 - link
Displayport 1.2a and 1.3 and onwards will all be Freesync enabled.That means once monitor makers decide to move on, every new monitor being made will have Freesync.
AMD just did show it. Read the article. Oh sure, they didn't show a ready to sell product. They just needed to show it works.
Freesync is a VESA standard for DP1.2a and onwards.
loopingz - Friday, June 6, 2014 - link
Is there any luck we see some GSync updated or already working with adaptive sync?I mean if I buy a gsync device I might not want to get stuck with nvidia for around 5 to 10 years (my mean time between screen change).
formulav8 - Friday, June 6, 2014 - link
You NVidia stooges are so pathetic and annoying.Novaguy - Saturday, June 7, 2014 - link
IIRC, weren't laptops were more likely to support adaptive sync? Any chance an AMD partner has a Kaveri APU + 1866/2133mhz dram + 1080p + adaptive sync laptop model coming to the market?Such a system could be a decent budget $650~700 gamer laptop (especially if paired with an $100 or $200 aftermarket crucial SSD)