What I want is a 21:9 aspect ratio monitor with G-Sync support, a high refresh rate (120 or 144 Hz), and low latency (gamer focused). I don't want to run a triple-monitor setup, bezels are bad. Anything incoming like this?
I'd be interested in GSync with a 21:9 xPS or xVA (anything not TN, basically) at 60 Hz. Give it a decent PPI of ~130, so I might get away without scaling.
G-Sync carries about $200 premium, we already know how much these Samsung models cost ($600 for non-ASync), how much do you think they will cost with ASync? Pretty good chance it won't be "Free"Sync anyways.
As a comparison, the Acer G-Sync 4K display and the Asus Swift 144Hz 1440p G-Sync display cost $800. Not a whole lot of wiggle room left for any significant ASync discount given AMD estimates their version will be around $100 cheaper than G-Sync.
In any case, it will most likely come to quality and support of the solution. Having used G-Sync already for 4 months, it is a seamless experience in single-GPU mode and well worth the investment whether you are an AMD or Nvidia user.
There is a big cost different. Adaptive VSync is royalty-free technology that's part of the DisplayPort standard, G-Sync requires the monitor manufacturer include Nvidia hardware into the monitor design. I don't think it will be long before every monitor supports adaptive vsync, it really is that cheap to implement. Nvidia will have to give in eventually and eventually will probably be sooner than later.
I personally don't care which technology wins because both technologies are basically interchangeable, but I do know that whichever is easiest for the industry will win. Even Intel gave in to the easier to implement technology. Remember IA86? No, well that's because we're all using AMD's AMD64 technology for our 64bit x86 processors now.
If you think that monitor manufactures are going to price it cheap, just because it's cheap to implement, then I've got a nice bridge.... ASync is free profit outta nowhere for monitor manufacturers. Good job AMD! Way to make money off your ideas!
They will make money on video cards. The situation was Nvidia only with Gsync and they had a timer to market lead. Leaving AMD with a situation of making work for everyone or deliver a solution that is cheaper as incentive to manufacturers. It potentially could eliminate Gsync from the market as it is a part of DisplayPort and so in a couple years it will be a feature all GPUs leverage.
How does AMD making money on video cards benefit the monitor manufacturer?
I'm with Dag, just because it's way cheaper to implement doesn't mean that monitors with this feature aren't going to cost more then ones without it, regardless of how little it cost to the manufacturer, and I wouldn't be surprised if the price difference was around 10% or less.
Free sync aka adaptive sync is a industry standard, so Intel, AMD, Nvidia all could use it free. Pure competition will pring the price down. G-sync is pure Nvidia only system, so It will always be like Apple products. No competition, no need to reduce the price... The really interesting this is how good Adaptive sync will be compared to Gsync. If it is as good or even near, it will become very popular. It will definitely help those poor performance Intel GPU to produce reasonable good picture in simple games. It also can be used in CPUs that are used in ARM processors, so there definitely is going to be market to this system!
"Adaptive VSync is royalty-free technology that's part of the DisplayPort standard, G-Sync requires the monitor manufacturer include Nvidia hardware into the monitor design. I don't think it will be long before every monitor supports adaptive vsync, it really is that cheap to implement"
Adaptive-Sync==g-sync==ePD1.3+, ALL of those require controllers built into the panel with more expensive chips and a substantial frame buffer
You are looking at a max of $100 difference using AMD's own numbers, actual G-Sync panel costs, actual Samsung panel costs, and a little bit of common sense.
$800 for Nvidia's 4K Display. $600+ for Samsung's UHD non-ASync Display. Purported premium of $100 for the ASync capable scalers. That's $700 for Async and $800 G-Sync. Is that really a big difference when you are spending $700+ either way?
The biggest difference of course being G-Sync's benefits are a known commodity, while ASync is still TBD.
Or you can leave the $100 there and get the better adaptive frame syncing technology, the trade-off may very well be the same as spending on a better GPU or CPU, quite possibly even more noticeable than $100 towards a GPU or CPU upgrade.
Oh now the truth comes out. "Or you can leave the $100 there and get the better adaptive frame syncing technology,"
0 real world testing has been done by any professional review site in the world to conclude that G-Sync monitor is superior to FreeSync, yet here we have a typical Pro-NV suppporter shouting this as if it's a fact. If this was true, why wouldn't NV support both G-Sync and FreeSync? Since as you say G-Synch is "better", then the market would rule and people with NV cards would still buy G-Sync monitors.
For starters, 4K IPS + FreeSync is not way comparable to Asus G-Sync TN ROG. If we see FreeSync on 4K IPS monitors, that's the death of G-Sync since G-Sync/FreeSync are most beneficial in the <60 fps area not from 60-144 Hz, and naturally a modern 4K IPS at 60 fps or below >>>> any 1080P/1440P/4K TN panel.
Secondly, since FreeSync is an open standard, it should be available on a lot more monitors giving users choices instead of 2-3 monitors for G-Sync.
If NV was onboard with FreeSynch like any normal company support open industry standards would have been, then we could be buying NV or AMD GPUs for our next gen 4K monitor, but because NV wants to lock us into buying NV only GPUs, they are purposely not supporting FreeSync. In some ways NV has become worse than Apple in their business practices.
LMAO and why has there been 0 real world testing done by professional reviewers? What we do know, and has corroborated by professional reviewers, is that G-Sync is available TODAY and does what it says it does, and is by all accounts: AWESOME.
But how do we know AMD's solution is probably inferior? Because their VP of Graphics said it months ago. I know, I know, this guy has said a lot of things to about FreeSync that turned out to be false (being free, being supported on at last 3 gens of AMD displays, not requiring additional hardware, being supported by just a firmware update), but hey, he is AMD's VP of Graphics right? He should have some insight on what they were putting together?
http://www.techpowerup.com/196557/amd-responds-to-... "According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware. ****Although the results of FreeSync will be close to those of G-Sync, NVIDIA's technology will have an edge with its output quality****, because the two are implemented differently, and by that we don't just mean how the hardware is laid out on a flow-chart, although the goals for both technologies is the same - to make a display's refresh rate slave to the GPU's frame-rate, rather than the other way around (with V-Sync)."
And for starters, you do realize those Samsung panels are also TN right? And that Acer also has a 4K TN G-Sync panel available TODAY, that again, from all accounts is awesome. So why are you trying to compare it to a 1440p ROG panel? They both serve slightly different markets and use cases but I am sure that won't stop you from speaking ignorantly on the topic.
Secondly, who cares if it is an Open Standard if no one besides AMD supports it? You still get a minority of the market supporting it in the end. And 2-3 G-Sync monitors? Its hard to tell if you are being obtuse or dishonest here. There's new G-Sync panels being announced daily, but even 2-3 would be way ahead of what AMD has shown us so far and even these Samsung panels are what? 5-6 months away?
LMAO like any normal company, is this a joke? If Nvidia waited for an open standard to magically pop out of a hole in the ground like AMD and their fanboys tend to hope for, we STILL wouldn't have adaptive frame sync technology, because Nvidia invented it. So yes, AMD and their fanboys are welcome, again, because thanks to Nvidia and their proactive research and development in making gaming better, even AMD fans like yourself have a (hopefully) similar avenue to enjoy this technology, because it is truly impressive.
And for the last bit hahaha I love it when AMD fans try to push the lock-out card regarding Adaptive Sync, I guess its time to pull the Raja Koduri card again:
"According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware."
So I guess it was actually AMD that decided to pursue a standard that locked out Nvidia's hardware, huh?
You mean "pay to consume" a product that not for sale yet? A product you are certain is inferior even though it isn't available to be purchased or reviewed yet? I'm finding it hard to see what else you could mean but I'm certain you're aware that this product hasn't been sold to anyone.
If the product doesn't deliver when they actually start selling it, then sure rally against the liars and the BS. Until then it just reads like someone with an oddly emotional distaste for something they can't know enough about to make a objective judgement on.
...so much misinformation about FreeSync. It's NOT part of the VESA standard. FreeSync does not = adaptive-sync. Adaptive vsync does not = adaptive-sync. Not all DP 1.2a/1.3 monitors will support adaptive-sync, and thus FreeSync. While cheaper than Gsync, FreeSync is not free and will have an additional cost to implement in DP monitors (right now it's looking like it'll be ~$100). The tech is not interchangeable. As much as people like to tout FreeSync as a free and open standard it's essentially proprietary.
It's part of the DisplayPort standard. An optional secion in 1.3a. It's exactly as proprietary as any other VESA standard: It's free for any VESA consortium member.
No, FreeSync is not a part of DP 1.2a/1.3. It's not a VESA standard. It's an AMD layer on top of adaptive-sync that's only supported by a subset of GCN cards. So much misinformation about FreeSync... sigh.
These are only the first series of monitors that have been officially announced to support FreeSync. As we get closer to the retail release date of FreeSync, I expect we'll learn that there will be cheaper models available as well.
The Samsung announcement is big news. Building on that is the report that BenQ, ViewSonic and LG are all getting ready to release Adaptive-Sync capable monitors in Q4 2014 or Q1 2015. FreeSync is obviously gathering major support from the industry's largest monitor manufacturers.
These are 4K and TN, if I undestand correctly. IMHO, there should also be IPS FullHD or 1920x1200 for ~$350 with FreeSync on the market (these, like my ASUS PA248Q, can be driven by single R9 290(X), and with FreeSync the picture will be tearless and flawless).
By "driven" I mean games can be played in High/Ultra settings. Of course, at lower quality settings, single R9 290(X) can do much higher resolutions reasonably.
I think the only current advantage of G-SYNC over FreeSync is that it currently has a wider range of supported frequencies. The downsides are extra cost and proprietary nature. Plus, if you're an NVIDIA fan, it's your only option of getting adaptive refresh rates at this time.
Well that's certainly not the ONLY advantage. I'd say the bigger advantage is that it actually exists, and does an awesome job at doing what it says its doing. We really have no idea how effective AMD's solution will be.
In the end, both solutions will feature vendor lock-in, as Nvidia will push their initiative G-Sync, and only AMD will support only A-Sync regardless whether it is free or open.
Intel and all ARM producers are going to use adaptive sync. They have so poor GPU in their systems that they need all help the async is going to give. And it does not even cost them anything. Just new GPU drivers to support that feature! But we don't know the quality yeat, so next spring will be interesting!
Why would Intel and ARM have any interest in supporting this given it is almost strictly going to be used for gaming and may require additional hardware and software on their end to support Adaptive Sync on their GPUs?
Given AMD has come forward and claimed Nvidia GPUs cannot support Adaptive Sync the same way their own GPUs can, who is to say Intel or ARM have the requisite support in their own display adapters.
AMD claiming this is an open standard that everyone is free to use is just another half-truth, a ploy/stall tactic to try and hinder the adoption of G-Sync, but in the end the score will still be:
Nvidia ~65% +/-5% * % of GeForce Kepler or newer graphics cards AMD 30% +/-5% * % of GCN 1.1+ GPUs (Hawaii, Bonaire, Tonga)
Because no one else will care enough about it to implement and support it.
"Why would Intel and ARM have any interest in supporting this given it is almost strictly going to be used for gaming and may require additional hardware and software on their end to support Adaptive Sync on their GPUs?"
Intel and Arm also service very different markets than NVidia and AMD. They can implement an asynchronous frame rate technology in phones, laptops, commercial displays, and integrated displays in cars and other areas. I could see this being helpful on a mobile display, because then you can have the CPU/GPU compute if the screen does not need to be updated. This frees up CPU/GPU cycles for more performance, or you can idle the CPU/GPU to save power if the screen is not being updated. For embedded solutions, this would allow a larger, higher resolution screen to be driven for a given CPU or GPU. If it is displaying text, the refresh could be ~1 Hz, and if it is video it could bump up to ~24 Hz.
Therefore, your opinion that Intel and ARM would have no incentive to use this tech is not correct.
Still have no idea if its true 240Hz input or fake TruMotion ala HDTVs, not to mention anything over 60Hz on the UHD panels these are going into will be useless for about 5 more years as there's not going to be the GPU horsepower to drive such refresh rates.
My bet is just marketing jargon for fake refresh rates, unlike G-Sync which gets actual 120+ FPS inputs.
With such Sync technology there's really no need for those super high display frame rates. And if you're concerned about the latency of your inputs: those shouldn't be coupled to the display frame rate anyway.
G-Sync makes the Display slave to the GPU with a direct-link, which is part of the reason there is an expensive FPGA and memory buffer (acts as a lookaside buffer), so that the monitor only refreshes when the GPU tells it to.
The way AMD explained their adaptive sync is that the monitor is still reactively predicting how to adjust frame rates but there is no direct signaling for each frame. Its hard to say for sure though given AMD has changed their tune so many times and never actually demonstrated the tech completely (their live demos were fixed sub-60Hz refresh rates, and not adaptive, as they claimed).
Shouldn't we be calling it "not-so-freesync" by now? This article blatantly shows it is NOT free as I've said all along here, at tomshardware etc. Ridiculous. Scalers will charge for the R&D they had to do to get this to work, certification testing will cost monitor makers etc etc. If you still think this will be FREE, you're plumb crazy. FREE would mean it would be the EXACT SAME PRICE as current model without it right?
"I'd expect the Adaptive-Sync enabled monitors to have at least a moderate price premium, but we'll see when they become available some time around March 2015."
"AMD notes that the manufacturing and validation requirements to support variable refresh rates without visual artifacts are higher than traditional LCDs"
Yeah, let me know when FREE really means FREE.
"Over time, however, if Adaptive-Sync catches on then economies of scale come into play and we could see widespread adoption."
You could say the same for Gsync, especially since NV owns 68% of discrete and there are FAR more owners of 600 series and up cards on the NV side already that will work if you go Gsync monitor, rather than AMD's side. Meaning fewer people on the NV side need more than a monitor.
http://support.amd.com/en-us/search/faq/219 "The AMD Radeon™ R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming."
If you don't own one of the above, you won't be using your card with freesync for gaming (only vid playback). Considering their much smaller market share, far more people need a card and a monitor on the AMD side, as simple math on sales numbers would show. I'll reserve judgement for when AMD actually gives it to reviewers to TEST, but for even a $100 premium over whatever premium is on AMD's side, I'll go NV because we know it works already if they can't prove it is just as good as NV's solution (and I really mean AS GOOD, not good enough). My monitors are over 7yrs old now, so that works out to $1 or so per month for "it just works". I have no problem with proprietary stuff if it's better and not much more cost like gsync seems to be over time. AMD also isn't leading in anything on gpu vs. maxwell and seem to have far more driver issues after cutting ~35% of their employees over the last 4yrs or so. I don't mind paying for BETTER stuff. IE, I like AMD the company and many products over the years (not management, but for example I own a radeon 5850 currently), but I'll be buying an Intel cpu barring a miracle cpu from AMD next year.
That's not true, Koduri was telling anyone who would listen FreeSync would essentially be free and that many monitors on the market could support FreeSync with just a firmware update. Needless to say, we have not seen Koduri speak on this or any other matter lately, instead we get the even more dishonest Huddy.
Either way, it was a wholly misleading misnomer that many AMD fanboys have incorrectly parroted since. In the end they're just pawns as usual in AMD's stall tactics to try and delay adoption of G-Sync while they scrambled to formulate their own solution.
Don't you ever get tired of this endless posturing? FFS, unless Nvidia is paying you I just don't understand what it is you get out of it.
We get it. NVIDIA can do no wrong, while AMD is a pile of shit. We would all be much better off if Intel and Nvidia had a monopoly on the CPU and GPU markets and AMD was shot behind the barn to put it out of its misery.
Do you like being lied to? Do you like being treated as an uninformed moron that doesn't know any better when AMD makes these ridiculous statements? Unless AMD is paying you or you're a braindead AMD zombie I just don't understand what you get out of it by defending their endless posturing, excuses and flat out lies. Its called accountability, AMD loves to just spitball and throw this nonsense out here, but its OK as long as people like you don't hold them accountable for it.
I've never once stated Nvidia can do no wrong, in fact, I've been critical of many of their pricing/SKU initiatives with Kepler, but that doesn't change the fact they are a SOLUTIONS driven company that actually produces product over bullshit slidedecks and spends more of their time and efforts developing and improving their own tech rather than pointing the finger and QQing over how their premature/stillborn tech will somehow be better when it eventually makes it to market.
And Intel vs. AMD isn't already a virtual monopoly in the CPU market? Seems to have been business as usual over the last 8 years since Conroe launched. Has anyone really even noticed a lack of AMD competitiveness in this market? I doubt we would see much different if a similar lopsided victory occurred for Nvidia on the GPU end of things. In fact, the last time Nvidia dominated AMD so badly, we got the $230 8800GT, as AT stated "the only GPU that matters".
and look what we got and the cpu market intel can know set pretty much any price they want want with the cpu's and enthusiasts can do little more then grumble since their are no alternatives. I also think you are confusing cause and effect with the 8800gt, we did not get it BECAUSE nvidia was dominating amd (actually ATI at the time i believe) but rather it was the reason they dominated. so please don't try and argue a monopoly is a good thing, or get cause and effect mixed up.
Intel sets any price they want? This is nonsense and an internet tech meme that needs to die. What is the most you have ever spent on a CPU? What would compel you to spend considerably more than that over just using the CPU that you already have? What kind of performance gain would you need Intel to show just to buy another CPU at that same price point? There are very simple economic factors that dictate Intel's pricing even in a virtual monopoly, such as discretionary income, substitutes and the price elasticity of demand. You can see it in the comments section every time they release a new CPU, every single consumer is making those decisions in real-time and the discussion doesn't end in: "Intel can make a $1000 CPU and I would be compelled to buy it" given that has been happening for years and that obviously isn't the case.
And regarding the 8800GT, you're wrong, Nvidia was dominating AMD for over a year with the 8800GTX and 8800GTS with R600 repeatedly delayed and ultimately, non-competitive. In the face of no actual competition, Nvidia did the unthinkable, they launched nearly flagship performance at a fraction of the cost, $230, which led them to even new heights. So yes, even in the face of no competition, the statement competition is always good and necessary is not unilaterally true.
@ legokill101: Intel has essentially maintained the same price points since Conroe, price points AMD used to occupy, all while consistently improving performance and efficiency. No one is saying monopoly is a good thing, but your assertion that Intel can and is charging whatever they want for their CPUs is simply baseless. That same argument parroted by AMD fanboys for the past ~8 years is in fact getting cause/effect mixed up.
What am I preaching? Reality? That's the best kind, maybe a bigger dose of it for AMD fanboys will diminish having to wade through all these nonsensical half-truths and memes in the future.
Feel free to post something worthwhile, until then I'll be right here thanks.
Yes, some software breaks under scaling, but I'm okay with that. I've been happy with 144PPI laptops (150% scaling) since 2008, and this 200% scaling should get rid of the blurriness.
I hope these will also have a 120Hz strobed backlight mode, and 8-bit color depth. My current XL2420TE has amazing motion in strobed mode, but also has horrible banding.
From interviews with both companies, it really sounds like G-Sync has some serious advantages...but with it costing much more to implement and not being royalty free, I can't see it going anywhere.
And where are the non-TN monitors that support it?
FreeSync at least seems like it might happen...except practically speaking if Nvidia doesn't support it (I can't remember if they are) it does me no good as I need to see AMD doing YEARS of Nvidia quality drivers before I consider them again...and I need to see them supporting old hardware like Nvidia does.
FreeSync might really take off if Intel adopted it. Not because their GPUs would be so strong, but because they might be able to combine it with any GPU via Lucid Hydra.
I love how nobodys addressing the elephant in the room....4k gaming is far too demnading for anyone tontake seriously!!! New games cant even run @ 1440p without SLI, let alone 4k!!!
Asynchronous frame rate technology should help that. Then, if a frame takes longer than 16.6 milliseconds to render (60 frame/sec) it can hold the previous frame until it is done. There would need to be smoothing enabled, otherwise jitter like what happens in AMD crossfire can happen. If the GPU can put out 30-60 FPS, and the frame syncing tech smooths out delivery in that range, current GPUs at the ~$300 price point should be able to push 4K. Basically, you can trick the user into feeling that the GPU is pushing buttery smooth 60 FPS without really having to get it. Input lag may be an issue, but that would need to be handled by the game it self and not the display or GPU.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
73 Comments
Back to Article
eastyy123 - Thursday, November 20, 2014 - link
the price for g-sync i think is to much and i really thought there would have been a lot more choice of monitors with g-sync by nowEdgeOfDetroit - Thursday, November 20, 2014 - link
What I want is a 21:9 aspect ratio monitor with G-Sync support, a high refresh rate (120 or 144 Hz), and low latency (gamer focused). I don't want to run a triple-monitor setup, bezels are bad. Anything incoming like this?MartinT - Friday, November 21, 2014 - link
Problem is you need the angle-independent colours of IPS or VA displays for good image quality in 21:9, and those aren't known to do more than 60 Hz.Hrel - Friday, November 21, 2014 - link
IPS also isn't low latency.meacupla - Friday, November 21, 2014 - link
PLS and IPS can do 120Hz, as already proven by the numerous cheap korean 1440/1600p screens on the market.The only limitation seems to be the controller chip on the monitor and brightness degradation at high refresh rates.
Morawka - Friday, November 21, 2014 - link
PLS can, IPS cannot. PLS is a poor man's "IPS" and does not always match performance.MrSpadge - Friday, November 21, 2014 - link
I'd be interested in GSync with a 21:9 xPS or xVA (anything not TN, basically) at 60 Hz. Give it a decent PPI of ~130, so I might get away without scaling.chizow - Thursday, November 20, 2014 - link
G-Sync carries about $200 premium, we already know how much these Samsung models cost ($600 for non-ASync), how much do you think they will cost with ASync? Pretty good chance it won't be "Free"Sync anyways.As a comparison, the Acer G-Sync 4K display and the Asus Swift 144Hz 1440p G-Sync display cost $800. Not a whole lot of wiggle room left for any significant ASync discount given AMD estimates their version will be around $100 cheaper than G-Sync.
In any case, it will most likely come to quality and support of the solution. Having used G-Sync already for 4 months, it is a seamless experience in single-GPU mode and well worth the investment whether you are an AMD or Nvidia user.
Flunk - Thursday, November 20, 2014 - link
There is a big cost different. Adaptive VSync is royalty-free technology that's part of the DisplayPort standard, G-Sync requires the monitor manufacturer include Nvidia hardware into the monitor design. I don't think it will be long before every monitor supports adaptive vsync, it really is that cheap to implement. Nvidia will have to give in eventually and eventually will probably be sooner than later.I personally don't care which technology wins because both technologies are basically interchangeable, but I do know that whichever is easiest for the industry will win. Even Intel gave in to the easier to implement technology. Remember IA86? No, well that's because we're all using AMD's AMD64 technology for our 64bit x86 processors now.
Flunk - Thursday, November 20, 2014 - link
IA86 => IA64, too many stupid acronyms.dagnamit - Thursday, November 20, 2014 - link
If you think that monitor manufactures are going to price it cheap, just because it's cheap to implement, then I've got a nice bridge.... ASync is free profit outta nowhere for monitor manufacturers. Good job AMD! Way to make money off your ideas!eanazag - Friday, November 21, 2014 - link
They will make money on video cards. The situation was Nvidia only with Gsync and they had a timer to market lead. Leaving AMD with a situation of making work for everyone or deliver a solution that is cheaper as incentive to manufacturers. It potentially could eliminate Gsync from the market as it is a part of DisplayPort and so in a couple years it will be a feature all GPUs leverage.Deelron - Saturday, November 22, 2014 - link
How does AMD making money on video cards benefit the monitor manufacturer?I'm with Dag, just because it's way cheaper to implement doesn't mean that monitors with this feature aren't going to cost more then ones without it, regardless of how little it cost to the manufacturer, and I wouldn't be surprised if the price difference was around 10% or less.
haukionkannel - Saturday, November 22, 2014 - link
Free sync aka adaptive sync is a industry standard, so Intel, AMD, Nvidia all could use it free. Pure competition will pring the price down. G-sync is pure Nvidia only system, so It will always be like Apple products. No competition, no need to reduce the price...The really interesting this is how good Adaptive sync will be compared to Gsync. If it is as good or even near, it will become very popular. It will definitely help those poor performance Intel GPU to produce reasonable good picture in simple games. It also can be used in CPUs that are used in ARM processors, so there definitely is going to be market to this system!
medi02 - Monday, November 24, 2014 - link
It's not only that G-sync is not free, it's also that nVidia openly stated they weren't going to share it with anyone (for money or not).medi02 - Monday, November 24, 2014 - link
It's not only that G-sync is not free, it's also that nVidia openly stated they weren't going to share it with anyone (for money or not).medi02 - Monday, November 24, 2014 - link
Monitor manufacturers would benefit from not having to deal with market segmentation.basroil - Friday, November 21, 2014 - link
"Adaptive VSync is royalty-free technology that's part of the DisplayPort standard, G-Sync requires the monitor manufacturer include Nvidia hardware into the monitor design. I don't think it will be long before every monitor supports adaptive vsync, it really is that cheap to implement"Adaptive-Sync==g-sync==ePD1.3+, ALL of those require controllers built into the panel with more expensive chips and a substantial frame buffer
chizow - Friday, November 21, 2014 - link
You are looking at a max of $100 difference using AMD's own numbers, actual G-Sync panel costs, actual Samsung panel costs, and a little bit of common sense.$800 for Nvidia's 4K Display. $600+ for Samsung's UHD non-ASync Display. Purported premium of $100 for the ASync capable scalers. That's $700 for Async and $800 G-Sync. Is that really a big difference when you are spending $700+ either way?
The biggest difference of course being G-Sync's benefits are a known commodity, while ASync is still TBD.
Shadow7037932 - Friday, November 21, 2014 - link
$100 is a noticeable difference. That's $100 I can spend on a better GPU or CPU or just save.chizow - Friday, November 21, 2014 - link
Or you can leave the $100 there and get the better adaptive frame syncing technology, the trade-off may very well be the same as spending on a better GPU or CPU, quite possibly even more noticeable than $100 towards a GPU or CPU upgrade.RussianSensation - Friday, November 21, 2014 - link
Oh now the truth comes out. "Or you can leave the $100 there and get the better adaptive frame syncing technology,"0 real world testing has been done by any professional review site in the world to conclude that G-Sync monitor is superior to FreeSync, yet here we have a typical Pro-NV suppporter shouting this as if it's a fact. If this was true, why wouldn't NV support both G-Sync and FreeSync? Since as you say G-Synch is "better", then the market would rule and people with NV cards would still buy G-Sync monitors.
For starters, 4K IPS + FreeSync is not way comparable to Asus G-Sync TN ROG. If we see FreeSync on 4K IPS monitors, that's the death of G-Sync since G-Sync/FreeSync are most beneficial in the <60 fps area not from 60-144 Hz, and naturally a modern 4K IPS at 60 fps or below >>>> any 1080P/1440P/4K TN panel.
Secondly, since FreeSync is an open standard, it should be available on a lot more monitors giving users choices instead of 2-3 monitors for G-Sync.
If NV was onboard with FreeSynch like any normal company support open industry standards would have been, then we could be buying NV or AMD GPUs for our next gen 4K monitor, but because NV wants to lock us into buying NV only GPUs, they are purposely not supporting FreeSync. In some ways NV has become worse than Apple in their business practices.
chizow - Friday, November 21, 2014 - link
LMAO and why has there been 0 real world testing done by professional reviewers? What we do know, and has corroborated by professional reviewers, is that G-Sync is available TODAY and does what it says it does, and is by all accounts: AWESOME.But how do we know AMD's solution is probably inferior? Because their VP of Graphics said it months ago. I know, I know, this guy has said a lot of things to about FreeSync that turned out to be false (being free, being supported on at last 3 gens of AMD displays, not requiring additional hardware, being supported by just a firmware update), but hey, he is AMD's VP of Graphics right? He should have some insight on what they were putting together?
http://www.techpowerup.com/196557/amd-responds-to-...
"According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware. ****Although the results of FreeSync will be close to those of G-Sync, NVIDIA's technology will have an edge with its output quality****, because the two are implemented differently, and by that we don't just mean how the hardware is laid out on a flow-chart, although the goals for both technologies is the same - to make a display's refresh rate slave to the GPU's frame-rate, rather than the other way around (with V-Sync)."
And for starters, you do realize those Samsung panels are also TN right? And that Acer also has a 4K TN G-Sync panel available TODAY, that again, from all accounts is awesome. So why are you trying to compare it to a 1440p ROG panel? They both serve slightly different markets and use cases but I am sure that won't stop you from speaking ignorantly on the topic.
Secondly, who cares if it is an Open Standard if no one besides AMD supports it? You still get a minority of the market supporting it in the end. And 2-3 G-Sync monitors? Its hard to tell if you are being obtuse or dishonest here. There's new G-Sync panels being announced daily, but even 2-3 would be way ahead of what AMD has shown us so far and even these Samsung panels are what? 5-6 months away?
LMAO like any normal company, is this a joke? If Nvidia waited for an open standard to magically pop out of a hole in the ground like AMD and their fanboys tend to hope for, we STILL wouldn't have adaptive frame sync technology, because Nvidia invented it. So yes, AMD and their fanboys are welcome, again, because thanks to Nvidia and their proactive research and development in making gaming better, even AMD fans like yourself have a (hopefully) similar avenue to enjoy this technology, because it is truly impressive.
And for the last bit hahaha I love it when AMD fans try to push the lock-out card regarding Adaptive Sync, I guess its time to pull the Raja Koduri card again:
"According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware."
So I guess it was actually AMD that decided to pursue a standard that locked out Nvidia's hardware, huh?
Pwnstar - Saturday, November 22, 2014 - link
Raja didn't say it was worse, he said it was different. Your biases are showing.chizow - Saturday, November 22, 2014 - link
And your poor reading comprehension is showing, truncated and asterisked for emphasis...."Although the results of FreeSync will be close to those of G-Sync, *******NVIDIA's technology will have an edge with its output quality*******."
Alexey291 - Sunday, November 23, 2014 - link
So how much do you get paid per post by NV's marketing?:)chizow - Monday, November 24, 2014 - link
Nothing, and certainly less than what you happily and willingly pay to consume AMD's BS, lies, and half-truths.Horza - Wednesday, November 26, 2014 - link
You mean "pay to consume" a product that not for sale yet? A product you are certain is inferior even though it isn't available to be purchased or reviewed yet? I'm finding it hard to see what else you could mean but I'm certain you're aware that this product hasn't been sold to anyone.If the product doesn't deliver when they actually start selling it, then sure rally against the liars and the BS. Until then it just reads like someone with an oddly emotional distaste for something they can't know enough about to make a objective judgement on.
dragonsqrrl - Monday, November 24, 2014 - link
...so much misinformation about FreeSync. It's NOT part of the VESA standard. FreeSync does not = adaptive-sync. Adaptive vsync does not = adaptive-sync. Not all DP 1.2a/1.3 monitors will support adaptive-sync, and thus FreeSync. While cheaper than Gsync, FreeSync is not free and will have an additional cost to implement in DP monitors (right now it's looking like it'll be ~$100). The tech is not interchangeable. As much as people like to tout FreeSync as a free and open standard it's essentially proprietary.xdrol - Sunday, January 4, 2015 - link
It's part of the DisplayPort standard. An optional secion in 1.3a. It's exactly as proprietary as any other VESA standard: It's free for any VESA consortium member.xdrol - Sunday, January 4, 2015 - link
*section. WTB edit button.dragonsqrrl - Wednesday, January 14, 2015 - link
No, FreeSync is not a part of DP 1.2a/1.3. It's not a VESA standard. It's an AMD layer on top of adaptive-sync that's only supported by a subset of GCN cards. So much misinformation about FreeSync... sigh.Creig - Friday, November 21, 2014 - link
These are only the first series of monitors that have been officially announced to support FreeSync. As we get closer to the retail release date of FreeSync, I expect we'll learn that there will be cheaper models available as well.Creig - Tuesday, November 25, 2014 - link
The Samsung announcement is big news. Building on that is the report that BenQ, ViewSonic and LG are all getting ready to release Adaptive-Sync capable monitors in Q4 2014 or Q1 2015. FreeSync is obviously gathering major support from the industry's largest monitor manufacturers.TiGr1982 - Thursday, November 20, 2014 - link
These are 4K and TN, if I undestand correctly.IMHO, there should also be IPS FullHD or 1920x1200 for ~$350 with FreeSync on the market (these, like my ASUS PA248Q, can be driven by single R9 290(X), and with FreeSync the picture will be tearless and flawless).
TiGr1982 - Thursday, November 20, 2014 - link
By "driven" I mean games can be played in High/Ultra settings. Of course, at lower quality settings, single R9 290(X) can do much higher resolutions reasonably.bobbozzo - Thursday, November 20, 2014 - link
Please review these UD590 monitors; I wasn't aware a 28" 4k monitor was currently available for anywhere in that price range ($600).thanks!
r3loaded - Thursday, November 20, 2014 - link
Does G-Sync have any technical advantage over FreeSync? What makes the two standards different?Gigaplex - Thursday, November 20, 2014 - link
I think the only current advantage of G-SYNC over FreeSync is that it currently has a wider range of supported frequencies. The downsides are extra cost and proprietary nature. Plus, if you're an NVIDIA fan, it's your only option of getting adaptive refresh rates at this time.chizow - Thursday, November 20, 2014 - link
Well that's certainly not the ONLY advantage. I'd say the bigger advantage is that it actually exists, and does an awesome job at doing what it says its doing. We really have no idea how effective AMD's solution will be.In the end, both solutions will feature vendor lock-in, as Nvidia will push their initiative G-Sync, and only AMD will support only A-Sync regardless whether it is free or open.
HunterKlynn - Friday, November 21, 2014 - link
Hush, fanboy.chizow - Saturday, November 22, 2014 - link
Oh that's clever, would expect nothing less from a fanboy.haukionkannel - Saturday, November 22, 2014 - link
Intel and all ARM producers are going to use adaptive sync. They have so poor GPU in their systems that they need all help the async is going to give. And it does not even cost them anything. Just new GPU drivers to support that feature!But we don't know the quality yeat, so next spring will be interesting!
chizow - Saturday, November 22, 2014 - link
Why would Intel and ARM have any interest in supporting this given it is almost strictly going to be used for gaming and may require additional hardware and software on their end to support Adaptive Sync on their GPUs?Given AMD has come forward and claimed Nvidia GPUs cannot support Adaptive Sync the same way their own GPUs can, who is to say Intel or ARM have the requisite support in their own display adapters.
AMD claiming this is an open standard that everyone is free to use is just another half-truth, a ploy/stall tactic to try and hinder the adoption of G-Sync, but in the end the score will still be:
Nvidia ~65% +/-5% * % of GeForce Kepler or newer graphics cards
AMD 30% +/-5% * % of GCN 1.1+ GPUs (Hawaii, Bonaire, Tonga)
Because no one else will care enough about it to implement and support it.
DiHydro - Wednesday, December 3, 2014 - link
"Why would Intel and ARM have any interest in supporting this given it is almost strictly going to be used for gaming and may require additional hardware and software on their end to support Adaptive Sync on their GPUs?"Intel and Arm also service very different markets than NVidia and AMD. They can implement an asynchronous frame rate technology in phones, laptops, commercial displays, and integrated displays in cars and other areas. I could see this being helpful on a mobile display, because then you can have the CPU/GPU compute if the screen does not need to be updated. This frees up CPU/GPU cycles for more performance, or you can idle the CPU/GPU to save power if the screen is not being updated. For embedded solutions, this would allow a larger, higher resolution screen to be driven for a given CPU or GPU. If it is displaying text, the refresh could be ~1 Hz, and if it is video it could bump up to ~24 Hz.
Therefore, your opinion that Intel and ARM would have no incentive to use this tech is not correct.
Despoiler - Friday, November 21, 2014 - link
Freesync has a far wider frequency range than G-Sync. Freesync is 9-240hz. G-Sync is 30-144hzchizow - Friday, November 21, 2014 - link
Still have no idea if its true 240Hz input or fake TruMotion ala HDTVs, not to mention anything over 60Hz on the UHD panels these are going into will be useless for about 5 more years as there's not going to be the GPU horsepower to drive such refresh rates.My bet is just marketing jargon for fake refresh rates, unlike G-Sync which gets actual 120+ FPS inputs.
MrSpadge - Friday, November 21, 2014 - link
With such Sync technology there's really no need for those super high display frame rates. And if you're concerned about the latency of your inputs: those shouldn't be coupled to the display frame rate anyway.chizow - Thursday, November 20, 2014 - link
G-Sync makes the Display slave to the GPU with a direct-link, which is part of the reason there is an expensive FPGA and memory buffer (acts as a lookaside buffer), so that the monitor only refreshes when the GPU tells it to.The way AMD explained their adaptive sync is that the monitor is still reactively predicting how to adjust frame rates but there is no direct signaling for each frame. Its hard to say for sure though given AMD has changed their tune so many times and never actually demonstrated the tech completely (their live demos were fixed sub-60Hz refresh rates, and not adaptive, as they claimed).
jimjamjamie - Thursday, November 20, 2014 - link
23.6" 4K display? Yes pleaseGigaplex - Thursday, November 20, 2014 - link
With the current state of Windows desktop DPI scaling? No thanks.Spectrophobic - Thursday, November 20, 2014 - link
It'll be great for UHD gaming eye-candy though... as long as you only play games that is.JarredWalton - Friday, November 21, 2014 - link
As long as you only play games that can run at 4K without choking as well. :pSpectrophobic - Friday, November 21, 2014 - link
Well, depending on your definition of "without choking", current GPUs are moderately capable of playing games on UHD on moderate settings.TheJian - Thursday, November 20, 2014 - link
Shouldn't we be calling it "not-so-freesync" by now? This article blatantly shows it is NOT free as I've said all along here, at tomshardware etc. Ridiculous. Scalers will charge for the R&D they had to do to get this to work, certification testing will cost monitor makers etc etc. If you still think this will be FREE, you're plumb crazy. FREE would mean it would be the EXACT SAME PRICE as current model without it right?"I'd expect the Adaptive-Sync enabled monitors to have at least a moderate price premium, but we'll see when they become available some time around March 2015."
"AMD notes that the manufacturing and validation requirements to support variable refresh rates without visual artifacts are higher than traditional LCDs"
Yeah, let me know when FREE really means FREE.
"Over time, however, if Adaptive-Sync catches on then economies of scale come into play and we could see widespread adoption."
You could say the same for Gsync, especially since NV owns 68% of discrete and there are FAR more owners of 600 series and up cards on the NV side already that will work if you go Gsync monitor, rather than AMD's side. Meaning fewer people on the NV side need more than a monitor.
http://support.amd.com/en-us/search/faq/219
"The AMD Radeon™ R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming."
If you don't own one of the above, you won't be using your card with freesync for gaming (only vid playback). Considering their much smaller market share, far more people need a card and a monitor on the AMD side, as simple math on sales numbers would show. I'll reserve judgement for when AMD actually gives it to reviewers to TEST, but for even a $100 premium over whatever premium is on AMD's side, I'll go NV because we know it works already if they can't prove it is just as good as NV's solution (and I really mean AS GOOD, not good enough). My monitors are over 7yrs old now, so that works out to $1 or so per month for "it just works". I have no problem with proprietary stuff if it's better and not much more cost like gsync seems to be over time. AMD also isn't leading in anything on gpu vs. maxwell and seem to have far more driver issues after cutting ~35% of their employees over the last 4yrs or so. I don't mind paying for BETTER stuff. IE, I like AMD the company and many products over the years (not management, but for example I own a radeon 5850 currently), but I'll be buying an Intel cpu barring a miracle cpu from AMD next year.
ViRGE - Friday, November 21, 2014 - link
"Shouldn't we be calling it "not-so-freesync" by now? "The "free" in "freesync" has always referred to royalties. Nothing more, nothing less.
chizow - Friday, November 21, 2014 - link
That's not true, Koduri was telling anyone who would listen FreeSync would essentially be free and that many monitors on the market could support FreeSync with just a firmware update. Needless to say, we have not seen Koduri speak on this or any other matter lately, instead we get the even more dishonest Huddy.Either way, it was a wholly misleading misnomer that many AMD fanboys have incorrectly parroted since. In the end they're just pawns as usual in AMD's stall tactics to try and delay adoption of G-Sync while they scrambled to formulate their own solution.
kyuu - Friday, November 21, 2014 - link
Don't you ever get tired of this endless posturing? FFS, unless Nvidia is paying you I just don't understand what it is you get out of it.We get it. NVIDIA can do no wrong, while AMD is a pile of shit. We would all be much better off if Intel and Nvidia had a monopoly on the CPU and GPU markets and AMD was shot behind the barn to put it out of its misery.
chizow - Friday, November 21, 2014 - link
Do you like being lied to? Do you like being treated as an uninformed moron that doesn't know any better when AMD makes these ridiculous statements? Unless AMD is paying you or you're a braindead AMD zombie I just don't understand what you get out of it by defending their endless posturing, excuses and flat out lies. Its called accountability, AMD loves to just spitball and throw this nonsense out here, but its OK as long as people like you don't hold them accountable for it.I've never once stated Nvidia can do no wrong, in fact, I've been critical of many of their pricing/SKU initiatives with Kepler, but that doesn't change the fact they are a SOLUTIONS driven company that actually produces product over bullshit slidedecks and spends more of their time and efforts developing and improving their own tech rather than pointing the finger and QQing over how their premature/stillborn tech will somehow be better when it eventually makes it to market.
And Intel vs. AMD isn't already a virtual monopoly in the CPU market? Seems to have been business as usual over the last 8 years since Conroe launched. Has anyone really even noticed a lack of AMD competitiveness in this market? I doubt we would see much different if a similar lopsided victory occurred for Nvidia on the GPU end of things. In fact, the last time Nvidia dominated AMD so badly, we got the $230 8800GT, as AT stated "the only GPU that matters".
legokill101 - Friday, November 21, 2014 - link
and look what we got and the cpu market intel can know set pretty much any price they want want with the cpu's and enthusiasts can do little more then grumble since their are no alternatives. I also think you are confusing cause and effect with the 8800gt, we did not get it BECAUSE nvidia was dominating amd (actually ATI at the time i believe) but rather it was the reason they dominated. so please don't try and argue a monopoly is a good thing, or get cause and effect mixed up.chizow - Friday, November 21, 2014 - link
Intel sets any price they want? This is nonsense and an internet tech meme that needs to die. What is the most you have ever spent on a CPU? What would compel you to spend considerably more than that over just using the CPU that you already have? What kind of performance gain would you need Intel to show just to buy another CPU at that same price point? There are very simple economic factors that dictate Intel's pricing even in a virtual monopoly, such as discretionary income, substitutes and the price elasticity of demand. You can see it in the comments section every time they release a new CPU, every single consumer is making those decisions in real-time and the discussion doesn't end in: "Intel can make a $1000 CPU and I would be compelled to buy it" given that has been happening for years and that obviously isn't the case.And regarding the 8800GT, you're wrong, Nvidia was dominating AMD for over a year with the 8800GTX and 8800GTS with R600 repeatedly delayed and ultimately, non-competitive. In the face of no actual competition, Nvidia did the unthinkable, they launched nearly flagship performance at a fraction of the cost, $230, which led them to even new heights. So yes, even in the face of no competition, the statement competition is always good and necessary is not unilaterally true.
dragonsqrrl - Monday, November 24, 2014 - link
@ legokill101: Intel has essentially maintained the same price points since Conroe, price points AMD used to occupy, all while consistently improving performance and efficiency. No one is saying monopoly is a good thing, but your assertion that Intel can and is charging whatever they want for their CPUs is simply baseless. That same argument parroted by AMD fanboys for the past ~8 years is in fact getting cause/effect mixed up.Alexey291 - Sunday, November 23, 2014 - link
you certainly sound like a preaching moron. i.e. the worst kind.A moron who thinks he knows some TRUTH and must spread it to the masses.
Go away please.
chizow - Monday, November 24, 2014 - link
What am I preaching? Reality? That's the best kind, maybe a bigger dose of it for AMD fanboys will diminish having to wade through all these nonsensical half-truths and memes in the future.Feel free to post something worthwhile, until then I'll be right here thanks.
BillyHerrington - Tuesday, November 25, 2014 - link
Since when does anandtech turned into the like of fudzilla & wccftech ?Where everyone calling each other fanboy and stupid.
Horza - Wednesday, November 26, 2014 - link
@Alex291Seems like you've got his number, straight into a rant about giving everyone a dose of reality.
DanaGoyette - Friday, November 21, 2014 - link
23.6 inches and "4K" (assuming 3840x2160) -- that's 186.69 PPI. Sounds amazing!Yes, some software breaks under scaling, but I'm okay with that. I've been happy with 144PPI laptops (150% scaling) since 2008, and this 200% scaling should get rid of the blurriness.
I hope these will also have a 120Hz strobed backlight mode, and 8-bit color depth. My current XL2420TE has amazing motion in strobed mode, but also has horrible banding.
Wolfpup - Friday, November 21, 2014 - link
From interviews with both companies, it really sounds like G-Sync has some serious advantages...but with it costing much more to implement and not being royalty free, I can't see it going anywhere.And where are the non-TN monitors that support it?
FreeSync at least seems like it might happen...except practically speaking if Nvidia doesn't support it (I can't remember if they are) it does me no good as I need to see AMD doing YEARS of Nvidia quality drivers before I consider them again...and I need to see them supporting old hardware like Nvidia does.
MrSpadge - Friday, November 21, 2014 - link
FreeSync might really take off if Intel adopted it. Not because their GPUs would be so strong, but because they might be able to combine it with any GPU via Lucid Hydra.chizow - Saturday, November 22, 2014 - link
Hydra? Really? Haven't heard that name in awhile. Dead tech is dead. I almost thought about using Lucid VirtuMVP on my 4770K, almost.But yeah if you want to take the chance, Intel is selling Larrabee Xeon Phis for dirt cheap right now!
https://software.intel.com/en-us/articles/special-...
yannigr2 - Sunday, November 23, 2014 - link
G-Sinking...poohbear - Wednesday, November 26, 2014 - link
I love how nobodys addressing the elephant in the room....4k gaming is far too demnading for anyone tontake seriously!!! New games cant even run @ 1440p without SLI, let alone 4k!!!DiHydro - Wednesday, December 3, 2014 - link
Asynchronous frame rate technology should help that. Then, if a frame takes longer than 16.6 milliseconds to render (60 frame/sec) it can hold the previous frame until it is done. There would need to be smoothing enabled, otherwise jitter like what happens in AMD crossfire can happen. If the GPU can put out 30-60 FPS, and the frame syncing tech smooths out delivery in that range, current GPUs at the ~$300 price point should be able to push 4K. Basically, you can trick the user into feeling that the GPU is pushing buttery smooth 60 FPS without really having to get it. Input lag may be an issue, but that would need to be handled by the game it self and not the display or GPU.