Not sure why people are bashing Gsync, it still is fantastic. So it puts a little more on price on hardware..you are still forgetting the main drive why gsync is better. It still works on nvidia hardware and freesync STILL requires a driver on nvidia hardware which they have already stated won't happen.
If you noticed, the displayed they are announced don't even care about the specs freesync could do. They are just the same if worse than Gsync. Notice the Gsynce IPS 144Hz monitors are coming out this month..
To add to the above, freesync is not better than Gsync because its "open". Costs still trickle down somehow..and that cost is you having to stick with AMD hardware when you buy a new monitor. So the openness about is is really it is closed tech...just like Gsync.
Nvidia will someday to have to provide freesync to comply with spec of more recent displayport version. Right now they are just keeping older 1.2 version on their gpu, simply as to not have to adopt it.
It's optional. Outside of gaming it doesn't have any real value; so forcing mass market displays to use a marginally more complex controller/more capable panel doesn't add any value.
That is definitely not true. Tearing happens between the graphics card and the display, it doesn't matter what the source is. If you watch a movie at 24fps on a screen that is running at 60Hz connected to a PC, there is a chance for tearing.
Tearing is just less common on movies because the frame rates are usually much lower than the refresh rate of the display, so its less likely to be visible.
If it's not working, this is just as wrong. Since it's fairly close, at 24, 25, or almost 30, you will see the tear line creeping up or down the image, if vsync isn't on. It's exceptionally obvious. Usually, you will just see skipped frames on Windows, since the compositor forces vsync for the desktop, and this is generally well-supported by any video player's playback mechanisms. The skipped frames become more noticeable as you watch, but aren't nearly as bad as tearing.
I'm writing a compositing engine for HaikuOS and I would LOVE to be able to control the refresh timing! when a small update occurs, and the frame buffer is ready, I'd swap it, trigger a monitor refresh, and then be on my way right away.
As it stands, I have to either always be a frame behind, or try and guess how long composing the frame buffer from the update stream will take before I know anything about what the update stream will be like so I know when to wake up the composite engine control loop.
That means, even on normal day-to-day stuff, like opening a menu, dragging icons, playing solitaire, browsing the web, etc. FreeSync would be quite useful. As it stands, the best I can do is hope the frame is ready for the next interval, or wait until the next refresh is complete to swap frame buffers - which means that the data on screen is always a frame out of date (or more).
At 60hz that is a fixed delay multiplier of 16.7, with a minimum multiplicand of 1. Going with higher refresh rates on the desktop is just wasteful (we don't really need 60, except for things to feel smooth due to the delay multiplier effect of the refresh rate).
If I could use the whole range from 45hz to 75 hz, our (virtual) multiplicand could be 0.75-1.33, instead of exactly 1 or 2. That make a significant difference in jitter.
Everything would be smoother - and we could drop down to a 45hz refresh interval by default, saving energy in the process, instead of being stuck at at a fixed cadence.
Wrong. it is generating the visuals, and doing so the exact same way, as far as any of this technology is concerned, and screen tearing does happen, because refresh rates vary from our common ones.
considering the power saving impact it's had on the mobile sector (no sense rendering to pixels that haven't changed, just render to the ones that have), it most definitely would have a significant impact on the laptop market and would be a great 'green' tech in general.
No value, except to the consumer that doesn't have to pay the (current) $160+ premium for g-sync. Now, if amd had a gfx card competitor to the gtx980, it'd be marvelous, and a no brainer. Given that the cost is apparently minimal to implement, I don't see that as a problem. Even if you think it's not value added, panel manufacturers shoved the pointless 3d down everyone's throat, so clearly, they're not averse to that behavior.
Inside of gaming it has plenty of value - who even cares about the rest? Gaming was a $25.1 billion market in 2010 (ESA annual report). I'd take a billionth of that pie and go out for a nice meal wouldn't you?
... No current or upcoming DP spec ...requires... adaptive sync. It's optional, not sure how else you could interpret that, especially when you take the comment I responded to into consideration.
Read the article. Freesync monitors are less expensive. Plus, they have a much better chance of getting Intel support or even Nvidia support (wanna bet it's going to happen? they're simply going to call it DisplayPort variable refresh or something like that...)
Intel support? I doubt you will find anyone buying these with a intel GPU. Why would nvidia support it with its investment already in Gsync..with new Gsync monitors IPS shipping this month? Makes no sense.
Laptop displays. Laptop displays. Laptop displays. Being able to lower the refresh rate when you don't need it higher is something nearly every laptop could use. Currently there are no implementations of Freesync/Async that go down to 9Hz, but, well... That's power savings!
And tablet and phone chipsets go as far as having no refresh at all. The display only updates when there is a new frame. The tablets even use a power-saving simple frame buffer / LCD driver and turn off the render hardware entirely.
nVidia will buckle. It's inevitable. They can't stand against the entire industry, and AMD has the entire industry behind them with this. Jen Hsun knows he's already lost this battle, and he's just out to milk G-Sync for whatever he can get, for as long as he can get it. It's only a short matter of time before somebody hacks the nVidia drivers and makes them work with FreeSync, a la the old custom Omega ATI drivers. How appealing will it be to pay extra for G-Sync monitors once custom nVidia drivers exist that work with the much wider range of FreeSync monitors?
LOL, yes the thought of having to use a hacked driver to use an inferior solution leading to Nvidia reversing course on G-Sync is a thought only an AMD fan could possibly find palatable.
G-Sync isn't going anywhere, especially in light of all the problems we are seeing with FreeSync.
Of course FreeSync is better than Gsync because it's open. Royalty cost to GPU-maker to support FreeSync is literally $0. That makes future Intel or nVidia GPUs a driver update away from supporting FreeSync. Compare to Gsync, at a royalty cost greater than zero, assuming nVidia would license it at all. Scaler cost to LCD-makers to support FreeSync appears to be a maximum of $50 now, quite likely $0 in the long run as it becomes a default feature in all scalers. Compare to Gsync at $200+.
Take off your fanboy blinders for a moment. Capabilities being equal (as they so far seem to be), a royalty-free solution that's supported by default on future standard hardware is clearly better than a royalty-encumbered solution that requires costly additional hardware, no matter which team is supporting which one.
Again, please link confirmation from Nvidia that G-Sync carries a penny of royalty fees. BoM for the G-Sync module is not the same as a Royalty Fee, especially because as we have seen, that G-Sync module may very well be the secret sauce FreeSync is missing in providing an equivalent experience as G-Sync.
"I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. "
"It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."
Of course, royalties for using the G-SYNC brand is the real question -- not royalties for using the G-SYNC module. But even if NVIDIA doesn't charge "royalties" in the normal sense, they're charging a premium for a G-SYNC scaler compared to a regular scaler. Interestingly, if the G-SYNC module is only $40-$60, that means the LCD manufacturers are adding $100 over the cost of the module.
Why is there a need to finish the quote? If you get a spoiler and turbo charger in your next car, are they charging you a royalty fee? It's not semantics to anyone who actually understands the simple notion: better tech = more hardware = higher price tag.
@AnnihilatorX, how is "making profit from" suddenly the key determining factor for being a royalty? Is AMD charging you a royalty every time you buy one of their graphics cards? Complete and utter rubbish. Honestly, as much as some want to say it is "semantics", it really isn't, it comes down to clear definitions that are specific in usage particularly in legal or contract contexts.
A Royalty is a negotiated fee for using a brand, good, or service that is paid continuously per use or at predetermined intervals. That is completely different than charging a set price for Bill of Material for an actual component that is integrated into a product you can purchase at a premium. It is obvious to anyone that additional component adds value to the product and is reflected in the higher price. This is in no way, a royalty.
Jarred, Chizow is the most diehard Nvidia fanboy on Anandtech. There is nothing you could say to him to convince him that Freesync/Adaptive Sync is in any way better than G-Sync (pricing or otherwise). Just being unavailable on Nvidia hardware makes it completely useless to him. At least until Nvidia adopts it. Then it'll suddenly be a triumph of open standards, all thanks to VESA and Nvidia and possibly some other unimportant company.
And Alexvrb is one of the staunchest AMD supporters on Anandtech. There is quite a lot that could convince me FreeSync is better than G-Sync, that it actually does what it sets out to do without issues or compromise, but clearly that isn't covered in this Bubble Gum Review of the technology. Unlike the budget-focused crowd that AMD targets, Price is not going to be the driving factor for me especially if one solution is better than the other at achieving what it sets out to do, so yes, while Freesync is cheaper, to me it is obvious why, it's just not as good as G-Sync.
But yes, I'm honestly ambivalent to whether or not Nvidia supports Adaptive Sync or not, as long as they continue to support G-Sync as their premium option than it's np. Supporting Adaptive Sync as their cheap/low-end solution would just be one less reason for anyone to buy an AMD graphics card, which would probably be an unintended consequence for AMD.
The cost means nothing, keep in mind the people buying this stuff pay out the nose already for hardware. Given that most people who by nvidia cards are going to get a Gsync, cost has no meaning.
This is a double bad thing for AMD..first its still tied to IT'S graphics card (NV already so no to support), and 2nd the monitors announced already are already below the specs Freesync is suppose to do, and worse than next gen Gsync monitors.
I mean i love competition like the next person, but this is just PR making it seem like its a good thing when its not.
Your name says it all. Do you really think manufacturers will beg NVidia to come and mess with their manufacturing process just to include something that only they support? Time will come when phone makers will join and they mostly don't use NVidia GPU's. So now you have NVidia vs AMD and Intel (for ultrabooks) and ARM (Mali) and PowerVR. You think NVidia can hold them off with overpricing and PR?
uhm no? I'd want my next monitor to be GPU agnostic ideally. And I'd want to use an nvidia card with it because right now AMD cards are still ovens compared to nvidia. Not because I like paying through the nose, a 750 Ti doesn't cost much at all.
I'll hold out since I'm trusting that this thing will solve itself (in favour of the industry standard, adaptive sync) sooner or later.
The reported GPU temp means nothing. That is just an indication of the heatsink/fans ability to remove heat from the GPU. You need to look at power draw. The AMD GPUs draw significantly more power than current nVidia cards for less performance. That power generates heat, heat that needs to go somewhere. So while the AMD cards may be 10 degrees Celsius more, which isn't minimal in and of itself, it is having to dissipate quite a bit more generated heat. The end result is AMD GPUs are putting out quite a bit more heat than nVidia GPUs.
Freesync today are only open to people with radeon cards. AMD made the better deal they let the monitor builders take the cost for freesync. Nvidia made the hardware themself
Huh? Who cares if its open/closed/upside down/inside out? G-Sync is better because it is BETTER at what it set out to do. If it is the better overall solution, as we have seen today it is, then it can and should command a premium. This will just be another bulletpoint pro/con for Nvidia vs. AMD. You want better, you have to pay for it, simple as that.
better? did we read same article? I cant find where it says that gsync is better than freesync. In what aspect it is better? And answer for your 1st question is, anyone with the brain. Thing is that nvidia could enable support for freesync if it wouldnt hurt their pride, which would be big benefit for their customers (you wouldnt be restricted on monitor selection) but they chose what is better for them, pushing gsync & milking more money from you. This is pretty stupid, while you may be some average gamer that thinks it is fine to have your monitor selection restricted to 2% normal people probably wouldnt be that happy. The way it should be is that every monitor should support freesync (or whatever you call it) as this is display feature and should have been in 1st place developed by LCD makers but they dont give a shit to provide excellent displays as far as they can sell shit that people are buying like crazy (not refering to gsync monitors now). Vendor lockin is always a bad thing. Oh and article says that gsync monitor doesnt provide advanced OSD as comon panels today...so yeah gsync is clearly better
The ghosting problem actually has nothing to do with the G-Sync and FreeSync technologies like the article said, but more have to do with the components in the monitor. So if Asus made a ROG Swift FreeSync version of the same monitor, there would've been no ghosting just like the G-SYNC version. So your example is invalid.
@happycamperjack. Again, incorrect. Why is it that panels from the SAME manufacturers, that possibly use the same panels even, using the same prevaling panel technologies of this time, exhibit widely different characteristics under variable refresh? Maybe that magic G-Sync module that AMD claims is pointless is actually doing something....like controlling the drive electronics that control pixel response variably in response to changing framerates. Maybe AMD needs another 18 months to refine those scalers with the various scaler mfgs?
"Modern monitors are often tuned to a specific refresh rate – 144 Hz, 120 Hz, 60 Hz, etc. – and the power delivery to pixels is built to reduce ghosting and image defects. But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."
BenQ for example makes a fine G-Sync monitor, and multiple high refresh 3D Vision monitors well known for their lack of ghosting. Are you going to tell me that suddenly they are using inferior panel tech that can't handle ghosting? This is 2015 and TN panels we are talking about here right? This kind of ghosting has not been seen since circa 2007 when PMVA was all the rage.
I will summarize it for you in case your prejudice clouds your comprehension
1) At no point in the article it finds any performance advantage from FreeSync or Gsync (AMD claims 0.5-1% advantage but that's too small to detect, so we disregard that) 2) Freesync has better monitor choices, including IPS and ones with better specs in general 3) Freesync monitors are about USD200 cheaper, almost half the cost of a decent graphic card 4) Freesync monitors have on-screen dialogues (OSD) that works, Gsync monitor doesn't due to implementation 5) Freesync has better potential in future for support, especially laptops becuase of zero royalty fees and only minor update to hardware 6. Freesync allows users the option to choose whether they want to enable Vsync or not, Gsync locks Vsync to be on. This mean the user can have better latency if they can stand tearing. The important thing is option, having the option is always advantageous 7. AMD claims Freesync works from 9Hz-240Hz wheras Gsync only works from 30Hz to 144Hz.
1) You assume the tests conducted here are actually relevant. 2) No, they don't. Nvidia has an IPS in the works that may very well be the best of all, but in the meantime, it is obvious that for whatever reason the FreeSync panels are subpar compared to the G-Sync offerings. Coutesy of PCPER: http://www.pcper.com/image/view/54234?return=node%... 3) Sure they are cheaper, but they also aren't as good, and certainly not "Free" as there is a clear premium compared to non-FreeSync panels, and certainly no firmware flash is going to change that. Also, that $200 is going to have to be spent on a new AMD graphics GCN1.1+ graphics card anyways as anyone who doesn't already own a newer AMD card will have to factor that into their decision. Meanwhile, G-Sync supports everything from Nvidia from Kepler on. Nice and tidy (and dominant in terms of installed user base). 4) OSDs, scalers and such add input lag, while having multiple inputs is nice, OSDs are a feature gaming purists can live without (See: all the gaming direct input modes on newer LCDs that bypass the scalers). 5) Not if they're tied to AMD hardware. They can enjoy a minor share of the dGPU graphics market as their TAM. 6) Uh, this is nonsense. FreeSync is still tied to Vsync in ways THIS review certainly doesn't cover indepth, but that's certainly not going to be a positive since Vsync inherently adds latency. Meanwhile, Vsync is never enabled with G-Sync, and while there is more latency at the capped FPS, it is a driver-side cap and not Vsync enabled. 7) Well, AMD can claim all they like it goes as low as 9Hz but as we have seen the implementation is FAR worst, falling apart below 40FPS where blurring, tearing, basically the image falls apart and everything you invested hundreds of dollars basically became a huge waste. Meanwhile, G-Sync shows none of these issues, and I play some MMOs that regularly dip into the 20s in crowded cities, no sign of any of this.
So yes, as I've shown, there are still many issues with FreeSync that need to be addressed that show it is clearly not as good as G-Sync. But like I said, this is a good introduction to the tech that Nvidia invented some 18 months ago, maybe with another 18 months AMD will make more refinements and close the gap?
5) what? Where did you got that Adaptive sync is tied to AMD HW? Thats pretty bullshit, if it would then it wouldnt be standardized by VESA right? If today it is only AMD HW that can support it (cause they implement first) doesnt validate your claim that it is AMD tied. Intel/nvidia/... can implement it in their products if they want. It is like you would be saying that if for example LG release first monitor that will support DP1.3 that it implies DP1.3 is LG tied lol On other hand Gsync is Nvidia tied. But you know this right?
@lordken, who else supports FreeSync? No one but AMD. Those monitor makers can ONLY expect to get business from a minor share of the graphics market given that is going to be the primary factor in paying the premium for one over a non-FreeSync monitor. This is a fact.
VESA supports FreeSync, which means Intel will probably support it, too. Intel graphics drive far more computers than AMD or nVidia, which means that if Intel does support it, nVidia is euchred, and even if Intel doesn't support it, many more gamers will choose free over paying an extra $150-$200 for a gaming setup. Between the 390-series coming out shortly and the almost guaranteed certainty that some hacked nVidia drivers will show up on the web to support FreeSync, G-Sync is a doomed technology. Period.
Intel has no reason to support FreeSync, and they have shown no interest either. Hell they showed more interest in Mantle but as we all know, AMD denied them (so much for being the open hands across the globe company).
But yes I'm hoping Nvidia does support Adaptive Sync as their low-end solution and keeps G-Sync as their premium solution. As we have seen, FreeSync just isn't good enough but at the very least it means people will have even less reason to buy AMD if Nvidia supports both lower-end Adaptive Sync and premium G-Sync monitors.
@lordken and yes I am well aware Gsync is tied to Nvidia lol, but like I said, will I bet on the market leader with ~70% market share and installed user base (actually much higher than this, since Kepler is 100% vs. GCN1.1 is maybe 30%? over the cards sold since 2012) over the solution that holds a minor share of the dGPU market and even a smaller share of the CPU/APU market.
And why don't you stop your biased preconceptions and actually read some articles that don't just take AMD's slidedecks at face value? Read a review that actually tries to tackle the real issues I am referring to, while actually TALKING to the vendors and doing some investigative reporting:
It's not a "major issue" so much as a limitation of the variable refresh rate range and how AMD chooses to handle it. With NVIDIA it refreshes the same frame at least twice if you drop below 30Hz, and that's fine but it would have to introduce some lag. (When a frame is being refreshed, there's no way to send the next frame to the screen = lag.) AMD gives two options: VSYNC off or VSYNC on. With VSYNC off, you get tearing but less lag/latency. With VSYNC on you get stuttering if you fall below the minimum VRR rate.
The LG displays are actually not a great option here, as 48Hz at minimum is rather high -- 45 FPS for example will give you tearing or stutter. So you basically want to choose settings for games such that you can stay above 48 FPS with this particular display. But that's certainly no worse than the classic way of doing things where people either live with tearing or aim for 60+ FPS -- 48 FPS is more easily achieved than 60 FPS.
The problem right now is we're all stuck comparing different implementations. A 2560x1080 IPS display is inherently different than a 2560x1440 TN display. LG decided 48Hz was the minimum refresh rate, most likely to avoid flicker; others have allowed some flicker while going down to 30Hz. You'll definitely see flicker on G-SYNC at 35FPS/35Hz in my experience, incidentally. I can't knock FreeSync and AMD for a problem that is arguably the fault of the display, so we'll look at it more when we get to the actual display review.
As to the solution, well, there's nothing stopping AMD from just resending a frame if the FPS is too low. They haven't done this in the current driver, but this is FreeSync 1.0 beta.
Final thought: I don't think most people looking to buy the LG 34UM67 are going to be using a low-end GPU, and in fact with current prices I suspect most people that upgrade will already have an R9 290/290X. Part of the reason I didn't notice issues with FreeSync is that with a single R9 290X in most games the FPS is well over 48. More time is needed for testing, obviously, and a single LCD with FreeSync isn't going to represent all future FreeSync displays. Don't try and draw conclusions from one sample, basically.
How is it not a major issue? You think that level of ghosting is acceptable and comparable to G-Sync!?!?! My have your standards dropped, if that is the case I do not think you are qualified to write this review, or at least post it under Editorial, or even better, post it under the AMD sponsored banner.
Fact is, below the stated minimum refresh, FreeSync is WORST than a non-VRR monitor would be, as all the tearing and input lag is there AND you get awful flickering and ghosting too.
And how do you know it is a limitation of panel technology when Nvidia's solution exhibits none of these issues at typical refresh rates as low as 20Hz, and especially at the higher refresh rates that AMD starts to experience it? Don't you have access to the sources and players here? I mean we know you have AMD's side of the story, but why don't you ask these same questions to Nvidia, the scaler makers, the monitor makers as well? It could certainly be a limitation of the spec don't you think? If monitor makers are just designing a monitor to AMD's FreeSync spec, and AMD is claiming they can alleviate this via a driver update, it sounds to me like the limitation is in the specification, not the technology, especially when Nvidia's solution does not have these issues. In fact, if you had asked Nvidia, as PCPer did, they may very well have explained to you why FreeSync ghosts/flickers, and their solution does not: From PCPer, again:
" But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."
Science and hardware trumps hunches and hearsay, imo. :)
Also, you might need to get with Ryan to fully understand the difference between G-Sync and FreeSync at low refresh. G-Sync simply displays the same frame twice. There is no sense of input lag, as input lag would be if the next refreshed panel was tied to a different input I/O. That is not the case with G-Sync, because the held frame 2nd is still tied to the input of the 1st frame, but the next live frame has a live input. All you perceive is low FPS, not input lag. There is a difference. It would be like playing a game at 30FPS on a 60Hz monitor with no Vsync. Still, certainly much better than AMD's solution of having everything fall apart at a framerate that is still quite high and hard to obtain for many video cards.
The LG is a horrible situation, who wants to be tied to a solution that is only effective in such a tight framerate band? If you are actually going to do some "testing", why don't you test something meaningful like a gaming session that shows the % of frames in any particular game with a particular graphics card that shows that fall outside of the "supported" refresh rates. I think you will find the amount of time spent outside of these bands is actually pretty high in demanding games and titles at the higher than 1080p games on the market today.
And you definitely see flicker at 35fps/35Hz on a G-Sync panel? Prove it. I have an ROG Swift and there is no flicker as low as 20FPS which is common in the CPU-limited MMO games out there. Not any noticeable flicker. You have access to both technologies, prove it. Post a video, post pictures, post the kind of evidence and do the kind of testing you would actually expect from a professional reviewer on a site like AT instead of addressing the deficiencies in your article with hearsay and anecdotal evidence.
Last part, again I'd recommend running the test I suggested on multiple panels with multiple cards and mapping out the frame rates to see the % that fall outside or below these minimum FreeSync thresholds. I think you would be surprised, especially given many of these panels are above 1080p. Even that LG is only ~1.35x 1080p, but most of these panels are 1440p premium panels and I can tell you for a fact a single 970/290/290X/980 class card is NOT enough to maintain 40+FPS in many recent demanding games at high settings. And as of now, CF is not an option. So another strike against FreeSync, if you want to use it, your realistic options are a 290/X at the minimum or there's the real possibility you are below the minimum threshold.
Hopefully you don't take this too harshly or personally, while there is some directed comments in there, there's also a lot of constructive feedback. I have been a fan of some of your work in the past but this is certainly not your best effort or an effort worthy of AT, imo. The biggest problem I have and we've gotten into it a bit in the past is that you repeat many of the same misconceptions that helped shape and perpetuate all the "noise" surrounding FreeSync. For example, you mention it again in this article, yet do we have any confirmation from ANYONE that existing scalers and panels can simply be flashed to FreeSync with a firmware update? If not, why bother repeating the myth?
G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module."
Especially those last few sentences. You say AMD can just duplicate frames like G-Sync but according to this article it's actually something in the G-Sync module that enables it. Is there truth to that?
Not worth the typing effort. Chizow is a well known nvidia fanboy or possibly a shill for them. As long as it is green it is best to him. Bent over, cheeks spread and ready for nvidias next salvo all the time.
@Socketofpoop, I'm well known among AMD fanboys! I'm so flattered!
I would ask this of you and the lesser-known AMD fanboys out there. If a graphics card had all the same great features, performance, support with existing prices that Nvidia offers, but had an AMD logo and red cooler on the box, I would buy the AMD card in a heartbeat. No questions asked. Would you if roles were reversed? Of course not, because you're an AMD fan and obviously brand preference matters to you more than what is actually the better product.
I hate to break it to you, but history has not been kind to the technically superior but proprietary and/or higher cost solution. HD-DVD, miniDisc, Laserdisc, Betamax... the list goes on.
Something else interesting to note is that there are 11 FreeSync displays already in the works (with supposedly nine more unannounced), compared to seven G-SYNC displays. In terms of numbers, FreeSync on the day of launch has nearly caught up to G-SYNC.
Did you pull that off AMD's slidedeck too Jarred? What's interesting to note is you list the FreeSync displays "in the works" without counting the G-Sync panels "in the works"? And 3 monitors is now "nearly caught up to" 7? Right.
A brand new panel is a big investment (not really), I guess everyone should place their bets carefully. I'll bet on the market leader that holds a commanding share of the dGPU market, consistently provides the best graphics cards, great support and features, and isn't riddled with billions in debt with a gloomy financial outlook.
Try counting again: LG 29", LG 34", BenQ 27", Acer 27" -- that's four. Thanks for playing. And the Samsung displays are announced and coming out later this month or early next. For NVIDIA, there are six displays available, and one coming next month (though I guess it's available overseas). I'm not aware of additional G-SYNC displays that have been announced, so there's our six/seven. I guess maybe we can count the early moddable LCDs from ASUS (and BenQ?) and call it 8/9 if you really want to stretch things.
I'm not saying G-SYNC is bad, but the proprietary nature and price are definitely not benefits for the consumer. FreeSync may not be technically superior in every way (or at least, the individual implementations in each LCD may not be as good), but open and less expensive frequently wins out over closed and more expensive.
@Jarred, thanks for playing, but you're still wrong. There's 8 G-Sync panels on the market, and even adding the 1 for AMD that's still double, so how that is "nearly caught up" is certainly an interesting lens.
Nvidia also has panels in the works, including 2 new, major breakthroughs like the Acer 1440p IPS, 1st 144Hz, 1440p, IPS VRR panel and the Asus ROG Swift 4K IPS, 1st 4K IPS VRR monitor. So yes, while AMD is busy "almost catching up" with low end panels, Nvidia and their partners are continuing to pioneer the tech.
As for FreeSync bringing up the low end, I personally think it would be great if Nvidia adopted AdaptiveSync for their low end solutions and continued to support G-Sync as their premium solution. It would be great for the overwhelming majority of the market that owns Nvidia already, and would be one less reason for anyone to buy a Radeon card.
You sure have a lot of excuses. This is beta 1.0, it's the lcd's fault (pcper didn't think so), assumption that open/free (this isn't free, $50 by your own account for freesync, which is the same as $40-60 for the gsync module right?, you even admit they're hiking prices at the vendor side for $100+) is frequently the winner. Ummm, tell that to CUDA and NV's generally more expensive cards. There is a reason they have pricing power (the are better), and own 70% discrete and ~75% workstation market. I digress...
"I'll bet on the market leader that holds a commanding share of the dGPU market, consistently provides the best graphics cards, great support and features, and isn't riddled with billions in debt with a gloomy financial outlook."
You mean you'll bet on the crooked, corrupt, anti-competitive, money-grubbing company that doesn't compensate their customers when they rip them off (bumpgate), and has no qualms about selling them a bill of goods (GTX970 has 4GB ram! Well, 3.5GB of 'normal' speed ram, and .5GB of much slower, shitty ram.), likes to pay off game-makers to throw in trivial nVidia proprietary special effects (Batman franchise and PhysX, I'm looking right at you)? That company? Ok, you keep supporting the rip-off GPU maker, and see how this all ends for you.
@Anubis44: Yeah again, I don't bother with any of that noise. The GTX 970 I bought for my wife in Dec had no adverse impact from the paper spec change made a few months ago, it is still the same fantastic value and perf it was the day it launched.
But yes I am sure ignoramuses like yourself are quick to dismiss all the deceptive and downright deceitful things AMD has said in the past about FreeSync, now that we know its not really Free, can't be implemented with a firmware flash, does in fact require additional hardware, and doesn't even work with many of AMD's own GPUs. And how about CrossFireX? How long did AMD steal money from end-users like yourself on a solution that was flawed and broken for years on end, even denying there was a problem until Nvidia and the press exposed it with that entire runtframe FCAT fiasco?
And bumpgate? LMAO. AMD fanboys need to be careful who they point the finger at, especially in the case of AMD there's usually 4 more fingers pointed back at them. How about that Llano demand overstatement lawsuit still ongoing that specifically names most of AMD's exec board, including Read? How about that Apple extended warranty and class action lawsuit regarding the same package/bump issues on AMD's MacBook GPUs?
LOL its funny because idiots like you think "money-grubbing" is some pejorative and greedy companies are inherently evil, but then you look at AMD's financial woes and you understand they can only attract the kind of cheap, ignorant and obtusely stubborn customers LIKE YOU who won't even spend top dollar on their comparably low-end offerings. Then you wonder why AMD is always in a loss position, bleeding money from every orifice, saddled in debt. Because you're waiting for that R9 290 to have a MIR and drop from $208.42 to $199.97 before you crack that dusty wallet open and fork out your hard-earned money.
And when there is actually a problem with AMD product, you would rather make excuses for them and sweep those problems under the rug, rather than demand better product!
So yes, in the end, you and AMD deserve one another, for as long as it lasts anyways.
HD-DVD was technically superior and higher cost? It seems BlueRay/HD-DVD is a counterexample to what you are saying, but you include it in the list to your favor. Laserdisc couldn't record whereas VCRs could. Minidisc was smaller and offered recording, but CD-R came soon after and then all it had was the smaller size. Finally MP3 players came along and did away with it.
There's another difference in this instance, though, which doesn't apply to any of those situations that I am aware of, other than minidisc ): G-Sync/FreeSync are linked to an already installed user base of requisite products. (Minidisc was going up against CD libraries, although people could copy those. In any case, minidisc wasn't successful and was going AGAINST an installed user base.) NVIDIA has a dominant position in the installed GPU base, which is probably exactly the reason that NVIDIA chose to close off G-Sync and the "free" ended up being in FreeSync.
Assuming variable refresh catches on, if after some time G-Sync monitors are still significantly more expensive than FreeSync ones, it could become a problem for NVIDIA and they may have to either work to reduce the price or support FreeSync.
Uh, HD-DVD was the open standard there guy, and it lost to the proprietary one: Blu-Ray. But there's plenty of other instances of proprietary winning out and dominating, let's not forget Windows vs. Linux, DX vs. OpenGL, CUDA vs. OpenCL, list goes on and on.
Fact remains, people will pay more for the better product, and better means better results, better support. I think Nvidia has shown time and again, that's where it beats AMD, and their customers are willing to pay more for it.
See: Broken Day 15 CF FreeSync drivers as exhibit A.
FYI, ghosting is a factor of the display and firmware, not of the inherent technology. So while it's valid to say, "The LG FreeSync display has ghosting..." you shouldn't by extension imply FreeSync in and of itself is the cause of ghosting.
So are you saying a firmware flash is goiing to fix this, essentially for free? Yes that is a bit of a troll but you get the picture. Stop making excuses for AMD and ask these questions to them and panel makers, on record, for real answers. All this conjecture and excuse-making is honestly a disservice to your readers who are going to make some massive investment (not really) into a panel that I would consider completely unusable.
You remember that Gateway FPD2485W that you did a fantastic review of a few years ago? Would you go back to that as your primary gaming monitor today? Then why dismiss this problem with FreeSync circa 2015?
You're assuming gsync stays the same price forever. So scalers can improve pricing (in your mind) to zero over time, but NV's will never shrink, get better revs etc...LOL. OK. Also you assume they can't just lower the price any day of the week if desired. Microsoft just decided to give away Windows 10 (only to slow android but still). This is the kind of thing a company can do when they have 3.7B in the bank and no debt (NV, they have debt but if paid off, they'd have ~3.7b left). They could certainly put out a better rev that is cheaper, or subsidize $50-100 of it for a while until they can put out a cheaper version just to slow AMD down.
They are not equal. See other site reviews besides and AMD portal site like anandtech ;)
http://www.pcper.com/reviews/Displays/AMD-FreeSync... There is no lic fee from NV according to PCper. "It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic." Which basically shows VENDORS must be marking things up quite a lot. But that is too be expected with ZERO competition until this week.
"For NVIDIA users though, G-Sync is supported on GTX 900-series, 700-series and 600-series cards nearly across the board, in part thanks to the implementation of the G-Sync module itself." Not the case on the AMD side as he says. So again not so free if you don't own a card. NV people that own a card already are basically covered, just buy a monitor.
Specs of this is misleading too, which anandtech just blows by: "The published refresh rate is more than a bit misleading. AMD claims that FreeSync is rated at 9-240 Hz while G-Sync is only quoted at 30-144 Hz. There are two issues with this. First, FreeSync doesn’t determine the range of variable refresh rates, AdaptiveSync does. Second, AMD is using the maximum and minimum refresh range published by the standards body, not an actual implementation or even a planned implementation from any monitor or display. The 30-144 Hz rating for G-Sync is actually seen in shipping displays today (like the ASUS PG278Q ROG Swift). The FreeSync monitors we have in our office are either 40-144 Hz (TN, 2560x1440) or 48-75 Hz (IPS, 2560x1080); neither of which is close to the 9-240 Hz seen in this table."
Again, read a site that doesn't lean so heavily to AMD. Don't forget to read about the GHOSTING on AMD. One more point, PCper's conclusion: "My time with today’s version of FreeSync definitely show it as a step in the right direction but I think it is far from perfect." "But there is room for improvement: ghosting concerns, improving the transition experience between VRR windows and non-VRR frame rates and figuring out a way to enable tear-free and stutter-free gaming under the minimum variable refresh rate." "FreeSync is doing the right things and is headed in the right direction, but it can’t claim to offer the same experience as G-Sync. Yet."
you must be a headcase or, more likely, are paid for by NVidia to publicly shill. Gsync requires a proprietary NVidia chip installed in the monitor that comes from, and only from, NVidia.
It's much easier to simply set a flag-byte in the DisplayPort data stream that says "ok render everything since the last render you rendered, now". There's nothing closed about that.
And? Who cares if it results in a better solution? LOL only a headcase or a paid AMD shill would say removing hardware for a cheaper solution that results in a worst solution is actually better.
Oh darn... so what you're saying is that I have to purchase the card that costs less, then I have to purchase the monitor that costs less too? Sound like a raw deal... ROFL!!
And as far as your bogus oppenness argument goes: There is nothing preventing Nvidia from supporting Adaptive Sync. NOTHING. Well, unless you count hubris and greed. In fact, Mobile G-Sync already doesn't even require the module! I guess that expensive module really wasn't necessary after all...
And lastly, Nvidia has no x86 APU to offer, so they can't offer what AMD can with their Freesync-supporting APUs. Nvidia simply has nothing to compete with there. Even gamers on a tight budget can enjoy Freesync! The same simply cannot be said for GSync.
Thats the thing, this is clearly a tech that can help lower end GPU's provide a better gaming experience, and it is a patch away for nVidia.
Them saying they won't or them just not doing it is honestly a slap in the face to every customer of theirs. Me included, I don't want my GPU to work better with one monitor than another because of a branding issue.
if nVidia doesn't support Freesync, I'll just never buy their products again. I honestly don't see why they wouldn't support it. Then their GPU's work with everything and AMD are still stuck to Freesync.
Not only is it insulting to me as a customer, it is also stupid from a business standpoint as well.
AdaptiveSync - An open VESA industry standard available for free to any company that wishes to utilize it. G-sync - Nvidia's proprietary solution that they collect royalties on and refuse to allow any other company to use.
So you are arguing that the cost is that you have to stick with AMD hardware? For one, how is this a cost?? But my point is the only reason that you would have to stick with AMD hardware is because NVIDIA chooses not to support displayport 1.2a. So it is NVIDIA costing you. Secondly, You are not limited to AMD hardware, rather NVIDIA is excluding itself from your next purchase. Freesync is not closed tech... Intel graphics chips could adopt it tomorrow since it is an open standard. It is NVIDIA that is closing down options, not AMD.
Costs will trickle down somehow...yes with good PR. AMD spent a ton developing this only to give it away for free. It will pay because it makes NVidia look bad. I'm not a fanboy, I prefer NVidia's drivers usually. I just like AMD better because they compete on cost, not match it like an oligopoly.
"Costs still trickle down somehow..and that cost is you having to stick with AMD hardware when you buy a new monitor."
That's not a cost AMD is imposing on us, it's a cost nVidia is imposing on us, by stubbornly refusing to give their customers Freesync compatible drivers. nVidia is simply trying to grab as much cash as possible, and people like you are helping them rip us all off.
How is it not a cost AMD is imposing? LOL. FreeSync panels carry a premium over non-FreeSync panels, this is a fact. AMD has said it is the panel mfgs charging a premium for higher quality components, specifications, engineering/QA costs. No one has a problem with this, even AMD fanboys like you.
Yet when Nvidia does the same, especially when their G-Sync module is clearly doing a better job at what it needs to do relative to the new FreeSync scalers, all while offering more features (3D and ULMB), suddenly there's a problem and Nvidia has no right?
LOL, idiots. Nvidia and their mfg partners are charging more because the market sees value in their superior products, simple as that. These are the same mfgs btw, if they thought they could charge more they would, but clearly, they also see the Nvidia solution commands the higher price tag.
BS. As mentioned in the article, FreeSync support is no big deal and is already supported by most upscaler chips out there. Had there been "hidden cost" they wouldn't do it.
How is it better because it works with Nvidia hardware? I mean, if you have a Nvidia card you don't have a choice. That doesn't make GSync better in any meaningful way.
... If you currently have an AMD card, you have much less of a choice. Actually given the restriction of GCN 1.1 or later, there's a decent possibility you have no choice.
mmh your point is? ofc if you have AMD you can only get freesync because if nothing else nvidia kept gsync for themself. What did you try to say? Nvidia is fragmenting monitor market.
My point is that Nvidia currently has more options for variable refresh rate tech, on top of a much larger install base, than AMD. It often helps to read a response in the context of the comment it's responding to. If you can't see how that's a relevant response to FriendlyUser's comment, then I can't help you.
Exactly, yet AMD fans and surprisingly, even the author Jarred (who should know better), would have you believe G-Sync somehow faces the uphill battle?
Having Nvidia refuse to embrace a standard does not make overpriced Gsync devices "better." It's just Nvidia failing their users yet-again.
They screwed me on stereoscopic 3D by dropping support for the $1K eMagin HMD when changing business partners, making it clear that they do not care to support their customers if not supporting them will drive sales of new displays. I won't get fooled again.
Nvidia failing their users, that's interesting. So they failed their users by inventing a tech the world had never seen before and bringing it to market some 18 months before the competition. Having owned and used an ROG Swift for the past 7 months which completely changed my gaming experience, I'd disagree with you.
Nvidia once again did exactly what I expect them to do: introduce great new technology to improve their ecosystem for their users.
For those 18 months yes, Nvidia was good. But now, It fails its customers because now, refusing to support the VESA standard, they are effectively limiting their choice of monitors and by forcing customers to pay a premium if they want smooth gameplay.
No, they're reinforcing their position and rewarding their customers by sticking to and developing a technology they're invested in. Their customers will pay the premium as long as their solution is better, and the only way to continue to ensure it remains better is to continue investing and developing it. Nvidia has already said they aren't done improving G-Sync, given how awesome it has been in its first incarnation, I can't wait to see what they have in store.
Meanwhile, FreeSync is a good introduction into VRR for AMD, let's hope they continue to invest the same way Nvidia has to make sure they produce the best product they can for their users.
you can't possibly believe that, it's ridiculous! rewarding their customers by making them pay a premium for an end result that's clearly not noticeably different from the free alternative?? It's really business 101: they had the market cornered and they could charge whatever they want, fair play to them and well done. But now when an equally good free open standard alternative comes into play, not adopting it IS a complete disregard to their customers. I own nvidia gpus (sli) now, and i DON'T want to pay for their solution after seeing what freesync can do. Not providing me with that option simply makes me a disgruntled customer that'll take my business elsewhere. The problem is people like you who can't see that continue to blindly buy into it, making them reluctant to change their stance as long as the money rolls in. They'd drop gsync in an instant if no one buys their overpriced tech, and we'd all be better for it.
The fact Jarred and AT not only missed this, but have actively made excuses/denied it is pretty appalling as more and more reviewers are making note of the ghosting and flickering problems with FreeSync.
@chizow: "Read some actual reviews that clearly show right now FreeSync is the inferior solution."
You mean, as opposed to all the 'virtual' reviews we've been reading that all say the same thing: that FreeSync does exactly what G-Sync does, except for free? Clearly, a green goblin lover like yourself must respect techreport.com, a blatantly pro-nVidia website plain and simple, and even their response to FreeSync? I quote Scott Wasson:
"Now, I have only had a few hours with the BenQ XL2730Z (our review sample arrived yesterday afternoon), but my first impressions are just this: AMD has done it. They've replicated that sense of buttery smooth animation that a fast G-Sync display will get you, and they've done it in a way that squeezes the extra costs out of the monitors. This is a very good thing."
I'll repeat that operative part of the quote so you can't possibly overlook or ignore it. AMD has "DONE IT. They've REPLICATED THAT SENSE OF BUTTERY SMOOTH ANIMATION that a fast G-Sync display will get you, and they've done it in a way that SQUEEZES THE EXTRA COSTS OUT OF THE MONITORS."
This is all that the vast legions of gamers are going to care about: same buttery smooth performance, less cost. QED.
Yeah, they are virtual reviews, done in controlled test environments in a limited period of time just as noted in that review. Now that actual live samples are coming in however, a LOT of problems are creeping up outside of the controlled test environments.
And before you attack the author's credibility, as I'm sure you will, keep in mind this guy has been a huge AMD advocate in the past:
"Note to readers: I have seen this firsthand on the Acer FreeSync monitor I’m reviewing, and PC Perspective noticed it with 2 additional FreeSync monitors, the BenQ XL2730Z and LG 34UM67. To illustrate the problem they recorded the aforementioned monitors running AMD’s Windmill demo, as well as the same demo running on a G-Sync enabled Asus ROG Swift. Ignore the stuttering you see (this is a result of recording at high speed) and pay attention to the trailing lines, or ghosting. I agree that it’s jarring by comparison."
@maximumGPU: Oh, but chizow DOES believe it. He believes that the green goblin on the box is the hallmark of quality, and that nVidia NEVER makes any mistakes, NEVER rips off its customers, and NEVER cheats or lies, or acts deceptively, because 'it's all business'. In other words, he's a sociopath. He BELIEVES G-Sync MUST be better than FreeSync because there's a green goblin-labelled chip in the G-Sync monitor that MUST be doing SOMETHING the FreeSync monitor just cannot do. It just HAS to be better, because he PAID for it. LOL
Who cares about all that noise? They still offer the best product at any given price point if you can afford a slightly better product. Only sycophants like you go into the stupid and inane morality of it, same way you begrudge Intel for putting their boot to AMD's neck in the CPU market.
Buy what gives you the best product for your money and needs, who cares about the rest.
GSync is a shameless attempt to ban competition. AMD couldn't use it even if it would PAY for it. On the contrary, FreeSync, being VESA standard, can be freely used by any GPU manufacturer, including Intel and nVidia.
I would never buy a Gsync monitor because I want GPU flexibility down the road. Period. Unfortunately, with no assurance from Nvidia that they might support Freesync down the road, I'm afraid to invest in a Freesync monitor either. Both sides lose.
NVidia can state what they want. So can Apple. Companies that refuse to create standards with other companies for the benefit of the consumer will pay the price with actual money. This is part of why I buy AMD and not Nvidia products. I got tired of the lies, the schemes, and the confusion that they like to throw at those that fund them.
@DigitalFreak: So your response to an intelligently argued point about not supporting a company that imposes proprietary standards is to say 'your mom called?' You're clearly a genius, aren't you?
I'm disappointed Anandtech is forgetting this: there's a fundamental difference between G-Sync and freesync, which justifies the custom hardware and makes it worth it compared to something I'd rather stay without: freesync requires v-sync to be active and is plagued with the additional latency that comes with it, gsync replaces it entirely.
Considering that the main reason anyone would want adaptive sync is to play decently even when the framerate dips at 30-40fps, where the 3 v-synced, pre-rendered frames account for a huge 30-40ms latency, AMD can shove its free solution up its arse as far as I'm concerned.
I'd rather have tearing and no latency, or no tearing and acceptable latency at lower settings (to allow me to play at 60fps), both of which don't require a new monitor.
For the considerable benefit of playing at lower-than-60 fps without tearing AND no additional latency, I'll gladly pay the nvidia premium (as soon as 4K 120Hz IPS will be available).
Did you read the article? Of course not! VSync On or Off only comes into play when outside the refresh rate range of the monitor and is an option that GSync does not have. If you have GSync you are force into VSync On when outside the refresh range of the monitor.
No, it'S the other way around. Freesync does not require V-Sync; you can actually choose, and it will impact the stuttering/tearing or display latency. OTOH, G-Sync does exactly what you said : V-Sync on, when outside of dynamic sync range. Read more carefully, pal.
The AMD and Nvidia haters all come out of the wood work for these type articles.
Intel needs to chime in. I suspect they will go the FreeSync route since it is part of the spec and there are no costs.
I understand Nvidia has some investment here. I fully expect them to support adaptive sync - at least in 5 years. They really need to do something about Phys-X. As a customer I see it as irrelevant. I know it isn't their style to open up their tech.
Not to go off-topic too much, but physx as a CPU physics engine, like havok, etc., is quite popular. There are hundreds of titles out there using it and more are coming.
As for GPU physx, which is what you had in mind, yes, it'd never become widely adopted unless nvidia opens it up, and that would probably not happen, unless someone else comes up with another, open GPU accelerated physics engine.
Minor nitpick, intel's solution won't be called FreeSync - this is reserved for AMD certified solutions. Pretty sure though it's going to be technically the same, just using the adaptive sync feature of DP 1.2a. (My guess would be at some point in the future nvidia is going to follow suit, first with notebooks because gsync is more or less impossible there though even then it will be initially restricted to notebooks which drive the display from the nvidia gpu which aren't many but everything else is going to require intel to support it first. I'm quite confident they are going to do this with desktop gpus too, though I would suspect they'd continue to call it GSync. Let's face it requiring a specific nvidia gsync module in the monitor just isn't going to fly with anything but high-end gaming market whereas adaptive sync should trickle down to a lot more markets, thus imho there's no way nvidia's position on this doesn't have to change.)
@eanazag: nVidia will be supporting FreeSync about 20 minutes after the first hacked nVidia driver to support FreeSync makes it onto the web, whether they like it or not.
There's no need to be disappointed honestly, Jarred just copy/pasted half of AMD's slide deck and then posted a Newegg Review. Nothing wrong with that, Newegg Reviews have their place in the world, its just unfortunate that people will take his conclusions and actually believe Freesync and G-Sync are equivalents, when there are already clear indications this is not the case.
- 40 to 48 minimums are simply unacceptably low thresholds before things start falling apart, especially given many of these panels are higher than 1080p. 40 Minimum at 4K for example is DAMN hard to accomplish, in fact the recently launched Titan X can't even do it in most games. CrossFireX isn't going to be an option either until AMD fixes FreeSync + CF, if ever.
-The tearing/ghosting/blurring issues at low frame rates is significant. AMD mentioned issues with pixel decay causing problems at low refresh, but honestly, this alone shows us G-Sync is worth the premium because it is simply better. http://www.pcper.com/files/imagecache/article_max_... Jarred has mused multiple times these panels may use the same one as the one in the Swift, so why are the FreeSync panels faling so badly at low refresh? Maybe that G-Sync module is actually doing something, like actively sync'ing with the monitor to force overdrive without breaking the kind of guesswork framesync FreeSync is using?
-Input lag? We can show AMD's slide and take their word for it without even bothering to test? High speed camera, USB input double attached to a mouse, scroll and see which one responds faster. FreeSync certainly seems to work within its supported frequency bands in preventing tearing, but that was only half of the problem related to Vsync on/off. The other trade off for Vsync ON was how much input lag this introduced.
-A better explanation of Vsync On/Off and tearing? Is this something the driver handles automatically? Is Vsync being turned on and off by the driver dynamically, similar to Nvidia's Adaptive Vsync? When it is on, does it introduce input lag?
In any case, AnandTech's Newegg Review of FreeSync is certainly a nice preview and proof of concept of FreeSync, but I wouldn't take it as more than that. I'd wait for actual reviews to cover the science of display technology that actually matter, like input lag, blurring, image retention etc that can only really be captured and quantified with equipment like high speed cameras and a sound testing methodology.
NewEgg review LOL! In defense of Jared, he's probably working in the confines of the equipment made available to him by the parent company of this place. TFTCentral and PRAD have really expensive equipment they use to quantify the metrics in their reviews.
G-Sync isn't going anywhere, but its nice to see AMD provide their fans with an inferior option as usual. Works out well, given their customers are generally less discerning anyways. Overall its a great day for AMD fans that can finally enjoy the tech they've been downplaying for some 18 months since G-Sync was announced.
AMD offers an option that's indistinguishable in actual use from nVidia's, and significantly cheaper to boot. Sure, it's not enough for the "discerning" set who are willing to pay big premiums for minuscule gains just so they can brag that they have the best, but who other than nVidia stockholders cares who gets to fleece that crowd?
Frankly, I wish that AMD could pull the same stunt in the CPU market. Intel could use a price/performance kick in the pants.
Well unfortunately, for less discerning customers, the type that would just take such a superficial review as gospel to declare equivalency, the issues with input lag focuses on minor differences that were not easily quantified or identified, but over thousands of frames, the differences are much more apparent.
If you're referring to differences in FPS charts, you've already failed in seeing the value Nvidia provides to end-users in their products as graphics cards have become much more than just spitting out frames on a bar graph. FreeSync and G-Sync are just another example of this, going beyond the "miniscule gains" vs price tag that less discerning individuals might prioritize.
You're so discerning that I'm sure you could wax poetical on how your $3K monocrystalline speaker cables properly align the electrons to improve the depth of your music in ways that aren't easily quantifiable.
No, but I can certainly tell you why G-Sync and dozen or so other features Nvidia provides as value-add features for their graphics cards make them a better solution for me and the vast majority of the market.
Np, always nice mental exercise reminding myself why I prefer Nvidia over AMD:
1. G-Sync 2. 3D Vision (and soon VRDirect) 3. PhysX 4. GeForce Experience 5. Shadowplay 6. Better 3rd party tool support (NVInspector, Afterburner, Precision) which gives control over SLI/AA settings in game profiles and overclocking 7. GameWorks 8. Better driver support and features (driver-level FXAA and HBAO+), profiles as mentioned above, better CF profile and Day 1 support. 9. Better AA support, both in-game and forced via driver (MFAA, TXAA, and now DSR) 10. Better SLI compatibility and control (even if XDMA and CF have come a long way in terms of frame pacing and scaling). 11. Better game bundles 12. Better vendor partners and warranty (especially EVGA). 13. Better reference/stock cooler, acoustics, heat etc.
Don't particularly use these but they are interesting to me at either work or in the future: 14. CUDA, we only use CUDA at work, period. 15. GameStream. This has potential but not enough for me to buy a $200-300 Android device for PC gaming, yet. 16. GRID. Another way to play your PC games on connected mobile devices.
I can certainly let you off most of those, but third party activities shouldn't count, so you can subtract 6 and 12. Additionally, 13 can be picked apart as the 295X2 showed that AMD can present a high quality cooler, and because I believe lumping the aesthetic qualities of a cooler in with heat and noise is a partial falsehood (admit it - you WILL have been thinking of metal versus plastic shrouds). I also don't agree with you on 11; at least, not if you move back past the 2XX generation as AMD had more aggressive bundles back then. 8 is subjective but NVIDIA usually gets the nod here.
Also, some of your earlier items are proprietary tech, to which I could always tease you about as it's not as if they couldn't license any of this out. ;)
I'll hand it to you and credit you with your dozen.
And I thank you for not doing the typical dismissive approach of "Oh I don't care about those features" that some on these forums might respond with.
I would still disagree on 6 and 12 though, ultimately they are still a part of Nvidia's ecosystem and end-user experience, and in many cases, Nvidia affords them the tools and support to enable and offer these value-add features. 3rd party tools for example, they specifically take advantage of Nvidia's NVAPI to access hardware features via driver and Nvidia's very transparent XML settings to manipulate AA/SLI profile data. Similarly, every feature EVGA offers to end users has to be worth their effort and backed by Nvidia to make business sense for them.
And 13, I would absolutely disagree on that one. I mean we see the culmination of Nvidia's cooling technology, the Titan NVTTM cooler, which is awesome. Having to resort to a triple slot water cooled solution for a high-end graphics card is terrible precedent imo and a huge barrier to entry for many, as you need additional case mounting and clearance which could be a problem if you already have a CPU CLC as many do. But that's just my opinion.
AMD did make a good effort with their Gaming Evolved bundles and certainly offered better than Nvidia for a brief period, but its pretty clear their marketing dollars dried up around the same time they cut that BF4 Mantle deal and their current financial situation hasn't allowed them to offer anything compelling since. But I stand by that bulletpoint, Nvidia typically offers the more relevant and attractive game bundle at any given time.
One last point in favor of Nvidia, is Optimus. I don't use it at home as I have no interest in "gaming" laptops, but it is a huge benefit there. We do have them on powerful laptops at work however, and the ability to "elevate" an application to the Nvidia dGPU on command is a huge benefit there as well.
@chizow: But hey kids, remember, after reading this 16 point PowerPoint presentation where he points out the superiority of nVidia using detailed arguments like "G-Sync" and "GRID" as strengths, chizow DOES NOT WORK FOR nVidia! He is not sitting in the marketing department in Santa Clara, California, with a group of other marketing mandarins running around, grabbing factoids for him to type in as responses to chat forums. No way!
Repeat after me, 'chizow does NOT work for nVidia.' He's just an ordinary, everyday psychopath who spends 18 hours a day at keyboard responding to every single criticism of nVidia, no matter how trivial. But he does NOT work for nVidia! Perish the thought! He just does it out of his undying love for the green goblin.
But hey remember AMD fantards, there's no reason that the overwhelming majority of the market prefers Nvidia, those 16 things I listed don't actually mean anything if you prefer subpar product and don't demand better, and you continually choose to ignore the obvious one product supports more features and the other doesn't. But hey, just keep accepting subpar products and listen to AMD fanboys like anubis44, don't give in to the reality the rest of us all accept as fact.
False, it's indistinguishable "Within the supported refresh rate range" as per this review. What happens outside the VRR window however, and especially under it, is incredibly different. With G-sync, if you get 20 fps it'll actually duplicate frames and tune the monitor to 40Hz, which means smooth gaming at sub-30Hz refresh rates (well, as smooth as 20fps can be). With FreeSync, it'll just fall back to v-sync on or off, with all the stuttering or tearing that involves. That means that if your game ever falls below the VRR window on FreeSync, image quality falls apart dramatically. And according to PCPer, this isn't just something AMD can fix with a driver update because it requires the frame buffer and logic on the G-Sync module!
Take note that the LG panel tested actually has a VRR window lower bound of 48Hz, so image quality starts falling apart if you dip below 48fps, which is clearly unacceptable.
Yah just like R.I.P Direct X, Mantle will rule the day, now AMD is telling developers to ignore Mantle, Gsync is great, and for those of us who prefer drivers being updated the same day games are released will stick with Nvidia, while months later AMD users will be crying that games still don't work right for them.
Company of Heroes 2 worked like shit for nVidia users for months after release, while my Radeon 7950 was pulling more FPS than a Titan card. To this day, Radeons pull better, and smoother FPS than equivalently priced nVidia cards in this, my favourite game. The GTX970 is still behind the R9 290 today. Is that the 'same day' nVidia driver support you're referring to?
More BS from one of the biggest AMD fanboys on the planet, an AMD *CPU* fanboy nonetheless. CoH2 ran faster on Nvidia hardware from Day1, and also runs much faster on Intel CPUs, so yeah, as usual, you're running the slower hardware in your favorite game simply bc you're a huge AMD fanboy.
Good review, helps clear up a lot with respect to these new features. I've long thought that achieving a sufficiently high FPS and refresh rate would take care of things, but it's not always possible to do that with how games have pushed the limits of card abilities.
I'm still on the fence about whether or not I should upgrade my monitor. These days I do a lot of my gaming on my TV by running it through my AV receiver. However, there are some games (like Civilization V) that just don't translate well to a couch-based experience.
I guess the likelihood of someone coming out with a 16:10 FreeSync IPS monitor is pretty low? Sadly it seems like 16:10 monitors in general are becoming an endangered species these days.
Agreed, on Newegg right now a 98% of 16:10 displays use DP while only 40% of 16:9 displays do, so that could mean the likelihood is greater... but that 40% makes up a larger pool of monitors overall. However, with 2160p (16:9) and 21:9 displays on the rise for gaming, that probably means fewer 16:10 displays will be produced in general.
I think NVIDIA made a mistake in closing off G-sync. The death of Betamax should have taught them that you really can't go it alone in developing industry standards.
If they opened it up to monitor manufacturers, but still required GPU makers (AMD) to pay a royalty on that side, they might have gotten a lot more adoption by now and may have made the roll-out of FreeSync very difficult for AMD.
I'm so sick of this proprietary shit, when companies want to do something new why not work with the other companies in the industry to come up with the best solution for the problem and make it universal? Customers NEVER jump onto a technology that only works from one company, we aren't going to commit to a monopoly. I get that they want a competitive edge in the market, but it literally never works that way. What happens is they spend all this money on R&D, marketing, prototyping, waste everyone's time with marketing and reviews only to have a handful of people pick it up (probably too few to even break even) and then it stops growing right there until a universal standard comes out that the industry adapts as a whole.
Just fucking stop wasting everyone's time and money and choose cooperation from the start!
I'm sure AMD will be happy to give you the masks to the 290X and 390X, if you only ask. "I'm sick of there only being two players in the market. Why don't you let me in? I'm sure I could sell your products for less than your prices!"
next time when you try to make someone look stupid, try no to look like fool by yourself. Another dogma believer that thinks without patents world will end. Same as copyright believers that music&entertainment without copyright will stop to exist... If you think about it, you can see that patents can actually pretty much slow down technology advancement. You can even see that today with Intel CPUs, as AMD cannot catch up and CPUs are patent locked we are left to be milked by Intel with minimal performance gains between generations. If either AMD would have better CPUs or could simply copy good parts from Intel and put into their design today, imho, we would be much far with performance. Also look around you and see that 3D printing boom? Guess what few years back and last year patents expired and allowed this. Yes 3D printing was invented 30years ago, yet it gets to your desk only today. So much for patent believers.
btw even if AMD would give you their blueprints what would you do? Start selling R390X tomorrow right? Manufactured out of thin air. By the time you could sell R390X we would be at 590 generation. Possibly only nvidia/intel would be able to benefit that sooner (which isnt necesarily a bad thing)
@Hrel: a) cause most corporations are run by greedy bastards imho b) today managers of said corporations cant employ common sense and are disconnected from reality making stupid/bad decisions. I see it in big corporation i work for... so using brain and "pro-consumer" way of thinking is forbidden.
Please just support the Adaptive VSync standard Nvidia, your G-Sync implmentation doesn't have any benefits over it. You don't need to call it FreeSync, but we all need to settle on one standard because if I have to be locked in to one brand of GPU based on my monitor isn't not going to be the one that's not using industry standards.
It would be interesting to see input lag comparisons and videos of panning scenes in games that would typically cause tearing captured at higher speed played in slow motion.
Until we don't have video showing the 2 of them going in parallel, we can't decide for a winner. There might be a lot of metrics for measuring "tearing", but this is not about "hard metrics", but how the bloody frame sequences look on your screen. Smooth or not.
The difference cannot be shown on video. How can a medium like video which has a limited and constant frame-rate be used to demonstrate a dynamic, variable frame rate technology?
This is one of those scenarios where you can experience it only on a real monitor.
I am one of those who switch every time I upgrade my GPU (which is every few years). Sometimes, AMD is on top while other time Nvidia is better. Now, I must be locked into one forever or buy 6 monitors (3 for eyefinity and 3 for nvidia surround)!
If they can put out a confirmed 1440p 21:9 w/Freesync they will get my money. The rumors around the Acer Predator are still just rumors. Please... someone... give me the goods!
It's pretty likely that LG will do just that. They already make two 1440p 21:9 monitors, and since it sounds like FreeSync will be part of new scalers going forward, you can probably count on the next LG 1440p 21:9 picking up that ability.
I'm right there with you. I'm already preparing to get the update on the LG 1440 21:9 and a 390X, because if the rumors for the latter are anything like what the card is, it's going to be fantastic, and after getting a 21:9 for work I can't make myself use any other resolution.
Same deal here. If nVidia supported FreeSync and priced the Titan X (or impending 980 Ti) in a more sane manner I'd consider going that way because I have no great love for either company.
But so long as they expect to limit my monitor choices to their price-inflated special options and pretend that $1K is a reasonable price for a flagship video card, they've lost my business to someone with neither of those hangups.
I have no experience with 144hz screens. I've been waiting for freesync to come but you're saying the difference is negligble with a static 144hz monitor? Is that with any FPS or does the FPS also have to be very high? (in regards to 4th paragarph on last page). Thanks
I'd have to do more testing, but 144Hz redraws the display every 6.9ms compared to 60Hz redrawing every 16.7ms. With pixel response times often being around 5ms in the real world (not the marketing claims of 1ms), the "blur" between frames will hide some of the tearing. And then there's the fact that things won't change as much between frames that are 7ms apart compared to frames that are 17ms apart.
Basically at 144Hz tearing can still be present but it ends up being far less visible to the naked eye. Or at least that's my subjective experience using my 41 year old eyes. :-)
If you have a display with backlight strobing (newest light boost, benq blur reduction, etc) the difference is readily apparent. Motion clarity is way way way better than with out. The issue is its like a CRT and strobbing is annoy at low rates. 75hz is about the absolutely min, but 90hz and above are better. I doubt any of the displays support strobing and adaptive sync at the same time currently, but when you can push the frames, its totally worth it. The new BenQ mentioned in the article will do both for example (Maybe not at the same time.) That way you can have adaptive sync for games with low FPS and strobing for games with high fps.
I'm not a fan of closed, expensive solutions, but this hate towards g-sync that some here are showing is unwarranted.
nvidia created g-sync at a time where no other alternative existed, so they created it themselves, and it works. No one was/is forced to buy it.
It was the only option and those who had a bit too much money or simply wanted the best no matter what, bought it. It was a niche market and nvidia knew it.
IMO, their mistake was to make it a closed, proprietary solution.
Those consumers who were patient can now enjoy a cheaper and, in certain aspects, better alternative.
Now that DP adaptive-sync exists, nvidia will surly drop the g-sync hardware and introduce a DP compatible software g-sync. I don't see anyone buying a hardware g-sync monitor anymore.
you don't understand the hate because you think nvidia will drop g-sync immediately. It's likely you're right but it's not a given. Maybe it will be a while before the market forces nvidia to support adaptive sync.
nVidia will protect manufacturers that invested resources into G-Sync. They will continue support for G-Sync and later introduce added support for Freesync.
The fact that only AMD cards work with Freesync now is not because Freesync is closed but because Nvidia refuses to support it. It takes a perverse kind of Alice in Wonderland logic to use the refusal of certain company to support an open standard in its hardware as proof that the open standard is in fact "closed."
Freesync is open because it is part of the "open" Displayport standard and any display and GPU maker can take advantage of it by supporting that relevant Displayport standard (because use of the Displayport standard that Freesync is part of is free). Nvidia's Gsync is "closed" because Nvidia decides who and on what terms gets to support it.
Whatever the respective technical merits of Freesync and Gsync, please stop the trying to muddy the water with sophistry about open and closed. Nvidia GPU can work with Freesync monitors tomorrow if Nvidia wanted it - enabling Freesync support Nvidia a dime of licensing fees or requirement the permission of AMD or anyone else. The fact that they choose not to support it is irrelevant to the definition of Displayport 1.2a (of which Freesync is a part of) as an open standard.
Are NVIDIA's partners able to modify their cards BIOS and/or provide customized drivers to support FreeSync or do they have to rely on NVIDIA to adopt the feature? I know different manufacturers have made custom cards in the past with different port layouts and such. I never investigated to see if they required a custom driver from the manufacturer, though. Is it possible that this could be an obstacle that an EVGA, ASUS, MSI, etc. could overcome on their own?
How is this even remotely a fact when AMD themselves have said Nvidia can't support FreeSync, and even many of AMD's own cards in relevant generations can't support it? Certainly Nvidia has said they have no intention of supporting it, but there's also the possibility AMD is right and Nvidia can't support it.
So in the end, you have two effectively closed and proprietary systems, one designed by AMD, one designed by Nvidia.
Nvidia cannot use FreeSync as it is AMD implementation of VESA's Adaptive Sync, they have to come up with their own implementation of the specification.
are you sure? they only have to come with different name (if they want). Just as both amd/nvidia calls and use "DisplayPort" as DisplayPort , they didnt have came up with their own implementations of it as DP is standardized by VESA so they used that. Or I am missing your point what you wanted to say.
Question is if it become core/regular part of lets say DP1.4 onwards as just now it is only optional aka 1.2a and not even in DP1.3 - if I understand that correctly.
So it is also closed/proprietary on an open spec? Gotcha, so I guess Nvidia should just keep supporting their own proprietary solution. Makes sense to me.
You mean like repeating FreeSync can be made backward compatible with existing monitors with just a firmware flash, essentially for Free? I can't remember how many times that nonsense was tossed about in the 15 months it took before FreeSync monitors finally materialized.
Btw, it is looking more and more like FreeSync is a proprietary implementation based on an open-spec just as I stated. FreeSync has recently been trademarked by AMD so there's not even a guarantee AMD would allow Nvidia to enable their own version of Adaptive-Sync on FreeSync (TM) branded monitors.
From the PC Perspective article you've been parroting around like gospel all day today:
"That leads us to AMD’s claims that FreeSync doesn’t require proprietary hardware, and clearly that is a fair and accurate benefit of FreeSync today. Displays that support FreeSync can use one of a number of certified scalars that support the DisplayPort AdaptiveSync standard. The VESA DP 1.2a+ AdaptiveSync feature is indeed an open standard, available to anyone that works with VESA while G-Sync is only offered to those monitor vendors that choose to work with NVIDIA and purchase the G-Sync modules mentioned above."
@ddarko, it says a lot that you quote the article but omit the actually relevant portion:
"Let’s address the licensing fee first. I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."
And more on that G-Sync module AMD claims isn't necessary (but we in turn have found out a lot of what AMD said about G-Sync turned out to be BS even in relation to their own FreeSync solution):
"But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."
In summary, AMD's own proprietary spec just isn't as good as Nvidia's.
AMDs spec is not proprietary so stop lying. Also I love how you just quoted in context what you quoted out of context in an earlier comment. The only argument you have against freeSync is ghosting and as many people have pointed out is that it is not an inherent issue with free sync but the monitors themselves. The example given in that shows three different displays that all are affected differently. The LG and benq both show ghosting differently but use the same freeSync standard so something else is different here and not freeSync. On top of that the LG is $100 less than the asus and the benQ $150 less for the same features and more inputs. I don't see how a better more well rounded monitor that can offer variable refresh rates with more features that is cheaper is a bad thing. From the consumer side of things that is great! A few ghosting issues that i'm sure are hardly noticeable to the average user is not a major issue. The videos shown there are taken at a high frame rate and slowed down, then put into a compressed format and thrown on youtube in what is a very jerky hard to see video, great example for your only argument. If the tech industry could actually move away from proprietary/patented technology, and maybe try to actually offer better products and not "good enough" products that force customers into choosing and being locked into one thing we could be a lot father along.
Huh? How do you know Nvidia can use FreeSync? I am pretty sure AMD has said Nvidia can't use FreeSync, if they decide to use something with DP 1.2a Adaptive Sync they have to call it something else and create their own implementation, so clearly it is not an Open Standard as some claim.
And how is it not an issue inherent with FreeSync? Simple test that any site like AT that actually wants answers can do:
1) Run these monitors with Vsync On. 2) Run these monitors with Vsync Off. 3) Run these monitors with FreeSync On.
Post results. If these panels only exhibit ghosting in 3), then obviously it is an issue caused by FreeSync. Especially when we have these same panel makers (the 27" 1440p BenQ is apparently the same AU optronics panel as the ROG Swift) have panels on the market, both non-FreeSync and G-Sync) that have no such ghosting.
And again you mention the BenQ vs. the Asus, well guess what? Same panel, VERY different results. Maybe its that G-Sync module doing its magic, and that it actually justifies its price. Maybe that G-Sync module isn't bogus as AMD claimed and it is actually the Titan X of monitor scalers and is worth every penny it costs over AMD FreeSync if it is successful at preventing the kind of ghosting we see on AMD panels, while allowing VRR to go as low as 1FPS.
Same panel different scalers. AMD just uses the standard built into the display port, the scaler handles the rest there so it isn't necessarily freeSync but the variable refresh rate technology in scaler that would be causing the ghosting. So again not AMD but the manufacturer.
"If these panels only exhibit ghosting in 3), then obviously it is an issue caused by FreeSync" Haven't seen this and you haven't shown us either.
"Maybe its that G-Sync module doing its magic" This is the scaler so the scaler, not made by AMD, that supports the VRR standard that AMD uses is what is controlling that panel not freeSync itself. Hence an issue outside of AMDs control. Stop lying and saying it is an issue with AMD. Nvidia fanboys lying, gotta keep them on the straight and narrow.
Yes..the scaler that AMD worked with scaler makers to design, using the Spec that AMD designed and pushed through VESA lol. Again it is hilarious that AMD and their fanboys are now blaming the scaler and monitor makers already. This really bodes well for future development of FreeSync panels. /sarcasm
Viva La Open Standards!!!! /shoots_guns_in_air_in_fanboy_fiesta
So the AMD fanboys like you can't keep shoving your head in the sand and blaming Nvidia for your problems. Time to start holding AMD accountable, something AMD fanboys should've done years ago.
That the display controller in Nvidia vidro cards don't support variable refresh intervals? First of all, that's an AMD executive's speculation on why Nvidia has to use an external module. It's never be confirmed to be true by Nvidia. If this is the source of your claim, then it's laughable - you're taking as gospel what an AMD exec says he thinks are the hardware capabilities of Nvidia cards. Whatever.
Second, even if it was true for arguments sake, that still means Freesync is open while Gsync is closed because Nvidia can add display controller hardware support without anyone's approval or licensing fee. AMD or Intel cannot do the same with Gsync. It's really that simple.
Really, grasping at straws only weakens your arguments. Everyone understands what open and closed means and your attempts to creatively redefine them are a failure. The need to add hardware does not make a standard closed - USB 3.1 is an open standard even though vendors must add new chips to support it. It is open because every vendor can add those chips without license fee to anyone else. Freesync is open - Nvidia, AMD or Intel. Gsync is not. Case closed.
Hey, AMD designed the spec, they should certainly know better than anyone what can and cannot support it, especially given MANY OF THEIR OWN RELEVANT GCN cards CANNOT support FreeSync. I mean if it was such a trivial matter to support FreeSync, why can't any of AMD's GCN1.0 cards support it? Good questioin huh?
2nd part, for arguments sake, I honestly hope Nvidia supports FreeSync, because they can just keep supporting G-Sync as their premium option allowing their users to use both monitors. Would be bad news for AMD however, as that would be even 1 less reason to buy a Radeon card.
Not really radeon cards compete well with nvidia, the two leap frog and both offer better value depending on what time you look at what is available. Also the older GCN1.0 cards most likely don't have the hardware to support it, like the unconfirmed Nvidia story above, I myself am assuming here. Nvidia created a piece of hardware that will get them more money that has to be added into a monitor that would support older cards. Hard to change an architecture thats already released. Nvidia did well by making it the way they did, it offered a larger selection of cards to use even if it was a higher price. But now that there is an open standard things will shift and the next gen from AMD, i'm sure, will all support free sync broadening the available cards for free sync. The fact that g-sync still has a very limited amount of very expensive monitors makes it a tough argument that it is in any way winning, especially when by next month freeSync will have just as many options at a lower price. You just can't ever admit that AMD possibly did something well and that Nvidia is going to be fighting a very steep uphill battle to compete on this currently niche technology.
Also lets add to the fact that against your article works against you.
"The question now is: why is this happening and does it have anything to do with G-Sync or FreeSync?...It could be panel technology, it could be VRR technology or it could be settings in the monitor itself. We will be diving more into the issue as we spend more time with different FreeSync models.
For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area."
Even the article you posted and used for your argument says it is not freeSync but the monitor technology itself. You have a knack for trying to target anything wrong against AMD when it is the monitor manufacturer that has caused this problem.
So you do agree, it is very well possible Nvidia can't actually support FreeSync for the same reasons most of AMD's own GPUs can't support it then? Is AMD just choosing to screw their customers by not supporting their own spec? Maybe they are trying to force you to buy a new AMD card so you can't see how bad FreeSync is at low FPS?
Last bit of first paragraph. LOL, it's like AMD fans just dont' get it. Nvidia, the market leader that holds ~70% of the dGPU market (check any metric, JPR, Steam survey, your next LAN party, online guild, whatever) that supports 100% of their GPUs since 2012 with G-Sync and has at least 8 panels on the market has the uphill battle while AMD, the one that is fighting for relevance, supports only a fraction of their ~30% dGPU share FreeSync, has half the monitors, and ultimately, has the worst solution. But somehow Nvidia has the uphill battle in this niche market? AMD fans willing to actually buy expensive niche equipment is as niche as it gets lol.
And last part, uh, no. They are asking a rhetorical question they hope to answer and they do identify G-Sync and FreeSync as possible culprits with a follow-on catchall with VRR, but as I said the test is simple enough on these panels, do VSync ON/OFF and FreeSync and see under what conditions the panel ghosts. I would be shocked if they ghost with standard fixed refresh, so if the problem only exhibits itself with FreeSync enabled, it is clearly a problem inherent with FreeSync.
Also, if you read the comments by PCPer's co-editor Allyn, you will see he also believes it is a problem with FreeSync.
Well if that is true, not saying it is, then that would be an issue with the VRR technology built into the monitors scaler, not a free sync issues. Free sync uses an open standard technology so it is not freesync itself but the tech in the monitor, and how the manufacturers scaler handles the refresh rates. Also your argument that is falls apart at low refresh rates is again not really an AMD issue. The freesync implementation can go down as low as 9hz. But since freesync relies on the manufacturer to make a monster that can go that low there is a limitation there. Obviously an open standard will have more issue than a more closed proprietary system, ie. Mac vs windows. Freesync has to support different hardware and dealers across the board and not the exact same proprietary thing every time.
The fact that the amount of monitors available for gsync in 18 months has only reached 8 models should tell you something there. Obviously manufacturers don't want to build monitors that cost them more to build that they have to sell at a higher price for a feature that is not selling much at all yet. But hey the fact that they have a new standard being built into newonitors that can be supported by several large brand names and if cost then almost nothing extra to build, and it can be used with non display port devices? Well that opens a much larger customer base for those monitors. Smart business there.
I'm sure as time goes on monitor manufacturers will build higher quality parts that will solve the ghosting issue, but the fact remains that AMD was able to accomplish essentially the same thing as Nvidia using an open standard that is working it's way into monitors. Also those monitors are still able to target a larger audience. Proprietary never wins out in cases like this.
Gsync is not enough of a benefit over freesync to justify the cost and in time it will be more widely adopted as companies who make monitors stop wanting to pay the extra to Nvidia. Although bent has a hybrid monitor for gsync, I could see a hybrid gsync/freesync monitor in the future(graphics agnostic, sort of).
The fact the number of G-Sync panels is only 8 reinforces what Nvidia has said all along, doing it right is hard, and instead of just hoping Mfgs. pick the right equipment (while throwing them under the bus like AMD is doing), Nvidia actually fine tunes their G-Sync module to every single panel that gets G-Sync certified. This takes engineering time and resources.
And how are you sure things will get better? It took AMD 15 months to produce what we see today and it is clearly riddled with problems that do not plague G-Sync. What if the sales of these panels are a flop and there is a high RMA rate attached to them due to the ghosting issues we have seen? Do you think AMD's typical "hey its an open standard, THEY messed it up" motto is doing any favors as they are now already blaming their Mfg. partners for this problem? Honestly, take a step back and watch history repeat itself. AMD once again releases a half-baked, half-supported solution, and then when problems arise, they don't take accountability for it.
Also, it sounds like you are acknowledging these problems do exist, so would you, in good conscience recommend that someone buy these, or wait for FreeSync panels 2.0 if they are indeed problems tied to hardware? Just wondering?
And how is G-Sync not worth the premium, given it does exactly as it said it would from Day 1, and has for some 18 months now without exhibiting these very same issues? Do you really think an extra $150 on a $600+ part is really going to make the difference if one solution provides what you want TODAY vs. the other that is a completely unknown commodity?
It has one issue that you have pointed out, ghosting, so not riddled with problems. Also AMD had to wait for the standard to get into display port 1.2a they had it working in 2013 but until the VESA got it into display port they could ship monitors that supported it, hence the 15 months you say it took to 'develop'.
So far all the reviews I have seen on freeSync have been great, so yes. The only one that even mentions ghosting is the one that you posted and i'm sure that it's only noticeable with high speed cameras. Obviously the ghosting is not a big enough issue that it was even required to make mention for when under normal use it isn't noticeable, and you still get smooth playback on the monitor, same as gsync.
Last, yes $150 is 25% increase on $600 there so that is a significant increase in price, very relevant because some people have a thing called a budget.
No, its not the only issue, ghosting is just the most obvious and most problematic, but you also get flickering at low FPS because FreeSync does not have the onboard frame buffer to just double frames as needed to prevent this kind if pixel decay and flickering you see with FreeSync. Its really funny because these are all things (G-Sync module as smart scaler and frame/lookaside buffer) that AMD quizzically wondered why Nvidia needed extra hardware. Market leader, market follower, just another great example.
Another issue apparently is there is a perceivable difference in smoothness as the AMD driver kick in and out of Vsync On/Off mode in its supported frame bands. This again, is mentioned in these previews but not covered in depth. My guess is because press had limited access to the machines again for these previews and they were done in controlled test environments with their own shipped versions only arriving around the time the embargo was lifted.
But yes you and anyone else should certainly wait for follow-up reviews because the ghosting and flickering was certainly visible without any need of high speed cameras
25% increase is nothing if you want a working solution today, but yes a $600 investment in what AMD has shown would be a dealbreaker and waste of money imo, so if you are in the AMD camp or also need to spend another $300 on an AMD card for FreeSync you will certainly want to wait a bit longer.
I would like an actual look at added input latency from these adaptive sync implementations. Nobody has even mentioned it but there's a very real possibility that either the graphics TX or monitor's scaler has to do enough thinking to cause a significant delay from when pixels come it to when they're displayed on the screen. Why isn't the first issue to be scrutinized be the thing that these technologies seek to solve?
My monitors last longer than 5 years. Basically I keep them till they die. I have a 19" 1280x1024 on the shared home computer I'm considering replacing. I'd be leaning towards neither or Freesync monitors.
I currently am sporting AMD GPUs, but I am one of those who go back and forth between vendors and I don't think it is as small a minority as was assumed. I bought two R9 290's when AMD last February. If I was buying right now, I'd be getting a GTX 970. I do like the GeForce Experience software. I'm still considering a GTX 750 Ti.
I'm not totally sold on what AMD has in the market at the moment. I have a lot of heat concerns using in Crossfire and the wattage is higher than I like. The original 290 blowers sucked. I'd like blower cards again that are quality like Nvidia's.
It's not "better" but it is roughly equivalent. I've got benchmarks from over 20 games. Average for 290 X at 2560x1440 "Ultra" across those games is 57.4 FPS while the average for 970 is 56.8 FPS. Your link to Crysis: Warhead is one title where AMD wins, but I could counter with GRID 2/Autosport and Lord of the Fallen where NVIDIA wins. And of the two GPUs, 970 will overclock more than 290X if you want to do that.
I'm an NVIDIA User, but in happy to see the proprietary GSYNC get beat down. I've got a 1080p144 non GSYNC panel, so I won't be upgrading for 3-5 years, and hopefully 4k and FreeSync will both be standard by then.
It's background information that's highly pertinent, and if "the first page" means "the first 4 paragraphs" then you're right... but the last two talk mostly about FreeSync.
I love how the pricing page doesn't anything to address a big problem with both FreeSync and G-Sync -- the assumption that people want to replace the monitors they already have or have money to throw away to do so.
I bought an $800 BenQ BL3200PT 32" 1440 A-MVA panel and I am NOT going to just buy another monitor in order to get the latest thing graphics card companies have dreamt up.
Companies need to step up and offer consumers the ability to send in their panels for modification. Why haven't you even thought of that and mentioned it in your article? You guys, like the rest of the tech industry, just blithely support planned obsolescence at a ridiculous speed -- like with the way Intel never even bothered to update the firmware on the G1 ssd to give it TRIM support.
People have spent even more money than I did on high-quality monitors -- and very recently. It's frankly a disservice to the tech community to neglect to place even the slightest pressure on these companies to do more than tell people to buy new monitors to get basic features like this.
You guys need to start advocating for the consumer not just the tech companies that send you stuff.
While we're at it: Why don't companies allow you to send in your old car to have it upgraded with a faster engine? Why can't I send in my five year old HDTV to have it turned into a Smart TV? I have some appliances that are getting old as well; I need Kenmore to let me bring in my refrigerator to have it upgraded as well, at a fraction of the cost of a new one!
But seriously, modifying monitors is hardly a trivial affair and the only computer tech that allows upgrades without replacing the old part is... well, nothing. You want a faster CPU? Sure, you can upgrade, but the old one is now "useless". I guess you can add more RAM if you have empty slots, or more storage, or an add-in board for USB 3.1 or similar...on a desktop at least. The fact is you buy technology for what it currently offers, not for what it might offer in the future.
If you have a display you're happy with, don't worry about it -- wait a few years and then upgrade when it's time to do so.
Apple offered a free upgrade for Lisa 1 buyers to the Lisa 2 that included replacement internal floppy drives and a new fascia. Those sorts of facts, though, are likely to escape your attention because it's easier to just stick with the typical mindset the manufacturers, and tech sites, love to endorse blithely --- fill the landfills as quickly as possible with unnecessary "upgrade" purchases.
Macs also used to be able to have their motherboards replaced to upgrade them to a more current unit. "The only computer tech that allows upgrades without replacing the old part is... well, nothing." And whose mindset is responsible for that trend? Hmm? Once upon a time people could actually upgrade their equipment for a fee.
The silence about my example of the G1 ssd's firmware is also deafening. I'm sure it would have taken tremendous resources on Intel's part to offer a firmware patch!
The G1 question is this: *could* Intel have fixed it via a firmware update? Maybe, or maybe Intel looked into it and found that the controller in the G1 simply wasn't designed to support TRIM, as TRIM didn't exist when the G1 was created. But "you're sure" it was just a bit of effort away, and since you were working at Intel's Client SSD department...oh, wait, you weren't. Given they doubled the DRAM from 16MB to 32MB and switched controller part numbers, it's probable that G1 *couldn't* be properly upgraded to support TRIM: http://www.anandtech.com/show/2808/2
So if that's the case, it's sounds a lot like Adaptive Sync -- the standard didn't exist when many current displays were created, and it can't simply be "patched in". Now you want existing displays that are already assembled to be pulled apart and upgraded. That would likely cost more money than just selling the displays at a discount, as they weren't designed to be easily disassembled and upgraded.
The reality of tech is that creating a product that can be upgraded costs time and resources, and when people try upgrading and mess up it ends up bringing in even more support calls and problems. So on everything smaller than a desktop, we've pretty much lost the ability to upgrade components -- at least in a reasonable and easy fashion.
Is it possible to make an upgradeable display? I suppose so, but what standards do you support to ensure future upgrades? Oh, you can't foresee the future so you have to make a modular display. Then you might have the option to swap out the scaler hardware, LCD panel, maybe even the power brick! And you also have a clunkier looking device because users need to be able to get inside to perform the upgrades.
"Now you want existing displays that are already assembled to be pulled apart and upgraded. That would likely cost more money than just selling the displays at a discount, as they weren't designed to be easily disassembled and upgraded."
If that's the case... I wonder why that is? Could it be the blithe acceptance of ridiculous cases of planned obsolescence like this?
Manufacturers piddle out increments of tech constantly to try to keep a carrot on a stick in front of consumers. Just like with games and their DLC nonsense, the new mindset is replace, replace, replace... design the product so it can't be upgraded. Fill up the landfills.
Sorry, but my $800 panel isn't going to just wear out or be obsolete in short order. People who spent even more are likely to say the same thing. And, again, many of these products are still available for purchase right now. The industry is doing consumers a disservice enough by not having standards (incompatible competing G-Sync and FreeSync) but it's far worse to tell people they need to replace otherwise perfectly satisfactory equipment for a minor feature improvement.
You say it's not feasible to make monitors that can be upgraded in a relatively minor way like this. I say it's not. It's not like we're talking about installing DisplayPort into a panel that didn't have it or something along those lines. It's time for the monitor industry to stop spewing out tiny incremental changes and expecting wholesale replacement.
This sort of product and the mindset that accompanies it is optional, not mandatory. Once upon a time things were designed to be upgradable. I suppose the next thing you'll fully endorse are motherboards with the CPUs, RAM, and everything else soldered on (which Apple likes to do) to replace DIY computing... Why not? Think of how much less trouble it will be for everyone.
"it's probable that G1 *couldn't* be properly upgraded to support TRIM" "since you were working at Intel's Client SSD department...oh, wait, you weren't." So, I assume I should use the same retort on you with your "probable", eh?
The other thing you're missing is that Intel never told consumers that TRIM could not be added with a firmware patch. It never provided anyone with an actual concrete justification. It just did what is typical for these companies and for publications like yours = told people to buy the latest shiny to "upgrade".
The troll is strong in this one. You keep repeating how this is technically worse than G-SYNC and have absolutely nothing to back it up. You claim forced V-SYNC is an issue with FreeSync, but it's the other way around - you can't turn V-SYNC off with G-SYNC but you can with FreeSync. You don't address the fact that G-SYNC monitors need the proprietary scaler that doesn't have all the features of FreeSync capable scalers (eg more input ports, OSD functionality). You accuse everyone who refutes your argument with AMD fanboy sentimentality, when you yourself are the obvious NVIDIA fanboy. No doubt you'll accuse me of being an AMD fanboy too. How wrong you are.
Technically the G-SYNC scaler supports an OSD... the options are just more limited as there aren't multiple inputs to support, and I believe NVIDIA doesn't bother with supporting multiple *inaccurate* color modes -- just sRGB and hopefully close to the correct values.
Actually you're wrong again, Vsync is always off, there is a frame cap turned on via driver but that is not Vsync as the GPU is still controlling frame rate.
Meanwhile, FreeSync is still clearly tied to Vsync, which is somewhat surprising in its own right since AMD has historically had issues with driver-level Vsync.
I've never once glossed over the fact G-Sync requires proprietary module, because I've clearly stated the price and tech is justified if it is a better solution and as we saw yesterday, it clearly is.
I've also acknowledged that multiple inputs an OSD are amenities that are a bonus, but certainly not over these panels excelling at what they are purchased for. I have 2xU2410 companion panels with TONS of inputs for anything I need beyond gaming.
I have to give it to AMD here - I was skeptical this could be accomplished without dedicated hardware to buffer the video frames on the display, but they've done it. I still wouldn't buy one of their power hungry video cards but it's good for AMD fans. This is good news for G-Sync owners too as it should drive down the artificially inflated price (partly due to lack of competition, partly due to early adoption premium). After fiddling around with triple buffering and tripe buffering overrides for years (granted, less of a problem on DX10/11 as it seems many modern engines have some form of "free" triple buffering) it's good to go to perfect refresh rates. As a big emulation fan, with many arcade games using various refresh rates from 50 to 65 Hz, these displays are also great. Was input lag tested? AMD don't claim to have Vsync-off like input lag reduction. This would be superb in a laptop where displaying every last frame is important (Optimus provides a sort of "free" triple buffering of its own, but it's not the smoothest and often requires you to set a 60 FPS frame cap).
same here, I can just wait a year or so before upgrading monitor and gpu, their loss. If in the meanwhile AMD comes up with something competitive (i.e. also not an oven please), they win.
Hats off to Nvidia for delivering G-Sync and getting the ball rolling on this thing. They were the first to create a solution for a very real problem.
Because of NVidia's pioneering, and because NVidia won't license the technology to AMD, AMD had to find their own solution in re-purposing an existing DP1.2a feature to provide the same function.
It makes sense for NVidia to refuse to support adaptive refresh, until these displays become commonplace. They had the only card and the only display module that could do this, and they needed to sell as many as they could before the competition's technology was viable.
Soon NVidia needs to reverse that decision, because I'm not going to buy an inferior monitor, just so that I can slap "The Way It's Meant to Be Played" on the side of my computer.
I fully expect that both will come together on this one. NVidia had a good run with G-Sync. But now it needs to jump on the bandwagon or risk losing out on GPU sales.
Unfortunately, I doubt it. While they are great first movers, look at their track record of good tech that could be great tech with industry-wide adoption via less proprietary measures: PhysX, CUDA, 3D Surround, Gsync, etc. They also have a poor history of working with more open platforms like Linux. "Our way or the highway" is the vibe I get.
What about actually testing the fallback cases, where framerate is outside the monitor's range of refresh rates? We need an input lag comparison when both monitors are maxed out in v-sync mode, and a gpu utilization comparison when framerates dip below the monitor's minimum refresh rate.
"One final topic to address is something that has become more noticeable to me over the past few months. While G-SYNC/FreeSync can make a big difference when frame rates are in the 40~75 FPS range, as you go beyond that point the benefits are a lot less clear. Take the 144Hz ASUS ROG Swift as an example. Even with G-SYNC disabled, the 144Hz refresh rate makes tearing rather difficult to spot, at least in my experience. "
Sure. It draws the frames 3-4 times between updates, so even if half of the frame showed tearing on the first pass it gets cleaned up on the second and third passes. And with VSYNC enabled, you can fall back to 72Hz and 48Hz before you are at ~30 Hz.
Really different reviews between AnandTech and PC Perspective. You conclude that FreeSync performs as well as G-Sync - if not better, because of the option to disable V-sync. PC Perspective, on the other hand, noticed that their FreeSync monitors performed badly compared to the G-Sync monitors when the frame rate dropped below the lowest refresh rate of the monitor.
You give the impression that they would behave the same - or FreeSync would be potentially better because you could choose your poison: stutter or tearing - when with G-Sync you would always get stuttering. PC Perspective, on the other hand, tells that G-Sync monitors handle this gracefully by refreshing the display twice or more during those longer frames - and as such G-Sync avoids both stutter and tearing at those lower fram rates. Their FreeSync monitors did not do that - and the stuttering or tearing was very noticeable. The frame rate dropping below 48 fps is not uncommon and the displays behavior in those situations is very important. That makes the G-Sync the superior technology. Unless - the tearing or stuttering at speeds lower than the display's lowest refresh rate is only a problem with that specific monitor and not with the FreeSync / AdaptiveSync technology in general. (The LG monitor is incapable of doubling its slowest refresh rate - other monitors that are capable maybe could handle the situation differently. If not, FreeSync is the inferior technology.)
I don't know how G-Sync and FreeSync actually would handle full screen movies at 24 fps. G-Sync could easily display it at a 48 Hz refresh rate. Your LG monitor would probably also show it at 48 Hz - because it is the lowest it could go. But would the LG monitor with FreeSync be smart enough to show a 25 fps movie in 50 Hz - or would it display it in 48 Hz with unnecessary tearing or stuttering?
"PC Perspective, on the other hand, tells that G-Sync monitors handle this gracefully by refreshing the display twice or more during those longer frames - and as such G-Sync avoids both stutter and tearing at those lower fram rates."
That would drastically reduce the effects of tearing, but it would not do much, if anything, for stutter.
It would reduce stutter in the sense that if the frame rate were, for example, constantly 30 fps, G-sync would give you every frame when it is ready - keeping the motion fluid. FreeSync with V-SYnc on, on the other hand, would force that into the lowest refresh rate of the monitor. It would double some frames and not others - making the timing of the frames different and making a constant 30 fps motion jerky where G-Sync would not. I would call that jerky motion 'stutter' - FreeSync (currently) has it, G-Sync does not.
In short, G-Sync retains its variable refresh rate technology when going under the displays min refresh rate. FreeSync does not but switches to constant refresh rate at the monitors min refresh rate - introducing either tearing or stutter. Within the display's refresh rate range they perform the same. When going faster than the refresh rate range - FreeSync gives the option of disabling V-Sync and choosing tearing instead of stuttering. There it is better. I just think that the low fps range is probably more important than the high. I would not buy any FreeSync / Adaptive Sync displays before they demonstrate that they can handle those low fps situations as gracefully as G-Sync does.
And as Ryan said it is a beast, but one question you buy an XB270hu and you plug in your 290x, because the video card doesnt support GSYNC uses the standard scaler? in the Acer to display video data. Now if the Acer uses a scaler from one of the four main manufacturers listed in the article is there a chance it would support Freesync? (Acer wouldnt advertise that obviously since the monitor is a GSYNC branded monitor....)
So there are a few assumptions above about the operations of GSYNC, but Im curious if this will be the case as it keeps red and green camps happy...
One other question if anyone is happy to answer, high hertz refresh monitors will they maintain their peak capable refresh when in portrait mode or are they limited to a lower refresh rate or GSYNC for that matter? Im thinking a triple monitor portrait setup for my next build.
Adaptive Sync is a Display Port specific standard. What current gen console supports Display Port? None to my knowledge. HDMI is a different standard and I don't think there have been even any rumors about putting adaptive sync technology into the HDMI standard. And if it some day would come - would the current HDMI hardware on the consoles be able to support it after a driver update from AMD? Probably not.
it's not likely to happen any time soon since video and STBs etc. revolve around the usual framerates and TVs do the same so there's no need for this kind of flexibility, tearing is not an issue.
Too bad that TV standards like HDMI spill over in the computer world (audio, projectors, laptops, etc.) and hamstring progress.
On the specs page for the BenQ XL2730Z (http://gaming.benq.com/gaming-monitor/xl2730z/spec... it states a 54Hz min vertical refresh. This could be a copy/paste issue since it's the same as the min horizontal refresh.
I was already shopping for a 21:9 monitor for my home office. I'm now planning to order a 29UM67 as soon as I see one in stock. The GPU in that machine is an R7/260X, which is on the compatible list. :-)
I have had my reservations with claims made by AMD these days, and my opinion of 'FreeSync' wasn't quite in contrast. If this actually works at least just as well as G-Sync (as claimed by this rather brief review) with various hardware/software setups then it is indeed a praiseworthy development. I personally would certainly be glad that the rivalry of two tech giants resulted (even if only inadvertently) in something that benefits the consumer.
I love the arguments about "freesync is an open standard" when it doesn't matter. 80% of the market is Nvidia and won't be using it. Intel is a non-issue because not many people are playing games that benefit from adaptive v-sync. Think about it, either way you're stuck. If you buy a GSync monitor now you likely will upgrade your GPU before the monitor goes out. So your options are only Nvidia. If you buy a freesync monitor your options are only AMD. So everyone arguing against gsync because you're stuck with Nvidia, have fun being stuck with AMD the other way around.
Best to not even worry about either of these unless you absolutely do not see yourself changing GPU manufacturers for the life of the display.
NVidia is 71% of the AIB market, as of the latest released numbers from Hexus. That doesn't include AMD's APUs, which also support Freesync and are often used by "midrange" gamers.
The relevance of being an open standard though, is that monitor manufacturers can add it with almost zero extra cost. If it's built into nearly every monitor in a couple of years, then NVidia might have a reason to start supporting it.
@Jarred Walton You disappoint me. What you said about G-sync below minimum refresh rate is not correct, also there seems to be issues with ghosting on freesync. I encourage everyone to go to PCper(dot)com and read a much more in-depth article on the subject. Get rekt anandtech.
If you're running a game and falling below the minimum refresh rate, you're using settings that are too demanding for your GPU. I've spent quite a few hours playing games on the LG 34UM67 today just to see if I could see/feel issues below 48 FPS. I can't say that I did, though I also wasn't running settings that dropped below 30 FPS. Maybe I'm just getting too old, but if the only way to quantify the difference is with expensive equipment, perhaps we're focusing too much on the theoretical rather than the practical.
Now, there will undoubtedly be some that say they really see/feel the difference, and maybe they do. There will be plenty of others where it doesn't matter one way or the other. But if you've got an R9 290X and you're looking at the LG 34UM67, I see no reason not to go that route. Of course you need to be okay with a lower resolution and a more limited range for VRR, and you're also willing to go with a slower response time IPS (AHVA) panel rather than dealing with TN problems. Many people are.
What's crazy to me is all the armchair experts reading our review and the PCPer review and somehow coming out with one or the other of us being "wrong". I had limited time with the FreeSync display, but even so there was nothing I encountered that caused me any serious concern. Are there cases where FreeSync doesn't work right? Yes. The same applies to G-SYNC. (For instance, at 31 FPS on a G-SYNC display, you won't get frame doubling but you will see some flicker in my experience. So that 30-40 FPS range is a problem for G-SYNC as well as FreeSync.)
I guess it's all a matter of perspective. Is FreeSync identical to G-SYNC? No, and we shouldn't expect it to be. The question is how much the differences matter. Remember the anisotropic filtering wars of last decade where AMD and NVIDIA were making different optimizations? Some were perhaps provably better, but in practice most gamers didn't really care. It was all just used as flame bait and marketing fluff.
I would agree that right now you can make the case the G-SYNC is provably better than FreeSync in some situations, but then both are provably better than static refresh rates. It's the edge cases where NVIDIA wins (specifically, when frame rates fall below the minimum VRR rate), but when that happens you're already "doing it wrong". Seriously, if I play a game and it starts to stutter, I drop the quality settings a notch. I would wager most gamers do the same. When we're running benchmarks and comparing performance, it's all well and good to say GPU 1 is better than GPU 2, but in practice people use settings that provide a good experience.
Example: Assassin's Creed: Unity runs somewhat poorly on AMD GPUs. Running at Ultra settings or even Very High in my experience is asking for problems, no matter if you have a FreeSync display or not. Stick with High and you'll be a lot happier, and in the middle of a gaming session I doubt anyone will really care about the slight drop in visual fidelity. With an R9 290X running at 2560x1080 High, ACU typically runs at 50-75FPS on the LG 34UM67; with a GTX 970, it would run faster and be "better". But unless you have both GPUs and for some reason you like swapping between them, it's all academic: you'll find settings that work and play the game, or you'll switch to a different game.
Bottom Line: AMD users can either go with FreeSync or not; they have no other choice. NVIDIA users likewise can go with G-SYNC or not. Both provide a smoother gaming experience than 60Hz displays, absolutely... but with a 120/144Hz panel only the high speed cameras and eagle eyed youth will really notice the difference. :-)
Haha love it, still feisty I see even in your "old age" there Jarred. I think all the armchair experts want is for you and AT to use your forum on the internet to actually do the kind of testing and comparisons that matter for the products being discussed, not just provide another Engadget-like experience of superficial touch-feely review, dismissing anything actually relevant to this tech and market as not being discernable to someone "your age".
It's easy to point out flaws in testing; it's a lot harder to get the hardware necessary to properly test things like input latency. AnandTech doesn't have a central location, so I basically test with what I have. Things I don't have include gadgets to measure refresh rate in a reliable fashion, high speed cameras, etc. Another thing that was lacking: time. I received the display on March 17, in the afternoon; sometimes you just do what you can in the time you're given.
You however are making blanket statements that are pro-NVIDIA/anti-AMD, just as you always do. The only person that takes your comments seriously is you, and perhaps other NVIDIA zealots. Mind you, I prefer my NVIDIA GPUs to my AMD GPUs for a variety of reasons, but I appreciate competition and in this case no one is going to convince me that the closed ecosystem of G-SYNC is the best way to do things long-term. Short-term it was the way to be first, but now there's an open DisplayPort standard (albeit an optional one) and NVIDIA really should do everyone a favor and show that they can support both.
If NVIDIA feels G-SYNC is ultimately the best way to do things, fine -- support both and let the hardware enthusiasts decide which they actually want to use. With only seven G-SYNC displays there's not a lot of choice right now, and if most future DP1.2a and above displays use scalers that support Adaptive Sync it would be stupid not to at least have an alternate mode.
But if the only real problem with FreeSync is when you fall below the minimum refresh rate you get judder/tearing, that's not a show stopper. As I said above, if that happens to me I'm already changing my settings. (I do the same with G-SYNC incidentally: my goal is 45+ FPS, as below 40 doesn't really feel smooth to me. YMMV.)
You can test absolute input latency to sub millisecond precision with ~50 bucks worth of hobby electronics, free software, and some time to play with it. For example, an arduino micro, a photoresistor, a second resistor to make a divider, a breadboard, and a usb cable. Set the arduino up to emulate a mouse, and record the difference in timing between a mouse input and the corresponding change in light intensity. Let it log a couple minutes of press/release cycles, subtract 1ms of variance for USB polling, and there you go, full chain latency. If you have access to a CRT, you can get a precise baseline as well.
As for sub-VRR behavior, if you leave v-sync on, does the framerate drop directly to 20fps, or is AMD using triple buffering?
You seem to be taking my comments pretty seriously Jarred, as you should, given they draw a lot of questions to your credibility and capabilities in writing a competent "review" of the technology being discussed. But its np, no one needs to take me seriously, this isn't my job, unlike yours even if it is part time. The downside is, reviews like this make it harder for anyone to take you or the content on this site seriously, because as you can see, there are a number of other individuals that have taken issue to your Engadget-like review. I am sure there are a number of people that will take this review as gospel, go out and buy FreeSync panels, discover ghosting issues not covered in this "review" and ultimately, lose trust in what this site represents. Not that you seem to care.
As for being limited in equipment, that's just another poor excuse and asterisk you've added to the footnotes here. It takes a max $300 camera, far less than a single performance graphics card, and maybe $50 in LED, diodes and USB input doublers (hell you can even make your own if you know how to splice wires) at Digikey or RadioShack to test this. Surely, Ryan and your new parent company could foot this bill for a new test methodology if there was actually interest in conducting a serious review of the technology. Numerous sites have already given the methodology for input lag and ghosting with a FAR smaller budget than AnandTech, all you would have to do is mimic their test set-up with a short acknowledgment which I am sure they would appreciate from the mighty AnandTech.
But its OK, like the FCAT issue its obvious AT had no intention of actually covering the problems with FreeSync, I guess if it takes a couple of Nvidia "zealots" to get to the bottom of it and draw attention to AMD's problems to ultimately force them to improve their products, so be it. Its obvious the actual AMD fans and spoon-fed press aren't willing to tackle them.
As for blanket-statements lol, that's a good one. I guess we should just take your unsubstantiated points of view, which are unsurprisingly, perfectly aligned with AMD's, at face value without any amount of critical thinking and skepticism?
It's frankly embarrassing to read some of the points you've made from someone who actually works in this industry, for example:
1) One shred of confirmation that G-Sync carries royalties. Your "semantics" mean nothing here. 2) One shred of confirmation that existing, pre-2015 panels can be made compatible with a firmware upgrade. 3) One shred of confirmation that G-Sync somehow faces the uphill battle compared to FreeSync, given known market indicators and factual limitations on FreeSync graphics card support.
All points you have made in an effort to show FreeSync in a better light, while downplaying G-Sync.
As for the last bit, again, if you have to sacrifice your gaming quality in an attempt to meet FreeSync's minimum standard refresh rate, the solution has already failed, given one of the major benefits of VRR is the ability to crank up settings without having to resort to Vsync On and the input lag associated with it. For example, in your example, if you have to drop settings from Very High to High just so that your FPS don't drop below 45FPS for 10% of the time, you've already had to sacrifice your image quality for the other 90% it stays above that. That is a failure of a solution if the alternative is to just repeat frames for that 10% as needed. But hey, to each their own, this kind of testing and information would be SUPER informative in an actual comprehensive review.
As for your own viewpoints on competition, who cares!?!?!? You're going to color your review and outlook in an attempt to paint FreeSync in a more favorable light, simply because it aligns with your own viewpoints on competition? Thank you for confirming your reasoning for posting such a biased and superficial review. You think this is going to matter to someone who is trying to make an informed decision, TODAY, on which technology to choose? Again, if you want to get into the socioeconomic benefits of competition and why we need AMD to survive, post this as an editiorial, but to put "Review" in the title is a disservice to your readers and the history of this website, hell, even your own previous work.
Thanks Jarred. I really appreciate your work on this. However, I do disagree to some extent on the low-end FPS issue. The biggest potential benefit to Adaptive Refresh is smoothing out the tearing and judder that happens when the frame rate is inconsistent and drops. I also would not play at settings where my average frame-rate fell below 60fps.. However, my settings will take into account the average FPS, where most scenes may be glassy-smooth, while in a specific area the frame-rate may drop substantially. That's where I really need adaptive-sync to shine. And from most reports, that's where G-Sync does shine. I expect low end flicker could be solved with a doubling of frames, and understand you cannot completely solve judder if the frame-rate is too low.
I own a G-Sync ASUS ROG PG278Q display and while it's fantastic, I'd prefer NVIDIA just give up on G-Sync and go with the flow and adapt ASync/FreeSync. It's clearly working as well (which was my biggest hesitation) so there's no reason to continue forcing users to pay a premium on proprietary technology that more and more display manufacturers will not support. If LG or Samsung push out a 34" widescreen display that is AHVA/IPS with low response time and 144 Hz support, I'll probably sell my ROG Swift and switch, even if it is a FreeSync display. Like Jared said in his article, you don't notice tearing with a 144 Hz display so G-Sync/FreeSync make little to no impact.
And what if going to Adaptive Sync results in a worst experience? Personally I have no problems if Nvidia uses an inferior Adaptive Sync based solution, but I would still certainly want them to continue developing in and investing in G-Sync, as I know for a fact I would not be happy with what FreeSync has shown today.
inferior / worst experience ? meanwhile the review from anandtech, guru3d, hothardware, overclock3d, hexus, techspot, hardwareheaven, and the list could goes on forever. They clearly stated that the experiences with both freesync & g-sync are equal / comparable, but freesync cost less as an added bonus. Before you accuse me as an AMD fanboy, i own an intel haswell cpu & zotac GTX 750 (should i take pics as prove ?). Based of review from numerous tech sites. my conclusion is : either g-sync will end up just like betamax, or nvdia forced to adopt adaptive sync.
These tests were done in some kind of limited/closed test environment apparently, so yes all of these reviews are what I would consider incomplete and superficial. There are a few sites however that delve deeper and notice significant issues. I have already posted links to them in the comments, if you made it this far to comment you would've come upon them already and either chose to ignore them or missed them. Feel free to look them over and come to your own conclusions, but it is obvious to me, the high floor refresh and the ghosting make FreeSync worst than G-Sync without a doubt.
Yeah since pcper gospel review was apparently made by jesus. And 95% reviewer around the world who praised freesync are heretic, isn't that right ?
V-sync can be turned off-on as you wish if the fps surpass the monitor refresh cap range, unlike G-sync. And none report ghostring effect SO FAR, you are daydreaming or what ? Still waiting for tomshardware review, even though i already know what their verdict will be
Who needs god when you have actual screenshots and video of the problems? This is tech we are talking about, not religion, but I am sure to AMD fans they are one and the same.
@chizow anything that doesn't say Nvidia is God is incomplete and superficial to you. You are putting down a lot of hard work out into a lot of reviews for one review that pointed out a minor issue.
Show us more than one and maybe we will loom at it but your paper gospel means nothing when there are a ton more articles that contradict it. Also what makes you the expert here when all these reviews say it is the same/comparable and you yourself has not seen freesync in person. If you think you can do a better job start a blog and show us. Otherwise stop with your anti AMD pro Nvidia campaign and get off your high horse. In these comments you even attacked Jarred who works hard in his short time that he gets hardware to give us as much relevant info that he can. You don't show respect to others work here and you make blanket statements with nothing to support yourself.
I think what would clear up your misgivings with FreeSync would be if a very high quality panel with a very wide frequency range (especially dealing with those lower frequencies) was tested. I shouldn't blame the underlying technology for too much yet, especially considering there are downsides to GSync as well.
There have been numerous reports that the BenQ panel shares the same high quality AU optronics panel as the ROG Swift, so again, it is obvious the problem is in the spec/implementation, rather than the panel itself. But yes, it is good confirmation that the premium associated with G-Sync is in fact, worth it if the BenQ shows these problems at $630 while the Swift does not at $780.
AMD claims the spec goes as low as 9Hz, but as we have seen it has a hard enough working properly at the current minimums of 40 and 48Hz without exhibiting serious issues, so until those problems directly tied to low refresh and pixel decay are resolved, it doesn't really matter what AMD claims on a slidedeck.
I disagree, the holy grail to get me to upgrade is going to be a 32" 1440p monitor (since 30" 1600p ones are non-existent), and maybe one that's just 60fps.
I'd really love a good 34" 3440x1440 display with a 120Hz refresh rate. Too bad that's more bandwidth than DP provides. And too bad gaming at 3440x1440 generally requires more than any single GPU other than the GTX Titan X. I've got CrossFire 290X and SLI 970 incidentally; when I'm not testing anything, it's the GTX 970 GPUs that end up in my system for daily use.
I'd totally go for a 34" 3440x1440... Wouldn't be any harder to drive than my 3x 24" 1080p IPS displays in Eyefinity. I've resigned myself to CF/SLI, it's not like I play very many games at launch anyway. Anything lower res, smaller, or not as wide would feel like a side grade.
Seiki will be releasing DP1.3 on a 4k monitor Q3 2015, however, for whatever reason, it's still going to be 60hz. Now if other monitors follow suit and R300 also includes displayPort 1.3 we can get out of this bottleneck!
Hmm.. All the listed screens are budget ones.. There are all either TN film or low res IPS screens. I hope it's just time needed for manufacturers to churn out their top models (e.g. 1440p 34" IPS or 27" 1440p 144Hz AHVA) and not some trend.
By your logic all the G-SYNC displays are equally lousy (TN), with the exception of the upcoming Acer XB270HU. It's one of the primary complaints I have with G-SYNC to date: TN is usable, but IPS/AHVA/MVA are all better IMO.
Side note: anyone else notice that IPS panels (esp. on laptops) seem to have issues with image persistence? I swear, I've played with laptops that are only a few months old and if you have a static window like Word open and then switch to a solid blue background or similar you'll see the Word outline remain for several minutes. Maybe it's just less expensive panels, but I don't know....
I have an Intel CPU and a nVIDIA GC but bravo AMD that despite being some kind of underdog does revolutionize the market on the right way, Mantle, Freesync, what's next?
I'm convinced Intel will support Freesync in the near future, I don't see why not, it's free and easy to implement on the graphics circuit and the monitors can benefit from it at almost no cost (proved by the firmware upgrades)...
I think that since AMD has gotten the FreeSync details built into DisplayPort specifications, then all NVidia would have to do to be compatible with a FreeSync monitor is update their DisplayPort to the same revision. G-Sync will have to come down in price, or offer a significant benefit that isn't matched by FreeSync in order to stay viable.
Is the table with the displays correct? Other sites say that at least the 23.6" and the 31.5" versions of the Samsung UE850 will come with a PLS (Samsung's name for IPS) panel, not with a TN panel. It would be nice to have a 4k display >= 30" with FreeSync so my hopes would be on the UE850 with 31,5".
Either the other sites are wrong or you got updated information that those will come with TN panels after all, which would be a shame :(
I think people are assuming they'll be PLS; I'm assuming they'll be TN. The reason is that Samsung specifically notes the PLS status on displays that use it (e.g. SE650), but they say nothing about panel type when it's TN, because everyone knows TN is a negative marketing term. Here's at least one place suggesting TN as well: http://www.businesswire.com/news/home/201501050060...
Heise is a very respectable site, owner of the c't magazine, probably the most reputable German computer magazine still left that usually doesn't just wildly spread incorrect information (ok, nowadays you never know, but they certainly are on the more reliable and trustworthy side than most others).
Someone at Samsung knows, but they haven't publicly stated anything that I can find. Given they're all 4K panels, TN or PLS/IPS/AHVA are certainly possible. I've added "TN?" on the UE850, as it is an unknown. Here's hoping they are PLS and not TN!
it's the usual korean monitors from ebay. It's a horse that's been beaten to death already so nobody elaborates on it anymore. They are overclockable, but since they're cheap and made with second-choice panels, you can probably get dead pixels (unless it has a guarantee of no dead pixels, but you pay for that), plus there is no guarantee that it will overclock to where you'd like it to, there's no guarantee, it's sold as a 60 Hz monitor and anything else is a bonus.
Monoprice offers a 30" IPS 120Hz display that's worth a look -- they "guarantee" that it will overclock to 120Hz at least. I saw it at CES and it looked good. I'm sure there's still a fair amount of pixel lag (most IPS are about 5ms), but it's better than 60Hz.
To add to my last post, i think 1440p Freesync screens will be good for people with only 1 higher end GPU. Since they won't have to deal with micro stuttering that multi cards bring, smooth experience at 40+ fps. Good news.
I was just about ready to praise AMD but then I see "must have and use display port"... Well, at least it appears AMD got it to work on their crap, and without massive Hz restrictions as they were on earlier reviews. So, heck, AMD might have actually done something right for once ? I am cautious - I do expect some huge frikkin problem we can't see right now --- then we will be told to ignore it, then we will be told it's nothing, then there's a workaround fix, then after a couple years it will be full blow admitted and AMD will reluctantly "fix it" in "the next release of hardware". Yes, that's what I expect, so I'm holding back the praise for AMD because I've been burned to a crisp before.
I actually went to the trouble to make an account to say sometimes I come here just to read the comments, some of the convos have me rolling on the floor busting my guts laughing, seriously, this is free entertainment at its best! Aside from that, the cost of this Nvidia e-penis would feed 10 starving children for a month. I mean seriously, at what point is it overkill? By that I mean is there any game out there that would absolutely not run good enough on a slightly lesser card at half the price? When I read this card alone requires 250W, my eyes popped out of my head, holy electric bill batman, but I guess if somebody has a 1G to throw away on an e-penis, they don't have electric bill worries. One more question, what kind of CPU/motherboard would you need to back this sucker up? I think this card would be retarded without at least the latest i7 Extreme(ly overpriced), can you imagine some tool dropping this in an i3? What I'm saying is, this sucker would need an expensive 'bed' too, otherwise, you'd just be wasting your time and money.
Btw, I've tested 1/2/3x GTX 980 on a P55 board, it works a lot better than one might think. Also test 1/2x 980 with an i5 760, again works quite well. Plus, the heavier the game, the less they tend to rely on main CPU power, especially as the resolution/detail rises.
Go to 3dmark.com, Advanced Search, Fire Strike, enter i7 870 or i5 760 & search, my whackadoodle results come straight back. :D I've tested with a 5GHz 2700K and 4.8GHz 3930K aswell, the latter are quicker of course, but not that much quicker, less than most would probably assume.
Btw, the Titan X is more suited to solo pros doing tasks that are mainly FP32, like After Effects. However, there's always a market for the very best, and I know normal high street stores make their biggest profit margins on premium items (and the customers who buy them), so it's an important segment - it drives everything else in a way.
I am a big fan of open standard. More the merrier. I stay with nVidia because they support their products better. I was a big fan of AMD GPU but the drivers were so buggy. nVidia updates their drivers so quickly and the support is just a lot better. I like G-sync. It worths the extra money. I hope my monitor can support FreeSync with a firmware upgrade. (Not that I have an AMD GPU.)
Lot of G-Sync versus AMD bashing here. Personally it all comes down to whether or not I am being confined to an eco-system when going for either technology. If nVidia starts to become like Apple in that respect, I'm not very comfortable with it. However, I wonder if to adapt to FreeSync, does it take a lot of addition or modification of hardware on the GPU end. That might be one reason that nVidia didn't have to change much of their architecture on the GPU during the G-Sync launch and confined that to the scaler. AMD worked with VESA to get this working on GCN 1.1, but not on GCN 1.0. This may be another area where the technologies are radically different- one is heavily reliant on the scaler while the other may be able to divide the work to a certain extent? Then again, I'm quite ignorant of how scalers and GPUs work in this case.
To be honest I don't give half a fudge about FreeSync or G-Sync. What gets my attention is ULMB/ strobed backlight. An IPS-Display (or well, OLED but...), WQHD and strobing that works on a range of refresh rates, including some that are mutliples of 24. That would be MY holy grail. The announced Acer XB270HU comes close but ULMB apparently only works on 85Hz ans 100Hz.
Why will it not work with the R9 270? That is BS! To hell with you AMD! I paid good money for my R9 series card! And it was supposed to be current GCN not GCN 1.0! Not only do you have to deal with crap drivers that cause artifacts! Now AMD is pulling off marketing BS!
Anandtech, have you seen the PCPerspective article on Gsync vs Freesync? PCper was seeing ghosting with freesync. Can you guys coo-berate their findings?
The problem I have is "syncing" is a relic of the past. The only reason why you needed to sync with a monitor is because they were using CRTs that could only trace the screen line by line. It just kept things simpler (or maybe practical) if you weren't trying to fudge with the timing of that on the fly.
Now, you can address each individual pixel. There's no need to "trace" each line. DVI should've eliminated this problem because it was meant for LCD's. But no, in order to retain backwards compatibility, DVI's data stream behaves exactly like VGA's. DisplayPort finally did away with this by packetizing the data, which I hope means that display controllers only change what they need to change, not "refresh" the screen. But given they still are backwards compatible with DVI, I doubt that's the case.
Get rid of the concept of refresh rates and syncing altogether. Stop making digital displays behave like CRTs.
Why do i need either Freesync or Gsync when I already get over 100fps in all games at 2560x1400. All i want is a 144Hz 2560x1440 monitor without the Gsync tax. as gsync and freesync are only usefull if you drop below 60fps.
LCD is a memory array. If you don't use it you lose it. Need to physically refresh each pixel the same number of times a second. You could save on average bitrate by only sending changed pixels but that requires more work on the gpu and adds latency. What's more is it doesn't change the fact what your max bitrate needs to be and don't even bigger suggesting multiple frame buffers as that adds TV tier latency.
And more evidence of FreeSync's (and AnandTech's) shortcomings, again from PCPer. I remember a time AnandTech was willing to put in the work with the kind of creativeness needed to come to such conclusions, but I guess this is what happens when the boss retires and takes a gig with Apple.
Free sync is not meant to increase fps. The whole point is visuals. It stops visual tearing which is why it drops frame rates to match the monitor. Fps has no effect on what free sync is meant to do. It's all visuals not performance. I hate when people write reviews that don't know what they're talking about. You're gonna get dropped frame rates because that means the frame isn't ready yet so the GPU doesn't give it to the display and holds onto it a tiny bit longer to make sure the monitor and GPU are both ready for that frame.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
350 Comments
Back to Article
Keysisitchy - Thursday, March 19, 2015 - link
RIP GsyncThe 'gsync module's was all smoke and mirrors and worthless.
RIP gsync. Freesync is here
imaheadcase - Thursday, March 19, 2015 - link
Not sure why people are bashing Gsync, it still is fantastic. So it puts a little more on price on hardware..you are still forgetting the main drive why gsync is better. It still works on nvidia hardware and freesync STILL requires a driver on nvidia hardware which they have already stated won't happen.If you noticed, the displayed they are announced don't even care about the specs freesync could do. They are just the same if worse than Gsync. Notice the Gsynce IPS 144Hz monitors are coming out this month..
imaheadcase - Thursday, March 19, 2015 - link
To add to the above, freesync is not better than Gsync because its "open". Costs still trickle down somehow..and that cost is you having to stick with AMD hardware when you buy a new monitor. So the openness about is is really it is closed tech...just like Gsync.iniudan - Thursday, March 19, 2015 - link
Nvidia will someday to have to provide freesync to comply with spec of more recent displayport version. Right now they are just keeping older 1.2 version on their gpu, simply as to not have to adopt it.dragonsqrrl - Thursday, March 19, 2015 - link
No current or upcoming DP spec (including DP 1.3) requires adaptive sync.eddman - Thursday, March 19, 2015 - link
Do you mean to say that it's available for 1.3 but optional, or that 1.3 doesn't support adaptivesync at all?It does support it in 1.3, according to their FAQ, but doesn't say it's mandatory or not.
http://www.displayport.org/faq/#DisplayPort 1.3 FAQs
DanNeely - Thursday, March 19, 2015 - link
It's optional. Outside of gaming it doesn't have any real value; so forcing mass market displays to use a marginally more complex controller/more capable panel doesn't add any value.mutantmagnet - Friday, March 20, 2015 - link
It has value for movies. The screen tearing that can occur would be nice to remove.D. Lister - Friday, March 20, 2015 - link
Movies can't have screen tearing, because the GPU is just decoding them not actually generating the visuals like in games.ozzuneoj86 - Friday, March 20, 2015 - link
That is definitely not true. Tearing happens between the graphics card and the display, it doesn't matter what the source is. If you watch a movie at 24fps on a screen that is running at 60Hz connected to a PC, there is a chance for tearing.Tearing is just less common on movies because the frame rates are usually much lower than the refresh rate of the display, so its less likely to be visible.
Cerb - Saturday, March 21, 2015 - link
If it's not working, this is just as wrong. Since it's fairly close, at 24, 25, or almost 30, you will see the tear line creeping up or down the image, if vsync isn't on. It's exceptionally obvious. Usually, you will just see skipped frames on Windows, since the compositor forces vsync for the desktop, and this is generally well-supported by any video player's playback mechanisms. The skipped frames become more noticeable as you watch, but aren't nearly as bad as tearing.looncraz - Saturday, March 21, 2015 - link
Tearing can happen anytime.I'm writing a compositing engine for HaikuOS and I would LOVE to be able to control the refresh timing! when a small update occurs, and the frame buffer is ready, I'd swap it, trigger a monitor refresh, and then be on my way right away.
As it stands, I have to either always be a frame behind, or try and guess how long composing the frame buffer from the update stream will take before I know anything about what the update stream will be like so I know when to wake up the composite engine control loop.
That means, even on normal day-to-day stuff, like opening a menu, dragging icons, playing solitaire, browsing the web, etc. FreeSync would be quite useful. As it stands, the best I can do is hope the frame is ready for the next interval, or wait until the next refresh is complete to swap frame buffers - which means that the data on screen is always a frame out of date (or more).
At 60hz that is a fixed delay multiplier of 16.7, with a minimum multiplicand of 1. Going with higher refresh rates on the desktop is just wasteful (we don't really need 60, except for things to feel smooth due to the delay multiplier effect of the refresh rate).
If I could use the whole range from 45hz to 75 hz, our (virtual) multiplicand could be 0.75-1.33, instead of exactly 1 or 2. That make a significant difference in jitter.
Everything would be smoother - and we could drop down to a 45hz refresh interval by default, saving energy in the process, instead of being stuck at at a fixed cadence.
Cerb - Saturday, March 21, 2015 - link
Wrong. it is generating the visuals, and doing so the exact same way, as far as any of this technology is concerned, and screen tearing does happen, because refresh rates vary from our common ones.soccerballtux - Friday, March 20, 2015 - link
considering the power saving impact it's had on the mobile sector (no sense rendering to pixels that haven't changed, just render to the ones that have), it most definitely would have a significant impact on the laptop market and would be a great 'green' tech in general.erple2 - Friday, March 20, 2015 - link
No value, except to the consumer that doesn't have to pay the (current) $160+ premium for g-sync. Now, if amd had a gfx card competitor to the gtx980, it'd be marvelous, and a no brainer. Given that the cost is apparently minimal to implement, I don't see that as a problem. Even if you think it's not value added, panel manufacturers shoved the pointless 3d down everyone's throat, so clearly, they're not averse to that behavior.mdriftmeyer - Sunday, March 22, 2015 - link
It has value for any animated sequence.JonnyDough - Monday, March 23, 2015 - link
Inside of gaming it has plenty of value - who even cares about the rest? Gaming was a $25.1 billion market in 2010 (ESA annual report). I'd take a billionth of that pie and go out for a nice meal wouldn't you?dragonsqrrl - Thursday, March 19, 2015 - link
... No current or upcoming DP spec ...requires... adaptive sync. It's optional, not sure how else you could interpret that, especially when you take the comment I responded to into consideration.eddman - Friday, March 20, 2015 - link
Wait a minute; that only applies to monitors, right? It'd suck to buy a DP 1.2a/3 video card and find out that it cannot do adaptive-sync.tobi1449 - Friday, March 20, 2015 - link
Plus FreeSync != Adaptive SyncFlunk - Friday, March 20, 2015 - link
Free Sync = Adaptive Sync + AMD GPU + AMD Drivers.Gigaplex - Thursday, March 19, 2015 - link
Adaptive Sync as part of the DP spec is optional. It's not required for certification.JonnyDough - Monday, March 23, 2015 - link
Yep.FriendlyUser - Thursday, March 19, 2015 - link
Read the article. Freesync monitors are less expensive. Plus, they have a much better chance of getting Intel support or even Nvidia support (wanna bet it's going to happen? they're simply going to call it DisplayPort variable refresh or something like that...)imaheadcase - Thursday, March 19, 2015 - link
Intel support? I doubt you will find anyone buying these with a intel GPU. Why would nvidia support it with its investment already in Gsync..with new Gsync monitors IPS shipping this month? Makes no sense.testbug00 - Thursday, March 19, 2015 - link
Laptop displays. Laptop displays. Laptop displays. Being able to lower the refresh rate when you don't need it higher is something nearly every laptop could use. Currently there are no implementations of Freesync/Async that go down to 9Hz, but, well... That's power savings!Zan Lynx - Wednesday, March 25, 2015 - link
http://www.intel.com/content/dam/doc/white-paper/p...Refresh rate switching is definitely part of it.
And tablet and phone chipsets go as far as having no refresh at all. The display only updates when there is a new frame. The tablets even use a power-saving simple frame buffer / LCD driver and turn off the render hardware entirely.
anubis44 - Tuesday, March 24, 2015 - link
nVidia will buckle. It's inevitable. They can't stand against the entire industry, and AMD has the entire industry behind them with this. Jen Hsun knows he's already lost this battle, and he's just out to milk G-Sync for whatever he can get, for as long as he can get it. It's only a short matter of time before somebody hacks the nVidia drivers and makes them work with FreeSync, a la the old custom Omega ATI drivers. How appealing will it be to pay extra for G-Sync monitors once custom nVidia drivers exist that work with the much wider range of FreeSync monitors?chizow - Tuesday, March 24, 2015 - link
LOL, yes the thought of having to use a hacked driver to use an inferior solution leading to Nvidia reversing course on G-Sync is a thought only an AMD fan could possibly find palatable.G-Sync isn't going anywhere, especially in light of all the problems we are seeing with FreeSync.
Black Obsidian - Thursday, March 19, 2015 - link
Of course FreeSync is better than Gsync because it's open.Royalty cost to GPU-maker to support FreeSync is literally $0. That makes future Intel or nVidia GPUs a driver update away from supporting FreeSync. Compare to Gsync, at a royalty cost greater than zero, assuming nVidia would license it at all.
Scaler cost to LCD-makers to support FreeSync appears to be a maximum of $50 now, quite likely $0 in the long run as it becomes a default feature in all scalers. Compare to Gsync at $200+.
Take off your fanboy blinders for a moment. Capabilities being equal (as they so far seem to be), a royalty-free solution that's supported by default on future standard hardware is clearly better than a royalty-encumbered solution that requires costly additional hardware, no matter which team is supporting which one.
medi03 - Thursday, March 19, 2015 - link
nVidia's roalty cost for GSync is infinity.They've stated they were not going to license it to anyone.
chizow - Thursday, March 19, 2015 - link
It's actually nil, they have never once said there is a royalty fee attached to G-Sync.Creig - Friday, March 20, 2015 - link
TechPowerup"NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties"
WCCF Tech
"AMD FreeSync, unlike Nvidia G-Sync is completely and utterly royalty free"
The Tech Report
"Like the rest of the standard—and unlike G-Sync—this "Adaptive-Sync" feature is royalty-free."
chizow - Friday, March 20, 2015 - link
@CreigAgain, please link confirmation from Nvidia that G-Sync carries a penny of royalty fees. BoM for the G-Sync module is not the same as a Royalty Fee, especially because as we have seen, that G-Sync module may very well be the secret sauce FreeSync is missing in providing an equivalent experience as G-Sync.
Indeed, a quote from someone who didn't just take AMD's word for it: http://www.pcper.com/reviews/Displays/AMD-FreeSync...
"I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. "
JarredWalton - Friday, March 20, 2015 - link
Finish the quote:"It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."
Of course, royalties for using the G-SYNC brand is the real question -- not royalties for using the G-SYNC module. But even if NVIDIA doesn't charge "royalties" in the normal sense, they're charging a premium for a G-SYNC scaler compared to a regular scaler. Interestingly, if the G-SYNC module is only $40-$60, that means the LCD manufacturers are adding $100 over the cost of the module.
chizow - Friday, March 20, 2015 - link
Why is there a need to finish the quote? If you get a spoiler and turbo charger in your next car, are they charging you a royalty fee? It's not semantics to anyone who actually understands the simple notion: better tech = more hardware = higher price tag.AnnihilatorX - Sunday, March 22, 2015 - link
Nvidia is making profit over the Gsync module, how's that different from a royalty?chizow - Monday, March 23, 2015 - link
@AnnihilatorX, how is "making profit from" suddenly the key determining factor for being a royalty? Is AMD charging you a royalty every time you buy one of their graphics cards? Complete and utter rubbish. Honestly, as much as some want to say it is "semantics", it really isn't, it comes down to clear definitions that are specific in usage particularly in legal or contract contexts.A Royalty is a negotiated fee for using a brand, good, or service that is paid continuously per use or at predetermined intervals. That is completely different than charging a set price for Bill of Material for an actual component that is integrated into a product you can purchase at a premium. It is obvious to anyone that additional component adds value to the product and is reflected in the higher price. This is in no way, a royalty.
Alexvrb - Monday, March 23, 2015 - link
Jarred, Chizow is the most diehard Nvidia fanboy on Anandtech. There is nothing you could say to him to convince him that Freesync/Adaptive Sync is in any way better than G-Sync (pricing or otherwise). Just being unavailable on Nvidia hardware makes it completely useless to him. At least until Nvidia adopts it. Then it'll suddenly be a triumph of open standards, all thanks to VESA and Nvidia and possibly some other unimportant company.chizow - Monday, March 23, 2015 - link
And Alexvrb is one of the staunchest AMD supporters on Anandtech. There is quite a lot that could convince me FreeSync is better than G-Sync, that it actually does what it sets out to do without issues or compromise, but clearly that isn't covered in this Bubble Gum Review of the technology. Unlike the budget-focused crowd that AMD targets, Price is not going to be the driving factor for me especially if one solution is better than the other at achieving what it sets out to do, so yes, while Freesync is cheaper, to me it is obvious why, it's just not as good as G-Sync.But yes, I'm honestly ambivalent to whether or not Nvidia supports Adaptive Sync or not, as long as they continue to support G-Sync as their premium option than it's np. Supporting Adaptive Sync as their cheap/low-end solution would just be one less reason for anyone to buy an AMD graphics card, which would probably be an unintended consequence for AMD.
imaheadcase - Thursday, March 19, 2015 - link
The cost means nothing, keep in mind the people buying this stuff pay out the nose already for hardware. Given that most people who by nvidia cards are going to get a Gsync, cost has no meaning.This is a double bad thing for AMD..first its still tied to IT'S graphics card (NV already so no to support), and 2nd the monitors announced already are already below the specs Freesync is suppose to do, and worse than next gen Gsync monitors.
I mean i love competition like the next person, but this is just PR making it seem like its a good thing when its not.
SleepyFE - Thursday, March 19, 2015 - link
Your name says it all. Do you really think manufacturers will beg NVidia to come and mess with their manufacturing process just to include something that only they support? Time will come when phone makers will join and they mostly don't use NVidia GPU's. So now you have NVidia vs AMD and Intel (for ultrabooks) and ARM (Mali) and PowerVR. You think NVidia can hold them off with overpricing and PR?Murloc - Thursday, March 19, 2015 - link
uhm no?I'd want my next monitor to be GPU agnostic ideally.
And I'd want to use an nvidia card with it because right now AMD cards are still ovens compared to nvidia.
Not because I like paying through the nose, a 750 Ti doesn't cost much at all.
I'll hold out since I'm trusting that this thing will solve itself (in favour of the industry standard, adaptive sync) sooner or later.
Ranger101 - Friday, March 20, 2015 - link
So a difference of 10 degrees celsius under load makes an AMD gpu an OVEN and an Nvidia gpu presumably a Fridge by comparison....LOL.Lakku - Wednesday, May 6, 2015 - link
The reported GPU temp means nothing. That is just an indication of the heatsink/fans ability to remove heat from the GPU. You need to look at power draw. The AMD GPUs draw significantly more power than current nVidia cards for less performance. That power generates heat, heat that needs to go somewhere. So while the AMD cards may be 10 degrees Celsius more, which isn't minimal in and of itself, it is having to dissipate quite a bit more generated heat. The end result is AMD GPUs are putting out quite a bit more heat than nVidia GPUs.althaz - Thursday, March 19, 2015 - link
There are a bunch of 4k monitors announced. Yet there are no 4k G-Sync monitors available - how is that worse specs?I'd buy a 4k 27" G-Sync display at a reasonable price in a heartbeat. In fact I'd buy two.
thejshep - Thursday, March 19, 2015 - link
Acer XB280HK http://www.newegg.com/Product/Product.aspx?Item=N8...arneberg - Thursday, March 19, 2015 - link
Freesync today are only open to people with radeon cards. AMD made the better deal they let the monitor builders take the cost for freesync. Nvidia made the hardware themselfchizow - Thursday, March 19, 2015 - link
Huh? Who cares if its open/closed/upside down/inside out? G-Sync is better because it is BETTER at what it set out to do. If it is the better overall solution, as we have seen today it is, then it can and should command a premium. This will just be another bulletpoint pro/con for Nvidia vs. AMD. You want better, you have to pay for it, simple as that.lordken - Thursday, March 19, 2015 - link
better? did we read same article? I cant find where it says that gsync is better than freesync. In what aspect it is better?And answer for your 1st question is, anyone with the brain. Thing is that nvidia could enable support for freesync if it wouldnt hurt their pride, which would be big benefit for their customers (you wouldnt be restricted on monitor selection) but they chose what is better for them, pushing gsync & milking more money from you.
This is pretty stupid, while you may be some average gamer that thinks it is fine to have your monitor selection restricted to 2% normal people probably wouldnt be that happy. The way it should be is that every monitor should support freesync (or whatever you call it) as this is display feature and should have been in 1st place developed by LCD makers but they dont give a shit to provide excellent displays as far as they can sell shit that people are buying like crazy (not refering to gsync monitors now).
Vendor lockin is always a bad thing.
Oh and article says that gsync monitor doesnt provide advanced OSD as comon panels today...so yeah gsync is clearly better
chizow - Thursday, March 19, 2015 - link
See link: http://www.pcper.com/image/view/54234?return=node%...Also: still unaddressed concerns with how and why FreeSync is still tied to Vsync and how this impacts latency.
happycamperjack - Thursday, March 19, 2015 - link
The ghosting problem actually has nothing to do with the G-Sync and FreeSync technologies like the article said, but more have to do with the components in the monitor. So if Asus made a ROG Swift FreeSync version of the same monitor, there would've been no ghosting just like the G-SYNC version. So your example is invalid.chizow - Friday, March 20, 2015 - link
@happycamperjack. Again, incorrect. Why is it that panels from the SAME manufacturers, that possibly use the same panels even, using the same prevaling panel technologies of this time, exhibit widely different characteristics under variable refresh? Maybe that magic G-Sync module that AMD claims is pointless is actually doing something....like controlling the drive electronics that control pixel response variably in response to changing framerates. Maybe AMD needs another 18 months to refine those scalers with the various scaler mfgs?http://www.pcper.com/reviews/Displays/AMD-FreeSync...
"Modern monitors are often tuned to a specific refresh rate – 144 Hz, 120 Hz, 60 Hz, etc. – and the power delivery to pixels is built to reduce ghosting and image defects. But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."
BenQ for example makes a fine G-Sync monitor, and multiple high refresh 3D Vision monitors well known for their lack of ghosting. Are you going to tell me that suddenly they are using inferior panel tech that can't handle ghosting? This is 2015 and TN panels we are talking about here right? This kind of ghosting has not been seen since circa 2007 when PMVA was all the rage.
AnnihilatorX - Thursday, March 19, 2015 - link
chizow stop your biased preconceptions and actually read the articleAnnihilatorX - Thursday, March 19, 2015 - link
I will summarize it for you in case your prejudice clouds your comprehension1) At no point in the article it finds any performance advantage from FreeSync or Gsync (AMD claims 0.5-1% advantage but that's too small to detect, so we disregard that)
2) Freesync has better monitor choices, including IPS and ones with better specs in general
3) Freesync monitors are about USD200 cheaper, almost half the cost of a decent graphic card
4) Freesync monitors have on-screen dialogues (OSD) that works, Gsync monitor doesn't due to implementation
5) Freesync has better potential in future for support, especially laptops becuase of zero royalty fees and only minor update to hardware
6. Freesync allows users the option to choose whether they want to enable Vsync or not, Gsync locks Vsync to be on. This mean the user can have better latency if they can stand tearing. The important thing is option, having the option is always advantageous
7. AMD claims Freesync works from 9Hz-240Hz wheras Gsync only works from 30Hz to 144Hz.
chizow - Thursday, March 19, 2015 - link
@AnnihilatorX1) You assume the tests conducted here are actually relevant.
2) No, they don't. Nvidia has an IPS in the works that may very well be the best of all, but in the meantime, it is obvious that for whatever reason the FreeSync panels are subpar compared to the G-Sync offerings. Coutesy of PCPER: http://www.pcper.com/image/view/54234?return=node%...
3) Sure they are cheaper, but they also aren't as good, and certainly not "Free" as there is a clear premium compared to non-FreeSync panels, and certainly no firmware flash is going to change that. Also, that $200 is going to have to be spent on a new AMD graphics GCN1.1+ graphics card anyways as anyone who doesn't already own a newer AMD card will have to factor that into their decision. Meanwhile, G-Sync supports everything from Nvidia from Kepler on. Nice and tidy (and dominant in terms of installed user base).
4) OSDs, scalers and such add input lag, while having multiple inputs is nice, OSDs are a feature gaming purists can live without (See: all the gaming direct input modes on newer LCDs that bypass the scalers).
5) Not if they're tied to AMD hardware. They can enjoy a minor share of the dGPU graphics market as their TAM.
6) Uh, this is nonsense. FreeSync is still tied to Vsync in ways THIS review certainly doesn't cover indepth, but that's certainly not going to be a positive since Vsync inherently adds latency. Meanwhile, Vsync is never enabled with G-Sync, and while there is more latency at the capped FPS, it is a driver-side cap and not Vsync enabled.
7) Well, AMD can claim all they like it goes as low as 9Hz but as we have seen the implementation is FAR worst, falling apart below 40FPS where blurring, tearing, basically the image falls apart and everything you invested hundreds of dollars basically became a huge waste. Meanwhile, G-Sync shows none of these issues, and I play some MMOs that regularly dip into the 20s in crowded cities, no sign of any of this.
So yes, as I've shown, there are still many issues with FreeSync that need to be addressed that show it is clearly not as good as G-Sync. But like I said, this is a good introduction to the tech that Nvidia invented some 18 months ago, maybe with another 18 months AMD will make more refinements and close the gap?
lordken - Thursday, March 19, 2015 - link
5) what? Where did you got that Adaptive sync is tied to AMD HW? Thats pretty bullshit, if it would then it wouldnt be standardized by VESA right?If today it is only AMD HW that can support it (cause they implement first) doesnt validate your claim that it is AMD tied. Intel/nvidia/... can implement it in their products if they want.
It is like you would be saying that if for example LG release first monitor that will support DP1.3 that it implies DP1.3 is LG tied lol
On other hand Gsync is Nvidia tied. But you know this right?
chizow - Thursday, March 19, 2015 - link
@lordken, who else supports FreeSync? No one but AMD. Those monitor makers can ONLY expect to get business from a minor share of the graphics market given that is going to be the primary factor in paying the premium for one over a non-FreeSync monitor. This is a fact.anubis44 - Tuesday, March 24, 2015 - link
VESA supports FreeSync, which means Intel will probably support it, too. Intel graphics drive far more computers than AMD or nVidia, which means that if Intel does support it, nVidia is euchred, and even if Intel doesn't support it, many more gamers will choose free over paying an extra $150-$200 for a gaming setup. Between the 390-series coming out shortly and the almost guaranteed certainty that some hacked nVidia drivers will show up on the web to support FreeSync, G-Sync is a doomed technology. Period.chizow - Tuesday, March 24, 2015 - link
Intel has no reason to support FreeSync, and they have shown no interest either. Hell they showed more interest in Mantle but as we all know, AMD denied them (so much for being the open hands across the globe company).But yes I'm hoping Nvidia does support Adaptive Sync as their low-end solution and keeps G-Sync as their premium solution. As we have seen, FreeSync just isn't good enough but at the very least it means people will have even less reason to buy AMD if Nvidia supports both lower-end Adaptive Sync and premium G-Sync monitors.
chizow - Thursday, March 19, 2015 - link
@lordken and yes I am well aware Gsync is tied to Nvidia lol, but like I said, will I bet on the market leader with ~70% market share and installed user base (actually much higher than this, since Kepler is 100% vs. GCN1.1 is maybe 30%? over the cards sold since 2012) over the solution that holds a minor share of the dGPU market and even a smaller share of the CPU/APU market.chizow - Thursday, March 19, 2015 - link
And why don't you stop your biased preconceptions and actually read some articles that don't just take AMD's slidedecks at face value? Read a review that actually tries to tackle the real issues I am referring to, while actually TALKING to the vendors and doing some investigative reporting:http://www.pcper.com/reviews/Displays/AMD-FreeSync...
You will see, there are some major issues still with FreeSync that still need to be answered and addressed.
JarredWalton - Thursday, March 19, 2015 - link
It's not a "major issue" so much as a limitation of the variable refresh rate range and how AMD chooses to handle it. With NVIDIA it refreshes the same frame at least twice if you drop below 30Hz, and that's fine but it would have to introduce some lag. (When a frame is being refreshed, there's no way to send the next frame to the screen = lag.) AMD gives two options: VSYNC off or VSYNC on. With VSYNC off, you get tearing but less lag/latency. With VSYNC on you get stuttering if you fall below the minimum VRR rate.The LG displays are actually not a great option here, as 48Hz at minimum is rather high -- 45 FPS for example will give you tearing or stutter. So you basically want to choose settings for games such that you can stay above 48 FPS with this particular display. But that's certainly no worse than the classic way of doing things where people either live with tearing or aim for 60+ FPS -- 48 FPS is more easily achieved than 60 FPS.
The problem right now is we're all stuck comparing different implementations. A 2560x1080 IPS display is inherently different than a 2560x1440 TN display. LG decided 48Hz was the minimum refresh rate, most likely to avoid flicker; others have allowed some flicker while going down to 30Hz. You'll definitely see flicker on G-SYNC at 35FPS/35Hz in my experience, incidentally. I can't knock FreeSync and AMD for a problem that is arguably the fault of the display, so we'll look at it more when we get to the actual display review.
As to the solution, well, there's nothing stopping AMD from just resending a frame if the FPS is too low. They haven't done this in the current driver, but this is FreeSync 1.0 beta.
Final thought: I don't think most people looking to buy the LG 34UM67 are going to be using a low-end GPU, and in fact with current prices I suspect most people that upgrade will already have an R9 290/290X. Part of the reason I didn't notice issues with FreeSync is that with a single R9 290X in most games the FPS is well over 48. More time is needed for testing, obviously, and a single LCD with FreeSync isn't going to represent all future FreeSync displays. Don't try and draw conclusions from one sample, basically.
chizow - Friday, March 20, 2015 - link
@JarredHow is it not a major issue? You think that level of ghosting is acceptable and comparable to G-Sync!?!?! My have your standards dropped, if that is the case I do not think you are qualified to write this review, or at least post it under Editorial, or even better, post it under the AMD sponsored banner.
Fact is, below the stated minimum refresh, FreeSync is WORST than a non-VRR monitor would be, as all the tearing and input lag is there AND you get awful flickering and ghosting too.
And how do you know it is a limitation of panel technology when Nvidia's solution exhibits none of these issues at typical refresh rates as low as 20Hz, and especially at the higher refresh rates that AMD starts to experience it? Don't you have access to the sources and players here? I mean we know you have AMD's side of the story, but why don't you ask these same questions to Nvidia, the scaler makers, the monitor makers as well? It could certainly be a limitation of the spec don't you think? If monitor makers are just designing a monitor to AMD's FreeSync spec, and AMD is claiming they can alleviate this via a driver update, it sounds to me like the limitation is in the specification, not the technology, especially when Nvidia's solution does not have these issues. In fact, if you had asked Nvidia, as PCPer did, they may very well have explained to you why FreeSync ghosts/flickers, and their solution does not: From PCPer, again:
" But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."
Science and hardware trumps hunches and hearsay, imo. :)
Also, you might need to get with Ryan to fully understand the difference between G-Sync and FreeSync at low refresh. G-Sync simply displays the same frame twice. There is no sense of input lag, as input lag would be if the next refreshed panel was tied to a different input I/O. That is not the case with G-Sync, because the held frame 2nd is still tied to the input of the 1st frame, but the next live frame has a live input. All you perceive is low FPS, not input lag. There is a difference. It would be like playing a game at 30FPS on a 60Hz monitor with no Vsync. Still, certainly much better than AMD's solution of having everything fall apart at a framerate that is still quite high and hard to obtain for many video cards.
The LG is a horrible situation, who wants to be tied to a solution that is only effective in such a tight framerate band? If you are actually going to do some "testing", why don't you test something meaningful like a gaming session that shows the % of frames in any particular game with a particular graphics card that shows that fall outside of the "supported" refresh rates. I think you will find the amount of time spent outside of these bands is actually pretty high in demanding games and titles at the higher than 1080p games on the market today.
And you definitely see flicker at 35fps/35Hz on a G-Sync panel? Prove it. I have an ROG Swift and there is no flicker as low as 20FPS which is common in the CPU-limited MMO games out there. Not any noticeable flicker. You have access to both technologies, prove it. Post a video, post pictures, post the kind of evidence and do the kind of testing you would actually expect from a professional reviewer on a site like AT instead of addressing the deficiencies in your article with hearsay and anecdotal evidence.
Last part, again I'd recommend running the test I suggested on multiple panels with multiple cards and mapping out the frame rates to see the % that fall outside or below these minimum FreeSync thresholds. I think you would be surprised, especially given many of these panels are above 1080p. Even that LG is only ~1.35x 1080p, but most of these panels are 1440p premium panels and I can tell you for a fact a single 970/290/290X/980 class card is NOT enough to maintain 40+FPS in many recent demanding games at high settings. And as of now, CF is not an option. So another strike against FreeSync, if you want to use it, your realistic options are a 290/X at the minimum or there's the real possibility you are below the minimum threshold.
Hopefully you don't take this too harshly or personally, while there is some directed comments in there, there's also a lot of constructive feedback. I have been a fan of some of your work in the past but this is certainly not your best effort or an effort worthy of AT, imo. The biggest problem I have and we've gotten into it a bit in the past is that you repeat many of the same misconceptions that helped shape and perpetuate all the "noise" surrounding FreeSync. For example, you mention it again in this article, yet do we have any confirmation from ANYONE that existing scalers and panels can simply be flashed to FreeSync with a firmware update? If not, why bother repeating the myth?
Darkito - Friday, March 20, 2015 - link
@JaredWhat do you make of this PCPerformance article?
http://www.pcper.com/reviews/Displays/AMD-FreeSync...
"
G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module."
Especially those last few sentences. You say AMD can just duplicate frames like G-Sync but according to this article it's actually something in the G-Sync module that enables it. Is there truth to that?
Socketofpoop - Thursday, March 19, 2015 - link
Not worth the typing effort. Chizow is a well known nvidia fanboy or possibly a shill for them. As long as it is green it is best to him. Bent over, cheeks spread and ready for nvidias next salvo all the time.chizow - Friday, March 20, 2015 - link
@Socketofpoop, I'm well known among AMD fanboys! I'm so flattered!I would ask this of you and the lesser-known AMD fanboys out there. If a graphics card had all the same great features, performance, support with existing prices that Nvidia offers, but had an AMD logo and red cooler on the box, I would buy the AMD card in a heartbeat. No questions asked. Would you if roles were reversed? Of course not, because you're an AMD fan and obviously brand preference matters to you more than what is actually the better product.
Black Obsidian - Thursday, March 19, 2015 - link
I hate to break it to you, but history has not been kind to the technically superior but proprietary and/or higher cost solution. HD-DVD, miniDisc, Laserdisc, Betamax... the list goes on.JarredWalton - Thursday, March 19, 2015 - link
Something else interesting to note is that there are 11 FreeSync displays already in the works (with supposedly nine more unannounced), compared to seven G-SYNC displays. In terms of numbers, FreeSync on the day of launch has nearly caught up to G-SYNC.chizow - Thursday, March 19, 2015 - link
Did you pull that off AMD's slidedeck too Jarred? What's interesting to note is you list the FreeSync displays "in the works" without counting the G-Sync panels "in the works"? And 3 monitors is now "nearly caught up to" 7? Right.A brand new panel is a big investment (not really), I guess everyone should place their bets carefully. I'll bet on the market leader that holds a commanding share of the dGPU market, consistently provides the best graphics cards, great support and features, and isn't riddled with billions in debt with a gloomy financial outlook.
JarredWalton - Thursday, March 19, 2015 - link
Try counting again: LG 29", LG 34", BenQ 27", Acer 27" -- that's four. Thanks for playing. And the Samsung displays are announced and coming out later this month or early next. For NVIDIA, there are six displays available, and one coming next month (though I guess it's available overseas). I'm not aware of additional G-SYNC displays that have been announced, so there's our six/seven. I guess maybe we can count the early moddable LCDs from ASUS (and BenQ?) and call it 8/9 if you really want to stretch things.I'm not saying G-SYNC is bad, but the proprietary nature and price are definitely not benefits for the consumer. FreeSync may not be technically superior in every way (or at least, the individual implementations in each LCD may not be as good), but open and less expensive frequently wins out over closed and more expensive.
chizow - Friday, March 20, 2015 - link
@Jarred, thanks for playing, but you're still wrong. There's 8 G-Sync panels on the market, and even adding the 1 for AMD that's still double, so how that is "nearly caught up" is certainly an interesting lens.Nvidia also has panels in the works, including 2 new, major breakthroughs like the Acer 1440p IPS, 1st 144Hz, 1440p, IPS VRR panel and the Asus ROG Swift 4K IPS, 1st 4K IPS VRR monitor. So yes, while AMD is busy "almost catching up" with low end panels, Nvidia and their partners are continuing to pioneer the tech.
As for FreeSync bringing up the low end, I personally think it would be great if Nvidia adopted AdaptiveSync for their low end solutions and continued to support G-Sync as their premium solution. It would be great for the overwhelming majority of the market that owns Nvidia already, and would be one less reason for anyone to buy a Radeon card.
TheJian - Sunday, March 22, 2015 - link
You sure have a lot of excuses. This is beta 1.0, it's the lcd's fault (pcper didn't think so), assumption that open/free (this isn't free, $50 by your own account for freesync, which is the same as $40-60 for the gsync module right?, you even admit they're hiking prices at the vendor side for $100+) is frequently the winner. Ummm, tell that to CUDA and NV's generally more expensive cards. There is a reason they have pricing power (the are better), and own 70% discrete and ~75% workstation market. I digress...anubis44 - Tuesday, March 24, 2015 - link
@chizow:"I'll bet on the market leader that holds a commanding share of the dGPU market, consistently provides the best graphics cards, great support and features, and isn't riddled with billions in debt with a gloomy financial outlook."
You mean you'll bet on the crooked, corrupt, anti-competitive, money-grubbing company that doesn't compensate their customers when they rip them off (bumpgate), and has no qualms about selling them a bill of goods (GTX970 has 4GB ram! Well, 3.5GB of 'normal' speed ram, and .5GB of much slower, shitty ram.), likes to pay off game-makers to throw in trivial nVidia proprietary special effects (Batman franchise and PhysX, I'm looking right at you)? That company? Ok, you keep supporting the rip-off GPU maker, and see how this all ends for you.
chizow - Tuesday, March 24, 2015 - link
@Anubis44: Yeah again, I don't bother with any of that noise. The GTX 970 I bought for my wife in Dec had no adverse impact from the paper spec change made a few months ago, it is still the same fantastic value and perf it was the day it launched.But yes I am sure ignoramuses like yourself are quick to dismiss all the deceptive and downright deceitful things AMD has said in the past about FreeSync, now that we know its not really Free, can't be implemented with a firmware flash, does in fact require additional hardware, and doesn't even work with many of AMD's own GPUs. And how about CrossFireX? How long did AMD steal money from end-users like yourself on a solution that was flawed and broken for years on end, even denying there was a problem until Nvidia and the press exposed it with that entire runtframe FCAT fiasco?
And bumpgate? LMAO. AMD fanboys need to be careful who they point the finger at, especially in the case of AMD there's usually 4 more fingers pointed back at them. How about that Llano demand overstatement lawsuit still ongoing that specifically names most of AMD's exec board, including Read? How about that Apple extended warranty and class action lawsuit regarding the same package/bump issues on AMD's MacBook GPUs?
LOL its funny because idiots like you think "money-grubbing" is some pejorative and greedy companies are inherently evil, but then you look at AMD's financial woes and you understand they can only attract the kind of cheap, ignorant and obtusely stubborn customers LIKE YOU who won't even spend top dollar on their comparably low-end offerings. Then you wonder why AMD is always in a loss position, bleeding money from every orifice, saddled in debt. Because you're waiting for that R9 290 to have a MIR and drop from $208.42 to $199.97 before you crack that dusty wallet open and fork out your hard-earned money.
And when there is actually a problem with AMD product, you would rather make excuses for them and sweep those problems under the rug, rather than demand better product!
So yes, in the end, you and AMD deserve one another, for as long as it lasts anyways.
Yojimbo - Thursday, March 19, 2015 - link
HD-DVD was technically superior and higher cost? It seems BlueRay/HD-DVD is a counterexample to what you are saying, but you include it in the list to your favor. Laserdisc couldn't record whereas VCRs could. Minidisc was smaller and offered recording, but CD-R came soon after and then all it had was the smaller size. Finally MP3 players came along and did away with it.There's another difference in this instance, though, which doesn't apply to any of those situations that I am aware of, other than minidisc ): G-Sync/FreeSync are linked to an already installed user base of requisite products. (Minidisc was going up against CD libraries, although people could copy those. In any case, minidisc wasn't successful and was going AGAINST an installed user base.) NVIDIA has a dominant position in the installed GPU base, which is probably exactly the reason that NVIDIA chose to close off G-Sync and the "free" ended up being in FreeSync.
Assuming variable refresh catches on, if after some time G-Sync monitors are still significantly more expensive than FreeSync ones, it could become a problem for NVIDIA and they may have to either work to reduce the price or support FreeSync.
chizow - Thursday, March 19, 2015 - link
Uh, HD-DVD was the open standard there guy, and it lost to the proprietary one: Blu-Ray. But there's plenty of other instances of proprietary winning out and dominating, let's not forget Windows vs. Linux, DX vs. OpenGL, CUDA vs. OpenCL, list goes on and on.Fact remains, people will pay more for the better product, and better means better results, better support. I think Nvidia has shown time and again, that's where it beats AMD, and their customers are willing to pay more for it.
See: Broken Day 15 CF FreeSync drivers as exhibit A.
at80eighty - Thursday, March 19, 2015 - link
Keep paddling away, son. The ship isn't sinking at allchizow - Thursday, March 19, 2015 - link
If you buy a FreeSync monitor you will get 2-3 paddles for every 1 on a G-Sync panel. That will certainly help you pedal faster.http://www.pcper.com/image/view/54234?return=node%...
at80eighty - Friday, March 20, 2015 - link
oh hey great example - no ghosting there at all. brilliant!JarredWalton - Friday, March 20, 2015 - link
FYI, ghosting is a factor of the display and firmware, not of the inherent technology. So while it's valid to say, "The LG FreeSync display has ghosting..." you shouldn't by extension imply FreeSync in and of itself is the cause of ghosting.chizow - Friday, March 20, 2015 - link
So are you saying a firmware flash is goiing to fix this, essentially for free? Yes that is a bit of a troll but you get the picture. Stop making excuses for AMD and ask these questions to them and panel makers, on record, for real answers. All this conjecture and excuse-making is honestly a disservice to your readers who are going to make some massive investment (not really) into a panel that I would consider completely unusable.You remember that Gateway FPD2485W that you did a fantastic review of a few years ago? Would you go back to that as your primary gaming monitor today? Then why dismiss this problem with FreeSync circa 2015?
chizow - Friday, March 20, 2015 - link
Who said no ghosting? lol. There's lots of ghosting, on the FreeSync panels.TheJian - Sunday, March 22, 2015 - link
You're assuming gsync stays the same price forever. So scalers can improve pricing (in your mind) to zero over time, but NV's will never shrink, get better revs etc...LOL. OK. Also you assume they can't just lower the price any day of the week if desired. Microsoft just decided to give away Windows 10 (only to slow android but still). This is the kind of thing a company can do when they have 3.7B in the bank and no debt (NV, they have debt but if paid off, they'd have ~3.7b left). They could certainly put out a better rev that is cheaper, or subsidize $50-100 of it for a while until they can put out a cheaper version just to slow AMD down.They are not equal. See other site reviews besides and AMD portal site like anandtech ;)
http://www.pcper.com/reviews/Displays/AMD-FreeSync...
There is no lic fee from NV according to PCper.
"It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."
Which basically shows VENDORS must be marking things up quite a lot. But that is too be expected with ZERO competition until this week.
"For NVIDIA users though, G-Sync is supported on GTX 900-series, 700-series and 600-series cards nearly across the board, in part thanks to the implementation of the G-Sync module itself."
Not the case on the AMD side as he says. So again not so free if you don't own a card. NV people that own a card already are basically covered, just buy a monitor.
Specs of this is misleading too, which anandtech just blows by:
"The published refresh rate is more than a bit misleading. AMD claims that FreeSync is rated at 9-240 Hz while G-Sync is only quoted at 30-144 Hz. There are two issues with this. First, FreeSync doesn’t determine the range of variable refresh rates, AdaptiveSync does. Second, AMD is using the maximum and minimum refresh range published by the standards body, not an actual implementation or even a planned implementation from any monitor or display. The 30-144 Hz rating for G-Sync is actually seen in shipping displays today (like the ASUS PG278Q ROG Swift). The FreeSync monitors we have in our office are either 40-144 Hz (TN, 2560x1440) or 48-75 Hz (IPS, 2560x1080); neither of which is close to the 9-240 Hz seen in this table."
Again, read a site that doesn't lean so heavily to AMD. Don't forget to read about the GHOSTING on AMD. One more point, PCper's conclusion:
"My time with today’s version of FreeSync definitely show it as a step in the right direction but I think it is far from perfect."
"But there is room for improvement: ghosting concerns, improving the transition experience between VRR windows and non-VRR frame rates and figuring out a way to enable tear-free and stutter-free gaming under the minimum variable refresh rate."
"FreeSync is doing the right things and is headed in the right direction, but it can’t claim to offer the same experience as G-Sync. Yet."
Ok then...Maybe Freesync rev2 gets it right ;)
soccerballtux - Friday, March 20, 2015 - link
you must be a headcase or, more likely, are paid for by NVidia to publicly shill. Gsync requires a proprietary NVidia chip installed in the monitor that comes from, and only from, NVidia.It's much easier to simply set a flag-byte in the DisplayPort data stream that says "ok render everything since the last render you rendered, now". There's nothing closed about that.
chizow - Friday, March 20, 2015 - link
And? Who cares if it results in a better solution? LOL only a headcase or a paid AMD shill would say removing hardware for a cheaper solution that results in a worst solution is actually better.soccerballtux - Friday, March 20, 2015 - link
wellll, if it's cheaper and a better solution, then the market cares.chizow - Friday, March 20, 2015 - link
Except its cheaper and worst, therefore it should be cheaperbloodypulp - Friday, March 20, 2015 - link
Oh darn... so what you're saying is that I have to purchase the card that costs less, then I have to purchase the monitor that costs less too? Sound like a raw deal... ROFL!!And as far as your bogus oppenness argument goes: There is nothing preventing Nvidia from supporting Adaptive Sync. NOTHING. Well, unless you count hubris and greed. In fact, Mobile G-Sync already doesn't even require the module! I guess that expensive module really wasn't necessary after all...
And lastly, Nvidia has no x86 APU to offer, so they can't offer what AMD can with their Freesync-supporting APUs. Nvidia simply has nothing to compete with there. Even gamers on a tight budget can enjoy Freesync! The same simply cannot be said for GSync.
Denithor - Friday, March 20, 2015 - link
ONLY closed because NVIDIA refuses to support freesync. Much like OpenCL. And PhysX.Refuge - Friday, March 20, 2015 - link
Thats the thing, this is clearly a tech that can help lower end GPU's provide a better gaming experience, and it is a patch away for nVidia.Them saying they won't or them just not doing it is honestly a slap in the face to every customer of theirs. Me included, I don't want my GPU to work better with one monitor than another because of a branding issue.
if nVidia doesn't support Freesync, I'll just never buy their products again. I honestly don't see why they wouldn't support it. Then their GPU's work with everything and AMD are still stuck to Freesync.
Not only is it insulting to me as a customer, it is also stupid from a business standpoint as well.
Creig - Friday, March 20, 2015 - link
AdaptiveSync - An open VESA industry standard available for free to any company that wishes to utilize it.G-sync - Nvidia's proprietary solution that they collect royalties on and refuse to allow any other company to use.
Big difference.
tobi1449 - Friday, March 20, 2015 - link
Plus I doubt the companies producing the scalers don't want their cut for additional features like this one.cbutters - Friday, March 20, 2015 - link
So you are arguing that the cost is that you have to stick with AMD hardware? For one, how is this a cost?? But my point is the only reason that you would have to stick with AMD hardware is because NVIDIA chooses not to support displayport 1.2a. So it is NVIDIA costing you.Secondly, You are not limited to AMD hardware, rather NVIDIA is excluding itself from your next purchase. Freesync is not closed tech... Intel graphics chips could adopt it tomorrow since it is an open standard. It is NVIDIA that is closing down options, not AMD.
JonnyDough - Monday, March 23, 2015 - link
Costs will trickle down somehow...yes with good PR. AMD spent a ton developing this only to give it away for free. It will pay because it makes NVidia look bad. I'm not a fanboy, I prefer NVidia's drivers usually. I just like AMD better because they compete on cost, not match it like an oligopoly.anubis44 - Tuesday, March 24, 2015 - link
"Costs still trickle down somehow..and that cost is you having to stick with AMD hardware when you buy a new monitor."That's not a cost AMD is imposing on us, it's a cost nVidia is imposing on us, by stubbornly refusing to give their customers Freesync compatible drivers. nVidia is simply trying to grab as much cash as possible, and people like you are helping them rip us all off.
chizow - Tuesday, March 24, 2015 - link
How is it not a cost AMD is imposing? LOL. FreeSync panels carry a premium over non-FreeSync panels, this is a fact. AMD has said it is the panel mfgs charging a premium for higher quality components, specifications, engineering/QA costs. No one has a problem with this, even AMD fanboys like you.Yet when Nvidia does the same, especially when their G-Sync module is clearly doing a better job at what it needs to do relative to the new FreeSync scalers, all while offering more features (3D and ULMB), suddenly there's a problem and Nvidia has no right?
LOL, idiots. Nvidia and their mfg partners are charging more because the market sees value in their superior products, simple as that. These are the same mfgs btw, if they thought they could charge more they would, but clearly, they also see the Nvidia solution commands the higher price tag.
medi03 - Wednesday, March 25, 2015 - link
BS.As mentioned in the article, FreeSync support is no big deal and is already supported by most upscaler chips out there. Had there been "hidden cost" they wouldn't do it.
FriendlyUser - Thursday, March 19, 2015 - link
How is it better because it works with Nvidia hardware? I mean, if you have a Nvidia card you don't have a choice. That doesn't make GSync better in any meaningful way.dragonsqrrl - Thursday, March 19, 2015 - link
... If you currently have an AMD card, you have much less of a choice. Actually given the restriction of GCN 1.1 or later, there's a decent possibility you have no choice.lordken - Thursday, March 19, 2015 - link
mmh your point is? ofc if you have AMD you can only get freesync because if nothing else nvidia kept gsync for themself. What did you try to say? Nvidia is fragmenting monitor market.dragonsqrrl - Thursday, March 19, 2015 - link
My point is that Nvidia currently has more options for variable refresh rate tech, on top of a much larger install base, than AMD. It often helps to read a response in the context of the comment it's responding to. If you can't see how that's a relevant response to FriendlyUser's comment, then I can't help you.chizow - Tuesday, March 24, 2015 - link
Exactly, yet AMD fans and surprisingly, even the author Jarred (who should know better), would have you believe G-Sync somehow faces the uphill battle?JeffFlanagan - Thursday, March 19, 2015 - link
Having Nvidia refuse to embrace a standard does not make overpriced Gsync devices "better." It's just Nvidia failing their users yet-again.They screwed me on stereoscopic 3D by dropping support for the $1K eMagin HMD when changing business partners, making it clear that they do not care to support their customers if not supporting them will drive sales of new displays. I won't get fooled again.
chizow - Thursday, March 19, 2015 - link
Nvidia failing their users, that's interesting. So they failed their users by inventing a tech the world had never seen before and bringing it to market some 18 months before the competition. Having owned and used an ROG Swift for the past 7 months which completely changed my gaming experience, I'd disagree with you.Nvidia once again did exactly what I expect them to do: introduce great new technology to improve their ecosystem for their users.
AnnihilatorX - Thursday, March 19, 2015 - link
For those 18 months yes, Nvidia was good. But now, It fails its customers because now, refusing to support the VESA standard, they are effectively limiting their choice of monitors and by forcing customers to pay a premium if they want smooth gameplay.chizow - Thursday, March 19, 2015 - link
No, they're reinforcing their position and rewarding their customers by sticking to and developing a technology they're invested in. Their customers will pay the premium as long as their solution is better, and the only way to continue to ensure it remains better is to continue investing and developing it. Nvidia has already said they aren't done improving G-Sync, given how awesome it has been in its first incarnation, I can't wait to see what they have in store.Meanwhile, FreeSync is a good introduction into VRR for AMD, let's hope they continue to invest the same way Nvidia has to make sure they produce the best product they can for their users.
maximumGPU - Friday, March 20, 2015 - link
you can't possibly believe that, it's ridiculous!rewarding their customers by making them pay a premium for an end result that's clearly not noticeably different from the free alternative??
It's really business 101: they had the market cornered and they could charge whatever they want, fair play to them and well done.
But now when an equally good free open standard alternative comes into play, not adopting it IS a complete disregard to their customers. I own nvidia gpus (sli) now, and i DON'T want to pay for their solution after seeing what freesync can do. Not providing me with that option simply makes me a disgruntled customer that'll take my business elsewhere.
The problem is people like you who can't see that continue to blindly buy into it, making them reluctant to change their stance as long as the money rolls in. They'd drop gsync in an instant if no one buys their overpriced tech, and we'd all be better for it.
chizow - Friday, March 20, 2015 - link
And you can't possibly believe that can you? Read some actual reviews that clearly show right now FreeSync is the inferior solution.silverblue - Friday, March 20, 2015 - link
Besides the PCPerspective review (which, like this one, is a work-in-progress anyway), please provide links to these reviews (plural, as stated).chizow - Friday, March 20, 2015 - link
Yeah np, 2nd paragraph after the CCC Picture, it clearly states FreeSync panels begin exhibiting ghosting bleow 60Hz.https://translate.google.com/translate?sl=auto&...
You can thank me later for saving you $500+ on a 1st Gen FreeSync panel with ghosting problems that would frankly make it unusable to most.
I am sure we will see more reviews once reviewers actually get samples they can keep and aren't forced to use in AMD's controlled test environment...
silverblue - Saturday, March 21, 2015 - link
I said plural only because you implied it. More links still required.chizow - Saturday, March 21, 2015 - link
You can use google just as well as I can, I am sure. 2 links are plenty, especially when they actually give photo evidence of the problem.chizow - Monday, March 23, 2015 - link
Another confirmation of the problem from Forbes, with quotes from Petersen:http://www.forbes.com/sites/jasonevangelho/2015/03...
The fact Jarred and AT not only missed this, but have actively made excuses/denied it is pretty appalling as more and more reviewers are making note of the ghosting and flickering problems with FreeSync.
anubis44 - Tuesday, March 24, 2015 - link
@chizow: "Read some actual reviews that clearly show right now FreeSync is the inferior solution."You mean, as opposed to all the 'virtual' reviews we've been reading that all say the same thing: that FreeSync does exactly what G-Sync does, except for free? Clearly, a green goblin lover like yourself must respect techreport.com, a blatantly pro-nVidia website plain and simple, and even their response to FreeSync? I quote Scott Wasson:
"Now, I have only had a few hours with the BenQ XL2730Z (our review sample arrived yesterday afternoon), but my first impressions are just this: AMD has done it. They've replicated that sense of buttery smooth animation that a fast G-Sync display will get you, and they've done it in a way that squeezes the extra costs out of the monitors. This is a very good thing."
I'll repeat that operative part of the quote so you can't possibly overlook or ignore it. AMD has "DONE IT. They've REPLICATED THAT SENSE OF BUTTERY SMOOTH ANIMATION that a fast G-Sync display will get you, and they've done it in a way that SQUEEZES THE EXTRA COSTS OUT OF THE MONITORS."
This is all that the vast legions of gamers are going to care about: same buttery smooth performance, less cost. QED.
chizow - Tuesday, March 24, 2015 - link
Yeah, they are virtual reviews, done in controlled test environments in a limited period of time just as noted in that review. Now that actual live samples are coming in however, a LOT of problems are creeping up outside of the controlled test environments.And before you attack the author's credibility, as I'm sure you will, keep in mind this guy has been a huge AMD advocate in the past:
http://www.forbes.com/sites/jasonevangelho/2015/03...
"Note to readers: I have seen this firsthand on the Acer FreeSync monitor I’m reviewing, and PC Perspective noticed it with 2 additional FreeSync monitors, the BenQ XL2730Z and LG 34UM67. To illustrate the problem they recorded the aforementioned monitors running AMD’s Windmill demo, as well as the same demo running on a G-Sync enabled Asus ROG Swift. Ignore the stuttering you see (this is a result of recording at high speed) and pay attention to the trailing lines, or ghosting. I agree that it’s jarring by comparison."
anubis44 - Tuesday, March 24, 2015 - link
@maximumGPU: Oh, but chizow DOES believe it. He believes that the green goblin on the box is the hallmark of quality, and that nVidia NEVER makes any mistakes, NEVER rips off its customers, and NEVER cheats or lies, or acts deceptively, because 'it's all business'. In other words, he's a sociopath. He BELIEVES G-Sync MUST be better than FreeSync because there's a green goblin-labelled chip in the G-Sync monitor that MUST be doing SOMETHING the FreeSync monitor just cannot do. It just HAS to be better, because he PAID for it. LOLchizow - Tuesday, March 24, 2015 - link
Who cares about all that noise? They still offer the best product at any given price point if you can afford a slightly better product. Only sycophants like you go into the stupid and inane morality of it, same way you begrudge Intel for putting their boot to AMD's neck in the CPU market.Buy what gives you the best product for your money and needs, who cares about the rest.
medi03 - Thursday, March 19, 2015 - link
GSync is a shameless attempt to ban competition.AMD couldn't use it even if it would PAY for it.
On the contrary, FreeSync, being VESA standard, can be freely used by any GPU manufacturer, including Intel and nVidia.
Frenetic Pony - Thursday, March 19, 2015 - link
It's adaptive sync but worse in every way. What is possibly good about it?testbug00 - Thursday, March 19, 2015 - link
Huh? It's just the brand that AMD is putting on adaptive sync for their GPUs. ???farhadd - Friday, March 20, 2015 - link
I would never buy a Gsync monitor because I want GPU flexibility down the road. Period. Unfortunately, with no assurance from Nvidia that they might support Freesync down the road, I'm afraid to invest in a Freesync monitor either. Both sides lose.just4U - Friday, March 20, 2015 - link
"Not sure why people are bashing Gsync"--------
It's not a open standard. That's the main reason to bash it.
JonnyDough - Monday, March 23, 2015 - link
NVidia can state what they want. So can Apple. Companies that refuse to create standards with other companies for the benefit of the consumer will pay the price with actual money. This is part of why I buy AMD and not Nvidia products. I got tired of the lies, the schemes, and the confusion that they like to throw at those that fund them.medi03 - Wednesday, March 25, 2015 - link
Paying 100$+ premium, being locked-in to a single manufacturer, having only a single input port on a monitor is "fantastic" indeed.DigitalFreak - Thursday, March 19, 2015 - link
Your mom called. She said you need to clean your room.anubis44 - Tuesday, March 24, 2015 - link
@DigitalFreak: So your response to an intelligently argued point about not supporting a company that imposes proprietary standards is to say 'your mom called?' You're clearly a genius, aren't you?SetiroN - Thursday, March 19, 2015 - link
I'm disappointed Anandtech is forgetting this:there's a fundamental difference between G-Sync and freesync, which justifies the custom hardware and makes it worth it compared to something I'd rather stay without: freesync requires v-sync to be active and is plagued with the additional latency that comes with it, gsync replaces it entirely.
Considering that the main reason anyone would want adaptive sync is to play decently even when the framerate dips at 30-40fps, where the 3 v-synced, pre-rendered frames account for a huge 30-40ms latency, AMD can shove its free solution up its arse as far as I'm concerned.
I'd rather have tearing and no latency, or no tearing and acceptable latency at lower settings (to allow me to play at 60fps), both of which don't require a new monitor.
For the considerable benefit of playing at lower-than-60 fps without tearing AND no additional latency, I'll gladly pay the nvidia premium (as soon as 4K 120Hz IPS will be available).
FriendlyUser - Thursday, March 19, 2015 - link
Did you read the article? Of course not! VSync On or Off only comes into play when outside the refresh rate range of the monitor and is an option that GSync does not have. If you have GSync you are force into VSync On when outside the refresh range of the monitor.200380051 - Thursday, March 19, 2015 - link
No, it'S the other way around. Freesync does not require V-Sync; you can actually choose, and it will impact the stuttering/tearing or display latency. OTOH, G-Sync does exactly what you said : V-Sync on, when outside of dynamic sync range. Read more carefully, pal.eanazag - Thursday, March 19, 2015 - link
The AMD and Nvidia haters all come out of the wood work for these type articles.Intel needs to chime in. I suspect they will go the FreeSync route since it is part of the spec and there are no costs.
I understand Nvidia has some investment here. I fully expect them to support adaptive sync - at least in 5 years. They really need to do something about Phys-X. As a customer I see it as irrelevant. I know it isn't their style to open up their tech.
eddman - Thursday, March 19, 2015 - link
Not to go off-topic too much, but physx as a CPU physics engine, like havok, etc., is quite popular. There are hundreds of titles out there using it and more are coming.As for GPU physx, which is what you had in mind, yes, it'd never become widely adopted unless nvidia opens it up, and that would probably not happen, unless someone else comes up with another, open GPU accelerated physics engine.
mczak - Thursday, March 19, 2015 - link
Minor nitpick, intel's solution won't be called FreeSync - this is reserved for AMD certified solutions. Pretty sure though it's going to be technically the same, just using the adaptive sync feature of DP 1.2a.(My guess would be at some point in the future nvidia is going to follow suit, first with notebooks because gsync is more or less impossible there though even then it will be initially restricted to notebooks which drive the display from the nvidia gpu which aren't many but everything else is going to require intel to support it first. I'm quite confident they are going to do this with desktop gpus too, though I would suspect they'd continue to call it GSync. Let's face it requiring a specific nvidia gsync module in the monitor just isn't going to fly with anything but high-end gaming market whereas adaptive sync should trickle down to a lot more markets, thus imho there's no way nvidia's position on this doesn't have to change.)
anubis44 - Tuesday, March 24, 2015 - link
@eanazag: nVidia will be supporting FreeSync about 20 minutes after the first hacked nVidia driver to support FreeSync makes it onto the web, whether they like it or not.chizow - Tuesday, March 24, 2015 - link
Cool, I welcome it, one less reason to buy anything AMD related.chizow - Thursday, March 19, 2015 - link
There's no need to be disappointed honestly, Jarred just copy/pasted half of AMD's slide deck and then posted a Newegg Review. Nothing wrong with that, Newegg Reviews have their place in the world, its just unfortunate that people will take his conclusions and actually believe Freesync and G-Sync are equivalents, when there are already clear indications this is not the case.- 40 to 48 minimums are simply unacceptably low thresholds before things start falling apart, especially given many of these panels are higher than 1080p. 40 Minimum at 4K for example is DAMN hard to accomplish, in fact the recently launched Titan X can't even do it in most games. CrossFireX isn't going to be an option either until AMD fixes FreeSync + CF, if ever.
-The tearing/ghosting/blurring issues at low frame rates is significant. AMD mentioned issues with pixel decay causing problems at low refresh, but honestly, this alone shows us G-Sync is worth the premium because it is simply better. http://www.pcper.com/files/imagecache/article_max_...
Jarred has mused multiple times these panels may use the same one as the one in the Swift, so why are the FreeSync panels faling so badly at low refresh? Maybe that G-Sync module is actually doing something, like actively sync'ing with the monitor to force overdrive without breaking the kind of guesswork framesync FreeSync is using?
-Input lag? We can show AMD's slide and take their word for it without even bothering to test? High speed camera, USB input double attached to a mouse, scroll and see which one responds faster. FreeSync certainly seems to work within its supported frequency bands in preventing tearing, but that was only half of the problem related to Vsync on/off. The other trade off for Vsync ON was how much input lag this introduced.
-A better explanation of Vsync On/Off and tearing? Is this something the driver handles automatically? Is Vsync being turned on and off by the driver dynamically, similar to Nvidia's Adaptive Vsync? When it is on, does it introduce input lag?
In any case, AnandTech's Newegg Review of FreeSync is certainly a nice preview and proof of concept of FreeSync, but I wouldn't take it as more than that. I'd wait for actual reviews to cover the science of display technology that actually matter, like input lag, blurring, image retention etc that can only really be captured and quantified with equipment like high speed cameras and a sound testing methodology.
at80eighty - Thursday, March 19, 2015 - link
Waaachizow - Thursday, March 19, 2015 - link
Another disappointed AMD user I see, I agree, FreeSync certainly isn't as good as one might have hoped.at80eighty - Friday, March 20, 2015 - link
had more nvidia cards than amd; so keep trying.chizow - Friday, March 20, 2015 - link
Doubt it, but keep trying.5150Joker - Friday, March 20, 2015 - link
NewEgg review LOL! In defense of Jared, he's probably working in the confines of the equipment made available to him by the parent company of this place. TFTCentral and PRAD have really expensive equipment they use to quantify the metrics in their reviews.chizow - Thursday, March 19, 2015 - link
G-Sync isn't going anywhere, but its nice to see AMD provide their fans with an inferior option as usual. Works out well, given their customers are generally less discerning anyways. Overall its a great day for AMD fans that can finally enjoy the tech they've been downplaying for some 18 months since G-Sync was announced.Black Obsidian - Thursday, March 19, 2015 - link
AMD offers an option that's indistinguishable in actual use from nVidia's, and significantly cheaper to boot. Sure, it's not enough for the "discerning" set who are willing to pay big premiums for minuscule gains just so they can brag that they have the best, but who other than nVidia stockholders cares who gets to fleece that crowd?Frankly, I wish that AMD could pull the same stunt in the CPU market. Intel could use a price/performance kick in the pants.
chizow - Thursday, March 19, 2015 - link
Well unfortunately, for less discerning customers, the type that would just take such a superficial review as gospel to declare equivalency, the issues with input lag focuses on minor differences that were not easily quantified or identified, but over thousands of frames, the differences are much more apparent.If you're referring to differences in FPS charts, you've already failed in seeing the value Nvidia provides to end-users in their products as graphics cards have become much more than just spitting out frames on a bar graph. FreeSync and G-Sync are just another example of this, going beyond the "miniscule gains" vs price tag that less discerning individuals might prioritize.
Ranger101 - Friday, March 20, 2015 - link
My heartfelt thanks to chizow. Your fanboy gibberings are a constant source of amusement :)chizow - Friday, March 20, 2015 - link
Np, without the nonsense posted by AMD fanboys there wouldn't be a need to post at all!Black Obsidian - Friday, March 20, 2015 - link
You're so discerning that I'm sure you could wax poetical on how your $3K monocrystalline speaker cables properly align the electrons to improve the depth of your music in ways that aren't easily quantifiable.chizow - Friday, March 20, 2015 - link
No, but I can certainly tell you why G-Sync and dozen or so other features Nvidia provides as value-add features for their graphics cards make them a better solution for me and the vast majority of the market.silverblue - Friday, March 20, 2015 - link
A dozen? Please.No really, I mean PLEASE tell us this "dozen or so other features".
chizow - Friday, March 20, 2015 - link
Np, always nice mental exercise reminding myself why I prefer Nvidia over AMD:1. G-Sync
2. 3D Vision (and soon VRDirect)
3. PhysX
4. GeForce Experience
5. Shadowplay
6. Better 3rd party tool support (NVInspector, Afterburner, Precision) which gives control over SLI/AA settings in game profiles and overclocking
7. GameWorks
8. Better driver support and features (driver-level FXAA and HBAO+), profiles as mentioned above, better CF profile and Day 1 support.
9. Better AA support, both in-game and forced via driver (MFAA, TXAA, and now DSR)
10. Better SLI compatibility and control (even if XDMA and CF have come a long way in terms of frame pacing and scaling).
11. Better game bundles
12. Better vendor partners and warranty (especially EVGA).
13. Better reference/stock cooler, acoustics, heat etc.
Don't particularly use these but they are interesting to me at either work or in the future:
14. CUDA, we only use CUDA at work, period.
15. GameStream. This has potential but not enough for me to buy a $200-300 Android device for PC gaming, yet.
16. GRID. Another way to play your PC games on connected mobile devices.
Damn, was that 16? No sweat.
silverblue - Saturday, March 21, 2015 - link
I can certainly let you off most of those, but third party activities shouldn't count, so you can subtract 6 and 12. Additionally, 13 can be picked apart as the 295X2 showed that AMD can present a high quality cooler, and because I believe lumping the aesthetic qualities of a cooler in with heat and noise is a partial falsehood (admit it - you WILL have been thinking of metal versus plastic shrouds). I also don't agree with you on 11; at least, not if you move back past the 2XX generation as AMD had more aggressive bundles back then. 8 is subjective but NVIDIA usually gets the nod here.Also, some of your earlier items are proprietary tech, to which I could always tease you about as it's not as if they couldn't license any of this out. ;)
I'll hand it to you and credit you with your dozen.
chizow - Saturday, March 21, 2015 - link
And I thank you for not doing the typical dismissive approach of "Oh I don't care about those features" that some on these forums might respond with.I would still disagree on 6 and 12 though, ultimately they are still a part of Nvidia's ecosystem and end-user experience, and in many cases, Nvidia affords them the tools and support to enable and offer these value-add features. 3rd party tools for example, they specifically take advantage of Nvidia's NVAPI to access hardware features via driver and Nvidia's very transparent XML settings to manipulate AA/SLI profile data. Similarly, every feature EVGA offers to end users has to be worth their effort and backed by Nvidia to make business sense for them.
And 13, I would absolutely disagree on that one. I mean we see the culmination of Nvidia's cooling technology, the Titan NVTTM cooler, which is awesome. Having to resort to a triple slot water cooled solution for a high-end graphics card is terrible precedent imo and a huge barrier to entry for many, as you need additional case mounting and clearance which could be a problem if you already have a CPU CLC as many do. But that's just my opinion.
AMD did make a good effort with their Gaming Evolved bundles and certainly offered better than Nvidia for a brief period, but its pretty clear their marketing dollars dried up around the same time they cut that BF4 Mantle deal and their current financial situation hasn't allowed them to offer anything compelling since. But I stand by that bulletpoint, Nvidia typically offers the more relevant and attractive game bundle at any given time.
One last point in favor of Nvidia, is Optimus. I don't use it at home as I have no interest in "gaming" laptops, but it is a huge benefit there. We do have them on powerful laptops at work however, and the ability to "elevate" an application to the Nvidia dGPU on command is a huge benefit there as well.
anubis44 - Tuesday, March 24, 2015 - link
@chizow:But hey kids, remember, after reading this 16 point PowerPoint presentation where he points out the superiority of nVidia using detailed arguments like "G-Sync" and "GRID" as strengths, chizow DOES NOT WORK FOR nVidia! He is not sitting in the marketing department in Santa Clara, California, with a group of other marketing mandarins running around, grabbing factoids for him to type in as responses to chat forums. No way!
Repeat after me, 'chizow does NOT work for nVidia.' He's just an ordinary, everyday psychopath who spends 18 hours a day at keyboard responding to every single criticism of nVidia, no matter how trivial. But he does NOT work for nVidia! Perish the thought! He just does it out of his undying love for the green goblin.
chizow - Tuesday, March 24, 2015 - link
But hey remember AMD fantards, there's no reason that the overwhelming majority of the market prefers Nvidia, those 16 things I listed don't actually mean anything if you prefer subpar product and don't demand better, and you continually choose to ignore the obvious one product supports more features and the other doesn't. But hey, just keep accepting subpar products and listen to AMD fanboys like anubis44, don't give in to the reality the rest of us all accept as fact.sr1030nx - Saturday, March 21, 2015 - link
Only if they were NVIDIA branded speaker cables 😉Darkito - Friday, March 20, 2015 - link
FalseDarkito - Friday, March 20, 2015 - link
False, it's indistinguishable "Within the supported refresh rate range" as per this review. What happens outside the VRR window however, and especially under it, is incredibly different. With G-sync, if you get 20 fps it'll actually duplicate frames and tune the monitor to 40Hz, which means smooth gaming at sub-30Hz refresh rates (well, as smooth as 20fps can be). With FreeSync, it'll just fall back to v-sync on or off, with all the stuttering or tearing that involves. That means that if your game ever falls below the VRR window on FreeSync, image quality falls apart dramatically. And according to PCPer, this isn't just something AMD can fix with a driver update because it requires the frame buffer and logic on the G-Sync module!http://www.pcper.com/reviews/Displays/AMD-FreeSync...
Take note that the LG panel tested actually has a VRR window lower bound of 48Hz, so image quality starts falling apart if you dip below 48fps, which is clearly unacceptable.
AdamW0611 - Sunday, March 22, 2015 - link
Yah just like R.I.P Direct X, Mantle will rule the day, now AMD is telling developers to ignore Mantle, Gsync is great, and for those of us who prefer drivers being updated the same day games are released will stick with Nvidia, while months later AMD users will be crying that games still don't work right for them.anubis44 - Tuesday, March 24, 2015 - link
Company of Heroes 2 worked like shit for nVidia users for months after release, while my Radeon 7950 was pulling more FPS than a Titan card. To this day, Radeons pull better, and smoother FPS than equivalently priced nVidia cards in this, my favourite game. The GTX970 is still behind the R9 290 today. Is that the 'same day' nVidia driver support you're referring to?chizow - Tuesday, March 24, 2015 - link
More BS from one of the biggest AMD fanboys on the planet, an AMD *CPU* fanboy nonetheless. CoH2 ran faster on Nvidia hardware from Day1, and also runs much faster on Intel CPUs, so yeah, as usual, you're running the slower hardware in your favorite game simply bc you're a huge AMD fanboy.http://www.techspot.com/review/689-company-of-hero...
http://www.anandtech.com/show/8526/nvidia-geforce-...
ggg000 - Thursday, March 26, 2015 - link
Freesync is a joke:https://www.youtube.com/watch?feature=player_embed...
https://www.youtube.com/watch?v=VJ-Pc0iQgfk&fe...
https://www.youtube.com/watch?v=1jqimZLUk-c&fe...
https://www.youtube.com/watch?feature=player_embed...
https://www.youtube.com/watch?v=84G9MD4ra8M&fe...
https://www.youtube.com/watch?v=aTJ_6MFOEm4&fe...
https://www.youtube.com/watch?v=HZtUttA5Q_w&fe...
ghosting like hell.
Mangolao - Friday, May 22, 2015 - link
Heard free sync still have issue..... on TFT central ; the G2G is actually at 8.5ms instead of 1ms when connected with free sync on DP.... http://www.tftcentral.co.uk/reviews/benq_xl2730z.h...cgalyon - Thursday, March 19, 2015 - link
Good review, helps clear up a lot with respect to these new features. I've long thought that achieving a sufficiently high FPS and refresh rate would take care of things, but it's not always possible to do that with how games have pushed the limits of card abilities.I'm still on the fence about whether or not I should upgrade my monitor. These days I do a lot of my gaming on my TV by running it through my AV receiver. However, there are some games (like Civilization V) that just don't translate well to a couch-based experience.
Owls - Thursday, March 19, 2015 - link
So gysync is overhyped garbage? Who would have thought?invinciblegod - Thursday, March 19, 2015 - link
Based on that statement, freesync is also garbage.LancerVI - Thursday, March 19, 2015 - link
I wouldn't say gsync was garbage. I would say gsync was DRM. Expensive DRM at that.FriendlyUser - Thursday, March 19, 2015 - link
At least it's not overhyped and costs less.fatpenguin - Thursday, March 19, 2015 - link
What's Intel's plan? Will they be supporting this in the future?duploxxx - Thursday, March 19, 2015 - link
Intel will bring another version: INTnosync, since the iGPU is not capable of running anything at decent FPS on the display res shipping today.....MikeMurphy - Thursday, March 19, 2015 - link
Which is precisely why Intel will benefit most from adaptive sync.Denithor - Friday, March 20, 2015 - link
LOLjimjamjamie - Thursday, March 19, 2015 - link
They teamed up with Microsoft to create their own adaptive sync solution appropriate for Intel HD Graphics. It's called PowerPointRefuge - Thursday, March 19, 2015 - link
Bwuahahaha!!!! +1 good sir!Crunchy005 - Friday, March 20, 2015 - link
O man that was a beautiful comment, +1.sr1030nx - Saturday, March 21, 2015 - link
Shouldn't have been drinking when I read this, haha.ltcommanderdata - Thursday, March 19, 2015 - link
I guess the likelihood of someone coming out with a 16:10 FreeSync IPS monitor is pretty low? Sadly it seems like 16:10 monitors in general are becoming an endangered species these days.Sanidin - Thursday, March 19, 2015 - link
I would think you'd have a better chance of a 16:10 with freesync than g-sync; seeing as the scalars all support it, free standard, et al.nathanddrews - Thursday, March 19, 2015 - link
Agreed, on Newegg right now a 98% of 16:10 displays use DP while only 40% of 16:9 displays do, so that could mean the likelihood is greater... but that 40% makes up a larger pool of monitors overall. However, with 2160p (16:9) and 21:9 displays on the rise for gaming, that probably means fewer 16:10 displays will be produced in general.TL;DR: There will be at least one, someday.
maecenas - Thursday, March 19, 2015 - link
I think NVIDIA made a mistake in closing off G-sync. The death of Betamax should have taught them that you really can't go it alone in developing industry standards.If they opened it up to monitor manufacturers, but still required GPU makers (AMD) to pay a royalty on that side, they might have gotten a lot more adoption by now and may have made the roll-out of FreeSync very difficult for AMD.
chizow - Friday, March 20, 2015 - link
And the life of Blu-Ray should tell you that going it alone and taking everyone else with you works just fine?Hrel - Thursday, March 19, 2015 - link
I'm so sick of this proprietary shit, when companies want to do something new why not work with the other companies in the industry to come up with the best solution for the problem and make it universal? Customers NEVER jump onto a technology that only works from one company, we aren't going to commit to a monopoly. I get that they want a competitive edge in the market, but it literally never works that way. What happens is they spend all this money on R&D, marketing, prototyping, waste everyone's time with marketing and reviews only to have a handful of people pick it up (probably too few to even break even) and then it stops growing right there until a universal standard comes out that the industry adapts as a whole.Just fucking stop wasting everyone's time and money and choose cooperation from the start!
HunterKlynn - Thursday, March 19, 2015 - link
For the record, AMDs solution here *is* an attempt at an open standard. GSync is the proprietary one.dragonsqrrl - Thursday, March 19, 2015 - link
...Black Obsidian - Thursday, March 19, 2015 - link
It's hard to be more open than being an official (albeit optional) part of the DisplayPort spec itself.DominionSeraph - Thursday, March 19, 2015 - link
I'm sure AMD will be happy to give you the masks to the 290X and 390X, if you only ask. "I'm sick of there only being two players in the market. Why don't you let me in? I'm sure I could sell your products for less than your prices!"lordken - Thursday, March 19, 2015 - link
next time when you try to make someone look stupid, try no to look like fool by yourself. Another dogma believer that thinks without patents world will end. Same as copyright believers that music&entertainment without copyright will stop to exist...If you think about it, you can see that patents can actually pretty much slow down technology advancement. You can even see that today with Intel CPUs, as AMD cannot catch up and CPUs are patent locked we are left to be milked by Intel with minimal performance gains between generations. If either AMD would have better CPUs or could simply copy good parts from Intel and put into their design today, imho, we would be much far with performance. Also look around you and see that 3D printing boom? Guess what few years back and last year patents expired and allowed this. Yes 3D printing was invented 30years ago, yet it gets to your desk only today. So much for patent believers.
btw even if AMD would give you their blueprints what would you do? Start selling R390X tomorrow right? Manufactured out of thin air. By the time you could sell R390X we would be at 590 generation. Possibly only nvidia/intel would be able to benefit that sooner (which isnt necesarily a bad thing)
lordken - Thursday, March 19, 2015 - link
@Hrel: a) cause most corporations are run by greedy bastards imho b) today managers of said corporations cant employ common sense and are disconnected from reality making stupid/bad decisions. I see it in big corporation i work for...so using brain and "pro-consumer" way of thinking is forbidden.
Flunk - Thursday, March 19, 2015 - link
Please just support the Adaptive VSync standard Nvidia, your G-Sync implmentation doesn't have any benefits over it. You don't need to call it FreeSync, but we all need to settle on one standard because if I have to be locked in to one brand of GPU based on my monitor isn't not going to be the one that's not using industry standards.Murloc - Thursday, March 19, 2015 - link
a monitor lasts much longer than a GPU and costs more too for most users out there so yeah, standards win.They can call it adaptive sync as that's what it is. A Displayport standard.
praeses - Thursday, March 19, 2015 - link
It would be interesting to see input lag comparisons and videos of panning scenes in games that would typically cause tearing captured at higher speed played in slow motion.YukaKun - Thursday, March 19, 2015 - link
Until we don't have video showing the 2 of them going in parallel, we can't decide for a winner. There might be a lot of metrics for measuring "tearing", but this is not about "hard metrics", but how the bloody frame sequences look on your screen. Smooth or not.Cheers!
eddman - Thursday, March 19, 2015 - link
The difference cannot be shown on video. How can a medium like video which has a limited and constant frame-rate be used to demonstrate a dynamic, variable frame rate technology?This is one of those scenarios where you can experience it only on a real monitor.
Murloc - Thursday, March 19, 2015 - link
putting it on video makes the comparison kinda useless.invinciblegod - Thursday, March 19, 2015 - link
I am one of those who switch every time I upgrade my GPU (which is every few years). Sometimes, AMD is on top while other time Nvidia is better. Now, I must be locked into one forever or buy 6 monitors (3 for eyefinity and 3 for nvidia surround)!jackstar7 - Thursday, March 19, 2015 - link
If they can put out a confirmed 1440p 21:9 w/Freesync they will get my money. The rumors around the Acer Predator are still just rumors. Please... someone... give me the goods!Black Obsidian - Thursday, March 19, 2015 - link
It's pretty likely that LG will do just that. They already make two 1440p 21:9 monitors, and since it sounds like FreeSync will be part of new scalers going forward, you can probably count on the next LG 1440p 21:9 picking up that ability.xthetenth - Thursday, March 19, 2015 - link
I'm right there with you. I'm already preparing to get the update on the LG 1440 21:9 and a 390X, because if the rumors for the latter are anything like what the card is, it's going to be fantastic, and after getting a 21:9 for work I can't make myself use any other resolution.Black Obsidian - Thursday, March 19, 2015 - link
Same deal here. If nVidia supported FreeSync and priced the Titan X (or impending 980 Ti) in a more sane manner I'd consider going that way because I have no great love for either company.But so long as they expect to limit my monitor choices to their price-inflated special options and pretend that $1K is a reasonable price for a flagship video card, they've lost my business to someone with neither of those hangups.
kickpuncher - Thursday, March 19, 2015 - link
I have no experience with 144hz screens. I've been waiting for freesync to come but you're saying the difference is negligble with a static 144hz monitor? Is that with any FPS or does the FPS also have to be very high? (in regards to 4th paragarph on last page). ThanksJarredWalton - Thursday, March 19, 2015 - link
I'd have to do more testing, but 144Hz redraws the display every 6.9ms compared to 60Hz redrawing every 16.7ms. With pixel response times often being around 5ms in the real world (not the marketing claims of 1ms), the "blur" between frames will hide some of the tearing. And then there's the fact that things won't change as much between frames that are 7ms apart compared to frames that are 17ms apart.Basically at 144Hz tearing can still be present but it ends up being far less visible to the naked eye. Or at least that's my subjective experience using my 41 year old eyes. :-)
Midwayman - Thursday, March 19, 2015 - link
If you have a display with backlight strobing (newest light boost, benq blur reduction, etc) the difference is readily apparent. Motion clarity is way way way better than with out. The issue is its like a CRT and strobbing is annoy at low rates. 75hz is about the absolutely min, but 90hz and above are better. I doubt any of the displays support strobing and adaptive sync at the same time currently, but when you can push the frames, its totally worth it. The new BenQ mentioned in the article will do both for example (Maybe not at the same time.) That way you can have adaptive sync for games with low FPS and strobing for games with high fps.darkfalz - Thursday, March 19, 2015 - link
Games at 100+ FPS look much smoother. Think of it like perfect motion blur. If you can keep your game between 72-144 Hz it's gaming nirvana.eddman - Thursday, March 19, 2015 - link
I'm not a fan of closed, expensive solutions, but this hate towards g-sync that some here are showing is unwarranted.nvidia created g-sync at a time where no other alternative existed, so they created it themselves, and it works. No one was/is forced to buy it.
It was the only option and those who had a bit too much money or simply wanted the best no matter what, bought it. It was a niche market and nvidia knew it.
IMO, their mistake was to make it a closed, proprietary solution.
Those consumers who were patient can now enjoy a cheaper and, in certain aspects, better alternative.
Now that DP adaptive-sync exists, nvidia will surly drop the g-sync hardware and introduce a DP compatible software g-sync. I don't see anyone buying a hardware g-sync monitor anymore.
Murloc - Thursday, March 19, 2015 - link
you don't understand the hate because you think nvidia will drop g-sync immediately.It's likely you're right but it's not a given.
Maybe it will be a while before the market forces nvidia to support adaptive sync.
MikeMurphy - Thursday, March 19, 2015 - link
nVidia will protect manufacturers that invested resources into G-Sync. They will continue support for G-Sync and later introduce added support for Freesync.ddarko - Thursday, March 19, 2015 - link
The fact that only AMD cards work with Freesync now is not because Freesync is closed but because Nvidia refuses to support it. It takes a perverse kind of Alice in Wonderland logic to use the refusal of certain company to support an open standard in its hardware as proof that the open standard is in fact "closed."Freesync is open because it is part of the "open" Displayport standard and any display and GPU maker can take advantage of it by supporting that relevant Displayport standard (because use of the Displayport standard that Freesync is part of is free). Nvidia's Gsync is "closed" because Nvidia decides who and on what terms gets to support it.
Whatever the respective technical merits of Freesync and Gsync, please stop the trying to muddy the water with sophistry about open and closed. Nvidia GPU can work with Freesync monitors tomorrow if Nvidia wanted it - enabling Freesync support Nvidia a dime of licensing fees or requirement the permission of AMD or anyone else. The fact that they choose not to support it is irrelevant to the definition of Displayport 1.2a (of which Freesync is a part of) as an open standard.
mrcaffeinex - Thursday, March 19, 2015 - link
Are NVIDIA's partners able to modify their cards BIOS and/or provide customized drivers to support FreeSync or do they have to rely on NVIDIA to adopt the feature? I know different manufacturers have made custom cards in the past with different port layouts and such. I never investigated to see if they required a custom driver from the manufacturer, though. Is it possible that this could be an obstacle that an EVGA, ASUS, MSI, etc. could overcome on their own?JarredWalton - Thursday, March 19, 2015 - link
It would at the very least require driver level modifications, which the card manufacturers wouldn't be able to provide.chizow - Thursday, March 19, 2015 - link
How is this even remotely a fact when AMD themselves have said Nvidia can't support FreeSync, and even many of AMD's own cards in relevant generations can't support it? Certainly Nvidia has said they have no intention of supporting it, but there's also the possibility AMD is right and Nvidia can't support it.So in the end, you have two effectively closed and proprietary systems, one designed by AMD, one designed by Nvidia.
iniudan - Thursday, March 19, 2015 - link
Nvidia cannot use FreeSync as it is AMD implementation of VESA's Adaptive Sync, they have to come up with their own implementation of the specification.lordken - Thursday, March 19, 2015 - link
are you sure? they only have to come with different name (if they want). Just as both amd/nvidia calls and use "DisplayPort" as DisplayPort , they didnt have came up with their own implementations of it as DP is standardized by VESA so they used that.Or I am missing your point what you wanted to say.
Question is if it become core/regular part of lets say DP1.4 onwards as just now it is only optional aka 1.2a and not even in DP1.3 - if I understand that correctly.
iniudan - Thursday, March 19, 2015 - link
Well the implementation need to be in their driver, they not gonna give that to Nvidia. =pchizow - Thursday, March 19, 2015 - link
So it is also closed/proprietary on an open spec? Gotcha, so I guess Nvidia should just keep supporting their own proprietary solution. Makes sense to me.ddarko - Thursday, March 19, 2015 - link
You know repeating a falsehood 100 times doesn't make it true, right?chizow - Tuesday, March 24, 2015 - link
You mean like repeating FreeSync can be made backward compatible with existing monitors with just a firmware flash, essentially for Free? I can't remember how many times that nonsense was tossed about in the 15 months it took before FreeSync monitors finally materialized.Btw, it is looking more and more like FreeSync is a proprietary implementation based on an open-spec just as I stated. FreeSync has recently been trademarked by AMD so there's not even a guarantee AMD would allow Nvidia to enable their own version of Adaptive-Sync on FreeSync (TM) branded monitors.
ddarko - Thursday, March 19, 2015 - link
From the PC Perspective article you've been parroting around like gospel all day today:"That leads us to AMD’s claims that FreeSync doesn’t require proprietary hardware, and clearly that is a fair and accurate benefit of FreeSync today. Displays that support FreeSync can use one of a number of certified scalars that support the DisplayPort AdaptiveSync standard. The VESA DP 1.2a+ AdaptiveSync feature is indeed an open standard, available to anyone that works with VESA while G-Sync is only offered to those monitor vendors that choose to work with NVIDIA and purchase the G-Sync modules mentioned above."
http://www.pcper.com/reviews/Displays/AMD-FreeSync...
That's the difference between an open and closed standard, as you well know but are trying to obscure with FUD.
chizow - Friday, March 20, 2015 - link
@ddarko, it says a lot that you quote the article but omit the actually relevant portion:"Let’s address the licensing fee first. I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."
And more on that G-Sync module AMD claims isn't necessary (but we in turn have found out a lot of what AMD said about G-Sync turned out to be BS even in relation to their own FreeSync solution):
"But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."
In summary, AMD's own proprietary spec just isn't as good as Nvidia's.
Crunchy005 - Friday, March 20, 2015 - link
AMDs spec is not proprietary so stop lying. Also I love how you just quoted in context what you quoted out of context in an earlier comment. The only argument you have against freeSync is ghosting and as many people have pointed out is that it is not an inherent issue with free sync but the monitors themselves. The example given in that shows three different displays that all are affected differently. The LG and benq both show ghosting differently but use the same freeSync standard so something else is different here and not freeSync. On top of that the LG is $100 less than the asus and the benQ $150 less for the same features and more inputs. I don't see how a better more well rounded monitor that can offer variable refresh rates with more features that is cheaper is a bad thing. From the consumer side of things that is great! A few ghosting issues that i'm sure are hardly noticeable to the average user is not a major issue. The videos shown there are taken at a high frame rate and slowed down, then put into a compressed format and thrown on youtube in what is a very jerky hard to see video, great example for your only argument. If the tech industry could actually move away from proprietary/patented technology, and maybe try to actually offer better products and not "good enough" products that force customers into choosing and being locked into one thing we could be a lot father along.chizow - Friday, March 20, 2015 - link
Huh? How do you know Nvidia can use FreeSync? I am pretty sure AMD has said Nvidia can't use FreeSync, if they decide to use something with DP 1.2a Adaptive Sync they have to call it something else and create their own implementation, so clearly it is not an Open Standard as some claim.And how is it not an issue inherent with FreeSync? Simple test that any site like AT that actually wants answers can do:
1) Run these monitors with Vsync On.
2) Run these monitors with Vsync Off.
3) Run these monitors with FreeSync On.
Post results. If these panels only exhibit ghosting in 3), then obviously it is an issue caused by FreeSync. Especially when we have these same panel makers (the 27" 1440p BenQ is apparently the same AU optronics panel as the ROG Swift) have panels on the market, both non-FreeSync and G-Sync) that have no such ghosting.
And again you mention the BenQ vs. the Asus, well guess what? Same panel, VERY different results. Maybe its that G-Sync module doing its magic, and that it actually justifies its price. Maybe that G-Sync module isn't bogus as AMD claimed and it is actually the Titan X of monitor scalers and is worth every penny it costs over AMD FreeSync if it is successful at preventing the kind of ghosting we see on AMD panels, while allowing VRR to go as low as 1FPS.
Just a thought!
Crunchy005 - Monday, March 23, 2015 - link
Same panel different scalers. AMD just uses the standard built into the display port, the scaler handles the rest there so it isn't necessarily freeSync but the variable refresh rate technology in scaler that would be causing the ghosting. So again not AMD but the manufacturer."If these panels only exhibit ghosting in 3), then obviously it is an issue caused by FreeSync"
Haven't seen this and you haven't shown us either.
"Maybe its that G-Sync module doing its magic"
This is the scaler so the scaler, not made by AMD, that supports the VRR standard that AMD uses is what is controlling that panel not freeSync itself. Hence an issue outside of AMDs control. Stop lying and saying it is an issue with AMD. Nvidia fanboys lying, gotta keep them on the straight and narrow.
chizow - Monday, March 23, 2015 - link
Yes..the scaler that AMD worked with scaler makers to design, using the Spec that AMD designed and pushed through VESA lol. Again it is hilarious that AMD and their fanboys are now blaming the scaler and monitor makers already. This really bodes well for future development of FreeSync panels. /sarcasmViva La Open Standards!!!! /shoots_guns_in_air_in_fanboy_fiesta
So the AMD fanboys like you can't keep shoving your head in the sand and blaming Nvidia for your problems. Time to start holding AMD accountable, something AMD fanboys should've done years ago.
https://www.youtube.com/watch?v=-ylLnT2yKyA
ddarko - Thursday, March 19, 2015 - link
Is this where you have pulled the fanciful notion that Nvidia can't support fresync?http://techreport.com/news/25867/amd-could-counter...
That the display controller in Nvidia vidro cards don't support variable refresh intervals? First of all, that's an AMD executive's speculation on why Nvidia has to use an external module. It's never be confirmed to be true by Nvidia. If this is the source of your claim, then it's laughable - you're taking as gospel what an AMD exec says he thinks are the hardware capabilities of Nvidia cards. Whatever.
Second, even if it was true for arguments sake, that still means Freesync is open while Gsync is closed because Nvidia can add display controller hardware support without anyone's approval or licensing fee. AMD or Intel cannot do the same with Gsync. It's really that simple.
Really, grasping at straws only weakens your arguments. Everyone understands what open and closed means and your attempts to creatively redefine them are a failure. The need to add hardware does not make a standard closed - USB 3.1 is an open standard even though vendors must add new chips to support it. It is open because every vendor can add those chips without license fee to anyone else. Freesync is open - Nvidia, AMD or Intel. Gsync is not. Case closed.
chizow - Friday, March 20, 2015 - link
Hey, AMD designed the spec, they should certainly know better than anyone what can and cannot support it, especially given MANY OF THEIR OWN RELEVANT GCN cards CANNOT support FreeSync. I mean if it was such a trivial matter to support FreeSync, why can't any of AMD's GCN1.0 cards support it? Good questioin huh?2nd part, for arguments sake, I honestly hope Nvidia supports FreeSync, because they can just keep supporting G-Sync as their premium option allowing their users to use both monitors. Would be bad news for AMD however, as that would be even 1 less reason to buy a Radeon card.
Crunchy005 - Friday, March 20, 2015 - link
Not really radeon cards compete well with nvidia, the two leap frog and both offer better value depending on what time you look at what is available. Also the older GCN1.0 cards most likely don't have the hardware to support it, like the unconfirmed Nvidia story above, I myself am assuming here. Nvidia created a piece of hardware that will get them more money that has to be added into a monitor that would support older cards. Hard to change an architecture thats already released. Nvidia did well by making it the way they did, it offered a larger selection of cards to use even if it was a higher price. But now that there is an open standard things will shift and the next gen from AMD, i'm sure, will all support free sync broadening the available cards for free sync. The fact that g-sync still has a very limited amount of very expensive monitors makes it a tough argument that it is in any way winning, especially when by next month freeSync will have just as many options at a lower price. You just can't ever admit that AMD possibly did something well and that Nvidia is going to be fighting a very steep uphill battle to compete on this currently niche technology.Also lets add to the fact that against your article works against you.
"The question now is: why is this happening and does it have anything to do with G-Sync or FreeSync?...It could be panel technology, it could be VRR technology or it could be settings in the monitor itself. We will be diving more into the issue as we spend more time with different FreeSync models.
For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area."
Even the article you posted and used for your argument says it is not freeSync but the monitor technology itself. You have a knack for trying to target anything wrong against AMD when it is the monitor manufacturer that has caused this problem.
chizow - Friday, March 20, 2015 - link
So you do agree, it is very well possible Nvidia can't actually support FreeSync for the same reasons most of AMD's own GPUs can't support it then? Is AMD just choosing to screw their customers by not supporting their own spec? Maybe they are trying to force you to buy a new AMD card so you can't see how bad FreeSync is at low FPS?Last bit of first paragraph. LOL, it's like AMD fans just dont' get it. Nvidia, the market leader that holds ~70% of the dGPU market (check any metric, JPR, Steam survey, your next LAN party, online guild, whatever) that supports 100% of their GPUs since 2012 with G-Sync and has at least 8 panels on the market has the uphill battle while AMD, the one that is fighting for relevance, supports only a fraction of their ~30% dGPU share FreeSync, has half the monitors, and ultimately, has the worst solution. But somehow Nvidia has the uphill battle in this niche market? AMD fans willing to actually buy expensive niche equipment is as niche as it gets lol.
And last part, uh, no. They are asking a rhetorical question they hope to answer and they do identify G-Sync and FreeSync as possible culprits with a follow-on catchall with VRR, but as I said the test is simple enough on these panels, do VSync ON/OFF and FreeSync and see under what conditions the panel ghosts. I would be shocked if they ghost with standard fixed refresh, so if the problem only exhibits itself with FreeSync enabled, it is clearly a problem inherent with FreeSync.
Also, if you read the comments by PCPer's co-editor Allyn, you will see he also believes it is a problem with FreeSync.
Crunchy005 - Saturday, March 21, 2015 - link
Well if that is true, not saying it is, then that would be an issue with the VRR technology built into the monitors scaler, not a free sync issues. Free sync uses an open standard technology so it is not freesync itself but the tech in the monitor, and how the manufacturers scaler handles the refresh rates. Also your argument that is falls apart at low refresh rates is again not really an AMD issue. The freesync implementation can go down as low as 9hz. But since freesync relies on the manufacturer to make a monster that can go that low there is a limitation there. Obviously an open standard will have more issue than a more closed proprietary system, ie. Mac vs windows. Freesync has to support different hardware and dealers across the board and not the exact same proprietary thing every time.The fact that the amount of monitors available for gsync in 18 months has only reached 8 models should tell you something there. Obviously manufacturers don't want to build monitors that cost them more to build that they have to sell at a higher price for a feature that is not selling much at all yet. But hey the fact that they have a new standard being built into newonitors that can be supported by several large brand names and if cost then almost nothing extra to build, and it can be used with non display port devices? Well that opens a much larger customer base for those monitors. Smart business there.
I'm sure as time goes on monitor manufacturers will build higher quality parts that will solve the ghosting issue, but the fact remains that AMD was able to accomplish essentially the same thing as Nvidia using an open standard that is working it's way into monitors. Also those monitors are still able to target a larger audience. Proprietary never wins out in cases like this.
Gsync is not enough of a benefit over freesync to justify the cost and in time it will be more widely adopted as companies who make monitors stop wanting to pay the extra to Nvidia. Although bent has a hybrid monitor for gsync, I could see a hybrid gsync/freesync monitor in the future(graphics agnostic, sort of).
Crunchy005 - Saturday, March 21, 2015 - link
Wow spelling errors gallore, this I wrote on my phone so ya auto correct is stupid.chizow - Saturday, March 21, 2015 - link
The fact the number of G-Sync panels is only 8 reinforces what Nvidia has said all along, doing it right is hard, and instead of just hoping Mfgs. pick the right equipment (while throwing them under the bus like AMD is doing), Nvidia actually fine tunes their G-Sync module to every single panel that gets G-Sync certified. This takes engineering time and resources.And how are you sure things will get better? It took AMD 15 months to produce what we see today and it is clearly riddled with problems that do not plague G-Sync. What if the sales of these panels are a flop and there is a high RMA rate attached to them due to the ghosting issues we have seen? Do you think AMD's typical "hey its an open standard, THEY messed it up" motto is doing any favors as they are now already blaming their Mfg. partners for this problem? Honestly, take a step back and watch history repeat itself. AMD once again releases a half-baked, half-supported solution, and then when problems arise, they don't take accountability for it.
Also, it sounds like you are acknowledging these problems do exist, so would you, in good conscience recommend that someone buy these, or wait for FreeSync panels 2.0 if they are indeed problems tied to hardware? Just wondering?
And how is G-Sync not worth the premium, given it does exactly as it said it would from Day 1, and has for some 18 months now without exhibiting these very same issues? Do you really think an extra $150 on a $600+ part is really going to make the difference if one solution provides what you want TODAY vs. the other that is a completely unknown commodity?
Just curious.
Crunchy005 - Monday, March 23, 2015 - link
It has one issue that you have pointed out, ghosting, so not riddled with problems. Also AMD had to wait for the standard to get into display port 1.2a they had it working in 2013 but until the VESA got it into display port they could ship monitors that supported it, hence the 15 months you say it took to 'develop'.So far all the reviews I have seen on freeSync have been great, so yes. The only one that even mentions ghosting is the one that you posted and i'm sure that it's only noticeable with high speed cameras. Obviously the ghosting is not a big enough issue that it was even required to make mention for when under normal use it isn't noticeable, and you still get smooth playback on the monitor, same as gsync.
Last, yes $150 is 25% increase on $600 there so that is a significant increase in price, very relevant because some people have a thing called a budget.
chizow - Monday, March 23, 2015 - link
No, its not the only issue, ghosting is just the most obvious and most problematic, but you also get flickering at low FPS because FreeSync does not have the onboard frame buffer to just double frames as needed to prevent this kind if pixel decay and flickering you see with FreeSync. Its really funny because these are all things (G-Sync module as smart scaler and frame/lookaside buffer) that AMD quizzically wondered why Nvidia needed extra hardware. Market leader, market follower, just another great example.Another issue apparently is there is a perceivable difference in smoothness as the AMD driver kick in and out of Vsync On/Off mode in its supported frame bands. This again, is mentioned in these previews but not covered in depth. My guess is because press had limited access to the machines again for these previews and they were done in controlled test environments with their own shipped versions only arriving around the time the embargo was lifted.
But yes you and anyone else should certainly wait for follow-up reviews because the ghosting and flickering was certainly visible without any need of high speed cameras
https://www.youtube.com/watch?v=-ylLnT2yKyA
25% increase is nothing if you want a working solution today, but yes a $600 investment in what AMD has shown would be a dealbreaker and waste of money imo, so if you are in the AMD camp or also need to spend another $300 on an AMD card for FreeSync you will certainly want to wait a bit longer.
willis936 - Thursday, March 19, 2015 - link
I would like an actual look at added input latency from these adaptive sync implementations. Nobody has even mentioned it but there's a very real possibility that either the graphics TX or monitor's scaler has to do enough thinking to cause a significant delay from when pixels come it to when they're displayed on the screen. Why isn't the first issue to be scrutinized be the thing that these technologies seek to solve?mutantmagnet - Thursday, March 19, 2015 - link
Acer already posted the MSRPhttp://us.acer.com/ac/en/US/content/model/UM.HB0AA...
$800
mutantmagnet - Thursday, March 19, 2015 - link
I forgot to mention it's already on sale in Europe.JarredWalton - Thursday, March 19, 2015 - link
Google was failing me last night, though granted I haven't slept much in the past two days.ezridah - Thursday, March 19, 2015 - link
It's odd that on their product page they don't mention G-Sync or the refresh rate anywhere... It's like they don't want to sell it or something.eanazag - Thursday, March 19, 2015 - link
My monitors last longer than 5 years. Basically I keep them till they die. I have a 19" 1280x1024 on the shared home computer I'm considering replacing. I'd be leaning towards neither or Freesync monitors.I currently am sporting AMD GPUs, but I am one of those who go back and forth between vendors and I don't think it is as small a minority as was assumed. I bought two R9 290's when AMD last February. If I was buying right now, I'd be getting a GTX 970. I do like the GeForce Experience software. I'm still considering a GTX 750 Ti.
I'm not totally sold on what AMD has in the market at the moment. I have a lot of heat concerns using in Crossfire and the wattage is higher than I like. The original 290 blowers sucked. I'd like blower cards again that are quality like Nvidia's.
Dorek - Thursday, March 19, 2015 - link
Wait, you didn't just say that you use two R9 290s ona 1280x1024 monitor, right?medi03 - Thursday, March 19, 2015 - link
I don't get how 970 is better than 290x. it is slower and more expensive:http://www.anandtech.com/show/8568/the-geforce-gtx...
And total system consumption is lower by about 20-25% (305w on 970 vs 365 on 290x). No big deal
JarredWalton - Thursday, March 19, 2015 - link
It's not "better" but it is roughly equivalent. I've got benchmarks from over 20 games. Average for 290 X at 2560x1440 "Ultra" across those games is 57.4 FPS while the average for 970 is 56.8 FPS. Your link to Crysis: Warhead is one title where AMD wins, but I could counter with GRID 2/Autosport and Lord of the Fallen where NVIDIA wins. And of the two GPUs, 970 will overclock more than 290X if you want to do that.TallestJon96 - Thursday, March 19, 2015 - link
I'm an NVIDIA User, but in happy to see the proprietary GSYNC get beat down. I've got a 1080p144 non GSYNC panel, so I won't be upgrading for 3-5 years, and hopefully 4k and FreeSync will both be standard by then.junky77 - Thursday, March 19, 2015 - link
Girls, what about laptops..medi03 - Thursday, March 19, 2015 - link
It's lovely how the first page of the article about FreeSync talks exclusively about nVidia.JarredWalton - Thursday, March 19, 2015 - link
It's background information that's highly pertinent, and if "the first page" means "the first 4 paragraphs" then you're right... but the last two talk mostly about FreeSync.Oxford Guy - Thursday, March 19, 2015 - link
I love how the pricing page doesn't anything to address a big problem with both FreeSync and G-Sync -- the assumption that people want to replace the monitors they already have or have money to throw away to do so.I bought an $800 BenQ BL3200PT 32" 1440 A-MVA panel and I am NOT going to just buy another monitor in order to get the latest thing graphics card companies have dreamt up.
Companies need to step up and offer consumers the ability to send in their panels for modification. Why haven't you even thought of that and mentioned it in your article? You guys, like the rest of the tech industry, just blithely support planned obsolescence at a ridiculous speed -- like with the way Intel never even bothered to update the firmware on the G1 ssd to give it TRIM support.
People have spent even more money than I did on high-quality monitors -- and very recently. It's frankly a disservice to the tech community to neglect to place even the slightest pressure on these companies to do more than tell people to buy new monitors to get basic features like this.
You guys need to start advocating for the consumer not just the tech companies that send you stuff.
JarredWalton - Thursday, March 19, 2015 - link
While we're at it: Why don't companies allow you to send in your old car to have it upgraded with a faster engine? Why can't I send in my five year old HDTV to have it turned into a Smart TV? I have some appliances that are getting old as well; I need Kenmore to let me bring in my refrigerator to have it upgraded as well, at a fraction of the cost of a new one!But seriously, modifying monitors is hardly a trivial affair and the only computer tech that allows upgrades without replacing the old part is... well, nothing. You want a faster CPU? Sure, you can upgrade, but the old one is now "useless". I guess you can add more RAM if you have empty slots, or more storage, or an add-in board for USB 3.1 or similar...on a desktop at least. The fact is you buy technology for what it currently offers, not for what it might offer in the future.
If you have a display you're happy with, don't worry about it -- wait a few years and then upgrade when it's time to do so.
Oxford Guy - Friday, March 20, 2015 - link
"Old" as in products still being sold today. Sure, bud.Oxford Guy - Friday, March 20, 2015 - link
Apple offered a free upgrade for Lisa 1 buyers to the Lisa 2 that included replacement internal floppy drives and a new fascia. Those sorts of facts, though, are likely to escape your attention because it's easier to just stick with the typical mindset the manufacturers, and tech sites, love to endorse blithely --- fill the landfills as quickly as possible with unnecessary "upgrade" purchases.Oxford Guy - Friday, March 20, 2015 - link
Macs also used to be able to have their motherboards replaced to upgrade them to a more current unit. "The only computer tech that allows upgrades without replacing the old part is... well, nothing." And whose mindset is responsible for that trend? Hmm? Once upon a time people could actually upgrade their equipment for a fee.Oxford Guy - Friday, March 20, 2015 - link
The silence about my example of the G1 ssd's firmware is also deafening. I'm sure it would have taken tremendous resources on Intel's part to offer a firmware patch!JarredWalton - Friday, March 20, 2015 - link
The G1 question is this: *could* Intel have fixed it via a firmware update? Maybe, or maybe Intel looked into it and found that the controller in the G1 simply wasn't designed to support TRIM, as TRIM didn't exist when the G1 was created. But "you're sure" it was just a bit of effort away, and since you were working at Intel's Client SSD department...oh, wait, you weren't. Given they doubled the DRAM from 16MB to 32MB and switched controller part numbers, it's probable that G1 *couldn't* be properly upgraded to support TRIM:http://www.anandtech.com/show/2808/2
So if that's the case, it's sounds a lot like Adaptive Sync -- the standard didn't exist when many current displays were created, and it can't simply be "patched in". Now you want existing displays that are already assembled to be pulled apart and upgraded. That would likely cost more money than just selling the displays at a discount, as they weren't designed to be easily disassembled and upgraded.
The reality of tech is that creating a product that can be upgraded costs time and resources, and when people try upgrading and mess up it ends up bringing in even more support calls and problems. So on everything smaller than a desktop, we've pretty much lost the ability to upgrade components -- at least in a reasonable and easy fashion.
Is it possible to make an upgradeable display? I suppose so, but what standards do you support to ensure future upgrades? Oh, you can't foresee the future so you have to make a modular display. Then you might have the option to swap out the scaler hardware, LCD panel, maybe even the power brick! And you also have a clunkier looking device because users need to be able to get inside to perform the upgrades.
Oxford Guy - Friday, March 20, 2015 - link
"Now you want existing displays that are already assembled to be pulled apart and upgraded. That would likely cost more money than just selling the displays at a discount, as they weren't designed to be easily disassembled and upgraded."If that's the case... I wonder why that is? Could it be the blithe acceptance of ridiculous cases of planned obsolescence like this?
Manufacturers piddle out increments of tech constantly to try to keep a carrot on a stick in front of consumers. Just like with games and their DLC nonsense, the new mindset is replace, replace, replace... design the product so it can't be upgraded. Fill up the landfills.
Sorry, but my $800 panel isn't going to just wear out or be obsolete in short order. People who spent even more are likely to say the same thing. And, again, many of these products are still available for purchase right now. The industry is doing consumers a disservice enough by not having standards (incompatible competing G-Sync and FreeSync) but it's far worse to tell people they need to replace otherwise perfectly satisfactory equipment for a minor feature improvement.
You say it's not feasible to make monitors that can be upgraded in a relatively minor way like this. I say it's not. It's not like we're talking about installing DisplayPort into a panel that didn't have it or something along those lines. It's time for the monitor industry to stop spewing out tiny incremental changes and expecting wholesale replacement.
This sort of product and the mindset that accompanies it is optional, not mandatory. Once upon a time things were designed to be upgradable. I suppose the next thing you'll fully endorse are motherboards with the CPUs, RAM, and everything else soldered on (which Apple likes to do) to replace DIY computing... Why not? Think of how much less trouble it will be for everyone.
Oxford Guy - Friday, March 20, 2015 - link
"it's probable that G1 *couldn't* be properly upgraded to support TRIM" "since you were working at Intel's Client SSD department...oh, wait, you weren't." So, I assume I should use the same retort on you with your "probable", eh?Oxford Guy - Friday, March 20, 2015 - link
The other thing you're missing is that Intel never told consumers that TRIM could not be added with a firmware patch. It never provided anyone with an actual concrete justification. It just did what is typical for these companies and for publications like yours = told people to buy the latest shiny to "upgrade".Gunbuster - Thursday, March 19, 2015 - link
So G-Sync has been available to purchase for what a year now? And AMD comes to the table with something exactly the same. How impressive.Oh and Crossfire driver the traditional trust us Coming soon™
chizow - Thursday, March 19, 2015 - link
18 months later, and not exactly the same, still worst. But yes we must give it to AMD, at least they brought something to the table this time.Gigaplex - Friday, March 20, 2015 - link
The troll is strong in this one. You keep repeating how this is technically worse than G-SYNC and have absolutely nothing to back it up. You claim forced V-SYNC is an issue with FreeSync, but it's the other way around - you can't turn V-SYNC off with G-SYNC but you can with FreeSync. You don't address the fact that G-SYNC monitors need the proprietary scaler that doesn't have all the features of FreeSync capable scalers (eg more input ports, OSD functionality). You accuse everyone who refutes your argument with AMD fanboy sentimentality, when you yourself are the obvious NVIDIA fanboy. No doubt you'll accuse me of being an AMD fanboy too. How wrong you are.JarredWalton - Friday, March 20, 2015 - link
Technically the G-SYNC scaler supports an OSD... the options are just more limited as there aren't multiple inputs to support, and I believe NVIDIA doesn't bother with supporting multiple *inaccurate* color modes -- just sRGB and hopefully close to the correct values.chizow - Friday, March 20, 2015 - link
Actually you're wrong again, Vsync is always off, there is a frame cap turned on via driver but that is not Vsync as the GPU is still controlling frame rate.Meanwhile, FreeSync is still clearly tied to Vsync, which is somewhat surprising in its own right since AMD has historically had issues with driver-level Vsync.
I've never once glossed over the fact G-Sync requires proprietary module, because I've clearly stated the price and tech is justified if it is a better solution and as we saw yesterday, it clearly is.
I've also acknowledged that multiple inputs an OSD are amenities that are a bonus, but certainly not over these panels excelling at what they are purchased for. I have 2xU2410 companion panels with TONS of inputs for anything I need beyond gaming.
darkfalz - Thursday, March 19, 2015 - link
I have to give it to AMD here - I was skeptical this could be accomplished without dedicated hardware to buffer the video frames on the display, but they've done it. I still wouldn't buy one of their power hungry video cards but it's good for AMD fans. This is good news for G-Sync owners too as it should drive down the artificially inflated price (partly due to lack of competition, partly due to early adoption premium). After fiddling around with triple buffering and tripe buffering overrides for years (granted, less of a problem on DX10/11 as it seems many modern engines have some form of "free" triple buffering) it's good to go to perfect refresh rates. As a big emulation fan, with many arcade games using various refresh rates from 50 to 65 Hz, these displays are also great. Was input lag tested? AMD don't claim to have Vsync-off like input lag reduction. This would be superb in a laptop where displaying every last frame is important (Optimus provides a sort of "free" triple buffering of its own, but it's not the smoothest and often requires you to set a 60 FPS frame cap).darkfalz - Thursday, March 19, 2015 - link
By G-Sync owners, I Guess I mean NVIDIA fans / prospective G-Sync buyers. G-sync owners (like me) have already paid the premium.marraco - Thursday, March 19, 2015 - link
I ever owned nVidia GPUs (not fue to fanboyism, but the coincidence of geforces being the sweet spot each time I needed a new card).Still, I will not pay for G-SYNC. I don't want to be tied to a company.
I also can't buy a FreeSync, because is not supported by nVidia.
Also, hardware supported features tend to turn obsolete at a faster rate than software ones, so I do not trust G-Sync.
Murloc - Thursday, March 19, 2015 - link
same here, I can just wait a year or so before upgrading monitor and gpu, their loss. If in the meanwhile AMD comes up with something competitive (i.e. also not an oven please), they win.Norseman4 - Friday, March 20, 2015 - link
But you can buy an Adaptive Sync monitor and use it with any GPU. You won't get the benefits of FreeSync without AMD, but that is all.Tikcus9666 - Thursday, March 19, 2015 - link
I aint overly worried, tearing does not bother me, I can't say I really notice it when playing, however I am only playing at 1080p with a Radeon 280steve4king - Thursday, March 19, 2015 - link
Hats off to Nvidia for delivering G-Sync and getting the ball rolling on this thing. They were the first to create a solution for a very real problem.Because of NVidia's pioneering, and because NVidia won't license the technology to AMD, AMD had to find their own solution in re-purposing an existing DP1.2a feature to provide the same function.
It makes sense for NVidia to refuse to support adaptive refresh, until these displays become commonplace. They had the only card and the only display module that could do this, and they needed to sell as many as they could before the competition's technology was viable.
Soon NVidia needs to reverse that decision, because I'm not going to buy an inferior monitor, just so that I can slap "The Way It's Meant to Be Played" on the side of my computer.
I fully expect that both will come together on this one. NVidia had a good run with G-Sync. But now it needs to jump on the bandwagon or risk losing out on GPU sales.
PPalmgren - Friday, March 20, 2015 - link
Unfortunately, I doubt it. While they are great first movers, look at their track record of good tech that could be great tech with industry-wide adoption via less proprietary measures: PhysX, CUDA, 3D Surround, Gsync, etc. They also have a poor history of working with more open platforms like Linux. "Our way or the highway" is the vibe I get.Soulwager - Thursday, March 19, 2015 - link
What about actually testing the fallback cases, where framerate is outside the monitor's range of refresh rates? We need an input lag comparison when both monitors are maxed out in v-sync mode, and a gpu utilization comparison when framerates dip below the monitor's minimum refresh rate.ncsaephanh - Thursday, March 19, 2015 - link
Finally, some competition up in here.czesiu - Thursday, March 19, 2015 - link
"One final topic to address is something that has become more noticeable to me over the past few months. While G-SYNC/FreeSync can make a big difference when frame rates are in the 40~75 FPS range, as you go beyond that point the benefits are a lot less clear. Take the 144Hz ASUS ROG Swift as an example. Even with G-SYNC disabled, the 144Hz refresh rate makes tearing rather difficult to spot, at least in my experience. "Does 144hz monitor help when the FPS is ~40?
JarredWalton - Thursday, March 19, 2015 - link
Sure. It draws the frames 3-4 times between updates, so even if half of the frame showed tearing on the first pass it gets cleaned up on the second and third passes. And with VSYNC enabled, you can fall back to 72Hz and 48Hz before you are at ~30 Hz.SleepModezZ - Thursday, March 19, 2015 - link
Really different reviews between AnandTech and PC Perspective. You conclude that FreeSync performs as well as G-Sync - if not better, because of the option to disable V-sync. PC Perspective, on the other hand, noticed that their FreeSync monitors performed badly compared to the G-Sync monitors when the frame rate dropped below the lowest refresh rate of the monitor.You give the impression that they would behave the same - or FreeSync would be potentially better because you could choose your poison: stutter or tearing - when with G-Sync you would always get stuttering. PC Perspective, on the other hand, tells that G-Sync monitors handle this gracefully by refreshing the display twice or more during those longer frames - and as such G-Sync avoids both stutter and tearing at those lower fram rates. Their FreeSync monitors did not do that - and the stuttering or tearing was very noticeable. The frame rate dropping below 48 fps is not uncommon and the displays behavior in those situations is very important. That makes the G-Sync the superior technology. Unless - the tearing or stuttering at speeds lower than the display's lowest refresh rate is only a problem with that specific monitor and not with the FreeSync / AdaptiveSync technology in general. (The LG monitor is incapable of doubling its slowest refresh rate - other monitors that are capable maybe could handle the situation differently. If not, FreeSync is the inferior technology.)
I don't know how G-Sync and FreeSync actually would handle full screen movies at 24 fps. G-Sync could easily display it at a 48 Hz refresh rate. Your LG monitor would probably also show it at 48 Hz - because it is the lowest it could go. But would the LG monitor with FreeSync be smart enough to show a 25 fps movie in 50 Hz - or would it display it in 48 Hz with unnecessary tearing or stuttering?
Gigaplex - Friday, March 20, 2015 - link
"PC Perspective, on the other hand, tells that G-Sync monitors handle this gracefully by refreshing the display twice or more during those longer frames - and as such G-Sync avoids both stutter and tearing at those lower fram rates."That would drastically reduce the effects of tearing, but it would not do much, if anything, for stutter.
SleepModezZ - Friday, March 20, 2015 - link
It would reduce stutter in the sense that if the frame rate were, for example, constantly 30 fps, G-sync would give you every frame when it is ready - keeping the motion fluid. FreeSync with V-SYnc on, on the other hand, would force that into the lowest refresh rate of the monitor. It would double some frames and not others - making the timing of the frames different and making a constant 30 fps motion jerky where G-Sync would not. I would call that jerky motion 'stutter' - FreeSync (currently) has it, G-Sync does not.In short, G-Sync retains its variable refresh rate technology when going under the displays min refresh rate. FreeSync does not but switches to constant refresh rate at the monitors min refresh rate - introducing either tearing or stutter. Within the display's refresh rate range they perform the same. When going faster than the refresh rate range - FreeSync gives the option of disabling V-Sync and choosing tearing instead of stuttering. There it is better. I just think that the low fps range is probably more important than the high. I would not buy any FreeSync / Adaptive Sync displays before they demonstrate that they can handle those low fps situations as gracefully as G-Sync does.
WatcherCK - Thursday, March 19, 2015 - link
TFTcentral have done a review of the soon to be released Acer monitor:http://www.tftcentral.co.uk/reviews/acer_xb270hu.h...
And as Ryan said it is a beast, but one question you buy an XB270hu and you plug in your 290x, because the video card doesnt support GSYNC uses the standard scaler? in the Acer to display video data. Now if the Acer uses a scaler from one of the four main manufacturers listed in the article is there a chance it would support Freesync? (Acer wouldnt advertise that obviously since the monitor is a GSYNC branded monitor....)
So there are a few assumptions above about the operations of GSYNC, but Im curious if this will be the case as it keeps red and green camps happy...
One other question if anyone is happy to answer, high hertz refresh monitors will they maintain their peak capable refresh when in portrait mode or are they limited to a lower refresh rate or GSYNC for that matter? Im thinking a triple monitor portrait setup for my next build.
cheers
sonicmerlin - Thursday, March 19, 2015 - link
Will Freesync work with the current gen consoles?SleepModezZ - Thursday, March 19, 2015 - link
No.Adaptive Sync is a Display Port specific standard. What current gen console supports Display Port? None to my knowledge. HDMI is a different standard and I don't think there have been even any rumors about putting adaptive sync technology into the HDMI standard. And if it some day would come - would the current HDMI hardware on the consoles be able to support it after a driver update from AMD? Probably not.
Murloc - Thursday, March 19, 2015 - link
it's not likely to happen any time soon since video and STBs etc. revolve around the usual framerates and TVs do the same so there's no need for this kind of flexibility, tearing is not an issue.Too bad that TV standards like HDMI spill over in the computer world (audio, projectors, laptops, etc.) and hamstring progress.
sonicmerlin - Friday, March 20, 2015 - link
Well what if MS and Sony released hardware refreshes (like a slimmed down PS4) that included display port?Gigaplex - Friday, March 20, 2015 - link
I'm pretty sure that both Xbox One and PS4 use GCN 1.0 hardware, so no, a DisplayPort refresh probably wouldn't help.Norseman4 - Thursday, March 19, 2015 - link
Can you please verify some information:On the specs page for the BenQ XL2730Z (http://gaming.benq.com/gaming-monitor/xl2730z/spec... it states a 54Hz min vertical refresh. This could be a copy/paste issue since it's the same as the min horizontal refresh.
barleyguy - Thursday, March 19, 2015 - link
I was already shopping for a 21:9 monitor for my home office. I'm now planning to order a 29UM67 as soon as I see one in stock. The GPU in that machine is an R7/260X, which is on the compatible list. :-)boozed - Thursday, March 19, 2015 - link
"the proof is in the eating of the pudding"Thankyou for getting this expression right!
Oh, and Freesync looks cool too.
D. Lister - Thursday, March 19, 2015 - link
I have had my reservations with claims made by AMD these days, and my opinion of 'FreeSync' wasn't quite in contrast. If this actually works at least just as well as G-Sync (as claimed by this rather brief review) with various hardware/software setups then it is indeed a praiseworthy development. I personally would certainly be glad that the rivalry of two tech giants resulted (even if only inadvertently) in something that benefits the consumer.cmdrdredd - Thursday, March 19, 2015 - link
I love the arguments about "freesync is an open standard" when it doesn't matter. 80% of the market is Nvidia and won't be using it. Intel is a non-issue because not many people are playing games that benefit from adaptive v-sync. Think about it, either way you're stuck. If you buy a GSync monitor now you likely will upgrade your GPU before the monitor goes out. So your options are only Nvidia. If you buy a freesync monitor your options are only AMD. So everyone arguing against gsync because you're stuck with Nvidia, have fun being stuck with AMD the other way around.Best to not even worry about either of these unless you absolutely do not see yourself changing GPU manufacturers for the life of the display.
barleyguy - Friday, March 20, 2015 - link
NVidia is 71% of the AIB market, as of the latest released numbers from Hexus. That doesn't include AMD's APUs, which also support Freesync and are often used by "midrange" gamers.The relevance of being an open standard though, is that monitor manufacturers can add it with almost zero extra cost. If it's built into nearly every monitor in a couple of years, then NVidia might have a reason to start supporting it.
tsk2k - Thursday, March 19, 2015 - link
@Jarred WaltonYou disappoint me.
What you said about G-sync below minimum refresh rate is not correct, also there seems to be issues with ghosting on freesync. I encourage everyone to go to PCper(dot)com and read a much more in-depth article on the subject.
Get rekt anandtech.
JarredWalton - Friday, March 20, 2015 - link
If you're running a game and falling below the minimum refresh rate, you're using settings that are too demanding for your GPU. I've spent quite a few hours playing games on the LG 34UM67 today just to see if I could see/feel issues below 48 FPS. I can't say that I did, though I also wasn't running settings that dropped below 30 FPS. Maybe I'm just getting too old, but if the only way to quantify the difference is with expensive equipment, perhaps we're focusing too much on the theoretical rather than the practical.Now, there will undoubtedly be some that say they really see/feel the difference, and maybe they do. There will be plenty of others where it doesn't matter one way or the other. But if you've got an R9 290X and you're looking at the LG 34UM67, I see no reason not to go that route. Of course you need to be okay with a lower resolution and a more limited range for VRR, and you're also willing to go with a slower response time IPS (AHVA) panel rather than dealing with TN problems. Many people are.
What's crazy to me is all the armchair experts reading our review and the PCPer review and somehow coming out with one or the other of us being "wrong". I had limited time with the FreeSync display, but even so there was nothing I encountered that caused me any serious concern. Are there cases where FreeSync doesn't work right? Yes. The same applies to G-SYNC. (For instance, at 31 FPS on a G-SYNC display, you won't get frame doubling but you will see some flicker in my experience. So that 30-40 FPS range is a problem for G-SYNC as well as FreeSync.)
I guess it's all a matter of perspective. Is FreeSync identical to G-SYNC? No, and we shouldn't expect it to be. The question is how much the differences matter. Remember the anisotropic filtering wars of last decade where AMD and NVIDIA were making different optimizations? Some were perhaps provably better, but in practice most gamers didn't really care. It was all just used as flame bait and marketing fluff.
I would agree that right now you can make the case the G-SYNC is provably better than FreeSync in some situations, but then both are provably better than static refresh rates. It's the edge cases where NVIDIA wins (specifically, when frame rates fall below the minimum VRR rate), but when that happens you're already "doing it wrong". Seriously, if I play a game and it starts to stutter, I drop the quality settings a notch. I would wager most gamers do the same. When we're running benchmarks and comparing performance, it's all well and good to say GPU 1 is better than GPU 2, but in practice people use settings that provide a good experience.
Example:
Assassin's Creed: Unity runs somewhat poorly on AMD GPUs. Running at Ultra settings or even Very High in my experience is asking for problems, no matter if you have a FreeSync display or not. Stick with High and you'll be a lot happier, and in the middle of a gaming session I doubt anyone will really care about the slight drop in visual fidelity. With an R9 290X running at 2560x1080 High, ACU typically runs at 50-75FPS on the LG 34UM67; with a GTX 970, it would run faster and be "better". But unless you have both GPUs and for some reason you like swapping between them, it's all academic: you'll find settings that work and play the game, or you'll switch to a different game.
Bottom Line: AMD users can either go with FreeSync or not; they have no other choice. NVIDIA users likewise can go with G-SYNC or not. Both provide a smoother gaming experience than 60Hz displays, absolutely... but with a 120/144Hz panel only the high speed cameras and eagle eyed youth will really notice the difference. :-)
chizow - Friday, March 20, 2015 - link
Haha love it, still feisty I see even in your "old age" there Jarred. I think all the armchair experts want is for you and AT to use your forum on the internet to actually do the kind of testing and comparisons that matter for the products being discussed, not just provide another Engadget-like experience of superficial touch-feely review, dismissing anything actually relevant to this tech and market as not being discernable to someone "your age".JarredWalton - Friday, March 20, 2015 - link
It's easy to point out flaws in testing; it's a lot harder to get the hardware necessary to properly test things like input latency. AnandTech doesn't have a central location, so I basically test with what I have. Things I don't have include gadgets to measure refresh rate in a reliable fashion, high speed cameras, etc. Another thing that was lacking: time. I received the display on March 17, in the afternoon; sometimes you just do what you can in the time you're given.You however are making blanket statements that are pro-NVIDIA/anti-AMD, just as you always do. The only person that takes your comments seriously is you, and perhaps other NVIDIA zealots. Mind you, I prefer my NVIDIA GPUs to my AMD GPUs for a variety of reasons, but I appreciate competition and in this case no one is going to convince me that the closed ecosystem of G-SYNC is the best way to do things long-term. Short-term it was the way to be first, but now there's an open DisplayPort standard (albeit an optional one) and NVIDIA really should do everyone a favor and show that they can support both.
If NVIDIA feels G-SYNC is ultimately the best way to do things, fine -- support both and let the hardware enthusiasts decide which they actually want to use. With only seven G-SYNC displays there's not a lot of choice right now, and if most future DP1.2a and above displays use scalers that support Adaptive Sync it would be stupid not to at least have an alternate mode.
But if the only real problem with FreeSync is when you fall below the minimum refresh rate you get judder/tearing, that's not a show stopper. As I said above, if that happens to me I'm already changing my settings. (I do the same with G-SYNC incidentally: my goal is 45+ FPS, as below 40 doesn't really feel smooth to me. YMMV.)
Soulwager - Saturday, March 21, 2015 - link
You can test absolute input latency to sub millisecond precision with ~50 bucks worth of hobby electronics, free software, and some time to play with it. For example, an arduino micro, a photoresistor, a second resistor to make a divider, a breadboard, and a usb cable. Set the arduino up to emulate a mouse, and record the difference in timing between a mouse input and the corresponding change in light intensity. Let it log a couple minutes of press/release cycles, subtract 1ms of variance for USB polling, and there you go, full chain latency. If you have access to a CRT, you can get a precise baseline as well.As for sub-VRR behavior, if you leave v-sync on, does the framerate drop directly to 20fps, or is AMD using triple buffering?
chizow - Saturday, March 21, 2015 - link
You seem to be taking my comments pretty seriously Jarred, as you should, given they draw a lot of questions to your credibility and capabilities in writing a competent "review" of the technology being discussed. But its np, no one needs to take me seriously, this isn't my job, unlike yours even if it is part time. The downside is, reviews like this make it harder for anyone to take you or the content on this site seriously, because as you can see, there are a number of other individuals that have taken issue to your Engadget-like review. I am sure there are a number of people that will take this review as gospel, go out and buy FreeSync panels, discover ghosting issues not covered in this "review" and ultimately, lose trust in what this site represents. Not that you seem to care.As for being limited in equipment, that's just another poor excuse and asterisk you've added to the footnotes here. It takes a max $300 camera, far less than a single performance graphics card, and maybe $50 in LED, diodes and USB input doublers (hell you can even make your own if you know how to splice wires) at Digikey or RadioShack to test this. Surely, Ryan and your new parent company could foot this bill for a new test methodology if there was actually interest in conducting a serious review of the technology. Numerous sites have already given the methodology for input lag and ghosting with a FAR smaller budget than AnandTech, all you would have to do is mimic their test set-up with a short acknowledgment which I am sure they would appreciate from the mighty AnandTech.
But its OK, like the FCAT issue its obvious AT had no intention of actually covering the problems with FreeSync, I guess if it takes a couple of Nvidia "zealots" to get to the bottom of it and draw attention to AMD's problems to ultimately force them to improve their products, so be it. Its obvious the actual AMD fans and spoon-fed press aren't willing to tackle them.
As for blanket-statements lol, that's a good one. I guess we should just take your unsubstantiated points of view, which are unsurprisingly, perfectly aligned with AMD's, at face value without any amount of critical thinking and skepticism?
It's frankly embarrassing to read some of the points you've made from someone who actually works in this industry, for example:
1) One shred of confirmation that G-Sync carries royalties. Your "semantics" mean nothing here.
2) One shred of confirmation that existing, pre-2015 panels can be made compatible with a firmware upgrade.
3) One shred of confirmation that G-Sync somehow faces the uphill battle compared to FreeSync, given known market indicators and factual limitations on FreeSync graphics card support.
All points you have made in an effort to show FreeSync in a better light, while downplaying G-Sync.
As for the last bit, again, if you have to sacrifice your gaming quality in an attempt to meet FreeSync's minimum standard refresh rate, the solution has already failed, given one of the major benefits of VRR is the ability to crank up settings without having to resort to Vsync On and the input lag associated with it. For example, in your example, if you have to drop settings from Very High to High just so that your FPS don't drop below 45FPS for 10% of the time, you've already had to sacrifice your image quality for the other 90% it stays above that. That is a failure of a solution if the alternative is to just repeat frames for that 10% as needed. But hey, to each their own, this kind of testing and information would be SUPER informative in an actual comprehensive review.
As for your own viewpoints on competition, who cares!?!?!? You're going to color your review and outlook in an attempt to paint FreeSync in a more favorable light, simply because it aligns with your own viewpoints on competition? Thank you for confirming your reasoning for posting such a biased and superficial review. You think this is going to matter to someone who is trying to make an informed decision, TODAY, on which technology to choose? Again, if you want to get into the socioeconomic benefits of competition and why we need AMD to survive, post this as an editiorial, but to put "Review" in the title is a disservice to your readers and the history of this website, hell, even your own previous work.
steve4king - Monday, March 23, 2015 - link
Thanks Jarred. I really appreciate your work on this. However, I do disagree to some extent on the low-end FPS issue. The biggest potential benefit to Adaptive Refresh is smoothing out the tearing and judder that happens when the frame rate is inconsistent and drops. I also would not play at settings where my average frame-rate fell below 60fps.. However, my settings will take into account the average FPS, where most scenes may be glassy-smooth, while in a specific area the frame-rate may drop substantially. That's where I really need adaptive-sync to shine. And from most reports, that's where G-Sync does shine. I expect low end flicker could be solved with a doubling of frames, and understand you cannot completely solve judder if the frame-rate is too low.tsk2k - Friday, March 20, 2015 - link
Thanks for your reply Jarred.I was just throwing a tantrum cause I wanted a more in-depth article.
5150Joker - Friday, March 20, 2015 - link
I own a G-Sync ASUS ROG PG278Q display and while it's fantastic, I'd prefer NVIDIA just give up on G-Sync and go with the flow and adapt ASync/FreeSync. It's clearly working as well (which was my biggest hesitation) so there's no reason to continue forcing users to pay a premium on proprietary technology that more and more display manufacturers will not support. If LG or Samsung push out a 34" widescreen display that is AHVA/IPS with low response time and 144 Hz support, I'll probably sell my ROG Swift and switch, even if it is a FreeSync display. Like Jared said in his article, you don't notice tearing with a 144 Hz display so G-Sync/FreeSync make little to no impact.chizow - Friday, March 20, 2015 - link
And what if going to Adaptive Sync results in a worst experience? Personally I have no problems if Nvidia uses an inferior Adaptive Sync based solution, but I would still certainly want them to continue developing in and investing in G-Sync, as I know for a fact I would not be happy with what FreeSync has shown today.wira123 - Friday, March 20, 2015 - link
inferior / worst experience ?meanwhile the review from anandtech, guru3d, hothardware, overclock3d, hexus, techspot, hardwareheaven, and the list could goes on forever. They clearly stated that the experiences with both freesync & g-sync are equal / comparable, but freesync cost less as an added bonus.
Before you accuse me as an AMD fanboy, i own an intel haswell cpu & zotac GTX 750 (should i take pics as prove ?).
Based of review from numerous tech sites. my conclusion is : either g-sync will end up just like betamax, or nvdia forced to adopt adaptive sync.
chizow - Friday, March 20, 2015 - link
These tests were done in some kind of limited/closed test environment apparently, so yes all of these reviews are what I would consider incomplete and superficial. There are a few sites however that delve deeper and notice significant issues. I have already posted links to them in the comments, if you made it this far to comment you would've come upon them already and either chose to ignore them or missed them. Feel free to look them over and come to your own conclusions, but it is obvious to me, the high floor refresh and the ghosting make FreeSync worst than G-Sync without a doubt.wira123 - Friday, March 20, 2015 - link
Yeah since pcper gospel review was apparently made by jesus.And 95% reviewer around the world who praised freesync are heretic, isn't that right ?
V-sync can be turned off-on as you wish if the fps surpass the monitor refresh cap range, unlike G-sync. And none report ghostring effect SO FAR, you are daydreaming or what ?
Still waiting for tomshardware review, even though i already know what their verdict will be
chizow - Saturday, March 21, 2015 - link
Who needs god when you have actual screenshots and video of the problems? This is tech we are talking about, not religion, but I am sure to AMD fans they are one and the same.Crunchy005 - Saturday, March 21, 2015 - link
@chizow anything that doesn't say Nvidia is God is incomplete and superficial to you. You are putting down a lot of hard work out into a lot of reviews for one review that pointed out a minor issue.Show us more than one and maybe we will loom at it but your paper gospel means nothing when there are a ton more articles that contradict it. Also what makes you the expert here when all these reviews say it is the same/comparable and you yourself has not seen freesync in person. If you think you can do a better job start a blog and show us. Otherwise stop with your anti AMD pro Nvidia campaign and get off your high horse. In these comments you even attacked Jarred who works hard in his short time that he gets hardware to give us as much relevant info that he can. You don't show respect to others work here and you make blanket statements with nothing to support yourself.
Crunchy005 - Saturday, March 21, 2015 - link
Sorry, tried to play nice higher up...chizow - Saturday, March 21, 2015 - link
Again, there is nothing religious about technology, it seems you are the ones who are clinging to faith despite photographic evidence to the contrary.silverblue - Saturday, March 21, 2015 - link
Worse, my dear chizow, worse. 'Worst' doesn't go with 'a', but it does go with 'the' as it denotes an extreme.HTH. :)
silverblue - Saturday, March 21, 2015 - link
I think what would clear up your misgivings with FreeSync would be if a very high quality panel with a very wide frequency range (especially dealing with those lower frequencies) was tested. I shouldn't blame the underlying technology for too much yet, especially considering there are downsides to GSync as well.chizow - Saturday, March 21, 2015 - link
There have been numerous reports that the BenQ panel shares the same high quality AU optronics panel as the ROG Swift, so again, it is obvious the problem is in the spec/implementation, rather than the panel itself. But yes, it is good confirmation that the premium associated with G-Sync is in fact, worth it if the BenQ shows these problems at $630 while the Swift does not at $780.silverblue - Sunday, March 22, 2015 - link
FreeSync works down to 9Hz; it'd be nice to see a panel that gets even remotely close to this.chizow - Monday, March 23, 2015 - link
AMD claims the spec goes as low as 9Hz, but as we have seen it has a hard enough working properly at the current minimums of 40 and 48Hz without exhibiting serious issues, so until those problems directly tied to low refresh and pixel decay are resolved, it doesn't really matter what AMD claims on a slidedeck.soccerballtux - Friday, March 20, 2015 - link
I disagree, the holy grail to get me to upgrade is going to be a 32" 1440p monitor (since 30" 1600p ones are non-existent), and maybe one that's just 60fps.JarredWalton - Friday, March 20, 2015 - link
I'd really love a good 34" 3440x1440 display with a 120Hz refresh rate. Too bad that's more bandwidth than DP provides. And too bad gaming at 3440x1440 generally requires more than any single GPU other than the GTX Titan X. I've got CrossFire 290X and SLI 970 incidentally; when I'm not testing anything, it's the GTX 970 GPUs that end up in my system for daily use.Impulses - Friday, March 20, 2015 - link
I'd totally go for a 34" 3440x1440... Wouldn't be any harder to drive than my 3x 24" 1080p IPS displays in Eyefinity. I've resigned myself to CF/SLI, it's not like I play very many games at launch anyway. Anything lower res, smaller, or not as wide would feel like a side grade.steve4king - Tuesday, March 24, 2015 - link
Seiki will be releasing DP1.3 on a 4k monitor Q3 2015, however, for whatever reason, it's still going to be 60hz. Now if other monitors follow suit and R300 also includes displayPort 1.3 we can get out of this bottleneck!wyewye - Friday, March 20, 2015 - link
This is not a review, its a blatant brown-nosing to Amd fest.Nvidia can at any time add support for Adaptive sync in their drivers, thus supporting both Gsync and Adaptive sync.
Amd will never be able to suppory both.
Looks like Nvidia has all the options.
silverblue - Friday, March 20, 2015 - link
They can, but why would they? As for AMD using NVIDIA tech, well... that's a well-trodden path and we all know what's at the end.ijozic - Friday, March 20, 2015 - link
Hmm.. All the listed screens are budget ones.. There are all either TN film or low res IPS screens. I hope it's just time needed for manufacturers to churn out their top models (e.g. 1440p 34" IPS or 27" 1440p 144Hz AHVA) and not some trend.JarredWalton - Friday, March 20, 2015 - link
By your logic all the G-SYNC displays are equally lousy (TN), with the exception of the upcoming Acer XB270HU. It's one of the primary complaints I have with G-SYNC to date: TN is usable, but IPS/AHVA/MVA are all better IMO.Side note: anyone else notice that IPS panels (esp. on laptops) seem to have issues with image persistence? I swear, I've played with laptops that are only a few months old and if you have a static window like Word open and then switch to a solid blue background or similar you'll see the Word outline remain for several minutes. Maybe it's just less expensive panels, but I don't know....
Strunf - Friday, March 20, 2015 - link
I have an Intel CPU and a nVIDIA GC but bravo AMD that despite being some kind of underdog does revolutionize the market on the right way, Mantle, Freesync, what's next?I'm convinced Intel will support Freesync in the near future, I don't see why not, it's free and easy to implement on the graphics circuit and the monitors can benefit from it at almost no cost (proved by the firmware upgrades)...
ComputerGuy2006 - Friday, March 20, 2015 - link
No talk at all about ULMB (Ultra Low Motion Blur)? I am guessing its just not supported?JarredWalton - Friday, March 20, 2015 - link
It's a display feature, not a FreeSync feature.zodiacfml - Friday, March 20, 2015 - link
Just get 120hz displays and be done with it.knightspawn1138 - Friday, March 20, 2015 - link
I think that since AMD has gotten the FreeSync details built into DisplayPort specifications, then all NVidia would have to do to be compatible with a FreeSync monitor is update their DisplayPort to the same revision. G-Sync will have to come down in price, or offer a significant benefit that isn't matched by FreeSync in order to stay viable.p1nky - Friday, March 20, 2015 - link
Is the table with the displays correct?Other sites say that at least the 23.6" and the 31.5" versions of the Samsung UE850 will come with a PLS (Samsung's name for IPS) panel, not with a TN panel.
It would be nice to have a 4k display >= 30" with FreeSync so my hopes would be on the UE850 with 31,5".
Either the other sites are wrong or you got updated information that those will come with TN panels after all, which would be a shame :(
JarredWalton - Friday, March 20, 2015 - link
I think people are assuming they'll be PLS; I'm assuming they'll be TN. The reason is that Samsung specifically notes the PLS status on displays that use it (e.g. SE650), but they say nothing about panel type when it's TN, because everyone knows TN is a negative marketing term. Here's at least one place suggesting TN as well:http://www.businesswire.com/news/home/201501050060...
p1nky - Friday, March 20, 2015 - link
hmm the table there doesn't mention the UE850, just the 590 and in the text I also can't find a hint indicating TN for the 850?Here are 2 (German however) sites that say PLS:
http://www.heise.de/newsticker/meldung/CES-Kommend...
http://www.hardwareluxx.de/index.php/news/hardware...
Heise is a very respectable site, owner of the c't magazine, probably the most reputable German computer magazine still left that usually doesn't just wildly spread incorrect information (ok, nowadays you never know, but they certainly are on the more reliable and trustworthy side than most others).
Maybe even Samsung doesn't know yet... :)
JarredWalton - Friday, March 20, 2015 - link
Someone at Samsung knows, but they haven't publicly stated anything that I can find. Given they're all 4K panels, TN or PLS/IPS/AHVA are certainly possible. I've added "TN?" on the UE850, as it is an unknown. Here's hoping they are PLS and not TN!yefi - Friday, March 20, 2015 - link
If they come out with either a 30" 1600p or 40" 4k IPS monitor, I'll sell my 970 and be all over this.peevee - Friday, March 20, 2015 - link
"there are overclockable 27” and 30” IPS displays that don’t cost much at all."Can you elaborate on that? Maybe a comparison test?
Murloc - Friday, March 20, 2015 - link
it's the usual korean monitors from ebay. It's a horse that's been beaten to death already so nobody elaborates on it anymore.They are overclockable, but since they're cheap and made with second-choice panels, you can probably get dead pixels (unless it has a guarantee of no dead pixels, but you pay for that), plus there is no guarantee that it will overclock to where you'd like it to, there's no guarantee, it's sold as a 60 Hz monitor and anything else is a bonus.
JarredWalton - Friday, March 20, 2015 - link
Monoprice offers a 30" IPS 120Hz display that's worth a look -- they "guarantee" that it will overclock to 120Hz at least. I saw it at CES and it looked good. I'm sure there's still a fair amount of pixel lag (most IPS are about 5ms), but it's better than 60Hz.yefi - Saturday, March 21, 2015 - link
I commented on that thread recently. The monitor was tested and is apparently only capable of 60hz :(Welsh Jester - Friday, March 20, 2015 - link
I agree with the article, had a 120hz screen a few years now and it is def smoother and the tearing barely noticable.However, i'll upgrade when a well priced 27" 1440p Freesync screen comes around. Probably TN for the better response time and no glow.
Welsh Jester - Friday, March 20, 2015 - link
To add to my last post, i think 1440p Freesync screens will be good for people with only 1 higher end GPU. Since they won't have to deal with micro stuttering that multi cards bring, smooth experience at 40+ fps. Good news.FlushedBubblyJock - Saturday, March 21, 2015 - link
I was just about ready to praise AMD but then I see "must have and use display port"...Well, at least it appears AMD got it to work on their crap, and without massive Hz restrictions as they were on earlier reviews.
So, heck, AMD might have actually done something right for once ?
I am cautious - I do expect some huge frikkin problem we can't see right now --- then we will be told to ignore it, then we will be told it's nothing, then there's a workaround fix, then after a couple years it will be full blow admitted and AMD will reluctantly "fix it" in "the next release of hardware".
Yes, that's what I expect, so I'm holding back the praise for AMD because I've been burned to a crisp before.
cykodrone - Saturday, March 21, 2015 - link
I actually went to the trouble to make an account to say sometimes I come here just to read the comments, some of the convos have me rolling on the floor busting my guts laughing, seriously, this is free entertainment at its best! Aside from that, the cost of this Nvidia e-penis would feed 10 starving children for a month. I mean seriously, at what point is it overkill? By that I mean is there any game out there that would absolutely not run good enough on a slightly lesser card at half the price? When I read this card alone requires 250W, my eyes popped out of my head, holy electric bill batman, but I guess if somebody has a 1G to throw away on an e-penis, they don't have electric bill worries. One more question, what kind of CPU/motherboard would you need to back this sucker up? I think this card would be retarded without at least the latest i7 Extreme(ly overpriced), can you imagine some tool dropping this in an i3? What I'm saying is, this sucker would need an expensive 'bed' too, otherwise, you'd just be wasting your time and money.cykodrone - Saturday, March 21, 2015 - link
This got posted to the wrong story, was meant for the NVIDIA GeForce GTX Titan X Review, my humble apologies.mapesdhs - Monday, March 23, 2015 - link
No less amusing though. ;DBtw, I've tested 1/2/3x GTX 980 on a P55 board, it works a lot better than one might think.
Also test 1/2x 980 with an i5 760, again works quite well. Plus, the heavier the game, the
less they tend to rely on main CPU power, especially as the resolution/detail rises.
Go to 3dmark.com, Advanced Search, Fire Strike, enter i7 870 or i5 760 & search, my
whackadoodle results come straight back. :D I've tested with a 5GHz 2700K and 4.8GHz
3930K aswell, the latter are quicker of course, but not that much quicker, less than most
would probably assume.
Btw, the Titan X is more suited to solo pros doing tasks that are mainly FP32, like After Effects.
However, there's always a market for the very best, and I know normal high street stores make
their biggest profit margins on premium items (and the customers who buy them), so it's an
important segment - it drives everything else in a way.
Ian.
mapesdhs - Monday, March 23, 2015 - link
(Damn, still no edit, I meant to say the 3-way testing was with an i7 870 on a P55)Vinny DePaul - Sunday, March 22, 2015 - link
I am a big fan of open standard. More the merrier. I stay with nVidia because they support their products better. I was a big fan of AMD GPU but the drivers were so buggy. nVidia updates their drivers so quickly and the support is just a lot better. I like G-sync. It worths the extra money. I hope my monitor can support FreeSync with a firmware upgrade. (Not that I have an AMD GPU.)Teknobug - Sunday, March 22, 2015 - link
Now if I only can find a 24" monitor with these features, anything bigger than 24" is too large for me.gauravnba - Monday, March 23, 2015 - link
Lot of G-Sync versus AMD bashing here. Personally it all comes down to whether or not I am being confined to an eco-system when going for either technology. If nVidia starts to become like Apple in that respect, I'm not very comfortable with it.However, I wonder if to adapt to FreeSync, does it take a lot of addition or modification of hardware on the GPU end. That might be one reason that nVidia didn't have to change much of their architecture on the GPU during the G-Sync launch and confined that to the scaler.
AMD worked with VESA to get this working on GCN 1.1, but not on GCN 1.0. This may be another area where the technologies are radically different- one is heavily reliant on the scaler while the other may be able to divide the work to a certain extent? Then again, I'm quite ignorant of how scalers and GPUs work in this case.
PixelSupreme - Monday, March 23, 2015 - link
To be honest I don't give half a fudge about FreeSync or G-Sync. What gets my attention is ULMB/ strobed backlight. An IPS-Display (or well, OLED but...), WQHD and strobing that works on a range of refresh rates, including some that are mutliples of 24. That would be MY holy grail. The announced Acer XB270HU comes close but ULMB apparently only works on 85Hz ans 100Hz.P39Airacobra - Monday, March 23, 2015 - link
Why will it not work with the R9 270? That is BS! To hell with you AMD! I paid good money for my R9 series card! And it was supposed to be current GCN not GCN 1.0! Not only do you have to deal with crap drivers that cause artifacts! Now AMD is pulling off marketing BS!Morawka - Tuesday, March 24, 2015 - link
Anandtech, have you seen the PCPerspective article on Gsync vs Freesync? PCper was seeing ghosting with freesync. Can you guys coo-berate their findings?shadowjk - Tuesday, March 24, 2015 - link
Am I the only one who would want a 24" ish 1080p IPS screen with gsync or freesync?xenol - Tuesday, March 24, 2015 - link
FreeSync and GSync shouldn't have ever happened.The problem I have is "syncing" is a relic of the past. The only reason why you needed to sync with a monitor is because they were using CRTs that could only trace the screen line by line. It just kept things simpler (or maybe practical) if you weren't trying to fudge with the timing of that on the fly.
Now, you can address each individual pixel. There's no need to "trace" each line. DVI should've eliminated this problem because it was meant for LCD's. But no, in order to retain backwards compatibility, DVI's data stream behaves exactly like VGA's. DisplayPort finally did away with this by packetizing the data, which I hope means that display controllers only change what they need to change, not "refresh" the screen. But given they still are backwards compatible with DVI, I doubt that's the case.
Get rid of the concept of refresh rates and syncing altogether. Stop making digital displays behave like CRTs.
Mrwright - Wednesday, March 25, 2015 - link
Why do i need either Freesync or Gsync when I already get over 100fps in all games at 2560x1400. All i want is a 144Hz 2560x1440 monitor without the Gsync tax. as gsync and freesync are only usefull if you drop below 60fps.ggg000 - Thursday, March 26, 2015 - link
Freesync is a joke:https://www.youtube.com/watch?feature=player_embed...
https://www.youtube.com/watch?v=VJ-Pc0iQgfk&fe...
https://www.youtube.com/watch?v=1jqimZLUk-c&fe...
https://www.youtube.com/watch?feature=player_embed...
https://www.youtube.com/watch?v=84G9MD4ra8M&fe...
https://www.youtube.com/watch?v=aTJ_6MFOEm4&fe...
https://www.youtube.com/watch?v=HZtUttA5Q_w&fe...
ghosting like hell.
willis936 - Tuesday, August 25, 2015 - link
LCD is a memory array. If you don't use it you lose it. Need to physically refresh each pixel the same number of times a second. You could save on average bitrate by only sending changed pixels but that requires more work on the gpu and adds latency. What's more is it doesn't change the fact what your max bitrate needs to be and don't even bigger suggesting multiple frame buffers as that adds TV tier latency.ggg000 - Thursday, March 26, 2015 - link
Freesync is a joke:https://www.youtube.com/watch?feature=player_embed...
https://www.youtube.com/watch?v=VJ-Pc0iQgfk&fe...
https://www.youtube.com/watch?v=1jqimZLUk-c&fe...
https://www.youtube.com/watch?feature=player_embed...
https://www.youtube.com/watch?v=84G9MD4ra8M&fe...
https://www.youtube.com/watch?v=aTJ_6MFOEm4&fe...
https://www.youtube.com/watch?v=HZtUttA5Q_w&fe...
ghosting like hell.
chizow - Monday, March 30, 2015 - link
And more evidence of FreeSync's (and AnandTech's) shortcomings, again from PCPer. I remember a time AnandTech was willing to put in the work with the kind of creativeness needed to come to such conclusions, but I guess this is what happens when the boss retires and takes a gig with Apple.http://www.pcper.com/reviews/Graphics-Cards/Dissec...
PCPer is certainly the go-to now for any enthusiast that wants answers beyond the superficial spoon-fed vendor stories.
ZmOnEy132 - Saturday, December 17, 2016 - link
Free sync is not meant to increase fps. The whole point is visuals. It stops visual tearing which is why it drops frame rates to match the monitor. Fps has no effect on what free sync is meant to do. It's all visuals not performance. I hate when people write reviews that don't know what they're talking about. You're gonna get dropped frame rates because that means the frame isn't ready yet so the GPU doesn't give it to the display and holds onto it a tiny bit longer to make sure the monitor and GPU are both ready for that frame.