Only trouble is, since these panels aren't vetted by AMD and just have a badge slapped on them if they fall within the spec for Freesync, the blurring issue and other reported issues remain. These issues do not exist within the GSync ecosystem because nVidia vets the panels themselves, on an individual level with each manufacturer before it can receive approval as a GSync approved panel for production. So, despite the price difference, you do indeed pay for what you get.
Hopefully, GSync will see some major expansion this year and subsequent price drops. I own one myself and it's the best thing I've purchased for a PC since a solid state drive, period. Don't regret it one bit. Still have my S-IPS ProArt for photo editing as a second monitor as well, so no sacrifice on that front, but it would be nice to have 3-5ms GSync enabled S-IPS panels from good manufacters this year. I wouldn't even care about 4k on a panel like that... a 2560x1440, 5ms response time, 120Hz S-IPS GSync monitor? I will literally throw you my money.
Hopefully GSync will just die off completely. We don't need any more of these BS closed, proprietary implementations. We need open standards. If Freesync has problems, then hopefully they can be improved on. None of us benefit from having to match our monitors to a specific graphics card manufacturer.
There's no point of open standards, if these 'standards' have such low quality.
Such is the case with the Philips screwdriver. It may be standard, but it's easily the worst screw drive available, as it easily damages itself and you have to apply a lot of force into keeping it mated, instead of having a secure contact and letting the screw do the work.
I, for one, am glad the Philips standard exists! It isn't difficult to determine which drivers are the better ones. And if you want to be a cheap@$$, you can get a cheap one!
The Philips screwdriver was never meant to be used with motorized screwdrivers so perhaps that's the cause of your issues. I've never had an issue with it using hand tools.
I'm not a big fan or Robertson or Torx screw-heads because you need the exact right size for anything. I have 12 different Torx screwdrivers and that just covers the non-security versions of the common sizes.
"Phillips wasn’t trying to make life with hand tools easier. He was trying to solve an industrial problem. To drive a slot screw, you need hand-eye coordination to line up the screwdriver and the slot. If you’re a machine — especially a 1930s machine — you ain’t got no eye, and your hand coordination may depend on humans.
The Phillips-head screw and Phillips screwdriver were designed for power tools, especially power tools on assembly lines. "
As a mechanical engineer, whether you're advocating torque or philips designed heads it is a matter of strengths of materials used in the screw that matters most.
The problem with philips is that we keep trying to use it for things it wasn't designed to do. It was never intended for high torque uses (like driving a screw into a block of wood without a pilot hole). The intent of the sloped hole and bit was that the driver was supposed to torque out to prevent over tightening and/or snapping off the screw head at a time (circa a century ago) when torque wrenches were prohibitively expensive for the factory floor and machinery for automatically installing screws wasn't consistent enough in the level of torque it applied.
Hmmm.... perhaps you should learn how to use a Philips screw driver. Do you even know why the Philips style drive was developed or are you just ranting? The Philips self centerin screw solves several assembly issues that flat screw drivers could not resolve. How many rescessed precision machine screws do you see using flat blades? None.
Your attitude clearly shows a lack of a technical education.
the philips screwdriver was the first self-centering screwdriver, that's why it's not low quality standard. The fact that is cams out was apparently wanted so that assemblage speed of cars could be increase without using torque drivers which weren't available. A source I've read disputes this claim but honestly, it doesn't matter, it was still easier to use and is good enough to build cars by hand.
The pozidriv was invented later as an upgrade for situations where you don't want this effect. There's also the Torx.
These are all used.
The problem is that wood screws and other hobbyist stuff are stuck on the philips head because it's the first thing after the single-slot head that was decent and it took over as a standard. Nothing you can do about it....
I like my G-Sync monitor, smooth gameplay, no ghosting, no issues whatsoever. I'd take existing proprietary tech over inferior open standards that are yet to be fully implemented. If Freesync proves to be superior I'll switch, but considering AMD track record the drivers will be riddled with bugs and problems. They need to work a lot harder on their software. That is the reason many people jumped ship over to Nvidia. It's not enough to look good on paper, the tech must work well too.
The actual technology is functionally identical, the initial implementation is the issue with AMD's solution. That's the sort of thing that gets worked out quickly.
AMD saying vague things doesn't convince me of that. Time will tell if it's functionally identical. Also, NVIDIA's implementation allows backlight strobing to be controlled as well. It's possible that, even if the problems with FreeSync can technically be worked out with their current implementation, by the time they do it, NVIDIA will have worked out combining G-SYNC with ULMB into a continuous blend.
Your argument about AMD software or hardware support is totally invalid. Freesync is just a specification, there is no hardware or software by AMD. The ghosting issues is up to the monitor manufacturer to solve, they can include a hardware buffer for example, which will make Freesync on par with Gsync with regards to behavior when falling below the minimum supported refresh rate.
This is 100% wrong. If it were no hardware or software by AMD then why only a tiny fraction of AMD video cards supports freesync? You need both AMD hardware and drivers on the GPU side in order for freesync to work. There is no working solution without AMD, even if possible in theory.
You are wrong. The Implementation requires hardware on both sides. The GPU and the Monitor. The monitors implementing side is based on Standards. On the GPU side the GPU manufacturer AMD/Nvidia/Intel needs to support talking to the Monitor using this standard (Adaptive Sync). This part is Software and Hardware. AMD's Implementation is Freesync. But Intel and Nvidia are FREE to implement their own solution. Nothing is stopping them from implementing this and AMD's solution is not the only answer possible. (Nvidia GSync does not implement Adaptive Sync but it could in the future)
@MobiusPizza: Please stop spreading misinformation and writing checks that AMD can't or isn't willing to cash, specifically the bit about the hardware buffer.
If everyone supported it (nVidia and Intel) then it would get a lot more attention from a lot more people. Its the age old Kumbaya tale.
If these companies would work together for the betterment of all customers rather than just their own shareholders, then freesync would be agnostic from AMD or nVidia or Intel, and every monitor would have it, and it would scare from 9hz to 2000hz.
Because to be honest, this should have been a standard long ago, I love greenteam, I appreciate their ingenuity and skill. But they are doing this wrong, they are doing well, but they are doing it wrong.
They spec'd the standard and the implementation and it has to go through their logo program, so of course it is subject to AMD's terrible support record. In fact, it is this approach and mentality that is clearly echoed by their supporters cements this, notion, AMD loves to throw out open standards and then just expects work from ?SOMEONE? to make it good. Ridiculous lol.
I really have no problem with proprietary technologies if done well. Open source is good when it comes to software that can be easily patched but not hardware where features are pretty much hardwired and very little can be done in terms of firmware patches. All I know is that right now G-Sync is superior to Freesync because Nvidia put tons of resources in it. So Im in Nvidia camp and it doesn't matter if I am locked into a proprietary system, because I am, by choice. I've been enjoying my G-Sync display for few months now without any problems.
Yep, same, I enjoy results, the best product for my dollar. Leave all the philosophical nonsense at the door, and let some guy who isn't in the market anyways espouse the benefits of poorly supported products.
At this point I just don't like that Nvidia doesn't have a freesync option for their cards. You must go gsync or no VRR for you. That seems silly to me as it would allow them to have the best of both worlds. Support the open option and offer a better option through proprietary hardware. You get the cheaper guys who will go freesync and you get the more expensive that will go gsync and you get the only cards that support both. win win win
What AMD is solving now with the HDMI solution is that Laptops will soon have Free-sync monitors. While display port has advantages HDMI port is still used in laptops as well as external monitors.
Free sync is for the masses as well. ANd that means more eyeballs for game studios to sell games to.
I think you didn't read the article at all. They are using a propertiary protocol over HDMI with customized HDMI firmware on the receiver side. It will NEVER work with existing HDMI connections.
That's how they started off with DP FreeSync, and now it's part of the current standard as DPAS. The way I look at it, it's not about upgrading existing displays, but about making it a part of the next-gen HDMI standard so it sees wider-spread adoption. Once it improves to a certain point I would also like to see Intel supporting Adaptive Sync standards. Particularly in mobile where you can't just swap out the graphics card.
This, this, a thousand times: this. I like AMD as a company because over the years they have worked to implement new and open standards to try to improve tech and make the world of computing a better place. nVidia seems primarily motivated by making their bank account a better place. Not digging on a company for trying to make money, but proprietary development is a massive hindrance on the proliferation of new technologies.
Wake me when they FIX Freesync. Currently you should avoid it unless you like ghosting etc as PCper etc have shown. Not ready for primetime, maybe they'll get it right at rev2. Heck, both screenshots here have ghosting/blurring...LOL. Does it matter how fast you rolled it out if nobody should really buy it because it still sucks?
With mobile GSync not needing the module Nvidia originally stated as requisite for the tech, they have already exposed GSync to be a scam. I have to give them credit. They've done a great job getting the true believers to throw money at them for implementing what AMD is giving away for free.
Cute. Don't you think it's possible, since it's all in one box, that they don't need to re-buffer the last frame just a few inches away from the GPU, in the case of running below the minimum refresh rate? It's not necessary to run DisplayPort internally (though some do, like Apple in the retina MBP) so NVIDIA could be talking directly to the TCON and simplify the system.
The scam comes in when anyone who believes this buys a FreeSync monitor and sees all the problems that are resolved/mitigated by that "useless" G-Sync module.
Laptops don't need the module, because the GPU is in direct control with the display using eDP standards. Such standards don't exist on desktop panels, hence the need for a scaler/G-Sync module.
Knowledge is power, and always superior to ignorance.
Mobile doesn't need it because there is no scalar in the mobile implementation, the display is directly connected to the GPU, as has been explained previously by NVidia.
It's free, but unfortunately it does suck. You get a free meal in a soup kitchen, it's still not comparable to a 5 course menu in a michelin restaurant.
@LordSojar: Exactly, which is why I will always prefer a proprietary supported product over a laissez-faire open standards one, even if the proprietary one carries a premium.
Thats why we have hardware sites to test monitors and inform us about their quality. Being locked in one manufacturer's GPUs and pay over $100, close to $150 extra, for someone to give the OK to the product we are going to buy, is not always the best thing. It's also too soon to judge Freesync/Adaptive Sync and accept as a necessity to keep paying indefinitely $150 for a sticker and also be locked in one manufacturer's GPUs.
Its not too soon, FreeSync has been on the market almost 3 months, 21 months since G-Sync launched and a number of issues still exist, none of which exist with G-Sync. There is no doubt that today, G-Sync is the superior option.
People will continue to choose G-Sync because it is simply better.
With Nvidia defining everything, maybe it will seem best for a little while. But there are real costs of limiting something like this to a small subset of the market. Over time, Freesync will work its way into most monitors and now most TV's, and that will keep the costs down. And the monitor manufacturers will find a way to eliminate the negatives you mentioned. Its in their interest to not have those issues because they want to sell more monitors too.
@LordSojar an easy way to remedy that situation is for AMD to have "AMD Certified" Freesync monitors. The standard can still remain open source, but Freesync monitor manufacturers can choose to send in their monitors to AMD for certification so that they can have an "AMD Certified" badge slapped on them for buyers who want assurances that the monitor is of a certain standard and quality, especially when it comes to the implementation of Freesync.
The only chance for GSync is for AMD to quit GPU business altogether. It's an expensive piece of proprietary crap that even ignoring price doesn't offer ANYTHING at all on top of what FreeSync gives literally for free (it's so cheap, it's in all new upscaler chips already)
Oh, and limiting to only single port, "GSync DVI" isn't helping things either.
PS And, oh, did nVidia finance FUD campaign about FreeSync's "issues", eh? What a pathetic company, good lord...
AMD's statement raleated to overdrive is nonsense. Monitor's slacer need to know when next frame arrive, and GPU shuold provide this information to monitor. Withouit overdrive FreeSync is useless. They made broken standard - typical AMD level.
Because you have to set the overdrive values at the time of refresh, it's impossible to know exactly when the next frame will be done. With a variable refresh display, the GPU has most likely just started rendering the next frame at the time you're painting your most recently complete frame to the monitor.
All you can do is guess based on how long the last few frames took, and perhaps some rendering stats from the GPU (though I'm not sure anything above the frame history is all that useful).
I believe that's what Nvida does, they approximate the time the next frame arrives and that is better than nothing at all, it still reduces ghosting apparently.
The G-Sync module also communicates back to the GPU, AMD claimed this was bad because it introduced possible latency, but it looks like it was worth it!
The mobile implementation is definitely speculative however, but the GPU is in direct control of the monitor TCONs and overdrive.
I have the 27" Asus ROG Swift which is supposed to be very fast and I don't notice any latency issues. Adaptive sync makes such an enormous difference that even if there is some minor lag related to the communication, its till better than without it. I have absolutely no complaints about G-Sync performance, seems to work exactly as advertised.
I agree, Nvidia had a slide that basically debunked this (info came from Huddy, which is 99% guaranteed to be BS anyways) but of course the biggest proof was in the pudding where sites like TFTCentral/BlurBusters tested actual latency and found very little (and better than most panels).
I have the Swift as well, and I was skeptical during the G-Sync run-up until actual products launched and reviews were done that ended up selling me on the tech and like you, I found it was everything that was advertised and now, even more as Nvidia continues adding features to it.
The polling is causing significant issues when approaching the maximum supported by the monitor to the point where G-Sync is causing problems with competitive gaming and is being disabled. Look up "G-Sync Input Lag" or "G-Sync Latency".
Nixeus has a DPAS range of 30-144 and overdrive. If we can see this implemented more widely in the next generation of monitors that would be great. I'd like to see a review of this model in the near future! Decent budget options for DPAS is a good thing, it's the only way the technology will become mainstream.
ARGH! It would be preferable to simply encourage display manufacturers to switch to DisplayPort as the 'default' interface for all price ranges of monitor. HDMI is a consumer AV interconnect that gets bastardised for use as a 'DVI replacement'. This causes no end of problems with compatibility; anything other than a set selection of timings is technically outside of the HDMI spec and in the wild no-mans-land of "it might work, maybe, depending on who implemented what outside of spec", will rarely behave in exactly the same way between different devices (e.g. Cisco VDI boxes that default to overscan on HDMI but only on certain monitors depending on an arcane combination of EDID flags), issues with 'autodetection' of TV levels and chroma subsampling what have go unfixed (and in some cases remain unfixed) for years, etc. HDMI is not for desktop monitors. It is an ugly hack. Stop bloody using it in them!
I hope HDMI would just die so everyone can live happily on DP and connect everything with something like USB Type-C without any adapters/converters/botched colorspace issues.
I never understood why in low end products you mostly see HDMI for which you have to pay royalties and DP in high end for which you don't.
All of this FreeSync/G-Sync is basically worthless and overhyped. You can get the same results with V-Synv Adaptive (capping FPS to monitor refresh rate) and se level quality in game to match monitor's refresh rate. That's all, fast simple and free. G-Sync/FreeSync is useless for shooters, because 30-50 FPS is just too low for accurate tracking, aiming and shooting. The same is for high FPS like 80-120. G-Sync/FreeSync provide minimal advantage, barely noticeable.
OMG, another "use VSYNC, it's the same and it's free" troll. You have absolutely no idea what you're talking about. It clearly shows your fundamental lack of understanding what variable sync does, G-Sync or Freesync. There is no way Adaptive VSYNC works as well as variable sync. It sounds good but in reality is far from effective. It's a hack. I have two GTX980s that can run all my games at 60Hz or more and I have been doing exactly what you suggested with a standard 60Hz display: I would cap all my games to 60Hz using Adaptive VSYNC. But frame limiting isn't perfect, you may still get 59 or 61fps so I'd still get tearing an stuttering. Besides, you can't expect that your card will run everything at 60fps all the time. When I hooked up my G-Sync display, it's like day and night difference. You have to experience variable sync to understand. Now, when playing a game the framerates are all over the place and I still get smooth video with no tearing, no stutter regardless of my framerate. Also, since G-Sync monitors also have high refresh rates I enjoy my games at framerates higher than 60 without any glitches. My system can easily run all my games at above 60fps so I had to limit myself to 60 knowing I don't get the performance I should. But at the same time it can't run them at 120 or 144 so a faster monitor would not have helpe. Variable sync solves that. Variable sync does to games what SSDs did to storage. It completely changes your experience for the better.
This reminds me a lot of the uninformed "SSDs aren't that big a deal" comments I read 5-6 years ago. Try it before commenting, or be prepared to put foot in mouth years later when you finally do see it for yourself.
Can we get a mea culpa from AT that retracts the previous stance FreeSync is an equivalent to G-Sync now? Honestly I hope you guys do a better job in the future empirically weighing the implementation before just going through marketing slides and declaring a tie, its a disservice to your readers and the pedigree of this site.
Personally I can't believe AMD is just shuffling on to the next poorly supported "open standard" without first fixing and addressing the issues with FreeSync. Its frankly, appalling.
Your inability to interpret any article regarding AMD without your "everything AMD does sucks because I have a weird emotional reaction to that series of letters" filter is really irritating. Nothing about the previous analysis of the technologies was inaccurate and this article does not say what you and others with your mindset seem to think.
Wouldn't it be more productive for you to go and start your own totally awesome tech blog IHateAMD.com where you could cover every issue from your own warped viewpoint and leave AT in peace?
Great, if so I'd like how long you've been reading AT, or using the internet as your primary source of information, for that matter. AT when Anand founded it, was built on the assumption to NEVER take what the vendors said at face value. To poke, dive, dig into what was said and y'know, test the validity of those claims and in doing so, often finding some cool, unusual and interesting results of that tech, but ultimately FORCING VENDORS TO ADDRESS ISSUES which led to better products for the consumer.
Nvidia made a major breakthrough with VRR, few will discount this fact, and all the major press sites covered it and agreed it was fantastic new tech as it rolled out on the market exactly as described. Meanwhile, AMD began ramping up their usual anti-proprietary jargon, while making a ton of promises that mostly turned out to be bogus anyways.
Then they finally launch some 18 months later, and not only declare an equivalent solution, but even go as far to say it is SUPERIOR. Now, to the uninformed, like you, you take it at face value and you believe it, but sites other than AT actually noticed problems even going back as far as CES, and did what AT *USED* to be known to do. Did more in-depth testing and clearly IDENTIFIED THE PROBLEMS SO THEY COULD BE ADDRESSED. What I never understand and a large part of the reason I simply can't get behind AMD products is the proclivity of their users to try and sweep problems under the rug, as if they didn't exist. The same exact thing happened with FCAT, it is as if YOU WANT an inferior product.
In the end, you get what you pay for and yes, for as long as their products come out with these kinds of issues and their supporters, fanboys, apologists, like you, continue to meekly pretend nothing is wrong, my views and buying habits won't change, at all. Needless to say the market agrees with me, which is why we may not have to concern ourselves with AMD for much longer.
With LG already having some monitors which support FreeSync via DisplayPort, hopefully they will be able to update their televisions to support FreeSync via HDMI.
Though there are some issues with the first generation of FreeSync panels having high minimum refresh rates and overdrive either not working at all, or only working well within an even narrower range of framerates, I am sure those will be worked out.
Minimum refresh rate seems as though it could be solved in software as long as the maximum supported refresh rate is at least double its minimum: have the GPU store the current frame and update the panel twice if the framerate is below the minimum refresh rate for that display. E.g. display 25 FPS at 50Hz on a panel which supports a variable range of 40-120Hz.
Overdriving seems like it may have been an oversight with the first-generation displays which could be solved inside the display itself - though that might raise the cost a bit.
But if they can get FreeSync into the official HDMI spec, we could potentially be seeing OLED televisions that support say 0-120Hz with no need for overdriving at all.
Since Freesync over HDMI does not use the HDMI protocol at all, I highly doubt it will be a part of the HDMI standard. Additionally HDMI consortium needs 5 years to implement even the slightest changes, see HDMI 2.0 which was in development for eternity.
It's a custom protocol in as much as it's not bog-standard HDMI. However it is developed as an extension to HDMI since AMD is seeking to have it integrated into the standard.
Agreed, especially when HDMI SIG members are not the same as DP, their primary interests are in the Home Theater/consumer electronics market rather than the PC market.
It will certainly create an interesting dichotomy if HDMISync ends up being proprietary to AMD. :)
If I can run freesync on a 4k tv I'm sold and jumping ship to AMD. Price and size of monitors gets me every time because I enjoy the 1080p experience on a 42in or greater TV. Gsync has been relegated to only specific monitors and brands which means waiting...and waiting for the perfect monitor to come to market.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
91 Comments
Back to Article
LordSojar - Wednesday, June 3, 2015 - link
Only trouble is, since these panels aren't vetted by AMD and just have a badge slapped on them if they fall within the spec for Freesync, the blurring issue and other reported issues remain. These issues do not exist within the GSync ecosystem because nVidia vets the panels themselves, on an individual level with each manufacturer before it can receive approval as a GSync approved panel for production. So, despite the price difference, you do indeed pay for what you get.Hopefully, GSync will see some major expansion this year and subsequent price drops. I own one myself and it's the best thing I've purchased for a PC since a solid state drive, period. Don't regret it one bit. Still have my S-IPS ProArt for photo editing as a second monitor as well, so no sacrifice on that front, but it would be nice to have 3-5ms GSync enabled S-IPS panels from good manufacters this year. I wouldn't even care about 4k on a panel like that... a 2560x1440, 5ms response time, 120Hz S-IPS GSync monitor? I will literally throw you my money.
Nagorak - Wednesday, June 3, 2015 - link
Hopefully GSync will just die off completely. We don't need any more of these BS closed, proprietary implementations. We need open standards. If Freesync has problems, then hopefully they can be improved on. None of us benefit from having to match our monitors to a specific graphics card manufacturer.meacupla - Wednesday, June 3, 2015 - link
There's no point of open standards, if these 'standards' have such low quality.Such is the case with the Philips screwdriver. It may be standard, but it's easily the worst screw drive available, as it easily damages itself and you have to apply a lot of force into keeping it mated, instead of having a secure contact and letting the screw do the work.
bloodypulp - Wednesday, June 3, 2015 - link
I, for one, am glad the Philips standard exists! It isn't difficult to determine which drivers are the better ones. And if you want to be a cheap@$$, you can get a cheap one!Flunk - Wednesday, June 3, 2015 - link
The Philips screwdriver was never meant to be used with motorized screwdrivers so perhaps that's the cause of your issues. I've never had an issue with it using hand tools.I'm not a big fan or Robertson or Torx screw-heads because you need the exact right size for anything. I have 12 different Torx screwdrivers and that just covers the non-security versions of the common sizes.
Yojimbo - Wednesday, June 3, 2015 - link
The Philips screwdriver was specifically designed to be used with motorized screwdrivers.From http://charlesandhudson.com/the-history-of-the-phi... :
"Phillips wasn’t trying to make life with hand tools easier. He was trying to solve an industrial problem. To drive a slot screw, you need hand-eye coordination to line up the screwdriver and the slot. If you’re a machine — especially a 1930s machine — you ain’t got no eye, and your hand coordination may depend on humans.
The Phillips-head screw and Phillips screwdriver were designed for power tools, especially power tools on assembly lines. "
mdriftmeyer - Thursday, June 4, 2015 - link
As a mechanical engineer, whether you're advocating torque or philips designed heads it is a matter of strengths of materials used in the screw that matters most.DanNeely - Wednesday, June 3, 2015 - link
Slotted is even worse than philips.The problem with philips is that we keep trying to use it for things it wasn't designed to do. It was never intended for high torque uses (like driving a screw into a block of wood without a pilot hole). The intent of the sloped hole and bit was that the driver was supposed to torque out to prevent over tightening and/or snapping off the screw head at a time (circa a century ago) when torque wrenches were prohibitively expensive for the factory floor and machinery for automatically installing screws wasn't consistent enough in the level of torque it applied.
akamateau - Wednesday, June 3, 2015 - link
@Dan NeelyYou are just SOOOOOOOO WROOONG. Where do I even begin?
Why should I bother. Just go here and read.
http://www.acontinuouslean.com/2013/04/05/a-better...
Then learn how to use a screw driver.
Murloc - Wednesday, June 3, 2015 - link
you say he's wrong and yet you post a link that explains the exact same thing he said.Shadow7037932 - Wednesday, June 3, 2015 - link
You should really read the articles you link to...nathanddrews - Wednesday, June 3, 2015 - link
Ever since I bought an impact driver, all my problems with Philips-head screws vanished. They just don't strip anymore.akamateau - Wednesday, June 3, 2015 - link
Hmmm.... perhaps you should learn how to use a Philips screw driver. Do you even know why the Philips style drive was developed or are you just ranting? The Philips self centerin screw solves several assembly issues that flat screw drivers could not resolve. How many rescessed precision machine screws do you see using flat blades? None.Your attitude clearly shows a lack of a technical education.
Murloc - Wednesday, June 3, 2015 - link
the philips screwdriver was the first self-centering screwdriver, that's why it's not low quality standard.The fact that is cams out was apparently wanted so that assemblage speed of cars could be increase without using torque drivers which weren't available. A source I've read disputes this claim but honestly, it doesn't matter, it was still easier to use and is good enough to build cars by hand.
The pozidriv was invented later as an upgrade for situations where you don't want this effect.
There's also the Torx.
These are all used.
The problem is that wood screws and other hobbyist stuff are stuck on the philips head because it's the first thing after the single-slot head that was decent and it took over as a standard.
Nothing you can do about it....
Morawka - Wednesday, June 3, 2015 - link
why are we talking about screw drivers in a free sync article.. stay on topic.dsraa - Thursday, June 4, 2015 - link
lolyefi - Saturday, June 6, 2015 - link
Screw you!Zak - Wednesday, June 3, 2015 - link
I like my G-Sync monitor, smooth gameplay, no ghosting, no issues whatsoever. I'd take existing proprietary tech over inferior open standards that are yet to be fully implemented. If Freesync proves to be superior I'll switch, but considering AMD track record the drivers will be riddled with bugs and problems. They need to work a lot harder on their software. That is the reason many people jumped ship over to Nvidia. It's not enough to look good on paper, the tech must work well too.Flunk - Wednesday, June 3, 2015 - link
The actual technology is functionally identical, the initial implementation is the issue with AMD's solution. That's the sort of thing that gets worked out quickly.Yojimbo - Wednesday, June 3, 2015 - link
AMD saying vague things doesn't convince me of that. Time will tell if it's functionally identical. Also, NVIDIA's implementation allows backlight strobing to be controlled as well. It's possible that, even if the problems with FreeSync can technically be worked out with their current implementation, by the time they do it, NVIDIA will have worked out combining G-SYNC with ULMB into a continuous blend.MobiusPizza - Wednesday, June 3, 2015 - link
Your argument about AMD software or hardware support is totally invalid. Freesync is just a specification, there is no hardware or software by AMD. The ghosting issues is up to the monitor manufacturer to solve, they can include a hardware buffer for example, which will make Freesync on par with Gsync with regards to behavior when falling below the minimum supported refresh rate.Shark321 - Wednesday, June 3, 2015 - link
This is 100% wrong. If it were no hardware or software by AMD then why only a tiny fraction of AMD video cards supports freesync? You need both AMD hardware and drivers on the GPU side in order for freesync to work. There is no working solution without AMD, even if possible in theory.caqde - Thursday, June 4, 2015 - link
You are wrong. The Implementation requires hardware on both sides. The GPU and the Monitor. The monitors implementing side is based on Standards. On the GPU side the GPU manufacturer AMD/Nvidia/Intel needs to support talking to the Monitor using this standard (Adaptive Sync). This part is Software and Hardware. AMD's Implementation is Freesync. But Intel and Nvidia are FREE to implement their own solution. Nothing is stopping them from implementing this and AMD's solution is not the only answer possible. (Nvidia GSync does not implement Adaptive Sync but it could in the future)chizow - Thursday, June 4, 2015 - link
@MobiusPizza: Please stop spreading misinformation and writing checks that AMD can't or isn't willing to cash, specifically the bit about the hardware buffer.Refuge - Wednesday, June 3, 2015 - link
If everyone supported it (nVidia and Intel) then it would get a lot more attention from a lot more people. Its the age old Kumbaya tale.If these companies would work together for the betterment of all customers rather than just their own shareholders, then freesync would be agnostic from AMD or nVidia or Intel, and every monitor would have it, and it would scare from 9hz to 2000hz.
Because to be honest, this should have been a standard long ago, I love greenteam, I appreciate their ingenuity and skill. But they are doing this wrong, they are doing well, but they are doing it wrong.
chizow - Wednesday, June 3, 2015 - link
Who was doing it right, before Nvidia invented VRR and brought it to market, and got it right the first time?Michael Bay - Thursday, June 4, 2015 - link
You don`t get it: they are wrong because they are doing it well, like, totally wrong.Refuge - Wednesday, June 3, 2015 - link
Also, with it being an open standard it isn't really subject to AMD's "Terrible driver track record"chizow - Thursday, June 4, 2015 - link
They spec'd the standard and the implementation and it has to go through their logo program, so of course it is subject to AMD's terrible support record. In fact, it is this approach and mentality that is clearly echoed by their supporters cements this, notion, AMD loves to throw out open standards and then just expects work from ?SOMEONE? to make it good. Ridiculous lol.pogostick - Tuesday, June 9, 2015 - link
Yeah. Like HBM. Ridiculous.chizow - Wednesday, June 3, 2015 - link
@Zak, well said, I am glad others see the value in proprietary and don't just regurgitate all this open standards nonsense.Zak - Wednesday, June 3, 2015 - link
I really have no problem with proprietary technologies if done well. Open source is good when it comes to software that can be easily patched but not hardware where features are pretty much hardwired and very little can be done in terms of firmware patches. All I know is that right now G-Sync is superior to Freesync because Nvidia put tons of resources in it. So Im in Nvidia camp and it doesn't matter if I am locked into a proprietary system, because I am, by choice. I've been enjoying my G-Sync display for few months now without any problems.chizow - Thursday, June 4, 2015 - link
Yep, same, I enjoy results, the best product for my dollar. Leave all the philosophical nonsense at the door, and let some guy who isn't in the market anyways espouse the benefits of poorly supported products.JTWrenn - Monday, June 15, 2015 - link
At this point I just don't like that Nvidia doesn't have a freesync option for their cards. You must go gsync or no VRR for you. That seems silly to me as it would allow them to have the best of both worlds. Support the open option and offer a better option through proprietary hardware. You get the cheaper guys who will go freesync and you get the more expensive that will go gsync and you get the only cards that support both. win win winakamateau - Wednesday, June 3, 2015 - link
What AMD is solving now with the HDMI solution is that Laptops will soon have Free-sync monitors. While display port has advantages HDMI port is still used in laptops as well as external monitors.Free sync is for the masses as well. ANd that means more eyeballs for game studios to sell games to.
akamateau - Wednesday, June 3, 2015 - link
Oh yeah, HDMI is also used in Tablets and Mobiles. So Freesync could be transpotable to the ARM universe as well.Shark321 - Wednesday, June 3, 2015 - link
I think you didn't read the article at all. They are using a propertiary protocol over HDMI with customized HDMI firmware on the receiver side. It will NEVER work with existing HDMI connections.chizow - Thursday, June 4, 2015 - link
Get ready for another 3 years of every HDTV with HDMI ever invented being able to upgrade to HDMISync for essentially free, with a firmware flash.Alexvrb - Friday, June 5, 2015 - link
That's how they started off with DP FreeSync, and now it's part of the current standard as DPAS. The way I look at it, it's not about upgrading existing displays, but about making it a part of the next-gen HDMI standard so it sees wider-spread adoption. Once it improves to a certain point I would also like to see Intel supporting Adaptive Sync standards. Particularly in mobile where you can't just swap out the graphics card.HarryMannbach - Wednesday, June 3, 2015 - link
This, this, a thousand times: this. I like AMD as a company because over the years they have worked to implement new and open standards to try to improve tech and make the world of computing a better place. nVidia seems primarily motivated by making their bank account a better place. Not digging on a company for trying to make money, but proprietary development is a massive hindrance on the proliferation of new technologies.TheJian - Monday, June 8, 2015 - link
Wake me when they FIX Freesync. Currently you should avoid it unless you like ghosting etc as PCper etc have shown. Not ready for primetime, maybe they'll get it right at rev2. Heck, both screenshots here have ghosting/blurring...LOL. Does it matter how fast you rolled it out if nobody should really buy it because it still sucks?bloodypulp - Wednesday, June 3, 2015 - link
With mobile GSync not needing the module Nvidia originally stated as requisite for the tech, they have already exposed GSync to be a scam. I have to give them credit. They've done a great job getting the true believers to throw money at them for implementing what AMD is giving away for free.Eidigean - Wednesday, June 3, 2015 - link
Cute.Don't you think it's possible, since it's all in one box, that they don't need to re-buffer the last frame just a few inches away from the GPU, in the case of running below the minimum refresh rate? It's not necessary to run DisplayPort internally (though some do, like Apple in the retina MBP) so NVIDIA could be talking directly to the TCON and simplify the system.
chizow - Wednesday, June 3, 2015 - link
The scam comes in when anyone who believes this buys a FreeSync monitor and sees all the problems that are resolved/mitigated by that "useless" G-Sync module.Laptops don't need the module, because the GPU is in direct control with the display using eDP standards. Such standards don't exist on desktop panels, hence the need for a scaler/G-Sync module.
Knowledge is power, and always superior to ignorance.
darth415 - Wednesday, June 3, 2015 - link
Jet fuel can't melt steel beams.Kuad - Wednesday, June 3, 2015 - link
Mobile doesn't need it because there is no scalar in the mobile implementation, the display is directly connected to the GPU, as has been explained previously by NVidia.akamateau - Wednesday, June 3, 2015 - link
Any new product release has growing pains. But Freesync has a potential that GSync does not. Hmmm.... it's free! And is not hardware dependant.Kind of a no-brainer.
Shark321 - Wednesday, June 3, 2015 - link
It's free, but unfortunately it does suck. You get a free meal in a soup kitchen, it's still not comparable to a 5 course menu in a michelin restaurant.chizow - Wednesday, June 3, 2015 - link
@LordSojar: Exactly, which is why I will always prefer a proprietary supported product over a laissez-faire open standards one, even if the proprietary one carries a premium.yannigr2 - Wednesday, June 3, 2015 - link
Thats why we have hardware sites to test monitors and inform us about their quality. Being locked in one manufacturer's GPUs and pay over $100, close to $150 extra, for someone to give the OK to the product we are going to buy, is not always the best thing.It's also too soon to judge Freesync/Adaptive Sync and accept as a necessity to keep paying indefinitely $150 for a sticker and also be locked in one manufacturer's GPUs.
chizow - Thursday, June 4, 2015 - link
Its not too soon, FreeSync has been on the market almost 3 months, 21 months since G-Sync launched and a number of issues still exist, none of which exist with G-Sync. There is no doubt that today, G-Sync is the superior option.People will continue to choose G-Sync because it is simply better.
Morawka - Wednesday, June 3, 2015 - link
forget IPS, i wanna see OLED pro art monitors.. IPS just can't compete.Mark_gb - Thursday, June 4, 2015 - link
With Nvidia defining everything, maybe it will seem best for a little while. But there are real costs of limiting something like this to a small subset of the market. Over time, Freesync will work its way into most monitors and now most TV's, and that will keep the costs down. And the monitor manufacturers will find a way to eliminate the negatives you mentioned. Its in their interest to not have those issues because they want to sell more monitors too.FlyBri - Thursday, June 4, 2015 - link
@LordSojar an easy way to remedy that situation is for AMD to have "AMD Certified" Freesync monitors. The standard can still remain open source, but Freesync monitor manufacturers can choose to send in their monitors to AMD for certification so that they can have an "AMD Certified" badge slapped on them for buyers who want assurances that the monitor is of a certain standard and quality, especially when it comes to the implementation of Freesync.medi03 - Friday, June 5, 2015 - link
The only chance for GSync is for AMD to quit GPU business altogether.It's an expensive piece of proprietary crap that even ignoring price doesn't offer ANYTHING at all on top of what FreeSync gives literally for free (it's so cheap, it's in all new upscaler chips already)
Oh, and limiting to only single port, "GSync DVI" isn't helping things either.
PS
And, oh, did nVidia finance FUD campaign about FreeSync's "issues", eh?
What a pathetic company, good lord...
K_Space - Wednesday, June 3, 2015 - link
All fair and well, but without many high end panels out there to demonstrate the new technology it simply won't fly.Marc GP - Wednesday, June 3, 2015 - link
They just need to convince the HDMI consortium to adopt it on the standard so every new montior/TV will incorporate it.I can't see why they wouldn't adopt it, it doesn't have any cost for them.
TristanSDX - Wednesday, June 3, 2015 - link
AMD's statement raleated to overdrive is nonsense. Monitor's slacer need to know when next frame arrive, and GPU shuold provide this information to monitor. Withouit overdrive FreeSync is useless. They made broken standard - typical AMD level.Ryan Smith - Wednesday, June 3, 2015 - link
Because you have to set the overdrive values at the time of refresh, it's impossible to know exactly when the next frame will be done. With a variable refresh display, the GPU has most likely just started rendering the next frame at the time you're painting your most recently complete frame to the monitor.All you can do is guess based on how long the last few frames took, and perhaps some rendering stats from the GPU (though I'm not sure anything above the frame history is all that useful).
Zak - Wednesday, June 3, 2015 - link
I believe that's what Nvida does, they approximate the time the next frame arrives and that is better than nothing at all, it still reduces ghosting apparently.chizow - Wednesday, June 3, 2015 - link
The G-Sync module also communicates back to the GPU, AMD claimed this was bad because it introduced possible latency, but it looks like it was worth it!The mobile implementation is definitely speculative however, but the GPU is in direct control of the monitor TCONs and overdrive.
Zak - Wednesday, June 3, 2015 - link
I have the 27" Asus ROG Swift which is supposed to be very fast and I don't notice any latency issues. Adaptive sync makes such an enormous difference that even if there is some minor lag related to the communication, its till better than without it. I have absolutely no complaints about G-Sync performance, seems to work exactly as advertised.chizow - Thursday, June 4, 2015 - link
I agree, Nvidia had a slide that basically debunked this (info came from Huddy, which is 99% guaranteed to be BS anyways) but of course the biggest proof was in the pudding where sites like TFTCentral/BlurBusters tested actual latency and found very little (and better than most panels).I have the Swift as well, and I was skeptical during the G-Sync run-up until actual products launched and reviews were done that ended up selling me on the tech and like you, I found it was everything that was advertised and now, even more as Nvidia continues adding features to it.
praeses - Thursday, June 4, 2015 - link
Dead wrong regarding the latency.The polling is causing significant issues when approaching the maximum supported by the monitor to the point where G-Sync is causing problems with competitive gaming and is being disabled. Look up "G-Sync Input Lag" or "G-Sync Latency".
If you're lazy try this:
http://www.blurbusters.com/gsync/preview2/
chizow - Friday, June 5, 2015 - link
It is already disabled, so yeah non-issue. Unlike AMD, Nvidia listens to feedback and fixes their problems. ;)Alexvrb - Friday, June 5, 2015 - link
Nixeus has a DPAS range of 30-144 and overdrive. If we can see this implemented more widely in the next generation of monitors that would be great. I'd like to see a review of this model in the near future! Decent budget options for DPAS is a good thing, it's the only way the technology will become mainstream.edzieba - Wednesday, June 3, 2015 - link
ARGH! It would be preferable to simply encourage display manufacturers to switch to DisplayPort as the 'default' interface for all price ranges of monitor. HDMI is a consumer AV interconnect that gets bastardised for use as a 'DVI replacement'. This causes no end of problems with compatibility; anything other than a set selection of timings is technically outside of the HDMI spec and in the wild no-mans-land of "it might work, maybe, depending on who implemented what outside of spec", will rarely behave in exactly the same way between different devices (e.g. Cisco VDI boxes that default to overscan on HDMI but only on certain monitors depending on an arcane combination of EDID flags), issues with 'autodetection' of TV levels and chroma subsampling what have go unfixed (and in some cases remain unfixed) for years, etc.HDMI is not for desktop monitors. It is an ugly hack. Stop bloody using it in them!
Senti - Wednesday, June 3, 2015 - link
I hope HDMI would just die so everyone can live happily on DP and connect everything with something like USB Type-C without any adapters/converters/botched colorspace issues.I never understood why in low end products you mostly see HDMI for which you have to pay royalties and DP in high end for which you don't.
meacupla - Wednesday, June 3, 2015 - link
because some people want to connect a console to their monitor?Flunk - Wednesday, June 3, 2015 - link
In a perfect world, your TV would be DisplayPort over USB-C as well. I don't see it happening, but it would make things easier.chizow - Wednesday, June 3, 2015 - link
Different markets, different cash cows. I'm frankly happy they are mostly cross-compatible to be honest.Zak - Wednesday, June 3, 2015 - link
As long as HDMI is offered along with DP I don't see a problem. More choices is always better.TristanSDX - Wednesday, June 3, 2015 - link
All of this FreeSync/G-Sync is basically worthless and overhyped. You can get the same results with V-Synv Adaptive (capping FPS to monitor refresh rate) and se level quality in game to match monitor's refresh rate. That's all, fast simple and free. G-Sync/FreeSync is useless for shooters, because 30-50 FPS is just too low for accurate tracking, aiming and shooting. The same is for high FPS like 80-120. G-Sync/FreeSync provide minimal advantage, barely noticeable.Zak - Wednesday, June 3, 2015 - link
OMG, another "use VSYNC, it's the same and it's free" troll. You have absolutely no idea what you're talking about. It clearly shows your fundamental lack of understanding what variable sync does, G-Sync or Freesync. There is no way Adaptive VSYNC works as well as variable sync. It sounds good but in reality is far from effective. It's a hack. I have two GTX980s that can run all my games at 60Hz or more and I have been doing exactly what you suggested with a standard 60Hz display: I would cap all my games to 60Hz using Adaptive VSYNC. But frame limiting isn't perfect, you may still get 59 or 61fps so I'd still get tearing an stuttering. Besides, you can't expect that your card will run everything at 60fps all the time. When I hooked up my G-Sync display, it's like day and night difference. You have to experience variable sync to understand. Now, when playing a game the framerates are all over the place and I still get smooth video with no tearing, no stutter regardless of my framerate. Also, since G-Sync monitors also have high refresh rates I enjoy my games at framerates higher than 60 without any glitches. My system can easily run all my games at above 60fps so I had to limit myself to 60 knowing I don't get the performance I should. But at the same time it can't run them at 120 or 144 so a faster monitor would not have helpe. Variable sync solves that. Variable sync does to games what SSDs did to storage. It completely changes your experience for the better.chizow - Thursday, June 4, 2015 - link
This reminds me a lot of the uninformed "SSDs aren't that big a deal" comments I read 5-6 years ago. Try it before commenting, or be prepared to put foot in mouth years later when you finally do see it for yourself.ant6n - Wednesday, June 3, 2015 - link
What's "TCON"monstercameron - Wednesday, June 3, 2015 - link
Timing controller?chizow - Wednesday, June 3, 2015 - link
@ Ryan & Jarred:Can we get a mea culpa from AT that retracts the previous stance FreeSync is an equivalent to G-Sync now? Honestly I hope you guys do a better job in the future empirically weighing the implementation before just going through marketing slides and declaring a tie, its a disservice to your readers and the pedigree of this site.
Personally I can't believe AMD is just shuffling on to the next poorly supported "open standard" without first fixing and addressing the issues with FreeSync. Its frankly, appalling.
Eidigean - Wednesday, June 3, 2015 - link
I second this motion.Shark321 - Wednesday, June 3, 2015 - link
agreed.kyuu - Wednesday, June 3, 2015 - link
Your inability to interpret any article regarding AMD without your "everything AMD does sucks because I have a weird emotional reaction to that series of letters" filter is really irritating. Nothing about the previous analysis of the technologies was inaccurate and this article does not say what you and others with your mindset seem to think.Wouldn't it be more productive for you to go and start your own totally awesome tech blog IHateAMD.com where you could cover every issue from your own warped viewpoint and leave AT in peace?
chizow - Thursday, June 4, 2015 - link
Are you done fanboying/apologizing for AMD?Great, if so I'd like how long you've been reading AT, or using the internet as your primary source of information, for that matter. AT when Anand founded it, was built on the assumption to NEVER take what the vendors said at face value. To poke, dive, dig into what was said and y'know, test the validity of those claims and in doing so, often finding some cool, unusual and interesting results of that tech, but ultimately FORCING VENDORS TO ADDRESS ISSUES which led to better products for the consumer.
Nvidia made a major breakthrough with VRR, few will discount this fact, and all the major press sites covered it and agreed it was fantastic new tech as it rolled out on the market exactly as described. Meanwhile, AMD began ramping up their usual anti-proprietary jargon, while making a ton of promises that mostly turned out to be bogus anyways.
Then they finally launch some 18 months later, and not only declare an equivalent solution, but even go as far to say it is SUPERIOR. Now, to the uninformed, like you, you take it at face value and you believe it, but sites other than AT actually noticed problems even going back as far as CES, and did what AT *USED* to be known to do. Did more in-depth testing and clearly IDENTIFIED THE PROBLEMS SO THEY COULD BE ADDRESSED. What I never understand and a large part of the reason I simply can't get behind AMD products is the proclivity of their users to try and sweep problems under the rug, as if they didn't exist. The same exact thing happened with FCAT, it is as if YOU WANT an inferior product.
In the end, you get what you pay for and yes, for as long as their products come out with these kinds of issues and their supporters, fanboys, apologists, like you, continue to meekly pretend nothing is wrong, my views and buying habits won't change, at all. Needless to say the market agrees with me, which is why we may not have to concern ourselves with AMD for much longer.
Southrncomfortjm - Wednesday, June 3, 2015 - link
Sooo, maybe we will see FreeSync on TVs in a few years? That would be epic.Shark321 - Wednesday, June 3, 2015 - link
no, since AMD is using a propertiary protocol for Freesync over HDMI. You will not be able no connect normal devices, only AMD GPUs.ant6n - Wednesday, June 3, 2015 - link
I think they point is that they're trying to create a standard here, just like they did with display port v1.2a.User.Name - Wednesday, June 3, 2015 - link
This is huge!With LG already having some monitors which support FreeSync via DisplayPort, hopefully they will be able to update their televisions to support FreeSync via HDMI.
Though there are some issues with the first generation of FreeSync panels having high minimum refresh rates and overdrive either not working at all, or only working well within an even narrower range of framerates, I am sure those will be worked out.
Minimum refresh rate seems as though it could be solved in software as long as the maximum supported refresh rate is at least double its minimum: have the GPU store the current frame and update the panel twice if the framerate is below the minimum refresh rate for that display. E.g. display 25 FPS at 50Hz on a panel which supports a variable range of 40-120Hz.
Overdriving seems like it may have been an oversight with the first-generation displays which could be solved inside the display itself - though that might raise the cost a bit.
But if they can get FreeSync into the official HDMI spec, we could potentially be seeing OLED televisions that support say 0-120Hz with no need for overdriving at all.
Shark321 - Wednesday, June 3, 2015 - link
Since Freesync over HDMI does not use the HDMI protocol at all, I highly doubt it will be a part of the HDMI standard. Additionally HDMI consortium needs 5 years to implement even the slightest changes, see HDMI 2.0 which was in development for eternity.Ryan Smith - Wednesday, June 3, 2015 - link
It's a custom protocol in as much as it's not bog-standard HDMI. However it is developed as an extension to HDMI since AMD is seeking to have it integrated into the standard.Shark321 - Thursday, June 4, 2015 - link
I still doubt it will be a part of the standard within the next 3-4 years, as the HDMI consortium is the slowest organization in the world.chizow - Thursday, June 4, 2015 - link
Agreed, especially when HDMI SIG members are not the same as DP, their primary interests are in the Home Theater/consumer electronics market rather than the PC market.It will certainly create an interesting dichotomy if HDMISync ends up being proprietary to AMD. :)
bbarrett - Thursday, June 4, 2015 - link
If I can run freesync on a 4k tv I'm sold and jumping ship to AMD. Price and size of monitors gets me every time because I enjoy the 1080p experience on a 42in or greater TV. Gsync has been relegated to only specific monitors and brands which means waiting...and waiting for the perfect monitor to come to market.