No, there wasn't. I didn't take proper notes, but I believe the Samsung panel was TN, the BenQ was TN as well (maybe?), and the LG display was IPS (or AHVA or similar).
Is it likely that the BenQ monitor was using the AU Optronics panel M270DAN02.3? It's the only panel I can think of that is 27" 1440p 144hz. It's likely this panel is also being used in the upcoming Acer XB270HU, which his another 27" 1440p 144hz IPS with G-Sync.
Entirely possible -- I am quite sure Samsung was 4K TN, but it may be that both the BenQ and LG were wide viewing angle. I know one of them was, but that's all I can remember. (Too many products seen in too few days....)
I am really not liking this 'Free-Sync" vs. "G-Sync" differentiation in the monitor segment. Monitors last for donkeys' years, and graphic cards seem to get replaced every time a new fruit fly hatches (around here, anyway...)! I DO NOT want any lock-in between my monitor and a brand of GPU. In fact, I am tempted to not buy a monitor that has any kind of lock-in. How long will it be before the smart monitor manufacturers decide to start making a monitor that will do both...?
Keep in mind that FreeSync uses DisplayPort Adaptive Sync and should thus always work. Of course, G-SYNC displays work fine with AMD GPUs as well, but without the adaptive V-SYNC aspect. G-SYNC will remain an NVIDIA exclusive I'm sure, but if FreeSync gains enough support NVIDIA might be forced to abandon G-SYNC and support FreeSync instead. Time will tell if FreeSync will get enough support, but it seems more likely than G-SYNC taking over.
Again, it will come down to which solution is better. Why would Nvidia need to abandon G-Sync when their solution holds an overwhelming majority of the potential TAM for variable refresh rate monitors that matches their high-end dGPU marketshare?
I suspect many expected the same of CUDA as soon as OpenCL rolled out a few years ago, y'know, open standards, multiple disinterested industry parties backing it etc. and yet. CUDA is still alive and dominating today. Because its the better solution.
I expect nothing less when it comes to G-Sync and FreeSync. The market will respond, and the better solution will prevail.
Silly, nVidia doesn't care about "potential" – all they need is "vendor lock in". I'm quite sure that the technical differences for videocards are minimal.
About CUDA: almost everyone who writes it hates nVidia for that. It's far from "better solution", but you get great developer tools for it (and very important tools, like profiler) while for OpenCL nVidia provides you almost nothing (even AMD profiler is more informative on nVidia GPU than nVidia's one). And it's all pure marketing as both CUDA and OpenCL compiles into the same PTX on nVidia, so no technical reasons for such disparity in support. With equal support and tools CUDA would be dropped and forgotten long ago.
You forgot the fact that cuda came first and therefore already had developer tools created for it. Creating new tools require time and money. If Cuda works fine for nVidia, why would they bother making new tools?
Also, Nvidia has been more successful selling business cards due to the effort they have made. Specifically in virtualization features. AMD has been playing catch up in the enterprise for video cards for years.
Previous gen NVidia cards had poor OpenCL performance too.
Yay, more nonsense. I guess Nvidia doesn't care about creating new, innovative technologies that didn't pre-exist, simply because everyone would love them more for just waiting for some standards board to invent it years later?
And almost everyone who writes for CUDA hates it? What kind of nonsense is this? People love, and use CUDA because IT WORKS and makes THEIR JOBS, and therefore, their LIVES easier. I administrate and support a number of users and machines that use CUDA and Nvidia solutions and at no point have I ever heard any of this gibberish, in fact, they make it a point that the solution MUST be Nvidia CUDA-based for their needs.
Cuda works best because of $7Billion invested in it, ~8 years of building the ecosystem for it, 500+ schools teach it, every popular app for content creation etc uses it, etc etc. The reason OpenCL sucks is there is no company backing it like NV backed Cuda for years. You get what you pay for (mostly) and NV paid and paid and paid for cuda, and they're getting what they paid for now year after year.
Please show a test where Cuda loses to OpenCL. Find the best OpenCL AMD app and pit it against the best NV Cuda enabled app that does the same thing. Cuda will win pretty much always. There really is a REASON Cuda owns 80% or so of the workstation market. People who are risking money (IE, their business) don't buy crap for tools on purpose. I could hate NV/Cuda and I'd still buy it today due to nothing being better. If you don't, the guy who does do that will blow your doors off until you go bankrupt.
As long as Cuda is faster, NV has no reason to do anything but develop new versions of cuda. No point in helping the enemy catch up. That would be stupid business and cause me to drop their stock like a hot rock...LOL.
I think Intel may weigh in here and if their integrated graphics support FreeSync (since it is in DisplayPort spec), then Freesync will dominate.
I don't care for vendor lock-in because I do go back and forth between vendors. I have multiple machines in the house. Currently my house is AMD R9 290, but it was Nvidia GTX 660. Integrated and discrete in laptops. Generally discrete GPUs in laptops I lean towards Nvidia because I don't want to play the Enduro "does it work?" game.
They will not abandon G-Sync. Just in case there is a lot of Freesync monitors in the wild (and with Samsung and LG backing, there will be), they will just easily enable on their cards Freesync compatibility. So they will support both.
Where do you get $150 monitors being the same as $800 monitors? Excluding G-Sync on any panels in the market, do you think you "get the same" on a $150 monitor as a $600 non-G-Sync equivalent?
I know you're trying to perpetuate the hopeful myth all monitors will be FreeSync compatible simply because Adaptive Sync is an optional component of the DP1.2a standard, but let's be real here, not all monitors that support DP1.2a are going to be Adaptive/FreeSync compatible.
No, there's not a single DP 1.2 monitor that supports Adaptive Sync or FreeSync, or more specifically, variable refresh and dynamic frame rates. No firmware or software update on the planet is going to replace the missing hardware necessary to make FreeSync a reality.
Haven't you read the articles? DP1.2a was ratified in May 2014 and is an OPTIONAL part of the standard that is not backward compatible with existing displays because that OPTIONAL functionality requires new scaler ASICs that were not developed until after the spec was ratified, with news of them sometime in September 2014, with pre-production samples finally being displayed this week at CES.
I don't blame you for being confused on this though, I completely understand all the noise regarding FreeSync's development has been very misleading, which is why I think most companies choose to develop the tech and begin production in working devices before making a ton of unsubstantiated claims about it. :D
We will hear for YEARS the prior falsehoods the AMD fanatics just spewed in comments over the last 2 pages, and it will become their always proclaimed belief system, despite the facts. The chip on their shoulders will grow, they will never have any valid counterargument, and of course the rest of us will have to put up with it. When freesync doesn't pan out properly, is less effective with less flexible features, has problems working with many games, and requires hacks and has driver issues with AMD, the same crowd will blame nVidia for not "supporting it" and "causing AMD problems". They will then deeply desire a huge lawsuit and a payment from nVidia directly to AMD, while having exactly the OPPOSITE stance when it comes to AMD's Mantle.
Amen. I'm hoping G-Sync dies out pretty quickly so we don't have two competing implementations for too long. It's not likely G-Sync will be the better option for interoperability due to NVIDIAs reluctance to license their technology.
I always find this to be an interesting dichotomy. AMD fans will frequently parrot the internet meme that the world needs AMD for the sake of competition against the likes of Intel and Nvidia, or we'll all end up somehow perishing under the oppression of $10,000 CPUs and GPUs and that competition is always good and necessary for the consumer.
But when Nvidia comes out with an innovative new technology that AMD doesn't have in an attempt to differentiate and better their products for their customers, these same AMD fans want them to fail. Because they're competing too hard?
G-Sync is a proprietary standard. Freesync is just a brand name for DisplayPort Adaptive Sync. The difference is Nvidia could easily implement Freesync if they wanted to, but AMD cannot implement G-Sync. Would you rather have a single standard that only one company is allowed to use, or a single standard that everyone can use? :) If there wasn't an open alternative standard you'd have a point, nothing wrong with good technology, but the fact is a world where everyone implements adaptive sync would be a world that's better for everyone, with the possible exception of nvidia's marketing department.
Also, I wouldn't really compare the GPU world to the CPU world. While I'm sure you'll respond to this with something pointless about the state of the market over the last couple of months, AMD STILL have the fastest single card and have come out with faster cards than Nvidia for years. Nvidia is not like the graphics version of Intel - frankly they do as much to make AMD keep competing as AMD does to make them keep competing.
It's DisplayPort because it was given to VESA for free by AMD and VESA accepted the IP. This was probably a matter of NVIDIA being months ahead with G-Sync and controlling over 60% of the discrete desktop graphics chip market share. AMD couldn't compete with NVIDIA on the matter so their best strategy was to be defensive and give it away to take the advantage away from NVIDIA. However as long as you have a current NVIDIA video card, FreeSync won't work with it, so if you want that feature, you'll have to get a G-Sync monitor instead. If FreeSync works just as well as G-Sync it would be nice if NVIDIA would support FreeSync, but I wouldn't count on it. They spent money developing it first and on their own, and I believe it or something similar has already been available for their Quadro cards since 2012, so it would mean supporting both standards at once for a while. Unless they are forced to, they probably don't want to incur extra cost overhead from an innovation they developed because a competitor decided to develop something similar and give it away.
Hmmm... The Adaptive sync is from VESA and much older "thing" than the free sync re branding made by AMD. AMD just gave new name to allready existing technology and dis show a new way of using existing technology to achieve similar effect that can be done by using g-sync. All in all open standard is better. That is why DX12 will win Mantley in the long run (even though dx is not open, it is used by all competitors). That is why adaptive synch will win in long run. Intel will use it to make their igp to look less bad, AMD have to use it because they wont to compete with Nvidia. Nvidia will wait and make money as long as they can by sellin g-sync produts. When the g-sync is not selling Nvidia will release new super drivers that will allow Nvidia adaptive sync with their all products, because they don't wan to use the same name as AMD who will keep on branding the exact same system as a free sync because it sounds better than adaptive sync even though it is exactly the same thing...
Actually, according to AMD, Nvidia can't just support FreeSync. In fact, even most of AMD's recent GPUs can't support it. Who knows if Intel can? FreeSync has an even smaller addressable market than G-Sync right now, and this is a fact.
http://techreport.com/news/25867/amd-could-counter... According to AMD's Raja Koduri: "The exec's puzzlement over Nvidia's use of external hardware was resolved when I spoke with him again later in the day. His new theory is that the display controller in Nvidia's current GPUs simply can't support variable refresh intervals, hence the need for an external G-Sync unit."
Off the bat I have both 970 Asus and r9 290 Asus: subjectively they are the same for gaming, for opencl AMD kicks ass... I have universe sandbox 2 alapha for the last year in a bit ( beta testing it), and the performance of AMD cards is huge when lots of shitz is happening in the simulation, in my perspective, AMD is the champ for anything that uses Opencl that isn't a benchmark... however, the 290 is a reff card that I picked up when it was released, and yes it gets much louder then the 970...both cards are great.. on the CPU side I like their idea of the APU unit and HSA, but needs to be faster and on par with Intel ipc before I ever go back to them
Consider that only a handful of latest AMD cards support Freesync.
nVidia will not have issues supporting it in their next generation of cards (along with G-Sync). Intel could do the same with Skylake (and mind you, this tech is best for low-performing gfx cards).
With Samsung and LG backing this technology (and Asus having non-branded, but probably compatibe monitor as well), this is bound to get some serious market share. I mean, it looks to me Freesync monitors offering is going to be wider than G-Sync, even though that one had one year headstart.
Let me know when AMD shows it running GAMES (a lot of them) to prove it works. I'll take the one that is BEST, no matter who owns it if they also have a great balance sheet (higher R&D, profits, cash on hand etc) and can keep investing in the best drivers for the gpu I'll need to buy to use with the monitors. While I currently run a 5850, it will be my last AMD card/product for a while if Gsync is BETTER. I already have zero interest in their cpus due to Intel. I hate that, but again, why buy 2nd when they're so far behind? I just can't do that anymore these days. AMD has to beat NV in the gpu perf with at least the same power AND at least MATCH Gsync (no loss in quality/perf results) or they lose my money for ages this time.
NV owns 67% of discrete amd owns 33% or so. NV owns 80% of workstations. Umm, NV is like Intel to AMD in gpus. AMD owns next to nothing these days due to selling everything off to stay afloat. Profits are rare, balance sheet is in shambles, etc. The complete opposite of NV. NV wins in every financial category. These two companies are not even on the same playing field today (used to be, but not now). Please take a look at their financials for the last decade then explain to me how they are even. NV has put as much money (~7B) into Cuda over the last 8 years or so, as AMD has LOST in the last decade (about 7B of losses). If that isn't night and day, I don't know what is. You're not making sense sir. I don't care about everyone, I care about having the best on my desk that I can get ;) If that means proprietary, I'll go that way unless 2nd is so close you can't tell the difference. But we can't say that here yet, since AMD seems afraid to show it GAMING.
You talk like you've seen freesync running 100 games. NOBODY has. I'm more worried about being stuck with a CRAP solution if NV caves while having the BEST solution already in hand and done (don't think they'd do that, but...). Maybe AMD would have a marketing dept if they'd start making some money so they could afford a REAL one. They need to start putting out GREAT products instead of "good enough for most", so they can charge a PREMIUM and make some profits for a few years in a row. The only reason AMD has the fastest single card is NV didn't want one ;) It's not like they don't have the cash to put out whatever they want to win. I'm fairly certain AMD isn't making wads of cash on that card ;) Their balance sheet doesn't show that anyway :( Whatever they spent to dev that card should have been spent on making BETTER cards that actually sell more volume (along with better drivers too!). NV could probably buy AMD if desired at this point (stupid, but they could). NV's market cap is now 5.5x AMD's and they spend more on R&D with less products (AMD makes a lot of cpus, etc but spends less on R&D). NV is beating AMD for the same reasons Intel is. Smarter management (should have paid 1/3 for ATI, should have passed on consoles like NV, etc etc), better products, more cash, less debt, more profits. That's a LOT for any small company to overcome and we're not even talking the coming ARM wave to steal even more low-end crap on the cpu front where AMD lives (getting squished by ARM+Intel sides). You should get the point. You can't compete without making money for long.
@Antronman, this is the 2nd statement you've made that is more or less, nonsense.
Are you some kind of Hope Merchant? Are you really trying to say HBM is automagically going to result in 144fps minimums? Is HBM even confirmed for AMD's next cards? Bit of a stretch since you're kinda going 2 degrees of FUD separation here, don't you think?
Back in reality, many people will benefit from less than max refresh rates without having to use triple buffering while solving the issue of tearing and minimal input lag/studder.
But several multi-card setups (and even now some single cards) can attain 144fps.
Adaptive refresh rates are only useful if you have a card that can't keep a minimum fps past the refresh rate of a fixed refresh rate panel. If the maximum refresh rates aren't anything higher than what they are now (if we can even see the difference between higher refresh rates).
Ah, so you posted a bunch of rubbish and only clarified once called on it, gotcha. Why am I not surprised? Sounds like more "noise" within a distinctly pro-AMD diatribe.
There are very few multi-card set-ups that can achieve and maintain 144fps minimums in modern games, and also, at what resolution are you talking about? 1080p? 720p? This new wave of FreeSync and G-Sync monitors are moving beyond 1080p and pushing 1440p which is nearly 2x as many pixels as 1080p, meaning it is that much harder to maintain those FPS.
Same as 4K, its 4x the pixels/resolution of 1080p, so yeah, you're looking at video cards and combinations that don't exist yet that can maintain 60FPS minimums at 4K for modestly demanding, recent games.
In every single one of these situations, variable refresh would work wonders for end-users, in fact, spending a little bit extra money on a Variable refresh monitor may end up saving you from having to spend 2-3-4x as much on video cards to try to achieve the unrealistic goal of meeting minimum FPS that meet or exceed maximum monitor refresh rates at 144Hz or 60Hz at 4K.
Because it's the Apple way of doing things. You have to buy into the NVidia ecosystem and stay there. Nothing prevents NVidia from supporting both their standard and FreeSync, or from Intel to offer it in their APUs. Since there's nothing to license it should eventually be easy for monitor manufacturers to build it into every monitor they offer, even if in limited form for cheaper panels that wouldn't support the same range of refresh rates. No one argues Apple doesn't make good stuff, and they have helped drive the mobile and desktop market in beneficial directions for a while, but it's still Apple, and most of us don't want to live under their roof. Same goes for Nvidia.
It's also the AMD way of doing things (pure audio), and just about every company's way of doing things. AMD simply felt they couldn't compete here so they played defensively instead of going head-to-head on it with NVIDIA.
What's defensive is AMD is pennyless, so they will do the second hand generic and demand, like all their fans do, that it is the universal monopoly. Thus, when Bill Gates and Microsoft OWN the OS on every computer in the world, you AMD fans need to be reminded, that's the way you like it.
And thus is the nature of competition, isn't it? You make your products better than the competition, to benefit your existing customers and attract new ones?
I guess Nvidia, Apple, Intel and everyone else should just stop innovating and developing new tech, or just give it away for free so that their competitors can catch up? Or even more laughably, give up on their originally, pre-existing, and superior tech just because their competitor is finally offering an inferior analogue over a year later?
There are still a lot of unanswered questions about FreeSync beyond what is covered here, such as cost, lag/latency, and framerates beyond max refresh. In PCPer's analysis, they mentioned a potential deal-breaker with FreeSync when framerates exceed the monitor's refresh rate. They mention it was a "design decision" by AMD to either force the user to enable Vsync in this situation, or to suffer from screen tearing, thus negating the benefit of variable/dynamic refresh. I'm really not sure why AMD didn't implement a soft/driver framerate cap like Nvidia did with G-Sync.
There are also questions about input lag/latency, as G-Sync has held up favorably when tested independently against the fastest panels with V-Sync off. We will see have to see how FreeSync fares against them.
In the end, it will simply come down to which solution is better, as usual. As I expected months ago, and as it looks to be shaping up from these early demos, G-Sync appears to be the superior solution, and as such, it will command a justified premium price. These features on monitors will just end up being another checkbox feature that develop each vendor's ecosystem and influence a buying decision, such is the nature of competition.
@Jarred: did AMD mention when you might be getting some review samples? Look forward to seeing some tests and general impressions!
"In PCPer's analysis, they mentioned a potential deal-breaker with FreeSync when framerates exceed the monitor's refresh rate. They mention it was a "design decision" by AMD to either force the user to enable Vsync in this situation, or to suffer from screen tearing, thus negating the benefit of variable/dynamic refresh. I'm really not sure why AMD didn't implement a soft/driver framerate cap like Nvidia did with G-Sync."
Uh, what do you want it to do? How is a framerate cap any better than v-sync capping the framerate?
@Gigaplex: the answer is quite simple, you can have a framerate cap WITHOUT Vsync, because in no way are they mutually dependent. The downside of course on traditional displays is that you still get screen tearing, but on a dynamic refresh rate monitor, you won't get tearing, you get the most recent, full frame WITHOUT the associated input lag.
G-Sync does this, with FreeSync it is less clear because it appears they are still relying on some form of V-sync (most likely triple buffering for sub-native refresh) which reverts and breaks above scaler supported refresh rates, forcing them to enable V-Sync at the application level.
Honestly, to me it looks like they've basically implemented Nvidia's older frame rate control technology Adaptive Vsync in hardware at the scaler level without the "automatic" switching of Vsync On/Off. As a refresher, Adaptive Vsync was introduced by Nvidia with Kepler where it automatically turned Vsync with triple buffering on below the native refresh rate of a monitor, and disabled it above native refresh rate. I am sure this work was instrumental in coming out with their ultimate solution, G-Sync, which bypasses Vsync and the problems associated with it altogether.
G-sync only works within a certain frequency range as well. It's no different than FreeSync in that respect. And AMD is developing a Dynamic Frame Rate Control feature for their drivers. So G-sync and FreeSync do appear to be fairly closely matched.
Well, except for the fact that G-sync requires a custom FPGA board to be installed (extra cost), requires this FPGA board to be tuned to the panel of the monitor it's going to be installed in (extra cost) and requires a licensing fee to be paid to Nvidia (extra cost). Since the only additional cost for FreeSync is an updated scaler, G-sync will always be more expensive than FreeSync.
Not to mention G-sync locks you into nvidia GPUs only while FreeSync is based off an industry standard which any company is free to develop. And Adaptive-Sync (which FreeSync is based off) also has power savings benefits. Since this is an industry standard, I would expect Intel to eventually jump on the bandwagon as well.
Right out of the gate, FreeSync is looking to eclipse G-sync. AMD has reported that there could be as many as eleven different FreeSync capable monitors on the market by the end of March. How many models of G-sync monitors have been released in the 12 months since G-sync came out? Seven?
Yes, we need objective reviews between G-sync and FreeSync before we can make a meaningful comparison. But G-sync seems far from being "the superior solution" as you put it.
Theres no point in touting that freesync an open standard until other adopt it. There are so many open standards that are not used and is basically proprietary is someone uses it. MicroUSB type A for example.
G-Sync works in a wider frequency range, and by design, it can't go out of this range, never forcing the user to make a choice between negating the benefit of dynamic refresh by enabling V-Sync or suffer tearing, like FreeSync does. And yes, AMD has recently been touting its DFRC feature as a built-in framecap, but its curious why they didn't just make this a softcap for FreeSync enabled globally. Its most likely because FreeSync is still tied to and limited by the monitor refresh so exceeding the refresh intervals reverts back to tearing, but I am sure we will get more indepth info on this once they actually hit the market or review samples go out.
The G-Sync module with custom FPGA and onboard DRAM is necessary, but also most likely why Nvidia ends up with the better solution with G-Sync. Its hard to emulate in software what is being done in hardware, which is what I think AMD is running into with their FreeSync solution. Not to mention, we still don't know the price difference between the "expensive" G-Sync module vs. the custom, premium Scalers AMD and their partners took 8-12 months to bring to market, but they certainly have changed their tune from the days they were claiming FreeSync would be an essentially free upgrade that might only need a firmware update on existing monitors. :D
And FreeSync locks you in just as much as G-Sync, as long as no one else supports FreeSync, its in the exact same boat. Worst boat actually, since Nvidia commands an overwhelming majority of the dGPU market at a 70/30 rate for supported GPUs, and not even all AMD GPUs in the same generational span (again, I hope I don't need to redefine this for you) are even capable of supporting FreeSync. Meanwhile, all Nvidia GPUs from Kepler onward support G-Sync. Intel has shown no interest in FreeSync, which is really no surprise given its main benefit is gaming, a market Intel doesn't really address with their IGPs, and they got all they needed out of the eDP spec when they invented and developed it for reduced refresh power saving years ago in their laptop displays.
And right out of the gate FreeSync is looking to eclipse G-Sync? Haha how so? Nvidia has just as many G-Sync panels on the market TODAY as AMD has ANNOUNCED, and from what we have seen already, FreeSync is inferior to G-Sync in every way but *POSSIBLY* cost. We don't even know if FreeSync addresses the other major problem associated with V-Sync, input lag/latency, until reviewers and interest sites get their hands on them. So yes, I guess a solution that is already demonstrably inferior, is still an unknown commodity in terms of cost, is still not available for purchase and still has major questions about its compatibility and implementation has somehow eclipsed a solution that is available today, and does everything it claimed to do, from Day 1 in G-Sync? Fanboy fantasy at its finest.
But yes, we do need more object reviews to further dissect and expose the differences between G-Sync and FreeSync. G-Sync has been on the market for about a year and has held up against such scrutiny, all the questions are surrounding FreeSync, but from what we have already seen, it is falling short.
The stated compatible refresh rates for FreeSync are from 9Hz to 240Hz. I haven't been able to find any upper or lower limits listed for G-sync, but I fail to see how it could possibly work "in a wider frequency range" than FreeSync as you claim it can.
FreeSync does not require a custom FPGA board, does not require custom tuning for each panel, does not require that royalties be paid back to AMD. Therefore, given two otherwise identical monitors, the FreeSync display will always be less expensive than the G-sync alternative. That's a huge win for FreeSync right there. And you are wrong about AMD claiming that it would only be a free upgrade for existing monitors. They explicitly stated that although their test model monitor was able to utilize FreeSync with only a firmware update, most likely this firmware would NOT be available through the manufacturer.
From the Anandtech "FreeSync Monitor Prototype" article: "At this point AMD is emphasizing that while they were able to get FreeSync up and running on existing hardware, owners shouldn’t be expecting firmware updates as this is very unlikely to happen (though this is ultimately up to monitor manufacturers. Instead AMD is using it to demonstrate that existing panels and scalers already exist that are capable of variable refresh, and that retail monitors should not require significant/expensive technology upgrades."
You can claim that AMD's FreeSync "locks" you in to their hardware, but only because Nvidia has stated that they refuse to support Adaptive-Sync in favor of their proprietary version. G-sync is not an industry standard. Adaptive-Sync is an industry standard. We don't know yet whether or not Intel plans to support AS in the future, but it would be almost inconceivable for them not to. Adaptive-Sync benefits lower end systems more than dedicated gaming machines. Laptops and business computers (more often than not) fall into the "lower end system" category. And Intel sells a LOT of hardware that ends up in laptops and entry level computers.
Naturally not all AMD video cards will support FreeSync. Just as not all Nvidia cards support G-sync. There will always be a cutoff line for hardware that simply will not be compatible with new technology. For G-sync, it's Kepler and newer. For AMD, I believe it's Tahiti and newer. I don't know where you're trying to go with this one. It's simply the way of things in the technology world that not all old models will support new features.
And I stand by my assertion that FreeSync appears to have a better potential for success than G-sync. It's taken Nvidia an entire year to get as many models out as AMD has announced that will support FreeSync within months. And while Nvidia may have a majority in the dGPU market, it has only a 20% share in the overall GPU market with AMD taking 20% and Intel holding 60%. And as it's the lower end systems that will be typically running at lower frame rates, it's these systems that will be utilizing the benefits of Adaptive-Sync more often than dedicated gaming computers. Intel has not yet weighed in on whether or not they will eventually support Adaptive-Sync, but being an industry standard, they may do so whenever they wish. Nvidia is free to adapt their GPUs for AS as well. That's the beauty of it being an industry standard.
I haven't read any FreeSync reviews yet and I'm almost certain you haven't either. In fact, you have absolutely NO IDEA if there is any visual difference whatsoever between G-sync and FreeSync. To make a claim such as "FreeSync is inferior to G-Sync in every way but *POSSIBLY* cost" without any reviews at all to back yourself up simply makes you look daft. I plan to wait until the professional reviewers get done with their comparisons before I'll make any claims as to the performance of FreeSync vs Gsync. But I'll stand by my previous assertion that FreeSync's future is already looking brighter than Gsync.
Well, HotHardware had a crew at CES who looked at the AMD FreeSync displays and had the following to say:
"AMD expects there to be a total of 11 FreeSync displays available by March at varying refresh rates, resolutions, and panel sizes, including IPS panel options and the aforementioned 144Hz gaming panels. Obviously a full comparison between G-Sync and FreeSync will have to wait for head-to-head hardware, but our team reports that the two standards appeared to perform identically.
Assuming that continues to be true, AMD could have an advantage with this feature -- FreeSync reportedly doesn't add any additional cost to display panels, whereas the ASIC hardware required for G-Sync reportedly increases panel retail cost by ~$150. Of course, it's ultimately up to the manufacturers themselves whether or not to charge a premium for FreeSync monitors -- there's just no baked-in cost increase from specialized hardware."
As I said, FreeSync is already looking pretty good. Just waiting on the official reviews now.
Interesting, so is AMD once again perpetuating the myth FreeSync has no additional cost? LMAO. Sticker shock incoming for AMD fanboys, but yes, I am looking forward to further dissection of their half-baked, half-assed solution.
The one company that is offering both G-Sync and Free Sync monitors has priced the Free Sync $150 less. So even if Free Sync isn't free it is the more reasonable of the two.
You do understand that the "crew" has not seen it running a GAME yet correct? The "crew" saw the same things anandtech did. Windmill demos etc. Get back to us when we all see them running GAMES and NOBODY can tell the difference at THAT point in time. Currently we know specific demos set up to show the effects work, but have ZERO idea how it works in games because for some reason (umm, inferior tech?), they even shy away from showing it at a CES show 3 months before they hit supposedly. Hmmf...Not much confidence in the gaming part right or why wouldn't you have shown a dozen games working? How hard is it to have games running?
Freesync is looking pretty shaky to me or they'd be showing games running. Anandtech mentions the costs (and forgets testing for certification also costs, panel makes must pay this, testing equipment costs also), so it isn't free. Free for AMD maybe, but that's it and they have to pay to R&D the cards to comply also or all their cards would work (Nvidia also). There is a reason NV won't support it for at least another generation (cards not compatible as AMD even suggests), and also a reason OLD AMD cards won't work either. Compliance from all sides is NOT free. Scaler tech had to be modified (so that required some "SPECIALIZED HARDWARE" correct?), monitors need to be VESA certified to wear that label, and AMD/NV have to mod their cards. I could go on but you should get the point. NONE of that is free.
"FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers."
Did you miss that part of the article? The standard is free, but that is it. The rest COSTS money and WE will be paying it. The only question is HOW MUCH?
@Creig: why are you still quoting dated spec sheets that clearly do not reflect reality now that we have seen ACTUAL FreeSync monitors on the market that clearly show those specs are inaccurate, if not to purposefully mislead? AMD can say 6 months ago, that FreeSync *CAN* support anywhere from 9 to 240Hz, but if the actual scalers that go into production only actually support FreeSync in the 40-60Hz band, or the 30-144Hz band (on the BenQ), is that 9 to 240Hz statement accurate? Of course not, we can just file it under more nonsense AMD said about FreeSync prior to actually y'know, doing the work and producing an actual working product.
And no I am not wrong about AMD claiming monitors might essentially get a Free upgrade to FreeSync because Raja Koduri was telling anyone who would listen this time last year, that there might be existing monitors on the market that could do this with just a firmware upgrade. But I know, just more FUD and misinformation from AMD as they scrambled to throw together a competing solution when Nvidia completely caught them with their pants down by introducing an awesome, innovative new feature in G-Sync.
http://techreport.com/news/25867/amd-could-counter... "The lack of adoption is evidently due to a lack of momentum or demand for the feature, which was originally pitched as a power-saving measure. Adding support in a monitor should be essentially "free" and perhaps possible via a firmware update. The only challenge is that each display must know how long its panel can sustain the proper color intensity before it begins to fade. The vblank interval can't be extended beyond this limit without affecting color fidelity."
What's even more interesting is that TechReport subsequently edited their story to exclude mention of Koduri by name. Wonder why? I guess he probably asked them to redact his name as he tried to distance himself from comments that were obvious misinformation at a time they had no solution and were just grasping at straws. But revisionist history aside, you and every other AMD fanboy was screaming from the rooftops a year ago saying FreeSync would be better because it would be "Free" and Nvidia's G-Sync was charging an overpriced $200 premium. Now, its a $50-100 premium, maybe, yet still better. Interesting what a year and actual product will do to change the situation.
And yes, FreeSync locks you in to AMD solutions only, and only newer AMD solutions at that. Who else supports FreeSync other than AMD right now? What makes you think Intel has any interest in FreeSync, or that their display controllers can even support it, given even many of AMD's own cannot? What's even more funny is Intel has shown more interest in Mantle than FreeSync, and yet, AMD was happy to deny them there, but FreeSync which they are freely giving away, no interest from Intel whatsoever. So yes, if you buy a FreeSync monitor today, you lock yourself into AMD and should have no expectation whatsoever that any other GPU vendor will support it, simple as that.
And where I was going with support? Where I was going is simple, you can't naively assume Nvidia, Intel, Qualcomm or anyone else can simply support FreeSync when it was a spec that AMD designed to work with their hardware, and only works with SPECIFIC AMD hardware at that. Again, you assume many things but as usual, you are wrong, the only question is whether or not you are attempting to deceive purposefully or you are just speaking ignorantly on the topic. Tahiti and other GCN 1.0 cards are NOT supported for 3D games with FreeSync, from AMD's own FAQ, only a handful of newer GCN 1.1+ ASICs (Hawaii, Tonga, Bonaire) and a few newer APUs. But this is par for the course for AMD and their half-baked, half-supported solutions, like TrueAudio, Virtual Super Resolution, CF Frame Pacing etc., only SOME of their cards in any given timeframe or generation are supported in all mode and there is no clean cut-off. Nvidia on the other hand is easy, anything Kepler and newer. The way of the world is, Nvidia is much better at supporting new features on legacy hardware. AMD does a much worst job, but to say anyone can just support FreeSync if they want to is a bit laughable given AMD can't even support FreeSync on all of their still relevant cards, and I am SURE they want to. :D
Of course you take the position AMD FreeSync has a better position in the market, but unfortunately, reality says otherwise. Nvidia holds a commanding lead, even bigger in the last few months where these relevant affected SKUs were sold, in the only TAM that matters in the use-cases these displays will be sold and deployed: gaming. That means mobile and dGPU. Overall graphics market share means absolutely nothing here because over 50% of that is Intel which might as well not exist in the discussion. That brings us back to over 70% market for Nvidia (Kepler + Maxwell) vs. 30% or less for AMD (3 ASICs + a few APUs in same last 3 years). Oh and 0 monitors on the market, while Nvidia has already brought at least 1 model to market from 7 different partners, with even more on the horizon. But yes, some idiot will sit here and tell you something you can't even buy on the market, is inferior to the established solution G-Sync, and is only supported by the market underdog is somehow better suited to succeed! Amazing.
You haven't read any reviews because you choose ignorance, that's all it comes down to. Honestly, is your reading comprehension so poor that you didn't read my reference to PCPers assessment, based on THEIR DISCUSSIONS WITH AMD, live from CES? Ryan Shrout said plenty to indicate FreeSync is already inferior to G-Sync (worst minimums, tearing above refresh or Vsync) and that's not even addressing the remaining questions whether or not FreeSync even improves latency over V-sync.
Here you go, so you can't continue to feign ignorance, you can also update your frame of reference regarding nonsensical claims of 9-240Hz FreeSync support, too, while you are at it. https://www.youtube.com/watch?v=8rY0ZJJJf1A#t=2m20...
Mark Rejhon over at blurbusters.com got a chance at some FreeSync monitors at CES today, and his initial impression is that AMD's implementation is on par with G-sync. "Now, my initial impressions of FreeSync is that it's on an equal footing to GSYNC in motion quality. At least by first impression, without looking closely at them "under a microscope". FreeSync certainly eliminated stutters and tearing, just like GSYNC does, even if the methods/technologies work somewhat differently. A future article will probably compare GSYNC and FreeSync. Many sources have reported various pros and cons of GSYNC and FreeSync, but a major one that sticks out: Lower cost of implementing FreeSync." http://forums.blurbusters.com/viewtopic.php?f=16&a...
Of course, we will have to wait for more detailed analysis, but early impressions are encouraging.
A bit more from his post: "...I played around with the options of the windmill. It had an option to sweep the framerate. The sweep was seamless, seeing framerate bounce around from 40fps through 60fps without stutter. It looked every bit as good looking at G-SYNC at the same rates (40-60fps in Pendulum Demo). Disable FreeSync and VSYNC brought about ugly tearing and stutters, so it was certainly cool to see FreeSync doing its job.
Thanks for the link but again, this picture here, should be a huge red flag and reason for concern for anyone interested in FreeSync, or G-Sync for that matter:
Why is Vsync on at all? Why does the associated quote confirm VSync is enabled when it says "Disable FreeSync and Vsync brought about ugly tearing...."
To me, it sounds like they are still relying on triple buffered Vsync to achieve the incremental framerates between 40-60 (rather than the full denominational increments of max refresh) and then using Adaptive-Sync/VBlank signal to change the refresh rate on the monitor's scaler as the framerate changes (most likely, 1 or 2 frames latency here too).
But the key is, all that Vsync lag is still going to be present, even if the tearing and stuttering associated with no VSync is gone. That was only half the problem and compromise re: Vsync On/Off, but I guess we will need to observe it under a microscope to see if FreeSync addresses latency at all over Vsync On as well as G-Sync does. My bet is, it does not.
I am also interested to see what granularity the scalers on the monitors are capable of. Is it 1Hz frequency differences, or is larger chunks?
"I played around with the options of the windmill."
Let us know when someone plays around with GAMES. Windmills mean nothing IMHO. We don't play windmills ;) They saw the same crap demos as everyone else at CES. NO GAMES. Until we see that running on many sites (which will mean we'll see many games as they don't all use the same games), we really don't know how good the tech is.
I challenge anyone to explain why AMD would NOT run games at CES if everything was working as advertised?
Uhh, AMD didn't use games because (accepting YOUR challenge) at CES, and everywhere else, thousands of drooling idiot AMD fans don't care ! So long as they can scream nVidia must NOT ever have proprietary technology no matter what, and nVidia "IS RUINING GAMING FOR EVERYONE ! ", AND "nVidia will cost more and everyone will be unfairly locked in! " - they could care less if freesync actually works as well or works at all - because no matter how limited (40-60 fps only for instance) or crappy it is, the AMD fanboys will scream deliriously, forever, it is perfect and just as good and better than nVidia's "greed driven proprietary locked in payware" ! Since even that level of astounding partisanship is not enough for the drooling amd fans, they will go one step further and AMD knows it - if it doesn't work at all they will all scream in unison " I don't want it and no one can tell at those framerates anyway ! " As if that still weren't enough, it could go on for literally YEARS not working correctly while anyone who objected was told they were full of it, as it did with crossfire and massive frametime stuttering, dropped and runt frames, false framerates on all those game tests at all the official websites including here that meant totally unfair unwitting LIES pumping up AMD beyond it's true FPS values, and then when those years pass and it's FINALLY EXPOSED as broken and fraudulent, it could get fixed... and all those amd fans that hate nVidia with their passions, could care less, they will be so HAPPY that AMD finally fixed the broken junk peddled as perfect for YEARS, that they will instantly renew their fan fervor for AMD and launch it to it's highest personal emotional peak positive, ever, while simultaneously complaining that nVidia caused all the trouble with it's money grubbing Gsync crap. So, there's why AMD hasn't a worry in the world.
I know this won't matter to you as you've made your position clear but there's a difference between a specification and implementation. The problem seems to be that most desktop monitors don't work below 30hz due to short pixel memory (igzo can be better, but that's hardly standard).
@tuxroller, yes I fully understand this, but it is clear Creig is quoting this dated spec info in a misleading attempt to back his point FreeSync is superior to G-Sync, when in reality, FreeSync's supported frequency bands are in fact, worst. What is the purpose of this if not to mislead and misinform? Its a disservice to everyone involved and interested in these products to try and claim this as an advantage when in actual implementation, the displays are not capable of these low refresh rates, nor are desktop (or even HDTV displays) capable of driving up to 240Hz.
Would it not be misleading to say "Hey, this product is better because it is capable of time *travel."
First of all, there is nothing at all wrong with the specs I listed. According to the information that has been released, FreeSync is capable of operating anywhere from 9Hz to 240Hz. It does so in ranges.
9Hz - 60Hz 17Hz - 120Hz 21Hz - 144Hz 36Hz - 240Hz
As I highly doubt that any one panel out there will be capable of operating in the full range of 9Hz - 240Hz, I don't see what the problem is. The monitor manufacturer simply chooses which range will cover the specs of the panel they intend to produce. The fact that no panels out there today can go as low as 9Hz or as high as 240Hz yet is irrelevant. FreeSync will be ready for them if and when they eventually make it to market. Your "issue" is a non-issue.
From your quote:"there MIGHT be existing monitors on the market", "PERHAPS possible via a firmware update". See the words "MIGHT" and "PERHAPS"? He didn't say "WILL". Monitor firmware updates are beyond AMD's abililty to control. It was obvious that the monitor they used was capable of FreeSync with an updated firmware. But it is up to the manufacturer to offer the update, not AMD. You may as well blame the weather on the television forecasters while you're at it. It makes just about as much sense as what you just said.
Only AMD will support "FreeSync" because "FreeSync" is simply AMD's implementation of Adaptive-Sync. As far as other companies such as Intel, they are free to develop their own version of FreeSync because Adaptive-Sync is a VESA industry standard. The VESA board evidently considered the spec to be of great enough benefit to include it in their latest version. So it's not only AMD who found merit in its design. Intel has had eDP for a couple of years now so it's entirely possible that they already have Adaptive-Sync capability built into their shipping products. If they don't already possess it, I can't see why they wouldn't want to include it in the future. It's an industry standard spec, there are no licensing costs and it gives the end user an overall better experience.
I was pulling from memory the GPUs that will be FreeSync capable. I thought I read somewhere that Tahiti will have partial FreeSync support in that they will handle the video playback aspect, but not the 3D rendering. I'll see if I can find that info. And even if it turns out that it's only Hawaii and newer, what of it? There will always be new technology that isn't compatible with old hardware. There has to be a cutoff line somewhere. Are you raging because there are no more AGP slots on motherboards? Are you upset because you can't find a new laptop that comes with a floppy drive? Newer technology isn't always compatible with older technology. Does that mean we should simply stop innovating?
Both websites of PCPER and Blur Busters have personally been to the AMD booth at CES and both websites have reported no visual difference between FreeSync and G-Sync. Obviously we'll have to wait for official reviews to get the final word, but I fail to see why you are still trying to claim that FreeSync is inferior to G-sync in nearly every way when people who have actually seen both in operation are saying otherwise.
Really chizow, you might be taken a bit more seriously around here if you would simply tone down your pro-nvidia "RAH RAH RAH" eight or nine levels.
So you can admit, that because the frequency range FreeSync monitor mfgs chose to support are inferior to what G-Sync supports, FreeSync is an inferior solution to G-Sync correct? Because I would hate for someone to get the impression FreeSync is better than G-Sync based on some dated specs you pulled off an AMD whitepaper when in reality, there are no monitors on the market that support anything close to these ranges on either the top or bottom end. Just making sure. :)
So after that quote about FreeSync being free, and after we have seen there were in fact no displays on the market that could just support FreeSync with a firmware update, for free, you can admit what AMD said was misleading, and that their entire naming structure is really just a misnomer. Do all the people AMD misled with their claims deserve an apology, in your opinion? Do you think AMD should have made these claims without first verifying any of it being true, first? Just wondering. :)
LOL gotta love the "It is open but they can't use AMD's implementation, but as usual, anyone is free to develop" take you are heading towards here with FreeSync. Keep towing that company line though! I know that is the favored mantra for AMD and their fanboys when they develop something proprietary under the guise of "Openness" just so the dim-witted and misinformed can parrot it and find no fault with AMD. Just like Mantle right? Intel, Nvidia and anyone else are free to develop their own Mantle implementation? :D Wonder who spread that bit of noise/FUD around...
But yeah you were wrong about Tahiti, so you really should be more careful when referencing major points if you are going to try and counter my point, which in this case, was market share. You are now of course trying to downplay the fact that only a tiny fraction of AMD cards even support FreeSync, which are only a minority share of the dGPU market to begin with, but it reinforces my point that FreeSync is the technology that has the huge uphill battle because so few GPUs can even make use of it. Its also quite funny that you are now trying to downplay the importance of hardware install-base. Who is going on about floppy drives and AGP slots? We are talking about relevant DX11 hardware from the last 3 years, most of which can't support AMD's own FreeSync standards. If install-base and legacy support aren't important, what chances would you give FreeSync to succeed if they started the ticker at 1 starting with their next 14 or 20nm GPU, against the tens of millions of Kepler and newer GPUs that support G-Sync? You don't think the fact many AMD users will have to upgrade both their GPU *AND* their monitor will be a barrier to entry, and an additional cost of adoption for FreeSync? Maybe its time to reassess the fees attached and total cost of ownership once you factor in a new AMD GPU too?
And reported no visual difference? Wrong, they reported no visual difference until the demos went out of supported frequency band, at which point, everything fell apart. This cannot happen with G-Sync, by design. Also, visual difference in the form of tearing and stutter was only part of the equation and problem with Vsync that was solved by G-Sync, the other half of the equation was input lag/latency, which we have no insight on because the demos weren't truly interactive. But again, based on the impressions and various screenshots indicate AMD's solution is still tied to V-Sync, so there is a strong possibility they were not able to resolve this input lag, as G-Sync does.
And tone down the RAH RAH tone? Hahah that's funny from the guy who is now forced to re-scream all the nonsense of the past 12 months from the rooftops, but I fully understand, you've backed yourself into this position long ago when you reference and gave creedence to all the nonsense AMD claimed about FreeSync that ultimately, ended up being BS.
I'll enjoy watching you eat humble pie over this. Be sure to be man enough to admit your rants were wrong and heckling everyone over their rebuttals was juvenile.
Are you enjoying that humble pie defending all the FUD/misinformation AMD said about Not-So-FreeSync before they actually did the work and productized it? Certainly you are "man enough" to admit much of what AMD said over the past year regarding FreeSync was in fact misleading?
I hope to soon SUE the lying AMD company to THE HILT OF THEIR EMPTY BANK ACCOUNT - because they have lied to me about Freesync and my Hawaii core AMD cpu ! Yes, it has 4GB of ram, but when the screen is shredding and tearing apart - what good is it ?!
What about future hardware? User has choice to purchase a GPU that supports monitors $150-$200 cheaper than a GPU that requires a more expensive monitor to get similar performance. Only hardcore team green loyalists will choose the latter. AMD will hold onto their loyalists. And those with common sense and go back and forth between green and red will have one more reason to go to AMD. Especially, in the back half of this year where it appears team Red will have the best performing cards for at least six months. How Red prices the new GPUs should dictate the success of market share gains. I suspect $125 premium to Nvidia cards will be the case.
Or more likely, the 70% of the market that already owns an Nvidia dGPU from the last 3 years (Kepler launched in Mar 2012) can just buy a new G-Sync monitor, the same set of users that already saw a benefit from Nvidia products independent of the relatively new innovation of G-Sync.
But yes, if both Nvidia/AMD hold onto their "loyalists" or repeat buyers, you can already see, AMD is going to run up against a huge uphill battle where they control an extremely minor share of the dGPU market (desktop and mobile) at ~70/30 clip. What's the point of referencing future GPUs of unknown commodity at this point? You don't think Nvidia is going to release another high-end GPU to combat AMD's next offering?
And you can't say for sure these users will have 1 more reason to go AMD, because there is a premium and value for better technology. G-Sync may just be better than FreeSync, and while we don't know this for sure right now, we do know G-Sync does everything Nvidia said it would for over a year, which has held up against the test of time from both reviewers and consumers alike.
We simply can't say the same about FreeSync right now, can we?
@medi03, actually 70% would be a generous number for AMD GPUs in this discussion, because again, Nvidia supports G-Sync with all Kepler and Maxwell-based GPUs, which goes back to March 2012. AMD GPUs that can support FreeSync are far fewer, with only GPUs based on Hawaii, Tonga, and Bonaire and any APU from Kaveri onwards.
While AMD has stated all new GPUs based on new ASICs will support FreeSync, the most market data shows they are getting destroyed in the marketplace, which is no surprise given the reception Maxwell has received in the marketplace:
That was only with 1 month of sales for the new Maxwell cards, market reports are expecting similar, if not more pronounced results in favor of Nvidia for Q4, but I wouldn't be surprised to see a slight decline with AMD's price cuts as 72% is REALLY hard to improve upon.
blind and ignorant AMD fan bloviates again... " Not to mention G-sync locks you into nvidia GPUs only while FreeSync is based off an industry standard which any company is free to develop. "
The only other company is AMD, so AMD is locking you in with freesync, SO YOU AMD FANS NEED TO STOP THIS BULL. Thanks for not THINKING AT ALL. Not.
And what do you think G-Sync is doing when then the computer draws frames faster than max refresh rate? It just waits for the next sync, effectively working as Freesync with V-Sync on. Unlike with nVidia, you can choose here, so a plus point.
The difference is however in low frame rates, where it looks AMD gives choise between tearing and V-Sync frame drops (it would be ideal to choose V-Sync independently for top/bottom frame rates), while nVidia lets pixels dim down (effectively letting them below the designed minimum refresh rate for the panel) and thus you get blinking. None of those three options is ideal and I am not certain, which was is optimal one.
Jury is still out till the independent reviewers get a chance to review both panels side by side. Given the width of Freesync offer shown at CES and the manufactures backing it, it is certain to get some real traction. And then nVidia could enable it generation of cards (while keeping G-Sync as well, of course).
No, G-Sync can't draw frames faster than max refresh, because it uses a driver soft cap, which is curious as to why AMD didn't do this by default. But even at the max refresh of 120 or 144Hz, the GPU is still the one calling the shots, telling the G-Sync module to draw a new frame only when the monitor's refresh is ready. If the monitor isn't ready, the G-Sync module with the onboard DRAM acts as a lookaside buffer that allows the monitor to simply hold and continue to display the last frame until the monitor's refresh is ready, which then displays the next live frame (not old frame like with Vsync) that is sent from the GPU to the G-Sync module. The end result is just a perceived reduction in FPS rather than an input laggy/juddery one, as you would see with Vsync.
There are a lot of questions for certain with AMD because 40Hz is still a very questionable minimum especially on a 4K display; I am honestly not sure what kind of AMD set-up you would need to ensure you meet this minimum (3x290X?) always. While Nvidia's minimum really only becomes an issue ~20Hz but realistically, it only really manifests itself on menus in certain games that drop their frame rate because some games tended to overheat (SC2, GTA4 etc) when running uncapped frames.
But yes, independent reviews I am sure will give us more answers, but realistically, there's no reason to expect anything other than vendor lock-in for a tiny niche of specialty monitors that cost more for this premium feature. Any AMD fan who doesn't think they are going to pay a significant price premium (maybe not as much as G-sync, but definitely not free!) is deluding themselves.
More to the point when are laptop users going to see freesync or gsync on laptop screens!
If ever there was a need for these technologies it's the laptop market where even gaming laptops only feature 60hz screens and the most powerful GPU is only 75% the power of the most powerful desktop cards which increases the chances of dipping below 60fps greatly...
Do you think laptop users are willing to pay the premium for something that won't be portable between GPU/hardware upgrades? At least on the desktop side of things, I think many will be more willing to invest in one of these panels if they cost $600+ if they know it will survive at least 1-2 upgrade cycles.
But maybe this is a question for AMD at some point, since they were making some pretty bold claims this time last year at CES about FreeSync. Since they claimed this was always a feature of laptops using the eDP specs along with the comments they made about being free and needing only a simple firmware update, maybe they can get FreeSync working on these laptops, for "Free"?
Well yes, have you seen the price of gaming laptops and also the news that they are selling in far greater numbers then ever before. The users of gaming laptops are often desktop gamers themselves and being included in those numbers myself seeing laptop gaming reach the standard of desktop gaming is exactly what we want to see, and would be willing to pay for. Freesync/Gsync is something I personally see as a must for laptop gaming to be fully embraced.
The early demonstrations of freesync was actually done with regular laptops that didn't require additional hardware. From what I believe the technology is there, just presumably no one has yet embraced the market and provided an option. I'm hoping the royalty free freesync spurs the laptops manufacturers on.
Does not require additional hardware? Are you sure? It requires no additional hardware above the "DisplayPort Adaptive-Sync standard", but to support this standard the monitors seem to need additional hardware because otherwise why would AMD being showing off these new FreeSync monitors?
Exactly. There's no extra licensing fee for FreeSync, but the hardware required to run it will certainly be more expensive than the hardware required for a standard static refresh rate display. Besides the scaler, you'll need a panel that can actually handle dynamic refresh rates, and in most cases the panel will also need to be able to handle higher than normal refresh rates (e.g. 144Hz instead of only 60Hz). We need to see final pricing on shipping FreeSync and then compare that with G-SYNC equivalents to say how much NVIDIA is actually charging.
And for all the hate on G-SYNC, remember this: FreeSync wouldn't even exist if NVIDIA hadn't made G-SYNC. They first demonstrated G-SYNC in 2013, and hardware has been shipping for most of last year (though it wasn't until the second half of the year that it was readily available). Considering the R&D cost, hardware, and need to get display vendors to buy in to G-SYNC it's pretty amazing how fast it was released.
Where is there hate for FreeSync? If it does what G-Sync does, that's great! AMD fans may finally get to enjoy the tech they've been downplaying for nearly a year!
I have a disdain for FUD and misinformation, that's it, and as we have seen, AMD has a strong penchant for making unsubstantiated claims not only about their unreleased/undeveloped solutions, but about their competitor's solutions as well.
Do you appreciate being lied to? Just wondering and trying to understand how some of AMD's most devout and loyal fans don't seem to be misled or lied to, at all. :)
Yeah, its unfortunate, this is probably based on the early misinformation AMD provided about FreeSync, but they've since changed their tune since FreeSync clearly requires a premium custom scaler in order to work.
Interesting, I won't disagree here because I have heard similar sentiments in other gaming notebook discussions, and there is clearly a huge premium attached to slower performing high-end notebook parts. I will caution however, that many notebooks initially implemented 3D Vision capability and 120+Hz ability and it did carry a hefty premium. 3D and 3D Vision interest has since subsided unfortunately, so I am not sure if these panels are still including it or not as a feature. Just something to consider.
AMD showed FreeSync working on laptops at last year's CES as their initial FreeSync demonstration. Adaptive-Sync would be a great addition to any laptop as it includes the ability to improve graphics qualities at lower frame rates. As laptops generally have less power graphics subsystems, they would benefit from this technology. In addition, Adaptive-Sync also has power savings abilities as well. All of which would be to the benefit of laptop users.
Agreed. I think it makes lot of sense for laptops and for desktops with 4k screens or cheap graphics cards as they will likely have lower framerates and benefit the most.
Hey dude, could you provide references to this said "free" upgrade ur talking about.. like a link to an AMD media article, or a URL from an official statement.. my take in and comment was that if the existing laptops already had the needed standard of displayport 1.2a then it would be up to the manufacturer to implement some type of update if they wish, "wish" being the operative word. Now with that said, funny thing about the universe it's relative to what the subject wants to interpret.. so keep on saying what ur heart wants...
I've already linked it in other comments, and it was obviously these comments from Koduri that gave rise to the misinformation many AMD fans clung to for months until AMD reversed course and backed off these statements.
"Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months."
Umm, so this was shown running tons of games and they all worked fine then? If not, you shouldn't make claims like this. The whole point of this tech is to MAKE GAMES BETTER, not test windmill demos etc. I think we need to say the jury is still out until GAME TESTED and APPROVED. THEN you can say the type of statement such as the one above. Just another reason NV doesn't send you info early ;) No cake a while back like others got, no 965m info today...LOL.
What would have been NEW, is if they showed it WORKING in games ;) Also we'll see how long it takes to get them here, and as noted (for the first time on AT?) pricing won't be FREE ;) Anyone who thought it would be FREE was smoking some really good stuff :) For the first time I see you actually hinting here it may cost the same as gsync...Well duh. R&D isn't free (for scaler/monitor makers, testing it all etc), and it WILL be passed on to consumers (AMD or NV solution).
I don't believe NV will support this in any case. They'll just give away Gsync crap cheaper if forced (they only need a break even here, and it sells their gpus) to entrench it if at all possible. There is also a good chance NOT-SO-FREE sync might not be as good as NV's solution (or they'd be showing games running at CES right?), in which case they can keep charging a premium no matter what the price for NOT-SO-freesync ends up being. Many gamers won't accept 2nd best in this case nor "good enough". IMHO it may be "good enough" for some (why not showing 2 dozen games at CES?), but that will allow NV to stay Gsync anyway claiming the premium solution is only on NV hardware. I can deal with that considering my monitors make it 7yrs+ anyway and that just means I'll buy 2 NV cards or so during the monitors life (I upgrade gpus about every 3yrs).
Agree with Chizow. The market will keep gsync alive if it is BETTER, period. Considering the market share's of both (1/3 AMD, 2/3 NV and NV gaining, monitor makers know these numbers too) they have no reason to favor a smaller player with possibly a WORSE solution that still hasn't been shown GAMING. I'm worried at this point since it's been ages since they showed NOT-SO-freesync, and have yet to show a bunch of GAME DEMOs running. How bad is this tech if we're supposedly Q1 for monitors and NOBODY will show them running GAMES?
Nobody else finds this odd? Anandtech just acts like it works with no games tested? Nvidia put it in TESTER HANDS to run GAMES (review sites etc). Smell anything fishy people? Having said all that, I'll wait until the next black friday to decide the victor assuming my monitors can live that long (in year 8 or so now...LOL).
Actually, Intel is what matters since they've more share than anyone else. While adaptive sync is useful for gaming, that's far from its best use. Having a 100% tear free desktop, perfectly synced videos, lower power usage are all at least as useful and will certainly be useful to more people.
For the record, AMD showed at least two games running with FreeSync. You'll note that there's no "G-SYNC compatible" list of games from NVIDIA for a reason: if the technology works, nothing more is required of the games to enable it! Spouting FUD and posting long anti-AMD diatribes does nothing but create noise. FreeSync was shown running games, and that proves that it can work. I don't know that it's necessarily 100% ready today, but the remaining work is going to be mostly in fine tuning the drivers over the next couple of months.
If you want to really complain about something, it's that FreeSync requires at least GCN 1.1 to enable the full functionality, so R9 280/280X (7950/7970) and earlier GPUs won't support it AFAICT.
So posting pro-AMD diatribes and throwing out misinformation doesn't create noise? That's interesting view on things since we ARE STILL trying to filter out all the noise AMD threw out there for the past 12 months, perpetuated and regurgitated by their pro-AMD fanboys. Shall we recount?
1) Koduri telling everyone at CES last year FreeSync would "effectively be free" and might even be supported on existing monitors with just a firmware update. Only much later do we see them reverse course and say they were only referencing "royalties" that no one, including Nvidia, has ever confirmed, existing. Clearly this lie has legs, because there are STILL tech sites perpetuating this myth even to this day!
2) Koduri and AMD saying FreeSync wouldn't need additional hardware, that they could work with existing DP specs that supported eDP. Months later, after actually doing the work, pushing a spec, and working with scaler makers, we see these monitors will in fact need more expensive scalers, and the Free in FreeSync is no longer "essentially free", its "royalty free".
3) Koduri telling everyone that Nvidia needed expensive hardware because the display controllers in their GPUs couldn't handle Adaptive Sync as well as their own display controllers. Yet months later, we find that even many of AMD's own display controllers weren't quote AMDawesome enough to handle this!
4) AMD saying FreeSync would support 9-240Hz bands, spurring fanboys like Creig to repeatedly quote this misinformation even in light of the fact FreeSync in actual pre-production samples is only supporting 40-60Hz and 30-144Hz in the models on display.
5) AMD claiming G-Sync will die because it is proprietary, while FreeSync is open and "royalty free", when in reality, AMD is the only one supporting FreeSync but with a much smaller share of the total addressable market for these products than Nvidia, with Nvidia having shipped and sold their product for close to a year already. Oh, and we STILL don't know how much "Free'er" these monitors will be, do we?
So now let's compare that with G-Sync and the mysteries surrounding its launch:
1) Nvidia announces, demos, and launches G-Sync live for all the world to see and ships it in actual product 2 months later, and it does everything they said it does, from Day 1.
So again Jarred, where is all the noise coming from, again? :) It seems to me, if AMD said nothing at all about FreeSync until it was actually done, there wouldn't be all the noise surrounding it.
1) "MIGHT" be supported on existing monitors, not "WILL" be supported. It's up to the monitor manufacturer to release a firmware update to support FreeSync, not AMD. Therefore, AMD did not lie.
2) FreeSync was already shown to work on laptops with the required specs without any additional hardware. However, laptop displays are not desktop displays. It is necessary to have a panel that is capable of variable refresh rates. This is why we have had to wait for monitor manufacturers to produce desktop displays with the VRR ability of certain laptop displays. Therefore, AMD did not lie.
3) AMD certainly does have GPUs that are Adaptive-Sync capable while Nvidia does not. Therefore, AMD did not lie.
4) FreeSync reportedly can operate anywhere from 9Hz to 240Hz. Just because current panels cannot go that high or that low does not mean that FreeSync is not capable of operating at those frequencies. Therefore, AMD did not lie.
5) Whether or not G-sync will die remains to be seen. It is a fact, however, that AMD charges no royalties connected with FreeSync while it is reported that Nvidia does collect fees for every G-sync monitor sold.
The noise, chizow, is coming from you distorting the facts to fit your twisted view of FreeSync.
Just to clarify #2). It appears that the only difference between a FreeSync capable monitor and a non-FreeSync capable monitor is the scaler. Most current scalers are v1.2 and FreeSync requires 1.2a. As old monitors get replaced with new versions, it will be a simple and inexpensive matter for manufacturers to update them with DP1.2a or DP1.3 scalers which will make them FreeSync compatible and give them the necessary variable refresh rate capability.
I am not claiming infallibility with the points I bring up. It's possible that I may make a mistake and state something that is in error. But I am trying to be as factual as possible.
See, this is a perfect example of how all the misinformation and FUD AMD has put out there over the last year regarding FreeSync, just dies hard. You now have all these half-truths, lies, myths and straight nonsense put out there from AMD, perpetuated by their fanboys who of course feel compelled to continue the myths, FUD, lies and half-truths simply because they backed themselves into these untenable positions months ago, coupled with the fact they simply can't acknowledge AMD lied or was spreading misinformation regarding FreeSync this whole time.
1) Creig, you are simply trying to argue semantics here when it is obvious what AMD said regarding FreeSync being essentially free, or possibly being supported with just a firmware update was misinformation, plain and simple. How you as an AMD fan and supporter aren't disappointed by this dishonesty is somewhat unsurprising, but to continue covering for them and thus, perpetuating the lie is somewhat shocking. Is there a single monitor on the market that is just upgradeable via a firmware update, essentially free that can support FreeSync? No, there is not, therefore, AMD lied whether intentional or not. There is no single monitor or display on the market that can support FreeSync via firmware update, essentially free.
2) No, FreeSync was not shown to work, unless you consider a fixed refresh demo of a windmill a working version of FreeSync. But thanks again for providing another example where AMD was less than honest and forthcoming about what they showed and what they said they demonstrated.
3) Great! So I guess that debunks your claims that Nvidia is the one choosing not to support FreeSync, when in reality, its certainly possible AMD designed a spec that only their hardware could support, knowing Nvidia and Intel GPUs could not. While FreeSync may not be proprietary in name, it certainly is in practice, is it not? Which brings us back to my original point, AMD is currently the only GPU vendor that supports FreeSync, just as Nvidia is the only GPU vendor that supports G-Sync, but of course, that also means Nvidia commands an overwhelming % of the TAM for these displays. The rest, is just "noise".
4) No, it just means G-Sync as currently implemented is BETTER than FreeSync, just as I originally stated. You can claim FreeSync can lower your mortgage on paper, but if it doesn't do it in reality, who gives a rats ass? 9-240Hz on a piece of paper is just a way to deceive the ignorant and non-technical into thinking FreeSync is better because it "supports" a wider range of frequencies, when we see in reality, the supported band is MUCH smaller. Mission accomplished, it seems!
5) Again, reported by whom? AMD? LOL. Again, the noise regarding royalties and fees have come from AMD and AMD only, but as we have seen, they have continually backed off this stance saying there is now additional BoM cost due to better scalers and better displays capable of handling these refresh rates and LCD decay times. Yet, somehow, the $200 G-Sync premium for Nvidia's BoM, R&D and QA costs per board are unjustified??? And we STILL don't know how much more AMD's solution will cost, so again, why is AMD saying anything at all until they know for sure?
And, your clarification is wrong too, there are more differences than just the scalers, the panels themselves are higher quality also, as they need to support lower decay times to address the minimum refresh rates. 4K, IPS and 120+Hz will also command premium panel prices.
So yes, as usual, the noise originated from AMD and has been echoed and parroted by AMD and their fanboys like you, Creig. If AMD simply shut their mouths and waited til they introduced actual product this week at CES, you wouldn't feel the need for all this backpedaling and revisionist history to cover all the misinformation they've been spreading over the last 12 months, but thanks for proving my point with your elaborate attempt to cover-up all of AMD's missteps.
1) Are there monitors out there that can be made FreeSync compatible with nothing but a firmware flash? Yes. End of story.
2) From what I understand, the laptop FreeSync demo was to showcase the fact that they could display a variable refresh rate. Full implementation of FreeSync requires dynamic variable refresh rates. The demo simply showed that FreeSync was possible, even if it didn't have all the features yet. Try to keep in mind that it was a demonstration of a beta work-in-progress just to show that FreeSync was possible.
3) So AMD shouldn't have come out with FreeSync simply because Nvidia cards might not currently have the capability of utilizing it? And we don't know whether or not Intel currently has hardware that is Adaptive-Sync compatible. But since it's an industry standard now, Nvidia or Intel are free to incorporate it into their own hardware. That's more than can be said about G-sync.
4) First you said AMD's support of 9Hz to 250Hz was a lie. Now you've admitted that it isn't a lie. It isn't AMD's fault that current monitors don't go that low or that high. But when they do, FreeSync will be ready to support them. How can you possibly twist that into a BAD thing?
5) However you want to look at it, FreeSync will be cheaper than G-sync. It's an inescapable truth. FreeSync simply needs an updated scaler while G-sync requires the entire scaler to be replaced! And that replacement scaler has to be custom tuned to the panel in question. And don't forget about Nvidia's royalty fees. FreeSync will end up being cheaper than G-sync. No question about it.
There is no backpedaling or revisionist history going on here. In fact, the only thing going on (and on and on and on) is you. I realize that you're upset that AMD appears to have beaten Nvidia at its own game and that the industry is excited about the forthcoming release of FreeSync. But no amount of ranting on your part is going to change that. So please just calm yourself down and try to stick to facts.
LMAO, again, it is amazing you're not ashamed to continue perpetuating these lies and myths. And for what? To try and defend the accumulated lies and misinformation from AMD, or as Jarred would say, "noise" that has piled up over the last year regarding FreeSync?
1) No, there are not any monitors that can be made FreeSync compatible with just a firmware flash. Will you personally guarantee this level of support out of pocket for anyone misled by this statement? Will AMD stand by this? Will the monitor mfg? No. No one wants to guarantee this because there ARE costs associated with "Free"Sync and no one is willing to guarantee this.
2) Well you understood incorrectly because AMD *WAS* telling people this was FreeSync with dynamic refresh rates when in fact, it was not. I mean how can you even sit here and say this was a demonstration of FreeSync and a worthy analogue to G-Sync when it was not even close to feature complete, including missing the MOST important aspect which is actual DYNAMIC/ADAPTIVE frame rate adjustments. Only someone intent on deception or misinformation would even throw this out there as you did as a counterpoint to try and prove AMD had already shown working demos of FreeSync to try and back AMD's original lie about existing panels on the market being able to support FreeSync with just firmware updates. So again, now that FreeSync is complete, why can't these older panels just support a feature complete FreeSync with just a firmware update? Oh right, because they can't. They lack the necessary hardware, hardware which AMD originally claimed wasn't necessary. But AMD subsequently developed more advanced scalers because they found out you couldn't actually just upgrade to FreeSync for free, with just a firmware update. Conclusion: AMD lied and fed the public misinformation, whether intentional or not, and their fanboys like you CHOOSE to continue to perpetuate this "noise" rather than just admit AMD was wrong and move on.
3) Who said anything of the sort? AMD is always free to develop whatever they like to improve their products for existing customers and to entice future customers, but what they shouldn't be doing is making MISLEADING statements about what their competitors can or cannot do. I mean it would be just as disingenuous as Nvidia saying, well AMD can support G-Sync at any time if they want to, they just have to invest a huge amount of R&D to ensure their display controllers work with a custom board and FPGA ASIC. And as for Intel, we do know they have shown no interest whatsoever in supporting FreeSync. We also don't know if they are even capable, again, given not even all AMD GPUs have the secret display controller sauce to support FreeSync. But neither are "free" to implement because again, this may take a considerable level of effort and R&D and in Intel's case, they may not even care. In Nvidia's case, why bother when they already have a better solution they brought to market before FreeSync? And again, who cares if its an open standard if only AMD supports it? I guess we can give it the same chances as other failed open standards, like HD-DVD? How'd that go?
4) Where did I admit it wasn't a lie? LMAO. Quoting AMD's bullshit is one thing, but please don't misquote me, thanks. But back on topic, do any of the demonstrated displays support 9Hz on the low end in FreeSync mode, or 240Hz on the high end, in FreeSync mode? No, they do not. Continuing to quote this lie as a benefit in favor of FreeSync is dishonest, simple as that, but I fully expect AMD and their disingenuous fanboys like you Creig, to continue to perpetuate this lie for years to come, because even now that we have actual demonstrated FreeSync monitors, none of them support these ranges.
5) No, its not really an inescapable truth Creig. If you are one of the >70% of dGPU owners that own an Nvidia Kepler or Maxwell based GPU, you have 2 options:
a) Buy a G-Sync monitor, game happily on a solution that does everything it says it does. b) Buy an AMD GPU that supports FreeSync and buy a FreeSync monitor.
Are you and AMD willing to back your claim that FreeSync is the cheaper solution out of your own pocket for this subset of users just to perpetuate a myth and your flawed analysis? And what royalty fees are you talking about again? The ones AMD associated to G-Sync? LOL. Again, more "noise" as Jarred would say.
Hahah no revisionist history. That's classic, you've backpedaled on every point when it was shown AMD did NOT demonstrate or make good on what they said on the original lies, myths and FUD you continue to try and perpetuate even after we see FreeSync in its final form has moved beyond most of these lies. That folks, is "noise".
And AMD has beaten Nvidia at its own game? LOL. Yes, once again getting ahead of ourselves aren't we Creig? Because AMD sure has a great track record when it comes to supporting their initiatives, how's that Open Mantle SDK coming along btw? But I am sure in a few months once FreeSync actually makes it to market, we can throw this up there as another not-quite-as-good reactionary half-baked solution from AMD in an attempt to match Nvidia, the industry leader that developed and introduced this tech:
SLI > CF Shadowplay > some AMD junk in their bloated adware client DSR > VSR GPU Boost > Turbo Core CUDA > OpenCL 3D Vision > HD3D PhysX > BulletPhysics Optimus > Enduro
I'm sure there's more, but that's just a small sample of how AMD has "beaten Nvidia at its own game" in the past. Don't worry, there's plenty more room on the list for G-Sync > FreeSync, too! :D
If there's a fanboy here, Chizow, it's you. I'm not going to bother with your rantings any longer as it's obvious that you refuse to acknowledge facts. As more and more FreeSync capable monitors hit the market, everybody will just laugh at you all the harder. So just keep on tilting at those AMD windmills. I'm sure you'll stop them in their tracks single-handed.
Haha the difference Creig, is that I actually use their products because they are the best at satisfying my informed demands as a user, and I'm not willing to perjure myself to suit their agenda, as you CLEARLY are.
What is your excuse again for buying inferior tech? Save a few bucks? What's your excuse for defending all of these accumulated lies and misinformation, ie. "noise"? I'm simply trying to set the record straight here, y'know, filter out all the "noise" because it is clear where that noise originated (AMD) and it is clear there are certain folks who's agenda is to not only perpetuate that noise in order to confuse the market or create a competitive advantage to distort the reality FreeSync is the one that faces the greater obstacles on the market, not G-Sync.
But yes, until then, we will just continue enjoying our G-Sync monitors, laughing as FreeSync falls further and further away from AMD's original claims because we have the benefit and luxury of knowing G-Sync does everything it said it would and has been for close to a year!
SLI is better than Crossfire? LOL...get current dude.The XDMA bus is a far better and more elegant solution as demonstrated in its total ownership of SLI for both frame rates and frame pacing. Take off the green goggles before you go blind.
LMAO, ah yes, frame pacing, the problem AMD fanboys like you spent months, years, downplaying and sweeping under the rug as if it didn't exist, right? FCAT was just an Nvidia viral marketing scam until it actually forced AMD to go back and fix their broken CF implementation YEARS later, right?
But its OK, I do recognize progress and superior tech when it is appropriate and XDMA is certainly a better approach than Nvidia's aging SLI bridge.
But unfortunately for you and AMD users, XDMA and hardware is only part of the problem, and they have only fixed part of their CF implementation by sorting out the frame pacing/microstutter problems AMD fanboys spent years downplaying.
The biggest problem for AMD and their CF implementation is that the end-user is STILL bound to AMD's driver updates because they don't expose the compatibility bits in their CF profiles, as Nvidia has done for years. There are some half-baked workaround that require you to copy profiles for other games, but this has the chance to break other features, like AA, because you don't have the granularity to change individual bit settings like you can with Nvidia profiles via a simple XML change using something like Nvidia Inspector.
But yes, all in all, AMD fans can once again thank Nvidia and their supporters for bringing about positive change for your AMD products. Because we sure as hell know nothing would've gotten fixed on the CF front otherwise! Not surprising when you have a fan base that is content with mediocrity and would rather downplay and sweep a problem under the rug, rather than demand support and a fix for it!
Who are you referring to again? I've made no such claim, they showed non-interactive demos of games at CES, but they did not do so before then, so again, if you're going to try to catch me in a lie, you're going to have to try a lot harder.
Whereas I can just go through pretty much any of these pro-AMD posts and pull out oodles of bullshit that you will subsequently have to backtrack on.
Thanks Chizow - and the TheJian as well secondarily. It really is true it doesn't matter one whit what AMD actually does, the fanboys are so deranged and so locked in, if anything doesn't work they either totally deny that fact for YEARS, or just say "I don't even care about that crap nVidia has, it's no good anyway." - then they march on over the cliff in absolute bliss, screaming and pointing their finger before the splat sound. After the splat, of course they are still ranting and screeching in ethereal form. I just want to say: 1. If just one single holy souled earth loving honest and not greedy AMD fanboy in the whole world gets one single firmware upgrade on a single monitor for his lifetime savings Hawaii gpu to use freesync with, well then AMD's marketing was 100% honest and reasonable.
See, 1/10,000,000th of what was promised is good enough if you're a good guy. You may have gotten it wrong nine million nine hundred and ninety nine thousand times, but that single AMD fanboy made happy is worth it, it's OK, no it's GREAT, even if it's the son of AMD's CEO.
On the other hand, if you are the evil demon the AMD fans war against in the name of all that is good and honorable and as worthy top level knights in shining armor, nVidia getting it right on 7 out of 8 memory octants is not good enough, and by golly, that R290x price destroying MASSIVE release of a card MUST BE PUNISHED IN THE MOST SEVERE METHODS AND WAYS ACHIEVABLE...
Yes, we won't hear a single word about " AMD LIED, and it will take some time for them to earn the trust of their loyal customers back..." repeated, over and over and over, ad nauseum... NOPE - that is a fantasy(meaning fair) world that just not exist(because lying con artists and amd fools are common).
Yes you're right, there's multiple variations of IPS, but my main point is, the more these technologies evolve in an attempt to match TN for gaming, the further they deviate from the characteristics that made them more attractive to some users (viewing angles, IQ, color reproduction).
I won't care until all the dick waving ends and we have a clear picture of which solution will stand the test of time. Hopefully by then the panel I want at the res/refresh/size I want will be semi affordable and I can buy two or three to replace my 3x U2412M, the majority of the market will take same stance...
The BEST tech or standard doesn't always win because it's often not the market itself that picks the winner, countless past examples dating back to Betamax/VHS. Not making a case for Freesync mind you, I think it'd be a crying shame if it's actually interior and gains critical mass...
But I'm sure as heck not gonna sink one or two grand into displays that won't last me a good five years at least, and I'm not rushing out to swap $600 worth of GPUs over it either.
I can understand the hesitation and I think many will be in the same boat as you, because you want a "true" upgrade across the board for all of your monitors. Unfortunately, I think we can see the trend will be a near exponential increase in cost for some time. IPS only really became affordable in recent years, most likely due to the fact TN gaming panels have become more popular and the emphasis and premium on IPS just hasn't been the same, especially given the decrease in price for IPS HDTVs into that $500 range.
I guess as someone who never really got into the whole multi-display gaming (I do have multiple monitors, just game on a single panel), I would never consider tying myself to 3x or 4x of that resolution and price, since it also increases your GPU demands. Instead, I just go for the biggest, highest resolution for my main panel at the time with the best gaming characteristics. Still use 2xU2410 for companion panels and they're still great for that. ROG Swift for my main display/gaming.
And the converse of what you stated, the OPEN tech or standard doesn't always win either, because in the end the market will tend to favor the better implemented and better supported option, regardless whether it costs more or not.
But realistically, I do hope Nvidia adopts its own version of Adaptive Sync and puts what AMD has said to the test (again), because this would give the market one less reason to buy AMD. Nvidia would support both their own version of Adaptive-Sync (N-Sync?) and a superior proprietary G-Sync. I guess then we will see just how genuine AMD was when they encouraged Intel and Nvidia to adopt their open standard when it results in stripping them of any competitive advantage.
But yeah, the commitment of 3x displays will make it 3x harder for you to invest in this kind of upgrade, while for me, I see it as spending the same on a single, fantastic monitor for the same price as 3 monitors that don't really improve or do anything better than what I already own.
In fact, my brother recently bought 3xU2415 but returned them when he came over and saw my Swift. The U2415s looked really slick on a 3 monitor stand, very little bezel, but they still had all the negatives of IPS when it came to gaming. Maybe these new FreeSync and G-Sync panels will address some of these issues but since these panels tend to come from the same factories and share the same issues/benefits, I am not going to be overly optimistic even if they claim 144Hz support.
On pcper they show the Asus MG279Q amd claim it works with Freesync since according to AMD's Robert Hallock "AMD will not have a whitelist/blacklist policy for FreeSync displays and that as long as a monitor adheres to the standards of DP 1.2a+ then they will operate in the variable refresh rate window as defined by the display's EDID."
The MG279Q is a 27-in 2560x1440 display with IPS panel that supports 40-120hz. Ships Q1 2015 for $599
i'd love amd to wave a magic wand and make my 9 year old Dell Ultrasharp 2405 screen Free-Sync compliant, after all, i paid £600 for it back in the day!
The new LG 34" ultrawide with freesync is only 2560*1080? Well, that's incredibly disappointing.
The only reason I haven't already bought their 3840*1440 unit was that I'd heard a freesync version was imminent. So much for that.
Why is there never a monitor from any manufacturer that ticks all the feature boxes at any given time? I am prepared to spend more and compromise less for the right monitor than pretty much any other component in my PC, as it's likely to last through multiple machines. But the last time I felt like I could get everything I wanted at the time in one display was about 2007
Even though Gsync is more expensive than FreeSync, from what I've seen Gsync works better in the regarding to completely eliminating tears. I believe Gsync will drop in cost considerably over the next few months, especially when the Freesync goes to production - One reason it costs so much because there is simply no alternative/competing product at the moment.
If FreeSync does take off, I'm sure Nvidia will have no choice but to implement Free syc.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
118 Comments
Back to Article
GuniGuGu - Thursday, January 8, 2015 - link
Was there a 4k IPS monitor?JarredWalton - Thursday, January 8, 2015 - link
No, there wasn't. I didn't take proper notes, but I believe the Samsung panel was TN, the BenQ was TN as well (maybe?), and the LG display was IPS (or AHVA or similar).Azusis - Thursday, January 8, 2015 - link
Is it likely that the BenQ monitor was using the AU Optronics panel M270DAN02.3? It's the only panel I can think of that is 27" 1440p 144hz. It's likely this panel is also being used in the upcoming Acer XB270HU, which his another 27" 1440p 144hz IPS with G-Sync.Article on AU Optronics: http://techreport.com/news/27019/au-optronics-pane...
JarredWalton - Thursday, January 8, 2015 - link
Entirely possible -- I am quite sure Samsung was 4K TN, but it may be that both the BenQ and LG were wide viewing angle. I know one of them was, but that's all I can remember. (Too many products seen in too few days....)Azusis - Thursday, January 8, 2015 - link
Hah, I can only imagine. Thanks for the input! Regardless, I think a monitor upgrade in Q1 is in my future. :)croc - Thursday, January 8, 2015 - link
I am really not liking this 'Free-Sync" vs. "G-Sync" differentiation in the monitor segment. Monitors last for donkeys' years, and graphic cards seem to get replaced every time a new fruit fly hatches (around here, anyway...)! I DO NOT want any lock-in between my monitor and a brand of GPU. In fact, I am tempted to not buy a monitor that has any kind of lock-in. How long will it be before the smart monitor manufacturers decide to start making a monitor that will do both...?JarredWalton - Thursday, January 8, 2015 - link
Keep in mind that FreeSync uses DisplayPort Adaptive Sync and should thus always work. Of course, G-SYNC displays work fine with AMD GPUs as well, but without the adaptive V-SYNC aspect. G-SYNC will remain an NVIDIA exclusive I'm sure, but if FreeSync gains enough support NVIDIA might be forced to abandon G-SYNC and support FreeSync instead. Time will tell if FreeSync will get enough support, but it seems more likely than G-SYNC taking over.chizow - Thursday, January 8, 2015 - link
Again, it will come down to which solution is better. Why would Nvidia need to abandon G-Sync when their solution holds an overwhelming majority of the potential TAM for variable refresh rate monitors that matches their high-end dGPU marketshare?I suspect many expected the same of CUDA as soon as OpenCL rolled out a few years ago, y'know, open standards, multiple disinterested industry parties backing it etc. and yet. CUDA is still alive and dominating today. Because its the better solution.
I expect nothing less when it comes to G-Sync and FreeSync. The market will respond, and the better solution will prevail.
Senti - Thursday, January 8, 2015 - link
Silly, nVidia doesn't care about "potential" – all they need is "vendor lock in". I'm quite sure that the technical differences for videocards are minimal.About CUDA: almost everyone who writes it hates nVidia for that. It's far from "better solution", but you get great developer tools for it (and very important tools, like profiler) while for OpenCL nVidia provides you almost nothing (even AMD profiler is more informative on nVidia GPU than nVidia's one). And it's all pure marketing as both CUDA and OpenCL compiles into the same PTX on nVidia, so no technical reasons for such disparity in support. With equal support and tools CUDA would be dropped and forgotten long ago.
invinciblegod - Thursday, January 8, 2015 - link
You forgot the fact that cuda came first and therefore already had developer tools created for it. Creating new tools require time and money. If Cuda works fine for nVidia, why would they bother making new tools?eanazag - Thursday, January 8, 2015 - link
Also, Nvidia has been more successful selling business cards due to the effort they have made. Specifically in virtualization features. AMD has been playing catch up in the enterprise for video cards for years.Previous gen NVidia cards had poor OpenCL performance too.
chizow - Thursday, January 8, 2015 - link
Yay, more nonsense. I guess Nvidia doesn't care about creating new, innovative technologies that didn't pre-exist, simply because everyone would love them more for just waiting for some standards board to invent it years later?And almost everyone who writes for CUDA hates it? What kind of nonsense is this? People love, and use CUDA because IT WORKS and makes THEIR JOBS, and therefore, their LIVES easier. I administrate and support a number of users and machines that use CUDA and Nvidia solutions and at no point have I ever heard any of this gibberish, in fact, they make it a point that the solution MUST be Nvidia CUDA-based for their needs.
TheJian - Thursday, January 8, 2015 - link
Cuda works best because of $7Billion invested in it, ~8 years of building the ecosystem for it, 500+ schools teach it, every popular app for content creation etc uses it, etc etc. The reason OpenCL sucks is there is no company backing it like NV backed Cuda for years. You get what you pay for (mostly) and NV paid and paid and paid for cuda, and they're getting what they paid for now year after year.Please show a test where Cuda loses to OpenCL. Find the best OpenCL AMD app and pit it against the best NV Cuda enabled app that does the same thing. Cuda will win pretty much always. There really is a REASON Cuda owns 80% or so of the workstation market. People who are risking money (IE, their business) don't buy crap for tools on purpose. I could hate NV/Cuda and I'd still buy it today due to nothing being better. If you don't, the guy who does do that will blow your doors off until you go bankrupt.
As long as Cuda is faster, NV has no reason to do anything but develop new versions of cuda. No point in helping the enemy catch up. That would be stupid business and cause me to drop their stock like a hot rock...LOL.
eanazag - Thursday, January 8, 2015 - link
I think Intel may weigh in here and if their integrated graphics support FreeSync (since it is in DisplayPort spec), then Freesync will dominate.I don't care for vendor lock-in because I do go back and forth between vendors. I have multiple machines in the house. Currently my house is AMD R9 290, but it was Nvidia GTX 660. Integrated and discrete in laptops. Generally discrete GPUs in laptops I lean towards Nvidia because I don't want to play the Enduro "does it work?" game.
ppi - Thursday, January 8, 2015 - link
They will not abandon G-Sync. Just in case there is a lot of Freesync monitors in the wild (and with Samsung and LG backing, there will be), they will just easily enable on their cards Freesync compatibility. So they will support both.Antronman - Saturday, January 10, 2015 - link
But if Freesync works just as well why would consumers buy ~$800 monitors when they could get the same on ~$150 monitors?chizow - Sunday, January 11, 2015 - link
@Antronman,Where do you get $150 monitors being the same as $800 monitors? Excluding G-Sync on any panels in the market, do you think you "get the same" on a $150 monitor as a $600 non-G-Sync equivalent?
I know you're trying to perpetuate the hopeful myth all monitors will be FreeSync compatible simply because Adaptive Sync is an optional component of the DP1.2a standard, but let's be real here, not all monitors that support DP1.2a are going to be Adaptive/FreeSync compatible.
Antronman - Sunday, January 11, 2015 - link
There's a lot of DP 1.2 monitors that have Adaptive Sync.Freesync is software that enables GPUs to utilize it for variable refresh rates. G-sync is hardware that enables GPUs to use it.
Haven't you read the articles? Freesync is a standard feature of DP 1.2a and onwards. Every DP 1.2a and onwards monitor will have it.
chizow - Monday, January 12, 2015 - link
No, there's not a single DP 1.2 monitor that supports Adaptive Sync or FreeSync, or more specifically, variable refresh and dynamic frame rates. No firmware or software update on the planet is going to replace the missing hardware necessary to make FreeSync a reality.Haven't you read the articles? DP1.2a was ratified in May 2014 and is an OPTIONAL part of the standard that is not backward compatible with existing displays because that OPTIONAL functionality requires new scaler ASICs that were not developed until after the spec was ratified, with news of them sometime in September 2014, with pre-production samples finally being displayed this week at CES.
http://www.amd.com/en-us/press-releases/Pages/supp...
I don't blame you for being confused on this though, I completely understand all the noise regarding FreeSync's development has been very misleading, which is why I think most companies choose to develop the tech and begin production in working devices before making a ton of unsubstantiated claims about it. :D
FlushedBubblyJock - Tuesday, February 24, 2015 - link
We will hear for YEARS the prior falsehoods the AMD fanatics just spewed in comments over the last 2 pages, and it will become their always proclaimed belief system, despite the facts.The chip on their shoulders will grow, they will never have any valid counterargument, and of course the rest of us will have to put up with it.
When freesync doesn't pan out properly, is less effective with less flexible features, has problems working with many games, and requires hacks and has driver issues with AMD, the same crowd will blame nVidia for not "supporting it" and "causing AMD problems".
They will then deeply desire a huge lawsuit and a payment from nVidia directly to AMD, while having exactly the OPPOSITE stance when it comes to AMD's Mantle.
Gigaplex - Thursday, January 8, 2015 - link
Amen. I'm hoping G-Sync dies out pretty quickly so we don't have two competing implementations for too long. It's not likely G-Sync will be the better option for interoperability due to NVIDIAs reluctance to license their technology.chizow - Thursday, January 8, 2015 - link
I always find this to be an interesting dichotomy. AMD fans will frequently parrot the internet meme that the world needs AMD for the sake of competition against the likes of Intel and Nvidia, or we'll all end up somehow perishing under the oppression of $10,000 CPUs and GPUs and that competition is always good and necessary for the consumer.But when Nvidia comes out with an innovative new technology that AMD doesn't have in an attempt to differentiate and better their products for their customers, these same AMD fans want them to fail. Because they're competing too hard?
Just find it a bit ironic. :)
mickulty - Thursday, January 8, 2015 - link
G-Sync is a proprietary standard. Freesync is just a brand name for DisplayPort Adaptive Sync. The difference is Nvidia could easily implement Freesync if they wanted to, but AMD cannot implement G-Sync. Would you rather have a single standard that only one company is allowed to use, or a single standard that everyone can use? :) If there wasn't an open alternative standard you'd have a point, nothing wrong with good technology, but the fact is a world where everyone implements adaptive sync would be a world that's better for everyone, with the possible exception of nvidia's marketing department.Also, I wouldn't really compare the GPU world to the CPU world. While I'm sure you'll respond to this with something pointless about the state of the market over the last couple of months, AMD STILL have the fastest single card and have come out with faster cards than Nvidia for years. Nvidia is not like the graphics version of Intel - frankly they do as much to make AMD keep competing as AMD does to make them keep competing.
Yojimbo - Thursday, January 8, 2015 - link
It's DisplayPort because it was given to VESA for free by AMD and VESA accepted the IP. This was probably a matter of NVIDIA being months ahead with G-Sync and controlling over 60% of the discrete desktop graphics chip market share. AMD couldn't compete with NVIDIA on the matter so their best strategy was to be defensive and give it away to take the advantage away from NVIDIA. However as long as you have a current NVIDIA video card, FreeSync won't work with it, so if you want that feature, you'll have to get a G-Sync monitor instead. If FreeSync works just as well as G-Sync it would be nice if NVIDIA would support FreeSync, but I wouldn't count on it. They spent money developing it first and on their own, and I believe it or something similar has already been available for their Quadro cards since 2012, so it would mean supporting both standards at once for a while. Unless they are forced to, they probably don't want to incur extra cost overhead from an innovation they developed because a competitor decided to develop something similar and give it away.haukionkannel - Thursday, January 8, 2015 - link
Hmmm... The Adaptive sync is from VESA and much older "thing" than the free sync re branding made by AMD. AMD just gave new name to allready existing technology and dis show a new way of using existing technology to achieve similar effect that can be done by using g-sync.All in all open standard is better. That is why DX12 will win Mantley in the long run (even though dx is not open, it is used by all competitors). That is why adaptive synch will win in long run. Intel will use it to make their igp to look less bad, AMD have to use it because they wont to compete with Nvidia. Nvidia will wait and make money as long as they can by sellin g-sync produts. When the g-sync is not selling Nvidia will release new super drivers that will allow Nvidia adaptive sync with their all products, because they don't wan to use the same name as AMD who will keep on branding the exact same system as a free sync because it sounds better than adaptive sync even though it is exactly the same thing...
chizow - Thursday, January 8, 2015 - link
Actually, according to AMD, Nvidia can't just support FreeSync. In fact, even most of AMD's recent GPUs can't support it. Who knows if Intel can? FreeSync has an even smaller addressable market than G-Sync right now, and this is a fact.http://techreport.com/news/25867/amd-could-counter...
According to AMD's Raja Koduri:
"The exec's puzzlement over Nvidia's use of external hardware was resolved when I spoke with him again later in the day. His new theory is that the display controller in Nvidia's current GPUs simply can't support variable refresh intervals, hence the need for an external G-Sync unit."
medi03 - Thursday, January 8, 2015 - link
Dear anandtech.com god.For the love of reader's sanity, please ban this troll.
dcoca - Thursday, January 8, 2015 - link
Off the bat I have both 970 Asus and r9 290 Asus: subjectively they are the same for gaming, for opencl AMD kicks ass... I have universe sandbox 2 alapha for the last year in a bit ( beta testing it), and the performance of AMD cards is huge when lots of shitz is happening in the simulation, in my perspective, AMD is the champ for anything that uses Opencl that isn't a benchmark... however, the 290 is a reff card that I picked up when it was released, and yes it gets much louder then the 970...both cards are great.. on the CPU side I like their idea of the APU unit and HSA, but needs to be faster and on par with Intel ipc before I ever go back to themFlushedBubblyJock - Tuesday, February 24, 2015 - link
Nothing hurts a rude fanboy more than the truth.1st amendment, try to not be a book burrner.
ppi - Thursday, January 8, 2015 - link
Consider that only a handful of latest AMD cards support Freesync.nVidia will not have issues supporting it in their next generation of cards (along with G-Sync). Intel could do the same with Skylake (and mind you, this tech is best for low-performing gfx cards).
With Samsung and LG backing this technology (and Asus having non-branded, but probably compatibe monitor as well), this is bound to get some serious market share. I mean, it looks to me Freesync monitors offering is going to be wider than G-Sync, even though that one had one year headstart.
TheJian - Friday, January 9, 2015 - link
Let me know when AMD shows it running GAMES (a lot of them) to prove it works. I'll take the one that is BEST, no matter who owns it if they also have a great balance sheet (higher R&D, profits, cash on hand etc) and can keep investing in the best drivers for the gpu I'll need to buy to use with the monitors. While I currently run a 5850, it will be my last AMD card/product for a while if Gsync is BETTER. I already have zero interest in their cpus due to Intel. I hate that, but again, why buy 2nd when they're so far behind? I just can't do that anymore these days. AMD has to beat NV in the gpu perf with at least the same power AND at least MATCH Gsync (no loss in quality/perf results) or they lose my money for ages this time.NV owns 67% of discrete amd owns 33% or so. NV owns 80% of workstations. Umm, NV is like Intel to AMD in gpus. AMD owns next to nothing these days due to selling everything off to stay afloat. Profits are rare, balance sheet is in shambles, etc. The complete opposite of NV. NV wins in every financial category. These two companies are not even on the same playing field today (used to be, but not now). Please take a look at their financials for the last decade then explain to me how they are even. NV has put as much money (~7B) into Cuda over the last 8 years or so, as AMD has LOST in the last decade (about 7B of losses). If that isn't night and day, I don't know what is. You're not making sense sir. I don't care about everyone, I care about having the best on my desk that I can get ;) If that means proprietary, I'll go that way unless 2nd is so close you can't tell the difference. But we can't say that here yet, since AMD seems afraid to show it GAMING.
You talk like you've seen freesync running 100 games. NOBODY has. I'm more worried about being stuck with a CRAP solution if NV caves while having the BEST solution already in hand and done (don't think they'd do that, but...). Maybe AMD would have a marketing dept if they'd start making some money so they could afford a REAL one. They need to start putting out GREAT products instead of "good enough for most", so they can charge a PREMIUM and make some profits for a few years in a row. The only reason AMD has the fastest single card is NV didn't want one ;) It's not like they don't have the cash to put out whatever they want to win. I'm fairly certain AMD isn't making wads of cash on that card ;) Their balance sheet doesn't show that anyway :( Whatever they spent to dev that card should have been spent on making BETTER cards that actually sell more volume (along with better drivers too!). NV could probably buy AMD if desired at this point (stupid, but they could). NV's market cap is now 5.5x AMD's and they spend more on R&D with less products (AMD makes a lot of cpus, etc but spends less on R&D). NV is beating AMD for the same reasons Intel is. Smarter management (should have paid 1/3 for ATI, should have passed on consoles like NV, etc etc), better products, more cash, less debt, more profits. That's a LOT for any small company to overcome and we're not even talking the coming ARM wave to steal even more low-end crap on the cpu front where AMD lives (getting squished by ARM+Intel sides). You should get the point. You can't compete without making money for long.
Antronman - Saturday, January 10, 2015 - link
What about HBM? What if (it's very likely) that the R9 300 series card have HBM?What then?
Because who really needs adaptive refresh rates when you can just buy a 144Hz monitor and have a minimum 144fps?
Adaptive refresh rates are a feature for budget systems and 4k resolutions on the current GPUs. There really is little other use.
chizow - Sunday, January 11, 2015 - link
@Antronman, this is the 2nd statement you've made that is more or less, nonsense.Are you some kind of Hope Merchant? Are you really trying to say HBM is automagically going to result in 144fps minimums? Is HBM even confirmed for AMD's next cards? Bit of a stretch since you're kinda going 2 degrees of FUD separation here, don't you think?
Back in reality, many people will benefit from less than max refresh rates without having to use triple buffering while solving the issue of tearing and minimal input lag/studder.
Antronman - Sunday, January 11, 2015 - link
That was an exaggeration.But several multi-card setups (and even now some single cards) can attain 144fps.
Adaptive refresh rates are only useful if you have a card that can't keep a minimum fps past the refresh rate of a fixed refresh rate panel. If the maximum refresh rates aren't anything higher than what they are now (if we can even see the difference between higher refresh rates).
chizow - Monday, January 12, 2015 - link
Ah, so you posted a bunch of rubbish and only clarified once called on it, gotcha. Why am I not surprised? Sounds like more "noise" within a distinctly pro-AMD diatribe.There are very few multi-card set-ups that can achieve and maintain 144fps minimums in modern games, and also, at what resolution are you talking about? 1080p? 720p? This new wave of FreeSync and G-Sync monitors are moving beyond 1080p and pushing 1440p which is nearly 2x as many pixels as 1080p, meaning it is that much harder to maintain those FPS.
Same as 4K, its 4x the pixels/resolution of 1080p, so yeah, you're looking at video cards and combinations that don't exist yet that can maintain 60FPS minimums at 4K for modestly demanding, recent games.
In every single one of these situations, variable refresh would work wonders for end-users, in fact, spending a little bit extra money on a Variable refresh monitor may end up saving you from having to spend 2-3-4x as much on video cards to try to achieve the unrealistic goal of meeting minimum FPS that meet or exceed maximum monitor refresh rates at 144Hz or 60Hz at 4K.
djc208 - Thursday, January 8, 2015 - link
Because it's the Apple way of doing things. You have to buy into the NVidia ecosystem and stay there. Nothing prevents NVidia from supporting both their standard and FreeSync, or from Intel to offer it in their APUs. Since there's nothing to license it should eventually be easy for monitor manufacturers to build it into every monitor they offer, even if in limited form for cheaper panels that wouldn't support the same range of refresh rates.No one argues Apple doesn't make good stuff, and they have helped drive the mobile and desktop market in beneficial directions for a while, but it's still Apple, and most of us don't want to live under their roof. Same goes for Nvidia.
Yojimbo - Thursday, January 8, 2015 - link
It's also the AMD way of doing things (pure audio), and just about every company's way of doing things. AMD simply felt they couldn't compete here so they played defensively instead of going head-to-head on it with NVIDIA.medi03 - Thursday, January 8, 2015 - link
So what is "defensive" about FreeSync? The fact that it is available to all vendors (with no royalty fee) to implement?FlushedBubblyJock - Tuesday, February 24, 2015 - link
What's defensive is AMD is pennyless, so they will do the second hand generic and demand, like all their fans do, that it is the universal monopoly.Thus, when Bill Gates and Microsoft OWN the OS on every computer in the world, you AMD fans need to be reminded, that's the way you like it.
chizow - Thursday, January 8, 2015 - link
And thus is the nature of competition, isn't it? You make your products better than the competition, to benefit your existing customers and attract new ones?I guess Nvidia, Apple, Intel and everyone else should just stop innovating and developing new tech, or just give it away for free so that their competitors can catch up? Or even more laughably, give up on their originally, pre-existing, and superior tech just because their competitor is finally offering an inferior analogue over a year later?
dcoca - Thursday, January 8, 2015 - link
Sup dude, something makes me think u work at Nvidia: happy new yearsFlushedBubblyJock - Tuesday, February 24, 2015 - link
Sup coca -Makes me think you wish you had an AMD card but can't afford it yet, so you hate rich nVidia users like me.chizow - Thursday, January 8, 2015 - link
There are still a lot of unanswered questions about FreeSync beyond what is covered here, such as cost, lag/latency, and framerates beyond max refresh. In PCPer's analysis, they mentioned a potential deal-breaker with FreeSync when framerates exceed the monitor's refresh rate. They mention it was a "design decision" by AMD to either force the user to enable Vsync in this situation, or to suffer from screen tearing, thus negating the benefit of variable/dynamic refresh. I'm really not sure why AMD didn't implement a soft/driver framerate cap like Nvidia did with G-Sync.There are also questions about input lag/latency, as G-Sync has held up favorably when tested independently against the fastest panels with V-Sync off. We will see have to see how FreeSync fares against them.
In the end, it will simply come down to which solution is better, as usual. As I expected months ago, and as it looks to be shaping up from these early demos, G-Sync appears to be the superior solution, and as such, it will command a justified premium price. These features on monitors will just end up being another checkbox feature that develop each vendor's ecosystem and influence a buying decision, such is the nature of competition.
@Jarred: did AMD mention when you might be getting some review samples? Look forward to seeing some tests and general impressions!
Gigaplex - Thursday, January 8, 2015 - link
"In PCPer's analysis, they mentioned a potential deal-breaker with FreeSync when framerates exceed the monitor's refresh rate. They mention it was a "design decision" by AMD to either force the user to enable Vsync in this situation, or to suffer from screen tearing, thus negating the benefit of variable/dynamic refresh. I'm really not sure why AMD didn't implement a soft/driver framerate cap like Nvidia did with G-Sync."Uh, what do you want it to do? How is a framerate cap any better than v-sync capping the framerate?
chizow - Friday, January 9, 2015 - link
@Gigaplex: the answer is quite simple, you can have a framerate cap WITHOUT Vsync, because in no way are they mutually dependent. The downside of course on traditional displays is that you still get screen tearing, but on a dynamic refresh rate monitor, you won't get tearing, you get the most recent, full frame WITHOUT the associated input lag.G-Sync does this, with FreeSync it is less clear because it appears they are still relying on some form of V-sync (most likely triple buffering for sub-native refresh) which reverts and breaks above scaler supported refresh rates, forcing them to enable V-Sync at the application level.
Honestly, to me it looks like they've basically implemented Nvidia's older frame rate control technology Adaptive Vsync in hardware at the scaler level without the "automatic" switching of Vsync On/Off. As a refresher, Adaptive Vsync was introduced by Nvidia with Kepler where it automatically turned Vsync with triple buffering on below the native refresh rate of a monitor, and disabled it above native refresh rate. I am sure this work was instrumental in coming out with their ultimate solution, G-Sync, which bypasses Vsync and the problems associated with it altogether.
Creig - Thursday, January 8, 2015 - link
G-sync only works within a certain frequency range as well. It's no different than FreeSync in that respect. And AMD is developing a Dynamic Frame Rate Control feature for their drivers. So G-sync and FreeSync do appear to be fairly closely matched.Well, except for the fact that G-sync requires a custom FPGA board to be installed (extra cost), requires this FPGA board to be tuned to the panel of the monitor it's going to be installed in (extra cost) and requires a licensing fee to be paid to Nvidia (extra cost). Since the only additional cost for FreeSync is an updated scaler, G-sync will always be more expensive than FreeSync.
Not to mention G-sync locks you into nvidia GPUs only while FreeSync is based off an industry standard which any company is free to develop. And Adaptive-Sync (which FreeSync is based off) also has power savings benefits. Since this is an industry standard, I would expect Intel to eventually jump on the bandwagon as well.
Right out of the gate, FreeSync is looking to eclipse G-sync. AMD has reported that there could be as many as eleven different FreeSync capable monitors on the market by the end of March. How many models of G-sync monitors have been released in the 12 months since G-sync came out? Seven?
Yes, we need objective reviews between G-sync and FreeSync before we can make a meaningful comparison. But G-sync seems far from being "the superior solution" as you put it.
invinciblegod - Thursday, January 8, 2015 - link
Theres no point in touting that freesync an open standard until other adopt it. There are so many open standards that are not used and is basically proprietary is someone uses it. MicroUSB type A for example.chizow - Thursday, January 8, 2015 - link
G-Sync works in a wider frequency range, and by design, it can't go out of this range, never forcing the user to make a choice between negating the benefit of dynamic refresh by enabling V-Sync or suffer tearing, like FreeSync does. And yes, AMD has recently been touting its DFRC feature as a built-in framecap, but its curious why they didn't just make this a softcap for FreeSync enabled globally. Its most likely because FreeSync is still tied to and limited by the monitor refresh so exceeding the refresh intervals reverts back to tearing, but I am sure we will get more indepth info on this once they actually hit the market or review samples go out.The G-Sync module with custom FPGA and onboard DRAM is necessary, but also most likely why Nvidia ends up with the better solution with G-Sync. Its hard to emulate in software what is being done in hardware, which is what I think AMD is running into with their FreeSync solution. Not to mention, we still don't know the price difference between the "expensive" G-Sync module vs. the custom, premium Scalers AMD and their partners took 8-12 months to bring to market, but they certainly have changed their tune from the days they were claiming FreeSync would be an essentially free upgrade that might only need a firmware update on existing monitors. :D
And FreeSync locks you in just as much as G-Sync, as long as no one else supports FreeSync, its in the exact same boat. Worst boat actually, since Nvidia commands an overwhelming majority of the dGPU market at a 70/30 rate for supported GPUs, and not even all AMD GPUs in the same generational span (again, I hope I don't need to redefine this for you) are even capable of supporting FreeSync. Meanwhile, all Nvidia GPUs from Kepler onward support G-Sync. Intel has shown no interest in FreeSync, which is really no surprise given its main benefit is gaming, a market Intel doesn't really address with their IGPs, and they got all they needed out of the eDP spec when they invented and developed it for reduced refresh power saving years ago in their laptop displays.
And right out of the gate FreeSync is looking to eclipse G-Sync? Haha how so? Nvidia has just as many G-Sync panels on the market TODAY as AMD has ANNOUNCED, and from what we have seen already, FreeSync is inferior to G-Sync in every way but *POSSIBLY* cost. We don't even know if FreeSync addresses the other major problem associated with V-Sync, input lag/latency, until reviewers and interest sites get their hands on them. So yes, I guess a solution that is already demonstrably inferior, is still an unknown commodity in terms of cost, is still not available for purchase and still has major questions about its compatibility and implementation has somehow eclipsed a solution that is available today, and does everything it claimed to do, from Day 1 in G-Sync? Fanboy fantasy at its finest.
But yes, we do need more object reviews to further dissect and expose the differences between G-Sync and FreeSync. G-Sync has been on the market for about a year and has held up against such scrutiny, all the questions are surrounding FreeSync, but from what we have already seen, it is falling short.
Creig - Thursday, January 8, 2015 - link
The stated compatible refresh rates for FreeSync are from 9Hz to 240Hz. I haven't been able to find any upper or lower limits listed for G-sync, but I fail to see how it could possibly work "in a wider frequency range" than FreeSync as you claim it can.FreeSync does not require a custom FPGA board, does not require custom tuning for each panel, does not require that royalties be paid back to AMD. Therefore, given two otherwise identical monitors, the FreeSync display will always be less expensive than the G-sync alternative. That's a huge win for FreeSync right there. And you are wrong about AMD claiming that it would only be a free upgrade for existing monitors. They explicitly stated that although their test model monitor was able to utilize FreeSync with only a firmware update, most likely this firmware would NOT be available through the manufacturer.
From the Anandtech "FreeSync Monitor Prototype" article:
"At this point AMD is emphasizing that while they were able to get FreeSync up and running on existing hardware, owners shouldn’t be expecting firmware updates as this is very unlikely to happen (though this is ultimately up to monitor manufacturers. Instead AMD is using it to demonstrate that existing panels and scalers already exist that are capable of variable refresh, and that retail monitors should not require significant/expensive technology upgrades."
You can claim that AMD's FreeSync "locks" you in to their hardware, but only because Nvidia has stated that they refuse to support Adaptive-Sync in favor of their proprietary version. G-sync is not an industry standard. Adaptive-Sync is an industry standard. We don't know yet whether or not Intel plans to support AS in the future, but it would be almost inconceivable for them not to. Adaptive-Sync benefits lower end systems more than dedicated gaming machines. Laptops and business computers (more often than not) fall into the "lower end system" category. And Intel sells a LOT of hardware that ends up in laptops and entry level computers.
Naturally not all AMD video cards will support FreeSync. Just as not all Nvidia cards support G-sync. There will always be a cutoff line for hardware that simply will not be compatible with new technology. For G-sync, it's Kepler and newer. For AMD, I believe it's Tahiti and newer. I don't know where you're trying to go with this one. It's simply the way of things in the technology world that not all old models will support new features.
And I stand by my assertion that FreeSync appears to have a better potential for success than G-sync. It's taken Nvidia an entire year to get as many models out as AMD has announced that will support FreeSync within months. And while Nvidia may have a majority in the dGPU market, it has only a 20% share in the overall GPU market with AMD taking 20% and Intel holding 60%. And as it's the lower end systems that will be typically running at lower frame rates, it's these systems that will be utilizing the benefits of Adaptive-Sync more often than dedicated gaming computers. Intel has not yet weighed in on whether or not they will eventually support Adaptive-Sync, but being an industry standard, they may do so whenever they wish. Nvidia is free to adapt their GPUs for AS as well. That's the beauty of it being an industry standard.
I haven't read any FreeSync reviews yet and I'm almost certain you haven't either. In fact, you have absolutely NO IDEA if there is any visual difference whatsoever between G-sync and FreeSync. To make a claim such as "FreeSync is inferior to G-Sync in every way but *POSSIBLY* cost" without any reviews at all to back yourself up simply makes you look daft. I plan to wait until the professional reviewers get done with their comparisons before I'll make any claims as to the performance of FreeSync vs Gsync. But I'll stand by my previous assertion that FreeSync's future is already looking brighter than Gsync.
Creig - Thursday, January 8, 2015 - link
Well, HotHardware had a crew at CES who looked at the AMD FreeSync displays and had the following to say:"AMD expects there to be a total of 11 FreeSync displays available by March at varying refresh rates, resolutions, and panel sizes, including IPS panel options and the aforementioned 144Hz gaming panels. Obviously a full comparison between G-Sync and FreeSync will have to wait for head-to-head hardware, but our team reports that the two standards appeared to perform identically.
Assuming that continues to be true, AMD could have an advantage with this feature -- FreeSync reportedly doesn't add any additional cost to display panels, whereas the ASIC hardware required for G-Sync reportedly increases panel retail cost by ~$150. Of course, it's ultimately up to the manufacturers themselves whether or not to charge a premium for FreeSync monitors -- there's just no baked-in cost increase from specialized hardware."
As I said, FreeSync is already looking pretty good. Just waiting on the official reviews now.
chizow - Thursday, January 8, 2015 - link
Interesting, so is AMD once again perpetuating the myth FreeSync has no additional cost? LMAO. Sticker shock incoming for AMD fanboys, but yes, I am looking forward to further dissection of their half-baked, half-assed solution.Intel999 - Friday, January 9, 2015 - link
The one company that is offering both G-Sync and Free Sync monitors has priced the Free Sync $150 less. So even if Free Sync isn't free it is the more reasonable of the two.TheJian - Friday, January 9, 2015 - link
You do understand that the "crew" has not seen it running a GAME yet correct? The "crew" saw the same things anandtech did. Windmill demos etc. Get back to us when we all see them running GAMES and NOBODY can tell the difference at THAT point in time. Currently we know specific demos set up to show the effects work, but have ZERO idea how it works in games because for some reason (umm, inferior tech?), they even shy away from showing it at a CES show 3 months before they hit supposedly. Hmmf...Not much confidence in the gaming part right or why wouldn't you have shown a dozen games working? How hard is it to have games running?Freesync is looking pretty shaky to me or they'd be showing games running. Anandtech mentions the costs (and forgets testing for certification also costs, panel makes must pay this, testing equipment costs also), so it isn't free. Free for AMD maybe, but that's it and they have to pay to R&D the cards to comply also or all their cards would work (Nvidia also). There is a reason NV won't support it for at least another generation (cards not compatible as AMD even suggests), and also a reason OLD AMD cards won't work either. Compliance from all sides is NOT free. Scaler tech had to be modified (so that required some "SPECIALIZED HARDWARE" correct?), monitors need to be VESA certified to wear that label, and AMD/NV have to mod their cards. I could go on but you should get the point. NONE of that is free.
"FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers."
Did you miss that part of the article? The standard is free, but that is it. The rest COSTS money and WE will be paying it. The only question is HOW MUCH?
chizow - Thursday, January 8, 2015 - link
@Creig: why are you still quoting dated spec sheets that clearly do not reflect reality now that we have seen ACTUAL FreeSync monitors on the market that clearly show those specs are inaccurate, if not to purposefully mislead? AMD can say 6 months ago, that FreeSync *CAN* support anywhere from 9 to 240Hz, but if the actual scalers that go into production only actually support FreeSync in the 40-60Hz band, or the 30-144Hz band (on the BenQ), is that 9 to 240Hz statement accurate? Of course not, we can just file it under more nonsense AMD said about FreeSync prior to actually y'know, doing the work and producing an actual working product.And no I am not wrong about AMD claiming monitors might essentially get a Free upgrade to FreeSync because Raja Koduri was telling anyone who would listen this time last year, that there might be existing monitors on the market that could do this with just a firmware upgrade. But I know, just more FUD and misinformation from AMD as they scrambled to throw together a competing solution when Nvidia completely caught them with their pants down by introducing an awesome, innovative new feature in G-Sync.
http://techreport.com/news/25867/amd-could-counter...
"The lack of adoption is evidently due to a lack of momentum or demand for the feature, which was originally pitched as a power-saving measure. Adding support in a monitor should be essentially "free" and perhaps possible via a firmware update. The only challenge is that each display must know how long its panel can sustain the proper color intensity before it begins to fade. The vblank interval can't be extended beyond this limit without affecting color fidelity."
What's even more interesting is that TechReport subsequently edited their story to exclude mention of Koduri by name. Wonder why? I guess he probably asked them to redact his name as he tried to distance himself from comments that were obvious misinformation at a time they had no solution and were just grasping at straws. But revisionist history aside, you and every other AMD fanboy was screaming from the rooftops a year ago saying FreeSync would be better because it would be "Free" and Nvidia's G-Sync was charging an overpriced $200 premium. Now, its a $50-100 premium, maybe, yet still better. Interesting what a year and actual product will do to change the situation.
And yes, FreeSync locks you in to AMD solutions only, and only newer AMD solutions at that. Who else supports FreeSync other than AMD right now? What makes you think Intel has any interest in FreeSync, or that their display controllers can even support it, given even many of AMD's own cannot? What's even more funny is Intel has shown more interest in Mantle than FreeSync, and yet, AMD was happy to deny them there, but FreeSync which they are freely giving away, no interest from Intel whatsoever. So yes, if you buy a FreeSync monitor today, you lock yourself into AMD and should have no expectation whatsoever that any other GPU vendor will support it, simple as that.
And where I was going with support? Where I was going is simple, you can't naively assume Nvidia, Intel, Qualcomm or anyone else can simply support FreeSync when it was a spec that AMD designed to work with their hardware, and only works with SPECIFIC AMD hardware at that. Again, you assume many things but as usual, you are wrong, the only question is whether or not you are attempting to deceive purposefully or you are just speaking ignorantly on the topic. Tahiti and other GCN 1.0 cards are NOT supported for 3D games with FreeSync, from AMD's own FAQ, only a handful of newer GCN 1.1+ ASICs (Hawaii, Tonga, Bonaire) and a few newer APUs. But this is par for the course for AMD and their half-baked, half-supported solutions, like TrueAudio, Virtual Super Resolution, CF Frame Pacing etc., only SOME of their cards in any given timeframe or generation are supported in all mode and there is no clean cut-off. Nvidia on the other hand is easy, anything Kepler and newer. The way of the world is, Nvidia is much better at supporting new features on legacy hardware. AMD does a much worst job, but to say anyone can just support FreeSync if they want to is a bit laughable given AMD can't even support FreeSync on all of their still relevant cards, and I am SURE they want to. :D
Of course you take the position AMD FreeSync has a better position in the market, but unfortunately, reality says otherwise. Nvidia holds a commanding lead, even bigger in the last few months where these relevant affected SKUs were sold, in the only TAM that matters in the use-cases these displays will be sold and deployed: gaming. That means mobile and dGPU. Overall graphics market share means absolutely nothing here because over 50% of that is Intel which might as well not exist in the discussion. That brings us back to over 70% market for Nvidia (Kepler + Maxwell) vs. 30% or less for AMD (3 ASICs + a few APUs in same last 3 years). Oh and 0 monitors on the market, while Nvidia has already brought at least 1 model to market from 7 different partners, with even more on the horizon. But yes, some idiot will sit here and tell you something you can't even buy on the market, is inferior to the established solution G-Sync, and is only supported by the market underdog is somehow better suited to succeed! Amazing.
You haven't read any reviews because you choose ignorance, that's all it comes down to. Honestly, is your reading comprehension so poor that you didn't read my reference to PCPers assessment, based on THEIR DISCUSSIONS WITH AMD, live from CES? Ryan Shrout said plenty to indicate FreeSync is already inferior to G-Sync (worst minimums, tearing above refresh or Vsync) and that's not even addressing the remaining questions whether or not FreeSync even improves latency over V-sync.
Here you go, so you can't continue to feign ignorance, you can also update your frame of reference regarding nonsensical claims of 9-240Hz FreeSync support, too, while you are at it.
https://www.youtube.com/watch?v=8rY0ZJJJf1A#t=2m20...
MTRougeau - Thursday, January 8, 2015 - link
Mark Rejhon over at blurbusters.com got a chance at some FreeSync monitors at CES today, and his initial impression is that AMD's implementation is on par with G-sync. "Now, my initial impressions of FreeSync is that it's on an equal footing to GSYNC in motion quality. At least by first impression, without looking closely at them "under a microscope". FreeSync certainly eliminated stutters and tearing, just like GSYNC does, even if the methods/technologies work somewhat differently. A future article will probably compare GSYNC and FreeSync. Many sources have reported various pros and cons of GSYNC and FreeSync, but a major one that sticks out: Lower cost of implementing FreeSync." http://forums.blurbusters.com/viewtopic.php?f=16&a...Of course, we will have to wait for more detailed analysis, but early impressions are encouraging.
MTRougeau - Thursday, January 8, 2015 - link
A bit more from his post: "...I played around with the options of the windmill. It had an option to sweep the framerate. The sweep was seamless, seeing framerate bounce around from 40fps through 60fps without stutter. It looked every bit as good looking at G-SYNC at the same rates (40-60fps in Pendulum Demo). Disable FreeSync and VSYNC brought about ugly tearing and stutters, so it was certainly cool to see FreeSync doing its job.chizow - Friday, January 9, 2015 - link
Thanks for the link but again, this picture here, should be a huge red flag and reason for concern for anyone interested in FreeSync, or G-Sync for that matter:http://www.blurbusters.com/wp-content/uploads/2015...
Why is Vsync on at all? Why does the associated quote confirm VSync is enabled when it says "Disable FreeSync and Vsync brought about ugly tearing...."
To me, it sounds like they are still relying on triple buffered Vsync to achieve the incremental framerates between 40-60 (rather than the full denominational increments of max refresh) and then using Adaptive-Sync/VBlank signal to change the refresh rate on the monitor's scaler as the framerate changes (most likely, 1 or 2 frames latency here too).
But the key is, all that Vsync lag is still going to be present, even if the tearing and stuttering associated with no VSync is gone. That was only half the problem and compromise re: Vsync On/Off, but I guess we will need to observe it under a microscope to see if FreeSync addresses latency at all over Vsync On as well as G-Sync does. My bet is, it does not.
I am also interested to see what granularity the scalers on the monitors are capable of. Is it 1Hz frequency differences, or is larger chunks?
TheJian - Friday, January 9, 2015 - link
"I played around with the options of the windmill."Let us know when someone plays around with GAMES. Windmills mean nothing IMHO. We don't play windmills ;) They saw the same crap demos as everyone else at CES. NO GAMES. Until we see that running on many sites (which will mean we'll see many games as they don't all use the same games), we really don't know how good the tech is.
I challenge anyone to explain why AMD would NOT run games at CES if everything was working as advertised?
FlushedBubblyJock - Tuesday, February 24, 2015 - link
Uhh, AMD didn't use games because (accepting YOUR challenge) at CES, and everywhere else, thousands of drooling idiot AMD fans don't care ! So long as they can scream nVidia must NOT ever have proprietary technology no matter what, and nVidia "IS RUINING GAMING FOR EVERYONE ! ", AND "nVidia will cost more and everyone will be unfairly locked in! " - they could care less if freesync actually works as well or works at all - because no matter how limited (40-60 fps only for instance) or crappy it is, the AMD fanboys will scream deliriously, forever, it is perfect and just as good and better than nVidia's "greed driven proprietary locked in payware" !Since even that level of astounding partisanship is not enough for the drooling amd fans, they will go one step further and AMD knows it - if it doesn't work at all they will all scream in unison " I don't want it and no one can tell at those framerates anyway ! "
As if that still weren't enough, it could go on for literally YEARS not working correctly while anyone who objected was told they were full of it, as it did with crossfire and massive frametime stuttering, dropped and runt frames, false framerates on all those game tests at all the official websites including here that meant totally unfair unwitting LIES pumping up AMD beyond it's true FPS values, and then when those years pass and it's FINALLY EXPOSED as broken and fraudulent, it could get fixed... and all those amd fans that hate nVidia with their passions, could care less, they will be so HAPPY that AMD finally fixed the broken junk peddled as perfect for YEARS, that they will instantly renew their fan fervor for AMD and launch it to it's highest personal emotional peak positive, ever, while simultaneously complaining that nVidia caused all the trouble with it's money grubbing Gsync crap.
So, there's why AMD hasn't a worry in the world.
tuxRoller - Friday, January 9, 2015 - link
I know this won't matter to you as you've made your position clear but there's a difference between a specification and implementation. The problem seems to be that most desktop monitors don't work below 30hz due to short pixel memory (igzo can be better, but that's hardly standard).chizow - Friday, January 9, 2015 - link
@tuxroller, yes I fully understand this, but it is clear Creig is quoting this dated spec info in a misleading attempt to back his point FreeSync is superior to G-Sync, when in reality, FreeSync's supported frequency bands are in fact, worst. What is the purpose of this if not to mislead and misinform? Its a disservice to everyone involved and interested in these products to try and claim this as an advantage when in actual implementation, the displays are not capable of these low refresh rates, nor are desktop (or even HDTV displays) capable of driving up to 240Hz.Would it not be misleading to say "Hey, this product is better because it is capable of time *travel."
*Time travel not available anytime soon, if ever.
Creig - Friday, January 9, 2015 - link
@chizow:First of all, there is nothing at all wrong with the specs I listed. According to the information that has been released, FreeSync is capable of operating anywhere from 9Hz to 240Hz. It does so in ranges.
9Hz - 60Hz
17Hz - 120Hz
21Hz - 144Hz
36Hz - 240Hz
As I highly doubt that any one panel out there will be capable of operating in the full range of 9Hz - 240Hz, I don't see what the problem is. The monitor manufacturer simply chooses which range will cover the specs of the panel they intend to produce. The fact that no panels out there today can go as low as 9Hz or as high as 240Hz yet is irrelevant. FreeSync will be ready for them if and when they eventually make it to market. Your "issue" is a non-issue.
From your quote:"there MIGHT be existing monitors on the market", "PERHAPS possible via a firmware update". See the words "MIGHT" and "PERHAPS"? He didn't say "WILL". Monitor firmware updates are beyond AMD's abililty to control. It was obvious that the monitor they used was capable of FreeSync with an updated firmware. But it is up to the manufacturer to offer the update, not AMD. You may as well blame the weather on the television forecasters while you're at it. It makes just about as much sense as what you just said.
Only AMD will support "FreeSync" because "FreeSync" is simply AMD's implementation of Adaptive-Sync. As far as other companies such as Intel, they are free to develop their own version of FreeSync because Adaptive-Sync is a VESA industry standard. The VESA board evidently considered the spec to be of great enough benefit to include it in their latest version. So it's not only AMD who found merit in its design. Intel has had eDP for a couple of years now so it's entirely possible that they already have Adaptive-Sync capability built into their shipping products. If they don't already possess it, I can't see why they wouldn't want to include it in the future. It's an industry standard spec, there are no licensing costs and it gives the end user an overall better experience.
I was pulling from memory the GPUs that will be FreeSync capable. I thought I read somewhere that Tahiti will have partial FreeSync support in that they will handle the video playback aspect, but not the 3D rendering. I'll see if I can find that info. And even if it turns out that it's only Hawaii and newer, what of it? There will always be new technology that isn't compatible with old hardware. There has to be a cutoff line somewhere. Are you raging because there are no more AGP slots on motherboards? Are you upset because you can't find a new laptop that comes with a floppy drive? Newer technology isn't always compatible with older technology. Does that mean we should simply stop innovating?
Both websites of PCPER and Blur Busters have personally been to the AMD booth at CES and both websites have reported no visual difference between FreeSync and G-Sync. Obviously we'll have to wait for official reviews to get the final word, but I fail to see why you are still trying to claim that FreeSync is inferior to G-sync in nearly every way when people who have actually seen both in operation are saying otherwise.
Really chizow, you might be taken a bit more seriously around here if you would simply tone down your pro-nvidia "RAH RAH RAH" eight or nine levels.
chizow - Friday, January 9, 2015 - link
@Creig.So you can admit, that because the frequency range FreeSync monitor mfgs chose to support are inferior to what G-Sync supports, FreeSync is an inferior solution to G-Sync correct? Because I would hate for someone to get the impression FreeSync is better than G-Sync based on some dated specs you pulled off an AMD whitepaper when in reality, there are no monitors on the market that support anything close to these ranges on either the top or bottom end. Just making sure. :)
So after that quote about FreeSync being free, and after we have seen there were in fact no displays on the market that could just support FreeSync with a firmware update, for free, you can admit what AMD said was misleading, and that their entire naming structure is really just a misnomer. Do all the people AMD misled with their claims deserve an apology, in your opinion? Do you think AMD should have made these claims without first verifying any of it being true, first? Just wondering. :)
LOL gotta love the "It is open but they can't use AMD's implementation, but as usual, anyone is free to develop" take you are heading towards here with FreeSync. Keep towing that company line though! I know that is the favored mantra for AMD and their fanboys when they develop something proprietary under the guise of "Openness" just so the dim-witted and misinformed can parrot it and find no fault with AMD. Just like Mantle right? Intel, Nvidia and anyone else are free to develop their own Mantle implementation? :D Wonder who spread that bit of noise/FUD around...
But yeah you were wrong about Tahiti, so you really should be more careful when referencing major points if you are going to try and counter my point, which in this case, was market share. You are now of course trying to downplay the fact that only a tiny fraction of AMD cards even support FreeSync, which are only a minority share of the dGPU market to begin with, but it reinforces my point that FreeSync is the technology that has the huge uphill battle because so few GPUs can even make use of it. Its also quite funny that you are now trying to downplay the importance of hardware install-base. Who is going on about floppy drives and AGP slots? We are talking about relevant DX11 hardware from the last 3 years, most of which can't support AMD's own FreeSync standards. If install-base and legacy support aren't important, what chances would you give FreeSync to succeed if they started the ticker at 1 starting with their next 14 or 20nm GPU, against the tens of millions of Kepler and newer GPUs that support G-Sync? You don't think the fact many AMD users will have to upgrade both their GPU *AND* their monitor will be a barrier to entry, and an additional cost of adoption for FreeSync? Maybe its time to reassess the fees attached and total cost of ownership once you factor in a new AMD GPU too?
And reported no visual difference? Wrong, they reported no visual difference until the demos went out of supported frequency band, at which point, everything fell apart. This cannot happen with G-Sync, by design. Also, visual difference in the form of tearing and stutter was only part of the equation and problem with Vsync that was solved by G-Sync, the other half of the equation was input lag/latency, which we have no insight on because the demos weren't truly interactive. But again, based on the impressions and various screenshots indicate AMD's solution is still tied to V-Sync, so there is a strong possibility they were not able to resolve this input lag, as G-Sync does.
And tone down the RAH RAH tone? Hahah that's funny from the guy who is now forced to re-scream all the nonsense of the past 12 months from the rooftops, but I fully understand, you've backed yourself into this position long ago when you reference and gave creedence to all the nonsense AMD claimed about FreeSync that ultimately, ended up being BS.
Will Robinson - Sunday, January 11, 2015 - link
I'll enjoy watching you eat humble pie over this.Be sure to be man enough to admit your rants were wrong and heckling everyone over their rebuttals was juvenile.
chizow - Sunday, January 11, 2015 - link
@Will Robinson,Are you enjoying that humble pie defending all the FUD/misinformation AMD said about Not-So-FreeSync before they actually did the work and productized it? Certainly you are "man enough" to admit much of what AMD said over the past year regarding FreeSync was in fact misleading?
FlushedBubblyJock - Tuesday, February 24, 2015 - link
I hope to soon SUE the lying AMD company to THE HILT OF THEIR EMPTY BANK ACCOUNT - because they have lied to me about Freesync and my Hawaii core AMD cpu !Yes, it has 4GB of ram, but when the screen is shredding and tearing apart - what good is it ?!
Intel999 - Friday, January 9, 2015 - link
What about future hardware? User has choice to purchase a GPU that supports monitors $150-$200 cheaper than a GPU that requires a more expensive monitor to get similar performance. Only hardcore team green loyalists will choose the latter. AMD will hold onto their loyalists. And those with common sense and go back and forth between green and red will have one more reason to go to AMD. Especially, in the back half of this year where it appears team Red will have the best performing cards for at least six months. How Red prices the new GPUs should dictate the success of market share gains. I suspect $125 premium to Nvidia cards will be the case.chizow - Sunday, January 11, 2015 - link
@Intel999Or more likely, the 70% of the market that already owns an Nvidia dGPU from the last 3 years (Kepler launched in Mar 2012) can just buy a new G-Sync monitor, the same set of users that already saw a benefit from Nvidia products independent of the relatively new innovation of G-Sync.
But yes, if both Nvidia/AMD hold onto their "loyalists" or repeat buyers, you can already see, AMD is going to run up against a huge uphill battle where they control an extremely minor share of the dGPU market (desktop and mobile) at ~70/30 clip. What's the point of referencing future GPUs of unknown commodity at this point? You don't think Nvidia is going to release another high-end GPU to combat AMD's next offering?
And you can't say for sure these users will have 1 more reason to go AMD, because there is a premium and value for better technology. G-Sync may just be better than FreeSync, and while we don't know this for sure right now, we do know G-Sync does everything Nvidia said it would for over a year, which has held up against the test of time from both reviewers and consumers alike.
We simply can't say the same about FreeSync right now, can we?
medi03 - Sunday, January 11, 2015 - link
70% eh?http://www.anandtech.com/show/8446/the-state-of-pc...
chizow - Sunday, January 11, 2015 - link
@medi03, actually 70% would be a generous number for AMD GPUs in this discussion, because again, Nvidia supports G-Sync with all Kepler and Maxwell-based GPUs, which goes back to March 2012. AMD GPUs that can support FreeSync are far fewer, with only GPUs based on Hawaii, Tonga, and Bonaire and any APU from Kaveri onwards.While AMD has stated all new GPUs based on new ASICs will support FreeSync, the most market data shows they are getting destroyed in the marketplace, which is no surprise given the reception Maxwell has received in the marketplace:
Source: Jon Peddie Research
http://jonpeddie.com/publications/add-in-board-rep...
"Nvidia continues to hold a dominant market share position at 72%."
http://jonpeddie.com/images/uploads/publications/A...
That was only with 1 month of sales for the new Maxwell cards, market reports are expecting similar, if not more pronounced results in favor of Nvidia for Q4, but I wouldn't be surprised to see a slight decline with AMD's price cuts as 72% is REALLY hard to improve upon.
FlushedBubblyJock - Tuesday, February 24, 2015 - link
blind and ignorant AMD fan bloviates again..." Not to mention G-sync locks you into nvidia GPUs only while FreeSync is based off an industry standard which any company is free to develop. "
The only other company is AMD, so AMD is locking you in with freesync, SO YOU AMD FANS NEED TO STOP THIS BULL.
Thanks for not THINKING AT ALL.
Not.
ppi - Thursday, January 8, 2015 - link
And what do you think G-Sync is doing when then the computer draws frames faster than max refresh rate? It just waits for the next sync, effectively working as Freesync with V-Sync on. Unlike with nVidia, you can choose here, so a plus point.The difference is however in low frame rates, where it looks AMD gives choise between tearing and V-Sync frame drops (it would be ideal to choose V-Sync independently for top/bottom frame rates), while nVidia lets pixels dim down (effectively letting them below the designed minimum refresh rate for the panel) and thus you get blinking. None of those three options is ideal and I am not certain, which was is optimal one.
Jury is still out till the independent reviewers get a chance to review both panels side by side. Given the width of Freesync offer shown at CES and the manufactures backing it, it is certain to get some real traction. And then nVidia could enable it generation of cards (while keeping G-Sync as well, of course).
chizow - Friday, January 9, 2015 - link
No, G-Sync can't draw frames faster than max refresh, because it uses a driver soft cap, which is curious as to why AMD didn't do this by default. But even at the max refresh of 120 or 144Hz, the GPU is still the one calling the shots, telling the G-Sync module to draw a new frame only when the monitor's refresh is ready. If the monitor isn't ready, the G-Sync module with the onboard DRAM acts as a lookaside buffer that allows the monitor to simply hold and continue to display the last frame until the monitor's refresh is ready, which then displays the next live frame (not old frame like with Vsync) that is sent from the GPU to the G-Sync module. The end result is just a perceived reduction in FPS rather than an input laggy/juddery one, as you would see with Vsync.There are a lot of questions for certain with AMD because 40Hz is still a very questionable minimum especially on a 4K display; I am honestly not sure what kind of AMD set-up you would need to ensure you meet this minimum (3x290X?) always. While Nvidia's minimum really only becomes an issue ~20Hz but realistically, it only really manifests itself on menus in certain games that drop their frame rate because some games tended to overheat (SC2, GTA4 etc) when running uncapped frames.
But yes, independent reviews I am sure will give us more answers, but realistically, there's no reason to expect anything other than vendor lock-in for a tiny niche of specialty monitors that cost more for this premium feature. Any AMD fan who doesn't think they are going to pay a significant price premium (maybe not as much as G-sync, but definitely not free!) is deluding themselves.
Stonedofmoo - Thursday, January 8, 2015 - link
More to the point when are laptop users going to see freesync or gsync on laptop screens!If ever there was a need for these technologies it's the laptop market where even gaming laptops only feature 60hz screens and the most powerful GPU is only 75% the power of the most powerful desktop cards which increases the chances of dipping below 60fps greatly...
chizow - Thursday, January 8, 2015 - link
Do you think laptop users are willing to pay the premium for something that won't be portable between GPU/hardware upgrades? At least on the desktop side of things, I think many will be more willing to invest in one of these panels if they cost $600+ if they know it will survive at least 1-2 upgrade cycles.But maybe this is a question for AMD at some point, since they were making some pretty bold claims this time last year at CES about FreeSync. Since they claimed this was always a feature of laptops using the eDP specs along with the comments they made about being free and needing only a simple firmware update, maybe they can get FreeSync working on these laptops, for "Free"?
Stonedofmoo - Thursday, January 8, 2015 - link
Well yes, have you seen the price of gaming laptops and also the news that they are selling in far greater numbers then ever before.The users of gaming laptops are often desktop gamers themselves and being included in those numbers myself seeing laptop gaming reach the standard of desktop gaming is exactly what we want to see, and would be willing to pay for. Freesync/Gsync is something I personally see as a must for laptop gaming to be fully embraced.
The early demonstrations of freesync was actually done with regular laptops that didn't require additional hardware. From what I believe the technology is there, just presumably no one has yet embraced the market and provided an option. I'm hoping the royalty free freesync spurs the laptops manufacturers on.
Yojimbo - Thursday, January 8, 2015 - link
Does not require additional hardware? Are you sure? It requires no additional hardware above the "DisplayPort Adaptive-Sync standard", but to support this standard the monitors seem to need additional hardware because otherwise why would AMD being showing off these new FreeSync monitors?JarredWalton - Thursday, January 8, 2015 - link
Exactly. There's no extra licensing fee for FreeSync, but the hardware required to run it will certainly be more expensive than the hardware required for a standard static refresh rate display. Besides the scaler, you'll need a panel that can actually handle dynamic refresh rates, and in most cases the panel will also need to be able to handle higher than normal refresh rates (e.g. 144Hz instead of only 60Hz). We need to see final pricing on shipping FreeSync and then compare that with G-SYNC equivalents to say how much NVIDIA is actually charging.And for all the hate on G-SYNC, remember this: FreeSync wouldn't even exist if NVIDIA hadn't made G-SYNC. They first demonstrated G-SYNC in 2013, and hardware has been shipping for most of last year (though it wasn't until the second half of the year that it was readily available). Considering the R&D cost, hardware, and need to get display vendors to buy in to G-SYNC it's pretty amazing how fast it was released.
Will Robinson - Sunday, January 11, 2015 - link
I keep re-reading this thread Jarred but the only hate I see is for FreeSync by hardcore NV fans.Or is that what you meant to type?
chizow - Sunday, January 11, 2015 - link
Where is there hate for FreeSync? If it does what G-Sync does, that's great! AMD fans may finally get to enjoy the tech they've been downplaying for nearly a year!I have a disdain for FUD and misinformation, that's it, and as we have seen, AMD has a strong penchant for making unsubstantiated claims not only about their unreleased/undeveloped solutions, but about their competitor's solutions as well.
Do you appreciate being lied to? Just wondering and trying to understand how some of AMD's most devout and loyal fans don't seem to be misled or lied to, at all. :)
chizow - Thursday, January 8, 2015 - link
Yeah, its unfortunate, this is probably based on the early misinformation AMD provided about FreeSync, but they've since changed their tune since FreeSync clearly requires a premium custom scaler in order to work.chizow - Thursday, January 8, 2015 - link
Interesting, I won't disagree here because I have heard similar sentiments in other gaming notebook discussions, and there is clearly a huge premium attached to slower performing high-end notebook parts. I will caution however, that many notebooks initially implemented 3D Vision capability and 120+Hz ability and it did carry a hefty premium. 3D and 3D Vision interest has since subsided unfortunately, so I am not sure if these panels are still including it or not as a feature. Just something to consider.Creig - Thursday, January 8, 2015 - link
AMD showed FreeSync working on laptops at last year's CES as their initial FreeSync demonstration. Adaptive-Sync would be a great addition to any laptop as it includes the ability to improve graphics qualities at lower frame rates. As laptops generally have less power graphics subsystems, they would benefit from this technology. In addition, Adaptive-Sync also has power savings abilities as well. All of which would be to the benefit of laptop users.andrewaggb - Thursday, January 8, 2015 - link
Agreed. I think it makes lot of sense for laptops and for desktops with 4k screens or cheap graphics cards as they will likely have lower framerates and benefit the most.dcoca - Thursday, January 8, 2015 - link
Hey dude, could you provide references to this said "free" upgrade ur talking about.. like a link to an AMD media article, or a URL from an official statement.. my take in and comment was that if the existing laptops already had the needed standard of displayport 1.2a then it would be up to the manufacturer to implement some type of update if they wish, "wish" being the operative word. Now with that said, funny thing about the universe it's relative to what the subject wants to interpret.. so keep on saying what ur heart wants...chizow - Sunday, January 11, 2015 - link
I've already linked it in other comments, and it was obviously these comments from Koduri that gave rise to the misinformation many AMD fans clung to for months until AMD reversed course and backed off these statements.codylee - Thursday, January 8, 2015 - link
It's so nice to see a company put the time in to testing products to make sure they are ready for release!TheJian - Thursday, January 8, 2015 - link
"Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months."Umm, so this was shown running tons of games and they all worked fine then? If not, you shouldn't make claims like this. The whole point of this tech is to MAKE GAMES BETTER, not test windmill demos etc. I think we need to say the jury is still out until GAME TESTED and APPROVED. THEN you can say the type of statement such as the one above. Just another reason NV doesn't send you info early ;) No cake a while back like others got, no 965m info today...LOL.
What would have been NEW, is if they showed it WORKING in games ;) Also we'll see how long it takes to get them here, and as noted (for the first time on AT?) pricing won't be FREE ;) Anyone who thought it would be FREE was smoking some really good stuff :) For the first time I see you actually hinting here it may cost the same as gsync...Well duh. R&D isn't free (for scaler/monitor makers, testing it all etc), and it WILL be passed on to consumers (AMD or NV solution).
I don't believe NV will support this in any case. They'll just give away Gsync crap cheaper if forced (they only need a break even here, and it sells their gpus) to entrench it if at all possible. There is also a good chance NOT-SO-FREE sync might not be as good as NV's solution (or they'd be showing games running at CES right?), in which case they can keep charging a premium no matter what the price for NOT-SO-freesync ends up being. Many gamers won't accept 2nd best in this case nor "good enough". IMHO it may be "good enough" for some (why not showing 2 dozen games at CES?), but that will allow NV to stay Gsync anyway claiming the premium solution is only on NV hardware. I can deal with that considering my monitors make it 7yrs+ anyway and that just means I'll buy 2 NV cards or so during the monitors life (I upgrade gpus about every 3yrs).
Agree with Chizow. The market will keep gsync alive if it is BETTER, period. Considering the market share's of both (1/3 AMD, 2/3 NV and NV gaining, monitor makers know these numbers too) they have no reason to favor a smaller player with possibly a WORSE solution that still hasn't been shown GAMING. I'm worried at this point since it's been ages since they showed NOT-SO-freesync, and have yet to show a bunch of GAME DEMOs running. How bad is this tech if we're supposedly Q1 for monitors and NOBODY will show them running GAMES?
Nobody else finds this odd? Anandtech just acts like it works with no games tested? Nvidia put it in TESTER HANDS to run GAMES (review sites etc). Smell anything fishy people? Having said all that, I'll wait until the next black friday to decide the victor assuming my monitors can live that long (in year 8 or so now...LOL).
tuxRoller - Friday, January 9, 2015 - link
Actually, Intel is what matters since they've more share than anyone else.While adaptive sync is useful for gaming, that's far from its best use. Having a 100% tear free desktop, perfectly synced videos, lower power usage are all at least as useful and will certainly be useful to more people.
JarredWalton - Friday, January 9, 2015 - link
For the record, AMD showed at least two games running with FreeSync. You'll note that there's no "G-SYNC compatible" list of games from NVIDIA for a reason: if the technology works, nothing more is required of the games to enable it! Spouting FUD and posting long anti-AMD diatribes does nothing but create noise. FreeSync was shown running games, and that proves that it can work. I don't know that it's necessarily 100% ready today, but the remaining work is going to be mostly in fine tuning the drivers over the next couple of months.If you want to really complain about something, it's that FreeSync requires at least GCN 1.1 to enable the full functionality, so R9 280/280X (7950/7970) and earlier GPUs won't support it AFAICT.
chizow - Friday, January 9, 2015 - link
Interesting take on this Jarred.So posting pro-AMD diatribes and throwing out misinformation doesn't create noise? That's interesting view on things since we ARE STILL trying to filter out all the noise AMD threw out there for the past 12 months, perpetuated and regurgitated by their pro-AMD fanboys. Shall we recount?
1) Koduri telling everyone at CES last year FreeSync would "effectively be free" and might even be supported on existing monitors with just a firmware update. Only much later do we see them reverse course and say they were only referencing "royalties" that no one, including Nvidia, has ever confirmed, existing. Clearly this lie has legs, because there are STILL tech sites perpetuating this myth even to this day!
2) Koduri and AMD saying FreeSync wouldn't need additional hardware, that they could work with existing DP specs that supported eDP. Months later, after actually doing the work, pushing a spec, and working with scaler makers, we see these monitors will in fact need more expensive scalers, and the Free in FreeSync is no longer "essentially free", its "royalty free".
3) Koduri telling everyone that Nvidia needed expensive hardware because the display controllers in their GPUs couldn't handle Adaptive Sync as well as their own display controllers. Yet months later, we find that even many of AMD's own display controllers weren't quote AMDawesome enough to handle this!
4) AMD saying FreeSync would support 9-240Hz bands, spurring fanboys like Creig to repeatedly quote this misinformation even in light of the fact FreeSync in actual pre-production samples is only supporting 40-60Hz and 30-144Hz in the models on display.
5) AMD claiming G-Sync will die because it is proprietary, while FreeSync is open and "royalty free", when in reality, AMD is the only one supporting FreeSync but with a much smaller share of the total addressable market for these products than Nvidia, with Nvidia having shipped and sold their product for close to a year already. Oh, and we STILL don't know how much "Free'er" these monitors will be, do we?
So now let's compare that with G-Sync and the mysteries surrounding its launch:
1) Nvidia announces, demos, and launches G-Sync live for all the world to see and ships it in actual product 2 months later, and it does everything they said it does, from Day 1.
So again Jarred, where is all the noise coming from, again? :) It seems to me, if AMD said nothing at all about FreeSync until it was actually done, there wouldn't be all the noise surrounding it.
Creig - Friday, January 9, 2015 - link
1) "MIGHT" be supported on existing monitors, not "WILL" be supported. It's up to the monitor manufacturer to release a firmware update to support FreeSync, not AMD. Therefore, AMD did not lie.2) FreeSync was already shown to work on laptops with the required specs without any additional hardware. However, laptop displays are not desktop displays. It is necessary to have a panel that is capable of variable refresh rates. This is why we have had to wait for monitor manufacturers to produce desktop displays with the VRR ability of certain laptop displays. Therefore, AMD did not lie.
3) AMD certainly does have GPUs that are Adaptive-Sync capable while Nvidia does not. Therefore, AMD did not lie.
4) FreeSync reportedly can operate anywhere from 9Hz to 240Hz. Just because current panels cannot go that high or that low does not mean that FreeSync is not capable of operating at those frequencies. Therefore, AMD did not lie.
5) Whether or not G-sync will die remains to be seen. It is a fact, however, that AMD charges no royalties connected with FreeSync while it is reported that Nvidia does collect fees for every G-sync monitor sold.
The noise, chizow, is coming from you distorting the facts to fit your twisted view of FreeSync.
Creig - Friday, January 9, 2015 - link
Just to clarify #2). It appears that the only difference between a FreeSync capable monitor and a non-FreeSync capable monitor is the scaler. Most current scalers are v1.2 and FreeSync requires 1.2a. As old monitors get replaced with new versions, it will be a simple and inexpensive matter for manufacturers to update them with DP1.2a or DP1.3 scalers which will make them FreeSync compatible and give them the necessary variable refresh rate capability.I am not claiming infallibility with the points I bring up. It's possible that I may make a mistake and state something that is in error. But I am trying to be as factual as possible.
chizow - Friday, January 9, 2015 - link
See, this is a perfect example of how all the misinformation and FUD AMD has put out there over the last year regarding FreeSync, just dies hard. You now have all these half-truths, lies, myths and straight nonsense put out there from AMD, perpetuated by their fanboys who of course feel compelled to continue the myths, FUD, lies and half-truths simply because they backed themselves into these untenable positions months ago, coupled with the fact they simply can't acknowledge AMD lied or was spreading misinformation regarding FreeSync this whole time.1) Creig, you are simply trying to argue semantics here when it is obvious what AMD said regarding FreeSync being essentially free, or possibly being supported with just a firmware update was misinformation, plain and simple. How you as an AMD fan and supporter aren't disappointed by this dishonesty is somewhat unsurprising, but to continue covering for them and thus, perpetuating the lie is somewhat shocking. Is there a single monitor on the market that is just upgradeable via a firmware update, essentially free that can support FreeSync? No, there is not, therefore, AMD lied whether intentional or not. There is no single monitor or display on the market that can support FreeSync via firmware update, essentially free.
2) No, FreeSync was not shown to work, unless you consider a fixed refresh demo of a windmill a working version of FreeSync. But thanks again for providing another example where AMD was less than honest and forthcoming about what they showed and what they said they demonstrated.
3) Great! So I guess that debunks your claims that Nvidia is the one choosing not to support FreeSync, when in reality, its certainly possible AMD designed a spec that only their hardware could support, knowing Nvidia and Intel GPUs could not. While FreeSync may not be proprietary in name, it certainly is in practice, is it not? Which brings us back to my original point, AMD is currently the only GPU vendor that supports FreeSync, just as Nvidia is the only GPU vendor that supports G-Sync, but of course, that also means Nvidia commands an overwhelming % of the TAM for these displays. The rest, is just "noise".
4) No, it just means G-Sync as currently implemented is BETTER than FreeSync, just as I originally stated. You can claim FreeSync can lower your mortgage on paper, but if it doesn't do it in reality, who gives a rats ass? 9-240Hz on a piece of paper is just a way to deceive the ignorant and non-technical into thinking FreeSync is better because it "supports" a wider range of frequencies, when we see in reality, the supported band is MUCH smaller. Mission accomplished, it seems!
5) Again, reported by whom? AMD? LOL. Again, the noise regarding royalties and fees have come from AMD and AMD only, but as we have seen, they have continually backed off this stance saying there is now additional BoM cost due to better scalers and better displays capable of handling these refresh rates and LCD decay times. Yet, somehow, the $200 G-Sync premium for Nvidia's BoM, R&D and QA costs per board are unjustified??? And we STILL don't know how much more AMD's solution will cost, so again, why is AMD saying anything at all until they know for sure?
And, your clarification is wrong too, there are more differences than just the scalers, the panels themselves are higher quality also, as they need to support lower decay times to address the minimum refresh rates. 4K, IPS and 120+Hz will also command premium panel prices.
So yes, as usual, the noise originated from AMD and has been echoed and parroted by AMD and their fanboys like you, Creig. If AMD simply shut their mouths and waited til they introduced actual product this week at CES, you wouldn't feel the need for all this backpedaling and revisionist history to cover all the misinformation they've been spreading over the last 12 months, but thanks for proving my point with your elaborate attempt to cover-up all of AMD's missteps.
Creig - Friday, January 9, 2015 - link
1) Are there monitors out there that can be made FreeSync compatible with nothing but a firmware flash? Yes. End of story.2) From what I understand, the laptop FreeSync demo was to showcase the fact that they could display a variable refresh rate. Full implementation of FreeSync requires dynamic variable refresh rates. The demo simply showed that FreeSync was possible, even if it didn't have all the features yet. Try to keep in mind that it was a demonstration of a beta work-in-progress just to show that FreeSync was possible.
3) So AMD shouldn't have come out with FreeSync simply because Nvidia cards might not currently have the capability of utilizing it? And we don't know whether or not Intel currently has hardware that is Adaptive-Sync compatible. But since it's an industry standard now, Nvidia or Intel are free to incorporate it into their own hardware. That's more than can be said about G-sync.
4) First you said AMD's support of 9Hz to 250Hz was a lie. Now you've admitted that it isn't a lie. It isn't AMD's fault that current monitors don't go that low or that high. But when they do, FreeSync will be ready to support them. How can you possibly twist that into a BAD thing?
5) However you want to look at it, FreeSync will be cheaper than G-sync. It's an inescapable truth. FreeSync simply needs an updated scaler while G-sync requires the entire scaler to be replaced! And that replacement scaler has to be custom tuned to the panel in question. And don't forget about Nvidia's royalty fees. FreeSync will end up being cheaper than G-sync. No question about it.
There is no backpedaling or revisionist history going on here. In fact, the only thing going on (and on and on and on) is you. I realize that you're upset that AMD appears to have beaten Nvidia at its own game and that the industry is excited about the forthcoming release of FreeSync. But no amount of ranting on your part is going to change that. So please just calm yourself down and try to stick to facts.
chizow - Saturday, January 10, 2015 - link
LMAO, again, it is amazing you're not ashamed to continue perpetuating these lies and myths. And for what? To try and defend the accumulated lies and misinformation from AMD, or as Jarred would say, "noise" that has piled up over the last year regarding FreeSync?1) No, there are not any monitors that can be made FreeSync compatible with just a firmware flash. Will you personally guarantee this level of support out of pocket for anyone misled by this statement? Will AMD stand by this? Will the monitor mfg? No. No one wants to guarantee this because there ARE costs associated with "Free"Sync and no one is willing to guarantee this.
2) Well you understood incorrectly because AMD *WAS* telling people this was FreeSync with dynamic refresh rates when in fact, it was not. I mean how can you even sit here and say this was a demonstration of FreeSync and a worthy analogue to G-Sync when it was not even close to feature complete, including missing the MOST important aspect which is actual DYNAMIC/ADAPTIVE frame rate adjustments. Only someone intent on deception or misinformation would even throw this out there as you did as a counterpoint to try and prove AMD had already shown working demos of FreeSync to try and back AMD's original lie about existing panels on the market being able to support FreeSync with just firmware updates. So again, now that FreeSync is complete, why can't these older panels just support a feature complete FreeSync with just a firmware update? Oh right, because they can't. They lack the necessary hardware, hardware which AMD originally claimed wasn't necessary. But AMD subsequently developed more advanced scalers because they found out you couldn't actually just upgrade to FreeSync for free, with just a firmware update. Conclusion: AMD lied and fed the public misinformation, whether intentional or not, and their fanboys like you CHOOSE to continue to perpetuate this "noise" rather than just admit AMD was wrong and move on.
3) Who said anything of the sort? AMD is always free to develop whatever they like to improve their products for existing customers and to entice future customers, but what they shouldn't be doing is making MISLEADING statements about what their competitors can or cannot do. I mean it would be just as disingenuous as Nvidia saying, well AMD can support G-Sync at any time if they want to, they just have to invest a huge amount of R&D to ensure their display controllers work with a custom board and FPGA ASIC. And as for Intel, we do know they have shown no interest whatsoever in supporting FreeSync. We also don't know if they are even capable, again, given not even all AMD GPUs have the secret display controller sauce to support FreeSync. But neither are "free" to implement because again, this may take a considerable level of effort and R&D and in Intel's case, they may not even care. In Nvidia's case, why bother when they already have a better solution they brought to market before FreeSync? And again, who cares if its an open standard if only AMD supports it? I guess we can give it the same chances as other failed open standards, like HD-DVD? How'd that go?
4) Where did I admit it wasn't a lie? LMAO. Quoting AMD's bullshit is one thing, but please don't misquote me, thanks. But back on topic, do any of the demonstrated displays support 9Hz on the low end in FreeSync mode, or 240Hz on the high end, in FreeSync mode? No, they do not. Continuing to quote this lie as a benefit in favor of FreeSync is dishonest, simple as that, but I fully expect AMD and their disingenuous fanboys like you Creig, to continue to perpetuate this lie for years to come, because even now that we have actual demonstrated FreeSync monitors, none of them support these ranges.
5) No, its not really an inescapable truth Creig. If you are one of the >70% of dGPU owners that own an Nvidia Kepler or Maxwell based GPU, you have 2 options:
a) Buy a G-Sync monitor, game happily on a solution that does everything it says it does.
b) Buy an AMD GPU that supports FreeSync and buy a FreeSync monitor.
Are you and AMD willing to back your claim that FreeSync is the cheaper solution out of your own pocket for this subset of users just to perpetuate a myth and your flawed analysis? And what royalty fees are you talking about again? The ones AMD associated to G-Sync? LOL. Again, more "noise" as Jarred would say.
Hahah no revisionist history. That's classic, you've backpedaled on every point when it was shown AMD did NOT demonstrate or make good on what they said on the original lies, myths and FUD you continue to try and perpetuate even after we see FreeSync in its final form has moved beyond most of these lies. That folks, is "noise".
And AMD has beaten Nvidia at its own game? LOL. Yes, once again getting ahead of ourselves aren't we Creig? Because AMD sure has a great track record when it comes to supporting their initiatives, how's that Open Mantle SDK coming along btw? But I am sure in a few months once FreeSync actually makes it to market, we can throw this up there as another not-quite-as-good reactionary half-baked solution from AMD in an attempt to match Nvidia, the industry leader that developed and introduced this tech:
SLI > CF
Shadowplay > some AMD junk in their bloated adware client
DSR > VSR
GPU Boost > Turbo Core
CUDA > OpenCL
3D Vision > HD3D
PhysX > BulletPhysics
Optimus > Enduro
I'm sure there's more, but that's just a small sample of how AMD has "beaten Nvidia at its own game" in the past. Don't worry, there's plenty more room on the list for G-Sync > FreeSync, too! :D
Creig - Saturday, January 10, 2015 - link
If there's a fanboy here, Chizow, it's you. I'm not going to bother with your rantings any longer as it's obvious that you refuse to acknowledge facts. As more and more FreeSync capable monitors hit the market, everybody will just laugh at you all the harder. So just keep on tilting at those AMD windmills. I'm sure you'll stop them in their tracks single-handed.chizow - Sunday, January 11, 2015 - link
Haha the difference Creig, is that I actually use their products because they are the best at satisfying my informed demands as a user, and I'm not willing to perjure myself to suit their agenda, as you CLEARLY are.What is your excuse again for buying inferior tech? Save a few bucks? What's your excuse for defending all of these accumulated lies and misinformation, ie. "noise"? I'm simply trying to set the record straight here, y'know, filter out all the "noise" because it is clear where that noise originated (AMD) and it is clear there are certain folks who's agenda is to not only perpetuate that noise in order to confuse the market or create a competitive advantage to distort the reality FreeSync is the one that faces the greater obstacles on the market, not G-Sync.
But yes, until then, we will just continue enjoying our G-Sync monitors, laughing as FreeSync falls further and further away from AMD's original claims because we have the benefit and luxury of knowing G-Sync does everything it said it would and has been for close to a year!
Will Robinson - Tuesday, January 13, 2015 - link
SLI is better than Crossfire?LOL...get current dude.The XDMA bus is a far better and more elegant solution as demonstrated in its total ownership of SLI for both frame rates and frame pacing.
Take off the green goggles before you go blind.
chizow - Tuesday, January 13, 2015 - link
LMAO, ah yes, frame pacing, the problem AMD fanboys like you spent months, years, downplaying and sweeping under the rug as if it didn't exist, right? FCAT was just an Nvidia viral marketing scam until it actually forced AMD to go back and fix their broken CF implementation YEARS later, right?But its OK, I do recognize progress and superior tech when it is appropriate and XDMA is certainly a better approach than Nvidia's aging SLI bridge.
But unfortunately for you and AMD users, XDMA and hardware is only part of the problem, and they have only fixed part of their CF implementation by sorting out the frame pacing/microstutter problems AMD fanboys spent years downplaying.
The biggest problem for AMD and their CF implementation is that the end-user is STILL bound to AMD's driver updates because they don't expose the compatibility bits in their CF profiles, as Nvidia has done for years. There are some half-baked workaround that require you to copy profiles for other games, but this has the chance to break other features, like AA, because you don't have the granularity to change individual bit settings like you can with Nvidia profiles via a simple XML change using something like Nvidia Inspector.
But yes, all in all, AMD fans can once again thank Nvidia and their supporters for bringing about positive change for your AMD products. Because we sure as hell know nothing would've gotten fixed on the CF front otherwise! Not surprising when you have a fan base that is content with mediocrity and would rather downplay and sweep a problem under the rug, rather than demand support and a fix for it!
FlushedBubblyJock - Tuesday, February 24, 2015 - link
AMEN !Thank you, a MILLION THANK YOUS !
Will Robinson - Sunday, January 11, 2015 - link
Speaking of FUD...Where is your apology for claiming that no games were shown using Free Sync...?
(As confirmed by Jarred)
chizow - Sunday, January 11, 2015 - link
@ Will Robinson,Who are you referring to again? I've made no such claim, they showed non-interactive demos of games at CES, but they did not do so before then, so again, if you're going to try to catch me in a lie, you're going to have to try a lot harder.
Whereas I can just go through pretty much any of these pro-AMD posts and pull out oodles of bullshit that you will subsequently have to backtrack on.
Will Robinson - Monday, January 12, 2015 - link
Go right ahead..I'll wait....chizow - Tuesday, January 13, 2015 - link
You won't need to wait for anything, I've already called out all the nonsensical BS from like-minded AMD fanboys like yourself.FlushedBubblyJock - Tuesday, February 24, 2015 - link
Thanks Chizow - and the TheJian as well secondarily.It really is true it doesn't matter one whit what AMD actually does, the fanboys are so deranged and so locked in, if anything doesn't work they either totally deny that fact for YEARS, or just say "I don't even care about that crap nVidia has, it's no good anyway." - then they march on over the cliff in absolute bliss, screaming and pointing their finger before the splat sound. After the splat, of course they are still ranting and screeching in ethereal form.
I just want to say:
1. If just one single holy souled earth loving honest and not greedy AMD fanboy in the whole world gets one single firmware upgrade on a single monitor for his lifetime savings Hawaii gpu to use freesync with, well then AMD's marketing was 100% honest and reasonable.
See, 1/10,000,000th of what was promised is good enough if you're a good guy. You may have gotten it wrong nine million nine hundred and ninety nine thousand times, but that single AMD fanboy made happy is worth it, it's OK, no it's GREAT, even if it's the son of AMD's CEO.
On the other hand, if you are the evil demon the AMD fans war against in the name of all that is good and honorable and as worthy top level knights in shining armor, nVidia getting it right on 7 out of 8 memory octants is not good enough, and by golly, that R290x price destroying MASSIVE release of a card MUST BE PUNISHED IN THE MOST SEVERE METHODS AND WAYS ACHIEVABLE...
Yes, we won't hear a single word about " AMD LIED, and it will take some time for them to earn the trust of their loyal customers back..." repeated, over and over and over, ad nauseum... NOPE - that is a fantasy(meaning fair) world that just not exist(because lying con artists and amd fools are common).
I mean, yeah, it's amazing
Death666Angel - Friday, January 9, 2015 - link
Unless we get IPS x-Sync monitors, I don't care. TN panels are just not worth it for me anymore. For the time being, 105Hz IPS will do the trick.chizow - Friday, January 9, 2015 - link
There are IPS versions of both panel techs coming, although they may be PLS or AHVA in actual implementation and not true "IPS".Oxford Guy - Saturday, January 10, 2015 - link
What is "true" IPS? S-IPS? H-IPS? e-IPS?There are plenty of variations on IPS. The subpixels aren't even shaped the same way. S-IPS has chevrons. H-IPS has rectangles.
chizow - Tuesday, January 13, 2015 - link
Yes you're right, there's multiple variations of IPS, but my main point is, the more these technologies evolve in an attempt to match TN for gaming, the further they deviate from the characteristics that made them more attractive to some users (viewing angles, IQ, color reproduction).Impulses - Saturday, January 10, 2015 - link
I won't care until all the dick waving ends and we have a clear picture of which solution will stand the test of time. Hopefully by then the panel I want at the res/refresh/size I want will be semi affordable and I can buy two or three to replace my 3x U2412M, the majority of the market will take same stance...The BEST tech or standard doesn't always win because it's often not the market itself that picks the winner, countless past examples dating back to Betamax/VHS. Not making a case for Freesync mind you, I think it'd be a crying shame if it's actually interior and gains critical mass...
But I'm sure as heck not gonna sink one or two grand into displays that won't last me a good five years at least, and I'm not rushing out to swap $600 worth of GPUs over it either.
chizow - Tuesday, January 13, 2015 - link
I can understand the hesitation and I think many will be in the same boat as you, because you want a "true" upgrade across the board for all of your monitors. Unfortunately, I think we can see the trend will be a near exponential increase in cost for some time. IPS only really became affordable in recent years, most likely due to the fact TN gaming panels have become more popular and the emphasis and premium on IPS just hasn't been the same, especially given the decrease in price for IPS HDTVs into that $500 range.I guess as someone who never really got into the whole multi-display gaming (I do have multiple monitors, just game on a single panel), I would never consider tying myself to 3x or 4x of that resolution and price, since it also increases your GPU demands. Instead, I just go for the biggest, highest resolution for my main panel at the time with the best gaming characteristics. Still use 2xU2410 for companion panels and they're still great for that. ROG Swift for my main display/gaming.
And the converse of what you stated, the OPEN tech or standard doesn't always win either, because in the end the market will tend to favor the better implemented and better supported option, regardless whether it costs more or not.
But realistically, I do hope Nvidia adopts its own version of Adaptive Sync and puts what AMD has said to the test (again), because this would give the market one less reason to buy AMD. Nvidia would support both their own version of Adaptive-Sync (N-Sync?) and a superior proprietary G-Sync. I guess then we will see just how genuine AMD was when they encouraged Intel and Nvidia to adopt their open standard when it results in stripping them of any competitive advantage.
But yeah, the commitment of 3x displays will make it 3x harder for you to invest in this kind of upgrade, while for me, I see it as spending the same on a single, fantastic monitor for the same price as 3 monitors that don't really improve or do anything better than what I already own.
In fact, my brother recently bought 3xU2415 but returned them when he came over and saw my Swift. The U2415s looked really slick on a 3 monitor stand, very little bezel, but they still had all the negatives of IPS when it came to gaming. Maybe these new FreeSync and G-Sync panels will address some of these issues but since these panels tend to come from the same factories and share the same issues/benefits, I am not going to be overly optimistic even if they claim 144Hz support.
Scutter42 - Friday, January 9, 2015 - link
On pcper they show the Asus MG279Q amd claim it works with Freesync since according to AMD's Robert Hallock "AMD will not have a whitelist/blacklist policy for FreeSync displays and that as long as a monitor adheres to the standards of DP 1.2a+ then they will operate in the variable refresh rate window as defined by the display's EDID."The MG279Q is a 27-in 2560x1440 display with IPS panel that supports 40-120hz. Ships Q1 2015 for $599
Oxford Guy - Saturday, January 10, 2015 - link
It totally sucks that people are expected to buy new monitors just to get G-Sync and FreeSync.For people like myself who just spent quite a bit on a good monitor, the budget for replacing a monitor just isn't there.
R3MF - Monday, January 12, 2015 - link
i'm not sure i see a solution to that.i'd love amd to wave a magic wand and make my 9 year old Dell Ultrasharp 2405 screen Free-Sync compliant, after all, i paid £600 for it back in the day!
Oxford Guy - Monday, January 12, 2015 - link
Cute, but my monitor is a few months old. That, and there are no A-MVA panels that offer either technology, let alone 32" 1440p models.NZLion - Monday, January 12, 2015 - link
The new LG 34" ultrawide with freesync is only 2560*1080? Well, that's incredibly disappointing.The only reason I haven't already bought their 3840*1440 unit was that I'd heard a freesync version was imminent. So much for that.
Why is there never a monitor from any manufacturer that ticks all the feature boxes at any given time? I am prepared to spend more and compromise less for the right monitor than pretty much any other component in my PC, as it's likely to last through multiple machines. But the last time I felt like I could get everything I wanted at the time in one display was about 2007
Rock1m1 - Monday, January 12, 2015 - link
Even though Gsync is more expensive than FreeSync, from what I've seen Gsync works better in the regarding to completely eliminating tears. I believe Gsync will drop in cost considerably over the next few months, especially when the Freesync goes to production - One reason it costs so much because there is simply no alternative/competing product at the moment.If FreeSync does take off, I'm sure Nvidia will have no choice but to implement Free syc.