NVIDIA G-Sync Review

by Anand Lal Shimpi on 12/12/2013 9:00 AM EST
Comments Locked

193 Comments

Back to Article

  • tipoo - Thursday, December 12, 2013 - link

    Good to hear it mostly works well, if you can keep the framerate high enough. This with a high end computer and Occulus Rift would be an amazing combination, I hope both take off.
  • smunter6 - Thursday, December 12, 2013 - link

    Want to make any wild guesses as to what John Carmack's working on over in Oculus Rift's secret labs?
  • GiantPandaMan - Thursday, December 12, 2013 - link

    Except the Oculus Rift probably won't have it. They love non-proprietary stuff and G-Sync lands firmly in the proprietary category.

    Make it a standard, make it cost about $10 more to implement rather than $120 and this will take off. I don't see this happening, though. NVidia just doesn't operate in that matter, unfortunately. It would make gaming so much better for the people who really need it--those with sub-par video cards.

    No display maker is going to make a key component (the scalar) beholden only to a single manufacturer (nVidia). The technology needs to be licensed so it becomes an industry standard so that manufacturers can put it into their displays without having to rely on a single OEM.
  • psuedonymous - Thursday, December 12, 2013 - link

    Carmack himself mentioned at the panel after the G-sync reveal that the first consumer release of the Oculus would NOT contain G-sync, but that is definitely something they want to incorporate.
    My guess is the reason being the use of LVDS as the sole panel interface. There simply AREN'T any decent 5.6"-6" panels using LVDS. Nobody makes them. The relatively bad (6-bit FRC, crummy colours compared to modern panels, low fill-factor, low resolution, too big to be used efficiently) panel was a compromise in that it was the only one readily available in volume and compatible with the existing LVDS board. Phone/tablet panels in the correct size, resolution and quality range are all MIPI DSI, with the exception of the Retina iPad Mini, which uses an eDP panel like the iPad 3 onwards. Except that panel is still too large, and will be unavailable in volume until Apple decide to reduce their orders in 6 months or so. The current 1080p prototype uses one of the early DVI->MIPI chips (probably on an in-house board) because it's the only way to actually drive the panels available.
  • GiantPandaMan - Thursday, December 12, 2013 - link

    Interesting information. Thanks for posting it.

    As useful as G-Sync would be for something like Oculus (especially for reducing motion sickness) it's still far too expensive to implement. Oculus, itself, wants to hold the line at $300. There's simply no way for $120 to be cut down into that price.

    Then there's the fact that Oculus would benefit far more from 120hz panels than it would be from gsync. Honestly, I can't imagine Carmack or Oculus ever bad mouthing a new technology that it could benefit from in the future, but the fact remains there are so many other things that would be more cost effective for Oculus to do first. Higher resolution, 1920x2160 say; higher refresh rates, 120hz. Personally I hope they think about using some of the projector panels. Their smaller, lighter, and already have both the color depth and refresh rates. The only problem, of course, is they're probably too small and may be too expensive.
  • psuedonymous - Friday, December 13, 2013 - link

    They specifically avoid making a microdisplay-based HMD, because of the tradeoffs that every previous microdisplay HMD has had to make. Because the displays are small, you need some hefty optics to view the image, and these must be complicated in order to correct for distortion (as unlike the large-panel software-corrected approach the Rift uses, distortion with a much smaller display would be so great it could not be effectively corrected). This means the optics are bulky, heavy and expensive. And that goes doubly so if you want a large field-of-view (compare the Oculus 90° horizontal FoV to the HMD-1/2/3's 45° hFoV, and the HMD series were praised for their unusually large FoV compared to competing models). In fact, the only large FoV HMD I know of using microdisplays is the Sensics Pisight (http://sensics.com/head-mounted-displays/technolog... a huge 24 display monster that costs well in excess of $20,000.

    And anything other than a tristimulus subpixel microdisplay (a tiny transmissive LCD) will have chromatic fringing when you look around due to sequential colour (http://blogs.valvesoftware.com/abrash/why-virtual-...
  • GiantPandaMan - Friday, December 13, 2013 - link

    Ahh, so I guess my fears on using projector panels are true. Damn. I guess we're going to be stuck with 60 hz on the Oculus for awhile. I just don't see phone displays moving up in refresh rates anytime soon.

    I really want the Pisight now, but, unfortunately, I need to do things like eat and have shelter. :P
  • JoannWDean - Saturday, December 14, 2013 - link

    my buddy's aunt earned 14958 dollar past week. she been working on the laptop and got a 510900 dollar home. All she did was get blessed and put into action the information leaked on this site... http://cpl.pw/OKeIJo
  • Black Obsidian - Thursday, December 12, 2013 - link

    G-Sync seems to live in a very small niche. How many people both:
    A) Need better performance
    *and*
    B) Need a new monitor as well
    ?

    Absent those two conditions, aren't people simply better off investing the ~$400 a G-Sync monitor would cost in, you know, a better video card instead? I experience neither tearing nor stuttering, because my absurd triple-slot, factory-overclocked R7970 has no problem pushing any game I play well beyond 60FPS. A special monitor would cost 80% what that card did at launch, so G-Sync seems like a bit of a non-starter to me, unless there's something I'm missing here.
  • IanCutress - Thursday, December 12, 2013 - link

    For the gamer that has it all?

    I'm interested in G-Sync at 4K. If the need for AA is reduced, and you're battling against 30-60 FPS numbers. But for those users who are in the mid range GPU market, having a good monitor that will last 5-10 years might be cheaper than a large GPU or system upgrade.

    It's just another piece in the puzzle towards which will hopefully become standard. Think about it - in an ideal world, shouldn't this have been implemented from the start?
  • dagnamit - Thursday, December 12, 2013 - link

    Agreed. You would think that getting the display and the thing that talks to the display speaking the same language would be close to first on the list.
  • DanNeely - Thursday, December 12, 2013 - link

    Same here. I'm not going to rush out and buy a 1080p gsync monitor; but even in a year or two an extra $120 on a 4k monitor isn't going to be a large hit relatively speaking and gaming at <60 FPS will be a lot more common there than at 1080p.
  • Black Obsidian - Thursday, December 12, 2013 - link

    4K is a really good point; I hadn't considered the utility of this sort of thing on much higher-resolution monitors.
  • TheHolyLancer - Thursday, December 12, 2013 - link

    Here is the thing, if you have it all. Then a SLI/CF or Titan / 290X setup will mean that you will more than likely able to max out the graphics, and G-sync and V-Sync becomes more or less the same.

    The target market is for when you are on a budget and is playing with mid range cards and the card cannot push 60 / 120 / 144 fps (or 30 fps, if that is your thing...) consistently at 1080 or 4K. Which means price becomes an issue, if you are going to buy a midrange card, likely you are going to reuse your existing monitor, or maybe get a nice cheap one unless the G-sync enabled models (and cards) are not significantly expensive that you can step up to a better card that can then run it nicely at full speed via v-sync.

    So if they can price it so that a new monitor + new nv gpu is the same as a new monitor of same size and speed + new amd gpu + say 20 dollars then that is fine. But if they can't do that then for a mid range gpu dropping 20 or 30 dollar more can mean a lot more performance for the buck; unless you are already at the upper end of midrange, to go from upper midrange to high end is a large jump in cost. And even then, if people want to keep the monitor they have, then there will likely be NO way that this will take off, because even a cheap 1080 is ~100 dollars, and that means a huge jump in quality of the GPU if you used it on the card itself rather than with the monitor.

    The killer app would be if G-Sync would work with any bog ol' monitor (or that all future monitor is sold with this soc enabled). Then it would become a nice new feature that is good for many people.
  • Kamus - Friday, December 13, 2013 - link

    "Here is the thing, if you have it all. Then a SLI/CF or Titan / 290X setup will mean that you will more than likely able to max out the graphics, and G-sync and V-Sync becomes more or less the same."

    This is just flat out wrong...

    I play BF4 on a 290x on a 120hz monitor. And there are very few maps that mantain a consistent framerate. So as soon as the framerate dips below 120 i start seeing suttering. And that's on the smooth maps. There are maps, like "seige of shangai" where the framerate hovers from 80 all the way down to 30-40 FPS... Vsync would be a HUGE deal in situations like that.

    TL;DR= Gsync is a big deal, even for high end rigs.
  • Da W - Thursday, December 12, 2013 - link

    The gamer that has it all certainly won't invest on a 1080p Tn panel.
    Here's the problem right now: a bunch of things that will get implemented later. Isn't the solution in hardware? Will I have to replace my panel next year? And then, my panel will be tied to NVIDIA?
    Not just yet. AMD will surely come with an open source solution next year, as usual.
  • SlyNine - Thursday, December 12, 2013 - link

    Lots of gamers will invest in TN panels because that technology is actually better for games. But it does come at a compromise.
  • rarson - Sunday, December 15, 2013 - link

    I just bought a 27" QHD IPS monitor for $285. From the games I've played on it so far, I'd say you're nuts if you buy a 1080p TN panel over a monitor like this.
  • tlbig10 - Tuesday, December 17, 2013 - link

    And I'll counter with saying you're nuts for overlooking 120/144hz TN panels *if* the main use for your machine is gaming. I have the VG248QE, have enabled LightBoost on it, and I would *never* use my wife's 27" QHD IPS for gaming because I would lose the butter smoothness a 120hz LB monitor gets me. Yes her display has better color reproduction, but it is a mess in BF4 with all its ghosting and 60hz choppiness. Until you've seen what LightBoost and 120hz is like in a first person shooter, you can't call us "nuts".

    And those of you on 120 or 144hz monitors who aren't using LightBoost, do yourself a favor and check it out. There is a substantial difference between LB 120hz and plain 144hz.
  • ZKriatopherZ - Thursday, December 12, 2013 - link

    I think a lot of this is leftover garbage from the way CRT displays needed to be implemented. Seems like we should be removing hardware here not adding it. Flat Panels when introduced to that ecosystem needed to output on a frame by frame basis even though the only real limitation seems to be the pixel color to color refresh. Since LCD pixels are more like a switch wouldn't a video card output and display system that updated on an independent per pixel basis be more efficient and better suited to modern displays? I understand games have frame buffers you would need to interpret but that can all be addressed in the video card hardware. If you have a card capable of drawing to the screen in such a way wouldn't that make this additional hardware unnecessary and eliminate the tearing problem?
  • extide - Thursday, December 12, 2013 - link

    I think you are on to something here, something like a modification of the packet based DP protocol. Now, the speed of the packets in DP depends on the resolution and refresh rate. Why not make the packets come whenever they are ready (in 3d) and at a regular rate (in 2d). THen have a monitor with an lcd panel expecting a signal like that.

    I mean, I think in the end the way nVidia did it right here is just a way to make it work right now with the constraints of the exiting lcd panels and DP protocols. In the future, I could easily see this sort of tech being built in to future protocols, video cards, and monitors, and probably all odne without needing an expensive FPGA and additional RAM.
  • Egg - Friday, December 13, 2013 - link

    I don't see how drawing pixel by pixel solves anything. The issue arises whenever you do not draw a full frame.
  • ZKriatopherZ - Saturday, December 14, 2013 - link

    Yes, and if you aren't drawing frame by frame on the monitor there is no issue :D I'm saying remove the frame by frame drawing on the display side completely. This would create an effective "refresh" rate that is only limited by how many pixels the card can push and how fast the pixels can change on the display side. You could also take it a step further and move away from the frame by frame output on the software side. Fantastic example is the desktop display where 50% of the time the only change on the display is the movement of the mouse cursor. Even in a fps game where most of the screen changes constantly it would still benefit from the lack of a frame holdup since what we call refresh rate would no longer exist.

    Truthfully this is all speculative, I don't have enough experience to tell if there are shortcomings here I'm overlooking but nothing blatant stands out at the moment.
  • blitzninja - Saturday, December 21, 2013 - link

    Here's one problem with the video card pushing out data on a per pixel basis. It would take a LOT more data.

    When a scene is rendered, the instructions essentially target a set of pixels and apply an effect to them (colour, saturation, brightness, etc) and when all calculations are done the final product is sent to the screen to be rendered (tearing is when the not all of the new image has been copied into the monitor's frame buffer).

    The problem is that the GPU is blind to any upcoming draws calls in that it does not know which pixels will be affected until the calculations are done. This means that there is no way for the GPU to know when a particular pixel is full computed or "rendered" and ready to be sent to the monitor's frame buffer.

    A better solution would be to, for 2D applications, check for pixel changes from one frame to another and simply send the change.

    For 3D is see this as impossible since any small change (camera movement or otherwise) will require a complete re-render due to the nature of 3D and how the calculations are done (the GPU has no idea how to render a scene, it simply follows instructions layed out by the developer, so it can't figure out which pixels it can skip).
  • blitzninja - Saturday, December 21, 2013 - link

    Quick clarification: The increase in data would be the need for the GPU to continuously overwrite pixels in the monitor's frame buffer in 3D mode until all draws are complete.
  • ZKriatopherZ - Saturday, December 28, 2013 - link

    Does this have to do with the developer or the established rendering API? (OpenGL or DirectX) If the API is designed to output that way wouldn't that make both development and implementation easier? I get what you are saying about camera movement but there is a speed limitation caused by rendering frame by frame as well. if you are dealing with something like a tn panel that has a quick color to color change the effective fps on a per pixel basis becomes closer to 300 fps (or we could call it pixel per second here) even if you are drawing full screens.

    I also am wrapping my head around what you are saying about 3d applications. Ultimately they are still outputting to a 2d display. I understand there are shaders and other effects that may require a full screen write and it sounds like the Graphics Card, OS, API's and Display are all set up that way. It may take some serious effort to really take a step back and take a more efficient approach based on the current display technology. Ultimately though if you can quickly kind of go through and make the changes to allow this to take place I do feel like it will end up being a much more efficient and faster approach. It may just not be as easy as it first seemed to me.
  • otherwise - Thursday, December 12, 2013 - link

    In the future, we're all going to need a new monitor. Depending on the price premium; and assuming these make their way into IPS displays; it might be hard to justify buying a non-Gsync monitor over a GSync monitor. I doubt many are going to run out to buy a new $400 display right now; but this will have a powerful effect on consumer behavior down the line.
  • oranos - Thursday, December 12, 2013 - link

    whats your point? Maybe this site should stop posting all tech articles that don't fit into wide mainstream demographics?
  • Black Obsidian - Thursday, December 12, 2013 - link

    My point was obviously that this seemed to be a technology currently useful to virtually nobody.

    Ian pointed out future applications that I hadn't considered, which was exactly the sort of feedback I was hoping for.
  • BeVar - Thursday, December 12, 2013 - link

    @Black Obsidian> No! This is just a marketing gimmick. A costly one at that. I like you would just buy a batter video board. But, as Art Linkletter said "people are funny".
  • nathanddrews - Thursday, December 12, 2013 - link

    "Can't keep a constant XXfps at XXXXp because our GPUs are too slow? Here, buy this thing that makes your display go slower!"

    I'm a bit torn on G-Sync. On the one hand, it removes some glaring issues that have plagued gamers for years. On the other, it's basically a beard. 15 years ago, you could play a game at insane FPS and refresh rates on CRT. Games were simple with small textures and almost no particle effects. 10 years ago, LCDs became affordable and suddenly everyone was capped at 60Hz and consoles were locked at 30fps or 60fps. Games were more complex, requiring faster hardware, but the slow LCDs made it less noticeable. Now were moving on to LCDs that operate at 144Hz and 4K displays capped at 60Hz. G-Sync is a band-aid. The REAL problem is that GPU makers (NVIDIA/AMD) have not kept up with the pace of resolution requirements and game complexity.

    Like most reviews point out, it all comes down to what you're used to. I'm still using a CRT, 1920x1200@96Hz (sometimes lower, sometimes higher). I have all my games set up to maximize FPS for the target resolution and usually don't use vsync. Screen tearing is not as noticeable due to the high frame rate, instant response time, and the nonexistent lag that comes from CRT tech. G-Sync appeals to me because it would allow me to avoid the most glaring pitfalls of LCD tech and my inability to turn up eye candy to the max without buying all the highest-end hardware. But like I said, this is really just a band-aid and I'm not sure I want to reward this laziness.

    G-Sync hasn't earned my dollar yet. I know my next display purchase will be 4K, but I'm not content with 60Hz LCD. DP 1.3 is on the way, bringing with it 4K and 8K support at significantly higher refresh rates along with 3-D and all that jazz. Will AMD have a response to G-Sync or will they be able to license it for Hawaii 2.0? Will someone develop and open spec that requires minimal hardware to implement for broader adoption? Will GPU makers significantly push performance to make G-Sync obsolete? My CRT hopefully has a couple years left in her, so I hope I can weather the oncoming storm (not a DW reference).
  • Yojimbo - Friday, December 13, 2013 - link

    Obviously there's a limit to how good of a video card you can get. This pushes the upper bound on the experience offered by allowing frame rates down to 35fps to be acceptable instead of down to 60fps. As far as cost analysis of buying a faster card for those not in the market for the top tier cards, one must remember that most users will upgrade video cards far more often than monitors. For the life of a monitor, one must continue to purchase more expensive video cards each time one upgrades video cards in order to equal the same experience of a g-sync-enabled monitor with less expensive video cards.
  • hoboville - Thursday, December 12, 2013 - link

    It's kind of a stopgap device for those who don't want to shell out the extra cash for a better/second GPU. ..But even then, the cost of getting a new monitor would seem to offset the cost of a better/second GPU.

    Anand hit the nail on the head when he pointed out that if you are getting a minimum FPS of 60, then vsync should be fine for you. At 1440p+ resolution, even dual GPU will start to encounter slow downs, so it makes sense to invest in Gsync, because minimum frames will be lower. Also, as your hardware ages in relation to the games you play, having Gsync will be good because you'll get a smooth experience without having to buy a new GPU / CPU. Old hardware will retain its relevance longer.
  • Mr Perfect - Thursday, December 12, 2013 - link

    Don't forget C!

    C) Have, or are willing to go buy, an nvidia GPU to use with the screen.

    It's always a little disappointing when a manufacturers spends a lot of time and money making some cool new feature, only to have it die because it's proprietary. If this was a part of DirectX or some other industry standard, maybe it would take off.
  • kwrzesien - Thursday, December 12, 2013 - link

    They might as well make an entirely new connector and cable.
  • Dribble - Friday, December 13, 2013 - link

    Got to lol there - DirectX is proprietary. From the manufacturers point of view (which is in this case nvidia) if they have spent all that time and money how do they get it back if they just give the tech away for free.
  • Sadrak85 - Thursday, December 12, 2013 - link

    As a person who outgrew 1080p a while back and now has a 3x1 setup, I'm certainly hungry for more features in my monitors, if they're going to continue to cost the same (and they appear to have no intention of having a race to the bottom).
  • Yojimbo - Friday, December 13, 2013 - link

    It may be niche for the next couple years, but it seems like a technology which is destined to eventually become ubiquitous. It's common sense and has real world results. As the industry matures, the holes in the experience will be filled in.
  • Samus - Saturday, December 14, 2013 - link

    For what probably costs $20 in hardware, they can charge a $50-$100 price premium on a high end display for G-Sync. It's definitely niche, but so are video cards costing over $200 and they sell quite well.
  • ArmedandDangerous - Sunday, December 15, 2013 - link

    Well, an FPGA isn't cheap, RAM on it isn't cheap, and it definitely doesn't cost $20 in materials.
  • mwildtech - Thursday, December 12, 2013 - link

    What is the likely hood for this G-Sync module to be retrofitted in the current IPS and PLS Korean displays?
  • Stuka87 - Thursday, December 12, 2013 - link

    As I recall, ASUS has an exclusivity agreement with nVidia for the first year. But I cannot find the article that mentioned this now.
  • Teizo - Thursday, December 12, 2013 - link

    That article was completely wrong as well. They forgot to mention that it was the Asus monitor used in this article that was going to be receiving the G-Sync module first...not that Nvidia had an exclusivity agreement with Asus. No need to try spread false information.
  • Jinru - Thursday, December 12, 2013 - link

    The article states you need a Displayport connector. Which rules out the Qnix/X-Star overclockable monitors. Which is a shame... since I have a Qnix and was wondering the same thing, lol.
  • extide - Thursday, December 12, 2013 - link

    Actually it doesnt necessarily rule them out. It drives the panel with LVDS, so the panel needs LVDS. As long as the panel in those korean monitors is LVDS then theoretically it should work, even if the monitor did not originally have DP. Your video card, does need it, but all of the cards that actually support it, have DP anyways so thats not an issue.
  • B3an - Thursday, December 12, 2013 - link

    Few questions...

    Is there any 2560x1600 120Hz monitors around? Can DP 1.2 handle it?

    Is DP 1.3 even close to being released? Any news on it?
  • Ryan Smith - Thursday, December 12, 2013 - link

    I haven't seen any such monitors, but DP 1.2 should have enough bandwidth to handle 2560x1600@120Hz. The bandwidth requirements are virtually identical to 4k@60Hz. Though I'm not sure whether that configuration is fully defined in the standard or not.
  • bondfc11 - Thursday, December 12, 2013 - link

    Overlord Tempest - best IPS on the planet - and hand-tested, shipped from a kickass US company in California.
  • Mr Perfect - Thursday, December 12, 2013 - link

    It's an... interesting display. Somewhat disappointing that they don't guarantee 120Hz, though. It just say "Up to120Hz". That and I am not impressed in a $600 DVI only screen. Considering HDMI caries the same signal as DVI, why wouldn't they make it HDMI? I guess you could just get a DVI-to-HDMI adapter at Newegg.
  • extide - Thursday, December 12, 2013 - link

    Because it is Dual-Link DVI, which is not the same as HDMI.
  • JimmiG - Thursday, December 12, 2013 - link

    What about Triple Buffering? Seems we already have a solution to this problem in software..
  • tipoo - Thursday, December 12, 2013 - link

    Stutter/lag.
  • NicePants42 - Thursday, December 12, 2013 - link

    I came here to post this, and am quite surprised that there was no mention of triple buffering in the article. It seems very disingenuous on the part of both nVidia (assuming there were no slides mentioning triple buffering) and Anandtech to omit this issue.

    In fact, it was Anandtech who did an excellent (IMHO) job of informing me about the advantages of triple buffering back in 2009, in this article: http://www.anandtech.com/show/2794/2

    I'm glad nVidia is bringing more hardware solutions to improve gaming, but not addressing triple buffering here makes me think that nVidia's marketing department wasn't impressed with the comparison.

    What gives, Anand?
  • Zink - Thursday, December 12, 2013 - link

    Triple buffering is a type of v-sync. On the first page of this review it explains the issue. The buffers hold on to what the GPU rendered for anywhere from 0 ms to 17 ms so there is no way for the rate of motion of objects in the frames from the GPU to actually match up with when the frames are displayed.
  • PEJUman - Thursday, December 12, 2013 - link

    My thoughts exactly, triple buffering is a software solution, while this is a hardware based solution. If I understand correctly, the input lag for both should be the very similar. The distinction comes from the memory requirements; on already taxed hardware (think 4K), G-sync would work better.

    ultimately, Unless Nvidia can get the G-sync compatible LCDs under ~$30 premium, it (G-sync) will only make sense for ultra high 4K monitors; If you have middle of the road stuff, your money is better spent into better GPU imho.
  • Traciatim - Thursday, December 12, 2013 - link

    Triple Buffering causes lag, since you never get to see anything that happens in the game until the third frame is scanned on the screen. That's a pretty huge deal when you are playing games where reflexes affect the outcomes. If you get a chance to react up to 33ms faster than the next player, all else being equal you win.
  • JimmiG - Friday, December 13, 2013 - link

    Of course triple buffering has drawbacks (input lag) just like G-Sync, however it's a perfectly viable solution to the same problem that this hardware+software combination tries to solve. I doubt competitive twitch FPS players are going to jump on G-Sync anyway as it may causes input lag too - they will keep playing with double-buffering + no VSync.

    With triple-buffering, as long as your actual frame rate is over 60 FPS, there should always be a frame ready in one of the back buffers when the screen redraws. I've always found that in the few games that support it, the frame rate is very even and smooth.
  • PEJUman - Friday, December 13, 2013 - link

    so does G-sync, isn't telling the screen to wait until the next frame is ready = lag? The only difference is G-sync waits for the next frame, buffering grabs the latest finished frame. Both creates extra lag.
  • psuedonymous - Friday, December 13, 2013 - link

    Triple-buffering solves the issue when you have performance to spare (i.e. spend most of your time rendering at above the display refresh rate) and a very high tolerance for update delay (lag). When performance constrained, triple-buffering offers little to no benefit over double-buffering (as you're never filling that other buffer before display update), and you still get that frame-by-frame variance between render time and display time when performance varies.
  • oranos - Thursday, December 12, 2013 - link

    just because it's niche doesn't mean it's not worth an article.
  • Black Obsidian - Thursday, December 12, 2013 - link

    I never suggested that it wasn't worth an article, and I am in fact a longtime reader who appreciates Anand's articles on niche subjects. I merely wanted to understand if there is or might one day be a non-niche application for the technology.
  • drewp - Thursday, December 12, 2013 - link

    It's less niche than say mantle/true audio. Speaking of which where is mantle and bf4?
  • SlyNine - Thursday, December 12, 2013 - link

    Seems about as niche as mantle if you ask me.
  • bondfc11 - Thursday, December 12, 2013 - link

    with about 60K 248 monitors sold and over 5k sold a month on average - team green is going after the mod kit circuit. Once other options, i.e. other manufacturers, release their GSYNC monitors this will become much less "niche" than you think
  • SlyNine - Thursday, December 12, 2013 - link

    My hope is AMD or Intel will come out with something similar and make it a standard.
  • Jumangi - Thursday, December 12, 2013 - link

    This will be something only the hardcore tech enthusiasts look for. Unless Nvidia is willing to let go with the proprietary nature of the tech then it will be stuck as be a niche feature like Phys X is.
  • Exodite - Thursday, December 12, 2013 - link

    What about input lag?

    I don't often find time to game these days but when I do I tend to prefer MMOs or similar games where UI lag is completely unacceptable.

    I'm not particularly bothered by tearing, often I don't even notice it, and while V-Sync fixes that it also creates absolutely unworkable amounts of input lag.

    Presumably, by my understanding of the technology, G-Sync would not be as bad as V-Sync but it seems it would still introduce more input lag than the bog-standard "V-Sync off" option.
  • bo3b johnson - Thursday, December 12, 2013 - link

    I'm looking at that screen shot with "Set up stereoscopic 3D" and wondering how well it works with 3D Vision?

    Any chance of testing with 3D Vision? S3D typically has the problems seen here, with half the frame rate, and locked vSync to synchronize with shutter glasses, so G-Sync has the potential to dramatically improve S3D.
  • Hixbot - Friday, December 27, 2013 - link

    Display refresh frequency must remain rock solid in 3D mode in order for shutter glasses to work at the same frequency. Gsync will never work in 3D mode.
  • Bal - Thursday, December 12, 2013 - link

    Could you revisit this with multiple monitors? Knowing that going to three monitors triples the workload, and adding 3D doubles it, I see this having a large impact when trying to push 3 1080p monitors running 3D. You should be able to finally crank the settings up.
  • colonelclaw - Thursday, December 12, 2013 - link

    Do current games run fine or do they need to be patched to be 'G-Sync aware'?
  • Traciatim - Thursday, December 12, 2013 - link

    Most games shouldn't even notice.
  • Hixbot - Friday, December 27, 2013 - link

    True, but you if you have G-sync, you are best to turn off Triple Buffering in your game (turn off double buffering too). Some games do no offer the user the ability to disable tripple buffering
  • zsero - Thursday, December 12, 2013 - link

    Very nice, detailed article! One thing I missed is how it compares with nVidia's sync technology found on Quadro cards, which can sync to an external SDI source.

    Can you explain a bit about it, and how does that compare to the new G-Sync? It has been around for a couple of years now but only on high-end Quadro cards.
  • Kevin G - Thursday, December 12, 2013 - link

    I think nVidia/AMD need to do a round of cross licensing. AMD would get G-Sync + PhysX while nVidia would get Mantle and an AMD x86 CPU core to build an nVidia branded SoC. An enthusiast can dream can't they?
  • extide - Thursday, December 12, 2013 - link

    That would be nice but Intel would never let the nVidia x86 part happen, heh.
  • Kevin G - Friday, December 13, 2013 - link

    Well AMD has opened to using other IP in a custom SoC. So actually it wouldn't be an nVidia x86 core but rather AMD who also would be responsible for the overall layout of the SoC. Of course, individual components in the SoC, (mainly graphics) would be nVidia IP.

    Of course this would never happen given that AMD and nVidia are absolutely fierce competitors with each other along with Intel (a three way stand off of hate). Like I said though, one can dream.
  • Montago - Thursday, December 12, 2013 - link

    Any chance you might compare this to LucidLogix VirtuMVP VirtualVsync ??

    I've been using VirtuMVP for some time, and although it might not be as perfect as GSync, it does offer a similar feature of syncing the dislpay while letting the GPU run like its VSyncOFF = better gaming response at 60hz refreshrate...
  • Arbie - Thursday, December 12, 2013 - link

    Were those wood screws holding the panel on?
  • bondfc11 - Thursday, December 12, 2013 - link

    Right?!? I think its weird that the component parts are held to the panel with TAPE! High end approach there.
  • kepstin - Thursday, December 12, 2013 - link

    It would be interesting to see if the same approach can be used in a media player to perfectly sync screen refresh to video frame rate. You'd get perfect frame timing on a media center box, without worrying about display timing, and it could adapt to different frame rates as needed.
  • Ryan Smith - Thursday, December 12, 2013 - link

    So long as something used full screen exclusive mode (remember, windowed mode doesn't work), that should work. However keep in mind that G-sync has a minimum refresh rate of 30Hz, so you'd have to double up sub-30fps framerates to stay above the minimum.
  • Pastuch - Thursday, December 12, 2013 - link

    It blows my mind no input lag testing was done. If the game looks smooth, that's nice. If G-Sync adds even 10ms to current input lag numbers then it's useless to competitive FPS gamers.

    I would rather play games with NO AA, lower smoke settings, etc to get a 100FPS @100hz. My Qnix 2560x1440 runs at 110hz refresh rate (After overclock).

    To enable Gsync I'd have to down grade my monitor dramatically. I'd have to go back to a shitty TN panel that "officially" supports 100hz+.

    Thank you Korea for shipping us A- 2560x1440 PLS & IPS panels with 100hz capable refresh rates for $300! These monitors are the best deal I've found in PC gaming since the 9500pro to 9700pro unlock. The Qnix qx2710 is awesome.

    I hate the idea that 30 to 60 fps "is enough"... It's not. It never will be.
  • Sm0kes - Thursday, December 12, 2013 - link

    I was hoping for comments on the same topic -- FPS games. From what I gather, it doesn't seem to have any real benefit for anyone that can easily hit over 120fps (e.g., source engine).
  • vicbdn - Thursday, December 12, 2013 - link

    I have a qnix and ASUS VG248QE. To be fair, while it is a TN with terrible colors, the VG248QE shows much less blur when playing FPS. Blurbusters did a lot of analysis on that with their Lightboost hack. Even at 110hz+ on the Qnix, it still looks much worse than the Asus when turning around, aiming quickly in a FPS. For general use, of course the Qnix has been the best bang for the buck in recent years, but it's still not that same. It's not only being an officially supported 100hz+, there is some value in the TN panels for now.
  • Traciatim - Thursday, December 12, 2013 - link

    In theory it should actually improve your reaction times, even if just slightly if you are using a 144hz panel. I'll use a 60hz panel with v-sync on and off vs gsync just to show the advantage.

    Say for instance you have a 60 hz panel with v-sync and triple buffering on and you are doing a test where a pixel appears on screen and you have to click the mouse button in response.

    When the machine makes the pixel appear, it is in frame 3 of the buffer. On average you will be half way through the currently synched frame so you have to wait 8ms for that to finish, 16ms for the second frame to finish, and then the third frame will draw as fast as your display can flip pixels and then you click.

    No Vsync on you won't have triple buffering so on average you'd have to wait for half a screen refresh to see the pixels flip and then click.

    G-Sync, as soon as the frame is done the video card tells the monitor to draw so all you wait for is the pixels to flip . . . it will be the same every time, not sometimes 16.2ms and sometimes 1ms and then pixels... just pixels every time.

    I haven't read anything that it causes any different input lag than a regular monitor though, but it still should be a net advantage in almost any scenario to user experience and reaction ability.
  • Traciatim - Thursday, December 12, 2013 - link

    I kind of messed up the triple buffering example since it should just skip frame 2 if a third frame is ready before the second frame is even requested... but the point still stands.
  • bondfc11 - Thursday, December 12, 2013 - link

    The Overlord Tempest is better! PLS panels look horrible when OC compared to IPS.
  • blackoctagon - Sunday, December 15, 2013 - link

    Based opn previous reviews, I'm pretty sure Anand doesn't have an oscilloscope, so any input lag testing he would do would need to be taken with several pinches of salkt
  • Krysto - Thursday, December 12, 2013 - link

    So why do you need to tell others about it?
  • Wreckage - Thursday, December 12, 2013 - link

    It's a spam link to NOT click it.
  • skiboysteve - Thursday, December 12, 2013 - link

    I learned these aren't actually to fool commenters but to increase the google page rank of the linked site since a bunch of sites will link to it if this spam comment is placed all over the place..
  • jibberegg - Thursday, December 12, 2013 - link

    Pretty sure nofollowed comment spam hasn't been a valid SEO tactic in quite a while.
  • Krysto - Thursday, December 12, 2013 - link

    Another big improvement for games would be 120 Hz displays (paired with 120fps games). Use G-sync at such a framerate and THEN we're talking.
  • extide - Thursday, December 12, 2013 - link

    What, 144fps isn't enough for you, or did you forget to read the article ? ;)
  • jigglywiggly - Thursday, December 12, 2013 - link

    does it increase the input lag vs no gsync?
    is the input lag hte same as vsync?
  • drewp - Thursday, December 12, 2013 - link

    It's available now, but from where? If this was on newegg I'd buy it. The GeForce page says the DIY modules will be sold before the year is up.

    Also, does this mean that the "gold standard" of 60fps is irrelevant now? All I need is to maintain 35+ fps?
  • blanarahul - Thursday, December 12, 2013 - link

    Why can't we have a 2560*1600 IPS monitor that ranges between 24 to 60 Hz.
  • skiboysteve - Thursday, December 12, 2013 - link

    Im super happy that someone solved this problem but I feel this was solved the wrong way. Why isn't this technology just a revision to current display standards? Why isnt vertical blank sync just removed and a new 'frame' sync introduced to the standard? Then every display will have it and every GPU vendor could support it.
  • Raniz - Thursday, December 12, 2013 - link

    Yeah, there've been numerous opportunities to standardise something like this.

    First with DVI, then with HMDI (although that might not count since it's the same as DVI) and then with Display Port.

    DVI might have been too early for this and HDMI is compatible with DVI, but DP really could have innovated here.
  • extide - Thursday, December 12, 2013 - link

    I totally agree. This could seemingly be solved with a much simpler solution that included a new display protocol/standard. Hopefully this is just the tip of the iceberg, and a more sensible solution will come in the future.
  • Pastuch - Thursday, December 12, 2013 - link

    Solid points... If someone game me a LB monitor I wouldn't use it though because I love 1440P. I was close to buying a 27" Lightboost monitor but when I saw my friends Qnix I changed my mind instantly. The tradeoffs to get LB are too drastic. G-sync looks to make lower fps feel better. I'd rather have higher FPS.
  • fade2blac - Thursday, December 12, 2013 - link

    Shouldn't Adaptive V-Sync be thrown into the mix as well? I thought this was also supposed to be a way to improve the user experience in situations where framerates drop below your display's refresh rate. G-Sync seems to be a better and more direct solution to the problem, but it requires one to buy new specialized (a.k.a. more expensive) hardware and also (currently) limits connectivity options.

    Ideally, I would much rather that this pushes development of an open standard that leverages DVI/HDMI/DP, which will likely require "smarter" displays, but doesn't discriminate on the GPU and connectivity side. Further fragmenting the market by implementing yet another proprietary solution to an otherwise universal problem will severely limit the adoption. I assume there are patents, etc. that likely prevent anyone from implementing similar solutions without having to license this "novel" idea from nVidia.
  • eanazag - Thursday, December 12, 2013 - link

    I think this may make more sense on gaming laptops. I say this because high res panels on laptops could be better supported with less than a top of the line GPU that costs $300-400.
  • ant6n - Thursday, December 12, 2013 - link

    On the first page the second diagram illustrates "V-Sync off causes 'Tearing'". But in the image, why does the GPU wait to start drawing the new frame until the monitor hits refresh? I thought the point would be that the gpu waits to display the new frame until the monitor refreshes; there's no reason wait for the refresh to start drawing. And for the given example there would be no lag, because the three frames could be drawn in time if the gpu wouldn't wait.

    Another question: If your card can't push 60Hz, why not just run the game at 40Hz, and v-sync to that? If the gpu can push at least 40hz, there will be no stuttering.
  • Traciatim - Thursday, December 12, 2013 - link

    Because generally monitors are fixed refresh rates and you can't easily just sync to 40hz. 40hz on a 60hz panel would mean judder since some frames would be displayed for different lengths of time which you then pick up as unnatural movement even if your frame rate is pretty high. A better option would be to since to evenly divisible numbers, so if you have a 144hz panel you only draw every 1 refresh for 144fps, 2 for 72 fps, 3 for 48fps, etc.

    That or you could just use G-Sync and tell the monitor when the frame is ready so you don't ever have to wait around for the monitor.
  • ant6n - Thursday, December 12, 2013 - link

    Are you saying that panels have a fixed internal refresh rate? I know they have a fixed native resolution, but the refresh rate should be whatever comes in from the pc.
  • SlyNine - Friday, December 13, 2013 - link

    Umm. that's kinda the whole point of Gsync is that Monitors run at a fixed hz. in fact that's the whole point of Vsync.
  • ant6n - Friday, December 13, 2013 - link

    Of course they run at a fixed Hz at any given time But Traciatim seems to claim an lcd panel will always display at 60hz, even if the gpu drives it with 40hz, resulting in judder.
  • Floflo81 - Monday, December 16, 2013 - link

    It highly depends on the monitor, and is rarely mentioned in the technical specs.
    My ASUS PA238Q accepts any refresh rate between 40 and 70Hz, but anything other than 60 Hz indeed results in judder (skipped or missing frames...) because the panel always runs at 60 Hz.
    But some other screens behave differently.
  • Cellar Door - Thursday, December 12, 2013 - link

    Nvidia and partners NEED to bring this for under $50 as a premium over a regular monitor or it will be a flop like 3D.
  • bondfc11 - Thursday, December 12, 2013 - link

    NEVER gunna happen - don't hold your breath. The kits will be in the $150-200 range. Costs have to cover R&D marketing etc. $50 is an insane expectation on your part.
  • SlyNine - Thursday, December 12, 2013 - link

    "The combination of fast enough hardware to keep the frame rate in the G-Sync sweet spot of 40 - 60 fps and the G-Sync display itself produced a level of smoothness that I hadn’t seen before"

    See, you kind of lose me here. You're basically saying that its smoother then a solid 60/120/144 fps. Unless you have never experienced those.
  • Friendly0Fire - Saturday, December 14, 2013 - link

    Good luck getting that solid 60/120/144 at all times.

    That's the *entire* point of G-sync.
  • praeses - Thursday, December 12, 2013 - link

    power consumption?
  • Hrel - Thursday, December 12, 2013 - link

    You guys need to lay off the IPS fandom. It's stupid and vastly over-stated. The advantage is viewing angle, that's it, it stops there. For gamers IPS is actually stupid given the increased response time. Seriously, stop harping on it constantly. Only people who need IPS are professional image creator/editors. NO ONE else outside of mobile.
  • bondfc11 - Thursday, December 12, 2013 - link

    You have never played on an IPS running at 120hz at 1440 then. Once you do you will eat your words. Lag is not an issue with the tempest and the color/clarity/brightness plus screen size and high refresh rate kill, absolutely KILL any other panel on the market. Get one - then comment.
  • SlyNine - Friday, December 13, 2013 - link

    What IPS monitor runs at 120hz at 1440? is there a few to choose from? Can I buy it from New egg?
  • blackoctagon - Sunday, December 15, 2013 - link

    He's talking about the overclockable IPS (and PLS) 2560x1440 monitors that Anandtech has mysteriously failed to cover in their articles, despite the very significant enthusiast fanbase for these monitors
  • fade2blac - Thursday, December 12, 2013 - link

    Does G-Sync have any potential benefit for the use case of film/video playback, especially media encoded at sub-30Hz framerates? This seems like it has been conspicuously absent from the discussions I have seen thus far. My HDTV can already sync with a 24p source to display native film framerates by essentially rendering each frame 5 times @ 120Hz. G-Sync should be able to trivially match video framerates (using frame duplication if needed) and eliminate any need for 3:2 pulldown or similar approximations. This, of course, speaks to a niche (A/V-philes) within a niche market (PC Enthusiasts) who would be currently limited to relatively smaller desktop displays unless this sort of tech works it's way into projectors and larger TVs.
  • haukionkannel - Thursday, December 12, 2013 - link

    Un TV there is constant 24 screens per second standard, so the display is always in sync if it does support 24 standard... The problem with games is that there is variance in sync speed, in movie and TV material the speed is allways same, so the problem is not in there...
    The problem with movies is that 24 FPS is too little. Even the "super" smooth 48 FPS that is used in Hobbit is not so great even some people think that it does destroy the "movie feel"...
    I think that Peter Jackson should have got it 96 FPS so it would be in par with good ole CRT screens! IMHO
  • nirvgorilla - Thursday, December 12, 2013 - link

    "If you’re not overly bothered by tearing or are ok with v-sync stuttering, there’s really nothing G-Sync can offer you."

    Huh? This is an incomplete review. You didn't even run any emulated games which will actually have the most benefit from G-Sync.

    Mortal Kombat in MAME: http://i.imgur.com/Jk6i8Tb.jpg

    Also, you didn't answer any of these important questions:

    1. Test Mortal Kombat in MAME. It runs at 54hz. We need to see non-60hz games running smoothly.

    2. Test R-Type in MAME. It runs at 55hz. We need to see non-60hz games running smothly.

    3. Using GeForce Experience 1.8 and ShadowPlay, does it record the h.264 .mp4 file and playback in VLC using the exact refresh you were playing in?

    4. Mortal Kombat II in MAME runs at 54.7hz and we want to know if G-sync understands the decimals or is it only integers. Does it run at 54.7hz or 55hz?

    5. Is full screen needed for G-Sync or can it be used in Full Screen Windowed (Borderless)?

    6. Is G-Sync Direct3D only? Do OpenGL games like Quake III Arena work?
  • bondfc11 - Thursday, December 12, 2013 - link

    The article stated it must be full screen - windowed defaults to vsuck - I mean vsync
  • jramskov - Thursday, December 12, 2013 - link

    So that's why the rMBP reviews are late - you've been gaming ;)
  • sheh - Thursday, December 12, 2013 - link

    I wonder why it requires at least 30Hz to work. Kinda unfortunate. Supporting as low as 20Hz would still be very useful, and even lower might help every now and then.
  • Traciatim - Thursday, December 12, 2013 - link

    Below 30hz the pixels on the screen would fade causing flickering, so the screen is force refreshed with the latest complete frame if no new frame is ready.
  • sheh - Thursday, December 12, 2013 - link

    Backlight is static. Aren't TFT pixels also static until controlled by voltage?
  • CoolBOBob1 - Thursday, December 12, 2013 - link

    According to another article it has to due with pixel degradation, so a pixel has to be refreshed within 33 ms or so.
  • Hrel - Thursday, December 12, 2013 - link

    Based on this it seems like we'd all be better served by 240hz 1080p displays than Gsync. More affordable, allow us to use more affordable GPU's (instead of attempting to power a 4k display) and TV's it would allow for that AMAZING feature that apparently no one uses where you can have split screen multi-player with each person having the full screen.
  • DesktopMan - Friday, December 13, 2013 - link

    Higher static refresh rate requires more interface bandwidth. Abrash talks about 1000hz panels, which would be great, but the current interfaces aren't anywhere near there in bandwidth yet.

    You can have splitscreen multiplayer with two people at 120hz (with glasses of course). At 240hz you could have four player, but then each player would have the same V-Sync issues a regular 60hz monitor has today.
  • Autisticgramma - Thursday, December 12, 2013 - link

    This sounds like an attempt to recover some $$ from a system builder, over charging for mid range hardware. Its a separate 'piece' and perceived as a separate sale. If this tech is licensed and not 'made by nvidia' I could see industry standard here. And no one else was going to do it, I'll give credit for market making. However, my inner geek is appalled that I'd just not get the horse power to the computer, and enable 120hz, and 144hz vsync in the driver, if your monitor can handle it. In addition to the polling, I see this being more of a distraction from a 'build your own' stand point.

    I also have to ask why now. None of the consoles have NVidia chips, so none of the next gen TV's will use it. (probably)

    I can only guess that this is a first step in delivering data center rendered grafix to whatever device at whatever refresh rate, and resolution.

    Considering that since kepler, the upgrades have only been incremental for gamers, (not so sure about data center cards.) just to keep up the AMD tit for tat. Is g-sync someone's attempt to convince executives that there is still a market in PC gaming? a get it out the door before its over product?

    Nothing here makes me want to buy this instead of a water block, pump, reservoir etc. for similar price. So were left to non builder pc gamers. Does anyone like this even exist?
  • extide - Thursday, December 12, 2013 - link

    Even with high end hardware you can have stuttering. This still has use even in those cases.
  • tackle70 - Thursday, December 12, 2013 - link

    Can't wait to get a 4k gsync monitor - hopefully in 2014 sometime :)
  • r13j13r13 - Thursday, December 12, 2013 - link

    esta tecnología o su evolución pronto estará disponible para AMD, Intel o Nvidia integrado en la pantalla y con soporte por medio de los controladores.
  • ibex333 - Thursday, December 12, 2013 - link

    Great. Another way to get morons to spend extra money. Another reason to trick people into buying a new monitor or a video card.

    First of all, tearing and stuttering is not that big an issue. Certainly not big enough to get a reasonable person to shell out $100+ for a new monitor. It barely ever happens, and when it does happen, I barely notice it, because I am used to it from old days of gaming. Honestly, with my current setup, (670 GTX + 120Hz Acer monitor) I don't have any tearing or stuttering. I just dont know what these people are talking about! But hey, if they just keep telling us that we NEED a new monitor to have "smoother" gameplay we just might believe it if they keep forcing it on us. Great way to create hoards of mindless consumers to buy a new product they don't really need.
  • ibex333 - Thursday, December 12, 2013 - link

    "G-Sync also lowered my minimum frame rate requirement to not be distracted by stuttering. Dropping below 30 fps is still bothersome, but in all of the games I tested as long as I could keep frame rates north of 35 fps the overall experience was great. "

    Anand, what are you talking about?! Are you really that spoiled? Why is it that when I play at 30 fps the experience is butter smooth for me, and you need to be north of 35fps? Seriously...
  • DesktopMan - Friday, December 13, 2013 - link

    People are different. Just because 30fps is fine for you doesn't mean it's fine for everybody else.
  • tackle70 - Friday, December 13, 2013 - link

    lolololol "30fps = butter smooth"

    Troll/console peasant confirmed
  • blackoctagon - Sunday, December 15, 2013 - link

    After 3 days on a 120Hz monitor I could not go back to 60Hz and in fact started demanding a 90fps minimum in-game in order for it to feel 'smooth.' 30fps feels butter smooth to you because you have only experienced 1 single, disgusting, Soviet Republic brand of butter
  • ArmedandDangerous - Sunday, December 15, 2013 - link

    Your definition of 30fps on a 60hz panel as butter smooth just proves that it doesn't affect YOU. Hell, at 50fps it already affects me.
  • Jelic - Thursday, December 12, 2013 - link

    Hmm, interesting article. I actually currently have the ASUS VG248QE. While gsync sounds intriguing, what I find even more promising is the use of lightboost to give CRT like quality to the panel. WIth my current setup I have a GTX 680 with the max framerate limited via EVGA's OC tool to 120fps. On a 1080p screen, with 120hz refresh rate, and 2d lightboost enabled you get absolutely no motion blur, very little tearing, and overall an amazing gaming experience. Since you have the hardware already, I'd be interested in hearing your opinion on 2d lightboost + gsync (at 120hz), and if that makes any difference. Also I'd love it if Anandtech did an article on lightboosted monitors as well! My ideal monitor would be something like a 27in 2560x1600 IPS panel with 120hz lightboost supported... of course I'd need something like dual 780s to get the most out of it, but it'd be well worth it to me heh.
  • DesktopMan - Friday, December 13, 2013 - link

    Lightboost doesn't work well on low framerates since you'd see the backlight flicker. If you flicker it more than once per frame you introduce retina blur again. It works best at high, stable framerates. G-Sync would still be useful with lightboost if your framerate hovers between 60 and 120 though.
  • mdrejhon - Friday, December 13, 2013 - link

    Just so all readers know, the great news is there are several different strobe backlights now:

    - LightBoost
    - Official sequel to LightBoost (coming with G-SYNC monitors), mentioned by John Carmack
    - EIZO's FG2421
    - BENQ Blur Reduction (behaves like LightBoost, but via monitor menus)
    - Sony's Motionflow "Impulse" (GAME MODE strobe backlight, low lag, no interpolation)

    Some of them darken a lot, and others darken less. Some have better contrast ratios, and much better colors. Some of them (BENQ Z-series) can strobe at 75Hz and 85Hz, if you want zero motion blur with a bit less GPU horsepower. Some of them are zero-ghost (no double-image effect). But you can't "get it all" simultaneously.

    From my experience playing on the EIZO FG2421 (warmed up after 30 mins to reduce VA ghosting on cold panels), it's lovely to have a bright and colorful picture, something that LightBoost has difficulty with. The VA panels ghosts a bit more (until I warm up), but when I sustain 120fps@120Hz (Bioshock Infinite, VSYNC ON on a GeForce Titan), it produces spectacular motion quality, the most CRT quality I have ever seen.

    Now, if I fall below 100fps a lot, like Battlefield 4, I prefer G-SYNC because it does an amazing job of eliminating stutters during fluctuating framerates.
  • blackoctagon - Sunday, December 15, 2013 - link

    And does G-Sync offer any benefit if you're ALREADY at 120fps@120Hz? Because, if so, surely someone needs to review the VG248QE with both G-Sync and LightBoost enabled at the same time :)
  • web-cyborg - Thursday, December 12, 2013 - link

    All of those articles focus on variable hz function of g-sync and not the supposed "superior to lightboost" backlight strobing option. The articles say "30 to 40 fps is 'fine'", with 40 being the sweet spot. I would disagree. These same people complain about marginal input lag milliseconds, yet accept long "freeze-frame" milliseconds with open arms in order to get more eye candy. I think people will be cranking up their graphics settings and getting 30 - 40fps. At 30fps you are frozen on the same frame of world action for 33.2ms while the 120hz+120fps user sees 4 game world action update "slices". At 40fps you are seeing the same frozen slice of game world action for 25ms, while the 120hz+120fps user see 3 action slice updates. This makes you see new action later and gives you less opportunities to initiate action, (less "dots per dotted line length") then you add input lag to your already out of date game world state you are acting on. Additionally, the higher hz+higher frame rates provide an aesthetically smoother control, aesthetically smoother higher motion definition and animation definition. Of course 120hz also cuts the continual FoV movement blur of the entire viewport by 50% (vs 60hz baseline full smearing "outside of the lines" blur) as well, and backlight strobing at high hz eliminates FoV blur essentially (eizo FG2421 now, "superior to lightboost" backlight strobing mode of g-sync monitors in the future supposedly).
  • web-cyborg - Thursday, December 12, 2013 - link

    60hz vz 120hz vs backlight strobing. Note that newer monitors like the eizo FG2421 and future "superior to lightboost" backlight functionality of g-sync strobe mode (unfortunately mutually exclusive from the variable hz mode) do not/will not suffer the lowered brightness and muted colors of the lightboost "hack" shown in these examples. However they will eliminate the blur which is shown in these examples.
    http://www.blurbusters.com/faq/60vs120vslb/
    Now remember that in reality it's not just a single simple cell shaded cartoon object moving across your screen, rather your entire 1st/3rd person viewport of high detail textures, depth via bump mapping, "geography"/terrain, architectures and creatures are all smeared "outside of the lines" or "shadow masks" of everything on screen every time you move your FoV at 60hz, more within the "shadow masks" of onscreen objects at 120hz but still losing all detail, textures and bump mapping, and essentially zero blur when using backlight strobing over 100hz.
  • web-cyborg - Thursday, December 12, 2013 - link

    I'm more interested in high fps and zero blur obviously, even if I have to turn down the ever higher *arbitrarily set by devs* graphics cieling "carrot" that people keep chasing (that ceiling could be magnitudes higher if they wanted).
    I still play some "dated" games too.. fps is high.

    You are seeing multiple frames skipped and behind a 120hz+120fps user, watching "freeze-frames" for 25ms to 33.2 ms at 30fps and 40fps, and every time you move your FoV you are smearing the entire viewport into what can't even be defined as a solid grid resolution to your eyes/brain. So much for high rez.
    I think people are sacrificing a lot motion, animation, and control wise aesthetically as well as sacrificing seeing action sooner and being given more and sooner opportunities to initiate actions - to reach for higher still-detail eye candy aesthetically.
    You don't play a screen shot :b
  • mdrejhon - Thursday, December 12, 2013 - link

    Hello fellow guys at AnandTech -- I've created a new TestUFO animation (via software interpolation) that simulates the smooth framerate ramping that G-SYNC can do:

    http://www.testufo.com/stutter#demo=gsync

    It shows off stutter-free frame rate variances as well. I created this unique animation (I think, the only one of its kind on the Internet), for the Blur Busters preview of G-SYNC.
  • HisDivineOrder - Thursday, December 12, 2013 - link

    The "biggest issue with what we have here today" is not that it's nVidia only. That's a big issue, to be sure.

    The biggest issue is that there are a LOT of us who have fantastic displays that we paid high dollar for and will not go down to 16:9 or TN panels. Hell, a lot of us won't even go and spend the same money we just spent on our incredibly expensive and incredibly hard to resell monitors to get this technology that should have 1) been included from the start in the LCD spec and 2) should have a way of being implemented that involves something other than tossing our old monitor in the bin.

    They need to make an adapter box for monitors without built-in scalers that translates what they're doing to DVI-D. Else, there's a LOT of people who won't be seeing this technology have any use until they get around to making 4K monitors that include it with IPS and at an even semi-reasonable price.

    Really, the biggest problem is they didn't find a way to adapt it for all monitors.
  • web-cyborg - Friday, December 13, 2013 - link

    in regard to the backlight strobing functionality, the eizo FG2421 is a high hz VA panel whose backlight strobing "zero blur" capability is independent of gpu camps.

    We are talking about gaming usage. Practically all 1st/3rd person games use HOR+ / virtual cinematography which means you see more of a scene in 16:9 mode, even if you have to run 16:9 mode on a 16:10 monitor. 16:10 mode cuts the sides off basically.
    http://www.web-cyb.org/images/lcds/HOR-plus_scenes...

    Gpu upgrades can run $500 - $1000 now too for high end, and somewhere in between or double for dual gpus. 16:10 / 16:9 is really a bigger deal at 1080 vs 1200 even for desktop use. 16:10 30" is not as much real-estate difference as the size suggest between 2560x 27", the 30" pixels are a lot larger. Here is a graphic I made to show three common resolutions compared at the same ppi or equivalent percieved ppi at viewing distances.
    http://www.web-cyb.org/images/lcds/4k_vs_27in_vs_3...

    Imo for the time being you are better off using two different monitors, one for gaming and one for desktop/apps instead of trying to get both in one monitor and getting worse performance/greater trade-offs combined in one (i.e 60hz vs 120hz, lack of backlight strobing or gsync, resolutions too high to maintain high fps at high+/ultra gfx settings relative to your gpu budget, resolutions too low for quality desktop/app usage,lots of tradeoff, etc).

    Upgrades to display and gpu technology are the nature of the beast really. Up until now you would be better off getting a korean knock off 2560x1440 ips or the american mfg versions for $400 or less, and put a good 120hz gaming monitor next to it imo Eizo FG2421 24" VA backlight strobing model is around $500, so for $900+ (and a good gpu of course) you could have better of both worlds pretty much for the time being. Going forward we know g-sync will have backlight strobing functionality but we don't know if any of the higher resolution monitors due to come out with g-sync will have 100hz+ required to support strobing adequately. If they don't, again we are back to major tradeoffs gaming vs desktop use again (low hz -> low motion+animation definition/much less game action updates shown per second/lower control definition, full 60hz baseline smear bluring out all detail and textures during continual FoV movement/motion flow).
  • repoman27 - Friday, December 13, 2013 - link

    The G-Sync module is a daughter card for another custom NVIDIA board which replaces the inputs and scaler (if present) on whatever monitor they decide to build one for. Theoretically, any panel with a suitable LVDS connection for the TCON (i.e. most of them) can be supported. Also, NVIDIA only has to provide the specs for the daughter card socket and/or a reference design for their scaler replacement board in order for any display manufacturer to implement it and create a "G-Sync ready" product.
  • MarkHunt - Friday, December 13, 2013 - link

    What an exaggerated screen shot demonstrating 'tearing' - never seen anything remotely like that whilst playing a video game.
  • ArmedandDangerous - Sunday, December 15, 2013 - link

    I don't see it exaggerated. I see this very often if I turn V-Sync off. 2x7950's at 1080p running anything with V-Sync off introduces tearing.
  • poohbear - Friday, December 13, 2013 - link

    If they make a 1440p IPS display then count me in, otherwise i just don't see this as an option.
  • web-cyborg - Friday, December 13, 2013 - link

    I'll leave it at this for now because I've been posting too much..
    To review,
    - every time you move your FoV greater than a snails pace on a sub 100hz, non backlight strobing monitor you drop to such a low rez that it isn't even definably a solid grid to your eyes and brain. So continual bursts/path-flow of the worst resolution possible more or less, the entire viewport dropping all high detail geometry and textures (including depth via bump mapping) into a blur. So much for high rez.

    - 1080p is the same exact scene at 16:9 no matter what in HOR+, HOR+/virtual cinematography is used in essentially every 1st/3rd person perspective game and every virtual camera render. All of the on screen objects and the perspective are the same size on a 27" 1080p and a 27" 1440p for example (in a 1st/3rd person game). The difference is the amount of pixels in the scene providing greater detail obviously. This is a big difference but a much bigger difference for desktop/app real-estate where the usable area and display object sizes change, especially considering gpu budgets limits/fps in regard to games..
    .
    -at low hz and low fps, you are at greatly reduced motion definition and control definition.
    Greatly less the amount of new action/animation/world state slices shown, seeing longer "freeze frame" periods during which a high hz+high fps person is seeing up to several newer updates.
    1/2 the motion+control definition and opportunities to initiate actions in response at 60hz-60fps
    1/3 the motion+control definition and opportunities to initiate actions in response at 40.
    1/4 the motion+control definition and opportunities to initiate actions in response at 30.

    -you need at least 100hz to support backlight strobing for essentially zero blur (120hz better).
    -you can upscale 1080p x4 fairly cleanly on higher rez 3840x2160 (aka "quad HD") monitors if you have to, its not optimal but it can work
    (so you can game at higher fps/lower rez on demanding games yet still use a high rez monitor for desktop/apps for example)

    -the eizo FG2421 is a high hz 1080p VA panel that uses backlight strobing, it isn't TN.
    - we know that nvidia is still supposed to support backlight strobing function as part of g-sync monitors, just that it won't work with the dynamic hz function (at least not for now). So "the industry" is still addressing backlight strobing for zero blur in both the eizo and the g-sync strobe option (which again, requires higher hz to make the strobing viable).

    -We know there are higher rez and likely ips g-sync monitors due out, but we do not know if they will have the max hz bumped up which is necessary to utilize the backlight strobe function adequately.

    There is more to a game than a screen shot resolution/definition.
    There is continual FoV movement blur (an undefinable"non"definition resolution, unless perhaps you were to equate it to an extremely bad visual acuity number /"out of focus")
    There is otherwise essentially zero blur using high hz and backlight strobing,
    and there is high or low action updates and motion definition, animation definition, and control definition.
  • mdrejhon - Friday, December 13, 2013 - link

    "-you need at least 100hz to support backlight strobing for essentially zero blur (120hz better)."

    Great reply, just minor clarification.
    Not necessarily, if you can strobe at rates below 100Hz.
    Some strobe backlights (e.g. BENQ Z-Series, such as XL2720Z) can strobe lower, like 75Hz or 85Hz.

    You only need one strobe per refresh, since framerate=refreshrate on strobed displays leads to zero motion blur. (Also why CRT 60fps@60Hz has less motion blur than non-strobed LCD 120fps@120Hz). 100Hz is simply because of less flicker, and because of LightBoost's rate limitation. But nothing prevents zero motion blur at 85Hz, if you have an 85Hz strobe backlight (with low-persistence; aka brief backlight flash times).

    Mark Rejhon
    Chief Blur Buster
  • web-cyborg - Saturday, December 14, 2013 - link

    yes I didn't mean that it was impossible, I meant that for people like me with "Fast eyesight" / visual acuity, 100hz sounds like it would be a good minimum against seeing flicker. I know from your posts on other forums that there are even 60hz sony tv's with some from of stobing but that would drive me crazy personally. Thanks for the input though so everyone reading knows the rest.
  • Deepo - Friday, December 13, 2013 - link

    27" VA panel and I'm in. Need this!
  • beginner99 - Friday, December 13, 2013 - link

    This $120 can also be invested into a better GPU which can easily hit 60 FPS. $120 is just way too pricy for this. I never play FPS with vsync. Especially in BC2 the effect is terrible and I can't hit anyone. The difference is day and night. However I never notice tearing...
  • ArmedandDangerous - Sunday, December 15, 2013 - link

    You need a GPU that can do a minimum 60FPS, not "average" 60FPS cos any time your FPS drops below 60, you WILL experience stutter.
  • just4U - Friday, December 13, 2013 - link

    The only stumbling block I really have is being tied to one video chip maker because of the Monitor I buy. That's a problem for me...Stuff like this has to be a group effort.
  • butdoesitwork - Friday, December 13, 2013 - link

    "The interval remains today in LCD flat panels, although it’s technically unnecessary."

    Technically unnecessary from whose perspective? Anand, I'm sure you meant the monitor's perspective, but this otherwise benign comment on VBLANK is misleading at best and dangerous at worst. The last thing we need is folks going around stirring the pot saying things aren't needed. Some bean counter might actually try to muck things up.

    VBLANK most certainly IS "technically" needed on the other end ---- every device from your Atari VCS to your GDDR5 graphics card!

    VBLANK is the only precious time you have to do anything behind the display's back.
    On the Atari VCS, that was the only CPU time you had to run the game program.
    On most consoles (NES on up), that was the only time you had to copy finished settings for the next frame to the GPU. (And you use HBLANK for per-line effects, too. Amiga or SNES anyone?)

    On most MODERN consoles (GameCube through XBox One), you need to copy the rendered frame from internal memory to some external memory for display. And while you can have as many such external buffers as you need (meaning the copy could happen any time), you can bet some enterprising programmers use only one (to save RAM footprint). In that case VBLANK is the ONLY time you have to perform the copy without tearing.

    On any modern graphics card, VBLANK is the precious time you have to hide nondeterministic duration mode changes which might cause display corruption otherwise. Notably GDDR5 retraining operations. Or getting out of any crazy power saving modes. Of course it's true all GPU display controllers have FIFOs and special priority access to avoid display corruption due to memory bandwidth starvation, but some things you just cannot predict well enough, and proper scheduling is a must.
  • DesktopMan - Friday, December 13, 2013 - link

    G-Sync isn't V-blank, so, yeah, if you have G-Sync you don't need V-Blank. You can take your time rendering, not worried about what the monitor is doing, and push your updated frame once the frame is ready. This moves the timing responsibility from monitor to the GPU, which obviously is a lot more flexible.

    If you need time to do GPU configuration or other low level stuff as you mention, then just do them and push the next frame when it's done. None of it will result in display corruption, because you are not writing to the display. You really can rethink the whole setup from bottom up with this. Comparing to systems that are not this is kinda meaningless.
  • mdrejhon - Friday, December 13, 2013 - link

    Although true -- especially for DisplayPort packet data and LCD panels -- this is not the only way to interpret GSYNC.

    Scientifically, GSYNC can be interpreted as a variable-length VBLANK.

    Remember the old analog TV's -- the rolling picture when VHOLD was bad -- that black bar is VBLANK (also called VSYNC). With variable refresh rates, that black bar now becomes variable-height, padding time between refreshes. This is one way to conceptually understand GSYNC, if you're an old-timer television engineer. You can theoretically do GSYNC over an analog cable this way, via the variable-length blanking interval technique.
  • mdrejhon - Friday, December 13, 2013 - link

    Yeah, put this into perspective:
    "Refresh rates" is an artificial invention
    "Frame rate" is an artifical invention

    We had to invent them about century ago, when the first movies came out (19th century), and then the first televisions (Early 20th century). There was no other way to display recorded motion to framerateless human eyes, so we had to come up with the invention of a discrete series of images, which necessitates an interval between them. Continuous, real-life motion has no "interval", no "refresh rate", no "frame rate".
  • darkfalz - Friday, December 13, 2013 - link

    Will this polling performance hit be resolvable by future driver versions, or only by hardware changes?
  • Strulf - Friday, December 13, 2013 - link

    While you can still see the effects in 120 Hz, tearing or lag is nearly not visible at all anymore at 144 Hz. At this point, I can easily do without G-Sync.
    G-Sync is certainly a nice technology if you use a 60 Hz monitor and a nVidia card. But nearly the same effects can be achieved with lots of Hz. A 27" 1440p@144Hz monitor might be quite expensive though. ;)
  • emn13 - Friday, December 13, 2013 - link

    The article states that at any framerate below 30fps G-Sync doesn't work since a panel refresh is caused on at least a 30Hz signal. That conclusion doesn't make sense; unlike a "true" 30Hz refresh rate, after every forced refresh, G-Sync can allow a quicker refresh again.

    Since the refresh latency is 1s/144 ~ 7ms aka on this panel, and a 30Hz refresh is ~ 33ms, that means that when the frame rendering takes longer than 33ms - but shorter than 40ms, it'll finish during the refresh, and will need to wait for the refresh. Translated, that means that only if the instantaneous frame is between 25 and 30 fps will you get stuttering. In practice, I'd expect frame rates to rarely be that consistent; you'll get some >30fps and some <25fps moments, so even in a bad case I'd expect the stuttering to be reduced somewhat; at least, were it not for the additional overhead due to polling.
  • mdrejhon - Friday, December 13, 2013 - link

    And since the frame scan-out time is 1/144sec, that's one very tiny stutter (6.9ms) in a forest of high-latency frames (>33ms+)
  • 1Angelreloaded - Friday, December 13, 2013 - link

    Just a thought but the one thing you really didn't take into account was NVidia's next GPU series, Maxwell, which supposedly will have the VRAM right next to the DIE and will share stored files with direct access from the CPU, if you take that into account, along with G-Sync you can see what will be happening to the framerates if they can pull off a die shrink as well.
    At this point I think Monitor technologies are so far behind, and profit milked to the point of stifling the industry so bad, 1080p has been far longer cycle that we could expect partially due to mainstream HDTVs. We should have had the next jump in resolution as a standard over a year ago in the 200$-300$ price range and standard IPS technologies with lower MS in that range as well or have been introduced to other display types. LED backlighting carried a heavy price premium when in reality they are ridiculously cheaper to produce because of the hazardous waste CFL bulbs being taken out of the equation, which cost more and have certain fees that need to be paid.
  • Ryan Smith - Sunday, December 15, 2013 - link

    That's Volta (2016?), not Maxwell.

    http://images.anandtech.com/doci/6846/GPURoadmap.j...
  • Totally - Friday, December 13, 2013 - link

    I just don't see why can't they integrate this into the video card completely, and work out a method with display manufacturers to bypass the scalar. Closet guess I have is Nvidia doesn't want to add that $120 if that to their cards.
  • MrSpadge - Saturday, December 14, 2013 - link

    Thanks for that very intersting review, Anand! Glad you're not only doing iWhatever by now ;)

    Some people have brought up the point that one could simply get more raw GPU horsepower and push for high frame rates with VSync on. I think GSync is superior, in fact I'd formulate it the other way around: it could let you get away with a smaller GPU, since 30 - 60 fps is fine with GSync on. Apart from buying the GPUs this also saves on power consumption, cooling requirements, noise etc.

    And this could go really well with mechanisms built into the game engines to ensure a certain minimum frame rate by dynamically skipping or reducing the complexity of less important stuff.

    And decoupling of the AI and interface from the display refresh, of course.
  • godihatework - Saturday, December 14, 2013 - link

    my question is will this ever make it to laptops? I think the potential benefit is much higher in a mid range laptop scenario rather than in a high end gaming desktop.
  • mutantmagnet - Saturday, December 14, 2013 - link

    http://www.youtube.com/watch?v=KhLYYYvFp9A&t=4...

    When will you guys talk about Gsync's alternative modes. I was curious about what improvements they made to Lightboost with their Low Persistence Mode. I was a little shocked to see you say gsync won't have much benefit for those who can push out a lot of FPS which is correct but LPM is supposed to address those type of people but it wasn't mentioned.
  • nand - Saturday, December 14, 2013 - link

    when do diy modding kits coming out?
  • coolhund - Saturday, December 14, 2013 - link

    Stuttering? Stuttering? What?
    I have never experienced stuttering with Vsync on, except when I turn the pre-rendered frames too high. They should always be on 0 or 1, never more, or you GET stuttering and input lag. Also LCDs have far too much motion blur to notice slight stuttering.
  • Murloc - Sunday, December 15, 2013 - link

    I've never experienced tearing nor stuttering, vsync enabled or not.
    Weird.
  • frag85 - Sunday, December 15, 2013 - link

    Unfortunately, with this being a proprietary hardware/software 'gimmick' I don't see it taking off. A standard needs to be created for this to really take off and be viable for everyone. If it can be adopted by everyone (ATI, Intel, Matrox, VIA ect..) it will no longer be a gimmick.
  • lilkwarrior - Tuesday, December 17, 2013 - link

    This is hardly a gimmick. This is absolutely a game changer that makes Nvidia have all the cards as far as capturing high-end gamers.

    Of course it makes sense for ALL high-end gamers (people willing to buy Nvidia and ATI's flagship GPU cards around this time of the year annually) for it to be licensed out eventually, but it makes sense for them to not do that until a year or so.

    It's a no-brainer the benefits of the problem solved that's not necessarily critical to have unless you're already the niche part of the audience that values high-quality levels of entertainment.

    It's not too different than retina displays by Apple early on if you ignore the closed access to this technology: You have to SEE it to believe it, but the problems it solves is a no-brainer to capture if you're already willing to pay that much for a laptop/desktop.

    If you're deciding between the best tier of Nvidia card or ATI card right now for gaming, this technology--along with PhyX--makes Nvidia the rationale choice right now to side with.

    They undisputedly have the best cards this year, the better proprietary technology that enhances games in a progressively-enhancing way (if you don't have it, you don't notice it; if you do have it, it's very satisfying) Nvidia better provides, and their SLI is more consistent in what it delivers compared to Crossfire that has had issues ATI is solving to bring parity back ot that discussion.

    G-Sync being exclusive to Nvidia graphics cards for a year isn't too much of a big deal.

    I wouldn't be surprised nonetheless if they license it after a year or so to offset any attempt for someone to research their own answer to it.

    My only concern is the component that replaces the monitor scaler: Would it pose a problem with non-nvidia cards in the future?
  • spejr - Sunday, December 15, 2013 - link

    This seems to be perfect for mobile gaming. Only one unit, and constant lack of performance. And i dont think the cost of implementation has to be 120, thats such a random figure taken out of context.
  • spejr - Sunday, December 15, 2013 - link

    And also i dont get the argument "but with lots of power you dont need it blablablah...". It's one thing that is even better, enabling even better performance in even more demanding situations
  • rms - Monday, December 16, 2013 - link

    I'd love to see a report on how this behaves with a title like Deus Ex:HR-DC, which has an egregious stuttering issue in hub areas. It's not clear this stuttering is a rendering issue, so how would g-sync respond?
  • Farfle - Monday, December 16, 2013 - link

    No mention of mouse lag in the article. Somehow, it goes unnoticed by some folks, while for others ( me) it downright makes the game unplayable, at least competitively.

    With V-Sync on and the system capped at 60fps, mouse lag on some games is just terrible. Forget stuttering and scream tearing, for me this is much worse. That's why I bought a 120hz LCD. I can either turn VSync off and enjoy zero mouse lag and relatively no tearing on modern games that are FPS-bound; or I can turn VSync on and experience considerably less lag than at 60hz and a beautiful artifact-free picture.

    But if Gsync eliminates mouse lag entirely, as well as any visual issues, that seems like an awesome tech that I wish were implemented in all setups. Although it wouldn't be as useful for someone with a 120hz+ display as 60hz, it'd still be a better overall experience
  • beck2050 - Wednesday, December 18, 2013 - link

    Awesome tech. I saw a demo of it and it is silky. I want a 780 ti ACX to go with it and should be good for a while.
  • nadroj1485 - Wednesday, December 18, 2013 - link

    I wonder what the prices will be like once this goins mainstream.
  • cedarson - Friday, December 20, 2013 - link

    I am so saving up for a 27+ inch version of this.
  • mlmcasual - Monday, December 23, 2013 - link

    One of the BIGGEST FAILS at introducing a new. Tech I have ever witnessed.
    1080P=Automatic fail. As the author mentioned, the entire point of the technology would be suited for a higher res. Investing 400$ for a 1080P? No thanks. This is a complete mismatching of technology. If you can't run over 60fps on a 1080p monitor, you don't need to spend 400$ on a proprietary monitor that can look good at 30hz.. you would start with a better vid. card.
  • somatzu - Friday, January 10, 2014 - link

    Hi, everyone. I have a gtx 670 paired with an ASUS VG248qe. I've always thought that I've been quite sensitive to screen tearing, and once I started playing games with vsync off at 120/144hz refresh rates, tearing seemed to have disappeared altogether. Whether it's bioshock or spelunky, the buttery smooth motion flits past my monitor with no tearing. Of course, when I can't match my output FPS to the high refresh rate of my monitor, there's some motion blur (with LB on or off), but no tearing.

    So, it seems like g-sync is unnecessary for me because my gfx card doesn't need to throttle itself to match my monitor's high refresh rate. I probably won't see any difference, in fact, as the article stated, I'll see a slight decrease in fps. Anand was using a 760, which seems like a better fit for g-sync. But would someone with a gtx670 and a high refresh rate monitor (120hz<) really need this technology?
  • NavasC - Thursday, January 16, 2014 - link

    Somatzu - I just recently completed a DIY on this myself and I have a GTX Titan. There are still some pretty demanding games like Metro: Last Light and BF4 multiplayer that occasionally bring frame dips down to the 40-60 fps range. For those who have higher end PCs, the G-SYNC completely removes the occasional stutter that comes from big frame drops during high-intensity games. And for those games where you tend to hold back on extra features like AntiAliasing, you can now crank them up a little bit and still get really smooth performance. Is it 100% necessary if you've already got a high end rig? Not necessarily, but it's definitely more pleasing to the eye when playing games where frame drops are common (i.e. BF4 multiplayer) and stuttering when you need to twitch is the difference between life and death.

    Also, the GSYNC module comes equipped with a toggle-able improved "Lightboost" mode called ULMB, so no more ToastyX hack is required for Lightboost.
  • raidmaxx - Tuesday, January 14, 2014 - link

    So is stutter and lag the same thing?
  • Zarzou - Sunday, October 19, 2014 - link

    Would be interesting to see a comparison between adaptive mode and G-Sync @ 60/120/144 Hz.
  • kirisaki - Saturday, December 19, 2015 - link

    can I enable G Sync with monitor BenQ XL2730Z??

    answer please.
    Thanks

Log in

Don't have an account? Sign up now