Sections

Nvidia G-SYNC aims to eliminate stutter, tearing and input lag

Monday, 21st October 2013 00:57 GMT By Brenna Hillier

Nvidia has found a way to sync monitor refresh rates with GPU output – not the other way round, as is usual – in order to reduce or eliminate some of the most persistent problems of graphics output.

Called G-SYNC, the tech relies on modules installed in monitors, either pre-installed or modded in by savvy home users.

As detailed in a Geforce blog post, the module makes the monitor refresh at the same rate as the GPU outputs, as opposed to syncing the GPU to the monitor’s refresh rate – a relic of the days when displays needed to use a standardised refresh rate.

When the monitor’s refresh rate is used to determine what content is displayed on screen, you end up with input lag. If you use VSync, you get screen tearing, and if you switch it off, you get stuttering. G-SYNC apparently avoids all three problems.

At an event in Montreal, Nvidia roped in John Carmack, Johan Andersson, Tim Sweeney and Mark Rein to talk about the tech, which they seem to have judged very fine indeed, and has had great success testing it with pro gamers.

G-SYNC is just one of a number of recent technologies from Nvidia, such as GameStream and Shadowplay.

Thanks, games.on.net.

Latest

32 Comments

  1. Panthro

    Hmmm cool I could see how this would help alot of people if they were to buy one of these new monitors that include G-Sync…

    But in my past experience when I had another monitor and rig I used to get screen tearing when the V-sync setting was off and once I forced the option to always be on its stopped the tearing…

    And if I used to V-sync option in the settings menu of some games it used to cause some mouse lag.

    So I think the article may be wrong…

    V-sync off = Screen tearing.

    V-sync on = No screen tearing, possible mouse lag.

    #1 9 months ago
  2. Erthazus

    I saw G-Sync and it is amazing. Even in 50 frames per second the difference between standard 50 fps and G-sync 50 fps is huge.
    and monitor developers said that they will start using this tech already next year and will include in all future monitors.

    and in reality if next gen consoles had this tech, we could have more games without screen tearing on consoles. Sony and MS took the cheapest and shittiest approach with AMD.

    well, good luck console players. In 2014+ you will start your experience with screen tearing and 30 fps.

    Nvidia with Shadowplay+G-Sync+GameStream+Tegra chips basically a leader in videogame industry.

    #2 9 months ago
  3. Erthazus

    http://www.youtube.com/watch?v=Lg5TAUrtXxo – here is the demo.

    and who thinks that 30 fps is enough for gaming is a total idiot. Even the difference between 50 frames and 60 frames is a freakin huge deal in gaming. You can see it in the demo and the guy that presents this tells that to everyone and proves it.

    #3 9 months ago
  4. Juice_Man101

    @2

    Do you even play videogames?

    #4 9 months ago
  5. Erthazus

    @4, Yes.

    #5 9 months ago
  6. Panthro

    There is a difference between 30fps and 60fps, most of the people who don’t think there is a difference haven’t experienced it properly…

    Anybody who actually cares knows 60 frames per second is better so you don’t have to be so aggressive about it Erthazus.

    I only game on PC now unless a good console exclusive comes out, and when they do I can see past the FPS limits, the lower resolution, the mucky textures and little to no anti aliasing since what is underneath is still usually a fun game regardless of how it looks.

    Shit, sometimes I even play some games on my old snes and playstation 1 there are some seriously kick ass games on those systems.

    I only bought a PC for the indie games and the vast choices of other games created for PC’s and I’m happy, I’ve found some truly incredible games on this platform but I still don’t choose to rag on console gamers solely because there platform of choice doesn’t have some of the features mine does, fuck that. We all play games so we should all technically get along.

    #6 9 months ago
  7. rrw

    Erthazus, what happend if I told you i cannot see the difference in that video?

    #7 9 months ago
  8. Juice_Man101

    @6

    +1

    #8 9 months ago
  9. sagtlthl

    @6

    +1

    #9 9 months ago
  10. DSB

    This is gonna be huge. Not looking forward to throwing even more buckets of money at more expensive monitors… But it’s definitely gonna make a difference.

    Clever bastards.

    #10 9 months ago
  11. Lounds

    Why so much hate for AMD, Radeon is pretty good these days. I’ve used both cards over the years and apart form my ATI 4850 which had a few bad drivers I had a pretty good run with it. 7970 can be grabbed pretty cheap at moment so I would say they’re a good buy. Fuck it buy 2 if you have a big enough PSU.

    #11 9 months ago
  12. TheWulf

    What #6 said. There’s absolutely no need to be nasty towards console owners. And really, do you think that nVidia won’t offer this to television manufacturers, or do you think that AMD won’t come up with something similar for the consoles?

    I keep saying this, but — what makes the true differences between platforms isn’t hardware, it’s philosophy. So, one platform has one bit of shiny hardware, another platform has another, who gives a toss? I have a PC for the exact reasons put forth by #6 — because there are experiences on the PC that you can’t find anywhere else. And that’s due to the openness of the platform and that people can develop what they want to.

    There are more people creating because they can on the PC because the PC is a development platform. So you get these crazy things on the PC that you wouldn’t get on non-development platforms. So, yeah, the PC is a great platform that’s always going to be around, due to it actually being necessary for development in the first place.

    And that’s great.

    I don’t think that there’s any point in going on and on about the hardware differences, because eventually consoles and tablets are going to slowly catch up. They’re doing that right now, whether you care to admit it or not. And they’re becoming more and more like PCs. They’re only missing the integral element that the PC has — being open platforms and being development platforms.

    When consoles and tablets can be those things, only then will the PC become irrelevant. And goodness knows when that will be, due to greed and proprietary nonsense.

    But still, hardware? Who cares! And this is just more proprietary nonsense, anyway, hoping to lock people into specific types of monitors.

    So, yeah.

    It’s a shame that you’re something of a token PC gamer around these parts, Erth, because you don’t seem to understand what the platform is about, really, beyond just having better hardware than the next guy. It’s kind of like a wannabe car enthusiast who believes that he is such just because he has a better car than the next guy.

    It’s never about that. Never.

    #12 9 months ago
  13. Kabby

    I’ll reserve judgement until retail devices are available.

    #13 9 months ago
  14. sagtlthl

    @12 +1

    #14 9 months ago
  15. Phoenixblight

    @7

    Did you watch the entire video? Its not aware in the beginning especially with Youtube compression but then they zoom up to the monitor and it is quite obvious at the difference. This is huge and I will be anxiously awaiting for the kit and price to put this on my monitor.

    #15 9 months ago
  16. Brenna Hillier

    What I think is really interesting is in the full Geforce blog post they mention the fact that pro-gamers struggled with it at first, because they’d learned to compensate for input lag, and then started getting great results. One step closer to genuine one-to-one controls…

    #16 9 months ago
  17. Dragon

    Read the DF preview a few days ago, looks good to say the least-
    http://www.eurogamer.net/articles/digitalfoundry-nvidia-g-sync-the-end-of-screen-tear-in-pc-gaming
    A true game-changing innovation.

    It currently seems to support only Kepler cards, but unless Kepler is equipped with some special hardware (which I doubt, someone can correct on that. The tech looks to work exclusively on monitor side, not the gpu one) , its quite possible for Nvidia to license it to others.
    This will potentially be a good new revenue stream for Nvidia, since all PC monitor manufacturers will want this.

    #17 9 months ago
  18. Chief Thunder

    This is really good stuff but coming from a 144Hz panel, doesnt look that appealing… btw a monitor with G sync will cost almost $400 as the ASUS VG248QE costs about $279 and the G sync add-in card they said will retail for $175… at least that’s what i heard.

    once you game at 144Hz w/vsync, even 100fps looks like crap. but that’s just my eyes… vsync works provided you have the GPU power to pull it off.

    @ Erth, why in the world do you always bash consoles? this is not even related to console gaming…is Nvidia paying you or something? I have both a 360 and a PS3 and I just love the exclusives like the Last of US or Halo or Beyond Two Souls or whatever.. on PC, I have a 3×1 portrait monitor setup, 3240 X 1920, all at 144Hz, and have the GPU power to play almost any game at that FPS too with vsync on… Do i see a difference between that and the console games that sometimes dip to 20 fps? of course yes, it is beyond ANYTHING that this Gsync is or 4K or 30 fps or 60 or whatever… but is it the same experience? NO

    just don’t try to look special, cuz you’re not…

    #18 9 months ago
  19. Phoenixblight

    @18

    ” btw a monitor with G sync will cost almost $400 as the ASUS VG248QE”

    WHere did you get that? I actually doubt the kit will be that much what you are paying for is for modders to put the device in so much like a mechanic you are paying for the hour of work along with the device.

    #19 9 months ago
  20. Chief Thunder

    @19 I think I read it on Geforce forums, i’ll try to find a link… my guess is IPS panels with G sync will be even more expensive, as VG248QE is an LCD LED. I’m actually looking forward to this, but I can’t see a point in it if you already don’t see any stutter… maybe its one of those things you have to see to really understand… :)

    #20 9 months ago
  21. Hcw87

    @3
    The difference between 50 and 60 fps today is impossible to see for the average human being. Just saying.

    But then again, you’re a troll so it might be different for your kind.

    #21 9 months ago
  22. TheBlackHole

    I really don’t need to have state of the art tech to enjoy video games.

    Console frame rates have served me well up until now. I’m sure I’ll enjoy 60fps when they catch up.

    I’m really not in any rush.

    Erth, your aggressiveness is very childish. Sort it out.

    #22 9 months ago
  23. DrDamn

    I can see this being very useful in PC gaming where there is a lot of variety in hardware and devs don’t have a set config to develop for. Hence frame rates will always be variable. On consoles the devs should aim to lock at 30 or 60 fps to give a smooth consistent experience – that’s where the tech in TV’s is. If that changes for the majority then maybe look at it again.

    #23 9 months ago
  24. Erthazus

    I don’t bash consoles relax. I bash Sony and Microsoft. There is a difference guys.

    and G-Sync has nothing to do with crazy good looking picture. It has to do with the common issue of videogames. SCREEN TEARING and one to one control. V-sync is the ultimate thing that will help gamers to see a proper image.

    But G-Sync makes monitor as a slave to GPU which means that even if you are playing in 30 or 40 fps result is going to be much smoother and you will still get your one to one control.

    @7, Because you can’t. Because you can judge it when you have monitor or TV in front of you.

    #24 9 months ago
  25. Erthazus

    and btw if you have GTX 660+ your PC is ready for G-Sync. All you need is a monitor in the future.
    So get ready for some freaking good stuff for your eyes.

    I wonder what Valve is doing with Steam machines. :D and console manufacturers at least could do something similar cause you know most common thing in development for console games is that you need to achieve 30 frames per second even if you can’t use V-sync.

    #25 9 months ago
  26. CyberMarco

    @Erth, I wonder how much money you burn on electronics, seriously! Do you have a life?

    This G-Sync is a nice feature but seeing it’s from Nvidia, no it’s sure going to be expensive. Just the ASUS VG248QE monitor costs 350€ alone, so I can’t see how the new monitors with G-Sync are going to cost less than 400€.

    Personally I’m happy with my 17″ 75Hz LG monitor from 2003/4 @1280×1024, without any dead-pixel, so yeah, I can’t see why I have to ditch it to get the next best thing.

    I suppose people have some serious money to burn and want to feed their consumerism.

    #26 9 months ago
  27. monkeygourmet

    Eliminating Screen tear was one of my wishes for next gen (along with 1080p / 60fps). From a hardware perspective, thats all I wanted, and by not hitting those benchmarks, Going PC for my main gaming choice next gen has been an easy decision.

    Games like Far Cry 3 on Xbox 360 had so much screen tear it was unreal. Ruined the experience for me in many ways.

    @CyberMarcco

    “I suppose people have some serious money to burn and want to feed their consumerism”

    Thats a bit unfair. If you have been gaming for over 20 years and now have a beter income, you generally gravitate towards better equipment for your hobby.

    I started of last gen with a 360 and CRT TV. I quickly realised GOW looked a mess, and it made me want to upgrade to a HD set. I’ve had 3 TV’s since owning my 360 now.

    #27 9 months ago
  28. CyberMarco

    @27 I see your point, actually I was referring to the no-brain big spenders that don’t miss the chance to get the next shining best electronic equipment while the still have the previous version that is good to go for the next 3-5 years, if not more.

    Those who don’t have the “sense of preservation” for their consumerist habits.

    As I said I play on a 17″ 75Hz 1280×1024 LG monitor and probably wouldn’t say no to a new 1080p monitor, but why get rid of my monitor that is going strong almost for the past 10 years?

    I mean do I need to get a new monitor? Does this monitor satisfy my needs? Personally speaking, yes. But I’m also the type of guy who wants to “squeeze” every bit of performance out of his products before getting rid of them.

    I too bought a new 32″ HDTV when I got my PS3 in Sept 2012, but I’m planning to keep my TV for as long as possible, presumably for the next 5-7+ years, if not longer.

    #28 9 months ago
  29. traumadisaster

    Cymermarco I have some clothes that are 20 years old, I could buy walmart jeans or $200 jeans. Nobody says there is anything wrong with your old monitor. Generally as you get older you make more money. At some point you may choose to eat a steak vs noodles or if you cant afford it now, others can so good for them.

    #29 9 months ago
  30. CyberMarco

    @29 I think you missed my point. I didn’t say it’s wrong to buy better stuff, I always try to get the best products in terms of quality when money isn’t really an issue.

    I’m not saying one must live like a peasant. Also we are talking for consumer’s habits, not vital needs like quality of food and stuff like that.

    All I’m trying to say is that some people don’t realize how they “waste” their money.

    “Metron Ariston”. Moderation is best!

    #30 9 months ago
  31. AmiralPatate

    Ain’t nuthin but a G-SYNC baby. Because this thread needed comic relief.

    #31 9 months ago
  32. TheWulf

    @30

    That’s exactly how I feel. And this is what creates the “PC elitist” that some console owners have an inferiority complex about. It’s all very unnecessary.

    I mean, let’s say I spend five thousand quid on the very best hardware. How many games actually take advantage of that? One? Two? Okay, let’s say that I spent three thousand quid on really good hardware, how many games will actually take advantage of that, let alone need it? Under one per cent?

    You only need to be as good as the current console generation, because that tends to be the benchmark. Having a slightly better processor is nice because sometimes you’ll have this weird European developer who likes to do good AI (see: Piranha Bytes), but other than that?

    This leads to consumers having cognitive dissonance, and talking up their computer as being able to do amazing things, because it actually can’t. There’s only a few games out there that would even make use of such a computer, so they realise they’ve wasted their money.

    Then console owners believe this, then they get jealous, then they get pissed. This leads to console owners having a writhing hatred for PC owners because PC owners get to experience hawt grafix that they can’t have. So they actually get infected with this, even though it’s all fake.

    Really, show me a game which uses even a three thousand quid computer to its fullest, one that looks so amazing. It’s all just a fantasy.

    I like the PC best, yeah, but it’s never about the hardware. In fact, I always stay on the lower rungs of the hardware curve, because I have no need to be any higher than that. I can play 99 per cent of the games out there and I’m perfectly happy with that. And honestly? The remaining 1 per cent is grey-brown, homogeneous nonsense anyway.

    I personally just prefer the different experiences available to the PC due to there not being gatekeepers. When you look at Kickstarter, the PC is the platform they always aim for first, because there are gatekeepers on every other platform. So the PC is where they can most immediately earn money.

    Just look at the Obduction Kickstarter, which is already half way to its goal.

    And the only people that are to blame for this not happening on consoles are Microsoft and Sony, with their proprietary, closed systems which they are gatekeepers and toll men for. An open system allows something like Obduction to thrive, which is why I’m an open platform owner.

    I feel that’s more important to say. I don’t want to say I’m a PC owner any more because I feel like I’m in the same camp as Erth. I just want to say I’m an open platform owner, because I prefer that philosophy. I prefer it that anyone can make and release the kinds of games they want, with no one telling them that they can’t.

    And the Steambox may change everything for console owners. I hope it does.

    If the Steambox is successful, it might mean that there’s no gatekeepers for consoles then, either. And it would mean mods and Steam Workshop access for everyone. That’d be glorious, wouldn’t it? When that happens, we can all just be open platform users, and that would be that. No more segregation.

    So yeah, I’m not liking thinking of myself as a PC owner now, because of those who have too much money and too little sense. What they want out of it isn’t what I want.

    I’ll just call myself an open platform owner. And I hope everyone will be able to call themselves that, eventually, too.

    #32 9 months ago

Comments are now closed on this article.