Nvidia Ends Screen Tearing With G-Sync Display Technology

Andy Chalk

One Flag, One Fleet, One Cat
Nov 12, 2002
45,698
1
0
Nvidia Ends Screen Tearing With G-Sync Display Technology


Nvidia's new G-Sync technology promises to end once and for all the ugliness and hassles of screen tearing and sluggish frame rates.

You know how it goes: In-game images look best with v-sync turned on but games can get sluggish and laggy, and if the GPU can't keep up with the monitor's refresh rate, stuttering becomes an issue too. Turning v-sync off offers the best performance, but then you have to deal with "screen tearing," where the top and bottom parts of the display aren't in sync. In short, v-sync on looks best, v-sync off plays best, but neither are ideal.

Enter G-Sync, the new technomancy from the folks at display company Nvidia, which is actually built around a fairly simple idea. Conventional LCD monitors have fixed refresh rates, typically 60hz, which the GPU must work with, but with G-Sync, a module goes inside the monitor that transfers control of the refresh rate to the GPU. Because the display adapter controls the timing, the two are always synchronized, eliminating screen tearing without sacrificing performance.

Tom Petersen, Nvidia's director of technical marketing, said G-Sync will be compatible with most GeForce GTX GPUs, but while the potential benefits are obvious, it won't likely be an easy sell to mainstream consumers. Unless I'm completely misunderstanding how it works, G-Sync will require specific monitors in order to operate, but it's unlikely that "average" PC users will be willing to fork over extra bucks for technology that has no bearing on them anyway. Is the anti-screen-tearing segment of the enthusiast market sufficient to support proprietary monitors? It's a great idea, but there are some pretty big questions that remain unanswered.

Source: Nvidia [http://blogs.nvidia.com/blog/2013/10/18/g-sync/]


Permalink
 

GoaThief

Reinventing the Spiel
Feb 2, 2012
1,229
0
0
I'd certainly shell out for it if it works, I can only play with VSync on if I'm using a controller. The input lag on a mouse renders it unplayable for me, even if it's set to render only one frame ahead - I do prefer higher sensitivity than average so it may be more apparent.

Compatibility with HDTVs would be the icing on a very sweet cake.
 

Doom972

New member
Dec 25, 2008
2,312
0
0
I don't see myself investing in a new monitor just for this purpose. Maybe when I'll get a new one for some other reason.

In the meantime, Adaptive V-Sync and framerate limiting seem to do the job when V-Sync slows up the game.
 

Easton Dark

New member
Jan 2, 2011
2,366
0
0
Say goodbye to the evil whore known as input lag then... V-sync is off in every game I'm able to stand the screen tearing.

Unfortunately, it wont fix the strange problem some games have where you need to have V-sync on to continue with the game. Most prominent example is The Darkness 2.
 

Boris Goodenough

New member
Jul 15, 2009
1,428
0
0
Well according to Steam 51.98% use nVidia cards (32.66% for AMD), and according to Anandtech it works with most nVidia cards.
People buy 120 and 144 Hz screens to avoid tearing (and Overcloackable 60 Hz IPS screens), I am sure people are willing to cash out on something that makes visible tearing a thing of the past.
Pricing is everything in this case though and if they opt in for IPS or OLED price might be too high, at least initially.

Edit: they should be out Q1 2014.
 
Apr 5, 2008
3,736
0
0
If it is only available as an integral component of a new monitor, it will take a while to get off the ground. I would absolutely buy it based on this feature, but only in my case if it's available in a top-end gaming monitor in other regards.

It's also tragic that I just bought a top-end gaming widescreen monitor less than 6 months ago so couldn't justify upgrading to a G-Sync one any time soon. Plus my brand new gaming rig (as of Aug/Sept) has no issues with performance/tearing in anything I've tried so far :) With V-Sync on and Triple-Buffering where available i have no issues in any game with everything at maximum settings (GTX 780 Classified).

But my next monitor purchase certainly. It is a great benefit that directly impacts gaming experience for the better; I'm all for it and would gladly spend money on it.
 

Boris Goodenough

New member
Jul 15, 2009
1,428
0
0
KingsGambit said:
It's also tragic that I just bought a top-end gaming widescreen monitor less than 6 months ago so couldn't justify upgrading to a G-Sync one any time soon. Plus my brand new gaming rig (as of Aug/Sept) has no issues with performance/tearing in anything I've tried so far :) With V-Sync on and Triple-Buffering where available i have no issues in any game with everything at maximum settings (GTX 780 Classified).
What resolution is it?
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
Not very useful but good to have, I guess. This is why I use D3DOverrider. It lets you limit your FPS and use triple buffering even with DirectX games, as opposed to just OpenGL. That pretty much does the job.
 

Ne1butme

New member
Nov 16, 2009
491
0
0
This is good and all, but will it fix the tearing issues with Silverlight and Netflix? No way to turn on a v-sync option for those applications.
 

thiosk

New member
Sep 18, 2008
5,410
0
0
Unless Gsynch chips catch on and monitor manufacturers buy the modules for direct incorporation from nvidia. Its like a trojan horse approach for getting chips on every product manufactured world wide, and then building the video cards to go along with it.

Nvidia: Has all the FLOPs
 

Boris Goodenough

New member
Jul 15, 2009
1,428
0
0
thiosk said:
Unless Gsynch chips catch on and monitor manufacturers buy the modules for direct incorporation from nvidia. Its like a trojan horse approach for getting chips on every product manufactured world wide, and then building the video cards to go along with it.

Nvidia: Has all the FLOPs
ASUS, BenQ, Philips, ViewSonic have already signed up for it.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Why not triple buffering? Doesn't that solve the same problem, while also being possible on pretty much every video card released in the last two decades?
 

lacktheknack

Je suis joined jewels.
Jan 19, 2009
19,316
0
0
But I tend to stop noticing the framerate at 50+, so my 60Hz refresh rate isn't an issue... no sell over here...
 

lacktheknack

Je suis joined jewels.
Jan 19, 2009
19,316
0
0
Bad Jim said:
Why not triple buffering? Doesn't that solve the same problem, while also being possible on pretty much every video card released in the last two decades?
No, because V-synch synchs the GPU to the monitor. Triple buffering doesn't change the hardware specs.
 

Rad Party God

Party like it's 2010!
Feb 23, 2010
3,560
0
0
Thanks but no thanks, I already have triple buffering turned on by default and I use MSI Afterburner to limit my FPS when necessary.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
lacktheknack said:
Bad Jim said:
Why not triple buffering? Doesn't that solve the same problem, while also being possible on pretty much every video card released in the last two decades?
No, because V-synch synchs the GPU to the monitor. Triple buffering doesn't change the hardware specs.
But why do we want to change the hardware specs? Triple buffering already eliminates tearing without hurting your fps, and you can use it right now. How does this G-Sync make our gaming experience better if we have already turned on triple buffering?
 

lacktheknack

Je suis joined jewels.
Jan 19, 2009
19,316
0
0
Bad Jim said:
lacktheknack said:
Bad Jim said:
Why not triple buffering? Doesn't that solve the same problem, while also being possible on pretty much every video card released in the last two decades?
No, because V-synch synchs the GPU to the monitor. Triple buffering doesn't change the hardware specs.
But why do we want to change the hardware specs? Triple buffering already eliminates tearing without hurting your fps, and you can use it right now. How does this G-Sync make our gaming experience better if we have already turned on triple buffering?
You're misinformed.

Triple buffering does not fix screen tearing.

Tearing happens when (example) a GPU fires 70 frames in one second to a screen that can only output 60 Hz. This means that before the screen is even done rendering the first frame, it's already started rendering a new one at the top. When things are moving, you're looking at two different frames at once (or more, if the frame rate goes to 120 FPS or higher). The place where the two frames are smushed together is a screen tear.

http://en.wikipedia.org/wiki/Multiple_buffering

What buffering does is allow the PROGRAM to do its thing however fast it wants while allowing the GPU to finish what it's doing before using the next set of instructions. This means that the software and GPU are separate. If you play an old game (particularly a DOS game) with no buffering on a modern computer, you may notice that the animations go insanely fast. That's from a lack of buffering.

Inversely, if the software is having momentary issues, and not sending out any drawing instructions, the GPU can take a previous set of instructions from the buffer and work on them instead while it waits.

EDIT: Added extra.

As long as you don't use V-Sync and your FPS is higher than your screen output, you'll get screen tearing. The only fix was V-synch. Buffering doesn't even acknowledge the monitor's existence.

G-Synch is a chip (if I'm reading this right) that overwrites the screen's refresh rate by handing control to the GPU. It... seems a bit dangerous, really. I imagine that you'd need specialized monitors to avoid explosions, but if that's the case, why not just invest in a 120Hz monitor and be done with it?
 

Smooth Operator

New member
Oct 5, 2010
8,162
0
0
Well I've only been saying refresh rates on LCD are the dumbest idea in human history since they started selling them, but hey better late then never.
Sadly this being yet another proprietary standard it makes for a very shit solution, hopefully this sparks some work in creating a proper standard for general use.

Oh and triple buffering is a system to solve inherit problems with graphics cards, synchronizing with your screen is a completely separate matter.