Nvidia Ends Screen Tearing With G-Sync Display Technology
Nvidia's new G-Sync technology promises to end once and for all the ugliness and hassles of screen tearing and sluggish frame rates.
You know how it goes: In-game images look best with v-sync turned on but games can get sluggish and laggy, and if the GPU can't keep up with the monitor's refresh rate, stuttering becomes an issue too. Turning v-sync off offers the best performance, but then you have to deal with "screen tearing," where the top and bottom parts of the display aren't in sync. In short, v-sync on looks best, v-sync off plays best, but neither are ideal.
Enter G-Sync, the new technomancy from the folks at display company Nvidia, which is actually built around a fairly simple idea. Conventional LCD monitors have fixed refresh rates, typically 60hz, which the GPU must work with, but with G-Sync, a module goes inside the monitor that transfers control of the refresh rate to the GPU. Because the display adapter controls the timing, the two are always synchronized, eliminating screen tearing without sacrificing performance.
Tom Petersen, Nvidia's director of technical marketing, said G-Sync will be compatible with most GeForce GTX GPUs, but while the potential benefits are obvious, it won't likely be an easy sell to mainstream consumers. Unless I'm completely misunderstanding how it works, G-Sync will require specific monitors in order to operate, but it's unlikely that "average" PC users will be willing to fork over extra bucks for technology that has no bearing on them anyway. Is the anti-screen-tearing segment of the enthusiast market sufficient to support proprietary monitors? It's a great idea, but there are some pretty big questions that remain unanswered.
Source: Nvidia [http://blogs.nvidia.com/blog/2013/10/18/g-sync/]
Permalink
Nvidia's new G-Sync technology promises to end once and for all the ugliness and hassles of screen tearing and sluggish frame rates.
You know how it goes: In-game images look best with v-sync turned on but games can get sluggish and laggy, and if the GPU can't keep up with the monitor's refresh rate, stuttering becomes an issue too. Turning v-sync off offers the best performance, but then you have to deal with "screen tearing," where the top and bottom parts of the display aren't in sync. In short, v-sync on looks best, v-sync off plays best, but neither are ideal.
Enter G-Sync, the new technomancy from the folks at display company Nvidia, which is actually built around a fairly simple idea. Conventional LCD monitors have fixed refresh rates, typically 60hz, which the GPU must work with, but with G-Sync, a module goes inside the monitor that transfers control of the refresh rate to the GPU. Because the display adapter controls the timing, the two are always synchronized, eliminating screen tearing without sacrificing performance.
Tom Petersen, Nvidia's director of technical marketing, said G-Sync will be compatible with most GeForce GTX GPUs, but while the potential benefits are obvious, it won't likely be an easy sell to mainstream consumers. Unless I'm completely misunderstanding how it works, G-Sync will require specific monitors in order to operate, but it's unlikely that "average" PC users will be willing to fork over extra bucks for technology that has no bearing on them anyway. Is the anti-screen-tearing segment of the enthusiast market sufficient to support proprietary monitors? It's a great idea, but there are some pretty big questions that remain unanswered.
Source: Nvidia [http://blogs.nvidia.com/blog/2013/10/18/g-sync/]
Permalink