Log in   •   Sign up   •   Subscribe  feed icon

Nvidia's G-SYNC Will Render Screen Tearing Completely Obsolete

A lot of current graphics hardware has a rather irritating problem with it. See, most monitors are stuck at a refresh rate of 60 Hz - that's around 60 frames of animation per second. Pretty much an industry standard, that. Some of you might be wondering why that's a problem - after all 60 frames per second is already considerably faster than the human eye is able to detect. What's the big deal?

GPUs. Even moderately powerful graphics processors are capable of pumping out framerates well above 60 per second. Trouble is, most monitors simply aren't set up to render that quickly. As a result, screen tearing - a phenomenon which occurs when a display device attempts to display two or more frames with a single refresh - can occur quite readily. 

It's referred to as such because it gives the screen a 'torn' look, almost as if someone took a piece of paper and put part of it through a shredder.  One of the most common solutions to this problem (in addition to buffering, which holds back data to prevent a display from being overwhelmed) is Vertical Sync (V-Sync for short).

Unfortunately, that solution is an incredibly inelegant one, which brings with it a whole new host of problems; among them are input lag, benchmarking complications and even video stuttering. 

In short, ever since PC gaming first started gaining in popularity, gamers have effectively had to choose between visual quality and input response time. It's not a pleasant choice to have to make, particularly if you're gaming in a competitive fashion.  It's a problem that's bothered Nvidia's Tom Peterson almost since he started gaming twenty years ago.

Now, with the help of some of Nvidia's brightest minds, Peterson and Jen-Hsun - Nvidia's CEO - have found the solution

As it turns out, the refresh rate of many modern computer displays is actually linked to the standard which was common in Television back in the 1940s. Displays back then almost universally utilized a 60Hz refresh rate, simply because that made it far easier to develop the electronic components. The computing industry just sort of inherited that refresh rate without asking too many questions. 

Nvidia decided that this was far from ideal, and after countless hours of work, their engineers came forward with the manufacturers newest product: The Nvidia G-SYNC module. This small chip is designed to fit inside a display, and works directly with the hardware and software present in most GeForce GTX graphics cards. This chip works directly with the graphics card, causing the monitor to refresh each time the GPU renders a new frame. 

I'll put it simply: Nvidia has, in essence, killed both screen tearing and stuttering. 

The technology is currently slated for general availability next year. Asus, BenQ, Philips, and ViewSonic are already on-board; I find it likely that even more manufacturers and distributors will join in before too much longer.  

Comments
Oct 21, 2013
by Anonymous

Just an explanation in

Just an explanation in laymans terms:

Vsync:
#1 GPU draws need to catch up to the monitor's refresh rate, if it doesn't there is lag/stutter.
#2 If you turn off Vsync, there is tearing when monitor is refreshing faster than the GPU is drawing or if the monitor refreshes in the middle of a draw.

Gsync:
#1 The GPU and the monitor draws and refreshes at the same time.