X

Nvidia G-Sync is a smooth move for PC games

The right graphics card and G-Sync monitor can make games look better.

Joseph Kaminski Senior Associate Technology Editor / Reviews
During my almost twenty years at CNET, I handled benchmark testing/methodologies for both Mac and PC systems and, sometime after, integrated testing for micro-mobility (e-bikes, electric scooters and EUCs), which is a passion of mine. Transitioning from a BMX background to this field was seamless. Despite testing numerous products, each new one brings the same excitement as my first.
Joseph Kaminski
4 min read

Watch this: Nvidia G-Sync is a smooth move for PC games

Playing video games on a PC versus a living room game console has numerous advantages, from better textures to higher resolutions to tighter mouse-and-keyboard controls. But even on a $3,000-or-more desktop gaming PC with the latest processors and graphics cards, games can still display annoying visual artifacts, such as screen tearing and stutter.

Tearing is horizontal distortion across the screen when playing a PC game, where it looks like one frame of animation is being half-written over another. It's something many PC gamers have just learned to live with.

Nvidia, maker of the popular GeForce line of graphics chips, has developed a display technology called G-Sync that promises to eliminate tearing and screen stutter, and improve input lag (where input commands can be out of sync with the action on-screen). We've tested the technology on several games, using a high-end desktop PC and a G-Sync monitor from Asus.

Screen tearing without G-Sync in an Nvidia demo. Sarah Tew / CNET

Previously, to minimize tearing, gamers had to go into the game settings, or the Nvidia control panel app, and turn on V-Sync (or vertical synchronization), a technology that dates back to the CRT monitor days. It could stop the graphics card output from outpacing the refresh rate of the display, but at the potential cost of a serious performance hit and input lag.

So, most people leave V-Sync off, leading to a problem where the next rendered frame is sent to the monitor, even if the display of the previous frame is not yet done. This is what causes tearing, other visual artifacts and screen stutter.

G-Sync synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are needed. The Nvidia G-Sync-compatible graphics card (any GeForce GTX desktop card from the 600 series through the current 900-level series) sends a signal to a G-Sync controller chip physically built into the monitor (yes, G-Sync requires a new, specially compatible monitor). After the GPU renders the the frame and sends it to the display, the monitor delivers the frame to the screen as soon as it hits its next refresh cycle, and instead of waiting on the vertical blanking period of the monitor, the GPU is now free to send the next frame as soon as it's available.

Our dual-monitor test setup, with the G-Sync display on the right. Sarah Tew / CNET

This is all because of the direct communication between the display's built in logic board and the Nvidia graphics card, which are connected via DisplayPort (for now, G-Sync works only through DisplayPort, not HDMI).

In practice, the effect is visually similar to watching a big-screen TV with a dejudder filter turned on, which is a form of video smoothing some call the "soap opera effect." It's generally unwanted on TVs, but here it's a plus. Motion is smoother, screen tearing is nonexistent, and each of the games we tried, from Metro: Last Light to the new Dying Light, looked great.

Using G-Sync (which must be turned on via the Nvidia control panel on the PC, and may require an Nvidia driver update to add the required checkbox), caused a not-insignificant drop in performance in some PC gaming benchmarks. Running Metro: Last Light with the display set to 60Hz (60 refresh cycles per second), resolution set to 1,920x1,080, and G-Sync turned off, the game averaged 70.29 frames per second. With G-Sync on, and the other settings unchanged, the game ran at an average of 58.0 frames per second.

Tests were conducted using an Asus Rog Swift PG278Q monitor, one of the first to support G-Sync, and a Maingear Shift desktop equipped with three Nvidia GeForce GTX 980 graphics cards and an overclocked Intel Core i7 5960X CPU.

g-sync-pic-1.jpg
The G-Sync monitor on the right avoids screen tearing in Dying Light. Dan Ackerman

In an interesting paradox, while turning G-Sync off resulted in a higher frame rate, the run of the game played with G-Sync on and the lower fps rate actually looked visually better. In a sense, G-Sync gave us the illusion of a better frame rate, thanks to its especially smooth motion. Comparing the two, anyone would pick the G-Sync version.

The same was true to for the new game Dying Light, played at very high settings, at 2,560x1,440 resolution. Walls and backgrounds tore on a non-G-Sync monitor connected to the same desktop, but looked perfect on the G-Sync monitor running simultaneously next to it.

G-Sync displays include this sticker. Sarah Tew / CNET

Currently, several display makers are offering G-Sync monitors, but most are a couple of hundred dollars more than comparable non-G-Sync versions. The Asus we used sells for $799 (as does an Acer model), and versions from Ben-Q and Phillips run about $599, all for 27- or 28-inch screens.

AMD offers a similar technology called FreeSync that's compatible with that company's current R7 and R9 series desktop graphics cards. It, too, uses DisplayPort, but requires a monitor with the adaptive sync specification, which is free to use so it could be more widely available than branded G-Sync monitors (although adaptive sync monitors for FreeSync are not yet for sale).

G-Sync isn't a must-have, especially as it involves the significant added expense of a new premium-priced monitor, but in our hands-on and eyes-on tests, there is a definite difference when using it. It will be very interesting to see if this comes to more mainstream monitors or even to laptop displays.