To help us provide you with free impartial advice, we may earn a commission if you buy through links on our site. Learn more

FreeSync vs G-Sync: AMD and Nvidia face off for adaptive sync dominance

Here's everything you need to know between AMD FreeSync and Nvidia G-Sync

Screen tearing is one of the biggest irritations facing PC gamers today. It’s a huge annoyance for players who want quick response times in fast-paced games such as FPS and RTS titles, but the problem affects games and gamers across the board, from budget PCs to high-end monsters. It’s a crisis that graphics card and monitor makers have finally come together to fix.

Nvidia and AMD have two different solutions to this problem. Together they’re called adaptive sync (alternatively known as dynamic refresh rates). The two firms market their technologies differently, but they solve the same problems in a very similar way; it’s the hardware implementations that vary slightly. In this article, we’ll talk about how the technology works and give you quite a lot to think about if you’re in the market for a monitor or graphics card.

However, it’s not all rosy – not all Nvidia cards will work with a FreeSync monitor, and AMD graphics cards with FreeSync won’t work with G-Sync monitors. This leaves consumers with a difficult choice, as your choice of monitor will potentially lock you to one manufacturer or the other for the life of your display. It’s worth knowing that in early 2019, Nvidia announced G-Sync support for a select few FreeSync monitors – it’s the first time Nvidia owners have had the chance of eliminating screen tearing on AMD-centric monitors. To understand how this works, read our dedicated article on the subject.

Available monitors

At first, AMD FreeSync and Nvidia G-Sync monitors were rather hard to come by, but with time, almost every monitor, be it gaming-centric or not comes with an AMD FreeSync module installed. More premium monitors have the Nvidia G-Sync module, though, a G-Sync monitor will cost you an additional £100-300 over the identical FreeSync monitor.

Compatibility

Most modern-day Nvidia or AMD graphics cards will support G-Sync and FreeSync, respectively. If you’re running a particularly old system, we’d urge you to check on the manufacturers’ website to check if your card supports the technologies. As a general rule of thumb, if your card was made post-2015, it’s safe to assume the technologies are supported.

What is screen tearing?

Gamers with high-performance systems often run into the problem of screen or frame tearing. This is caused by the refresh rate of the monitor being out of sync with the frames being produced by the graphics card.

G-Sync diagram 0

A 60Hz monitor refreshes 60 times per second, but your graphics card’s output will vary – due to the varying load events onscreen put upon it. As a result, when your screen refreshes, the graphics card may have only drawn part of a frame so you end up with two or more frames on screen at once, which results in fairly distracting jagged-looking images when there’s fast-paced action on screen.

^Screen tearing caused by out of sync graphics card and monitor panel (Nvidia diagram)

This can easily be solved by turning on vertical sync (Vsync) in-game, which forces the graphics card to match the refresh rate of the monitor, typically producing 60 complete frames per second. However, many cards can’t keep up with this, but because they have to send 60 full frames each second, some of the frames are repeated until the next frame has been fully drawn. This leads to input lag and stuttering that for many are even more unpleasant than screen tearing.

^Stuttering caused by vsync (Nvidia diagram)

Because graphics cards and monitors don’t really talk to each other in any meaningful way other than to share basic information, there’s no way to sync the frame output and the refresh rate of a monitor. G-Sync and FreeSync solve this problem in the same way, although they both use slightly different technology to do so.

^ Adaptive sync controls when your monitor refreshes (Nvidia diagram)

With G-Sync and FreeSync, the graphics card and monitor can communicate with one another, with the graphics card able to control the refresh rate of the monitor, meaning your 60Hz monitor could become, say, a 49Hz, 35Hz or 59Hz screen; changing dynamically from moment to moment depending on how your graphics card is performing.

This eliminates both the stuttering from Vsync and also eliminates screen tearing because the monitor is only ever refreshing when it’s been sent a fully drawn frame. The impact is obvious to see, incredibly impressive and is particularly strong on mid-range machines with fluctuating frame rates. High-end machines will benefit, too, although not to the same extent.

Differences between AMD FreeSync and Nvidia G-Sync

Nvidia was first on to the market with its G-Sync technology, with launch partners including AOC, Asus and Acer. The technology is impressive but it has a slight drawback. In order to be G-Sync compatible, the screens need G-Sync specific hardware that’s rather expensive, unofficially adding around £100-300 on the price, depending on the spec of the monitor.

G-Sync monitors require a proprietary Nvidia G-Sync scaler module in order to function, which means all G-Sync monitors have similar on-screen menus and options and also have a slight price premium, whereas monitor manufacturers are free to choose scalers from whichever manufacturers produce hardware that supports FreeSync.

FreeSync, which is an AMD technology, uses the Adaptive Sync standard built into the DisplayPort 1.2a specification. Because it’s part of the DisplayPort standard decided upon by the VESA consortium, any monitor with a DisplayPort 1.2a (or above) input is potentially compatible. That’s not to say that it’s a free upgrade; specific scaler hardware is required for FreeSync to work, but the fact that there are multiple third-party scaler manufacturers signed up to make FreeSync compatible hardware (Realtek, Novatek and MStar) should mean that pricing is competitive due to the competition.

One clear difference between Nvidia G-Sync and AMD FreeSync is how they handle graphics cards that produce higher frame rates than a monitor can handle. G-Sync locks frame rates to the upper limit of the monitor while FreeSync (with in-game Vsync turned off) will allow the graphics card to produce a higher frame rate. This introduces tearing, but also means that input lag is at an absolute minimum, which is important for twitch gamers such as those who play FPS titles.

Other benefits: Input lag

Aside from the visual experience, there’s a hidden benefit of using G-Sync/FreeSync over VSync: the reduction of input lag. This is the time it takes your monitor to respond to your input – for example, a mouse click or swipe.

When using AMD’s or Nvidia’s technologies, unwanted input lag is kept to a minimum, though, still existent, it’s nowhere near as bad as VSync. Here, the software-driven technology eliminates screen tearing but adds noticeable lag. By playing a fast-paced FPS, your monitor will feel sluggish and less responsive. It’s a night and day difference to a casual gamer, let alone for those who’re in the competitive scene.