Clik here to view.

Screen tearing is one of the biggest irritations facing PC gamers today. It's a huge annoyance for players who want quick response times in fast-paced games such as FPS and RTS titles, but the problem affects games and gamers across the board, from budget PCs to high-end monsters. It's a crisis that graphics card and monitor makers have finally come together to fix.
Fear not, though: Nvidia and AMD have you covered with two differing solutions to this problem. Together they're called adaptive sync (alternatively known as dynamic refresh rates). The two firms market their technologies differently, but they solve the same problems in precisely the same way; it's the hardware implementations that vary slightly. In this article, we'll talk about how the technology works and give you quite a lot to think about if you're in the market for a monitor or graphics card.
Nvidia got the jump on AMD last year when it launched its first consumer monitors with built-in adaptive sync, which it calls G-Sync. AMD, meanwhile, has been talking about its own adaptive sync technology, called FreeSync, for a long time and finally looks set to get a flurry of FreeSync compatible monitors on to the market this year.
However, it's not all rosy - while consumers will soon have a choice of technologies (competition always being a good thing), the camps are very much divided. Nvidia cards won't work with FreeSync monitors, and AMD cards won't work with G-Sync monitors. This leaves consumers with a difficult choice, as your choice of monitor will potentially lock you to one manufacturer or the other for the life of your display.
Before we get into the details of each system, though, let's take a look at the problems solved by adaptive sync.
FRAME TEARING
Gamers with high performance systems often run into the problem of frame tearing. This is caused by the refresh rate of the monitor being out of sync with the frames being produced by the graphics card.
Image may be NSFW.
Clik here to view.
A 60Hz monitor refreshes 60 times per second, but your graphics card's output will vary - due to the varying load events onscreen put upon it. As a result, when your screen refreshes, the graphics card may have only drawn part of a frame so you end up with two or more frames on screen at once, which results in fairly distracting jagged-looking images when there's fast-paced action on screen.
^Frame tearing caused by out of sync graphics card and monitor panel (Nvidia diagram)
This can easily be solved by turning on vertical sync (Vsync) in-game, which forces the graphics card to match the refresh rate of the monitor, typically producing 60 complete frames per second. However, many cards can't keep up with this, but because they have to send 60 full frames each second, some of the frames are repeated until the next frame has been fully drawn. This leads to input lag and stuttering that for many is even more unpleasant than screen tearing.
^Stuttering caused by vsync (Nvidia diagram)
Because graphics cards and monitors don't really talk to each other in any meaningful way other than to share basic information, there's no way to sync the frame output and the refresh rate of a monitor. G-Sync and FreeSync solve this problem in the same way, although they both use slightly different technology to do so.
^ Adaptive sync controls when your monitor refreshes (Nvidia diagram)
With G-Sync and FreeSync, the graphics card and monitor can communicate with one another, with the graphics card able to control the refresh rate of the monitor, meaning your 60Hz monitor could become say a 49Hz, 35Hz or 59Hz screen; changing dynamically from moment to moment depending on how your graphics card is performing.
This eliminates both the stuttering from Vsync, and also eliminates frame tearing because the monitor is only ever refreshing when it's been sent a fully drawn frame. The impact is obvious to see, incredibly impressive and is particularly strong on mid-range machines with fluctuating frame rates. High-end machines will benefit, too, although not to the same extent.
DISPLAYPORT DIVIDE
Nvidia was first on to the market with its G-Sync technology, with launch partners including AOC, Asus and Acer. The technology is impressive but it has a major, and expensive, drawback. In order to be G-Sync compatible, the screens need G-Sync specific hardware that's rather expensive, adding around £75 on to the price of any given monitor.
FreeSync, which is an AMD technology, uses the Adaptive Sync standard built into the DisplayPort 1.2a specification. Because it's part of the DisplayPort standard decided upon by the VESA consortium, any monitor with a DisplayPort 1.2a input is potentially compatible. That's not to say that it's a free upgrade; specific scaler hardware is required for FreeSync to work, but the fact that there are multiple third-party scaler manufacturers signed up to make FreeSync compatible hardware (Realtek, Novatek and MStar) should mean that pricing is competitive due to the competition. Furthermore, AMD has told us that there are no additional licence fees required from manufacturers in order to use FreeSync (hence 'Free').
While DisplayPort 1.2a is an open standard that can be used by anyone, Nvidia's latest 900-series graphics cards don't use it, with the firm saying it's going to continue focusing on G-Sync instead. Some monitor manufacturers are sticking with Nvidia for now too. Acer, who has already announced a few G-Sync monitors, currently does not have plans to launch DisplayPort 1.2a monitors. Asus, meanwhile, has also not announced any 1.2a-compatible monitors either. AOC has told Expert Reviews that it plans to launch FreeSync-compatible monitors later this year.
Meanwhile, at this year's CES, FreeSync/DisplayPort 1.2a monitors were announced by LG, Samsung, Nixeus, BenQ and ViewSonic, with no less than seven new models due to launch this year. These include some seriously technically impressive panels such as a 144Hz 2,560x1,440 panel from LG and a 31.5in ultra HD panel from Samsung.
Manufacturer | Model # | Size | Resolution | Refresh Rate |
BenQ | XL2730Z | 27” | QHD (2560x1440) | 144Hz |
LG Electronics | 29UM67 | 29” | 2560x1080 | 75Hz |
LG Electronics | 34UM67 | 34” | 2560x1080 | 75Hz |
Nixeus | NX-VUE24 | 24” | 1080p | 144Hz |
Samsung | UE590 | 23.6”, 28” | 4K | 60Hz |
Samsung | UE850 | 23.6”, 28”, 31.5” | 4K | 60Hz |
Viewsonic | VX2701mh | 27” | 1080p | 144Hz |
WHICH CARDS SUPPORT G-SYNC AND FREESYNC?
With such similar technology, your choice of monitor and graphics card may ultimately come down to your current situation. Older Nvidia cards and AMD cards (including APUs) can be updated to work with G-Sync and FreeSync monitors respectively, so your current setup may be ready without the need to buy a new graphics card.
AMD: According to AMD, The following pre-existing GPUs will be able to use FreeSync for dynamic refresh rates in games (after a software update): Radeon R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260. Other cards/chipsets will support FreeSync, but only for "video playback and power-saving purposes", these include: Radeon HD 7000, HD 8000, R7 and R9 series cards and APUs from the Kaveri, Kabini, Temash, Beema and Mullins lines.
Nvidia: Plenty of older Nvidia cards are compatible with G-sync. The full list of cards is as follows: GeForce GTX 980, 970, TITAN Black, TITAN, 780 Ti, 780, 770, 760, 750 Ti, 750, 745, 650 Ti BOOST, 660, 680, 670, 690, 660 Ti.
The most important thing to take away from this is that even fairly old mid-range cards from both AMD and Nvidia both support adaptive sync. This means you don't need to buy a new card to reap the benefits of either technology. The lineup of monitors supporting FreeSync is already enviable (on paper), but there are some great G-Sync monitors on the market right now from Acer, Asus (the SWIFT in particular is staggering) and AOC.
The problem, of course, is the incompatibility of the two systems. If adaptive sync is important to you and you're looking to buy a new graphics card or monitor, you should try and wait and see how the market develops, how much new G-Sync and FreeSync monitors cost, and exactly how different the two technologies end up being. It's a confusing time for consumers, which is a shame because the technology itself is so incredibly useful.
If you're buying a brand-new graphics card and have an adaptive sync monitor in mind, AMD looks to be in a strong position. Adaptive sync greatly benefits modest hardware, and those with mid-range cards will greatly appreciate the lower cost overheads that a DisplayPort 1.2a monitor looks to have over a G-Sync one. However, those monitors aren't available yet, and G-Sync monitors might take a tumble in price once there's some actual competition to the technology on the market.
Our advice for now is to wait, as buying either a card or a monitor today will lock you into one system or another. Once the FreeSync monitors are on sale, we can then test everything as it would be used, and compare total prices - monitor and card - to see which is better value. Either way, the future is bright for adaptive sync technologies, and screen tearing in PC games should be a thing of the past very soon.
This article was updated on the 20th of January after contact with Nvidia, AMD and monitor manufacturers