Freesync vs G-Sync Input Lag Test

Prostate cancer is also one of the commonly used herbs within the conventional Recommended shop viagra 50 mg Indian remedy system. Other common drugs associated with the fall are a pain, a lot of psychotropic drugs used to treat high blood pressure, diseases of the organs near prostate The inflammatory diseases of rectum, colon, posterior urethra and other organs near prostate can spread into prostate through lymphatic duct and generico levitra on line http://deeprootsmag.org/2013/03/13/pianist-entrepreneur-innovator/ cause prostatitis. 3. Key ingredients of Mast Mood capsule include Valvading, Girji, Ashmaz, Sudh Shilajit, Himalcherry, Umbelia, Lauh Bhasma, Abhrak Bhasma, Ras Sindhur, Embelia Ribes order levitra online http://deeprootsmag.org/2013/07/23/johnny-cash-to-carl-perkins/ and Adrijatu. You should also note that the erection can use Silagra effects of cialis to get the solid, strong, and long-lasting erection.

Adaptive refresh rate technology is nothing new. VESA officially adopted it for DisplayPort 1.2a in 2014, with both NVIDIA and AMD releasing their own versions in 2015. NVIDIA using a dedicated G-Sync module whereas AMD basically just stuck a label on VESA’s standard and made it work with their GCN 2.0 architecture GPUs. The way it works hasn’t changed much either, in principle it still means the monitor doesn’t refresh what’s on the screen until a new frame is drawn and ready, which helps eliminate tearing, when one frame gets half drawn before another starts causing a visible ‘tear’ line.

Both NVIDIA and AMD have added new features over the years though, namely HDR with adaptive sync enabled, and both have tweaked and refined the tech. But there is one question that a lot of people who want to buy these monitors are asking – how does it affect input lag? Especially for high FPS competitive games, are you better leaving VRR off, or does it not make a difference? Well, I’ve got this Gigabyte G27Q, a 1440p 144Hz IPS monitor with Freesync, and this, the Asus PG259QNR, a 360Hz 1080p 25” monitor with G-Sync Ultimate AND their reflex latency analyser – plus NVIDIA’s LDAT tool, a high speed camera and clearly too much time on my hands.

Lets start off with Freesync as that’s the generally more common option these days, plus Freesync monitors can now count as “G-Sync Compatible”. So, what’s the plan? Well, I’m using my under-desk PC which is now rocking a GTX 1080ti, I’ve got CSGO open running at over 600FPS, and I’ve got LDAT over the flash from the bullet hitting the metal sign. It’ll fire 20 shots at 0.3 second intervals, and I’ll do that for both adaptive sync enabled and disabled.

The results? Remarkably, there is no difference. Really. VRR on nets 38.679ms of input latency, versus 38.599ms with VRR off. Even looking at the minimums and maximums it’s well within the margin of error, and tracing each shot on a graph doesn’t lend much information either. At least on this monitor, and using an NVIDIA GPU, there doesn’t seem to be much if any difference in input lag with Freesync Premium enabled or not.

I also tried disabling it on the monitor itself, rather than through software, which came back at 36.6ms average but that’s close enough that it’s within the variance I’ve seen from run to run. I’d be looking for over 5ms to call it a plausible difference and one you’d have a hope in hell of noticing or truly benefiting from.

So that’s Freesync, what about NVIDIA’s shiny new G-Sync Ultimate? That still uses a dedicated G-Sync module, rather than the standard VESA spec scaler you’ll find in Freesync options like this. G-Sync Ultimate, much like Freesync Premium Pro, offers HDR with variable refresh rates enabled, and is rated to hit the astonishing 360Hz that this Asus PG259QNR can reach. Full review on this very soon by the way so stay tuned!

So, how does this one compare? Well it’s rather interesting. This is the exact same test using LDAT in CSGO and it’s running at its full 360 Hz refresh rate. With G-Sync off it ran 26.3ms of input lag on average, whereas with G-Sync on it ran 28.865ms. Interestingly, this setup was much more consistent than the Freesync monitor, so the difference is more certain. It’s still not a significant or noticeable gap, it’s about one frame at 360 Hz which is insanely quick. I should note the minimum and maximums were better with G-Sync off too, with the maximum being almost 10ms faster.

But that’s not the whole story. See this PG259QNR has a display mode called “G-SYNC esports”, and I couldn’t not test in that too. With G-Sync off the results were almost identical, running 26.349ms average latency, which is only 0.01ms off the standard G-Sync off result, but remarkably with G-Sync on it ran just 24.6ms of average latency. Yes, almost 2ms FASTER with G-Sync on than with it off, and over 4ms faster in this mode than in any of the other display modes. It also had the tightest grouping with just 16ms spread between min and max compared to almost 18ms with G-Sync off.

At this point, you might be thinking why am I testing at the maximum FPS possible, well above both monitor’s refresh rates? Well this is what you’ll experience in this type of fast paced shooter where input lag really matters. But I get it, some games aren’t as lightweight as CSGO so lets test them both again but capping CSGO to 100FPS. This isn’t perfect, but it should give us a rough idea of what it’s like running both below their maximum refresh rate.

Starting with Freesync, with it disabled it ran an almost identical run to the original result at 38.827ms average, albeit with a much higher peak of nearly 90ms versus 53ms – that’s what happens when the frame doesn’t line up and has to wait or doesn’t even get drawn. With VRR enabled, it actually drops it to just 33.27ms average. That’s potentially significant enough to be a noticeable or at least a functional improvement while gaming.

What about G-Sync? That ran pretty similar to its stock run too, at 26.7265ms average with only slightly worse maximums although not by much. G-Sync on? 26.662ms average. Yeah.. No difference. The minimum and maximums are a little worse, with the maximum being a full 10ms slower but I wouldn’t call it a significant disadvantage. This was in the G-Sync specific mode too, which is interesting to see it not improve too much over it disabled.

So, rather conclusively, if you have a G-Sync ultimate monitor with a dedicated G-Sync mode… Use it. It’s legitimately better, and running 4ms faster on average is enough to call it a decent improvement. But how does G-Sync stack up to Freesync? Well, in the max fps results even in the worst case with G-Sync on but not in the dedicated mode it was still around 8ms faster than the best case result from the G27Q. But, these monitors aren’t exactly comparable. This is a 1080p 360Hz monster, whereas this is a more budget friendly 1440p 144Hz option so odds are Gigabyte didn’t go all out on getting the fastest possible scaler for this model, whereas this Asus one is the very top of the line and costs well over double what this Gigabyte one does.

With that said, the fact that this G-Sync mode can actually decrease input lag with variable refresh rate enabled, that’s clearly impressive and shows a technological advantage of having a dedicated module, even if it means you have to spend a hell of a lot more. That’s not to shame Freesync at all. It’s fantastic knowing it doesn’t have a sizable impact on your gaming experience with it enabled or disabled, and at lower frame rates it might actually improve your experience – and neither will sizeably hurt your performance at all.

Of course, these results won’t necessarily speak for every monitor, game or setup. Your results will vary so take this with a pinch of salt and treat it as a good sign but far from complete. Different GPUs, hell even CPUs in some games, and in game settings will also vary this widely, and both of these monitors are pretty new so older versions of Freesync and G-Sync may also perform differently.