Monitor Input Lag – Testing with LDAT, Time Sleuth & 1000FPS Camera

Everyone loves to have one self scared by viewing these sildenafil tablet viagra terror films. Women s are levitra properien appalachianmagazine.com more willing to have sex rather than men. It improves functioning of nerves and improves sensation levitra professional samples in the genitals. President & CEO, The discount viagra pharmacy Our pharmacy store Africa Society Ambassador Dr.

When it comes to monitors, there are a few vital metrics us gamers care about. One is response time, how quickly a panel can change colours or brightness, and the other is input lag. Manufacturers quote response time figures – although they are often misleading – but don’t quote input lag so as a reviewer it’s really important that I test it and report on it in my reviews so you, the prospective buyer, know what you are getting. 

Let’s take a quick step back though – what is input lag? Well, in a perfect world, the instant you click your left mouse button to fire a shot at an enemy in game, it would instantly display that shot on your screen. Ok maybe accounting for the speed of electricity, which is roughly 90% the speed of light, and making a rough approximation of 5m of total wiring between your mouse, PC and monitor, it would display is about 0.18 nanoseconds. Pretty fast then. 

Unfortunately, since your mouse runs on USB, best case right now you are using one of the 8000Hz options from people like Razer or Corsair and it takes just 125µs for the USB controller to receive your input. Then your PC needs to process that input, then the graphics card needs to draw a new frame which even in a game like CSGO with an RTX 3080 at 380FPS takes 2.6ms. Then, assuming you are not using a 360Hz monitor, the GPU likely has to wait until the monitor is ready to refresh which at 144Hz can take 6.9ms – by that time the GPU will have drawn at least 2 more frames that are ever so slightly newer. Then, and only then, can the monitor process the image, converting the digital frame into voltages for each one of the pixels, and subpixels, then applying those voltages to make what is being displayed change. 

It’s a lot, I know. Now what I just described is total system latency, and is one of the two ways to measure a monitor’s input lag. The other is to cut out the PC and measure just the monitor on it’s own. While less of a ‘real world’ measurement, it provides an easy and direct comparison between monitors and takes any performance issues with the system out of the equation. Personally, I like to quote both figures in my reviews so you can get the best understanding possible before making a purchasing decision. 

When it comes to total system input lag, there are a few different ways you can measure it. Potentially the most obvious way is with a camera. Now just recording at 30 or 60 FPS won’t give you very good results, as each recorded frame will be 33ms or 16.7ms apart, so for measuring something that can be just a couple of milliseconds, that’s not very helpful. Recording at 240FPS is better, as each frame is captured every 4.2ms, but that’s still not a good resolution to measure this with, so the normal go-to is 1000FPS. That means a frame captured every 1ms, and it’s what I use with my Sony RX100 Mark 5A, technically 960FPS but I factor that into my measurements. But it’s no good just spamming a mouse button and recording it, because you won’t know when the mouse actually registers the click and sends it to the PC – hence why I soldered an LED directly to the switch. 

That way works fine, and does give reasonably accurate results, the trouble is it’s time consuming to record enough button presses to make it an accurate measurement and then sit in premiere cutting the clip up to work out how long each one took, then there is a couple of millisecond variance in when I decide the button has been pressed and when I count it being ‘registered’ on screen. Not deal breaking, but you can do it a better way.

As ApertureGrille showed in his video on input lag – one you should definitely go watch by the way I’ll leave that in the cards above – you can use an LDR or light dependant resistor (my A Level Electronics is finally paying off…) to measure brightness changes on screen using an arduino. He made a UE4 project that changes the entire display colour from black to white with a keypress, and has the arduino do that a load of times and record the results. That’s great, but very much home-made, and while it’s more accurate for testing the monitor, it’s not the same ‘real world’ measurement that testing in a production game would give. It’s not what you the end user who bought one would experience.

So where am I going with all this? Well, NVIDIA have a tool they use internally to test exactly this. It’s called LDAT, short for Latency & Display Analysis Tool and has a little LDR on the bottom, some elastic to strap it to a monitor, and a little button you use to start the test. This is the LDAT V2, so unlike the model you may have seen Linus showcase last year, this one no longer requires a mouse be connected, although using this two pin plug you can if you want to. Now, using their software, you can automate as many shots as you like, log the results, and instantly see a graph displaying concentrations of results.

There are a few funky ways you can use this, including having it trigger via sound, but for now I’ll focus on using it the “standard” way with the sensor on the bottom. Now I’ve got this AOC 24G2U – the original version no less – and I’ve got LDAT positioned in the middle of the screen as I’m using CSGO to test here. LDAT will capture the muzzle flash, and report the latency from that. I’ve got it set to fire 20 shots at 0.2 second intervals. Lets have a go.

Right, so LDAT has recorded the latency from those shots, and you can see the graph shows how many shots were in that 5ms range. It also shows our average, standard deviation, minimum, maximum and shot counts (n), which are handy for plotting on graphs to compare monitors. So this one is reporting a hair over 25ms average which is about right from my previous testing – but this only took a minute to set up and run, rather than easily 30 minutes if not an hour with the camera and mouse and I can vouch for the accuracy of this data more so than before. 

You don’t have to rely on muzzle flashes, games like Fortnite with NVIDIA Reflex built in have a latency flash option that flashes a white box on the left side of the screen when it receives a left click, so you can use that instead if you’d rather.

Thanks for coming with me on the journey of explaining input lag, testing it and NVIDIAs sweet new tool, and I hope it’s been interesting and maybe you learned something. If you want to see more videos like this, or monitor reviews for that matter, hit that subscribe button and the bell notification icon.