My LATENCY TOOL just got WAY BETTER!

The Open Source Latency Testing Tool I’ve been building and developing for almost a year now has just had a major update – something I’ve been working on for MONTHS – and I’m really excited to show you what’s new. I have not one, not two, NOT EVEN THREE, BUT FOUR NEW FEATURES to show you! Yeah, I’ve been busy…

The first feature is the most obvious, and is something I already hinted at including, which is the ability to pre-test your system’s latency to subtract it from game results. This is great as it helps isolate the game from your system, and means especially if you are a reviewer who’s going to be quoting these figures, you are reporting the game’s results, not just your system’s results. Of course things like game settings will affect the latency too, and what sort of performance you are getting, but it gives you the option. Now it isn’t perfect, as all light based measurements are subject to the monitor’s refresh rate, and what I’m doing is subtracting the average on display latency from each game result. There will still be some variability from the display refresh window there, but it should give you a good idea of what’s going on anyway. Let me show you how it works.

On the desktop app you’ll find two new buttons for this – the pre test toggle in the test settings, and the run pre test button up top. If you want the pretest data to be included in your game testing, you’ll need to hit the run pretest button first – BEFORE OPENING ANY OTHER GAMES – strap the sensor to the display, and once the DirectX window opens, you’ll need to wait for the FPS counter to be reading around 1000 FPS, then hit the button on the sensor. Let it run, then it’ll close and you’ll see at the top that the status message now says “Pretest data saved”. Now you can switch the pretest toggle on and launch your game of choice and test away. You can still use F10 to start and stop the test, or click the button instead. Once you’ve tested your game, you’ll get the results viewer with all the data you’ve collected, and you’ll see the on display latency has had the system latency removed. 

Sticking with the game testing, one of the most requested features was the ability to swap from mouse clicks to mouse movement. The idea here is that mouse clicks, say for firing a gun, often includes an animation delay, which can add somewhat artificial latency to your inputs. Also, not all games have an obvious mouse click action that would register as a significant change for the light sensor, so why not just move the mouse? Now you do need to be a bit careful with this one, as you’ll need to line the camera up with a sharp edge, ideally something dark, with something considerably brighter next to it that the camera can move to. As an example, I use the training mode in Rainbow Six Siege, lining up the camera with the edge of this dark box, and I’ve set the mouse action to move left to the much brighter wall there. Then I just hit the button and let it take as many samples as I fancy. One thing you might notice is how little the camera is moving. As much as I’d like for this to be controllable, this is the maximum the board can do. If you find you need more of a difference, you’ll want to head to your game’s settings and bump the mouse sensitivity up until you are happy with the amount of motion. You’ll find that option in the test settings area, and you have three options. Left click, move the mouse left, or move the mouse right. 

As a side note, I technically added an extra feature which is that the processing now works for light to dark transitions too, so if it ends up being easier for you to test looking at a light area and moving to a dark area, that’s now going to work fine!

Ok, moving away from game testing to mouse and keyboard testing, this has been quite the ride for me. What started as an innocent suggestion led to a full month of testing, programming and confusion. I’ll stick a timecode on screen if you want to jump ahead of the technical stuff here, but this has given me too many grey hairs to not explain. In short, the way this test works is relatively simple. The board reads analogue data from the microphone waiting to hear a sharp spike. There’s a whole detection thing I set up for that, taking a baseline measurement and then adding eighty percent of the difference between the baseline and the maximum value possible, but once it ‘hears’ a sound loud enough to get past that threshold, it then enters a loop waiting for the desktop program to tell it that it received a click or keypress. The desktop app has a low level mouse or keyboard hook running that as soon as Windows receives the click, it fires back to the board to say it received the click. 

Now that includes some of the changes I’ve made, namely the low level hooks. Previously I was using the WinForms mouse down or key down events, which turned out to be a little unreliable, especially mouse down. I ran tests where the board itself did the clicking, then waited for the response, and I was getting results all over the place. Like, six millisecond swings kinda mad. Here’s the graph for that. Yeah, not good. So swapping to the low level hooks solved that part. The other problem was the serial bus had a hard floor of one millisecond. That’s the smallest timeout I could use, but that meant every result was at least one millisecond, which isn’t great for anything higher than 1000 Hz peripherals. Happily I solved that problem too – or at least I improved it greatly. It turns out there is an Arduino serial function called “readBytesUntil”, which will exit out once a condition is met, skipping the rest of the timeout window. So, for the click test, I now use that to wait for the single char to come back, and that now returns well under one millisecond. I’m not sure you’ll get a 0.125ms result even with an 8000 Hz mouse, but it’s going to be a lot closer now, and is about as good as I can get without significant hardware changes.

So, in short, the test is now significantly more accurate, uses a low level mouse or keyboard hook, and can now report sub-millisecond results. To use the new method, it’s pretty much the same process as before, although you’ll notice that there are now separate entries for mice and keyboards in the test source selection – that’s because if I register the low level hooks at the same time, it would likely mess with the results so I’ve separated them out. Once you start the test you’ll notice the text box is gone. That’s because the low level hook doesn’t need you to actually save what you are typing anywhere, it just registers the keydown event. Technically you don’t need my desktop app focused, but it’s probably best to leave it selected. When you get the results, you’ll notice they now have decimal places and if you have a fast enough mouse, you should see sub-millisecond results. 

The final feature is part of the same peripherals test – if you want the ultimate accuracy, you can now solder the two pin fly leads to your peripheral’s switch directly and use the two pin input to trigger the test. This replaces the microphone as the trigger and takes its signal from the switch itself, rather than listening for the sound of you clicking. If I’m honest I don’t expect many to do this, but it’s there if you want it! 

I think that’s about it for now. I expect you’ll have plenty of questions and suggestions, so please jump on our Discord or leave them in a Github issue and I’ll do my best to get back to you. This update should be live already, so if you’ve already got a unit it should prompt you for an update the next time you open the desktop app – both of the desktop app itself and the board’s firmware. If you don’t have a unit yet, head to OSRTT.com/osltt and pick one up. I still have a handful of units left from the second batch, and I’ll be building another wave soon too.