OSRTT – Measuring Monitor Response Times & Latency
|This is the open source response time tool, or OSRTT for short. It’s primarily a monitor response time tester, although you can do input latency testing with it too. It’s now in the hands of a pretty crazy number of reviewers, companies and enthusiasts around the world – people like Linus, Kitguru, Techtesters, CNN Underscored, NYT Wirecutter and so, so many more. I’m honestly blown away by the reception, and I’m incredibly happy to see it being put to good use literally the world over. Since the videos I’ve done thus-far have been a bit spread out, I thought it’d be good to do a condensed hopefully easy to understand video to give you a rough idea of what this thing is all about.
First up, it’s literally in the name. Response Times. Monitor response times are definitely a complicated concept, especially to someone who doesn’t have an in-depth knowledge of how liquid crystal displays work. In short, what we are measuring is how fast the panel can change colours – actually specifically changing brightness, not really colour. The faster the change, the more sharp and accurate the image on screen will look. Too slow and you get “ghosting”, where previous frames are still visible on screen at the same time as the current. Conversely, driving the pixels too hard can cause them to “overshoot”, to let too much (or too little) light through for a short period extending the time it takes to visibly see the frame stabilise.
The way you measure response times is you capture how much light is being emitted over a transition, actually in my case over 30. You can then plot those results on a heatmap table to visualise how good or bad a panel is at changing brightness, and therefore colour. To actually measure those figures though, for a number of reasons that you can check out in the much more detailed video in the cards above, you want to use a reasonable tolerance level, effectively trimming off some of the transition. I personally use and recommend using an RGB 5 tolerance, although there are plenty of valid options there too.
In this case, the light sensor, a Melexis MLX75305, turns the light input into a voltage output that the microcontroller onboard captures and sends to the desktop program to be processed. It’ll run the test multiple times and average those down, including rejecting any outlier results, then load up the results viewer to show you the relevant heatmaps. If you want to check out the raw data in graph form – and see where it thinks the transitions start and end – you can hit the “Raw Data Graphs” button. It’s a fully interactive graph so feel free to zoom, or manually measure anything by dragging the edges of the block around!
One important thing to note is that the program saves all the raw and processed data in a folder per test. This is so you can easily refer back to the raw data, and even re-process it in future with any other methodologies you like. This is part of the openness I really like about the tool, everything is open source and freely available. Nothing is hidden away.
On the flipside, input lag is equally possible to test with this. It’s less suited for it natively, but it does work just fine. I’ve built a dedicated mode for this and I’ll be pushing a pretty major update just for this pretty soon. But first it’s worth knowing what “input lag” actually is – and what it isn’t. Input lag, when talking about a monitor, is generally the “on-display” latency, as in the time from a new frame being received at the input to it being shown on screen. A load of you might know it a bit differently though, specifically as the “click to photon” or “total system latency”, the more real world measurement. The trouble with that is that while it is more real world, it’s a little too real world, relying on the specific system used for testing, what game, and even what settings are used, so it’s not an overly useful metric for reviewers to quote without adequate context. For example, leaving VSYNC enabled can add two or three frames worth of latency thanks to double or triple buffering.
With enough information, specifically how long the USB polling delay is, and the frame render time, you can know how long the display then took to actually process that image, which is what I’m working on with the new feature update for OSRTT. A few people have asked if I’ll be implementing NVIDIA LDAT style in-any-game testing, and for the time being the answer is no. I’m only one person not only running this Youtube channel, but a second channel dedicated to car videos, a global link building platform, and doing literally everything from hand soldering the units to writing all the code, all while being mentally and physically broken. So I’m afraid that’s just not something I can do right now, although it’s not something I’m against at all and in due time I’d very much like to make that happen.
That note also extends to any bugs you may run into, or the delay between ordering a unit and getting it shipped to you. I’m hand wrapping each unit, printing the shipping labels on my home printer, the whole thing, so please bear with me. I’m not a company, there is no support staff. It’s just me. With that said, if you do want to pick up an OSRTT unit for yourself – or your company – I’ve built a store page just for that. It’s OSRTT.com, so head there and pick one up. You can also leave your email address to be kept up to date with the project. I won’t be sharing that with anyone, and trust me that I will not be sending you a barrage of emails. Anyone who’s emailed me thus far knows I’m terrible at actually sending anything so you have no worries there. With that said, something on the horizon I’m incredibly excited to share is going to be a really useful tool for anyone looking to buy a monitor, so do stay tuned!
So that’s pretty much… oh, actually, there is one more thing………. Meet the OSRTT Pro. This is still very much in development so don’t expect it for a little while yet, but believe me when I say this is a massive upgrade in usability and features. The standard OSRTT model is limited to around 160 nits or so of peak brightness, and you pretty much have to hit that, whereas the Pro can range from around 80 nits to… honestly I’m not actually sure. My napkin maths says over 2,000 nits… So, plenty. It’s also a much more accurate and fast-to-respond sensor arrangement which means it’s even better for measuring lightning fast panels like OLED and MiniLED options. Oh, and it even has an OLED of its own! So again, sign up to be notified when this beast is available – that’s OSRTT.com.