What Makes A Gaming Monitor Good?
Is this monitor good? What about this one? Or this one? Or this one? Or this one? Or this one? Or…. How would you find out? I mean the obvious way is reviews like mine, right? But those reviews are chocked full of tests and technical jargon that I know I’m guilty of not explaining well – I mean what does a “Delta E average of under two” even mean?? Sure, you can probably infer from the excited tone of my voice and the smile on my face when I say it that it’s a good thing, but if you don’t engage with monitor reviews on a regular basis, you won’t have a good frame of reference to know what’s truly good or not. So, by the end of this video you will have the foundational understanding of what makes a monitor good, so even just looking at the specs you can get a proper idea of what you’re looking at, and when watching or reading reviews like mine you’ll get the most out of them. Without further ado, let’s get into it, starting with the specs.
Specs come in two forms, observable facts, and measured values. Observable facts are the basics, like this is a 27 inch 1440p QD-OLED panel. Technically it’s something like 26.5 inches, but still. This one is a 32 inch 1440p IPS panel. Those are observable facts. Stuff like “1000 nits of peak brightness” or “1ms grey-to-grey response time” are measured values. Those are performance figures. To put it in car terms, the observable facts of my car are that it has a 3 litre supercharged V6 engine with a 7 speed dual clutch gearbox. The measured values are the power, at 333 horsepower, and a 0-62 time of 5.1 seconds. Those are things the manufacturer says are true, but need to be independently verified before being believed. Knowing what is a hard fact and what’s a measured value is really important. Let’s use this monitor’s Amazon listing as an example. So, 500Hz? Fact. 0.03ms? Measurement. Adaptive Sync? Fact (that occasionally needs verifying). HDR500 TrueBlack, this is a certification, so while it’s really a measured value, you can treat it as fact. Image contrast ratio? Measured value. Screen surface? Fact. Hopefully that’s helpful.
Since those observable facts tend to speak for themselves, let’s instead work through those measurements. The sort of stuff I, and other monitor reviewers, include in our reviews. Let’s start with colours – or more specifically you can think of this as the quality of light the monitor outputs. That means what colours, how bright they are, the contrast between the brightest and darkest, and the uniformity (evenness) of them. A number of these metrics are often touted, either on the retailer’s page, or on the specs page for the monitor. In fact, I’ve already mentioned a few – brightness and contrast – so let’s start there. Brightness is generally measured in a value called “nits”. “Nits” is actually “candelas per square metre”, and a “candela”… yeah, I know it’s already a bit much… is the SI unit for luminous intensity. Think of this like PSI versus “pounds”. Pounds is a weight. PSI is pounds per square inch, it’s a force. Nits is the same. Anyway, 100 nits is pretty dim. Amazingly 80 nits is the standard for the sRGB colour space (we’ll come onto that in a minute), but yeah 100 nits is dark. 250-300 nits is generally pretty usable, especially indoors without direct sunlight. You’ll only need more than that if you’ve got a sun-facing window behind you, then you’ll want 500 at a minimum. 1000 nits is very bright. The key thing to know though is how the monitor gets to that brightness. This QD-OLED lists itself as 1000 nit monitor, but it’ll only hit 1000 nits in a very small window size (so only a small part of the display actually gets that bright) and only in its HDR mode. In regular usage, not in HDR, it runs at 300 nits.
The other brightness metric you’ll often hear about is the contrast ratio. This is the difference between the brightest and darkest elements. The more contrast, the better the display looks. Dark areas are truly dark, and bright areas are truly bright. This is expressed as a ratio, so an IPS panel is generally around 1000:1 – so for every one nit of pure black light, there’s 1000 nits of brightness. VA panels are generally 3000:1, with good ones being 4000:1, and OLEDs… man they are the best. They have infinite contrast ratios, because they physically do not produce any light when you tell them not to. The higher the contrast ratio, generally the less light they produce when dark, rather than being brighter at the top end. That does help, obviously, but it generally tells you how dark they can get.
The final brightness related metric you might see – although it’s a lot less common and only found in more detailed reviews – is uniformity. This is how even the panel outputs light. While it’s all good if a display can output 300, 400, or 500 nits of brightness, and get really dark, if it can only do that in the middle of the screen, but, say, along the bottom edge it’s permanently really bright, or one corner is weirdly dimmer, that’s no good. IPS panels in particularly used to be really bad for that, especially on the edges, although they’ve gotten a lot better since then, and with OLEDs… again, man they are perfect. All of that is to say you want a uniform monitor.
The final light quality metric is actually two forms, both discussing colour performance. Displays, at least our current designs anyway, can only make varying levels of red, green and blue light (except W-OLEDs which also produce pure white too, because there always has to be a stupid exception to the rule), so we generally map their ability to produce colours like that. How much red, how much green, and how much blue. This shows how much of the total visible spectrum the display can output. We often reference this to a few standard spectrums – we’ve already met one, sRGB, the smallest one – so we can gauge how impressive the display’s colour performance is. The wider the gamut coverage, the better, more vibrant it is. The other common spectrums are DCI P3, AdobeRGB and the newest is Rec2020. That’s the widest, and you’ll struggle to find a display that covers 100% of that one. In short though, the higher the percentage, the better.
The other colour metric you’ll hear about is accuracy. This is how, well, accurate the display is when you ask it to show you a specific colour. This is really important for designing stuff, be that websites or videos, you want to know that what you’re seeing is what everyone else will see. Because monitor colour is described with three components, R, G and B, it’s a 3D space. That means the accuracy figures get weird. The figure we use is called “Delta E”, with ‘Delta’ being ‘difference’ and E being the German word “Empfindung” meaning ‘sensation’. Really what we’re saying is how far off (in 3D space) was the colour we got from the colour we wanted. Generally a Delta E of less than two is considered ideal, as that’s close to how little our eyes can detect as a difference. Here’s two colours that are a Delta E of 2 apart. And now 5 apart. Five is really obvious, right? But two is really close. Obviously the lower the better, but if none of the tests return a result higher than two, that’s a really good display. If the average is less than two, that’s good too. If it’s above two? Well it gradually gets worse. Here’s a little gradient so you can see, going from a Delta E of 0 to 6. Pause if you want to look through this more.
Now we get to move onto my specialty, how the light changes, which we call the “response time”. I built a tool to test response times, the very creatively named… open source response time tool. I’m a creative genius, I know. Response times are basically the time it takes for the monitor to change colour. That is important because as the frames change and stuff moves on screen, you need the pixels to be able to keep up and actually show you those changes. You want to see where you’re aiming, or enemies positions, right? Yeah, kinda important. The important thing to know about response times is that it’s basically impossible to truly represent the change in light level with one number – and every manufacturer grossly misrepresents their response times.
Here is what a single response time transition looks like. While I could go on for hours about this, I’m gonna keep it tight and give you just three examples. First, this one. This is a rising transition, so going from dark to light. How you measure the response time varies depending on who’s doing the testing, there are differing methodologies, so for example companies measure by just cutting 10% of the light level change off the top and bottom (leaving you with just 80% of the change). If you’re watching one of my reviews, I test from here, with a fixed RGB 10 offset off the top and bottom, and if you’re watching a Monitor’s Unboxed video, he tests like this, taking 3% of the RGB value change. This is why numbers you see from me or the other OSRTT users might not line up exactly with data from Tim or other sites that have their own tools like RTINGS.
This is example two – see, I’m being good! This shows what’s called “overshoot”. Thanks to a feature normally called overdrive, the panel actually misses the target and has to come back down to the correct value – or come back up if it’s the other way going from bright to dark. This happens because overdrive tries to get the pixels to move faster than they otherwise might want to by initially setting them to go to a much higher (or lower) target, then changing the target once it’s close. Obviously, if you see this, the overdrive hasn’t done its job well. We quantify overshoot in two ways, how far it missed at its peak, and how long extra it took to actually go from your starting level to the target. An overshoot of a couple of RGB values is perfectly fine. Honestly, anything up to 10 RGB values is fine. You’ll struggle to notice. More than that though, that’s a problem. The time it takes I (and therefore OSRTT) include in the response time, so instead of showing you the initial response time (the first rise or fall, not including overshoot time), I show you the perceived response time (which includes the overshoot time).
Finally I want to show you what a perfect response time, at least theoretically anyway, looks like. This is from an OLED, and basically you can see it changes instantly, and stays there. No overshoot, just a straight vertical line. This is why OLEDs are so great, because instead of an LCD slowly drawing your new frame, such that it only just finishes by the time the next one is ready – or worse, doesn’t even finish before drawing the next one so they all blur together – OLEDs draw every frame, instantly, giving them a level of sharpness that not even resolution can give. Motion is crisp, not just the image, and that’s really cool. I should make it clear that us reviewers test a bunch of transitions, OSRTT tests 30 different shades, so when looking at heatmaps like this there are a couple key things to look at. Obviously, the colours in the heatmaps make it pretty easy to see if it’s any good, but three most important response time numbers to look for are the initial and perceived response times (and the difference, if any, between them) and the refresh rate compliance percentage. For OLEDs this is basically always going to be 100 percent. For, say, an IPS or VA panel, depending on the overdrive mode it might be anywhere from 20 percent to 80 percent. This is basically just how many of those transitions are under the refresh rate window. A truly good display should return 100%. Anything less and you’ll get at least some ghosting – that’s previous frames on screen at the same time as the current frame.
One bit of info that is widely forgotten, that may seem like pedantry, but I’d argue is actually pretty important is that while an instant response time IS better than one that takes the whole frame time (for 165Hz displays you get new frames every 6 milliseconds, so a response time at 6 milliseconds would be the ‘whole frame’), there’s actually a quirk to how our eyes perceive motion on monitors like these that mean, depending on the refresh rate, there’s an argument to be made that the slight smoothing that LCDs accidentally do because they can’t keep up with their own damn refresh rates can actually be perceptually smoother. If I’m being honest, even on my relatively slow 175 Hz QD-OLED (relative to this thing that’s 500 hertz, or new frames every 2 milliseconds instead of 5.7) looks miles better than the 640Hz TN monitor I tested recently, at least to my admittedly failing eyes. Still, it’s worth considering how refresh rate plays a part in perceived motion smoothness.
Finally, we have latency. OSRTT tests latency too, although the difference between latency and response time is that response times are how quickly the physical pixels change colour. Latency is how long it takes for the monitor to take a new frame from the HDMI or DisplayPort and start putting it on screen. Displays do actually do quite a lot to that image – they’ve got to convert the image data into instructions for the actual pixel grid, apply any effects like brightness or colour changes, and do the overshoot calculations based on the last frame. All of that is almost always done in around one millisecond. The biggest delay is actually the refresh rate window normally. See, if you’ve got a 60Hz display, that means you only get new frames every 16.7 milliseconds. Let’s say your graphics card draws you a new frame and sends it to the monitor when the current frame has only been on screen for one millisecond. You’ve now got to wait 15.7 milliseconds for the display to refresh and show you that frame. That isn’t the monitor being slow, that’s just the refresh rate. If your graphics card sends it when the current frame has been on screen for, say, 14 milliseconds, now you only have to wait 2.7 milliseconds! That’s why a great OSRTT latency measurement is half the refresh rate, with no results above the frame time. That means the display’s processing happens fast enough to basically never miss the next frame window.
There are a few other things you might see, and if you read incredibly detailed review from sites like RTINGS, I’ve skipped a whole load of their tests for sure, but for me what I’ve covered is the (in depth) basics of monitors, and if you understand the basics of colour, brightness and contrast, response times and latency, you’re in a really good place to understand what is a good gaming monitor, and what isn’t worth your time nor money. I hope this has been useful for you, and if you have any specific questions, please do leave them in the comments below, or jump on our discord and ask there. Of course, if you fancy testing monitors for response times and latency, you are more than welcome to pick up an open source response time tool from OSRTT.com, linked in the description. I build them right here at home in the UK and ship them worldwide.
