NVIDIA G-Sync Pulsar Explained – Better than OLED??
NVIDIA’s latest addition to the monitor market is this, G-Sync Pulsar. This, much like G-Sync Ultimate, is an NVIDIA GPU only feature that supposedly makes this 360 hertz monitor have 1000 hertz worth of motion clarity. Sounds great, right? Well let me explain exactly what this is doing, and do some in depth testing on it, before we come to any conclusions. This is NVIDIA G-Sync Pulsar explained.
First, a bit of background. NVIDIA first launched G-Sync well over a decade ago now, aiming to ‘revolutionise’ the way we experience our games – and specifically by eliminating tearing. Adaptive sync had been bouncing around in the VESA standard for a while before that, but the hardware didn’t really exist to allow your graphics card to be the one to control when your display actually refreshes (within limits, anyway), so NVIDIA made their own sort of custom chip. Specifically the original G-Sync monitors came with a G-Sync module, made up mostly of an FPGA, or field programmable gate array, a kind of software-defined chip that can do a remarkable number of things, and all thanks to programming how the circuits inside the chip are connected and operate. That was an expensive way to do it, meaning early G-Sync monitors held a hefty price premium over their non-G-Sync counterparts. AMD’s FreeSync came along soon after, using more standard hardware, meaning it didn’t cost an arm and a leg to have a variable refresh rate.
The other key bit of tech you need to understand is what NVIDIA calls ULMB – ultra low motion blur. Most monitor manufacturers call it something different, MPRT, MBR, ELMB, Aim Stabilizer – it’s all the same thing. It’s often called “Black Frame Insertion”, but it’s actually just backlight strobing. That is where the monitor’s backlight turns on and off such that only one millisecond of light per frame is actually shown to you. Why would you want your display flashbanging you hundreds of times per second? Well I wouldn’t, but the theory boils down to the simple fact that our eyes and brains are dumb. Displays already exploit how dumb our eyes and brains are, because our persistence of vision means we can be shown a remarkably small number of still images per second and we perceive that as a continuous stream of motion. The thing about our eyes and brains though is that depending on how long the frame stays in one place, the more our brains blur any following motion. It’s a weird thing, but that’s why ‘cinematic’ 24 FPS films’ motion looks so blurry – even when you step through frame by frame and it doesn’t look that blurry – it’s your eyes and brain smearing it for you. By only showing you the current frame for one millisecond, then turning the backlight off, you get much sharper motion, without having to increase the frame rate.
The trouble with those two things is that it turns out it’s pretty hard to time when the backlight turns on and off when the frame rate can, well, vary. Variable refresh rate and backlight strobing are kind of mutually exclusive experiences, or at least they were. Recently some monitor manufacturers have begun offering their own adaptive sync backlight strobing modes – Asus has ELMB Sync, and AOC has MBR Sync – although G-Sync Pulsar being from the trillion dollar titan of the industry should mean it is the ultimate version of that. This comes, in part, through a collaboration with MediaTek to make a dedicated scaler for the job. Pulsar, much like variable refresh rates in general, is limited to 90 FPS at the low end – as in if your game runs at 89 FPS or lower, this monitor will start repeating frames instead of waiting. That means this has a 90 to 360 hertz window.
One key advantage Pulsar has over the existing adaptive sync backlight strobing modes is that they have a rolling backlight. Instead of just strobing the whole backlight on and off, it rolls the backlight on from top to bottom, theoretically improving the motion clarity. In theory this gives the pixels – which are drawn top to bottom as well – time to render, giving you the crispest motion possible. In theory. This is done by the backlight being split into strip zones, although the precise number is hard to pin down.
So, what does this actually look like? Well, first off, warning, flashing images. Second, it’s dim and flickery. Because the backlight is only on for around one millisecond, or 25 percent of the frame time, that means you only get 25 percent of the brightness. The backlight tries to compensate for that by pumping out way more light that usual for that pulse so your brain perceives it as a brighter overall experience, but the second you turn Pulsar on the brightness noticeably drops. Like a lot. Plus at least for me I get a stonking headache. But let’s look at the light data from my very own open source response time tool to better understand what’s going on here. With the monitor in Pulsar mode, but just looking at the desktop (ie not actively adaptive syncing, or running at sub-monitor refresh rates) the pulses line up to be 2.78 milliseconds apart, and are on for around one millisecond. 25 percent of 2.78 is 0.7 milliseconds, and that’s about right for this, so that’s good. It’s worth also seeing that with my 1000 FPS camera so you can see that the backlight strobe happens right before the start of a new frame, which means the display should have finished rendering the current frame, giving you the clearest image possible, before turning the backlight off again and starting to change the frame. This is the designed, and theoretically ideal behaviour.
Now let’s look at what happens when it is adaptive syncing, starting at 165 FPS. As you can see, the distance between each spike is now around 6 milliseconds, although interestingly the pulse itself actually looks about the same width (or time) as at 360 FPS, but instead there’s a second (much smaller) pulse a few milliseconds later. Interestingly, if I drop the framerate to just 100 FPS – only just above the 90 FPS minimum mind you – we can see that the whole frame time is now 10 milliseconds, as expected, and that there’s still two pulses, although now we can see a more noticeable width difference in the main pulse. The backlight is on for a lot more time, to match the much lower framerate. Interesting!
Theoretically then, this is the perfect gaming monitor, right? I mean you do get worse than OLED brightness, and a searing headache afterwards, but if you get 1000 hertz worth of motion clarity – as NVIDIA promises – what’s not to love? Well I think it’s worth talking about the oled-phant in the room. God that was a bad pun. Anyway, OLEDs already offer near perfect motion clarity, rendering each frame instantly, and with 500 hertz OLEDs being common (and actually cheaper than this thing…), I’m struggling to work out why you’d pay this much money to get flashbanged every time you want to play a game. Theoretically this does give you better motion clarity, but I’m not convinced OLEDs don’t offer the same – or better – motion clarity already, and without the literal headache. Still, that’s G-Sync Pulsar!
