Asus STRIX SCAR 16 Review – MiniLED MADNESS! 

There is so much to talk about with this STRIX SCAR 16, including possibly fake G-Sync support, that we have to get straight into this. I want to start with the performance – this is a pretty beefy machine. This one has an i9-13980HX, with 8P and 16E cores, 32GB of DDR5-4800 RAM, an RTX 4080 Laptop GPU and 2TB of RAID 0 PCIe Gen 4×4 SSD space. Again, beefy. Performance from the CPU is, as you’d expect, pretty top notch. Interestingly, it’s actually matched by the 13900HX in the XMG FOCUS 16 quite well – technically bested in fact, despite the FOCUS 16 not being in its most extreme performance mode. Still, the SCAR 16 offers top-of-the-pack performance here. In Blender it’s the same story, with the SCAR taking a slight lead from the FOCUS 16, although not by all that much. Interestingly, the 13980HX seems to be considerably less efficient, pumping a whopping 170 watts through itself to get those minor improvements over the FOCUS 16 – a good 50 watts more, or around 15 watts more once it stabilised.

Gaming performance on the other hand is more mixed. Native 1600p resolution performance isn’t too bad with an average of 172 FPS across these seven games. That isn’t bad, although not quite enough to match the refresh rate. Even Starfield, admittedly on low settings, managed well over 60 FPS, so decent job there. When it comes to comparative figures, at 1080p in Cyberpunk the Scar 16 is down at the bottom of the pack. I can’t say I understand this result, seeing as the configuration should allow for a lot more performance, but I’ve tested and retested this, and these are the sorts of figures I get every time. Shadow of the Tomb Raider shows a tail of two machines. In performance mode you get a somewhat sedate machine – it’s keeping pace with lower TDP RTX 4060 machines, but that’s about it. Kick in into Turbo though, and you get a beast. Almost 10 FPS clear of anything else, that’s what I’d expect to see! Fortnite is the same thing, Performance mode is fine, but Turbo is actually impressive. Flight didn’t respond all that well to Turbo – it does gain 8 FPS average which is good, but it doesn’t touch even 4060 based machines. Hitman’s built in benchmark allows me to split out the CPU and GPU performance, and as you can see the GPU CAN rip when you let it. Even in performance mode it bests even an RTX 4090 Laptop chip – admittedly a much, much lower TDP variant, and Turbo only kicks that further up a notch. Lastly in Rainbow Six Siege you get fairly middling performance, although it’s the best I’ve tested recently. 

One downside of using the Turbo mode is this…

It’s painfully loud – like you’ll need headphones to use this type loud. I was talking noticeably louder while it was running a game. It’s distractingly loud. 

Something I couldn’t help but notice while testing was a stuttering that persisted across multiple games, restarts and driver versions. I think the Shadow of the Tomb Raider benchmark shows it well – it’ll just hang every few seconds. It does this a LOT, and makes the gaming experience far from ideal. What’s even more strange is that I think it’s the display, because while doing some of the display testing you’ll see in a second, I captured high speed of it happening. This is the frog pursuit test from Aperture Grille, and as you can see the whole image will stop moving, then it’ll really badly overshoot, then it’ll carry on. See most frames don’t seem to have any overshoot at all, but then it just stops, overshoots, then moves on. Yeah I have absolutely no idea what’s going on here, but I couldn’t go without mentioning it!

Now speaking of the display, that’s a 2560×1600 240Hz MiniLED backlit panel that offers up to almost 1000 nits of peak whole-screen brightness in SDR! Yeah, seriously, this thing is insane! It’s ridiculously bright, and of course with that MiniLED backlight and full array local dimming you get a functionally infinite contrast ratio. You do have some haloing, in fact it’s pretty obvious even in a bright room, so that’s a bit of a turn off for me, but at least you have the option to swap between the “Multi-zone” and “One-zone” settings in the Armory Crate software. Colours wise, this panel is absolutely stunning. To the eyes it’s easily one of the most vibrant, colour-pop displays I’ve seen. It’s just beautiful. That’s reflected in the gamut coverage, with exactly 100% coverage of the DCI P3 spectrum. That’s incredible, and one of the best results I’ve seen. Accuracy seems off with the local dimming on, but I have a sneaking suspicion that’s a bug with the SpyderX rather than actual results from this display. Still, disabling local dimming and running the test again reveals a ridiculously good DeltaE of just 0.83, with a maximum around 1.6. That’s fantastic.

What’s considerably less impressive is the response times. I spent an awfully long time testing and retesting this to be sure, but I have quite a few revelations to share. First, the response times themselves. The best result I could get was around 8 milliseconds on average, with the worst results being over 17 milliseconds. That’s with the “3 millisecond overdrive” feature enabled, and local dimming off. With local dimming AND overdrive on, the average jumps to 16 milliseconds, which considering this is a 240 Hz panel, running at 60 Hz equivalent is dreadful. Even at 8 milliseconds that’s only around 120 Hz, or half the speed it should be changing. Just to show you what’s happening here, this is the RGB 0 to RGB 102 transition, and it’s really pretty slow. Even with this tolerance, it’s reporting 10 milliseconds there, which is really quite slow. It’s worth noting too that because this is a MiniLED backlight, it uses PWM to control those LEDs. That means the actual raw data looks like this – it’s pulsing on and off thousands of times per second. It’s only when you add my denoising function on top that you get usable, readable data. 

Which I think brings us nicely onto the adaptive sync problem… Let’s look at another graph. The transition doesn’t matter here, what matters is those pulses. They are about 4.2 milliseconds apart, which just so happens to match the 240 Hz refresh rate window. Now this is a G-Sync display – there’s even a sticker to say so right on the machine. So, you’d think that if your game was running at, say, 60 FPS, you’d see these pulses every 16.7 milliseconds, right? Well, Asus doesn’t seem to think so! The slowest it seems to run is 8.3 milliseconds, or 120 Hz. It occasionally bumps between 8.3 and 4.2 milliseconds, but it doesn’t seem able to refresh at anything other than 240Hz or 120Hz. But, hey, maybe 60 FPS is below the supported refresh rate window – it shouldn’t be but let’s give Asus the benefit of doubt and say it is – so let’s try this again at 170 FPS. That’s right between the 120 Hz and 240 Hz it seems able to switch between… Anndddd… Nope. It’s just running at 4.2 milliseconds again. Bugger. Also, to be clear, I tested this with both Optimus enabled and the NVIDIA dGPU only modes, I confirmed adaptive sync was enabled in the Windows display settings, and when on the dGPU only I confirmed G-SYNC was enabled in the NVIDIA Control Panel. So, as far as I can tell anyway, this doesn’t adaptive sync. Yikes.

Strangely, input lag is also pretty poor – both with Optimus and dGPU only modes. With just the dGPU it is better with an average of 7 milliseconds or so, but that’s almost two frames at 240 Hz – and some of the results are consistently above 10 or even 15 milliseconds, which is really quite poor. It gets worse with Optimus enabled too, running at just shy of 15 milliseconds average, or three and a half frames worth of latency, with some spiking as high as 25 milliseconds. That’s really quite bad. 

All of that has a noticeable effect on the gaming experience. I’d be lying if I said it was anything other than decent, but it is noticeably more difficult to hit shots and aim at targets. I mean, it looks incredible, but that’s about it. The keyboard feels pretty nice for gaming, although I do miss the mechanical switches in the XMG Core 16 I reviewed recently. Still this is nice enough for sure. 

Happily the Scar 16 seems to have broken the trackpad curse I’ve had for the last couple machines, as this one works without a problem. It even has a numpad built in, a rather strange addition I can’t see myself using much, but it’s there if you want it! IO wise it’s a little strange too – you’ve only got two USB A ports, both of which are annoyingly on the right hand side – right where you mouse is. That’s it for the right IO too, as everything else can be found on the left side. That being DC in, ethernet, HDMI, two USB C ports and a headphone jack. Inside is where the real magic is. In here you’ve got the two M.2 slots, populated with Samsung PM9A1 1TB drives in RAID 0, alongside two DDR5 SODIMM slots, populated with SKHynix 16GB 1Rx8 modules. Below that you’ve got the 90Wh battery, and above you’ve got the absolutely monstrous cooling package. You get not one, not two, but three fans here, and the entire width of the laptop is a heatsink, on top of more at both sides. That makes sense since the RTX 4080 Laptop GPU is the 175W variant and seems to sink up to 225W, and we know the CPU can be dumping near-on 170W into this cooler too. Now you see why it’s so damn loud!

This one is rather hard to summarise. For as beautiful as this display is, it has far, far too many flaws to be a worthwhile purchase. The seemingly lacking G-Sync support – despite the branding – is worrying, and my god these response times were terrible. The performance is somewhat lacklustre for the spec – I was honestly expecting chart-topping performance across the board, but generally games were limited I assume by the CPU, seeing as when that limit was lifted it did offer fantastic performance. Still, between the stuttering, the display in general, and the noise, I can’t say this one is for me. While I’m not really looking for a desktop replacement style machine anyway, this one has a few too many blemishes to make it worth the three-and-a-half grand you’d need to drop to get one. Considering that XMG Focus 16 is more like £2,500 or so and performed remarkably similarly, I’d get that instead. 

  • TechteamGB Score
3