Aorus X16 Review – The AI Gaming Laptop..
|The Aorus 16X is the gaming laptop of the future – or at least Gigabyte would hope you’d think so. Aorus has gone all in on “AI”, with these machines literally being called “AORUS AI Gaming Laptops”, as part of their “RE:DEFYNE” scheme. Just as a side note – all these buzzword/deliberate misspelling tags just annoy me, I don’t know about you, but anyway, despite such a hefty marketing presence, the actual “AI” features Gigabyte offers here are really pretty minimal. It’s just three items in the Gigabyte Control Center app, as part of the “AI Nexus” section. Actually, if I’m going to be blunt, the only thing remotely close to “AI” here is the Stable Diffusion GUI that’s an optional extra download. The other two “AI” things are basically just two versions of the standard laptop performance modes. One for GPU management, and one for power limits and fan curves like usual. These existing modes now have “AI” options which… just switch between modes based on what programs are open. Yeah. That’s not “AI”. That’s bog-standard programming – it’s pretty likely they just maintain a whitelist of types of apps to trigger a change in mode, and maybe monitor things like GPU and webcam usage to know if it should put you in “gaming” or “meeting” mode.
The “AI Generator” tab is really pretty funny. The only “module” currently available is InvokeAI’s Stable Diffusion GUI. You download the module, and the Stable Diffusion model, then you get access to what is a very limited version of the standard Stable Diffusion GUIs already available. Just for fun I made it create an image of an Asus gaming laptop, and I’m 98% sure it actually drew an Acer Nitro 5 but stuck an Asus logo on it. That’s really, really funny to me. In theory Gigabyte will be partnering with other suppliers to provide other tools here, although if the Stable Diffusion one is anything to go by, this isn’t a useful addition. I don’t see a way to easily find or swap models, or have the hundreds of extra controls something like Automatic1111’s tool provides. I guess to just have a quick play once this is ok, but if you are even remotely interested in image or text generation, you’ll want to use the incredibly popular and increasingly easy to set up tools instead. If I’m being honest, I’m not sure this is actually much of a value add.
Part of that comes from the specs too – this has an i9-14900HX with 32GB of DDR5-5600 RAM, an RTX 4070 Laptop GPU with 8GB of VRAM, and two 1 TB SSDs. While that spec is pretty decent for gaming, as we’ll soon see, for AI tools, 8GB of VRAM isn’t all that much. For image generation it’s normally fine – at least for smaller image sizes and lower detail – but for text generation that’s very much on the low end. Happily, for gaming you get a pretty decent experience. At the native 1600p resolution you get an average of 152 FPS – at least with the games I test, at the settings I test at. Esports titles hit in the 200 to 300 FPS range, and more intensive games are more like 100 FPS or so.
For the sake of some comparison, at 1080p you’ll find intermittent performance. CS2 has the 16X at the top, above the 13980HX and 4080 of the STRIX Scar 16, whereas Cyberpunk is a bit of an L with 113 FPS average, compared to up to 131 FPS from other machines. Flight Simulator has it in the middle at 99 FPS average, which is down from the 121.5 FPS on the XMG PRO 15, but is higher than, say, the XMG Core 16. Fortnite is basically the same with the 16X in the middle, above the helios 16 and not too far behind the Scar 16. Hitman’s built in benchmark lets me split out the CPU and GPU performance, and just looking at the GPU data you can see the 4070 stretch its legs a little, basically tied with the XMG PRO 15 with a very similar spec. Siege has it in the middle too, again tied with the Scar 16, albeit with a lower minimum. That’s not too big a deal though as the minimum can be a single hitch rather than throughout the whole run. As for Shadow of the Tomb Raider, that has the 16X up near the top, only behind the XMG PRO 15 and Scar 16, at 170 FPS average. Not bad! Finally in Starfield the 16X is actually up at the top, albeit with a lower 1% low result which is a little more significant. Happily that seems to be more to do with playing at 1080p as when I ran my tests at 1600p that hitching went away.
As for more creative tasks, Cinebench has the 14900HX a touch faster than the 13900HX, but not by all that much. Slightly faster single core performance, and a more substantial lead in multithreading. Blender isn’t quite as fast, with the 13900HX in the XMG PRO 15 and XMG Focus 16 actually coming out ahead of the 14900HX here. Clearly there isn’t much between the two generations – although I think it’s safe to say we already knew that from the desktop chips! It’s also about the same in power usage too, with this chip sucking back around 120W during the Blender testing. That’s around the same as the XMG PRO 15, although Aorus’s cooling solution seems to be doing a better job, keeping the chip at just 82°c at peak. That’s actually really quite impressive for a laptop!
As for the display, well that’s a fairly standard 1600p 165Hz IPS panel. To the eye it looks pretty nice, colours pop pretty well, and it’s obviously plenty sharp. To the SpyderX2 it’s still pretty decent. 79% coverage of the DCI P3 spectrum is decent, if not amazing these days, and a colour accuracy DeltaE of 1.73 isn’t too bad either. It isn’t quite production-ready out the box, but it’s decent enough, and takes a calibration well. Gigabyte quotes this as a 400 nit display, but my unit pushed well past that, up to 560 nits at peak, with around a 1400:1 contrast ratio. That’s great for an IPS panel – although you won’t find those ‘deep blacks’ here, as black is more grey, as it is with all IPS panels.
The more gaming related metrics require the use of my very own Open Source Response Time Tool, available at OSRTT.com by the way, and we see an average response time of 5.5 milliseconds. That’s slower than the 3 milliseconds Gigabyte quotes, but with 77% of the transitions falling within the refresh rate window, that isn’t too bad. In fact, looking at high speed footage shows there is functionally no ghosting here, although new frames do take the whole frame time to finish drawing, which isn’t perfect. Still, for an IPS panel this is perfectly serviceable, and actually pretty smooth to the eye. As for latency, that’s about right too at 4 milliseconds on average.
That makes for a pretty decent gaming experience. I played quite a few games on this, and even fast paced FPS games like Siege were a great time. While the display could perhaps be a higher refresh rate – and I still don’t quite understand why we’re putting 1600p displays on machines that really can’t run at sometimes even medium settings at 1600p – it is still a pretty good time. It’s smooth enough, plenty responsive, and a pretty good time.
The keyboard actually kind of lets it down a little. The transparent keys and illegible font don’t help, but I found the keyboard wouldn’t register key pressed fairly regularly. Like, fairly deep presses, and they just didn’t register. Bottoming it out does sort it, but it’s not like I’m light-footing it. Happily the trackpad is spot on. It’s large, feels good to use and has great palm rejection. IO wise you only get two USB A ports, although you do get two USB C ports, a microSD card slot, combo audio jack, HDMI, ethernet and DC in. That isn’t exactly ‘packing’ but it isn’t bad.
Inside you’ll find a pretty beefy cooling solution, alongside two DDR5 SODIMM modules, and two M.2 slots, which in my case came with “Solid State Drive” brand SSDs. I’m joking, these are Gigabyte’s own drives, but it’s pretty funny that these are the most generic looking drives you could imagine. They are Gen4x4, although they aren’t exactly the top-end performers. They are also not in RAID 0, which is kind of a surprise. I don’t think I mind that, but I know Asus does RAID 0 on their dual-drive machines, maybe you can let me know what you think Gigabyte should do here in the comments. Anyway, you also get a 99Wh battery, which with the right usage can give you a pretty decent run time, but of course when gaming that thing drains like a champion.
On the whole the 16X is a pretty decent machine. The AI stuff isn’t worth considering, as the vast majority isn’t even tangentially related to AI, it’s just a marketing label stuck on top of existing features to trick you into thinking it’s new and fancy. In fact, this machine seems to have an annoying optimus bug I’ve not seen on other advanced optimus machines – when opening or closing a game the whole system freezes for a good 5 to 10 seconds, while it switches between optimus and dGPU only mode. That’s actively annoying, and was only a problem with the AI mode enabled. Leaving optimus to its own devices solves it. If you look past the AI stuff, this is a decent machine. It’s priced in line with other machines like it, just shy of £2,000 as of writing, which while I wouldn’t call it a bargain, isn’t a rip-off either. I’d be pretty happy to have this as my gaming laptop – with the AI stuff disabled, of course.