Helldivers 2: BEST Settings for MAXIMUM FPS & Graphics (Benchmarked!)

Welcome, Helldivers! I’ll keep this short so you can get back to blasting bugs and bots – this video is all about the best settings to play on. Of course, the final answer to that depends an awful lot on what your system’s specs are, but the load of testing I’ve done should still be relevant regardless. Settings wise, the main ones that I want to look at are the straight quality presets, and the render scale. Between those two you can get over 300% more performance, so let’s dive straight into it!

I’m going to start with the quality presets – all at native resolution, which for this XMG Core 16 I’m testing with is 2560×1600, which is on the tougher side for the RTX 4060 Laptop GPU to handle, but let’s see… Naturally, starting on the Ultra preset, and I’ll include high in here too for comparison’s sake, we don’t get the best performance possible. At 50.4 FPS average on Ultra, or 56.6 FPS average on High, this isn’t amazing. It’s playable, for sure, but we can do better. Medium jumps us up a touch more to over 60 FPS – 61.7 specifically, but if I add low into the mix, well now we’re talking! 97.6 FPS average is excellent, especially for this 4060 Laptop at 1600p. 

Something I want to highlight is the visual quality difference too. Now just because someone is going to leave a smooth-brain comment, the footage you’re seeing was captured separately from the benchmark results. I did the same mission, but the recording did not affect the benchmark results because they were done separately. With that said, looking at the Ultra quality clip, you can see how good this looks. The shadows cast beautifully, the bugs are perfectly rendered, it’s great. Contrast that to the Low preset which looks flat, bland and kinda jagged thanks to no Anti-Aliasing. If I swap Ultra for Medium, obviously this is a lot closer to the Low preset’s settings, but it’s noticeably better looking. It isn’t quite as beautiful, but it’s perfectly serviceable. So far Medium is my preferred option – although we are missing a fair bit of performance here so on lower end hardware, Low might be a better shout.

By far the most glaring issue with the Low preset is the lack of Anti-Aliasing, but that’s just a setting we can enable, so how does that affect the performance – and the visual quality? Well in the performance department we do drop a bit of performance, going from 97.6 FPS average down to 90.7 FPS. That’s really not bad though, and the visual quality difference is well, well worth it. The shimmering on any fine details disappears, as does the saw-toothing on every damn edge. I can’t express just how much I recommend keeping Anti-Aliasing on, especially with such a small performance hit. It really is worth it. 

As for the render scale settings, this is rather interesting. This is the usual upscaling tech, much like DLSS and FSR – and considering this is a PlayStation game I wouldn’t be surprised to find out this is more AMD than anything. One unique option – or actually two options – are the SuperSampling and Ultra SuperSampling settings. This renders the frames at higher-than-native resolution and actually down-scales the frames to fit your display. You’d really have to be a masochist to willingly set this to SuperSampling, and the Ultra version is just ridiculous. Even on the Low preset with Anti-Aliasing, it feels like a slideshow. I’ve rendered this video at 60 FPS so you can see the experience better, because the playing experience was awful. The crazy thing is I can’t see any major quality difference. Let me know if you can notice any differences in the comments, but really all you’re left with is the lowest performance result I captured at 31 FPS average, or 51.5 FPS average for the regular SuperSampling – which is the same as playing on native res at ULTRA! I know which of those I’d rather use… 

Flipping the render scale setting the other way, so the game is actually rendering the frames at lower-than-native resolution and then upscaling the image, well that starts to make some more sense. Even using the Ultra Quality option, which will be the closest to native resolution rendering possible, adds a full 20 FPS average over the Low plus AA run at native res. Hell, the 1% lows are actually higher than the AVERAGE on native render scale, so it’s pretty clear that is a worthwhile choice. There might be the tiniest bit of visual quality difference between native and ultra-quality, but while actually playing I can’t say I noticed. Here they are side by side so you can see how they compare. To me there isn’t much in it, which makes running at Ultra Quality a no-brainer.

Moving down to the Quality render scale setting, on the performance front, that’s a significant jump up again, this time an average of 124 FPS – another 15 FPS higher than Ultra Quality, and a full 35 FPS higher than native res. The real question is, is there a significant quality difference that’d make you not use that setting? No. No there isn’t. While I am able to discern a quality difference by carefully studying the footage, when you’re actually playing, it’s no big deal at all. Totally playable, and probably the best setting outright. 

Once you start moving down to Balanced the quality difference starts to get more noticeable. Balanced is serviceable, but even in game it starts to be noticeably kinda blurry. Certainly less sharp than any of the previous settings. The performance does improve, slightly, to 127.6 FPS average, up from 124 FPS, but that’s such a marginal difference that I don’t think it’s worth it. 

Unsurprisingly, the Performance mode is even worse visually – verging on unplayably blurry. Details are smeared together, because, well, they aren’t being rendered. The upscaler is having a hell of a time trying to upscale what I have to assume is like a 720p frame to 1600p. I don’t think I need to show you a comparison for this one, it’s just bad. And for what? 1.6 FPS average more? Yeah, no thanks. 

The award for the worst setting imaginable goes to… Ultra Performance! Look at this man… It’s trash! It legitimately looks like you’re playing at 144p, it’s dreadful. You can’t make out the parts of the bugs at all – want to headshot them? Good luck. Want to hit the squishy weak spot? Hahahaha good luck! Look at the jets from the extraction shuttle! It’s like we’re actually playing Minecraft with how blocky that is. Compare that to the Quality setting and while there are a couple of blocky artefacts, it’s otherwise perfectly usable. And again, for what? In my testing I got LESS performance, albeit slightly, when running at Ultra Performance. Even the 1% lows are on the lower end, likely thanks to some instability from the upscaling having to do so much heavy lifting. 

So, what are the best settings? Well personally I’m going to be gaming on the Medium preset with the Render Scale set to Quality. That should provide the best balance of visual quality and playability on pretty much any system. If you’ve got a higher end GPU though, you might want to try Ultra Quality, or bump up the preset to High if you don’t mind having a few frames less. I think Ultra is a bit too cinematic for me, both in framerates, and in visuals, and the slightly lower end settings provide the best balance of competitive advantage and visual quality. That’s my thoughts though, but I’d love to hear yours in the comments down below!