GTX 750 Ti in 2021 – Still playable?

The obsession of gadgets is really affecting our love sildenafil india life in the real world. Moreover, working in ICU for 15 years I lost two patients after “liver flush.” Finally, by comparison with other inner generico viagra on line unica-web.com organs, pancreas is mostly vulnerable for toxic damage. People seem to want a clear line drawn between education and religion – or separation of church and state – and any subtle moves to levitra soft tabs show otherwise are frowned upon. To comprehend what Alternative Medicine is, you need to realize that they need help regardless of the profits! That’s why any reliable on-line drug store will never sell quality medication at the same prices with manufacturers that purchase pills. unica-web.com cialis on line purchase

This bad boy is a GTX 750 Ti, a now 7 year old GPU with a whopping 2GB of VRAM, 640 CUDA cores, 1085MHz boost clock at stock and what feels like a massive 28 nm process node. This specific one is a Palit Stormx OC model, meaning it runs at up to 1163MHz on boost. Thanks to it’s just 60W TDP, this has no PCIe power connector, no 6 or 8 pins in sight, and NVIDIA’s funky 12 pin was just a dream in Jensen’s mind when this launched. By all accounts, this was a very mid-range card which explains why even in 2021 more gamers on Steam have one of these, than gamers with RTX 3060Ti’s AND gamers with RX 480s COMBINED! So, how has it held up? How does it perform in today’s titles? Let’s test it, and find out.

I’m under no illusion this card wasn’t exactly a powerhouse when it was new, so I’m not expecting this to magically offer record breaking performance at 1440p or 4K. No, I think 1080p will be our “high resolution option”, with 720p being our backup if the 1080p performance is dreadful. I also don’t expect it to do all that well at higher settings, so I’ll be running each game through its various settings presets all the way from low to ultra to see how it stacks up. So, let’s get started.

Watch Dogs Legion is up first and at 1080p there really wasn’t what I’d call a playable setting. The best we got was 22 FPS, with high running at 18 FPS, very high at just 15 FPS, and ultra… well that blue screened. Yeah. I suspect this is thanks to our lacking VRAM – the card has just 2GB available, but even at low the game reported using 3.16GB, 3.25GB on medium, 3.97GB on high and a full 6.22GB on very high meaning those last two runs were using more system memory than VRAM. I’ve only got 16GB in my test system (alongside the Ryzen 5600X I’m testing with) and the game uses a good 8GB or so, Windows and open apps need a bit more so it’s feasible too much VRAM usage could have it be dipping into the damn page file.

Moving onto Cyberpunk, I’m afraid that’s not any prettier of a picture. Just 16 FPS average on the low quality and textures, and 12 FPS in the 1% lows. Medium quality and textures hurts further at 13 FPS average, high nets less than 10 FPS and high textures plus ultra quality nets just 7 FPS average. Here’s what that looks like, by the way. The game is running so slow that the physics engine is making the AI literally walk in slow motion, and your inputs can take literal seconds to register. It’s fully unplayable, in fact even at low everything it was almost impossible to use.

CSGO, as you might expect, is much more playable. Even on it’s highest settings you should be netting nearly 100 FPS average, with 1% lows at over 60 FPS. A more medium setup offers more, around 140 FPS average, and everything low pumps that to 231 FPS average, and 160 FPS in the 1% lows. That’s pretty fantastic, considering it gets at best 16FPS in Cyberpunk. This sort of performance can be expected across most esports titles, which really is what this is best for.

Moving back to a more demanding title in Microsoft Flight, the card is so old and low end when you start the game up it gives you a message box saying the system doesn’t meet the minimum specification, it still lets you run the game, but at 1080p anyway, not well. You get 23 FPS average on low, which is definitely verging on unplayable, and to top it off the visual quality is pretty bad so I can’t say I’d personally be enjoying my time playing this on this card. It only gets worse from there as medium runs just 16 FPS average, high-end is just 12 FPS and ultra is only 8 FPS average.

Finally in Fortnite we get a rather impressive range. On low, albeit with proper potato quality, you get a mental 239 FPS average and 135 FPS in the 1% lows. Stepping it up to medium offers the best balance at 99 FPS average and 57 FPS in the 1% lows, while still offering enough visual quality to see what’s actually going on around your character. High pushes a little too hard, dropping to just 33 FPS average and a dreadful 15 FPS in the 1% lows, and ultra… Just 19 FPS average, and 8FPS in the 1% lows. Let’s just say you’d be at a very significant competitive disadvantage playing at ultra on one of these.

So, at 1080p the lighter weight games like CSGO and at the right settings, Fortnite, are perfectly playable even at high refresh rates. The more modern, intensive titles though, that’s where you’ll struggle. Getting a little over 20FPS average even on the lowest settings makes for an uncomfortable, or in some cases downright unplayable, experience. But, that’s where 720p comes in, right?

Well, sort of. At 720p in Watch Dogs Legion it didn’t blue screen, although at ultra you do only get 14 FPS average so it’s a small consolation. But, at medium you now get 34 FPS average, and low nets 36 FPS, at the 1% low numbers aren’t too bad either ad 27 and 28 FPS respectively. It’s not ideal, but if you really wanted to play it at 720p you actually could.

Cyberpunk is a similar story, as on low you get just a hair under 30 FPS average. It’s still not perfect and the image quality really takes a hit, especially at low you get blurry and blocky rendering on anything not in your immediate distance, and you even have a weird shimmer behind moving objects as each pixel has to try and show half a person and half the road or building behind, promptly bugging out in a pretty obvious way. Still, it’s somewhat playable and that’s great.

CSGO unsurprisingly runs just fine, now getting 140 FPS average on the highest settings, 218 FPS on medium and 326 FPS on the lowest – I’d personally prefer to run this at 1080p though as much like Cyberpunk objects (or enemies) at a distance fade into a blur of pixels and that can cost you if you can’t even see an enemy let alone fight them.

MS Flight also sees a welcomed boost at 720p, with the low-end preset netting 36 FPS average, and 32 FPS in the 1% lows. Visual quality is still pretty poor, in fact poorer thanks to the resolution, but if you really want to give it a go on your 750 TI, 720p low is what you need.

Lastly in Fortnite again we have an impressive range, netting a touch more low performance now just shy for 250 FPS average and 145 FPS in the 1% lows, a couple more FPS at medium as well running at 103 FPS average, but the start is high running at just shy of 60FPS average. I did have some rather strange and annoying texture/model loading bugs though which could be a problem in games, although strangely switching to epic settings fixed it albeit by halving the performance too.

So, the 750 Ti in 2021, what’s it good for? Absolutely noth.. Well, actually… For esports titles, like CSGO or Fortnite on lower settings, this is still perfectly fine. Even at 1080p you should be netting at or over 100 FPS on high or medium settings, and if you want more drop it to low and get over 200 FPS. Same goes for Fortnite’s lower settings. If you do want to branch out into more demanding titles, you will need to sacrifice image quality for playable performance and you might even have to concede to a lower resolution too but if it lets you play the game that’s the part that matters right?

I suppose the next question to answer is, is it time to upgrade? Well, if all you play is CSGO and you aren’t having any significant frame drops or stuttering, it’s probably fine as is. If you want to play a wider selection, then yeah it’s probably time to move up a tier or two even if that’s just generationally as even a GTX 1060 6GB will net you a comfortable 60FPS (on low) in pretty much any game at 1080p.

Just so I cover all the bases, I should mention that while I have been testing with a modern CPU, a Ryzen 5600X, I’ve tested older GPUs with older CPUs found that the GPU power is normally low enough that the CPU isn’t a significant limiting factor meaning upgrading your CPU with an old GPU like this won’t net you much in the way of performance gains.