RTX 3080 with an OLD CPU – Should you upgrade?

Additionally it is thought to stimulate http://deeprootsmag.org/2016/01/22/the-elite-half-hundred-of-2015-part-2/ achat viagra pfizer customers during athletic coaching and sexual operation. The penile erectile tissue is rich in tiny pools of blood vessels buy levitra in uk known as cavernous sinuses. The important arteries are clogged due to excessive deposition of fat. deeprootsmag.org viagra fast This drug and its strength to redeem impotency has become possible because of its structural configuration that has been developed to fight against the mechanical disturbances arose due to PDE5 body enzyme. buy sildenafil 100mg http://deeprootsmag.org/2018/02/27/smell-rat-rat-smells-danger/

Nvidia crested a lot of hype with the rtx 3000 launch, prompting a lot of people to think about upgrading, potentially for the first time in a while. But one question I’ve been asked a lot recently is “should I upgrade my older cpu for a 3080?”, which what they are really asking is “how much of a bottleneck will my cpu be?” so in this video I want to answer that question, and compare performance with this Asus tuf 3080, using a ryzen 3900x, and an Intel i7 4790k. But first if you haven’t already consider subscribing for more videos like this one every monday wednesday and Friday.

Bottlenecking is a pretty common term to hear these days, but most people who use it kind of miss the point. You will always have a bottleneck in your system, be it your cpu, gpu, storage, heck even your Internet, depending on what you’re doing. Most people though are talking about gaming performance, but even then game to game can vary a lot. The other thing to remember is that even if you’ve got a 10900K, playing at 1080p with a 3080, you are going to have a CPU bottleneck there. Because I guarantee if you overclock it, or the next generation CPU comes along, with that same GPU you’ll get more performance.

All of that is to say, it really depends what you are doing whether you need to upgrade or not. Now lets get into the testing. I’m using newer DirectX 12 games here, COD Modern Warfare, Battlefield 5 and Fortnite, all in DX12 mode, all on pretty much max settings, and all three were tested at 1080p, 1440p and 4K. I also kept an eye on GPU power draw, temps and clock speeds to make sure that didn’t play a part in any performance differences, and the systems were as similar as I could make them. Both with 16GB of RAM, obviously DDR3 vs DDR4, but still. The same OS, both cooled by a 240mm AIO and of course the same GPU. So, what’s the difference?

Quite a lot.

COD MW3900X4790KDifference% Diff
1080P AVG202.43159.7942.6421.06407153
1080P 1% Low152.9051988119.617224933.287973921.77033493
1440P AVG162.8156.216.594.047911548
1440P 1% Low126.9035533111.607142915.2964104412.05357143
4K AVG102.83102.430.40.3889915394
4K 1% Low87.0322019178.247261358.78494056910.09389671
BFV3900X4790KDifference% Diff
1080P AVG191.44155.9535.4918.53844547
1080P 1% Low118.483412391.3242009127.1592114122.92237443
1440P AVG164.5155.868.645.252279635
1440P 1% Low113.122171991.1577028321.9644691219.4165907
4K AVG104.22107.18-2.96-2.840145845
4K 1% Low91.1577028377.1604938313.99720915.35493827
Fortnite3900X4790KDifference% Diff
1080P AVG180.71160.5820.1311.13939461
1080P 1% Low114.155251197.7517106516.4035404914.36950147
1440P AVG136.02136.97-0.95-0.698426702
1440P 1% Low79.051383486.80555556-7.754172156-9.809027778
4K AVG92.5595.97-3.12-3.360258481
4K 1% Low68.399452861.614294526.7851582889.919901417

So if you are gaming at 1080p on a 3080, first of all what are you doing? Unless you’ve got a 240Hz monitor, 1440p 144hz is the way to go, trust me on that. But, if you are, you’re older CPU is going to be a fairly big bottleneck. Going from 200FPS in COD to 160, especially if you are using an ultra high refresh rate monitor, including the new 360Hz ones coming out, that’s a big deal. Even in fortnite, dropping 11% of your average FPS isn’t great, although for the average gamer, especially one with a 144Hz monitor instead, it really doesn’t matter much.

If you are playing at 1440p, the difference most of the time was negligible with the only thing to consider being the 1% lows which were 10-20% lower on the 4790K, but it was still high enough that you wouldn’t notice. And at 4K, the GPU becomes the bottleneck entirely and besides the 1% lows again, it’s within margin of error which is better.

For a card like the 3080, it’s a little too high end for most people to care, getting 180FPS with a new CPU vs 160FPS with an older one won’t matter to most people’s gaming experience so I think once the 3070 is available and in my hands, I’ll do this test again and see if it makes more sense to upgrade then, where losing 20% of your average FPS can matter a lot more. Hit that subscribe button if you want to see that by the way!