RTX 3080 with an OLD CPU – Should you upgrade?
|Nvidia crested a lot of hype with the rtx 3000 launch, prompting a lot of people to think about upgrading, potentially for the first time in a while. But one question I’ve been asked a lot recently is “should I upgrade my older cpu for a 3080?”, which what they are really asking is “how much of a bottleneck will my cpu be?” so in this video I want to answer that question, and compare performance with this Asus tuf 3080, using a ryzen 3900x, and an Intel i7 4790k. But first if you haven’t already consider subscribing for more videos like this one every monday wednesday and Friday.
Bottlenecking is a pretty common term to hear these days, but most people who use it kind of miss the point. You will always have a bottleneck in your system, be it your cpu, gpu, storage, heck even your Internet, depending on what you’re doing. Most people though are talking about gaming performance, but even then game to game can vary a lot. The other thing to remember is that even if you’ve got a 10900K, playing at 1080p with a 3080, you are going to have a CPU bottleneck there. Because I guarantee if you overclock it, or the next generation CPU comes along, with that same GPU you’ll get more performance.
All of that is to say, it really depends what you are doing whether you need to upgrade or not. Now lets get into the testing. I’m using newer DirectX 12 games here, COD Modern Warfare, Battlefield 5 and Fortnite, all in DX12 mode, all on pretty much max settings, and all three were tested at 1080p, 1440p and 4K. I also kept an eye on GPU power draw, temps and clock speeds to make sure that didn’t play a part in any performance differences, and the systems were as similar as I could make them. Both with 16GB of RAM, obviously DDR3 vs DDR4, but still. The same OS, both cooled by a 240mm AIO and of course the same GPU. So, what’s the difference?
Quite a lot.
COD MW | 3900X | 4790K | Difference | % Diff |
1080P AVG | 202.43 | 159.79 | 42.64 | 21.06407153 |
1080P 1% Low | 152.9051988 | 119.6172249 | 33.2879739 | 21.77033493 |
1440P AVG | 162.8 | 156.21 | 6.59 | 4.047911548 |
1440P 1% Low | 126.9035533 | 111.6071429 | 15.29641044 | 12.05357143 |
4K AVG | 102.83 | 102.43 | 0.4 | 0.3889915394 |
4K 1% Low | 87.03220191 | 78.24726135 | 8.784940569 | 10.09389671 |
BFV | 3900X | 4790K | Difference | % Diff |
1080P AVG | 191.44 | 155.95 | 35.49 | 18.53844547 |
1080P 1% Low | 118.4834123 | 91.32420091 | 27.15921141 | 22.92237443 |
1440P AVG | 164.5 | 155.86 | 8.64 | 5.252279635 |
1440P 1% Low | 113.1221719 | 91.15770283 | 21.96446912 | 19.4165907 |
4K AVG | 104.22 | 107.18 | -2.96 | -2.840145845 |
4K 1% Low | 91.15770283 | 77.16049383 | 13.997209 | 15.35493827 |
Fortnite | 3900X | 4790K | Difference | % Diff |
1080P AVG | 180.71 | 160.58 | 20.13 | 11.13939461 |
1080P 1% Low | 114.1552511 | 97.75171065 | 16.40354049 | 14.36950147 |
1440P AVG | 136.02 | 136.97 | -0.95 | -0.698426702 |
1440P 1% Low | 79.0513834 | 86.80555556 | -7.754172156 | -9.809027778 |
4K AVG | 92.55 | 95.97 | -3.12 | -3.360258481 |
4K 1% Low | 68.3994528 | 61.61429452 | 6.785158288 | 9.919901417 |
So if you are gaming at 1080p on a 3080, first of all what are you doing? Unless you’ve got a 240Hz monitor, 1440p 144hz is the way to go, trust me on that. But, if you are, you’re older CPU is going to be a fairly big bottleneck. Going from 200FPS in COD to 160, especially if you are using an ultra high refresh rate monitor, including the new 360Hz ones coming out, that’s a big deal. Even in fortnite, dropping 11% of your average FPS isn’t great, although for the average gamer, especially one with a 144Hz monitor instead, it really doesn’t matter much.
If you are playing at 1440p, the difference most of the time was negligible with the only thing to consider being the 1% lows which were 10-20% lower on the 4790K, but it was still high enough that you wouldn’t notice. And at 4K, the GPU becomes the bottleneck entirely and besides the 1% lows again, it’s within margin of error which is better.
For a card like the 3080, it’s a little too high end for most people to care, getting 180FPS with a new CPU vs 160FPS with an older one won’t matter to most people’s gaming experience so I think once the 3070 is available and in my hands, I’ll do this test again and see if it makes more sense to upgrade then, where losing 20% of your average FPS can matter a lot more. Hit that subscribe button if you want to see that by the way!