At first I was going to say that the CPU wouldn't really matter when you're past 3 GHz for a single core. That's also what our leading PC magazine recently told. And I'm quite happy with my i7-4790 which is two generations (and years) older than yours.
BUT
Then I got to thinking about my buddy who plays CS:GO semi-seriously. He used to have a similar rig to mine equipped with a GTX 1080 but in tight situations he said that although his frame rates didn't significantly drop (1440p, 144Hz monitor) he
felt some sort of lag. So shopping we went and got him an Intel 10th or 11th generation based package. According to him everything now seems more fluid and he seems to see the enemy a fraction of a second faster to give him time to react. So...
A quick and dirty Google search tells that various benchmarks tell a similar story. It seems that something happened after the 8th generation, there's not much difference between i5-9600, 10600, 11600 and 12600 in performance. Is it because the extra cores take care of any background activity, I don't know. AH and many other games only can use a couple and most likely not more than four cores. But Windows has grown fatter and seems to use more resources than it used to. I've noticed that with hard disks as well, an older mediocre laptop with an HDD can suffer tremendously from high disk activity caused by updating. Apparently the programmers don't bother optimizing their code for slower computers...
Here's some further information of how a newer CPU affects gaming:
https://youtu.be/hxII7cWH7dI