I wasn't criticizing anything you said, I agree with all of it.
Maybe Skuzzy can explain this, or someone else, I frequently wonder why GPU companies can make really decent jumps, the 980/ti/etc to the 1080/Titan Pascal/etc jump was pretty significant, no matter which card you pick or how you break it down, the cost/performance ratio really took a big green up arrow jump IMO, and then used much less power and made less heat to boot. Yet, looking at the CPU side of things, take whatever Intel platform you like (AMD too I guess, I'm just not that familiar with their CPU/MB anymore), and compare it to the next/previous generation, and it's often single digits of real performance gains. WTH?! Why is that, why can the graphics companies put out faster, cooler, more efficient, and even cheaper products, while the CPU side of the house can't. Or is it more of a "doesn't" issue, than "can't"?
The answer, an accurate, researched, and fact checked, answer, would be a fantastic article, as I've not really seen anything that great which specifically covers this.