Or you can say that all the performance after a certain point is "wasted" because at a certain point (which differs for each user of course), any extra speed is not noticeable. Then, one could say that either AMD or Intel is good enough. But really, the graphics card is the real bottleneck for most people playing games (not the CPU).
Exceptions are people that are going to continuously peg their CPU at 100% because they are grining 24/7 on those CPU (and electric power bill) distributed computing projects. I wish distributed computing projects would also keep track of the COST of the project to planet earth by also allowing users to enter the approximate wattage used by the PC with the CPU at 100% as well as the kilowatt/hour cost of the electricity. Then the cost for each user could be added to the running total cost of the project.
Usopp: Hey, this item A displays results in 0.000000000001 of a second! But this this other thing gives results in 0.00000000001 of a second which is TEN TIMES faster! These specs are great! It's a big difference!
Nami: Usopp, do you really notice the difference between 0.000000000001 seconds and 0.00000000001 seconds?
Usopp: ??
Luffy: ??
Zoro: ??
Nami: Okay, okay, it's one of those mystery things that they are more or less the same because they are both good enough.
Crew: Yes!
Horo: *reaches for item A* That's an Apple product. I prefer Apple products because they are premium and because I am Horo the Intelligent wolf.
Nami: You mean sage wolf, don't you?
Horo: *wrinkling her nose* No, sage is a herb used for making sausages.