Originally posted by mrblack
FPS is silly any way as the human eye cannot dissern anything after 26FPS.
*sigh*
WRONG.
And I wont elaborate because I'm tired of correcting this misconception maybe 40 times a year on different forums.
-edit-
*sigh*
I'm such a sucker for always trying to cure people's ignorance. Here it goes, copy-pasted from many other places:
The human eye can actually see *way* past 200fps (it has been military tested). It is a common misconception that it is much lower because of film only being 25fps.
Film has motion blur, games (normally!) do not. 120fps can 'feel' better than 60 despite people telling you otherwise. The issue with how many fps the eye can percive is complex and really I think the other answer is more what you are asking.
when you reach 10/15 fps in a fast paced game it becomings very difficult to do anything because of the video lag you get. It is incredibly frustrating and not desirable at all. If you can get 200fps in an empty room, you've got a good chance of getting a decent fps in a room full of people, rockets, gibs, etc etc.
It's important when talking about FPS to realize that your maximum frame rate is not really very important! Far more important is your minimum frame rate! When we say we want more FPS, what we really mean is we want more FPS at the bottom of the range. It's no good to have a system which draws D3 at 100 FPS when you are cruising around, but when you get into heavy combat, it plunges to 15 FPS.
The human eye can easily detect the flicker of a 60Hz refresh rate, and any time your video card produces less than 60 FPS if becomes obviously jerky. At rates above 60 FPS the situation changes: if your PC is delivering 60 FPS or more then your eye is fully fooled into believing non-flickering motion. Many people believe that frame rates above 60 per second are a waste of time and money as the eye can't detect all those extra frames. However, when frame rates climb into the triple digits the human eye can detect differences in quality, not quantity. Ultimately, only you can decide how many FPS is enough...
Frames Per Second are important because each "frame" that is drawn on your screen is a rock solid, stand-alone image - fully detailed, with no "motion blur". You can confirm this by taking a screen shot (with the PRINT SCREEN key) when things on the screen are going fast. Television can survive on 30 fps because each image on a TV screen is blurry. Hit PAUSE on your VCR when things are moving fast to confirm this for yourself.
Some new video cards have introduced Motion Blur (or their version of it) which is meant to address the high fps requirements of games. Purists scoff and say it's a poor substitute for producing enough "un-blurry" images to fool the eye into believing continuous high definition movement.
Conclusion:
YES, the human eye can tell up to 100 fps average. I can easily distinguish between 30 and 60. In fact, I find 30 barely tolerable.
60 is the sweet spot. Forget about how high your PC can go - just make sure it NEVER goes below 60 and you will then have bragging rights...