Yawn...ok...let me add me 2¢ (BTW I did some work on this in college...so please dont quote some no name newsgroup post as fact
)
~24 FPS is aproximate minimum at which they eye can tell the diference between a stream of still pictures and a normal 'smooth' video. This does not mean the eye cannot see a diference...it just means that is where you no longer are distracted by a frame by frame look. A good example is an inexpensive digital camera that takes movies....they typically do 15 fps...which looks OK...but you can tell its a little off. Bear in mind that this # does not mean 100% realism...it only means you cant dicern individual frames.
~72 Hz is the minimum where you can no longer decern FLICKER (60 and below everyone sees flicker....60-68 most people see flicker....68-72 some people see flicker....above 72 almost no one can see flicker). Take a strobe at 72 Hz and it will look like a solid light to you. This is why monitor refresh rate minimums should be kept at 72 Hz or better.
Now...lets remember that the 72 Hz has NOTHING to do with decerning smooth motion, it does however indirectly effect your ability to feel smooth motion (see below)
The key issue with video games and not being able to call 24 FPS the golden target is twofold. Firstly, there is the issue of control input and framrate consistancy. There is very little chance of you having an average 30 fps average without droping (even if for just a moment) well into the low 20's or teens. This is immediately noticed by the player. Secondly, there is little or no motion blur in video games. This accentuates you ability to 'feel' something isnt smooth, becuase all thru your life, you are used to motion blur. Higher frame rates compensate for this by aproaching the 72 Hz flicker tollerance at which point the missing motion blur is overcome.
As an example, please take a look at one of the old stop-motion christmas specials or old monster movies. You dont feel the characters are real moving because there is NO MOTION BLUR....just like in computer games. This is because the animation is done frame by frame with moving apendages in perfect focus. This has be overcome in recient years with techniques that induce motion blur, ie: Stan Winstons 'Go-Motion'. All new high end computer animation has this induces motion blur.
In general 30 FPS average will provide a good game experience. Get to 40-50 FPS and most people are very very happy, even the most elite hardcore "I can tell damnit" gamers are unable to see anything past 60FPS in double blind testing.
ok
I'm done