OK. I admit it. I suck as a statistician.
This isn't scientific, and not particularly accurate. Its just designed to allow people to estimate their expected framerate in orders of magnitude.
So, this may be the worst, most inaccurate, fudged up pile of fantasy since the U.S. Census, but here it is. ;0)
NOTE: I picked a sampling of the most representitive data point and graphed them. It looks to be pretty linear. The data point at 400 mhz seem a little low but I chalk that up to being a non-intel chip. In reviewing these numbers I would suggest applying the following modifiers:
1. There didn't seem to be much difference between 16 and 32 meg vid cards but there did seem to be a drop between 8 and 16. So, if you only have a 8 meg vid card subtract 5 fps from whats charted.
2. Non-Intel chips seemed slower. Subtract 5 fps if you have a non-intel (except for the 400 mhz data point).
3. Subtract 3 fps if you run res more than 1024x768. Add 3 fps if you run less than 1024x768.
http://www.digitalsim.com/downloads/fps.gif For what its worth,
Wab