Not quite right streakeagle. The CPU is always important, but you will find those with high end video cards using higher resolutions with AA and AF, which mask the overall potential gains from this test.
Without knowing what those settings are, you cannot really get an idea of how much more the video card is important in this test.
Take a look at this example.
1280x1024 = 1,310,720 pixel per frame
1024x768 = 786,432 pixels per frame
Now, let's take a look at one of the low end video cards numbers, using the ground vis. An FX5200 ran it from 23 to 50 frame rate. On the surface it indicates a 54% increase in performance.
But the real performance gains could be turned into a pixel per second number. At 23 FPS and 1024x768 resolution, that would be 18,087,936 pixels per second. At 50 FPS it would be 39,321,600 pixels per second. I happen to know this particular system was running at this resolution.
Now, let's take an FX6600. It ran from 31 to 75 FPS, a 59% gain in performance. It happens to be running at 1280x1024. At 31 FPS that would be 40,632,320 pixels per second. At 75 FPS, it would be 98,304,000 per second.
What do these systems have in common? Both are 3.2Ghz Pentium 4's with 1GB of PC3200 RAM.
FX5200: 23FPS/18,087,936 PPS to 50FPS/39,321,600 PPS.
FX6600: 31FPS/40,632,320 PPS to 75FPS/98,304,000 PPS.
Now if you look strictly at frame rate gains, the FX6600 gained 16% over the FX5200 at the low end. Looking at the PPS gains show a very different number. A 56% gain over the FX5200 at the low end, and a 60% PPS gain at the high end.
Bottomline, you cannot do a proper comparison using only frame rates. Pixel rates are more important. And I have not attemtped to factor in the AA or AF impact either.