Aces High Bulletin Board
General Forums => Aces High General Discussion => Topic started by: NHawk on October 18, 2003, 10:50:38 AM
-
Those of you who think you're getting 200+ fps are most likely kidding yourseves. Simply because of a limiting factor which I'll try to explain as simply as possible, and I'll use nice even numbers to keep it that way.
First, take into account screen refresh rate. Let's say you have your screen refresh rate set to 100Hz. That means your screen is being redrawn 100 times per second. It can't be redrawn any faster than that. Period.
Now, you check your frame rate and your system says you're getting 200fps. You think WOW this is great! Well, guess again. :)
What the system is telling you is that the video card is sending information to the screen at 200fps. BUT, what you are actually seeing is 100fps. The screen is incapable of redrawing the screen at 200fps because of the limiting refresh rate.
What's happening to the other 100 frames? They are most likely being lost completely. The system is displaying every other frame.
Are there monitors able to refresh at 200Hz? Probably, but most of us don't own one.
Now, I'll leave this to debate and bow out of the conversation.
:rolleyes:
-
k
-
NHAWK, what you say is very true, but can you explain this?
(http://www.onpoi.net/ah/pics/users/ah_154_1066498221.jpg) :confused:
-
zmeg, you probably have vsync disabled which highers the frame rate in game but is generally not recommended (according to Skuzzy).
-
Photoshop:aok
-
Originally posted by NHawk
Those of you who think you're getting 200+ fps are most likely kidding yourseves. Simply because of a limiting factor which I'll try to explain as simply as possible, and I'll use nice even numbers to keep it that way.
First, take into account screen refresh rate. Let's say you have your screen refresh rate set to 100Hz. That means your screen is being redrawn 100 times per second. It can't be redrawn any faster than that. Period.
Now, you check your frame rate and your system says you're getting 200fps. You think WOW this is great! Well, guess again. :)
What the system is telling you is that the video card is sending information to the screen at 200fps. BUT, what you are actually seeing is 100fps. The screen is incapable of redrawing the screen at 200fps because of the limiting refresh rate.
What's happening to the other 100 frames? They are most likely being lost completely. The system is displaying every other frame.
Are there monitors able to refresh at 200Hz? Probably, but most of us don't own one.
Now, I'll leave this to debate and bow out of the conversation.
:rolleyes:
You're right and you're wrong. You're right if the player has Vsync enabled which ties the frame rates to the refresh rate of the monitor. But if the player has Vsync disabled then the frame rates can be higher than the refresh rate since it's no longer tied to the refresh rate. That is why you see stuff like texture ripping and other graphic anomolies with Vsync disabled.
Lesson over.
Ack-Ack
-
It's all semantics.
The number of frames displayed can never exceed the monitors refresh rate.
The number of frames processed is limited only by the cpu/video card. (Vsync waits in between frames long enough for the monitor to finish refreshing)
For the most part the only use of vsync off is for benchmarking. As mentioned, it will cause annoying artifacts, especially with a lot of movement.
-
Actually...
What happens when you turn vsync off is that the video card doesn't synchronize it's output to the monitor's refresh rate. The result being that those extra frames are being drawn, just as fragments. So, for example, if you have 200 fps at a 100 Hz. (100 fps) refresh rate, every frame the monitor draws would contain 2-3 pieces. In the simplest case the top half would be from the first video card rendered frame, the bottom half would be from the second frame. The downside to this is that fast motion in the scene (or rapidly changing colors) would leave a noticable separation between the halves of the screen. Of course if your refresh rate is high enough you'd be unlikely to notice it during gameplay, but screenshots would catch it.
The point is there isn't any real benefit from turning vsync off other than for benchmarking. So maximum effective framerate is capped by your monitor's refresh rate. Adding to that, typical monitor refresh rates meet or exceed the maximum speed the human eye can process the images anyway.
-
Claiming that fps over monitor refresh rate would be useless is total BS. The true fps varies a lot during gameplay and the fluctuations may cause noticeable lag regardless of how high the fps seems to be at one given time.
Some players think that the eye can't see anything past 30fps.. However 30fps is way too slow during fast movements which are typical during gaming. The eyesight part is not all there is to it though.
When your fps varies, also your game performance varies. Therefore when your fps drops noticeably it also affects the way the game responses to your commands. It affects the way your packets are being sent to the server, it affects your whole game performance.
Therefore a player who gets a constant 200fps is FAR better off than a player who hardly gets 85 (with or without monitor refresh) and freezes up during a heavy battle.
-
FPS rates over your refresh rate won't make a difference visibly (most framerates over ~30 won't make much of a difference), it's just a measure of performance.
The whole point of getting as many frames per second drawn to the backbuffer as possible is that framerates are going to drop when more polygons and effects are being rendered, or if more complex physics calculations are occurring. If you're getting 200 fps in a low detail situation, it doesn't matter if you don't see a difference between 200 and 60 fps. 200 fps is still much preferable, because it means that your system is less likely to drop below the 30 fps threshold during a stressful high-detail scene.
-
Originally posted by Siaf__csf
Some players think that the eye can't see anything past 30fps.. However 30fps is way too slow during fast movements which are typical during gaming. The eyesight part is not all there is to it though.
It is true that the eye can't see more than 30fps. Thats actually why a higher framerate is preferable.
With film running at 24fps, each frame represents (almost) 1/24th of a second. Shutter opens, and any movement during that fraction of a second is recorded as a blur. That blur gives the impression of smooth movement. However, in a computer game, each frame only represents one discrete moment in time, making the transitions between frames appear jumpy. With higher framerates, multiple frames add up over those fractions of a second to produce what to our eyes is a motion blur effect.
-
Actually Siaf, what has been said is fairly accurate.
There are a number of factors involved with accurate frame draws as it relates to FPS. A video card has many options available to determine what should be drawn on the display at any given moment.
When a game displays the FPS, it is displaying the rate at which it can render a frame. Just because a game renders it, does not mean that frame is actually being displayed on the video display.
Now it gets complicated. When vsync is enabled, you are assured that every frame rendered is displayed, as long as the video subsystem properly honors vsync. So what is a game doing when it cannot shove data to the video card during a frame draw?
Well, it depends on how many frame buffers a game allocates and if the frame buffers are full or not when the game wants to render a frame. It also depends on how the video subsystem (hardware/driver) decides it wants to handle it.
Sometimes a video card can stall the data delivery so a frame will not be missed, sometimes it might throw away a frame buffer to allow the game to continue. Really hard to know as it is dependent on the video subsystem.
Most of the time a video subsystem will make the best attempt to make sure the video frames are rendered so they do not miss any data, which keeps the video smooth and accurate.
Without vsync, a video subsystem has several options available. It can overwrite the video frame, which will typically cause 'tearing' in the video display due to mismatched frames being partially overlayed.
The video susbsystem may opt to finish a current frame buffer and overwrite previous buffers, which can cause some jerkiness in the motion of objects, but this method eliminates the 'tearing' effect.
A game could also send the same frame over and over again, if the update information is not available for the next frame. This last item can lead to some intersting visuals. For instance, in an online game, the object in your view needs a packet update from a server to be placed accurately in its environment. If you are running insane frame rates, your player/object could get updatred many times while game code simply extrapolates the remote object position. Suddenly a packet update arrives and the extrapolation may not have been accurate and the remote object jumps.
Your position, which the remote player has not gotten yet coupld also jump as while you are running insane frame rates and your object has been moving around, the remote player does not see it until he gets a packet update.
Now, both of you are out of sync, which can cause perception of lag, when in fact, it is a video synchronization problem and may not have a thing to do with lag. This can be more exaggerated with very high CPU speeds as well.
Keep in mind, I am not specifically talking about Aces High. Just the overall effect running without vsync can have on a multi-player game.
-
Skuzzy I think you misinterpreted my message. I was saying that 30fps is not enough for a fluid gameplay experience..
Even in the movies the low shutter rate really bothers me when there are fast movements, you can clearly see flickering in the picture during fast camera runs.
When a game of mine runs at 30fps, I feel it to be sluggish, low responsive and generally unplayable. That doesn't have anything to do with actual visual information, even. It doesn't bother me if the screen is a tad jumpy, but response problems do.
I know all about the technology you explained, I did some studying already back at WB when I noticed large differencies between 2D and 3D hit accuracy as well as vsync enabled and disabled.
What I didn't know was that the game can actually draw prediction calculated movements if the screen needs refreshing and is not dependent to some fixed synchronisation. So basically what you're saying is that if a person runs at 700fps a moment of netlag can cause 7 times the jump on his screen compared to a person who runs at 100fps? That would well explain the 'rubber bullet' syndrome with vsync off, though.
That being said, it's downright wrong to state gamers do not need fps rates past 30, 60 or whatever in order to get a good performance from a game. The truth is that (with vsync disabled) your framerates will need to be skyhigh in most if not all games in order to be able to handle the parts of the game where a lot of action, polygons and movement is being displayed.
You play with vsync on - but you need to have that 'hidden' reserve in performance.
-
Originally posted by Siaf__csf
You play with vsync on - but you need to have that 'hidden' reserve in performance.
Right on spot !
-
I would lend my .02 in support of Siaf.
Try holding a Quake Lan Party and defeating anyone with a higher FPS that you. Gonna be WAY hard if s/he is halfway decent.
In AH or Q3A, If they see you before you see them your dead. Bandwidth issues share this issue. I have seemed to have noticed faster connects see you spawn first, and nail you before all the cons have even appeared on your end.
The faster the better. No excemptions. Get HT to upgade your workstation and stop feeding you a line.
:D
Here's a question for everyone tho...
Were the screenshots based on CRT screens or LCD? I haven't played with them much, but aren't the LCD displays un-inhibited by a refresh Rate since it's function can be directly manipulated rather than a directed electron stream painting a pigmented phosphor screen?
-
Better Siaf. Yes, that is what the rubber bullet syndrome is about. And yes, having the capacity to run at insane frame rates means your CPU has enough time to handle all the events required in a massive online flight sim.
2Hawks, games like Quake are entirely different than a flight sim. Too different to compare. FPS games place trivial demands on a CPU versus a flight sim. And the bots in a FPS game are not moving at 400 MPH. Makes a huge difference.
-
Originally posted by Skuzzy
And the bots in a FPS game are not moving at 400 MPH. Makes a huge difference.
Do units actually matter in a game's engine? I mean, 400 miles/hour vs. 400 feet/second. Does the placement of a decimal take a greater/lesser toll on the CPU?
Like you said, comparing quake to AH is very difficult. Take this strange picture for example: AH could be modeled into a FPS. Take a quake character and shove it into AH. The size of said quake character would be such that the character's head is approximately 6,000ft tall in AH. In quake, the character would stand 6 ft tall. A difference in a decimal place.
I think I lost my train of thought and have forgotten what I'm trying to ask... if i make any sense at all, see what you can make of this :) lol
-
And about vsync and rubber bullet syndrome.
With Vsync enabled, the CPU/GPU frame rendering is halted and syncronized with the refresh rate. If your hardware is capable of producing 200fps, and your refresh rate is 100fps, when the rendering is halted, does that allow a greater capacity for the hardware to run different processes?
I ask this because I've always run with v-sync disabled (I like seeing that 200 fps in the corner). If I sync with my refresh rate (85Hz/85fps), then the capacity the hardware used to produce the extra 115fps (200fps - 85fps) can then be used for other stuff such as Anti-aliasing, anisotropic filtering, or other processes such as computing the FM or tracking packets and whatnot?
... just my attempt at understanding this.
-
how do u turn v-sync because my computer average fps is 10
-
v=sync wont help below 60, 75 fps
-
punt!
-
Holy Geek Batman!
I thought I was an ub3rn3rd, but you guys seriously need to get some sunshine.
GO PLAY OUTSIDE!
-
Funny, my monitor can do 200 Hz at lower resolutions.
-
So which is better...
30fps or 30mm?
...actually 29.97 fps, but i didn't want to point this out since i would like like a SUPERDORK.
-
Originally posted by Innominate
It is true that the eye can't see more than 30fps. Thats actually why a higher framerate is preferable.
Bollocks.
The BRAIN begins to perceive motion at about 20fps and upwards. The eye has nothing to do with it. The BRAIN quite happily perceives what could loosely be called frames up to about 60 frames per second. The maximum thought to be perceived is no more than 100fps.
-
ever wonder why 120volt AC lightbulbs usually operate at 60Hz? (cycles per second) Its because MOST human eyes wont see them flickering at that level. Ergo the human eye really wont see any noticable difference of anything above 60Hz. So I dont see why anyone would care if there Frame rate was higher than that.
-
Originally posted by Gunslinger
ever wonder why 120volt AC lightbulbs usually operate at 60Hz? (cycles per second) Its because MOST human eyes wont see them flickering at that level. Ergo the human eye really wont see any noticable difference of anything above 60Hz. So I dont see why anyone would care if there Frame rate was higher than that.
60 Hertz was chosen for transmitting power via an Alternating Current. Tesla found that power transmitted over greater distances with less loss when using AC rather than DC (Direct Current). -- Thomas Edison did not arbitrarily decide that he should use 60 cycles because he didn't see it "flicker".
Refresh rates used to be much slower, and in older models, fixed frequency. The refresh rate is controlled by the speed and precision by which the electron stream could directed to "Paint" the Phospor inside the CRT. Mostly dependent on the hardware available at that time. Not for anything resembling cinematics.
As for Flicker it's mostly noticable when viewed in a flourescent light source. -- Same reason that saws and other power equiment may appear motionless. If the movement is constand, and at the right speed, a flourescent light will flicker at a rate that will illuminate the equipment exactly once every complete revolution. -- Need less to say this has cost people many extremeties.
Refresh rate is completly a matter of how fast they can make the screen repaint. The Tech industry is always trying to make things go farther and faster. I really don't think they would want to stop or would stop just because "60 should be more than anyone would need".
I compared Quake to Aces High and In the context of percieved motion I will say it again, when measured in Frame Rates or Frames Per Second (FPS) the difference between 100 and 125 is stark.
-
I need to understand something here. Im hoping for a straigth forward answer. ::Puts away all fishing gear::
OK, my eyes see around 30fps, Right? Yet my Brain can see at 100fps? that seems a contradiction. Could that be clairified.
Next, If Im flying AH and am getting 55 fps in an average situation, and my opponet is geting 180fps. Does he have an advantage and how?
I know I seem to be able to fly on a computer getting 25 frames without any difficulty, If Im on one with 45 fps, Its smoother looking, but to me the game seems to be basicly the same. Does the lower frames make a difference on A/C preformace and applied damage?
I do understand if I am getting say 60 frames normaly, Im likely to stay above 30fps in a really graphic intense area, which is preferable.
Is there a point when addition fps really dont do anything for you?
Lots of question, but this thread has me wondering.
Thank you in advance for the clarifications. :)
-
This seems to be a point of debate on human physiology than simulation.
For myself the faster it is the more real it feels to me. If I was comparing Frames Per second (limimited by your refresh rate) then I would draw on the analogy I made in an earlier post regarding flourescent lights @ 60 Hertz or "Cycles Per Second".
If a saw looks stationary under flourescent light operating @ 60 Hertz. An Object would have to be spinning at the speed of light to appear motionless under the sun. Ergo the Speed of light is 186,282 miles per second, then in my opinion real life moves at 186,282 Frames Per Second.
Doctors don't seem to know either and for everyone the experience is different. -- Whats important, how does it look and feel to to you?
As for advantags or disadvantages, if FPS really made a big difference in the game then HT would simply lock the refresh rate on all instances of the game.
There is another discussion on light at:
ClanBaker Forums (http://www.clanbaker.com/forums/viewtopic.php?t=5)
-
Going along with what Grimm said, I've got a question too.
I run AH with Vsync enabled, refreshing at 75hz. It almost never drops below that (only in a 40 plane furball over a smoking base with tracers flying everywhere).
Should I be tweaking my settings somehow, or is this good? Does a guy who has an identical rig, with Vsync off and showing, say, 140 fps on a 75hz monitor have some kind of advantage?
-
Tarmac I would consider your system good in the game, and FPS will never exceed your refresh rate.
Your FPS can only go up to but not exceed your monitors refresh frequency.
-
2Hawks....just for clarification...I never said that's why they chose 60Hz....I said u wont see a light bulb flicker at that rate.....or u could say your brain wont see it....either way (and I think we are in agreement here) SUPER HIGH frame rates really dont make a difference in the visual aspects of the game
-
I can see the differance of a monitors refresh between 60-75-85 but not more than that. A friend and I tested it, he would set the refresh on my monitor randomly and I would guess the refresh, I was right most of the time.
I learned this as a pc repair guy in a previous life, I would look at someone else's screen (90% of people have their refresh at 60) and notice that it was 60. A quick test I use it to wave your hand in front of the monitor, see the strobing? The faster the refresh, the less strobing. I would postulate that when the refresh is high enough the strobing effect will go away, or at least match the normal strobing you percieve when looking at you waving hand.
Now here's the kicker, what is the refresh of your eyes/brain? It must have one, as you can see when an airplane propeller (or car wheel, or fan) speeds up, it looks like it slows, stops, then spins the other way. This can only be atributed to some sort of refresh in you optic center.
hmmmmm...
-
Originally posted by g00b
Now here's the kicker, what is the refresh of your eyes/brain? It must have one, as you can see when an airplane propeller (or car wheel, or fan) speeds up, it looks like it slows, stops, then spins the other way. This can only be atributed to some sort of refresh in you optic center.
hmmmmm...
Another trick for figuring out the refresh is looking at the monitor with what astronomers refer to as "Averted Vision". Watch the screen with your peripheral vision and you will notice a perceptable flicker. This would suggest a lot about the ability of the human mind to change the perception of it'as focus. -- But here I could go into all kinds of topics.
I will agree with the strobing effect with the hand, but for watching the tires, or a propeller, try that during the day with NO artificial lighting and not in some electronically reproduced fashion.
If there is a perceptable difference in the view of the object in question then it would confirm my previous statement that everyone is different, but only because their optic centers run at a different refresh rate. The mind is afterall, the most complicated electrical device we know of on this planet. Try teaching a robot to catch a ball or breakdance...
Pick a refresh rate that you like and let that be it. Save the technical jargon for boring the spouse to submitting into the upgrade expenditure. :)
-
Yeah, if you can't see more than 30 fps, and 60 Hz is double that, my eyes hurt and i get a headache after an hour or 2, but if i have my refresh rate at 85 Hz, i can use it for hours?
-
Originally posted by BenDover
Yeah, if you can't see more than 30 fps, and 60 Hz is double that, my eyes hurt and i get a headache after an hour or 2, but if i have my refresh rate at 85 Hz, i can use it for hours?
Heh, I guess your "Optical Refresh Rate" is between 70 and 85 then huh? :) heheh
-
so.... which is better for improved gameplay/fr... vsync On.. or Off?..(or that nifty auto-select)
-
Vsync on will be best.
I've never seen any good reasons to leave it off except for testing.
I saw a discussion once that started out at 30fps, later all concluded that 60fps is the magic number (or above). That was in relation to FPS games.
-
Originally posted by Desl0ck
so.... which is better for improved gameplay/fr... vsync On.. or Off?..(or that nifty auto-select)
Turn Vsynch off if the game is unplayable, otherwise turn it oin if you don't notice the difference. -- Some people don't have the hottest hardware, and have to do what it takes to get what you want.
-
Vsync off won't boost performance.
If you have lower framerates than your monitor refresh rate, then Vsync off will do nothing to raise your framerates.
I say again, there is no point in turning Vsync off, unless for testing.
-
Originally posted by jodgi
Vsync off won't boost performance.
If you have lower framerates than your monitor refresh rate, then Vsync off will do nothing to raise your framerates.
I say again, there is no point in turning Vsync off, unless for testing.
guess you've never play natural selection 1.04?
-
:confused:
-
The game is not playable with v-sync off, the tearing effect makes for a very ugly picture, use this setting only for tweaking your vidio settings then turn it on to play.
-
In the past when the refresh rate is capable and everything else is capable except the video card, turning off vsynch (WM2/3) made the gzme playable. --Vsynch has it's place otherwise it wouldn't be there.
-
I have had enough. I am having bad dreams at night about big ugly Vsync monsters attacking my CV :D
Ok, how come this topic died when it got to the best part? Rubber Bullets!!
Who gives a rats butt about graphics? I want 100% hit registering!
Explain that one for me!!!!
-
frames rates also depend on the number of people in a arena too right?
i was in the dueling arena the other day getting 100's straight forward! usually im lucky to get half the in the ma
-
Originally posted by Innominate
It is true that the eye can't see more than 30fps. Thats actually why a higher framerate is preferable.
more bull;)
30fps is horrible on a 70hz tft.
-
So I guess the concensus is to play with vsynch on then?
B00000 - no more looking straight up and seeing 540fps lol.
-
how come when i turn my Refresh rate up in control panel the screen size gets so much smaller?
-
how come when i turn my Refresh rate up in control panel the screen size gets so much smaller?
-
how come when i turn my Refresh rate up in control panel the screen size gets so much smaller?
-
Originally posted by 2shad4u
how come when i turn my Refresh rate up in control panel the screen size gets so much smaller?
huh?
im guessing you've got a CRT monitor and when you select a higher refresh its putting more border on the screen? if so find out how to stretch it back out via the buttons on the screen.
or have you mixed up refresh and resolution change? :confused: