Author Topic: nVidia G-Sync technology  (Read 3173 times)

Offline LCADolby

  • Platinum Member
  • ******
  • Posts: 7173
Re: nVidia G-Sync technology
« Reply #15 on: December 17, 2013, 06:36:30 AM »
Like anything AcesHigh does suffer from stutter, but I highly doubt G-sync will make enough of a difference for it to be worthwhile.
May as well invest in better core PC parts.
JG5 "Eismeer"
YouTube-20Dolby10
Twitch - Glendinho

Offline 2bighorn

  • Gold Member
  • *****
  • Posts: 2829
Re: nVidia G-Sync technology
« Reply #16 on: December 17, 2013, 03:43:06 PM »
That's not entirely correct. If vsync is enabled and your framerate drops below 60, let's say 59fps - due to the nature of vsync your framerate will instantly dip to 30fps. This is very noticeable and drastic. If your frames dip below 30 they immediately will half to 15 etc. G-sync will remove this limitation by dynamically issuing a new framerate if I understood the thing correctly.

Also when vsync dips below 60 or whatever your supported monitor refresh is, it forces the display card to wait for the slower refresh becoming a bottleneck so even if your card could push the 59fps you only get the 30 out of it as long as you can't achieve the full 60fps.

Of course you know this already, I just don't understand why you don't see the removal of this halving as a positive step.

In a sense I agree with Skuzzy however. G-sync is not very exciting to me as it requires both buying new hardware and is vendor limited.

Adaptive V-sync fixes the problem bellow monitor refresh rate ie it's off.

Offline Changeup

  • Persona Non Grata
  • Platinum Member
  • ******
  • Posts: 5688
      • Das Muppets
Re: nVidia G-Sync technology
« Reply #17 on: December 17, 2013, 07:10:56 PM »
For Xmas Im buying a SSD as thats my current bottleneck.

I'll watch G-Sync as it evolves.

Changeup runs an i7-950 on a Sabertooth X58 board with a 660Ti.  Changeup had 21 fps in clouds and on busy fields until he bought a Samsung Pro 256 Gig SATA 6 SSD.  Now he flies around in busy clouds at 52.

Changeup happy :aok
"Such is the nature of war.  By protecting others, you save yourself."

"Those who are skilled in combat do not become angered.  Those who are skilled at winning do not become afraid.  Thus, the wise win before the fight, while the ignorant fight to win." - Morihei Ueshiba

Offline MrRiplEy[H]

  • Persona Non Grata
  • Plutonium Member
  • *******
  • Posts: 11633
Re: nVidia G-Sync technology
« Reply #18 on: December 18, 2013, 12:12:24 AM »
Changeup runs an i7-950 on a Sabertooth X58 board with a 660Ti.  Changeup had 21 fps in clouds and on busy fields until he bought a Samsung Pro 256 Gig SATA 6 SSD.  Now he flies around in busy clouds at 52.

Changeup happy :aok

Uh, that would indicate that AH does not buffer all textures to ram (even main ram from where copy to vram would be quick) and/or use non-blocking code in the texture loads which will make the client freeze whenever it needs disk i/o.
Definiteness of purpose is the starting point of all achievement. –W. Clement Stone

Offline BaldEagl

  • Plutonium Member
  • *******
  • Posts: 10791
Re: nVidia G-Sync technology
« Reply #19 on: December 18, 2013, 12:21:13 AM »
That's not entirely correct. If vsync is enabled and your framerate drops below 60, let's say 59fps - due to the nature of vsync your framerate will instantly dip to 30fps. This is very noticeable and drastic. If your frames dip below 30 they immediately will half to 15 etc. G-sync will remove this limitation by dynamically issuing a new framerate if I understood the thing correctly.

Also when vsync dips below 60 or whatever your supported monitor refresh is, it forces the display card to wait for the slower refresh becoming a bottleneck so even if your card could push the 59fps you only get the 30 out of it as long as you can't achieve the full 60fps.

Of course you know this already, I just don't understand why you don't see the removal of this halving as a positive step.  
In a sense I agree with Skuzzy however. G-sync is not very exciting to me as it requires both buying new hardware and is vendor limited.

Huh?  My computer doesn't do that.  Rarely, but on occasion, I'll see FR's in the 45-49 range with vsync on.  I also never see stuttering.
« Last Edit: December 18, 2013, 12:23:03 AM by BaldEagl »
I edit a lot of my posts.  Get used to it.

Offline MrRiplEy[H]

  • Persona Non Grata
  • Plutonium Member
  • *******
  • Posts: 11633
Re: nVidia G-Sync technology
« Reply #20 on: December 18, 2013, 01:29:36 AM »
Huh?  My computer doesn't do that.  Rarely, but on occasion, I'll see FR's in the 45-49 range with vsync on.  I also never see stuttering.


Triple buffering or adaptive vsync which is nvidia proprietary tech gives more flexibility to the rates.

Basically however it goes like this (partially ripped from another forum):

Vsync synchronizes the buffer swap with your monitors vertical refresh rate. If there are the usual two buffers (double buffer) your gpu is being held up untill a buffer swap can be made (if your GPU is fast enough that is). So if your GPU can't draw the buffer full on next refresh it will have to skip it totally. If you're refreshing at 16ms intervals but a frame needs 17ms to draw, it will miss the sync interval. Because the buffer swap is locked to the refresh rate it can't take place on the 17ms mark - it needs to wait until the next interval, which will be at 32ms. Likewise if a frame needs 33ms it will miss the intervals at 16ms and 32ms, and will need to wait until the 48ms interval.

This is the why and how vsync effects framerates. AH2 is supposed to enable triple buffering automatically AFAIK but I have noticed that in some cases I had to force it on from graphics settings.

Judging from Changeups post though it seems that some of AH:s stutterings do not result from lack of rendering power but instead AH has trouble with i/o implementation. If software code is made 'blocking' like it naturally happens when you use loops etc. any operation will force the code to hang for the duration of it. Since i/o is natively very slow, making i/o operations blocking can cause very visible stutters since the software basically stops working for the duration that some texture is loaded. Coincidentally this plays together with vsync so that even if a small texture block needs i/o it's enough to make the rendering skip a frame or two leading to visual problems.

There is a solution to this which is called 'non blocking code' which essentially performs parallel operations so that loading textures does not hang the main code at all - but this type of code is much more demanding and any software title that has roots years back usually doesn't make use of it.

The problem with old software projects can be that the main codebase is so massive that the legacy stuff in there is just too resource heavy a task to do on a tight budget and time frame.

I'm not saying this is the case with AH as I have no knowledge of it, but it can be the case.
« Last Edit: December 18, 2013, 01:33:09 AM by MrRiplEy[H] »
Definiteness of purpose is the starting point of all achievement. –W. Clement Stone

Offline zack1234

  • Plutonium Member
  • *******
  • Posts: 13182
Re: nVidia G-Sync technology
« Reply #21 on: December 18, 2013, 01:59:28 AM »
I have a gtx680 and i7 4.3 ssd,hardrive and loads of corsair ram gaming pc :old:

I use track ir and get a stutter every now and then, tripple buffering might sort this out?

Obviously it might be some thing in back ground but is track ir a big culprit for a stutter every now and then?

Rise of Flight has a stutter as well.

The nvidia experience software i was told is gibberish.

There are no pies stored in this plane overnight

                          
The GFC
Pipz lived in the Wilderness near Ontario

Offline MrRiplEy[H]

  • Persona Non Grata
  • Plutonium Member
  • *******
  • Posts: 11633
Re: nVidia G-Sync technology
« Reply #22 on: December 18, 2013, 02:26:42 AM »
I have a gtx680 and i7 4.3 ssd,hardrive and loads of corsair ram gaming pc :old:

I use track ir and get a stutter every now and then, tripple buffering might sort this out?

Obviously it might be some thing in back ground but is track ir a big culprit for a stutter every now and then?

Rise of Flight has a stutter as well.

The nvidia experience software i was told is gibberish.



Some of the stutterings may be caused by Windows 7 problems with multithreading. At least BF4 users suffer from 'core unpark' bug which causes stutters to any software that uses multithreading heavily. Core parking is a power saving function that aims to 'park' processes to certain processor cores so that one or more cores could be left idle when full power is not needed. The system is supposed to automatically expand processes to the idle cores when needed but it seems the system is either not fast enough or doesn't work correctly. That's why users either switch to Win8 or run the 'core unpark' patch to Win7 that will fix the stutters.
Definiteness of purpose is the starting point of all achievement. –W. Clement Stone

Offline guncrasher

  • Plutonium Member
  • *******
  • Posts: 17315
Re: nVidia G-Sync technology
« Reply #23 on: December 18, 2013, 03:08:28 AM »
Triple buffering or adaptive vsync which is nvidia proprietary tech gives more flexibility to the rates.

Basically however it goes like this (partially ripped from another forum):

Vsync synchronizes the buffer swap with your monitors vertical refresh rate. If there are the usual two buffers (double buffer) your gpu is being held up untill a buffer swap can be made (if your GPU is fast enough that is). So if your GPU can't draw the buffer full on next refresh it will have to skip it totally. If you're refreshing at 16ms intervals but a frame needs 17ms to draw, it will miss the sync interval. Because the buffer swap is locked to the refresh rate it can't take place on the 17ms mark - it needs to wait until the next interval, which will be at 32ms. Likewise if a frame needs 33ms it will miss the intervals at 16ms and 32ms, and will need to wait until the 48ms interval.

This is the why and how vsync effects framerates. AH2 is supposed to enable triple buffering automatically AFAIK but I have noticed that in some cases I had to force it on from graphics settings.

Judging from Changeups post though it seems that some of AH:s stutterings do not result from lack of rendering power but instead AH has trouble with i/o implementation. If software code is made 'blocking' like it naturally happens when you use loops etc. any operation will force the code to hang for the duration of it. Since i/o is natively very slow, making i/o operations blocking can cause very visible stutters since the software basically stops working for the duration that some texture is loaded. Coincidentally this plays together with vsync so that even if a small texture block needs i/o it's enough to make the rendering skip a frame or two leading to visual problems.

There is a solution to this which is called 'non blocking code' which essentially performs parallel operations so that loading textures does not hang the main code at all - but this type of code is much more demanding and any software title that has roots years back usually doesn't make use of it.

The problem with old software projects can be that the main codebase is so massive that the legacy stuff in there is just too resource heavy a task to do on a tight budget and time frame.

I'm not saying this is the case with AH as I have no knowledge of it, but it can be the case.

great posting ripley.  but could it be something else with changeup's setup since most of the players with similar systems havent posted this as a problem?  I havent had 20 fps on a regular bases since I switched from dial up aol 7 years ago to verizon dsl.  my fps went from around 20 to 35.  and since i built my own computer starting with the 8400 cpu with an 9800tx+ video card I havent had nothing less than mid 50's to 90 even in heavy cv ack.

I dont know about changeups set up that much to be honest.  but I think if it was ah code it would have hit way more systems that just his. 


semp
you dont want me to ho, dont point your plane at me.

Offline MrRiplEy[H]

  • Persona Non Grata
  • Plutonium Member
  • *******
  • Posts: 11633
Re: nVidia G-Sync technology
« Reply #24 on: December 18, 2013, 03:38:28 AM »
great posting ripley.  but could it be something else with changeup's setup since most of the players with similar systems havent posted this as a problem?  I havent had 20 fps on a regular bases since I switched from dial up aol 7 years ago to verizon dsl.  my fps went from around 20 to 35.  and since i built my own computer starting with the 8400 cpu with an 9800tx+ video card I havent had nothing less than mid 50's to 90 even in heavy cv ack.

I dont know about changeups set up that much to be honest.  but I think if it was ah code it would have hit way more systems that just his.  


semp

After writing the post it dawned to me that the reason for his behaviour may be that he's running his graphics settings too high and gets depleted from video ram. When that happens, parts of video ram is flushed to main ram and back constantly and the result is of course a major slow down.

D3D9 also natively keeps a shadow copy of video ram in main ram (to my knowledge, correct me if I'm wrong) so if the end result is system swapping the SSD may save the day in the end. Only HTC really knows (or can know) the answer to these things.

But to me the most logical conclusion is that if SSD affects his framerates, his stutters were i/o related so for a reason or another his AH client was fetching or storing data from disk during those situations and the i/o access time improvement of the SSD removed the visual stuttering. Of course there can be n number of variables such as did he reinstall windows when switching to SSD etc ;)
« Last Edit: December 18, 2013, 03:42:28 AM by MrRiplEy[H] »
Definiteness of purpose is the starting point of all achievement. –W. Clement Stone

Offline Skuzzy

  • Support Member
  • Administrator
  • *****
  • Posts: 31462
      • HiTech Creations Home Page
Re: nVidia G-Sync technology
« Reply #25 on: December 18, 2013, 06:27:22 AM »
If you force triple buffering on, in the video card driver, you will also cause stutters as Aces High already triple buffers.

An SSD cannot, directly, impact frame rates unless there is something else being shared on the interrupt with the hard drive, such as the video card.  While I have never seen that happen, it is possible.  I cannot imagine a motherboard manufacturer allowing that combination.

The other possibility is the video card is no longer running textures from its local RAM, and instead is using system RAM which has been swapped out.  Very possible as most games (including Aces High) do preload textures before they are actually needed.
Roy "Skuzzy" Neese
support@hitechcreations.com

Offline Brooke

  • Aces High CM Staff
  • Plutonium Member
  • *******
  • Posts: 15462
      • http://www.electraforge.com/brooke/
Re: nVidia G-Sync technology
« Reply #26 on: December 18, 2013, 12:11:31 PM »
Changeup runs an i7-950 on a Sabertooth X58 board with a 660Ti.  Changeup had 21 fps in clouds and on busy fields until he bought a Samsung Pro 256 Gig SATA 6 SSD.  Now he flies around in busy clouds at 52.

Changeup happy :aok

That's odd.  I have an i5-2400 and a GTX 550 Ti, and I have 60 fps in clouds and busy fields.  However, while I have all other settings maxed out including even antialiasing forced on in my card's settings, I have the reflections slider (or whatever the name is -- don't have access to the game at the moment) at "none".  Maybe that's the difference.

Offline Changeup

  • Persona Non Grata
  • Platinum Member
  • ******
  • Posts: 5688
      • Das Muppets
Re: nVidia G-Sync technology
« Reply #27 on: December 18, 2013, 01:05:41 PM »
That's odd.  I have an i5-2400 and a GTX 550 Ti, and I have 60 fps in clouds and busy fields.  However, while I have all other settings maxed out including even antialiasing forced on in my card's settings, I have the reflections slider (or whatever the name is -- don't have access to the game at the moment) at "none".  Maybe that's the difference.

It's 100% the difference.  Mines one notch from full.  Ratchet yours up and see what happens, lol
"Such is the nature of war.  By protecting others, you save yourself."

"Those who are skilled in combat do not become angered.  Those who are skilled at winning do not become afraid.  Thus, the wise win before the fight, while the ignorant fight to win." - Morihei Ueshiba

Offline guncrasher

  • Plutonium Member
  • *******
  • Posts: 17315
Re: nVidia G-Sync technology
« Reply #28 on: December 18, 2013, 01:21:35 PM »
It's 100% the difference.  Mines one notch from full.  Ratchet yours up and see what happens, lol

man changeup the problem with you having low fps was that you are really pushing your video card above its limits.  glad the ssd worked for you.  but a new video card will probably help too.   I would also get around 30 fps if I pushed the em to full.  but it's something you normally wont have time to see, i just lowered it to nothing.  other than em I can play with everything on and shadows at 4096 using lower card than you, sli evga 465 at full fps.


semp
you dont want me to ho, dont point your plane at me.

Offline zack1234

  • Plutonium Member
  • *******
  • Posts: 13182
Re: nVidia G-Sync technology
« Reply #29 on: December 18, 2013, 02:04:36 PM »
Will gaming mode on my samsung stop lag?
There are no pies stored in this plane overnight

                          
The GFC
Pipz lived in the Wilderness near Ontario