Author Topic: nVidia G-Sync technology  (Read 3171 times)

Offline Rich46yo

  • Platinum Member
  • ******
  • Posts: 7358
Re: nVidia G-Sync technology
« Reply #30 on: December 18, 2013, 03:21:33 PM »
Well I just bought a new Asus 2560x1440 monitor so I cant imagine spending the kinda dough a comparable size G-Sync monitor will end up costing after they come out, nor am I going to buy a 24" monitor in order to kit it out. No matter how promising the new technology looks. However were I to know about G-Sync before I bought this new screen I'd probably hold off since I already had a 1920x1080 HP. Im not sure buying that Asus was the wisest thing to do with that money. In some games extra resolution equates into lighting problems. Games indoors often seem to dark.
"flying the aircraft of the Red Star"

Offline Ack-Ack

  • Radioactive Member
  • *******
  • Posts: 25260
      • FlameWarriors
Re: nVidia G-Sync technology
« Reply #31 on: December 18, 2013, 07:05:42 PM »


The nvidia experience software i was told is gibberish.



It mostly is, geared towards the gamer that wants everything done for them so they don't have to fiddle with graphic settings in each game they play.  But I would still recommend those with a GFX 600 series or better to install Geforce Experience just for the ShadowPlay video capture utility.  It's probably (though still in beta) the best game video capture program out, its got a very small resource foot print and the video quality is far superior to that of Fraps and MSI's video capture utility in Afterburner.

ack-ack
"If Jesus came back as an airplane, he would be a P-38." - WW2 P-38 pilot
Elite Top Aces +1 Mexican Official Squadron Song

Offline Arlo

  • Radioactive Member
  • *******
  • Posts: 24759
Re: nVidia G-Sync technology
« Reply #32 on: December 18, 2013, 07:11:18 PM »
I haven't flown a sortie in Aces High in around two years, so I don't know. 

QFE  :lol :cheers:

Offline MrRiplEy[H]

  • Persona Non Grata
  • Plutonium Member
  • *******
  • Posts: 11633
Re: nVidia G-Sync technology
« Reply #33 on: December 18, 2013, 10:55:14 PM »
Will gaming mode on my samsung stop lag?

It probably may help a little but no, it won't stop the lag. Google for input lag for your Samsung if you're lucky you may find reviews. Good screens have an input lag lower than 16ms, the best ones approach 1-2ms.
Definiteness of purpose is the starting point of all achievement. –W. Clement Stone

Offline zack1234

  • Plutonium Member
  • *******
  • Posts: 13182
Re: nVidia G-Sync technology
« Reply #34 on: December 19, 2013, 01:47:20 AM »
I think you posted list :old:

I saw a article yesterday with a projector which had a good response :old:
There are no pies stored in this plane overnight

                          
The GFC
Pipz lived in the Wilderness near Ontario

Offline MrRiplEy[H]

  • Persona Non Grata
  • Plutonium Member
  • *******
  • Posts: 11633
Re: nVidia G-Sync technology
« Reply #35 on: December 19, 2013, 03:02:39 AM »
I think you posted list :old:

I saw a article yesterday with a projector which had a good response :old:

If your tv happens to be on the list then yes. Be aware that just  1 number difference in the model number may mean a 1000% jump in rated specs.
Definiteness of purpose is the starting point of all achievement. –W. Clement Stone

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: nVidia G-Sync technology
« Reply #36 on: December 22, 2013, 01:29:13 PM »
If you force triple buffering on, in the video card driver, you will also cause stutters as Aces High already triple buffers.

An SSD cannot, directly, impact frame rates unless there is something else being shared on the interrupt with the hard drive, such as the video card.  While I have never seen that happen, it is possible.  I cannot imagine a motherboard manufacturer allowing that combination.

The other possibility is the video card is no longer running textures from its local RAM, and instead is using system RAM which has been swapped out.  Very possible as most games (including Aces High) do preload textures before they are actually needed.

Question Skuzzy:

Does AH continue to triple buffer frames even if the AA slider is set to None?
Is triple buffering in AH independent of AA?

Would like to know the answer to this for graphics setting purposes.

As far as the SSD thing goes, I remember in another thread a long time back that you had stated that using a SSD w/ a page file on it w/ AH was not good due to AH making a lot of small writes to the SSD. I had figured then that AH will preload textures (as they are loaded in either system memory/HDD depending upon availability) which will require some writes to the page file eventually as this is directly written into all Windows OS's for compatibility purposes w/ older software & older hardware configurations where graphics/system memory can still be limited. The issue as I see it is that MS needs to 1.) write some logic into the OS that can turn off some of this legacy coding upon recognizing capabilities of newer hardware or 2.) giving the end user the OPTION of making the setting changes within the OS to turn off this legacy coding to make full usage of newer hardware once the OS recognizes it's capabilities.

Sometimes the OS just won't recognize large amounts of dedicated graphics memory onboard a vid card & will set up an amount of system memory or HDD space for this purpose regardless & because it does the OS will most likely use what it set up, regardless.

If this is wrong then please correct me but it is apparent when I pull up the system info on my GTX TITAN...........a dedicated graphics card w/ 6144 MB of dedicated onboard graphics memory but there is also a 7907 MB shared system memory allocation for graphics that totals 14051 MB total graphics memory available for graphics......that 7907 MB of system memory allocated has to be set aside for the CPU/GPU to set aside (buffer) drawn frames for the CPU/GPU to get them from, right? If I remember correctly the Windows OS will still do this as if there was an integrated graphics chip on a mobo......again for compatibility purposes.

I am currently using Win 7 HP SP1 OS w/ this vid card.

Just saying............

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Changeup

  • Persona Non Grata
  • Platinum Member
  • ******
  • Posts: 5688
      • Das Muppets
Re: nVidia G-Sync technology
« Reply #37 on: December 22, 2013, 05:03:45 PM »

If this is wrong then please correct me but it is apparent when I pull up the system info on my GTX TITAN...........a dedicated graphics card w/ 6144 MB of dedicated onboard graphics memory but there is also a 7907 MB shared system memory allocation for graphics that totals 14051 MB total graphics memory available for graphics......that 7907 MB of system memory allocated has to be set aside for the CPU/GPU to set aside (buffer) drawn frames for the CPU/GPU to get them from, right? If I remember correctly the Windows OS will still do this as if there was an integrated graphics chip on a mobo......again for compatibility purposes.

I am currently using Win 7 HP SP1 OS w/ this vid card.

Just saying............

 :salute

I would very much like to know if this GPU will run EVERYTHING in AH maxed (no sliders right.  Everything running wide open) with 60fps no matter what is going on around you. 
"Such is the nature of war.  By protecting others, you save yourself."

"Those who are skilled in combat do not become angered.  Those who are skilled at winning do not become afraid.  Thus, the wise win before the fight, while the ignorant fight to win." - Morihei Ueshiba

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: nVidia G-Sync technology
« Reply #38 on: December 25, 2013, 03:02:39 PM »
Hi Changeup,

If you have the supporting hardware (CPU, mem, monitor, etc) that will not bottle neck it then the answer is yes & do it easily. But this goes for most of the offerings out there today. Due to some personal choices I make I don't usually use this card w/ totally maxed graphics settings (only reflections aren't fully maxed as I can't visually tell any difference in-game...all other in-game graphics settings are fully maxed out) but I have tested & ran this card in AH running flat out & steady at max FPS of my monitor (59 FPS) & at fully maxed out graphics settings regardless of what is going on in-game.

I will also say this again as I have done a LOT of testing w/ this GTX Titan along w/ a GTX 670 FTW vid card on my box running AHII:

Nvidia's GPU Boost--whether it is vers 1 or 2--will not fully boost a Nvidia Kepler GPU on my box to max boost clocks using the in-game AH graphics settings (AA slider) due to the in-game SETTING level only, regardless of slider position. The game runs excellent but not w/ the Kepler GPU running at max boost clocks....most of the time they ran BELOW the BASE boost clock speeds even w/ all in-game graphics settings at max & GPU temps/power ranges WELL below the throttling thresholds. This can cause the game to fluctuate on FPS when running on a Nvidia Kepler GPU.

When I turn off the in-game AA setting (slider set to None) & set the AF, AA, Trans AA & TF settings to max settings at the Nvidia driver level (after setting the driver to override any application settings...the graphics settings that are set within the game itself do not affect GPU Boost...only the AA slider does) & then GPU Boost would run the GPU while playing the game at the max boost clock settings allowed in BIOS as long as the GPU temps/power ranges were below the throttling thresholds & game runs hiccup-free w/ all other in-game settings maxed out full. This is w/o any OC'ing on the GPU or CPU on my box. The only time I have noted this Titan to clock back (and subsequent FPS drop) was due to the GPU power range hitting the 100% power threshold on occasion as set in the BIOS (have verified this using Precision X)...GPU temps have never exceeded 67*C under full load conditions. This is due to the game loading when at a large field that is under attack w/ a LOT of stuff burning & a LOT of friendlies/cons present. Making 1-2 bumps off full reflections updates or upping the GPU power threshold to 110% usually fixes this issue. I will usually just back off the reflections setting for my tastes as I visually can't see any difference in the reflections when set less than full. The only thing that I haven't tried yet is disabling Intel SpeedStep to my I7 3820 lock the CPU speeds to it's max speed settings to see if this has any effect to the game running (CPU will throttle it's speeds due to load/power levels....just like the Kepler GPU does). The potential CPU throttling may fix this as well but I haven't tested this yet so I can't say.

Note on GPU Boost: If I lower the Nvidia driver AF set below 16x or the AA set below 16QxCSAA or the TAA set below 8xSS or the TF set below High Quality GPU Boost will not boost GPU to max boost clocks but will boost to a % below max boost due to the SETTING used, regardless of the GPU temps/power ranges being below the throttling thresholds then may go to max boost clocks due to in-game graphical loading.

In short, if you want to ensure full GPU boost clocks are being used w/ a Nvidia Kepler card using GPU Boost in AHII I recommend using the Nvidia driver settings mentioned above set to max settings then tailor the rest using the in-game graphics settings to your liking.

I can easily OC this Titan and/or CPU & NEVER hit any graphics wall playing AHII in it's current configuration but due to it very seldom reaching this limitation in stock trim I can accept leaving it in stock trim. Now when HTC releases their updated software w/ the updated graphics engine all this may change....................

I can't say 60 FPS as I set my monitor to use 59 Hz to avoid the rounding issues & so my FPS stays at 59 FPS running at 2560x1440x32 res. This way I don't need any of the gadget software solutions.

Is it worth the $1,000.00 to get this kind of performance.....the answer is no. Since this card came out there are other models available in both camps that can do this for a lot less money. Heck my GTX 670 FTW could do it on my box but I wanted a Titan so I got 1...........& I don't regret buying it to this day.

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Triton28

  • Gold Member
  • *****
  • Posts: 2248
Re: nVidia G-Sync technology
« Reply #39 on: December 25, 2013, 04:18:23 PM »
My new rig (i7 4770k, 780 classified)  runs everything maxed out,  but not at 60 fps.  When its real busy I'll dip to the low 40's,  but I find that very playable.
Fighting spirit one must have. Even if a man lacks some of the other qualifications, he can often make up for it in fighting spirit. -Robin Olds
      -AoM-


Offline BoilerDown

  • Silver Member
  • ****
  • Posts: 1926
Re: nVidia G-Sync technology
« Reply #40 on: January 13, 2014, 09:02:00 PM »
How-to video in case you have the right Asus monitor and order the add-on kit:

http://www.youtube.com/watch?v=FMKpJr0KTC8
Boildown

This is the Captain.  We have a lil' problem with our entry sequence so we may experience some slight turbulence and then... explode.

Boildown is Twitching: http://www.twitch.tv/boildown

Offline Delirium

  • Platinum Member
  • ******
  • Posts: 7276
Re: nVidia G-Sync technology
« Reply #41 on: January 14, 2014, 01:17:20 PM »
Does AH continue to triple buffer frames even if the AA slider is set to None?
Is triple buffering in AH independent of AA?

Skuzzy, it would be fantastic if you could post pictures of Nividia and AMD control panels with the settings you would recommend for Aces High. I understand it would be system specific to some degree but that triple buffer information would be in a single place for all to benefit from.

Just sayin'.   :D
Delirium
80th "Headhunters"
Retired AH Trainer (but still teach the P38 selectively)

I found an air leak in my inflatable sheep and plugged the hole! Honest!

Offline Changeup

  • Persona Non Grata
  • Platinum Member
  • ******
  • Posts: 5688
      • Das Muppets
Re: nVidia G-Sync technology
« Reply #42 on: January 14, 2014, 02:54:03 PM »
Skuzzy, it would be fantastic if you could post pictures of Nividia and AMD control panels with the settings you would recommend for Aces High. I understand it would be system specific to some degree but that triple buffer information would be in a single place for all to benefit from.

Just sayin'.   :D

+1
"Such is the nature of war.  By protecting others, you save yourself."

"Those who are skilled in combat do not become angered.  Those who are skilled at winning do not become afraid.  Thus, the wise win before the fight, while the ignorant fight to win." - Morihei Ueshiba

Offline Skuzzy

  • Support Member
  • Administrator
  • *****
  • Posts: 31462
      • HiTech Creations Home Page
Re: nVidia G-Sync technology
« Reply #43 on: January 14, 2014, 03:43:23 PM »
The game is programmed to make use of the default settings in either AMD/ATI ro NVidia control panels.

Here, we usually do not install the control panels at all and just let the drivers work at whatever defaults they are set to.
Roy "Skuzzy" Neese
support@hitechcreations.com

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: nVidia G-Sync technology
« Reply #44 on: January 18, 2014, 11:04:08 AM »
The game is programmed to make use of the default settings in either AMD/ATI ro NVidia control panels.

Here, we usually do not install the control panels at all and just let the drivers work at whatever defaults they are set to.

Ahhhh.........that explains a lot. Also a smart business move as, aside from most who post here, most folks just load & go so most driver installs do default to looking for the application settings..................... .............

Thanks, Skuzzy!

I did test this out after I posted the 1st post in this thread:

Since I use the Nvidia drivers to run on (drivers set to override any application settings) I have the in-game AA slider set to None (game doesn't try to set AA level to Vid card).
I did have the Nvidia drivers set to triple buffer & Nvidia drivers set to use std V-synch (not the Adaptive V-Synch).

After posting, the wanna-be engineer in me  :D went into the NV CP & made the following setting changes to test AHII:

Turned triple buffering off & set the NV drivers for Vertical synch to "Use the 3D application setting" since what these settings were telling me is that the NV driver would "look" for the in-game coding instructions for V-synch and/or triple buffering application (depending if the game coding ties the two together as V-synch can be done w/o triple buffering....was pretty positive that AH would tie the two together) & then would execute the coding if it exists.

Then I ran the game, picked a runway & set my trusty Mk IX (or any prop-equipped plane in the game) & started the engine to watch the prop spin up thru the gun sight reticle as this is a good place to see/test if v-synching/triple buffering is being used in the graphical rendering regardless of the vid card's GPU or CPU speed/power. I have tested & known this for quite some time. If v-synch is not being used the prop graphics will most certainly tear as they pass thru the reticle as the prop is "spinning up" from dead stop to engine idle speed & they won't tear if v-synch is being used.

The results of this test clearly show that with the NV driver set as I had stated AHII is natively written to instruct the vid card to use v-synching/triple buffering as the prop graphics were smooth & the prop blades never distorted...proving that the graphics frame sequencing were synched....w/ the in-game AA slider set to None (game is not sending AA instructions to vid card drivers AND the NV drivers set to override any application settings for AA...ignoring them).

I then flew around watching the graphics looking for any tearing or abnormal sequencing............saw none.

Quote
From Skuzzy,s quote: If you force triple buffering on, in the video card driver, you will also cause stutters as Aces High already triple buffers.

I can see where this is a true statement now....along w/ enabling v-synch from the vid card's driver level as well w/ AHII.

I now run w/ triple buffer turned off & vertical synch set to "Use the 3D application settings" in NV CP.

Thanks again Skuzzy.

 :salute
« Last Edit: January 18, 2014, 11:25:05 AM by Pudgie »
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd