Aces High Bulletin Board

General Forums => Hardware and Software => Topic started by: MADe on February 12, 2017, 10:38:58 AM

Title: question
Post by: MADe on February 12, 2017, 10:38:58 AM
in Nvidia profile options,
AH3 prefers shader cache on or off?
"               " automatic thread handling on or off?

Title: Re: question
Post by: Skuzzy on February 13, 2017, 06:00:36 AM
We already cache shaders and we al;ready manage our own threads.
Title: Re: question
Post by: Pudgie on February 13, 2017, 11:08:09 PM
We already cache shaders and we al;ready manage our own threads.

Was not aware of the game calling for a shader cache to be created............assumed this was solely done at the vid card driver level.........

Since the client is actually building a shader cache to store frequently used graphics frames (assuming in system mem cache) to be reused when called for and the vid card driver is set up by default to do the same thing but store this cache on a HDD\SSD (both Nvidia and AMD drivers are set up w\ this setting enabled by default) instead of in system mem, is this duplicating of shader caching the AHIII client's graphics frames the potential smoking gun to the issue of screen pauses\freezes? I can envision this causing a potential TDR event................

Gonna run some tests w\ this disabled in the Crimson driver to see what happens.................

 :salute
Title: Re: question
Post by: MADe on February 14, 2017, 08:01:37 PM
no

i have run with and without, no percieved difference.
Title: Re: question
Post by: MADe on February 14, 2017, 08:02:27 PM
cc
skuzzy
 :salute
Title: Re: question
Post by: Pudgie on February 14, 2017, 09:24:31 PM
So far, so good......................... .

Haven't witnessed 1 screen freeze\pause to date since disabling Shader Cache in the Crimson driver and running AHIII Patch 21 under Dx11.

Game runs very well and shows to be somewhat more responsive on my end to boot.........no perception.

Continuing w\ testing...................... .

 :salute
Title: Re: question
Post by: Pudgie on February 19, 2017, 07:57:10 AM
Update:

I finally saw the screen freeze\pause issue creep back in after playing the game for over 3 days so this issue wasn't the main cause of this anomaly...........but this setting change did show to actually slow the occurances of these screen freezes\pauses somewhat so I've witnessed these pauses occur less than they were prior. This would make sense as anything that will reduce the GPU processing time to keep it within the 2 sec threshold before initiating a TDR event is a good step in the right direction.

But the much improved game responsiveness is very noticeable on my box since disabling the shader cache at the vid card driver level......which would also make sense as the vid card's GPU wouldn't have to contend w\ having to waste precious GPU processing time deciphering which set of cached graphics frames to actually use as well as in what sequence\position to use them.....and it will always be the better choice to use the shader cache that is set up by the game client in system mem run thru the CPU (especially a multi-core CPU) over the shader cache set up by the vid card driver on a HDD\SSD to be managed\used by the GPU.....as long as AHIII (or any game for that matter) provides 1.

IMHO the knowledge of AHIII creating and using it's own shader cache so players can know to disable this setting in the vid card driver if they choose to is a valid item to add to the Hints & Tips section.

Just a thought...................... ....

Appreciate the info, Skuzzy.

 :salute
Title: Re: question
Post by: Skuzzy on February 19, 2017, 08:26:51 AM
Is that feature on, by default?

We keep telling everyone the best settings for the video card driver are the default settings.  We design around those settings.
Title: Re: question
Post by: Pudgie on February 19, 2017, 08:45:02 AM
Yes, Shader Cache is enabled in the AMD Crimson Global driver by default as AMD Optimized.

I believe it is also on by default in the Nvidia drivers.....at least the last time I had my GTX 780Ti installed it was.

Can't say for sure though since then.

 :salute

PS--Also to let you know that w\ the Crimson driver it is also trying to apply tesselation in the game as this is set by default to AMD Optimized (I have witnessed it causing some stuttering especially when flying around the tops of mountains that have snow tiled but is probably also trying to apply this in other areas as well (like the trees). Setting the driver to either "Use Application Settings" or "Override Application Settings then set the maximum level to Off" stops this from occurring since AHIII doesn't apply tesselation at this time.

As you may already surmised, this is while running under Dx11...............

FYI.......................

 :salute
Title: Re: question
Post by: Pudgie on February 19, 2017, 10:49:56 AM
1 thing to note on AMD's Shader Cache..................

When AHIII was being developed AMD hadn't included this in the original Catalyst drivers or Radeon Crimson 15.x driver series..................

If memory serves me I believe Shader Cache was added starting w\ Radeon Crimson 16.3 series drivers forward.............

Nvidia drivers have had this feature for quite some time..........even before the development of AHIII.

FYI.

 :salute

PS--Checked & found it was w\ Crimson 15.11.2 that Shader Cache was added.....................
Title: Re: question
Post by: Bizman on February 19, 2017, 01:16:46 PM
Interesting, Pudgie...

Since I'm still using the good old HD6970, I suppose I wouldn't actually miss anything by reverting to drivers older than the 15.11.2. If the older drivers add stability by not featuring Shader Cache, it would be an easy fix for many of us using older Radeons.

Remind you that I haven't installed the control panel at all since I like to run the card at default settings. Plus it kept asking me to check for non-existing driver updates... Now I only get a small error message window every now and then, carrying the new logo.
Title: Re: question
Post by: Rich46yo on February 19, 2017, 01:27:12 PM
NVidea is set to default "Global" which is game setting right? Anyway I just disabled in DX11 and I'll see if that does anything to the frame stutter issue I had. Ive been running DX11 since the version came out and this stutter is only a relatively new issue. The game has run well in the past.

BTW going to DX9 and dialing down settings did get rid of it. I would still like to isolate the exact cause however. I kinda smell a NVidea driver thing. Their drivers have gone to pot the last year or so.
Title: Re: question
Post by: Pudgie on February 19, 2017, 03:08:01 PM
Here is a good article that gives as good of an explanation of AMD's Shader Cache and how it works as there is........................... ...

http://www.anandtech.com/show/9811/amd-crimson-driver-overview/2

I went & checked my DxCache folder in C:\ drive & found only 1 file dated 2-18-17 so I deleted it & enabled Shader Cache in the Crimson driver to actually see what this is caching.

Started the game up & noticed that the game hesitated a little before bringing up the initial screen w\ clipboard, then went in game & flew around awhile then got out & rechecked this DxCache folder & found 1 .bin file that was created today 2-19-17 @ 12:07 hrs (when I initially started up AHIII) so the vid card driver did create a shader cache of the shaders compiling the start up screen to disk to speed this part of AHIII up. When I started the game up the 2nd time it started up pretty quick and processed into the hangar faster than it did the 1st time....so here is 1 instance of this Shader Caching working w\ AHIII. Flew around for a while, engaged some cons until I got shot down then got out of the game then checked this DxCache folder again & found no other .bin files had been created outside of this 1st 1 that I had noted seeing.

I know that AHIII client while running does pre-load graphics calls and now I know the client will also pre-cache all compiled graphics\shader calls making use of multi-threading multiple threads across multiple CPU cores into CPU L3 cache to system mem cache for the GPU when called for so the GPU shouldn't have to really do any compilation of any graphics draw\shader calls per se, only execute them as it receives them......as long as the CPU\mobo subsystem can keep up......which my Intel I7 5820K 6-core CPU w\ HT disabled clocked at 4.0 Ghz across the Intel X99 platform is showing to easily do (so much for the real benefits of SMT playing this game as long as there are sufficient numbers of physical CPU cores available), judging from the lack of any additional .bin files being created by the vid card driver due to the GPU having to compile any of this at this time, much less paging out to disk due to the 16Gb of system mem onboard (ran MS Performance Monitor in background 1 more time w\ monitor set up to detect any data moving from system mem to virtual mem & got 0%)............. So w\ the capabilities of the hardware that I have, unless there is something covert going on, I should never see another .bin file created in my DxCache folder except the 2-19-17 @ 12:07 hrs .bin file w\ Shader Cache enabled...................... ...

Gonna test this out from here to see if my box can make this "theory" come true.....................

So it appears that what I've noted prior when I turned this setting off in the vid card driver was the driver reading this .bin file that was already in storage so once created & stored on disk it will always be here to be used by the vid card driver unless it is deleted from the DxCache folder, thus the improved responsiveness that I had noted w\ Shader Cache disabled................

Interesting.................. ..

 :salute

NVidea is set to default "Global" which is game setting right? Anyway I just disabled in DX11 and I'll see if that does anything to the frame stutter issue I had. Ive been running DX11 since the version came out and this stutter is only a relatively new issue. The game has run well in the past.

BTW going to DX9 and dialing down settings did get rid of it. I would still like to isolate the exact cause however. I kinda smell a NVidea driver thing. Their drivers have gone to pot the last year or so.

As you have stated I also do suspect that the AMD driver stack to need some "tuning" as well to better handle (read interpret here) the new AHIII graphics calls being sent across thru the D3DCompiler_47.dll under Dx11 but this will have to be done by AMD as well as Nvidia...........but AHIII ain't very high on either of AMD\Nvidia's game compatibility lists for Day 1 driver compatibility\performance.

Let's hope that w\ AHIII making the certification grade\favored status for the Occulus Rift VR headset & software that these 2 vid card manuf's will start to take notice and show some interest.

We can only hope......................... .

 :salute


Title: Re: question
Post by: Skuzzy on February 20, 2017, 07:07:57 AM
Yes, NVidia ships with the setting at "Global" which means it is not going to do anything for Aces High III.

If AMD is indeed forcing it on, they are hurting our performance on their product.
Title: Re: question
Post by: 715 on February 20, 2017, 11:46:17 AM
Yes, NVidia ships with the setting at "Global" which means it is not going to do anything for Aces High III.

If AMD is indeed forcing it on, they are hurting our performance on their product.

Skuzzy: could you explain that please?  Shaders are little programs and if I understand, shader caching just saves compiled versions of these programs so they don't have to be re-compiled each time the are reloaded.  If AH does it's own caching then it will be using the compiled versions and AMD's shader caching won't be given any uncompiled shaders to cache?
Title: Re: question
Post by: Skuzzy on February 20, 2017, 12:15:38 PM
We already have all the shaders compiled and manage them.  From what Pudgie was saying it seems AMD is saving the compiled versions into a file on the drive, which is going to slow the deployment of our shaders.

I can see it speeding up games which ship plain text shader code.  That compile can take some time to do and you would not want to do it real time each time the shader needed to load.
Title: Re: question
Post by: Easyscor on February 20, 2017, 12:42:48 PM
Excellent timing for a related question I found this morning.

I opened the sysinfo.cfg file and saw that SHADER0 AND SHADER1 are both set to zero. Is this what it should be and if not, is there something I need to be doing? Upgrading? LOL

sysinfo.cfg

VERSION,1
SURVEY,1
DEVICE0,NVIDIA GeForce GTX 1070
SHADER0,00000000
DEVICE1,NVIDIA GeForce GTX 1070
SHADER1,00000000
CPU,Intel(R) Core(TM) i7-2600K CPU @ 3.40GHz (8 CPUs), ~3.4GHz (That's a PCIe2 bus)
CNTRL0,CH PRO THROTTLE USB
CNTRL1,CH PRO PEDALS USB
CNTRL2,CH FIGHTERSTICK USB
OP_SYS,Major 6 Minor 1 (Win 7)
Title: Re: question
Post by: Pudgie on February 20, 2017, 09:52:38 PM
Update:

Just ran the game some this evening & caught my vid card driver actually caching shaders while playing........
Now the .bin file wasn't very large (64 K) but this will cause a slowdown so now that I have witnessed this going on I have disabled Shader Cache in the Crimson Global driver & deleted all .bin files in the DxCache file on my SSD.

Good to go on this now.

FYI.......................... ....

 :salute

Title: Re: question
Post by: zack1234 on February 21, 2017, 02:03:20 AM
For the PC retarded what do I have to do to my GTX 680 to run DX11 to stop stutters.

Title: Re: question
Post by: Skuzzy on February 21, 2017, 06:43:43 AM
The GTX 680 is about the same performance as a 750Ti so I should there should be a number of players who can comment on what they have done to maintain performance levels.

Pretty sure you will not be able to run with the default settings and maintain a steady frame rate in all conditions.

The DX9 version would run better for that card.