Author Topic: question  (Read 1486 times)

Offline MADe

  • Silver Member
  • ****
  • Posts: 1117
question
« on: February 12, 2017, 10:38:58 AM »
in Nvidia profile options,
AH3 prefers shader cache on or off?
"               " automatic thread handling on or off?

ASROCK X99 Taichi, INTEL i7 6850@4.5GHz, GIGABYTE GTX 1070G1, Kingston HyperX 3000MHz DDR4, OCZ 256GB RD400, Seasonic 750W PSU, SONY BRAVIA 48W600B, Windows 10 Pro /64

Offline Skuzzy

  • Support Member
  • Administrator
  • *****
  • Posts: 31462
      • HiTech Creations Home Page
Re: question
« Reply #1 on: February 13, 2017, 06:00:36 AM »
We already cache shaders and we al;ready manage our own threads.
Roy "Skuzzy" Neese
support@hitechcreations.com

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: question
« Reply #2 on: February 13, 2017, 11:08:09 PM »
We already cache shaders and we al;ready manage our own threads.

Was not aware of the game calling for a shader cache to be created............assumed this was solely done at the vid card driver level.........

Since the client is actually building a shader cache to store frequently used graphics frames (assuming in system mem cache) to be reused when called for and the vid card driver is set up by default to do the same thing but store this cache on a HDD\SSD (both Nvidia and AMD drivers are set up w\ this setting enabled by default) instead of in system mem, is this duplicating of shader caching the AHIII client's graphics frames the potential smoking gun to the issue of screen pauses\freezes? I can envision this causing a potential TDR event................

Gonna run some tests w\ this disabled in the Crimson driver to see what happens.................

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline MADe

  • Silver Member
  • ****
  • Posts: 1117
Re: question
« Reply #3 on: February 14, 2017, 08:01:37 PM »
no

i have run with and without, no percieved difference.
ASROCK X99 Taichi, INTEL i7 6850@4.5GHz, GIGABYTE GTX 1070G1, Kingston HyperX 3000MHz DDR4, OCZ 256GB RD400, Seasonic 750W PSU, SONY BRAVIA 48W600B, Windows 10 Pro /64

Offline MADe

  • Silver Member
  • ****
  • Posts: 1117
Re: question
« Reply #4 on: February 14, 2017, 08:02:27 PM »
cc
skuzzy
 :salute
ASROCK X99 Taichi, INTEL i7 6850@4.5GHz, GIGABYTE GTX 1070G1, Kingston HyperX 3000MHz DDR4, OCZ 256GB RD400, Seasonic 750W PSU, SONY BRAVIA 48W600B, Windows 10 Pro /64

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: question
« Reply #5 on: February 14, 2017, 09:24:31 PM »
So far, so good......................... .

Haven't witnessed 1 screen freeze\pause to date since disabling Shader Cache in the Crimson driver and running AHIII Patch 21 under Dx11.

Game runs very well and shows to be somewhat more responsive on my end to boot.........no perception.

Continuing w\ testing...................... .

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: question
« Reply #6 on: February 19, 2017, 07:57:10 AM »
Update:

I finally saw the screen freeze\pause issue creep back in after playing the game for over 3 days so this issue wasn't the main cause of this anomaly...........but this setting change did show to actually slow the occurances of these screen freezes\pauses somewhat so I've witnessed these pauses occur less than they were prior. This would make sense as anything that will reduce the GPU processing time to keep it within the 2 sec threshold before initiating a TDR event is a good step in the right direction.

But the much improved game responsiveness is very noticeable on my box since disabling the shader cache at the vid card driver level......which would also make sense as the vid card's GPU wouldn't have to contend w\ having to waste precious GPU processing time deciphering which set of cached graphics frames to actually use as well as in what sequence\position to use them.....and it will always be the better choice to use the shader cache that is set up by the game client in system mem run thru the CPU (especially a multi-core CPU) over the shader cache set up by the vid card driver on a HDD\SSD to be managed\used by the GPU.....as long as AHIII (or any game for that matter) provides 1.

IMHO the knowledge of AHIII creating and using it's own shader cache so players can know to disable this setting in the vid card driver if they choose to is a valid item to add to the Hints & Tips section.

Just a thought...................... ....

Appreciate the info, Skuzzy.

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Skuzzy

  • Support Member
  • Administrator
  • *****
  • Posts: 31462
      • HiTech Creations Home Page
Re: question
« Reply #7 on: February 19, 2017, 08:26:51 AM »
Is that feature on, by default?

We keep telling everyone the best settings for the video card driver are the default settings.  We design around those settings.
Roy "Skuzzy" Neese
support@hitechcreations.com

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: question
« Reply #8 on: February 19, 2017, 08:45:02 AM »
Yes, Shader Cache is enabled in the AMD Crimson Global driver by default as AMD Optimized.

I believe it is also on by default in the Nvidia drivers.....at least the last time I had my GTX 780Ti installed it was.

Can't say for sure though since then.

 :salute

PS--Also to let you know that w\ the Crimson driver it is also trying to apply tesselation in the game as this is set by default to AMD Optimized (I have witnessed it causing some stuttering especially when flying around the tops of mountains that have snow tiled but is probably also trying to apply this in other areas as well (like the trees). Setting the driver to either "Use Application Settings" or "Override Application Settings then set the maximum level to Off" stops this from occurring since AHIII doesn't apply tesselation at this time.

As you may already surmised, this is while running under Dx11...............

FYI.......................

 :salute
« Last Edit: February 19, 2017, 10:14:54 AM by Pudgie »
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: question
« Reply #9 on: February 19, 2017, 10:49:56 AM »
1 thing to note on AMD's Shader Cache..................

When AHIII was being developed AMD hadn't included this in the original Catalyst drivers or Radeon Crimson 15.x driver series..................

If memory serves me I believe Shader Cache was added starting w\ Radeon Crimson 16.3 series drivers forward.............

Nvidia drivers have had this feature for quite some time..........even before the development of AHIII.

FYI.

 :salute

PS--Checked & found it was w\ Crimson 15.11.2 that Shader Cache was added.....................
« Last Edit: February 19, 2017, 11:39:09 AM by Pudgie »
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Bizman

  • Plutonium Member
  • *******
  • Posts: 9606
Re: question
« Reply #10 on: February 19, 2017, 01:16:46 PM »
Interesting, Pudgie...

Since I'm still using the good old HD6970, I suppose I wouldn't actually miss anything by reverting to drivers older than the 15.11.2. If the older drivers add stability by not featuring Shader Cache, it would be an easy fix for many of us using older Radeons.

Remind you that I haven't installed the control panel at all since I like to run the card at default settings. Plus it kept asking me to check for non-existing driver updates... Now I only get a small error message window every now and then, carrying the new logo.
Quote from: BaldEagl, applies to myself, too
I've got an older system by today's standards that still runs the game well by my standards.

Kotisivuni

Offline Rich46yo

  • Platinum Member
  • ******
  • Posts: 7358
Re: question
« Reply #11 on: February 19, 2017, 01:27:12 PM »
NVidea is set to default "Global" which is game setting right? Anyway I just disabled in DX11 and I'll see if that does anything to the frame stutter issue I had. Ive been running DX11 since the version came out and this stutter is only a relatively new issue. The game has run well in the past.

BTW going to DX9 and dialing down settings did get rid of it. I would still like to isolate the exact cause however. I kinda smell a NVidea driver thing. Their drivers have gone to pot the last year or so.
"flying the aircraft of the Red Star"

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: question
« Reply #12 on: February 19, 2017, 03:08:01 PM »
Here is a good article that gives as good of an explanation of AMD's Shader Cache and how it works as there is........................... ...

http://www.anandtech.com/show/9811/amd-crimson-driver-overview/2

I went & checked my DxCache folder in C:\ drive & found only 1 file dated 2-18-17 so I deleted it & enabled Shader Cache in the Crimson driver to actually see what this is caching.

Started the game up & noticed that the game hesitated a little before bringing up the initial screen w\ clipboard, then went in game & flew around awhile then got out & rechecked this DxCache folder & found 1 .bin file that was created today 2-19-17 @ 12:07 hrs (when I initially started up AHIII) so the vid card driver did create a shader cache of the shaders compiling the start up screen to disk to speed this part of AHIII up. When I started the game up the 2nd time it started up pretty quick and processed into the hangar faster than it did the 1st time....so here is 1 instance of this Shader Caching working w\ AHIII. Flew around for a while, engaged some cons until I got shot down then got out of the game then checked this DxCache folder again & found no other .bin files had been created outside of this 1st 1 that I had noted seeing.

I know that AHIII client while running does pre-load graphics calls and now I know the client will also pre-cache all compiled graphics\shader calls making use of multi-threading multiple threads across multiple CPU cores into CPU L3 cache to system mem cache for the GPU when called for so the GPU shouldn't have to really do any compilation of any graphics draw\shader calls per se, only execute them as it receives them......as long as the CPU\mobo subsystem can keep up......which my Intel I7 5820K 6-core CPU w\ HT disabled clocked at 4.0 Ghz across the Intel X99 platform is showing to easily do (so much for the real benefits of SMT playing this game as long as there are sufficient numbers of physical CPU cores available), judging from the lack of any additional .bin files being created by the vid card driver due to the GPU having to compile any of this at this time, much less paging out to disk due to the 16Gb of system mem onboard (ran MS Performance Monitor in background 1 more time w\ monitor set up to detect any data moving from system mem to virtual mem & got 0%)............. So w\ the capabilities of the hardware that I have, unless there is something covert going on, I should never see another .bin file created in my DxCache folder except the 2-19-17 @ 12:07 hrs .bin file w\ Shader Cache enabled...................... ...

Gonna test this out from here to see if my box can make this "theory" come true.....................

So it appears that what I've noted prior when I turned this setting off in the vid card driver was the driver reading this .bin file that was already in storage so once created & stored on disk it will always be here to be used by the vid card driver unless it is deleted from the DxCache folder, thus the improved responsiveness that I had noted w\ Shader Cache disabled................

Interesting.................. ..

 :salute

NVidea is set to default "Global" which is game setting right? Anyway I just disabled in DX11 and I'll see if that does anything to the frame stutter issue I had. Ive been running DX11 since the version came out and this stutter is only a relatively new issue. The game has run well in the past.

BTW going to DX9 and dialing down settings did get rid of it. I would still like to isolate the exact cause however. I kinda smell a NVidea driver thing. Their drivers have gone to pot the last year or so.

As you have stated I also do suspect that the AMD driver stack to need some "tuning" as well to better handle (read interpret here) the new AHIII graphics calls being sent across thru the D3DCompiler_47.dll under Dx11 but this will have to be done by AMD as well as Nvidia...........but AHIII ain't very high on either of AMD\Nvidia's game compatibility lists for Day 1 driver compatibility\performance.

Let's hope that w\ AHIII making the certification grade\favored status for the Occulus Rift VR headset & software that these 2 vid card manuf's will start to take notice and show some interest.

We can only hope......................... .

 :salute


Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Skuzzy

  • Support Member
  • Administrator
  • *****
  • Posts: 31462
      • HiTech Creations Home Page
Re: question
« Reply #13 on: February 20, 2017, 07:07:57 AM »
Yes, NVidia ships with the setting at "Global" which means it is not going to do anything for Aces High III.

If AMD is indeed forcing it on, they are hurting our performance on their product.
Roy "Skuzzy" Neese
support@hitechcreations.com

Offline 715

  • Silver Member
  • ****
  • Posts: 1835
Re: question
« Reply #14 on: February 20, 2017, 11:46:17 AM »
Yes, NVidia ships with the setting at "Global" which means it is not going to do anything for Aces High III.

If AMD is indeed forcing it on, they are hurting our performance on their product.

Skuzzy: could you explain that please?  Shaders are little programs and if I understand, shader caching just saves compiled versions of these programs so they don't have to be re-compiled each time the are reloaded.  If AH does it's own caching then it will be using the compiled versions and AMD's shader caching won't be given any uncompiled shaders to cache?