Author Topic: CPU Core Affinity Effect AHIII  (Read 997 times)

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
CPU Core Affinity Effect AHIII
« on: December 03, 2016, 11:08:53 AM »
Hi All,

I just got done running AHIII Patch 12 Dx11 at A28 to A47 using the same settings (Intel CEIST Off, AMD PowerEfficiency Off so both CPU and GPU are running on max power and clock speeds, AMD driver and in-game graphics settings set the same) in each session except I ran 1 session w\ the CPU core affinity set by Windows for AHIII to use all 6 CPU cores, the other session was run w\ me setting the CPU core affinity in TaskManager, Process tab for AHIII to use only CPU core0 and core1 (2 CPU cores).

I was tangling w\ the same Typhoon con in each session over A47 so the scenarios were as apples-to-apples as you could get.

Here are snippets of my MSI AB graphs of each of these sessions......please note the GPU frametiming graph line in each graph.

From these you can clearly see how much better AHIII runs when the game really uses only 2 CPU cores vs 6 CPU cores on my box.
The micro variations seen in the GPU framtime graph line on the 6 CPU core snippet are being caused, not by the GPU, but by the OS parsing game threads across the other 5 lightly loaded CPU cores that are experiencing a lot of CPU idle time which is slightly throwing off the CPU signals to the GPU to fetch work\display graphics frames to the monitor which is seen on screen as a stutter....this is most noticeable thru the plane's prop rotational animation w\ an occasional screen stutter becoming visible as well.

The GPU frametime graph line on the 2 CPU core snippet OTOH is a nearly clean drawn graph line w\ no notice of any visible screen stutter or any noticeable variation in the plane's prop rotational animation at all during the session.

Note the game FPS was near stable in both graphs at 78-79 FPS so this isn't about the CPU\GPU maintaining FPS or even the Internet packet streaming per se.....this is really about signal timing between the CPU\GPU and these graphs show me that w\ the CPU using all 6 CPU cores of my Intel I7 5820K CPU there is too much CPU core latency being derived due to the OS parsing game threads across the 1 loaded and 5 lightly loaded CPU cores (read CPU core idle\wait time) to stay in synch w\ the GPU causing the GPU to slightly hesitate vs the OS parsing game threads across the 2 more heavily loaded CPU cores only which show to be able to stay in synch much better w\ the GPU due to much less CPU core latency being experienced due to less CPU core idle\wait time so the GPU doesn't hesitate hardly at all.

The game runs very sweet on my box under DX11 now.........just shows to run even better if I'm using only 2 CPU cores instead of all 6.

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline MADe

  • Silver Member
  • ****
  • Posts: 1117
Re: CPU Core Affinity Effect AHIII
« Reply #1 on: December 03, 2016, 02:40:24 PM »
Question is tho, ignoring the graphs, could you really tell/see a diff in game play?
I never couldd regarding ah.
 :joystick:
ASROCK X99 Taichi, INTEL i7 6850@4.5GHz, GIGABYTE GTX 1070G1, Kingston HyperX 3000MHz DDR4, OCZ 256GB RD400, Seasonic 750W PSU, SONY BRAVIA 48W600B, Windows 10 Pro /64

Offline morfiend

  • AH Training Corps
  • Plutonium Member
  • *******
  • Posts: 10435
Re: CPU Core Affinity Effect AHIII
« Reply #2 on: December 03, 2016, 05:58:16 PM »
Pudg,

  If you had say a 4 core 2700K do you think it would make a noticeable difference?

 Next question,how exactly do you go about setting the affinity so the cpu uses only the 2 cores?

 I ask because in some frames of FSO when there are large groups of planes if I dont reduce my settings I will see large frame-rate reductions. I use a 7950 AMD card,which I hope to upgrade soon,and I'm not sure if it's a map thing or a computer resource thing. I find AH3 runs much smoother for me than AH2 did when you drop a few frames but I would like to set my detail and terrain to a higher setting for better visuals. If I do this in FSO I risk a slideshow effect ATM.



   :salute

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: CPU Core Affinity Effect AHIII
« Reply #3 on: December 03, 2016, 08:26:12 PM »
Question is tho, ignoring the graphs, could you really tell/see a diff in game play?
I never couldd regarding ah.
 :joystick:

Hi MADe,

The short answer is yes, I can see a difference in gameplay. But the difference is not always just due to FPS alone. Too many times this is misunderstood IMHO. In general terms FPS is the speed measurement of how fast a graphics card is flipping finished graphics frames to display but within this realm it is also just as important, if not more important, as to the graphics frame flipping sequencing being consistent so the scene animation is consistent which shows up as SMOOTH SCENE MOVEMENT. It is this aspect of computer graphics that affects gameplay the most, over and above the FPS as this WILL impact any control input, gunnery, views, etc....any part of the scene that imparts motion thus creates a timing scenario within the scene. FPS alone is not the end-all answer............. In fact FPS can mask these issues just as well as show them up.

I use the MSI graphs to back this up as I can type my findings here all day but unless a person is in good understanding of GPU frametiming and the effects of it on what a person sees on screen is most likely gonna be missed or dismissed.

This is the main reason why both Nvidia and AMD are very big on GPU frametiming (FCAT process) to determine overall graphics performance and also why IMHO this should be closely monitored on a computer being used to play games on.

GPU FPS and GPU frametiming are not mutually inclusive, especially when you start adding in other methods to try to achieve the same end, such as Vsynch or even VRR (G-synch or FreeSynch) or even utilizing high Hz monitors. These are all viable methods to employ to achieve graphics scene consistency but my testing has shown that these issues are not just centered around a GPU or monitor....the CPU also has a part in this, especially when a CPU has more than 4 physical CPU cores on die running software that isn't written to make full use of all of them so these extra CPU cores start to exhibit excessive idle\wait time (read as CPU latency) as Windows will try to use ALL of them to parse game threads which will throw off the CPU signal timing to the GPU to either fetch work from the system mem cache to continue to draw\render graphics frames OR to flip finished graphics frames to display in sequence which WILL show up as a stutter or a screen tear as well. We have always just associated these w\ the graphics card or Internet but CPU latency can affect ALL of these devices operation.

Just to make it very clear, my box IS using Vsynch, FreeSynch and AMD FRTC set @ 80 FPS which all have worked to smooth out the graphics frametiming to the point that any CPU anomonies will show up in the GPU frametiming graph line on my box......as I have shown. The differences are easily noted in the SMOOTHNESS of the game running which will include any control inputs, gunnery angles\bullet trajectories, etc as the CPU does all of this work and has to be done in synch w\ the graphics frames being displayed, not necessarily the SPEED or FPS of the GPU alone. This is the most noticeable when viewing a plane's prop rotational animation while playing. If the prop arc rotation is smooth w\o any hitching then all is in synch, if it is not smooth and is showing variations in rotational animation then something is not in synch and I have seen stutters and hitches in screen animation as well and these will show up on the GPU frametiming graph line....as I have shown thru these graph snippets but they aren't ALWAYS the fault of the GPU....as I have also shown.

As CPU core counts continue to grow this is gonna become a bigger problem IMHO if game developers don't compensate for this thru their software.

Pudg,

  If you had say a 4 core 2700K do you think it would make a noticeable difference?

 Next question,how exactly do you go about setting the affinity so the cpu uses only the 2 cores?

 I ask because in some frames of FSO when there are large groups of planes if I dont reduce my settings I will see large frame-rate reductions. I use a 7950 AMD card,which I hope to upgrade soon,and I'm not sure if it's a map thing or a computer resource thing. I find AH3 runs much smoother for me than AH2 did when you drop a few frames but I would like to set my detail and terrain to a higher setting for better visuals. If I do this in FSO I risk a slideshow effect ATM.



   :salute

Hi Morfiend,

From my testing on my box I have found that AHIII will run pretty well on 2-4 CPU cores w\ little variation but here is the process to set this if you want....................

http://www.techrepublic.com/blog/windows-and-office/change-the-processor-affinity-setting-in-windows-7-to-gain-a-performance-edge/

HTC has said several times that AH was written to use 2 CPU cores, meaning that they have optimized the game threading across 2 CPU cores being used, but it will run on a single CPU core as well as many CPU cores but the results may not be as well as when run on the 2 CPU cores that the game was designed to be optimized on. This is due to how Windows is written to use CPU core affinity, not due to HTC. My testing has only shown to some extent the validity of what HTC has stated, but more to the exposure of running AHIII on a multi-core CPU that is in excess of 4 CPU cores (like my Intel I7 5820K 6-core CPU) as from my testing I'm finding that the extra CPU cores are inducing too much overall CPU latency due to these CPU cores experiencing excessive CPU idle\wait time so not being as efficient in processing game threads causing timing issues w\ my box's GPU culminating in GPU stuttering as seen both visually in-game AND thru the GPU frametiming graph line of MSI AB.

This scenario kinda fits the scenario as given in the blog link as to setting CPU core affinity for older software that hasn't been written to specifically make good use of all the extra CPU cores.

But your mileage may vary as your particular issues may be caused by other more mainstream causes so don't get your hopes up..........

Hope this helps you out.

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: CPU Core Affinity Effect AHIII
« Reply #4 on: December 03, 2016, 09:24:36 PM »
Here are another set of graphs from me playing at A3 just now, 1 session using all 6 CPU cores, the other using 2 CPU cores. Please note the GPU frametiming line on both graphs.......................

Pictures are worth a thousand words........................

 :salute

PS--Here is where OC'ing a CPU's core speeds can cause this issue to get worse instead of better so be mindful of this as well when using a multi-core CPU in excess of 4 CPU cores thinking that the CPU core speeds are just too slow................

 :salute
« Last Edit: December 03, 2016, 09:49:07 PM by Pudgie »
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline MADe

  • Silver Member
  • ****
  • Posts: 1117
Re: CPU Core Affinity Effect AHIII
« Reply #5 on: December 04, 2016, 12:10:58 AM »
OK I'm gonna set up afterburner to monitor as you have, but with W10. Lets see if w10 deals with thread allocation any better. w7 was new when quad cores were first introduced, possibly MS got better with age.

I see no anomolies other than players warp. All animation appear fine, get a little landscape scintillation, trees and towns, but its minor.
ASROCK X99 Taichi, INTEL i7 6850@4.5GHz, GIGABYTE GTX 1070G1, Kingston HyperX 3000MHz DDR4, OCZ 256GB RD400, Seasonic 750W PSU, SONY BRAVIA 48W600B, Windows 10 Pro /64

Offline Skuzzy

  • Support Member
  • Administrator
  • *****
  • Posts: 31462
      • HiTech Creations Home Page
Re: CPU Core Affinity Effect AHIII
« Reply #6 on: December 04, 2016, 06:25:10 AM »
Let me explain something about multi-core CPU's.  You can have a million cores, but 99.99% of the time, only one can run.  If all the cores need to fetch something from memory or write to memory or do any I/O, or if an interrupt occurs, then only one core at a time will run until the previous core is done with its operation.

Is it possible for 2 cores to perform better than 10, 20, 100 cores, at the same clock rate?  Yes, it certainly is.  It is very application specific though and will depend greatly on the configuration of the computer.  As one example, someone using a USB audio device, which interrupts for each byte of data, will probably see little gains with a bigger set of cores, than the person using a decent sound card.  That would be due to the constant invalidation of all the cores cache memory, which happens at each interrupt time.
Roy "Skuzzy" Neese
support@hitechcreations.com

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: CPU Core Affinity Effect AHIII
« Reply #7 on: December 04, 2016, 10:06:23 AM »
OK I'm gonna set up afterburner to monitor as you have, but with W10. Lets see if w10 deals with thread allocation any better. w7 was new when quad cores were first introduced, possibly MS got better with age.

I see no anomolies other than players warp. All animation appear fine, get a little landscape scintillation, trees and towns, but its minor.

Hi MADe,

That is the major point of all of this.....if while running the game all screen\scene animation is showing to be smooth w\ little to no variation regardless of the GPU FPS then all is working fine. Whether the in-game FPS is 30, 40, 60, 80, 120, 144 is not the main measurement here. Now does the FPS matter? I would say yes it does, especially IF smooth graphics scene animation isn't achieved at a lower FPS then FPS will matter as the faster the graphics frames are flipped the less of the variations will be visible and\or noticeable up to a point.

What this testing of mine has shown is that a CPU has a part to play in the graphics scene animation being\staying smooth, just as a GPU, sound card, Internet, etc does as well, especially if the CPU has less than 2 or more than 4 physical CPU cores on die and running AHIII.

All this testing was done on my box, which is using a 6-core CPU w\ a GPU using VRR and a FPS limiter and the game using Vsynch. I have tested this w\ VRR on\off and FRTC on\off and saw no changes until I set CPU core affinity to use less CPU cores which then exposed the CPU latency on my box w\ all 6 CPU cores being used. I also noted how Intel's CEIST was invoked due to the low CPU core loading from running AHIII under Dx11 (from my understanding was most likely influenced by Dx11 having multithreading capability at the API level which will take advantage of a multi-core CPU\GPU and be more efficient vs running under Dx9) which didn't help the CPU either.

All this was a by-product of me going thru the process of trying to solve the game screen freezes\pauses that I was seeing occur on my end to try to help HTC to identify the cause(s) on their end (if any) to fix so the game would run correctly and clean.

In the process I actually found several items that were amiss on my end (DSL modem\router went bad, DSL filter had been removed) that didn't help this as well and in the process gained some performance (gained true Gigabit LAN speed thru a newer, more modern ADSL modem\router which allowed me to increase the transmit\receive buffers for my Killer NIC which smoothed out the Internet packet flow to my box thus reducing this issue).

 :salute
« Last Edit: December 04, 2016, 10:10:34 AM by Pudgie »
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: CPU Core Affinity Effect AHIII
« Reply #8 on: December 04, 2016, 01:35:32 PM »
Just for giggles here is a snippet of my box running AHIII Patch 12 Dx11 using the 3 CPU cores that I've just discovered that AHIII can\will use:

Looks good!

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd