Author Topic: Ryzen gen 3  (Read 3866 times)

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: Ryzen gen 3
« Reply #15 on: December 17, 2019, 01:32:44 PM »
Update:

After getting all sorted out (all pertinent drivers updated....noted that that the RTG group has changed the Radeon Adrenalin interface software when I updated to the latest drivers for my RX Vega 64 vid card...IMHO this kinda sucks as it's not so intuitive to me as the original interface was but I'm an old guy so there  :D. 1 thing to note if anyone else is considering to buy a MSI AMD mobo….use the Dragon interface software at your own risk as this software is simply the pits! Installs fine but just won't start up. Tried several times to get this to work before giving up & deleting it so I had to set up my fan\pump control thru the UEFI to manage the cooling duties which sucks as now I can't make any necessary changes quickly. Other than this all is well.) I went & flew around for quite awhile last night (until 5:07 hrs MST) to fully settle the thermal paste on CPU & to see how all goes...……..

All I'm gonna say is that if the goal is to build a machine that can do all things & do them well (including gaming) & be affordable to most consumers, this Ryzen 9 3900X CPU on a X570 mobo is the 1 to get as a top tier build (the more sensible all around build to do IMHO would be a Ryzen 7 3700X on a X570 mobo as this would be comparable performance for a little less money).

While flying this CPU just wasn't being pushed in any load scenario (only used between 4%-6% CPU usage the entire time w\ PBO kicking in when needed....only used 4 CPU cores out of the 12 available w\ 3 cores (thus 6 threads) clocking up to the advertised 4.6GHz max boost clocks on occasion but mostly clocking between base clocks (3.8GHZ) to 4.5GHz boost clocks (have CPU in default setup--AMD Cool & Quiet enabled, PBO enabled, SMT enabled, etc & removed all affinity settings in AH CMD shortcuts so Windows is doing all CPU core thread scheduling management w\ the rest of the CPU cores\threads mostly shut down\rarely used) regardless of the ingame load & just pegged FPS consistently at the 90 FPS (90 Hz set in Windows for my Asus MG279Q 27" FreeSynch Gaming Monitor which is max freq w\ FreeSynch enabled...older model) mark throttling the CPU core speeds back & forth as needed but staying very cool...CPU never exceeded 50*C w\ my RX Vega 64 graphics card never exceeding 42*C regardless of graphics load--a testament to the new tech that AMD has incorporated into these gen 3 CPU's to give just the necessary amount of power to maintain CPU operations fast and stable while at the same time reducing total power usage & wasted heat so my loop is even more effective at maintaining all temps so that the CPU & GPU can max out their power delivery when needed to maintain performance. So I find absolutely no need to manually OC anything w\ this setup...just keep the temps consistently below the temp\power delta threshold & the CPU\GPU will OC themselves up to the max boost clocks as\if needed.

Tis the main reason why I chose to go to a properly sized closed loop WC setup in my box....to attain very consistent heat removal to maintain very consistent operating temps that can be consistently maintained below the temp/power deltas so the CPU\GPU won't get power throttled to them due to temps, only due to load changes so the gameplay is very, very smooth w\o hiccups....the frametime graphs verify this is occurring as they're now essentially a straight line w\ hardly any noticeable variation occurring while in game.

Just typed this to give a quick observation on how this Ryzen 9 3900X CPU is performing w\ AH. Will do some in flight recording to capture the inflight overlay readouts on my box's CPU\GPU & LAN ingame operations later on for my records.

More to come...………………

 :salute

PS--Just got done running some load tests in CINEBENCH R15 on this AMD Ryzen 9 3900X CPU....ran 4 full CPU core load tests back to back & score never wavered from the 3097 score it ran on the 1st run. Then ran a single core CPU run & she scored a 211 (which blew away the best scores I got w\ my AMD Ryzen 7 1800X....1625 full CPU cores & 148 single CPU core translates to 2.69x faster in full CPU core load & 1.58x faster in single CPU core load) then immediately ran another full CPU core load test after the single CPU core test & she again repeated the 3097 score!

Now that is some consistency in performance demonstrated there & some more proof of this WC loop maintaining consistent temps to allow full power delivery to the CPU cores during testing to maintain consistent operation.

 :salute
« Last Edit: December 17, 2019, 02:02:28 PM by Pudgie »
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline SilverZ06

  • Silver Member
  • ****
  • Posts: 1727
Re: Ryzen gen 3
« Reply #16 on: December 18, 2019, 12:07:30 PM »
I just downloaded C15 to compare 3900x scores.
OpenGL: 167.16
CPU: 3123
Single Core: 203

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: Ryzen gen 3
« Reply #17 on: December 18, 2019, 03:57:22 PM »
I just downloaded C15 to compare 3900x scores.
OpenGL: 167.16
CPU: 3123
Single Core: 203

Nice!

I ran all my tests on a back to back basis trying to see if this CPU would remain stable over a simulated "extended" usage as when CPU's get heat saturated they tended to slow down some as they got hotter & settle into a lower score than the initial 1 & this is the 1st CPU that I've run on CINEBENCH R15 that consistently held it's initial score rating thruout w\ the only interruption being the single CPU core load test. It most likely will take a while longer to eventually start getting this CPU to heat soak enough to get it to start throttling but w\ a CPU core Tjunction temp threshold of 95*C I don't see that happening on my setup anytime soon!

After seeing all these results I then went back into my UEFI & reset the MSI Hardware Fan Monitor to slow down my pump speeds from 2448 RPM's to 1457 RPM's & reset my 2 rad fan speed curves to further reduce fan noise levels since this CPU has such a high Tjunction threshold so now when flying in AH the Ryzen 9 3900X only heats up to a max of 62*C @ 4%-6% usage levels (getting coolant that is coming off my RX Vega 64 vid card after passing thru a 120mm rear case rad to remove some of the heat pulled from the vid card) pulling a total of 55.2W (rated for 105W) of power as opposed to my RX Vega 64 reaching a max of 42*C @ 99% GPU load (gets coolant entry into block at close to ambient temps after passing from CPU thru a 240mm front case rad & a 140mm reservoir before the pump intake) that has front cool room air passing thru it along w\ a bottom 120mm case fan blowing cool air into case from bottom of case, pulling a total of 220.3W of power (see, these Radeon Vega series vid cards don't really use more power than the Nvidia counterparts IF you can get the heat buildup away from the GPU readily enough...the reference air coolers just won't do it.....she's flatlining the FPS at the full 90 FPS (90Hz) FreeSynch RR of my monitor (FreeSynch enabled) so I haven't crossed the threshold of my vid card's GPU for stuttering to appear yet but it's really close!

Got all this being fed from a Seasonic PRIME Gold 850W 80certified PSU that is running at 90%+ efficiency when at 50% load (my box's total power usage load is just under\at the 395W design of my EKWB closed loop WC setup) so all is optimally sized to work well together.

This is what I wanted MSI's Dragon Center software for...to make quick, easy adjustments to the Hardware Fan Monitor w\o having to shut down Windows but to date I haven't figured out yet why the software won't start up after install......so gotta do it the old fashioned way!

I think this CPU is a winner but it's not for everyone's usage\price point.

1 thing is for certain though, I really can't see me upgrading to another CPU for any reason going forward outside of physical damage to the CPU making it inoperable as this thing can really do it all and do it very, very well!

Only items left for me to consider getting is 1 of them Intel Optane 905P 480Gb 2.5" SSD's (mostly cause I want it, don't need it), 1 of the new Radeon 5700 XT or better Radeon vid cards using the new RDNA architecture (again mostly cause I want 1, don't need it) & an external USB DVD drive w\ enclosure as this Fractal Design Meshify C case don't have a slot for 1 (only to use to playback some of my old music DVD's from time to time...starting to miss listening to them while online).

I haven't had this much fun messing w\ a computer in a while.....

 :D  :aok  :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline SilverZ06

  • Silver Member
  • ****
  • Posts: 1727
Re: Ryzen gen 3
« Reply #18 on: December 19, 2019, 03:47:38 PM »
I have the XFX Thicc 3 version of the 5700XT and absolutely love it so far. I also purchased this external dvd drive for myself since my case also doesn't have a slot for an internal one. this one works well so far and for $25 I cant complain. https://www.amazon.com/gp/product/B07DLRG9VH/ref=ppx_yo_dt_b_asin_title_o09_s00?ie=UTF8&psc=1

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: Ryzen gen 3
« Reply #19 on: December 20, 2019, 08:07:56 PM »
I have the XFX Thicc 3 version of the 5700XT and absolutely love it so far. I also purchased this external dvd drive for myself since my case also doesn't have a slot for an internal one. this one works well so far and for $25 I cant complain. https://www.amazon.com/gp/product/B07DLRG9VH/ref=ppx_yo_dt_b_asin_title_o09_s00?ie=UTF8&psc=1

As for the Radeon 5xxxXT series vid cards, if they make 1 using the AMD reference design that is using the same PCB layout that will use the same cooler as this Vega vid card I got (to swap on my EKWB full coverage water block) I'll be all over that 1 but if not then I'm looking at the PowerColor Red Devil Wet 5700XT as it comes w\ the EKWB full coverage water block already attached but will wait it out to see if the price will lower on them some...………….

Thanks for providing the link to that external DVD as I'm gonna get that 1 of these as well as I like the looks of this 1 as well as the thin design.....at my new favorite shopping source as well...…  :D

 :aok

PS---Now this is some geeky info that I forgot to type earlier but I also found out that even w\ this much CPU Windows still won't make as good use of this CPU's cores when assigning out game threads to them (still sometimes assign threads as if a 4 core CPU w\ HT enabled were installed) so it still will tend to pile thread load on too few CPU cores for my tastes even w\ the Windows Scheduler update installed to assist Windows to better schedule threads across these AMD Ryzen CPU's. CPU can handle it though due to the higher clock speeds & IPC improvements vs 1st gen Ryzen CPU's but I asserted CPU affinity command back in my CMD-created AH shortcut & set it up to instruct Windows to assign the game threads to be run exclusively on the 2nd CPU chiplet's CPU cores--all 6 of them--to see if there was any improvement. When I ran the game w\ this config enabled the game was showing to make use of all 6 of the 2nd CPU chiplet's CPU cores (so 12 threads in total as I also have SMT enabled) which offloaded the 1st CPU chiplet's 6 CPU cores for Windows to assign threads to for all else (drivers for vid card, LAN, Windows itself, any running 3rd party software, etc) which means that these CPU cores in the 1st CPU chiplet are hardly being loaded at all. This usage resulted in the CPU actually cutting the necessary CPU core boost speeds to maintain performance (dropped from 4.6 GHz max on 3-4 CPU cores in 1st CPU chiplet to 4.2 GHz max across all 6 CPU cores in the 2nd CPU chiplet) w\o any power increase across the CPU (all stayed in the same 50.2W-55.1W power usage range as before) and no loss of performance either (I use MSI Afterburner\RTSS overlay w\ HWiNFO64 in tandem to hook what readouts MSI AB can't but show it thru MSI AB\RTSS's overlay...this is how I can see how all this is working\responding while flying the game in real time). This is the fun part for me!  :D

 :salute
« Last Edit: December 20, 2019, 09:06:36 PM by Pudgie »
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: Ryzen gen 3
« Reply #20 on: February 03, 2020, 11:43:36 PM »
Update:

After running this new AMD Ryzen 9 3900X CPU on default settings in the MSI 7C37vA4 BIOS that came on this MSI X570 Gaming Plus mobo (which was running very well w\ no hiccups anywhere) running AHIII smoothly using approx 52W-55W of power cycling CPU core clock speed rate across the 2nd chiplet's 6 CPU cores (using CPU affinity command to instruct Windows to run the game client on the 2nd chiplet's CPU cores & CPU priority command to instruct Windows to give AHIII game client high priority time_min of 75% processing time_on the 2nd chiplet's 6 cores thru a Windows CMD-created shortcut I created for AHIII) between the base 3.8 Ghz to 4.2 Ghz boost under game load while the 1st chiplet's CPU cores remained at the 3.8 Ghz base clock speeds (Windows using these 6 CPU cores on the 1st chaplet to run everything else including itself) I then got around to flashing the mobo BIOS up to the latest non-beta version (7C73vA6) to see how she does now as there has been a lot of info on these Zen3 chips having various relatively minor issues since launch (memory compatibilities, PCI-E compatibilities, etc....I never had any issues) on various mobos from various manuf's along w\ some performance improvements. GPU is running flat out at the max 218W-225W power curve @ 98%-99% GPU usage @ 144 FPS (using AMD's Enhanced Synch which will max out the GPU as long as the temps can be controlled & kept below the 85*C threshold...my loop is maintaining the GPU temps at 46*C-48*C max under this kind of load so the GPU gets the full power limit applied while the CPU is at 68*C-72*C but this isn't due to the CPU's load...is mainly due to the load on the RX Vega 64 GPU after coolant exiting the vid card block has passed thru the 120mm radiator to pull some of the heat out of the coolant before it goes into the CPU block....actual CPU temps are approx. 10*C-15*C lower than reported).

1 thing that I've found w\ this MSI X570 Gaming Plus mobo….it takes a WHILE for a BIOS flash to complete (timed out on my setup at 32 mins....I kid you not....had gotten spoiled w\ the Gigabyte mobos I had prior completing BIOS flash in 8-10 mins) but the good thing was that it flashed up in good order and all was good. Went in BIOS and set back up the Hardware Monitor fan-pump control settings & enabled the mobo's XMP setting to use the SPI setup info on the mem modules as prior then went thru the rest to ensure that all was still set to default then rebooted into Windows in good order. All good.

While doing all this I noted that MSI had a new version of their Dragon Center utility software out so I d'ld it & installed it to see if they fixed it....they DID! So now I have the ability back to monitor all mobo functions\tweak the fan-pump speed profiles thru Windows w\ MSI's software so I have MSI Dragon Center, MSI Afterburner & HWINFO64 running in background handling all monitoring of system (power, temps, clock speeds, mem speeds, LAN upload\download speeds, CPU\GPU\mem usage%, etc) processes and in game CPU\GPU real time monitoring thru MSI Afterburner & HWINFO64 thru RTSS (all as before prior this CPU\mobo upgrade, just not w\ MSI Dragon Center utility software....was using Gigabyte's APPCenter utility software w\ the other 2).

Running the game I've found that the CPU performance has improved due to what appears to be some tweaking on AMD's part thru the new BIOS as the CPU's power usage has increased in game to approx 64W-67W under load...this has the CPU now running all 12 CPU cores at 4.1 Ghz base boost clocks w\ the 2nd chiplet's 6 CPU cores cycle boost clocks between the 4.1 Ghz base boost clocks to as high as 4.5 Ghz boost clocks on as many as 4 of the 6 CPU cores at a time depending on CPU loading from the game client. The good news is that the operating temps have not changed at all from what is listed above during game play so my little closed loop water-cooling system is hanging in there just fine (120mm radiator fan is turning at max speeds @ 2,200 RPM's, 240mm radiator fans are cycling at 1,874-1,912 RPM's--all are EKWB's newer 120mm Vardar Hi Pressure cooling fans--w\ EKWB 3.2L PWM DDC pump running at 1,485 RPM's which is optimal pump speed for adequate coolant retention time in radiators to provide maximum heat transfer rates) even though it wasn't sized w\ this Ryzen 9 3900X CPU in mind (sized using Ryzen 7 1800X CPU).

Gonna run some more tests to see how she fares since mobo BIOS upgrade.

PS......Also am using Win 10's Game Mode settings as well which are supposed to help CPU clock rates while a game is running. I have this set to recognize AHIII as a game....I'm assuming that it's working. Forgot to mention that before I posted initially...……….

FYI...…………

 :salute
« Last Edit: February 03, 2020, 11:47:43 PM by Pudgie »
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Vinkman

  • Gold Member
  • *****
  • Posts: 2884
Re: Ryzen gen 3
« Reply #21 on: February 04, 2020, 12:00:16 PM »
Pudge,
Thanks for the update. I have to admit I don't understand half of what you're talking about because I'm not a very knowledgeable computer guy, but it gives me a road map of things to research and teach my self.     :salute
« Last Edit: February 04, 2020, 12:02:11 PM by Vinkman »
Who is John Galt?

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: Ryzen gen 3
« Reply #22 on: February 05, 2020, 08:03:39 PM »
Pudge,
Thanks for the update. I have to admit I don't understand half of what you're talking about because I'm not a very knowledgeable computer guy, but it gives me a road map of things to research and teach my self.     :salute

Hi Vinkman,
No problem. If it can help you out, use the search feature in here to search the forum as I have posted on a lot of what you've read in my last post along w\ the links to some of the specific articles or files that you can use to learn & apply the same stuff I'm doing now as I've been doing all this stuff over the last 4-5 yrs as I've learned myself how to take better advantage of what these >4 core, multi-core CPU's can provide to the usage experience including gaming thru a lot of coding already available within MS Windows OS'es (been available since MS Vista days forward) that can help existing apps (which includes games) & hardware run better (or at least run the way you want them to according to your definition of "better"....) w\o needing 3rd party software interfaces to access.

The object is to make all run smoother w\ as much reduced latency in the entire process as can be achieved (thus the term "your definition of better, faster, etc" as for example, anything done that reduces latency does speed a process up so the CPU can process "faster" thus the GPU can process "faster" even though the onscreen FPS numbers may not look any faster but the on screen cinema appears to flow faster when in reality is more smoothly processed w\ less latency, ie faster, at the current graphics frame flip rates to monitor (or FPS\monitor RR's...take your pick)...which the differences can be seen thru a GPU frametime graph running in the background) so the GPU can be fully optimized to process as fast as it can w\o having to wait on any data it needs & that answer lies within the Windows OS itself to optimize the CPU operation side in both power delivery & efficiency by making the most efficient use of all the extra CPU cores & the much larger L3 cache on die then tie it all together w\ the system mem's capacity AND mem speed optimizations to keep the GPU's optimized then take advantage of any coding within the vid card drivers to make the most of the GPU's performance (both power & efficiency...which can increase the actual FPS numbers) running w\ a game's software.

Then anything that a software developer can do thru their software coding to also make better use of the OS AND these newer multi-core >4 core CPU's can make it even better....but I don't blame any software developer that doesn't take all the time & effort to do this thru their software coding as in reality this should all be done thru the OS itself....but MS is only gonna cater to the areas concerning their OS where the majority of their money is made from & that ain't us gamers BUT MS has coded into their OS'es the coding necessary to do all this, just haven't taken the time to work out how to LOGICALLY apply it so that it can fit in all the myriad of system configurations AND the myriad of software development processes used to write all this software being used.

But if a CONSUMER will take the time & make the effort to learn where all this is located within Windows & how to properly use\apply it to the software of choice to run as efficiently as can be made on their specific platform configuration being used, MS & other outlets has made all of this stuff available along w\ any instruction\direction on how to apply it on a Windows based platform to the public on the Internet (where I found it all) & has been available to the public even before I found all this some 4-5 yrs ago.....so in some sense you really can't blame MS either....at least not & call yourself being genuine about it in all fairness......at least this is the way that I look at it from my POV......

This is not an AMD vs Intel thing as every Intel multi-core CPU w\ >4 physical CPU cores on die will benefit from the exact same stuff applied thru Windows just as AMD's will.....most of what I use now was initially validated on an Intel I7 5820K 6-core CPU X99 platform then carried over to the AMD Ryzen AM4 platforms I've used\currently using since & is still valid even today, just reworked to make it specific to AMD Ryzen's newer CPU chiplet layout (the base CPU SMP design layout is still the same between the 2 so the Windows OS still looks at\manages them both in the same manner & same logical criteria as a 4 core or less SMP designed CPU using SMT--or HT if you prefer--which is where some of the optimization issues arise from when a SMP designed CPU w\ >4 physical CPU cores is being used w\ a consumer level Windows OS which includes the current version of Win 10...).

So if I can do it, you can too as I started from the same level of understanding as you & if I can help you in any way along the way just shout........

 :salute

Now after all this I have gone back into the BIOS & enabled the ERP efficiency coding (disabled by default) that a lot of mobo manuf's are providing thru their BIOS'es to provide energy efficiency capability to a system using their mobos. This is mainly intended initially for enterprise operations but is making it's way down to consumers as well. Wanted to see what effects it would have on overall system performance & stability since I've always skipped it in times past......and since I have this particular platform on hand that has some similarity to some low level server equipment used in some business environments….

Upon rebooting my box it initially seems to be more responsive & according to my WC'ing loop using less energy overall (loop is much more quiet w\ less fan oscillation) at the desktop level of operations. Can't see any effects on the AMD Ryzen 9 3900X CPU's operation at the desktop level that could give some explanation to what I sense is happening outside of my WC loop but when I run AHIII I can clearly see some of the effects of the ERP coding as the CPU is still running at the base boost clock speeds of 4.1 Ghz all core BUT the 2nd CPU chiplet's 6 CPU cores are showing to be limited to a max boost speed of 4.2 Ghz on all 6 cores now running the AHIII game client instead of allowing random CPU cores to clock higher if the CPU core load\temps allowed for as was the case when ERP was disabled in the BIOS AND the other 6 CPU cores in the 1st CPU chiplet are held to 4.1 Ghz max clocks or allowed to clock down as low as 3.4 Ghz as well so the ERP coding at the mobo level is influencing the power usage across the mobo to the components mounted on it to align to some component efficiency level according to the amount of work application being necessary to achieve a result. The game runs flawless & is very smooth...the GPU is running flat out @ 96%-98% usage w\ no oscillation or stuttering so the CPU is adequately providing all the data to the GPU so it's not having to wait on\for anything to do it's thing (this was also true before enabling ERP, just that the CPU core speeds were allowed to clock higher w\ all else assumed being equal) so the game runs at the monitor's native RR of 144 Hz (144 FPS) @ 96% of the time using AMD's Enhanced Synch thru the drivers w\ AMD's FreeSynch still enabled w\ all in game AHIII graphics settings at max settings w\ exception of the tree detail level (around 2\3 max), environmental reflections (at the default level of 1), checkbox checked to not allow downloading of skins but using all the rest of in game post processing graphics controls along w\ some graphics enhancing settings on the AMD driver side which puts a very sizeable graphics load on this RX Vega 64's GPU & to a lesser extent this AMD Ryzen 9 3900X CPU but both show to handle all this w\o any adverse operational issue....the CPU easily as it only is using the same 64-67W of power (it is rated for max power of 105W) whereas this GPU is needing ALL the 218-225W of power it is calling for to maintain these levels of performance....which is only possible due to my closed WC loop being able to keep her cooled down sufficiently enough--along w\ my SeaSonic PRIME GOLD 850W PSU giving her all she needs w\ ease (just near\at 50% of total available PSU power over all the rails in use) so I'm assuming that this ERP coding in mobo's BIOS is reducing any other power needs across the mobo as can be done w\o affecting component effectiveness during operations.

So far this seems to work as intended and as long as it isn't affecting gameplay in any negative way I'm gonna leave it enabled as this CPU really doesn't need to run any faster than 4.0 Ghz to maintain this level of performance...from my perspective as my prior AMD Ryzen 7 1800X would almost allow the GPU to run flat out using Enhance Synch but I would've had to manually OC the 1800X to get the 4.1 Ghz on a core or 2 to have any chance of getting there (would only go to the base clocks of 3.8 Ghz all core on it's own default settings which wasn't quite to the level of smoothness I wanted w\ the RX Vega 64 running flat out as it couldn't quite keep up @ 144 Hz but cut the GPU speeds down under 120 Hz then it could maintain the GPU smoothly). This is why I was wanting to move up to this Ryzen 9 3900X CPU anyway as the Ryzen 7 1800X CPU was the only item in the way due to CPU core clock speed not being high enough across all cores. My Gigabyte GA-AX370 Gaming K5 mobo's BIOS chips failing gave me a reason to start the upgrade process.

Now AMD has come out w\ a BIOS update that will allow a 1st gen Ryzen CPU to run on a X570 chipset AM4 socket mobo……………………. Glad this happened after I finished upgrading mine......:D

Now once I pick up 1 of these PowerColor Radeon 5700XT Liquid Devil vid cards I should be able to run AHIII at full graphics load at full monitor native RR on my current platform w\ ease.....and use less power overall to do it than what I'm using now so a win, win situation using all Team Red......

Or wait on Navi 21.......but this particular PowerColor vid card is a niche product & it may not be around for much longer so I'll have to weigh this out.....

 :salute

Then I think I'll be satisfied for a while...……………….
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Vinkman

  • Gold Member
  • *****
  • Posts: 2884
Re: Ryzen gen 3
« Reply #23 on: February 11, 2020, 09:31:46 AM »
Thanks Pudge,
I did have Game mode and activated it...mapping it to AH3. Game runs a lot smoother. Great tip, thanks for posting.
Who is John Galt?

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: Ryzen gen 3
« Reply #24 on: February 13, 2020, 03:45:32 PM »
Thanks Pudge,
I did have Game mode and activated it...mapping it to AH3. Game runs a lot smoother. Great tip, thanks for posting.

No problem.

MS has been working on Game Mode to improve it's effectiveness since they implemented it some 3-4 yrs back but the coding at this time will only recognize games written to the Windows Store code std so any of these games start up the OS will auto enable Game Mode for them. To use this for any other games that do not have Windows Store coding embedded you have to manually set up Game Mode to recognize the game's .exe when it is executed.....as you have done, then Game Mode should start up when you start the game from there on. Then you can set up the rest of the features within Game Mode if desired. Since I've been using my Windows CMD-created shortcut for AHIII to have CPU Priority & CPU Affinity enabled & configured to my liking for the game I sometimes forget I have Game Mode enabled too...….

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd