Author Topic: AMD Radeon R9 290X from XFX  (Read 2629 times)

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: AMD Radeon R9 290X from XFX
« Reply #15 on: February 28, 2015, 06:37:55 PM »
Ahhhhhhhhhh mannnnnnnnnnnnnnnnnnn........ ............................. ..!

I had to cut off part of the previous post to get the post to fit within the limit of 10,000 characters then open another post so that I could paste the cut portion of the original post in the new post as I figured would happen....................... ....

Except I tried to paste the text on the desktop from the Clipboard instead of just opening the 2nd post then paste it from the Clipboard so the short of it is that I have lost the rest of my post....this part was the conclusion of the 2nd comparison along w/ the summary of both comparison tests that I wanted to share w/ all y'all (this part was just as long as the prior post).........this sucks!

Well here is the short of this all IMO:
The best performing vid card for AHII is the 780Ti IF both cards are pushed to the limits of their capabilities. The 780Ti is a more powerful card w/ the GK110 Kepler GPU & will handle a much more stringent graphics feature load in AHII & maintain desired FPS & damn good graphics imagery & performance than the 290X w/ it's AMD Hawaii-based GPU can. The Nvidia Geforce WQHL drivers offer a myriad more features & driver updates to further enhance the user experience in AHII than the AMD Cat WQHL drivers have to offer & that is a plus in the 780Ti's favor & a big selling point for Nvidia.

OTOH, the best graphics imagery appearance in color reproduction, contrast, sharpness, clarity, easy on the eyes to view over extended gaming time, enhances the immersion experience & just downright beautiful in AHII goes to the 290X w/o a doubt over the 780Ti. I have to admit that AMD just does a better job of providing 1st-class graphics imagery appearance than Nvidia does even though Nvidia does a damn good job w/ it's graphics performance....this is a big deal for a lot of folks, myself included. I love good clean & beautiful graphics imagery as the visuals are what expounds the player immersive experience & is the #1 reason for the emphasis on the eye candy along w/ the gameplay & the graphics at this time are more important to me as both vid cards have more than enough brute power to run the game at max or close enough to max settings smoothly & stay cool doing it. For years I had thought that Nvidia had finally caught up w/ or surpassed AMD/ATI in this category..................... .......the data says otherwise for me. Heck I was able to clearly see & track dots that were outside of the icon range...not only against the sky background but against the background of the ground objects detail w/ this 290X that I could never pick out w/ the 780Ti & that spoke volumes to me!

When price is brought into the discussion the price\performance curve for AHII goes to the AMD Radeon R9 290X hands down & is a deal breaker for a lot of folks as well.

So in the end I'm going to stick w/ the XFX BE Radeon R9 290X vid card in my box & keep the 780Ti for a back up card for the time being. The price\performance curve of the AMD Radeon R9 290X is not the main reason for this decision, the graphics imagery performance & the idea of the Radeon's 512-bit GPU & mem bus providing maximum bandwidth performance potential is. I didn't say it was superior to Nvidia's so don't go there, I just know that a wider bus lane can move data w/ less chance of bottlenecking than a smaller one & do it using a slower GPU & mem clock speed & this appeals to me. The control response & feel of my HOTAS to the visual experience in-game that I am enjoying is most certainly influenced by the performance of the vid card's GPU & mem bus performance to "synch" w/ the graphics frames sequencing being displayed........

The XFX Double Dissipation Dual Fan Cooler is for real & will keep this card cool on air alone & stay pretty quiet in the process, just be aware that it is MASSIVE!

In the future I will be paying a lot more attention to AMD's offerings in the future...............

Sorry I lost the rest of my original posting as I had numbers & other data to present for your consideration....too much to retype again.

 :salute  :cheers:
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: AMD Radeon R9 290X from XFX
« Reply #16 on: February 28, 2015, 06:59:55 PM »
Update!

I finally found the rest of the original post & here it is......................YAY!

Whew! 

 :D  :x

Conclusion:

Both vid cards met the optimization criteria stated above when set up as shown. Both cards showed that they both needed to use the vid card driver's alpha graphics settings to achieve full GPU utilization in AHII...which makes sense as the respective companies should have used their max settings levels to tune their GPU control programs to get the absolute max GPU performance out of their GPU's & the game developers would have to code in as aggressive or more aggressive graphics alpha\beta settings for the 3D game to do the same. From the results of my testing the only AHII in-game graphics setting that can push either of these cards into submission is the EM slider but it just can do that when both vid cards are ran as set up in comparison #1 as both were just 1 set below full EM updates.

Both cards exhibited very similar operating profiles (GPU clock speeds w/ variations due to load changes, mem speed profile (780Ti mem boosted speeds were pegged at 3499 MHz across a 384-bit mem bus, 290X mem boosted speeds were pegged at 1250 MHz across a 512-bit mem bus....290X's mem speed across the 512-bit mem bus performance is reason why Nvidia sped up the mem speed from the original Titan design mem speed of 3005 MHz across a 384-bit mem bus to counter the 290X w/o having to redesign the PCB...IMO of course) remained the same as noted in 1st comparison, GPU temp loads remained close to the same, going up on an avg of 2*C-3*C above the 1st comparison but to do this the fan speeds were pegged at 100% on both cards but the fan noise was noticeably lower w/ the 290X vs 780Ti due to the different cooler designs (XFX DD Dual Fan Cooler vs Nvidia Titan Reference blower design....both using a vapor chamber in conjunction w/ heatpipes w/ aluminum fins HSF design on GPU) but neither were anywhere near unbearable. GPU power levels of both cards were also similar avg'ing between 65%-74% but the GPU usage levels spread apart some w/ the 780Ti being on the lower end vs the 290X (780Ti avg 57%-62%, 290X avg 65%-75%...again w/ the 780Ti drawing a much cleaner & tighter graph line than the 290X). This is where IMO the Nvidia GPU Boost 2.0 does a better job of controlling the GPU than AMD's PowerTune does and\or the power of the fully fleshed Nvidia GK110 Kepler GPU has over AMD's Hawaii GPU is realized. The driver setting level comparison shown above also point to the Nvidia GK110 Kepler GPU's performance advantage over AMD Hawaii GPU when the GPU's are pushed to their limits. The data that I have seen makes this very clear to me. The feature-richness of the Nvidia drivers vs the AMD drivers also show that Nvidia gives the consumer a LOT more control on the operation of their product over AMD & that is a fact. These feature-rich driver sets along w/ the frequent driver updates vs AMD also demonstrates to me the confidence that Nvidia has in the quality (read superior here) of their products to give the consumer the overall satisfaction that they desire when the money is spent & this is worth something to consumers which is why IMO they are more than willing to spend more for it.

Now when the subject of graphics image quality is looked at, here is where the 290X really separated itself from the 780Ti. Both cards produced superb graphics in looks & quality in both comparisons but the 290X showed to be slightly better in the 1st test & really separated itself from the 780Ti as the sharper, cleaner, crisper & easiest on the eyes graphics images in the 2nd test when the vid card driver's alpha graphics settings were applied. This is an eye opener for me as I have heard this being stated in several threads on the AH BBS over the years but due to this up-front test I now see that this is very true. This is a big deal, especially if 1 sits in front of a computer & plays games for extended hours at a time. Now I know that there are other settings in the Nvidia drivers that can be adjusted to compensate to match\exceed AMD here but that is not the point being made here as the very same case can be made for the AMD drivers. When kept at a base level the 290X is showing to produce the overall better graphics imagery than the 780Ti on my box, period. Both vid cards used graphics frame timing using the FCAT process at the hardware level; the 780Ti GPU demonstrated to sequence graphics frames faster (16ms-18ms) than the 290X (22ms-28ms) & smoother as well (very tight graph lines w/ sporadic spiking on 780Ti vs more variation across the graph lines w/ more regular spiking on 290X) but this didn't turn out to provide a noticeable enough of an advantage for the 780Ti over the 290X in graphics image reproduction & sequencing to make up for the 290X's graphics imagery quality advantage over the 780Ti. To some this can be a deal breaker, especially when price is brought into the discussion................... ...

Summary:

Both cards are stellar performers in AHII & will be good choices to use even in future 3D applications.
The best performer goes to the 780Ti IF both cards are pushed to their limits in AHII, otherwise the performance will be essentially the same & thus a moot point.
The best graphics imagery quality goes to the 290X w/o any doubt vs the 780Ti across the spectrum, even though the 780Ti's graphics imagery quality is damn good. The big caveat for the 290X vs the 780Ti though will be the quality of the vid card driver that is being used, as Nvidia is historically much better in this area than ATI had been in the past & AMD has to improve on that to give the consumer more confidence in their product. I had to find the latest Cat WHQL drivers to fix the 290X's issues but the same will be also said for the 780Ti, even though I have updated a lot of drivers for my 780Ti that didn't result in the card performing any better in AHII but DID gain more usable FEATURES to enhance the user experience in AHII if desired.

Personal choice & conviction will make the final choice in the end as to which vid card one chooses to buy. I did this to see for myself on my platform & have typed all this to share the results w/ the AH community.

So due to the better graphics imagery performance I'm gonna keep the 290X in my box & have the 780Ti as a backup card for the time being.

Ok I'm done......................... .......

Enjoy!

 :salute  :cheers:

Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Getback

  • Platinum Member
  • ******
  • Posts: 6364
Re: AMD Radeon R9 290X from XFX
« Reply #17 on: February 28, 2015, 08:32:41 PM »
That after burner is an awesome tool. I miss that on my Nvidia card.

Both of those cards have an awe effect.

  Created by MyFitnessPal.com - Free Calorie Counter

Offline 38ruk

  • Gold Member
  • *****
  • Posts: 2121
      • @pump_upp - best crypto pumps on telegram !
Re: AMD Radeon R9 290X from XFX
« Reply #18 on: March 06, 2015, 03:51:05 PM »
Great write up! You had problems with the 14.12's ... ? That sucks!   They were really supposed to add a bunch of features and performance to the 290.
« Last Edit: March 06, 2015, 03:54:08 PM by 38ruk »

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: AMD Radeon R9 290X from XFX
« Reply #19 on: March 06, 2015, 05:49:37 PM »
Great write up! You had problems with the 14.12's ... ? That sucks!   They were really supposed to add a bunch of features and performance to the 290.

Thanks for the kind words........................ ... :salute

After all this I was snooping around on the AMD site & took another looksee at the Cat 14.12 Omega drivers & then I saw my mistake. I didn't pay attention to the listing the 1st time to realize that there are 2 versions of Cat drivers: 1 version for desktop discrete vid cards & 1 version for the new APU CPU integrated graphics. I had downloaded the APU version by mistake....... :old:

I d'ld the right version of the Cat 14.12 Omega drivers & all is righteous now!
 :aok

The 1 thing that I still frown about is the lack of access to the graphics driver controls that AMD won't allow consumers to access thru the CCC unlike Nvidia does thru the Nvidia Control Panel. Just about every driver feature that Nvidia allows access to is available in the AMD drivers but you have to get a 3rd party utility to access them....like Radeon Pro. This utility is what the AMD CCC should be.......been half tempted to delete the CCC & install Radeon Pro & use it as the CCC so that I could really tune the 290X but I haven't......yet. Sure would like for AMD to change their tune on this one.......especially w/ the Omega driver packages in particular.

Outside of that though, I have really enjoyed this XFX BE Radeon R9 290X vid card & the graphics performance that it puts out.

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: AMD Radeon R9 290X from XFX
« Reply #20 on: March 08, 2015, 06:46:17 PM »
Went back & deleted drivers & all else.

D'ld Display Driver Uninstaller (Driver Sweeper in earlier phase) & ran in Safe Mode & cleaned all files Nvidia & ATI/AMD & rebooted.

Reinstalled Cat 14.12 Omega drivers again (installed all except AMD Gaming Evolved & AMD HDMI Sound Driver) & set all back up.

Tested in game......performance getting even better after cleaning out old driver bits.....shoulda done this from the jump but was lazy.

 :D

FYI.......................... .............

 :salute

Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: AMD Radeon R9 290X from XFX
« Reply #21 on: March 12, 2015, 11:43:43 PM »
Update:

For those whom may take interest in this......................... ..

I have been doing a LOT of reading up on this AMD R9 290X vid card, especially around the drivers & what is coded in them to work w/ this Hawaii GPU to optimize it.........

First off, the AMD Cat 14.12 Omega drivers are the driver set that provides most of the same features that Nvidia does in the 347.x drivers (including VSR, same as DSR but not as configurable) but also AMD doesn't allow users access to most of them thru the CCC....you have to use a 3rd party utility such as Radeon Pro to access them (which kinda sucks BTW) so most are at a set level determined by AMD in the background. I wiped the Omega driver & reinstalled it w/o the CCC & d'ld & installed Radeon Pro utility & made an attempt at setting up a profile for AH & making the tweaks to these settings but the issue became clear that when I made a change in 1 of these tweaks it broke another setting which messed up the AMD PowerTune 2.0 coding which created more work to try to find a balance & so I just gave up on this after about 5-6 hrs of fiddling & deleted Radeon Pro, wiped the Omega drivers once more & reinstalled them, this time installing the drivers & CCC only....nothing else (I don't do any video work of any kind or watch\record any movies & I sure ain't gonna let AMD Gaming Evolved utility do nothing on my box) so the rest of the driver pack is not needed.

I set all back up as I had it before thru CCC & this time started reading up on AMD's PowerTune 2.0 GPU control & found out some interesting things.....

According to AMD (& as I understand the science behind it....may not say much  :D) this control coding is designed to optimize maximum GPU usage using power & temp readouts but not in the same methodology as say Nvidia's GPU Boost (& AMD PowerTune 1.0 did w/ the Tahiti GPU). As GPU Boost is coded to use GPU power & temp readouts to optimize GPU clock speeds by boosting power to GPU & upping GPU clock speeds to a BIOS set max frequency as long as GPU temp stays below a set threshold (temp is 80*C) & according to a predetermined GPU graphics load level influenced by the actual alpha graphics setting levels, AMD PowerTune 2.0 achieves this by maximizing the GPU's thermal dynamics by reading the GPU operating temp & power levels according to the present GPU graphics load then runs an algorithim to determine the necessary power level needed to boost the GPU clocks to reach the necessary optimal thermal range that promotes maximum heat transfer from the GPU to it's cooling solution to promote optimal GPU usage & performance by using the cooling fan speed to maintain the GPU temps that maintain the maximum heat transfer rates....this is why the Hawaii GPU's run hot....they are designed to run at a max temp of 95*C w/ minimal power leakage across the chip die & thus achieve stability at this temp as AMD claims that to achieve maximum heat transfer rates from a GPU to a cooler the heat deltaP (temp drop from GPU heat spreader to cooler plate) needs to be as high as possible & so the Hawaii GPU performs at it's best when it is as close to this max heat range as is feasible so the user will get the best overall experience from the card in performance, power usage & quiet environment. So in short, if this GPU is overcooled so the GPU won't reach this optimal thermal temp range, PowerTune will actually work to prevent the GPU from optimizing itself to it's fullest potential due to throwing off the algorithim calculations which tend to cap the GPU power at lower usage levels than is necessary to maintain the GPU's clock speeds at a load which causes the GPU to run w/ less stability & become erratic in operation...............

I can see why most users just won't buy into this & so AMD claims that most folks misunderstand how this is supposed to work......................... Have read a myriad of reviews\write-ups on this & they all say pretty much the same thing.

So I ran some tests to see if I can actually see this work as AMD claims by resetting the cooling fan profile in AB to essentially run a flat line from 40*C to 78*C w/ fan speed % set at 40%-45% (AMD states max fan speed % to reach optimal heat transfer rates was 47%.....why default fan speed in AMD Overdrive is set at 50%) then spike almost straight up from 78*C to 80*C w/ fan speed% at 45%-100%. The profile looks like an "L" laid backwards & went in AH & found the field where the activity was the heaviest to tax the GPU (A16) & got in the thick of the action....................

Lo & behold, the card performed excellent maintaining FPS between 51-60 w/ all AH beta graphics checked & sliders maxed out w/ EM slider set at 1 notch off None. Used the Omega driver alpha graphics settings as laid out in earlier posting. GPU temps leveled off at 74*C w/ fan speeds staying below 52%, up from 62*C-65*C w/ fan speeds at 95%-100%. Looking at graph lines on AB I could see that the GPU had stopped clocking down to 0% usage on the low end of the usage swings (around 20%), GPU clock speeds were much more stable across the playing time & held in the 1000 MHz range. Went in AH & upped EM slider to 2 notches off None then ran test at same field....again card performed excellent maintaining FPS in the same range as before. Looked at AB graphs after I got shot down & noted GPU temps leveled off at 78*C w/ fan speeds staying below 52*C, GPU usage swings closed up some more w/ the lows at 40%, GPU clock speeds increased to 1038 MHz range on a steady line w/ much smaller variation.

So from what I have seen so far it looks like what the AMD folks said about PowerTune 2.0 has merit to it....the hotter this GPU is allowed to get the better it performs......

For the time being I ain't gonna let her exceed 80*C to maintain some sort of thermal "safe zone" as I'm not ready to push this GPU to it's max thermal range of 95*C................yet.

This is a concept that is not easy to grasp or accept & seems counterintuitive but the results that I got seems to agree w/ the AMD engineers so I'm gonna run her hot from here forward.

Hope this helps.

 :salute

« Last Edit: March 13, 2015, 12:04:10 AM by Pudgie »
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: AMD Radeon R9 290X from XFX
« Reply #22 on: March 14, 2015, 02:23:58 PM »
Update:

Well I have went on & took the next step.................

I reset my fan profile in AB as follows:
Maintained a flat line at 40% fan speed from 0*C to 80*C then ran fan speed line straight up from 40% to 100% between 80*C to 90*C so fans stay spinning at 40% speed until GPU temp exceeds 80*C then will ramp up as needed & will hit 100% speed at 90*C thus will provide maximum cooling to GPU at near GPU max temp threshold to allow PowerTune 2.0 to use the GPU's max temp threshold if needed.
To add to this I then went in the CCC & set the driver AA Sampling rate to the max setting of 24EQ using Edge Detect so now the drivers are set on full max alpha graphics settings that should inform PowerTune software (which is located in GPU BIOS & runs on the GPU itself....not on a chip on PCB) to push power & GPU clock speeds to maximum & manage from there.

Went in AH & tried to find a furball but I couldn't at the time so I just went up & flew around where a couple of bogeys were in the area so at least when the dance got started it would work the card some then if I got shot down or RTB I would check AB to see the graphs.........

Once again the results showed to prove that the AMD engineers explaination of how this GPU & it's control algorithim is supposed to work does what they said it would. The hottest this GPU got was 84*C & was maintained within a tight curve of variation of less than 3*C.....essentially a straight line. FPS held between 50-60 the entire time & the variation in FPS was due to PowerTune 2.0 performing GPU clock stepping (from what I understood from all the reading is approx 3 MHz per step vs the std 13 MHz step that PowerTune 1.0 & Nvidia GPU Boost uses.....this is the GPU clock stepping granularity difference along w/ the improved power step switching capability that AMD is claiming as superior to PowerTune 1.0 & Nvidia GPU Boost 2.0) to allow the GPU to maintain max power along w/ clock speeds & keep the GPU at/below the optimum temp threshold along w/ the measured cooling fan speeds. I noticed this control dance occurring as the graphics scenes changed while flying. The gameplay was very smooth thruout except for the occasional packet issue across my connection (I could see this happen in game when my wife was doing her thing online...Facebook & Pogo Club gaming...as both computers are sitting beside each other & connected to the router via Ethernet Cat 6e cabling so I could see when she was streaming some YouTube stuff in Facebook I would begin to see some slight hitches from time to time in my gameplay & when she would stop them the hitches disappeared). GPU clock speeds were held right at the max advertised speed of 1050 MHz varying down approx 12 MHz across the playing time (effective range of 1038MHz-1050MHz) w/ fan speeds maintaining mostly at 40% w/ the occasional uptick to no more than 52%. Control response ingame was excellent.....very crisp, smooth & responsive. This is where I believe the 512-bit GPU\mem bus performance is showing up. Had no issues w/ control spiking while riding the very edges of my Spitty's flight envelope & won many a duel simply due to this as I waited for my opponent to falter & give me the advantage while their tracer rounds are just missing me on the merges.....very harrowing I must say.....takes some gonads to do that in RL!  :salute

Whenever HTC will trust us enough to allow attachments to be attached to posts in the Hardware and Software section of the BBS I have saved snippets of all this data to prove what I'm saying for y'all to see for yourselves......just saying............

I have to admit that AMD has a damn good product here but you can screw it's performance up big time if you don't understand, believe in or accept how they've designed this card to optimally operate....hardest thing to see\accept is the GPU running at these high levels of operating temps for extended periods of time & they're not supposed to damage the GPU, but it also shows me that if you want these latest AMD cards to perform at their absolute best you HAVE to allow PowerTune 2.0 to do it's thing & you HAVE to maintain low cooling fan speeds to do so as PowerTune 2.0 is looking at these to use in it's calculations. So I wouldn't recommend to get a reference cooler version (squirrel-cage blower fan) of this card as those are the ones that the complaints of fan noise & uncontrollable overheat issues stem from. This XFX card I have w/ their Double Dissipation cooler design (dual 120mm fans w/ a vapor chamber/heatpipe combination HS) will handle this GPU's heat output easily w/ PLENTY of spare cooling capacity in reserve & if is being used in a box w/ superb airflow capacity is just icing on the cake so the 290X's to get will be the ones w/ the manufacturer's cooling designs on them..........
 
So when you "overclock" a 290X you're only setting a higher GPU max clock threshold above the norm represented by the 0% clock speed readout & so you will need to set a higher power % above the norm represented by the 0% power readout to allow PowerTune 2.0 to use to try to achieve them & maintain GPU temps at/near the 95*C threshold. This temp threshold is set in stone...AMD won't allow this setting to be changed or accessed thru CCC (as opposed to PowerTune 1.0) or 3rd party software (like AB, Radeon Pro) & the ability of the GPU to hit the higher performance levels will be as good as the cooling capacity on the card to hold the GPU as near to 95*C as it can w/o exceeding it as the GPU speeds will be throttled back as much as it is needed to bring the GPU temp back under 95*C then will readjust them back up to find the right level to maintain the temp threshold....the results may not reach "your" expected ideal performance level according to your level of understanding of how all this works so I can clearly understand AMD's position on saying that most consumers misunderstand how these cards are designed to operate & will do things w/ the settings that according to a consumer's understanding level should improve performance when in reality will retard it & cause other undesirable issues as well then blame the product as faulty. GPU max clock speed, max power levels & max fan speeds are dependent on 1 thing & 1 thing only.....PowerTune 2.0's ability to maintain the Hawaii GPU's operating temp threshold of 95*C to provide to the consumer optimal performance, power usage & noise levels.....I can vouch that this XFX BE Radeon R9 290X vid card can deliver in AH IF it is used as it was intended to be used. I also know that it won't if it ain't used as such also.

Caveat:

For the AMD folks in here the next issue is that only certain versions of R-series & earlier series GPU vid cards can use the features available in the Cat 14.12 Omega drivers......this is so even within the R9 series of cards so you got to do some research to know if your flavor of Radeon can take advantage of them....or what part of them that they can\cannot use as the GPU's aren't exactly the same. This is also a reason why I chose the R9 290X as this is AMD's flagship card at the moment & so it gets ALL the goodies that the Cat 14.12 Omega drivers can activate that other AMD GPU's (Southern Islands, Tahiti & even some of the Hawaii's) can't take advantage of due to GPU design limitations as the 290X's Hawaii GPU is designed w/ a little extra architechture from even the R9 290's Hawaii GPU so it can take advantage of driver coding that the 290 can't. This in itself sucks a big one for AMD users....not any different for Nvidia users w/ Nvidia vid cards EXCEPT Nvidia is more forward in informing users as to WHICH vid cards their driver features will work with & the ones that they WON'T work with & Nvidia stays within a GPU class when doing so so the user is far less confused. The info is there on the AMD web site but you have to do a little digging to find it.

So don't take all this that I posted for this R9 290X & think that you should get the same results using these drivers on your R9 270 & up or your HD7000 series......you need to check to see. Your card should run w/ this driver but some/most of the driver features will not work depending on the particular GPU your card is using.

You have been warned....................... .......... :D

I have to say that it has been a lot of fun playing w/ this XFX BE Radeon R9 290X vid card since I installed it in my box & worked to tune it to the best I could get it but along the way I found that I had to adjust my way of thinking as to what & how max performance on it should look like & how to go about getting it..........

Hope this can help someone out...............

 :salute


Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline MrRiplEy[H]

  • Persona Non Grata
  • Plutonium Member
  • *******
  • Posts: 11633
Re: AMD Radeon R9 290X from XFX
« Reply #23 on: March 14, 2015, 02:28:25 PM »

Whenever HTC will trust us enough to allow attachments to be attached to posts in the Hardware and Software section of the BBS I have saved snippets of all this data to prove what I'm saying for y'all to see for yourselves......just saying............

Google for 'free image host' and you'll get a plethora of options where to upload your images. They provide you with a link you can paste to the forum inside the img tags.
Definiteness of purpose is the starting point of all achievement. –W. Clement Stone

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: AMD Radeon R9 290X from XFX
« Reply #24 on: March 14, 2015, 04:37:37 PM »
Google for 'free image host' and you'll get a plethora of options where to upload your images. They provide you with a link you can paste to the forum inside the img tags.

Appreciate the info, MrRiplEy.

I'll take it into consideration..............

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: AMD Radeon R9 290X from XFX
« Reply #25 on: March 22, 2015, 12:29:53 AM »
Update:

Overclocking................. .

After some time using this vid card I had also read in the myriad of reviews that this 290X was a poor overclocker & so I made several attempts to see by overclocking this card's GPU & mem:

Since AMD's PowerTune 2.0 GPU control will determine where the GPU clocks & power levels will\can go on the 290X, just as GPU Boost 2.0 will do w/ the 780Ti all you are setting is a higher GPU clock speed\power level setting above the stock settings then it is up to PowerTune 2.0 to apply all according to the operating GPU temp & where this temp is in comparison to the max GPU TDP level.
I did all this as follows:

All settings were made w/ Cat 14.12 Omega driver set as noted 10 posts back during round 2 testing...................... ..

1st

I went in AB & upped the GPU clock speed from 1050MHz to 1106MHz, left the mem at stock speed & power set @ 0% & went in game....all operated OK & PowerTune maxed the GPU to the new max speed level w/ FPS holding 59-60 but didn't really notice any effective performance boost. Upped the max GPU level to 1206MHz & went back in the game....didn't get off the tower when the 290X started artifacting badly then AH shut down...had to reboot box then reset AB GPU clock speed back to default of 1050MHz. Went in AH & flew around...all operating smoothly as before.

2nd

I then went in AB & upped the mem clock speed from 1250MHz to 1350MHz, left the GPU clock speed set at default & went in game....all operated OK & PowerTune maxed the mem speed to the new max speed level w/o issue holding FPS at 59-60 & here is where I noted a definite performance improvement as the game ran much smoother & response improved. Upped the EM slider from 1/4 update to 2/3 update w/o any slowdown, stepped EM up to 3/4 update & all ran OK w/ intermittent slowdowns to 50-51 FPS so reset EM slider to 2/3 update. Got some improvement so I upped the mem speed even more from 1350MHz to 1450MHz & as soon as I had saved this in AB the card started artifacting in idle. Reset mem speed back to default & saved...artifacting stopped. Upped the power setting to 50% (max setting) from 0% then reset the mem speed back to 1450MHz & as soon as I saved it again in AB the card started artifacting so I had to reset the mem speed back to 1350MHz then all was well so 1350MHz was as high as the mem speed would go.

3rd

I then went in AB & upped the GPU clock speed from 1050MHz to 1150MHz w/ mem speed set at 1350MHz & power set @ 50% (max) then went into AH & flew around. At 1st all was looking OK then I began to start seeing little artifacts after approx. 30 mins of flying, they were random in nature & very few so I didn't figure on what was to come. After approx. 1 hr of flying the artifacts all of a sudden started getting more severe then the game just shut down but didn't BSOD w/ a dark screen but the sound was still running as if there was nothing wrong....after 2 mins of this I hit the reset button & rebooted my box. When all came back up I went in AB & reset the GPU speed back to 1050, the power% back to 0% & left the mem speed set at 1350MHz as the 290X ran best w/ the mem speed OC.

Conclusion

As was noted in the reviews this 290X was indeed a poor overclocker but I'm not so sure that the issue is the GPU or the mem themselves but it sure makes 1 think that the Hawaii GPU & GDDR5 mem used on this card just doesn't have sufficient headroom to get a good overclock. The way that AMD's PowerTune 2.0 control actually applies the power to these chips may have something to add to this but I couldn't tell. The 1 item that I did note is that the mem speed isn't set fast enough to fully exploit the 512-bit GPU\mem bus on the 290X that I'm running & this is a shame as I clearly noticed a performance gain when I upped the mem speed over stock which noted a potential graphics performance bottleneck here on the card itself.

Hopefully the new leaked specs on the new R9 390X are indicative that AMD may have realized this & did something about it...............if only.........

Time will tell....................

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: AMD Radeon R9 290X from XFX
« Reply #26 on: March 23, 2015, 09:15:33 PM »




Here are a couple of Afterburner snips of the XFX BE Radeon R9 290X (red) running AHII at full load on Cat 14.12 Omega drivers w/ all driver settings maxed out & all maxed out in game & EM set at 1 notch above None & EVGA GTX 780Ti (green) running AHII at full load on Geforce 347.25 drivers w/ all driver settings maxed out except TransAA (set @ multisample to match the 290X's setting) & all maxed out in game & EM set at Full Updates. Note: the FPS on the 780Ti was 60 FPS....the 535 FPS occurred when I got out of AHII to get snip & the max GPU clock speed on the 290X was 1050MHz...covered up by the pause note...

Both run on the box in sig below w/ box running on stock setup w/ Intel TurboCache clocking I7 4820K CPU to 3.9 Ghz steady while running AHII......................... .........

This is why I said if both of these cards are loaded to their limits in AHII the Nvidia GTX 780Ti wins this w/ a fair amount of ease displaying the raw power of a fully fleshed GK110 GPU optimized for single-precision operations....this is a reference card at stock out-of-the-box settings going against an OC vers of the AMD R9 290X at stock out-of-the-box settings.

Nvidia has to date come up w/ 2 generations of vid cards since this 780Ti was top of the line (GTX 980 & now Titan X) so if a person really wants to max out AHII graphics wise w/ very playable FPS in most/all game situations Nvidia is the only way to go right now.

But as you can also see AMD can get the job done as well, just not to the level that Nvidia can.......at the moment, but the game is very playable as you can also see w/ nearly all maxed out.

Enjoy!

 :salute

PS--I tried to upload snips of both these cards w/ their drivers set up to use the AHII ingame alpha video settings w/ all ingame beta graphics settings at max to show that both GPU's were not fully utilized but the free image hoster that I was using went on the fritz while trying to upload the snips so I'll try to upload those at a later time.................
« Last Edit: March 23, 2015, 10:07:12 PM by Pudgie »
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: AMD Radeon R9 290X from XFX
« Reply #27 on: March 24, 2015, 07:23:37 PM »


Ok folks, here are the 2 snips of the XFX BE Radeon R9 290X (red) w/ Cat 14.12 Omega drivers set to use the AHII ingame video settings & AHII video settings set as follows: Disable V-synch unchecked & AA slider set to Most then all ingame beta graphics settings set to max, including EM & EVGA GTX 780Ti (green) w\ 347.25 drivers set to use the AHII ingame video settings & AHII video settings set as follows: Disable V-synch unchecked & AA slider set to Most then all ingame beta graphics settings set to max including EM as well.

As you can see, the AHII alpha graphics settings could not push either GPU to full clocks....this is due to AMD's PowerTune 2.0 AND Nvidia's GPU Boost 2.0 algorithms NOT recognizing the AHII alpha graphics set levels as heavy (I have stated this repeatedly in prior threads in this BBS from testing my Nvidia cards..nice to see that AMD's PowerTune 2.0 does the same....here's the proof) & so only loaded both GPU's as necessary to maintain FPS...which neither card had any problems doing just that so if a person is calling using ALL AHII graphics configurations as max graphics then either card will perform damn good in this setup & will satisfy all parties & if this is the case I would recommend that you save yourself some dough & go w/ AMD at this point....unless you just HAVE to have Nvidia & or you play more games, especially the more modern games, than AHII that will require a more powerful video card......................... ..

In the end it all comes down to personal choice....................... .............................

Well here it all is folks..............got some more stuff but the gist of it all is shown w/ these 4 snips.........

Enjoy!

 :cheers: :salute

PS--I keep forgetting to post this but all was done running my monitor's res at native 2560x1440 res (DoubleSight DS-279W 27" 2560 x 1440 60Hz S-IPS panel LCD monitor which is a purpose-built business monitor but is fast enough for gaming).
« Last Edit: March 24, 2015, 07:30:28 PM by Pudgie »
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: AMD Radeon R9 290X from XFX
« Reply #28 on: April 02, 2015, 01:41:55 PM »
As far as how I explained AMD's PowerTune 2.0 worked on this XFX BE Radeon R9 290X.................
I set up the exact type of fan profile w/ my EVGA GTX 780Ti ref vid card to test Nvidia's GPU Boost 2.0 to see if I got similar performance profile (left low fan speed of 40% set until temps exceeded 70*C then set 100% fan speed at 75*C to give a 5*C cushion off the GPU throttling threshold setting of 80*C for Nvidia....even though GPU is rated to run at 97*C......hmmmm)..........

Well here's a sample of what this looks like....................



Here's another w/ DSR enabled & set @ 3840x2160 res.....................



As you can see Nvidia's GPU Boost 2.0 works in the same manner as AMD's PowerTune 2.0 does.........

The closer you run the GPU to the threshold temps you will get more performance out of the vid card............
Notice just how stable the GPU temps are maintained when using it this way......................

Just putting this out there for those who are interested................... .......

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: AMD Radeon R9 290X from XFX
« Reply #29 on: April 04, 2015, 03:26:42 AM »
1 last item to post......................... ...............

Thru all this testing of this Radeon (& the 780Ti as well) it never occurred to me that I was running both of these vid cards on a monitor native res of 2560x1440, essentially a HIGH RES monitor, & that due to the higher dot pitch count that the amount of AA that I was applying to both cards was gross overkill which would greatly use up the vid card's resources......yes, maybe good for load testing vid cards to find the point where they will fall on their knees but in real practice is utterly nonsense as any amount of AA applied to remove jaggies on a high resolution monitor that can't be seen due to the monitor's natural dot pitch actually doing the "anti-aliasing work" is essentially overkill.....making the vid card do extra work for absolutely no gain or effect.

So I ran tests on both my XFX BE Radeon R9 290X & EVGA GTX 780Ti vid cards, slowly dropping the AA levels to get to the level in which the jaggies would noticeably show up then apply just enough AA to clear them..................

With my monitor's native res, both cards showed that the necessary amount of AA to apply to effectively clean up the jaggies was only 2x....................this is the LOWEST AA setting that you can select in the drivers of both cards.

And as you can imagine, this unloaded both card's GPU to then use that "extra power" to do other duties to enhance the experience even better. Here is a snip of both cards w/ the AA set @ 2x w/ VSR\DSR enabled & running in AHII......290X (red) using VSR res of 3200x1800 (max VSR setting of my monitor's native res...available in Cat 14.12 Omega driver only to date) & 780Ti (green) using DSR res of 3840x2160 (2.25% of native res w/ 100% smoothness...started for 780Ti w/ 347.25 driver):





As you can see they both performed superb.........this is the area that they were really designed & optimized for.....to be used in conjunction w/ a high res monitor. The gameplay was excellent on both cards, kinda expected the 780Ti to run well but the 290X made a pretty big performance jump when the AA level was backed down to the minimum levels & VSR was added....here is where the 512-bit GPU\mem bus came to bear. I now see that this R9 290X runs best on high res monitors which holds up well w/ the advertising of performing well at 4K res as you really don't need AA at all at that res.

So from here on out I will run this 290X at 2x AA & VSR res @ 3200x1800 as here is where it shines the most performance-wise.

Enjoy!

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd