Aces High Bulletin Board
General Forums => Hardware and Software => Topic started by: BoilerDown on December 15, 2013, 03:53:51 PM
-
The first comprehensive review is out: http://www.anandtech.com/show/7582/nvidia-gsync-review
Suffice to say, I want it and I will buy a G-Sync monitor, sooner as opposed to later. Watching Anand's YouTube clips, this sort of stuttering is what bothers me the most, particularly when I turn the display left or right. This awesome technology can't get here soon enough.
-
As long as they are a stupid price :old:
-
Not really that awesome. It is more along the lines of NVidia cannot figure out how to make SLI work properly and honor vertical sync.
-
Not really that awesome. It is more along the lines of NVidia cannot figure out how to make SLI work properly and honor vertical sync.
That's ridiculous and this has nothing to do with SLI. I must assume you didn't even bother to read the article.
-
I did read the article. I also read beyond it.
-
I did read the article. I also read beyond it.
In a year, all your high-end users will be buying new monitors to support this, and it won't be a support nightmare for you, in fact you won't have to do a thing to "support" it. Unlike SLI, which isn't even mentioned in the article. Next time, try to read beyond your own cynicism.
-
isn't g-synch transparent from the game side? Then does it matter if Skuzzy supports it or not?
I've read the news as 'hey we found a new way to charge you 150e for something we should be doing right in the first place'
also g-synch requires you to run at a fixed 60fps (with a 60hz monitor) in order to be advantageous.
It is not a game changer for sure.
-
isn't g-synch transparent from the game side? Then does it matter if Skuzzy supports it or not?
I've read the news as 'hey we found a new way to charge you 150e for something we should be doing right in the first place'
also g-synch requires you to run at a fixed 60fps (with a 60hz monitor) in order to be advantageous.
It is not a game changer for sure.
Skuzzy has bunker mentality from SLI apparently. So he's concerned that this will be another Nvidia headache. I merely mentioned that it won't be, as you're correct in that its transparent to the games.
If "should be doing right in the first place" means something everyone on the planet has done since the invention of LCD displays, then you're right. :huh Nvidia is the first to do it right, they should be given credit, not mocked.
How you read the article and came to the conclusion that it requires you to run at a fixed fps is beyond me. The technology updates the FPS each frame. It can change the FPS 100 times per second if needed. Each frame arrives on your screen when its done instead of waiting for the refresh timer. That's the essence of the technology. You seem to have missed the whole point.
Most definitely a game changer.
-
it takes a 3 to 5% hit on performance whatever that means.
but out of curiosity boilerdown, do you get a lot of stutters? I hardly do and only started when I got my new 1440 monitor. I just spent 600 bucks on that monitor and will be spending another 400 bucks on another video card. I dont see myself spending another 600+ dollars next year on a monitor that will fix something I dont notice. just my opinion.
semp
-
but out of curiosity boilerdown, do you get a lot of stutters? I hardly do and only started when I got my new 1440 monitor. I just spent 600 bucks on that monitor and will be spending another 400 bucks on another video card. I dont see myself spending another 600+ dollars next year on a monitor that will fix something I dont notice.
If you take a look at Anand's YouTube videos in that review, it shows exactly the kind of stuttering that bothers me the most, eliminated on the G-Sync side, present on the V-Sync side. G-Sync will allow you to increase graphics fidelity to the point where it sometimes dips well below the nominal refresh rate of your monitor, be it set to 60Hz, 120Hz, or 144Hz, without stuttering. The effect of which is to make it feel like the framerate is much higher.
Right now if you want your game stutter-free, you have to use graphics settings super low so that the FPS stays super high, so that even a highly-complex scene won't reduce your framerate below your monitor refresh rate. With G-Sync you'll be able to make more reasonable selections while keeping the game play smooth.
-
In a year, all your high-end users will be buying new monitors to support this, and it won't be a support nightmare for you, in fact you won't have to do a thing to "support" it. Unlike SLI, which isn't even mentioned in the article. Next time, try to read beyond your own cynicism.
Has nothing to do with cynicism. It is the engineer in me.
I could write a white paper about this, but do not have the time right now. If you think it is all that great, then fine. Go for it. From an engineering perspective it really is not all that special.
By the way, most stutters, when frame rates drop, are due to the computer not being able to generate the art work fast enough to keep it smooth. Nothing in this, is going to help computer induced stutters be smooth.
-
If you take a look at Anand's YouTube videos in that review, it shows exactly the kind of stuttering that bothers me the most, eliminated on the G-Sync side, present on the V-Sync side. G-Sync will allow you to increase graphics fidelity to the point where it sometimes dips well below the nominal refresh rate of your monitor, be it set to 60Hz, 120Hz, or 144Hz, without stuttering. The effect of which is to make it feel like the framerate is much higher.
Right now if you want your game stutter-free, you have to use graphics settings super low so that the FPS stays super high, so that even a highly-complex scene won't reduce your framerate below your monitor refresh rate. With G-Sync you'll be able to make more reasonable selections while keeping the game play smooth.
boilerdown, not trying to put you on the spot. just asking a simple question. do you get stutters in the game right now? talking about aces high? just curious like I said, I hardly ever get them and I have my settings way up high.
semp
-
Has nothing to do with cynicism. It is the engineer in me.
I could write a white paper about this, but do not have the time right now. If you think it is all that great, then fine. Go for it. From an engineering perspective it really is not all that special.
By the way, most stutters, when frame rates drop, are due to the computer not being able to generate the art work fast enough to keep it smooth. Nothing in this, is going to help computer induced stutters be smooth.
That's not entirely correct. If vsync is enabled and your framerate drops below 60, let's say 59fps - due to the nature of vsync your framerate will instantly dip to 30fps. This is very noticeable and drastic. If your frames dip below 30 they immediately will half to 15 etc. G-sync will remove this limitation by dynamically issuing a new framerate if I understood the thing correctly.
Also when vsync dips below 60 or whatever your supported monitor refresh is, it forces the display card to wait for the slower refresh becoming a bottleneck so even if your card could push the 59fps you only get the 30 out of it as long as you can't achieve the full 60fps.
Of course you know this already, I just don't understand why you don't see the removal of this halving as a positive step.
In a sense I agree with Skuzzy however. G-sync is not very exciting to me as it requires both buying new hardware and is vendor limited.
-
boilerdown, not trying to put you on the spot. just asking a simple question. do you get stutters in the game right now? talking about aces high? just curious like I said, I hardly ever get them and I have my settings way up high.
I haven't flown a sortie in Aces High in around two years, so I don't know. I just pay the sub so that this game doesn't perish from this Earth.
And I've been more active on the forums as of late because I'm getting bored of Planetside 2 and Star Citizen Dogfighting Module has been pushed back to February, so its probable I'll resume playing Aces High again soon (tm). (But haven't yet.)
-
For Xmas Im buying a SSD as thats my current bottleneck.
I'll watch G-Sync as it evolves.
-
Like anything AcesHigh does suffer from stutter, but I highly doubt G-sync will make enough of a difference for it to be worthwhile.
May as well invest in better core PC parts.
-
That's not entirely correct. If vsync is enabled and your framerate drops below 60, let's say 59fps - due to the nature of vsync your framerate will instantly dip to 30fps. This is very noticeable and drastic. If your frames dip below 30 they immediately will half to 15 etc. G-sync will remove this limitation by dynamically issuing a new framerate if I understood the thing correctly.
Also when vsync dips below 60 or whatever your supported monitor refresh is, it forces the display card to wait for the slower refresh becoming a bottleneck so even if your card could push the 59fps you only get the 30 out of it as long as you can't achieve the full 60fps.
Of course you know this already, I just don't understand why you don't see the removal of this halving as a positive step.
In a sense I agree with Skuzzy however. G-sync is not very exciting to me as it requires both buying new hardware and is vendor limited.
Adaptive V-sync fixes the problem bellow monitor refresh rate ie it's off.
-
For Xmas Im buying a SSD as thats my current bottleneck.
I'll watch G-Sync as it evolves.
Changeup runs an i7-950 on a Sabertooth X58 board with a 660Ti. Changeup had 21 fps in clouds and on busy fields until he bought a Samsung Pro 256 Gig SATA 6 SSD. Now he flies around in busy clouds at 52.
Changeup happy :aok
-
Changeup runs an i7-950 on a Sabertooth X58 board with a 660Ti. Changeup had 21 fps in clouds and on busy fields until he bought a Samsung Pro 256 Gig SATA 6 SSD. Now he flies around in busy clouds at 52.
Changeup happy :aok
Uh, that would indicate that AH does not buffer all textures to ram (even main ram from where copy to vram would be quick) and/or use non-blocking code in the texture loads which will make the client freeze whenever it needs disk i/o.
-
That's not entirely correct. If vsync is enabled and your framerate drops below 60, let's say 59fps - due to the nature of vsync your framerate will instantly dip to 30fps. This is very noticeable and drastic. If your frames dip below 30 they immediately will half to 15 etc. G-sync will remove this limitation by dynamically issuing a new framerate if I understood the thing correctly.
Also when vsync dips below 60 or whatever your supported monitor refresh is, it forces the display card to wait for the slower refresh becoming a bottleneck so even if your card could push the 59fps you only get the 30 out of it as long as you can't achieve the full 60fps.
Of course you know this already, I just don't understand why you don't see the removal of this halving as a positive step.
In a sense I agree with Skuzzy however. G-sync is not very exciting to me as it requires both buying new hardware and is vendor limited.
Huh? My computer doesn't do that. Rarely, but on occasion, I'll see FR's in the 45-49 range with vsync on. I also never see stuttering.
-
Huh? My computer doesn't do that. Rarely, but on occasion, I'll see FR's in the 45-49 range with vsync on. I also never see stuttering.
Triple buffering or adaptive vsync which is nvidia proprietary tech gives more flexibility to the rates.
Basically however it goes like this (partially ripped from another forum):
Vsync synchronizes the buffer swap with your monitors vertical refresh rate. If there are the usual two buffers (double buffer) your gpu is being held up untill a buffer swap can be made (if your GPU is fast enough that is). So if your GPU can't draw the buffer full on next refresh it will have to skip it totally. If you're refreshing at 16ms intervals but a frame needs 17ms to draw, it will miss the sync interval. Because the buffer swap is locked to the refresh rate it can't take place on the 17ms mark - it needs to wait until the next interval, which will be at 32ms. Likewise if a frame needs 33ms it will miss the intervals at 16ms and 32ms, and will need to wait until the 48ms interval.
This is the why and how vsync effects framerates. AH2 is supposed to enable triple buffering automatically AFAIK but I have noticed that in some cases I had to force it on from graphics settings.
Judging from Changeups post though it seems that some of AH:s stutterings do not result from lack of rendering power but instead AH has trouble with i/o implementation. If software code is made 'blocking' like it naturally happens when you use loops etc. any operation will force the code to hang for the duration of it. Since i/o is natively very slow, making i/o operations blocking can cause very visible stutters since the software basically stops working for the duration that some texture is loaded. Coincidentally this plays together with vsync so that even if a small texture block needs i/o it's enough to make the rendering skip a frame or two leading to visual problems.
There is a solution to this which is called 'non blocking code' which essentially performs parallel operations so that loading textures does not hang the main code at all - but this type of code is much more demanding and any software title that has roots years back usually doesn't make use of it.
The problem with old software projects can be that the main codebase is so massive that the legacy stuff in there is just too resource heavy a task to do on a tight budget and time frame.
I'm not saying this is the case with AH as I have no knowledge of it, but it can be the case.
-
I have a gtx680 and i7 4.3 ssd,hardrive and loads of corsair ram gaming pc :old:
I use track ir and get a stutter every now and then, tripple buffering might sort this out?
Obviously it might be some thing in back ground but is track ir a big culprit for a stutter every now and then?
Rise of Flight has a stutter as well.
The nvidia experience software i was told is gibberish.
-
I have a gtx680 and i7 4.3 ssd,hardrive and loads of corsair ram gaming pc :old:
I use track ir and get a stutter every now and then, tripple buffering might sort this out?
Obviously it might be some thing in back ground but is track ir a big culprit for a stutter every now and then?
Rise of Flight has a stutter as well.
The nvidia experience software i was told is gibberish.
Some of the stutterings may be caused by Windows 7 problems with multithreading. At least BF4 users suffer from 'core unpark' bug which causes stutters to any software that uses multithreading heavily. Core parking is a power saving function that aims to 'park' processes to certain processor cores so that one or more cores could be left idle when full power is not needed. The system is supposed to automatically expand processes to the idle cores when needed but it seems the system is either not fast enough or doesn't work correctly. That's why users either switch to Win8 or run the 'core unpark' patch to Win7 that will fix the stutters.
-
Triple buffering or adaptive vsync which is nvidia proprietary tech gives more flexibility to the rates.
Basically however it goes like this (partially ripped from another forum):
Vsync synchronizes the buffer swap with your monitors vertical refresh rate. If there are the usual two buffers (double buffer) your gpu is being held up untill a buffer swap can be made (if your GPU is fast enough that is). So if your GPU can't draw the buffer full on next refresh it will have to skip it totally. If you're refreshing at 16ms intervals but a frame needs 17ms to draw, it will miss the sync interval. Because the buffer swap is locked to the refresh rate it can't take place on the 17ms mark - it needs to wait until the next interval, which will be at 32ms. Likewise if a frame needs 33ms it will miss the intervals at 16ms and 32ms, and will need to wait until the 48ms interval.
This is the why and how vsync effects framerates. AH2 is supposed to enable triple buffering automatically AFAIK but I have noticed that in some cases I had to force it on from graphics settings.
Judging from Changeups post though it seems that some of AH:s stutterings do not result from lack of rendering power but instead AH has trouble with i/o implementation. If software code is made 'blocking' like it naturally happens when you use loops etc. any operation will force the code to hang for the duration of it. Since i/o is natively very slow, making i/o operations blocking can cause very visible stutters since the software basically stops working for the duration that some texture is loaded. Coincidentally this plays together with vsync so that even if a small texture block needs i/o it's enough to make the rendering skip a frame or two leading to visual problems.
There is a solution to this which is called 'non blocking code' which essentially performs parallel operations so that loading textures does not hang the main code at all - but this type of code is much more demanding and any software title that has roots years back usually doesn't make use of it.
The problem with old software projects can be that the main codebase is so massive that the legacy stuff in there is just too resource heavy a task to do on a tight budget and time frame.
I'm not saying this is the case with AH as I have no knowledge of it, but it can be the case.
great posting ripley. but could it be something else with changeup's setup since most of the players with similar systems havent posted this as a problem? I havent had 20 fps on a regular bases since I switched from dial up aol 7 years ago to verizon dsl. my fps went from around 20 to 35. and since i built my own computer starting with the 8400 cpu with an 9800tx+ video card I havent had nothing less than mid 50's to 90 even in heavy cv ack.
I dont know about changeups set up that much to be honest. but I think if it was ah code it would have hit way more systems that just his.
semp
-
great posting ripley. but could it be something else with changeup's setup since most of the players with similar systems havent posted this as a problem? I havent had 20 fps on a regular bases since I switched from dial up aol 7 years ago to verizon dsl. my fps went from around 20 to 35. and since i built my own computer starting with the 8400 cpu with an 9800tx+ video card I havent had nothing less than mid 50's to 90 even in heavy cv ack.
I dont know about changeups set up that much to be honest. but I think if it was ah code it would have hit way more systems that just his.
semp
After writing the post it dawned to me that the reason for his behaviour may be that he's running his graphics settings too high and gets depleted from video ram. When that happens, parts of video ram is flushed to main ram and back constantly and the result is of course a major slow down.
D3D9 also natively keeps a shadow copy of video ram in main ram (to my knowledge, correct me if I'm wrong) so if the end result is system swapping the SSD may save the day in the end. Only HTC really knows (or can know) the answer to these things.
But to me the most logical conclusion is that if SSD affects his framerates, his stutters were i/o related so for a reason or another his AH client was fetching or storing data from disk during those situations and the i/o access time improvement of the SSD removed the visual stuttering. Of course there can be n number of variables such as did he reinstall windows when switching to SSD etc ;)
-
If you force triple buffering on, in the video card driver, you will also cause stutters as Aces High already triple buffers.
An SSD cannot, directly, impact frame rates unless there is something else being shared on the interrupt with the hard drive, such as the video card. While I have never seen that happen, it is possible. I cannot imagine a motherboard manufacturer allowing that combination.
The other possibility is the video card is no longer running textures from its local RAM, and instead is using system RAM which has been swapped out. Very possible as most games (including Aces High) do preload textures before they are actually needed.
-
Changeup runs an i7-950 on a Sabertooth X58 board with a 660Ti. Changeup had 21 fps in clouds and on busy fields until he bought a Samsung Pro 256 Gig SATA 6 SSD. Now he flies around in busy clouds at 52.
Changeup happy :aok
That's odd. I have an i5-2400 and a GTX 550 Ti, and I have 60 fps in clouds and busy fields. However, while I have all other settings maxed out including even antialiasing forced on in my card's settings, I have the reflections slider (or whatever the name is -- don't have access to the game at the moment) at "none". Maybe that's the difference.
-
That's odd. I have an i5-2400 and a GTX 550 Ti, and I have 60 fps in clouds and busy fields. However, while I have all other settings maxed out including even antialiasing forced on in my card's settings, I have the reflections slider (or whatever the name is -- don't have access to the game at the moment) at "none". Maybe that's the difference.
It's 100% the difference. Mines one notch from full. Ratchet yours up and see what happens, lol
-
It's 100% the difference. Mines one notch from full. Ratchet yours up and see what happens, lol
man changeup the problem with you having low fps was that you are really pushing your video card above its limits. glad the ssd worked for you. but a new video card will probably help too. I would also get around 30 fps if I pushed the em to full. but it's something you normally wont have time to see, i just lowered it to nothing. other than em I can play with everything on and shadows at 4096 using lower card than you, sli evga 465 at full fps.
semp
-
Will gaming mode on my samsung stop lag?
-
Well I just bought a new Asus 2560x1440 monitor so I cant imagine spending the kinda dough a comparable size G-Sync monitor will end up costing after they come out, nor am I going to buy a 24" monitor in order to kit it out. No matter how promising the new technology looks. However were I to know about G-Sync before I bought this new screen I'd probably hold off since I already had a 1920x1080 HP. Im not sure buying that Asus was the wisest thing to do with that money. In some games extra resolution equates into lighting problems. Games indoors often seem to dark.
-
The nvidia experience software i was told is gibberish.
It mostly is, geared towards the gamer that wants everything done for them so they don't have to fiddle with graphic settings in each game they play. But I would still recommend those with a GFX 600 series or better to install Geforce Experience just for the ShadowPlay video capture utility. It's probably (though still in beta) the best game video capture program out, its got a very small resource foot print and the video quality is far superior to that of Fraps and MSI's video capture utility in Afterburner.
ack-ack
-
I haven't flown a sortie in Aces High in around two years, so I don't know.
QFE :lol :cheers:
-
Will gaming mode on my samsung stop lag?
It probably may help a little but no, it won't stop the lag. Google for input lag for your Samsung if you're lucky you may find reviews. Good screens have an input lag lower than 16ms, the best ones approach 1-2ms.
-
I think you posted list :old:
I saw a article yesterday with a projector which had a good response :old:
-
I think you posted list :old:
I saw a article yesterday with a projector which had a good response :old:
If your tv happens to be on the list then yes. Be aware that just 1 number difference in the model number may mean a 1000% jump in rated specs.
-
If you force triple buffering on, in the video card driver, you will also cause stutters as Aces High already triple buffers.
An SSD cannot, directly, impact frame rates unless there is something else being shared on the interrupt with the hard drive, such as the video card. While I have never seen that happen, it is possible. I cannot imagine a motherboard manufacturer allowing that combination.
The other possibility is the video card is no longer running textures from its local RAM, and instead is using system RAM which has been swapped out. Very possible as most games (including Aces High) do preload textures before they are actually needed.
Question Skuzzy:
Does AH continue to triple buffer frames even if the AA slider is set to None?
Is triple buffering in AH independent of AA?
Would like to know the answer to this for graphics setting purposes.
As far as the SSD thing goes, I remember in another thread a long time back that you had stated that using a SSD w/ a page file on it w/ AH was not good due to AH making a lot of small writes to the SSD. I had figured then that AH will preload textures (as they are loaded in either system memory/HDD depending upon availability) which will require some writes to the page file eventually as this is directly written into all Windows OS's for compatibility purposes w/ older software & older hardware configurations where graphics/system memory can still be limited. The issue as I see it is that MS needs to 1.) write some logic into the OS that can turn off some of this legacy coding upon recognizing capabilities of newer hardware or 2.) giving the end user the OPTION of making the setting changes within the OS to turn off this legacy coding to make full usage of newer hardware once the OS recognizes it's capabilities.
Sometimes the OS just won't recognize large amounts of dedicated graphics memory onboard a vid card & will set up an amount of system memory or HDD space for this purpose regardless & because it does the OS will most likely use what it set up, regardless.
If this is wrong then please correct me but it is apparent when I pull up the system info on my GTX TITAN...........a dedicated graphics card w/ 6144 MB of dedicated onboard graphics memory but there is also a 7907 MB shared system memory allocation for graphics that totals 14051 MB total graphics memory available for graphics......that 7907 MB of system memory allocated has to be set aside for the CPU/GPU to set aside (buffer) drawn frames for the CPU/GPU to get them from, right? If I remember correctly the Windows OS will still do this as if there was an integrated graphics chip on a mobo......again for compatibility purposes.
I am currently using Win 7 HP SP1 OS w/ this vid card.
Just saying............
:salute
-
If this is wrong then please correct me but it is apparent when I pull up the system info on my GTX TITAN...........a dedicated graphics card w/ 6144 MB of dedicated onboard graphics memory but there is also a 7907 MB shared system memory allocation for graphics that totals 14051 MB total graphics memory available for graphics......that 7907 MB of system memory allocated has to be set aside for the CPU/GPU to set aside (buffer) drawn frames for the CPU/GPU to get them from, right? If I remember correctly the Windows OS will still do this as if there was an integrated graphics chip on a mobo......again for compatibility purposes.
I am currently using Win 7 HP SP1 OS w/ this vid card.
Just saying............
:salute
I would very much like to know if this GPU will run EVERYTHING in AH maxed (no sliders right. Everything running wide open) with 60fps no matter what is going on around you.
-
Hi Changeup,
If you have the supporting hardware (CPU, mem, monitor, etc) that will not bottle neck it then the answer is yes & do it easily. But this goes for most of the offerings out there today. Due to some personal choices I make I don't usually use this card w/ totally maxed graphics settings (only reflections aren't fully maxed as I can't visually tell any difference in-game...all other in-game graphics settings are fully maxed out) but I have tested & ran this card in AH running flat out & steady at max FPS of my monitor (59 FPS) & at fully maxed out graphics settings regardless of what is going on in-game.
I will also say this again as I have done a LOT of testing w/ this GTX Titan along w/ a GTX 670 FTW vid card on my box running AHII:
Nvidia's GPU Boost--whether it is vers 1 or 2--will not fully boost a Nvidia Kepler GPU on my box to max boost clocks using the in-game AH graphics settings (AA slider) due to the in-game SETTING level only, regardless of slider position. The game runs excellent but not w/ the Kepler GPU running at max boost clocks....most of the time they ran BELOW the BASE boost clock speeds even w/ all in-game graphics settings at max & GPU temps/power ranges WELL below the throttling thresholds. This can cause the game to fluctuate on FPS when running on a Nvidia Kepler GPU.
When I turn off the in-game AA setting (slider set to None) & set the AF, AA, Trans AA & TF settings to max settings at the Nvidia driver level (after setting the driver to override any application settings...the graphics settings that are set within the game itself do not affect GPU Boost...only the AA slider does) & then GPU Boost would run the GPU while playing the game at the max boost clock settings allowed in BIOS as long as the GPU temps/power ranges were below the throttling thresholds & game runs hiccup-free w/ all other in-game settings maxed out full. This is w/o any OC'ing on the GPU or CPU on my box. The only time I have noted this Titan to clock back (and subsequent FPS drop) was due to the GPU power range hitting the 100% power threshold on occasion as set in the BIOS (have verified this using Precision X)...GPU temps have never exceeded 67*C under full load conditions. This is due to the game loading when at a large field that is under attack w/ a LOT of stuff burning & a LOT of friendlies/cons present. Making 1-2 bumps off full reflections updates or upping the GPU power threshold to 110% usually fixes this issue. I will usually just back off the reflections setting for my tastes as I visually can't see any difference in the reflections when set less than full. The only thing that I haven't tried yet is disabling Intel SpeedStep to my I7 3820 lock the CPU speeds to it's max speed settings to see if this has any effect to the game running (CPU will throttle it's speeds due to load/power levels....just like the Kepler GPU does). The potential CPU throttling may fix this as well but I haven't tested this yet so I can't say.
Note on GPU Boost: If I lower the Nvidia driver AF set below 16x or the AA set below 16QxCSAA or the TAA set below 8xSS or the TF set below High Quality GPU Boost will not boost GPU to max boost clocks but will boost to a % below max boost due to the SETTING used, regardless of the GPU temps/power ranges being below the throttling thresholds then may go to max boost clocks due to in-game graphical loading.
In short, if you want to ensure full GPU boost clocks are being used w/ a Nvidia Kepler card using GPU Boost in AHII I recommend using the Nvidia driver settings mentioned above set to max settings then tailor the rest using the in-game graphics settings to your liking.
I can easily OC this Titan and/or CPU & NEVER hit any graphics wall playing AHII in it's current configuration but due to it very seldom reaching this limitation in stock trim I can accept leaving it in stock trim. Now when HTC releases their updated software w/ the updated graphics engine all this may change....................
I can't say 60 FPS as I set my monitor to use 59 Hz to avoid the rounding issues & so my FPS stays at 59 FPS running at 2560x1440x32 res. This way I don't need any of the gadget software solutions.
Is it worth the $1,000.00 to get this kind of performance.....the answer is no. Since this card came out there are other models available in both camps that can do this for a lot less money. Heck my GTX 670 FTW could do it on my box but I wanted a Titan so I got 1...........& I don't regret buying it to this day.
:salute
-
My new rig (i7 4770k, 780 classified) runs everything maxed out, but not at 60 fps. When its real busy I'll dip to the low 40's, but I find that very playable.
-
How-to video in case you have the right Asus monitor and order the add-on kit:
http://www.youtube.com/watch?v=FMKpJr0KTC8
-
Does AH continue to triple buffer frames even if the AA slider is set to None?
Is triple buffering in AH independent of AA?
Skuzzy, it would be fantastic if you could post pictures of Nividia and AMD control panels with the settings you would recommend for Aces High. I understand it would be system specific to some degree but that triple buffer information would be in a single place for all to benefit from.
Just sayin'. :D
-
Skuzzy, it would be fantastic if you could post pictures of Nividia and AMD control panels with the settings you would recommend for Aces High. I understand it would be system specific to some degree but that triple buffer information would be in a single place for all to benefit from.
Just sayin'. :D
+1
-
The game is programmed to make use of the default settings in either AMD/ATI ro NVidia control panels.
Here, we usually do not install the control panels at all and just let the drivers work at whatever defaults they are set to.
-
The game is programmed to make use of the default settings in either AMD/ATI ro NVidia control panels.
Here, we usually do not install the control panels at all and just let the drivers work at whatever defaults they are set to.
Ahhhh.........that explains a lot. Also a smart business move as, aside from most who post here, most folks just load & go so most driver installs do default to looking for the application settings..................... .............
Thanks, Skuzzy!
I did test this out after I posted the 1st post in this thread:
Since I use the Nvidia drivers to run on (drivers set to override any application settings) I have the in-game AA slider set to None (game doesn't try to set AA level to Vid card).
I did have the Nvidia drivers set to triple buffer & Nvidia drivers set to use std V-synch (not the Adaptive V-Synch).
After posting, the wanna-be engineer in me :D went into the NV CP & made the following setting changes to test AHII:
Turned triple buffering off & set the NV drivers for Vertical synch to "Use the 3D application setting" since what these settings were telling me is that the NV driver would "look" for the in-game coding instructions for V-synch and/or triple buffering application (depending if the game coding ties the two together as V-synch can be done w/o triple buffering....was pretty positive that AH would tie the two together) & then would execute the coding if it exists.
Then I ran the game, picked a runway & set my trusty Mk IX (or any prop-equipped plane in the game) & started the engine to watch the prop spin up thru the gun sight reticle as this is a good place to see/test if v-synching/triple buffering is being used in the graphical rendering regardless of the vid card's GPU or CPU speed/power. I have tested & known this for quite some time. If v-synch is not being used the prop graphics will most certainly tear as they pass thru the reticle as the prop is "spinning up" from dead stop to engine idle speed & they won't tear if v-synch is being used.
The results of this test clearly show that with the NV driver set as I had stated AHII is natively written to instruct the vid card to use v-synching/triple buffering as the prop graphics were smooth & the prop blades never distorted...proving that the graphics frame sequencing were synched....w/ the in-game AA slider set to None (game is not sending AA instructions to vid card drivers AND the NV drivers set to override any application settings for AA...ignoring them).
I then flew around watching the graphics looking for any tearing or abnormal sequencing............saw none.
From Skuzzy,s quote: If you force triple buffering on, in the video card driver, you will also cause stutters as Aces High already triple buffers.
I can see where this is a true statement now....along w/ enabling v-synch from the vid card's driver level as well w/ AHII.
I now run w/ triple buffer turned off & vertical synch set to "Use the 3D application settings" in NV CP.
Thanks again Skuzzy.
:salute
-
Back on topic........................ .......
I probably won't look at this new tech for quite some time since I've bought this DoubleSight DS-279W 27" monitor already & w/ the hardware I have now stuttering isn't a concern right now.
About 2 weeks back I got a I7 4820K CPU for my rig which, w/ a mobo BIOS update, gave me native PCI-E 3.x specs on my Asus RIVG for my TITAN to use. This CPU natively is about 100 Mhz faster in stock trim than the I7 3820 but runs 9*C cooler at full load (3820 ran at 43*C, 4820K runs at 34*C using the same Artic Freezer I30 Extreme HSF). This alone ensured that the CPU ran at full speed (using Turbo Mode in BIOS CPU topped out at 3907 Mhz).
Now my set up is deemed complete.
I'll be interested in how this tech fares so let us know how it goes if anyone gets this.
:salute
-
I recently found a real nice compilation of all things G-Sync knowledge over at BlurBusters: http://forums.blurbusters.com/index.php
The upgrade module for the one Asus monitor seems like quite the risky proposition, but if you are the type that de-lids your CPUs, its probably not that big of a deal. Unfortunately my 120Hz monitor didn't turn out to be the right one. The good news is that the Asus 27inch that's upcoming looks to be the most impressive pure-gaming monitor in a long, long time.