Aces High Bulletin Board

General Forums => Hardware and Software => Topic started by: DCCBOSS on November 06, 2009, 08:17:34 AM

Title: ANTIALIASING
Post by: DCCBOSS on November 06, 2009, 08:17:34 AM
What does it change in the systems performance and is a higher setting perfered or does a higher setting have a limitation tied to other settings in your system.  :headscratch:
Title: Re: ANTIALIASING
Post by: Denholm on November 06, 2009, 09:10:29 AM
Antialiasing decreases the "blocky" appearance of lines on textures. It will decrease your overall video performance. Based on your video card and processor, the decrease in performance may not be noticeable. Typically I would use a 4x or higher anti aliasing setting.
Title: Re: ANTIALIASING
Post by: DCCBOSS on November 06, 2009, 09:29:11 AM
Well I'm using a Nivida 8800GTS what do you recommend setting wise with this.
Title: Re: ANTIALIASING
Post by: BaldEagl on November 06, 2009, 10:23:47 AM
Well I'm using a Nivida 8800GTS what do you recommend setting wise with this.

I've got a 512Mb 8800GTS and I'm using either 8x or 16x but don't remember for sure.
Title: Re: ANTIALIASING
Post by: Anodizer on November 06, 2009, 11:24:18 AM
(http://www.snowbound.com/images/antialias.gif)
Helps alleviate the stair stepping effect..  Your 8800 should be able to handle a setting of
4x at least..  Be sure to set this up through the game and leave your video card AA settings
to application controlled(with regards to AH)..
Title: Re: ANTIALIASING
Post by: Masherbrum on November 06, 2009, 01:18:47 PM
my 512MB 8800GTS KO is at 16xQ AA for Left 4 Dead, Joint Operations Typhoon Rising and other games.   
Title: Re: ANTIALIASING
Post by: Vulcan on November 06, 2009, 03:27:05 PM
antialiasing is a funny beast 50% usefull 50% marketing, the higher you go in resolution the less you need it. As the output of graphics card approaches the dot pitch limits of the monitor it has no benefit.
Title: Re: ANTIALIASING
Post by: Fulmar on November 06, 2009, 04:08:49 PM
my 512MB 8800GTS KO is at 16xQ AA for Left 4 Dead, Joint Operations Typhoon Rising and other games.   
Resolution plays a large factor in AA settings as well.  As you go up from say 1024x768 (786,432 pixels) to 1680x1050 (1,764,000 pixels), the amount of stress on the GPU for rendering AA is expecially hard.  Just between those two resolutions above, thats over double the number of pixels it has to anti-alias.
Title: Re: ANTIALIASING
Post by: eagl on November 06, 2009, 08:34:21 PM
antialiasing is a funny beast 50% usefull 50% marketing, the higher you go in resolution the less you need it. As the output of graphics card approaches the dot pitch limits of the monitor it has no benefit.

This is 50% truth and 50% untrue scoffing :)

Even at higher resolutions, antialiasing will help get rid of the irritating "crawling" you see on transitions from one polygon or texture to another as you turn, get closer, or farther away from the transition.  Any line, whether it's a fence, a rooftop, a tree, runway, or top of a hill, will crawl a bit as the angle and distance changes.  Antialiasing can dramatically reduce this crawling effect, and the improvement makes everything seem more realistic.

To find out what antialiasing looks best, you first need to bump up the resolution to the native resolution of your monitor (if you're on an LCD).  Then start increasing the antialiasing until framerates seem to bog down or stutter a bit, and then back down one or two antialiasing settings to ensure framerates stay high.

In AH, I have found that pretty much any card from the 6800GT on up should be able to run 4x antialiasing at almost any resolution.  A few AH versions ago, I had no problems keeping maxed at 60fps using 1280x1024 resolution and 4x antialiasing on my nvidia 6800GT.  An 8800GTS ought to be able to do the same or better on the current AH version, but there is no way to know for sure until you try it out.

First things first though, make sure you're running at the native resolution of your LCD monitor, or at least 1280x1024 if you're on an old-school tube monitor.  There is no benefit to lowering resolution just so you can increase antialiasing.  Max out the resolution first and then add AA until framerates start to dip.

Title: Re: ANTIALIASING
Post by: DCCBOSS on November 07, 2009, 11:55:05 AM
How do I find my native resolution for my monitor
Title: Re: ANTIALIASING
Post by: MrRiplEy[H] on November 07, 2009, 12:25:31 PM
How do I find my native resolution for my monitor

Consult manual, manufacturer page or try setting the resolution as high as it goes (this is ONLY necessary if you have a flat LCD monitor).

Title: Re: ANTIALIASING
Post by: DCCBOSS on November 07, 2009, 12:42:29 PM
Yes I have a Acer 22" flat screen
Title: Re: ANTIALIASING
Post by: AirFlyer on November 07, 2009, 12:49:30 PM
My guess would be the native resolution would be 1920 x 1200, 1680 x 1050, or 1440 x 900. Granted a model number would allowed us to confirm that.
Title: Re: ANTIALIASING
Post by: 1701E on November 07, 2009, 12:54:38 PM
Assuming it's the same Acer 22" I'm using, the native is 1680x1050.  Model is X223W, just look on the back of the Monitor and it will say.
Title: Re: ANTIALIASING
Post by: Pudgie on November 07, 2009, 03:03:07 PM
Hi All,

What I have noticed/done is that if your monitor's native RR at native res is over 60 Hz AHII will perform better if you use the vid card driver AA settings over AHII's AA settings. I'm not referring to a graphics improvement-graphics performance is the same either way-I could not see any difference graphically. I'm referring to the game performance-the game performs more crisp, control inputs are much quicker, views panned much quicker-this was very noticeable. AHII's AA settings will lock the RR at 60 Hz (60 FPS due to vsynch) regardless of monitor's native RR-whether it is above or not. Since AHII is written by default to instruct the vid card's driver to vsynch RR to FPS (unless you turn it off in AHII video settings) then the vid card driver will run at what settings you have set up in driver's CP & the FPS in-game will follow the native monitor's RR & hold as long as you got enough vid card/CPU performance to maintain it (no different from AHII settings).

If you are running Nvidia vid card (I am using a GTX 260) just set the Vsynch setting to "Use the 3D application setting" & AHII will instruct the driver to vsynch the vid card to RR of monitor regardless. Set the Antiliasing Mode setting to "Override the application setting" & set AA level in CP then go into AHII Video Settings & set slider for AA to None. You will now be able to run AA & vsynched at the faster RR of your monitor-whatever that is above 60 Hz. If it ain't above 60 Hz don't bother 'cause it will be exactly the same.


Some may already know this so please ignore if you do.

This method will provide the only exception to what Eagl has stated concerning antiliasing at a lower res than native res IF you have a CRT monitor. Most CRT's will run at a higher RR at lower res so you "could" use AA at a lower res to "clean up" the jaggies AND get the improved vsynched FPS due to the higher RR. You would do this to gain increased vsynched FPS above 60 FPS & have some graphical clarity too. I have done this as well w/ the same card as I have a 21" CRT. This does work & work well.

Otherwise what Eagl has stated is what you'd want to do.

The goal here is not to start an arguement or belittle anyone. I am just posting an alternative method to provide a way to utilize AA & get more vsynched FPS. If you have a monitor that can RR at rates above 60 Hz & want AA & vsynch on IMHO you should use it to get the best of it all.

 :salute
Title: Re: ANTIALIASING
Post by: Vulcan on November 07, 2009, 09:15:54 PM
This is 50% truth and 50% untrue scoffing :)

A monitor, LCD or CRT, cannot render something smaller than it's dot pitch. At a certain point anti-aliasing becomes blurring, and you loose clarity.
Title: Re: ANTIALIASING
Post by: eagl on November 07, 2009, 09:55:37 PM
A monitor, LCD or CRT, cannot render something smaller than it's dot pitch. At a certain point anti-aliasing becomes blurring, and you loose clarity.

At those high resolutions, the clarity loss is VASTLY outweighed by the elimination of digital artifacts.  I learned graphics programming as a pup, including actually writing the code for a half-dozen different antialiasing techniques.  Including the math (matrix manipulation, ugh!).  I spent a lot of time on the subject, and the simple fact is that elimination of digital artifacts is critical to achieving realistic imaging and getting that "suspension of disbelief" in gaming.

If you believe that image sharpness is best, then by all means turn off image processing and advanced graphics techniques.  But you're missing out.  If you don't believe me, take a peek at some of the DX10 and DX11 "reviews" that look at the actual technology and algorithms that are being used.  To a great degree, many of the newest graphics features dramatically reduce sharpness, especially ones dealing with volumetric fog effects and HDR lighting.  In a static screenshot, the images sometimes even look pretty horrible compared to the same scene without the new techniques.  But put them into dynamic motion, and they look far more realistic. 

If you want to see the jaggies, then keep all the sharpness you want.  Heck, some monitors and cards will let you apply additional sharpness processing so you can see even more "detail".  It'll look like crap, no matter how high your resolution is.  If you want to eliminate digital artifacts including moire effects, crawling edges, and fences with portions that turn invisible when you look at them from certain angles, you're going to have to apply antialiasing and sacrifice a bit of sharpness to make the image look better.  And I say "better" in the sense that at any resolution you can display, seeing an entire fence that is blurred by 4xSSAA looks far more realistic than a fence that shimmers when you move, has sections that disappear when you stop, and has jaggies on every portion that is visible.  I'll take realistic and reduced digital artifacting over unrealistic "sharpness" any day.  And I did the ugly ugly math and wrote the coad to prove what I was seeing wasn't just wishful thinking, back when I first came to hate digital artifacts.