antialiasing is a funny beast 50% usefull 50% marketing, the higher you go in resolution the less you need it. As the output of graphics card approaches the dot pitch limits of the monitor it has no benefit.
This is 50% truth and 50% untrue scoffing
Even at higher resolutions, antialiasing will help get rid of the irritating "crawling" you see on transitions from one polygon or texture to another as you turn, get closer, or farther away from the transition. Any line, whether it's a fence, a rooftop, a tree, runway, or top of a hill, will crawl a bit as the angle and distance changes. Antialiasing can dramatically reduce this crawling effect, and the improvement makes everything seem more realistic.
To find out what antialiasing looks best, you first need to bump up the resolution to the native resolution of your monitor (if you're on an LCD). Then start increasing the antialiasing until framerates seem to bog down or stutter a bit, and then back down one or two antialiasing settings to ensure framerates stay high.
In AH, I have found that pretty much any card from the 6800GT on up should be able to run 4x antialiasing at almost any resolution. A few AH versions ago, I had no problems keeping maxed at 60fps using 1280x1024 resolution and 4x antialiasing on my nvidia 6800GT. An 8800GTS ought to be able to do the same or better on the current AH version, but there is no way to know for sure until you try it out.
First things first though, make sure you're running at the native resolution of your LCD monitor, or at least 1280x1024 if you're on an old-school tube monitor. There is no benefit to lowering resolution just so you can increase antialiasing. Max out the resolution first and then add AA until framerates start to dip.