Originally posted by Skuzzy
The quality of the video card has nothing to do with the resources needed to run anti-aliasing. Anti-aliasing uses a significant amount of video ram.
If a texture only takes 1MB of video RAM, and then you set the AA level to 8, that same texture will use 8MB of video RAM. That is just one 1024x1024 texture.
Oh, that's ludicrous. Take a Geforce 7600GT and put it next to a Geforce 8800GTX and tell me there isn't going to be a difference in performance in regards to anti-aliasing (or overall for that matter). I understand the multiples of video ram used in running FSAA as well as transparency AA. I agree with you there. As I'm confident you're aware of, a better video card HAS the resources(more memory, higher memory bandwidth, higher GPU clock, etc.) with which to perform better whilst utilizing AA, Anisotropic Filtering, higher resolutions, physics, and so on. So, of course a better quality video has EVERYTHING to do with resources needed to run AA.
I don't want to make this a pissing contest. You're the boss here. But what you're saying is a bit, shall we say, elusive?
And why do you seem so anti anti-aliasing? You, being one of the architects of this game, I believed would be one who would champion the ability for better graphics and performance to enhance the gaming experience. Am I in error in observing this?
By the way, for anyone wanting to see the differences in settings for anti-aliasing, here's a pretty good little thing I found that allows you to see an image with different AA settings.
Anti-Aliasing presentation Good Day.