Ok, I can follow the formulas easy enough. They are fairly straightforward, and a logical progression. Using your formula, for a max texture of 256 and changing no other settings (and adding in the extra at the end), I would only need around 75MB, and with max textures at 512, I would need around 125. Assuming I turn my max textures down to 512, theoretically I have enough video RAM to cover what I'm using for settings. What is the end benefit? Or I guess the more appropriate question is, what part of my performance am I hurting now, with settings that exceed my available amount of video RAM? How do the ingame "performance sliders" affect this? Obviously the game plays, and I get decent frame rates. Not great, but decent. It also stands to reason that when I DO have frame rate hits, its over an area with more objects to be drawn (over land). So are you saying that if my system is within the bounds prescribed by the formulas given, plus the extra "scratch pad" RAM, that I can max out those ingame sliders and not hurt performace? The formulas give me a good idea of how important each component is to the overall performance issue, but obviously there is more to it than just this or AH would be damn near unplayable on my comp in its current settings. So is it by having textures preloaded to memory that I am circumventing this limit? I've tried lowering my max textures before, and I havent noticed any appreciable increase in performance.
Appreciate the info, stuff like this is hard to find.