when you bring up frame rate info( ctrl I) and it shows your video cards amount of memory available - i.e.. -64 or 128mb what ever the card has and it shows amount of memory used-- .i.e 25.5 mb . Is it MY computer making that determination of how much video memory to use? or is it the server?. And why would it not use more of the memory, i mean my card is a 128mb and when i check it it seems to only use around 25 to 27 mb? out of the 128 vid mem.. my frame rate is fine 99% of the time , just wondered why its not using more of the available memory. dont need it? err..
I have 512 mb ram too. If the answer is too involved its cool, probley a thread somewhere on the subject maybe yes ,maybe no. but any replies to this subject , I thank you now....