An A64 3300 ought to run most games faster than almost any P4. If you have a very high end P4 however, then you may as well stick with that until AMD and Intel figure out their next gen and settle on things like the memory standard and socket.
From what I've read, if you're upgrading now then an AMD socket 939 is still a good bet because their next gen socket and chipset is still quite a ways in the future. But if you are happy now, then wait at least 6-12 months to see how the dual cpu war progresses. If AMD can increase the yields on their X2 cpus and get the price down closer to the dual core Intel cpus, then AMD is going to set the standard. But right now the price diff is out of the question in favor of Intel. The super high volume is in home and business rigs, and as soon as a business IT guy can choose to get a dual core cpu for a price jump of $10-$15 per cpu like they were able to do when hyperthreading made it to the sub-$300 P4s, then you'll see them take off fast. A dual core cpu would help me at work a LOT because the military loads a crapload of useless-but-necessary junk onto every computer, and just sitting there the computer uses nearly 256 meg of ram and 60% cpu in a P4 box. In general use, that equates to the computer simply failing to respond smoothly and occasionally hanging for up to 20 seconds as it gonks on whatever it thinks is more important than the work you're doing. Dual core would let the computer do it's thinking on one cpu while you can do your work on the other.
If it's a $150 per box diff, they simply won't make it into the office. At $15 per box diff, they will. Right now Intel is the closest with a low end dual core cpu priced under $300, while AMD's admittedly better low-end X2 cpu will run well over $400. For general acceptance of dual core, Intel is pushing that just like AMD started the 64 bit transition.
For AH specifically, we'll see 2 things - First, denials and forecasts of doom and gloom aside, games will simply be forced to find ways to use multiple cpu cores. It was only 7-8 years ago when certain game developers scoffed at the idea of a 3D GPU offloading graphics to a card in an expansion slot (just think of the bus latencies!!!!!!!!!) but look at the state of the art now. The games that jump on it now will reap the benefits, and those that don't will be forced to try to advance their games using half or less of a cpu with a core clock speed no greater than we have today.
Although Skuzzy talks down dual core cpus, I find it heartening that the original CK took advantage of certain graphics speedups found only in S3 chipped video cards, so maybe HT will be willing to step towards the leading edge again with this dual core cpu business. Maybe it's apples and oranges but I really think single core cpu speeds are going to stagnate badly so it's time to get with the program and make it work.
Second, we'll continue to see a discrepancy between framerate advances and image quality. The type of box that today gets 40 fps (high end? Godbox? Value?) will in 2-3 years still get 40 fps, but the image quality and screen resolution will dramatically improve. This is because a virtual world game engine has a finite amount of tasks that must be done in it's processing cycle, including network transactions, physics calculations, graphics, etc., and improvements in computer subsystems will probably only speed up portions of that cycle at a time. I don't know what percentage of the standard game cycle AH uses for graphics vs. everything else, but it seems like 80% or so of that cycle is purely cpu dependent. In practice, that means that buying into any single technology will probably never boost your AH framerates more than 20%, no matter how advanced that technology is.
I could be waaaay off base, but I've had 10 years of playing HT's games and I don't remember ever seeing an upgrade cycle that "revolutionized" how the game played. Wait 2 years and spend $2000, wait 6 months and spend $500, and it's all the same - you'll see only an incremental improvement from any purchase unless you're upgrading everything from a horribly obsolete system.
The wildcard may be that physics processor... If anyone can actually get any useful work and solve latency issues, then it could help a lot in a game like AH. If drawing the screen can be done relatively independently of calculating what is actually sent to the GPU for display, then why can't the cpu just act as the timekeeper, feeding on-time events to and from the physics processor and the vid card for display? Again, lots of people are saying it won't help just like they're saying dual cpus won't work and a GPU wouldn't work, but I think that those who can make it work are going to reap huge benefits while those game developers who don't jump onboard will be left behind in a completely stagnant market of single cpu core speeds that never increase.