It seems to me that the technology is changing faster and faster. I am currently at 1.7 GHZ and GeForce 3 level and have been at this level for a year. I have no plans to upgrade and look at the faster and faster CPU's and the GeForce 4 technologies as nice. But what program or game require them??
The hardware guys are way out front of just about any requirements for the mass computer market. As fewer and fewer people purchase the "newest" there is a danger that this technology will start to thin the competition. This will kill research for the new technology! Competition has gotten the end user the finest hardware at a low cost.
My question is how will the cutting edge companies keep producing better and better products and keep the market stimulated to purchase this new technology?
Why hasn't the programing end kept pace with the hardware guys? This is an easy question to answer. They are forced to program to the "least common denominator". This means a slow CPU and medium priced video cards.
Now that high speed connections are becoming common place and hardware is progressing and also becoming cheaper, the developers of applicaations need to start taking advantage of all the new features offered by the hardware side of the equation.
Developers need to start to write for 1GHZ and GeForce 3. The low end users will upgrade if developers of applications will just push them!
I feel developers are to conservative and are holding back the growth of the industry. I will also blame the end user. The end users should open their eyes and see that one upgrade now to a level of at least 1GHZ and GeForce 3 will not require an upgrade for along time!
This is all just my opnion:) I'm still amazed at how much has changed since 1997 when I discovered the personal computer!