Actually, given the transistor count (222 Million) of NV40 (GeForce 6 6800 Ultra), its power consumption specs are not actually all that bad. (For reference, that's about 2x the number of transistors in an Athlon 64 and about 3x that of a northwood Pentium 4.)
Power consumption for ICs depends on a few things. Capacitance, clockspeed, leakage current losses, and voltage (actually voltage^2). Adding more transistors will increase both leakage current and capacitance. Capacitance is a real killer, as the higher the capacitance, the lower the maximum clockspeed of the chip will be as well.
Skuzzy, I also doubt NV40 is built using SOI. It is simply too large of a die and IMO it probably doesn't run at a high enough clockspeed to really warrant it. SOI wafers are extremely expensive, and coupled with a nearly 40x40mm die (massive), that would probably make the chip too expensive to produce. Nvidia is also probably not all that concerned with power consumption, as those who can afford to pay $500 for a video card can probably afford to pay $100 for an Antec 480W True Power supply. IBMs SOI .13u process would probably yield a 20 - 25% reduction in power consumption over their standard .13u process, hardly worth probably close to a 2x increase in production costs for a chip the size of NV40.
(Skuzzy, did you see the news today that ATI will be releasing a chipset for the Athlon 64 in Q3 2004?)