The basic principles of semiconductors operate by changing the electron energy levels in a material when a voltage is applied. This same change in energy level is also affected by temperature. When the temperature is above a predetermined design value, the applied voltage competes with the thermal effects for operation of the semiconductor. When this happens, you loose bits and clock cycles, ect and the device stops operating as intended.
Modern semiconductors operate at lower voltages to reduce the amount of heat generated during operation. For the electronically inclined, the heat due to semiconductor operation is related to the I^2R and V=IR, so dropping voltage drops the current and heat generated per clock cycle. The higher the clock cycle and more instructions done per cycle the more heat is generated. AMD processors generate more heat per clock cycle than Pentium's because the AMD uses a complex instruction set. The Pentiums began using reduced instruction sets between the P3 and P4's. That is why a 2.2GHz AMD 64 can perform like a 3.2GHz P4.
Lowering the CPU temperature will in no way guarantee you can overclock (run the CPU at speeds above its design frequency). However, since you typically need to raise the CPU voltage to get the higher clock signals 'cleanly', you must have extra cooling capacity because the processor must generate more heat due to both the higher voltages and clock speeds.
The effect of temperature also provides what is called 'dark current' in photoelectronic semiconductors (like the CCD in a digital camera). Cryogenic cooling is typically used to minimize this effect when doing sensitive laboratory instruments. Digital cameras compensate for this with software and more expensive digital camera's will have less temperature sensitive CCD's (better electronics).
To get back on topic, AMD, Intel, ATI, and Nvidia processors all have internal temperature sensors. Software (which may involve reflashing the BIOS) can get you that information.
Regards,
Malta