Author Topic: WOW GF 6800 card  (Read 842 times)

Offline bloom25

  • Silver Member
  • ****
  • Posts: 1675
WOW GF 6800 card
« Reply #15 on: April 16, 2004, 11:58:22 PM »
Actually, given the transistor count (222 Million) of NV40 (GeForce 6 6800 Ultra), its power consumption specs are not actually all that bad.  (For reference, that's about 2x the number of transistors in an Athlon 64 and about 3x that of a northwood Pentium 4.)

Power consumption for ICs depends on a few things.  Capacitance, clockspeed, leakage current losses, and voltage (actually voltage^2).  Adding more transistors will increase both leakage current and capacitance.  Capacitance is a real killer, as the higher the capacitance, the lower the maximum clockspeed of the chip will be as well.

Skuzzy, I also doubt NV40 is built using SOI.  It is simply too large of a die and IMO it probably doesn't run at a high enough clockspeed to really warrant it.  SOI wafers are extremely expensive, and coupled with a nearly 40x40mm die (massive), that would probably make the chip too expensive to produce.  Nvidia is also probably not all that concerned with power consumption, as those who can afford to pay $500 for a video card can probably afford to pay $100 for an Antec 480W True Power supply.  IBMs SOI .13u process would probably yield a 20 - 25% reduction in power consumption over their standard .13u process, hardly worth probably close to a 2x increase in production costs for a chip the size of NV40.

(Skuzzy, did you see the news today that ATI will be releasing a chipset for the Athlon 64 in Q3 2004?)

Offline Skuzzy

  • Support Member
  • Administrator
  • *****
  • Posts: 31462
      • HiTech Creations Home Page
WOW GF 6800 card
« Reply #16 on: April 17, 2004, 07:39:29 AM »
Yes, I had heard a few months back about ATI building an AMD chipset.

I agree about the SOI process.  I have not heard that IBM has been able to produce a wafer with reasonable yeilds using SOI.

The thing that might hurt NVidia is ATI's next gen part is using a low-k dielectric which they have already been using in the 9600XT and it has proven to be very stable at high clock speeds while reducing the power consumption by a considerable amount.

I have to wonder why NVidia did not use a cooling solution that vented the heat outside the case.  It is a two slot solution and why they did not vent it outside makes no sense to me.
Possibly they were more concerned about the noise levels that an external vent is prone to raising.  I would be concerned about that amount of heat being put back into the case.

Overall though it appears to be a much better design than the NV3x line was.

It was interesting to note that the samples sent out for reviews had the ram overclocked.  I just wonder what the real clocks will be for this card when it goes retail.
Roy "Skuzzy" Neese
support@hitechcreations.com

Offline Nilsen

  • Plutonium Member
  • *******
  • Posts: 18108
WOW GF 6800 card
« Reply #17 on: April 18, 2004, 08:03:43 AM »
Now gamers are gonne be blamed for global warming.

Offline Siaf__csf

  • Gold Member
  • *****
  • Posts: 2213
WOW GF 6800 card
« Reply #18 on: April 21, 2004, 02:18:49 PM »
As long as nvidia lacks image quality, it's a no go for me.

Offline Connection

  • Copper Member
  • **
  • Posts: 141
WOW GF 6800 card
« Reply #19 on: April 22, 2004, 12:10:18 PM »
Quote
Originally posted by Siaf__csf
As long as nvidia lacks image quality, it's a no go for me.


Watch the benchmarks and image comparisons. Image quality has improved quite a bit and seems now on par with ATI.

Thats with beta drivers. It can only get better.

Offline Siaf__csf

  • Gold Member
  • *****
  • Posts: 2213
WOW GF 6800 card
« Reply #20 on: April 23, 2004, 01:48:13 PM »
I'm not talking about 3D quality (which still lacks from ati according to a comparison I saw today.)

I mean 2D quality which has been severely impacted in all recent Geforce series. The image quality is fuzzy as hell, even at professional quadro cards. I can't understand why.

The main reason for switching to Ati was the crisp 2D, 3D performance came as a secondary reason.