Aces High Bulletin Board
General Forums => Hardware and Software => Topic started by: Roscoroo on April 15, 2004, 02:44:45 AM
-
http://gear.ign.com/articles/506/506325p1.html (http://gear.ign.com/articles/506/506325p1.html)
April 14, 2004 - Today NVIDIA announces the GeForce 6800 and 6800 Ultra. These two new cards, based on the architecture previously referred to as "NV40," mark the arrival of the next generation of graphics cards. I've taken a GeForce 6800 Ultra through our full set of benchmarks, but before we get to those performance numbers I want to go over the new features of this brand-new GPU.
For starters, GeForce 6800 Ultra has 6 vertex units and a whopping 16 render pipelines and 1 texture unit on each. It can render 16 pixels per clock, giving it a theoretical max fillrate of 6400mpixels/sec for single- and multi-texturing. This isn't a conditional 16 pixels per clock either; it will always be at least 16 pixels/clock, unlike NV35 which was 8 pixels/clock on only some occasions.
In comparison, Radeon 9800 Pro/XT can render 8 pixels/clock and GeForce FX 5900 can do 4, although its multi-textured performance makes the card seem like the same 4 rendering pipelines / 2 texture units architecture of the GeForce 4 Ti series.
(Yes for 500 us you to can own one of these puppy's )
:aok
-
Sorry but I drew the line at the $300 US mark on video cards. Now I don't mind staying slightly behind the tech curve with video. Since I can nearly rebuild my entire system for $500 and don't want to drop my entire budget for upgrades into one purchase.
-
Everything on hold till full 64 Bit and PCI Express has had all the bugs ironed out.
New system about spring 2006 I reckon.;)
6 Ghz 64 Bit
2 gig ram
Gf 80000 512 Mb/Ati 12000
:aok
-
It also requires duel molex power plugs on individual rails and a 480W power supply, per NVidia's documentation.
-
LOL, there goes the electricity bill......
-
I would think the 9600XT would get the jb done for most games and all i intend to play for the next year or 2.
-
Just why in the hell are power "requirements" going up so much for video cards? I can almost understand an increase in power supply needs but some of these with different power connectors and such I honsetly don't understand why they have to do the parts that way.
-
The new GPU's from both NVidia and ATI have over 200 million transistors. That is roughly 4 times the number in the Pentum 4.
ATI's new R420 will much more power efficient due to the use of a low-k dielectric process. NVidia never could get that process to work.
-
they did test it with a 400 watt ps according to the artical..
the good part is the price always goes down on the current cards as the new ones come out .
the plane jane model of the 6800 is only 300 us , but they havent tested that card yet .
-
We'll get to see how ATIs Radeon X800 Pro performs on April 26th. Hopefully it's up to the task, as NV40 is pretty incredible.
This new trend of graphics cards that consume more power than an entire computer system did 5 years ago sure isn't a good thing though.
(Skuzzy, NV40 is currently being fabbed by IBM.)
-
Yes, I am aware of IBM being the FAB house. Rumors are saying that the NV40 is SOI, but I have not been able to confirm that.
The amount of power they need and the cooling would seem to indicate it is not SOI based.
NV40 is a much better product then NV3x ever was. NVidia will be glad to drop those products due to all the hardware/yeild problems with them.
Performance wise, it should be close. I think NVidia will win in some areas and ATI others, as far as raw performance goes. It really depends on the ram speeds ATI goes with.
With what is known, ATI will maintain its edge in AA quality over NVidia, and in lower power consumption (less heat).
Other than that, it is a grab bag right now.
-
Do either of you guys (bloom or Skuzzy) have any idea why we are seeing such a drastic increase in power consumption for the video cards? Being the total ID10T that I am about how the hardware works I don't see the need for so much power being sent to a video card. As it sounds now the next time I update my system with a new video card (which is generally every two years and that falls this year) I will also have to get a flipping 500W power supply to go with it in the hopes that I don't screw something up.
-
Transistor counts have skyrocketed Reschke, thus the need for more power. The reason for the rapid escalation in counts is the competition between NVidia and ATI has never been higher.
They both want to be king of the hill and are pulling out all stops to get there.
These new generation cards have doubled the parallel pipes of the previous generation, which means a lot more transistors. The move from integer math to full floating point throughout the pipeline has served to increase the transistor counts significantly as well.
-
so this means mid range cards that will be fine for at while yets prices are gonna go down...perfect...
-
yeah vorticon..with the gigantic "leap" ive seen the new card has done in reviews, the 9800XT/Pro should get even cheaper very soon :aok
maybe even I will treat myself to a new windows box for ah2 and solitare :)
-
Actually, given the transistor count (222 Million) of NV40 (GeForce 6 6800 Ultra), its power consumption specs are not actually all that bad. (For reference, that's about 2x the number of transistors in an Athlon 64 and about 3x that of a northwood Pentium 4.)
Power consumption for ICs depends on a few things. Capacitance, clockspeed, leakage current losses, and voltage (actually voltage^2). Adding more transistors will increase both leakage current and capacitance. Capacitance is a real killer, as the higher the capacitance, the lower the maximum clockspeed of the chip will be as well.
Skuzzy, I also doubt NV40 is built using SOI. It is simply too large of a die and IMO it probably doesn't run at a high enough clockspeed to really warrant it. SOI wafers are extremely expensive, and coupled with a nearly 40x40mm die (massive), that would probably make the chip too expensive to produce. Nvidia is also probably not all that concerned with power consumption, as those who can afford to pay $500 for a video card can probably afford to pay $100 for an Antec 480W True Power supply. IBMs SOI .13u process would probably yield a 20 - 25% reduction in power consumption over their standard .13u process, hardly worth probably close to a 2x increase in production costs for a chip the size of NV40.
(Skuzzy, did you see the news today that ATI will be releasing a chipset for the Athlon 64 in Q3 2004?)
-
Yes, I had heard a few months back about ATI building an AMD chipset.
I agree about the SOI process. I have not heard that IBM has been able to produce a wafer with reasonable yeilds using SOI.
The thing that might hurt NVidia is ATI's next gen part is using a low-k dielectric which they have already been using in the 9600XT and it has proven to be very stable at high clock speeds while reducing the power consumption by a considerable amount.
I have to wonder why NVidia did not use a cooling solution that vented the heat outside the case. It is a two slot solution and why they did not vent it outside makes no sense to me.
Possibly they were more concerned about the noise levels that an external vent is prone to raising. I would be concerned about that amount of heat being put back into the case.
Overall though it appears to be a much better design than the NV3x line was.
It was interesting to note that the samples sent out for reviews had the ram overclocked. I just wonder what the real clocks will be for this card when it goes retail.
-
Now gamers are gonne be blamed for global warming.
-
As long as nvidia lacks image quality, it's a no go for me.
-
Originally posted by Siaf__csf
As long as nvidia lacks image quality, it's a no go for me.
Watch the benchmarks and image comparisons. Image quality has improved quite a bit and seems now on par with ATI.
Thats with beta drivers. It can only get better.
-
I'm not talking about 3D quality (which still lacks from ati according to a comparison I saw today.)
I mean 2D quality which has been severely impacted in all recent Geforce series. The image quality is fuzzy as hell, even at professional quadro cards. I can't understand why.
The main reason for switching to Ati was the crisp 2D, 3D performance came as a secondary reason.