Aces High Bulletin Board

General Forums => Hardware and Software => Topic started by: Chalenge on February 21, 2010, 07:06:01 PM

Title: Nvidia GF100
Post by: Chalenge on February 21, 2010, 07:06:01 PM
Im just a little worried that Nvidia will release the GF100 next month while Im in the PI. The Rocket sled demo they released makes it look good but I htink Skuzzy said they have issues to iron out yet. Im betting it will be an ATI-killer...

http://www.youtube.com/watch?v=HjIzoGnkCZs&feature=player_embedded
Title: Re: Nvidia GF100
Post by: Kazaa on February 21, 2010, 08:04:36 PM
I really hope Nvidia bring this card out on time because...

the price of the ATI HD5850s will drop and I'll go buy a second one.

 :D
Title: Re: Nvidia GF100
Post by: Spikes on February 21, 2010, 08:25:04 PM
I really hope Nvidia bring this card out on time because...

the price will drop on the 5850s and I'll go buy a second one.

 :D
+1
Title: Re: Nvidia GF100
Post by: Chalenge on February 21, 2010, 09:03:59 PM
Sell out!  :P They are claiming March 27 for the GTX 480s but I wouldnt hold my breath on any prices dropping.
Title: Re: Nvidia GF100
Post by: Kazaa on February 21, 2010, 10:23:30 PM
Competitive products, competitive prices.

It took Nvidia 6 months to become competitive again. The HD5850 actually raised in price as the months past, about £50 here in the UK.
Title: Re: Nvidia GF100
Post by: maddafinga on February 21, 2010, 11:55:24 PM
I really hope Nvidia bring this card out on time because...

the price of the ATI HD5850s will drop and I'll go buy a second one.

 :D

I just ordered one yesterday!  I can't wait.  If they go down, I'll have to get a second one and two more monitors!!  You have any good setup or config tips for me?

Title: Re: Nvidia GF100
Post by: Chalenge on February 21, 2010, 11:59:43 PM
Competitive products, competitive prices.

It took Nvidia 6 months to become competitive again. The HD5850 actually raised in price as the months past, about £50 here in the UK.

I dont know how Ati has anything stronger than the 295s are. DX11 games are few and far between and I havent found anything that the 295s dont run at refresh rate.
Title: Re: Nvidia GF100
Post by: Kazaa on February 22, 2010, 10:28:48 AM
When it was time for me to buy a new graphics card I spent a lot of time looking around and reading benchmark reviews. The only Nvidia card that came close to the HD5850's performance was the 280GTX. But still the HD5850 was 10-15FPS faster and around £80-£90 cheaper before the prices got bumped up £40, just checked, they have dropped back down again.

Two HD5850's would cost me little over £30 more than a single stock GTX295, in addition I would get two free DX11 games (which come with each card) that I could sell on for £20-£30 each. Performance wise, game pending ofc, I've seen on average an additional 10-30FPS, in some cases even more performance. Also note that at the time, these benchmarks where done on beta drivers. The ATI 10.2 drivers are promising to support CrossFireX even better!

So, two HD5850's would give me better performance, 2 free games, DX11, built in multi-display support with eyefinity and in a month's time 3D multi-display support for an extra £30!

Now you know.
Title: Re: Nvidia GF100
Post by: Chalenge on February 22, 2010, 01:58:17 PM
Im not sure how you can get 10-30 frames more than refresh rate but at least you can play the few DX11 games that there are out. Also you seem to be biased against the 295 as if its a single card yet it is not which is one reason it is more money. When the 480s are released I would bet the 295s come down in price but I doub the Atis will change much.

You know you can get 3D out of 8800s? I dont think its worth trying without the 120Hz monitors though.
Title: Re: Nvidia GF100
Post by: Skuzzy on February 22, 2010, 02:21:37 PM
I am not in any hurry to see people rush to buy 'Fermi' based video cards.  With all the manufacturing issues NVidia has had with this part, I think it would be wise to wait and see if they really will be able hold up in real-world use.

It also appears retail products will not be available until April.

I do not see this as any type of ATI killer.  They take turns.  ATI has been kicking NVidia's butt for a few months now.  If NVidia can build Fermi, reliably, in quantities, then they might take number one back again, for a while.  It will never be as cheap as ATI's current cards though.  It is a very large piece of silicon.  That translates directly to the cost of manufacturing.

If Nvidia costs completes with ATI, they will lose thier short on this design cycle.
Title: Re: Nvidia GF100
Post by: Kazaa on February 22, 2010, 06:39:01 PM
Chalenge, I have nothing against the GTX295 or Nvidia, I'm also not an ATI fanboi.

When a game is used for benchmarking, V-Sync will be turned off to record the maximum FPS achieved. The more overhead room you have the better.
Title: Re: Nvidia GF100
Post by: Chalenge on February 22, 2010, 11:53:41 PM
They (Nvidia) throw out a lot of buzz phrases like 'enhanced tesselation' and 'live ray tracing' which sounds far beyond anything I have heard mentioned about Ati. I know it wont do any good for any of the games I play but I do have high hopes for the future of games just the same. Of course I have to buy at least one 'Fermi' to test out.  :D
Title: Re: Nvidia GF100
Post by: Skuzzy on February 23, 2010, 06:17:40 AM
The "enhanced tessellation" is part of the DX11 spec and all of ATI's DX11 cards support it.  ATI made all the big noises about that 6 months ago.  The ray trace features are just making use of the advanced floating point processors in all video cards.  

The only difference is ATI is using an open standard for that compute power and NVidia is using a proprietary API.

You really have to read between the lines of a video card companies marketing spiel.

NVidia is actually behind the curve, by quite a bit.  They still have not said if they are going to have anything to compete with ATI's Eyeinfinity feature.  Right now, Fermi hopes to be faster than ATI's cards, but with fewer features and costing more.  

You have to remember that Fermi was supposed to ship in 10/2009 and at that time the feature set would have been state-of-the-art.  However, in 4/2010, that feature set is a bit blase'.
Title: Re: Nvidia GF100
Post by: Kazaa on February 23, 2010, 09:40:36 AM
You would need to buy at least 12 Fermies in SLI to run "Ray Trace" in today's games. "Ray Trace" is 10 years away from being usable effect.
Title: Re: Nvidia GF100
Post by: Skuzzy on February 23, 2010, 10:15:13 AM
Depends on the number of "rays" and the bounce/reflection depth Kazaa.  1 ray, with a bounce depth of 1 would render pretty quickly.  Several frames a second could easily be done, but it would look like crap.
Title: Re: Nvidia GF100
Post by: Chalenge on February 23, 2010, 02:50:49 PM
From what you are saying then the only benefit from this availibility will be the price drop on 285s (since 295s have virtually disappeared) and other 200 class GPUs? Do you think these Fermi cards will come in priced far above the current line?
Title: Re: Nvidia GF100
Post by: Skuzzy on February 23, 2010, 03:09:48 PM
There is a lot of speculative pricing running around the Internet at the moment.  I would not trust any of it.

The price will all depend on how much NVidia is willing to give away.  NVidia got caught flat-footed by ATI's move to modular chips, which reduced the costs significantly for ATI/AMD.

NVidia is faced with manufacturing the largest die ever done for a video card during a time when yeilds at the 40nm production are not good (TSMC claims they are getting a handle on it though).  The larger the die, the worse the yeilds are to begin with.  The larger the die, the higher the costs are as you only get so many chips from one wafer.  The wafer costs are pretty much fixed, so you always want to get as many chips from one wafer as you can.  These are facts, which cannot be ignored nor marketed away.  It is going to cost more for Fermi than anything ATI will produce in the foreseeable future.

NVidia is faced with two basic choices.  1)  Price themselves much higher than ATI/AMD and hope like heck the performance is that much better (not looking all that hot at the moment) or 2) Try to stay price competitive and give up making any money on Fermi (hang on to marketshare) and get the next generation design done as soon as possible (basically a 5000 series oops, as that is exactly what happened there).

ATI has made a couple of very good decisions which puts them in a position to simply blow NVidia away.  If NVidia tries to compete on price, ATI can ruin their day very quickly as it is costing ATI/AMD much less to produce thier cards than it is costing NVidia, at the moment.

On the other hand, if NVidia holds thier prices high, ATI can simply ride the higher net profit curve and be in a better position to turn designs around faster while still making money.

If ATI plays its cards right, this could be a banner year for them and a really bad one for NVidia.  Fermi is either going to be the boat anchor NVidia will be sorry they ever started, or it is going to be something they can ride out until the next generation.  Being caught with thier shorts around thier ankles, by ATI, has never set well with NVidia.
Title: Re: Nvidia GF100
Post by: Chalenge on February 23, 2010, 03:34:27 PM
When you say '5000 series oops' are you talking Geforce or Quadro? I absolutely love the FX 5800... is there something I dont know about it?
Title: Re: Nvidia GF100
Post by: Skuzzy on February 23, 2010, 03:51:55 PM
The FX 5000 series was a terrible series.  NVidia could not disassociate itself fast enough away from those parts.  It was a panic release to catch back up with ATI as ATI caught NVidia napping with the 9800 cards.

They ran too hot and the interconnects in the silicon were prone to high leakage whihch caused premature deaths for many, many of those cards.  It was a bad design and NVidia knew it.
Title: Re: Nvidia GF100
Post by: Chalenge on February 23, 2010, 04:12:16 PM
You must be talking several years ago for another line and not about this gem?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814133253
Title: Re: Nvidia GF100
Post by: Skuzzy on February 23, 2010, 04:15:22 PM
Any 5000 series cards made today are not using the original parts they released.  The NV30 chips have all gone the way of the do-do.
Title: Re: Nvidia GF100
Post by: Chalenge on February 23, 2010, 04:30:07 PM
I was going to say... I have not found any card that can do what this does for Autocad and while I would never try to use it with AH or any other game inside of a 64-bit OS I dont think there is another card on the planet that can touch it!

I was really hoping the GF100 would do for games what the FX 5800 did for my CAD system!
Title: Re: Nvidia GF100
Post by: Pudgie on February 28, 2010, 05:36:17 PM
That is exactly why any new build that I do from now on will have an Intel chipset on the mobo..................so that I can go with either vid card platform.

Right now I want to go with an ATI vid card.

Performance/price ratio is very hard to beat.

But what's worse is that I have a NForce mobo & 260GTX vid card setup that ROCKS so I don't have a good enough excuse to jump out & build new.

 :D
Title: Re: Nvidia GF100
Post by: Skuzzy on March 01, 2010, 11:25:20 AM
Interesting tidbits are starting to leak out about the NVidia GFX480.

42A on the 12V power rail (600W minimum) will be needed for the single GPU version of the card. It has to have a 6 pin and 8 pin power connector from the power supply.  Curious to see what kind of cooling solution this card will need.  Dissipating 500W of power will be no small feat.

It will not have "DisplayPort" support.  It will have the mini-HDMI port and twin DVI ports.

1.5GB of GDDR5 RAM.  Forget using this card in a 32 bit operating system.  That amount of video RAM puts useable memory at 2.5GB for a 32bit OS.  Would be tight with Windows XP, but no way with Vista/7.
Title: Re: Nvidia GF100
Post by: Chalenge on March 01, 2010, 01:37:15 PM
So its a 64-bit OS card only then? I got no problem with that!  :D

Power and cooling isnt a problem either or at least I dont think it will be. I have the Coolermaster ATCS 840 with two Seasonic X750 Gold PSUs powering 295s... and really... do I even need a GFX480? No but I will get one anyway.

Is there an advantage to 'DisplayPort' versus 'HDMI?'
Title: Re: Nvidia GF100
Post by: Skuzzy on March 01, 2010, 02:02:12 PM
In a nutshell.  HDMI is based on legacy CRT raster-scan architecture, while DisplayPort is for modern flat panel displays and computer based chipsets.

DisplayPort is set to replace VGA, DVI, and LVDS connections, on video cards, as the dominant computer video interface.  DisplayPort is a networking type of bus architecture where you can add monitors to the single connection to the video card without inhibiting the performance of the video system.

DisplayPort can work over copper or fiber and has a bandwidth of 10.8Gb/s making it substantially faster than HDMI.  DisplayPort can also carry audio and USB connectivity as well.

It is just a better all round computer video interface.
Title: Re: Nvidia GF100
Post by: Denholm on March 01, 2010, 04:10:46 PM
What happens when the gaming industry hits the ceiling of graphical, 3D, fully immersible gameplay? Will NVIDIA and ATI continue making new and "more powerful" technology or will they refine the technology of that day to make it more efficient?

Not trying to hijack this topic, I just find it interesting that we continue making more powerful technology although there's nothing available to tax these resources.
Title: Re: Nvidia GF100
Post by: Skuzzy on March 01, 2010, 04:15:07 PM
<snip>Not trying to hijack this topic, I just find it interesting that we continue making more powerful technology although there's nothing available to tax these resources.

Oh, I don't know.  There is still plenty of poorly written software out there that requires a brute force approach to work.  Every year seems to bring more of it to the market.

Give Microsoft a few more years and they will require this type of product, just run the desktop.  Everyone wll ooh and ahh, and ignore the fact there is more bloat in it than there are M&M's in New Jersey.
Title: Re: Nvidia GF100
Post by: Denholm on March 01, 2010, 04:18:28 PM
That I can agree with! :lol
Title: Re: Nvidia GF100
Post by: Chalenge on March 01, 2010, 07:25:08 PM
Okay so now the question I have is if you have a video card that does have a DisplayPort output can you avoid the TripleHead2Go device and still get the same function?
Title: Re: Nvidia GF100
Post by: maddafinga on March 01, 2010, 08:23:25 PM
Okay so now the question I have is if you have a video card that does have a DisplayPort output can you avoid the TripleHead2Go device and still get the same function?

These new ati cards allow you to plug several monitors into them without anything extra.  I only have one good monitor though, so I don't mess with it, but all those ports are just there, staring at me....

Title: Re: Nvidia GF100
Post by: Skuzzy on March 02, 2010, 06:14:54 AM
Okay so now the question I have is if you have a video card that does have a DisplayPort output can you avoid the TripleHead2Go device and still get the same function?

Essentially, yes.  It will depend on the video card hardware and drivers.  ATI's EyeInfinity support does exactly that.  They currently support 6 monitors (I think they are moving to 9, if they have not already done so) connected at one and they appear as one monitor.  They also allow various configurations of the monitors.  2x3,3x2,1x6,6x1 and possibly others.

ATI really caught NVidia napping on this feature.
Title: Re: Nvidia GF100
Post by: Krusty on March 02, 2010, 09:26:30 AM
Heck it's been in pop culture (movies, cartoons, comics) and even video games (Half-Life 2) for quite some time now! It's about time the software/hardware caught up to let us have 9 monitors functioning as one!!

 :O :O :O



(*now if only the monitor prices would catch up -- I mean drop -- so that we could do this without a second mortgage!)
Title: Re: Nvidia GF100
Post by: Skuzzy on March 02, 2010, 09:35:54 AM
DisplayPort has been slow to be accepted as it adds costs to the video card and the monitors, over a VGA/DVI port.  You want monitor prices to drop, yet add new features?  Ok. See, that is one reason why it has been slow to be accepted as a standard.  No one wants to pay for the feature.  It does add a network controller to the video card and monitor as well.

DisplayPort will not work with regular monitors either.  It requires a monitor which is not dependent on scan lines (LCD, Plasma, CMOS, OLED...).  The only way a scan-line based monitor could work is if an internal decoder was added to decode the data packets back into scan lines for the display, thus adding more costs.

Quite frankly, the whole HDMI thing should have never happened.  It was a very, very poor decision to implement HD content using scan-line technology.  How many CRT based HD televisions do you see around these days?
Title: Re: Nvidia GF100
Post by: Krusty on March 02, 2010, 12:22:50 PM
You want monitor prices to drop, yet add new features?  [...] No one wants to pay for the feature.

I know, I know (*sheepish look*)

From the consumer perspective, I'm just thinking wishfully.
Title: Re: Nvidia GF100
Post by: Kazaa on March 03, 2010, 04:45:09 AM
The sooner OLED displays hit the market the better.

Eyefinity resolutions on a single, curved monitor = win.

Title: Re: Nvidia GF100
Post by: Chalenge on March 03, 2010, 02:34:13 PM
OLEDs are already out but $2700 for an 11" screen?
Title: Re: Nvidia GF100
Post by: Skuzzy on March 18, 2010, 02:18:37 PM
And the beat goes on.....

Latest rumblings have the 480GTX card outperforming the stock ATI5870 card by 5 to 10% and costing $120 U.S. more and sucking down 300W of power.
Title: Re: Nvidia GF100
Post by: Reschke on March 19, 2010, 01:37:51 PM
Also I have read that it is going to be an extremely low production run of around 9000 units because they had a major problem with the chip wafers...at least that is what I read over on www.simhq.com (http://www.simhq.com).

http://simhq.com/forum/ubbthreads.php/topics/2960029/Nvidia_GTX480_Whoops_Rumor.html#Post2960029

which points to this article...
http://www.semiaccurate.com/2010/02/17/nvidias-fermigtx480-broken-and-unfixable/
Title: Re: Nvidia GF100
Post by: Kazaa on March 19, 2010, 02:35:55 PM
Benchmarks are in.

Price of the 5850 went back up £20.

http://hardware-infos.com/news.php?news=3476
Title: Re: Nvidia GF100
Post by: Skuzzy on March 20, 2010, 05:14:24 PM
The production numbers are now at 5,000 units worldwide for both the 480 and 470.
Title: Re: Nvidia GF100
Post by: Pudgie on March 20, 2010, 07:08:41 PM
I think that Little Red Riding Hood is about to slay the Jolly Green Giant if she puts out another bowl of some good red porridge to best the next bowl of green  pea soup.


 :D
Title: Re: Nvidia GF100
Post by: skribetm on March 20, 2010, 10:56:41 PM
(http://www.techpowerup.com/img/10-03-19/106a.jpg)

(http://www.techpowerup.com/img/10-03-19/106b.jpg)

Quote
The XFX GeForce GTX 480 comes with the usual feature set of 1536 MB of GDDR5 memory, 480 CUDA cores, and a broad feature set that includes support for DirectX 11, CUDA, PhysX, 3D Vision Surround, and 3-way SLI. The GeForce GTX 470 retains this feature set, albeit with 448 CUDA cores, and 1280 MB of GDDR5 memory. The two will be released on the 26th of March.

Title: Re: Nvidia GF100
Post by: Boozeman on March 22, 2010, 07:19:16 AM
I think that Little Red Riding Hood is about to slay the Jolly Green Giant if she puts out another bowl of some good red porridge to best the next bowl of green  pea soup.


 :D


Actually, if the performance rumors on the GTX480 are true, ATI only has to release a "HD5890" with 1 GHz GPU clock and 2 GB of RAM to get on par with the GTX480 or maybe even beat it performce wise. Considering the tiny amouts of 480 cards available, ATI should have no problems to handpick similar numbers of the finest Cypress dies and steal the show from NV once again.     
Title: Re: Nvidia GF100
Post by: Skuzzy on March 22, 2010, 08:06:41 AM
These dies are exhibiting an abnormal amount of transistor leakage, which is why the power requirements are so high.  I would not touch one with a 10 foot pole.  Make sure you get a really good warranty if you decide to get one of these cards.  The lifespan is going to be suspect.

When you have yeilds in the 7% area, that tells you the design has problems.  It is not just the process at that low of a figure, regardless of how NVidia's marketing would cover it.

You think NVidia would have learned after eating all those other dies for the same problems.  My confidence in NVidia's ability to turn out a quality product is eroding.
Title: Re: Nvidia GF100
Post by: Knite on March 22, 2010, 11:26:44 AM
I'm with you on that Skuzzy. 3 years ago if you would have told me ATI was going to beat nVidia at their own game, I may have laughed, but damn if ATI hasn't really socked nVidia right in the jaw with this gen of cards.

I'm a bit bummed the Fermi cards aren't more compeitive though, as I was keeping my fingers crossed that at release it'd put some pressure on ATI to get those 58xx series cards prices back down to the announce MSRPs instead of what they're floating at now.

Title: Re: Nvidia GF100
Post by: Skuzzy on March 22, 2010, 01:05:14 PM
It just got worse.  Due to the horrible yeilds, they revised the specifications again.  The 480GTX cards will have 480 cores, instead of 512.  The 470 will have 448.

This means the benchmarks, everyone has seen, are void and the performance gains are less than before.  Probably on par with the ATI 5870 cards now.  Only the 480 costs more and uses far more power.

What they are doing is blocking dead cores in the bad chips to get the production yeilds higher.  It will take them from 7% to around 40% yeilds.  Well, at least they will not be losing as much per card as they would have if they stuck with 512 cores.
Title: Re: Nvidia GF100
Post by: Ack-Ack on March 23, 2010, 01:46:37 AM
These dies are exhibiting an abnormal amount of transistor leakage, which is why the power requirements are so high.  I would not touch one with a 10 foot pole.  Make sure you get a really good warranty if you decide to get one of these cards.  The lifespan is going to be suspect.

When you have yeilds in the 7% area, that tells you the design has problems.  It is not just the process at that low of a figure, regardless of how NVidia's marketing would cover it.

You think NVidia would have learned after eating all those other dies for the same problems.  My confidence in NVidia's ability to turn out a quality product is eroding.

It seems that Nvidia is more interested in making "strategic partnerships" instead of making quality product these days.  Also, am I the only one that has noticed that Nvidia's decline has coincided with their purchase the makers of Phys-X and the incorporation of Phys-X into their cards?

ack-ack
Title: Re: Nvidia GF100
Post by: Boozeman on March 23, 2010, 05:27:07 AM
It seems that Nvidia is more interested in making "strategic partnerships" instead of making quality product these days.  Also, am I the only one that has noticed that Nvidia's decline has coincided with their purchase the makers of Phys-X and the incorporation of Phys-X into their cards?

ack-ack

Yes, they have been very busy promoting the proprietary gimmicks of their cards, instead of real innovation. And probably the enormous success of their G80 GPU made them lazy too...a bad combination that resulted in what we see now. 

BTW, more and more benchmarks leak...the one I am looking at now has the 480 a "massive" 6,7% ahead of the 1GB 5870 and 5% ahead of the 2GB 5870. If true, NV really only has CUDA and Phys ix left to promote their cards. And the lead as the fastest single GPU card is very slim. I'm almost 100% certain that ATI will reclaim the crown in short order.

   
Title: Re: Nvidia GF100
Post by: Skuzzy on March 23, 2010, 05:59:01 AM
Boozeman, any benchmarks you see right now are based on parts which will never be shipped.  NVidia is reducing the core counts of the parts in order to be able to get higher yields.  There isno way of knowing what the negative impact to the benchmarks will be.  What was 5-6% is now less than those figures.
Title: Re: Nvidia GF100
Post by: Boozeman on March 23, 2010, 06:23:06 AM
Boozeman, any benchmarks you see right now are based on parts which will never be shipped.  NVidia is reducing the core counts of the parts in order to be able to get higher yields.  There isno way of knowing what the negative impact to the benchmarks will be.  What was 5-6% is now less than those figures.

Skuzzy, I'm almost 100% certain that the latest leaks are from the final spec GTX480. Those have already arrived at the reviewers and they should have actual results by now. With the NDA expiring at the end of the week, it's likely that we see more and more leaks. I just cannot imagine that Nvidia supplies them with 512 Core Fermis and then leaves the customer with 480.
Title: Re: Nvidia GF100
Post by: Ghastly on March 23, 2010, 06:36:49 AM
Skuzzy, I'm almost 100% certain that the latest leaks are from the final spec GTX480. Those have already arrived at the reviewers and they should have actual results by now. With the NDA expiring at the end of the week, it's likely that we see more and more leaks. I just cannot imagine that Nvidia supplies them with 512 Core Fermis and then leaves the customer with 480.

How soon we forget.  It wasn't all that long ago that some bright lads found that the drivers were detecting common benchmarking software, and adjusting operations accordingly to post artificially inflated performance.

Somehow, an "Oops" like benchmarking the top bin cards and shipping the others doesn't seem so unlikely.

<S>
Title: Re: Nvidia GF100
Post by: Skuzzy on March 23, 2010, 06:38:19 AM
Skuzzy, I'm almost 100% certain that the latest leaks are from the final spec GTX480. Those have already arrived at the reviewers and they should have actual results by now. With the NDA expiring at the end of the week, it's likely that we see more and more leaks. I just cannot imagine that Nvidia supplies them with 512 Core Fermis and then leaves the customer with 480.

Boozeman, NVidia just announced the changed spec two days ago.  They have not built any final spec cards yet.  The cards out there right now were built to the old spec, until NVidia finally figured out there was no way they were going to be able to build them that way without losing a lot of money on every card they shipped.
Title: Re: Nvidia GF100
Post by: Boozeman on March 23, 2010, 06:52:52 AM
Boozeman, NVidia just announced the changed spec two days ago.  They have not built any final spec cards yet.  The cards out there right now were built to the old spec, until NVidia finally figured out there was no way they were going to be able to build them that way without losing a lot of money on every card they shipped.

AFAIK, those chanded specs are the finalized specs. Everything we heard about earlier were WIP specs, where shader cores, clock speeds, TDP etc. were all not set in stone yet. If they have not built any final spec 480s yet, then there is no chance to see any tests results by the time the NDA is lifted. But hey, we will know very soon.  ;)
Title: Re: Nvidia GF100
Post by: Skuzzy on March 23, 2010, 07:02:58 AM
Actually, someone claims to have gotten a final spec (everything is theoritical) card and it ran 10% slower than the ATI 5870.

At this point in time, who knows what anyone will have, when it comes time to lift the NDA.  It is going to be a mess.
Title: Re: Nvidia GF100
Post by: Boozeman on March 23, 2010, 07:14:34 AM
I agree, if we consider all the cons (power consumption, price, low availabillity etc.) it should absolutely blow the 5870 out of the water performance wise to have a justification. But it's rather safe to say that this is not going happen. So the only thing left is CUDA and Physix - if you need/want those features.   
Title: Re: Nvidia GF100
Post by: Skuzzy on March 23, 2010, 07:19:00 AM
But if it is only a couple of percentage points faster than the GTX295, then why bother at all?  The 295 card is a better card for the price and power consumption.
Title: Re: Nvidia GF100
Post by: Boozeman on March 23, 2010, 07:40:26 AM
But if it is only a couple of percentage points faster than the GTX295, then why bother at all?  The 295 card is a better card for the price and power consumption.

Well, I'd choose a single GPU over a multi-GPU solution if perfomance is the same.
Then there is DX11 too.

BTW, isn't the 295 EOL'd ? 
 
EDIT: I forgot, the 295 is short on VRAM too. 896MB per GPU does not cut it for a such a high performance card.
Title: Re: Nvidia GF100
Post by: Reschke on March 23, 2010, 09:31:22 AM
Its all a cycle no matter who makes what in the video card business. I remember when S3 based cards were kicking butt way back in the day...anyway when AMD and ATI went in together it choked off ATI for a while. The same is happening with Physx and nVidia now.
Title: Re: Nvidia GF100
Post by: Ack-Ack on March 23, 2010, 02:38:16 PM
So the only thing left is CUDA and Physix - if you need/want those features.   

Even those are going to stop being selling points for Nvidia cards because ATI in it's new line of cards (also think the 4xxxx line have the feature as well) feature a GPU that is able to handle game physics using the Havok API, which, from what I've heard, is far easier to program for than the Phys-X API.


ack-ack
Title: Re: Nvidia GF100
Post by: Skuzzy on March 23, 2010, 02:43:39 PM
Specifically, ATI has gone with open standards and NVidia has chosen proprietary API's.  Proprietary is all well and good when you are on top of your game, but when you drop the ball, ala Fermi, it can hurt you.
Title: Re: Nvidia GF100
Post by: skribetm on March 24, 2010, 08:15:32 PM
(http://cdn.i.haymarket.net.au/Utils/ImageResizer.ashx?n=http%3a%2f%2fi.haymarket.net.au%2fGalleries%2f20100324113534_IMG_0211+copy.jpg&h=450&w=665)


http://www.atomicmpc.com.au/Gallery/170368,nvidia-gtx480-disassembly-guide.aspx/17
Title: Re: Nvidia GF100
Post by: Boozeman on March 26, 2010, 07:33:01 AM
Yet another delay:

http://www.fudzilla.com/content/view/18208/34/

Skuzzy, it looks like you might have been right after all...meaning the test samples that have been send out are not representative of the cards that will hit the stores.
Maybe we are looking at pretty marketing stunt release with little to none relevance to the actual product. 
Title: Re: Nvidia GF100
Post by: Bronk on March 26, 2010, 07:35:45 AM
Yet another delay:

http://www.fudzilla.com/content/view/18208/34/

Skuzzy, it looks like you might have been right after all...meaning the test samples that have been send out are not representative of the cards that will hit the stores.
Maybe we are looking at pretty marketing stunt release with little to none relevance to the actual product. 

Marketing stunt? Looks more like fraud IMO.
Title: Re: Nvidia GF100
Post by: Boozeman on March 26, 2010, 07:49:52 AM
Well, at least Nvidia keeps up the confusion. We will see how these changes actually affect the cards - maybe minor things maybe, major things. But I think ist sure that the cards the test sites got are very much cherry picked to give the best picture - maybe not performance wise but in regard to core voltage / heat /noise to run at that performance level these will probably be better that the actual retail cards.   
Title: Re: Nvidia GF100
Post by: Ghastly on March 26, 2010, 11:31:24 AM
Marketing stunt? Looks more like fraud IMO.

Remember, up to this point EVERYTHING has been under an NDA  - with everything subject to change prior to release.  Any information out there has been "leaked". In short, NVidia has promised nothing, and delivered nothing - and so there's no basis for fraud.

Disappointment - Sure! Distrust, perhaps. But Fraud?  No.

<S>





Title: Re: Nvidia GF100
Post by: Bronk on March 26, 2010, 11:54:03 AM
Remember, up to this point EVERYTHING has been under an NDA  - with everything subject to change prior to release.  Any information out there has been "leaked". In short, NVidia has promised nothing, and delivered nothing - and so there's no basis for fraud.

Disappointment - Sure! Distrust, perhaps. But Fraud?  No.

<S>

Good point... Wonder how many fan bois will be sucked in though.





Title: Re: Nvidia GF100
Post by: Boozeman on March 26, 2010, 06:56:00 PM
Reviews are out!

Well there we have it. Its a bit faster overall than the 5870, but this comes with extreme power consumption, extreme noise and very hot hardware. I think "Thermi" nails it better than "Fermi". On the upside, under DX11 with lots of tesselation, this card really kicks 5870 bellybutton - but in all other scenarios...not so much.


 
Title: Re: Nvidia GF100
Post by: Ack-Ack on March 26, 2010, 08:26:38 PM
Reviews are out!

Well there we have it. Its a bit faster overall than the 5870, but this comes with extreme power consumption, extreme noise and very hot hardware. I think "Thermi" nails it better than "Fermi". On the upside, under DX11 with lots of tesselation, this card really kicks 5870 bellybutton - but in all other scenarios...not so much.


  

Its performance compared to the ATI 5870 isn't all that much and I wouldn't consider it to be kicking its ass.  

DX11 tests comparisons (http://www.hexus.net/content/item.php?item=24000&page=8) and it only beat the ATI 5970 in one area when using extreme tessellation.

While the GTX 480 can be called the world's fastest GPU, the ATI 5970 is still the world's fastest video card and pretty much beats the GTX 480 in most areas. The price point is also going to be up to 40% above that of ATI's flagship card.  All in all, ATI still retains the crown.

GTX and ATI chart comparison (http://www.hexus.net/content/item.php?item=24000&page=4)


ack-ack
Title: Re: Nvidia GF100
Post by: BoilerDown on March 26, 2010, 11:30:42 PM
Here's another review that correlates with what everyone has been predicting:

(Linked directly to the conclusions page)
http://www.hardocp.com/article/2010/03/26/nvidia_fermi_gtx_470_480_sli_review/8

They review the 470, 480 and 480 SLI... only one of the three breaks any new ground, along the bank.
Title: Re: Nvidia GF100
Post by: oneway on March 26, 2010, 11:33:59 PM
AnandTech has a write up on the 480/470....

"NVIDIA’s GeForce GTX 480 and GTX 470: 6 Months Late, Was It Worth the Wait?"

http://www.anandtech.com/video/showdoc.aspx?i=3783 (http://www.anandtech.com/video/showdoc.aspx?i=3783)

(http://images.anandtech.com/reviews/video/NVIDIA/GTX480/480card.jpg)
Title: Re: Nvidia GF100
Post by: Kazaa on March 27, 2010, 02:10:59 AM
Dammmmmmmmm, not bad at all.

The 480 is £50-£70 cheaper then the 5970 and is close in performance.
Title: Re: Nvidia GF100
Post by: Skuzzy on March 27, 2010, 05:52:22 AM
Just keep in mind these cards may not represent what will actually be shipped.  It is still a design with many problems.
Title: Re: Nvidia GF100
Post by: Bronk on March 27, 2010, 07:20:04 AM
Shades of geforce 5800.... :noid
Title: Re: Nvidia GF100
Post by: Boozeman on March 28, 2010, 05:01:27 AM
Shades of geforce 5800.... :noid

More like 5800 reloaded. ;)
Funny though, the 5800 series at ATI is a full success...

What worries me most about Fermi are the temps. Note that most reviewers do not have their test system installed in a case - unlike the customer. But even in such an unrealistic and favoring situation the Fermis run very hot. 95 C is reached regularly in heavy gaming, and the GPU throttles power at 105 C. Now ad summer like ambient temperatures, and non Fermi certified case (yeah, there are Fermi certified cases - apparently for good reason) with a less that optimal airflow, and you have a card that hits its limiter all the time - or worse.

A German review site did indeed broke a Fermi during testing, and another almost, simply by running Folding@home. There was a bug in the fan control software, which prevented that the fan spool up properly. The results were 112 C GPU temp and the screen went black, but the card was still crunching and and heating up. Probably only the quick manual reset the editor did save that card. And this was a GTX470, which runs a bit cooler than the 480.       

Fermi is a mess.
Title: Re: Nvidia GF100
Post by: Spikes on March 28, 2010, 01:20:28 PM
 Think I'll still stick to crossfire 5770s.
Title: Re: Nvidia GF100
Post by: MrRiplEy[H] on March 29, 2010, 06:49:26 AM
Fermi = Fermented = Stale/rotten.. Nomen est omen?  :D
Title: Re: Nvidia GF100
Post by: Chalenge on March 29, 2010, 02:54:27 PM
The 470s and 480s are alleged to be the first Nvidia cards to be 100% scaleable which if true will mean I will have three of them... NOT because you need them for AH though. I want to see what these cards can do for FSX.

Since their not out yet its time to WAIT AND SEE instead of jumping to conclusions... right Ripley?  :aok
Title: Re: Nvidia GF100
Post by: Skuzzy on March 29, 2010, 03:12:18 PM
Well, the 480 did not scale well in resolution against the ATI model. The higher the video resolution, the faster the 480 dropped in performance against the ATI card.

Three of them?  I hope you have a really good set of earplugs (3 cards @ 70dB each is uh,..loud), a really open case with lots of fans, and a 1500W power supply.  For the 5 minutes they will run before dialing back thier performance due to heat, it might be neat.  Unless you have some type of refrigerated case in mind?

If you really are hellbent on buying an NVidia product, I really think it would be smarter to wait until they do a refresh of the parts before touching them.  This generation is wrought with physical design problems I am sure they will cure in the next 3 to 6 months, but it is your money.

I just do not see these cards as an option for anyone due to the heat, noise, power consumption, lack of overall performance gains, and costs.  That is putting aside the design issues.

From my perspective, I see a support nightmare coming at me.

"Is there a way to turn up the VOX?  I cannot hear it over the video card!"
"Your game melted my video card!"
"Why is my frame rate dropping after I have been playing for a 30 minutes?"

And so on, and so forth.
Title: Re: Nvidia GF100
Post by: Chalenge on March 29, 2010, 05:13:36 PM
My plan exactly Skuzzy - is to wait... probably until I see something about the recall rate or how the actual heat off of the evga models is. Right now the reviews I have found are not well executed and none of them done with retail cards. I think my case is good as far as air movement but if the cards are just high-dollar soldering irons then I dont see evga selling them at all. The guys at evga are not in business to sell high-dollar boat anchors.  :D

Im using two Seasonic X750 Gold PSUs in a Coolermaster ATCS 840 case and I think that will be enough.  :aok
Title: Re: Nvidia GF100
Post by: Ack-Ack on March 31, 2010, 01:00:15 PM
(http://www.legitreviews.com/images/reviews/1264/Nvidia_grill.jpg)


ack-ack
Title: Re: Nvidia GF100
Post by: Skuzzy on March 31, 2010, 01:38:15 PM
Well, seems another blackeye has been awarded.  XFX and BFG are not going to offer any of these new cards.

From what I can gather, the fanbois are all saying these cards are not designed to play games, but to run CUDA applications faster than ATI cards can.  Uh,..hehe,...CUDA is a proprietary API NVidia created.  It does not run on ATI cards at all.  ATI cards use open standards for GPGPU programming.  I really do not think the fanbois are helping much.
Title: Re: Nvidia GF100
Post by: Kazaa on April 01, 2010, 02:44:29 AM
I hate fanbois.
Title: Re: Nvidia GF100
Post by: Skuzzy on April 01, 2010, 06:03:15 AM
Seems most of the fanbois are actually being smart and waiting.  The latest from the hard core fans is NVidia will probably release a driver that will fix the heat and noise problems.  They are just waiting on ATI's response so they can lower the boom on ATI again.

The little buggers can tap dance really well.
Title: Re: Nvidia GF100
Post by: Knite on April 01, 2010, 10:35:46 AM
Honestly, I'm just glad this thing finally got released. Maybe by the time nVidia gets their 2nd set of 470/480 hardware out with lower cooling/power consumption, they'll lower price too forcing AMD to as well. That's only a good thing.
nVidia failing this bad is not a good thing for consumers.


That being said, just installed a 5770 myself, and other than an odd quirk in which it decides to run in 24p mode on occasion (not AH problem), it runs AMAZING. Full detail, full vis range, 1080p resolution, 4096 self-shadowing and it stays LOCKED at 60fps.
So pretty. Now if only I could hit the broadside of a barn =)
Title: Re: Nvidia GF100
Post by: Krusty on April 01, 2010, 11:02:02 AM
Mildly curious, what is "24p mode"?
Title: Re: Nvidia GF100
Post by: Skuzzy on April 01, 2010, 11:07:01 AM
24 progressive (24 FPS).  It is a movie mode.
Title: Re: Nvidia GF100
Post by: cattb on April 01, 2010, 12:02:27 PM
IF ATI took the 5770 and went from 128 bit to 256 wouldn't it make that card all that much faster and unlock more potential since the memory bandwidth is larger? This would make the card more expensive, but then due to the way the card is made performance gains would not be that much?
Title: Re: Nvidia GF100
Post by: Krusty on April 01, 2010, 01:02:19 PM
IF ATI took the 5770 and went from 128 bit to 256 wouldn't it make that card all that much faster

It's called the 5870, and it has over twice the memory bandwidth as the 5770.

http://www.gpureview.com/show_cards.php?card1=615&card2=613
Title: Re: Nvidia GF100
Post by: cattb on April 01, 2010, 02:30:02 PM
Oh!! didn't realize the 5870 and 5770 were the same card but 5870 has the bandwidth. But then I didn't look at the specs either.
Title: Re: Nvidia GF100
Post by: Krusty on April 01, 2010, 03:11:51 PM
There is also a 5890.
Title: Re: Nvidia GF100
Post by: cattb on April 01, 2010, 04:31:07 PM
cool!! maby then they will make a 128 bit 5790
Title: Re: Nvidia GF100
Post by: Krusty on April 01, 2010, 05:12:33 PM
Doesn't really work that way....

The 5870 is their premeir line. The 5670 is their lesser line (not quite budget, but more towards the budget end). The 5770 is a step between. The 5890 is more like an upboosted model (like the X in GTX).

Folks want a "5790" they just buy a 5870 and be done with it :)
Title: Re: Nvidia GF100
Post by: Chalenge on April 01, 2010, 06:47:27 PM
Seems most of the fanbois are actually being smart and waiting.  The latest from the hard core fans is NVidia will probably release a driver that will fix the heat and noise problems.  They are just waiting on ATI's response so they can lower the boom on ATI again.

The little buggers can tap dance really well.

I read that the first delivery is sold out already? I dont think the heat issue can be handled by drivers and thought I read already the pathways have to be reduced (from say 90nm to 60nm) to fix the issue?

Anyway the scaling possibilities make me think even a state-changing device would be worth it if its possible to find a dual-head system (considering SLI). GPUs are already the primary source of heat in all of my systems so Im headed in that direction already.
Title: Re: Nvidia GF100
Post by: Krusty on April 01, 2010, 06:53:08 PM
I've heard some good things about the Vapor-X system. Looks rather amazing, really, and runs super cool. I haven't seen it on many cards straight from the factory (only one line had some a while back), but they must be available separately, no?
Title: Re: Nvidia GF100
Post by: Chalenge on April 01, 2010, 07:27:35 PM
I should have said 'phase-change' but what I really mean is a refrigeration type cooling system that cools the GPU directly to the neighborhood of -22C. OCZ at one time had a dual head cooler or at least I thought it was them. The noise would go up but again you could always cabinet the thing.
Title: Re: Nvidia GF100
Post by: Ghastly on April 01, 2010, 10:51:39 PM
Have you worked with it before, Chalenge?  In the late 90's I used the Kryotech Renegade room temperature phase-change cases for servers and workstations that had to operate in uncooled spaces where the ambient would at times reach 125' F - and they worked well.  But when Kryotech discontinued the room-temperature products, I was forced to switch to -20C units (I used the VapoChill units). 

For an overclocker who wanted to piddle and get the most out of the system for a few minutes to a few hours at a time - they could possibly work (reasonably) well I suppose.  But for a full time cooling solution, they are just miserable to work with, because you have to seal everything (thermally as well as hermetically) to make sure that you get none, zero, nada condensation anywhere  - tough to be sure of at +90 degree and 100% humidity ambient. 

Bottom line is I never got it to work satisfactorily - I eventually refused to place any new equipment that would not be able to be operate in a climate controlled zone.

But I can't imagine that trying to implement such a solution on two Fermi's is going to be any easier.  And all it takes is ONE drop of condensation to destroy a PCB - and hose your system.

<S>
Title: Re: Nvidia GF100
Post by: Skuzzy on April 02, 2010, 07:22:42 AM
I really do not care what the fanbois/NVidia marketing machine is saying.  I can safely state this part is a flawed design.  It was not designed to match the process TSMC has to manufacture the silicon, which is the primary reason it is running so hot.  It is leaking current all over the part.

ATI learned the hard way on thier first 40nm part manufactured by TSMC and they went back to the drawing board to design specifically for TSMC's process.  That is why ATI parts are running so much cooler now.  It is also why ATI's yeilds are so much better.

This has been well documented and there is no amount of marketing that will cover it up.  Ignoring the facts does not make them magically disappear.
Title: Re: Nvidia GF100
Post by: Ack-Ack on April 02, 2010, 12:39:39 PM
I've heard some good things about the Vapor-X system. Looks rather amazing, really, and runs super cool. I haven't seen it on many cards straight from the factory (only one line had some a while back), but they must be available separately, no?

Sapphire is the company that has the Vapor-X card cooling system.

ack-ack
Title: Re: Nvidia GF100
Post by: Skuzzy on April 02, 2010, 02:01:21 PM
The only thing that bothers me about Vapor-X is long term viability.  It depends on a vacuum to lower the boiling point of water so the water will vaporize much quicker.  If the part suffers an internal increase in air pressure, then it will stop working.

After a few years of constantly changing temperatures, I wonder how well it would hold up.

Overall, it is a clever design.