Aces High Bulletin Board
General Forums => Hardware and Software => Topic started by: Chalenge on February 21, 2013, 05:08:43 PM
-
They call it Titan, but it is realy the GK110 GPU. It is the fastest single GPU video card available as of this moment. The Titan comes with a standard 6GB memory model and is clocked lower than the 680 types. Will you be able to buy one anytime soon? Well,. . . if you want to buy a pre-built system then yes, but pricey at more than $3000.
Review (http://www.anandtech.com/show/6763/highend-meets-small-form-factor-geforce-titan-in-falcon-northwests-tiki) at Anandtech and any Nvidia SLI members will be getting email about it shortly with more pre-built vendors listed.
-
I'd get this before going SLI for sure. Never been sold on SLI, least of all now when every game I play is either CPU bound or GPU bound at more than 60 FPS.
Speaking of which, Aces High could use a GPU/CPU indicator ala hitting CTRL-F in Planetside 2. Would take a lot of guesswork out of which upgrade to get next for a lot of people.
-
I'd get this before going SLI for sure. Never been sold on SLI, least of all now when every game I play is either CPU bound or GPU bound at more than 60 FPS.
Speaking of which, Aces High could use a GPU/CPU indicator ala hitting CTRL-F in Planetside 2. Would take a lot of guesswork out of which upgrade to get next for a lot of people.
Running MSI afterburner will log your cpu/gpu usage history.
-
I think if you have an i5/i7 at greater than 3.0GHz and a 600 series, especially a GTX 680, then you have no need to upgrade. Not for AH at least. Since I also do other things on a computer I will have to see what it will do for transcoding videos, 3D Max renders, Revit, and so on, but for AH I don't see a need. I really don't feel a need to use ultimate shadows, because when you play with shadows you are not actually focusing on the shadows and your brain fills in the little details that the game has as a little rough.
Now if someone is playing on an older card and needs an upgrade, or suddenly their card burns out (I just had one do that actually - covered by EVGA lifetime warranty) then upgrading to Titan is a no-brainer. Now, if the older cards suddenly see a drastic price drop? SLI baby!
-
I have i7 overclocked to 4.3 and a gtx680 :)
Guy on shop advised me to get a 860i power supply and in two years time get another graphics card and go SLI :old:
On the Island yesterday i went down to 43fsp with local reflections on, which i found odd :cry
Its always 60fsp, I presume its something to do with reflections with the water and coast.
As Challenge has stated you can switch lots of things off if you want, i have everything up high because I can.
The most important thing to stop distraction is that i minimise the text buffer, in ROF its very annoying as the text buffer pops up all the time.
-
Yes, what I tried to say, and said poorly, is that even a single GTX 680 will run AH and allow smooth shadows at 4096 texture size (if you have the right CPU already). I did notice that the shadows are slightly smoother with the latest drivers (or maybe HTC changed something). I never noticed when the change occured, but I compared a few older recordings with some I just recently made and I can see a difference.
-
I never used the shadows, as challenge has said. If in a fight the concentration is on the fight, not the shadows. Just playing the game SA is so important I lose track of shadows don't pay attention to them, so why bother.
On a long cruise where boredom can set it, its kinda fun to mess with.
-
I noticed the shadows improvement with nVidia's new drivers as well, especially at 8092. I will say one thing, I can run max everything EXCEPT environment sliders and see 59 or 60 max no matter what I do in game. If I leave the slider two notches from the left, so 2 shy of max, my FPS stays maxed out continuously, however if I max it full left, it'll drop to the low 40's as a previous poster just said in a high traffic area.
For my 3820 single 4gb 680 FTW+ LE system the shadows being right off or maxed out doesn't cause my FPS to drop lower than 60, but as I said, the environmental slider being full left will make things seem a bit more sensitive when I roll around and look around, and will stay at 59 or 60 unless I get into a lot of gvs and cons and smoke combined, then it'll drop 20 fps or so. Even my 3930k system with 2 of those 680's in SLI will see a drop to 50 or so FPS in the same circumstances with the slider full left. I really don't see a whole lot of difference in how the game looks, it's fantastic still with it 2 notches right, so I just leave it there, and leave shadows at max as like I said they don't seem to have a major cost for my setup. Normally I would just switch shadows off as well, as I flew for years without them and they kinda bugged me at first, but I've been recording a lot and want to make films, and I want them to look as pretty as possible, so on they stay for now. I've never really taken note of it, but can you tell if the shadows are on or off in film? I'll check that out today sometime I guess.
I wonder when Intel will release their new CPU's for the socket 2011, it's gotta be soon. Be nice to see some benchmarks with this new video card Chalenge is speaking of with one of those.
big edit: Chal, it's a 1000$, not 3000$ retail for the Titan, must have been a typo. I was like, wth, nobody except for an extremely select few will pay that for a retail gaming card, that's more than some of those crazy high end graphic designer cards haha. It's actually cheaper than the 690 was that I bought, and 100$ cheaper than the 680 SLI setup I replaced it with. Good times, I can't wait until they come in, my SLI cards are already spoken for.
-
big edit: Chal, it's a 1000$, not 3000$ retail for the Titan, must have been a typo. I was like, wth, nobody except for an extremely select few will pay that for a retail gaming card, that's more than some of those crazy high end graphic designer cards haha. It's actually cheaper than the 690 was that I bought, and 100$ cheaper than the 680 SLI setup I replaced it with. Good times, I can't wait until they come in, my SLI cards are already spoken for.
No, I was talking about systems with the Titan included. Right now, unless you know something I don't, you can't buy a Titan.
You can always turn shadows on when you go to make a film into a movie.
-
The Ivy Bridge E processors will be on 2011 but won't be out until Q3 of this year, from what I recall reading. And the Titan will be available from Newegg, Amazon, etc, as a single card, no requirement to buy a $3000 PC just to get one.
From what I've seen the i7 3770 benches just as good as the Extreme processors in games, and its not much better than my overclocked 2600k, even if you overclock it as well. Haswell is supposed to be out June 2nd but will only be up to 10% faster than Ivy Bridge. Despite the small improvement, unless they don't overclock at all, Haswell will be my next upgrade CPU-wise. Not going for today's or tomorrow's extreme CPUs, they run too hot and they aren't really any better than the ~$400 CPUs available for what I do 99% of the time.
Wish AMD would push Intel and Nvidia's high end, but they seem to be taking a year off. So we get jack for speed improvements for a while. I'm actually fairly impressed by Titan mainly because I don't see it being upstaged in a long long time. You could probably buy one and not feel the need to upgrade the GPU again for four years. On the other hand GPUs are so far ahead of CPUs right now, upgrading the GPU to the Titan can hardly be warranted. Just get a 670 and wait for the GTX 700 series in December.
-
It was not local reflections restricting FSP, i had the update slider to max :old:
-
Sooooo..................
Has anybody jumped off the pier on this 1 yet since 2-24-13 posting & using it in AH?
Evidence says no but you never know if someone has went all covert about it........................... ...
I'm waiting for the price to drop some myself before I start getting serious about it........................... .............
Or I hit the lottery 1st..........................
:D
Just curious......................
:salute
-
I bought s gtx680 which is awesome :old:
-
This has me so excited! :banana: I think I will go off and 'crack one off'
-
I think if you have an i5/i7 at greater than 3.0GHz and a 600 series, especially a GTX 680, then you have no need to upgrade. Not for AH at least.
I have an i7 3960x and 3-way Titan SLI, only playing COD2 sitting in a corner waiting for the lemmings to show up, but i would like to upgrade, could you help me please? ;)
-
:)
-
I have an i7 3960x and 3-way Titan SLI, only playing COD2 sitting in a corner waiting for the lemmings to show up, but i would like to upgrade, could you help me please? ;)
There is no help for you.
-
Income tax refund is now in the bank & the struggle to resist just got harder!
I read today that there is a Titan LE vers in the making....................... ......
:x :pray :D :salute
-
Buy Trackir instead, its awesomesauce in game :old:
-
I have an i7 3960x and 3-way Titan SLI, only playing COD2 sitting in a corner waiting for the lemmings to show up, but i would like to upgrade, could you help me please? ;)
:rofl :rofl
-
Well the wait is over.......................
I, Pudgie, am now the proud owner of an EVGA GTX TITAN vid card. Bought it direct from EVGA for the grand sum of $1,009.23.
:D :salute
Couldn't resist the temptation to have it.......especially when I saw that Newegg had stopped carrying the vanilla card at one stretch (only had superclocked & signature & hydrocopper series....thought that Nvidia had stopped making the vanillas)...checked the EVGA site & saw that they were still showing the vanilla Titans so I ordered 1.....
Now the price is going up on them all.........................
Lemmings like me.............. :x
Popped her in & loaded the 314.22 WHQL drivers & all is well.
This card is quiet & is the SMOOTHEST running vid card that I've ever owned................
Stutter-free operation w/ all graphics settings at max & I mean all of them.
I tested this card since it's a Kepler-based GPU w/ GPU Boost 2.0 using the in-game graphics setting in Video Settings set to Most & setting the Nvidia driver to Use the 3D Application settings for AF, AA & V-synch...set TF at High Quality & ran the game....this TITAN did the exact same pattern as did my GTX 670 FTW run the same way...the card's GPU Boost set the GPU clocks some 224 Mhz BELOW the base clock settings of the card (actual in-game GPU clocks around 614-627 Mhz....base clocks were 837 Mhz) w/ mem clocks pegged at 3005 Mhz. As w/ the 670 the game ran flawless but the GPU clocks were below the base clocks. I then went into the game Video Settings, set the in-game graphics slider to None then went into the Nvidia driver, set it to Override any Application Settings then set the AF, AA to the driver's max settings (16x AF, 32xCSAA) & V-synch to On w/ TF set at High Quality & ran the game & just as the 670 did then, this TITAN's GPU Boost set the GPU clocks at 980 Mhz (pegged out 104 Mhz above the max boost clock of 876 Mhz) w/ the mem clock pegged at 3005 Mhz. GPU temp wasn't even an issue (between 59*C-62*C regardless w/ max power usage at 74%....set @ 100% power target). Again the game ran flawless but the GPU clocks were MUCH higher as GPU Boost did what it should have done when the GPU is running well below the temp & power range settings. Something w/ the in-game graphics setting is influencing the Nvidia GPU Boost algorithim to underclock the Kepler GPU's on my box...........
Hmmmmm..........2 different Kepler GPU's w/ 2 different vers of the GPU Boost algorithim running on 2 different driver vers but exhibiting the EXACT same behavior in the EXACT same game on the EXACT same platform.....so it's got to be some issue w/ the I7 3820 SB-E platform I'm using?
Naw I don't think so................
Anyway this TITAN is way too much card for what I use this box for.......but this box is way too much for what I use it for as well so I'm all set for a while.......
Well gotta go play some more........................
I do love this card!
:aok :salute
-
I bought three and threw one away :rofl
-
If you're interested,
There is an interview you can see at PC Perspective's web site where Ryan Shrout has Tom Petersen, a Nvidia rep, on & the topic is about the advent of FCAT process of using Frame Rating to further refine graphics performance....not just by GPU power & speed alone....but by being able to "read" each graphics frame that is written to the frame buffer AND is flipped to be displayed onscreen to ensure that every frame is fully rendered properly & fully, have all the frame inputs input at the proper time within the time window given for the vid card to have this done & displayed in it's proper time sequence. The discussion is a good listen IMHO as it demonstrates why using FRAPS (or other similar software) to evaluate graphics performance of a video card misses too much as it picks up the frames from the game engine (CPU) but doesn't account for the frame rendering process & flipping to display process (GPU) so we may see errant graphics performance & experience on screen but it's not necessarily due from GPU/mem performance issues concerning FPS.
This process is not proprietary to Nvidia as AMD is working w/ this process as well. Nvidia has been working w/ this for the last 3 years.
This is 1 of the features that is incorporated in the TITAN as this vid card has Nvidia's 1st attempt from all their research & testing being implemented at the hardware level of the card (GPU) & controlled thru the driver so this card is self-checking the GPU rendering & display process as well as actually doing the work of rendering & displaying the frames.
The goal is to be able to have the highest FPS that the card can run AND have the lowest frame timings as possible so we the user can have the fastest, smoothest graphics experience that can be provided.
I'm guessing that both AMD & Nvidia have hit the same wall w/ the GPU's that Intel & AMD have hit w/ the CPU's some time back.....pure processing speed & power--though necessary--isn't enough to fully achieve the goal of providing the ideal user EXPERIENCE w/ these products.
I'm also guessing this is why Nvidia needed the GK110 GPU instead of GK104 GPU as to use the FCAT process on a computer you would need a video capture card to capture the frame flipping from the frame buffer to the display, look at the frame sequencing, quality & making adjustments at the GPU level to correct any errors found & do all this at high GPU rendering rates & the GK110 GPU is more than capable of handling both processes at the same time thus the creation of the TITAN as a single GPU version.
The GTX 690 is a dual GPU version in which they can implement this process & manage it across the 2 GK104 GPU's thru the driver.
When you see the data that Ryan Shrout presents from using the FCAT tools to check for this performance across the latest Nvidia & AMD products you will see that both are working w/ tuning their products thru this process & you will see that Nvidia has really invested a LOT of time in using FCAT & refining their product line performance..........TITAN is the flagship card w/ FCAT in mind & it shows.
This from what I gathered is going to be the basis for the GTX 700 series cards.
http://www.pcper.com/
:salute
-
did they finally do something with the 400mhz ramdac chip? i can't find any reference on any of the titan cards...
-
Frame rates beyond the ability of the monitor to display them is pretty worthless. They need to quit obsessing over frame rates above the refresh rates of the monitors.
-
lol, no way that's going to happen Skuzzy. it's one of the greatest marketing tools they have stumbled on...pseudo frame rates. put enough memory on the board for a big buffer and let the gpu show 200fps. a little anti-aliasing here, a little tesellation there...as long as it appears seamless, nobody cares how it got there.
-
lol, no way that's going to happen Skuzzy. it's one of the greatest marketing tools they have stumbled on...pseudo frame rates. put enough memory on the board for a big buffer and let the gpu show 200fps. a little anti-aliasing here, a little tesellation there...as long as it appears seamless, nobody cares how it got there.
If enough people are educated about it, then that can change.
The only time it is helpful to know what the raw frame rate would be when you are trying to assess whether or not the CPU or the video card is the bottleneck in your computer.
-
For those who are interested again.......................
On the issue of a GTX Kepler series GPU using GPU Boost downclocking below the base 3D clocks when using the in-game AA settings vs using the driver settings alone:
I thought about this some more & realizing that when I was using the driver settings I was using the MAX settings for AF, AA, TAA (16xAF, 32x CSAA & TAA set at 8x Supersampling) w/ Vsynch set to On. This clearly had GPU Boost setting the GPU to max boost level as set in the vBIOS...but what if the actual in-game setting(s) aren't that high........would lower settings cause GPU Boost to react differently.................. ......?
So I reran this TITAN using the driver settings alone but set at the LOWEST settings that can be run (2x AF, 2xAA w/ TAA set for multisample) & left Vsynch set at On (as well as the rest of the driver settings) to see what the card would do as I didn't know what level the actual in-game settings would match in the driver setting range so the results from this run would/should give me some indication of the in-game setting level if GPU Boost did cut the clock speeds under the base clock speed setting..........
I checked afterwards & saw that GPU Boost had indeed downclocked the GPU below the base 3D clocks (837 Mhz) to the same approx clock speed as the in-game settings showed in earlier tests (614 Mhz-627 Mhz range)...................regardless of the GPU temps or voltage levels (which were very low & nowhere close to any throttling setting ranges). Also as in earlier tests the game ran flawless.
So from these results I'm gathering that the Nvidia GPU Boost feature is kinda using the reported AF, AA & TAA levels--whether from the game or at the driver--as a work load indicator to determine where to boost/deboost (if this is a word) the GPU 3D clock speeds--in addition to the GPU temp & voltage levels..........this was never mentioned in any articles that I read concerning GPU Boost & how it operates.
Hmmmmmm.....interesting.
For the record I NEVER buy a vid card for FPS capability as I have fully understood for years that the monitor's refresh rate is the actual viewing speed so I have ALWAYS used Vsynch to lock the GPU rendering/flipping speed & the monitor's refresh rate together regardless of any game that I have ran on my box(s). I've always bought based on GPU ability to render at highest visual QUALITY (I LOVE eye candy) & MAINTAIN monitor refresh rate regardless of game activity levels (I want to get as close to virtual reality as I can sensibly afford)......to do this requires BOTH the CPU & it's platform as well as the GPU & it's platform to perform at levels that will compliment each other & not hinder one over the other. FWIW this is why I went w/ Intel X79 platform using an Intel I7 3820 SB-E CPU w/ Win 7 HP OS w/ 16Gb mem in quad-channel configuration running off a SSD.....takes care of the CPU side of things. I had noticed before I decided to buy that on the Asus site they listed as 1 of the features of the TITAN was frame timing metering at the hardware level of the card. This was not mentioned on EVGA's site but I know that both are offering the Nvidia referenced TITAN so if 1 does it they ALL do it (or should). This capability has NEVER been mentioned to be used on any of the prior Kepler-class GPU's at the hardware level (GPU). This speaks towards rendering QUALITY to me so I wanted it to see for myself & I only want a SINGLE card solution (the main reason why I went w/ Asus Rampage IV Gene mobo...didn't want to pay for the extra board that I was NEVER gonna use but wanted a high quality mobo w/ high quality on-board Intel Gbit LAN & on-board Creative X-Fatality SS sound). I have never bought any computer part at the cutting edge before this current build & since I for once had the extra money to burn I went for it w/ this vid card (I've perferred EVGA Nvidia cards since the ATI 9700 to Radeon 800 to HD 3870 days & haven't found a good enough reason to switch back...outside of costs alone that is).
From the results that I've seen so far since this TITAN has met every expectation that I was looking for & then some so I'm a very happy camper.
So Skuzzy, if y'all at HTC are wanting to improve the graphical quality in AHII some more you ain't waiting on me........I'm waiting on y'all!
:D
This lemming is a happy one!
:salute
-
did they finally do something with the 400mhz ramdac chip? i can't find any reference on any of the titan cards...
Don't know the answer to that Gyrene, but since you've brought this up I haven't recalled reading anything on this either.
Another interesting question to check on........................... ............................. ........
:)
-
did they finally do something with the 400mhz ramdac chip? i can't find any reference on any of the titan cards...
I did a quick check on the EVGA site on the current product line of cards & I had to go all the way back to the 6 series (6000 series) of vid cards to find any reference to using the 400 Mhz ramdac chips. The 7 series seems to be fully discontinued & w/ the 8 series (8000 series) that are listed & forward they show to be using the CUDA cores for shader purposes & no mention of ramdac chips at all.
I got this data off the product spec sheets that are listed w/ the product.
Given this info Gyrene I would have to say that the 400Mhz ramdac chips were done away with some time ago on Nvidia cards after the 6-7 series............unless the CUDA core lingo is covert code to mask the continuance of using the ramdac chips........................ ......
:D :salute
-
I did a quick check on the EVGA site on the current product line of cards & I had to go all the way back to the 6 series (6000 series) of vid cards to find any reference to using the 400 Mhz ramdac chips. The 7 series seems to be fully discontinued & w/ the 8 series (8000 series) that are listed & forward they show to be using the CUDA cores for shader purposes & no mention of ramdac chips at all.
I got this data off the product spec sheets that are listed w/ the product.
Given this info Gyrene I would have to say that the 400Mhz ramdac chips were done away with some time ago on Nvidia cards after the 6-7 series............unless the CUDA core lingo is covert code to mask the continuance of using the ramdac chips........................ ......
:D :salute
Cuda and ramdac are two different things. Ramdac is the component that converts the digital image signal for analog monitors to view. Nvidia was notorious for poor RF filtering in the analog circuit which caused a very soft and blurry image quality on vga monitors. This is the reason why I stoped buying Nvidia back in the days and I guess the decision has stuck even through nowadays there would be no reason for it.
-
Cuda and ramdac are two different things. Ramdac is the component that converts the digital image signal for analog monitors to view. Nvidia was notorious for poor RF filtering in the analog circuit which caused a very soft and blurry image quality on vga monitors. This is the reason why I stoped buying Nvidia back in the days and I guess the decision has stuck even through nowadays there would be no reason for it.
Thanks, MrRiplEy for the explaination.
After I had typed that I went & did a search on ramdac chips & found that out but I appreciate your response.
Yep I remember the soft & blurry images when I had my ViewSonic 21PS CRT monitor (from the 5, 6 & 7 series Nvidia cards). I didn't jump on the 8 series when they came out as I needed a platform upgrade then to make that seem sensible (had a EVGA 7900 GTX+ card on an Intel P4 Northwood set up then....once I upgraded to the AMD Athlon X2 system using a Nvidia-based Asus A8-N mobo I jumped to the EVGA 9800 GTX+ vid card).
I stayed w/ EVGA Nvidia cards over ATI (now AMD) as they always seemed to perform better on my systems from a graphical smoothness under load standpoint. Yes the ATI cards did look better graphically overall, but w/ the Athlon upgrade I also got to liking the Nvidia-chipped mobos as opposed to Intel so was more reason to stick w/ Nvidia then.
I haven't noticed any blurry images from Nvidia cards since the 9800 GTX+ vid card. I went w/ a HP 2710m LCD monitor after my CRT gave up shortly there after & went fully DVI signal then instead of VGA........maybe the reason why. When the CRT (ViewSonic UltraBrite A91f+ 17" approx 10 yrs old now) gives up on my wife's box I'll give her this HP & I'll be looking for a 2560 x 1600 IPS panel LCD monitor then, probably around a 30"-32" size. 120Hz speed is nice if the price is right but 60Hz is good enough for me.
With my current Intel X79 set up I can go either way so we'll see what AMD comes up with after TITAN/GTX 700 series to compete with. The AMD Tahiti-based cards are looking interesting.................. ...........
But that will depend on what Hitech & crew do going forward w/ AHII as far as graphics are concerned cause as of right now I don't EVEN need to worry about graphics performance in AH for quite some time w/ this EVGA GTX TITAN as they are absolutely beautiful in every aspect....................... ......
:salute
-
Thanks, MrRiplEy for the explaination.
After I had typed that I went & did a search on ramdac chips & found that out but I appreciate your response.
Yep I remember the soft & blurry images when I had my ViewSonic 21PS CRT monitor (from the 5, 6 & 7 series Nvidia cards). I didn't jump on the 8 series when they came out as I needed a platform upgrade then to make that seem sensible (had a EVGA 7900 GTX+ card on an Intel P4 Northwood set up then....once I upgraded to the AMD Athlon X2 system using a Nvidia-based Asus A8-N mobo I jumped to the EVGA 9800 GTX+ vid card).
I stayed w/ EVGA Nvidia cards over ATI (now AMD) as they always seemed to perform better on my systems from a graphical smoothness under load standpoint. Yes the ATI cards did look better graphically overall, but w/ the Athlon upgrade I also got to liking the Nvidia-chipped mobos as opposed to Intel so was more reason to stick w/ Nvidia then.
I haven't noticed any blurry images from Nvidia cards since the 9800 GTX+ vid card. I went w/ a HP 2710m LCD monitor after my CRT gave up shortly there after & went fully DVI signal then instead of VGA........maybe the reason why. When the CRT (ViewSonic UltraBrite A91f+ 17" approx 10 yrs old now) gives up on my wife's box I'll give her this HP & I'll be looking for a 2560 x 1600 IPS panel LCD monitor then, probably around a 30"-32" size. 120Hz speed is nice if the price is right but 60Hz is good enough for me.
With my current Intel X79 set up I can go either way so we'll see what AMD comes up with after TITAN/GTX 700 series to compete with. The AMD Tahiti-based cards are looking interesting.................. ...........
But that will depend on what Hitech & crew do going forward w/ AHII as far as graphics are concerned cause as of right now I don't EVEN need to worry about graphics performance in AH for quite some time w/ this EVGA GTX TITAN as they are absolutely beautiful in every aspect....................... ......
:salute
Yes the digital DVI/HDMI/DP connections do not suffer from the analog filtering problem.
-
For those interested...................
I've ran some different setting levels in AHII at the driver level to determine where GPU Boost would put the GPU clock speeds according to setting load levels:
Set all other settings in driver AF, TAA to max settings (16x AF, 8x SS TAA) w/ TF set at High Quality (max) & V-synch at On....this left the AA settings to determine the load as follows. Left the card at stock settings:
At 8x CSAA GPU Boost put GPU around the base GPU clock speed of 837 Mhz (814-827).
At 8x AA GPU Boost put GPU slightly above the base clock speed of 837 Mhz (840-852).
At 16x CSAA GPU Boost put GPU pretty much in between the base clock speed of 837 Mhz & close to advertised boost speed of 876 Mhz (852-872).
At 16x QCSAA & up GPU Boost put GPU at max vBIOS boost speed of 980 Mhz.
GPU temp levels held around 57*C-62*C & 52%-74% power level thruout the testing so I experienced no GPU throttling in any of these runs because of temp/power....only due to AA setting levels.
The game ran flawless thruout this process but the experience was noticeably crisper as the GPU speeds were boosted upwards.
I then started messing w/ the GPU offset setting alone to see where it would go on stock temp/power settings (80*C/100% power):
At 100% offset GPU hit 1072 Mhz flatlined w/ no temp/power increases measured.
At 200% offset GPU hit 1172 Mhz flatlined again w/ no temp/power increases measured.
Stock mem speeds flatlined at 3005 Mhz (had posted 3502 Mhz earlier...that was an error in typing...sorry) thruout the whole process.
All data came from PrecisionX 4.10.
:salute
I stopped here as I didn't want to go any higher
-
The GTX 780 just went live.
Hard release, 10% slower than the Titan at 65% of its price. And after overclock, they are essentially equal (unless you need > 3 GB of RAM).
If I had to guess, when both are overclocked they will still be very close, closer than the un-overclocked 10% difference, because the limitation will be heat, and the GTX780 produces slightly less of it, allowing it to be pushed slightly harder.
http://hardocp.com/article/2013/05/23/nvidia_geforce_gtx_780_video_card_review
http://www.anandtech.com/show/6973/nvidia-geforce-gtx-780-review
If I hadn't bought a 4GB 680 about 6 months ago, I'd be all over this. For now I'm going to play it cool for a month, then make a decision, the 680 can be resold for a high value still. I might wait for a double memory 780 like the 680 I currently have.
Skipping this generation isn't unreasonable either. $650 is a lot for an GTX x80 at release, normally they're $500. AMD hasn't been holding up their end of the bargain, so Nvidia gets to price as high as they want. Its funny, without the Titan, people would be pissed about this price point, but now everyone is going to say "its a $350 cheaper Titan, so its worth it". That's certainly what my instinct is telling me.
-
So this baby-titan is the 780, guessing the 680 will transform into the 770, the 670 into the 760ti and so.
A fast one for sure, but if money doesnt count, the winner is the titan or the 7990, or if price-performance comes into the equation, a 7970 GHz edition is just 10-15% slower...
And betcha there wont be a gtx790, thwe GK110 is just that unbelivably huge : )
-
If they price the 770, 760 Ti, 760, etc the same as the equivalent 600 series when they were first released, AMD could be really really screwed. If they price them higher, which they could do because of the lack of competition, IMO it means they actually want AMD to remain a competitor instead of kicking them out of the market entirely.
-
If they price the 770, 760 Ti, 760, etc the same as the equivalent 600 series when they were first released, AMD could be really really screwed. If they price them higher, which they could do because of the lack of competition, IMO it means they actually want AMD to remain a competitor instead of kicking them out of the market entirely.
What youre drinking Sir, the 7970 GHz edition is a good 10% faster yet cheaper than the 680... for the same 1000$, the 7990 is quite a bit faster than the titan or the 690...
Now if the NVidia could cut the prices by a good 20%, they would be in a nice advantage, but honestly, this 700 series offers nothing new, only a slightly slower version of the titan and some renamings. The same thing is quite possible on the amd 8k series though..
Still, i hopingly hope that the competition forces the prices down a bit, as novadays video cards are just amazingly expensive, just compare the 2009 upper-mid range HD4850/GTX260 to the 2013 mid range 7870/660Ti...
-
I just sold my 2 4 gb 680 GTX's, and my 780 will be here tomorrow. I'll post some results with AH2 sometime in the evening. The 3 gb card with a moderate overclock using Afterburner will exceed the Titan superclocked card by about 3% with most of the games I'm playing, I saw it up close and personal at the shop I deal with, who had sold out of 780's in a couple of hours.
The 780 is easily the best performance for the $ of the higher end cards. That said, AMD is waiting until after the summer to release it's really new tech, but the 7990 tests so far show it to be faster than the Titan and the 780, but no accurate pricing exists yet to compare it to the nVidia. The 780 will be my last card swap until fall when the new AMD stuff will be out.
-
The Nvidia GTX 770 is out, and as I predicted its MSRP is $400. It is essentially a souped up GTX 680. It benches better than the 7970 GE at a lower price (although if you sell the 7970's game bundle, they end up the same).
I would expect the GTX 760 Ti, when it comes out, to win its price performance battle vs AMD as well. Considering AMD doesn't have new parts due until the end of Q4 this year (if not later), the red team could be hurting; expect prices to drop or game bundles / rebates to get better.
If you're looking for a video card and are spending less than $400, it would be wise to hold off for another month or two to give time for the price wars to work in your favor. At $400 or above, Nvidia is the way to go.
-
The only difference between the 680 and the 770 is, the 770 runs at 1048MHz by stock, instead of the 680s 1006MHz. 4-5%. More or less the same performance as the 7970GE (a little more when some new effects disabled, a bit less on full). And yes, its a tiny bit cheaper than the 680, even though its a "new product" and the prices are likely to go down a bit in a couple weeks. And the most important, its finally cheaper than the 7970GE!
Im happy and not. Happy because i can buy the same product cheaper, also the AMD has to do something, as it has a better price/performance ratio than their card. And not because a x70, and also an x80 GPU has never been this expensive before. We are getting very far from the "golden age" when a 4850 or a gtx260+ was no more than 200-220$.
-
Im happy and not. Happy because i can buy the same product cheaper, also the AMD has to do something, as it has a better price/performance ratio than their card. And not because a x70, and also an x80 GPU has never been this expensive before.
Not true. The GTX 670 debuted at $399, same as the GTX 770. The GTX 280 debuted at $650, same as the GTX 780. There are no historical high prices here, with inflation they are actually cheaper than the negative examples I cite.
People were actually surprised the 770 was priced as low as it was. They were expecting $450. AMD needs to step up so that the next ?80 part will be priced at the lower end of its historical range, rather than the higher end, I'll agree with that.