Author Topic: Titan goes live  (Read 2346 times)

Offline zack1234

  • Plutonium Member
  • *******
  • Posts: 13214
Re: Titan goes live
« Reply #15 on: April 08, 2013, 07:42:10 AM »
 :)
There are no pies stored in this plane overnight

                          
The GFC
Pipz lived in the Wilderness near Ontario

Offline Chalenge

  • Plutonium Member
  • *******
  • Posts: 15179
Re: Titan goes live
« Reply #16 on: April 08, 2013, 04:03:54 PM »
I have an i7 3960x and 3-way Titan SLI, only playing COD2 sitting in a corner waiting for the lemmings to show up, but i would like to upgrade, could you help me please?  ;)

There is no help for you.
If you like the Sick Puppy Custom Sound Pack the please consider contributing for future updates by sending a months dues to Hitech Creations for account "Chalenge." Every little bit helps.

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: Titan goes live
« Reply #17 on: April 14, 2013, 05:15:24 PM »
Income tax refund is now in the bank & the struggle to resist just got harder!

I read today that there is a Titan LE vers in the making....................... ......

 :x :pray :D :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline zack1234

  • Plutonium Member
  • *******
  • Posts: 13214
Re: Titan goes live
« Reply #18 on: April 15, 2013, 01:45:46 AM »
Buy Trackir instead, its awesomesauce in game :old:
There are no pies stored in this plane overnight

                          
The GFC
Pipz lived in the Wilderness near Ontario

Offline Dragon

  • Platinum Member
  • ******
  • Posts: 7055
      • AH JUGS
Re: Titan goes live
« Reply #19 on: April 15, 2013, 11:40:59 AM »
I have an i7 3960x and 3-way Titan SLI, only playing COD2 sitting in a corner waiting for the lemmings to show up, but i would like to upgrade, could you help me please?  ;)

 :rofl :rofl
SWchef  Lieutenant Colonel  Squadron Training Officer  125th Spartan Warriors

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: Titan goes live
« Reply #20 on: May 12, 2013, 01:17:39 AM »
Well the wait is over.......................

I, Pudgie, am now the proud owner of an EVGA GTX TITAN vid card. Bought it direct from EVGA for the grand sum of $1,009.23.
 :D :salute
Couldn't resist the temptation to have it.......especially when I saw that Newegg had stopped carrying the vanilla card at one stretch (only had superclocked & signature & hydrocopper series....thought that Nvidia had stopped making the vanillas)...checked the EVGA site & saw that they were still showing the vanilla Titans so I ordered 1.....

Now the price is going up on them all.........................

Lemmings like me.............. :x

Popped her in & loaded the 314.22 WHQL drivers & all is well.

This card is quiet & is the SMOOTHEST running vid card that I've ever owned................

Stutter-free operation w/ all graphics settings at max & I mean all of them.

I tested this card since it's a Kepler-based GPU w/ GPU Boost 2.0 using the in-game graphics setting in Video Settings set to Most & setting the Nvidia driver to Use the 3D Application settings for AF, AA & V-synch...set TF at High Quality & ran the game....this TITAN did the exact same pattern as did my GTX 670 FTW run the same way...the card's GPU Boost set the GPU clocks some 224 Mhz BELOW the base clock settings of the card (actual in-game GPU clocks around 614-627 Mhz....base clocks were 837 Mhz) w/ mem clocks pegged at 3005 Mhz. As w/ the 670 the game ran flawless but the GPU clocks were below the base clocks. I then went into the game Video Settings, set the in-game graphics slider to None then went into the Nvidia driver, set it to Override any Application Settings then set the AF, AA to the driver's max settings (16x AF, 32xCSAA) & V-synch to On w/ TF set at High Quality & ran the game & just as the 670 did then, this TITAN's GPU Boost set the GPU clocks at 980 Mhz (pegged out 104 Mhz above the max boost clock of 876 Mhz) w/ the mem clock pegged at 3005 Mhz. GPU temp wasn't even an issue (between 59*C-62*C regardless w/ max power usage at 74%....set @ 100% power target). Again the game ran flawless but the GPU clocks were MUCH higher as GPU Boost did what it should have done when the GPU is running well below the temp & power range settings. Something w/ the in-game graphics setting is influencing the Nvidia GPU Boost algorithim to underclock the Kepler GPU's on my box...........

Hmmmmm..........2 different Kepler GPU's w/ 2 different vers of the GPU Boost algorithim running on 2 different driver vers but exhibiting the EXACT same behavior in the EXACT same game on the EXACT same platform.....so it's got to be some issue w/ the I7 3820 SB-E platform I'm using?

Naw I don't think so................

Anyway this TITAN is way too much card for what I use this box for.......but this box is way too much for what I use it for as well so I'm all set for a while.......

Well gotta go play some more........................

I do love this card!

 :aok :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline zack1234

  • Plutonium Member
  • *******
  • Posts: 13214
Re: Titan goes live
« Reply #21 on: May 12, 2013, 03:47:01 AM »
I bought three and threw one away :rofl

There are no pies stored in this plane overnight

                          
The GFC
Pipz lived in the Wilderness near Ontario

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: Titan goes live
« Reply #22 on: May 17, 2013, 02:07:20 PM »
If you're interested,

There is an interview you can see at PC Perspective's web site where Ryan Shrout has Tom Petersen, a Nvidia rep, on & the topic is about the advent of FCAT process of using Frame Rating to further refine graphics performance....not just by GPU power & speed alone....but by being able to "read" each graphics frame that is written to the frame buffer AND is flipped to be displayed onscreen to ensure that every frame is fully rendered properly & fully, have all the frame inputs input at the proper time within the time window given for the vid card to have this done & displayed in it's proper time sequence. The discussion is a good listen IMHO as it demonstrates why using FRAPS (or other similar software) to evaluate graphics performance of a video card misses too much as it picks up the frames from the game engine (CPU) but doesn't account for the frame rendering process & flipping to display process (GPU) so we may see errant graphics performance & experience on screen but it's not necessarily due from GPU/mem performance issues concerning FPS.

This process is not proprietary to Nvidia as AMD is working w/ this process as well. Nvidia has been working w/ this for the last 3 years.

This is 1 of the features that is incorporated in the TITAN as this vid card has Nvidia's 1st attempt from all their research & testing being implemented at the hardware level of the card (GPU) & controlled thru the driver so this card is self-checking the GPU rendering & display process as well as actually doing the work of rendering & displaying the frames.

The goal is to be able to have the highest FPS that the card can run AND have the lowest frame timings as possible so we the user can have the fastest, smoothest graphics experience that can be provided.

I'm guessing that both AMD & Nvidia have hit the same wall w/ the GPU's that Intel & AMD have hit w/ the CPU's some time back.....pure processing speed & power--though necessary--isn't enough to fully achieve the goal of providing the ideal user EXPERIENCE w/ these products.

I'm also guessing this is why Nvidia needed the GK110 GPU instead of GK104 GPU as to use the FCAT process on a computer you would need a video capture card to capture the frame flipping from the frame buffer to the display, look at the frame sequencing, quality & making adjustments at the GPU level to correct any errors found & do all this at high GPU rendering rates & the GK110 GPU is more than capable of handling both processes at the same time thus the creation of the TITAN as a single GPU version.

The GTX 690 is a dual GPU version in which they can implement this process & manage it across the 2 GK104 GPU's thru the driver.

When you see the data that Ryan Shrout presents from using the FCAT tools to check for this performance across the latest Nvidia & AMD products you will see that both are working w/ tuning their products thru this process & you will see that Nvidia has really invested a LOT of time in using FCAT & refining their product line performance..........TITAN is the flagship card w/ FCAT in mind & it shows.

This from what I gathered is going to be the basis for the GTX 700 series cards.

 http://www.pcper.com/

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline gyrene81

  • Plutonium Member
  • *******
  • Posts: 11629
Re: Titan goes live
« Reply #23 on: May 17, 2013, 02:17:58 PM »
did they finally do something with the 400mhz ramdac chip? i can't find any reference on any of the titan cards...
jarhed  
Build a man a fire and he'll be warm for a day...
Set a man on fire and he'll be warm for the rest of his life. - Terry Pratchett

Offline Skuzzy

  • Support Member
  • Administrator
  • *****
  • Posts: 31462
      • HiTech Creations Home Page
Re: Titan goes live
« Reply #24 on: May 17, 2013, 02:48:37 PM »
Frame rates beyond the ability of the monitor to display them is pretty worthless.  They need to quit obsessing over frame rates above the refresh rates of the monitors.
Roy "Skuzzy" Neese
support@hitechcreations.com

Offline gyrene81

  • Plutonium Member
  • *******
  • Posts: 11629
Re: Titan goes live
« Reply #25 on: May 17, 2013, 03:06:24 PM »
lol, no way that's going to happen Skuzzy. it's one of the greatest marketing tools they have stumbled on...pseudo frame rates. put enough memory on the board for a big buffer and let the gpu show 200fps. a little anti-aliasing here, a little tesellation there...as long as it appears seamless, nobody cares how it got there.
jarhed  
Build a man a fire and he'll be warm for a day...
Set a man on fire and he'll be warm for the rest of his life. - Terry Pratchett

Offline Skuzzy

  • Support Member
  • Administrator
  • *****
  • Posts: 31462
      • HiTech Creations Home Page
Re: Titan goes live
« Reply #26 on: May 17, 2013, 04:37:36 PM »
lol, no way that's going to happen Skuzzy. it's one of the greatest marketing tools they have stumbled on...pseudo frame rates. put enough memory on the board for a big buffer and let the gpu show 200fps. a little anti-aliasing here, a little tesellation there...as long as it appears seamless, nobody cares how it got there.

If enough people are educated about it, then that can change.

The only time it is helpful to know what the raw frame rate would be when you are trying to assess whether or not the CPU or the video card is the bottleneck in your computer.
Roy "Skuzzy" Neese
support@hitechcreations.com

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: Titan goes live
« Reply #27 on: May 18, 2013, 02:54:40 PM »
For those who are interested again.......................

On the issue of a GTX Kepler series GPU using GPU Boost downclocking below the base 3D clocks when using the in-game AA settings vs using the driver settings alone:

I thought about this some more & realizing that when I was using the driver settings I was using the MAX settings for AF, AA, TAA (16xAF, 32x CSAA & TAA set at 8x Supersampling) w/ Vsynch set to On. This clearly had GPU Boost setting the GPU to max boost level as set in the vBIOS...but what if the actual in-game setting(s) aren't that high........would lower settings cause GPU Boost to react differently.................. ......?

So I reran this TITAN using the driver settings alone but set at the LOWEST settings that can be run (2x AF, 2xAA w/ TAA set for multisample) & left Vsynch set at On (as well as the rest of the driver settings) to see what the card would do as I didn't know what level the actual in-game settings would match in the driver setting range so the results from this run would/should give me some indication of the in-game setting level if GPU Boost did cut the clock speeds under the base clock speed setting..........

I checked afterwards & saw that GPU Boost had indeed downclocked the GPU below the base 3D clocks (837 Mhz) to the same approx clock speed as the in-game settings showed in earlier tests (614 Mhz-627 Mhz range)...................regardless of the GPU temps or voltage levels (which were very low & nowhere close to any throttling setting ranges). Also as in earlier tests the game ran flawless.

So from these results I'm gathering that the Nvidia GPU Boost feature is kinda using the reported AF, AA & TAA levels--whether from the game or at the driver--as a work load indicator to determine where to boost/deboost (if this is a word) the GPU 3D clock speeds--in addition to the GPU temp & voltage levels..........this was never mentioned in any articles that I read concerning GPU Boost & how it operates.

Hmmmmmm.....interesting.

For the record I NEVER buy a vid card for FPS capability as I have fully understood for years that the monitor's refresh rate is the actual viewing speed so I have ALWAYS used Vsynch to lock the GPU rendering/flipping speed & the monitor's refresh rate together regardless of any game that I have ran on my box(s). I've always bought based on GPU ability to render at highest visual QUALITY (I LOVE eye candy) & MAINTAIN monitor refresh rate regardless of game activity levels (I want to get as close to virtual reality as I can sensibly afford)......to do this requires BOTH the CPU & it's platform as well as the GPU & it's platform to perform at levels that will compliment each other & not hinder one over the other. FWIW this is why I went w/ Intel X79 platform using an Intel I7 3820 SB-E CPU w/ Win 7 HP OS w/ 16Gb mem in quad-channel configuration running off a SSD.....takes care of the CPU side of things. I had noticed before I decided to buy that on the Asus site they listed as 1 of the features of the TITAN was frame timing metering at the hardware level of the card. This was not mentioned on EVGA's site but I know that both are offering the Nvidia referenced TITAN so if 1 does it they ALL do it (or should). This capability has NEVER been mentioned to be used on any of the prior Kepler-class GPU's at the hardware level (GPU). This speaks towards rendering QUALITY to me so I wanted it to see for myself & I only want a SINGLE card solution (the main reason why I went w/ Asus Rampage IV Gene mobo...didn't want to pay for the extra board that I was NEVER gonna use but wanted a high quality mobo w/ high quality on-board Intel Gbit LAN & on-board Creative X-Fatality SS sound). I have never bought any computer part at the cutting edge before this current build & since I for once had the extra money to burn I went for it w/ this vid card (I've perferred EVGA Nvidia cards since the ATI 9700 to Radeon 800 to HD 3870 days & haven't found a good enough reason to switch back...outside of costs alone that is).

From the results that I've seen so far since this TITAN has met every expectation that I was looking for & then some so I'm a very happy camper.
So Skuzzy, if y'all at HTC are wanting to improve the graphical quality in AHII some more you ain't waiting on me........I'm waiting on y'all!

 :D

This lemming is a happy one!

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: Titan goes live
« Reply #28 on: May 18, 2013, 02:58:19 PM »
did they finally do something with the 400mhz ramdac chip? i can't find any reference on any of the titan cards...

Don't know the answer to that Gyrene, but since you've brought this up I haven't recalled reading anything on this either.

Another interesting question to check on........................... ............................. ........

 :)
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: Titan goes live
« Reply #29 on: May 18, 2013, 03:58:01 PM »
did they finally do something with the 400mhz ramdac chip? i can't find any reference on any of the titan cards...

I did a quick check on the EVGA site on the current product line of cards & I had to go all the way back to the 6 series (6000 series) of vid cards to find any reference to using the 400 Mhz ramdac chips. The 7 series seems to be fully discontinued & w/ the 8 series (8000 series) that are listed & forward they show to be using the CUDA cores for shader purposes & no mention of ramdac chips at all.

I got this data off the product spec sheets that are listed w/ the product.

Given this info Gyrene I would have to say that the 400Mhz ramdac chips were done away with some time ago on Nvidia cards after the 6-7 series............unless the CUDA core lingo is covert code to mask the continuance of using the ramdac chips........................ ......

 :D :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd