Author Topic: PCIE 3.0  (Read 978 times)

Offline MADe

  • Silver Member
  • ****
  • Posts: 1117
PCIE 3.0
« on: August 08, 2015, 12:38:32 PM »
So AH3, looks like I will need to upgrade my vid cards when things come to fruition.

I will not build a new pc for AH3 because my cpu cruises at 4GHz with 4 cores, so I see no need. But its a 1st gen EX 58 mobo and the PCIE slot is only 2.0.

When I do upgrade I will forgo the 2 card SLI path because AH does not support it and ultimately, it did not seem to get me much since I use a single monitor. However the vid cards are PCIE 3.0 now.

My question is will I lose a lot of performance putting a single 3.0 card in a 2.0 slot?

Any links or info appreciated.
ASROCK X99 Taichi, INTEL i7 6850@4.5GHz, GIGABYTE GTX 1070G1, Kingston HyperX 3000MHz DDR4, OCZ 256GB RD400, Seasonic 750W PSU, SONY BRAVIA 48W600B, Windows 10 Pro /64

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: PCIE 3.0
« Reply #1 on: August 08, 2015, 01:40:59 PM »
http://www.hardocp.com/article/2012/07/18/pci_express_20_vs_30_gpu_gaming_performance_review/14#.VcZEeJBFC6I

http://www.enthusiastpc.net/articles/00003/3.aspx

There's more info out there, MADe but the consensus is that it will work just fine & most likely not be constrained performance-wise.

According to what I've read & studied, the performance difference between PCI-E 2.x vs PCI-E 3.x is more of a total package issue than just a single device issue so if you're happy w/ the performance of the rest of your box then I wouldn't sweat it as the vid card's drivers would just set the vid card's PCI-E lane output to match the current PCI-E lane specs of your mobo.

The only issue that I can deduce from all this is trying to install a PCI-E 3.x spec device on a mobo that is PCI-E 1.x max spec'd as most don't take the PCI-E backwards compatibility specs that far back as from looking at the data you would run into lane thruput issues....................som e manuf's might provide it but most may not.

My 2 cents given.....................

Hope this helps you out.

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline MADe

  • Silver Member
  • ****
  • Posts: 1117
Re: PCIE 3.0
« Reply #2 on: August 08, 2015, 01:51:36 PM »
ty

answered exactly in a way I had hoped for.
 I will of course spend some time googleing but you have given me a place to start from.
ASROCK X99 Taichi, INTEL i7 6850@4.5GHz, GIGABYTE GTX 1070G1, Kingston HyperX 3000MHz DDR4, OCZ 256GB RD400, Seasonic 750W PSU, SONY BRAVIA 48W600B, Windows 10 Pro /64

Offline Chalenge

  • Plutonium Member
  • *******
  • Posts: 15179
Re: PCIE 3.0
« Reply #3 on: August 08, 2015, 01:57:40 PM »
. . .When I do upgrade I will forgo the 2 card SLI path because AH does not support it and ultimately, it did not seem to get me much since I use a single monitor.

Your choice, but SLI DOES work with AH whether AH "supports" (not really sure what you mean by saying that) SLI or not.
If you like the Sick Puppy Custom Sound Pack the please consider contributing for future updates by sending a months dues to Hitech Creations for account "Chalenge." Every little bit helps.

Offline MADe

  • Silver Member
  • ****
  • Posts: 1117
Re: PCIE 3.0
« Reply #4 on: August 08, 2015, 02:38:12 PM »
Your choice, but SLI DOES work with AH whether AH "supports" (not really sure what you mean by saying that) SLI or not.

it does not get you an overall performance increase when it comes to ah.

I currently run 2 cards now, 1600mb of ddr3 ram combined, but the 2 cards cannot handle ah3, the ram is not additive. I have tried ah with a single gpu as well, no difference in ah performance.

the best thing I discovered that improves the game is sweetfx. it improves the look tremendously in ah 2 and 3, but its not a performance increase. it just enriches the colors, contrasts and depth of the game. my ah profile has everything off, anti- aliasing is off in game video settings. I let sweetfx do all that and it does it so much better it's sick.

sorry hitech, but when ah3 is up and running, I will use sweetfx to get the depth, a good card to do the rendering of objects. believe me, I tinkered with every possible combination, sweetfx is the bomb when it comes to the games look.
ASROCK X99 Taichi, INTEL i7 6850@4.5GHz, GIGABYTE GTX 1070G1, Kingston HyperX 3000MHz DDR4, OCZ 256GB RD400, Seasonic 750W PSU, SONY BRAVIA 48W600B, Windows 10 Pro /64

Offline MADe

  • Silver Member
  • ****
  • Posts: 1117
Re: PCIE 3.0
« Reply #5 on: August 08, 2015, 02:48:31 PM »
had to post this, its from one of the pudgie links.
PCI express basics

3



PCI Express 1.x, 2.x and 3.x

 Okay so by now we all know that PCI Express connections come in varying numbers of lanes and that the number of lanes is the first thing that determines how fast your device (graphics card, onboard LAN, etc.) can communicate with the rest of your PC.

 As most of you will have noticed however, PCI Express connections also come with a version and revision number, there's:

•PCI Express 1.0 and 1.1 (a slight revision of 1.0) collectively referred to as 1.x
•PCI Express 2.0 and 2.1 (a slight revision of 2.0) collectively referred to as 2.x
•PCI Express 3.0


PCI Express 2.0 vs 2.1

 This is a frequently asked question and most likely because a lot of videocards specify that they are version 2.1 compliant while a lot of the mainboards specify that they are version 2.0 compliant. To answer what most of you probably really want to know: A PCI Express 2.1 videocard should run just fine in a PCI Express 2.0 motherboard. The PCI Express version that motherboards support is determined by either the mainboard chipset (AMD and Intel X58) or the CPU (Intel Sandy Bridge and Sandy Bridge E).

 A PCI Express 2.1 videocard should run just fine in a PCI Express 2.0 motherboard / system because the standard is designed to be both upwards and downwards compatible. The datarate remains unchanged between 2.0 and 2.1 so there should not be any major performance hurdles either when mixing and matching these two revisions. We'll leave the details of the revisions aside for now because that will force us off what's most important (well to most of you anyway).

 What separates 1.x, 2.x and 3.x mostly is the transfer speed per lane:

•A PCI Express 1.x lane can transfer up to 250MB/s
•A PCI Express 2.x lane can transfer up to 500MB/s
•A PCI Express 3.x lane can transfer up to 1GB/s

 These are Megabytes and Gigabytes not bits, so quite fast, even on just a single lane :-)

 Obviously a 16 lane connection is still 16 times as fast as a single lane so:

•PCI Express 1.x does 16 x 250MB/s = 4GB/s on a x16 connection
•PCI Express 2.x does 16 x 500MB/s = 8GB/s on a x16 connection
•PCI Express 3.x does 16 x 1GB/s = 16GB/s on a x16 connection


PCI Express 1.x, 2.x and 3.x are designed to be both upwards and downwards compatible.

 The standard that is used when you mix different generations of PCI Express in a system is the lowest standard that both components can understand. This means that if you put a PCI Express 3.0 card in a PCI Express 2.x mainboard, communication will be done using the 2.x standard and speed. This works both ways: if you insert a PCI 2.x card in a PCI 3.0 system, it will use 2.x.


Normally this should work and I wouldn't anticipate any problems in the examples. It is however possible that cards that depend on advanced features of a new standard will perform sub-optimal or work incorrectly. So far however, I have personally not encountered any problems like this.

 PCI Express 1.x is not really used anymore but those of you with a mainboard that still has 1.x are obviously at a speed disadvantage when it comes to talking to graphics cards or other peripherals that connect through PCI Express, by a factor of 2 (compared to 2.x).

 Most current systems at the time of writing support PCI Express 2.x maximum. The brand new Sandy Bridge E platform has been recently confirmed to work at PCIe 3.0 transfer speeds though!

•For AMD systems this speed limit is imposed by the chipset. Modern AMD chipsets support 2.x speeds.
•For Intel socket 1155 with a Sandy Bridge processor this speed limit is imposed by the CPU as well as the southbridge for the few lanes that run from there.
The above means that even if you have a "Gen3" board you will never run at 3.x speeds as long as you have a Sandy Bridge processor in your socket 1155!
 Ivy Bridge will have 16 lanes of PCI Express 3.0 running from the processor, however these will only do 3.0 speeds if all the other components on your mainboard and in the slots also support 3.0. If you have a board with an NF200 chip or any other switch that doesn't support 3.0 your PCI Express connections will be 2.x. The lanes running from the P67 or Z68 southbridge will obviously always be PCI Express 2.x so even with an Ivy Bridge 1155 processor only your processor lanes will be upgraded to PCIe 3.0.

 So to put it bluntly once again: most of you will never reap any benefit from a Gen3 board. Unless you have a thoroughly PCIe 3.0 ready board, a PCIe 3.0 graphics card and an Ivy Bridge processor you will be using PCIe 2.x. If just one of these components does not support 3.0 this will be the case and there is no unlocking, hacking or any other way around this.

 In the long run Ivy Bridge might make up for the x8/x8 graphics limitation on 1155 boards by doubling the speed per lane but this will require new graphics cards in addition to a Gen3 capable mainboard. I'm not a fortune teller by any means but I have never before upgraded all parts of my computer but the mainboard so I find this a strange upgrade path to say the least....

 In any case, those of you that still have an X58 board or the lucky few that can afford an X79 board will not have any problems for some time to come because you can easily run 2 graphics cards at x16/x16 speed on virtually any board out there. Even graphics cards that only support PCI Express 2.x will have no problem with this full complement of lanes.

 The same goes for users that have any of the AMD high end chipsets (790FX, 890FX, 990FX) although this has the downside of a slower CPU. This is much less of a problem however than people make it out to be:

 A final word on "feeding graphics cards" and the speed of your CPU.

 Every so often I hear people talk about a CPU not being able to feed a graphics card fast enough. This is really is roadkill. Transfers of data to and from a PCI Express device, be it a disk controller, LAN controller, graphics card or whatever else are done from the PC's main memory to the PCI Express device without the processor doing anything. This is done via a mechanism called "DMA" which is an abbreviation of "Direct Memory Access".

 Now for the fun part. Dual Channel DDR3 1333 memory (most of you have this or faster memory in their PC) has a transfer rate of just over 20GB/s. Given that most of you have PCI Express 2.x with a maximum of 16 lanes for your graphics card this means that the transfer speed to (and from) your graphics card is limited to 8GB/s (x16 PCI Express 2.x = 8GB/s).

 This means that your memory can "saturate" the PCI Express connection to your graphics card twice over with room to spare!!!! Over 12GB/s of room to spare to be more precise. Even if you have 2 graphics cards at 8GB/s each only one memory transfer at a time will happen so that still leaves you with 4GB/s of free overhead.


So why have a fast PCI Express connection?

 Well this is actually quite simple: the higher your bandwidth the sooner the receiving device (video card or any other device) can work with it. So in effect, the faster your first graphics card has a data transfer the earlier it can go about it's business rendering stuff while the next transfer to your second graphics card is underway. Besides the graphics card, transactions to other stuff is happening too: transfers to your audio chip, your disk controller, your network controller etc. The faster your PCI express connection, the faster your machine will be at a whole slew of different tasks.

 This is why a x16/x16 Bulldozer machine can outrun a Core i7 in framerates even though the Bulldozer processor is slower. As long as the processor is fast enough the better PCI Express on the Bulldozer system can make a difference that compounds the CPU speed. Ofcourse badly written game software that doesn't use the multiple cores effectively can still slow the entire game down, it just does not mean that there is no reason to have a fast
ASROCK X99 Taichi, INTEL i7 6850@4.5GHz, GIGABYTE GTX 1070G1, Kingston HyperX 3000MHz DDR4, OCZ 256GB RD400, Seasonic 750W PSU, SONY BRAVIA 48W600B, Windows 10 Pro /64

Offline Chalenge

  • Plutonium Member
  • *******
  • Posts: 15179
Re: PCIE 3.0
« Reply #6 on: August 08, 2015, 02:56:49 PM »
it does not get you an overall performance increase when it comes to ah.

I currently run 2 cards now, 1600mb of ddr3 ram combined, but the 2 cards cannot handle ah3, the ram is not additive. I have tried ah with a single gpu as well, no difference in ah performance.

What cards are you using?
If you like the Sick Puppy Custom Sound Pack the please consider contributing for future updates by sending a months dues to Hitech Creations for account "Chalenge." Every little bit helps.

Offline MADe

  • Silver Member
  • ****
  • Posts: 1117
Re: PCIE 3.0
« Reply #7 on: August 08, 2015, 03:51:55 PM »
sig says it all.
ASROCK X99 Taichi, INTEL i7 6850@4.5GHz, GIGABYTE GTX 1070G1, Kingston HyperX 3000MHz DDR4, OCZ 256GB RD400, Seasonic 750W PSU, SONY BRAVIA 48W600B, Windows 10 Pro /64

Offline caldera

  • Platinum Member
  • ******
  • Posts: 6448
Re: PCIE 3.0
« Reply #8 on: August 08, 2015, 04:14:14 PM »
sig says it all.

You forgot to include the make and model of your desk and chair.   :D
"Then out spake brave Horatius, the Captain of the gate:
 To every man upon this earth, death cometh soon or late.
 And how can man die better, than facing fearful odds.
 For the ashes of his fathers and the temples of his Gods."

Offline Chalenge

  • Plutonium Member
  • *******
  • Posts: 15179
Re: PCIE 3.0
« Reply #9 on: August 08, 2015, 04:53:16 PM »
Well, you are right that your cards do not have the 1GB minimum that Skuzzy has up to this point recommended. However, your cards are as powerful as a GTX 9800, and even more so with SLI. Since AH does not have its own SLI profile you will have to force alternate frame rendering mode 2 manually. To confirm scaling just go into the top menu of Nvidia Control Panel, under "3D Settings" and then select the first option "Show SLI visual indicator." Leave everything else on either "Off" or "Application-controlled," and then adjust AH settings for the highest scaling you can get.
If you like the Sick Puppy Custom Sound Pack the please consider contributing for future updates by sending a months dues to Hitech Creations for account "Chalenge." Every little bit helps.

Offline Pudgie

  • Silver Member
  • ****
  • Posts: 1280
Re: PCIE 3.0
« Reply #10 on: August 08, 2015, 06:05:15 PM »
MADe,

After looking at your sig, about the only thing that, if I was in your shoes, you might look at in addition to adding a better vid card is maybe upping the system mem amount from the 3-2Gb Tri Channel mem kit to at least a 3-4Gb or larger Tri Channel kit (assuming that you have your mobo setup in Tri Channel configuration & whatever your mobo's max capacity limit is) as I could see that helping the rest of your system out performance-wise as well as a better vid card.....but that's me looking at it from my perspective.

I myself almost went the X58 direction back when I built my current box in sig but couldn't pass up on the advent of an Intel X79 box as this was cutting-edge tech at that time that sprang out of the X58 platform's success & I wanted to go w/ a cutting-edge build for once in my "career" (been building off 1-2 yr old tech prior due to funds availability & untrusting of cutting edge tech's build quality\reliability....at that time).

The Intel X58 platform is 1 of the best all around PC gaming platforms ever made IMHO.............

 :salute
Win 10 Home 64, AMD Ryzen 9 3900X, MSI MPG X570 Gaming Plus, GSkill FlareX 32Gb DDR4 3200 4x8Gb, XFX Radeon RX 6900X 16Gb, Samsung 950 Pro 512Gb NVMe PCI-E SSD (boot), Samsung 850 Pro 128Gb SATA SSD (pagefile), Creative SoundBlaster X7 DAC-AMP, Intel LAN, SeaSonic PRIME Gold 850W, all CLWC'd

Offline Chalenge

  • Plutonium Member
  • *******
  • Posts: 15179
Re: PCIE 3.0
« Reply #11 on: August 08, 2015, 06:30:59 PM »
What I read is 6GB, or 2GB per channel. I seem to remember a problem with the tri-channel method, but offhand I don't remember what it was.

Concerning the PCIe 2.0 vs PCIe 3.0: The MB I use has the ability to limit the slots to gen2, or gen 1 even. I do not get any additional frames moving from gen2 to gen3, but I do get better benchmarks in Unigine Valley, by about 500 pts. Since I use two GTX 980s in SLI I should be able to best a single 980 Ti in the same benchmark, but that is not the case. This leads me to conclude that Unigine Valley is somehow cutting back on SLI performance.
If you like the Sick Puppy Custom Sound Pack the please consider contributing for future updates by sending a months dues to Hitech Creations for account "Chalenge." Every little bit helps.

Offline MADe

  • Silver Member
  • ****
  • Posts: 1117
Re: PCIE 3.0
« Reply #12 on: August 08, 2015, 06:45:37 PM »
You forgot to include the make and model of your desk and chair.   :D


In a way I did, I posted pics about my diy spitfire control column. Just another thread after I finished it. Chairs attached. :joystick:
ASROCK X99 Taichi, INTEL i7 6850@4.5GHz, GIGABYTE GTX 1070G1, Kingston HyperX 3000MHz DDR4, OCZ 256GB RD400, Seasonic 750W PSU, SONY BRAVIA 48W600B, Windows 10 Pro /64

Offline MADe

  • Silver Member
  • ****
  • Posts: 1117
Re: PCIE 3.0
« Reply #13 on: August 08, 2015, 07:10:51 PM »
I could add more ram for sure but as far as the game goes, it ain't gonna get me much really. If I was using the machine for anything other than what it is, a toy, more ram would be a must. I think 6GB is enough for the game, correct me if I'm wrong. The ram is underclocked but the timings are tight and low. My little system has been rock solid for awhile now. IMO the 58 is great. It just keeps going, all on air. I fully expect to get at least another 10 years out of it, except the SSD's anyway. I will build something else before that but still.

Challege you prolly remember about some kind of bottle necking issue. I read about it, its not layman oriented chat....you prolly understood it better than me. I have always got good numbers with bench marking but I have come to realize that at my level of understanding it don't mean much, just another number. Does it do what I want it to do at all times, answer yes. It will run all the stuff. Like I said I pretty much don't use the profile selections to alter the games look. I have everything off or disabled or use 3D application, sweetfx is the way to go.
 If I ran 3 cards, I'd get 8x3 pcie lanes, with 2 I get 16x2. 1 card.........

Thing is no matter what vid card I get, it will be a 3.0. That's just whats out there. The article about the GTX 680, that's 3.0 I believe.
Price wise for me it will be an EVGA, GTX XXX , and a single gpu. The actual performance increase you get from SLI is not that great. If I was running multiple monitors, doing bitcoin mining.............the question still is, do I go 700 series or 900 series, TI models are out. Theres also the fact that its 1 less piece of hardware attached to the mobo. Everything on the 58 uses the pcie lanes. Less power, little less overall heat, DDR5 ram....I have to research what will work best with my system. 900 series prolly to much....

I really liked pudgies reads, it was written at my level.
« Last Edit: August 08, 2015, 07:20:46 PM by MADe »
ASROCK X99 Taichi, INTEL i7 6850@4.5GHz, GIGABYTE GTX 1070G1, Kingston HyperX 3000MHz DDR4, OCZ 256GB RD400, Seasonic 750W PSU, SONY BRAVIA 48W600B, Windows 10 Pro /64

Offline Chalenge

  • Plutonium Member
  • *******
  • Posts: 15179
Re: PCIE 3.0
« Reply #14 on: August 08, 2015, 07:48:14 PM »
Well, here's the thing. Even with manual manipulation of the SLI settings we will never match an actual, Nvidia provided profile. You can get SLI working, but with a manually created profile it will never be optimum. You will know if there is a provided profile because when you select "Nvidia recommended" it will say "Nvidia recommended (SLI)" once it is set. Years back I would sit here and tweak the settings for hours to find the best results, but today is different.

As to the version of PCIe the video card is. . . it won't matter. Even PCIe 2.0 is fast enough. If it influences your gaming at all it would amount to 1 or 2 fps, and benchmarks performed in the past have even indicated that a PCIe 2.0 board will have as much as a 5 fps advantage. It's a synch rate issue, which might actually be why you are running your memory under the rated clock speed (i.e. stability). Individual results will often times vary because we set our systems differently with different hardware, and software both. It all makes a difference.

I would recommend a GTX 970 as long as it doesn't break the bank. Even a 750Ti is a great card, though I have not seen what it can do in AH.
If you like the Sick Puppy Custom Sound Pack the please consider contributing for future updates by sending a months dues to Hitech Creations for account "Chalenge." Every little bit helps.