People have tested the PCIe 16x and 8x slots and we have seen little to no difference in performance even when testing the top of the line GTX 1080 Ti. This is one of the reasons why people often state that there is no difference in performance when using PCIe 16x over 8x. But what happens when you use a better graphics card and use two of them. Here you can see the stress that dual Nvidia Titan V graphics cards will put on the PCIe slots.
GamerNexus was able to get their hands on two Nvidia Titan V graphics cards and the findings are pretty interesting. We see that the PCIe 16X bandwidth is drained by using these two graphics cards. Because these cards do not support SLI, DX12 multi-GPU support was used to make all of this happen.
The Nvidia Titan V is not a gaming GPU so you can imagine that games do not support two of these graphics cards and that is exactly what happened here. Having that said, Ashes of the Singularity is one game that supported two of these monsters and you can check out the findings of this experiment below:
The following hardware was used in order to carry out this experiment:
- 2x Titan V cards
- EVGA X299 Dark motherboard
- Intel i9-7980XE
- 32GB 3866MHz GSkill Trident Z Black
PCIe x16 was introduced in 2010 and has been the standard for the last 7 years now. With 2017 coming to an end, we now see the importance of PCIe gen 4. While PCIe gen 4 will take its time we know for sure that if the upcoming graphics cards will be as powerful as the Nvidia Titan V then this standard will drain out for sure.
Let us know what you think about all this and whether or not you think we need a new PCIe standard for gaming.