Unsolved
This post is more than 5 years old
10 Posts
0
49296
GTX 280 - XPS 630 - PCIe 8x Performance?
Hello,
I have recently purchased an XPS 630 before I knew about the PCIe 8x problem. I have also ordered an Nvidia GTX 280 card to use with this system.
My question is, how much is performance going to be effected due to the PCIe bus only running half the speed it should be?
I have noticed a few users with the above setup, any input would be appreciated.
Also, have all the other problems been resolved with this system, apart from the LED blinking problem? (Im hoping Dell may send me a system with an updated IO board to solve that).
Kind Regards,
MatexShip
Message Edited by MatexShip on 07-29-2008 07:53 AM
MatexShip
10 Posts
0
July 30th, 2008 13:00
I emailed Nvidia to see what they had to say on the issue;
My question:
Hello,
I recently purchased a DELL XPS 630 system which has two PCIe slots that are locked to 8x each. I was wondering how much of a performance drop would be expected, if any, when running this card under PCIe 1.0 8x rather than 16x?
I would appreciate any help with this matter.
Thank you.
Nvidia's answer:
Hello,
Thank you for contacting NVIDIA Customer Care.
This is Ganesh and I shall answer your query.
I understand that you would like to know the performance loss when you run a graphics card in PCI Express x16 running x8 mode. You have GeForce GTX 280 graphics card.
For an average end user using the regular desktop applications or games the difference will be very little. However, the difference can be significant for Scientists or Developers developing or using CUDA applications.
Please feel free to contact us for any further information.
Regards,
Ganesh,
NVIDIA Customer Care
So there we have it, straight from Nvidia who state that there will be a performance drop, however the scale of this will depend on what the system is used for.
Thanks,
MatexShip
DELL-Chris M
Community Manager
Community Manager
•
54.1K Posts
0
July 30th, 2008 13:00
The way you worded your conclusion makes it sound like there is a HUGE performance drop. That is not what Nvidia said. He said, "For an average end user using the regular desktop applications or games the difference will be very little.".
ElkWapiti
513 Posts
0
July 30th, 2008 14:00
I beleive the folding@home GPU client is a CUDA application. I am running this currently and would be delighted if additional lanes would make this client even more productive.
I am disappointed that Chris was apparently told that running a PCIe x16 card in a 8x slot would not make a difference as no such applications currently exist. It appears that this is simply not true. As a non-gamer who runs applications that take advantage of four CPU cores and powerful GPUs this matters to me.
Not only does this make Dell look bad, it puts Chris in a very difficult postion; the poor guy has to deal with frustrated customers while being provided with less than the best information.
Davet50
14.4K Posts
0
July 30th, 2008 14:00
Seems to me I have heard this before. Now that's two for little difference.
MatexShip
10 Posts
0
July 30th, 2008 19:00
Chris,
I in no may meant this to be an attack on you, I appologise if it came across that way. I do believe however that Dell should make this fact known to its end users as they may be using CUDA applications that are buisiness critical - one example of where this would effect performance.
Thanks,
MatexShip
kroocer
1 Message
0
July 30th, 2008 22:00
kevpan815
202 Posts
0
July 30th, 2008 23:00
devjonfos
105 Posts
0
July 31st, 2008 00:00
DELL-Chris_M: Do you think we could get that corrected?
http://www1.ca.dell.com/content/products/productdetails.aspx/xpsdt_630?c=ca&cs=cadhs1&l=en&s=dhs
Spastik_Plastik
19 Posts
0
July 31st, 2008 00:00
Great discussion... I too just bought an XPS 630 on Monday before reading these and other posts.
I have to agree with the issue of misrepresentation as the product tech specs clearly states:
2 PCIe X16, 1 PCIe X8, 1 PCIe X1, and 2 PCI
http://www1.ca.dell.com/content/products/productdetails.aspx/xpsdt_630?c=ca&cs=cadhs1&l=en&s=dhs .
I am going to wait til the machine arrives and check it out before I make a decision. The machine will strictly be used at work for Flash Production and a lot of web development. We shall see!!!
Durrthock
27 Posts
0
July 31st, 2008 01:00
Drange
155 Posts
0
July 31st, 2008 02:00
shdbcamping1
475 Posts
0
July 31st, 2008 04:00
All,
There will be a difference with mode X8 versus mode X16. Think of the lanes as DDR memory. when one is full the others can be used instead of waiting or they may just be used to increase throughput. The video cards are wired to the lanes. 8 lanes... only eight ways in and out. 16 lanes twice as many ways in and out. DDR memory got its advantage the same way that more lanes gets its. Just because there is only a full 5 lanes of data, this does not mean it will go out only on 5 lanes. (layman's explanation of PCIe).
Ps. the multiple GPU graphics cards will surpass the 8 lane data throughput very soon... not 3 years down the road.
Sean
dontexpectmetob
100 Posts
0
July 31st, 2008 04:00
CUDA applications are currently used mostly by scientist and tech labs. It's basically a C language platform that convert your GPU to a x86 CPU but is much more powerful than any current Intel CPUs.
GPU + CUDA = Intel CPU
http://en.wikipedia.org/wiki/CUDA
With that said, I don't think any applications you are using, and probably for the next 5 years, are CUDA based. CUDA is still being experienced by scientists and universities and not widely available to the public.
Durrthock
27 Posts
0
July 31st, 2008 13:00
According to Cpu-z And Gpu-z In my new Dell xps 630 I am running a 750i chipset, with pcie 2.0 at 16x.
So problem solved, my hdd light blinks though.
http://i55.photobucket.com/albums/g139/Durrthock/Proof.jpg
Sorry its a little hard to read.
MatexShip
10 Posts
0
August 1st, 2008 00:00
Chris,
Could you look into the fact that some people are now getting 750i boards? Is this a change or an error in build? My 630i has not arrived yet, I hope it may come with a 750i board. I live in the UK.
Cheers,
MatexShip