Start a Conversation

Unsolved

This post is more than 5 years old

49301

July 29th, 2008 11:00

GTX 280 - XPS 630 - PCIe 8x Performance?

Hello,

 

I have recently purchased an XPS 630 before I knew about the PCIe 8x problem. I have also ordered an Nvidia GTX 280 card to use with this system.

 

My question is, how much is performance going to be effected due to the PCIe bus only running half the speed it should be?

 

I have noticed a few users with the above setup, any input would be appreciated.

 

Also, have all the other problems been resolved with this system, apart from the LED blinking problem? (Im hoping Dell may send me a system with an updated IO board to solve that).

 

 

Kind Regards,

MatexShip

Message Edited by MatexShip on 07-29-2008 07:53 AM

10 Posts

July 30th, 2008 13:00

I emailed Nvidia to see what they had to say on the issue;

 

My question:

 

Hello,

 

I recently purchased a DELL XPS 630 system which has two PCIe slots that are locked to 8x each. I was wondering how much of a performance drop would be expected, if any, when running this card under PCIe 1.0 8x rather than 16x?

 

I would appreciate any help with this matter.

 

Thank you.

 

 

Nvidia's answer:

Hello,

 

Thank you for contacting NVIDIA Customer Care.

 

This is Ganesh and I shall answer your query.

 

I understand that you would like to know the performance loss when you run a graphics card in PCI Express x16 running x8 mode. You have GeForce GTX 280 graphics card.

 

For an average end user using the regular desktop applications or games the difference will be very little. However, the difference can be significant for Scientists or Developers developing or using CUDA applications.

 

Please feel free to contact us for any further information.

 

Regards,
Ganesh,
NVIDIA Customer Care

 

 

 

So there we have it, straight from Nvidia who state that there will be a performance drop, however the scale of this will depend on what the system is used for.

 

 

Thanks,

MatexShip

Message Edited by MatexShip on 07-30-2008 09:16 AM

Community Manager

 • 

54.3K Posts

July 30th, 2008 13:00

MatexShip,

The way you worded your conclusion makes it sound like there is a HUGE performance drop. That is not what Nvidia said. He said, "For an average end user using the regular desktop applications or games the difference will be very little.".

513 Posts

July 30th, 2008 14:00

I beleive the folding@home GPU client is a CUDA application.  I am running this currently and would be delighted if additional lanes would make this client even more productive.

 

I am disappointed that Chris was apparently told that running a PCIe x16 card in a 8x slot would not make a difference as no such applications currently exist.  It appears that this is simply not true.  As a non-gamer who runs applications that take advantage of four CPU cores and powerful GPUs this matters to me.

 

Not only does this make Dell look bad, it puts Chris in a very difficult postion; the poor guy has to deal with frustrated customers while being provided with less than the best information.

14.4K Posts

July 30th, 2008 14:00


@Anonymous-Chris_M wrote:
MatexShip,

The way you worded your conclusion makes it sound like there is a HUGE performance drop. That is not what Nvidia said. He said, "For an average end user using the regular desktop applications or games the difference will be very little.".

Seems to me I have heard this before. Now that's two for little difference.

10 Posts

July 30th, 2008 19:00

Chris,

 

I in no may meant this to be an attack on you, I appologise if it came across that way. I do believe however that Dell should make this fact known to its end users as they may be using CUDA applications that are buisiness critical - one example of where this would effect performance.

 

 

Thanks,

MatexShip

1 Message

July 30th, 2008 22:00

not sure what im doing wrong here i ran CPU-z and gpuz the cpu said i have a 750 chip set on main broad and the gpuz says PCI-E 2.0x16@16 2.0. i have the newest dell bios. let me know if any ideas come up plus now i wondering if i have a diff mainbroad ? i thought it was a 650i with only 8 lanes

202 Posts

July 30th, 2008 23:00

my Dell XPS 630I plays most Open Source Linux Ubuntu computer games just fine, (I removed my copy of Windows Vista Service Pack 1 from the computer and installed a Free Copy of Open Source Linux Ubuntu 8.10 Alpha 3 on the computer in order 2 take advantage of all the Free Open Source Computer Games available 4 Linux and 2 Alpha Test a Pre-Release Operating System as well, I am always bored when I don't have any new Alpha's or Beta's 2 test, and right now the only thing Microsoft has 2 test is IE 8 Beta 1, Very Boring, Yawn).

105 Posts

July 31st, 2008 00:00

DELL-Chris_M: Do you think we could get that corrected?

 

http://www1.ca.dell.com/content/products/productdetails.aspx/xpsdt_630?c=ca&cs=cadhs1&l=en&s=dhs

 

XPS 630 PCIe Slots 

July 31st, 2008 00:00

Great discussion... I too just bought an XPS 630 on Monday before reading these and other posts.

 

I have to agree with the issue of misrepresentation as the product tech specs clearly states:

2 PCIe X16, 1 PCIe X8, 1 PCIe X1, and 2 PCI

http://www1.ca.dell.com/content/products/productdetails.aspx/xpsdt_630?c=ca&cs=cadhs1&l=en&s=dhs .

 

I am going to wait til the machine arrives and check it out before I make a decision. The machine will strictly be used at work for Flash Production and a lot of web development. We shall see!!!

27 Posts

July 31st, 2008 01:00

So what does this mean If I have a 9800 gx2

155 Posts

July 31st, 2008 02:00

I think the fundamental problem with Dell is they want to appeal to the "enthusiast" crowd with the XPS 630 but aren't advertising accordingly.  A lot of these issues wouldn't have come up if Dell had followed the basic principle of advertising, know your target audience.  If Dell had done that there would have been a lot more detailed specs shown instead of the bare minimum we got.  Unfortunately for Dell advertising fraud is based on the perception of a reasonable customer that the product is being marketed to, something Dell has failed miserably to do.  That being said, it still doesn't explain the whole no LightFX 2.0 fiasco.

475 Posts

July 31st, 2008 04:00

All,

     There will be a difference with mode X8 versus mode X16. Think of the lanes as DDR memory. when one is full the others can be used instead of waiting or they may just be used to increase throughput. The video cards are wired to the lanes.  8 lanes... only eight ways in and out. 16 lanes twice as many ways in and out. DDR memory got its advantage the same way that more lanes gets its. Just because there is only a full 5 lanes of data, this does not mean it will go out only on 5 lanes. (layman's explanation of PCIe).

Ps. the multiple GPU graphics cards will surpass the 8 lane data throughput very soon... not 3 years down the road.

 

Sean

July 31st, 2008 04:00

CUDA applications are currently used mostly by scientist and tech labs.  It's basically a C language platform that convert your GPU to a x86 CPU but is much more powerful than any current Intel CPUs.

GPU + CUDA = Intel CPU

http://en.wikipedia.org/wiki/CUDA

 

With that said, I don't think any applications you are using, and probably for the next 5 years, are CUDA based.  CUDA is still being experienced by scientists and universities and not widely available to the public.

27 Posts

July 31st, 2008 13:00

According to Cpu-z And Gpu-z In my new Dell xps 630 I am running a 750i chipset, with pcie 2.0 at 16x. 

So problem solved, my hdd light blinks though. 

 

 

http://i55.photobucket.com/albums/g139/Durrthock/Proof.jpg

Sorry its a little hard to read. 

10 Posts

August 1st, 2008 00:00

Chris,

 

Could you look into the fact that some people are now getting 750i boards? Is this a change or an error in build? My 630i has not arrived yet, I hope it may come with a 750i board. I live in the UK.


Cheers,

MatexShip

No Events found!

Top