Start a Conversation

This post is more than 5 years old

Solved!

Go to Solution

32304

October 18th, 2016 01:00

GPU Installation on R730

Dear Sir or Madam,

I have bought the Poweredge R730 w/ Single E5-2640 V4 (10C/20T) with 750W 80+ Platinum PSU config and perform well. Now I wish to add the GPU card for extended processing performance, eg. GTX1080 with typical 170W. I have searched the Dell community and external forums and found there should/might be an external power problem based on custom GPU installation.

Those I need to inquire you on: should I have to order the GPU Installation Kit or any part to ensure the custom configuration gets into work (optional 2x GPU should be concerned). I acknowledged that the GPU installed might be used only for processing task and not for display purpose.

Thank you for your kind forward assistance.

Sincerely,
Worapong Wilairat

Moderator

 • 

8.4K Posts

October 18th, 2016 07:00

Zyxzenze,

An issue you may run into is that the specific GPU isn't supported, and may not function. The supported GPU list for the R730 is below;

R730 GPUs supported at RTS
Nvidia AMD Intel Phi
Nvidia K80 AMD S9150 Intel Phi 7120P
Nvidia M60 AMD S7150 Intel Phi 3120P
Nvidia M40 AMD S7150x2
Nvidia K40M AMD S/W9100
Nvidia GRID K1

Nvidia GRID K2

Now so you know, the R730 can support two 300W, full-length, double-wide or four 150W, single width GPUs. The GPUs are installed on the PCIe x16 Gen3 interfaces available on Riser2 and the GPU-Optional Riser3. For installing two internal GPUs in the system, GPU-Optional Riser3 has to be present.

It also looks like the cables will be GPU specific as well. 

Let me know if this helps.

2 Posts

October 19th, 2016 04:00

Dear Chris,

Appreciated for your help. :emotion-2: Should I ask you more detail on the GPU for the R730?

The question is: Do I need to choose only Manufactured by Dell (in the catalogue: accessories.ap.dell.com/.../category.aspx or other OEMs shall fit on my R730, based on its TDP?

Thanks forward for your answer.

1 Message

October 23rd, 2020 01:00

Hi Chris,

You wrote "It also looks like the cables will be GPU specific as well. "

I have a R730 with a 1100w PSU and the GPU installation kit.

I just bought a Nvidia Tesla K80. Do I need any extra cable specific to that card?

Thanks a lot

Moderator

 • 

3.4K Posts

October 23rd, 2020 05:00

Hello,

yes it needed a specific 8-pin cable designed for the NVIDIA Tesla K80 cards as a standard GPU power connector is not compatible.

Thanks

Marco

1 Rookie

 • 

16 Posts

October 24th, 2020 14:00

As far as I can make out the standard GPU power cable (part no 09H6FV) won't power a K80 as the connectors are wrong. At least the ones I have won't. If you are handy with a soldering iron (or have the correct crimp tool kit) you can just get a Molex Mini Fit Jr 4x2 plug snip the ends of the Dell part (it can power two single cards but the K80 is double height) and fit the power plug. I don't have the crimp kit (it's expensive) so I do it with some needle pliers and then because that's not great, to make sure the electrical connection is good use a small amount of solder. Too much and the pin won't fit in the housing properly.

You will also need a special Dell specific metal bracket to support the end of K80 card, these are not cheap to get hold of as you are generally going to have to buy an older card to get them. Here is a picture of a P100 with one of them attached sitting on top of a PowerEdge R730.

nVidia_P100.jpeg

 

 

Any unused PCI slots need covering up with solid covers as opposed to the ventilated ones that are standard. Presumably to force additional air flow over the card.

You are also limited to processors with a TDP of 135W or less and maximum inlet air temperature is reduced to 30°C.

Where the R730 owners manual talks about "low profile" heat sinks being part of the GPU enablement kit, it is as far as I can make out talking rubbish. I think it's a hangover from the R720 owners manual that should have been removed. At least I have not come across an R730 that needs different heat sinks. On the R720 you also need to replace the ducting shroud which means changing to low profile heat sinks (more powerful fans too). However from what I have seen empirically the R730 comes with "low profile" heat sinks that sit under the standard ducting shroud.

A P100 works in an R730 with the above, and  a V100 does too as it uses the same amount PCI base address register space. Presumably the V100S does but I can't attest to that. Later this week I will be able to let you know if the latest A100's work as I will be getting a loan of a sample. They use more PCI base address register space, significantly more if you make use of the virtual GPU feature and the R730 BIOS might not reserve enough space (hence the load of an A100 before we make a significant purchase).

Finding this all out has been a significant amount of research over the last couple of weeks, which is why I have a R730 on my lounge floor to work out exactly what the GPU enablement kit included. I already new the power lead was duff from a while back. Just need some extra R730's to expand our offering. We want a heterogeneous cluster and limited money for new servers anyway. Will just have to live with the A100's on a PCIe 3.0 bus assuming it works.

 

November 1st, 2020 09:00

Hi Sir,

I have purchased RTX3070 Cards will it support DELL PowerEdge R730 / xd Server. Please help with this.

 

Moderator

 • 

3.6K Posts

November 1st, 2020 17:00

Hi,

Unfortunately Client graphic cards does not go with servers.

 
I nternal GPU cards are supported on the PowerEdge R730 and not on the PowerEdge R730xd.
 
If you're looking to put a GPU card in a server for hardware acceleration, do not use a GTX. They are not supported by Dell. Use one of the Quadro line.

Moderator

 • 

3.4K Posts

November 2nd, 2020 06:00

Hello,

thanks for the update.

Yes we don't have this kind of documentation, unfortunately.
Marco

1 Rookie

 • 

16 Posts

November 2nd, 2020 06:00

Just an update on the A100 in a R730 status. Apart from nVidia changing the mounting holes for the support brackets again requiring a new hold to be drilled then they work.

I suspect you could even do NVlink between two cards as there is sufficient room in the case to run a ribbon cable between the two slots. I tried with a spare PCIe x16 flexible extender I had lying around. However I imagine one is going to need access to some technical documentation from nVidia to make the appropriate PCB's that wont be forth coming

1 Message

January 19th, 2021 16:00

i had R730  with 2 RTX 6000 Nvidia with optional adapter riser 3

1 Message

May 3rd, 2021 20:00

For the record, I'm running dual 2070 Supers in an R730 in my homelab. They work just fine. You can use "unsupported" cards, it's just... well... unsupported. They don't guarantee that they'll work properly, and if you call they'll say they can't help you, but if you're willing to absorb that risk, it seems like for the most part, a good number of consumer GPUs will work.

1 Message

May 8th, 2021 03:00

HelloI solved the problem with RTX 4000. You can install any graphics card with this Dell R730 server. 

Just make jumper wire between pin 5to 6 (GPU side connector). The sense pin SO Is not sensing in DELL R730.

You can install any graphics card with this method.

 

3 Posts

July 29th, 2021 02:00

Hi,

can you share more information? as well as some specs and photos?
I have r730 and I'm still trying to configure the system with a test K80 card.

I just move to proxmox as the feature is possible in the free version.

 

thanks.

 

 

 

September 22nd, 2021 08:00

I am trying to install an Nvidia Tesla M40 24GB or a K80. I currently have both. That first link you posted states "Ensure that both the processors are installed." Does this mean a single CPU installation will not work at all?

Also I ordered a power cable from Amazon that claimed it would work with the Tesla Cards and an R730. Is there a wiring diagram for this connection between Riser and Video card I can compare against to see if the cable is even close to being what it is supposed to be? 

I have tried installing each the cards with the cable and when I try to power the server back up, the status lights on the back of the power supplies turn from solid green to blinking orange. Any ideas why that would happen?

No Events found!

Top