Start a Conversation

This post is more than 5 years old

Solved!

Go to Solution

15989

December 21st, 2009 17:00

Area 51 PCI-E 2.0 16x issues

Hey Guys,

I have two ati 5870's in crossfire and for some reason both catalyst control center and GPU-Z are reporting that one of those cards is running at PCI-E 2.0 X16 bandwidth and the other is running at PCI-E 2.0 8X bandwidth. The third PCI-E slot is empty. I know that these motherboards support 3 PCI-E 2.0 16x bandwidth slots so why am I having this problem? I've ordered an exchange computer and it is exhibiting the exact same problem. Is anyone else experiencing this? Thanks for the help.

49 Posts

December 21st, 2009 22:00

Well after 2 hours with tech support we have finally figured out a solution. It's true that these motherboards have 3 PCI-E 2.0 x16 slots however only the top two slots are capable of running at x16 bandwidth with two graphic cards.

For some silly reason Alienware will ship crossfire/sli configurations using the 1st and 3rd slot (most likely because of heat collection when cards are on top of one another). I know for high end cards like the 5870, x8 vs. x16 is negligible however if anyone wants the full bandwidth power make sure to move the second card from the 3rd slot to the 2nd. Now if only I can get the command center working with windows 7......

kiax0001

30 Posts

December 22nd, 2009 04:00

Why wouldn't you be able to get the command center to work with windows 7. Just get the latest version from the downloads (not the version that came with the rest of the drivers on a DVD).

I installed the latest version and my pc never had a problem with it!

244 Posts

December 22nd, 2009 06:00

Why wouldn't you be able to get the command center to work with windows 7. Just get the latest version from the downloads (not the version that came with the rest of the drivers on a DVD).

 

I installed the latest version and my pc never had a problem with it!

I agree it works fine for me now that I don't use "sleep" mode anymore.  With sleep mode enabled, several times a week when coming out of sleep I'd end up with an "Unknown USB Device" splat in Device Manager after which the Command Center features began acting strangely.  Uninstalling/reinstalling Command Center (off Dell's Support site) along with a full shutdown and pulling the power chord out for 30 seconds, always fixed the problem.  Again, I don't use sleep mode anymore and all is well.

The Alienware Support folks (Costa Rica team) agree there's a Windows 7 compatibility issue with Command Center and that Dell's engineers are working on a fix.

49 Posts

December 22nd, 2009 12:00

DiMiannay,

Thanks for your reply. I think me and you are having two completely seperate issues with command center. Your problem is regarding an unknown USB device in device manager that pops up every now and again and having some intermittent issues regarding the lights and sleep. I have never once received this unkonwn USB device in device manager.

My problem is that when I shutdown the computer and turn it back on the command center doesnt work (ie. the side and top vent lights dont turn on as well as the internal fans running on high and not slowing down; thermal management) until I restart my computer in which everything works normal again. I believe there's a communication problem with the MIO board during the 1st boot following a system shutdown.

Let me know if your experiencing this problem as well because I know some other people in the forums are having the same issue that I'm having.

I'm hoping this will get fixed in a future command center update.

kiax0001

244 Posts

December 22nd, 2009 21:00

DiMiannay,

 

My problem is that when I shutdown the computer and turn it back on the command center doesnt work (ie. the side and top vent lights dont turn on as well as the internal fans running on high and not slowing down; thermal management) until I restart my computer in which everything works normal again. I believe there's a communication problem with the MIO board during the 1st boot following a system shutdown.

 

Let me know if your experiencing this problem as well because I know some other people in the forums are having the same issue that I'm having.

 

No, I am not experiencing your problem.  Command Center runs flawlessly for me now since I'm no longer using sleep mode.

7 Posts

December 22nd, 2009 21:00

Well after 2 hours with tech support we have finally figured out a solution. It's true that these motherboards have 3 PCI-E 2.0 x16 slots however only the top two slots are capable of running at x16 bandwidth with two graphic cards.

 

For some silly reason Alienware will ship crossfire/sli configurations using the 1st and 3rd slot (most likely because of heat collection when cards are on top of one another). I know for high end cards like the 5870, x8 vs. x16 is negligible however if anyone wants the full bandwidth power make sure to move the second card from the 3rd slot to the 2nd. Now if only I can get the command center working with windows 7......

 

kiax0001

Did you notice a speed bump when you moved your card to 16 lane slot? How is your cards breathing with one ontop of the other?

I currently have a 5870 and purchased a second one that will be here next week. I am trying to figure out how which slot I will plug it into.

 

Thanks

 

Carl G.

5 Practitioner

 • 

274.2K Posts

December 23rd, 2009 01:00

Well after 2 hours with tech support we have finally figured out a solution. It's true that these motherboards have 3 PCI-E 2.0 x16 slots however only the top two slots are capable of running at x16 bandwidth with two graphic cards.

 

For some silly reason Alienware will ship crossfire/sli configurations using the 1st and 3rd slot (most likely because of heat collection when cards are on top of one another). I know for high end cards like the 5870, x8 vs. x16 is negligible however if anyone wants the full bandwidth power make sure to move the second card from the 3rd slot to the 2nd. Now if only I can get the command center working with windows 7......

 

kiax0001

 

 

Did you notice a speed bump when you moved your card to 16 lane slot? How is your cards breathing with one ontop of the other?

 

I currently have a 5870 and purchased a second one that will be here next week. I am trying to figure out how which slot I will plug it into.

 

 

 

Thanks

 

 

 

Carl G.

I would like to know that as well.

49 Posts

December 24th, 2009 08:00

Hey Guys,

I did notice a speed bump when I moved the card to a 16x slot. I wish I could tell you exactly how much and I wish I had run some more benchmark tests before I made the change but I think everyone should put these cards in slots 1 and 2 regardless to obtain the most performance out of their machine.

Carl I don't think breathing room is an issue with the 5870's because cards the pull air in from the front of the card and the fan on the bottom of the card is internal so it doesn't interfere with second card. If it did they wouldn't have stacked them otherwise like they do in the aurora machines. Obviously the cards will run hotter but these cards relatively run cool and the internal GPU fan helps alot.

My only suggestion for people interested in stacking these cards on top of one another is to get smaller crossfire/sli bridges. The ones that came with my Area 51 were long enough to work in slots 1 and 3 but bend alot when I use slots 1 and 2. It actually forced me to take out the plastic gpu zone cover.

I would be interested to see some benchmarks before and after the slot change so If someone could post some vantage specs or something that would be great!

kiax0001

UPDATE: I have done a little bit more testing with these in crossfire in the x16 x16 slots and it doesn't look like there's much of a performance boost. Vantage score increases are minimal (~360 points) and I've noticed that these cards on top of one another can get really hot.After playing some COD I realized the temps of the cards were reaching a whopping 85 degrees!

Using slots 1 and 3 yield temps in the mid 70's and the cards are relatively more cool. Plus this case has been designed with "cooling" zones so to remove the plastic PCI-E cover makes it worse because the big PCI-E fan will just blow air that spreads throughout the case and not just to the 2 cards.

Hope this info helps.

Community Manager

 • 

54.3K Posts

December 30th, 2009 00:00

When looking at the motherboard card slots starting from the CPU -
PCIe x16 slot PCI_E1 (16 PCIe bus lanes)
PCIe x1 slot PCI_E2 (1 PCIe bus lane)
PCIe x16 slot PCI_E3 (8 PCIe bus lanes)
PCIe x1 slot PCI_E4 (1 PCIe bus lane)
PCIe x16 slot PCI_E5 (16 PCIe bus lanes)
PCI slot

Your saying Dell shipped it with the video cards in PCI_E1 and PCI_E5. You then moved the video card from PCI_E5 and inserted it in PCI_E3 so both video cards could use 16 PCIe bus lanes. On my XPS 730x, I never saw a difference in game performance when the 2nd video card was moved from the 8 PCIe bus lanes slot to the 16 PCIe bus lanes slot.

244 Posts

December 30th, 2009 08:00

When looking at the motherboard card slots starting from the CPU -
PCIe x16 slot PCI_E1 (16 PCIe bus lanes)
PCIe x1 slot PCI_E2 (1 PCIe bus lane)
PCIe x16 slot PCI_E3 (8 PCIe bus lanes)
PCIe x1 slot PCI_E4 (1 PCIe bus lane)
PCIe x16 slot PCI_E5 (16 PCIe bus lanes)
PCI slot

 

Your saying Dell shipped it with the video cards in PCI_E1 and PCI_E5. You then moved the video card from PCI_E5 and inserted it in PCI_E3 so both video cards could use 16 PCIe bus lanes. On my XPS 730x, I never saw a difference in game performance when the 2nd video card was moved from the 8 PCIe bus lanes slot to the 16 PCIe bus lanes slot.

OK, so now I'm confused...  it appears from what you posted that the optimal PCIe x16 slots are E1 and E5 since they are the two 16 lane x16 slots.  This allows two cards in Crossfire to be separated to reduce the extra high proximity heat if they'd been next to each other.  It would appear that moving a card from E5 to E3 would throttle the card in slot E3 just a bit and reduce overall Crossfire performance as well as increase proximity heat.  Am I missing something?

49 Posts

December 30th, 2009 10:00

Hey Guys,

Chris made a mistake. PCI-E5 is running at 8 PCIe bus lanes and not 16 PCIe bus lanes. So the configuration is as follows:

PCIe x16 slot PCI_E1 (16 PCIe bus lanes)
PCIe x16 slot PCI_E3 (16 PCIe bus lanes)
PCIe x16 slot PCI_E5 (8 PCIe bus lanes)

Basically using slots 1 and 5 (the furthest apart) will achieve x16 and x8 and using slots 1 and 3 (5870's on top of one another) will yield x16 x16.

kiax0001

Community Manager

 • 

54.3K Posts

December 30th, 2009 10:00

I might have it incorrect. I am trying to find a way to verify which slot is running at 8x.
PCIe x16 slot PCI_E1 (16 PCIe bus lanes)
PCIe x16 slot PCI_E3 (8 or 16 PCIe bus lanes?)
PCIe x16 slot PCI_E5 (8 or 16 PCIe bus lanes?)

kiax0001 says it is this way -
PCIe x16 slot PCI_E1 (16 PCIe bus lanes)
PCIe x16 slot PCI_E3 (16 PCIe bus lanes)
PCIe x16 slot PCI_E5 (8 PCIe bus lanes)

7 Posts

December 30th, 2009 12:00

With one ontop of the other, does that create heat problems? I am looking at my 5870 and the intake fan would sit right next to the other 5870 that I plan to put in there. It seems to me that I would be chocking the input air on at least 1 5870.

 

244 Posts

December 30th, 2009 13:00

OK, feeling better about this now.  My Area-51 has two 5870s, one in E1 and the other in E3.  Assuming these are the 16 lane PCIe slots then the delivered configuration has been optimized for maximum performance.  This does put the cards next to each other, but heat hasn't been a concern.  I've been playing Dragon Age, Modern Warfare 2 and Arkham Asylum all at max settings without the slightest slowdown or hesitation.  Performance is like a red hot knife through butter!  Simply awesome!

Community Manager

 • 

54.3K Posts

December 30th, 2009 23:00

Confirmed tonite.

One video card installed =
PCIe x16 slot PCI_E1 (16 PCIe bus lanes)
PCIe x16 slot PCI_E3 (8 PCIe bus lanes)
PCIe x16 slot PCI_E5 (8 PCIe bus lanes)

Two video cards installed =
PCIe x16 slot PCI_E1 (16 PCIe bus lanes)
PCIe x16 slot PCI_E3 (16 PCIe bus lanes)
PCIe x16 slot PCI_E5 (4 PCIe bus lanes)

No Events found!

Top