2 Bronze

VMware Connectivity Dropping on M8024 Switches

We currently have a VMware 5 cluster made up of M610 blades in the M1000e. The M1000e has been tested at 3.21 and 4.1. Each blade utilizes:

M6220 switches in A1 & A2 fabrics;

M8024-K switches in B1 & B2 fabrics using the Intel X520-x/k 10Gb Dual Port I/O Card mezz.

Brocade 5424 switches in fabrics C1 & C2 with Emulex LPE1205-M 8Gbps Fibre Channel  mezz.

We are utilizing teaming on the VMware hosts for the 2 - 10 GB connections. We have been experiecing issues with VMware hosts becoming disconnected on one side of the B fabric for the 10 GB connectivity. For instance the blade in slot 3 will have connectivity on Switch B1- port 3 but not have connectivity on B2 port 3.We have seen similar symptoms running both the and the code on the M8024-K switches.

During troubleshooting we booted the VMware server with no resolution. We then re-seated the mezzanine card in B with no resolution. Only after booting the switch were we able to re-establish connectivity on B2 port 3. On one occassion we re-seated the switch which initiated a re-boot which resolved the condition. We have seen the same symptoms on other ports in other chassis units.

Not sure if others are encountering similar issues as we have cases open with Dell in an effort to resolve. It has been very frustrating to pint down this specific problem.


0 Kudos
1 Reply
Not applicable

Re: VMware Connectivity Dropping on M8024 Switches

I recommend you to stack your M8024-k when you use Nic Team. That helped me before with a similar issue.