I've got a pair of M1000 blade enclosures, each filled with 16 x M630 blades, and 2 x B22 modules for networking. The blades have Emulex OCm14102B-N6-D network cards.
The blades are running ESXi. Today we implemented a change to remove the EtherChannel configuration that been set-up on the Nexus switches. This is something we've done many times before, and has worked without issue.
The change impacted all of the blades, plus an additional 8 x R630 rack servers.
After the network team had disabled EtherChannel, 3 of the blades dropped their networking. All other blades and rack servers are fine.
From the switch side the network team are seeing nothing connected. From ESXi it's showing the link as being down. From iDRAC when looking at hardware -> network devices -> integrated NIC 1 it's saying nothing against the Link Status, and both Link Speed and Auto negotiation are shown as "UnKnown". When going into the BIOS set-up for the Integrated NIC, it shows Link Status as "Disconnected" and Physical Link Speed as "Link Down".
We've tried power cycling the affected blades, but nothing is changing the status.
I've never seen anything like this before. The network team say there's nothing else they can do, but I can't see anything I can do on the M1000/iDRAC/ESXi side of things. I've done things in ESXi like forcing the vmnics down and back up.
Any suggestions? Many Thanks...