Unsolved

This post is more than 5 years old

2 Intern

 • 

356 Posts

1413

December 4th, 2015 13:00

Isilon - Does it do port Bonding?

Community,

the question was asked:

Can I  configure a layer 1 connection to test connectivity and wanted to know if the ports on the Isilon can be bonded to support 20gb. 

I am trying to find out how the 10GbE interfaces should be configured to achieve this request?

Thank you,

130 Posts

December 7th, 2015 07:00

Hello chjatwork,

Check out our guide for external network connectivity:

https://support.emc.com/docu58740_Isilon-External-Network-Connectivity-Guide---Routing,-Network-Topologies,-and-Best-Pra…

(this link DOES require a login to EMC Online Support.)

OneFS does support Link Aggregation; however, link aggregation does not increase overall throughput for a single client. This is because loadbalancing is performed based on either a source/target IP address hash or a source/target MAC address hash, depending on the configuration of the port channel configured on the switch. So, the links will remain at 10Gbps.

Please let me know if there is anything else I can do for you.

450 Posts

December 14th, 2015 07:00

chjatwork,

To expand a bit (I wrote the LACP section in the guide Katie mentioned above).  Yes you can certainly bond 2 x 10Gbe Interfaces with LACP on Isilon (from the same node, not across nodes).  But you'll never get 20Gbps from a single bonnie, iozone, or iperf test.  Why?

1. LACP load-balances based on either source/destination MAC address hash or source/destination IP address hash, meaning no one stream will ever use more than 1 of the 2 links at any one point in time.

2. Bonnie++, iozone, etc all are tests not just of the network interfaces but also of the underlying spindles.  You would need some seriously fast disks and a lot of them to fully saturate a single 10Gbps link let alone 2 at the same time.  iperf does take the disk out of the equation and make it simply a network test which can be helpful to find bottlenecks, but then especially on older nodes you'll find that the PCI bus itself isn't even capable of 20Gbps, (this is not an issue on newer nodes).

3. The only time that on Isilon a single client will ever use both 10Gbe links on a single node at the same time is when you are doing SMBv3 Multichannel, though you must have a client that supports that and can go that fast as well.

Hope this helps,

~Chris Klosterman

Advisory Solution Architect

EMC Emerging Technologies Enablement Team

chris.klosterman@emc.com

twitter: @croaking

0 events found

No Events found!

Top