Typically there is a management pool. The most common set up that I have seen if using the 10 gig interfaces for client traffic, is to have a 1 gig pool for management, but it will work with whichever external interface you choose.
Keep in mind that while you may be able to ping, you can't ssh to an IP address that is outside the system access zone. In earlier versions of OneFS, you could log in, but performing administrative tasks from a non-system access zone would return errors.
Here is the networking best practice guide for Isilon. While it doesn't get into much regarding access zones, it does talk a little about management of the cluster starting on page 9: https://support.emc.com/docu58740
Why are you using two different access zones? As long as you only have to create the access and don't have different Authentication providers i would stay in the same zone. You could even just use System.
Depends on your security mindset - of course this way they could "see" the GUI or ssh to the node. But they (hopefully) won't be able to login.
so in an overview this are your possibilities:
System
- smb
- nfs
- management
(recommended, if you don't mind having management and smb/nfs in the same Accesszone, it just keeps it simple)
or
System
- management
AccessZone1
- smb
- nfs
(recommended if you have the same Authentication providers for nfs and smb)
or
System
- management
AccessZone1
- smb
AccessZone2
- nfs (recommended if you have different Authenitcation providers)
what I have seen at most of my customers sites is that hey create separate access zones for security purposes and use the system zone solely for administration. The benefit is that it will restrict things like ssh to only the system zone and can help you maintain security. Outside of that there is no issue with running both the NFS and SMB in the same zone unless you need different authentication providers then you would create a different zone for SMB and NFS.
If you implement something like rfc2307 for multi tenancy of NFS and SMB to access the same files you would want them in the same zone. Just a little food for thought
We have Active Directory in the cifs zone, and even though it wouldn't be a big deal having the Unix servers in the same zone, the Unix admins tend to like to keep things separate from the Windows worlds, so that's why we'll end up with two zones there.
Is there any reason why we couldn't use Replication and Management on the same pool in the System Zone? Only reason I ask is that we already have a replication network/vlan that spans to our DR site and makes things really easy for setting up the replication since it's already configured. Normally we keep management on a separate network/vlan but it's a different subnet which is going to make configuring the system zone more complicated since we'd have two seperate subnets. Just curious what others do when you have replication to worry about it.
I see no reason why you cant have replication and administration in the system zone. the only thing that really defines why you need or don't need zones is authentication and any ssh restrictions.
so short answer no reason I can see why shouldn't be able to do what your talking about(having replication in with admin on system zone).
sjones51
252 Posts
0
July 14th, 2016 15:00
Hi Daryn,
Typically there is a management pool. The most common set up that I have seen if using the 10 gig interfaces for client traffic, is to have a 1 gig pool for management, but it will work with whichever external interface you choose.
Keep in mind that while you may be able to ping, you can't ssh to an IP address that is outside the system access zone. In earlier versions of OneFS, you could log in, but performing administrative tasks from a non-system access zone would return errors.
OneFS: Cluster Administration from Access Zone other than System may return errors https://support.emc.com/kb/334103
Here is the networking best practice guide for Isilon. While it doesn't get into much regarding access zones, it does talk a little about management of the cluster starting on page 9:
https://support.emc.com/docu58740
sluetze
2 Intern
•
300 Posts
0
July 17th, 2016 23:00
Why are you using two different access zones? As long as you only have to create the access and don't have different Authentication providers i would stay in the same zone. You could even just use System.
Depends on your security mindset - of course this way they could "see" the GUI or ssh to the node. But they (hopefully) won't be able to login.
so in an overview this are your possibilities:
System
- smb
- nfs
- management
(recommended, if you don't mind having management and smb/nfs in the same Accesszone, it just keeps it simple)
or
System
- management
AccessZone1
- smb
- nfs
(recommended if you have the same Authentication providers for nfs and smb)
or
System
- management
AccessZone1
- smb
AccessZone2
- nfs
(recommended if you have different Authenitcation providers)
addisdaddy20
65 Posts
0
July 18th, 2016 08:00
to Echo sjones5,
what I have seen at most of my customers sites is that hey create separate access zones for security purposes and use the system zone solely for administration. The benefit is that it will restrict things like ssh to only the system zone and can help you maintain security. Outside of that there is no issue with running both the NFS and SMB in the same zone unless you need different authentication providers then you would create a different zone for SMB and NFS.
If you implement something like rfc2307 for multi tenancy of NFS and SMB to access the same files you would want them in the same zone. Just a little food for thought
cheers,
D_Tracy
Clarkson14
1 Rookie
•
89 Posts
0
July 18th, 2016 09:00
We have Active Directory in the cifs zone, and even though it wouldn't be a big deal having the Unix servers in the same zone, the Unix admins tend to like to keep things separate from the Windows worlds, so that's why we'll end up with two zones there.
Is there any reason why we couldn't use Replication and Management on the same pool in the System Zone? Only reason I ask is that we already have a replication network/vlan that spans to our DR site and makes things really easy for setting up the replication since it's already configured. Normally we keep management on a separate network/vlan but it's a different subnet which is going to make configuring the system zone more complicated since we'd have two seperate subnets. Just curious what others do when you have replication to worry about it.
addisdaddy20
65 Posts
0
July 18th, 2016 10:00
I see no reason why you cant have replication and administration in the system zone. the only thing that really defines why you need or don't need zones is authentication and any ssh restrictions.
so short answer no reason I can see why shouldn't be able to do what your talking about(having replication in with admin on system zone).
dynamox
9 Legend
•
20.4K Posts
0
July 18th, 2016 10:00
i have two zones:
System: used for NFS/CIFS, Administration and Replication
SecureZone: used for CIFS with enabled auditing (Varonis used for audit log processing)