Start a Conversation

This post is more than 5 years old

Solved!

Go to Solution

7014

February 28th, 2013 21:00

Questions about Gatekeepers properties for ECC and SMC on a VM

Hello

I have some Questions about Gatekeepers Configuration required for SMC and ECC servers on vms

I have installed SMC on a Windows 2008 vm and have converted the ECC windows 2003 server to vm ( P2V )

need to have all properties and options should be chosen for the rdms used for the 2 servers

shall I make a Seperate MAsking View with only one path to these devices ? any special Configuration for them ? and is that Configuration required from VCENTER side or from SMC/Storage side ?

Please advise

Thanks and Regards
Sami

March 1st, 2013 02:00

> need to have all properties and options should be chosen for the rdms used for the 2 servers

Recommend to add 6 GKs to Solutions Enabler VM and 8 GKs to ECC VM.

Again we need enough to perform all concurrent Symmetrix commands and queries without running into a condition in which commands are delayed or rejected for lack of a gatekeeper resource.

> shall I make a Seperate MAsking View with only one path to these devices ?

> any special Configuration for them ?

It is fine to add GKs to existing VMware hosts masking view.

> and is that Configuration required from VCENTER side or from SMC/Storage side ?

Create GKs and add it to existing VMware hosts masking view.
Gatekeepers require only a small amount of space, 3 MB (3 cyl) for Enginuity levels 57xx, 58xx and higher, and 3 MB (6 cyl) for Enginuity levels of 56xx and lower.

No specific settings or requirements at vCenter level.

Also I would recommend please follow below Primuses

emc255976 : Gatekeepers - All You Need to Know

16. Virtualization/Cluster/Migration or Application Specific Gatekeeper Requirements.

Virtual Server Environments :
Virtual Server environments often permit the movement of a guest operating system instance from one physical server to another.
If that instance is doing Symmetrix management or control, the Gatekeepers being used must be visible to that instance on whatever
physical server it is running upon. As a result, the Gatekeeper devices must be made visible to the various physical servers the guest may operate upon.
Note that the rule for not sharing Gatekeepers remains, and that the guest operating system image with its Solutions Enabler
instance must run on only one physical machine at any given time, and only one instance using the Gatekeeper at any one time

emc248427 : Can gatekeepers be shared on virtual machines or physical ESX servers in a VMware Cluster?

Sharing gatekeepers across different virtual machines  is NOT supported.
Sharing gatekeepers at the physical Clustered ESX server level IS supported

VMware ESX, multipathed Gatekeepers using VMware NMP require a policy of FIXED. RR policy is NOT supported.

Gatekeepers must be presented to the virtual machine as RDM in physical compatibility mode for Solutions Enabler to operate properly.

emc193692 : How should I present gatekeepers to a VMware virtual machine?

Gatekeepers should be presented as physical RDM devices.

859 Posts

March 1st, 2013 02:00

starting from 5876 and SE 7.4, gatekeepers need not to be presented from single path and in previous version of enginuity, if you have powerpath installed on the mgmt host and gks are multipathed, pp will take care of this.

regards,

Saurabh

859 Posts

March 1st, 2013 03:00

if No powerpath is installed and enginuity is less than 5876 then I would recommend that there should be seperate masking view for gks so they are available through single path.

regards,

saurabh

72 Posts

March 1st, 2013 03:00

Great Answers

yes I need to check the multipathing and the one pathing thing

ok boxes are 5875 and SE is 7.4 for now

but planning to go to 5876 in two weeks or so

and ECC has PP but SMC doesnt

shall I make a masking view for the GKs and have one path till i have upgrade to 5876 ?

1 Rookie

 • 

20.4K Posts

March 1st, 2013 04:00

make sure RDM presented using physical compatibility mode

No Events found!

Top