97 Posts

May 17th, 2007 17:00

Appears the other DAE's on the loop do become inaccessible.
Primus - emc152751

Thanks

2.2K Posts

May 18th, 2007 06:00

Bowling,
It sounds like you are planning on the possibility of losing both power supplies in a DAE at the same time? I don't think that is a very likely scenario.

That being said building RAID10 groups that span enclosures does build higher levels of fault tolerance into your LUNs. The recommendation though is to split the RAID10 groups across not only enclosures but buses as well. So with a RAID10 4+4 you would have the 4 primary disks on one enclosure and bus and the 4 secondary disks on another enclosure and bus. This will protect you from bus and enclosure failures. But to create a RAID10 with that configuration you have to use navicli to create the group, that is the only way to designate the primary and secondary disks in a RAID10 group.

Using this setup I had a failure in our Cx3-80 that would have brought down a production LUN if not for the fact that it was a RAID10 set split between buses and enclosures. If you can afford it, and if the application warrants it, it is worth it.

Aran

2.2K Posts

May 18th, 2007 07:00

Yikes... I never heard about having to shutdown a DAE before for disk problems. That sucks. I don't see how you could plan for that kind of outage using RAID5.

97 Posts

May 18th, 2007 07:00

Thanks... We had a scenario where we had to shutdown a DAE twice in the past couple of months due to a disk problem they couldn't figure out.

This, of course, took down everything on the DAE due to the way the RG's were configured. RAID 10 luns usually goes on our DMX systems, so I was trying to design a better RAID 5 configuration that would allow us "higher" availability.

I wanted to create RG sets of 5 disk (4+1) across 5 DAE's, but that leaves me with two DAE's on the same backend bus. In this scenario, the only way we would suffer an outage is if it was the 1st DAE went out causing the second DAE to go offline (or another DAE lower in the chain went out)...

I could do RAID 5 (3+1), but then the cost jumps and we incur a 25% penalty vs. 20%. Some RG's could be designed in a (8+1) configuration further complicating things..
No Events found!

Top