I have a PowerEdge 4400 (Service Tag: 87Z600B) with a Perc 3/DI controller running Netware 5.0 SP6a. I have a 3 X 36 GIG RAID5 config. I need to add more storage space, so we ordered 3 additional hard drives from Dell. The new drives are the same size as the originals, but the new ones are U320s vs. U160s. Dell told me they were compatible and the drives would just "dumb down" to run at the slower speed. I intend to setup the new 3 drives as a separate RAID5 (via OpenManage Array Manager) instead of adding them to the existing array.
My problem is that when I added the new 3 drives to the SCSI backplane, the server would no longer boot up Netware. Netware locks up at the stage where it is "Scanning for new devices and partitions" Doing a to get into the Perc controller's BIOS, it shows the new drives listed, so I know the drives are being detected. The new drives have not been initialized or setup for the new RAID5 yet, since I need netware to boot up so I can use OpenManage Array manger to do that. Netware boots fine when the new drives are removed.
The Perc driver is PERC2.HAM (Dated 11/20/2001 version 2.70)
The Perc 3/DI firmware version is: 2.7 build 3546
Anyone have any ideas what's causing this?
Would you need to format the drives if you are going to just put them into a raid 5? When you put them into a raid 5 doesn't that wipe any type of formatting that is on the drive?
It does initialize and format the drives when it creates the raid 5, but if there is a preexisting raid configuration on the drives it may have issues. So it is a good practice to just do a quick format of the drives prior to configuration.
Hope this helps.