Raid 1 on centos won't boot with 1 drive removed
I have just installed a fresh copy of centos 5 and created raid 1 on 2 drives.
Both my drives are 40gb but different brands. I successfully created raid 1 by following guides, which gave me this setup:
/dev/md0 ext3 [check mark] 100
/dev/md1 swap [check mark] 1024
/dev/md2 ext3 [check mark] [remaining space]
/dev/sda1 /dev/md0 software RAID [no check mark] 100
/dev/sda2 /dev/md1 software RAID [no check mark] 1024
/dev/sda3 /dev/md2 software RAID [no check mark] [remaining space]
/dev/sdb1 /dev/md0 software RAID [no check mark] 100
/dev/sdb2 /dev/md1 software RAID [no check mark] 1024
/dev/sdb3 /dev/md2 software RAID [no check mark] [remaining space]
I also instelled grub on both drives.
So everything is good the system is up and running, but I decide to test raid and unplug 1 of the drives, trying to boot only from the remaining 1.
I get the grub screen, then centos startup but it gives me bad superblock, a few lines with directories that can't be found and finally a kernel panic. When i replug so both drives are on system boots fine.
What am I doing wrong? I just want to make sure the system will be able to run if one drive fails. Thank you for any help guys!