mdadm fail, disk won't show up

One of my disks failed in my raid array:

cat /proc/mdstat:

Personalities : [raid1] 
md1 : active raid1 sda2[2](F) sdb2[1]
      488086720 blocks [2/1] [_U]

md0 : active raid1 sda1[0] sdb1[1]
      192640 blocks [2/2] [UU]

When i tried to remove then re-add the drive the following happened:

mdadm: hot removed /dev/sda2
host:# mdadm /dev/md1 --add /dev/sda2
mdadm: add new device failed for /dev/sda2 as 2: Invalid argument

fdisk -l /dev/sda returns nothing.

Can you tell me what's happening now, especially because md0 looks fine. Tried to Google it but nothing interesting come up.

Update (just for verbatim), now md0is failed too:

Personalities : [raid1]
md1 : active raid1 sdb2[1]
     488086720 blocks [2/1] [_U]

md0 : active raid1 sda1[2](F) sdb1[1]
     192640 blocks [2/1] [_U]

Solution 1:

Sounds like the sda drive is dead. I'm guessing md0 was for booting, which means it probably didn't get accessed much so its driver doesn't know sda1 is dead too.

Before you reboot the computer, make sure that you've installed grub to sdb since the BIOS probably isn't going to be able to boot off of that first drive.

If you didn't already do this, you'll need to run grub, then on grub's command prompt do:

root (hd1,0)
setup (hd1)

(From the guide here for grub 0.95, I'm not sure if this has changed in newer versions of grub) (Note: hd1 assumes that sdb is the second drive in the system, if you've mixed IDE (hda) and SATA/SCSI drives, then you might have to figure this one out on your own)