I'll start with the system specs:
Gigabyte AM2+ 785G mATX motherboard
2x Western Digital 640GB Caviar Blue
Phenom X6 1055T
4GB DDR2
Radeon 4770
Windows 7 (/dev/sda1, /dev/sdb1)
Ubuntu 10.04 (/dev/sda2, /dev/sdb2)
I have the two HDs set up in a chipset RAID 1 array, which is working fine in Windows. I have Grub2 installed as the primary boot loader.
The problem: When I update grub for a new kernel install (e.g. 2.6.35 from mainline PPA), the update-grub process sees the new kernel, but when I reboot, I only get the pre-existing kernels as boot options.
I'm pretty sure this is because Ubuntu is only writing the new kernel and grub configuration to one of the drives, but grub is booting from the other. When I list /dev/sd*, I get entries for both /dev/sda* and /dev/sdb*, which says to me that Ubuntu is only mounting one of the drives, and not respecting the RAID array I have set up.
When I go into a grub command prompt at boot time, all I get is (hd0,*) listed, so it seems that grub only sees the primary drive in the array.
The big question: Is there any way that I can force Linux to treat both drives as a true RAID 1 array while still leaving Windows 7 bootable using the chipset RAID setup? I've looked at a few cheap hardware RAID cards (well, as cheap as those things get), but I'd like to avoid throwing hardware at this if I can avoid it. Will using mdadm and the other raid tools only work for a Linux-bootable drive, or can I use this to get Linux to cooperate without interfering with Windows?
Gigabyte AM2+ 785G mATX motherboard
2x Western Digital 640GB Caviar Blue
Phenom X6 1055T
4GB DDR2
Radeon 4770
Windows 7 (/dev/sda1, /dev/sdb1)
Ubuntu 10.04 (/dev/sda2, /dev/sdb2)
I have the two HDs set up in a chipset RAID 1 array, which is working fine in Windows. I have Grub2 installed as the primary boot loader.
The problem: When I update grub for a new kernel install (e.g. 2.6.35 from mainline PPA), the update-grub process sees the new kernel, but when I reboot, I only get the pre-existing kernels as boot options.
I'm pretty sure this is because Ubuntu is only writing the new kernel and grub configuration to one of the drives, but grub is booting from the other. When I list /dev/sd*, I get entries for both /dev/sda* and /dev/sdb*, which says to me that Ubuntu is only mounting one of the drives, and not respecting the RAID array I have set up.
When I go into a grub command prompt at boot time, all I get is (hd0,*) listed, so it seems that grub only sees the primary drive in the array.
The big question: Is there any way that I can force Linux to treat both drives as a true RAID 1 array while still leaving Windows 7 bootable using the chipset RAID setup? I've looked at a few cheap hardware RAID cards (well, as cheap as those things get), but I'd like to avoid throwing hardware at this if I can avoid it. Will using mdadm and the other raid tools only work for a Linux-bootable drive, or can I use this to get Linux to cooperate without interfering with Windows?
Comment