I had this happen to me last month. My setup is similar, with two
drives with RAID1. In addition, I'm using LVM on top of everything
else, but I don't think that effects the root issue. My workaround was
basically to use cfdisk to make the primary partitions. The exact steps
were:
* zero'd out both drives and removed both partition tables
* ran the Ubuntu installer to create a normal primary 20 Gb partition on
Drive A and installed the OS
* booted into Linux, ran cfdisk on drive B (cfdisk -z /dev/sdb), wrote the
partition table, and rebooted just for good measure
* then ran cfdisk /dev/sdb again, and created two partitions (one 2 Gb boot
partition, and the rest in another primary). I created an ext4 filesystem in
the larger partition.
* next, booted off the USB drive, mounted the new partition and tar-piped
everything from the installed partition into the new
* then zeroed out the partition table on drive A (/mnt/sbin/cfdisk -z
/dev/sda), rebooted into the installer, mounted drive B, and ran cfdisk to
create the partitions on drive A.
* once the primary partitions were created, I used the installer to set up
RAID1 and LVM.
After I manually created all my primary partitions, I was able to use
the installer for everything else. It certainly seems that the
installer's disk partitioner is broken in this case.
--
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/591721
Title:
installer fails creating two separate RAID devices and boot fails
(regression)
--
ubuntu-bugs mailing list
[email protected]
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs