This only started to happen with oVirt node 4.3, 4.2 didn't have issue.  Since 
I updated to 4.3, every reboot the host goes into emergency mode.  First few 
times this happened I re-installed O/S from scratch, but after some digging I 
found out that the drives it mounts in /etc/fstab cause the problem, 
specifically these mounts.  All three are single drives, one is an SSD and the 
other 2 are individual NVME drives.

UUID=732f939c-f133-4e48-8dc8-c9d21dbc0853 /gluster_bricks/storage_nvme1 auto 
defaults 0 0
UUID=5bb67f61-9d14-4d0b-8aa4-ae3905276797 /gluster_bricks/storage_ssd auto 
defaults 0 0
UUID=f55082ca-1269-4477-9bf8-7190f1add9ef /gluster_bricks/storage_nvme2 auto 
defaults 0 0

In order to get the host to actually boot, I have to go to console, delete 
those mounts, reboot, and then re-add them, and they end up with new UUIDs.  
all of these hosts reliably rebooted in 4.2 and earlier, but all the versions 
of 4.3 have this same problem (I keep updating to hope issue is fixed).  
 
_______________________________________________
Users mailing list -- users@ovirt.org
To unsubscribe send an email to users-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/users@ovirt.org/message/I4UKZAWPQDXWA47AKTQD43PAUCK2JBJN/

Reply via email to