LVM drives do not get activated at boot and require to be manually activated.

TM876

New Member
Jul 8, 2021
3
1
3
Hello,

For reference I believe this started after upgrading to 7 (I also replaced my PSU at the time but not sure if that would cause the issue) but before I upgraded, it took a few minutes for my 2 HDDs to activate on boot. Now I have to manually deactivate using lvchange -an HDD-#/HDD-#_tmeta and lvchange -an HDD-#/HDD-#_tdata for the drives then attach via vgchange -ay.

Write up on my blog for what I do:
Proxmox 7 - Activating LVM volumes after failure to attach on boot

For reference, here is what lsblk shows:
Screenshot 2021-08-13 at 14.04.56.png

Boot verbose messages:
2021-08-22-boot-error-messages.jpg

After manually activating:
Screenshot 2021-08-13 at 15.07.25.png

If anyone has any ideas on what should be my next steps then I would greatly appreciate it.
 

Attachments

  • IMG_7198.jpg
    IMG_7198.jpg
    432.1 KB · Views: 15
Last edited:
I am also having the exact same issue however I only have a single HDD, main drive is a nvme and second drive is a 6TB HDD, if I reboot I need to do exactly the same, proxmox 7 also
 
I am also having the exact same issue however I only have a single HDD, main drive is a nvme and second drive is a 6TB HDD, if I reboot I need to do exactly the same, proxmox 7 also
Yeah since I posted this thread, I was thinking about downgrading to 6.4 since I also had some kernel panic issues but I first ran the rescue boot (from the 6.4 installation tool) and I let it run through, the rescue boot on the ISO didn’t fully run as it crashed or errored out or something but then when I booted back into 7.0, it didn’t give me the errors as before and booted normally twice. So I can’t say for certain and also make sure you have backups beforehand if you try.