So I ran into an issue, I have a 6-disk raid 50 config, 20tb after raid. I figured I'd use 19tb as NAS and run VMs on the last 1. It is far more data than I have on any other machine to be able to viably back up elsewhere and restore... was kinda the point of it. About a month ago, I started having issues with the server randomly going down. Proxmox is usually unresponsive and all. I can reboot, get proxmox back up, but then nothing will start up.
My setup is Dell R620, 384GB ddr3, dual Xeon E5-2697v2, and 6 4tb HDD in raid 50.
I can go into iDRAC to reboot, then everything goes back to working except the lvm. My pool is just Main/Main
When I login, none of my VMs start and I get the error "TASK ERROR: activating LV 'Main/Main' failed: Activation of logical volume Main/Main is prohibited while logical volume Main/Main_tmeta is active."
I can sometimes get it running again by using
lvchange -an Main/Main_tdata
lvchange -an Main/Main_tmeta
lvchange -ay Main/Main
but sometimes it returns an error that Main_tmeta or Main_tdata are in use. I checked the lvs and it shows that the NAS has expanded to 21.44tb. I'm sure that's what's causing the issue, but how would I trim it back down and stop it from expanding in the future? My NAS normally only runs about a day now before crashing once I get it back up.
Any help would be greatly appreciated
My setup is Dell R620, 384GB ddr3, dual Xeon E5-2697v2, and 6 4tb HDD in raid 50.
I can go into iDRAC to reboot, then everything goes back to working except the lvm. My pool is just Main/Main
When I login, none of my VMs start and I get the error "TASK ERROR: activating LV 'Main/Main' failed: Activation of logical volume Main/Main is prohibited while logical volume Main/Main_tmeta is active."
I can sometimes get it running again by using
lvchange -an Main/Main_tdata
lvchange -an Main/Main_tmeta
lvchange -ay Main/Main
but sometimes it returns an error that Main_tmeta or Main_tdata are in use. I checked the lvs and it shows that the NAS has expanded to 21.44tb. I'm sure that's what's causing the issue, but how would I trim it back down and stop it from expanding in the future? My NAS normally only runs about a day now before crashing once I get it back up.
Any help would be greatly appreciated