Hello,
I am trying to repair my Proxmox installation after replacing my disks, and fail to do so
For the background:
I installed Proxmox on my laptop-converted-to-a-server, with two SSD, with a zfs pool and mirror replication across the two disks.
Not much, just:
As the disks were getting old, I acquired two new disks to replace them, of the same size. I follow this message from a thread, and replaced my secondary disk. All went fine.
Then, foolishly, I did the same for the main disk, without considering that it was the boot drive. Obviously I couldn't boot at all now, as no disk was containing the bootloader.
Strange thing N°1: After popping the original boot disk back, it was not unrecognized as a boot drive, and I am unable to boot into the old installation. I still can't figure out why. When booting into a live (ubuntu) USB, I can see both disks just fine, with the bootloader, but it's just not picked by the BIOS.
Reading the former data
From the live USB, I'm absolutely able to see the pool, with the replication, and mount it to get the data. But not the host data? Can't get anything from
As I can't boot into the old configuration anymore, I have limited insight and don't know/understand how to retrieve more. I was expecting to have some LVM partition on any of the old disks, but don't see any at the moment. Maybe I inadvertently deleted that? Right now, I just see some zfs_member partitions on both, and a BIOS boot + an EFI on the main old disk.
Restoring in a new install
After that, I reinstalled both new drives (one of which now contains the replica of the zfs pool), and made a fresh install on the other (new) disk.
I can now boot, but obviously lost all configuration. I stumbled upon this thread, but as I can't access the
I created a new zfs storage through the GUI, using the existing pool (the pool on the secondary drive was picked up by proxmox). But now I can't restore any previous configuration, or configure a proper healthy zfs replication.
TL;DR
1. How do I recover host configuration from the old disks (if I understand correctly, the
2. How do I restore those configuration to get back back my container and VM?
3. How do I configure back the ZFS pool with redundancy? I tried to replace the former-main disk with the new-main one, but that new disk already has one big LVM partition, so ZFS can't just steal from that.
I've been struggling with that situation for the past day, and am very anxious about it
Any help appreciated, and I'm happy to provide more information as needed.
Generally confident with GNU/Linux, but not as much with Proxmox or ZFS.
I am trying to repair my Proxmox installation after replacing my disks, and fail to do so
For the background:
I installed Proxmox on my laptop-converted-to-a-server, with two SSD, with a zfs pool and mirror replication across the two disks.
Not much, just:
- One LXC container with a Linux Turnkey and Portainer/docker installed, with most of my stuff
- One VM with HomeAssistant-OS
- A bunch of expendable LXC tests
As the disks were getting old, I acquired two new disks to replace them, of the same size. I follow this message from a thread, and replaced my secondary disk. All went fine.
Then, foolishly, I did the same for the main disk, without considering that it was the boot drive. Obviously I couldn't boot at all now, as no disk was containing the bootloader.
Strange thing N°1: After popping the original boot disk back, it was not unrecognized as a boot drive, and I am unable to boot into the old installation. I still can't figure out why. When booting into a live (ubuntu) USB, I can see both disks just fine, with the bootloader, but it's just not picked by the BIOS.
Reading the former data
From the live USB, I'm absolutely able to see the pool, with the replication, and mount it to get the data. But not the host data? Can't get anything from
/etc/pve
As I can't boot into the old configuration anymore, I have limited insight and don't know/understand how to retrieve more. I was expecting to have some LVM partition on any of the old disks, but don't see any at the moment. Maybe I inadvertently deleted that? Right now, I just see some zfs_member partitions on both, and a BIOS boot + an EFI on the main old disk.
Restoring in a new install
After that, I reinstalled both new drives (one of which now contains the replica of the zfs pool), and made a fresh install on the other (new) disk.
I can now boot, but obviously lost all configuration. I stumbled upon this thread, but as I can't access the
/etc/pve
of the old installation anymore, I don't know how to proceed.I created a new zfs storage through the GUI, using the existing pool (the pool on the secondary drive was picked up by proxmox). But now I can't restore any previous configuration, or configure a proper healthy zfs replication.
TL;DR
1. How do I recover host configuration from the old disks (if I understand correctly, the
/etc/pve
folder is crucial)2. How do I restore those configuration to get back back my container and VM?
3. How do I configure back the ZFS pool with redundancy? I tried to replace the former-main disk with the new-main one, but that new disk already has one big LVM partition, so ZFS can't just steal from that.
I've been struggling with that situation for the past day, and am very anxious about it
Any help appreciated, and I'm happy to provide more information as needed.
Generally confident with GNU/Linux, but not as much with Proxmox or ZFS.
Last edited: