[SOLVED] zpool replace configuration error (lose ssh & web console when replaced drive removed)

Adams-j

New Member
May 5, 2023
4
1
3
I replaced one of my raid 1 (mirror) rpool drives via commands below:
(rpool autoexpand on)

sgdisk /dev/sdb -R /dev/nvme1n1
sgdisk -G /dev/nvme1n1
proxmox-boot-tool format /dev/nvme1n1p2 --force
proxmox-boot-tool init /dev/nvme1n1p2 --force
zpool replace -f rpool nvme-eui.0000000001000000e4d25c54db2a4c00-part3 nvme-eui.6479a7735000001f-part3

Everything seems to be in order, when I check zpool status I get:

Screenshot_20230510_211215.png

Everything works great until I remove the replaced drive, at that point I lose the ability to ssh in and the web console to proxmox. I add back in the removed drive and everything works as it should.

Screenshot_20230510_211215.png

looks like the problem is shown when proxmox-boot-tool status is run "Re-executing '/usr/sbin/proxmox-boot-tool' in new private mount namespace.."

any ideas where I messed up? or what can be done to get the rpool to work as it should?

Thanks for your help :)
 
zpool replace -f rpool nvme-eui.0000000001000000e4d25c54db2a4c00-part3 nvme-eui.6479a7735000001f-part3
Everything works great until I remove the replaced drive, at that point I lose the ability to ssh in and the web console to proxmox. I add back in the removed drive and everything works as it should.

You likely need to adapt the network configuration:
https://forum.proxmox.com/threads/removed-nvme-drive-lost-networking.109337/post-470076