could not activate storage 'r1pool', zfs error: cannot open 'r1pool': no such pool (500)

proxdejan

New Member
Nov 18, 2022
6
0
1
HI
I woke up this morning with this error on my proxmox that have been working just fine for the past 4 years .
Not so skilled with configuration , can anyone help?
zpool list
rpool 111G 37.9G 73.1G - - 50% 34% 1.00x ONLINE -
lsblk
NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINT
sda 8:0 0 931.5G 0 disk
└─sda1 8:1 0 931.5G 0 part
└─storage1-vm--101--disk--0 253:2 0 930G 0 lvm
sdb 8:16 0 111.8G 0 disk
├─sdb1 8:17 0 1007K 0 part
├─sdb2 8:18 0 512M 0 part
└─sdb3 8:19 0 111.3G 0 part
sdc 8:32 0 931.5G 0 disk
└─sdc1 8:33 0 931.5G 0 part
└─storage2-vm--102--disk--0 253:1 0 930G 0 lvm
sdd 8:48 0 953.9G 0 disk
└─sdd1 8:49 0 953.9G 0 part
└─WindowsServer-vm--103--disk--0 253:0 0 953G 0 lvm
sr0 11:0 1 1024M 0 rom
zd0 230:0 0 1M 0 disk
zd16 230:16 0 32G 0 disk
├─zd16p1 230:17 0 2.4G 0 part
├─zd16p2 230:18 0 2G 0 part
└─zd16p3 230:19 0 27.4G 0 part

cat /etc/pve/storage.cfg
dir: local
path /var/lib/vz
content iso,vztmpl,backup

zfspool: local-zfs
pool rpool/data
content images,rootdir
mountpoint /rpool
sparse 1

lvm: storage1
vgname storage1
content rootdir,images
nodes overnetvm
shared 0

lvm: storage2
vgname storage2
content images,rootdir
shared 0

lvm: local-WindowServer
vgname WindowsServer
content rootdir,images
shared 0

zfspool: r1pool
pool r1pool
content rootdir,images
nodes overnetvm

zpool status -v
pool: rpool
state: ONLINE
scan: scrub repaired 0B in 0 days 00:04:35 with 0 errors on Sun Nov 13 00:28:38 2022
config:

NAME STATE READ WRITE CKSUM
rpool ONLINE 0 0 0
ata-KingDian_S400_120GB_2018042610284-part3 ONLINE 0 0 0

errors: No known data errors


ls -l /dev/disk/by-id
total 0
lrwxrwxrwx 1 root root 9 Nov 18 10:19 ata-KingDian_S400_120GB_2018042610284 -> ../../sdb
lrwxrwxrwx 1 root root 10 Nov 18 10:19 ata-KingDian_S400_120GB_2018042610284-part1 -> ../../sdb1
lrwxrwxrwx 1 root root 10 Nov 18 10:19 ata-KingDian_S400_120GB_2018042610284-part2 -> ../../sdb2
lrwxrwxrwx 1 root root 10 Nov 18 10:19 ata-KingDian_S400_120GB_2018042610284-part3 -> ../../sdb3
lrwxrwxrwx 1 root root 9 Nov 18 10:19 ata-Optiarc_DVD_RW_AD-7270H -> ../../sr0
lrwxrwxrwx 1 root root 9 Nov 18 10:19 ata-P3-1TB_979073190060 -> ../../sdd
lrwxrwxrwx 1 root root 10 Nov 18 10:19 ata-P3-1TB_979073190060-part1 -> ../../sdd1
lrwxrwxrwx 1 root root 9 Nov 18 10:19 ata-WDC_WD10EZRX-00A8LB0_WD-WMC1U4110284 -> ../../sdc
lrwxrwxrwx 1 root root 10 Nov 18 10:19 ata-WDC_WD10EZRX-00A8LB0_WD-WMC1U4110284-part1 -> ../../sdc1
lrwxrwxrwx 1 root root 9 Nov 18 10:19 ata-WDC_WD10EZRX-00A8LB0_WD-WMC1U4145708 -> ../../sda
lrwxrwxrwx 1 root root 10 Nov 18 10:19 ata-WDC_WD10EZRX-00A8LB0_WD-WMC1U4145708-part1 -> ../../sda1
lrwxrwxrwx 1 root root 10 Nov 18 10:19 dm-name-storage1-vm--101--disk--0 -> ../../dm-2
lrwxrwxrwx 1 root root 10 Nov 18 10:19 dm-name-storage2-vm--102--disk--0 -> ../../dm-1
lrwxrwxrwx 1 root root 10 Nov 18 10:19 dm-name-WindowsServer-vm--103--disk--0 -> ../../dm-0
lrwxrwxrwx 1 root root 10 Nov 18 10:19 dm-uuid-LVM-djcq0XC4Lx2m2EwdeMjENBoB2fEuxiGEenlNUMaEmtxObUwcy5YJVFML7TlIE0zb -> ../../dm-0
lrwxrwxrwx 1 root root 10 Nov 18 10:19 dm-uuid-LVM-dQUV3JDaInpF0RudgOQpAPNFhm901QaB4MzU8ybDbWef5GnRDBT9Hcg9IoQeYWRu -> ../../dm-2
lrwxrwxrwx 1 root root 10 Nov 18 10:19 dm-uuid-LVM-tYrxTXqd0h12yrdsRQzsz8bvdBOF20eAfiCdSaMegpIeHNE3E3OW1kS8BndupJzY -> ../../dm-1
lrwxrwxrwx 1 root root 10 Nov 18 10:19 lvm-pv-uuid-3D9dn4-hWoQ-sWpy-Vd9X-QkWL-aTx8-LbKWJz -> ../../sda1
lrwxrwxrwx 1 root root 10 Nov 18 10:19 lvm-pv-uuid-bB74eT-IdXv-MQIU-qevQ-Tcm0-AKp1-l0qy02 -> ../../sdc1
lrwxrwxrwx 1 root root 10 Nov 18 10:19 lvm-pv-uuid-UzEEog-NsAi-vb0Q-k0TN-WAw2-gh7A-sPuhIX -> ../../sdd1
lrwxrwxrwx 1 root root 9 Nov 18 10:19 wwn-0x5000000000000000 -> ../../sdb
lrwxrwxrwx 1 root root 10 Nov 18 10:19 wwn-0x5000000000000000-part1 -> ../../sdb1
lrwxrwxrwx 1 root root 10 Nov 18 10:19 wwn-0x5000000000000000-part2 -> ../../sdb2
lrwxrwxrwx 1 root root 10 Nov 18 10:19 wwn-0x5000000000000000-part3 -> ../../sdb3
lrwxrwxrwx 1 root root 9 Nov 18 10:19 wwn-0x5000000000000107 -> ../../sdd
lrwxrwxrwx 1 root root 10 Nov 18 10:19 wwn-0x5000000000000107-part1 -> ../../sdd1
lrwxrwxrwx 1 root root 9 Nov 18 10:19 wwn-0x50014ee25c2acddc -> ../../sda
lrwxrwxrwx 1 root root 10 Nov 18 10:19 wwn-0x50014ee25c2acddc-part1 -> ../../sda1
lrwxrwxrwx 1 root root 9 Nov 18 10:19 wwn-0x50014ee25c2ace13 -> ../../sdc
lrwxrwxrwx 1 root root 10 Nov 18 10:19 wwn-0x50014ee25c2ace13-part1 -> ../../sdc1
 
Those four drives appear to contain your Proxmox installation and are used for LVM storage. Maybe the drive(s) you used for r1pool are missing/broken/disconnected or the hardware controller to which they are connected is missing or broken?
 
Just checked the disk presence
lrwxrwxrwx 1 root root 9 Nov 18 10:19 ata-KingDian_S400_120GB_2018042610284 -> ../../sdb
lrwxrwxrwx 1 root root 9 Nov 18 10:19 ata-P3-1TB_979073190060 -> ../../sdd
lrwxrwxrwx 1 root root 9 Nov 18 10:19 ata-WDC_WD10EZRX-00A8LB0_WD-WMC1U4110284 -> ../../sdc
lrwxrwxrwx 1 root root 9 Nov 18 10:19 ata-WDC_WD10EZRX-00A8LB0_WD-WMC1U4145708 -> ../../sda
Which of those four detected drives did you use for r1pool?
 
controller is ok as the disc presence is there
NAME STATE READ WRITE CKSUM
rpool ONLINE 0 0 0
ata-KingDian_S400_120GB_2018042610284-part3 ONLINE 0 0 0
 
lrwxrwxrwx 1 root root 9 Nov 18 10:19 ata-KingDian_S400_120GB_2018042610284 -> ../../sd
That looks like the drive that contains the Proxmox installation and rpool:
lrwxrwxrwx 1 root root 10 Nov 18 10:19 ata-KingDian_S400_120GB_2018042610284-part1 -> ../../sdb1
lrwxrwxrwx 1 root root 10 Nov 18 10:19 ata-KingDian_S400_120GB_2018042610284-part2 -> ../../sdb2
lrwxrwxrwx 1 root root 10 Nov 18 10:19 ata-KingDian_S400_120GB_2018042610284-part3 -> ../../sdb3
/dev/sdb1 is the GRUB bios boot partition, /dev/sdb2 is the ESP boot partition and /dev/sdb3 is most likely the Proxmox rpool. Together, then use the whole 120GB drive. Where is the space for your missing r1pool? How did you create the r1pool on that drive?
 
I'm pretty sure that there was created. How can I find out where it was created if I'm wrong?
 
I'm pretty sure that there was created. How can I find out where it was created if I'm wrong?
Did you create it manually on the command line? Then the zpool command might still be in /root/.bash_history (or the .bash_history of the user that you used to create the ZFS pool). I don't know how to check the history if it was done via the Proxmox GUI.
Or did you create the r1pool inside (the virtual disk(s) of) a VM that was later removed? I'm just guessing as I don't see any drives that appear to contains the vdev(s) of the pool.
 
Did you create it manually on the command line?
yes
did you create the r1pool inside (the virtual disk(s) of) a VM that was later removed?
no disks where removed this happened without any power failure or any intervention on the machine.
I'm just guessing as I don't see any drives that appear to contains the vdev(s) of the pool.
I also don't understand how this is possible as all the disks that are installed are present and accounted
maybe can give me some hint how to overcome this situation
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!