[SOLVED] zpool import -a : no pools available

cvega

New Member
Oct 30, 2019
13
2
3
41
hi forum,

I have been running a proxmox machine for a while now, using a SSD as a OS drive and 2 x 4TB drives in ZFS RAID1 as main storage.
I've now gotten a new machine, and moved the raid card (dell perc h310 in IT mode) to the new motherboard along with the drives, onto a fresh proxmox install.
However, I cannot seem to import the zfs pool. It's called "main" but i cannot get zfs to find it. What did I do wrong when migrating ?
The server1 and server2 versions are identical (proxmox and zfs/zfs-utils)..

Disks are physically showing up in the node interface:View attachment 18509

/dev/sde and /dev/sdg are the two members of the raid pool that previously worked.
HELP!
 

Stoiko Ivanov

Proxmox Staff Member
Staff member
May 2, 2018
5,181
683
118
please post the outputs of:
Code:
zpool status
zpool list
zpool import
lsblk

Thanks!
 
  • Like
Reactions: mzaferyahsi

cvega

New Member
Oct 30, 2019
13
2
3
41
root@zenon:~# zpool status
no pools available
root@zenon:~# zpool list
no pools available
root@zenon:~# zpool import
no pools available to import
root@zenon:~# lsblk
NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINT
sda 8:0 0 931.5G 0 disk
sdb 8:16 0 931.5G 0 disk
sdc 8:32 0 931.5G 0 disk
sdd 8:48 0 931.5G 0 disk
sde 8:64 0 3.7T 0 disk
├─sde1 8:65 0 3.7T 0 part
└─sde9 8:73 0 8M 0 part
sdf 8:80 0 931.5G 0 disk
└─sdf1 8:81 0 931.5G 0 part
sdg 8:96 0 3.7T 0 disk
├─sdg1 8:97 0 3.7T 0 part
└─sdg9 8:105 0 8M 0 part
sdh 8:112 0 298.1G 0 disk
├─sdh1 8:113 0 1007K 0 part
├─sdh2 8:114 0 512M 0 part
└─sdh3 8:115 0 297.6G 0 part
├─pve-root 253:0 0 74.3G 0 lvm /
├─pve-swap 253:3 0 8G 0 lvm [SWAP]
├─pve-data_tmeta 253:4 0 2G 0 lvm
│ └─pve-data 253:8 0 195.4G 0 lvm
└─pve-data_tdata 253:5 0 195.4G 0 lvm
└─pve-data 253:8 0 195.4G 0 lvm
sdi 8:128 0 465.8G 0 disk
├─vm--ssd-vm--ssd_tmeta 253:1 0 120M 0 lvm
│ └─vm--ssd-vm--ssd-tpool 253:6 0 465.5G 0 lvm
│ ├─vm--ssd-vm--ssd 253:7 0 465.5G 0 lvm
│ ├─vm--ssd-vm--202--disk--0 253:9 0 64G 0 lvm
│ ├─vm--ssd-vm--500--state--HP_Sureclick_working 253:10 0 8.5G 0 lvm
│ ├─vm--ssd-vm--500--state--HP_SUreclick_421 253:11 0 8.5G 0 lvm
│ └─vm--ssd-vm--500--disk--0 253:12 0 64G 0 lvm
└─vm--ssd-vm--ssd_tdata 253:2 0 465.5G 0 lvm
└─vm--ssd-vm--ssd-tpool 253:6 0 465.5G 0 lvm
├─vm--ssd-vm--ssd 253:7 0 465.5G 0 lvm
├─vm--ssd-vm--202--disk--0 253:9 0 64G 0 lvm
├─vm--ssd-vm--500--state--HP_Sureclick_working 253:10 0 8.5G 0 lvm
├─vm--ssd-vm--500--state--HP_SUreclick_421 253:11 0 8.5G 0 lvm
└─vm--ssd-vm--500--disk--0 253:12 0 64G 0 lvm
 

Stoiko Ivanov

Proxmox Staff Member
Staff member
May 2, 2018
5,181
683
118
please use code tags for pasting command line output - it makes reading it much easier.

sde and sdg look like they could belong to a zpool - on a hunch does:
Code:
zpool import -a -d /dev/disk/by-id
work?
 

cvega

New Member
Oct 30, 2019
13
2
3
41
Interesting - yes I moved those drives from another (now dead) node called erebus (previous server).

Code:
root@zenon:~# zpool import -a -d /dev/disk/by-id/ata-WDC_WD40PURX-64GVNY0_WD-WCC4E6KE8J64-part1
cannot import 'main': pool was previously in use from another system.
Last accessed by erebus (hostid=f0f6e005) at Mon Jul 13 14:02:29 2020
The pool can be imported, use 'zpool import -f' to import the pool.
 

cvega

New Member
Oct 30, 2019
13
2
3
41
Code:
root@zenon:~# zpool import -a -f -d /dev/disk/by-id/ata-WDC_WD40PURX-64GVNY0_WD-WCC4E6KE8J64-part1
root@zenon:~# zpool list
NAME   SIZE  ALLOC   FREE  CKPOINT  EXPANDSZ   FRAG    CAP  DEDUP    HEALTH  ALTROOT
main  3.62T   238G  3.39T        -         -    12%     6%  1.00x    ONLINE  -
root@zenon:~# zfs list
NAME                     USED  AVAIL     REFER  MOUNTPOINT
main                    1.34T  2.17T       96K  /main
main/base-999-disk-0    17.6G  2.19T     1.10G  -
main/vm-201-disk-0      4.11G  2.17T     4.96G  -
main/vm-201-disk-1      1.03T  3.04T      166G  -
main/vm-204-disk-0      21.8G  2.19T     5.38G  -
main/vm-204-state-test  4.63G  2.18T     1.23G  -
main/vm-205-disk-0       132G  2.28T     26.5G  -
main/vm-207-disk-0      33.0G  2.19T     12.8G  -
main/vm-210-disk-0      33.0G  2.20T     2.07G  -
main/vm-210-disk-1      33.0G  2.20T     1.15G  -
main/vm-210-disk-2      33.0G  2.19T     14.3G  -
main/vm-501-disk-0      1.51G  2.17T     2.21G  -
 
  • Like
Reactions: Stoiko Ivanov

Stoiko Ivanov

Proxmox Staff Member
Staff member
May 2, 2018
5,181
683
118
seems that worked ;)
please mark the thread as 'SOLVED'
Thanks!
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE and Proxmox Mail Gateway. We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get your own in 60 seconds.

Buy now!