I've a problem similar to the one here:
http://forum.proxmox.com/threads/23...-zfs-install-cannot-import-rpool-no-such-pool
but for me is NOT due to some RAID metadata still present (I've zeroed the entire disk and also tried with a disk brand new)
and that's why I've abandoned that thread and opened this new one.
At boot I have:
If I do:
From Busybox I'm able to boot one time doing this:
(isn't strange I've to issue "exit" two times?)
I've tried this, as suggested in previous thread
But the problem is not solved, I land in BusyBox again.
Don't know if is relevant, but if from busybox I enter "exit" two times I have the second time:
But I don't know if the "ehci-orion" stuff is a problem (once booted I don't see such a module used by the system)
The MB is an Intel S2400SC, that has 8 HD bay, 4 connected with a "mini-sas to mini-sas" to onboard controller (set as AHCI mode, I guess the "C602"), the other with a "sata to mini-sas" (I guess the "C600").
I've the same problem in any bay.
With lspci I have (leaving more or less only info about HD controller)
and for kernel modules (removing for brevity what I think is not related, like IP stack)
I started with 4 disks in RAID10 but since was unable to boot I did a lot of experiments. To have put things to the minimum, now I've just one WD in the bay (I've tried to move from the first bay to the second bay, since are under different controllers, but no luck) and I've tried ext4 (that works fine) and ZFS (that does not).
At the moment I've only one HD (for ZFS raid0) and I boot / install proxmox from USB stick.
Any help is very apreciated, if I can't solve soon I have to abandon the idea of using ZFS.
http://forum.proxmox.com/threads/23...-zfs-install-cannot-import-rpool-no-such-pool
but for me is NOT due to some RAID metadata still present (I've zeroed the entire disk and also tried with a disk brand new)
and that's why I've abandoned that thread and opened this new one.
At boot I have:
Code:
Loading, please wait...
Command: /sbin/zpool import -N rpool
Message: cannot import 'rpool': no such pool available
Error: 1
No pool imported. Manually import the root pool
at the command prompt and then exit.
Hint: Tyr: zpool import -R /root -N rpool
BusyBox v1.22.1 ....
/ #
If I do:
Code:
# zpool import
pool: rpool
id: 4282105346604124069
state: ONLINE
action: The pool can be imported using its name or numeric identifier.
config:
rpool ONLINE
sda2 ONLINE
From Busybox I'm able to boot one time doing this:
Code:
# zpool import -d /dev/disk/by-id rpool -R /root
# exit
failed to mount root filesystem 'rpool/ROOT/pve-1'
# exit
I've tried this, as suggested in previous thread
Code:
# zpool set cachefile= rpool
# update-initramfs -k `uname -r` -u
# reboot
Don't know if is relevant, but if from busybox I enter "exit" two times I have the second time:
Code:
Target filesystem doesn't have requeste /sbin/init
mount: mounting /dev/ on /rot/dev failed: No such file or directory
No init found. Try passing init= bootarg.
modprobe: module ehci-orion not found in bodules.dep
Busybox ...
(initramfs)
The MB is an Intel S2400SC, that has 8 HD bay, 4 connected with a "mini-sas to mini-sas" to onboard controller (set as AHCI mode, I guess the "C602"), the other with a "sata to mini-sas" (I guess the "C600").
I've the same problem in any bay.
With lspci I have (leaving more or less only info about HD controller)
Code:
root@pve4:~# lspci | grep controller
00:16.0 Communication controller: Intel Corporation C600/X79 series chipset MEI Controller #1 (rev 05)
00:16.1 Communication controller: Intel Corporation C600/X79 series chipset MEI Controller #2 (rev 05)
00:1f.2 SATA controller: Intel Corporation C600/X79 series chipset 6-Port SATA AHCI Controller (rev 06)
09:00.0 Serial Attached SCSI controller: Intel Corporation C602 chipset 4-Port SATA Storage Control Unit (rev 06)
and for kernel modules (removing for brevity what I think is not related, like IP stack)
Code:
root@pve4:~# lsmod
Module Size Used by
iscsi_tcp 20480 0
libiscsi_tcp 24576 1 iscsi_tcp
libiscsi 53248 3 libiscsi_tcp,iscsi_tcp,ib_iser
scsi_transport_iscsi 98304 4 iscsi_tcp,ib_iser,libiscsi
snd_pcm 102400 0
lrw 16384 1 aesni_intel
gf128mul 16384 1 lrw
glue_helper 16384 1 aesni_intel
snd_timer 32768 1 snd_pcm
snd 86016 2 snd_timer,snd_pcm
ablk_helper 16384 1 aesni_intel
cryptd 20480 3 ghash_clmulni_intel,aesni_intel,ablk_helper
joydev 20480 0
lpc_ich 24576 0
ioatdma 65536 0
dca 16384 1 ioatdma
8250_fintek 16384 0
wmi 20480 0
mac_hid 16384 0
autofs4 40960 2
zfs 2789376 7
zunicode 331776 1 zfs
zcommon 57344 1 zfs
znvpair 90112 2 zfs,zcommon
spl 102400 3 zfs,zcommon,znvpair
zavl 16384 1 zfs
hid_generic 16384 0
usbkbd 16384 0
usbmouse 16384 0
ahci 36864 1
usbhid 49152 0
libahci 32768 1 ahci
isci 135168 0
e1000e 233472 0
hid 118784 2 hid_generic,usbhid
ptp 20480 1 e1000e
libsas 81920 1 isci
pps_core 20480 1 ptp
scsi_transport_sas 45056 2 isci,libsas
atl1c 49152 0
I started with 4 disks in RAID10 but since was unable to boot I did a lot of experiments. To have put things to the minimum, now I've just one WD in the bay (I've tried to move from the first bay to the second bay, since are under different controllers, but no luck) and I've tried ext4 (that works fine) and ZFS (that does not).
At the moment I've only one HD (for ZFS raid0) and I boot / install proxmox from USB stick.
Any help is very apreciated, if I can't solve soon I have to abandon the idea of using ZFS.