ZFS boot stuck at (initramfs), no root device specified

FourtyTwo

Member
Jan 27, 2020
21
2
8
70
Hello!
I read everything i could find about my question but nothing helps :(

I tried to make ZFS bootable system using two different SSD's for now cause i can't find second SSD with same size. Planning to buy two similar SSD's in future but now i have only one.

I downloaded PVE 6.1 iso, used Etcher to make a bootable USB stick, then used this post to start:
https://forum.proxmox.com/threads/d...s-raidz-1-root-file-system.22774/#post-208998

After succesfull installation i can't boot, initramfs telling me:
"Reading all physical volumes. This may take a while..."
"No root device specified. Boot arguments must include a root= parameter."
"(initramfs)"

So i found many posts about this, also i found a wiki page with Tips and Tricks and tried this:
"(initramfs) zpool import -R /rpool -N rpool" (i see path like /rpool here, let's try)
"The ZFS modules are not loaded."
"Try running '/sbin/modprobe zfs' as root to load them." (ok, let's do this)
"(initramfs) /sbin/modprobe zfs"
"(initramfs) zpool import -R /rpool -N rpool"
"(initramfs) exit"
"mount: mounting on /root failed: Invalid Argument"
"Failed to mount as root file system"

Ok, i rebooted and tried another info from here: https://forum.proxmox.com/threads/stuck-at-initramfs.56158/post-258736
"(initramfs) zpool import -R / rpool " (i see path like / here and rpool as parameter, let's try)
"The ZFS modules are not loaded."
"Try running '/sbin/modprobe zfs' as root to load them." (ok, let's do this)
"(initramfs) /sbin/modprobe zfs"
"(initramfs) zpool import -R / rpool"
"cannot mount '/': directory is not empty"

So first one is not helped and second one maybe a typo? Anyway i'm stuck here, please help :(
I beleive i configured UEFI boot cause there is "EFI Only" boot type in my BIOS.
 
If I understand you correctly you installed on one SSD (ZFS RAID0) and then tried attaching a second SSD as mirror?

* Did installing with one SSD work? - could you boot afterwards?
* if yes - which boot-loader was used by the installer (for UEFI+ZFS PVE uses systemd-boot, else grub - you should see the difference when booting
(grub has a blue screen, systemd-boot black)

* please paste/screenshot the output of `cat /proc/cmdline` when you reach the initramfs
 
Yes, i used ZFS RAID0 at one (smallest) SSD first, installed Proxmox and tried to boot from one SSD.

1. Install seems ok, i cannot boot afterwards from one SSD.
2. Looks like it's systemd-boot cause i didn't saw any blue screens and also "cat /proc/cmdline" points at EFI loader

3. Paste form "cat /proc/cmdine":
"initrd=\EFI\proxmox\5.3.10-1-pve\initrd.img-5.3.10-1-pve BOOT_IMAGE=/boot/linux26 ro ramdisk_size=16777216 rw quiet splash=silent"
 
hmm - the 'root=' parameter is missing
also the BOOT_IMAGE looks like the one you'd see in the installer - not in a booted pve...

Any messages during the installation? - did you run the installer in Debug mode?
 
did you continue after taking the screenshot? - any messages in the final debug-shell before the reboot?
 
Yes, but nothing interesting, DHCP discover, reboot, etc.
Nothing like error or strange behaviour.
 
hmm - did you unplug the USB-key before booting into the freshly installed PVE?
 
Of cource :)

I know i'm looking as a newbie but i'm familiar with linux and IT. I'm just trying to fix this approach like a whole day so i'm feeling bit confused and writing my answers like a newbie.

No offence :)

UPD:
I just tried another physical disk for installation, same error :(

UPD2:
Ok, i found 2 diskd with same size, installed PVE on ZFS RAID1 as it must be installed and got same error...
 
Last edited:
sorry - did not want to imply that you're unfamiliar with Linux! It's just that these things happen to me quite often (and I would consider myself familiar with Linux)...

in any case the kernel-commandline looks odd - I just installed a system locally with zfs on root:
Code:
cat /proc/cmdline 
initrd=\EFI\proxmox\5.3.10-1-pve\initrd.img-5.3.10-1-pve root=ZFS=rpool/ROOT/pve-1 boot=zfs

this is how it looks like when it works - the BOOT_IMAGE=/boot/linux26 you pasted looks wrong and this might be the reason why it's not working for you
 
Haha, my small old Intel SSD just finally died :)
Somehow that SSD was the problem, even if i tried to install PVE on another disks...
Maybe SATA controller failed cause of it or so...

I will reconfigure my test server, install PVE on live disks and immediately return to you with answer!
Anyway much thanks for your support and fast answers :)

P.S.
"sorry - did not want to imply that you're unfamiliar with Linux!"
English is not my native language so i think that i need to say sorry! I didn't wanted to force you to excuse man :)
 
Last edited:
Nope, still unlucky, still can't work with ZFS. Thought it was broken SSD but i was wrong :(
I tried another disks, only one disk connected, different combinations of BIOS SATA/RAID/EFI/ settings, etc.
Tried to make ZFS as intended by dev's, i mean install on 2 similar disks, same error every time.

I'm starting to be sure that my motherboard (GA-Z68AP-D3 (rev. 1.0) ) is buggy somewhere :(

Btw, i forgot to mention before that standart install with ext4 fs are working without problems.
 
They have like 8Gb difference but i don't think it's a problem. I can't init RAID 0 even on one disk and i'm sure that ZFS can do this.

My mobo support 2 slots of SATA3 but doing it using additional Intel chip. So i can say that SATA3 on my board (where SSD disks is) is internal and external at the same time :D. Anyway i tried to connect SSD's to another slots, no luck :(

P.S.
Today at work (and after work) i will try to check if USB stick are prepared correctly using VMware lab and also old Laptop.
 
hmm - maybe there's a BIOS-update available for the mainboard? (sometimes these do help with UEFI boot problems)
 
Okay, i tested my USB stick using VMware home lab, here is the info i got:
1. VMware can't boot from USB using BIOS boot method so i tested only UEFI boot.

2. When i started installation i noticed that PVE know that it boots from UEFI:
Аннотация 2020-01-28 191740.png
When i tried to install PVE yesterday on my bare-metal machine i didn't saw this message for sure. I selected EFI boot in BIOS before but installer anyway using BIOS boot every time.

3. When it was installed i can see this options on load:
Аннотация 2020-01-28 191822.png
When i tried to install PVE yesterday i could see only first option.

4. PVE successfully installed on ZFS RAID 0 using one virtual drive:
Аннотация 2020-01-28 194418.png
So now i'm sure that my USB stick are prepared correctly using Rufus. Or at least UEFI mode.


Next time i will try to install PVE on my old laptop and will give you new info guys :)
 
Hello again!
So, what i've understood for these days:
1. My problem caused by my motherboard for sure. It can't boot to UEFI devices, my BIOS just don't support this.
2. Also i can't install PVE on ZFS using old BIOS boot method.
3. The problem is not in the USB device or not in SSD disks.

Ok, not a big problem, i will try another setup in future but i still have one main question:
I can't install PVE on ZFS using old BIOS boot method. Is this intended or something goes wrong with GRUB?
From different sources (wiki, forum, google) i know that i should have such an opportunity.

Or i missed something and latest PVE on ZFS are working only from UEFI?
If i'm right and i must install PVE even using BIOS boot, what possibly goes wrong with my setup? Any new ideas?

Anyway, thank you for trying to help guys, especially Stoiko ;)
 
Just to validate this, I'm having the same issue. I'm trying to install a root ZFS RaidZ-1 on x3 120GB SATA3 SSDs. The boot fails with "no root device found" and i'm thrown to busybox

System
Gigabyte SKT-AM3 78LMT-USB3 / AMD 6300 6 Core sh1theap / 32GB DDR3 RAM / x3 SATA Sandisk 120GB SSDs

Troubleshooting:
* cat on /proc/cmdline returns the same as @FourtyTwo 's system
* Switching to bios/EFI doesnt fix the issue (system is in AHCI mode with native IDE disabled)
* Switching to ext4 fixes the issue (but that just gives me LVM partitions across 3 disks
* Using Octobers 6.04 / Decembers 6.1 ISO images gives the same error
* Manually loading ZFS via modprobe allows me to list the pool via zpool list (mounting via the tips in this thread do not work, i get critical bios error as i leave busybox with a ZFS pool mounted but no root to load https://forum.proxmox.com/threads/proxmox-ve-6-0-4-zfs-root-ends-in-busybox.56311/)
 

Attachments

  • zfs-error.jpg
    zfs-error.jpg
    667.5 KB · Views: 35
Last edited:
Hi @rearend-dev
I had semi-fixed this problem by changing my hardware and I'm still pretty sure that this problem caused by some UEFI boot problems.

Even using new MB+CPU+RAM i can reproduce it with this steps:
1. Burn USB-key with Rufus using any of 2 methods: GPT+DD or MBR+DD
2. Choose "UEFI only (no csm)" boot in BIOS
3. Install PVE on ZFS RAID
4. It will try to boot with UEFI using systemd-boot (black one) and then fail with "no root device" error

I had successfully installed my PVE on ZFS RAID only using this steps:
1. Burn USB-key with Rufus using MBR+DD method with 2.02-pve GRUB
2. Choose "UEFI plus Legacy (csm)" boot in BIOS
3. Install PVE on ZFS RAID
4. Boot with Legacy method from any of ZFS RAID disks using GRUB (blue one)
5. Works!

So, i still don't know what was the problem but you can try to do this:
1. Update BIOS to latest
2. Try to burn USB-key with all methods one by one: GPT+DD, MBR+DD+PVE GRUB 2.02-pve, MBR+DD+Latest GRUB (Rufus will ask you about it)
3. Try to install PVE by booting from USB-key with two methods one by one: UEFI, Legacy
4. Try to boot to freshly installed PVE with two methods one by one: UEFI, Legacy

Yeah, it looks like stupid iteration over all variables and it will not intended to fix the problem but it can lead you to some working combination and maybe to new info that will help community to fix this error :)
I found working one, good luck :)
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!