We are running into issues with getting a zfs rpool to boot after the PVE installation completes. Recently we have been using ZFS more and more and are working on setting a new server to use ZFS instead of a hardware raid card. The server we are installing onto has the following specs. Code: CPU: 2 x Intel Xeon Gold 6140, 2.3GHz (18-Core, HT, 2666 MT/s, 140W) 14nm RAM: 192GB (12 x 16GB DDR4-2666 ECC Registered 1R 1.2V RDIMMs) Operating at 2666 MT/s Max I/O Controller: Dual-Port Intel X540 10GbE Controller (RJ45) LP PCIe 3.0 x8 internal: Supermicro 12Gb/s SAS HBA (LSI 3008), 8-Port Internal, RAID 0,1,10 - up to 63 Devices Drive Set 1: 4 x Intel 480GB DC S4600 Series 3D TLC (6Gb/s, 3 DWPD) 2.5" SATA SSD Motherboard: SuperMicro X11DPU The LSI 3008 HBA board is flashed into IT Mode. The BIOS is set to LEGACY boot only. When following the PVE installation and setting up the target disks to be ZFS Raid Z2 and selecting our 4 disks and rebooting we get to the grub rescue prompt in which an error appears in which it is unable to find a device. Since the install fails to boot, we got into the debug shell through the installer and we are able mount the zpool and look at the partitions the installer created. Looking at the partitions we see the /dev/sda1 and /dev/sdb1... partitions it created as type EF02 or the bios boot partitions as expected. But it appears those are unable to actually boot the OS. Seems to me that the BIOS boot manager is not able to boot the ZFS partitions since the /boot appears to be inside of ZFS much like how ProxMox 3.4 has the /boot mount inside of the LVM setup on EXT4. I attached screenshots that show the grub rescue screen, the zpool setup and the block device ID's and partition setup and occurred during a regular install. Any help on this would be appreciated.