Hey guys!
First post here, so go easy on me
I'm new to Proxmox and Linux administration in general so maybe this is simple, but from my research, I don't seem to be the only one struggling with this. Here's the gist of my situation:
I have been tasked with setting up a new Proxmox 4.x server. The goal is to install it to a ZFS volume in RAID10 (8x 1.2TB disks) with a P440ar controller. The controller is set to HBA mode and BIOS set with UEFI mode enabled (as Proxmox claims it is supported in the latest release).
The behavior is as follows:
With UEFI, I can boot to the VE ISO and install it to a newly created ZFS volume consisting of the above mentioned 8 disks in RAID10. After installation, however, I am presented with a number of, what I assume are, GRUB errors. It thinks there's no OS to boot to and creates a boot loop as if there are no disks installed.
With Legacy Boot enabled, I get the same behavior as above. However, if I remove all of the disks but 1, and install VE to that, we boot right into the VE console without issue. I figured at this point I could simply shut down the machine, add the remaining disks, and expand the ZFS pool, but adding the disks back to the machine returns us to the boot loop problem.
I'm not sure if it's some kind of limitation in the Proxmox installer? If so, I assumed a potential solution is to install Proxmox over a Debian install? Would that involve use of a Debian Live CD to create the ZFS volume where we'll install Debian, followed by an install of Proxmox over that? Or am I dealing with some kind of hardware incompatibility / misconfiguration? My lack of experience is to blame on that one, but I'm enjoying the learning opportunity thus far!
So, kinda tired of scratching my head on this one, figured I'd reach out to the local gurus
Thanks all
First post here, so go easy on me
I have been tasked with setting up a new Proxmox 4.x server. The goal is to install it to a ZFS volume in RAID10 (8x 1.2TB disks) with a P440ar controller. The controller is set to HBA mode and BIOS set with UEFI mode enabled (as Proxmox claims it is supported in the latest release).
The behavior is as follows:
With UEFI, I can boot to the VE ISO and install it to a newly created ZFS volume consisting of the above mentioned 8 disks in RAID10. After installation, however, I am presented with a number of, what I assume are, GRUB errors. It thinks there's no OS to boot to and creates a boot loop as if there are no disks installed.
With Legacy Boot enabled, I get the same behavior as above. However, if I remove all of the disks but 1, and install VE to that, we boot right into the VE console without issue. I figured at this point I could simply shut down the machine, add the remaining disks, and expand the ZFS pool, but adding the disks back to the machine returns us to the boot loop problem.
I'm not sure if it's some kind of limitation in the Proxmox installer? If so, I assumed a potential solution is to install Proxmox over a Debian install? Would that involve use of a Debian Live CD to create the ZFS volume where we'll install Debian, followed by an install of Proxmox over that? Or am I dealing with some kind of hardware incompatibility / misconfiguration? My lack of experience is to blame on that one, but I'm enjoying the learning opportunity thus far!
So, kinda tired of scratching my head on this one, figured I'd reach out to the local gurus
Thanks all