I am installing proxmox 6.3 onto a dual xeon supermicro with 10 ssd drives.
This is a proof-of-concept build.
Basic details of the server:
-1U
-Dual 8-core xeon
-192gb ram
-2 sata controllers
-- sata controller A handles 6 drive bays
-- sata controler B handles 4 drive bays
-1 pcie dual port 10gb card solarflare
-2 onboard 1gb eth ports
-10 x 50gb ssd
This same exact configuration works perfectly fine installing Proxmox 6.3 onto ZFS utilizing 2 drives, 4 drives, 6 drives, and 8 drives.
With the changing number of drives I changed from RaidZ Mirror to Raidz1 and Raidz2 as appropriate.
I did not create multiple volumes or pools.
All installs I allowed the installer to configure ZFS.
When attempting to install utilizing all 10 drives in a RaidZ2 configuration everything during the install seems to be just fine.
When rebooting all disks sda through sdg are detected properly and loading stops with sdh with the following lines
It stops at that point and will not continue even after leaving it for 18+ hrs.
If I hit [Enter] it displays the prompt (initramfs)
**NOTE: I have had to retype output from a picture so there may be typos**
I have tried moving disks around and essentially the same thing is happening except with a different letter disk.
The 10gb card is currently not connected to cables so those final lines make sense.
However it only stops loading when it is configured with all drives.... 2, 4, 6, 8 drives all work just fine.
Can someone explain what is going on?
If there is a fix?
If I can assist the project by providing logs I will if I can get them without too much trouble.
I appreciate any input as long as it relevant to solving the above issue.
This is a proof-of-concept build.
Basic details of the server:
-1U
-Dual 8-core xeon
-192gb ram
-2 sata controllers
-- sata controller A handles 6 drive bays
-- sata controler B handles 4 drive bays
-1 pcie dual port 10gb card solarflare
-2 onboard 1gb eth ports
-10 x 50gb ssd
This same exact configuration works perfectly fine installing Proxmox 6.3 onto ZFS utilizing 2 drives, 4 drives, 6 drives, and 8 drives.
With the changing number of drives I changed from RaidZ Mirror to Raidz1 and Raidz2 as appropriate.
I did not create multiple volumes or pools.
All installs I allowed the installer to configure ZFS.
When attempting to install utilizing all 10 drives in a RaidZ2 configuration everything during the install seems to be just fine.
When rebooting all disks sda through sdg are detected properly and loading stops with sdh with the following lines
Code:
sdh: sdh1 sdh2 sdh3
sd 8:0:0:0: [sdh] Attached SCSI disk
ata9: SATA link down (SStatus 0 SControl 300)
ata10: SATA link down (SStatus 0 SControl 300)
qla2xxx [0000:20:00.0]-8083:4: cable is unplugged...
qla2xxx [0000:20:00.1]-8083:11: cable is unplugged...
It stops at that point and will not continue even after leaving it for 18+ hrs.
If I hit [Enter] it displays the prompt (initramfs)
**NOTE: I have had to retype output from a picture so there may be typos**
I have tried moving disks around and essentially the same thing is happening except with a different letter disk.
The 10gb card is currently not connected to cables so those final lines make sense.
However it only stops loading when it is configured with all drives.... 2, 4, 6, 8 drives all work just fine.
Can someone explain what is going on?
If there is a fix?
If I can assist the project by providing logs I will if I can get them without too much trouble.
I appreciate any input as long as it relevant to solving the above issue.