Using a couple PCs as a test cluster, everything updated to latest version in both. I set up a Ceph from the GUI to create the storage for this cluster using drives inside these two PCs.
Creating new VMs with UEFI and setting the UEFI disk on a Ceph pool works, I can install Windows and reboot and it still works fine.
But as soon as I go and change a hardware setting of the VM like the Display (from Default to SPICE), or migrate the VM from a node to another node of the same cluster, something in the UEFI breaks.
When starting again the VM, it stops at UEFI firmware (CPU use is near zero and RAM use is around 60-70 MB), the console also shows the UEFI boot splash with the white progress bar at the bottom, Windows does not seem to take over.
The VM can remain in this state for more than 20 min and I can only kill it with "Stop" command from GUI.
Changing back the setting (so moving the Display from SPICE to Default) does not fix the issue, also moving the VM back to its original node does not fix it, the VM's UEFI is broken permanently.
On the same VM, if I delete the UEFI disk and create a new one in "local storage" (so it's not in the Ceph storage), it boots fine and I can change hardware settings without breaking it.
If I install a new VM with BIOS (SeaBIOS) firmware, also everything works fine, I can migrate and change settings without breaking it
Similar issues have been reported by others in this other thread https://forum.proxmox.com/threads/cant-start-vm-with-ovmf-and-uefi-disk-on-ceph.82367/#post-365709 although they had problems regardless of changing settings so maybe it is not the same issue.
ping to @Alwin since he responded to the other similar issue
Creating new VMs with UEFI and setting the UEFI disk on a Ceph pool works, I can install Windows and reboot and it still works fine.
But as soon as I go and change a hardware setting of the VM like the Display (from Default to SPICE), or migrate the VM from a node to another node of the same cluster, something in the UEFI breaks.
When starting again the VM, it stops at UEFI firmware (CPU use is near zero and RAM use is around 60-70 MB), the console also shows the UEFI boot splash with the white progress bar at the bottom, Windows does not seem to take over.
The VM can remain in this state for more than 20 min and I can only kill it with "Stop" command from GUI.
Changing back the setting (so moving the Display from SPICE to Default) does not fix the issue, also moving the VM back to its original node does not fix it, the VM's UEFI is broken permanently.
On the same VM, if I delete the UEFI disk and create a new one in "local storage" (so it's not in the Ceph storage), it boots fine and I can change hardware settings without breaking it.
If I install a new VM with BIOS (SeaBIOS) firmware, also everything works fine, I can migrate and change settings without breaking it
Similar issues have been reported by others in this other thread https://forum.proxmox.com/threads/cant-start-vm-with-ovmf-and-uefi-disk-on-ceph.82367/#post-365709 although they had problems regardless of changing settings so maybe it is not the same issue.
ping to @Alwin since he responded to the other similar issue