Hello folks. I'm stumped and this feels wave a rubber chicken at your PC odd.
I recently changed from VMWare ESXi to Proxmox. I had GPU passthrough working on VMware, and after following the Ultimate Beginner's Guide to Proxmox GPU Passthrough I do have GPU passthrough working with Proxmox... with some weirdness!
To get GPU passthrough to work I have to have my VM configured with the Display set to default (so I can use the noVNC console) and the "Primary GPU" checkbox unchecked on the PCIe passthrough configuration. Then I power on my VM and switch to the console in the Proxmox WebUI and let the VM boot. As soon as the VM is past UEFI post and hands things off to Windows, my display connected to the GPU will switch from the Proxmox console to Windows. If I don't do that Windows will boot fine but the only access to the VM is via RDP.
I've tried changing the host server's BIOS to use the onboard GPU instead of the PCIe GPU, but in that configuration, the Radeon GPU isn't available for passthrough. Would be ideal to get the Proxmox console via the on-board VGA output and my Windows VM via the GPU, but accessing Proxmox via SSH or the WebUI only is just fine. I'm going for making GPU passthrough work reliably without manual intervention.
But honestly, I'll accept the manual intervention for the GPU passthrough VM for the nice boost in storage performance I'm getting from switching from hardware RAID to HBA mode + ZFS. (The TS440's controller performs much better in better with the HBA firmware than RAID firmware.)
Hardware:
Lenovo ThinkServer TS440
GPU: AMD Radeon R7 200 series.
Processor: Intel(R) Xeon(R) CPU E3-1245 v3
Yes, old hardware, but it's hanging in there well for what I use it for. (Lots of storage for Plex and a Windows VM with GPU passthrough that can play Rocket Leauge).
Thank you all!
I recently changed from VMWare ESXi to Proxmox. I had GPU passthrough working on VMware, and after following the Ultimate Beginner's Guide to Proxmox GPU Passthrough I do have GPU passthrough working with Proxmox... with some weirdness!
To get GPU passthrough to work I have to have my VM configured with the Display set to default (so I can use the noVNC console) and the "Primary GPU" checkbox unchecked on the PCIe passthrough configuration. Then I power on my VM and switch to the console in the Proxmox WebUI and let the VM boot. As soon as the VM is past UEFI post and hands things off to Windows, my display connected to the GPU will switch from the Proxmox console to Windows. If I don't do that Windows will boot fine but the only access to the VM is via RDP.
I've tried changing the host server's BIOS to use the onboard GPU instead of the PCIe GPU, but in that configuration, the Radeon GPU isn't available for passthrough. Would be ideal to get the Proxmox console via the on-board VGA output and my Windows VM via the GPU, but accessing Proxmox via SSH or the WebUI only is just fine. I'm going for making GPU passthrough work reliably without manual intervention.
But honestly, I'll accept the manual intervention for the GPU passthrough VM for the nice boost in storage performance I'm getting from switching from hardware RAID to HBA mode + ZFS. (The TS440's controller performs much better in better with the HBA firmware than RAID firmware.)
Hardware:
Lenovo ThinkServer TS440
GPU: AMD Radeon R7 200 series.
Processor: Intel(R) Xeon(R) CPU E3-1245 v3
Yes, old hardware, but it's hanging in there well for what I use it for. (Lots of storage for Plex and a Windows VM with GPU passthrough that can play Rocket Leauge).
Thank you all!