This is a write-up based on aggregation of several great posts I found online over the course of 4 days. You can find numerous instructions for GPU passthrough basics for all versions of Proxmox, I won't go into that here. This is just the last step in the process, to fix what has probably been a very frustrating problem where you see your Windows VM begin to boot on your external monitor, only for it to immediately freeze. This applies to you if see your OVMF UEFI and then a black/broken image/no signal on your external monitor, and/or if noVNC shows the VM bluescreen (due to video driver failure) or fully boot only when the GPU is removed from the passthrough list.
This post was greatly assisted by TorqeWrench's post here, which provided the foundation I needed to solve my issue and streamline the process. To recap, in brief;
Apparently there is a disconnect between Proxmox since 6.1 and Windows, which doesn't enable the flag for Message Signaled Interrupts. It appears that when the UEFI hands the GPU to the OS and the video driver loads, it stops getting proper IRQ handling.
The solution is to manually enable MSI in the registry. The GPU in this case was from Nvidia. My method is as follows:
Read through the instructions and download the necessary software in advance. Obviously back up any data you can't afford to lose, usual disclaimer blah blah.
Set the VM display to default and remove the GPU from the hardware list. This will allow you to use noVNC for the following steps.
Boot the Windows VM and run msconfig to set startup to safe mode with networking. Restart.
Run DDU to destroy any trace of the video driver. You could also use DDU to quickly set Windows Update not to automatically download drivers, or you can do it manually.
Restart (it will reboot to safe mode again)
I found a program which lets you customize the nvidia driver installer and has the option to enable the correct MSI flag in the registry upon install. Run NVCleanstall with your settings of choice, making sure to open the advance menu and enable the MSI flag. You can download your driver from the manufacturer, or select it through NVCleanstall.
Run msconfig and set it back to normal boot. restart/shut down.
Add your GPU back to the hardware list with the proper hardware flags.
Boot up and breathe a heavy sigh of relief. Make a post here if this helped you, to add visibility for others searching for help.
P.S. There are some concerns about needing to re-enable the MSI flag every time the video driver is updated. If you follow my method completely, drivers will only be installed when you manually update them, and by using NVCleanstall to install them you can manually flag MSI each time. That means there should be absolutely no unexpected downtime or need to redo this.
This post was greatly assisted by TorqeWrench's post here, which provided the foundation I needed to solve my issue and streamline the process. To recap, in brief;
Apparently there is a disconnect between Proxmox since 6.1 and Windows, which doesn't enable the flag for Message Signaled Interrupts. It appears that when the UEFI hands the GPU to the OS and the video driver loads, it stops getting proper IRQ handling.
The solution is to manually enable MSI in the registry. The GPU in this case was from Nvidia. My method is as follows:
Read through the instructions and download the necessary software in advance. Obviously back up any data you can't afford to lose, usual disclaimer blah blah.
Set the VM display to default and remove the GPU from the hardware list. This will allow you to use noVNC for the following steps.
Boot the Windows VM and run msconfig to set startup to safe mode with networking. Restart.
Run DDU to destroy any trace of the video driver. You could also use DDU to quickly set Windows Update not to automatically download drivers, or you can do it manually.
Restart (it will reboot to safe mode again)
I found a program which lets you customize the nvidia driver installer and has the option to enable the correct MSI flag in the registry upon install. Run NVCleanstall with your settings of choice, making sure to open the advance menu and enable the MSI flag. You can download your driver from the manufacturer, or select it through NVCleanstall.
Run msconfig and set it back to normal boot. restart/shut down.
Add your GPU back to the hardware list with the proper hardware flags.
Boot up and breathe a heavy sigh of relief. Make a post here if this helped you, to add visibility for others searching for help.
P.S. There are some concerns about needing to re-enable the MSI flag every time the video driver is updated. If you follow my method completely, drivers will only be installed when you manually update them, and by using NVCleanstall to install them you can manually flag MSI each time. That means there should be absolutely no unexpected downtime or need to redo this.
Last edited: