UPDATE
I have the VM booting with the GPU attached in PCIe mode and with the primary GPU flag set.
things I changed:
rolled the kernel back to 5.15.53-1-pve according to this post
added the GPU romfile to the VM config
The VM will boot now, but I get no output on the monitor I have attached...
I've been trying to get this to work for the past week with no luck. I've tried multiple different guides, but still can't seem to get this working.
I'll start with symptoms:
-Attaching GPU to VM with PCIe and Primary GPU results in windows blue screen.
-Attaching GPU to VM with PCIe and no...
So I decided to change from a hardware raid to ZFS on my homelab Proxmox node. The hardware is:
Dell r710
120GB ram
2 6-core 12-thread CPU’s
Dell H200 raid controller flashed to LSI 9211-8i IT Mode
2 Seagate Constellation ES.1 2TB 7.2K 6G/s SAS drives in a ZFS mirror
I’ve limited ZFS to...
This is solved. I ended up running and update/upgrade, which didn't fix the problem by itself, but after a reboot of proxmox everything worked fine again.
So I recently upgraded to Proxmox 6.1 on the same hardware (r710) I was running 5.4 on, but ever since the update my Windows VM's WILL NOT start no matter if they were existing (from backup) or if I create new ones. The syslog isn't very helpful either. The error I keep getting is:
TASK ERROR...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.