A side question for those with experience with Proxmox (starting fresh. The experience so far has been both fascinating and frustrating, though frustrations seem more in Nividia's house as I'll get to) I may be doing a fool's errand here, but my end goal is:
2 Windows VMs, both with GPU passthrough of the same Geforce GTX 1080 GPU (only one being started at a time)
1 instance of FreeNAS managing 8 HDDs
Outside of the 8 mentioned HDDs, I have 2 nvme sticks (250GB per) that are housing Proxmox. I also have a 1TB SSD where I'm hoping to have the 2 Windows VMs (approx. 500GB for one, 400ishGB on the other).
I think of priority, the FreeNAS environment probably trumps all, so if this is a bad idea, please let me know
-------
To the issue at hand. I got the VM up and running which is great, but I now have an issue where for the life of me, I can't seem to get the GTX 1080 to do right (from reading, it appears that Nvidia really doesn't want Geforce cards in use in VMs, so I think that's where the problem lies for me).
modified GRUB line (updated after saving it):
VM conf file:
VM Hardware is:
Things I've tried:
1) multiple attempts at DDU -> restart -> reinstall drivers manually
2) switched to romfile.Used GPU-Z to dump the rom file, passed to host via scp, used that rom file. No luck. Installed python, used Nvidia_vbios_vfio_patcher against the rom file from GPU-Z, passed the patched version of the rom file to host via scp, used that rom file. No luck.
2a) one oddball thing about this route: My monitor that is hooked up to the server from initial setup wasn't disconnected. So it'd mostly show system-y bits before. After using the patched version, I started to see the lock screen of the Windows VM on it. When I'd unlock on the Remote Desktop side, it would remain at the lock screen. Gibberish would replace the lock screen when installing graphics drivers.
Thing I've mostly noticed is that the graphics drivers never seem to take. If I install it + Experience via the manual method, Experience shows a "new" driver available. Express install it, it shows that same "new" driver is available. Essentially, looks like it never applies. Though after the driver is installed, I do see GTX 1080 in the Device Manager with a caution flag about how it had to be turned off due to Code 43.
One difference between my setup and others probably is that I am using an AMD RYZEN 2700x CPU (I know Intel seems to be the norm, understandable, from the guides I was reading).
Any ideas of how I can get this GPU working?
2 Windows VMs, both with GPU passthrough of the same Geforce GTX 1080 GPU (only one being started at a time)
1 instance of FreeNAS managing 8 HDDs
Outside of the 8 mentioned HDDs, I have 2 nvme sticks (250GB per) that are housing Proxmox. I also have a 1TB SSD where I'm hoping to have the 2 Windows VMs (approx. 500GB for one, 400ishGB on the other).
I think of priority, the FreeNAS environment probably trumps all, so if this is a bad idea, please let me know
-------
To the issue at hand. I got the VM up and running which is great, but I now have an issue where for the life of me, I can't seem to get the GTX 1080 to do right (from reading, it appears that Nvidia really doesn't want Geforce cards in use in VMs, so I think that's where the problem lies for me).
modified GRUB line (updated after saving it):
GRUB_CMDLINE_LINUX_DEFAULT="quiet amd_iommu=on iommu=pt pcie_acs_override=downstream;multifunction nofb nomodeset video=vesafbff,efifbff"
VM conf file:
agent: 1
bios: ovmf
bootdisk: scsi0
cores: 4
efidisk0: windows:vm-100-disk-1,size=1M
hostpci0: 0f:00,pcie=1,x-vga=1,romfile=GTX1080-patched.rom
machine: q35
memory: 16384
name: gaming-main
net0: e1000=CE:F4:14:E2:2E:2C,bridge=vmbr0
numa: 1
ostype: win10
scsi0: windows:vm-100-disk-0,iothread=1,replicate=0,size=500G,ssd=1
scsihw: virtio-scsi-single
smbios1: uuid=74963b60-ba8b-459a-91dd-16a0c3ba4d85
sockets: 1
vga: none
vmgenid: 9b0ff464-bab1-417d-89ca-bea86c434414
args: -cpu 'host,+kvm_pv_unhalt,+kvm_pv_eoi,hv_vendor_id=NV43FIX,kvm=off'
VM Hardware is:
Memory 16GB
Processors 4 (1 sockets, 4 cores) [numa=1]
BIOS OVMF (UEFI)
Display none (none)
Machine q35
SCSI Controller VirtIO SCSI single
Hard Disk (scsi0) windows:vm-100-disk-0,iothread=1,replicate=0,size=500G,ssd=1
Network Device (net0) e1000,bridge=vmbr0
EFI DIsk windows:vm-100-disk-1,size=1M
PCI Device (hostpci0) pcie=1,x-vga=1,romfile=GTX1080-patched.rom
Things I've tried:
1) multiple attempts at DDU -> restart -> reinstall drivers manually
2) switched to romfile.Used GPU-Z to dump the rom file, passed to host via scp, used that rom file. No luck. Installed python, used Nvidia_vbios_vfio_patcher against the rom file from GPU-Z, passed the patched version of the rom file to host via scp, used that rom file. No luck.
2a) one oddball thing about this route: My monitor that is hooked up to the server from initial setup wasn't disconnected. So it'd mostly show system-y bits before. After using the patched version, I started to see the lock screen of the Windows VM on it. When I'd unlock on the Remote Desktop side, it would remain at the lock screen. Gibberish would replace the lock screen when installing graphics drivers.
Thing I've mostly noticed is that the graphics drivers never seem to take. If I install it + Experience via the manual method, Experience shows a "new" driver available. Express install it, it shows that same "new" driver is available. Essentially, looks like it never applies. Though after the driver is installed, I do see GTX 1080 in the Device Manager with a caution flag about how it had to be turned off due to Code 43.
One difference between my setup and others probably is that I am using an AMD RYZEN 2700x CPU (I know Intel seems to be the norm, understandable, from the guides I was reading).
Any ideas of how I can get this GPU working?