[Solved] VM start kills local display terminal (gpu output)

valk

New Member
Dec 2, 2022
13
0
1
I have a 2 VMs and 3 LXC containers on this node. When I start 1 VM or any of LXCs I get to keep my local display terminal (through pci-e gpu), but the 2nd VM kills the output and it doesn't come back even when I shut the VM down. 1st VM has 1 HDD passed through to it and 2nd one has pci-e raid controller passed through to it, so I guess this is related somehow to the issue. Is there a way I can keep my terminal with the VM that has pci-e raid controller passed through? I rly need it for troubleshooting - my proxmox sometimes loses webinterface and ssh login and I get nothing in logs after reboot (but VM's keep working).
 
Probably a passed through device and that GPU share a IOMMU group. You cannot share devices from the same IOMMU group between VMs or between a VM and the Proxmox host. Doing so would break security isolation between VM(s) and/or the Proxmox host. The groups are determined by the motherboard and its BIOS and you can view them with this command: for d in /sys/kernel/iommu_groups/*/devices/*; do n=${d#*/iommu_groups/*}; n=${n%%/*}; printf 'IOMMU group %s ' "$n"; lspci -nns "${d##*/}"; done.

EDIT: It might help if you could tell which motherboard you have and which PCIe slots you are using. Sometimes using a different slot helps.
 
Last edited:
You are right, same IOMMU grp. MB is Maximus_IV_Extreme-Z. Top slot for gpu and last (pci16) slot for Raid controller.
 
Last edited:
You are right, same IOMMU grp. MB is Maximus_IV_Extreme-Z. Top slot for gpu and last (pci16) slot for Raid controller.
I had a look at pages 2-14 and 2-15 of the manual. Please try the fourth PCIe slot from the CPU (instead of the fifth you are using now). It might be in a separate IOMMU group.
 
I had a look at pages 2-14 and 2-15 of the manual. Please try the fourth PCIe slot from the CPU (instead of the fifth you are using now). It might be in a separate IOMMU group.
Thx for ur help! This exact move didn't help but then I tried the Raid card to last slot (x1) and all is working now and have different grps. Now though some onboard USB controllers refuse to work :D. I guess I can live with that untill I resolve the host login suicide issue...