GPU Sharing Issue (PCI Device Already In Use)

ibrahimswati

New Member
Jan 23, 2025
3
0
1
Hi,

I've been struggling for multiple days but unable to resolve an issue with GPU sharing. I'm fairly new to Proxmox and this VE was setup by my devops team member who has resigned. Last week my server went through a reboot and now I am unable to start multiple VM's at the same time that have the same GPU assigned. I can turn them on if I detach the GPU. Only one works actively with the GPU.

Here's the error that I get:
TASK ERROR: PCI device '0000:86:00.0' already in use by VMID '261'

I was not getting this before the reboot. I was able to share the GPU without any issues. Also my host does not have Nvidia drivers installed. Only the VM's do. I'm still not sure how that works

My current VM 261 that has the GPU running has the following settings for the GPU PCI:
Raw device 0000:86:00.0 which is my GPU
All functions toggled off
Primary GPU toggled off
ROM-Bar enabled but no vendor ID or device ID

Information:
CPU: Intel(R) Xeon(R) Platinum 8168 CPU @ 2.70GHz
GPU: Nvidia A40
Kernel Version: Linux 6.2.16-3
ProxMox version: 8.0.3
IOMMU is enabled

/etc/modules file:
vfio
vfio_iommu_type1
vfio_pci
vfio_virqfd

Grub file:
GRUB_DEFAULT=0
GRUB_TIMEOUT=5
GRUB_DISTRIBUTOR=`lsb_release -i -s 2> /dev/null || echo Debian`
GRUB_CMDLINE_LINUX_DEFAULT="quiet intel_iommu=on"
GRUB_CMDLINE_LINUX=""
iommu=pt

What am I missing here? Please let me know if you need anymore information and please ignore my ignorance as I'm fairly new.

Thank you in advance.
 
PCI(e) passthrough (which you appear to be using) can only work with one VM at a time. The GPU is given to the VM when the VM starts and cannot be shared. You can have multiple VMs with the device but not run them at the same time. I don't understand how this could have worked before.

I think Proxmox support NVidia GRID for sharing a GPU but you would set that up differently (no personal experience with it) and need a paid license from NVidia.

Also, you might want to update your Proxmox to the latest 8.3.
 
  • Like
Reactions: ibrahimswati
PCI(e) passthrough (which you appear to be using) can only work with one VM at a time. The GPU is given to the VM when the VM starts and cannot be shared. You can have multiple VMs with the device but not run them at the same time. I don't understand how this could have worked before.

I think Proxmox support NVidia GRID for sharing a GPU but you would set that up differently (no personal experience with it) and need a paid license from NVidia.

Also, you might want to update your Proxmox to the latest 8.3.
Thank you for clarifying that. So it's impossible to have a Windows machine and a Linux machine use the same GPU at the same time? I'm not sure how they were running before. As of right now I can even boot it up when one VM is running with the GPU assigned.

Also I will update ASAP. Thanks for letting me know
 
Hello ibrahimswati! My guess is that since you are using a data center GPU, you are trying to use Nvidia's vGPU to share the GPU between multiple VMs. This is possible with Nvidia's data center GPUs, but you need to make sure that the proprietary Nvidia driver runs on both the host (PVE) and the guest VMs.

First, I would recommend updating your PVE and kernel to the latest versions. I would then follow the guide on NVIDIA vGPU on Proxmox VE.
 
So it's impossible to have a Windows machine and a Linux machine use the same GPU at the same time?
It's not possible with most GPUs, no. But you are using an Nvidia A40 which does offer that possibility if you are setting it up correctly (see my previous post).

In general, sharing a GPU is otherwise possible using Intel’s GVT-g or NVIDIA's vGPU, but this requires explicit support from the GPU manufacturer - which you have with Nvidia's vGPU, in your case. Another possibility would be to use VirGL for host offloading. However, if I was you, I would simply try to get vGPU working again.
 
  • Like
Reactions: ibrahimswati
It's not possible with most GPUs, no. But you are using an Nvidia A40 which does offer that possibility if you are setting it up correctly (see my previous post).

In general, sharing a GPU is otherwise possible using Intel’s GVT-g or NVIDIA's vGPU, but this requires explicit support from the GPU manufacturer - which you have with Nvidia's vGPU, in your case. Another possibility would be to use VirGL for host offloading. However, if I was you, I would simply try to get vGPU working again.
Thank you so much for your help. I'm looking into the vGPU now. I was mainly curious to see how he had it running before without the vGPU.
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!