[SOLVED] Can't get mixed mode vGPU to work on NVIDIA A40

espenu

New Member
May 14, 2024
5
1
3
I have an NVIDIA A40 card (570 driver) with vGPU working with a couple of VMs. But I can't get mixed mode vGPU to work.
If I select for example a 4Q profile on one VM and start it, the only options for other VMs are 4Q and 4A.
What do I need to do to enable mixed mode on this card?

nvidia-smi shows vGPU Heterogeneous Mode = Disabled.
 
Does anyone know how to correctly make this apply across reboots? This is lost on reboot and VMs fail to start
 
Does anyone know how to correctly make this apply across reboots? This is lost on reboot and VMs fail to start
I've ended up giving one large profile to a VM instead of splitting it, so I'm not actually using heterogeneous mode any more.
But, it should be possible to run the command at linux startup using systemd. Just make sure that it runs after the nvidia vgpu service has started.
 
Does anyone know how to correctly make this apply across reboots? This is lost on reboot and VMs fail to start
I created a systemd oneshot service for this:
/etc/systemd/system/nvidia-shm.service
Code:
[Unit]
Description=Enable NVIDIA Heterogenous Time-Slice Sizes
Before=pve-guests.service
After=nvidia-vgpud.service nvidia-vgpu-mgr.service

[Service]
Type=oneshot
ExecStart=nvidia-smi vgpu -shm 1 -i 0

[Install]
WantedBy=multi-user.target

and then just enable it like any other service:
Bash:
systemctl daemon-reload
systemctl enable nvidia-shm.service

The Before makes sure this script runs before it attempts to start VMs that might use the vGPU resources as that setting can only be set when no vGPUs are currently in use. The After is because (at least according to Nvidias documentation) the vgpu manager services need to be started for this to work.