So I have 4 3090RTX GPU's in a Proxmox server. I am having real trouble getting them to all work together.
I have done the pass through and can't seem to get them to work on a VM (Not a container) would I need to
install the drivers on the Proxmox host first before installing cuda on to any of the VM's?
I have installed the GPU Driver for the 3090RTX and then use the offcial docker image with Tensorflow but I still can only use either 1 GPU
or it just crashes.
Anyone got a similar config working?
Thanks.
I have done the pass through and can't seem to get them to work on a VM (Not a container) would I need to
install the drivers on the Proxmox host first before installing cuda on to any of the VM's?
I have installed the GPU Driver for the 3090RTX and then use the offcial docker image with Tensorflow but I still can only use either 1 GPU
or it just crashes.
Anyone got a similar config working?
Thanks.