Hi,
apologies if this has been answered before, but I wasn't able to find a solution, as my scenario might be a bit of an extraordinary use case.
My setup is:
- Pretty old computer with Phenom 2 x6, 24GB RAM, AMD/ATI Radeon HD 4290
- Debian 11 with gnome desktop environment
On top of that there is Proxmox 7.1 installed (done according to the description here in the wiki)
I use the machine for two totally different things (and I know thats not how proxmox is intended):
1) Use it as TV with tvheadend/kodi and watch stuff online (yt, netflix) - the machine is connected to a TV screen over HDMI
2) Use it as host for some VMs (e.g. Windows for work purposes) that I connect to from client via rdp - proxmox
I have noticed a high cpu load when watching stuff online, so I assume it uses the cpu as software renderer instead of the GPU.
Graphic card has the radeon drivers installed according to https://wiki.debian.org/AtiHowTo
Output lspci
I was unable to install the package firmware-amd-graphics as according to another topic here in this this would conflict with proxmox firmware package that has this firmware already.
glxinfo shows me:
According to my knowledge llvmpipe is using the CPU as renderer.
So how can I use the GPU as renderer on the host machine?
I don't want to pass it to a VM but on the host machine itself.
Is that possible? And how?
Thanks in advance
apologies if this has been answered before, but I wasn't able to find a solution, as my scenario might be a bit of an extraordinary use case.
My setup is:
- Pretty old computer with Phenom 2 x6, 24GB RAM, AMD/ATI Radeon HD 4290
- Debian 11 with gnome desktop environment
On top of that there is Proxmox 7.1 installed (done according to the description here in the wiki)
I use the machine for two totally different things (and I know thats not how proxmox is intended):
1) Use it as TV with tvheadend/kodi and watch stuff online (yt, netflix) - the machine is connected to a TV screen over HDMI
2) Use it as host for some VMs (e.g. Windows for work purposes) that I connect to from client via rdp - proxmox
I have noticed a high cpu load when watching stuff online, so I assume it uses the cpu as software renderer instead of the GPU.
Graphic card has the radeon drivers installed according to https://wiki.debian.org/AtiHowTo
Output lspci
VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] RS880 [Radeon HD 4290] (prog-if 00 [VGA controller])
Subsystem: ASRock Incorporation RS880 [Radeon HD 4290]
Control: I/O+ Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR+ FastB2B- DisINTx-
Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast >TAbort- <TAbort- <MAbort- >SERR- <PERR- INTx-
Latency: 0, Cache Line Size: 64 bytes
Interrupt: pin A routed to IRQ 18
NUMA node: 0
Region 0: Memory at d0000000 (32-bit, prefetchable) [size=256M]
Region 1: I/O ports at c000 [size=256]
Region 2: Memory at fe5f0000 (32-bit, non-prefetchable) [size=64K]
Region 5: Memory at fe400000 (32-bit, non-prefetchable) [size=1M]
Expansion ROM at 000c0000 [virtual] [disabled] [size=128K]
Capabilities: <access denied>
Kernel driver in use: radeon
Kernel modules: radeon
I was unable to install the package firmware-amd-graphics as according to another topic here in this this would conflict with proxmox firmware package that has this firmware already.
glxinfo shows me:
glxinfo | grep render
direct rendering: Yes
GLX_MESA_query_renderer, GLX_MESA_swap_control, GLX_OML_swap_method,
GLX_EXT_visual_rating, GLX_MESA_copy_sub_buffer, GLX_MESA_query_renderer,
Extended renderer info (GLX_MESA_query_renderer):
OpenGL renderer string: llvmpipe (LLVM 11.0.1, 128 bits)
GL_ARB_conditional_render_inverted, GL_ARB_conservative_depth,
GL_NV_conditional_render, GL_NV_copy_image, GL_NV_depth_clamp,
GL_ARB_conditional_render_inverted, GL_ARB_conservative_depth,
GL_NV_conditional_render, GL_NV_copy_depth_to_color, GL_NV_copy_image,
GL_EXT_render_snorm, GL_EXT_robustness, GL_EXT_sRGB_write_control,
GL_MESA_shader_integer_functions, GL_NV_conditional_render,
GL_OES_element_index_uint, GL_OES_fbo_render_mipmap,
name of display: :1
display: :1 screen: 0
direct rendering: Yes
Extended renderer info (GLX_MESA_query_renderer):
Vendor: Mesa/X.org (0xffffffff)
Device: llvmpipe (LLVM 11.0.1, 128 bits) (0xffffffff)
Version: 20.3.5
Accelerated: no
Video memory: 23728MB
Unified memory: no
Preferred profile: core (0x1)
Max core profile version: 4.5
Max compat profile version: 3.1
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.2
OpenGL vendor string: Mesa/X.org
OpenGL renderer string: llvmpipe (LLVM 11.0.1, 128 bits)
OpenGL core profile version string: 4.5 (Core Profile) Mesa 20.3.5
OpenGL core profile shading language version string: 4.50
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL version string: 3.1 Mesa 20.3.5
OpenGL shading language version string: 1.40
OpenGL context flags: (none)
OpenGL ES profile version string: OpenGL ES 3.2 Mesa 20.3.5
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
According to my knowledge llvmpipe is using the CPU as renderer.
So how can I use the GPU as renderer on the host machine?
I don't want to pass it to a VM but on the host machine itself.
Is that possible? And how?
Thanks in advance