AMD igpu in dockers?

Cant get my igpu to worh in apps. Seems to be fine in linux.
Anyone have any tips how to get the gpu to work in apps thats need it?

  • amdgpu (kernel driver) – OK
  • /dev/dri/card0 (crw-rw---- root:video) – OK
  • /dev/dri/renderD128 (crw-rw-rw- root:render) – OK
  • Docker access to /dev/dri – OK
    Thanks

It’s a good idea to provide your AMD CPU model, as well as the application you want to use the AMD iGPU for.
This way we can do some testing.
The initial driver adaptation of ZimaOS revolves around Intel iGPUs or Nvidia GPUs.
AMD GPUs don’t do a specific adaptation.

1 Like

Same problem for me.

I want to use my AMD iGPU (AMD Ryzen 5 pro 4650G with AMD Vega 6) for Ollama.

I tried all different versions of Ollama (CPU-Only, Ollama - AMD with rocm) but none of them worked with the iGPU.
I passed through the /dev/dri but it didnt work in Ollama.
Didnt pass through the /dev/kfd because in the ZimaOS Console it doesnt even exist.

I tried it with OpenSuse Micro OS and Podman and it seemed to work there (at least ollama was running waaaaay faster there so i assume the iGPU worked there).

Maybe you got some Infos for me, if i can get it to work.

Greets, Tobias

Is there any Ollama/ROCm version that works with AMD Vega 6?
There is Vulcan coming, it will problably work with most iGPU:s from AMD is my guess.
If it is the gfx90c it looks like it will work with the cheat HSA_OVERRIDE_GFX_VERSION=9.0.0.

Try:

not sure if its working in docker though, but always worth a try

There’s no support for AMD ROCm yet in ZimaOS.

Context: AMD Radeon 890M GPU Passthrough/Hardware Acceleration Support - #6 by Zima-Giorgio

Soon it will possible to use Vulkan, not sure about docker though.
If it works when the changes comes in the stable build in Docker it would be great. Comparing Vulkan and ROCM in Ollama and/or LM Studio on my main machine, Vulkan is easier on my hardware. So great news

For information. I tried an old fork of Package ollama-vulkan · GitHub for my iGPU and it looks like it is working. Can’t use anything newer than llama 3.2, but it did work.

Good Morning @Zima-Jerry,
I’d like to post the need to add support for the following hardware since it isn’t yet supported: AMD Ryzen 7 255 w/ Radeon 780M Graphics
I’m planning to use the iGPU for Plex transcoding.

Thx and Best Regards

It will 4 sure work with plex. Use the right device in docker compose. My older card works fine for plex.

Hi @zztop007
I’m a newbie and I don’t know what you refer or how to configure it, would you be so kind to explain it?
Thx and Best Regards

I don’t have this device at the moment, so let me investigate if I need the latest cores to drive the 780M.

Use this one and maybe /dev/kfd - /dev/kfd

Plex;

Plex stops running (inside & outside the home).
I had to delete the /dev/kfd line.
Even by changing the “hardware transcoding device” doesn’t do anything…

1 Like

So no cigar at all? dev/dri or render128 use to do the trick. Works like a charm on my older hardware. Could be a problem with this slimmed down version of Linux, you can’t update or add drivers that’s not there I think. Good luck anyway