AMD

In my opinion, the attempt to run AI models with an AMD card has proven to be a dead end. Therefore, (with a heavy heart due to the 48GB VRAM) I decided to return the AMD card. In its place, an Nvidia 3090 card, which I was able to purchase relatively cheaply despite the Christmas season, is now doing its job."

This also opens up the option for me to use the current version of Linux Mint or Windows on my second PC as well.

The following pages are only still listed here for “historical reasons”.


After some trial and error, I can say: AI things are no gift with AMD graphics cards and, at best - if anything works at all - fragile. This seems to me to be primarily the result of a version chaos that is unmatched.

So, one day I seemingly got “Oobabooga text-generation-webui” running without any problems, so that I could use models in the formats GGUF, GPTQ, and EXL2. The next day, I had to reinstall Linux “for reasons” - and couldn't get it working again. As soon as I try to use a GGUF model, the application crashes completely. Currently, I can only assume that it's due to a change in the llama.cpp library, which seems to have several commits/merges every day."

In addition, OTGW is quite behind with the used libraries. For AMD support, it wants to use ROCm 6.1. The latest official 6.1.x version seems to be 6.1.2, so I'll be using that from now on.

The problem with 6.1.2 is that the latest officially supported Ubuntu version is 22.04.4, and the kernel version must not exceed 6.5. I was able to get it running with Ubuntu 22.04.4, but neither with a correspondingly old version of Linux Mint (I tried 21.3 Edge) nor with Xubuntu 22.04.4.

Unfortunately, 22.04 comes with both the 6.5 and 6.8 kernel as standard. The latter needs to be removed first...

Use these commands, to remove the undesired Kernal and re-initialise GRUB:


sudo apt purge -y linux*6.8*
sudo update-grub