An easy to setup, all-in-one AI software suite for AMD GPU on Linux (ROCM)
Go to file
2024-08-27 22:54:03 +02:00
background-removal-dis-rocm add update function to xtts and background remover 2024-08-10 21:39:51 +02:00
bitsandbytes-rocm-build add llamacpp build 2024-08-27 16:01:31 +02:00
comfyui-rocm update and add text gen 2024-08-27 15:37:35 +02:00
llama-cpp-python-rocm-build update text-gen 2024-08-27 17:44:25 +02:00
services full rewrite of all scripts in python 2024-08-27 22:54:03 +02:00
stablediffusion-forge-rocm update and add text gen 2024-08-27 15:37:35 +02:00
stablediffusion-rocm update and add text gen 2024-08-27 15:37:35 +02:00
text-generation-rocm full rewrite of all scripts in python 2024-08-27 22:54:03 +02:00
xtts-rocm update xtts-webui 2024-08-19 18:35:27 +02:00
.gitignore add update function to stablediff 2024-08-10 21:27:45 +02:00
main.py full rewrite of all scripts in python 2024-08-27 22:54:03 +02:00
README.md full rewrite of all scripts in python 2024-08-27 22:54:03 +02:00
utils.py full rewrite of all scripts in python 2024-08-27 22:54:03 +02:00
utils.sh add update function to stablediff 2024-08-10 21:27:45 +02:00

ai-suite-rocm-local

This is a simple project to make hosting multiple AI tools easily on Linux with AMD GPUs using ROCM locally (without docker).

Warning

Currently rewriting this project to be more modular and easier to use. This is a work in progress.
Do not mind the python files or the services dir for now, the scripts aren't affected.

To use you have to clone the repo run the install script for the service you want to use.

git clone https://git.broillet.ch/mathieu/ai-suite-rocm-local.git
cd ai-suite-rocm-local/<service name>/
./install.sh

Then you can run whichever service you want using their respectives run.sh scripts.

./run.sh

This has been tested on Fedora 40 with kernel 6.9.6 with an AMD RX 6800 XT.