ai-suite-rocm/README.md

30 lines
940 B
Markdown
Raw Permalink Normal View History

2024-02-17 00:37:40 +01:00
# ai-suite-rocm
2024-06-30 22:24:02 +02:00
**(Deprecated : Use [this](https://git.broillet.ch/mathieu/ai-suite-rocm-local) instead)**
2024-02-17 00:37:40 +01:00
This is a simple project to make hosting local LLM and AI tools easily on Linux with AMD GPUs using ROCM.
To use you have to clone the repo and build the docker image, it's the same image for all the services.
```bash
git clone https://github.com/M4TH1EU/ai-suite-rocm.git
cd ai-suite-rocm/
docker build . -t 'ai-suite-rocm:6.0' -f Dockerfile
```
Then you can start and stop whichever service you want using their respectives start/stop scripts.
For example, you can start stablediffusion using :
```bash
# Start
./start_stablediffusion.sh
# Stop
./stop_stablediffusion.sh
```
If like me you like storing all your models and big files on another disk, have a look at the make_folders.sh script (it creates symlinks).
*This has been tested on Fedora 39 with kernel 6.7.4 using latest docker version with an AMD RX 6800 XT.*