add readme
This commit is contained in:
parent
2e8975c9a6
commit
ff96143ac5
27
README.md
Normal file
27
README.md
Normal file
@ -0,0 +1,27 @@
|
||||
# ai-suite-rocm
|
||||
|
||||
This is a simple project to make hosting local LLM and AI tools easily on Linux with AMD GPUs using ROCM.
|
||||
|
||||
To use you have to clone the repo and build the docker image, it's the same image for all the services.
|
||||
|
||||
```bash
|
||||
git clone https://github.com/M4TH1EU/ai-suite-rocm.git
|
||||
cd ai-suite-rocm/
|
||||
docker build . -t 'ai-suite-rocm:6.0' -f Dockerfile
|
||||
```
|
||||
|
||||
Then you can start and stop whichever service you want using their respectives start/stop scripts.
|
||||
|
||||
For example, you can start stablediffusion using :
|
||||
```bash
|
||||
# Start
|
||||
./start_stablediffusion.sh
|
||||
|
||||
# Stop
|
||||
./stop_stablediffusion.sh
|
||||
```
|
||||
|
||||
|
||||
If like me you like storing all your models and big files on another disk, have a look at the make_folders.sh script (it creates symlinks).
|
||||
|
||||
*This has been tested on Fedora 39 with kernel 6.7.4 using latest docker version with an AMD RX 6800 XT.*
|
@ -1,4 +1,4 @@
|
||||
mkdir -p stablediffusion koboldai llamacpp
|
||||
mkdir -p stablediffusion koboldai llamacpp koyhass xtts
|
||||
ln -s '/mnt/DATA/SD_MODELS/' ./stablediffusion/models
|
||||
ln -s '/mnt/DATA/LLM_MODELS/' ./koboldai/models
|
||||
ln -s '/mnt/DATA/LLM_MODELS/' ./llamacpp/models
|
||||
|
1
start_koyhass.sh
Executable file
1
start_koyhass.sh
Executable file
@ -0,0 +1 @@
|
||||
/usr/bin/docker-compose up -d koyhass-rocm
|
1
start_xtts.sh
Executable file
1
start_xtts.sh
Executable file
@ -0,0 +1 @@
|
||||
/usr/bin/docker-compose up -d xtts-rocm
|
1
stop_koboldai.sh
Executable file
1
stop_koboldai.sh
Executable file
@ -0,0 +1 @@
|
||||
/usr/bin/docker-compose down kobold-rocm
|
1
stop_koyhass.sh
Executable file
1
stop_koyhass.sh
Executable file
@ -0,0 +1 @@
|
||||
/usr/bin/docker-compose down koyhass-rocm
|
1
stop_llamacpp.sh
Executable file
1
stop_llamacpp.sh
Executable file
@ -0,0 +1 @@
|
||||
/usr/bin/docker-compose down llamacpp-rocm
|
1
stop_stablediffusion.sh
Executable file
1
stop_stablediffusion.sh
Executable file
@ -0,0 +1 @@
|
||||
/usr/bin/docker-compose down stablediff-rocm
|
1
stop_xtts.sh
Executable file
1
stop_xtts.sh
Executable file
@ -0,0 +1 @@
|
||||
/usr/bin/docker-compose down xtts-rocm
|
Loading…
Reference in New Issue
Block a user