-
Notifications
You must be signed in to change notification settings - Fork 27.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Will webui support AMD graphics cards in the future? #287
Comments
It does support it, just replace:
with
and start |
Thank you for your answer! Where can I find this sentence? |
out of my scope; if someone wants to add a section to readme please send a PR |
Forget Rocm, only a few cards support it anyway so calling it 'AMD support' is quite the overstatement. |
Works fine on my AMD GPU, but thanks for the advice. |
Anyways, a short tutorial for those that want to run it on their AMD GPU (should work on a Linux and Windows host, but I didn't test on Windows myself): Pull the latest Execute the following inside the container: cd /dockerx
git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui
cd stable-diffusion-webui
python -m venv venv
source venv/bin/activate
python -m pip install --upgrade pip wheel
TORCH_COMMAND='pip install torch torchvision --extra-index-url https://download.pytorch.org/whl/rocm5.1.1' REQS_FILE='requirements.txt' python launch.py --precision full --no-half Subsequent runs will only require you to restart the container, attach to it again and execute the following inside the container. Find the container name from this listing: cd /dockerx/stable-diffusion-webui
# Optional: "git pull" to update the repository
source venv/bin/activate
TORCH_COMMAND='pip install torch torchvision --extra-index-url https://download.pytorch.org/whl/rocm5.1.1' REQS_FILE='requirements.txt' python launch.py --precision full --no-half The Works without issues for me and the performance is around a GTX 3090 for my RX 6900 XT (according to some graph I saw floating around). Quoting from the
|
@AUTOMATIC1111 If you want to, I think you can just copy and paste these instructions into the README for AMD users. |
Cant we get this to work with this? (on windows) pytorch-directml |
carefully look at the photo on it there are parameters of the video card and computer |
The docker solution outlined above does NOT work on windows. The GPU can't be passed through. |
I can't get it to work on Linux. Natively or in docker. When running natively it keeps complaining about nvidia, even though I followed the AMD wiki page:
|
You need to supply an environment variable as by default ROCm doesn't recognize consumer cards as compatible. |
Also you can drop the "--precision full --no-half" stuff now, that's not needed anymore and removing it makes things run much faster. |
Having issues getting it started too, looks like it still wants to use cuda
Specifically: |
Tried following this for Windows but didn't work for me. I pulled rocm/pytorch latest but when I do docker run -it --network=host --device=/dev/kfd --device=/dev/dri --group-add=video --ipc=host --cap-add=SYS_PTRACE --security-opt seccomp=unconfined -v $HOME/dockerx:/dockerx rocm/pytorch` I get this. Any fix or do I try a Linux vm?
|
As of Diffusers 0.6.0 the Diffusers Onnx Pipeline Supports Txt2Img, Img2Img and Inpainting for AMD cards Examples: https://gist.github.com/averad/256c507baa3dcc9464203dc14610d674 Would it be possible to include the Onnx Pipeline now that Img2Img and Inpainting are working? |
Just so folks have proper expectations, the ONNX pipeline on AMD on windows, while better than just running on your CPU, is still 2-3 times as long as the ROCm pipeline on Linux. |
And it has less functionality |
Works out of the box without compiling rcom to work with specific cards, installing docker or dual booting. Helps people like: |
Error response from daemon: error gathering device information while adding custom device "/dev/kfd": no such file or directory. doesn't work |
i had the same issue, fixed by installing docker engine instead of docker desktop and run command with sudo:
reference: What is the difference between Docker Desktop for Linux and Docker Engine. |
I just want to add this for documentation purposes. It works for me with the AMD MI25 perfectly fine. However you have to export TORCH_COMMAND before, it does not work inline for me. I used latest 5.4.2.
If you don't export it before or export a wrong version, it will download cuda pytorch libs and you get "gpu not found". |
❤️ PS: I am not using the docker, instead running it natively with ROCm. Following the official guide for ROCm on AMD's website. I am using ROCm 5.4.2 since PyTorch have support for it. Then in Once done, run Hope it helps guys! |
Created a little more detailed guideline for AMD ROCm. Just validated on Mi210 and latest release. You probably could give it a try. |
Now I regret not buying NVIDIA's graphics card…
The text was updated successfully, but these errors were encountered: