Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA 12 required? #783

Closed
sidharthrajaram opened this issue Apr 8, 2024 · 4 comments
Closed

CUDA 12 required? #783

sidharthrajaram opened this issue Apr 8, 2024 · 4 comments

Comments

@sidharthrajaram
Copy link
Contributor

Using the CUDA Docker Image (tag: nvidia/cuda:11.8.0-cudnn8-runtime-ubuntu22.04) as recommended in the README, but encountering the following error with faster-whisper (version 1.0.1):

File "/usr/local/lib/python3.10/dist-packages/faster_whisper/transcribe.py", line 344, in transcribe
    encoder_output = self.encode(segment)
  File "/usr/local/lib/python3.10/dist-packages/faster_whisper/transcribe.py", line 762, in encode
    return self.model.encode(features, to_cpu=to_cpu)
RuntimeError: Library libcublas.so.12 is not found or cannot be loaded

Does this mean that cuBLAS for CUDA 12 is required?

@trungkienbkhn
Copy link
Collaborator

@sidharthrajaram , hello. FW has supported CUDA 12 since version 1.0.0, so you should update your cuda image, for example nvidia/cuda:12.0.0-runtime-ubuntu20.04 . For Dockerfile example, you can see in this comment.

@sidharthrajaram
Copy link
Contributor Author

Hi @trungkienbkhn, thanks for the note. Yes, I understand that CUDA 12 is supported, but does that mean that CUDA 11 is not supported anymore? I followed the README's guidance on CUDA 11 (https://github.com/SYSTRAN/faster-whisper?tab=readme-ov-file#gpu), but had the above error.

However, as mentioned in this issue (#717 (comment)), simply downgrading FW resolved the issue.

@Purfview
Copy link
Contributor

Purfview commented Apr 9, 2024

I understand that CUDA 12 is supported, but does that mean that CUDA 11 is not supported anymore?

Latest versions of ctranslate2 are CUDA12 only, CUDA11 support would be nice.

At the moment for CUDA11 you need to downgrade it:
pip install --force-reinsall ctranslate2==3.24.0

@sidharthrajaram
Copy link
Contributor Author

Ah, got it @Purfview. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants