Skip to content

Commit

Permalink
[Bugfix] Fix unsupported FA version check for Turing GPU (vllm-projec…
Browse files Browse the repository at this point in the history
…t#12828)

Signed-off-by: Felix Marty <[email protected]>
  • Loading branch information
Isotr0py authored and fxmarty-amd committed Feb 7, 2025
1 parent c279d6f commit 4588bac
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion vllm/attention/backends/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -612,5 +612,5 @@ def flash_attn_version():
return fa_version

VLLM_FLASH_ATTN_VERSION = flash_attn_version()
except ImportError:
except (ImportError, AssertionError):
VLLM_FLASH_ATTN_VERSION = None

0 comments on commit 4588bac

Please sign in to comment.