-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: module 'torch_npu' has no attribute '_npu_flash_attention' #211
Comments
@pjgao Sorry for misleading. Currently, doc link links to latest release version.
|
TLDR:
I will close this issue today. |
Your current environment
The output of `python collect_env.py`
🐛 Describe the bug
执行下面脚本报错:
日志:
问题原因:
#187 这个PR合入后torch_npu的版本依赖从 2.5.1.dev20250218更新到了2.5.1.dev20250226 ,但readme以及安装文档中未同步修改,导致按照readme以及https://vllm-ascend.readthedocs.io/en/latest/installation.html 这个文档说明安装 2.5.1.dev20250218 版本报错
The text was updated successfully, but these errors were encountered: