You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Issue Description:
I'm currently using a local modification of modules_forge/bnb_installer.py where I manually changed the bitsandbytes dependency from version 0.43.3 to 0.45.1 to support PyTorch 2.6.0+cu126, because the older version of bitsandbytes didn't work with Flux. However, this change could cause errors for users who do not use CUDA 12.6, so generally changing to the newer version is risky. Also, instead of requiring users to manually adjust the version, it would be ideal if the installation process could automatically update the bitsandbytes version based on the CUDA version that PyTorch was compiled with.
Current Behavior:
The dependency is statically set to a specific version.
Manually changing the version to 0.45.1 locally supports CUDA 12.6, but it may break installations for those on earlier CUDA versions.
Desired Behavior:
If PyTorch is compiled with CUDA 12.6 (e.g., when torch.version.cuda starts with "12.6"), then the installation should automatically use bitsandbytes 0.45.1.
Otherwise, it should retain or revert to the older, compatible version (e.g., 0.43.3).
Proposed Implementation:
Integrate a dynamic check in the installation script that retrieves the CUDA version from PyTorch.
Based on the CUDA version, automatically select the appropriate bitsandbytes version.
Alternatively, if a dynamic update isn't feasible, updating the documentation to inform users about which bitsandbytes version to use based on their CUDA version would also be helpful.
This enhancement would ensure a smoother experience for all users, allowing those with CUDA 12.6 to run Flux without impacting users on other CUDA versions.
The text was updated successfully, but these errors were encountered:
Do you have reason to think v0.45.1 (or latest .2) could cause compatibility issues with older CUDA?
I've been testing v0.45.2 with CUDA 12.1 and Torch 2.4, getting identical results and no issues. But, I am only sample size 1.
Issue Description:
I'm currently using a local modification of
modules_forge/bnb_installer.py
where I manually changed the bitsandbytes dependency from version 0.43.3 to 0.45.1 to support PyTorch 2.6.0+cu126, because the older version of bitsandbytes didn't work with Flux. However, this change could cause errors for users who do not use CUDA 12.6, so generally changing to the newer version is risky. Also, instead of requiring users to manually adjust the version, it would be ideal if the installation process could automatically update the bitsandbytes version based on the CUDA version that PyTorch was compiled with.Current Behavior:
Desired Behavior:
torch.version.cuda
starts with"12.6"
), then the installation should automatically use bitsandbytes 0.45.1.Proposed Implementation:
This enhancement would ensure a smoother experience for all users, allowing those with CUDA 12.6 to run Flux without impacting users on other CUDA versions.
The text was updated successfully, but these errors were encountered: