-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to enable MULTI_DEVICE_SAFE_MODE ? #121
Comments
vs-trt does not rely on PyTorch at all so I don't know what you mean by enable it. You might be confusing it with HolyWu's plugins? Speaking of multi-device inference, you should be able to do that in vs-trt using static scheduling. |
When using multiple GPUs from different generation it "errors out" with
MULTI_DEVICE_SAFE_MODE is a variable in TensorRT, and, if true, it checks for CUDA compatibility every time it is called. Its default is Have a look here:
So my question is, where to set this for the vapoursynth ? I guess when building |
That's a very interesting error. Could you try to set the environment variable |
I already tried that. Same error Even when I change to V100 stream0 and 2080ti stream1 it says
|
I believe the flag you mentioned is internal to pytorch-tensorrt only, but let me check it carefully. EDIT: yep I believe it is only a pytorch-tensorrt warning that has nothing to do with the trt library itself. Anyway, how did you produce the engines exactly? Streams have nothing to do with devices in this case. |
|
Yes that's true, but it shows me, that SM "level" is set to first device being called, and then errors out on next device with different SM "level" |
Thanks. |
For now I setup a new environment with Tensor RT 8.6.1 and CUDA 11.8 and it works "again" I dunno exactly with which version they set MULTI_DEVICE_SAFE_MODE to false by default. So I went back to the versions I know they work. |
That's probably because the Myelin optimiser is not that aggressive in older TensorRT. Also Volta is not supported in TensorRT 10.5 and later. |
In vs-trt, how do I enable multi-device safe mode?
torch_tensorrt.runtime.multi_device_safe_mode
It was enabled by default in earlier versions, but since 10.2(?) it's disabled by default. But it makes it impossible to use my Voltas with Turings together.
The text was updated successfully, but these errors were encountered: