You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi guys,
I'm trying to convert onnx model fp32 to fp16.
but I faced This messages in the function 'remove_unnecessary_cast_node'
"The downstream node of the second cast node should be graph output" (onnxconverter_common/float16.py:557)
I knew That message caused by None of Downstream node
I am trying to use the FP16 model as much as possible.
Is there any other method I can try?
Would using TensorRT instead of ONNX improve the situation?
thank you
The text was updated successfully, but these errors were encountered:
Macsim2
changed the title
bug in converting fp32 to fp16
problem in converting fp32 to fp16
Nov 18, 2024
Hi guys,
I'm trying to convert onnx model fp32 to fp16.
but I faced This messages in the function 'remove_unnecessary_cast_node'
"The downstream node of the second cast node should be graph output" (onnxconverter_common/float16.py:557)
I knew That message caused by None of Downstream node
I am trying to use the FP16 model as much as possible.
Is there any other method I can try?
Would using TensorRT instead of ONNX improve the situation?
thank you
The text was updated successfully, but these errors were encountered: