Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

export ONNX and onnxsim #18

Open
pcb9382 opened this issue May 24, 2022 · 0 comments
Open

export ONNX and onnxsim #18

pcb9382 opened this issue May 24, 2022 · 0 comments

Comments

@pcb9382
Copy link

pcb9382 commented May 24, 2022

@zjjMaiMai
`
parser.add_argument("--images", nargs=2, required=False)
parser.add_argument("--model", type=str, default="HITNet_SF")
parser.add_argument("--ckpt", type=str, default="ckpt/hitnet_sf_finalpass.ckpt")
parser.add_argument("--width", type=int, default=None)
parser.add_argument("--output", default="./")
args = parser.parse_args()

model = PredictModel(**vars(args))
model.eval()
ckpt = torch.load("ckpt/hitnet_sf_finalpass.ckpt")
# for name in ckpt['state_dict']:
#     print('name is {}'.format(name))
model.load_state_dict(ckpt["state_dict"])
device = torch.device("cuda")
model = model.to(device)
input_names = ["input0"]#,"input1"
#output_names = ["output0"]
output_names=["output_%d" % i for i in range(1)]
#output_names = ["output0","output1","output2","output3"]
print(output_names)
left = torch.randn(1, 3, 480, 640).to(device)
right = torch.randn(1, 3, 480, 640).to(device)
a=(left, right)
export_onnx_file = "./HITNet_SF_5.onnx"
# torch_out = torch.onnx._export(model(left,right),(left,right), output_onnx, export_params=True, verbose=False,
#                                input_names=input_names, output_names=output_names)
torch_out = torch.onnx.export(model,args=(left,right),f=export_onnx_file,verbose=False,input_names=input_names,
                              output_names=output_names, export_params=True,opset_version=11)
`

I use the above code to export onnx and can get an onnx model of about 1.9M
python3 -m onnxsim HITNet_SF_5.onnx HITNet_SF_5-smi.onnx
飞书20220524-100301

When I optimized the onnx model, the following error occurred. After investigation, it was found that the max operation caused the optimization error;
飞书20220524-100518
How to solve it please, thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant