Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Onnx multi output #13390

Merged
merged 4 commits into from
Nov 26, 2018
Merged

Onnx multi output #13390

merged 4 commits into from
Nov 26, 2018

Conversation

safrooze
Copy link
Contributor

Description

ONNX export does not create a correct graph when there are multiple outputs. This PR fixes this bug.

Checklist

Essentials

Please feel free to remove inapplicable items for your PR.

  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage:
  • Unit tests are added for small changes to verify correctness (e.g. adding a new operator)
  • Code is well-documented:
  • To the my best knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change

Changes

  • Fixed ONNX export to handle graphs with multiple outputs.
  • Added unit-test for ONNX export and output to test both single output and multi-output graphs

Comments

Before this change, export_onnx would treat the graph as a sequential graph with output as the last node in the sequence. This change fixes this behavior by using sym.list_outputs() to record all output nodes.

@safrooze safrooze requested a review from szha as a code owner November 24, 2018 05:32
@safrooze
Copy link
Contributor Author

@Roshrini Please review.

tests/python/unittest/test_onnx.py Outdated Show resolved Hide resolved
python/mxnet/contrib/onnx/mx2onnx/export_onnx.py Outdated Show resolved Hide resolved
python/mxnet/contrib/onnx/mx2onnx/export_onnx.py Outdated Show resolved Hide resolved
python/mxnet/contrib/onnx/mx2onnx/export_onnx.py Outdated Show resolved Hide resolved
@anirudhacharya
Copy link
Member

Thanks for doing this.

@stu1130
Copy link
Contributor

stu1130 commented Nov 25, 2018

@mxnet-label-bot add [pr-awaiting-response]
Thanks for your contribution @safrooze

@marcoabreu marcoabreu added the pr-awaiting-response PR is reviewed and waiting for contributor to respond label Nov 25, 2018
- Removed unnecessary forward_pass() call
- Modified infer_output_shape to return multiple shapes for multiple outputs as well as output names.
@safrooze
Copy link
Contributor Author

@anirudhacharya all issues addressed. Please review asap. I'd like this to be part of tonight's nightly if possible.

Copy link
Member

@anirudhacharya anirudhacharya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

minor comments/questions. mostly LGTM.

@safrooze
Copy link
Contributor Author

@mxnet-label-bot remove [pr-awaiting-response]

@marcoabreu marcoabreu removed the pr-awaiting-response PR is reviewed and waiting for contributor to respond label Nov 26, 2018
Copy link
Contributor

@ThomasDelteil ThomasDelteil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spoke with Sina offline, let's make sure that the infer shape solution works with network that have loss layers, otherwise LGTM

@ThomasDelteil ThomasDelteil merged commit cb627bc into apache:master Nov 26, 2018
@safrooze
Copy link
Contributor Author

The current implementation of handling existence of training labels as part of the symbolic graph very broken. There are multiple issues with it:

  1. It assumes that all layers that depend on label are 'Output' layers and hence don't need label for forward. This isn't necessarily true if someone implements an actual loss as part of the forward pass.
  2. It assumes there is a single label and that label is named in a very specific way, which can easily be incorrect.

My PR does not change this fragile implementation behavior, but this needs to be fixed. I created #13407 to address this issue.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants