Skip to content
This repository has been archived by the owner on Oct 25, 2023. It is now read-only.

Support reusing Relay ONNX operator convertors in the Relax ONNX frontend #8

Merged
merged 5 commits into from
Jan 31, 2023

Conversation

gigiblender
Copy link
Contributor

@gigiblender gigiblender commented Jan 26, 2023

This PR adds support to reuse the Relay ONNX operators in the Relax ONNX frontend.

This is achieved by calling the Relay ONNX ops and then using the Relay to Relax translator to obtain the equivalent op. It then appends the output of the translator (VarBindings and the PrimFuncs) to the current block builder used by the frontend.

Dynamic input shapes should work -- this PR includes dynamic examples with:

  • a matmul
  • a reshape

Co-authored-by: @mbaret and @mikepapadim

@gigiblender gigiblender self-assigned this Jan 26, 2023
Copy link
Contributor

@mbaret mbaret left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should aim for a more explicit commit message to detail exactly what support has been added.

python/tvm/relax/frontend/onnx_frontend.py Outdated Show resolved Hide resolved
python/tvm/relax/frontend/onnx_frontend.py Outdated Show resolved Hide resolved
python/tvm/relax/frontend/onnx_frontend.py Outdated Show resolved Hide resolved
op_function = convert_class.get_converter(opset)
# If the op_function is a subclass of RelayOnnxOpConverter then it is a relay op.
if (convert_class.__bases__[0] == RelayOnnxOpConverter) or \
(hasattr(convert_class.__bases__[0], "__bases__") and convert_class.__bases__[0].__bases__[0] == RelayOnnxOpConverter):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this second condition is very opaque - what's it handling?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it was checking for second degree subclassing. I replaced this with issubclass(convert_class, RelayOnnxOpConverter)

Comment on lines 97 to 99
assert int(in_dim) == int(
red_dim
), "Inner dimensions of dense do not match. {in_dim} vs {red_dim}."
if isinstance(in_dim, tvm.tir.expr.IntImm) and isinstance(red_dim, tvm.tir.expr.IntImm):
assert int(in_dim) == int(red_dim), \
"Inner dimensions of dense do not match. {in_dim} vs {red_dim}."
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we need to think about the correct way to handle these sort of (non-additive) changes in our workflow. ideally they'd go straight to apache/tvm but we'd need to find a way to test the change there.

tests/python/relax/frontend/test_onnx_frontend.py Outdated Show resolved Hide resolved
python/tvm/relax/frontend/onnx_frontend.py Show resolved Hide resolved
python/tvm/relax/frontend/onnx_frontend.py Outdated Show resolved Hide resolved
python/tvm/relax/frontend/onnx_frontend.py Outdated Show resolved Hide resolved
python/tvm/relax/frontend/onnx_frontend.py Outdated Show resolved Hide resolved
gigiblender and others added 4 commits January 31, 2023 10:17
Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>
Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>
Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>
@gigiblender gigiblender merged commit 866eaac into TUZ-6 Jan 31, 2023
@gigiblender gigiblender deleted the TUZ-6-relay-onnx branch January 31, 2023 08:48
jwfromm pushed a commit that referenced this pull request Feb 3, 2023
…tend (#8)

* [WIP] Support using Relay ops in the Relax ONNX frontend

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] small fixes

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] Support dynamic matmul and reshape

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* Address PR comments

---------

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>
jwfromm pushed a commit that referenced this pull request Feb 14, 2023
* Relax pretty printer initial prototype

* call into TVMScriptPrinter for PrimFuncs

* most round-trip tests pass

* address comments

* fix typo
jwfromm pushed a commit that referenced this pull request Feb 14, 2023
…tend (#8)

* [WIP] Support using Relay ops in the Relax ONNX frontend

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] small fixes

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] Support dynamic matmul and reshape

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* Address PR comments

---------

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>
jwfromm pushed a commit that referenced this pull request Feb 15, 2023
…tend (#8)

* [WIP] Support using Relay ops in the Relax ONNX frontend

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] small fixes

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] Support dynamic matmul and reshape

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* Address PR comments

---------

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>
jwfromm pushed a commit that referenced this pull request Feb 16, 2023
…tend (#8)

* [WIP] Support using Relay ops in the Relax ONNX frontend

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] small fixes

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] Support dynamic matmul and reshape

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* Address PR comments

---------

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>
jwfromm pushed a commit that referenced this pull request Feb 17, 2023
* Initial importer and testing scaffolding.

* Implement matmul operator and tests.

* Add a bunch of new operators.

* Add new ops 

* [Relax][Onnx] Implement Div, Sigmoid, Softmax, Transpose and Unsqueeze ops

* skip test_reshape

* [Relax][ONNX] Implement BiasGelu and Gelu ops

* [Relax][ONNX] Implement Where op

* [Relax][ONNX] Add Multiple ONNX Frontend Support for Clip / Equal / Shape / Not / Tanh (#3)

* Rebase w/ Equal, Not, Tanh, Sqrt, Relu, Clip, Conv, Pow, Erf.

* Fix cumsum but still needs work.

* Fix initializer for CumSum. (#9)

* Add Constant, Squeeze & Sub (#10)

* Add squeeze.

* Add Constant.

* Add sub.

* Support reusing Relay ONNX operator convertors in the Relax ONNX frontend (#8)

* [WIP] Support using Relay ops in the Relax ONNX frontend

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] small fixes

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] Support dynamic matmul and reshape

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* Address PR comments

---------

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* Add more ops (including all Reduce ops) using the relay frontend (#11)

* [WIP] add more ops. Some fail at the moment

* skip some tests

* Remove duplicate tests for squeeze

* Add Split op in the Relax ONNX frontend (#12)

* [Relax][ONNX] Add Split op

* Remove tmp

* Fix layer normalizations and Shape operator.

* Replace main loop with tvm testing.

* Simplify Slice for opset 13.

* [Relax][ONNX] Implement pad op

* Incorporate pad op, add static constantofshape op.

* Changes to shape to temporarily enable constantofshape in our models.

* Add initial tensor_to_shape implementation.

* Implemented dynamic broadcast_to to support expand and constantofshape.

* Changes sufficient for vortex end to end run.

* Formatting.

* Format tests.

* Re-add broadcast_to shape checking.

* Fix formatting.

* Remove overly strict manipulate check.

* Fix typing

* [Relax][Onnx] Implement Tile operator

* Switch to native relax attention importer.

* Address some of the PR comments

* Check for the imported model IR version

* switch from torch to numpy due to some incompatibility

* Fix make format.

* Clean up typing issues.

* Clarify variable name.

* Remove unneeded comprehension.

* Remove circular dependency.

* Add name sanitization for inputs

* Disable reshape rewrite pass until fixed.

* Fix long comment

* Update cpu image.

---------

Co-authored-by: Florin Blanaru <[email protected]>
Co-authored-by: Xiyou Zhou <[email protected]>
Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>
Co-authored-by: Florin Blanaru <[email protected]>
Co-authored-by: sung <[email protected]>
jwfromm pushed a commit that referenced this pull request Feb 22, 2023
* Initial importer and testing scaffolding.

* Implement matmul operator and tests.

* Add a bunch of new operators.

* Add new ops 

* [Relax][Onnx] Implement Div, Sigmoid, Softmax, Transpose and Unsqueeze ops

* skip test_reshape

* [Relax][ONNX] Implement BiasGelu and Gelu ops

* [Relax][ONNX] Implement Where op

* [Relax][ONNX] Add Multiple ONNX Frontend Support for Clip / Equal / Shape / Not / Tanh (#3)

* Rebase w/ Equal, Not, Tanh, Sqrt, Relu, Clip, Conv, Pow, Erf.

* Fix cumsum but still needs work.

* Fix initializer for CumSum. (#9)

* Add Constant, Squeeze & Sub (#10)

* Add squeeze.

* Add Constant.

* Add sub.

* Support reusing Relay ONNX operator convertors in the Relax ONNX frontend (#8)

* [WIP] Support using Relay ops in the Relax ONNX frontend

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] small fixes

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] Support dynamic matmul and reshape

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* Address PR comments

---------

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* Add more ops (including all Reduce ops) using the relay frontend (#11)

* [WIP] add more ops. Some fail at the moment

* skip some tests

* Remove duplicate tests for squeeze

* Add Split op in the Relax ONNX frontend (#12)

* [Relax][ONNX] Add Split op

* Remove tmp

* Fix layer normalizations and Shape operator.

* Replace main loop with tvm testing.

* Simplify Slice for opset 13.

* [Relax][ONNX] Implement pad op

* Incorporate pad op, add static constantofshape op.

* Changes to shape to temporarily enable constantofshape in our models.

* Add initial tensor_to_shape implementation.

* Implemented dynamic broadcast_to to support expand and constantofshape.

* Changes sufficient for vortex end to end run.

* Formatting.

* Format tests.

* Re-add broadcast_to shape checking.

* Fix formatting.

* Remove overly strict manipulate check.

* Fix typing

* [Relax][Onnx] Implement Tile operator

* Switch to native relax attention importer.

* Address some of the PR comments

* Check for the imported model IR version

* switch from torch to numpy due to some incompatibility

* Fix make format.

* Clean up typing issues.

* Clarify variable name.

* Remove unneeded comprehension.

* Remove circular dependency.

* Add name sanitization for inputs

* Disable reshape rewrite pass until fixed.

* Fix long comment

* Update cpu image.

---------

Co-authored-by: Florin Blanaru <[email protected]>
Co-authored-by: Xiyou Zhou <[email protected]>
Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>
Co-authored-by: Florin Blanaru <[email protected]>
Co-authored-by: sung <[email protected]>
jwfromm pushed a commit that referenced this pull request Feb 25, 2023
* Initial importer and testing scaffolding.

* Implement matmul operator and tests.

* Add a bunch of new operators.

* Add new ops 

* [Relax][Onnx] Implement Div, Sigmoid, Softmax, Transpose and Unsqueeze ops

* skip test_reshape

* [Relax][ONNX] Implement BiasGelu and Gelu ops

* [Relax][ONNX] Implement Where op

* [Relax][ONNX] Add Multiple ONNX Frontend Support for Clip / Equal / Shape / Not / Tanh (#3)

* Rebase w/ Equal, Not, Tanh, Sqrt, Relu, Clip, Conv, Pow, Erf.

* Fix cumsum but still needs work.

* Fix initializer for CumSum. (#9)

* Add Constant, Squeeze & Sub (#10)

* Add squeeze.

* Add Constant.

* Add sub.

* Support reusing Relay ONNX operator convertors in the Relax ONNX frontend (#8)

* [WIP] Support using Relay ops in the Relax ONNX frontend

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] small fixes

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] Support dynamic matmul and reshape

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* Address PR comments

---------

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* Add more ops (including all Reduce ops) using the relay frontend (#11)

* [WIP] add more ops. Some fail at the moment

* skip some tests

* Remove duplicate tests for squeeze

* Add Split op in the Relax ONNX frontend (#12)

* [Relax][ONNX] Add Split op

* Remove tmp

* Fix layer normalizations and Shape operator.

* Replace main loop with tvm testing.

* Simplify Slice for opset 13.

* [Relax][ONNX] Implement pad op

* Incorporate pad op, add static constantofshape op.

* Changes to shape to temporarily enable constantofshape in our models.

* Add initial tensor_to_shape implementation.

* Implemented dynamic broadcast_to to support expand and constantofshape.

* Changes sufficient for vortex end to end run.

* Formatting.

* Format tests.

* Re-add broadcast_to shape checking.

* Fix formatting.

* Remove overly strict manipulate check.

* Fix typing

* [Relax][Onnx] Implement Tile operator

* Switch to native relax attention importer.

* Address some of the PR comments

* Check for the imported model IR version

* switch from torch to numpy due to some incompatibility

* Fix make format.

* Clean up typing issues.

* Clarify variable name.

* Remove unneeded comprehension.

* Remove circular dependency.

* Add name sanitization for inputs

* Disable reshape rewrite pass until fixed.

* Fix long comment

* Update cpu image.

---------

Co-authored-by: Florin Blanaru <[email protected]>
Co-authored-by: Xiyou Zhou <[email protected]>
Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>
Co-authored-by: Florin Blanaru <[email protected]>
Co-authored-by: sung <[email protected]>
jwfromm pushed a commit that referenced this pull request Feb 28, 2023
* Initial importer and testing scaffolding.

* Implement matmul operator and tests.

* Add a bunch of new operators.

* Add new ops

* [Relax][Onnx] Implement Div, Sigmoid, Softmax, Transpose and Unsqueeze ops

* skip test_reshape

* [Relax][ONNX] Implement BiasGelu and Gelu ops

* [Relax][ONNX] Implement Where op

* [Relax][ONNX] Add Multiple ONNX Frontend Support for Clip / Equal / Shape / Not / Tanh (#3)

* Rebase w/ Equal, Not, Tanh, Sqrt, Relu, Clip, Conv, Pow, Erf.

* Fix cumsum but still needs work.

* Fix initializer for CumSum. (#9)

* Add Constant, Squeeze & Sub (#10)

* Add squeeze.

* Add Constant.

* Add sub.

* Support reusing Relay ONNX operator convertors in the Relax ONNX frontend (#8)

* [WIP] Support using Relay ops in the Relax ONNX frontend

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] small fixes

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* [WIP] Support dynamic matmul and reshape

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* Address PR comments

---------

Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>

* Add more ops (including all Reduce ops) using the relay frontend (#11)

* [WIP] add more ops. Some fail at the moment

* skip some tests

* Remove duplicate tests for squeeze

* Add Split op in the Relax ONNX frontend (#12)

* [Relax][ONNX] Add Split op

* Remove tmp

* Fix layer normalizations and Shape operator.

* Replace main loop with tvm testing.

* Simplify Slice for opset 13.

* [Relax][ONNX] Implement pad op

* Incorporate pad op, add static constantofshape op.

* Changes to shape to temporarily enable constantofshape in our models.

* Add initial tensor_to_shape implementation.

* Implemented dynamic broadcast_to to support expand and constantofshape.

* Changes sufficient for vortex end to end run.

* Formatting.

* Format tests.

* Re-add broadcast_to shape checking.

* Fix formatting.

* Remove overly strict manipulate check.

* Fix typing

* [Relax][Onnx] Implement Tile operator

* Switch to native relax attention importer.

* Address some of the PR comments

* Check for the imported model IR version

* switch from torch to numpy due to some incompatibility

* Fix make format.

* Clean up typing issues.

* Clarify variable name.

* Remove unneeded comprehension.

* Remove circular dependency.

* Add name sanitization for inputs

* Disable reshape rewrite pass until fixed.

* Fix long comment

* Update cpu image.

---------

Co-authored-by: Florin Blanaru <[email protected]>
Co-authored-by: Xiyou Zhou <[email protected]>
Co-authored-by: Matthew Barrett  <[email protected]>
Co-authored-by: Michalis Papadimitriou <[email protected]>
Co-authored-by: Florin Blanaru <[email protected]>
Co-authored-by: sung <[email protected]>
vinx13 pushed a commit to vinx13/relax-octo that referenced this pull request Mar 29, 2023
* Relax pretty printer initial prototype

* call into TVMScriptPrinter for PrimFuncs

* most round-trip tests pass

* address comments

* fix typo
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants