-
Notifications
You must be signed in to change notification settings - Fork 13
Support reusing Relay ONNX operator convertors in the Relax ONNX frontend #8
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should aim for a more explicit commit message to detail exactly what support has been added.
op_function = convert_class.get_converter(opset) | ||
# If the op_function is a subclass of RelayOnnxOpConverter then it is a relay op. | ||
if (convert_class.__bases__[0] == RelayOnnxOpConverter) or \ | ||
(hasattr(convert_class.__bases__[0], "__bases__") and convert_class.__bases__[0].__bases__[0] == RelayOnnxOpConverter): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this second condition is very opaque - what's it handling?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it was checking for second degree subclassing. I replaced this with issubclass(convert_class, RelayOnnxOpConverter)
python/tvm/topi/nn/dense.py
Outdated
assert int(in_dim) == int( | ||
red_dim | ||
), "Inner dimensions of dense do not match. {in_dim} vs {red_dim}." | ||
if isinstance(in_dim, tvm.tir.expr.IntImm) and isinstance(red_dim, tvm.tir.expr.IntImm): | ||
assert int(in_dim) == int(red_dim), \ | ||
"Inner dimensions of dense do not match. {in_dim} vs {red_dim}." |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we need to think about the correct way to handle these sort of (non-additive) changes in our workflow. ideally they'd go straight to apache/tvm but we'd need to find a way to test the change there.
Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]>
Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]>
Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]>
69af71a
to
78a688d
Compare
…tend (#8) * [WIP] Support using Relay ops in the Relax ONNX frontend Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] small fixes Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] Support dynamic matmul and reshape Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * Address PR comments --------- Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]>
* Relax pretty printer initial prototype * call into TVMScriptPrinter for PrimFuncs * most round-trip tests pass * address comments * fix typo
…tend (#8) * [WIP] Support using Relay ops in the Relax ONNX frontend Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] small fixes Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] Support dynamic matmul and reshape Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * Address PR comments --------- Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]>
…tend (#8) * [WIP] Support using Relay ops in the Relax ONNX frontend Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] small fixes Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] Support dynamic matmul and reshape Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * Address PR comments --------- Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]>
…tend (#8) * [WIP] Support using Relay ops in the Relax ONNX frontend Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] small fixes Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] Support dynamic matmul and reshape Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * Address PR comments --------- Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]>
* Initial importer and testing scaffolding. * Implement matmul operator and tests. * Add a bunch of new operators. * Add new ops * [Relax][Onnx] Implement Div, Sigmoid, Softmax, Transpose and Unsqueeze ops * skip test_reshape * [Relax][ONNX] Implement BiasGelu and Gelu ops * [Relax][ONNX] Implement Where op * [Relax][ONNX] Add Multiple ONNX Frontend Support for Clip / Equal / Shape / Not / Tanh (#3) * Rebase w/ Equal, Not, Tanh, Sqrt, Relu, Clip, Conv, Pow, Erf. * Fix cumsum but still needs work. * Fix initializer for CumSum. (#9) * Add Constant, Squeeze & Sub (#10) * Add squeeze. * Add Constant. * Add sub. * Support reusing Relay ONNX operator convertors in the Relax ONNX frontend (#8) * [WIP] Support using Relay ops in the Relax ONNX frontend Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] small fixes Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] Support dynamic matmul and reshape Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * Address PR comments --------- Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * Add more ops (including all Reduce ops) using the relay frontend (#11) * [WIP] add more ops. Some fail at the moment * skip some tests * Remove duplicate tests for squeeze * Add Split op in the Relax ONNX frontend (#12) * [Relax][ONNX] Add Split op * Remove tmp * Fix layer normalizations and Shape operator. * Replace main loop with tvm testing. * Simplify Slice for opset 13. * [Relax][ONNX] Implement pad op * Incorporate pad op, add static constantofshape op. * Changes to shape to temporarily enable constantofshape in our models. * Add initial tensor_to_shape implementation. * Implemented dynamic broadcast_to to support expand and constantofshape. * Changes sufficient for vortex end to end run. * Formatting. * Format tests. * Re-add broadcast_to shape checking. * Fix formatting. * Remove overly strict manipulate check. * Fix typing * [Relax][Onnx] Implement Tile operator * Switch to native relax attention importer. * Address some of the PR comments * Check for the imported model IR version * switch from torch to numpy due to some incompatibility * Fix make format. * Clean up typing issues. * Clarify variable name. * Remove unneeded comprehension. * Remove circular dependency. * Add name sanitization for inputs * Disable reshape rewrite pass until fixed. * Fix long comment * Update cpu image. --------- Co-authored-by: Florin Blanaru <[email protected]> Co-authored-by: Xiyou Zhou <[email protected]> Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> Co-authored-by: Florin Blanaru <[email protected]> Co-authored-by: sung <[email protected]>
* Initial importer and testing scaffolding. * Implement matmul operator and tests. * Add a bunch of new operators. * Add new ops * [Relax][Onnx] Implement Div, Sigmoid, Softmax, Transpose and Unsqueeze ops * skip test_reshape * [Relax][ONNX] Implement BiasGelu and Gelu ops * [Relax][ONNX] Implement Where op * [Relax][ONNX] Add Multiple ONNX Frontend Support for Clip / Equal / Shape / Not / Tanh (#3) * Rebase w/ Equal, Not, Tanh, Sqrt, Relu, Clip, Conv, Pow, Erf. * Fix cumsum but still needs work. * Fix initializer for CumSum. (#9) * Add Constant, Squeeze & Sub (#10) * Add squeeze. * Add Constant. * Add sub. * Support reusing Relay ONNX operator convertors in the Relax ONNX frontend (#8) * [WIP] Support using Relay ops in the Relax ONNX frontend Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] small fixes Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] Support dynamic matmul and reshape Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * Address PR comments --------- Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * Add more ops (including all Reduce ops) using the relay frontend (#11) * [WIP] add more ops. Some fail at the moment * skip some tests * Remove duplicate tests for squeeze * Add Split op in the Relax ONNX frontend (#12) * [Relax][ONNX] Add Split op * Remove tmp * Fix layer normalizations and Shape operator. * Replace main loop with tvm testing. * Simplify Slice for opset 13. * [Relax][ONNX] Implement pad op * Incorporate pad op, add static constantofshape op. * Changes to shape to temporarily enable constantofshape in our models. * Add initial tensor_to_shape implementation. * Implemented dynamic broadcast_to to support expand and constantofshape. * Changes sufficient for vortex end to end run. * Formatting. * Format tests. * Re-add broadcast_to shape checking. * Fix formatting. * Remove overly strict manipulate check. * Fix typing * [Relax][Onnx] Implement Tile operator * Switch to native relax attention importer. * Address some of the PR comments * Check for the imported model IR version * switch from torch to numpy due to some incompatibility * Fix make format. * Clean up typing issues. * Clarify variable name. * Remove unneeded comprehension. * Remove circular dependency. * Add name sanitization for inputs * Disable reshape rewrite pass until fixed. * Fix long comment * Update cpu image. --------- Co-authored-by: Florin Blanaru <[email protected]> Co-authored-by: Xiyou Zhou <[email protected]> Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> Co-authored-by: Florin Blanaru <[email protected]> Co-authored-by: sung <[email protected]>
* Initial importer and testing scaffolding. * Implement matmul operator and tests. * Add a bunch of new operators. * Add new ops * [Relax][Onnx] Implement Div, Sigmoid, Softmax, Transpose and Unsqueeze ops * skip test_reshape * [Relax][ONNX] Implement BiasGelu and Gelu ops * [Relax][ONNX] Implement Where op * [Relax][ONNX] Add Multiple ONNX Frontend Support for Clip / Equal / Shape / Not / Tanh (#3) * Rebase w/ Equal, Not, Tanh, Sqrt, Relu, Clip, Conv, Pow, Erf. * Fix cumsum but still needs work. * Fix initializer for CumSum. (#9) * Add Constant, Squeeze & Sub (#10) * Add squeeze. * Add Constant. * Add sub. * Support reusing Relay ONNX operator convertors in the Relax ONNX frontend (#8) * [WIP] Support using Relay ops in the Relax ONNX frontend Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] small fixes Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] Support dynamic matmul and reshape Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * Address PR comments --------- Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * Add more ops (including all Reduce ops) using the relay frontend (#11) * [WIP] add more ops. Some fail at the moment * skip some tests * Remove duplicate tests for squeeze * Add Split op in the Relax ONNX frontend (#12) * [Relax][ONNX] Add Split op * Remove tmp * Fix layer normalizations and Shape operator. * Replace main loop with tvm testing. * Simplify Slice for opset 13. * [Relax][ONNX] Implement pad op * Incorporate pad op, add static constantofshape op. * Changes to shape to temporarily enable constantofshape in our models. * Add initial tensor_to_shape implementation. * Implemented dynamic broadcast_to to support expand and constantofshape. * Changes sufficient for vortex end to end run. * Formatting. * Format tests. * Re-add broadcast_to shape checking. * Fix formatting. * Remove overly strict manipulate check. * Fix typing * [Relax][Onnx] Implement Tile operator * Switch to native relax attention importer. * Address some of the PR comments * Check for the imported model IR version * switch from torch to numpy due to some incompatibility * Fix make format. * Clean up typing issues. * Clarify variable name. * Remove unneeded comprehension. * Remove circular dependency. * Add name sanitization for inputs * Disable reshape rewrite pass until fixed. * Fix long comment * Update cpu image. --------- Co-authored-by: Florin Blanaru <[email protected]> Co-authored-by: Xiyou Zhou <[email protected]> Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> Co-authored-by: Florin Blanaru <[email protected]> Co-authored-by: sung <[email protected]>
* Initial importer and testing scaffolding. * Implement matmul operator and tests. * Add a bunch of new operators. * Add new ops * [Relax][Onnx] Implement Div, Sigmoid, Softmax, Transpose and Unsqueeze ops * skip test_reshape * [Relax][ONNX] Implement BiasGelu and Gelu ops * [Relax][ONNX] Implement Where op * [Relax][ONNX] Add Multiple ONNX Frontend Support for Clip / Equal / Shape / Not / Tanh (#3) * Rebase w/ Equal, Not, Tanh, Sqrt, Relu, Clip, Conv, Pow, Erf. * Fix cumsum but still needs work. * Fix initializer for CumSum. (#9) * Add Constant, Squeeze & Sub (#10) * Add squeeze. * Add Constant. * Add sub. * Support reusing Relay ONNX operator convertors in the Relax ONNX frontend (#8) * [WIP] Support using Relay ops in the Relax ONNX frontend Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] small fixes Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * [WIP] Support dynamic matmul and reshape Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * Address PR comments --------- Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> * Add more ops (including all Reduce ops) using the relay frontend (#11) * [WIP] add more ops. Some fail at the moment * skip some tests * Remove duplicate tests for squeeze * Add Split op in the Relax ONNX frontend (#12) * [Relax][ONNX] Add Split op * Remove tmp * Fix layer normalizations and Shape operator. * Replace main loop with tvm testing. * Simplify Slice for opset 13. * [Relax][ONNX] Implement pad op * Incorporate pad op, add static constantofshape op. * Changes to shape to temporarily enable constantofshape in our models. * Add initial tensor_to_shape implementation. * Implemented dynamic broadcast_to to support expand and constantofshape. * Changes sufficient for vortex end to end run. * Formatting. * Format tests. * Re-add broadcast_to shape checking. * Fix formatting. * Remove overly strict manipulate check. * Fix typing * [Relax][Onnx] Implement Tile operator * Switch to native relax attention importer. * Address some of the PR comments * Check for the imported model IR version * switch from torch to numpy due to some incompatibility * Fix make format. * Clean up typing issues. * Clarify variable name. * Remove unneeded comprehension. * Remove circular dependency. * Add name sanitization for inputs * Disable reshape rewrite pass until fixed. * Fix long comment * Update cpu image. --------- Co-authored-by: Florin Blanaru <[email protected]> Co-authored-by: Xiyou Zhou <[email protected]> Co-authored-by: Matthew Barrett <[email protected]> Co-authored-by: Michalis Papadimitriou <[email protected]> Co-authored-by: Florin Blanaru <[email protected]> Co-authored-by: sung <[email protected]>
* Relax pretty printer initial prototype * call into TVMScriptPrinter for PrimFuncs * most round-trip tests pass * address comments * fix typo
This PR adds support to reuse the Relay ONNX operators in the Relax ONNX frontend.
This is achieved by calling the Relay ONNX ops and then using the Relay to Relax translator to obtain the equivalent op. It then appends the output of the translator (VarBindings and the PrimFuncs) to the current block builder used by the frontend.
Dynamic input shapes should work -- this PR includes dynamic examples with:
Co-authored-by: @mbaret and @mikepapadim