Skip to content
This repository has been archived by the owner on Oct 25, 2023. It is now read-only.

[TUZ-6] Add a direct Onnx to Relax Importer #14

Merged
merged 39 commits into from
Feb 17, 2023
Merged
Changes from 1 commit
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
16bc26d
Initial importer and testing scaffolding.
Jan 18, 2023
ed7ea1f
Implement matmul operator and tests.
Jan 18, 2023
e9cafd7
Add a bunch of new operators.
Jan 19, 2023
54a495a
Add new ops
Jan 25, 2023
ba6ccf2
[Relax][ONNX] Add Multiple ONNX Frontend Support for Clip / Equal / S…
zxybazh Jan 30, 2023
99ed8d9
Fix initializer for CumSum. (#9)
zxybazh Jan 31, 2023
c492c54
Add Constant, Squeeze & Sub (#10)
zxybazh Jan 31, 2023
c15b219
Support reusing Relay ONNX operator convertors in the Relax ONNX fron…
Jan 31, 2023
c59cfb1
Add more ops (including all Reduce ops) using the relay frontend (#11)
Jan 31, 2023
7bd93bb
Add Split op in the Relax ONNX frontend (#12)
Feb 2, 2023
10dd45b
Fix layer normalizations and Shape operator.
Feb 2, 2023
909473d
Replace main loop with tvm testing.
Feb 2, 2023
4205426
Simplify Slice for opset 13.
Feb 2, 2023
cec2760
[Relax][ONNX] Implement pad op
gigiblender Feb 2, 2023
a6f756c
Incorporate pad op, add static constantofshape op.
Feb 2, 2023
c49e2e1
Changes to shape to temporarily enable constantofshape in our models.
Feb 2, 2023
68dae06
Add initial tensor_to_shape implementation.
Feb 3, 2023
879b6c0
Implemented dynamic broadcast_to to support expand and constantofshape.
Feb 3, 2023
3fae07b
Changes sufficient for vortex end to end run.
Feb 3, 2023
370ed01
Formatting.
Feb 3, 2023
359615d
Format tests.
Feb 3, 2023
06f1114
Re-add broadcast_to shape checking.
Feb 3, 2023
acf84b9
Fix formatting.
Feb 3, 2023
1d594e4
Remove overly strict manipulate check.
Feb 3, 2023
c3dfe1d
Fix typing
Feb 4, 2023
03d840e
[Relax][Onnx] Implement Tile operator
gigiblender Feb 3, 2023
eca82df
Switch to native relax attention importer.
Feb 8, 2023
bf9695c
Address some of the PR comments
gigiblender Feb 9, 2023
96331c4
Check for the imported model IR version
gigiblender Feb 9, 2023
853873e
switch from torch to numpy due to some incompatibility
sunggg Feb 9, 2023
b0d7cd4
Fix make format.
Feb 10, 2023
609718b
Clean up typing issues.
Feb 10, 2023
cb0156e
Clarify variable name.
Feb 10, 2023
79de976
Remove unneeded comprehension.
Feb 10, 2023
bf792ee
Remove circular dependency.
Feb 11, 2023
84ccc8e
Add name sanitization for inputs
gigiblender Feb 12, 2023
20e3b92
Disable reshape rewrite pass until fixed.
Feb 15, 2023
8ce334c
Fix long comment
Feb 16, 2023
9f0e3b1
Update cpu image.
Feb 17, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Remove circular dependency.
Josh Fromm committed Feb 16, 2023
commit bf792ee1a6cbcd4a94d8097b3db91020e1a9da12
7 changes: 3 additions & 4 deletions python/tvm/relax/frontend/onnx_frontend.py
Original file line number Diff line number Diff line change
@@ -48,7 +48,6 @@
from tvm.relax import testing, PyExprMutator
from tvm.relay.expr import TupleWrapper, Var, GlobalVar
from tvm.relay.frontend.onnx import OnnxOpConverter as RelayOnnxOpConverter
from tvm.script import relax as R


def new_var(var_name: str, shape: List, dtype: str = "float32"):
@@ -556,7 +555,7 @@ def _impl_v9(cls, bb, inputs, attr):
shape_vars = []
for i in range(shape_ndim):
shape_vars.append(tvm.tir.Var("x_%d" % i, "int64"))
bb.match_cast(shape_dataflow_var, R.Shape(shape_vars))
bb.match_cast(shape_dataflow_var, relax.ShapeStructInfo(shape_vars))
return bb.normalize(relax.op.broadcast_to(const_value, relax.ShapeExpr(shape_vars)))


@@ -706,7 +705,7 @@ def _impl_v13(cls, bb, inputs, attr):
shape_vars = []
for i in range(shape_ndim):
shape_vars.append(tvm.tir.Var("x_%d" % i, "int64"))
bb.match_cast(shape_dataflow_var, R.Shape(shape_vars))
bb.match_cast(shape_dataflow_var, relax.ShapeStructInfo(shape_vars))
return bb.normalize(relax.op.broadcast_to(data, relax.ShapeExpr(shape_vars)))


@@ -987,7 +986,7 @@ def from_onnx(
outputs = outputs[0] if len(outputs) == 1 else relax.Tuple(outputs)

# Create a function from our output expression and all input variables.
param_list = [v for k, v in self._inputs.items()]
param_list = [v for k, v in self._inputs.items() if isinstance(v, relax.Var)]
output_var = self.bb.emit_output(outputs)
self.bb.emit_func_output(output_var, params=param_list)
return self.bb.get()