Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Resize (opset 11) layer to support scales option when dims are… #2137

Merged
merged 3 commits into from
Mar 18, 2023

Conversation

qant-um
Copy link
Contributor

@qant-um qant-um commented Mar 15, 2023

Resize layer (version 11) will generate automatically multiple layers using Shape, Slice, Concat whereas it supports also the scales and sizes options to realize this operation. In the case where input shape is undefined (-1), this method is interesting to be applied but otherwise, inference will be faster with only one node of Resize. Backend unit test are already efficient to cover this usecase.
reference: https://github.com/onnx/onnx/blob/main/docs/Changelog.md#Resize-11

@qant-um qant-um force-pushed the op_resize_opset11_optimized branch from c42652a to d821b11 Compare March 15, 2023 21:28
@qant-um qant-um force-pushed the op_resize_opset11_optimized branch from d821b11 to 87ac305 Compare March 15, 2023 21:42
concat_shape.output[0]
]
shape = ctx.get_shape(node.input[0])
if shape and shape[2] != -1 and shape[1] != -1 and node.inputs[1].is_const():
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The shape comes from the input of Resize. Is it possible the rank is less than 3 so shape[2] doesn't exist?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi,
This line is a copy of a condition of the version 10 ... but you are right, if this appends there will be an issue.
From my first understanding, shape is (B, C, H, W) so you mean the graph could be in another shape dimension. So, a condition should check that shape is on 4 dims before entering in this instruction block like this for example:
if shape and len(shape) == 4 and and shape[2] != -1 and shape[1] != -1 and node.inputs[1].is_const()
or "playing" with the array indexes:
if shape and shape[-2] != -1 and shape[-1] != -1 and node.inputs[1].is_const()

Thanks for your review again,

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, This line is a copy of a condition of the version 10 ... but you are right, if this appends there will be an issue. From my first understanding, shape is (B, C, H, W) so you mean the graph could be in another shape dimension. So, a condition should check that shape is on 4 dims before entering in this instruction block like this for example: if shape and len(shape) == 4 and and shape[2] != -1 and shape[1] != -1 and node.inputs[1].is_const() or "playing" with the array indexes: if shape and shape[-2] != -1 and shape[-1] != -1 and node.inputs[1].is_const()

Thanks for your review again,

Sorry, my mistake and such check are not necessary since such shape is required by these 3 ops.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean, this is ok for you ? I do not know exactly how the process works. Do you attend an action from myself ?
or the merge will be done by you once tests are passed ?

@fatcat-z fatcat-z enabled auto-merge (squash) March 17, 2023 08:12
Copy link
Collaborator

@fatcat-z fatcat-z left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your contribution!

@fatcat-z fatcat-z merged commit 5708e10 into onnx:main Mar 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants