Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[RELAY][FRONTEND][TENSORFLOW] Fix FuseBatchNorm output cast error if need_cast is True #4894

Merged
merged 1 commit into from
Feb 19, 2020
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions python/tvm/relay/frontend/tensorflow.py
Original file line number Diff line number Diff line change
Expand Up @@ -897,6 +897,7 @@ def _impl(inputs, attr, params):
disables=['momentum'])(inputs, attr)

if need_cast:
out = _expr.TupleGetItem(out.astuple(), 0)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this related to float16?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I converted a float16 tf model containing the FusedBatchNorm operator.
For the float16 model, the output of the FusedBatchNorm layer is converted back to float16(there may be other quantization types).
But the FusedBatchNorm layer has 3 outputs, if passed directly to the Cast layer will cause an exception error.
So this modification only takes its first calculation output and passes it to the Cast

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's true. I too see this recently and above change should fix it.

out = _op.cast(out, dtype=attr['T'].name)
return out
return _impl
Expand Down