-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Type inference of nn.softmax
does not reject invalid axis
#11684
Comments
didn't reproduce any error on my linux platform.
test code: x = relay.var('x', shape=(4, )) output: |
@lileidev Thanks for your participation in this issue. This test case is invalid, so an error is actually expected. However, the type inference in Relay does not report any error. This is somehow different from common bug issues, where we typically expect no errors. To actually reproduce an error, you can try the following code: from tvm import relay, transform, IRModule
x = relay.var('x', shape=(4,))
y = relay.nn.softmax(x, axis=1)
mod = IRModule.from_expr(y)
mod = relay.transform.InferType()(mod)
with transform.PassContext(opt_level=0):
lib = relay.build(mod, target='llvm') And you may see the following error report:
This cause of this error is the invalid |
Expected behavior
The following Relay program should NOT pass type inference:
The input tensor
%x
ofnn.softmax
has only one dimension. The valid range ofaxis
is [-1, 1).axis=1
is obviously invalid in this case.Actual behavior
This program passes type inference of Relay.
Environment
macOS 12.4. Compiled using Clang 13.1.6 with LLVM support. TVM commit
0df6961
.Steps to reproduce
Possible fix
tvm/src/relay/op/nn/nn.cc
Lines 409 to 423 in 0df6961
In operator registration of
nn.softmax
, its type relation is set to beIdentityRel
. However,nn.softmax
has an attributeaxis
that is not checked byIdentityRel
.A possible fix is to implement a new type relation function that checks
axis
attribute inSoftmaxAttrs
. This type relation also applies tonn.fast_softmax
andnn.log_softmax
.The text was updated successfully, but these errors were encountered: