You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
Following #16279, another example of inconsistency between HybridBlock and Block:
import mxnet as mx
from mxnet.gluon import HybridBlock
class Foo(HybridBlock):
def hybrid_forward(self, F, a, b):
return a + b
b1 = Foo(prefix='non_hybrid')
b2 = Foo(prefix='hybrid')
b2.hybridize()
print(b1(mx.nd.ones((10,)), mx.nd.ones((1,))))
print(b2(mx.nd.ones((10,)), mx.nd.ones((1,))))
MXNetError is triggered for hybridized case:
MXNetError: MXNetError: Error in operator hybrid_plus0: [14:24:15] src/operator/numpy/linalg/../../tensor/../elemwise_op_common.h:135: Check failed: assign(&dattr, vec.at(i)): Incompatible attr in node hybridized_plus0 at 1-th input: expected [10], got [1]
I think it's because for symbol, elemwise_add is used for __add__, while for ndarray broadcasting is considered in __add__ considering input shape. Can we add broadcasting in symbol too and avoid the need to differentiate between elemwise_add and broadcast_add?
Note that the same problem exists in subtract, multiply and divide operation, too.
The text was updated successfully, but these errors were encountered:
Following #16279, another example of inconsistency between HybridBlock and Block:
MXNetError is triggered for hybridized case:
I think it's because for symbol,
elemwise_add
is used for__add__
, while for ndarray broadcasting is considered in__add__
considering input shape. Can we add broadcasting in symbol too and avoid the need to differentiate betweenelemwise_add
andbroadcast_add
?Note that the same problem exists in subtract, multiply and divide operation, too.
The text was updated successfully, but these errors were encountered: