Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

[Numpy] expand_dims throws delay_alloc error #16850

Closed
stu1130 opened this issue Nov 19, 2019 · 5 comments · Fixed by #16856
Closed

[Numpy] expand_dims throws delay_alloc error #16850

stu1130 opened this issue Nov 19, 2019 · 5 comments · Fixed by #16856

Comments

@stu1130
Copy link
Contributor

stu1130 commented Nov 19, 2019

MXNet version: pip install mxnet-mkl --pre as of today

>>> import mxnet
>>> from mxnet import np, npx
>>> npx.set_np()
>>> a = np.array([]).reshape(2, 1, 0)
>>> np.expand_dims(a, 2).wait_to_read()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/leecheng/anaconda3/lib/python3.7/site-packages/mxnet/ndarray/ndarray.py", line 2350, in wait_to_read
    check_call(_LIB.MXNDArrayWaitToRead(self.handle))
  File "/Users/leecheng/anaconda3/lib/python3.7/site-packages/mxnet/base.py", line 255, in check_call
    raise MXNetError(py_str(_LIB.MXGetLastError()))
mxnet.base.MXNetError: [23:11:49] src/ndarray/ndarray.cc:502: Check failed: delay_alloc:
Stack trace:
  [bt] (0) 1   libmxnet.so                         0x000000011c7cc029 mxnet::op::MKLDNNLeakyReluBackward(nnvm::NodeAttrs const&, mxnet::OpContext const&, std::__1::vector<mxnet::NDArray, std::__1::allocator<mxnet::NDArray> > const&, mxnet::OpReqType const&, mxnet::NDArray const&) + 4265
  [bt] (1) 2   libmxnet.so                         0x000000011ea457c4 mxnet::NDArray::Chunk::SetMKLMem(mxnet::TShape const&, int) + 1668
  [bt] (2) 3   libmxnet.so                         0x000000011ea45cc0 mxnet::NDArray::GetMKLDNNData() const + 560
  [bt] (3) 4   libmxnet.so                         0x000000011c80e941 mxnet::op::MKLDNNReshapeFwd::MKLDNNReshapeFwd(mxnet::OpReqType const&, mxnet::NDArray const&, mxnet::NDArray const&) + 225
  [bt] (4) 5   libmxnet.so                         0x000000011c80fe65 mxnet::op::GetReshapeForward(mxnet::OpReqType const&, mxnet::NDArray const&, mxnet::NDArray const&) + 245
  [bt] (5) 6   libmxnet.so                         0x000000011c810cd7 mxnet::op::MKLDNNReshapeForward(nnvm::NodeAttrs const&, mxnet::OpContext const&, mxnet::NDArray const&, mxnet::OpReqType const&, mxnet::NDArray const&) + 951
  [bt] (6) 7   libmxnet.so                         0x000000011e969daa mxnet::imperative::PushFComputeEx(std::__1::function<void (nnvm::NodeAttrs const&, mxnet::OpContext const&, std::__1::vector<mxnet::NDArray, std::__1::allocator<mxnet::NDArray> > const&, std::__1::vector<mxnet::OpReqType, std::__1::allocator<mxnet::OpReqType> > const&, std::__1::vector<mxnet::NDArray, std::__1::allocator<mxnet::NDArray> > const&)> const&, nnvm::Op const*, nnvm::NodeAttrs const&, mxnet::Context const&, std::__1::vector<mxnet::engine::Var*, std::__1::allocator<mxnet::engine::Var*> > const&, std::__1::vector<mxnet::engine::Var*, std::__1::allocator<mxnet::engine::Var*> > const&, std::__1::vector<mxnet::Resource, std::__1::allocator<mxnet::Resource> > const&, std::__1::vector<mxnet::NDArray*, std::__1::allocator<mxnet::NDArray*> > const&, std::__1::vector<mxnet::NDArray*, std::__1::allocator<mxnet::NDArray*> > const&, std::__1::vector<mxnet::OpReqType, std::__1::allocator<mxnet::OpReqType> > const&)::'lambda'(mxnet::RunContext)::operator()(mxnet::RunContext) const + 490
  [bt] (7) 8   libmxnet.so                         0x000000011e96a7ad std::__1::__function::__func<mxnet::imperative::PushFComputeEx(std::__1::function<void (nnvm::NodeAttrs const&, mxnet::OpContext const&, std::__1::vector<mxnet::NDArray, std::__1::allocator<mxnet::NDArray> > const&, std::__1::vector<mxnet::OpReqType, std::__1::allocator<mxnet::OpReqType> > const&, std::__1::vector<mxnet::NDArray, std::__1::allocator<mxnet::NDArray> > const&)> const&, nnvm::Op const*, nnvm::NodeAttrs const&, mxnet::Context const&, std::__1::vector<mxnet::engine::Var*, std::__1::allocator<mxnet::engine::Var*> > const&, std::__1::vector<mxnet::engine::Var*, std::__1::allocator<mxnet::engine::Var*> > const&, std::__1::vector<mxnet::Resource, std::__1::allocator<mxnet::Resource> > const&, std::__1::vector<mxnet::NDArray*, std::__1::allocator<mxnet::NDArray*> > const&, std::__1::vector<mxnet::NDArray*, std::__1::allocator<mxnet::NDArray*> > const&, std::__1::vector<mxnet::OpReqType, std::__1::allocator<mxnet::OpReqType> > const&)::'lambda'(mxnet::RunContext), std::__1::allocator<mxnet::imperative::PushFComputeEx(std::__1::function<void (nnvm::NodeAttrs const&, mxnet::OpContext const&, std::__1::vector<mxnet::NDArray, std::__1::allocator<mxnet::NDArray> > const&, std::__1::vector<mxnet::OpReqType, std::__1::allocator<mxnet::OpReqType> > const&, std::__1::vector<mxnet::NDArray, std::__1::allocator<mxnet::NDArray> > const&)> const&, nnvm::Op const*, nnvm::NodeAttrs const&, mxnet::Context const&, std::__1::vector<mxnet::engine::Var*, std::__1::allocator<mxnet::engine::Var*> > const&, std::__1::vector<mxnet::engine::Var*, std::__1::allocator<mxnet::engine::Var*> > const&, std::__1::vector<mxnet::Resource, std::__1::allocator<mxnet::Resource> > const&, std::__1::vector<mxnet::NDArray*, std::__1::allocator<mxnet::NDArray*> > const&, std::__1::vector<mxnet::NDArray*, std::__1::allocator<mxnet::NDArray*> > const&, std::__1::vector<mxnet::OpReqType, std::__1::allocator<mxnet::OpReqType> > const&)::'lambda'(mxnet::RunContext)>, void (mxnet::RunContext)>::operator()(mxnet::RunContext&&) + 29
  [bt] (8) 9   libmxnet.so                         0x000000011e8de6d7 dmlc::ThreadLocalStore<mxnet::engine::ThreadedEngine::BulkStatus>::Get() + 16263
@reminisce
Copy link
Contributor

ExpandDimEx needs to skip zero-size cases. I will send in a PR for the fix.

@reminisce reminisce self-assigned this Nov 19, 2019
@TaoLv
Copy link
Member

TaoLv commented Nov 19, 2019

MKL-DNN path has been fixed by #16837.

@reminisce
Copy link
Contributor

@TaoLv #16837 only deals with zero-dim case, while here it's zero-size which is different.

@wuxun-zhang
Copy link
Contributor

@reminisce Yes, we need also check if any dim is zero.

@TaoLv
Copy link
Member

TaoLv commented Nov 19, 2019

Thank you for clarifying @reminisce . Let me know if anything I can help.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants