Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

i use openvino 2019R3 version to build a ConstLayer, but it can not connect with ReshapeLayer and EltwiseLayer #1694

Closed
wmf0929 opened this issue Aug 10, 2020 · 6 comments
Labels
category: tools OpenVINO C++ / Python tools

Comments

@wmf0929
Copy link

wmf0929 commented Aug 10, 2020

my testCase code as follows, openvino version is 2019R3

#include <ie_builders.hpp>
#include <inference_engine.hpp>
#include <vector>

namespace Builder = InferenceEngine::Builder;
using Layer = InferenceEngine::Builder::Layer;
using PortInfo = InferenceEngine::PortInfo;
using TensorDesc = InferenceEngine::TensorDesc;
using Precision = InferenceEngine::Precision;
using Layout = InferenceEngine::Layout;
using Blob = InferenceEngine::Blob;
using idx_t = InferenceEngine::idx_t;
using Port = InferenceEngine::Port;

int main() {
  // float inputData[1001 * 1 * 1 * 1] = {1.2};
  // float outputData[1001 * 1 * 1 * 1] = {0};
  float bias[1001 * 1 * 1 * 1] = {2.1};
  Builder::Network *network = new Builder::Network("openvino");

  // InputLayer
  Layer input = Builder::InputLayer("input").setPort(Port({1, 1, 1, 1001}));
  idx_t input_id = network->addLayer(input);

  // ConstLayer
  TensorDesc tDesc(Precision::FP32, {1001}, Layout::ANY);
  Blob::Ptr blob =
      InferenceEngine::make_shared_blob<float>(tDesc, (float *)bias);

  Layer weight =
      Builder::ConstLayer("weight").setPort(Port({1001})).setData(blob);

  auto weight_id = network->addLayer(weight);

  // ReshapeLayer
  Layer reshape = Builder::ReshapeLayer("reshape")
                      .setInputPort(Port({1001}))
                      .setOutputPort(Port({1, 1, 1, 1001}))
                      .setDims({1, 1, 1, 1001});
  idx_t reshape_id = network->addLayer({weight.getId()}, reshape);

  // EltwiseLayer
  Layer add = Builder::EltwiseLayer("add")
                  .setEltwiseType(Builder::EltwiseLayer::SUM)
                  .setInputPorts({Port({1, 1, 1, 1001}), Port({1, 1, 1, 1001})})
                  .setOutputPort(Port({1, 1, 1, 1001}));
  idx_t add_id = network->addLayer({input.getId(), reshape.getId()}, add);

  return 0;
}

build command:
c++ openvino.cc -I /opt/intel/openvino/deployment_tools/inference_engine/include -L /opt/intel/openvino/deployment_tools/inference_engine/lib/intel64 -linference_engine -g -O0

the run error information as follows:

gdb a.outGNU gdb (Ubuntu 8.1-0ubuntu3.2) 8.1.0.20180409-git
Copyright (C) 2018 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later http://gnu.org/licenses/gpl.html
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law. Type "show copying"
and "show warranty" for details.
This GDB was configured as "x86_64-linux-gnu".
Type "show configuration" for configuration details.
For bug reporting instructions, please see:
http://www.gnu.org/software/gdb/bugs/.
Find the GDB manual and other documentation resources online at:
http://www.gnu.org/software/gdb/documentation/.
For help, type "help".
Type "apropos word" to search for commands related to "word"...
Reading symbols from a.out...done.
(gdb) b main
Breakpoint 1 at 0x166ed: file openvino_3.cc, line 15.
(gdb) r
Starting program: /host/workspace/halo_tflite/a.out
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".

Breakpoint 1, main () at openvino_3.cc:15
15 int main() {
(gdb) n
18 float bias[1001 * 1 * 1 * 1] = {2.1};
(gdb) n
19 Builder::Network *network = new Builder::Network("openvino");
(gdb) n
22 Layer input = Builder::InputLayer("input").setPort(Port({1, 1, 1, 1001}));
(gdb) n
23 idx_t input_id = network->addLayer(input);
(gdb) n
26 TensorDesc tDesc(Precision::FP32, {1001}, Layout::ANY);
(gdb) n
28 InferenceEngine::make_shared_blob(tDesc, (float *)bias);
(gdb) n
31 Builder::ConstLayer("weight").setPort(Port({1001})).setData(blob);
(gdb) n
33 auto weight_id = network->addLayer(weight);
(gdb) n
36 Layer reshape = Builder::ReshapeLayer("reshape")
(gdb) n
37 .setInputPort(Port({1001}))
(gdb)
38 .setOutputPort(Port({1, 1, 1, 1001}))
(gdb)
39 .setDims({1, 1, 1, 1001});
(gdb)
40 idx_t reshape_id = network->addLayer({weight.getId()}, reshape);
(gdb)
terminate called after throwing an instance of 'InferenceEngine::details::InferenceEngineException'
what(): Cannot find layer with id: 18446744073709551615
/host/code/openvino/inference-engine/src/inference_engine/builders/ie_network_builder.cpp:710

Program received signal SIGABRT, Aborted.
__GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:51
51 ../sysdeps/unix/sysv/linux/raise.c: No such file or directory.
(gdb) bt
#0 __GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:51
#1 0x00007ffff5d9b8b1 in __GI_abort () at abort.c:79
#2 0x00007ffff63fe257 in ?? () from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
#3 0x00007ffff6409606 in ?? () from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
#4 0x00007ffff6409671 in std::terminate() () from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
#5 0x00007ffff6409905 in __cxa_throw () from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
#6 0x00007ffff72a3d78 in InferenceEngine::Builder::Network::getLayer (this=0x5555555a3560, layerId=18446744073709551615)
at /host/code/openvino/inference-engine/src/inference_engine/builders/ie_network_builder.cpp:710
#7 0x00007ffff729d0c6 in InferenceEngine::Builder::Network::<lambda()>::operator()(void) const (__closure=0x7fffffffc390)
at /host/code/openvino/inference-engine/src/inference_engine/builders/ie_network_builder.cpp:333
#8 0x00007ffff729dc05 in InferenceEngine::Builder::Network::connect (this=0x5555555a3560, input=..., output=...)
at /host/code/openvino/inference-engine/src/inference_engine/builders/ie_network_builder.cpp:380
#9 0x00007ffff729c127 in InferenceEngine::Builder::Network::addLayer (this=0x5555555a3560,
inputs=std::vector of length 1, capacity 1 = {...}, layer=...)
---Type to continue, or q to quit---
at /host/code/openvino/inference-engine/src/inference_engine/builders/ie_network_builder.cpp:272
#10 0x000055555556af34 in main () at openvino_3.cc:40
(gdb)

@ilya-lavrenov
Copy link
Contributor

@wmf0929 this NN builder API is deprecated since 2019 R4 and not supported anymore. It's even removed in 2020.4 release.
So the proposal is to use ngraph API to construct networks from code.

@jgespino jgespino added the category: tools OpenVINO C++ / Python tools label Aug 10, 2020
@wmf0929
Copy link
Author

wmf0929 commented Aug 11, 2020 via email

@wmf0929
Copy link
Author

wmf0929 commented Aug 11, 2020

#include "ie_core.hpp"
#include "ngraph/opsets/opset.hpp"
#include "ngraph/opsets/opset3.hpp"
using namespace std;
using namespace ngraph;

int main() {
  auto arg0 = make_shared<opset3::Parameter>(element::f32, Shape{7});
  auto arg1 = make_shared<opset3::Parameter>(element::f32, Shape{7});

  // Create an 'Add' operation with two inputs 'arg0' and 'arg1'
  auto add0 = make_shared<opset3::Add>(arg0, arg1);
  auto abs0 = make_shared<opset3::Abs>(add0);

  // Create a node whose inputs/attributes will be specified later
  auto acos0 = make_shared<opset3::Acos>();

  // Create a node using opset factories
  auto add1 = shared_ptr<Node>(get_opset3().create("Add"));

  // Set inputs to nodes explicitly
  acos0->set_argument(0, add0);

  add1->set_argument(0, acos0);
  add1->set_argument(1, abs0);

  // Run shape inference on the nodes
  NodeVector ops{arg0, arg1, add0, abs0, acos0, add1};
  validate_nodes_and_infer_types(ops);

  // Create a graph with one output (add1) and four inputs (arg0, arg1)
  auto ng_function =
      make_shared<Function>(OutputVector{add1}, ParameterVector{arg0, arg1});
  // std::shared_ptr<ngraph::Function> (use count 1, weak count 0) = {get() =
  // 0x555555597ff0}
  InferenceEngine::CNNNetwork net(ng_function);
  return 0;
}

c++ ngraph.cc -I /host/docker/opt/intel-2020-4-dev/include -L /host/docker/opt/intel-2020-4-dev/lib -lngraph -I /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include -L /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/lib/intel64 -linference_engine -g -O0 -std=c++11

use the openvino 2020.4 library to build the ngraph demo which you provided (https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_nGraphTutorial.html) , will account the follows errors:

/usr/local/bin/ld: /tmp/ccDiHoAS.o: in function `main':
/host/workspace/halo_tflite/ngraph.cc:42: undefined reference to `InferenceEngine::CNNNetwork::CNNNetwork(std::shared_ptr<ngraph::Function> const&)'
/usr/local/bin/ld: /tmp/ccDiHoAS.o: in function `InferenceEngine::details::extract_exception(InferenceEngine::StatusCode, char*)':
/host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/details/ie_exception_conversion.hpp:64: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/details/ie_exception_conversion.hpp:64: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(InferenceEngine::details::InferenceEngineException const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/details/ie_exception_conversion.hpp:64: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/details/ie_exception_conversion.hpp:64: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/details/ie_exception_conversion.hpp:64: undefined reference to `typeinfo for InferenceEngine::details::InferenceEngineException'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/details/ie_exception_conversion.hpp:64: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /tmp/ccDiHoAS.o: in function `InferenceEngine::CNNNetwork::getOutputsInfo[abi:cxx11]() const':
/host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:73: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:73: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(InferenceEngine::details::InferenceEngineException const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:73: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:73: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:73: undefined reference to `typeinfo for InferenceEngine::details::InferenceEngineException'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:73: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /tmp/ccDiHoAS.o: in function `InferenceEngine::CNNNetwork::getInputsInfo[abi:cxx11]() const':
/host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:87: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:87: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(InferenceEngine::details::InferenceEngineException const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:87: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:87: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:87: undefined reference to `typeinfo for InferenceEngine::details::InferenceEngineException'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:87: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /tmp/ccDiHoAS.o: in function `InferenceEngine::CNNNetwork::setBatchSize(unsigned long)':
/host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:126: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:126: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(InferenceEngine::details::InferenceEngineException const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:126: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:126: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:126: undefined reference to `typeinfo for InferenceEngine::details::InferenceEngineException'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:126: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /tmp/ccDiHoAS.o: in function `InferenceEngine::CNNNetwork::getBatchSize() const':
/host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:137: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:137: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(InferenceEngine::details::InferenceEngineException const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:137: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:137: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:137: undefined reference to `typeinfo for InferenceEngine::details::InferenceEngineException'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:137: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /tmp/ccDiHoAS.o: in function `InferenceEngine::CNNNetwork::getInputShapes[abi:cxx11]() const':
/host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:208: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:208: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(InferenceEngine::details::InferenceEngineException const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:208: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:208: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:208: undefined reference to `typeinfo for InferenceEngine::details::InferenceEngineException'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:217: undefined reference to `InferenceEngine::Data::getTensorDesc() const'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:217: undefined reference to `InferenceEngine::Data::getName[abi:cxx11]() const'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:208: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /tmp/ccDiHoAS.o: in function `InferenceEngine::CNNNetwork::reshape(std::map<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::vector<unsigned long, std::allocator<unsigned long> >, std::less<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::allocator<std::pair<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const, std::vector<unsigned long, std::allocator<unsigned long> > > > > const&)':
/host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:230: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:230: undefined reference to `InferenceEngine::details::InferenceEngineException::InferenceEngineException(InferenceEngine::details::InferenceEngineException const&)'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:230: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:230: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:230: undefined reference to `typeinfo for InferenceEngine::details::InferenceEngineException'
/usr/local/bin/ld: /host/docker/opt/intel-2020-4-dev/deployment_tools/inference_engine/include/cpp/ie_cnn_network.h:230: undefined reference to `InferenceEngine::details::InferenceEngineException::~InferenceEngineException()'
collect2: error: ld returned 1 exit status

can you help me take a look? thanks

@ilya-lavrenov
Copy link
Contributor

@wmf0929 please, also link inference_engine_legacy, it still contains some symbols of public API

@wmf0929
Copy link
Author

wmf0929 commented Aug 11, 2020

ok, thank you very much

@jgespino
Copy link
Contributor

@wmf0929 I am closing the issue, please feel free to re-open or start a new one if additional assistance is needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: tools OpenVINO C++ / Python tools
Projects
None yet
Development

No branches or pull requests

3 participants