-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
input-order-mismatch issue in multi-input layer. #2660
Labels
enhancement
New feature or request
Comments
cibot: Thank you for posting issue #2660. The person in charge will reply soon. |
EunjuYang
added a commit
to EunjuYang/nntrainer
that referenced
this issue
Oct 2, 2024
- This commit is related to issue nnstreamer#2660 - When using multi-inputs, users must feed the data in reverse order due to a known bug that needs fixing. In the current version, the input must be provided in reverse order, which was not shown in the previous example where random data with the same dimensions were used. - To provide a more accurate example to NNTrainer users, I have temporarily updated this example. - Once the issue is handled, further updates will be necessary. Signed-off-by: Eunju Yang <[email protected]>
EunjuYang
added a commit
to EunjuYang/nntrainer
that referenced
this issue
Oct 2, 2024
- This commit is related to issue nnstreamer#2660 - When using multi-inputs, users must feed the data in reverse order due to a known bug that needs fixing. In the current version, the input must be provided in reverse order, which was not shown in the previous example where random data with the same dimensions were used. - To provide a more accurate example to NNTrainer users, I have temporarily updated this example. - Once the issue is handled, further updates will be necessary. Signed-off-by: Eunju Yang <[email protected]>
EunjuYang
added a commit
to EunjuYang/nntrainer
that referenced
this issue
Oct 2, 2024
- This commit is related to issue nnstreamer#2660 - When using multi-inputs, users must feed the data in reverse order due to a known bug that needs fixing. In the current version, the input must be provided in reverse order, which was not shown in the previous example where random data with the same dimensions were used. - To provide a more accurate example to NNTrainer users, I have temporarily updated this example. - Once the issue is handled, further updates will be necessary. Signed-off-by: Eunju Yang <[email protected]>
jijoongmoon
pushed a commit
that referenced
this issue
Oct 4, 2024
- This commit is related to issue #2660 - When using multi-inputs, users must feed the data in reverse order due to a known bug that needs fixing. In the current version, the input must be provided in reverse order, which was not shown in the previous example where random data with the same dimensions were used. - To provide a more accurate example to NNTrainer users, I have temporarily updated this example. - Once the issue is handled, further updates will be necessary. Signed-off-by: Eunju Yang <[email protected]>
DonghakPark
pushed a commit
to DonghakPark/nntrainer
that referenced
this issue
Oct 8, 2024
- This commit is related to issue nnstreamer#2660 - When using multi-inputs, users must feed the data in reverse order due to a known bug that needs fixing. In the current version, the input must be provided in reverse order, which was not shown in the previous example where random data with the same dimensions were used. - To provide a more accurate example to NNTrainer users, I have temporarily updated this example. - Once the issue is handled, further updates will be necessary. Signed-off-by: Eunju Yang <[email protected]> (cherry picked from commit 2807f69)
Q1 : Is it correct to think that the current order follows the like this: auto res = model->inference(batch, {second, first}, {}); When i create a model using configuration file(ini file), it's also have the reverse order of configuration file[model]
type = NeuralNetwork
epochs = 200
loss = mse
batch_size = 32
[optimizer]
type = adam
[in1]
type = input
input_shape = 1:2:3
[in2]
type = input
input_shape = 1:3:3
[concat]
type = concat
input_layers = in1,in2
axis = 2
[output]
type = fully_connected
input_layers = concat
unit = 1 input demensionauto input_dim = model->getInputDimension();
for (size_t i = 0; i < input_dim.size(); i++) {
const auto &dim = input_dim[i];
std::cout << "INPUT_SHAPE[" << i << "] :" << dim.channel() << ", "
<< dim.height() << ", " << dim.width() << std::endl;
} INPUT_SHAPE[0] :1, 3, 3
INPUT_SHAPE[1] :1, 2, 3 model summarizemodel->summarize(std::cout,
ml_train_summary_type_e::ML_TRAIN_SUMMARY_LAYER); ================================================================================
Layer name Layer type Output dimension Input layer
================================================================================
in2 input 1:1:3:3
--------------------------------------------------------------------------------
in1 input 1:1:2:3
--------------------------------------------------------------------------------
concat concat 1:1:5:3 in1
in2
--------------------------------------------------------------------------------
output fully_connected 1:1:5:1 concat
--------------------------------------------------------------------------------
mse0 mse 1:1:5:1 output
================================================================================ |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
While I'm implementing an example app. I faced an issue of the order mismatch between
addLayer( )
and input tensor (multi-input layer case). Here're some descriptions on this issue.proto_net
:where the protonet consits of four layers; the last layer
proto
takes two inputl2norm
andinput_major_label
:protonet.cpp
):This bugfix burden is passed to a user, which is required to be resolved.
(+) The order of input tensors of the layer is only affected by the order of
addLayer
. But it is not coherent with the order (affected but randomly decided).(+) I tested with various combinations to boil down the isuse; I changed
input_layers
option in layer creation as well. Theinput_layers
is irrelevant with the actual input tensors' order. The following code change works in the same situation:Summary of Issue
addLayer
when a model is created.input_layers
property does not match with the actual order.Candidate to resolve
input_layers
, not the order ofaddLayer
.The text was updated successfully, but these errors were encountered: