-
Notifications
You must be signed in to change notification settings - Fork 116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting strange split and Lambda function when converting onnx #140
Comments
I managed to narrow down the problem: the split and the new Lambda function were generated due to the "Add" layer (the last one in the graph). This "Add" layer was originated from a FullyConnected (Dense) layer that was transformed to MatMul+Add (because as far I know ONNX doesn't support FullyConnected (Dense)). The onnx2keras conversion manages to convert the MatMul into Dense in Keras, but the following Add is not understood by the converter as the 2nd part of the Dense, and thus it is added as a Lambda layer. Thanks, |
I managed to fix it, and add the bias as part of the Dense layer! Thanks, |
After I got to know your code better, I think I managed to support better the 'channels_last' option (in the code it is called: change_ordering=True) which is Keras's default (channels_last = NHWC). And when doing that I added a Transpose at the beginning of the model to change the input from 'channels_first' (ONNX) to 'channels_last' (Keras default). I've also checked if the model has already Transpose layer at the beginning and if so merged it with the input conversion Transpose I add (some ONNX models have 'channels_last' to 'channels_first' Transpose at the beginning, if they were created from a 'channels_last' model themselves). I also check if those Transpose cancel one another (i.e. 'channels_last' -> 'channels_first' -> 'channels_last') and remove the Transpose altogether instead of keeping two Transpose layers. I also managed to merge Activation layers that follow Conv layer into the Conv layer, and also merge ZeroPadding2D that precedes Conv into the Conv as well (to some extent, because Keras supports only 'same' and 'valid' padding on the Conv layers, where ONNX has more flexibility by using pads array). See the screenshot below with all of my changes (you can compare to the previous screenshot I've sent), and let me know if you would like me to submit all those changes to your repository, for you to review and hopefully merge. p.s.: On the screenshot you can see that the Transpose layers were removed, that's because the ONNX model I've used had exactly the opposite channels conversion Transpose inside, so after converting that to 'channels_last' they were both canceling each other and no Transpose was needed. Thanks, |
@kobygold, hi can you share your code on merging MatMul and Add to Dense layer? |
Hi @tea1528, remote: Permission to gmalivenko/onnx2keras.git denied to kobygold. My change is to add these few lines in onnx2keras/converter.py right after line 120:
p.s.: My other changes regarding the treatment of channels_last and adding additional Transpose layers (and merging two sequential Transpose layers into one (or none)) is not generic enough, and will not always work correctly. I need to think how to make it more generic to work in all networks (basically only CONV based layers need such Transpose, while Dense doesn't need that, so I have to add this Transpose before each CONV layer, and add Transpose back after each CONV layer, while trying to avoid multiple Transpose layers as much as possible). |
Hi,
Trying to convert a pretty simple onnx to keras I'm getting a strange split from the input and 2 Lambda functions...
I would expect the output h5 to be similar to the onnx file, i.e. without the split and those Lambda functions that are not on the source onnx file.
Attached you can see a screenshot and both onnx and h5 files in zip file.
onnx+h5.zip
Thanks,
Koby
The text was updated successfully, but these errors were encountered: