Skip to content
This repository has been archived by the owner on May 5, 2020. It is now read-only.

Mobilenet_v2_quantized predict_net.pb file possible error #53

Open
TerryTsao opened this issue Nov 14, 2018 · 2 comments
Open

Mobilenet_v2_quantized predict_net.pb file possible error #53

TerryTsao opened this issue Nov 14, 2018 · 2 comments

Comments

@TerryTsao
Copy link

TerryTsao commented Nov 14, 2018

TL;DR

Can't run the net using caffe2 following the 'Load Pretrained Net' tutorial. Keep showing error saying that 325 is not CPU Tensor. Fixed by changing the deserialized ascii prototxt and re-serialization.


I had posted this issue here before.

https://discuss.pytorch.org/t/caffe2-mobilenetv2-quantized-using-caffe2-blobistensortype-blob-cpu-blob-is-not-a-cpu-tensor-325/29065

After some experiments, I've discovered the problem and come up w/ a fix.

After I deserialized predict_net.pb file to ascii prototxt, I found out at the end, the network is supposed to output blob softmax instead of a Int8CPUTensor called 325. The problem, although I'm not entirely sure of on a source code basis, is probably because an Int8CPUTensor somehow fail the CAFFE_ENFORCE test.

When I change the very last line of the dumped ascii prototxt from exported_output: 325 to exported_output: softmax, everything worked just fine. So I'm thinking the file given in the official repo is not correct, at least for me.

I'm not sure if this happens to anyone else. Thought I put it here in case anyone encounters the same situation.

@dzung-hoang
Copy link

You can find the fixed predict_net.pb here.

@TerryTsao
Copy link
Author

@dzung-hoang Great. I've fixed MobileNet myself as well. But it's good to know there is a bug in resnet_quant, too..... I can't believe this....

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants