Skip to content
This repository has been archived by the owner on May 5, 2020. It is now read-only.

How can I quantize my own model? #51

Open
E1eMenta opened this issue Oct 30, 2018 · 6 comments
Open

How can I quantize my own model? #51

E1eMenta opened this issue Oct 30, 2018 · 6 comments

Comments

@E1eMenta
Copy link

No description provided.

@KaiHu-KH
Copy link

same question, how can i quantize my own model in caffe2?

@E1eMenta
Copy link
Author

I've found only example in pytorch.
https://github.com/eladhoffer/quantized.pytorch

@KaiHu-KH
Copy link

I've found only example in pytorch.
https://github.com/eladhoffer/quantized.pytorch

Got it, Thanks a lot.

@liujingcs
Copy link

Same question. I do not find any doc talking about this.

@wangshankun
Copy link

Same question

@wangshankun
Copy link

using qnnpack replace gemmlowp in tflite;
TensorFlow quantize tflite modle

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants