-
Notifications
You must be signed in to change notification settings - Fork 479
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support min/max carry over for eager mode from_float method #2046
Conversation
This pull request was exported from Phabricator. Differential Revision: D57747749 |
…2046) Summary: X-link: pytorch/pytorch#127309 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Differential Revision: D57747749
097ede5
to
3ae2a51
Compare
This pull request was exported from Phabricator. Differential Revision: D57747749 |
…2046) Summary: X-link: pytorch/pytorch#127309 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Differential Revision: D57747749
3ae2a51
to
061c63c
Compare
This pull request was exported from Phabricator. Differential Revision: D57747749 |
…2046) Summary: X-link: pytorch/pytorch#127309 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Differential Revision: D57747749
061c63c
to
b903c5e
Compare
This pull request was exported from Phabricator. Differential Revision: D57747749 |
b903c5e
to
cb1d111
Compare
…2046) Summary: X-link: pytorch/pytorch#127309 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Differential Revision: D57747749
This pull request was exported from Phabricator. Differential Revision: D57747749 |
…2046) Summary: X-link: pytorch/pytorch#127309 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Differential Revision: D57747749
cb1d111
to
94981e4
Compare
This pull request was exported from Phabricator. Differential Revision: D57747749 |
…2046) Summary: X-link: pytorch/pytorch#127309 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Differential Revision: D57747749
94981e4
to
1b7d2b4
Compare
This pull request was exported from Phabricator. Differential Revision: D57747749 |
…2046) Summary: X-link: pytorch/pytorch#127309 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Differential Revision: D57747749
1b7d2b4
to
266dc95
Compare
This pull request was exported from Phabricator. Differential Revision: D57747749 |
…2046) Summary: X-link: pytorch/pytorch#127309 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Reviewed By: jerryzh168 Differential Revision: D57747749
0b88062
to
b0c946b
Compare
This pull request was exported from Phabricator. Differential Revision: D57747749 |
b0c946b
to
f78d62d
Compare
…2046) Summary: X-link: pytorch/pytorch#127309 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Reviewed By: jerryzh168 Differential Revision: D57747749
This pull request was exported from Phabricator. Differential Revision: D57747749 |
…2046) Summary: X-link: pytorch/pytorch#127309 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Reviewed By: jerryzh168 Differential Revision: D57747749
f78d62d
to
cafccb8
Compare
This pull request was exported from Phabricator. Differential Revision: D57747749 |
…2046) Summary: X-link: pytorch/pytorch#127309 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Reviewed By: jerryzh168 Differential Revision: D57747749
cafccb8
to
156a851
Compare
…127309) Summary: X-link: pytorch/torchrec#2046 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Test Plan: Signals ``` buck2 run caffe2/test:test_mobile_optimizer ``` Reviewed By: jerryzh168 Differential Revision: D57747749
This pull request was exported from Phabricator. Differential Revision: D57747749 |
Summary: X-link: pytorch/pytorch#127309 Pull Request resolved: #2046 After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never. Dynamic QAT also does not require to re-run weight observer again by design. This is a fix Reviewed By: jerryzh168 Differential Revision: D57747749 fbshipit-source-id: 231937f64c6dc53cc79b35bb94534fdaa84e7da1
Summary:
After QAT is completed or given pre-tuned weight observer via tunable PTQ algorithm, it should not over-write again with a given weight, at least for static QAT never.
Dynamic QAT also does not require to re-run weight observer again by design.
This is a fix
Differential Revision: D57747749