Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix MPFuture failing outside inference mode #521

Merged
merged 1 commit into from
Nov 26, 2022

Conversation

borzunov
Copy link
Member

@borzunov borzunov commented Nov 26, 2022

Fixes this error discovered while working on bigscience-workshop/petals#91:

Screenshot 2022-11-26 at 07 40 22

@codecov
Copy link

codecov bot commented Nov 26, 2022

Codecov Report

Merging #521 (1f49d50) into master (94c985d) will increase coverage by 0.02%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master     #521      +/-   ##
==========================================
+ Coverage   75.95%   75.97%   +0.02%     
==========================================
  Files          81       81              
  Lines        7946     7947       +1     
==========================================
+ Hits         6035     6038       +3     
+ Misses       1911     1909       -2     
Impacted Files Coverage Δ
hivemind/utils/mpfuture.py 89.33% <100.00%> (+0.04%) ⬆️
hivemind/dht/protocol.py 93.15% <0.00%> (+0.91%) ⬆️

@borzunov
Copy link
Member Author

Technically, this happens when MPFuture and the underlying shared state tensor are created inside inference_mode(), then the state is changed outside of it:

In [1]: import torch

In [2]: with torch.inference_mode():
   ...:     t = torch.zeros(3)
   ...: 

In [3]: t[...] = torch.ones(3)
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
Input In [3], in <cell line: 1>()
----> 1 t[...] = torch.ones(3)

RuntimeError: Inplace update to inference tensor outside InferenceMode is not allowed.You can make a clone to get a normal tensor before doing inplace update.See https://github.com/pytorch/rfcs/pull/17 for more details.

@borzunov borzunov merged commit 8f258b4 into master Nov 26, 2022
@borzunov borzunov deleted the fix-mpfuture-inference-mode branch November 26, 2022 17:24
@borzunov borzunov mentioned this pull request Nov 27, 2022
borzunov added a commit that referenced this pull request Nov 27, 2022
This is necessary for #521 to work. The minimal version where `torch.inference_mode()` works is 1.9.0.
mryab pushed a commit that referenced this pull request Nov 29, 2022
mryab pushed a commit that referenced this pull request Nov 29, 2022
This is necessary for #521 to work. The minimal version where `torch.inference_mode()` works is 1.9.0.

(cherry picked from commit 1242cfb)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants