We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The parallelism tutorial mentions this code to forward the attributes of a DataParallel object to its wrapped module:
DataParallel
class MyDataParallel(nn.DataParallel): def __getattr__(self, name): return getattr(self.module, name)
This, however, leads to a recursion error, as self.module will call the same __getattr__ again. I think it should be:
self.module
__getattr__
class MyDataParallel(DataParallel): def __getattr__(self, name): try: return super().__getattr__(name) except AttributeError: return getattr(self.module, name)
As was discussed here.
The text was updated successfully, but these errors were encountered:
/assigntome
Sorry, something went wrong.
Updates pytorch#836 as suggested in pytorch/pytorch#16885 (comment)
4b6497c
Patch 3 (#2389)
d078756
* Updates #836 as suggested in pytorch/pytorch#16885 (comment)
TheMemoryDealer
Successfully merging a pull request may close this issue.
The parallelism tutorial mentions this code to forward the attributes of a
DataParallel
object to its wrapped module:This, however, leads to a recursion error, as
self.module
will call the same__getattr__
again. I think it should be:As was discussed here.
The text was updated successfully, but these errors were encountered: