-
Notifications
You must be signed in to change notification settings - Fork 246
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add transforms' forward_shape and inverse_shape #887
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for porting this! My only suggestion is to add a comment about .forward_shape()
versus .forward_event_shape()
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for porting this!
def forward_shape(self, shape): | ||
out_shape = self.bijector.forward_event_shape(shape) | ||
in_event_shape = self.bijector.inverse_event_shape(out_shape) | ||
batch_shape = shape[:len(shape) - len(in_event_shape)] | ||
return batch_shape + out_shape |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice trick!
This PR steals @fritzo implementation in pytorch/pytorch#50581 and resolve various issues that couldn't be addressed previously. Thanks for your effort, @fritzo!