Skip to content
This repository has been archived by the owner on Aug 1, 2023. It is now read-only.

return empty tensor instead of None #332

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

jhcross
Copy link
Contributor

@jhcross jhcross commented Feb 6, 2019

Summary:
To allow efficient use of fork/join annotation, we return an empty tensor instead of None for encoder_padding_mask from transformer encoder in the unmasked/inference case.

Note that this slight hack is preferable to more far-reaching changes in, e.g., Fairseq multihead_attention.

Differential Revision: D13969691

jhcross added a commit to jhcross/translate-1 that referenced this pull request Feb 6, 2019
Summary:
Pull Request resolved: pytorch#332

To allow efficient use of fork/join annotation, we return an empty tensor instead of `None` for `encoder_padding_mask` from transformer encoder in the unmasked/inference case.

Note that this slight hack is preferable to more far-reaching changes in, e.g., Fairseq multihead_attention.

Differential Revision: D13969691

fbshipit-source-id: 5b6106d8f4ac311ca4a5708898639b18ab2be07d
Differential Revision: D13969501

fbshipit-source-id: 525464bd46f7d5c925e2392ba93f6ef6533dfb63
Summary:
Pull Request resolved: pytorch#332

To allow efficient use of fork/join annotation, we return an empty tensor instead of `None` for `encoder_padding_mask` from transformer encoder in the unmasked/inference case.

Note that this slight hack is preferable to more far-reaching changes in, e.g., Fairseq multihead_attention.

Differential Revision: D13969691

fbshipit-source-id: 862ed44019012449554527f236cb344046c75184
@facebook-github-bot
Copy link

Hi @jhcross!

Thank you for your pull request.

We require contributors to sign our Contributor License Agreement, and yours needs attention.

You currently have a record in our system, but the CLA is no longer valid, and will need to be resubmitted.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at [email protected]. Thanks!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants