Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add LFMMI loss #1725

Merged
merged 1 commit into from
Mar 10, 2023
Merged

add LFMMI loss #1725

merged 1 commit into from
Mar 10, 2023

Conversation

aluminumbox
Copy link
Collaborator

add LFMMI loss, add torch.jit.ignore for lfmmi function

@@ -89,6 +94,9 @@ def forward(
text: (Batch, Length)
text_lengths: (Batch,)
"""
if self.lfmmi_dir != '':
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why load it in forward? I think we should load it in construction.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then there is no need to use hasattr when loading resource

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why load it in forward? I think we should load it in construction.

because i need to decorate it with torch.jit.ignore, i dont want to decorate whole forward function with torch.jit.ignore

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

now we move load_mmi_resource into init and remove hasattr judgement

@torch.jit.ignore(drop=True)
def _calc_lfmmi_loss(self, encoder_out, encoder_mask, text):
ctc_probs = self.ctc.log_softmax(encoder_out)
supervision_segments = torch.stack(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why we should move it to cpu?

Copy link
Collaborator Author

@aluminumbox aluminumbox Mar 9, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

k2 requires supervision_segments on cpu

@robin1001 robin1001 merged commit 6ad405b into wenet-e2e:main Mar 10, 2023
@kli017
Copy link
Contributor

kli017 commented Apr 19, 2023

@aluminumbox 请问现在lfmmi在runtime推理中能够使用吗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants