Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed bug: replaced bce_loss_with_logits with bce_loss #7096

Merged
merged 5 commits into from
Apr 28, 2021

Conversation

amisev
Copy link
Contributor

@amisev amisev commented Apr 19, 2021

What does this PR do?

Fixed bug in pl_examples/domain_templates/computer_vision_fine_tuning.py:
replaced binary_cross_entropy_with_logits with binary_cross_entropy.

See #7081

@amisev amisev changed the title Fixed bug: replaced bce_loss_with_logits with bec_loss Fixed bug: replaced bce_loss_with_logits with bce_loss Apr 19, 2021
@carmocca
Copy link
Contributor

Why not remove the torch.sigmoid in forward instead?

@codecov
Copy link

codecov bot commented Apr 19, 2021

Codecov Report

Merging #7096 (9df85f7) into master (6be0a85) will decrease coverage by 5%.
The diff coverage is n/a.

@@           Coverage Diff           @@
##           master   #7096    +/-   ##
=======================================
- Coverage      91%     87%    -5%     
=======================================
  Files         198     198            
  Lines       12634   12790   +156     
=======================================
- Hits        11541   11067   -474     
- Misses       1093    1723   +630     

@amisev
Copy link
Contributor Author

amisev commented Apr 19, 2021

Some losses in torchmetrics are required probabilities rather than scores.

@awaelchli
Copy link
Contributor

awaelchli commented Apr 19, 2021

I would go for the more numerically stable option.
Can't we recompute sigmoid outputs where we need them?

@amisev
Copy link
Contributor Author

amisev commented Apr 20, 2021

I removed sigmoid activation from the forward pass, but anyway, even torchmetrics.Accuracy required probabilities, that's why I had to add torch.sigmoid two more times.

@awaelchli awaelchli added bug Something isn't working example ready PRs ready to be merged labels Apr 21, 2021
@awaelchli awaelchli added this to the v1.3 milestone Apr 21, 2021
@awaelchli
Copy link
Contributor

great fix!

@tchaton tchaton enabled auto-merge (squash) April 22, 2021 15:07
@tchaton tchaton merged commit 6b29211 into Lightning-AI:master Apr 28, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working example ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants