Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a SoftLaplace distribution #2791

Merged
merged 4 commits into from
Apr 2, 2021
Merged

Add a SoftLaplace distribution #2791

merged 4 commits into from
Apr 2, 2021

Conversation

fritzo
Copy link
Member

@fritzo fritzo commented Apr 1, 2021

From the docs:

This distribution corresponds to the log-convex density::

   z = (value - loc) / scale
   log_prob = log(2 / pi) - log(scale) - logaddexp(z, -z)

Like the Laplace density, this density has the heaviest possible tails
(asymptotically) while still being log-convex. Unlike the Laplace
distribution, this distribution is infinitely differentiable everywhere,
and is thus suitable for constructing Laplace approximations.

import torch
import matplotlib.pyplot as plt
import pyro.distributions as dist
x = torch.linspace(-4, 4, 200)
plt.plot(x, dist.Laplace(0, 1).log_prob(x), "k--", label="Laplace")
plt.plot(x, dist.SoftLaplace(0, 1).log_prob(x), "r-", label="SoftLaplace")
plt.legend(loc="best");

download

Tested

  • added to standard tests, including goftests
  • added a cdf-vs-icdf test that is now run on all continuous univariate distributions

@fritzo fritzo changed the title Add a SmoothLaplace distribution Add a SoftLaplace distribution Apr 1, 2021
@fritzo fritzo requested a review from fehiepsi April 2, 2021 14:42
if self._validate_args:
self._validate_sample(value)
z = (value - self.loc) / self.scale
return math.log(2 / math.pi) - self.scale.log() - torch.logaddexp(z, -z)
Copy link
Member

@fehiepsi fehiepsi Apr 2, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make senses to me: logaddexp(z, -z) ~ |z| in the tail and wolfram alpha shows the correct normalization factor.

Copy link
Member

@fehiepsi fehiepsi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! It would be nice to include some gradient tests, which you taught me previously, but let's do it when more distributions need the test.

@fehiepsi fehiepsi merged commit 52623a8 into dev Apr 2, 2021
@fritzo
Copy link
Member Author

fritzo commented Apr 2, 2021

Thanks for reviewing @fehiepsi !

@fritzo fritzo mentioned this pull request Jul 5, 2021
@fritzo fritzo deleted the smooth-laplace branch September 27, 2021 14:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants