Skip to content

Commit

Permalink
Improve sampling from GP predictive posteriors.
Browse files Browse the repository at this point in the history
In  the covaraince matrix is now a
rather than an . This change improves the samples from GP predictive posteriors.
Rather than applying a low-rank approximation to , the
now only applies a low-rank approximation to  for sampling, and then adds on i.i.d.
noise.
  • Loading branch information
gpleiss committed Mar 18, 2024
1 parent fcbf685 commit 703689d
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions gpytorch/likelihoods/gaussian_likelihood.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
from typing import Any, Optional, Tuple, Union

import torch
from linear_operator.operators import LinearOperator, MaskedLinearOperator, ZeroLinearOperator
from linear_operator.operators import LinearOperator, MaskedLinearOperator, PsdSumLinearOperator, ZeroLinearOperator
from torch import Tensor
from torch.distributions import Distribution, Normal

Expand Down Expand Up @@ -114,7 +114,7 @@ def log_marginal(
def marginal(self, function_dist: MultivariateNormal, *params: Any, **kwargs: Any) -> MultivariateNormal:
mean, covar = function_dist.mean, function_dist.lazy_covariance_matrix
noise_covar = self._shaped_noise_covar(mean.shape, *params, **kwargs)
full_covar = covar + noise_covar
full_covar = PsdSumLinearOperator(covar, noise_covar)
return function_dist.__class__(mean, full_covar)


Expand Down

0 comments on commit 703689d

Please sign in to comment.