-
Notifications
You must be signed in to change notification settings - Fork 414
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] botorch.optim.utils.sample_all_priors
fails to sample priors that GPyTorch can sample
#780
Comments
So the sampling will work if you instantiate your prior with scalar parameters, e.g. I will leave this open for now, hopefully this unblocks you. To solve this properly we'll need to fix the root cause in gpytorch. |
I see, even though my prior might be equivalent to In case someone else is also having this issue, and all of their priors have their event shape equal to the hyperparameter's shape, this workaround seems to until this gets disambiguated in GPyTorch :) def sample_all_priors(model: GPyTorchModel) -> None:
r"""Sample from hyperparameter priors (in-place).
Args:
model: A GPyTorchModel.
"""
for _, module, prior, closure, setting_closure in model.named_priors():
if setting_closure is None:
raise RuntimeError(
"Must provide inverse transform to be able to sample from prior."
)
try:
- setting_closure(module, prior.sample(closure(module).shape))
+ setting_closure(module, prior.sample())
except NotImplementedError:
warnings.warn(
f"`rsample` not implemented for {type(prior)}. Skipping.",
BotorchWarning,
) |
Indeed - if you want priors with different (prior) parameters for different parameter dimensions, then this is the way to do it (for now, until we've found a proper solution on the gpytorch end). |
It is worth noting that if you're using a 1d prior for an ARD lengthscale, e.g. like in the Instead, you should define the prior like |
The example in the OP still produces the error in the OP as of today. |
I'm running into this same issue. This issue occurs if you use Note that the workaround above won't behave as expected in batch mode. For example, when doing cross-validation, it will sample the same set of lengthscales for every cv fold, rather than different ones. Here's an altered workaround that works for this case. def sample_all_priors(model: GPyTorchModel) -> None:
r"""Sample from hyperparameter priors (in-place).
Args:
model: A GPyTorchModel.
"""
for _, module, prior, closure, setting_closure in model.named_priors():
if setting_closure is None:
raise RuntimeError(
"Must provide inverse transform to be able to sample from prior."
)
try:
- setting_closure(module, prior.sample(closure(module).shape))
+ setting_closure(module, prior.sample(closure(module).shape[:-1]))
except NotImplementedError:
warnings.warn(
f"`rsample` not implemented for {type(prior)}. Skipping.",
BotorchWarning,
) |
Summary: Addresses pytorch#780 Previously, this would pass in `closure(module).shape` as the `sample_shape`, which only worked if the prior was a univariate distribution. `Distribution.sample` produces samples of shape `Distribution._extended_shape(sample_shape) = sample_shape + Distribution._extended_shape()`, so we can calculate the `sample_shape` required to support both univariate and multivariate / batched priors. Differential Revision: D58377495
#2371 implements a fix that supports both univariate (e.g. default |
Summary: Pull Request resolved: #2371 Addresses #780 Previously, this would pass in `closure(module).shape` as the `sample_shape`, which only worked if the prior was a univariate distribution. `Distribution.sample` produces samples of shape `Distribution._extended_shape(sample_shape) = sample_shape + Distribution._extended_shape()`, so we can calculate the `sample_shape` required to support both univariate and multivariate / batched priors. Reviewed By: dme65 Differential Revision: D58377495 fbshipit-source-id: 17510505012838a3fe670492656be4d13bc0db5e
This should be fixed in the 0.11.3 release. |
botorch.optim.utils.sample_all_priors
is failing to sample priors that GPyTorch can sample withmodule.sample_from_prior
. The reason is that the shape of the parameter is being passed to thesample
method:botorch/botorch/optim/utils.py
Line 43 in 913aa0e
However, it seems that GPyTorch's priors do not expect to receive the event shape as a parameter.
To reproduce
Stack trace/error message
Expected Behavior
No errors thrown.
System information
The text was updated successfully, but these errors were encountered: