You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, the Zero mean function doesn't necessarily stay as zero once the posterior model has been optimised. This is because, under the hood, it is modelled using the Constant mean function, with the constant field set to 0. However, the constant field is trainable, and so upon inspecting a GP posterior which has had its hyperparameters optimised, the Zero mean function can in fact be non-zero.
Expected behaviour:
I believe that if a user explicitly chooses the Zero mean function then it should always return zero (so, in effect, the mean is not trainable). This is also consistent with existing GP modelling libraries such as GPFlow (https://github.com/GPflow/GPflow/blob/develop/gpflow/functions.py#L187).
Steps to reproduce:
Create a GP model with a Zero mean function prior, and generate an optimised posterior on some data with non-zero mean. Inspecting the posterior model's "prior" mean will reveal that the constant field of the Zero mean function will have changed from 0. This can be seen in the regression notebook.
Feel free to let me know if this behaviour is in fact intended!
The text was updated successfully, but these errors were encountered:
Thomas-Christie
changed the title
bug: Zero mean function doesn't necessarily remain zero r
bug: Zero mean function doesn't necessarily remain zero
Jul 4, 2023
Thomas-Christie
changed the title
bug: Zero mean function doesn't necessarily remain zero
bug: Zero mean function doesn't necessarily return zero after optimising a GP posterior
Jul 4, 2023
Changing param_field's trainable=False, won't fix this issue in general. The constant field really needs to be a static_field as it should never change value under any operations, and we should have an init=False to ensure it does not change value upon initialisation e.g.,
#339 feels slightly similar to this having just put in the request. Parameter (c) defined as a function of a static field which should remain constant, however is being 'trained' (in this case resulting in an invalid Polar GP).
Bug Report
GPJax version:
0.6.7
Current behaviour:
Currently, the
Zero
mean function doesn't necessarily stay as zero once the posterior model has been optimised. This is because, under the hood, it is modelled using theConstant
mean function, with theconstant
field set to 0. However, theconstant
field is trainable, and so upon inspecting a GP posterior which has had its hyperparameters optimised, theZero
mean function can in fact be non-zero.Expected behaviour:
I believe that if a user explicitly chooses the
Zero
mean function then it should always return zero (so, in effect, the mean is not trainable). This is also consistent with existing GP modelling libraries such as GPFlow (https://github.com/GPflow/GPflow/blob/develop/gpflow/functions.py#L187).Steps to reproduce:
Create a GP model with a
Zero
mean function prior, and generate an optimised posterior on some data with non-zero mean. Inspecting the posterior model's "prior" mean will reveal that theconstant
field of theZero
mean function will have changed from 0. This can be seen in the regression notebook.Feel free to let me know if this behaviour is in fact intended!
The text was updated successfully, but these errors were encountered: