Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add Multi Information Source Augmented GP (#2152)
Summary: <!-- Thank you for sending the PR! We appreciate you spending the time to make BoTorch better. Help us understand your motivation by explaining why you decided to make this change. You can learn more about contributing to BoTorch here: https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md --> ## Motivation This pull request introduces the implementation of the **Augmented Gaussian Process** (AGP) and a related acquisition function based on UCB, namely **Augmented UCB** (AUCB), for multi information source problems. > Candelieri, A., Archetti, F. Sparsifying to optimize over multiple information sources: an augmented Gaussian process based algorithm. Struct Multidisc Optim 64, 239–255 (2021). [https://doi.org/10.1007/s00158-021-02882-7](https://doi.org/10.1007/s00158-021-02882-7) ### AGP and AUCB in a nutshell The key idea of the AGP is to fit a GP model for each information source and *augment* the observations on the high fidelity source with those from *cheaper* sources which can be considered as *reliable*. The GP model fitted on this *augmented* set of observations is the AGP. The AUCB is a modification of the standard UCB - computed on the AGP - suitably proposed to also deal with the source-specific query cost. ### Example This is what the AGP and AUCB look like on the Forrester problem considering 2 sources. <p align="center"> <img src="https://github.com/pytorch/botorch/assets/59694427/72c43b56-e08b-4b47-aea9-00925345890c" height="500"> </p> ### Some implementation details The Augmented GP implementation is based on the `SingleTaskGP`. Each source is implemented as an independent `SingleTaskGP` and all the _reliable_ observations are used to fit the SingleTaskGP representing the AGP, namely `SingleTaskAugmentedGP`. A key difference with the Multi-Fidelity approaches in BoTorch, is that the dimension representing the source is not directly modelled by the AGP. In addition, a Fixed-Noise version of the AGP has been implemented based on the `FixedNoiseGP`. The Augmented UCB implementation is based on the `UpperConfidenceBound`, but it is penalized by the cost of the source and the discrepancy from the AGP. ### Have you read the [Contributing Guidelines on pull requests](https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md#pull-requests)? Yes, I have read the Contributing Guidelines on pull requests. Pull Request resolved: #2152 Test Plan: A bunch of unit tests for both, the AGP and AUCB, has been implemented. Those are inspired by the `SingleTaskGP` and `UpperConfidenceBound` tests, respectively. In addition, a notebook tutorial shows how to use the AGP model along with the AUCB acquisition function and a comparison with the Multi-Fidelity MES on the Augmented Branin test function. Finally, we plan to release an arXiv soon, with an extensive comparison of Multi-Fidelity and Multi Information Source approaches in BoTorch. Here are some preliminary results on the Augmented Hartmann test function, considering three sources (with fidelities $[0.50, 0.75, 1.00]$). The AGP has been compared with the discrete Multi-Fidelity version of the Knowled Gradient (KG), Max-value Entropy Search (MES) and GIBBON. The figure on the left shows the best seen with respect to the query cost, while the figure on the right shows the best seen with respect to the wall-clock time. <p align="center"> <img src="https://github.com/pytorch/botorch/assets/59694427/77cfcebc-8123-4802-8045-5d9b359775ec" height="300"> </p> ## Related PRs The docs has been updated in this PR. Reviewed By: Balandat Differential Revision: D52256404 Pulled By: sdaulton fbshipit-source-id: 863437b488dcee6b37306dcd6c1ee6b63ca9c55f
- Loading branch information