Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add infrastructure for changing acquisition function optimizers #482

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

sleweke-bayer
Copy link

The BotorchStrategy is expanded to modularize the acquisition function optimizer. Every acquisition function optimizer has to derive from the newly created base class AcquisitionFunctionOptimizer. The current Botorch-based optimizer is moved into the separate class BotorchAcqfOptimizer.

This PR includes a breaking change in the data model: A new data model for acquisition function optimizers is created. The Botorch-optimizer params stored in the BotorchStrategy are moved to the new BotorchAcqfOptimizer.

Due to the modularization, we can later add different optimizers for the acquisition function (e.g., genetic algorithm).

The BotorchStrategy is expanded to modularize the acquisition function
optimizer. Every acquisition function optimizer has to derive from a
newly created base class. The current Botorch-based optimizer is moved
into a separate class.

This includes a breaking change in the data model: A new data model for
acquisition function optimizers is created. The Botorch-optimizer params
stored in the BotorchStrategy are moved to the new BotorchAcqfOptimizer.
@jduerholt
Copy link
Contributor

Thanks @sleweke-bayer! I will have a detailed look on this in the time between the years!

Copy link
Contributor

@jduerholt jduerholt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @sleweke-bayer,

thank you very much. I do not had a detailed look but focused more on the architecture and had especially one comment there. What do you think?

Best and a happy new year,

Johannes

return cls(data_model=data_model)

@abstractmethod
def optimize(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From my understanding the signature of the optimize method should be a bit simpler and maybe also more torch agnostic. From my perspective, we only need the domain, the bounds, and the acqf(s), the rest can be generated on the fly from domain and is also the optimizer specific, or?

I mean that for example a GA wants to have the constraints in a different format then botorch wants them. In principal, also the bounds could be generated on the fly out of the domain, but this would create problems with the local and global bounds, as this is controlled on a different level. Of course, one could also think about putting the LSR logic into the generic optimizer class, then we would only need the acqfs and the domain and maybe also the input processing specs.

What do you think about making it more agnostic here?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants