-
Notifications
You must be signed in to change notification settings - Fork 409
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Document how to implement custom models #2306
Comments
Yes, this is generally possible. BoTorch's That said, note that in order to use gradient-based optimization of the acquisition function (via the standard |
Great! Could you include some tutorials notebook for this? I'm fairly new to BoTorch. |
@esantorella @Balandat here is an attempt at documenting a custom model that I'd like to submit a PR for: https://github.com/jakee417/botorch/blob/main/tutorials/custom_model.ipynb One question I have is trying to get
But when I try: from botorch.sampling import get_sampler
from botorch.sampling.base import MCSampler
from botorch.sampling.get_sampler import GetSampler
@GetSampler.register(TorchPosterior)
def _get_sampler_torch(
posterior: TorchPosterior,
sample_shape: torch.Size,
*,
seed: Optional[int] = None,
) -> MCSampler:
return get_sampler(
posterior=posterior,
sample_shape=sample_shape,
seed=seed,
) it doesn't seem to resolve the issue. Any tips here? |
Hi @jakee417. The |
I tried following the example for TransformedPosterior, which I now see is a mistake. But, the infinite loop never even occurred due to the dispatching in _posterior_to_distribution_encoder (had it worked, I do see your point about the cyclic dependency). Instead, referencing the from botorch.sampling.base import MCSampler
from botorch.sampling.get_sampler import GetSampler
from botorch.sampling.stochastic_samplers import StochasticSampler
@GetSampler.register(distributions.Distribution)
def _get_sampler_torch(
posterior: TorchPosterior,
sample_shape: torch.Size,
*,
seed: Optional[int] = None,
) -> MCSampler:
return StochasticSampler(sample_shape=sample_shape, seed=seed) In this case, the original error:
Is a little misleading since you don't need to register |
Ah yes, I forgot that we register them based on the distribution class for TorchPosterior. This is the right way to do it, but ideally we'd register them for the specific distribution class rather than the base class (in case there are custom behaviors for different distributions, like MVN). Left a comment on this on the PR as well. We can continue the discussion there. |
Summary: <!-- Thank you for sending the PR! We appreciate you spending the time to make BoTorch better. Help us understand your motivation by explaining why you decided to make this change. You can learn more about contributing to BoTorch here: https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md --> ## Motivation Issue #2306 ### Have you read the [Contributing Guidelines on pull requests](https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md#pull-requests)? Yes. Added a tutorial which can be used for smoke tests. Pull Request resolved: #2474 Test Plan: Probabilistic linear regression, bayesian linear regression, and ensemble linear regression all yield optimization results close to (0, 0) which is groundtruth answer. Random Forest doesn't seem to achieve groundtruth answer, likely due to its inability to incorporate gradient information into the optimization of the acquisition function. ## Related PRs N/A Reviewed By: esantorella Differential Revision: D61612799 Pulled By: jakee417 fbshipit-source-id: 63d26c048dc4544cae37e89767e14caf732e7749
Issue description
Provide a short description.
I wonder if we could implement a custom model instead of GP class model.
Specifically, suppose we build a random forest model, and it also have mean and variance for data points. Is it possible to do Bayesian Optimization with this model under BoTorch framework?
Code example
For example:
model = CustomModel(train_X=init_x, train_Y=init_y )
System Info
Please provide information about your setup, including
The text was updated successfully, but these errors were encountered: