Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Arguments passed to BoTorch optimize_acqf don't work (in an intuitive way) or don't make sense #2467

Open
esantorella opened this issue May 16, 2024 · 4 comments
Labels
bug Something isn't working wishlist Long-term wishlist feature requests

Comments

@esantorella
Copy link
Contributor

esantorella commented May 16, 2024

I wanted to use nonlinear inequality constraints, which seems like it should be doable because BoTorch's optimize_acqf supports nonlinear inequality constraints. However, optimize_acqf happens after Ax has applied transforms, so arguments such as nonlinear_inequality_constraints and batch_initial_conditions operate in the transformed space, causing surprising behavior.

Example:

import torch
from ax.modelbridge.generation_strategy import GenerationStep, GenerationStrategy
from ax.modelbridge.registry import Models
from ax.utils.testing.core_stubs import get_branin_data, get_branin_experiment

inequality_constraints = [(lambda x: 3 - (x**2).sum(), True)]

botorch_gen_step = GenerationStep(
    model=Models.BOTORCH_MODULAR,
    num_trials=-1,
    model_gen_kwargs={
        "model_gen_options": {
            "optimizer_kwargs": {
                "nonlinear_inequality_constraints": inequality_constraints,
                "batch_initial_conditions": torch.ones(((1, 1, 2))),
            }
        }
    },
)

constrained_gs = GenerationStrategy(steps=[botorch_gen_step])

inequality_constraints = [(lambda x: 3 - (x**2).sum(), True)]

botorch_gen_step = GenerationStep(
    model=Models.BOTORCH_MODULAR,
    num_trials=-1,
    model_gen_kwargs={
        "model_gen_options": {
            "optimizer_kwargs": {
                "nonlinear_inequality_constraints": inequality_constraints,
                "batch_initial_conditions": torch.ones(((1, 1, 2))),
            }
        }
    },
)

constrained_gs = GenerationStrategy(steps=[botorch_gen_step])
# {'x1': 10.0, 'x2': 15.0} -- does not obey the constraints
generator_run.arms[0].parameters

Suggested resolution:

I suggest not surfacing arguments to optimize_acqf to the user, possibly with a few exceptions added as needed. Although some of the arguments can be helpful when constructed by Ax, almost all are nonsensical, redundant with Ax, or will behave surprisingly when passed by the user. Redundant arguments include acq_function, bounds, q, and inequality_constraints. return_best_only is nonsensical when used with Ax. And others, such as nonlinear_inequality_constraints and batch_initial_conditions, operate in the transformed space and thus are nearly impossible to use correctly without a detailed understanding of what Ax does under the hood. Users with such a detailed understanding might as well use BoTorch.

I think this can be achieved by not constructing opt_options here, and instead erroring when optimizer_kwargs are present in model_gen_options.

opt_options = checked_cast(

@Balandat
Copy link
Contributor

Similar considerations arise when passing arguments to the acquisition function constructors, see #2401. It would be great if we could automatically apply the transforms to the arguments for which they are needed, but doing this in a generic fashion seems very challenging.

@bernardbeckerman bernardbeckerman added the bug Something isn't working label May 17, 2024
@esantorella esantorella added the wishlist Long-term wishlist feature requests label Jul 10, 2024
@esantorella esantorella removed their assignment Jul 10, 2024
@lena-kashtelyan
Copy link
Contributor

@esantorella, sounds like you were possibly working on this? I wonder if we want to just error with UnsupportedError if these are passed in Ax, since it seems that they will do nothing but confuse? I understand that there may be a contrived case where we are not using any Ax transforms, but does that ever happen in reality? If not, I think we could just validate against this.

@esantorella esantorella removed their assignment Aug 19, 2024
@esantorella
Copy link
Contributor Author

I haven't been working on this. My preferred solution is

not constructing opt_options here, and instead erroring when optimizer_kwargs are present in model_gen_options.

That may allow for more simplification, in that optimizer_kwargs will no longer need to be passed around, and the only allowed key for model_gen_options will then be acqf_kwargs:

# current
GenerationStep(
    model=Models.BOTORCH_MODULAR,
    num_trials=-1,
    model_gen_kwargs={
        "model_gen_options": {"acqf_kwargs": {"eta": 1e-2}},
    },
)
# new syntax after removing support for optimizer_kwargs
GenerationStep(
    model=Models.BOTORCH_MODULAR,
    num_trials=-1,
    model_gen_kwargs={"acqf_kwargs": {"eta": 1e-2}},
)

@Adi6501

This comment was marked as off-topic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working wishlist Long-term wishlist feature requests
Projects
None yet
Development

No branches or pull requests

5 participants