You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I wanted to use nonlinear inequality constraints, which seems like it should be doable because BoTorch's optimize_acqf supports nonlinear inequality constraints. However, optimize_acqf happens after Ax has applied transforms, so arguments such as nonlinear_inequality_constraints and batch_initial_conditions operate in the transformed space, causing surprising behavior.
I suggest not surfacing arguments to optimize_acqf to the user, possibly with a few exceptions added as needed. Although some of the arguments can be helpful when constructed by Ax, almost all are nonsensical, redundant with Ax, or will behave surprisingly when passed by the user. Redundant arguments include acq_function, bounds, q, and inequality_constraints. return_best_only is nonsensical when used with Ax. And others, such as nonlinear_inequality_constraints and batch_initial_conditions, operate in the transformed space and thus are nearly impossible to use correctly without a detailed understanding of what Ax does under the hood. Users with such a detailed understanding might as well use BoTorch.
I think this can be achieved by not constructing opt_optionshere, and instead erroring when optimizer_kwargs are present in model_gen_options.
Similar considerations arise when passing arguments to the acquisition function constructors, see #2401. It would be great if we could automatically apply the transforms to the arguments for which they are needed, but doing this in a generic fashion seems very challenging.
@esantorella, sounds like you were possibly working on this? I wonder if we want to just error with UnsupportedError if these are passed in Ax, since it seems that they will do nothing but confuse? I understand that there may be a contrived case where we are not using any Ax transforms, but does that ever happen in reality? If not, I think we could just validate against this.
I haven't been working on this. My preferred solution is
not constructing opt_options here, and instead erroring when optimizer_kwargs are present in model_gen_options.
That may allow for more simplification, in that optimizer_kwargs will no longer need to be passed around, and the only allowed key for model_gen_options will then be acqf_kwargs:
# currentGenerationStep(
model=Models.BOTORCH_MODULAR,
num_trials=-1,
model_gen_kwargs={
"model_gen_options": {"acqf_kwargs": {"eta": 1e-2}},
},
)
# new syntax after removing support for optimizer_kwargsGenerationStep(
model=Models.BOTORCH_MODULAR,
num_trials=-1,
model_gen_kwargs={"acqf_kwargs": {"eta": 1e-2}},
)
I wanted to use nonlinear inequality constraints, which seems like it should be doable because BoTorch's
optimize_acqf
supports nonlinear inequality constraints. However,optimize_acqf
happens after Ax has applied transforms, so arguments such asnonlinear_inequality_constraints
andbatch_initial_conditions
operate in the transformed space, causing surprising behavior.Example:
Suggested resolution:
I suggest not surfacing arguments to
optimize_acqf
to the user, possibly with a few exceptions added as needed. Although some of the arguments can be helpful when constructed by Ax, almost all are nonsensical, redundant with Ax, or will behave surprisingly when passed by the user. Redundant arguments includeacq_function
,bounds
,q
, andinequality_constraints
.return_best_only
is nonsensical when used with Ax. And others, such asnonlinear_inequality_constraints
andbatch_initial_conditions
, operate in the transformed space and thus are nearly impossible to use correctly without a detailed understanding of what Ax does under the hood. Users with such a detailed understanding might as well use BoTorch.I think this can be achieved by not constructing
opt_options
here, and instead erroring whenoptimizer_kwargs
are present inmodel_gen_options
.Ax/ax/models/torch/botorch_modular/utils.py
Line 171 in d97a80e
The text was updated successfully, but these errors were encountered: