Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] fixed_features does not support negative indices #2602

Closed
slishak-PX opened this issue Oct 31, 2024 · 3 comments
Closed

[Bug] fixed_features does not support negative indices #2602

slishak-PX opened this issue Oct 31, 2024 · 3 comments
Assignees
Labels
bug Something isn't working

Comments

@slishak-PX
Copy link
Contributor

🐛 Bug

If fixed_features has a negative index, the initial conditions will not be constructed with the correct reduced dimensionality.

To reproduce

** Code snippet to reproduce **

import torch
from botorch.acquisition import qLogExpectedImprovement
from botorch.fit import fit_gpytorch_mll
from botorch.models import SingleTaskGP
from botorch.models.transforms import Normalize
from botorch.optim import optimize_acqf
from gpytorch.mlls import ExactMarginalLogLikelihood

n_inputs = 4
n_outputs = 1
n_train = 256
n_test = 16
device = torch.device("cpu")

train_x = torch.rand(n_train, n_inputs, dtype=torch.float64, device=device)
train_y = torch.randn(n_train, n_outputs, dtype=torch.float64, device=device)

model = SingleTaskGP(train_x, train_y, input_transform=Normalize(n_inputs))

mll = ExactMarginalLogLikelihood(model.likelihood, model)
fit_gpytorch_mll(mll)

acqf = qLogExpectedImprovement(model, best_f=train_y.max())

bounds = torch.vstack([torch.zeros(1, n_inputs), torch.ones(1, n_inputs)])

candidates, value = optimize_acqf(
    acqf,
    bounds,
    q=2,
    num_restarts=4,
    raw_samples=8,
    fixed_features={-1: 0.5},
)

** Stack trace/error message **

Traceback (most recent call last):
  File "repro.py", line 27, in <module>
    candidates, value = optimize_acqf(
  File ".../lib/python3.10/site-packages/botorch/optim/optimize.py", line 547, in optimize_acqf
    return _optimize_acqf(opt_acqf_inputs)
  File ".../lib/python3.10/site-packages/botorch/optim/optimize.py", line 568, in _optimize_acqf
    return _optimize_acqf_batch(opt_inputs=opt_inputs)
  File ".../lib/python3.10/site-packages/botorch/optim/optimize.py", line 332, in _optimize_acqf_batch
    batch_candidates, batch_acq_values, ws = _optimize_batch_candidates()
  File ".../lib/python3.10/site-packages/botorch/optim/optimize.py", line 316, in _optimize_batch_candidates
    ) = opt_inputs.gen_candidates(
  File ".../lib/python3.10/site-packages/botorch/generation/gen.py", line 159, in gen_candidates_scipy
    clamped_candidates, batch_acquisition = gen_candidates_scipy(
  File ".../lib/python3.10/site-packages/botorch/generation/gen.py", line 251, in gen_candidates_scipy
    res = minimize_with_timeout(
  File ".../lib/python3.10/site-packages/botorch/optim/utils/timeout.py", line 83, in minimize_with_timeout
    return optimize.minimize(
  File ".../lib/python3.10/site-packages/scipy/optimize/_minimize.py", line 731, in minimize
    res = _minimize_lbfgsb(fun, x0, args, jac, bounds,
  File ".../lib/python3.10/site-packages/scipy/optimize/_lbfgsb_py.py", line 347, in _minimize_lbfgsb
    sf = _prepare_scalar_function(fun, x0, jac=jac, args=args, epsilon=eps,
  File ".../lib/python3.10/site-packages/scipy/optimize/_optimize.py", line 288, in _prepare_scalar_function
    sf = ScalarFunction(fun, x0, args, grad, hess,
  File ".../lib/python3.10/site-packages/scipy/optimize/_differentiable_functions.py", line 222, in __init__
    self._update_fun()
  File ".../lib/python3.10/site-packages/scipy/optimize/_differentiable_functions.py", line 294, in _update_fun
    fx = self._wrapped_fun(self.x)
  File ".../lib/python3.10/site-packages/scipy/optimize/_differentiable_functions.py", line 20, in wrapped
    fx = fun(np.copy(x), *args)
  File ".../lib/python3.10/site-packages/scipy/optimize/_optimize.py", line 79, in __call__
    self._compute_if_needed(x, *args)
  File ".../lib/python3.10/site-packages/scipy/optimize/_optimize.py", line 73, in _compute_if_needed
    fg = self.fun(x, *args)
  File ".../lib/python3.10/site-packages/botorch/generation/gen.py", line 208, in f_np_wrapper
    loss = f(X_fix).sum()
  File ".../lib/python3.10/site-packages/botorch/generation/gen.py", line 249, in f
    return -acquisition_function(x)
  File ".../lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File ".../lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
  File ".../lib/python3.10/site-packages/botorch/acquisition/fixed_feature.py", line 155, in forward
    X_full = self._construct_X_full(X)
  File ".../lib/python3.10/site-packages/botorch/acquisition/fixed_feature.py", line 192, in _construct_X_full
    raise ValueError(
ValueError: Feature dimension d' (4) of input must be d - d_f (3).

Expected Behavior

The negative indices should be equivalent to specifying the last dimension, or the docs should make it clear that this is not allowed.

System information

Please complete the following information:

  • BoTorch Version 0.12.0
  • GPyTorch Version 1.13
  • PyTorch Version 2.5.1+cu124
  • Computer OS: Linux

Additional context

This line is one reason why it doesn't work - I don't know yet if there are other areas where the indices are assumed positive.

unfixed_indices = sorted(set(range(d)) - set(sorted_keys))

@slishak-PX slishak-PX added the bug Something isn't working label Oct 31, 2024
@esantorella esantorella self-assigned this Oct 31, 2024
esantorella added a commit to esantorella/botorch that referenced this issue Oct 31, 2024
Summary:
Context: See pytorch#2602

This PR:
* Adds a check for negative fixed_features keys to input validation for optimizers. This applies to all of the optimizers that take fixed_features.
* Updates docstrings

Differential Revision: D65272024
@esantorella
Copy link
Member

Thanks for reporting. I put in #2603 to update docstrings to clarify that negative indices are not allowed and raise an exception if they are provided.

@Balandat
Copy link
Contributor

I feel like it shouldn't be to hard to allow this by canonicalizing the indices to index % num_features? Maybe we can make a backlog task for this?

facebook-github-bot pushed a commit that referenced this issue Oct 31, 2024
…2603)

Summary:
Pull Request resolved: #2603

Context: See #2602

This PR:
* Adds a check for negative fixed_features keys to input validation for optimizers. This applies to all of the optimizers that take fixed_features.
* Updates docstrings

Reviewed By: Balandat

Differential Revision: D65272024

fbshipit-source-id: f9da998a7308390358d22c768093685c587b664c
@esantorella
Copy link
Member

To keep things organized, I opened #2605 for that feature request. It would be a good task for a newcomer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants