Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Inequality Constraints to SEBO #2938

Closed
wants to merge 11 commits into from

Conversation

CompRhys
Copy link
Contributor

@CompRhys CompRhys commented Oct 22, 2024

Following pytorch/botorch#2588 being merged there is now no reason to stop people using inequality constraints with SEBO. This PR removes the ValueError check and adjusts the tests accordingly.

Connected to #2790

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Oct 22, 2024
Copy link
Contributor

@Balandat Balandat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks - if you don't mind fixing that one small linter nit the rest lgtm!

ax/models/torch/tests/test_sebo.py Outdated Show resolved Hide resolved
Copy link
Contributor

@Balandat Balandat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually there are a couple of other lint failures as well: https://github.com/facebook/Ax/actions/runs/11468956837/job/31951921724?pr=2938

@codecov-commenter
Copy link

codecov-commenter commented Oct 23, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 95.82%. Comparing base (81bb9de) to head (f5ba7fd).
Report is 3 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2938      +/-   ##
==========================================
- Coverage   95.82%   95.82%   -0.01%     
==========================================
  Files         503      503              
  Lines       50792    50806      +14     
==========================================
+ Hits        48671    48684      +13     
- Misses       2121     2122       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@CompRhys
Copy link
Contributor Author

Merged the changes from #2935 and then fixed the flake8 complaints. I just went ahead and fixed everything flake8 complained about in the repo rather than just the files I touched.

So used to relying on pre-commit that I always forget to lint manually when there's not a hook.

@facebook-github-bot
Copy link
Contributor

@Balandat has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

Copy link
Contributor

@Balandat Balandat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, you will want to make sure that you take into account the inequality constraints when in get_batch_initial_conditions() when you call this inside _optimize_with_homotopy()

@CompRhys
Copy link
Contributor Author

The naive solution would be to set it to None if inequality constraints are given which would then cause it to default to the behavior in botorch's gen_batch_initial_conditions. Would that be sufficient to be an acceptable solution here? Whether this is okay depends on how important the additional sampling around the pareto front is to performance (ping @dme65 as this was changed in the last few days).

Adding something more involved would require a bit more thought.

@souravdey94
Copy link

Is SEBO available with parameter constraints in the latest version of Ax?

@CompRhys
Copy link
Contributor Author

No this was closed as I did a force-push and it broke the connection to this repo. I hope to get this sorted this week.

@CompRhys CompRhys reopened this Nov 20, 2024
@CompRhys
Copy link
Contributor Author

Actually, you will want to make sure that you take into account the inequality constraints when in get_batch_initial_conditions() when you call this inside _optimize_with_homotopy()

@Balandat Finally getting back to this, in order to reduce the behaviour change and be backward compatible people using sebo + L0 I added pytorch/botorch#2636 to botorch and then we can use the botorch method to sample with constraints and then concat the points from the pareto front to that.

with the L1 setup I believe that changing the behaviour to also make use of the pareto points + noise should help there so I shifted the logic such that the same initial batch conditions are used for both the L1 and L0 choices.

@CompRhys
Copy link
Contributor Author

CompRhys commented Dec 2, 2024

failing tests here are due to the PR now depending on pytorch/botorch#2636

facebook-github-bot pushed a commit to pytorch/botorch that referenced this pull request Dec 3, 2024
#2610)

Summary:
## Motivation

See #2609

### Have you read the [Contributing Guidelines on pull requests](https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md#pull-requests)?

Yes

Pull Request resolved: #2610

Test Plan:
Basic testing of the code is easy the challenge is working out what the run on implications might be, will this break people's code?

## Related PRs

facebook/Ax#2938

Reviewed By: Balandat

Differential Revision: D66102868

Pulled By: saitcakmak

fbshipit-source-id: b3491581a205b0fbe62edd670510e95f13e08177
facebook-github-bot pushed a commit to pytorch/botorch that referenced this pull request Dec 6, 2024
Summary:
## Motivation

In order to get facebook/Ax#2938 over the line with initial candidate generation that obey the constraints we want to use the existing tooling within `botorch`. The hard coded logic currently in Ax uses topk to downselect the sobol samples. To make a change there that will not impact existing users we then need to implement topk downselection in `botorch`.

Pull Request resolved: #2636

Test Plan:
TODO:
- [x] add tests for initialize_q_batch_topk

## Related PRs

facebook/Ax#2938. (#2610 was initially intended to play part of this solution but then I realized that the pattern I wanted to use was conflating repeats and the batch dimension.)

Reviewed By: Balandat

Differential Revision: D66413947

Pulled By: saitcakmak

fbshipit-source-id: 39e71f5cc0468d554419fa25dd545d9ee25289dc
@CompRhys
Copy link
Contributor Author

CompRhys commented Dec 9, 2024

@saitcakmak @Balandat, I think this is now ready to be upstreamed now the related botorch PRs were merged. It should make no difference to existing users of L0 SEBO but there is a difference to anyone who might be using L1 SEBO as I moved the initial candidate selection based on the Pareto front to apply to both variants. I haven't benchmarked that change but it seemed to me that if was strange to not give already sparse solutions to the L1 optimizer if doing so was deemed necessary to getting good performance for the L0.

@CompRhys
Copy link
Contributor Author

@saitcakmak I would really appreciate it if this can be looked at/merged before the holidays, the request change is out of date now and can be cleared based on the work done in pytorch/botorch#2636 such that we can make use of the hit-and-run polytope sampling so that the random samples satisfy the constraints

Copy link
Contributor

@saitcakmak saitcakmak left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @CompRhys. Sorry, I haven't been following this PR since I'm not super familiar with SEBO & @Balandat had reviewed it earlier. Let me import it and get some eyes on it.

ax/models/torch/botorch_modular/sebo.py Outdated Show resolved Hide resolved
@facebook-github-bot
Copy link
Contributor

@saitcakmak has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

ax/models/torch/botorch_modular/sebo.py Outdated Show resolved Hide resolved
ax/models/torch/botorch_modular/sebo.py Outdated Show resolved Hide resolved
@Balandat
Copy link
Contributor

Sorry about the delay here @CompRhys, I'm at NeurIPS and so it's been a bit hectic. Sait can help get this over the finish line.

ax/models/torch/botorch_modular/sebo.py Outdated Show resolved Hide resolved
ssd = search_space_digest
bounds = _tensorize(ssd.bounds).t()

optimizer_options["batch_initial_conditions"] = get_batch_initial_conditions(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be set on optimizer_options_with_defaults? We now construct optimizer_options_with_defaults but don't pass it down to the optimizers.

I think this will also address the pyre failure:
fbcode/ax/models/torch/botorch_modular/sebo.py:246:8 Undefined attribute [16]: Optional type has no attribute __setitem__.

Suggested change
optimizer_options["batch_initial_conditions"] = get_batch_initial_conditions(
optimizer_options_with_defaults["batch_initial_conditions"] = get_batch_initial_conditions(

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For this is looks like there is a pyre pre-commit so will look at adding it to the pre-commit hooks https://github.com/facebook/pyre-check

ax/models/torch/botorch_modular/sebo.py Outdated Show resolved Hide resolved
Copy link

vercel bot commented Dec 13, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
ax-archive ✅ Ready (Inspect) Visit Preview 💬 Add feedback Dec 13, 2024 10:19pm

ax/models/torch/botorch_modular/sebo.py Outdated Show resolved Hide resolved
ax/models/torch/botorch_modular/sebo.py Outdated Show resolved Hide resolved
ax/models/torch/botorch_modular/sebo.py Show resolved Hide resolved
@facebook-github-bot
Copy link
Contributor

@saitcakmak has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@saitcakmak merged this pull request in b259f1a.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants