Skip to content

Nonlinear constraints to BoFire (integration with acqf + tests + tutorials)#740

Open
MrzvskK wants to merge 29 commits intoexperimental-design:mainfrom
MrzvskK:nonlinearC
Open

Nonlinear constraints to BoFire (integration with acqf + tests + tutorials)#740
MrzvskK wants to merge 29 commits intoexperimental-design:mainfrom
MrzvskK:nonlinearC

Conversation

@MrzvskK
Copy link
Copy Markdown
Contributor

@MrzvskK MrzvskK commented Mar 4, 2026

Motivation

This is an early attempt, I am sure there will be a few changes needed along the way. But to start with something, so Nonlinear constraints (NonlinearInequalityConstraint and NonlinearEqualityConstraint) were defined in BoFire's domain model but were not integrated into the optimization pipeline. The constraints were extracted from the domain but never passed to BoTorch's optimize_acqf, causing the optimizer to ignore them and propose infeasible candidates.

This PR:

  1. Fixes the constraint integration pipeline in acqf_optimization.py to properly pass constraints to BoTorch
  2. Corrects a critical bug where the is_equality flag was inverted (all constraints were treated as equality constraints)
  3. Enhances torch_tools.py to properly format constraints for BoTorch compatibility
  4. Adds feasibility verification in botorch.py to ensure proposed candidates satisfy all constraints
  5. Provides tests and a tutorial demonstrating usage

Have you read the Contributing Guidelines on pull requests?

Yes

Have you updated CHANGELOG.md?

I will do that later

New Test Suite

Created tests/bofire/strategies/test_nonlinear_constraints.py with 16 comprehensive test cases:

Constraint Types Tested:

  • Single and multiple inequality constraints (2-5 simultaneous)
  • Equality constraints
  • Mixed equality/inequality constraints

Optimization Scenarios:

  • Different acquisition functions (qLogEI, qLogNEI, qUCB)
  • Tight constraints (small feasible regions)
  • Boundary conditions (initial data on constraint manifold)
  • Single-variable constraints
  • Empty feasible regions (proper error handling)

Results:

  • All 16 nonlinear constraint tests passing
  • 352/354 total strategy tests passing (99.4% pass rate)
  • All integration tests passing

Tutorial

Created docs/tutorials/basic_examples/nonlinear_constraints_basic_usage.py demonstrating:

  • Competing reactions optimization with nonlinear constraints
  • Constraint definition using string expressions
  • Initial feasible sampling
  • Multi-objective optimization with constraints
  • Constraint verification utilities

Verification Steps

To verify the changes work:

# Run nonlinear constraint tests
pytest tests/bofire/strategies/test_nonlinear_constraints.py -v
 
# Run full strategy test suite
pytest tests/bofire/strategies/ -v
 
# Run tutorial example - they also exist in .qmd - after revision we can delete .py files 
python docs/tutorials/advanced_examples/nonlinear_advanced.py
python docs/tutorials/advanced_examples/nonlinear_constraints_maximizing_yield.py
 

@LukasHebing
Copy link
Copy Markdown
Contributor

Hi @MrzvskK,
nonlinear constraints are not handled by the standard BoFire optimizer, but the GA handles those:
https://experimental-design.github.io/bofire/docs/tutorials/advanced_examples/genetic_algorithm.html

Does this help?

@MrzvskK
Copy link
Copy Markdown
Contributor Author

MrzvskK commented Mar 5, 2026

@LukasHebing, this PR adds functionality to the BotorchOptimizer to handle nonlinear constraints.

@LukasHebing
Copy link
Copy Markdown
Contributor

@LukasHebing, this PR adds functionality to the BotorchOptimizer to handle nonlinear constraints.

Ok, great :)
I just wanted to make sure that this is not overlooked

@jduerholt jduerholt self-requested a review March 5, 2026 14:04
@MrzvskK
Copy link
Copy Markdown
Contributor Author

MrzvskK commented Mar 17, 2026

I fixed most of the issues, except of the last failing test, it has something to do with the conda.

@MrzvskK
Copy link
Copy Markdown
Contributor Author

MrzvskK commented Mar 20, 2026

@jduerholt hi fixed all the bugs, should be ready to merge

@MrzvskK
Copy link
Copy Markdown
Contributor Author

MrzvskK commented Apr 10, 2026

@jduerholt Hello, this is a quick reminder, I wanted to check if you approve the changes or not.

@jduerholt
Copy link
Copy Markdown
Contributor

Ah sorry, did not see that the tests are now passing, I will review over the weekend or beginning of next week ;)

@MrzvskK
Copy link
Copy Markdown
Contributor Author

MrzvskK commented Apr 17, 2026

@jduerholt kind reminder, did you have time to take a look at the PR?

@jduerholt
Copy link
Copy Markdown
Contributor

@jduerholt kind reminder, did you have time to take a look at the PR?

Sorry, too much travelling, I will do it over the weekend or the latest on Monday.

Copy link
Copy Markdown
Contributor

@jduerholt jduerholt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @MrzvskK,

thank you very much for the PR. I went over the actual implementation (ignoring the tests) for now and provided inline feedback. Regarding the botorch based suggestions, I also created an issue there: meta-pytorch/botorch#3280

Best,

Johannes

return hessian_expression

def __call__(self, experiments: pd.DataFrame) -> pd.Series:
def __call__(
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From my perspective, this is highly botorch dependable code, for this reason, I propose to move it to utils/torch_tools.py, where also the other constraints are prepared to be botorch ready. Then we also do not need the option here to pass torch ensors into the function etc, and it is much cleaner to read. You can have a look there how the ProductConstraint is build up.


type: Literal["NonlinearEqualityConstraint"] = "NonlinearEqualityConstraint"

def is_fulfilled(self, experiments: pd.DataFrame, tol: float = 1e-6) -> pd.Series:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Honestly, I would not support them directly out of the box in bofire, as they are not naturally supported in botorch. If somebody wants to do something like this, he or she should do this breakdown into individual nonlinear constraints by hand.

constraints.NonlinearEqualityConstraint,
]:
return False
return True # was False
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would only support the NonlinearInequalityConstraint and would the rest be handled by the user.

if self.local_search_config is not None:
if has_local_search_region(domain) is False:
warnings.warn(
logger.info(
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why this change from warning to logger, the logging mechanism in BoFire is somehow not well implemented, so I would like to keep it as warning for now. We need an overhaul there ...

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sorry, I forgot to revert it.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will do that now.

)
# Use object.__setattr__ to bypass Pydantic's frozen model behavior
object.__setattr__(self, "batch_limit", 1)
if self.n_restarts != 1:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not think, that we need n_restarts ==1, it should also work with more restarts, but it will run them sequentially. So this can go, I think.

return OptimizerEnum.OPTIMIZE_ACQF_MIXED
return OptimizerEnum.OPTIMIZE_ACQF_MIXED_ALTERNATING

def _get_nonlinear_constraint_setup(
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, I do not understand why this is necessary, currently we use our random strategy as initical candidates generator, and the random strategy can already handle nonlinear constraints via rejection sampling. Why not keep it as is?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The motivation for _get_nonlinear_constraint_setup is that once we pass nonlinear_inequality_constraints to BoTorch, it validates feasibility in tensor space with its own convention/tolerances, and ICs produced via BoFire rejection sampling can still be rejected there. If we decide not to pass nonlinear constraints into BoTorch and rely purely on BoFire’s rejection sampling, then I agree we can drop most of this setup and keep the implementation simpler. I will double check the RandomStrategy, since it might be doing the same thing.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will explore get_initial_conditions_generator

domain, constraint=LinearEqualityConstraint
)
if len(nonlinear_constraints := get_nonlinear_constraints(domain)) == 0:

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also do not understand this, this should all be handled in the original implementation using the random strategy as generator for the feasible candidates.

def has_sufficient_experiments(
self,
) -> bool:
"""Check if sufficient feasible experiments are available.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, I am not convinced that this is a good idea, from my perspective there is no need that past/historical experiments that were just used to train the surrogate models should fulfill the constraints. Often we have usecases, where new constraints are added over the course of the project, and this should keep working.

What was your intention for adding?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can revert this change.

If the domain is compatible with polytope sampling, it uses the polytope sampling to generate
candidate samples. Otherwise, it performs rejection sampling by repeatedly generating candidate
samples until the desired number of valid samples is obtained.
def _ask(self, candidate_count: int) -> pd.DataFrame:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, I do not get why this change was necessary, can you elaborate a bit?

# The constraint should evaluate to <= 0
def make_constraint_callable(c):
def constraint_fn(x: Tensor) -> Tensor:
# c.__call__ expects x with shape (batch_size, n_features)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would move all the logic on handling torch tensors of different here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants