Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: model based bracket optimizers #181

Merged
merged 3 commits into from
Jan 27, 2025

Conversation

eddiebergman
Copy link
Contributor

@eddiebergman eddiebergman commented Jan 24, 2025

Re-introduces model based sampling for the various bracket optimizers we have, but using the BoTorch GP's which back our BO. Should make it easier to later incorporate BO for graphs or even multi-objective GPs and acquisition, along with all their performance benefits.

Should be able to support:

We treat 10× as (10 · zmax) the total budget (in epochs) exhausted during multi-fidelity optimization.
After which, a GP model is activated for search that models all the observations made during the
optimization, across any fidelity available. That is, the fidelity is an extra dimension modeled along
with the search space. During the acquisition, since it is known from the optimization state st which
fidelity z the new sample will be evaluated at, a 2-step optimization is performed when maximizing EI.
In the first step, a set of configurations (10 in our experiments) is extracted for fidelity z through Monte
Carlo estimates of Equation 6 returning configurations likely to improve over the best configuration
found so far at z. Following this, the EI score is calculated for this selected set of configurations
using Equation 6 but with z = zmax. At this stage, ymin is chosen to be the best score obtained across
all observations. The idea is to choose a configuration that is likely to maximize performance at the
fidelity level where it is being queried and is also likely to improve at the target fidelity zmax.
Not sure how flexible we want to actual GPSampler to be.


Also made passing custom optimizers a bit more explicit, as this is not a regular user feature.

NOTE: This function here is too simple and the BO part solves it almost instantly on the first try so you'll start getting numerical error warnings.

from __future__ import annotations

import neps

# logging.basicConfig(level=logging.INFO)


def evaluate_pipeline(float1, float2, categorical, integer1, integer2):
    x1 = float1 * float2
    x2 = -int(categorical) * float1 / (float2 + 11)
    x3 = integer2 / 2
    x4 = -integer1
    return sum([x1, x2, x4]) * x3


space = neps.SearchSpace(
    {
        "float1": neps.Float(lower=0, upper=1),
        "float2": neps.Float(lower=-10, upper=10),
        "categorical": neps.Categorical(choices=[0, 1]),
        "integer1": neps.Integer(lower=0, upper=1),
        "integer2": neps.Integer(lower=1, upper=9, is_fidelity=True),
    }
)

kwargs = {
    "bracket_type": "hyperband",
    "eta": 3,
    "sample_prior_first": True,
    "sampler": "priorband",
    "early_stopping_rate": None,
    "bayesian_optimization_kick_in_point": 10,
    "device": None,
}
optimizer = neps.algorithms._bracket_optimizer(space, **kwargs)

neps.run(
    evaluate_pipeline=evaluate_pipeline,
    pipeline_space=space,
    optimizer=neps.algorithms.custom(
        name="example",
        optimizer=optimizer,
        kwargs=kwargs,
        initialized=True,
    ),
    root_directory="deleteme",
    overwrite_working_directory=True,
    max_evaluations_total=100,
)

frame, short = neps.status("deleteme", print_summary=True)
print(frame)  # noqa: T201
print(short)  # noqa: T201

@eddiebergman eddiebergman force-pushed the feat-model-based-bracket-optimizers branch from a602c66 to 75e12b7 Compare January 26, 2025 20:08
@eddiebergman eddiebergman changed the title feat: model based bracket optimizers (wip) feat: model based bracket optimizers Jan 27, 2025
@eddiebergman eddiebergman merged commit 04c4cbe into master Jan 27, 2025
15 checks passed
@eddiebergman eddiebergman deleted the feat-model-based-bracket-optimizers branch January 27, 2025 09:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

1 participant