Comments (4)
Hi Martin, I am closing this one as we have not heard from you, feel free to reopen if you still have a question.
from syne-tune.
Hi Martin,
Most of the searchers in syne-tune do not support conditional search space.
The easiest way would be to write your own scheduler with your constraint, here is one example:
import logging
from typing import Optional, List, Dict, Any
import numpy as np
from syne_tune.backend import PythonBackend
from syne_tune.backend.trial_status import Trial
from syne_tune.config_space import randint
from syne_tune.optimizer.scheduler import (
TrialScheduler,
SchedulerDecision,
TrialSuggestion,
)
from syne_tune.stopping_criterion import StoppingCriterion
from syne_tune.tuner import Tuner
class ConditionalScheduler(TrialScheduler):
def __init__(
self, config_space: Dict[str, Any], metric: str, mode: Optional[str] = None
):
super(ConditionalScheduler, self).__init__(config_space=config_space)
self.metric = metric
self.mode = mode if mode is not None else "min"
self.sorted_results = []
def _suggest(self, trial_id: int) -> Optional[TrialSuggestion]:
# sample a new candidate with a illustrative constraint, if x >= 5 then y can only be 1, 2 or 3.
x = self.config_space["x"].sample()
if x >= 5:
y = np.clip(self.config_space["y"].sample(), a_min=1, a_max=3)
else:
y = self.config_space["y"].sample()
config = dict(x=x, y=y)
return TrialSuggestion.start_suggestion(config)
def on_trial_result(self, trial: Trial, result: Dict[str, Any]) -> str:
# Given a new result, we decide whether the trial should stop or continue.
# In this case, we implement a naive strategy that stops if the result is worse than 80% of previous results.
# This is a naive strategy as we do not account for the fact that trial improves with more steps.
new_metric = result[self.metric]
# insert new metric in sorted results
index = np.searchsorted(self.sorted_results, new_metric)
self.sorted_results = np.insert(self.sorted_results, index, new_metric)
normalized_rank = index / float(len(self.sorted_results))
if self.mode == "max":
normalized_rank = 1 - normalized_rank
if normalized_rank < 0.8:
return SchedulerDecision.CONTINUE
else:
logging.info(
f"see new results {new_metric} which rank {normalized_rank * 100}%, "
f"stopping it as it does not rank on the top 80%"
)
return SchedulerDecision.STOP
def metric_names(self) -> List[str]:
return [self.metric]
def f(x, y):
from syne_tune import Reporter
reporter = Reporter()
reporter(error=(x / 10.0) ** 2 + (y / 10.0) ** 2)
if __name__ == "__main__":
logging.getLogger().setLevel(logging.DEBUG)
n_workers = 4
config_space = {
"x": randint(0, 10),
"y": randint(0, 10),
}
trial_backend = PythonBackend(tune_function=f, config_space=config_space)
scheduler = ConditionalScheduler(
config_space=config_space, metric="error", mode="min",
)
stop_criterion = StoppingCriterion(max_wallclock_time=20)
tuner = Tuner(
trial_backend=trial_backend,
scheduler=scheduler,
stop_criterion=stop_criterion,
n_workers=n_workers,
)
tuner.run()
It should be straight-forward to adapt to ASHA or perform other early stopping rules with this example, see also this issue that was related and might help #743.
Hope those help!
from syne-tune.
If you just need conditional random search, a simpler idea is to define several variables, like y_a: randint(0, 10)
, y_b: randint(1, 3)
, and then in the objective select the right y
variable, depending on the value of x
.
from syne-tune.
Thank you, sorry I completely forgot to reply!
from syne-tune.
Related Issues (20)
- RemoteLauncher corrupts requirements.txt when not ending with newline HOT 5
- Conditional/Inactive hyperparameters HOT 6
- Troubles with maximising using MORandomScalarizationBayesOpt HOT 4
- Run BOHB/SyncBOHB using lcbench HOT 2
- Open `MultiObjectiveMultiSurrogateSearcher` to additional arguments HOT 2
- Simple example for learning curve plotting HOT 7
- Surprising results of trial values over time HOT 3
- Using sigterm / catching sigterm to enable checkpointing HOT 10
- Convenience transformation for config spaces HOT 8
- Docs for continuing aborted runs HOT 12
- Hard to find default configurations for schedulers HOT 3
- Difficulties setting rungs / stopping HOT 20
- GP not robust to NaN metric HOT 2
- Direct support for time as a resource? HOT 7
- Acquisition functions in Bayesian optimization HOT 1
- Update Ray dependencies, as dependabot flags them as security vulnerabilities
- Set custom GPU Ids for LocalBackend HOT 2
- [Question] Multiple runs for same parameter values HOT 5
- ModuleNotFoundError: No module named 'sagemaker.interactive_apps' HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from syne-tune.