Coder Social home page Coder Social logo

hyperopt-gpsmbo's Introduction

Hyperopt: Distributed Hyperparameter Optimization

build pre-commit.ci status PyPI version Anaconda-Server Badge

Hyperopt is a Python library for serial and parallel optimization over awkward search spaces, which may include real-valued, discrete, and conditional dimensions.

Getting started

Install hyperopt from PyPI

pip install hyperopt

to run your first example

# define an objective function
def objective(args):
    case, val = args
    if case == 'case 1':
        return val
    else:
        return val ** 2

# define a search space
from hyperopt import hp
space = hp.choice('a',
    [
        ('case 1', 1 + hp.lognormal('c1', 0, 1)),
        ('case 2', hp.uniform('c2', -10, 10))
    ])

# minimize the objective over the space
from hyperopt import fmin, tpe, space_eval
best = fmin(objective, space, algo=tpe.suggest, max_evals=100)

print(best)
# -> {'a': 1, 'c2': 0.01420615366247227}
print(space_eval(space, best))
# -> ('case 2', 0.01420615366247227}

Contributing

If you're a developer and wish to contribute, please follow these steps.

Setup (based on this)

  1. Create an account on GitHub if you do not already have one.

  2. Fork the project repository: click on the ‘Fork’ button near the top of the page. This creates a copy of the code under your account on the GitHub user account. For more details on how to fork a repository see this guide.

  3. Clone your fork of the hyperopt repo from your GitHub account to your local disk:

    git clone https://github.com/<github username>/hyperopt.git
    cd hyperopt
  4. Create environment with:
    $ python3 -m venv my_env or $ python -m venv my_env or with conda:
    $ conda create -n my_env python=3

  5. Activate the environment:
    $ source my_env/bin/activate
    or with conda:
    $ conda activate my_env

  6. Install dependencies for extras (you'll need these to run pytest): Linux/UNIX: $ pip install -e '.[MongoTrials, SparkTrials, ATPE, dev]'

    or Windows:

    pip install -e .[MongoTrials]
    pip install -e .[SparkTrials]
    pip install -e .[ATPE]
    pip install -e .[dev]
  7. Add the upstream remote. This saves a reference to the main hyperopt repository, which you can use to keep your repository synchronized with the latest changes:

    $ git remote add upstream https://github.com/hyperopt/hyperopt.git

    You should now have a working installation of hyperopt, and your git repository properly configured. The next steps now describe the process of modifying code and submitting a PR:

  8. Synchronize your master branch with the upstream master branch:

    git checkout master
    git pull upstream master
  9. Create a feature branch to hold your development changes:

    $ git checkout -b my_feature

    and start making changes. Always use a feature branch. It’s good practice to never work on the master branch!

  10. We recommend to use Black to format your code before submitting a PR which is installed automatically in step 6.

  11. Then, once you commit ensure that git hooks are activated (Pycharm for example has the option to omit them). This can be done using pre-commit, which is installed automatically in step 6, as follows:

    pre-commit install

    This will run black automatically when you commit on all files you modified, failing if there are any files requiring to be blacked. In case black does not run execute the following:

    black {source_file_or_directory}
  12. Develop the feature on your feature branch on your computer, using Git to do the version control. When you’re done editing, add changed files using git add and then git commit:

    git add modified_files
    git commit -m "my first hyperopt commit"
  13. The tests for this project use PyTest and can be run by calling pytest.

  14. Record your changes in Git, then push the changes to your GitHub account with:

    git push -u origin my_feature

Note that dev dependencies require python 3.6+.

Algorithms

Currently three algorithms are implemented in hyperopt:

Hyperopt has been designed to accommodate Bayesian optimization algorithms based on Gaussian processes and regression trees, but these are not currently implemented.

All algorithms can be parallelized in two ways, using:

Documentation

Hyperopt documentation can be found here, but is partly still hosted on the wiki. Here are some quick links to the most relevant pages:

Related Projects

Examples

See projects using hyperopt on the wiki.

Announcements mailing list

Announcements

Discussion mailing list

Discussion

Cite

If you use this software for research, please cite the paper (http://proceedings.mlr.press/v28/bergstra13.pdf) as follows:

Bergstra, J., Yamins, D., Cox, D. D. (2013) Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures. TProc. of the 30th International Conference on Machine Learning (ICML 2013), June 2013, pp. I-115 to I-23.

Thanks

This project has received support from

  • National Science Foundation (IIS-0963668),
  • Banting Postdoctoral Fellowship program,
  • National Science and Engineering Research Council of Canada (NSERC),
  • D-Wave Systems, Inc.

hyperopt-gpsmbo's People

Contributors

jaberg avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

hyperopt-gpsmbo's Issues

AttributeError: 'DomainGP_EI' object has no attribute '_cost_deriv'

Hi @jaberg,
Thanks for the package! This extension on hyperopt can be very handy.

I was trying to run fmin with algo=ei.suggest and got the error message above. The full error message is as follows:

AttributeError                            Traceback (most recent call last)
~/anaconda/envs/ldsa/lib/python3.6/site-packages/hp_gpsmbo/hpsuggest_ei.py in init_fns(self)
     18         try:
---> 19             self._cost_deriv
     20         except AttributeError:

AttributeError: 'DomainGP_EI' object has no attribute '_cost_deriv'

During handling of the above exception, another exception occurred:

TypeError                                 Traceback (most recent call last)
<ipython-input-146-b79d81f8328e> in <module>()
      5                 #algo=ucb.suggest,
      6                 max_evals = 30,
----> 7                 trials=trials)
      8 rf_best

~/anaconda/envs/ldsa/lib/python3.6/site-packages/hyperopt/fmin.py in fmin(fn, space, algo, max_evals, trials, rstate, allow_trials_fmin, pass_expr_memo_ctrl, catch_eval_exceptions, verbose, return_argmin)
    305             verbose=verbose,
    306             catch_eval_exceptions=catch_eval_exceptions,
--> 307             return_argmin=return_argmin,
    308         )
    309 

~/anaconda/envs/ldsa/lib/python3.6/site-packages/hyperopt/base.py in fmin(self, fn, space, algo, max_evals, rstate, verbose, pass_expr_memo_ctrl, catch_eval_exceptions, return_argmin)
    633             pass_expr_memo_ctrl=pass_expr_memo_ctrl,
    634             catch_eval_exceptions=catch_eval_exceptions,
--> 635             return_argmin=return_argmin)
    636 
    637 

~/anaconda/envs/ldsa/lib/python3.6/site-packages/hyperopt/fmin.py in fmin(fn, space, algo, max_evals, trials, rstate, allow_trials_fmin, pass_expr_memo_ctrl, catch_eval_exceptions, verbose, return_argmin)
    318                     verbose=verbose)
    319     rval.catch_eval_exceptions = catch_eval_exceptions
--> 320     rval.exhaust()
    321     if return_argmin:
    322         return trials.argmin

~/anaconda/envs/ldsa/lib/python3.6/site-packages/hyperopt/fmin.py in exhaust(self)
    197     def exhaust(self):
    198         n_done = len(self.trials)
--> 199         self.run(self.max_evals - n_done, block_until_done=self.async)
    200         self.trials.refresh()
    201         return self

~/anaconda/envs/ldsa/lib/python3.6/site-packages/hyperopt/fmin.py in run(self, N, block_until_done)
    155                                                   d['result'].get('status')))
    156                 new_trials = algo(new_ids, self.domain, trials,
--> 157                                   self.rstate.randint(2 ** 31 - 1))
    158                 assert len(new_ids) >= len(new_trials)
    159                 if len(new_trials):

~/anaconda/envs/ldsa/lib/python3.6/site-packages/hp_gpsmbo/hpsuggest_ei.py in suggest(new_ids, domain, trials, seed, warmup_cutoff, n_buckshots, n_finetunes, stop_at, plot_contours, gp_fit_method, failure_loss, max_ei_thresh)
    200         n_buckshots=n_buckshots,
    201         n_finetunes=n_finetunes,
--> 202         rng=rng,
    203         )
    204     t1 = time.time()

~/anaconda/envs/ldsa/lib/python3.6/site-packages/hp_gpsmbo/hpsuggest_ei.py in optimize_over_X(self, n_buckshots, n_finetunes, rng)
    137                                                 n_finetunes,
    138                                                 rng,
--> 139                                                 ret_raw=True)
    140             if len(self.gpr._params_list) == 1:
    141                 Ks = self._K_new(np.atleast_2d(rval_raw),

~/anaconda/envs/ldsa/lib/python3.6/site-packages/hp_gpsmbo/hpsuggest.py in optimize_over_X(self, n_buckshots, n_finetunes, rng, ret_raw, ret_results)
    360         # -- sample a bunch of points
    361         buckshot = self.draw_n_feature_vecs(n_buckshots, rng)
--> 362         buckshot_crit = self.crit(buckshot)
    363         best_first = np.argsort(buckshot_crit)
    364         #print 'buckshot stats', buckshot_crit.min(), buckshot_crit.max()

~/anaconda/envs/ldsa/lib/python3.6/site-packages/hp_gpsmbo/hpsuggest_ei.py in crit(self, X)
     95 
     96     def crit(self, X):
---> 97         self.init_fns()
     98         #return -self.gpr.logEI(X,
     99                                #self._EI_thresh,

~/anaconda/envs/ldsa/lib/python3.6/site-packages/hp_gpsmbo/hpsuggest_ei.py in init_fns(self)
     53                 on_unused_input='ignore',
     54                 allow_input_downcast=True,
---> 55                 profile=0)
     56             op_Kcond.use_lazy_cholesky = None
     57             op_Kcond.use_lazy_cholesky_idx = None

~/anaconda/envs/ldsa/lib/python3.6/site-packages/theano/compile/function.py in function(inputs, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input)
    315                    on_unused_input=on_unused_input,
    316                    profile=profile,
--> 317                    output_keys=output_keys)
    318     return fn

~/anaconda/envs/ldsa/lib/python3.6/site-packages/theano/compile/pfunc.py in pfunc(params, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input, output_keys)
    484                          accept_inplace=accept_inplace, name=name,
    485                          profile=profile, on_unused_input=on_unused_input,
--> 486                          output_keys=output_keys)
    487 
    488 

~/anaconda/envs/ldsa/lib/python3.6/site-packages/theano/compile/function_module.py in orig_function(inputs, outputs, mode, accept_inplace, name, profile, on_unused_input, output_keys)
   1839                   name=name)
   1840         with theano.change_flags(compute_test_value="off"):
-> 1841             fn = m.create(defaults)
   1842     finally:
   1843         t2 = time.time()

~/anaconda/envs/ldsa/lib/python3.6/site-packages/theano/compile/function_module.py in create(self, input_storage, trustme, storage_map)
   1713             theano.config.traceback.limit = theano.config.traceback.compile_limit
   1714             _fn, _i, _o = self.linker.make_thunk(
-> 1715                 input_storage=input_storage_lists, storage_map=storage_map)
   1716         finally:
   1717             theano.config.traceback.limit = limit_orig

~/anaconda/envs/ldsa/lib/python3.6/site-packages/theano/gof/link.py in make_thunk(self, input_storage, output_storage, storage_map)
    697         return self.make_all(input_storage=input_storage,
    698                              output_storage=output_storage,
--> 699                              storage_map=storage_map)[:3]
    700 
    701     def make_all(self, input_storage, output_storage):

~/anaconda/envs/ldsa/lib/python3.6/site-packages/theano/gof/vm.py in make_all(self, profiler, input_storage, output_storage, storage_map)
   1089                                                  compute_map,
   1090                                                  [],
-> 1091                                                  impl=impl))
   1092                 linker_make_thunk_time[node] = time.time() - thunk_start
   1093                 if not hasattr(thunks[-1], 'lazy'):

TypeError: ('The following error happened while compiling the node', <hp_gpsmbo.op_Kcond.LazyCholesky object at 0x1364166d8>(Elemwise{Composite{((i0 * exp((((-maximum(((i1 + i2) - i3), i4)) / i5) + (i6 * maximum(((i7 + i8) - i9), i4)) + log(i10)))) + (i11 * i12))}}[(0, 3)].0, reuse_cholesky, reuse_cholesky_idx), '\n', "make_thunk() got an unexpected keyword argument 'impl'") 

Any suggestions on what could be going wrong?

Much appreciated!
Miguel

Does this work?

It would be nice to see real bayesian optimization and GP in hyperpot. Does this package work? Should we have hyperopt installed? It would be also nice to have some examples in the readme. Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.