Coder Social home page Coder Social logo

zoopt's Introduction

The maintenance of ZOOpt has shifted to https://github.com/polixir/ZOOpt . The new version is compatible with Ray.

ZOOpt

license Build Status Documentation Status codecov

ZOOpt is a python package for Zeroth-Order Optimization.

Zeroth-order optimization (a.k.a. derivative-free optimization/black-box optimization) does not rely on the gradient of the objective function, but instead, learns from samples of the search space. It is suitable for optimizing functions that are nondifferentiable, with many local minima, or even unknown but only testable.

ZOOpt implements some state-of-the-art zeroth-order optimization methods and their parallel versions. Users only need to add several keywords to use parallel optimization on a single machine. For large-scale distributed optimization across multiple machines, please refer to Distributed ZOOpt.

Documents: Tutorial of ZOOpt

Citation:

Yu-Ren Liu, Yi-Qi Hu, Hong Qian, Yang Yu, Chao Qian. ZOOpt: Toolbox for Derivative-Free Optimization. CORR abs/1801.00329

(Features in this article are from version 0.2)

Installation

The easiest way to install ZOOpt is to type pip install zoopt in the terminal/command line.

Alternatively, to install ZOOpt by source code, download this repository and sequentially run following commands in your terminal/command line.

$ python setup.py build
$ python setup.py install

A simple example

We define the Ackley function for minimization (note that this function is for arbitrary dimensions, determined by the solution)

import numpy as np
def ackley(solution):
    x = solution.get_x()
    bias = 0.2
    value = -20 * np.exp(-0.2 * np.sqrt(sum([(i - bias) * (i - bias) for i in x]) / len(x))) - \
            np.exp(sum([np.cos(2.0*np.pi*(i-bias)) for i in x]) / len(x)) + 20.0 + np.e
    return value

Ackley function is a classical function with many local minima. In 2-dimension, it looks like (from wikipedia)

Ackley function
Then, use ZOOpt to optimize a 100-dimension Ackley function:
from zoopt import Dimension, ValueType, Dimension2, Objective, Parameter, Opt, ExpOpt

dim_size = 100  # dimension size
dim = Dimension(dim_size, [[-1, 1]]*dim_size, [True]*dim_size)  # dim = Dimension2([(ValueType.CONTINUOUS, [-1, 1], 1e-6)]*dim_size)
obj = Objective(ackley, dim)
# perform optimization
solution = Opt.min(obj, Parameter(budget=100*dim_size))
# print the solution
print(solution.get_x(), solution.get_value())
# parallel optimization for time-consuming tasks
solution = Opt.min(obj, Parameter(budget=100*dim_size, parallel=True, server_num=3))

For a few seconds, the optimization is done. Then, we can visualize the optimization progress

import matplotlib.pyplot as plt
plt.plot(obj.get_history_bestsofar())
plt.savefig('figure.png')

which looks like

Expeirment results
We can also use `ExpOpt` to repeat the optimization for performance analysis, which will calculate the mean and standard deviation of multiple optimization results while automatically visualizing the optimization progress.
solution_list = ExpOpt.min(obj, Parameter(budget=100*dim_size), repeat=3,
                           plot=True, plot_file="progress.png")
for solution in solution_list:
		print(solution.get_x(), solution.get_value())

More examples are available in the example fold.

Releases

  • Add a parallel implementation of SRACOS, which accelarates the optimization by asynchronous parallelization.
  • Add a function that enables users to set a customized stop criteria for the optimization.
  • Rewrite the documentation to make it easier to follow.
  • Add the noise handling strategies Re-sampling and Value Suppression (AAAI'18), and the subset selection method with noise handling PONSS (NIPS'17)
  • Add high-dimensionality handling method Sequential Random Embedding (IJCAI'16)
  • Rewrite Pareto optimization method. Bugs fixed.
  • Include the general optimization method RACOS (AAAI'16) and Sequential RACOS (AAAI'17), and the subset selection method POSS (NIPS'15).
  • The algorithm selection is automatic. See examples in the example fold.- Default parameters work well on many problems, while parameters are fully controllable
  • Running speed optmized for Python

Distributed ZOOpt

Distributed ZOOpt is consisted of a server project and a client project. Details can be found in the Tutorial of Distributed ZOOpt

zoopt's People

Contributors

alexliuyuren avatar eyounx avatar paper2019 avatar xionghuichen avatar zewenli98 avatar hzhupku avatar nlnjnj avatar nogod1995 avatar

Watchers

James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.