Dragonfly is an open source python library for scalable Bayesian optimisation.

Overview


Dragonfly is an open source python library for scalable Bayesian optimisation.

Bayesian optimisation is used for optimising black-box functions whose evaluations are usually expensive. Beyond vanilla optimisation techniques, Dragonfly provides an array of tools to scale up Bayesian optimisation to expensive large scale problems. These include features/functionality that are especially suited for high dimensional optimisation (optimising for a large number of variables), parallel evaluations in synchronous or asynchronous settings (conducting multiple evaluations in parallel), multi-fidelity optimisation (using cheap approximations to speed up the optimisation process), and multi-objective optimisation (optimising multiple functions simultaneously).

Dragonfly is compatible with Python2 (>= 2.7) and Python3 (>= 3.5) and has been tested on Linux, macOS, and Windows platforms. For documentation, installation, and a getting started guide, see our readthedocs page. For more details, see our paper.

 

Installation

See here for detailed instructions on installing Dragonfly and its dependencies.

Quick Installation: If you have done this kind of thing before, you should be able to install Dragonfly via pip.

$ sudo apt-get install python-dev python3-dev gfortran # On Ubuntu/Debian
$ pip install numpy
$ pip install dragonfly-opt -v

Testing the Installation: You can import Dragonfly in python to test if it was installed properly. If you have installed via source, make sure that you move to a different directory to avoid naming conflicts.

$ python
>>> from dragonfly import minimise_function
>>> # The first argument below is the function, the second is the domain, and the third is the budget.
>>> min_val, min_pt, history = minimise_function(lambda x: x ** 4 - x**2 + 0.1 * x, [[-10, 10]], 10);  
...
>>> min_val, min_pt
(-0.32122746026750953, array([-0.7129672]))

Due to stochasticity in the algorithms, the above values for min_val, min_pt may be different. If you run it for longer (e.g. min_val, min_pt, history = minimise_function(lambda x: x ** 4 - x**2 + 0.1 * x, [[-10, 10]], 100)), you should get more consistent values for the minimum.

If the installation fails or if there are warning messages, see detailed instructions here.

 

Quick Start

Dragonfly can be used directly in the command line by calling dragonfly-script.py or be imported in python code via the maximise_function function in the main library or in ask-tell mode. To help get started, we have provided some examples in the examples directory. See our readthedocs getting started pages (command line, Python, Ask-Tell) for examples and use cases.

Command line: Below is an example usage in the command line.

$ cd examples
$ dragonfly-script.py --config synthetic/branin/config.json --options options_files/options_example.txt

In Python code: The main APIs for Dragonfly are defined in dragonfly/apis. For their definitions and arguments, see dragonfly/apis/opt.py and dragonfly/apis/moo.py. You can import the main API in python code via,

from dragonfly import minimise_function, maximise_function
func = lambda x: x ** 4 - x**2 + 0.1 * x
domain = [[-10, 10]]
max_capital = 100
min_val, min_pt, history = minimise_function(func, domain, max_capital)
print(min_val, min_pt)
max_val, max_pt, history = maximise_function(lambda x: -func(x), domain, max_capital)
print(max_val, max_pt)

Here, func is the function to be maximised, domain is the domain over which func is to be optimised, and max_capital is the capital available for optimisation. The domain can be specified via a JSON file or in code. See here, here, here, here, here, here, here, here, here, here, and here for more detailed examples.

In Ask-Tell Mode: Ask-tell mode provides you more control over your experiments where you can supply past results to our API in order to obtain a recommendation. See the following example for more details.

For a comprehensive list of uses cases, including multi-objective optimisation, multi-fidelity optimisation, neural architecture search, and other optimisation methods (besides Bayesian optimisation), see our readthe docs pages (command line, Python, Ask-Tell)).

 

Contributors

Kirthevasan Kandasamy: github, webpage
Karun Raju Vysyaraju: github, linkedin
Anthony Yu: github, linkedin
Willie Neiswanger: github, webpage
Biswajit Paria: github, webpage
Chris Collins: github, webpage

Acknowledgements

Research and development of the methods in this package were funded by DOE grant DESC0011114, NSF grant IIS1563887, the DARPA D3M program, and AFRL.

Citation

If you use any part of this code in your work, please cite our JMLR paper.

@article{JMLR:v21:18-223,
  author  = {Kirthevasan Kandasamy and Karun Raju Vysyaraju and Willie Neiswanger and Biswajit Paria and Christopher R. Collins and Jeff Schneider and Barnabas Poczos and Eric P. Xing},
  title   = {Tuning Hyperparameters without Grad Students: Scalable and Robust Bayesian Optimisation with Dragonfly},
  journal = {Journal of Machine Learning Research},
  year    = {2020},
  volume  = {21},
  number  = {81},
  pages   = {1-27},
  url     = {http://jmlr.org/papers/v21/18-223.html}
}

License

This software is released under the MIT license. For more details, please refer LICENSE.txt.

For questions, please email [email protected].

"Copyright 2018-2019 Kirthevasan Kandasamy"

Extreme Learning Machine implementation in Python

Python-ELM v0.3 --- ARCHIVED March 2021 --- This is an implementation of the Extreme Learning Machine [1][2] in Python, based on scikit-learn. From

David C. Lambert 511 Dec 20, 2022
A python library for Bayesian time series modeling

PyDLM Welcome to pydlm, a flexible time series modeling library for python. This library is based on the Bayesian dynamic linear model (Harrison and W

Sam 438 Dec 17, 2022
A modular active learning framework for Python

Modular Active Learning framework for Python3 Page contents Introduction Active learning from bird's-eye view modAL in action From zero to one in a fe

modAL 1.9k Dec 31, 2022
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for ranking, classification, regression and other machine learning tasks for Python, R, Java, C++. Supports computation on CPU and GPU.

Website | Documentation | Tutorials | Installation | Release Notes CatBoost is a machine learning method based on gradient boosting over decision tree

CatBoost 6.9k Jan 05, 2023
Adaptive: parallel active learning of mathematical functions

adaptive Adaptive: parallel active learning of mathematical functions. adaptive is an open-source Python library designed to make adaptive parallel fu

741 Dec 27, 2022
Laporan Proyek Machine Learning - Azhar Rizki Zulma

Laporan Proyek Machine Learning - Azhar Rizki Zulma Project Overview Domain proyek yang dipilih dalam proyek machine learning ini adalah mengenai hibu

Azhar Rizki Zulma 6 Mar 12, 2022
Tools for mathematical optimization region

Tools for mathematical optimization region

林景 15 Nov 30, 2022
A simple python program that draws a tree for incrementing values using the Collatz Conjecture.

Collatz Conjecture A simple python program that draws a tree for incrementing values using the Collatz Conjecture. Values which can be edited: Length

davidgasinski 1 Oct 28, 2021
database for artificial intelligence/machine learning data

AIDB v0.0.1 database for artificial intelligence/machine learning data Overview aidb is a database designed for large dataset for machine learning pro

Aarush Gupta 1 Oct 24, 2021
JMP is a Mixed Precision library for JAX.

Mixed precision training [0] is a technique that mixes the use of full and half precision floating point numbers during training to reduce the memory bandwidth requirements and improve the computatio

DeepMind 108 Dec 31, 2022
Predico Disease Prediction system based on symptoms provided by patient- using Python-Django & Machine Learning

Predico Disease Prediction system based on symptoms provided by patient- using Python-Django & Machine Learning

Felix Daudi 1 Jan 06, 2022
Simple data balancing baselines for worst-group-accuracy benchmarks.

BalancingGroups Code to replicate the experimental results from Simple data balancing baselines achieve competitive worst-group-accuracy. Replicating

Facebook Research 29 Dec 02, 2022
A linear regression model for house price prediction

Linear_Regression_Model A linear regression model for house price prediction. This code is using these packages, so please make sure your have install

ShawnWang 1 Nov 29, 2021
Machine Learning for RC Cars

Suiron Machine Learning for RC Cars Prediction visualization (green = actual, blue = prediction) Click the video below to see it in action! Dependenci

Kendrick Tan 706 Jan 02, 2023
Classification based on Fuzzy Logic(C-Means).

CMeans_fuzzy Classification based on Fuzzy Logic(C-Means). Table of Contents About The Project Fuzzy CMeans Algorithm Built With Getting Started Insta

Armin Zolfaghari Daryani 3 Feb 08, 2022
A handy tool for common machine learning models' hyper-parameter tuning.

Common machine learning models' hyperparameter tuning This repo is for a collection of hyper-parameter tuning for "common" machine learning models, in

Kevin Hu 2 Jan 27, 2022
Predict the demand for electricity (R) - FRENCH

06.demand-electricity Predict the demand for electricity (R) - FRENCH Prédisez la demande en électricité Prérequis Pour effectuer ce projet, vous devr

1 Feb 13, 2022
Flightfare-Prediction - It is a Flightfare Prediction Web Application Using Machine learning,Python and flask

Flight_fare-Prediction It is a Flight_fare Prediction Web Application Using Machine learning,Python and flask Using Machine leaning i have created a F

1 Dec 06, 2022
LinearRegression2 Tvads and CarSales

LinearRegression2_Tvads_and_CarSales This project infers the insight that how the TV ads for cars and car Sales are being linked with each other. It i

Ashish Kumar Yadav 1 Dec 29, 2021