An end-to-end framework for mixed-integer optimization with data-driven learned constraints.

Overview

OptiCL

OptiCL is an end-to-end framework for mixed-integer optimization (MIO) with data-driven learned constraints. We address a problem setting in which a practitioner wishes to optimize decisions according to some objective and constraints, but that we have no known functions relating our decisions to the outcomes of interest. We propose to learn predictive models for these outcomes using machine learning, and to subsequently optimize decisions by embedding the learned models in a larger MIO formulation.

The framework and full methodology are detailed in our manuscript, Mixed-Integer Optimization with Constraint Learning.

How to use OptiCL

You can install the OptiCL package locally by cloning the repository and running pip install . within the home directory of the repo. This will allow you to load opticl in Python; see the example notebooks for specific usage of the functions.

The OptiCL pipeline

Our pipeline requires two inputs from a user:

  • Training data, with features classified as contextual variables, decisions, and outcomes.
  • An initial conceptual model, which is defined by specifying the decision variables and any domain-driven fixed constraints or deterministic objective terms.

Given these inputs, we implement a pipeline that:

  1. Learns predictive models for the outcomes of interest by using a moel training and selection pipeline with cross-validation.
  2. Efficiently charactertizes the feasible decision space, or "trust region," using the convex hull of the observed data.
  3. Embeds the learned models and trust region into a MIO formulation, which can then be solved using a Pyomo-supported MIO solver (e.g., Gurobi).

OptiCL requires no manual specification of a trained ML model, although the end-user can optionally restrict to a subset of model types to be considered in the selection pipeline. Furthermore, we expose the underlying trained models within the pipeline, providing transparency and allowing for the predictive models to be externally evaluated.

Examples

We illustrate the full OptiCL pipeline in three notebooks:

  • A case study on food basket optimization for the World Food Programme (notebooks/WFP/The Palatable Diet Problem.ipynb): This notebook presents a simplified version of the case study in the manuscript. It shows how to train and select models for a single learned outcome, define a conceptual model with a known objective and constraints, and solve the MIO with an additional learned constraint.
  • A general pipeline overview (notebooks/Pipeline/Model_embedding.ipynb): This notebook demonstrates the general features of the pipleine, including the procedure for training and embedding models for multiple outcomes, the specification of each outcome as either a constraint or objective term, and the incorporation of contextual features and domain-driven constraints.
  • Model verification (notebooks/Pipeline/Model_Verification_Regression.ipynb, notebooks/Pipeline/Model_Verification_Classification.ipynb): These notebooks shows the training and embedding of a single model and compares the sklearn predictions to the MIO predictions to verify the MIO embeddings. The classification notebook also provides details on how we linearize constraints for the binary classification setting.

The package currently fully supports model training and embedding for continuous outcomes across all ML methods, as demonstrated in the example notebooks. Binary classification is fully supported for learned constraints. Multi-class classification support is in development.

Citation

Our software can be cited as:

  @misc{OptiCL,
    author = "Donato Maragno and Holly Wiberg",
    title = "OptiCL: Mixed-integer optimization with constraint learning",
    year = 2021,
    url = "https://github.com/hwiberg/OptiCL/"
  }

Get in touch!

Our package is under active development. We welcome any questions or suggestions. Please submit an issue on Github, or reach us at [email protected] and [email protected].

Owner
Holly Wiberg
Holly Wiberg
Bayesian Generative Adversarial Networks in Tensorflow

Bayesian Generative Adversarial Networks in Tensorflow This repository contains the Tensorflow implementation of the Bayesian GAN by Yunus Saatchi and

Andrew Gordon Wilson 1k Nov 29, 2022
Tensorflow 2.x implementation of Panoramic BlitzNet for object detection and semantic segmentation on indoor panoramic images.

Deep neural network for object detection and semantic segmentation on indoor panoramic images. The implementation is based on the papers:

Alejandro de Nova Guerrero 9 Nov 24, 2022
SHIFT15M: multiobjective large-scale fashion dataset with distributional shifts

[arXiv] The main motivation of the SHIFT15M project is to provide a dataset that contains natural dataset shifts collected from a web service IQON, wh

ZOZO, Inc. 138 Nov 24, 2022
GNN4Traffic - This is the repository for the collection of Graph Neural Network for Traffic Forecasting

GNN4Traffic - This is the repository for the collection of Graph Neural Network for Traffic Forecasting

564 Jan 02, 2023
Deep Learning Tutorial for Kaggle Ultrasound Nerve Segmentation competition, using Keras

Deep Learning Tutorial for Kaggle Ultrasound Nerve Segmentation competition, using Keras This tutorial shows how to use Keras library to build deep ne

Marko Jocić 922 Dec 19, 2022
Sionna: An Open-Source Library for Next-Generation Physical Layer Research

Sionna: An Open-Source Library for Next-Generation Physical Layer Research Sionna™ is an open-source Python library for link-level simulations of digi

NVIDIA Research Projects 313 Dec 22, 2022
Advanced Deep Learning with TensorFlow 2 and Keras (Updated for 2nd Edition)

Advanced Deep Learning with TensorFlow 2 and Keras (Updated for 2nd Edition)

Packt 1.5k Jan 03, 2023
Toolkit for collecting and applying prompts

PromptSource Promptsource is a toolkit for collecting and applying prompts to NLP datasets. Promptsource uses a simple templating language to programa

BigScience Workshop 998 Jan 03, 2023
🐦 Opytimizer is a Python library consisting of meta-heuristic optimization techniques.

Opytimizer: A Nature-Inspired Python Optimizer Welcome to Opytimizer. Did you ever reach a bottleneck in your computational experiments? Are you tired

Gustavo Rosa 546 Dec 31, 2022
PyTorch inference for "Progressive Growing of GANs" with CelebA snapshot

Progressive Growing of GANs inference in PyTorch with CelebA training snapshot Description This is an inference sample written in PyTorch of the origi

320 Nov 21, 2022
Code for Massive-scale Decoding for Text Generation using Lattices

Massive-scale Decoding for Text Generation using Lattices Jiacheng Xu, Greg Durrett TL;DR: a new search algorithm to construct lattices encoding many

Jiacheng Xu 37 Dec 18, 2022
This is the repo for the paper `SumGNN: Multi-typed Drug Interaction Prediction via Efficient Knowledge Graph Summarization'. (published in Bioinformatics'21)

SumGNN: Multi-typed Drug Interaction Prediction via Efficient Knowledge Graph Summarization This is the code for our paper ``SumGNN: Multi-typed Drug

Yue Yu 58 Dec 21, 2022
PyTorch Implementation of PortaSpeech: Portable and High-Quality Generative Text-to-Speech

PortaSpeech - PyTorch Implementation PyTorch Implementation of PortaSpeech: Portable and High-Quality Generative Text-to-Speech. Model Size Module Nor

Keon Lee 279 Jan 04, 2023
TensorFlow implementation of original paper : https://github.com/hszhao/PSPNet

Keras implementation of PSPNet(caffe) Implemented Architecture of Pyramid Scene Parsing Network in Keras. For the best compability please use Python3.

VladKry 386 Dec 29, 2022
CaFM-pytorch ICCV ACCEPT Introduction of dataset VSD4K

CaFM-pytorch ICCV ACCEPT Introduction of dataset VSD4K Our dataset VSD4K includes 6 popular categories: game, sport, dance, vlog, interview and city.

96 Jul 05, 2022
A Fast Sequence Transducer Implementation with PyTorch Bindings

transducer A Fast Sequence Transducer Implementation with PyTorch Bindings. The corresponding publication is Sequence Transduction with Recurrent Neur

Awni Hannun 184 Dec 18, 2022
Implement the Pareto Optimizer and pcgrad to make a self-adaptive loss for multi-task

multi-task_losses_optimizer Implement the Pareto Optimizer and pcgrad to make a self-adaptive loss for multi-task 已经实验过了,不会有cuda out of memory情况 ##Par

14 Dec 25, 2022
Exploring the Dual-task Correlation for Pose Guided Person Image Generation

Dual-task Pose Transformer Network The source code for our paper "Exploring Dual-task Correlation for Pose Guided Person Image Generation“ (CVPR2022)

63 Dec 15, 2022
An implementation for the ICCV 2021 paper Deep Permutation Equivariant Structure from Motion.

Deep Permutation Equivariant Structure from Motion Paper | Poster This repository contains an implementation for the ICCV 2021 paper Deep Permutation

72 Dec 27, 2022
Exploration-Exploitation Dilemma Solving Methods

Exploration-Exploitation Dilemma Solving Methods Medium article for this repo - HERE In ths repo I implemented two techniques for tackling mentioned t

Aman Mishra 6 Jan 25, 2022