NeuroLKH: Combining Deep Learning Model with Lin-Kernighan-Helsgaun Heuristic for Solving the Traveling Salesman Problem

Related tags

Deep LearningNeuroLKH
Overview

NeuroLKH: Combining Deep Learning Model with Lin-Kernighan-Helsgaun Heuristic for Solving the Traveling Salesman Problem

Liang Xin, Wen Song, Zhiguang Cao, Jie Zhang. NeuroLKH: Combining Deep Learning Model with Lin-Kernighan-Helsgaun Heuristic for Solving the Traveling Salesman Problem, 35th Conference on Neural Information Processing Systems (NeurIPS), 2021. [pdf]

Please cite our paper if this code is useful for your work.

@inproceedings{xin2021neurolkh,
    author = {Xin, Liang and Song, Wen and Cao, Zhiguang and Zhang, Jie},
    booktitle = {Advances in Neural Information Processing Systems},
    title = {NeuroLKH: Combining Deep Learning Model with Lin-Kernighan-Helsgaun Heuristic for Solving the Traveling Salesman Problem},
    volume = {34},
    year = {2021}
}

Quick start

To connect the deep learning model Sparse Graph Network (Python) and the Lin-Kernighan-Helsgaun Heuristic (C Programming), we implement two versions.

  • subprocess version. This version requires writting and reading data files to connect the two programming languages. To compile and test with our pretrained models for TSP instances with 100 nodes:
make
python data_generate.py -test
python test.py --dataset test/100.pkl --model_path pretrained/neurolkh.pt --n_samples 1000 --lkh_trials 1000 --neurolkh_trials 1000
  • Swig (http://www.swig.org) version. The C code is wrapped for Python. To compile and test with our pretained models for TSP instances with 100 nodes:
bash setup.sh
python data_generate.py -test
python swig_test.py --dataset test/100.pkl --model_path pretrained/neurolkh.pt --n_samples 1000 --lkh_trials 1000 --neurolkh_trials 1000

Usage

Generate the training dataset

As the training for edge scores requires the edge labels, generating the training dataset will take a relatively long time (a couple of days).

python data_generate.py -train

Train the NeuroLKH Model

To train for the node penalties in the Sparse Graph Network, swig is required and the subprocess version is currently not supported. With one RTX 2080Ti GPU, the model converges in approximately 4 days.

CUDA_VISIBLE_DEVICES="0" python train.py --file_path train --eval_file_path val --eval_batch_size=100 --save_dir=saved/tsp_neurolkh --learning_rate=0.0001

Finetune the node decoder for large sizes

The finetuning process takes less than 1 minute for each size.

CUDA_VISIBLE_DEVICES="0" python finetune_node.py

Testing

Test with the pretrained model on TSP with 500 nodes:

python test.py --dataset test/500.pkl --model_path pretrained/neurolkh.pt --n_samples 1000 --lkh_trials 1000 --neurolkh_trials 1000

We test on the TSPLIB instances with two NeuroLKH Models, NeuroLKH trained with uniformly distributed TSP instances and NeuroLKH_M trained with uniform, clustered and uniform-clustered instances (please refer to the paper for details).

python tsplib_test.py

Other Routing Problems (CVRP, PDP, CVRPTW)

Testing with pretrained models

test for CVRP with 100 customers, PDP and CVRPTW with 40 customers

# Capacitated Vehicle Routing Problem (CVRP)
python CVRPdata_generate.py -test
python CVRP_test.py --dataset CVRP_test/cvrp_100.pkl --model_path pretrained/cvrp_neurolkh.pt --n_samples 1000 --lkh_trials 10000 --neurolkh_trials 10000
# Pickup and Delivery Problem (PDP)
python PDPdata_generate.py -test
python PDP_test.py --dataset PDP_test/pdp_40.pkl --model_path pretrained/pdp_neurolkh.pt --n_samples 1000 --lkh_trials 10000 --neurolkh_trials 10000
# CVRP with Time Windows (CVRPTW)
python CVRPTWdata_generate.py -test
python CVRPTw_test.py --dataset CVRPTW_test/cvrptw_40.pkl --model_path pretrained/cvrptw_neurolkh.pt --n_samples 1000 --lkh_trials 10000 --neurolkh_trials 10000

Training

train for CVRP with 100-500 customers, PDP and CVRPTW with 40-200 customers

# Capacitated Vehicle Routing Problem (CVRP)
python CVRPdata_generate.py -train
CUDA_VISIBLE_DEVICES="0" python CVRP_train.py --save_dir=saved/cvrp_neurolkh
# Pickup and Delivery Problem (PDP)
python PDPdata_generate.py -train
CUDA_VISIBLE_DEVICES="0" python PDP_train.py --save_dir=saved/pdp_neurolkh
# CVRP with Time Windows (CVRPTW)
python CVRPTWdata_generate.py -train
CUDA_VISIBLE_DEVICES="0" python CVRPTW_train.py --save_dir=saved/cvrptw_neurolkh

Dependencies

  • Python >= 3.6
  • Pytorch
  • sklearn
  • Numpy
  • tqdm
  • (Swig, optional)

Acknowledgements

Owner
xinliangedu
xinliangedu
Analysis of Smiles through reservoir sampling & RDkit

Analysis of Smiles through reservoir sampling and machine learning (under development). This is a simple project that includes two Jupyter files for t

Aurimas A. Nausėdas 6 Aug 30, 2022
ICON: Implicit Clothed humans Obtained from Normals

ICON: Implicit Clothed humans Obtained from Normals arXiv, December 2021. Yuliang Xiu · Jinlong Yang · Dimitrios Tzionas · Michael J. Black Table of C

Yuliang Xiu 1.1k Dec 30, 2022
Pytorch Implementation for Dilated Continuous Random Field

DilatedCRF Pytorch implementation for fully-learnable DilatedCRF. If you find my work helpful, please consider our paper: @article{Mo2022dilatedcrf,

DunnoCoding_Plus 3 Nov 13, 2022
Hub is a dataset format with a simple API for creating, storing, and collaborating on AI datasets of any size.

Hub is a dataset format with a simple API for creating, storing, and collaborating on AI datasets of any size. The hub data layout enables rapid transformations and streaming of data while training m

Activeloop 5.1k Jan 08, 2023
Python code for loading the Aschaffenburg Pose Dataset.

Aschaffenburg Pose Dataset (APD) This repository contains Python code for loading and filtering the Aschaffenburg Pose Dataset. The dataset itself and

1 Nov 26, 2021
Official Implementation of "Transformers Can Do Bayesian Inference"

Official Code for the Paper "Transformers Can Do Bayesian Inference" We train Transformers to do Bayesian Prediction on novel datasets for a large var

AutoML-Freiburg-Hannover 103 Dec 25, 2022
This is the official source code of "BiCAT: Bi-Chronological Augmentation of Transformer for Sequential Recommendation".

BiCAT This is our TensorFlow implementation for the paper: "BiCAT: Sequential Recommendation with Bidirectional Chronological Augmentation of Transfor

John 15 Dec 06, 2022
A parametric soroban written with CADQuery.

A parametric soroban written in CADQuery The purpose of this project is to demonstrate how "code CAD" can be intuitive to learn. See soroban.py for a

Lee 4 Aug 13, 2022
Real-time Neural Representation Fusion for Robust Volumetric Mapping

NeuralBlox: Real-Time Neural Representation Fusion for Robust Volumetric Mapping Paper | Supplementary This repository contains the implementation of

ETHZ ASL 106 Dec 24, 2022
CLDF dataset derived from Robbeets et al.'s "Triangulation Supports Agricultural Spread" from 2021

CLDF dataset derived from Robbeets et al.'s "Triangulation Supports Agricultural Spread" from 2021 How to cite If you use these data please cite the o

Digital Linguistics 2 Dec 20, 2021
An efficient and effective learning to rank algorithm by mining information across ranking candidates. This repository contains the tensorflow implementation of SERank model. The code is developed based on TF-Ranking.

SERank An efficient and effective learning to rank algorithm by mining information across ranking candidates. This repository contains the tensorflow

Zhihu 44 Oct 20, 2022
Implementation of Deformable Attention in Pytorch from the paper "Vision Transformer with Deformable Attention"

Deformable Attention Implementation of Deformable Attention from this paper in Pytorch, which appears to be an improvement to what was proposed in DET

Phil Wang 128 Dec 24, 2022
Driller: augmenting AFL with symbolic execution!

Driller Driller is an implementation of the driller paper. This implementation was built on top of AFL with angr being used as a symbolic tracer. Dril

Shellphish 791 Jan 06, 2023
OoD Minimum Anomaly Score GAN - Code for the Paper 'OMASGAN: Out-of-Distribution Minimum Anomaly Score GAN for Sample Generation on the Boundary'

OMASGAN: Out-of-Distribution Minimum Anomaly Score GAN for Sample Generation on the Boundary Out-of-Distribution Minimum Anomaly Score GAN (OMASGAN) C

- 8 Sep 27, 2022
Yolox-bytetrack-sample - Python sample of MOT (Multiple Object Tracking) using YOLOX and ByteTrack

yolox-bytetrack-sample YOLOXとByteTrackを用いたMOT(Multiple Object Tracking)のPythonサン

KazuhitoTakahashi 12 Nov 09, 2022
Awesome Long-Tailed Learning

Awesome Long-Tailed Learning This repo pays specially attention to the long-tailed distribution, where labels follow a long-tailed or power-law distri

Stomach_ache 284 Jan 06, 2023
MVS2D: Efficient Multi-view Stereo via Attention-Driven 2D Convolutions

MVS2D: Efficient Multi-view Stereo via Attention-Driven 2D Convolutions Project Page | Paper If you find our work useful for your research, please con

96 Jan 04, 2023
PyTorch implementation of PSPNet

PSPNet with PyTorch Unofficial implementation of "Pyramid Scene Parsing Network" (https://arxiv.org/abs/1612.01105). This repository is just for caffe

Kazuto Nakashima 52 Nov 16, 2022
SymmetryNet: Learning to Predict Reflectional and Rotational Symmetries of 3D Shapes from Single-View RGB-D Images

SymmetryNet SymmetryNet: Learning to Predict Reflectional and Rotational Symmetries of 3D Shapes from Single-View RGB-D Images ACM Transactions on Gra

26 Dec 05, 2022
Pytorch implementation of TailCalibX : Feature Generation for Long-tail Classification

TailCalibX : Feature Generation for Long-tail Classification by Rahul Vigneswaran, Marc T. Law, Vineeth N. Balasubramanian, Makarand Tapaswi [arXiv] [

Rahul Vigneswaran 34 Jan 02, 2023