NeuroLKH: Combining Deep Learning Model with Lin-Kernighan-Helsgaun Heuristic for Solving the Traveling Salesman Problem

Related tags

Deep LearningNeuroLKH
Overview

NeuroLKH: Combining Deep Learning Model with Lin-Kernighan-Helsgaun Heuristic for Solving the Traveling Salesman Problem

Liang Xin, Wen Song, Zhiguang Cao, Jie Zhang. NeuroLKH: Combining Deep Learning Model with Lin-Kernighan-Helsgaun Heuristic for Solving the Traveling Salesman Problem, 35th Conference on Neural Information Processing Systems (NeurIPS), 2021. [pdf]

Please cite our paper if this code is useful for your work.

@inproceedings{xin2021neurolkh,
    author = {Xin, Liang and Song, Wen and Cao, Zhiguang and Zhang, Jie},
    booktitle = {Advances in Neural Information Processing Systems},
    title = {NeuroLKH: Combining Deep Learning Model with Lin-Kernighan-Helsgaun Heuristic for Solving the Traveling Salesman Problem},
    volume = {34},
    year = {2021}
}

Quick start

To connect the deep learning model Sparse Graph Network (Python) and the Lin-Kernighan-Helsgaun Heuristic (C Programming), we implement two versions.

  • subprocess version. This version requires writting and reading data files to connect the two programming languages. To compile and test with our pretrained models for TSP instances with 100 nodes:
make
python data_generate.py -test
python test.py --dataset test/100.pkl --model_path pretrained/neurolkh.pt --n_samples 1000 --lkh_trials 1000 --neurolkh_trials 1000
  • Swig (http://www.swig.org) version. The C code is wrapped for Python. To compile and test with our pretained models for TSP instances with 100 nodes:
bash setup.sh
python data_generate.py -test
python swig_test.py --dataset test/100.pkl --model_path pretrained/neurolkh.pt --n_samples 1000 --lkh_trials 1000 --neurolkh_trials 1000

Usage

Generate the training dataset

As the training for edge scores requires the edge labels, generating the training dataset will take a relatively long time (a couple of days).

python data_generate.py -train

Train the NeuroLKH Model

To train for the node penalties in the Sparse Graph Network, swig is required and the subprocess version is currently not supported. With one RTX 2080Ti GPU, the model converges in approximately 4 days.

CUDA_VISIBLE_DEVICES="0" python train.py --file_path train --eval_file_path val --eval_batch_size=100 --save_dir=saved/tsp_neurolkh --learning_rate=0.0001

Finetune the node decoder for large sizes

The finetuning process takes less than 1 minute for each size.

CUDA_VISIBLE_DEVICES="0" python finetune_node.py

Testing

Test with the pretrained model on TSP with 500 nodes:

python test.py --dataset test/500.pkl --model_path pretrained/neurolkh.pt --n_samples 1000 --lkh_trials 1000 --neurolkh_trials 1000

We test on the TSPLIB instances with two NeuroLKH Models, NeuroLKH trained with uniformly distributed TSP instances and NeuroLKH_M trained with uniform, clustered and uniform-clustered instances (please refer to the paper for details).

python tsplib_test.py

Other Routing Problems (CVRP, PDP, CVRPTW)

Testing with pretrained models

test for CVRP with 100 customers, PDP and CVRPTW with 40 customers

# Capacitated Vehicle Routing Problem (CVRP)
python CVRPdata_generate.py -test
python CVRP_test.py --dataset CVRP_test/cvrp_100.pkl --model_path pretrained/cvrp_neurolkh.pt --n_samples 1000 --lkh_trials 10000 --neurolkh_trials 10000
# Pickup and Delivery Problem (PDP)
python PDPdata_generate.py -test
python PDP_test.py --dataset PDP_test/pdp_40.pkl --model_path pretrained/pdp_neurolkh.pt --n_samples 1000 --lkh_trials 10000 --neurolkh_trials 10000
# CVRP with Time Windows (CVRPTW)
python CVRPTWdata_generate.py -test
python CVRPTw_test.py --dataset CVRPTW_test/cvrptw_40.pkl --model_path pretrained/cvrptw_neurolkh.pt --n_samples 1000 --lkh_trials 10000 --neurolkh_trials 10000

Training

train for CVRP with 100-500 customers, PDP and CVRPTW with 40-200 customers

# Capacitated Vehicle Routing Problem (CVRP)
python CVRPdata_generate.py -train
CUDA_VISIBLE_DEVICES="0" python CVRP_train.py --save_dir=saved/cvrp_neurolkh
# Pickup and Delivery Problem (PDP)
python PDPdata_generate.py -train
CUDA_VISIBLE_DEVICES="0" python PDP_train.py --save_dir=saved/pdp_neurolkh
# CVRP with Time Windows (CVRPTW)
python CVRPTWdata_generate.py -train
CUDA_VISIBLE_DEVICES="0" python CVRPTW_train.py --save_dir=saved/cvrptw_neurolkh

Dependencies

  • Python >= 3.6
  • Pytorch
  • sklearn
  • Numpy
  • tqdm
  • (Swig, optional)

Acknowledgements

Owner
xinliangedu
xinliangedu
PyTorch deep learning projects made easy.

PyTorch Template Project PyTorch deep learning project made easy. PyTorch Template Project Requirements Features Folder Structure Usage Config file fo

Victor Huang 3.8k Jan 01, 2023
Paddle implementation for "Cross-Lingual Word Embedding Refinement by ℓ1 Norm Optimisation" (NAACL 2021)

L1-Refinement Paddle implementation for "Cross-Lingual Word Embedding Refinement by ℓ1 Norm Optimisation" (NAACL 2021) 🙈 A more detailed readme is co

Lincedo Lab 4 Jun 09, 2021
Code and project page for ICCV 2021 paper "DisUnknown: Distilling Unknown Factors for Disentanglement Learning"

DisUnknown: Distilling Unknown Factors for Disentanglement Learning See introduction on our project page Requirements PyTorch = 1.8.0 torch.linalg.ei

Sitao Xiang 24 May 16, 2022
This is the official PyTorch implementation for "Mesa: A Memory-saving Training Framework for Transformers".

Mesa: A Memory-saving Training Framework for Transformers This is the official PyTorch implementation for Mesa: A Memory-saving Training Framework for

Zhuang AI Group 105 Dec 06, 2022
An implementation of the Contrast Predictive Coding (CPC) method to train audio features in an unsupervised fashion.

CPC_audio This code implements the Contrast Predictive Coding algorithm on audio data, as described in the paper Unsupervised Pretraining Transfers we

8 Nov 14, 2022
A toolset for creating Qualtrics-based IAT experiments

Qualtrics IAT Tool A web app for generating the Implicit Association Test (IAT) running on Qualtrics Online Web App The app is hosted by Streamlit, a

0 Feb 12, 2022
A Broader Picture of Random-walk Based Graph Embedding

Random-walk Embedding Framework This repository is a reference implementation of the random-walk embedding framework as described in the paper: A Broa

Zexi Huang 23 Dec 13, 2022
Code for one-stage adaptive set-based HOI detector AS-Net.

AS-Net Code for one-stage adaptive set-based HOI detector AS-Net. Mingfei Chen*, Yue Liao*, Si Liu, Zhiyuan Chen, Fei Wang, Chen Qian. "Reformulating

Mingfei Chen 45 Dec 09, 2022
OCR Streamlit App is used to extract text from images using python's easyocr, pytorch and streamlit packages

OCR-Streamlit-App OCR Streamlit App is used to extract text from images using python's easyocr, pytorch and streamlit packages OCR app gets an image a

Siva Prakash 5 Apr 05, 2022
Official implementation of the paper WAV2CLIP: LEARNING ROBUST AUDIO REPRESENTATIONS FROM CLIP

Wav2CLIP 🚧 WIP 🚧 Official implementation of the paper WAV2CLIP: LEARNING ROBUST AUDIO REPRESENTATIONS FROM CLIP 📄 🔗 Ho-Hsiang Wu, Prem Seetharaman

Descript 240 Dec 13, 2022
Translate darknet to tensorflow. Load trained weights, retrain/fine-tune using tensorflow, export constant graph def to mobile devices

Intro Real-time object detection and classification. Paper: version 1, version 2. Read more about YOLO (in darknet) and download weight files here. In

Trieu 6.1k Jan 04, 2023
This project generates news headlines using a Long Short-Term Memory (LSTM) neural network.

News Headlines Generator bunnysaini/Generate-Headlines Goal This project aims to generate news headlines using a Long Short-Term Memory (LSTM) neural

Bunny Saini 1 Jan 24, 2022
EncT5: Fine-tuning T5 Encoder for Non-autoregressive Tasks

EncT5 (Unofficial) Pytorch Implementation of EncT5: Fine-tuning T5 Encoder for Non-autoregressive Tasks About Finetune T5 model for classification & r

Jangwon Park 34 Jan 01, 2023
[UNMAINTAINED] Automated machine learning for analytics & production

auto_ml Automated machine learning for production and analytics Installation pip install auto_ml Getting started from auto_ml import Predictor from au

Preston Parry 1.6k Jan 02, 2023
Block Sparse movement pruning

Movement Pruning: Adaptive Sparsity by Fine-Tuning Magnitude pruning is a widely used strategy for reducing model size in pure supervised learning; ho

Hugging Face 54 Dec 20, 2022
This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" on Object Detection and Instance Segmentation.

Swin Transformer for Object Detection This repo contains the supported code and configuration files to reproduce object detection results of Swin Tran

Swin Transformer 1.4k Dec 30, 2022
PyTorch implementation of MoCo v3 for self-supervised ResNet and ViT.

MoCo v3 for Self-supervised ResNet and ViT Introduction This is a PyTorch implementation of MoCo v3 for self-supervised ResNet and ViT. The original M

Facebook Research 887 Jan 08, 2023
A JAX-based research framework for writing differentiable numerical simulators with arbitrary discretizations

jaxdf - JAX-based Discretization Framework Overview | Example | Installation | Documentation ⚠️ This library is still in development. Breaking changes

UCL Biomedical Ultrasound Group 65 Dec 23, 2022
PySOT - SenseTime Research platform for single object tracking, implementing algorithms like SiamRPN and SiamMask.

PySOT is a software system designed by SenseTime Video Intelligence Research team. It implements state-of-the-art single object tracking algorit

STVIR 4.1k Dec 29, 2022
QKeras: a quantization deep learning library for Tensorflow Keras

QKeras github.com/google/qkeras QKeras 0.8 highlights: Automatic quantization using QKeras; Stochastic behavior (including stochastic rouding) is disa

Google 437 Jan 03, 2023