CATE: Computation-aware Neural Architecture Encoding with Transformers

Overview

CATE: Computation-aware Neural Architecture Encoding with Transformers

Code for paper:

CATE: Computation-aware Neural Architecture Encoding with Transformers
Shen Yan, Kaiqiang Song, Fei Liu, Mi Zhang.
ICML 2021 (Long Talk).

CATE
Overview of CATE: It takes computationally similar architecture pairs as the input and trained to predict masked operators given the pairwise computation information. Apart from the cross-attention blocks, the pretrained Transformer encoder is used to extract architecture encodings for the downstream search.

The repository is built upon pybnn and nas-encodings.

Requirements

conda create -n tf python=3.7
source activate tf
cat requirements.txt | xargs -n 1 -L 1 pip install

Experiments on NAS-Bench-101

Dataset preparation on NAS-Bench-101

Install nasbench and download nasbench_only108.tfrecord in ./data folder.

python preprocessing/gen_json.py

Data will be saved in ./data/nasbench101.json.

Generate architecture pairs

python preprocessing/data_generate.py --dataset nasbench101 --flag extract_seq
python preprocessing/data_generate.py --dataset nasbench101 --flag build_pair --k 2 --d 2000000 --metric params

The corresponding training data and pairs will be saved in ./data/nasbench101/.

Alternatively, you can download the data train_data.pt, test_data.pt and pair indices train_pair_k2_d2000000_metric_params.pt, test_pair_k2_d2000000_metric_params.pt from here.

Pretraining

bash run_scripts/pretrain_nasbench101.sh

The pretrained models will be saved in ./model/.

Alternatively, you can download the pretrained model nasbench101_model_best.pth from here.

Extract the pretrained encodings

python inference/inference.py --pretrained_path model/nasbench101_model_best.pth.tar --train_data data/nasbench101/train_data.pt --valid_data data/nasbench101/test_data.pt --dataset nasbench101

The extracted embeddings will be saved in ./cate_nasbench101.pt.

Alternatively, you can download the pretrained embeddings cate_nasbench101.pt from here.

Run search experiments on NAS-Bench-101

bash run_scripts/run_search_nasbench101.sh

Search results will be saved in ./nasbench101/.

Experiments on NAS-Bench-301

Dataset preparation

Install nasbench301 and download the xgb_v1.0 and lgb_runtime_v1.0 file. You may need to make pytorch_geometric compatible with Pytorch and CUDA version.

python preprocessing/gen_json_darts.py # randomly sample 1,000,000 archs

Data will be saved in ./data/nasbench301_proxy.json.

Alternatively, you can download the json file nasbench301_proxy.json from here.

Generate architecture pairs

python preprocessing/data_generate.py --dataset nasbench301 --flag extract_seq
python preprocessing/data_generate.py --dataset nasbench301 --flag build_pair --k 1 --d 5000000 --metric flops

The correspoding training data and pairs will be saved in ./data/nasbench301/.

Alternatively, you can download the data train_data.pt, test_data.pt and pair indices train_pair_k1_d5000000_metric_flops.pt, test_pair_k1_d5000000_metric_flops.pt from here.

Pretraining

bash run_scripts/pretrain_nasbench301.sh

The pretrained models will be saved in ./model/.

Alternatively, you can download the pretrained model nasbench301_model_best.pth from here.

Extract the pretrained encodings

python inference/inference.py --pretrained_path model/nasbench301_model_best.pth.tar --train_data data/nasbench301/train_data.pt --valid_data data/nasbench301/test_data.pt --dataset nasbench301 --n_vocab 11

The extracted encodings will be saved in ./cate_nasbench301.pt.

Alternatively, you can download the pretrained embeddings cate_nasbench301.pt from here.

Run search experiments on NAS-Bench-301

bash run_scripts/run_search_nasbench301.sh

Search results will be saved in ./nasbench301/.

DARTS experiments without surrogate models

Download the pretrained embeddings cate_darts.pt from here.

python search_methods/dngo_ls_darts.py --dim 64 --init_size 16 --topk 5 --dataset darts --output_path bo  --embedding_path cate_darts.pt

Search log will be saved in ./darts/. Final search result will be saved in ./darts/bo/dim64.

Evaluate the learned cell on DARTS Search Space on CIFAR-10

python darts/cnn/train.py --auxiliary --cutout --arch cate_small
python darts/cnn/train.py --auxiliary --cutout --arch cate_large
  • Expected results (CATE-Small): 2.55% avg. test error with 3.5M model params.
  • Expected results (CATE-Large): 2.46% avg. test error with 4.1M model params.

Transfer learning on ImageNet

python darts/cnn/train_imagenet.py  --arch cate_small --seed 1 
python darts/cnn/train_imagenet.py  --arch cate_large --seed 1
  • Expected results (CATE-Small): 26.05% test error with 5.0M model params and 556M mult-adds.
  • Expected results (CATE-Large): 25.01% test error with 5.8M model params and 642M mult-adds.

Visualize the learned cell

python darts/cnn/visualize.py cate_small
python darts/cnn/visualize.py cate_large

Experiments on outside search space

Build outside search space dataset

bash run_scripts/generate_oo.sh

Data will be saved in ./data/nasbench101_oo_train.json and ./data/nasbench101_oo_test.json.

Generate architecture pairs

python preprocessing/data_generate_oo.py --flag extract_seq
python preprocessing/data_generate_oo.py --flag build_pair

The corresponding training data and pair indices will be saved in ./data/nasbench101/.

Pretraining

python run.py --do_train --parallel --train_data data/nasbench101/nasbench101_oo_trainSet_train.pt --train_pair data/nasbench101/oo_train_pairs_k2_params_dist2e6.pt  --valid_data data/nasbench101/nasbench101_oo_trainSet_validation.pt --valid_pair data/nasbench101/oo_validation_pairs_k2_params_dist2e6.pt --dataset oo

The pretrained models will be saved in ./model/.

Extract embeddings on outside search space

# Adjacency encoding
python inference/inference_adj.py
# CATE encoding
python inference/inference.py --pretrained_path model/oo_model_best.pth.tar --train_data data/nasbench101/nasbench101_oo_testSet_split1.pt --valid_data data/nasbench101/nasbench101_oo_testSet_split2.pt --dataset oo_nasbench101

The extracted encodings will be saved as ./adj_oo_nasbench101.pt and ./cate_oo_nasbench101.pt.

Alternatively, you can download the data, pair indices, pretrained models, and extracted embeddings from here.

Run MLP predictor experiments on outside search space

for s in {1..500}; do python search_methods/oo_mlp.py --dim 27 --seed $s --init_size 16 --topk 5 --dataset oo_nasbench101 --output_path np_adj  --embedding_path adj_oo_nasbench101.pt; done
for s in {1..500}; do python search_methods/oo_mlp.py --dim 64 --seed $s --init_size 16 --topk 5 --dataset oo_nasbench101 --output_path np_cate  --embedding_path cate_oo_nasbench101.pt; done

Search results will be saved in ./oo_nasbench101.

Citation

If you find this useful for your work, please consider citing:

@InProceedings{yan2021cate,
  title = {CATE: Computation-aware Neural Architecture Encoding with Transformers},
  author = {Yan, Shen and Song, Kaiqiang and Liu, Fei and Zhang, Mi},
  booktitle = {ICML},
  year = {2021}
}
Contrastive Learning for Compact Single Image Dehazing, CVPR2021

AECR-Net Contrastive Learning for Compact Single Image Dehazing, CVPR2021. Official Pytorch based implementation. Paper arxiv Pytorch Version TODO: mo

glassy 253 Jan 01, 2023
Code for Low-Cost Algorithmic Recourse for Users With Uncertain Cost Functions

EMS-COLS-recourse Initial Code for Low-Cost Algorithmic Recourse for Users With Uncertain Cost Functions Folder structure: data folder contains raw an

Prateek Yadav 1 Nov 25, 2022
Torch-ngp - A pytorch implementation of the hash encoder proposed in instant-ngp

HashGrid Encoder (WIP) A pytorch implementation of the HashGrid Encoder from ins

hawkey 1k Jan 01, 2023
This is a Keras implementation of a CNN for estimating age, gender and mask from a camera.

face-detector-age-gender This is a Keras implementation of a CNN for estimating age, gender and mask from a camera. Before run face detector app, expr

Devdreamsolution 2 Dec 04, 2021
Train CPPNs as a Generative Model, using Generative Adversarial Networks and Variational Autoencoder techniques to produce high resolution images.

cppn-gan-vae tensorflow Train Compositional Pattern Producing Network as a Generative Model, using Generative Adversarial Networks and Variational Aut

hardmaru 343 Dec 29, 2022
SimDeblur is a simple framework for image and video deblurring, implemented by PyTorch

SimDeblur (Simple Deblurring) is an open source framework for image and video deblurring toolbox based on PyTorch, which contains most deep-learning based state-of-the-art deblurring algorithms. It i

220 Jan 07, 2023
A code implementation of AC-GC: Activation Compression with Guaranteed Convergence, in NeurIPS 2021.

Code For AC-GC: Lossy Activation Compression with Guaranteed Convergence This code is intended to be used as a supplemental material for submission to

Dave Evans 2 Nov 01, 2022
Detecting Human-Object Interactions with Object-Guided Cross-Modal Calibrated Semantics

[AAAI2022] Detecting Human-Object Interactions with Object-Guided Cross-Modal Calibrated Semantics Overall pipeline of OCN. Paper Link: [arXiv] [AAAI

13 Nov 21, 2022
Lightweight Salient Object Detection in Optical Remote Sensing Images via Feature Correlation

CorrNet This project provides the code and results for 'Lightweight Salient Object Detection in Optical Remote Sensing Images via Feature Correlation'

Gongyang Li 13 Nov 03, 2022
本步态识别系统主要基于GaitSet模型进行实现

本步态识别系统主要基于GaitSet模型进行实现。在尝试部署本系统之前,建立理解GaitSet模型的网络结构、训练和推理方法。 系统的实现效果如视频所示: 演示视频 由于模型较大,部分模型文件存储在百度云盘。 链接提取码:33mb 具体部署过程 1.下载代码 2.安装requirements.txt

16 Oct 22, 2022
Benchmarking Pipeline for Prediction of Protein-Protein Interactions

B4PPI Benchmarking Pipeline for the Prediction of Protein-Protein Interactions How this benchmarking pipeline has been built, and how to use it, is de

Loïc Lannelongue 4 Jun 27, 2022
A library for preparing, training, and evaluating scalable deep learning hybrid recommender systems using PyTorch.

collie Collie is a library for preparing, training, and evaluating implicit deep learning hybrid recommender systems, named after the Border Collie do

ShopRunner 96 Dec 29, 2022
Code Release for ICCV 2021 (oral), "AdaFit: Rethinking Learning-based Normal Estimation on Point Clouds"

AdaFit: Rethinking Learning-based Normal Estimation on Point Clouds (ICCV 2021 oral) **Project Page | Arxiv ** Runsong Zhu¹, Yuan Liu², Zhen Dong¹, Te

40 Dec 30, 2022
Simple ONNX operation generator. Simple Operation Generator for ONNX.

sog4onnx Simple ONNX operation generator. Simple Operation Generator for ONNX. https://github.com/PINTO0309/simple-onnx-processing-tools Key concept V

Katsuya Hyodo 6 May 15, 2022
Code repository for the paper "Tracking People with 3D Representations"

Tracking People with 3D Representations Code repository for the paper "Tracking People with 3D Representations" (paper link) (project site). Jathushan

Jathushan Rajasegaran 77 Dec 03, 2022
Open source simulator for autonomous vehicles built on Unreal Engine / Unity, from Microsoft AI & Research

Welcome to AirSim AirSim is a simulator for drones, cars and more, built on Unreal Engine (we now also have an experimental Unity release). It is open

Microsoft 13.8k Jan 03, 2023
PIKA: a lightweight speech processing toolkit based on Pytorch and (Py)Kaldi

PIKA: a lightweight speech processing toolkit based on Pytorch and (Py)Kaldi PIKA is a lightweight speech processing toolkit based on Pytorch and (Py)

336 Nov 25, 2022
The code for 'Deep Residual Fourier Transformation for Single Image Deblurring'

Deep Residual Fourier Transformation for Single Image Deblurring Xintian Mao, Yiming Liu, Wei Shen, Qingli Li and Yan Wang News 2021.12.5 Release Deep

145 Jan 05, 2023
Python library for science observations from the James Webb Space Telescope

JWST Calibration Pipeline JWST requires Python 3.7 or above and a C compiler for dependencies. Linux and MacOS platforms are tested and supported. Win

Space Telescope Science Institute 386 Dec 30, 2022
A Self-Supervised Contrastive Learning Framework for Aspect Detection

AspDecSSCL A Self-Supervised Contrastive Learning Framework for Aspect Detection This repository is a pytorch implementation for the following AAAI'21

Tian Shi 30 Dec 28, 2022