Skip to content

songweige/Contrastive-Learning-with-Non-Semantic-Negatives

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Contrastive Learning with Non-Semantic Negatives

This repository is the official implementation of the NeurIPS 2021 paper Robust Contrastive Learning Using Negative Samples with Diminished Semantics.

tl;dr Contrastive learning utilizes positive pairs which preserve semantic information while perturbing superficial features in the training images. Similarly, we propose to generate negative samples to make the model more robust, where only the superfluous instead of the semantic features are preserved.

News

  • Updated scripts and added new checkpoints based on the hyperparameters identified by IFM on the ImageNet-100 dataset.

Preparation

Install PyTorch and check preprocess/ for ImageNet-100 and ImageNet-Texture preprocessing scripts.

Training

The following code is used to pre-train MoCo-v2 + patch / texture-based NS. The major code is developed with minimal modifications from the official implementation.

python moco-non-sem-neg.py -a resnet50 --lr 0.8 --batch-size 512 --moco-m 0.99 --dist-url 'tcp://localhost:10001' \
  --multiprocessing-distributed --world-size 1 --rank 0 \
  --mlp --moco-t 0.2 --aug-plus --cos --moco-k 16384 \
  --robust patch --num-nonsem 1 --alpha 2 --epochs 200 --patch-ratio 16 72 \
  --ckpt_path ./ckpts/mocov2_mocok16384_bs512_lr0.8_nonsem_16_72_noaug_nn1_alpha2_epoch200  \
  /path/to/imagenet-100/ 

python moco-non-sem-neg.py -a resnet50 --lr 0.8 --batch-size 512 --moco-m 0.99 --dist-url 'tcp://localhost:10001' \
  --multiprocessing-distributed --world-size 1 --rank 0 \
  --mlp --moco-t 0.2 --aug-plus --cos --moco-k 16384 \
  --robust texture --num-nonsem 1 --alpha 2 --epochs 200 \
  --ckpt_path ./ckpts/mocov2_mocok16384_bs512_lr0.8_texture_nn1_alpha2_epoch200 \
  /path/to/imagenet-100-texture/ 
  • Change /path/to/imagenet-100/ with the ImageNet-100 dataset directory.
  • Change --alpha and -moco-k to reproduce results with different configurations.

Linear Evaluation

Run following code is used to reproduce MoCo-v2 + patch-based NS model reported in Table 1.

python main_lincls.py -a resnet50 --lr 10.0 --batch-size 128 --epochs 60 \
  --pretrained ./ckpts/mocov2_mocok16384_bs128_lr0.03_nonsem_16_72_noaug_nn1_alpha2_epoch200/checkpoint_0199.pth.tar \
  --dist-url 'tcp://localhost:10001' --multiprocessing-distributed --world-size 1 --rank 0 \
  --ckpt_path ./ckpts/mocov2_mocok16384_bs128_lr0.03_nonsem_16_72_noaug_nn1_alpha2_epoch200 \
  /path/to/imagenet-100/ 

Pre-trained Models

You can download pretrained models here:

k α ImageNet100 Corruption Sketch Stylized Rendition Checkpoints
MoCo 16384 - 77.88±0.28 43.08±0.27 28.24±0.58 16.20±0.55 32.92±0.12 R1, R2, R3
+Texture 16384 2 77.76±0.17 43.58±0.33 29.11±0.39 16.59±0.17 33.36±0.15 R1, R2, R3
+Patch 16384 2 79.35±0.12 45.13±0.35 31.76±0.88 17.37±0.19 34.78±0.15 R1, R2, R3
+Patch 16384 3 75.58±0.52 44.45±0.15 34.03±0.58 18.60±0.26 36.89±0.11 R1, R2, R3
MoCo 8192 - 77.73±0.38 43.22±0.39 28.45±0.36 16.83±0.12 33.19±0.44 R1, R2, R3
+Patch 8192 2 79.54±0.32 45.48±0.20 33.36±0.45 17.81±0.32 36.31±0.37 R1, R2, R3
MoCo* 65536 - 80.00±0.14 45.15±0.42 30.38±0.30 16.68±0.39 30.38±0.30 R1, R2, R3
+Patch 65536 2 81.18±0.09 46.74±0.32 32.46±0.55 17.63±0.14 36.66±0.18 R1, R2, R3
+Patch 16394 2 81.49±0.11 47.48±0.20 34.20±0.40 17.95±0.41 38.45±0.19 R1, R2, R3

* denotes training with the IFM hyperparameters.

BibTeX

@article{ge2021robust,
  title={Robust Contrastive Learning Using Negative Samples with Diminished Semantics},
  author={Ge, Songwei and Mishra, Shlok and Li, Chun-Liang and Wang, Haohan and Jacobs, David},
  journal={Advances in Neural Information Processing Systems},
  volume={34},
  year={2021}
}

About

Robust Contrastive Learning Using Negative Samples with Diminished Semantics (NeurIPS 2021)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages