Retrieval.pytorch - The code we used in [2020 DIGIX]

Overview

retrieval.pytorch

dependence

  • python3
  • pytorch
  • numpy
  • scikit-learn
  • tqdm
  • yacs

You can install yacs by pip. Other dependencies can be installed by 'conda'.

prepare dataset

first, you need to download the dataset from here. Then, you can move them into the directory $DATASET and decompress them by

unzip train_data.zip
unzip test_data_A.zip
unzip test_data_B.zip

Then remove the empty directory in train_data:

cd train_data
rm -rf DIGIX_001453
rm -rf DIGIX_001639
rm -rf DIGIX_002284

Finally, you need to edit the file src/dataset/datasets.py and set the correct values for traindir, test_A_dir, test_B_dir.

traindir = '$DATASET/train_data'
test_A_dir = '$DATASET/test_data_A'
test_B_dir = '$DATASET/test_data_B'

Train the network to extract feature

You can train dla102x and resnet101 by the below comands.

python experiments/DIGIX/dla102x/cgd_margin_loss.py
python experiments/DIGIX/resnet101/cgd_margin_loss.py

To train fishnet99, hrnet_w18 and hrnet_w30, you need to download their imagenet pretrained weights from here. Specifically, download fishnet99_ckpt.tar for fishnet99, download hrnetv2_w18_imagenet_pretrained.pth for hrnet_w18, download hrnetv2_w30_imagenet_pretrained.pth for hrnet_w30. Then you need to move these weights to ~/.cache/torch/hub/checkpoints to make sure torch.hub.load_state_dict_from_url can find them.

Then, you can train fishnet99, hrnet_w18, hrnet_w30 by

python experiments/DIGIX/fishnet99/cgd_margin_loss.py
python experiments/DIGIX/hrnet_w18/cgd_margin_loss.py
python experiments/DIGIX/hrnet_w30/cgd_margin_loss.py

After Training, the model weights can be found in results/DIGIX/{model}/cgd_margin_loss/{time}/transient/checkpoint.final.ckpt. We also provide these weights file.

extract features for retrieval

You can download the pretrained model from here and move them to pretrained directory.

Then, run the below comands.

python experiments/DIGIX_test_B/dla102x/cgd_margin_loss_test_B.py
python experiments/DIGIX_test_B/resnet101/cgd_margin_loss_test_B.py
python experiments/DIGIX_test_B/fishnet99/cgd_margin_loss_test_B.py
python experiments/DIGIX_test_B/hrnet_w18/cgd_margin_loss_test_B.py
python experiments/DIGIX_test_B/hrnet_w30/cgd_margin_loss_test_B.py

When finished, the query feature for test_data_B can be found in results/DIGIX_test_B/{model}/cgd_margin_loss_test_B/{time}/query_feat. And the gallery feature can be found in results/DIGIX_test_B/{model}/cgd_margin_loss_test_B/{time}/gallery_feat.

Post process

You can download features from here. Then, you can put it into the directory features and decompress the files by

tar -xvf DIGIX_test_B_dla102x_5088.tar
tar -xvf DIGIX_test_B_fishnet99_5153.tar
tar -xvf DIGIX_test_B_hrnet_w18_5253.tar
tar -xvf DIGIX_test_B_hrnet_w30_5308.tar
tar -xvf DIGIX_test_B_resnet101_5059.tar

Then the features directory will be organized like this:

|-- DIGIX_test_B_dla102x_5088.tar  
|-- DIGIX_test_B_fishnet99_5153.tar  
|-- DIGIX_test_B_hrnet_w18_5253.tar  
|-- DIGIX_test_B_hrnet_w30_5308.tar  
|-- DIGIX_test_B_resnet101_5059.tar 
|-- DIGIX_test_B_dla102x_5088  
| |-- gallery_feat  
| |-- query_feat  
|-- DIGIX_test_B_fishnet99_5153  
| |-- gallery_feat  
| |-- query_feat  
|-- DIGIX_test_B_hrnet_w18_5253  
| |-- gallery_feat  
| |-- query_feat  
|-- DIGIX_test_B_hrnet_w30_5308  
| |-- gallery_feat  
| |-- query_feat  
|-- DIGIX_test_B_resnet101_5059  
| |-- gallery_feat  
| |-- query_feat  

Now, post process can be executed by

python post_process/rank.py --gpu 0 features/DIGIX_test_B_fishnet99_5153 features/DIGIX_test_B_dla102x_5088 features/DIGIX_test_B_hrnet_w18_5253 features/DIGIX_test_B_hrnet_w30_5308 features/DIGIX_test_B_resnet101_5059
Owner
Guo-Hua Wang
Guo-Hua Wang
Robbing the FED: Directly Obtaining Private Data in Federated Learning with Modified Models

Robbing the FED: Directly Obtaining Private Data in Federated Learning with Modified Models This repo contains a barebones implementation for the atta

16 Dec 04, 2022
CL-Gym: Full-Featured PyTorch Library for Continual Learning

CL-Gym: Full-Featured PyTorch Library for Continual Learning CL-Gym is a small yet very flexible library for continual learning research and developme

Iman Mirzadeh 36 Dec 25, 2022
Probabilistic Gradient Boosting Machines

PGBM Probabilistic Gradient Boosting Machines (PGBM) is a probabilistic gradient boosting framework in Python based on PyTorch/Numba, developed by Air

Olivier Sprangers 112 Dec 28, 2022
Block Sparse movement pruning

Movement Pruning: Adaptive Sparsity by Fine-Tuning Magnitude pruning is a widely used strategy for reducing model size in pure supervised learning; ho

Hugging Face 54 Dec 20, 2022
PyTorch implementation of paper "Neural Scene Flow Fields for Space-Time View Synthesis of Dynamic Scenes", CVPR 2021

Neural Scene Flow Fields PyTorch implementation of paper "Neural Scene Flow Fields for Space-Time View Synthesis of Dynamic Scenes", CVPR 20

Zhengqi Li 585 Jan 04, 2023
NLP From Scratch Without Large-Scale Pretraining: A Simple and Efficient Framework

NLP From Scratch Without Large-Scale Pretraining This repository contains the code, pre-trained model checkpoints and curated datasets for our paper:

Xingcheng Yao 224 Dec 08, 2022
Efficiently computes derivatives of numpy code.

Note: Autograd is still being maintained but is no longer actively developed. The main developers (Dougal Maclaurin, David Duvenaud, Matt Johnson, and

Formerly: Harvard Intelligent Probabilistic Systems Group -- Now at Princeton 6.1k Jan 08, 2023
A simple, fast, and efficient object detector without FPN

You Only Look One-level Feature (YOLOF), CVPR2021 A simple, fast, and efficient object detector without FPN. This repo provides an implementation for

789 Jan 09, 2023
A state of the art of new lightweight YOLO model implemented by TensorFlow 2.

CSL-YOLO: A New Lightweight Object Detection System for Edge Computing This project provides a SOTA level lightweight YOLO called "Cross-Stage Lightwe

Miles Zhang 54 Dec 21, 2022
Code release of paper "Deep Multi-View Stereo gone wild"

Deep MVS gone wild Pytorch implementation of "Deep MVS gone wild" (Paper | website) This repository provides the code to reproduce the experiments of

François Darmon 53 Dec 24, 2022
This repo contains code to reproduce all experiments in Equivariant Neural Rendering

Equivariant Neural Rendering This repo contains code to reproduce all experiments in Equivariant Neural Rendering by E. Dupont, M. A. Bautista, A. Col

Apple 83 Nov 16, 2022
Graph Convolutional Networks for Temporal Action Localization (ICCV2019)

Graph Convolutional Networks for Temporal Action Localization This repo holds the codes and models for the PGCN framework presented on ICCV 2019 Graph

Runhao Zeng 318 Dec 06, 2022
PAWS 🐾 Predicting View-Assignments with Support Samples

This repo provides a PyTorch implementation of PAWS (predicting view assignments with support samples), as described in the paper Semi-Supervised Learning of Visual Features by Non-Parametrically Pre

Facebook Research 437 Dec 23, 2022
Knowledge Distillation Toolbox for Semantic Segmentation

SegDistill: Toolbox for Knowledge Distillation on Semantic Segmentation Networks This repo contains the supported code and configuration files for Seg

9 Dec 12, 2022
2.86% and 15.85% on CIFAR-10 and CIFAR-100

Shake-Shake regularization This repository contains the code for the paper Shake-Shake regularization. This arxiv paper is an extension of Shake-Shake

Xavier Gastaldi 294 Nov 22, 2022
Syllabus del curso IIC2115 - Programación como Herramienta para la Ingeniería 2022/I

IIC2115 - Programación como Herramienta para la Ingeniería Videos y tutoriales Tutorial CMD Tutorial Instalación Python y Jupyter Tutorial de git-GitH

21 Nov 09, 2022
Demo for Real-time RGBD-based Extended Body Pose Estimation paper

Real-time RGBD-based Extended Body Pose Estimation This repository is a real-time demo for our paper that was published at WACV 2021 conference The ou

Renat Bashirov 118 Dec 26, 2022
QRec: A Python Framework for quick implementation of recommender systems (TensorFlow Based)

Introduction QRec is a Python framework for recommender systems (Supported by Python 3.7.4 and Tensorflow 1.14+) in which a number of influential and

Yu 1.4k Jan 01, 2023
Code release for "Masked-attention Mask Transformer for Universal Image Segmentation"

Mask2Former: Masked-attention Mask Transformer for Universal Image Segmentation Bowen Cheng, Ishan Misra, Alexander G. Schwing, Alexander Kirillov, Ro

Meta Research 1.2k Jan 02, 2023
PAMI stands for PAttern MIning. It constitutes several pattern mining algorithms to discover interesting patterns in transactional/temporal/spatiotemporal databases

Introduction PAMI stands for PAttern MIning. It constitutes several pattern mining algorithms to discover interesting patterns in transactional/tempor

RAGE UDAY KIRAN 43 Jan 08, 2023